# [TomsHW] Nvidia GTX 1180 Expected in July: What You Need to Know



## epic1337

> we expect a July release for the Founder's Edition cards with third-party cards to follow in August or September.


as late as september? i do hope its not just GTX1180, but a partial lineup.
e.g. including GTX1170 and possibly GTX1160.


----------



## ENTERPRISE

I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


----------



## Slaughtahouse

ENTERPRISE said:


> I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


While I do agree to an extent, the non ti versions typically have lower power draws.(GTX 1080 TDP 180W vs GTX 1080Ti 250W).

Given this is an enthusiast forum, I understand that most users don't factor that it but it does make a difference. Granted, if I knew a GTX 780Ti would of been released, I would of purchased that instead of a GTX 780 because I really wanted all the performance I could get. However, that is when Nvidia started this trend of releasing Titan's and X80ti's. 

Going forward, I think I will stick to the non-Ti's. Just so I am dumping less wattage into my w/c loop. But to each their own


----------



## keikei

So new Ti for Xmas then?


----------



## SuperZan

I'll probably just stick with the 1080 Ti, but it'll be interesting to see where these cards fall performance-wise.


----------



## doom26464

Id like the 1180ti version but Im not going to wait 10-12 months for that to drop. 

If this is 20% faster then a 1080ti it will be enough to get me off my 980ti


----------



## nycgtr

ENTERPRISE said:


> I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


That's how milking works. 



doom26464 said:


> Id like the 1180ti version but Im not going to wait 10-12 months for that to drop.
> 
> If this is 20% faster then a 1080ti it will be enough to get me off my 980ti


LMAO never gonna happen. Cherry a pick a title and non oc vs non oc i am sure it can made to maybe give that idea.


----------



## ThrashZone

Hi,
I sure loved my evga 1080 classified but yeah a 1180 FE card is a complete pass.
I can wait for better ti air cooling cards but I'm not all that upgrade happy either I'm sure prices will be insane


----------



## Hardware Hoshi

The date of July is rumored almost everywhere. If this isn't the usual copy&paste journalism, the release of the new GTX-Generation must be really close.

I am personally not exited about the GTX 1180 because its price tag is probably beyond 800 Euros for early adopters again. To me it is more technical interest if or how the card beats a current Pascal GTX 1080 Ti or even a Titan. Maybe a GTX1170 could be interesting, but I fear the price is closer to 500 Euro/Dollar too. 

Only thing left for me is a potential GTX1160 at 1080 levels for my HTPC. Below 120W TDP it would be another killercard to have. I pray to god this will be below 300 Euro/Dollar. IIRC the smaller chips got released later than the big Gx104 ones. Waiting for another 4-6 months after release of the new gen might get really painful. 

At least this is something ot look forward to.


----------



## iTurn

nycgtr said:


> That's how milking works.
> 
> 
> 
> LMAO never gonna happen. Cherry a pick a title and non oc vs non oc i am sure it can made to maybe give that idea.


You have any evidence for this, why wouldn't the 1180 be +20% faster than the 1080ti? 

980ti been eclipsed for a while now, but some people want a huge leap (>50%) from their upgrade, which is understandable.


----------



## Dotachin

Imo it's all about the architecture and its driver optimization life cycle.

If this is another pascal refresh even if it's 50% faster than the 1080Ti I wouldn't buy it. I like milking my GPU for around 3-4 years (currently on a 980Ti).
I have concluded that the best way to achieve this is getting the first wave of GPUs from a new architecture. I got lucky with my 980ti because the 1000 series was another refresh but I don't expect to be that lucky ever again.


----------



## tconroy135

I'd probably be willing to pay for a Titan Turing, getting the Ti level of performance a year early, but an 1180 FE has no appeal.


----------



## Hardware Hoshi

Dotachin said:


> Imo it's all about the architecture and its driver optimization life cycle.


Don't forget potential features and software achievements. The drivers are only 50% of the software. What kind of tricks Nvidia could pull out this time are quite exiting. 




Dotachin said:


> If this is another pascal refresh even if it's 50% faster than the 1080Ti I wouldn't buy it. I like milking my GPU for around 3-4 years (currently on a 980Ti).


Well, 50% is quite a number. A normal refresh would have 10-20% in best cases. Everything 50%+ is worth of being called 'a new generation'. At this point we don't know much about what is under the hood. I don't expect another Maxwell / Pascal based architecture, but something completely different. Or let's say at least different enough. 

The hardware change to GDDR6 und probably TSMC 12nm FinFET should be enough for Nvidia to tinker around with the architecture.




Dotachin said:


> I have concluded that the best way to achieve this is getting the first wave of GPUs from a new architecture. I got lucky with my 980ti because the 1000 series was another refresh but I don't expect to be that lucky ever again.


A 980 Ti was roughly on par with a GTX1070. For games of the recent years this is enough raw power if one does not go UHD / 4K. In retrospect the first buyers of 1070's were pretty lucky. I wished I bought one when some models were at 380 Euro. Through the mining grace the prices did skyrocket and we get the same cards 70-120 Euro more expensive. 

That is the only hurdle I see for an upcoming generation. All miners should die!


----------



## mouacyk

One can normally get 100% the stock performance of the next-gen *80 from maximizing the cooling and clocks of a prev-gen *80 Ti. The big win is, as usual, earlier (3-6 months) access to slightly more performance (~15%) at lower power consumption and lower heat output. Of course, the big loss is, as usual, the early adopter's tax premium. If you time it right, you may be able to recoup this back by selling the *80 before most are even aware of the *80 Ti's arrival.


----------



## epic1337

wait, i just noticed a part of Tom's source is Wccftech, meh.


----------



## nycgtr

iTurn said:


> You have any evidence for this, why wouldn't the 1180 be +20% faster than the 1080ti?
> 
> 980ti been eclipsed for a while now, but some people want a huge leap (>50%) from their upgrade, which is understandable.


They've done this for 2 generations already. What makes you think it won't be a third. There's no competition. NV is out to make money not give performance hand outs.


----------



## epic1337

anyone know when the other SKUs would launch?


----------



## bucdan

How much do you guys think this will cost?


----------



## epic1337

bucdan said:


> How much do you guys think this will cost?


Tom's source says 1180 = $699, and looking at how much 1080FE costed i don't think its impossible.
and knowing Nvidia, they'd probably go ahead with $699 and just drop price a few months later.


----------



## kd5151

6-5-2018


----------



## reqq

Was 1080 stock cooler less loud compared to ti? I only need to drive my new 1440p 144hz screen on medium settings. And i need blow cooler that is somewhat quiet. This might be the right card for me, even though im an amd fanboy.


----------



## SuperZan

epic1337 said:


> Tom's source says 1180 = $699, and looking at how much 1080FE costed i don't think its impossible.
> and knowing Nvidia, they'd probably go ahead with $699 and just drop price a few months later.


I'm not even sure that they'd bother lowing the price. New gen, new price, Nvidia. $849 for an 1180 Ti isn't out of the question, especially knowing how AMD and Nvidia like to spruce up those cross-gen performance graphs.


----------



## mouacyk

SuperZan said:


> I'm not even sure that they'd bother lowing the price. New gen, new price, Nvidia. $849 for an 1180 Ti isn't out of the question, especially knowing how AMD and Nvidia like to spruce up those cross-gen performance graphs.


It's kind of surprising *80 Ti MSRP has been consistently $650-$700 for 3 generations, and not closer to their performance-based value compared to the Titans. It's the *80 MSRP's that have ranged from $500 to $700, and more recently tended towards $700.


----------



## chessmyantidrug

The initial launch is typically the XX80 and XX70 cards. The more mainstream XX60 offering is usually a few months later. The 10 series was a bit different with the 1080 having the spotlight to itself for two weeks before the 1070 was released. The 1060 6GB was released a little over a month later with the 1060 3GB a month after that. The Titan X was wedged in between those two. Close to the full spectrum of Pascal cards was out by the end of October with the 1050 and 1050 Ti cards hitting the market. I expect a similar release cadence with the 11 series.

As far as pricing is concerned, I expect an initial price for the GTX 1180 to be around $700 like the GTX 1080 before it. I would not be surprised if the Founder's Edition price is raised $50. I anticipate an MSRP of $599 considering the GTX 980 and GTX 1080 had MSRPs of $499 and $549 respectively. I expect the GTX 1170 to be priced anywhere from $100 to $150 less. Pricing them too close together gives customers less incentive to consider the 1170 while pricing them too far apart does the same to the 1180. There's no way to anticipate how mining might affect pricing, especially with crypto down about 50% compared to four months ago. Deals will be hardest to find early as supply is always short the first few months.


----------



## epic1337

SuperZan said:


> I'm not even sure that they'd bother lowing the price. New gen, new price, Nvidia. $849 for an 1180 Ti isn't out of the question, especially knowing how AMD and Nvidia like to spruce up those cross-gen performance graphs.


the price drop isn't necessarily because they planed to, but rather depends on the situation.
in Pascal for example they had lowered 1080 to $499 ($549 FE) from $599 ($699 FE).

i don't know whether Nvidia plans on pushing the 80Ti price-point a notch higher, but i don't think they'd inflate 80 too much, $699 should be their target point.


----------



## doom26464

Would NVENC encoding performance of a gtx1180 be better compared to a gtx1080ti??


----------



## bucdan

epic1337 said:


> the price drop isn't necessarily because they planed to, but rather depends on the situation.
> in Pascal for example they had lowered 1080 to $499 ($549 FE) from $599 ($699 FE).
> 
> i don't know whether Nvidia plans on pushing the 80Ti price-point a notch higher, but i don't think they'd inflate 80 too much, $699 should be their target point.


I'm sure they have the pricing strategy down and the what-if scenarios, and I bet the cost of each card will pull its value from where they price the Titan, and break it down from there. With nothing from AMD, Nvidia will be pulling good numbers for their stockholders.


----------



## SuperZan

mouacyk said:


> It's kind of surprising *80 Ti MSRP has been consistently $650-$700 for 3 generations, and not closer to their performance-based value compared to the Titans. It's the *80 MSRP's that have ranged from $500 to $700, and more recently tended towards $700.


True. I suppose it's part of the delivery mechanism in that it's generally a price palatable to a lot of the early *80 adopters as well as many previous-gen holdouts and counts as a value proposition relative to the Titan. I'd still think we'd see it move a bit if the 1180 started at $699, just based on the market's behaviour the past couple of years, though by how much is anyone's guess. 



epic1337 said:


> the price drop isn't necessarily because they planed to, but rather depends on the situation.
> in Pascal for example they had lowered 1080 to $499 ($549 FE) from $599 ($699 FE).
> 
> i don't know whether Nvidia plans on pushing the 80Ti price-point a notch higher, but i don't think they'd inflate 80 too much, $699 should be their target point.


Also true, but Nvidia has even less competition at the high end than they did during the Maxwell era, where Fury X was at least presentable at 4k relative to the Titan, and even before the Vega launch as there was still the off chance that Vega would be more competitive. Right now, Nvidia is painfully aware of their arch lead and probably won't approach so much as mild apprehension again until Navi rumours start swirling. $699 does seem like a realistic ceiling, though. Just saying that I won't be truly surprised unless and until we hit that $1,000 *80.


----------



## littledonny

ENTERPRISE said:


> I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


Same. I'm content going from 02 chip to 02 chip, even if the new 04 chip is slightly faster/better.


----------



## paulerxx

GTX 1160 @ $250ish around 1070 performance sounds like a steal.


----------



## Hardware Hoshi

paulerxx said:


> GTX 1160 @ $250ish around 1070 performance sounds like a steal.


I wouldn't mind 1080 performance for $300 either


----------



## ZealotKi11er

paulerxx said:


> GTX 1160 @ $250ish around 1070 performance sounds like a steal.


1060 FE = $300.


----------



## epic1337

Hardware Hoshi said:


> I wouldn't mind 1080 performance for $300 either


at that performance even if it was stretched to $350 it would hit the same cost effectiveness of GTX1060 3GB.
a 1070-tier at $250 on the other hand would be 20% more cost efficient than GTX1060 3GB.


----------



## Shiftstealth

This seems like the generation for people that upgrade 1 part a year to go for a CPU. CPU's this year have seen a bigger uplift than GPU's.


----------



## Diffident

I only upgrade every other generation. Having a 970 with a 1440p ultrawide I'm desperately in need of an upgrade.


----------



## SoloCamo

Diffident said:


> I only upgrade every other generation. Having a 970 with a 1440p ultrawide I'm desperately in need of an upgrade.


I'm on a 290x reference cooler clocked to 1120 core 1500mhz (70mhz faster than a stock 390x)... and at 4k  I'm dying here. I refuse to upgrade to less than 1080ti performance and will not spend more than $500 on a card... I'll probably wait this gen out, too :thumbsdow


----------



## kd5151

AMD R7 260x for now....1180 I mean 7nm vega later.


----------



## animeowns

1180 it will have to outperform the titan V for me to buy it hopefully a 30% increase or something if its faster than the titan xp how much faster exactly


----------



## m4fox90

NVidia's been milking Pascal for so long, they've come very close to ruining the market for a new GPU. They almost *have* to make the 11X series great or people are going to shrug and stick with the 10X


----------



## Threx

mouacyk said:


> It's the *80 MSRP's that have ranged from $500 to $700, and more recently tended towards $700.


Both the 980 and 1080 were $550 MSRP. I wouldn't call that trending towards 700.



Diffident said:


> I only upgrade every other generation. Having a 970 with a 1440p ultrawide I'm desperately in need of an upgrade.


I'm kinda similar to you. I usually only upgrade if it is "2 tiers" above what I have. eg. I went from 460 -> 660 -> 770 -> 980. I just don't think a one tier (~20-30%) upgrade is worth it. So going from my current 980 to a 1180 should be a healthy bump.


----------



## Imglidinhere

ENTERPRISE said:


> I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


I personally feel that Geforce is as confusing as it gets. There's almost a Ti model for literally every iteration of every card. I mean hell, we have a 1080 Ti, a 1070 Ti, 1050 Ti... I could honestly understand when it was the GTX 560 and 560 Ti, that made sense. But having a Ti version to signify the 'in-between' or 'upgraded' version of a card is kinda silly, especially when the differences between the Ti of one card and the immediate successor card have single digit percent performance differences? It's gotten out of hand.

What's even funnier to me is the point that Nvidia claimed that they didn't like how GTX became so mainstream to the masses, so to fix the issue they released the Titan series... and then they literally just slapped on the GTX onto some of the later titan models... So it was Geforce GTX Titan.


Would be nice if they could get their marketing strategy straight for once. The Ti branding was cool back when it was the Geforce 4000 series. Every card back then used 'Ti' before the name, so it read Ti 4200, Ti 4400, etc. It was the GTX of the time, except that was back when they changed the naming scheme to keep things fresh and helped their product appear flashy with a new name. Now? It's all the same and has been for about 6-7 generations and refreshes. :wackosmil Time for a change.


----------



## mouacyk

Threx said:


> Both the 980 and 1080 were $550 MSRP. I wouldn't call that trending towards 700.


Uh... 1080 was as high as $700 for FE. 550 was only after price cut later from 600 for nonFE models. In actually, there was such a shortage at launch that pricing was usually higher than msrp for all nonFE cards.


----------



## steelbom

The performance of a 1080 will be fine for me on my 3440x1440 UW. I'm making due with an RX 480 atm... I'd upgrade but I don't currently want to pay $775 AUD for the blower style 1080. Maybe $600... so hope new GPUs come out and prices drop...


----------



## 1Kaz

Waiting on an 1180TI. Already have the money set aside, but it's hard to justify spending it on a moderate upgrade when my 970 still plays everything decently well. I game at 1440P 120hz. I do turn the graphics down for FPS, but I've always been a function over form guy. 

On a lot of competitive games I actually turn the graphics down because it's less distraction for the eye. I'll be looking for decent 120 hz 4K displays during the next year.


----------



## epic1337

mouacyk said:


> Uh... 1080 was as high as $700 for FE. 550 was only after price cut later from 600 for nonFE models. In actually, there was such a shortage at launch that pricing was usually higher than msrp for all nonFE cards.


i remember non-FE cards were even more expensive than the FE.


----------



## Threx

mouacyk said:


> Uh... 1080 was as high as $700 for FE. 550 was only after price cut later from 600 for nonFE models. In actually, there was such a shortage at launch that pricing was usually higher than msrp for all nonFE cards.


I wasn't talking about actual market prices, I was talking about MSRP because that was the term you used.


----------



## Hardware Hoshi

epic1337 said:


> at that performance even if it was stretched to $350 it would hit the same cost effectiveness of GTX1060 3GB.
> a 1070-tier at $250 on the other hand would be 20% more cost efficient than GTX1060 3GB.


The cost perspective is not the only way to view a new graphic card series. 

There are alot of gamers with small cases in ITX or HTPC formats. Small and powerful cards are trendy for those sizes. It's closer to 'the more the better'. A 1070 performance is probably a cut-down card too. The GTX1060 3GB would be such a card, but probably with more VRAM. We will have to wait for details about this roughly half a year after release of the x80-series. A few dollars more for a way better package is acceptable for many buyers. All within reason of course!

To me the GTX / GTS 1150 to 1160 (Ti) series is the most exiting news.


----------



## mouacyk

Threx said:


> I wasn't talking about actual market prices, I was talking about MSRP because that was the term you used.


Well, may be you should because MSRP (suggested retail pricing for FE was $699) and actual pricing of everything turned out to be even higher due to shortage. The $550 you stated was only after months when supply stabilized, and only for non FE cards.

Google's top result: https://www.google.com/search?clien.......0...1c..64.psy-ab..0.0.0....0.2UpRSCRcC_4



> The GTX 1080 and 1070 are both available (at least theoretically) in a standard configuration and a in a “Founders Edition” with better cooling. For the GTX 1080, the baseline MSRP is $599 and $699 respectively, while the GTX 1070 is supposed to be priced at $379 and $449 respectively.(Jun 16, 2016)


----------



## BigMack70

Do we have any credible info if this is going to be a big chip release like the 980 Ti/1080 Ti or is this a midrange chip like the 980/1080?


----------



## Hardware Hoshi

BigMack70 said:


> Do we have any credible info if this is going to be a big chip release like the 980 Ti/1080 Ti or is this a midrange chip like the 980/1080?


No reliable information so far, but if it is a similar release to both Maxwell and Pascal this 1180 is another Gx104 midrange chip with up to 400mm² DIE size. Practically a middle-class on steroids. There will probably be a GTX 1180 Ti with a Gx102 oversized chip again, though at a later date. I wouldn't expect this monster before early 2019. 

Nvidia will first of all milk the x70 and x80 series before that.


----------



## SuperZan

BigMack70 said:


> Do we have any credible info if this is going to be a big chip release like the 980 Ti/1080 Ti or is this a midrange chip like the 980/1080?


We don't have any concrete proof, but unless Nvidia drastically changes their release cadence, it's going to be an Gx-104 chip. We won't see Gx-102 for gamers for a few months after that in the form of a Titan; Gx-102 in Ti form will arrive six months or so after that.


----------



## epic1337

yeah, most of the rumors points towards GT104, so theres still room for a GT102 for 1180Ti and Titan.


----------



## the9quad

Put me in the wait for a 3rd party Ti camp.


----------



## chessmyantidrug

I just recently went from a GTX 970 to 1070 because I was tired of running into VRAM issues while playing FFXV. The performance difference was a bit more than I anticipated, but I'm still awaiting the 11-series. If the GTX 1170 is enough of a bump for ~$400, I'll definitely consider it. Release prices don't matter that much to me because I'm never is a hurry to upgrade. I'm perfectly content waiting for the market to come to me.

I expect the GTX 1160 to be roughly GTX 1070 performance, probably a bit better. If the GTX 960 and 1060 before it are any indication, that's an acceptable expectation. If they have two versions of the 1160, hopefully they actually differentiate them accurately. The 3GB version should have simply been called the GTX 1060 while the 6GB version should have been called the GTX 1060 Ti. At least people would have expected and understood a performance difference between the two instead of creating confusion.


----------



## epic1337

chessmyantidrug said:


> I expect the GTX 1160 to be roughly GTX 1070 performance, probably a bit better. If the GTX 960 and 1060 before it are any indication, that's an acceptable expectation. If they have two versions of the 1160, hopefully they actually differentiate them accurately. The 3GB version should have simply been called the GTX 1060 while the 6GB version should have been called the GTX 1060 Ti. At least people would have expected and understood a performance difference between the two instead of creating confusion.


if they'd launch a GTX1160Ti i wonder which die it would have, whether its a binned from the 1180 die or a full GT106 die.


----------



## chessmyantidrug

I would expect it to be a 106 die. Considering the 980 and 980 Ti and 1080 and 1080 Ti don't have the same dies, that isn't a requirement for their naming scheme. It would be nice if they at least stayed consistent across their entire lineup. I wonder if that was an objective of GPP.


----------



## epic1337

chessmyantidrug said:


> I would expect it to be a 106 die. Considering the 980 and 980 Ti and 1080 and 1080 Ti don't have the same dies, that isn't a requirement for their naming scheme. It would be nice if they at least stayed consistent across their entire lineup. I wonder if that was an objective of GPP.


well, GTX660Ti is based on GK104. :h34r-smi

edit: now that i think about it, the GTX660Ti is the last non-OEM 60Ti.


----------



## Threx

mouacyk said:


> Well, may be you should because MSRP (suggested retail pricing for FE was $699)


No, I shouldn't. MSRP is MSRP. Market price is market price. They are different. If you want to talk about actual market price then say actual market price, not MSRP.

And the $550 number is also from google's top result.

But regardless, even at 599 it is not "trending closer to 700."


----------



## tpi2007

chessmyantidrug said:


> I just recently went from a GTX 970 to 1070 because I was tired of running into VRAM issues while playing FFXV. The performance difference was a bit more than I anticipated, but I'm still awaiting the 11-series. If the GTX 1170 is enough of a bump for ~$400, I'll definitely consider it. Release prices don't matter that much to me because I'm never is a hurry to upgrade. I'm perfectly content waiting for the market to come to me.
> 
> I expect the GTX 1160 to be roughly GTX 1070 performance, probably a bit better. If the GTX 960 and 1060 before it are any indication, that's an acceptable expectation. If they have two versions of the 1160, hopefully they actually differentiate them accurately. The 3GB version should have simply been called the GTX 1060 while the 6GB version should have been called the GTX 1060 Ti. At least people would have expected and understood a performance difference between the two instead of creating confusion.



Agreed on the GTX 1060 naming, but when it comes to the performance of the future 1160 / 2060, expecting roughly 1070 performance would be a disappointment, especially after two years of waiting. The GTX 1060 is on par with the GTX 980, so that's what I'm expecting.


----------



## epic1337

it would depend on the price point, if they put 1160 at $300~$350 then it should be at least 1070Ti performance, if not 1080.
but if its $200~$250 then 1070 performance is quite acceptable, supposedly $200 would be the "lowest" it would go.


----------



## Kaltenbrunner

I'm delighted with the high prices. I plan to buy 100x 1180 for mining, and a few more to use as ashtrays, dish scrubber holders, etc


The sad tears of geeks makes me powerful hahahahahahaha


----------



## SuperZan

That's cool, man. You're cool.


----------



## chessmyantidrug

tpi2007 said:


> Agreed on the GTX 1060 naming, but when it comes to the performance of the future 1160 / 2060, expecting roughly 1070 performance would be a disappointment, especially after two years of waiting. The GTX 1060 is on par with the GTX 980, so that's what I'm expecting.


I didn't realize the 1060 was closer to the 980 than 970. Definitely well placed between the two. Like @epic1337 mentioned, price point will definitely be important. The 1060 3GB is slightly better than the 970. If they have two versions of the 1160 and the weaker one is closer to the 1070 and the stronger one closer to the 1080, that makes the most sense.


----------



## Threx

chessmyantidrug said:


> I didn't realize the 1060 was closer to the 980 than 970. Definitely well placed between the two. Like @epic1337 mentioned, price point will definitely be important. The 1060 3GB is slightly better than the 970. If they have two versions of the 1160 and the weaker one is closer to the 1070 and the stronger one closer to the 1080, that makes the most sense.


Yeah, the 1080 is a tier above the 980 ti, the 1070 is about the same tier as 980 ti, and the 1060 is the same tier as 980.

So if the 1180 is a tier above the 1080 ti, which I expect it to be, then it would make sense that the 1170 will be about the 1080 ti performance, and the 1160 will be about the 1080 performance. Give or take a handful of fps.


----------



## Cloudforever

ENTERPRISE said:


> I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


agreed-

i dont know why they make us wait this long for the Ti series lol, *stomps feet* " But I want it nowwwwww "


----------



## jprovido

time to sell my 1080 Ti's?


----------



## epic1337

jprovido said:


> time to sell my 1080 Ti's?


4months later, yes.


----------



## PriestOfSin

I'm always tempted when the new x80 cards come out, but the past two gens I've waited for the TI versions and have not been disappointed. Ti versions tend to age better as well imo, with my 980Ti still happily chugging along in the wife's PC.

It'd be great if we could get something nice from the red team, but I guess I'll keep dreaming.


----------



## Owneth

same here, i do 2-3 product cycles, then upgrade.


----------



## GHADthc

I'll just sit pretty on this 1080ti, water blocking it soon, and flashing it with the modded XOC bios.

It should tie me over till something more compelling comes out ie: 1180ti or whatever RTG hamfists out into the market (unless AMD has finally given them an actual budget to work with).

I'm even more inclined to just skip this next generation, unless something compelling actually happens in the performance they exhibit.

I'm still internally hype for the idea of MCM GPU's, but it's seeming like they've been pushed back further into the future.


----------



## evensen007

Maybe it's because I'm "only" running at 3440x1440 resolution, but doesn't it seem like software has hit a wall and isn't really taxing the hardware as much anymore? Strictly speaking of gaming, the days of Crysis bringing 2-3 generations of video cards to it's knees seem to be long gone. I can't imagine needing to upgrade my 1080ti anytime within the next 2 years.


----------



## Woundingchaney

evensen007 said:


> Maybe it's because I'm "only" running at 3440x1440 resolution, but doesn't it seem like software has hit a wall and isn't really taxing the hardware as much anymore? Strictly speaking of gaming, the days of Crysis bringing 2-3 generations of video cards to it's knees seem to be long gone. I can't imagine needing to upgrade my 1080ti anytime within the next 2 years.


Hardware demands in gaming are generally tied to console cycles. Honestly as PC gamers we have been fortunate because the the Xbone and PS4 released with underpowered hardware for their time frame. I dont expect this trend to end anytime soon. I upgrade often for a PC gamer and where as I want a new GPU there is little reason for me to, unless I start looking at 4k above 60fps or we see more demanding titles release (both seem unlikely at this point).


----------



## JackCY

They forgot to add 2019 as the year. There would have to be massive gains for people to even bother switching to newer from Maxwell and Paxwell.


----------



## epic1337

JackCY said:


> They forgot to add 2019 as the year. There would have to be massive gains for people to even bother switching to newer from Maxwell and Paxwell.


ummm, 980(maxwell) to 1180(turing?) would be at least 100% increase, i don't think thats no less than "massive"? or were you expecting at least 300% gains or something?


----------



## Falkentyne

evensen007 said:


> Maybe it's because I'm "only" running at 3440x1440 resolution, but doesn't it seem like software has hit a wall and isn't really taxing the hardware as much anymore? Strictly speaking of gaming, the days of Crysis bringing 2-3 generations of video cards to it's knees seem to be long gone. I can't imagine needing to upgrade my 1080ti anytime within the next 2 years.


You can bring any GPU to its knees just by enabling full 4x SSAA supersampling.
Try 200% render scale in Overwatch with ultra details and watch your GPU scream for mercy.


----------



## ZealotKi11er

Falkentyne said:


> You can bring any GPU to its knees just by enabling full 4x SSAA supersampling.
> Try 200% render scale in Overwatch with ultra details and watch your GPU scream for mercy.



Within reasonable settings. I only care to upgrade once the GPU cant handle game settings which effect IQ significantly.


----------



## thebski

ENTERPRISE said:


> I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


If you are in the market for a new card every year, I can understand. For example, 980 from 780 Ti was underwhelming. However, if you buy every two years (I generally do), to me, the Ti's make no sense.

Going from 04 to next gen 04 is often almost identical performance increase as going from 00/02 to next gen 00/02. For example, using Anand Tech's reviews, the average increase in their tests from 980 to 1080 was 66% at 1440P and 71% at 4k. Going from 980 Ti to 1080 Ti netted them 68% at 1440P and 75% at 4k. The increases from gen to gen are almost identical, whether you're talking mid dies or big dies. Why not go with the mid dies since they are cheaper, use less power, and run cooler and quieter?

Then again, as I type this, I am less convinced the Ti's have much of a place. As in your case, you usually skip the 80's because the jump from the previous Ti isn't much. That puts you in the every two year camp, which I explained above. The only person the Ti makes sense for, to me, is someone who is buying the fastest card out every single year. The Ti's get that crown every other year.

I'm not trying to convince you to buy the 80's. I am not too concerned with what anyone else chooses to use their money on. However, for me, the Ti just doesn't make much sense. Crazy considering only a few years ago I was only interested in the big die cards. I will buy this card to replace my 1080 given that it is not some stupid price like $800+ (it probably will be). Even $700 and I may run my 1080 another round.


----------



## rluker5

thebski said:


> If you are in the market for a new card every year, I can understand. For example, 980 from 780 Ti was underwhelming. However, if you buy every two years (I generally do), to me, the Ti's make no sense.
> 
> Going from 04 to next gen 04 is often almost identical performance increase as going from 00/02 to next gen 00/02. For example, using Anand Tech's reviews, the average increase in their tests from 980 to 1080 was 66% at 1440P and 71% at 4k. Going from 980 Ti to 1080 Ti netted them 68% at 1440P and 75% at 4k. The increases from gen to gen are almost identical, whether you're talking mid dies or big dies. Why not go with the mid dies since they are cheaper, use less power, and run cooler and quieter?
> 
> Then again, as I type this, I am less convinced the Ti's have much of a place. As in your case, you usually skip the 80's because the jump from the previous Ti isn't much. That puts you in the every two year camp, which I explained above. The only person the Ti makes sense for, to me, is someone who is buying the fastest card out every single year. The Ti's get that crown every other year.
> 
> I'm not trying to convince you to buy the 80's. I am not too concerned with what anyone else chooses to use their money on. However, for me, the Ti just doesn't make much sense. Crazy considering only a few years ago I was only interested in the big die cards. I will buy this card to replace my 1080 given that it is not some stupid price like $800+ (it probably will be). Even $700 and I may run my 1080 another round.


But the 80's are basically a year late ti + a little bit. Your same argument holds with buying older used cards at a steep discount. The ti's provide the highest reasonable performance at the time when they are sold. Right now that is 4k,60 and high framerate 1440 gaming. Maybe the next ones will be 4k,90 and the 4k,60 and high framerate 1440 players won't want them as much. 

If you count in depreciation of your cards, the 80's have a poor bang for your buck. Take a $650 980ti that loses about a third of it's useful life in premium gaming in the year it takes for the 20% better 1080 to come out at the same price. Or the same scenario with the 1080ti and 1180. And you are losing a year's worth of better performance on top of that.

But this is dependent on your gaming desires and budget too. If you have a total 1k budget and game at 1080,60 then the ti is a waste of money. 
But if you want all of the fanciness now, the 80s are the waste.


And the big dies come with aio a lot nowadays for not much more money which is both cooler and quieter than air little dies.


----------



## thebski

rluker5 said:


> But the 80's are basically a year late ti + a little bit. Your same argument holds with buying older used cards at a steep discount. The ti's provide the highest reasonable performance at the time when they are sold. Right now that is 4k,60 and high framerate 1440 gaming. Maybe the next ones will be 4k,90 and the 4k,60 and high framerate 1440 players won't want them as much.
> 
> If you count in depreciation of your cards, the 80's have a poor bang for your buck. Take a $650 980ti that loses about a third of it's useful life in premium gaming in the year it takes for the 20% better 1080 to come out at the same price. Or the same scenario with the 1080ti and 1180. And you are losing a year's worth of better performance on top of that.
> 
> But this is dependent on your gaming desires and budget too. If you have a total 1k budget and game at 1080,60 then the ti is a waste of money.
> But if you want all of the fanciness now, the 80s are the waste.
> 
> 
> And the big dies come with aio a lot nowadays for not much more money which is both cooler and quieter than air little dies.


I would agree with you, but it's based on your first sentence, which for Pascal wasn't true. Again going back to Anand's numbers and looking at Witcher 3 specifically simply because it closely represents their average results across all games, 980 Ti to 1080 was 31% at 1080 launch, and 1080 to 1080 Ti was 30.6% at 1080 Ti launch, all based on 1440P. Of course the exact numbers depend on the game and resolution, but the point is, the increments were very similar. They released a ~30% faster card each year. As far as Pascal goes, if you're going to say that the 1080 was just a 980 Ti plus a little bit, then the 1080 Ti was just a 1080 plus a little bit.


----------



## Blameless

evensen007 said:


> Maybe it's because I'm "only" running at 3440x1440 resolution, but doesn't it seem like software has hit a wall and isn't really taxing the hardware as much anymore? Strictly speaking of gaming, the days of Crysis bringing 2-3 generations of video cards to it's knees seem to be long gone. I can't imagine needing to upgrade my 1080ti anytime within the next 2 years.


Conversely, I can't imagine ever not wanting much more. There is almost always more eye candy that can be added, or higher frame rates to target. The graphics setup I want at any given moment won't exist for another 3-5 years.

As it stands now, the most demanding game I play was released in 2014, isn't generally considered exceptionally demanding, and at the settings I use I can still see my frame rates dip into the 40s on my OCed 1080 Ti on a single 2560*1440 display.


----------



## etrin

you just wait till amd shows up
They will have rebadged junk and call it new.


----------



## SuperZan

New Achievement Earned! 

Keenest Blade’s Edge(lord)


----------



## epic1337

etrin said:


> you just wait till amd shows up
> They will have rebadged junk and call it new.


at least there were tweaks to it, i remember 400 series had some serious thermal issues while 500 series doesn't, making the 500 series a bit faster across the board.

on the other hand, Vega has been out for quite a while now yet they still haven't released the rest of the lineup.


----------



## rluker5

thebski said:


> I would agree with you, but it's based on your first sentence, which for Pascal wasn't true. Again going back to Anand's numbers and looking at Witcher 3 specifically simply because it closely represents their average results across all games, 980 Ti to 1080 was 31% at 1080 launch, and 1080 to 1080 Ti was 30.6% at 1080 Ti launch, all based on 1440P. Of course the exact numbers depend on the game and resolution, but the point is, the increments were very similar. They released a ~30% faster card each year. As far as Pascal goes, if you're going to say that the 1080 was just a 980 Ti plus a little bit, then the 1080 Ti was just a 1080 plus a little bit.


The 980ti also overclocks better, and if you go by Anand's review of the 980ti and compare oc to oc, the 1080 is ahead by less than 20%. The jump is bigger from 80 to ti than ti to next 80. But the next 80 is newer and supports some as of yet undisclosed features to make it more attractive. And factor in your resolution and budget preferences and I guess it is case by case. Guess I like mine and you like yours. I do hope the 1180 will be awesome though.


----------



## Ownedj00

rluker5 said:


> The 980ti also overclocks better, and if you go by Anand's review of the 980ti and compare oc to oc, the 1080 is ahead by less than 20%. The jump is bigger from 80 to ti than ti to next 80. But the next 80 is newer and supports some as of yet undisclosed features to make it more attractive. And factor in your resolution and budget preferences and I guess it is case by case. Guess I like mine and you like yours. I do hope the 1180 will be awesome though.


but that jump from 980ti to 1180 will be great enough to not have to wait for the 1180ti version to get enough increase in performance. This is what i am doing as i don't really need the ti version and maybe from now on ill upgrade every new gen.


----------



## Lass3

ENTERPRISE said:


> I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants.


Well I used to think that too, but I'm starting to think going non-Ti is better, since I don't upgrade as often anymore. Used to upgrade every year... Not anymore.

Ti launches around 1 year later than x80 non-Ti, and the next step after Ti, is new arch. Meaning that driverfocus from Nvidia will not be a priority anymore.

Going with the smaller chip means; Cheaper card, cooler and quieter / less watt usage, longer focus from Nvidia -> You are on the primary arch from start to finish.


----------



## thebski

rluker5 said:


> The 980ti also overclocks better, and if you go by Anand's review of the 980ti and compare oc to oc, the 1080 is ahead by less than 20%. The jump is bigger from 80 to ti than ti to next 80. But the next 80 is newer and supports some as of yet undisclosed features to make it more attractive. And factor in your resolution and budget preferences and I guess it is case by case. Guess I like mine and you like yours. I do hope the 1180 will be awesome though.


It definitely depends on ones situation and of course what the new architecture has to offer.


----------



## astrallite

rluker5 said:


> The 980ti also overclocks better, and if you go by Anand's review of the 980ti and compare oc to oc, the 1080 is ahead by less than 20%. The jump is bigger from 80 to ti than ti to next 80. But the next 80 is newer and supports some as of yet undisclosed features to make it more attractive. And factor in your resolution and budget preferences and I guess it is case by case. Guess I like mine and you like yours. I do hope the 1180 will be awesome though.


Also Maxwell 980Ti was almost always thermally limited in terms of overclocking, whereas Pascal is pretty much brick walls around 2GHz even on water. The top overclocking 980 Tis with the best aftermarket coolers with high ASICs could OC to around ~1575MHz, some even to ~1600MHz. If you bought a high end 980 Ti you could be looking at no more than 12-13% improvement from 980 Ti to 1080, OC vs OC.


----------



## astrallite

thebski said:


> It definitely depends on ones situation and of course what the new architecture has to offer.


Well we can already see the improvements even without gaming drivers. The Titan V has 33% more cores than the Titan XP but in some low level API games hits ~40% better fps. So I believe there's going to be at least a per core efficiency increase of around ~10% in async/DX12/Vulkan situations. Perhaps with gaming drivers we might get that carried over to DX11 perf too.


----------



## SwitchFX

evensen007 said:


> Maybe it's because I'm "only" running at 3440x1440 resolution, but doesn't it seem like software has hit a wall and isn't really taxing the hardware as much anymore? Strictly speaking of gaming, the days of Crysis bringing 2-3 generations of video cards to it's knees seem to be long gone. I can't imagine needing to upgrade my 1080ti anytime within the next 2 years.


True, but may I point you toward Assassins Creed: Black Flag? Even with patches and user workarounds, it still performs terribly.


----------



## Lass3

astrallite said:


> Also Maxwell 980Ti was almost always thermally limited in terms of overclocking, whereas Pascal is pretty much brick walls around 2GHz even on water. The top overclocking 980 Tis with the best aftermarket coolers with high ASICs could OC to around ~1575MHz, some even to ~1600MHz. If you bought a high end 980 Ti you could be looking at no more than 12-13% improvement from 980 Ti to 1080, OC vs OC.


You are right - My 980 Ti at 1580 performs like a 1080 FE in 9 out of 10 games. Even beats it in some. 1080 pulls slightly ahead with OC, but not more than 10% at best.

I'd have picked up 1080 Ti but didn't really have a reason to. My 980 Ti hold up very well and still runs pretty much everything at 1440p maxed. Probably because Maxwell is so closely related to Pascal arch and recieved pretty much same tweaks driverwise. Now I'm waiting for 1180/2080 instead. Going small chip this time.


----------



## DrFPS

etrin said:


> you just wait till amd shows up
> They will have rebadged junk and call it new.


AMD They used to make gpu's, long time ago.


----------



## d5aqoep

Sitting tight on my 1080ti. Graphics have not actually improved over past 2/3 years. All feel like they are at the same level to me. Then you have that one weird setting which does nothing to PQ but cuts the framerate by 30% and all geeks have eyegasms because their burnt smelling overclocked cards could perform 1fps faster.


----------



## epic1337

DrFPS said:


> AMD They used to make gpu's, long time ago.


that was ATI, AMD is just an investor.


----------



## doom26464

Not sure how people are able to get 980ti past 1500mhz without extensive modding/cooling. I have had 3 980ti and all of them sit between 1400mhz-1450mhz. While it helps close the gap on a gtx 1080 its not enough to match it. 

Unless people are just playing games that dont stress the card much. Pubg and using NVENC encoding pretty much puts my 980ti under max load all the time.


----------



## Sir Beregond

Slaughtahouse said:


> While I do agree to an extent, the non ti versions typically have lower power draws.(GTX 1080 TDP 180W vs GTX 1080Ti 250W).
> 
> Given this is an enthusiast forum, I understand that most users don't factor that it but it does make a difference. Granted, if I knew a GTX 780Ti would of been released, I would of purchased that instead of a GTX 780 because I really wanted all the performance I could get. However, that is when Nvidia started this trend of releasing Titan's and X80ti's.
> 
> Going forward, I think I will stick to the non-Ti's. Just so I am dumping less wattage into my w/c loop. But to each their own


Yeah, I wish I had waited for a 980 Ti, but who was I to know that that was their new release strategy after only the 780 Ti release previous. Now 3 gens of that release strategy so it goes without saying now.


----------



## Lass3

doom26464 said:


> Not sure how people are able to get 980ti past 1500mhz without extensive modding/cooling. I have had 3 980ti and all of them sit between 1400mhz-1450mhz. While it helps close the gap on a gtx 1080 its not enough to match it.
> 
> Unless people are just playing games that dont stress the card much. Pubg and using NVENC encoding pretty much puts my 980ti under max load all the time.


I use custom firmware for 1580 MHz. I could do 1480 MHz on stock firmware.
Most 980 Ti's with stock firmware will do 1450-1500 MHz in my experience. 1380-1400 MHz is pretty much out of the box boost for good custom cards.


----------



## cjc75

Wow... September?
I don't know if I can wait that long....
I'm currently using an aging eVGA GTX 770 4GB FTW, and while its a heck of a work horse, its starting to show its age; especially when I try to push my monitor at 1440p... and I fear it won't last much longer. 
I may have an opportunity to pick up an eVGA GTX 1080 FTW2 within the next couple of weeks and I've been contemplating doing it, while considered "what if the 1180 comes out in July? I could get the 1080 and use eVGA's Step Up to get the 1180..." 
But that of course won't work if the 1180 doesn't come out until September.
...I'd be stuck with the 1080.


----------



## merlin__36

Cannot wait to see some new Cards, Always pushing the limit.


----------



## keikei

NVIDIA Next Generation Mainstream GPU Detailed in August


----------



## czin125

Could this possibly clock a bit higher with the improved process? It does seem to maintain equally high boost while the cores increased from P100 to V100


----------



## mouacyk

czin125 said:


> Could this possibly clock a bit higher with the improved process? It does seem to maintain equally high boost while the cores increased from P100 to V100


Clocks did seem to increase proportionally from Maxwell (GM200) to Pascal (GP102) in inverse relation to lithography. However, Titan V did not observe a 25% increase in clocks... could be a variety of reasons though. From what I gathered, lithography isn't consistent, so a smaller process change will likely yield less benefits than a larger process change.


----------



## Woundingchaney

How relevant is SLI these days? The only way I see this offering being a viable upgrade is if I run a mgpu solution.


----------



## mouacyk

Woundingchaney said:


> How relevant is SLI these days? The only way I see this offering being a viable upgrade is if I run a mgpu solution.


Really not at all: https://uk.hardware.info/reviews/81...nchmarks-hardwareinfo-gpu-prestatiescore-2018


----------



## chessmyantidrug

SLI support varies from game to game. Graphics processors are so powerful these days that multi-GPU solutions aren't really needed unless you're trying to game at 8K or something silly.


----------



## Kokin

chessmyantidrug said:


> SLI support varies from game to game. Graphics processors are so powerful these days that multi-GPU solutions aren't really needed unless you're trying to game at 8K or something silly.


4K 144Hz is right around the corner as well as 3440x1440 200Hz later this year. My single 1080Ti barely does 90-120FPS (AA off) for my 3440x1440 120Hz monitor, multi-GPU would be great for those higher refresh rates.

I'm honestly just waiting for Intel 8c/16t or next-gen Ryzen to upgrade from my 3570K from 2012!


----------



## looniam

keikei said:


> NVIDIA Next Generation Mainstream GPU Detailed in August


unfortunately it's now changed
https://www.hotchips.org/program/









(i took a nice snip of showing TBD but can't upload it so google photo)


----------



## Raficoo

looniam said:


> unfortunately it's now changed
> https://www.hotchips.org/program/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (i took a nice snip of showing TBD but can't upload it so google photo)


I suppose the question now is whether they changed it because Nvidia really suddenly decided to change their schedule or their previous mention of the event was in violation of some NDA regarding the new cards and they had to "fix" this mistake, sure do hope it's the latter lol


----------



## ExoticallyPure

looniam said:


> unfortunately it's now changed
> https://www.hotchips.org/program/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (i took a nice snip of showing TBD but can't upload it so google photo)


It was likely changed by NVIDIA due to speculation, but such activity is largely irrelevant. The new lineup release date is not far away now.


----------



## Swolern

Love new top tier GPU releases! Ray tracing please!!!!


----------



## crpcookie

Computex:
Q: When is the next GeForce coming to market?
Jensen: A long time from now...


----------



## mouacyk

Swolern said:


> Love new top tier GPU releases! Ray tracing please!!!!


It will be interesting to see how much of the current shader pipeline can be adapted to ray tracing on consumer volta. We did see a bit of ray-tracing already on Pascal with voxel-based global illumination (VXGI).

https://arstechnica.com/gaming/2018...just-how-great-real-time-raytracing-can-look/


> Getting a “cinematic” 24fps with real-time raytracing still requires some serious hardware: it’s currently running on Nvidia’s ultra-high-end, four-GPU DGX Station, which lists for $60,000 [Update: After publication, Epic reached out to clarify that the demo was running on a DGX Station, and not a DGX-1 as originally stated during the interview.] . Even with that, some elements of the scene, like the walls, need to be rasterized rather than made fully reflective, Libreri told Ars. And the Volta technology that is key to powering this kind of performance isn't even available in consumer-grade GPUs below the $3,000+ Titan V, so don't expect this kind of scene to run on your home gaming rig in the near-term.


----------



## EastCoast

> Back to the topic though, when Jensen Huang was asked when gamers can expect a Volta based GeForce card or whatever it'll be called, he answered that prices of the existing Pascal video cards had dropped to a regular level for some time and that gamers could again buy a GeForce GTX 1070, 1080 or 1080 Ti, saying that they were the best video cards that you as a gamer can buy.
> Another editor asked for a small hint on the topic, Huang answered. "Do not worry, I will invite you", followed by "It'll be a long time from now". Since the question was slightly specific on Volta, his answer by itself does not eliminate the option of a pascal refresh.


http://www.guru3d.com/news-story/nv...w-gamers-geforce-is-a-long-time-from-now.html




Interesting, I do believe they have no plans for a Volta like Gaming VC any time soon. I also believe that a refresh is very possible.
Not because of AMD competitiveness mind you. AMD is both CPU/GPU company. Gone are the days of ATI which was just a VC company like Nvidia...But because Nvidia isn't just a vc gaming company anymore.


I've said it once and say it again. Gamers are on the low priority as Nvidia don't see gamers as their only consumer anymore. I wonder how GPP would have played into this though? I mean, they can easily just do a refresh and people will buy it even if there is only a 10%-15% difference in performance.


----------



## Mysticial

crpcookie said:


> Computex:
> Q: When is the next GeForce coming to market?
> Jensen: A long time from now...





EastCoast said:


> http://www.guru3d.com/news-story/nv...w-gamers-geforce-is-a-long-time-from-now.html
> 
> 
> 
> 
> Interesting, I do believe they have no plans for a Volta like Gaming VC any time soon. I also believe that a refresh is very possible.
> Not because of AMD competitiveness mind you. AMD is both CPU/GPU company. Gone are the days of ATI which was just a VC company like Nvidia...But because Nvidia isn't just a vc gaming company anymore.
> 
> 
> I've said it once and say it again. Gamers are on the low priority as Nvidia don't see gamers as their only consumer anymore. I wonder how GPP would have played into this though? I mean, they can easily just do a refresh and people will buy it even if there is only a 10%-15% difference in performance.



The rumor that I've been hearing is that some large company (which I won't name) bought up the entire supply of Voltas for the first year or so. That's why none are left for the consumer market.

So yes, the consumer market is not high priority. We have one in our lab which some of my colleagues spent a lot of time learning/reverse-engineering it. But it's not consumer-grade - probably an engineering sample.


----------



## EastCoast

Mysticial said:


> The rumor that I've been hearing is that some large company (which I won't name) bought up the entire supply of Voltas for the first year or so. That's why none are left for the consumer market.
> 
> So yes, the consumer market is not high priority. We have one in our lab which some of my colleagues spent a lot of time learning/reverse-engineering it. But it's not consumer-grade - probably an engineering sample.



Hmm, I'm still wondering if it's private sector or Government entity.


----------



## mouacyk

Mysticial said:


> The rumor that I've been hearing is that some large company (which I won't name) bought up the entire supply of Voltas for the first year or so. That's why none are left for the consumer market.
> 
> So yes, the consumer market is not high priority. We have one in our lab which some of my colleagues spent a lot of time learning/reverse-engineering it. But it's not consumer-grade - probably an engineering sample.


It's probably Microsoft. They just bought out GitHub. They're gonna use the Voltas to mine GitHub and extract its juicy bits for Windows 11, or Skynet Core.


----------



## SavantStrike

Mysticial said:


> The rumor that I've been hearing is that some large company (which I won't name) bought up the entire supply of Voltas for the first year or so. That's why none are left for the consumer market.
> 
> So yes, the consumer market is not high priority. We have one in our lab which some of my colleagues spent a lot of time learning/reverse-engineering it. But it's not consumer-grade - probably an engineering sample.


Bought up the supply of large die or small die Volta? 


Either way the consumer segment comes dead last, but that shouldn't come as a shock to anyone. Given the current memory shortage, any launch would only be a paper launch.


----------



## Contiusa

What a shame... 2 years and nothing. The gaming market hyped and pumped these people with money for decades. Now we are disposable. They should know that what goes around comes around...

...


----------



## SavantStrike

Contiusa said:


> What a shame... 2 years and nothing. The gaming market hyped and pumped these people with money for decades. Now we are disposable. They should know that what comes around goes around...
> 
> ...


What are the gamers going to do, rage quit?

They have proven they are unwilling to pay more for a limited supply, so until the supply isn't limited, they've got to wait.


----------



## Mand12

nycgtr said:


> That's how milking works.



Only if you update on every card release.

Seriously, I can't fathom how people can line up to buy every single top card on launch and then still complain about prices.


----------



## EastCoast

IMO, a refresh would always coming. Based on what I've gathered they simply held out on even the refresh.
If the market shifts where Nvidia loses over 15% marketshare in discrete graphics even though they've moved more units would trigger a refresh release.


----------



## SavantStrike

EastCoast said:


> IMO, a refresh would always coming. Based on what I've gathered they simply held out on even the refresh.
> If the market shifts where Nvidia loses over 15% marketshare in discrete graphics even though they've moved more units would trigger a refresh release.


But the market can't shift without any competition, so for now they don't have to do anything. The CEO has said for a few months now that Pascal wasn't going anywhere, yet the tech press have been soiling themselves fantasizing about a July release.

I've been eying a late third quarter paper launch since January. If yields are bad or if there are other customers, they can certainly push it back further, but there's no reason for them to push it forward.


----------



## ZealotKi11er

SavantStrike said:


> But the market can't shift without any competition, so for now they don't have to do anything. The CEO has said for a few months now that Pascal wasn't going anywhere, yet the tech press have been soiling themselves fantasizing about a July release.
> 
> I've been eying a late third quarter paper launch since January. If yields are bad or if there are other customers, they can certainly push it back further, but there's no reason for them to push it forward.


It depends. If the next cards is 12nm it will come soon. If its 7nm + G6 it is out of Nvidias hand. People like to speculate its AMD holding back hence Nvidia is "waiting". Sure Nvidia could release halo cards like Titan for a lot of money bit cant mass produce at this point.


----------



## Majin SSJ Eric

Best part about a new Nvidia X80 card release is that the previous cards (usually) plummet in price. That's what I'm hoping for. A $300 1080 would be nice...


----------



## looniam

ExoticallyPure said:


> It was likely changed by NVIDIA due to speculation, but such activity is largely irrelevant. The new lineup release date is not far away now.


what is relevant is other manufacturers demonstrating mobile/SOC graphics, the reason why i including them in the screen shot, which made it very unlikely for nvidia to "drop a bomb".


----------



## keikei

> NVIDIA will be launching their next-gen GeForce GTX 1180 graphic card on July 30. NVIDIA CEO and founder Jensen Huang said they were a while away during NVIDIA's Computex 2018 press conference, but of course he's going to say that. July 30 isn't too far away, and while we were wrong about the GTC 2018 launch, my sources were very clear about July 30. I asked multiple people and more than one confirmed it was less than two months away, with July 30 specifically being mentioned by multiple of my sources.


https://www.tweaktown.com/news/6210...ition-details-unveiled-screenshots/index.html


----------



## Threx

keikei said:


> https://www.tweaktown.com/news/6210...ition-details-unveiled-screenshots/index.html


Btw, it's reported by the same guy who said the new cards were being launched back in GTC.


----------



## Contiusa

Mand12 said:


> Only if you update on every card release.
> 
> Seriously, I can't fathom how people can line up to buy every single top card on launch and then still complain about prices.


That is not always the case. Many people get stuck in between generations and wait for the new release to don't lose money. Or worse, people who buy a card today to see the new release a couple months later with much better cards. I bought a 1060 as a temporary upgrade expecting to buy a GTX 1180 this year. If I knew they were going to screw things up I would have bought a 1070 or even a 1080. Now I won't buy anything until they really release some new cards. So it is bad for me and for them. They lost back then because I saved money on my purchase and they are missing out now because I won't upgrade a Pascal to another Pascal.

If the market releases good products, people will buy it. Intel skimped on things for years when they could cash big releasing an hexacore with Skylake or even Haswell. I would have update it (and a big portion of the market), and yet I still have my old i7-3770K and won't update until they release an octacore free of hacks and bugs.

These entrepreneurs make an effort to take the wrong route.


----------



## SavantStrike

keikei said:


> https://www.tweaktown.com/news/6210...ition-details-unveiled-screenshots/index.html


Right.... The CEO of the company says they aren't launching for a long time, so that must mean they are launching in under 60 days.

Seems legit.


----------



## chessmyantidrug

Contiusa said:


> That is not always the case. Many people get stuck in between generations and wait for the new release to don't lose money. Or worse, people who buy a card today to see the new release a couple months later with much better cards. I bought a 1060 as a temporary upgrade expecting to buy a GTX 1180 this year. If I knew they were going to screw things up I would have bought a 1070 or even a 1080. Now I won't buy anything until they really release some new cards. So it is bad for me and for them. They lost back then because I saved money on my purchase and they are missing out now because I won't upgrade a Pascal to another Pascal.
> 
> If the market releases good products, people will buy it. Intel skimped on things for years when they could cash big releasing an hexacore with Skylake or even Haswell. I would have update it (and a big portion of the market), and yet I still have my old i7-3770K and won't update until they release an octacore free of hacks and bugs.
> 
> These entrepreneurs make an effort to take the wrong route.


Nvidia doesn't care when you buy. They aren't getting your money. They made their money when the partner (Asus, MSI, EVGA, Zotac, etc) purchased the GPU.

Intel wouldn't have been able to release 6-core parts at the same TDP and frequency on their mainstream platform. It also made no sense for them to do so because there was little demand for six and more cores. If you were doing things that required more than four cores, chances are a mainstream platform wasn't for you anyway.


----------



## Contiusa

chessmyantidrug said:


> It also made no sense for them to do so because there was little demand for six and more cores.


You could not be more wrong. People were asking for the hexacore for ages and the i5 was already being forced in games for quite some time. Go check the Coffee Lake thread and you will see how people were pissed off to see an haxacore just because of Ryzen. The i7-7700K to be a quad-core was a bad joke.

The same with GPUs. People are waiting for some real cards to use with top notch monitors. The whole market is waiting for these morons to release more powerful hardware.


----------



## Ksireaper

Just picked up a 1080ti from Nvidia site today. Not really wanting to wait longer. I only play @ 3440 x [email protected] so i dont really need to wait for the 1180. I think this card will be good for a while.


----------



## keikei

SavantStrike said:


> Right.... The CEO of the company says they aren't launching for a long time, so that must mean they are launching in under 60 days.
> 
> Seems legit.


Let me ask you this. How much does nvidia stand to lose on sales of the 10 series if they announce a release date? Jensen gave the press a response, but he did not answer the question for very good reason. Also, its been some time since i got a card, but wasnt the Ti announced and then launched around 30 days? I'm old, my memory might be borked.



Threx said:


> Btw, it's reported by the same guy who said the new cards were being launched back in GTC.


I did see your other post in another thread reporting a slightly later release date. Things seem to be getting a little spicy.


----------



## SuperZan

keikei said:


> Let me ask you this. How much does nvidia stand to lose on sales of the 10 series if they announce a release date? Jensen gave the press a response, but he did not answer the question for very good reason. Also, its been some time since i got a card, but wasnt the Ti announced and then launched around 30 days? I'm old, my memory might be borked.


The Ti is a little bit different because it's not the amuse-bouche for the new gen. It doesn't necessarily need any lead-up time as they begin working on it much earlier than the release date. The problem with the question as posed is this; if the sales of the 10-series are still good enough to worry about spoiling them, why wouldn't they continue to market such high-demand, high-margin parts for as long as possible? If sales of the 10-series are weak, why wouldn't they try to start drumming up excitement as soon as possible with a definitive breadcrumb?


----------



## Pro3ootector

I remember few people on this forum buying the Titan X (not XP) before Ti rease. I mean what was this? Paying premium to have coolest, and most powerful GPU for 2 months.


----------



## chessmyantidrug

Contiusa said:


> You could not be more wrong. People were asking for the hexacore for ages and the i5 was already being forced in games for quite some time. Go check the Coffee Lake thread and you will see how people were pissed off to see an haxacore just because of Ryzen. The i7-7700K to be a quad-core was a bad joke.
> 
> The same with GPUs. People are waiting for some real cards to use with top notch monitors. The whole market is waiting for these morons to release more powerful hardware.


You're right, I was wrong. But because I used the incorrect word. There was plenty of demand, but there was no need. Again, if you _needed_ more than four cores, chances are you needed more than mainstream platforms had to offer. If you honestly think the only reason Intel brought six cores to mainstream is Ryzen, you're pretty ignorant. Six cores has been on Intel's roadmap for quite a while. The only thing Ryzen did was bring Intel's six-core mainstream processors to market before they were ready for volume production.


----------



## Falkentyne

SuperZan said:


> The Ti is a little bit different because it's not the amuse-bouche for the new gen. It doesn't necessarily need any lead-up time as they begin working on it much earlier than the release date. The problem with the question as posed is this; if the sales of the 10-series are still good enough to worry about spoiling them, why wouldn't they continue to market such high-demand, high-margin parts for as long as possible? If sales of the 10-series are weak, why wouldn't they try to start drumming up excitement as soon as possible with a definitive breadcrumb?


You know, with that line of questioning and thinking, we would still be all hammering on Apple II/E's, Commodore 64's and Atari 2600's and NES's ! Who cares about progress anyway?  Just milk that money and get rich and get your super expensive car and that hot sexy slim girlfriend with that nice butt!


----------



## SavantStrike

keikei said:


> Let me ask you this. How much does nvidia stand to lose on sales of the 10 series if they announce a release date? Jensen gave the press a response, but he did not answer the question for very good reason. Also, its been some time since i got a card, but wasnt the Ti announced and then launched around 30 days? I'm old, my memory might be borked.
> 
> 
> 
> I did see your other post in another thread reporting a slightly later release date. Things seem to be getting a little spicy.


They stand to lose very little. The majority of the consumer market is purchasing through OEM's. Most people buy a computer when they want one, not based on shifting release cycles.

The remainder is unimportant - they'll buy what they can get their hands on (which won't be a vaporware 1180). Don't worry, all the spare Pascal cards will be dumped on the AIBs.


----------



## SuperZan

Falkentyne said:


> SuperZan said:
> 
> 
> 
> The Ti is a little bit different because it's not the amuse-bouche for the new gen. It doesn't necessarily need any lead-up time as they begin working on it much earlier than the release date. The problem with the question as posed is this; if the sales of the 10-series are still good enough to worry about spoiling them, why wouldn't they continue to market such high-demand, high-margin parts for as long as possible? If sales of the 10-series are weak, why wouldn't they try to start drumming up excitement as soon as possible with a definitive breadcrumb?
> 
> 
> 
> You know, with that line of questioning and thinking, we would still be all hammering on Apple II/E's, Commodore 64's and Atari 2600's and NES's ! Who cares about progress anyway? /forum/images/smilies/smile.gif Just milk that money and get rich and get your super expensive car and that hot sexy slim girlfriend with that nice butt!
Click to expand...

Oh, I’m so sorry. You must have me confused with Jen-Hsun Huang. My name is Audrey; I’m just here to pontificate like everyone else. It’s easy to see where you got crossed up.

If Nvidia has a different market analysis (and seeing as I’m not in their industry, I’m sure they do) then they’ll play to that, but they’ll make a decision based on quarterly returns, not progress for the sake of it. Nvidia is a business, not the United Federation of Planets.


----------



## ThrashZone

Hi,
Might have to sell a kidney to be able to afford a 1180ti


----------



## Mooncheese

Ksireaper said:


> Just picked up a 1080ti from Nvidia site today. Not really wanting to wait longer. I only play @ 3440 x [email protected] so i dont really need to wait for the 1180. I think this card will be good for a while.


You should have held out, youre doing exactly what Nvidia wants you to do. They have Turing / Volta ready to go but they want to milk us a little bit longer. If we all refused to budge and not buy what they are offering at the moment they would take that as a market signal and release the new hardware, knowing that that is what we, "the market" is doing.


----------



## chessmyantidrug

Mooncheese said:


> You should have held out, youre doing exactly what Nvidia wants you to do. They have Turing / Volta ready to go but they want to milk us a little bit longer. If we all refused to budge and not buy what they are offering at the moment they would take that as a market signal and release the new hardware, knowing that that is what we, "the market" is doing.


That ... isn't how it works. If we collectively decided not to buy any of their products, they wouldn't simply drop the prices until products started selling. If their new product was actually ready for market, they would release it. You have to be a special kind of naive to think they're willingly suppressing their profits by refraining from releasing new hardware.


----------



## tajoh111

We are paying the price of miners when they leave the market. 

Nvidia has too much stock of pascal because they likely overproduced based on demand from miners inflating the amount of cards desired by the market. 

As a result, they have an excess inventory of pascal cards which need to be sold before next gen becomes available. 

This is why it is not a simple matter of ordering and making more cards when mining does increase demand for a card. 

Ordering excessive GPU produces inventory which must be sold before the next gen comes out. Otherwise your taking a massive inventory write off from the depreciation. E.g a gtx 1080 and gtx 1080 will now be worth 499 and 350 respectively after the release of the 1180.


----------



## fursko

d5aqoep said:


> Sitting tight on my 1080ti. Graphics have not actually improved over past 2/3 years. All feel like they are at the same level to me. Then you have that one weird setting which does nothing to PQ but cuts the framerate by 30% and all geeks have eyegasms because their burnt smelling overclocked cards could perform 1fps faster.


We need that Playstation 5 for next gen graphics i guess. But looks like it will be struggle with pixel count more than a graphical improvement.


----------



## RXWX

I'm betting on Gamescom for the Volta GeForce unveiling.


----------



## Threx

RXWX said:


> I'm betting on Gamescom for the Volta GeForce unveiling.


Highly doubt it. If I'm not mistaken, Nvidia has never announced a new Geforce line up at Gamescom. They will likely hold their own event.


----------



## SavantStrike

chessmyantidrug said:


> That ... isn't how it works. If we collectively decided not to buy any of their products, they wouldn't simply drop the prices until products started selling. If their new product was actually ready for market, they would release it. You have to be a special kind of naive to think they're willingly suppressing their profits by refraining from releasing new hardware.


Exactly.

There isn't some secret conspiracy to keep GPU prices higher and delay releasing new hardware. The current market will bear today's prices and there isn't any be hardware to release yet in volume. The minute the market won't bear the price, there will be price cuts, and the minute new hardware is available in volume it will be released.

Nvidia probably can't get GDDR6 in adequate quantities to release new cards.



tajoh111 said:


> We are paying the price of miners when they leave the market.
> 
> Nvidia has too much stock of pascal because they likely overproduced based on demand from miners inflating the amount of cards desired by the market.
> 
> As a result, they have an excess inventory of pascal cards which need to be sold before next gen becomes available.
> 
> This is why it is not a simple matter of ordering and making more cards when mining does increase demand for a card.
> 
> Ordering excessive GPU produces inventory which must be sold before the next gen comes out. Otherwise your taking a massive inventory write off from the depreciation. E.g a gtx 1080 and gtx 1080 will now be worth 499 and 350 respectively after the release of the 1180.


This isn't how it works either.

Most of that inventory is in the hands of AIBs and OEMs. Nvidia only directly sells a fraction of the consumer chips on the market, and they don't care if the AIBs and OEMs get stuck with old stock. If it's really a problem, they will just force the AIBs and OEMs to buy a portion of the old inventory as a condition to purchase new cards.

They aren't delaying a release because of milking the product stack at this point, they are delaying because they have to.


----------



## RXWX

Threx said:


> Highly doubt it. If I'm not mistaken, Nvidia has never announced a new Geforce line up at Gamescom. They will likely hold their own event.


Maybe this year though. I'm betting that the GeForce cards will retain the Tensor Cores, maybe lower counts on the GV104 based cards and below, with GV102 having the same amount as GV100 (don't quote me on that, I'm just thinking about it, might end up wrong lol).


----------



## mmonnin

SavantStrike said:


> Exactly.
> 
> There isn't some secret conspiracy to keep GPU prices higher and delay releasing new hardware. The current market will bear today's prices and there isn't any be hardware to release yet in volume. The minute the market won't bear the price, there will be price cuts, and the minute new hardware is available in volume it will be released.
> 
> Nvidia probably can't get GDDR6 in adequate quantities to release new cards.
> 
> This isn't how it works either.
> 
> Most of that inventory is in the hands of AIBs and OEMs. Nvidia only directly sells a fraction of the consumer chips on the market, and they don't care if the AIBs and OEMs get stuck with old stock. If it's really a problem, they will just force the AIBs and OEMs to buy a portion of the old inventory as a condition to purchase new cards.
> 
> They aren't delaying a release because of milking the product stack at this point, they are delaying because they have to.


Some people complain NV is milking users by releasing a top end card one after another and then later complain they are milking people by not releasing new products. Some people don't get it but its def not the latter.


----------



## geoxile

Why would Nvidia launch a new line up now when TSMC's 7nm aready started volume production? IIRC their new uarchs mostly improved on machine learning and compute, so is there even a point?


----------



## Lass3

geoxile said:


> Why would Nvidia launch a new line up now when TSMC's 7nm aready started volume production? IIRC their new uarchs mostly improved on machine learning and compute, so is there even a point?


7nm started volume production for bigger chips?

Only seen AMD's demo/prototype of 7nm with 32GB HBM2. I don't exactly think yields are good for big chips on 7nm.


----------



## ibb27

geoxile said:


> Why would Nvidia launch a new line up now when TSMC's 7nm aready started volume production? IIRC their new uarchs mostly improved on machine learning and compute, so is there even a point?


This year - 11xx series with slower GDDR6.
Next year (7nm) - 20xx with faster GDDR6, and Ti thunder Q1 2020. 

They need to stay on top of the gaming market.


----------



## Lass3

ibb27 said:


> This year - 11xx series with slower GDDR6.
> Next year (7nm) - 20xx with faster GDDR6, and Ti thunder Q1 2020.
> 
> They need to stay on top of the gaming market.


Indeed.

No annual release equals less sales. Most people will just hold on to their current cards. Not many is buying Pascal anymore... More than 2 years old


----------



## SavantStrike

ibb27 said:


> This year - 11xx series with slower GDDR6.
> Next year (7nm) - 20xx with faster GDDR6, and Ti thunder Q1 2020.
> 
> They need to stay on top of the gaming market.


They are already on top of the gaming market.

I bet we could see a TI at the top of the product stack right before the next refresh on 7nm.


----------



## ibb27

SavantStrike said:


> I bet we could see a TI at the top of the product stack right before the next refresh on 7nm.


Probably, if we assume that the first new 7m products will be in the low power class like 750(Ti) series, cause they will stay on 7nm tech for 2-3 generations.

July 30 (Monday) is the prediction from Videocardz, starting with Founder edition cards, and late August or September other branded cards will follow.


----------



## mmonnin

ibb27 said:


> This year - 11xx series with slower GDDR6.
> Next year (7nm) - 20xx with faster GDDR6, and Ti thunder Q1 2020.
> 
> They need to stay on top of the gaming market.


So from 10xx to 11xx to 20xx...


----------



## Lass3

They are not going to release 1180 and 1170, then jump to 20xx.


----------



## GenoOCAU

Australian retailers have started slashing the prices on 10-series cards, one retailer discounted 1080Ti's for $260 AUD ($197 USD) and were sold out state wide with-in 48hours.

It's close, I can feeeeeel it! I'm looking forward to moving on from Pascal.


----------



## Hydroplane

We'll see what the performance gains are vs. 1080 Ti. If it's less than 10% I'll hold off until the Ti or Titan version.


----------



## Threx

GenoOCAU said:


> Australian retailers have started slashing the prices on 10-series cards, one retailer discounted 1080Ti's for $260 AUD ($197 USD) and were sold out state wide with-in 48hours.


Holy hell, now that's a deal if there ever was one.


----------



## geoxile

Lass3 said:


> geoxile said:
> 
> 
> 
> Why would Nvidia launch a new line up now when TSMC's 7nm aready started volume production? IIRC their new uarchs mostly improved on machine learning and compute, so is there even a point?
> 
> 
> 
> 7nm started volume production for bigger chips?
> 
> Only seen AMD's demo/prototype of 7nm with 32GB HBM2. I don't exactly think yields are good for big chips on 7nm.
Click to expand...

It's not a prototype. Amd said vega 20 has been sampling to customers and previously said it will have availability in 2018, presumably end of the year. Considering nvidia is closer to TSMC than AMD is I dont see why they couldn't start producing 7nm on a smaller GPU when AMD's already making the relatively massive vega on it. They could even do a paper launch like they have in the past to "launch" a product in 2018


----------



## LancerVI

GenoOCAU said:


> Australian retailers have started slashing the prices on 10-series cards, one retailer discounted 1080Ti's for $260 AUD ($197 USD) and were sold out state wide with-in 48hours.
> 
> It's close, I can feeeeeel it! I'm looking forward to moving on from Pascal.


I find this exceedingly hard to believe.


----------



## Diffident

LancerVI said:


> I find this exceedingly hard to believe.


I think the discount is $260AUD off it's previous price, not that it's $260. Considering Aussie prices, it's probably still above MSRP.


----------



## LancerVI

Diffident said:


> I think the discount is $260AUD off it's previous price, not that it's $260. Considering Aussie prices, it's probably still above MSRP.


That would make more sense.


----------



## GenoOCAU

Sorry about the lack of clarification, yes it was a $260 off the price. Brought the price back to $999 AUD or $760 USD.


----------



## Lass3

geoxile said:


> It's not a prototype. Amd said vega 20 has been sampling to customers and previously said it will have availability in 2018, presumably end of the year. Considering nvidia is closer to TSMC than AMD is I dont see why they couldn't start producing 7nm on a smaller GPU when AMD's already making the relatively massive vega on it. They could even do a paper launch like they have in the past to "launch" a product in 2018


I'd gladly wait a little longer for 7nm


----------



## ExoticallyPure

*PLEASE TAKE THIS POLL EVERYONE:*
https://www.strawpoll.me/15863852

I am just curious how impatient people are about NVIDIA's new GPU and I'm sure some other people are curious too.


----------



## hlreijnders

ExoticallyPure said:


> *PLEASE TAKE THIS POLL EVERYONE:*
> https://www.strawpoll.me/15863852
> 
> I am just curious how impatient people are about NVIDIA's new GPU and I'm sure some other people are curious too.


Where's the option "I just bought a 1080ti, just let me enjoy my high-end card before you degrade it to mid-range".


----------



## ToTheSun!

hlreijnders said:


> Where's the option "I just bought a 1080ti, just let me enjoy my high-end card before you degrade it to mid-range".


Why would one want an option one would have no legitimacy to choose?


----------



## guttheslayer

Our dear JHH actually drop a hint on when is the next Geforce card (computex 2018)








Tune to 13:00


Note: It is not a good news.


----------



## Threx

Well, so far with the small sample size 83% of people would like to see new GPUs sooner or later.

But if the poll was posted only in this thread then the results would be very inaccurate though, since people who bothered to read this thread in the first place are likely people who are interested in new GPUs.


----------



## Lass3

guttheslayer said:


> Our dear JHH actually drop a hint on when is the next Geforce card (computex 2018)
> 
> 
> https://www.youtube.com/watch?v=0JLED2veVzw
> 
> 
> Tune to 13:00
> 
> 
> Note: It is not a good news.


Nvidia would never tell if they were close. They want to keep selling Pascal cards right up to the launch event. Not many would buy Pascal if they knew for sure that new cards are close.

I bet we see them Q4 sometime. 7nm and GDDR6. Both are in mass production.


----------



## guttheslayer

Lass3 said:


> Nvidia would never tell if they were close. They want to keep selling Pascal cards right up to the launch event. Not many would buy Pascal if they knew for sure that new cards are close.
> 
> I bet we see them Q4 sometime. 7nm and GDDR6. Both are in mass production.



I will be overly surprised if the Next gen geforce this year is 7nm. But its just unlikely given that they haven really put 12nm to good use. Unless their 12nm FF+ is only specifically for Volta.


----------



## Lass3

guttheslayer said:


> I will be overly surprised if the Next gen geforce this year is 7nm. But its just unlikely given that they haven really put 12nm to good use. Unless their 12nm FF+ is only specifically for Volta.


Well, I don't see why they would not go with 7nm, since it's possible.
12nm over 16nm is not a big jump anyway.


----------



## Threx

Yeah, especially that there are now reports by several sites that TSMC has just started ramping up 7nm ahead of schedule due to high demands, I think Q4 for new Geforce line up seems quite probable. And also that E3 has revealed a lot of new visually-demanding games coming out early next year a lot of people are going to be wanting to upgrade.


----------



## Lass3

Threx said:


> Yeah, especially that there are now reports by several sites that TSMC has just started ramping up 7nm ahead of schedule due to high demands, I think Q4 for new Geforce line up seems quite probable. And also that E3 has revealed a lot of new visually-demanding games coming out early next year a lot of people are going to be wanting to upgrade.


Yeah TSMC 7nm + GDDR6 (which is also in mass production atm)

Q4 sounds about right


----------



## looniam

i'll drop this here for no reason:


*NVIDIA is allegedly hosting first briefings for board partners about upcoming GeForce series.*


> Igor Wallosek from Tom’s Hardware Germany claims that first board partners have been briefed about the new gaming series from NVIDIA. The bill of materials (BoM) has also been shown, which basically starts the whole process of GeForce 20 (or 11 as some claim) series development.
> 
> Igor Wallosek:
> 
> […] we have unofficially learned from some board partners that Nvidia has already started training the relevant employees from the development departments. And since Nvidia does not prioritize any sales training courses for sales, the time calculation based on the tables listed below is likely to take on much more concise forms.
> 
> From now it will take weeks, likely months to have products ready for release. In other words, there is absolutely no way of telling when new series will launch at this point, but the train has just left the station.



*Nvidia GeForce GTX 2080, 1180 or what we do not know about Turing [Update]*


> Update from 12.06.2018 18:40 clock
> In the meantime, we have unofficially learned from some board partners that Nvidia has already started training the relevant employees from the development departments. And since Nvidia does not prioritize any sales training courses for sales, the time calculation based on the tables listed below is likely to take on much more concise forms. If you follow the 3-month rule, the first board partner cards should appear on the market in late August or early September. However, some of the partners are now expecting a shift of at least two weeks, so that September seems rather plausible.



*FORUM 3D post* (igor):


> Only as a side note:
> The briefings of the board partners for the new cards were already running or running. The BoM is through, so now everyone can calculate for themselves (the time window I had already explained in my news)



*another quote*:


> a degree of technical assistance, otherwise I could not afford such adventures as TH. At Turing, it is true that I know those who are currently working on the electrical finesses of the board partners.
> 
> You can already look forward to a new way of a very specific video output and NV has also fundamentally changed the tact and its handling. And now everyone is allowed to puzzle again


the end of the THG article:


> Our up-to-date information, which of course still causes a certain amount of blurriness due to certain delays in time, allows for the following speculations at the end :
> 
> 
> Shipping of Turing bundles for FE from mid-June (Leak)
> Launch of the Turing reference cards FE ("Founders Edition") in July (Leak, rumor)
> Due to the generally slightly higher GDDR6 prices, the cards could be up to 100 euros more expensive in the EIA (assumption)
> Launch of the Turing board partner cards in August / September (board partner statements, calculation example)
> Launch of the first new Quadro Turing cards for Siggraph in August (suspected)
> Performance higher with similar power consumption compared to the direct predecessor model (condensed rumor)
> Of course, these calculations are just pure mind games, but in every rumor and jigsaw piece, there's a bit of truth in the end. And even if we have not really become smarter as to the name scheme, then at least we know a little more about the production of graphics cards. And these are real, verifiable facts.




yeah, sorry everything besides the VC link is in german. (*all quotes via google translate *- chrome)


----------



## guttheslayer

looniam said:


> i'll drop this here for no reason:
> 
> 
> *NVIDIA is allegedly hosting first briefings for board partners about upcoming GeForce series.*
> 
> 
> 
> *Nvidia GeForce GTX 2080, 1180 or what we do not know about Turing [Update]*
> 
> 
> 
> *FORUM 3D post* (igor):
> 
> 
> 
> *another quote*:
> 
> 
> the end of the THG article:
> 
> 
> 
> 
> yeah, sorry everything besides the VC link is in german. (*all quotes via google translate *- chrome)


If it is Sept for sure it wont be 7nm.


----------



## Lass3

guttheslayer said:


> If it is Sept for sure it wont be 7nm.


Might be paperlaunch. AMD already has several working 7nm HUGE DIE GPU's

No 7nm, no buy. Because this means that Nvidia will refresh fast on 7nm or something brand new on 7nm.

Why would Nvidia NOT use 7nm when TSMC can do it NOW or VERY SOON?
They could go the extreme milking route: 12nm Pascal refresh with GDDR6 and then in less than 12 months, new arch on 7nm and faster GDDR6... No competition is great


----------



## Threx

This is indeed good news if it is true. If it launches sept-oct I don't think 7nm will be ready. If it's december then I guess 7nm.

Personally I don't even really care what the size of the transistors are. All I care about are the speed, price, and durability. 

Oh, and the release date, of course.


----------



## Lass3

Threx said:


> This is indeed good news if it is true. If it launches sept-oct I don't think 7nm will be ready. If it's december then I guess 7nm.
> 
> Personally I don't even really care what the size of the transistors are. All I care about are the speed, price, and durability.
> 
> Oh, and the release date, of course.


Durability? Are you going to wait a few years to see how durable they are


----------



## guttheslayer

Extreme milking seem to be the best path for them since AMD have absolute zero competition at high end.

Also 12nm FFN have been too underutilised for TSMC to dedicate a new production flow just for them. So far only one die (GV100) uses 12nm. Its doesnt make any business sense go jump to more expensive node and keeping the profit margin tighter when they control majority of the market shares.

Unless AMD caught them with their pants down and they may rush out 7nm like how intel did.


----------



## Threx

Lass3 said:


> Durability? Are you going to wait a few years to see how durable they are


Remember a few years ago the fan blades of MSI cards were straight up breaking off just from spinning? That's what I mean.


----------



## guttheslayer

Threx said:


> This is indeed good news if it is true. If it launches sept-oct I don't think 7nm will be ready. If it's december then I guess 7nm.
> 
> Personally I don't even really care what the size of the transistors are. All I care about are the speed, price, and durability.
> 
> Oh, and the release date, of course.


Nvidia on 12nm is more than capable enough to fight AMD on 7nm front. The best eg is to compare Vega 64 vs Titan V (Mind you it is not even gaming optimised) performance and then predict how much can AMD improve their Vega 64 by jumping to 7nm.


You will realised even 7nm it will be tough for AMD to compete Titan V in the 250-300W range.


----------



## guttheslayer

> There have been fresh rumours, from a writer on Tom's Hardware with friends in technical places, that the new cards will feature a new video output. The speculation is that means the GTX 1180 will natively run HDMI 2.1 out of the box, hopefully delivering the bandwidth required to deal with 4K HDR at 120Hz without messing with the colours too much. It could also introduce Game Mode VRR (variable refresh rate) that might even give us non-hardware based G-Sync.


https://www.pcgamesn.com/nvidia-gtx-1180-release-date-specifications


There are rumoured pointing to HDMI 2.1 on the next gen Geforce. While it seem good to have more option, my question is..

Will G-sync or any VRR technology will be supported on the new HDMI for Geforce card?


Cause the last thing I want is to stuck with the more restrictive DP 1.4 for G-sync. Price premium is one thing, but will they even port their G-sync over to the new HDMI?


----------



## jamesch

guttheslayer said:


> Nvidia on 12nm is more than capable enough to fight AMD on 7nm front. The best eg is to compare Vega 64 vs Titan V (Mind you it is not even gaming optimised) performance and then predict how much can AMD improve their Vega 64 by jumping to 7nm.
> 
> 
> You will realised even 7nm it will be tough for AMD to compete Titan V in the 250-300W range.


AMD is a LONG ways off from releasing a giant 484mm2 die on 7nm, at best what we will get at 7nm will be die shrunk Vega 64 with small improvements. It wont even compete with Titan XP, much less Titan V.


----------



## guttheslayer

jamesch said:


> AMD is a LONG ways off from releasing a giant 484mm2 die on 7nm, at best what we will get at 7nm will be die shrunk Vega 64 with small improvements. It wont even compete with Titan XP, much less Titan V.


I am talking about the suppose increased clock speed from 16nm to 7nm. With just the clock speed bump it might bridge the gap between Txp and V64 while shrinking the die area considerably.


----------



## guttheslayer

Summary:

Expected name: GTX 1180
Expected feature: GDDR6, Tensor cores, DirectX 12.5 or 13.
Expected memory: 8GB
Expected price: $699 ($720-730 for custom AiB)
Expected launch: July 30th.


----------



## keikei

^If that price is anywhere near accurate I'm gonna hold off for a bit unless the performance matches it. July is coming up fast!


----------



## clonxy

guttheslayer said:


> https://www.youtube.com/watch?v=UBbyACygmAs&feature=youtu.be
> 
> 
> Summary:
> 
> Expected name: GTX 1180
> Expected feature: GDDR6, Tensor cores, DirectX 12.5 or 13.
> Expected memory: 8GB
> Expected price: $699 ($720-730 for custom AiB)
> Expected launch: July 30th.


.

If you read the posts in this thread, you would know that NVIDIA announced it isn't releasing 1180/2080 anytime soon. This video was obviously created for views but was late to the train with nothing to back it up. I downvoted it for you.


----------



## maltamonk

guttheslayer said:


> https://www.pcgamesn.com/nvidia-gtx-1180-release-date-specifications
> 
> 
> There are rumoured pointing to HDMI 2.1 on the next gen Geforce. While it seem good to have more option, my question is..
> 
> Will G-sync or any VRR technology will be supported on the new HDMI for Geforce card?
> 
> 
> Cause the last thing I want is to stuck with the more restrictive DP 1.4 for G-sync. Price premium is one thing, but will they even port their G-sync over to the new HDMI?


If that G-sync news pans out.......yikes...way to give customers that bought G-sync panels a big middle finger.


----------



## guttheslayer

clonxy said:


> .
> 
> If you read the posts in this thread, you would know that NVIDIA announced it isn't releasing 1180/2080 anytime soon. This video was obviously created for views but was late to the train with nothing to back it up. I downvoted it for you.



No one have any confirm source whether it is real or false, or if it is coming out in Dec or July. So unless you have hard evidence you can say its false either.


What I am disappointed is the price, at $50 above Ti, which was released more than 15 months ago, not to mention have much lesser GB for memory. it better have that very good performance to justify the price.


But we all knows if it have Tensor it going to be just Volta re-bundle. Also given its 8GB config of 256-bits its going to be almost guaranteed to be 4 GPC, or 3584 cores. Just how much 1180 can exceed 1080 Ti performance with the same number of cores, to justify that $699 price in 2018.


----------



## Threx

> One of NVIDIA’s Major OEM Partners Allegedely Returned 300,000 GPUs, Major Inventory Issues Being Cited For Next-Gen GeForce Launch Delay



https://wccftech.com/nvidia-oem-partner-300k-gpu-inventory-issues-next-gen-geforce-delay/


Their source is a site called SeekingAlpha. Anyone heard of them before?


----------



## Pro3ootector

Threx said:


> https://wccftech.com/nvidia-oem-partner-300k-gpu-inventory-issues-next-gen-geforce-delay/
> 
> 
> Their source is a site called SeekingAlpha. Anyone heard of them before?


rep+ ( when it's available )


----------



## teh n00binator

I'm sitting on whatever the performance and price of the 1170 is, if it's around the same performance of a 1080ti and a similar price to 1070's I might see what second hand 1080 ti's are being got rid of first. Current 980 ti's are going for less on the second hand market than 1070's are, so I'm sort of hoping it continues the trend between 1170 & 1080 ti.


----------



## Renegade5399

Sweet lord I hope there's some price reduction and some 1080Ti's that come out of this return.


----------



## mouacyk

How about NVidia do a sweepstakes with 1080 of them, and AMD offer to replace X of the winners?


----------



## looniam

Threx said:


> https://wccftech.com/nvidia-oem-partner-300k-gpu-inventory-issues-next-gen-geforce-delay/
> 
> 
> Their source is a site called SeekingAlpha. Anyone heard of them before?


yes and they get sourced for financial/stock analysis like the motely fool. maybe since they aren't a "tech site" is why you ask.


----------



## keikei

Threx said:


> https://wccftech.com/nvidia-oem-partner-300k-gpu-inventory-issues-next-gen-geforce-delay/
> 
> 
> Their source is a site called SeekingAlpha. Anyone heard of them before?


Makes sense. What goes up, must come down. http://www.guru3d.com/news-story/graphics-card-shipments-fall-as-mining-demand-weakens.html


----------



## Threx

Nvidia tweets happy birthday to Alan Turing.

https://www.tweaktown.com/news/6233...urings-birthday-gtx-1180-confirmed/index.html

http://www.guru3d.com/news-story/nv...delay-new-upcoming-turing-graphics-cards.html


----------



## guttheslayer

Threx said:


> Nvidia tweets happy birthday to Alan Turing.
> 
> https://www.tweaktown.com/news/6233...urings-birthday-gtx-1180-confirmed/index.html
> 
> http://www.guru3d.com/news-story/nv...delay-new-upcoming-turing-graphics-cards.html


I dunno why Turing might turn to be a Crypto card. Then Ampere in 7nm is the real gaming ones.


----------



## ZealotKi11er

guttheslayer said:


> I dunno why Turing might turn to be a Crypto card. Then Ampere in 7nm is the real gaming ones.


I do not think Nvidia cares to make a crypto card. Its is not a stable market.


----------



## BeeDeeEff

guttheslayer said:


> I dunno why Turing might turn to be a Crypto card. Then Ampere in 7nm is the real gaming ones.


The stuff that makes a card good at gaming is also what makes it good at mining. The mining software was written that way intentionally so that any average consumer could join in. At best the board partners could strip off some of the cheap flashy stuff and the display connector and save a few bucks.....but only a huge mining farm might ever be interested in such a card as the resale value/market sucks.


----------



## clonxy

Threx said:


> Nvidia tweets happy birthday to Alan Turing.
> 
> https://www.tweaktown.com/news/6233...urings-birthday-gtx-1180-confirmed/index.html
> 
> http://www.guru3d.com/news-story/nv...delay-new-upcoming-turing-graphics-cards.html


why do people think they can talk to dead people over the internet?


----------



## Threx

clonxy said:


> why do people think they can talk to dead people over the internet?



If you can talk to dead people through psychics, why not over the internet?




Spoiler



/s


----------



## Threx

"NVIDIA GeForce Next-Gen PCB ‘Prototype’ Leaked Out – Features 12 GB, 384-bit GDDR6 Memory, Triple 8 Pin Power Connectors, NVLINK Support"


https://wccftech.com/nvidia-geforce-next-gen-graphics-card-pcb-leak/


----------



## keikei

This july release date is looking more and more credible. 'A long time' must be code for next month.


----------



## ENTERPRISE

Threx said:


> "NVIDIA GeForce Next-Gen PCB ‘Prototype’ Leaked Out – Features 12 GB, 384-bit GDDR6 Memory, Triple 8 Pin Power Connectors, NVLINK Support"
> 
> 
> https://wccftech.com/nvidia-geforce-next-gen-graphics-card-pcb-leak/


Very interesting, Though I would have preferred it had 16GB, however having said that this board may be indicative on the 1180 and the 16GB variant will be the 1180Ti. Will be great to see what AIB's bring to the table this coming gen, I am definitely holding out for the premium cards this round and looking for max OC potential on air as WC is not something I am wanting to chase.


----------



## Threx

Btw, the original post of that leak on Reddit, the account seems to have been deleted just a few minutes ago. This leads me to believe the pic is probably real.

Another thing is the memory chips on the board are micron. And according to anandtech, micron just started mass producing GDDR6 yesterday.

https://www.anandtech.com/show/13012/micron-begins-mass-production-of-gddr6


----------



## keikei

If the 1180 is around 30% faster than the current Ti, i'm all over it. I need that extra performance. I'll pay for the early adopter fee. 

Good work Threx! :thumb: I"m a fan of One Piece myself.


----------



## profundido

ENTERPRISE said:


> Very interesting, Though I would have preferred it had 16GB, however having said that this board may be indicative on the 1180 and the 16GB variant will be the 1180Ti. Will be great to see what AIB's bring to the table this coming gen, I am definitely holding out for the premium cards this round and looking for max OC potential on air as WC is not something I am wanting to chase.



a shame. Once you go watercooling you don't go back but yeah you could just wait until one with a good 3-fan air cooler comes out like those MSI gaming series, that's right


----------



## ZealotKi11er

keikei said:


> If the 1180 is around 30% faster than the current Ti, i'm all over it. I need that extra performance. I'll pay for the early adopter fee.
> 
> Good work Threx! :thumb: I"m a fan of One Piece myself.


I do not see how that is possible. 

Just look at 780 Ti to 980 and 980 Ti to 1080.


----------



## Threx

keikei said:


> If the 1180 is around 30% faster than the current Ti, i'm all over it. I need that extra performance. I'll pay for the early adopter fee.
> 
> Good work Threx! :thumb: I"m a fan of One Piece myself.


Cheers for one piece. 

I'm also targeting the 1180. My 980 will be 4 years old this Oct so I'm itching for an upgrade. Unfortunately for me I won't be upgrading right away, I'm planning a white-blue theme for my upcoming build and kfa/galax usually takes 3-5 months to release their white exoc cards. 




> I do not see how that is possible.


1080 is about 30% faster than 980 ti in anandtech and guru3d benchmarks.

It might not be likely that 1180 is 30% faster than 1080 ti, but it is definitely not impossible.


----------



## nycgtr

Threx said:


> Cheers for one piece.
> 
> I'm also targeting the 1180. My 980 will be 4 years old this Oct so I'm itching for an upgrade. Unfortunately for me I won't be upgrading right away, I'm planning a white-blue theme for my upcoming build and kfa/galax usually takes 3-5 months to release their white exoc cards.
> 
> 
> 
> 
> 1080 is about 30% faster than 980 ti in anandtech and guru3d benchmarks.
> 
> It might not be likely that 1180 is 30% faster than 1080 ti, but it is definitely not impossible.


lmao. Some of you dreamers keep forgetting that there is no competition. Some of the tech press is pushing the click bait. Fact of the matter is we don't know what that dev board is for. Looking at the nvlink connector, 3 power connectors. Hell it can be for their self driving cars platform. Minning tanked a few months ago, looking at prices coming down across the board, excess inventory. There are peeps today buying new 1080s there is no reason to release anything until its sold off. Even then there's the opportunity to milk titan V perf to next year with a TI/ Titan V- whatever they wanna call it. Best bet you get Xp perf @ 649.99


----------



## Threx

nycgtr said:


> Some of you dreamers



???


----------



## guttheslayer

nycgtr said:


> lmao. Some of you dreamers keep forgetting that there is no competition. Some of the tech press is pushing the click bait. Fact of the matter is we don't know what that dev board is for. Looking at the nvlink connector, 3 power connectors. Hell it can be for their self driving cars platform. Minning tanked a few months ago, looking at prices coming down across the board, excess inventory. There are peeps today buying new 1080s there is no reason to release anything until its sold off. Even then there's the opportunity to milk titan V perf to next year with a TI/ Titan V- whatever they wanna call it. Best bet you get Xp perf @ 649.99


If it is a driving car platform it will be using HBM. I am not too convince myself but it seem Nvidia is going straight for GV102 instead of GV104 first. GV102 in comparison, is just GV100 without DP units and probably have half or none of the TENSOR cores.


There is a few speculation on what is this, a super behemoth mining card (for turing) design to satisfy the hungry mining committee. A Titan Xp successor, or a new flagship from Nvidia Geforce 11 series. The latter is out of norm for Nvidia but they had done it once for GTX 780 from 680 (memory also jumped 50%), so there is no guarantee it wont happen. But if the latter is really true:

Possible specs:
GTX 1180: 4608 - 5376 CUDA Cores, 12GB (600-660mm^2 die)
GTX 1170: 3584 CUDA Cores, 8GB (400-440mm^2 die)


I am not going to lie but the die size is simply huge on 12nm and I doubt it will come cheap at all. In fact I really doubt it should exist. Mining card seem to be more plausible. 7nm Geforce should be the true goal for Nvidia (there is rumor that Nvidia has just ramped production for 7nm GPUs)


----------



## Kaltenbrunner

Well I hope the prices are better, but the chances of that aren't good. I'll get a tax refund later this year, hope there's something big with enough perf/price


----------



## cainy1991

I'm still sitting here with a 1060... 

Was meant to be a stop gap GPU but honestly I'm still pretty impressed by it.


Lets see what the 1160/1160ti looks like lol.


----------



## Kaihekoa

First it was supposed to be March, then April, then June, now July, and here we are with 4 days left in June. I'll get excited once we actually hear the announcement from Nvidia whenever that is.


----------



## guttheslayer

Someone did an analysis of the leaked PCB test board and found out the GPU size is around 26mm on each side or 676mm^2


Considering the ratio between GP100 and GP102 and use it on GV100 and extrapolate GV102 estimated size. It is fairly close, 626mm^2. Also GV100 uses HBM which occupy less die space from its smaller memory controller as compared to G6. This mean a GV102 with bigger memory controller should exceed 626mm^2.


I am not going to lie, but a 12GB GTX 1180 with 5120 cores is becoming really plausible. The 24GB Quadro T6000 or V6000 might uses a full 5376 cores while replacing the P6000 in the price segment. Of course the GV100 will still stay at the top with 32GB HBM2 at $8999.


----------



## RideZeLitenin

R9 390, while still a great card that has some serious nuts to it, is no match for the new Sony 4K Panel I picked up this year. It'll rock 1440p decently enough, but after 3 years I'd say it's time for me to step up the game... here's looking at you, 1180


----------



## ENTERPRISE

guttheslayer said:


> Someone did an analysis of the leaked PCB test board and found out the GPU size is around 26mm on each side or 676mm^2
> 
> 
> Considering the ratio between GP100 and GP102 and use it on GV100 and extrapolate GV102 estimated size. It is fairly close, 626mm^2. Also GV100 uses HBM which occupy less die space from its smaller memory controller as compared to G6. This mean a GV102 with bigger memory controller should exceed 626mm^2.
> 
> 
> I am not going to lie, but a 12GB GTX 1180 with 5120 cores is becoming really plausible. The 24GB Quadro T6000 or V6000 might uses a full 5376 cores while replacing the P6000 in the price segment. Of course the GV100 will still stay at the top with 32GB HBM2 at $8999.


Those would be great specs for a next gen GPU but that number of Cuda Cores would have me expecting this to be a Ti variant engineering sample. I highly doubt the base 1180 will have that many cuda cores, if it did then the Ti would be insane as they always have more cuda cores.


----------



## hlreijnders

guttheslayer said:


> Someone did an analysis of the leaked PCB test board and found out the GPU size is around 26mm on each side or 676mm^2
> 
> 
> Considering the ratio between GP100 and GP102 and use it on GV100 and extrapolate GV102 estimated size. It is fairly close, 626mm^2. Also GV100 uses HBM which occupy less die space from its smaller memory controller as compared to G6. This mean a GV102 with bigger memory controller should exceed 626mm^2.
> 
> 
> I am not going to lie, but a 12GB GTX 1180 with 5120 cores is becoming really plausible. The 24GB Quadro T6000 or V6000 might uses a full 5376 cores while replacing the P6000 in the price segment. Of course the GV100 will still stay at the top with 32GB HBM2 at $8999.


The leaked board had a 384-bit bus, which would be more like a new Titan or Xx80 Ti part. The GTX 1080, 980 and 680 all had a 256-bit bus and those where the Gx104 chips. Only the big chips had a bigger bus. The 780 Ti has a 384-bit bus, the 980 Ti has a 384-bit bus and the 1080 Ti has a 352-bit bus.


----------



## guttheslayer

hlreijnders said:


> The leaked board had a 384-bit bus, which would be more like a new Titan or Xx80 Ti part. The GTX 1080, 980 and 680 all had a 256-bit bus and those where the Gx104 chips. Only the big chips had a bigger bus. The 780 Ti has a 384-bit bus, the 980 Ti has a 384-bit bus and the 1080 Ti has a 352-bit bus.


I want to ask you why you conveniently skip the GTX 780? Is it because the 780 have 384 bits bus? Lol. I have mentioned that this GTX 1100s is analogous to GTX 700s, where the X70 chip uses different chip from the X80. For the case of GTX 1170 and 1180, I believe it will be the same for different GPU used, where the X80 will use the big chip (instead of the small one).


Its not a new Ti variant. Nvidia (JHH specifically) have been marketing 25-35% performance over previous gen for both GTX 1080 and its Ti variant, Titan V while its huge, offer only 22-30% real world performance over GTX 1080 Ti. At that gap, it is not suitable for a Ti -> Ti jump, since that usually is 50-60% at least.

The cores increase from GTX 1080 to 1080 Ti is 40%, but the increase in core count from 1080 Ti to Titan V is only 42%. The 1080 Ti took 9 months after 1080 to be released, but right now it been 15 months since 1080 Ti is released, we expect something better really.


----------



## hlreijnders

guttheslayer said:


> I want to ask you why you conveniently skip the GTX 780? Is it because the 780 have 384 bits bus? Lol. I have mentioned that this GTX 1100s is analogous to GTX 700s, where the X70 chip uses different chip from the X80. For the case of GTX 1170 and 1180, I believe it will be the same for different GPU used, where the X80 will use the big chip (instead of the small one).


Yes, I skipped the GTX 780. It was based on the same core as the 780 Ti and the first Titan. It was supposed to be the top chip, minus the Titan. But then AMD came along with their 290X, forcing nVidia to release the 780 Ti to compete with it. Because nVidia has no competition from AMD at this moment, I would be surprised if they would use this big of a bus for a mid-range chip. It's very expensive.


----------



## keikei

RideZeLitenin said:


> R9 390, while still a great card that has some serious nuts to it, is no match for the new Sony 4K Panel I picked up this year. It'll rock 1440p decently enough, but after 3 years I'd say it's time for me to step up the game... here's looking at you, 1180



You have admiring patience. I on the other hand must have a sickness. I get that upgrade itch with every new gen release.


----------



## guttheslayer

hlreijnders said:


> Yes, I skipped the GTX 780. It was based on the same core as the 780 Ti and the first Titan. It was supposed to be the top chip, minus the Titan. But then AMD came along with their 290X, forcing nVidia to release the 780 Ti to compete with it. Because nVidia has no competition from AMD at this moment, I would be surprised if they would use this big of a bus for a mid-range chip. It's very expensive.



So since it was based on the same core as 780 Ti, you can also say the GTX 1180 is based on the same core as 1180 Ti. But in this case, there is very good reason for Nvidia to do this. TSMC 7nm.

7nm maturity come at a time which is awkward for Nvidia, it will be another 9 months before we start seeing 7nm GPUs, yet it was not long enough to fulfil Nvidia 2 years released cycle. In other word, Nvidia is stuck in a awkward timeline, since they already confirm to ramp up 7nm GPUs from TSMC, Nvidia will only have at most 12 months time to release their 12nm GPUs, you also need to consider the lead time to clear their 12nm inventory.


In short: There is no time for Ti variant. Come up a new GTX 1180, let it generate revenue during this 6-9 months period (while waiting for 7nm that is coming in Q1 2019) and simply call it a day. 7nm if you check the specs, it is a very powerful jump from 16/12nm. It been confirm to offer 4X transistor density. If Nvidia dont jump to 7nm as soon as possible, they will be overrun by 7nm Navi, no matter how inefficient GCN is.


Fun fact: If GP102 is using 7nm, the die size will be reduced from 471mm^2 -> 114mm^2. I am not joking.


----------



## zealord

guttheslayer said:


> So since it was based on the same core as 780 Ti, you can also say the GTX 1180 is based on the same core as 1180 Ti. But in this case, there is very good reason for Nvidia to do this. TSMC 7nm.
> 
> 7nm maturity come at a time which is awkward for Nvidia, it will be another 9 months before we start seeing 7nm GPUs, yet it was not long enough to fulfil Nvidia 2 years released cycle. In other word, Nvidia is stuck in a awkward timeline, since they already confirm to ramp up 7nm GPUs from TSMC, Nvidia will only have at most 12 months time to release their 12nm GPUs, you also need to consider the lead time to clear their 12nm inventory.
> 
> 
> In short: There is no time for Ti variant. Come up a new GTX 1180, let it generate revenue during this 6-9 months period (while waiting for 7nm that is coming in Q1 2019) and simply call it a day. 7nm if you check the specs, it is a very powerful jump from 16/12nm. It been confirm to offer 4X transistor density. If Nvidia dont jump to 7nm as soon as possible, they will be overrun by 7nm Navi, no matter how inefficient GCN is.
> 
> 
> Fun fact: If GP102 is using 7nm, the die size will be reduced from 471mm^2 -> 114mm^2. I am not joking.


do you have a source or something for that? I am not an expert. I really am not, but that sounds wrong


----------



## guttheslayer

zealord said:


> do you have a source or something for that? I am not an expert. I really am not, but that sounds wrong


https://www.semiwiki.com/forum/content/6713-14nm-16nm-10nm-7nm-what-we-know-now.html


Check out the MTr / mm^2, which stand for million transistor per millimetres square. For TSMC 16FF, the density is 28.2, but when it comes to 7nm it quadruple to 116.7 MTr / mm^2. In comparison, the GP102 (which measures 471 mm^2 in area and pack 12 billion transistors) has a density of 25.4 MTr / mm^2, which is pretty in-line with the figure given by semiwiki.com

The figure given by 7nm is accurate as well, since Intel have confirm their 10nm have a density of 100.8 MTr / mm^2. TSMC 7nm is leading them by 16% (For Samsung is 23%) which explains why Intel is panicking now cause they are losing the lead in nodes technology.


*When using TSMC 7nm, a GPU with 500mm ^2 die size can pack 58.3 billion transistors *. In contrast the biggest GPU, GV100 with 815mm^2 die only have 21.1 billion transistor. Do I need to say more?


----------



## Falkentyne

nycgtr said:


> lmao. Some of you dreamers keep forgetting that there is no competition. Some of the tech press is pushing the click bait. Fact of the matter is we don't know what that dev board is for. Looking at the nvlink connector, 3 power connectors. Hell it can be for their self driving cars platform. Minning tanked a few months ago, looking at prices coming down across the board, excess inventory. There are peeps today buying new 1080s there is no reason to release anything until its sold off. Even then there's the opportunity to milk titan V perf to next year with a TI/ Titan V- whatever they wanna call it. Best bet you get Xp perf @ 649.99


Who cares that there is no competition?
if Nvidia has the technology to push out faster cards, and they release it, and you people buy it up (like you ALWAYS do), then what's the problem?

Sheep will always follow the food source.

Nvidia is not Intel. They're doing a LOT better than Intel, financially, even if Intel is worth more.
https://www.fool.com/investing/2018/01/19/better-buy-intel-corporation-vs-nvidia.aspx


----------



## nycgtr

Falkentyne said:


> Who cares that there is no competition?
> if Nvidia has the technology to push out faster cards, and they release it, and you people buy it up (like you ALWAYS do), then what's the problem?
> 
> Sheep will always follow the food source.
> 
> Nvidia is not Intel. They're doing a LOT better than Intel, financially, even if Intel is worth more.
> https://www.fool.com/investing/2018/01/19/better-buy-intel-corporation-vs-nvidia.aspx


My statement means this gives them more opportunity to milk it further. Why offer a 30% jump over a ti if they can offer a 10% jump at 600 then another 20-25% jump on top of that at a higher price point. The V is 3000 you take off the tensor cores and the hbm2 I still don't see it being under 1200.


----------



## Falkentyne

Why milk it further?
Just push the tech so that AMD will NEVER catch up.
Them milking the old stuff is EXACTLY what AMD Wants them to do. How do you think AMD caught up to Intel on the CPU front?
Because Intel was milking 4 corez for *TEN YEARS!*
That's how AMD caught up.


----------



## nycgtr

Falkentyne said:


> Why milk it further?
> Just push the tech so that AMD will NEVER catch up.
> Them milking the old stuff is EXACTLY what AMD Wants them to do. How do you think AMD caught up to Intel on the CPU front?
> Because Intel was milking 4 corez for *TEN YEARS!*
> That's how AMD caught up.


It makes money? Nvidia is generations ahead of AMD. Intel stayed stagnant, nvidia isn't stagnant they can develop faster chips for their enterprise, self driving etc platforms. Doesn't mean that tech will come down to you unless they want it to or feel it's profitable to do so. If they can milk it a bit more while still being ahead they sure well. When they have enough cutdowns laying around from volta we will see volta gpus and thanks to 0 competition they can be whatever price premium they want it to be.

Also intel in the past few years have wasted billions trying to get into other sectors. At the end of the day Nvidia just makes GPUS that can be used for more than 1 purpose.


----------



## Lee Patekar

Falkentyne said:


> Why milk it further?
> Just push the tech so that AMD will NEVER catch up.
> Them milking the old stuff is EXACTLY what AMD Wants them to do. How do you think AMD caught up to Intel on the CPU front?
> Because Intel was milking 4 corez for *TEN YEARS!*
> That's how AMD caught up.


AMD is currently more competitive at higher core counts than at 8 core or less.. they have a significant manufacturing cost advantage thanks to infinity fabric that intel cannot match at present.

I don't see why nVidia will cannibalise on their existing inventory.. especially with reports of their partners returning unused inventory when the mining craze subsided.




nycgtr said:


> ... Nvidia is generations ahead of AMD. Intel stayed stagnant, nvidia isn't stagnant they can develop faster chips for their enterprise, self driving etc platforms. Doesn't mean that tech will come down to you unless they want it to or feel it's profitable to do so. If they can milk it a bit more while still being ahead they sure well. When they have enough cutdowns laying around from volta we will see volta gpus and thanks to 0 competition they can be whatever price premium they want it to be.


Precisely. They may even rebrand cards at a discount just to liquidate stock.


----------



## kd5151

I think lenovo spilled the beans.


----------



## dmasteR

keikei said:


> If the 1180 is around 30% faster than the current Ti, i'm all over it. I need that extra performance. I'll pay for the early adopter fee.
> 
> Good work Threx! :thumb: I"m a fan of One Piece myself.


1180 will be 2 x 1080 performance and will be announced/released 4 months from now.


----------



## guttheslayer

nycgtr said:


> My statement means this gives them more opportunity to milk it further. Why offer a 30% jump over a ti if they can offer a 10% jump at 600 then another 20-25% jump on top of that at a higher price point. The V is 3000 you take off the tensor cores and the hbm2 I still don't see it being under 1200.


The GTX 1180 will be 25-30% faster if they adopt Titan V cores configuration, which is set at 5120. Of cos with better boost clock it will be slightly more. But I am rather certain the lifespan for these 12nm 1100 series will be short.


What remain now is they will go straight for 1180 Ti or just 1180, either way the flagship (whichever it is) will not be <$700 that is for certain.


----------



## dVeLoPe

dmasteR said:


> 1180 will be 2 x 1080 performance and will be announced/released 4 months from now.


your a bit off on your math.


----------



## guttheslayer

dVeLoPe said:


> dmasteR said:
> 
> 
> 
> 1180 will be 2 x 1080 performance and will be announced/released 4 months from now.
> 
> 
> 
> your a bit off on your math.
Click to expand...

I think he is sarcastic. But either way 30% jump is reasonable consider that is the gap between 1080 and 1080 ti. And its been 15 months since 1080 ti. Its about time.


----------



## kd5151

August 2.


----------



## Threx

kd5151 said:


> August 2.


I think this has a good chance to be the case. Even though they didn't specifically confirm new gpus, handing out free trips to Europe to the press can't simply be about looking at some games. It's got to be something more important.


----------



## guttheslayer

kd5151 said:


> August 2.


From which source?


----------



## hlreijnders

guttheslayer said:


> kd5151 said:
> 
> 
> 
> August 2.
> 
> 
> 
> From which source?
Click to expand...

Probably from Wccftech, which states during Gamescom in August the 2nd. But Gamescom ist from August 21 to 25.


----------



## kd5151

guttheslayer said:


> From which source?


 WCCFTECH/ Videocardz


----------



## LongtimeLurker

It's looking like the new 1180 will be revealed on August 2nd. The Gamescon fully-sponsored press invites in another country by NVidia - coupled with that Lenovo rep accidentally revealing the 1180 will be released in Q3, and August is smack in the middle of Q3, now we finally have some fire to go with all this smoke.

Relevant links:

https://wccftech.com/nvidia-geforce-11-series-geforce-gtx-1180-graphics-card-lenovo-spills/
https://wccftech.com/nvidia-launching-next-generation-graphics-cards-at-gamescom-2018/

However, keep in mind it will be the boring FE versions for a bit 


> While we will almost certainly be seeing only the FE versions at this event, we don’t have any idea of how soon we can expect the custom versions to arrive.


But at least we'll hopefully be seeing leaked benchmarks soon!

And of course:


> This isn’t particularly surprising since supply is usually pretty bad at most launches and we see retailers like Amazon pumping up the MSRP to cater to the huge demand. In other words, if you manage to snag a GeForce GTX 1180 FE (or whatever NV ends up calling it) at MSRP, consider yourself very lucky.


And finally:


> As far as performance goes, we did hear that you can expect double-digit performance increase over the GeForce GTX 1080 Ti and it could reach significant digits if the company manages to work some magic using its drivers. Oh and remember when NVIDIA’s CEO Jensen stated that new GPUs are a long way off? Well, this is how he defines ‘a long way off’.


----------



## guttheslayer

30% over the GTX 1080 Ti is what I am rooting for, or ~3% faster than Titan V. Lol.


----------



## JedixJarf

SuperZan said:


> I'll probably just stick with the 1080 Ti, but it'll be interesting to see where these cards fall performance-wise.


Same, until 4K monitors hit with reasonable refresh rates (and reasonable pricing) I'll just hold onto the 1080 ti.


----------



## guttheslayer

JedixJarf said:


> Same, until 4K monitors hit with reasonable refresh rates (and reasonable pricing) I'll just hold onto the 1080 ti.


I hope the upcoming 32" HDR10 [email protected] QD microled panel will avoid the mistake and come with HDMI 2.1 that allows VRR for nvidia card.


----------



## ZealotKi11er

guttheslayer said:


> 30% over the GTX 1080 Ti is what I am rooting for, or ~3% faster than Titan V. Lol.


You think? All they got to do is 10% faster than 1080 Ti FE.


----------



## black96ws6

The WCCFTech writer screwed up the date in the article. He left off the "1" when he said "August 2, a Tuesday".

It's actually August 21, which is when Gamescon starts. And it's a Tuesday. August 2 is a Thursday and almost 3 weeks before the event actually starts, so that's definitely not correct.

The GTX1180 FE will be released on August 21.

Pricing guess: $799.


----------



## guttheslayer

ZealotKi11er said:


> You think? All they got to do is 10% faster than 1080 Ti FE.


It wont be. It will be 30%. Stop being pessimistic. Even the 1080 Ti was much better than 1080 for that.


Nvidia has been keeping a 20-30% trend and is just that, furthermore it about time Titan V to be dethrone. Afterall the card was not marketed as a gaming card, and is 22-29% faster in DX11 games.


My pricing guess is $699. Any pricing such as $799 all those is reserved for upcoming 7nm GPUs, if AMD still fall very short of the GPU battle.


----------



## Recipe7

It's getting more realllll.

I'm with guttheslayer. I sold my 1080ti, and am now back to my 980ti. The only way to justify a new 9700k build is with a gpu that has 30% performance over 1080 ti. Patiently waiting for January 2019.


----------



## caenlen

nevermind~ edit


----------



## guttheslayer

Recipe7 said:


> It's getting more realllll.
> 
> I'm with guttheslayer. I sold my 1080ti, and am now back to my 980ti. The only way to justify a new 9700k build is with a gpu that has 30% performance over 1080 ti. Patiently waiting for January 2019.



I am assuming you sold your 1080 Ti due to mining boom and it cost way better than the price you bought them at? If not selling them just to wait for 30% is not worth it. 

The upcoming 7nm GPUs are more nasty. A small 240mm sq 7nm could pack 8192 CUDA cores based on pascal configuration. A 12288 cores 7nm Tesla GPU only requires less than 500mm sq die. (52 bil transistors), and the clock is probably going to boost at 1.8 GHz (>2GHz for the smaller one), giving a Double Precision compute of 22-24 TFLOP. 


Making a Exa-scale Supercomputer (Frontier?) based on these new tesla GPU seem to be a real possibility before year 2021.


----------



## Recipe7

guttheslayer said:


> I am assuming you sold your 1080 Ti due to mining boom and it cost way better than the price you bought them at? If not selling them just to wait for 30% is not worth it.
> 
> The upcoming 7nm GPUs are more nasty. A small 240mm sq 7nm could pack 8192 CUDA cores based on pascal configuration. A 12288 cores 7nm Tesla GPU only requires less than 500mm sq die. (52 bil transistors), and the clock is probably going to boost at 1.8 GHz (>2GHz for the smaller one), giving a Double Precision compute of 22-24 TFLOP.
> 
> 
> Making a Exa-scale Supercomputer (Frontier?) based on these new tesla GPU seem to be a real possibility before year 2021.


I sold it because I have a 2.5 and a 1 year old that has been demanding more attention from me, particularly the 1 year old. They are not satisfied with watching me play Single players anymore, they want to take part and slam their palms on the keyboard. Forget about online games, I will be booted after minutes for being AFK. 

I didn't make 1300USD off the sale, unfortunately, but I did make back what I paid for it atleast.

Wow at those TFLOPS! I guess I should be saving for a 2000USD 144hz 4k monitor too, it would actually be utilized with a single gpu solution. If I put away 100 bucks a paycheck until then, It would be a viable purchase for me, ha.


----------



## guttheslayer

Recipe7 said:


> Wow at those TFLOPS! I guess I should be saving for a 2000USD 144hz 4k monitor too, it would actually be utilized with a single gpu solution. If I put away 100 bucks a paycheck until then, It would be a viable purchase for me, ha.



If you take 20 months to save up I think by then better display will be out, and the monitor will not be $2000. I would suggest you get the Mled version of the X27 or even X32 (32") with HDMI 2.1


----------



## confed

guttheslayer said:


> If you take 20 months to save up I think by then better display will be out, and the monitor will not be $2000. I would suggest you get the Mled version of the X27 or even X32 (32") with HDMI 2.1


I hope he doesn't get paid monthly. I think he was suggesting 20 weeks, or possibly 40 weeks, definitely not 84+ weeks.


----------



## Recipe7

guttheslayer said:


> If you take 20 months to save up I think by then better display will be out, and the monitor will not be $2000. I would suggest you get the Mled version of the X27 or even X32 (32") with HDMI 2.1


I haven't been too optomistic about the monitor market. The prices of the 1440p gsync panels did not waiver much, maybe only when other companies besides Acer and Asus started releasing their own variants, but the prices were firm at 699-1199 based on which one you wanted. I am figuring that the 144hz 4k market will always be no less than 1500USD, as they will slowly but surely bring upon small bells and whistles the next 4-5 years.

I'm not too familiar with what will be coming out, but I just want to keep 2000usd in mind as to not get dissapointed when it's time to pull the trigger.



confed said:


> I hope he doesn't get paid monthly. I think he was suggesting 20 weeks, or possibly 40 weeks, definitely not 84+ weeks.


26 paychecks a year would be 2600 after a year. I am overestimating and making light of this, of course.

I'm one of those that will blindly put money away, like a piggy bank, then smash it to pieces and buy what I was saving for a year ago. Makes a 'want' purchase a little more justifiable and less painful.


----------



## EastCoast

Wait, is this a GT104 or a GT102? I can pretty much assume that GT100 will be reserved for their Titan, etc. 

But if this is a GT104 what will be the buying remorse when the GT102 is released? Are we looking at another 15%+ performance gap? 

Also, with prices going back down to normal it's possible that the difference in prices (if released by fall 2018) will be more in line by the time the GT102 comes out. 

If the info is correct about the mArch we should have
GT106 1160
GT104 1170/1180
GT102 1180 TI
GT100 Titan?? XP??

But it's all speculative for now.


----------



## guttheslayer

EastCoast said:


> Wait, is this a GT104 or a GT102? I can pretty much assume that GT100 will be reserved for their Titan, etc.
> 
> But if this is a GT104 what will be the buying remorse when the GT102 is released? Are we looking at another 15%+ performance gap?
> 
> Also, with prices going back down to normal it's possible that the difference in prices (if released by fall 2018) will be more in line by the time the GT102 comes out.
> 
> If the info is correct about the mArch we should have
> GT106 1160
> GT104 1170/1180
> GT102 1180 TI
> GT100 Titan?? XP??
> 
> But it's all speculative for now.


Currently no one knows, we shouldnt assume the above model as that is used by Nvidia during a 2 years cycle. As of now I am seeing, we are a year or less away from 7nm product, so that means Turing only have a year lifespan, Just like Volta, Turing should be basically a Pascal with cores steroid (but more efficient in perf/watt)


From what I see, It will be a direct GT100 and GT104 together. (I really don't like T moniker as it been used before for Tesla). The GT100 is basically GV100 without the compute units, lesser TENSOR cores & using GDDR6 IMC. There is a very high chance GTX 1180 is using GT100 directly, and GTX 1170 will be GT104. They might squeeze in a Titan T in between that comes with 24GB memory, but the difference will be small (5376 cores vs 5120 cores)


Basically they will be phased out by next year Gamescon and the 7nm (Ampere) will be the true successor to all the GPUs we been seeing.


----------



## ibb27

Vietnam online store (pic from reddit), I hope it's true:


----------



## guttheslayer

ibb27 said:


> Vietnam online store (pic from reddit), I hope it's true:


The picture showing 1180 Ti naming is wrong.


And it remain to be seen if it was GTX 1180 or 1170. 16GB is unlikely but if it was 16 then yes it is 1180. If it is 8GB then is 1170.


----------



## EastCoast

guttheslayer said:


> Currently no one knows, we shouldnt assume the above model as that is used by Nvidia during a 2 years cycle.



This topic is in the Rumors and Unconfirmed Articles portion. 

But here is an article just recently published stating just that:
https://www.techpowerup.com/245657/nvidia-gt104-based-geforce-gtx-1180-surfaces-on-vietnamese-stores


Don't tell me that you are going to fall for the GT104 (GT104 is a mid range mArch part) as a top end part again? We've been through this before and most decided they were going to wait until the GT102 is released. That way if you want a GT104 at least you aren't gouged for it. 

https://www.overclock.net/forum/225...i-launch-date-october-26th-specs-details.html


----------



## bucdan

One thing I can absolutely appreciate Nvidia for, is that they lock down the rumor mill.


----------



## guttheslayer

Here is my quick guess on the upcoming 12nm Geforces Turing:


Geforce GTX 1180 - 5120 cores, 1.6GHz core, 12GB G6 @ 672 GB/s, GT100, 630 mm^2, US$699
Geforce GTX 1170 Ti - 3584 cores, 1.8GHz core, 8GB G6 @ 448 GB/s, GT104, 420 mm^2, US$499
Geforce GTX 1170 - 2816 cores, 1.7GHz core, 8GB G6 @ 384 GB/s GT104, 420 mm^2, US$379



Lets see how far am I is my guess on the actual product.


----------



## ENTERPRISE

guttheslayer said:


> Here is my quick guess on the upcoming 12nm Geforces Turing:
> 
> 
> Geforce GTX 1180 - 5120 cores, 1.6GHz core, 12GB G6 @ 672 GB/s, GT100, 630 mm^2, US$699
> Geforce GTX 1170 Ti - 3584 cores, 1.8GHz core, 8GB G6 @ 448 GB/s, GT104, 420 mm^2, US$499
> Geforce GTX 1170 - 2816 cores, 1.7GHz core, 8GB G6 @ 384 GB/s GT104, 420 mm^2, US$379
> 
> 
> 
> Lets see how far am I is my guess on the actual product.


I think the GTX 1180 will have less Cuda Cores than 5120. I feel that count will be reserved for the later 1180Ti Model.


----------



## guttheslayer

ENTERPRISE said:


> I think the GTX 1180 will have less Cuda Cores than 5120. I feel that count will be reserved for the later 1180Ti Model.


There wont be 1180 Ti, at least that is what I feel. It will be 7nm GTX 1280 next.


----------



## Threx

The GTX x80 debut cards for kepler, maxwell, and pascal did not use Gx100 so I highly doubt the 1180 will use GT100 if it is indeed a new architecture.


----------



## ENTERPRISE

guttheslayer said:


> There wont be 1180 Ti, at least that is what I feel. It will be 7nm GTX 1280 next.


Nvidia will milk this gen out I feel, a little like Pascal. Possibly not as long but milking nonetheless. I think a Ti will be on the cards as per the last 2 gens. Plus I think 7nm production is going to take a little more work and time before it is ready for another gen release to consumers and to give themselves more time Nvidia will undoubtedly release a Ti version.


----------



## guttheslayer

Threx said:


> The GTX x80 debut cards for kepler, maxwell, and pascal did not use Gx100 so I highly doubt the 1180 will use GT100 if it is indeed a new architecture.



You are right IF it was a complete new architecture (new architecture means you cannot infer anything at all not even CUDA cores). But i don't call Turing a new mArch if it was Volta stripped down to just CUDA cores.




ENTERPRISE said:


> Nvidia will milk this gen out I feel, a little like Pascal. Possibly not as long but milking nonetheless. I think a Ti will be on the cards as per the last 2 gens. Plus I think 7nm production is going to take a little more work and time before it is ready for another gen release to consumers and to give themselves more time Nvidia will undoubtedly release a Ti version.


AMD wont wait so long, the fact that 7nm is preparing for Apple Iphone this Sept means we only expect 9 months lead time at most. Nvidia only have up to 12 months of cycle starting from this year Gamescon to next year Q3. AMD Navi will be out by Q1 2019. And given the 4X density (>2.5x power saving) u can imagine the monstrosity amount of cores they can pack in. Nvidia will be losing a big slice of the pie if they are still on 12nm.


Dont forget everyone is in a race to exa-scale computing also, and US want to be in the first in 2020 or 2021 to hit exa-scale supercomputer. Nvidia next gen tesla on 7nm could deliver, if they can iron out all the big chip issue before 2020.


There are many reason why we will see next gen Geforce 7nm by Q3 2019. Its all about competition, not just on gaming field, but everywhere.


----------



## epic1337

so its finally july... do we have an accurate launch date yet? end of july? or maybe next year's july?


----------



## guttheslayer

epic1337 said:


> so its finally july... do we have an accurate launch date yet? end of july? or maybe next year's july?


I thought Gamescon was very clear? Aug 21. Which is in line with the hotchip invitation about next gen mainstream GPU (before it was removed shortly after)


----------



## prjindigo

Its July, where's the card?


----------



## tajoh111

guttheslayer said:


> You are right IF it was a complete new architecture (new architecture means you cannot infer anything at all not even CUDA cores). But i don't call Turing a new mArch if it was Volta stripped down to just CUDA cores.
> 
> 
> 
> 
> AMD wont wait so long, the fact that 7nm is preparing for Apple Iphone this Sept means we only expect 9 months lead time at most. Nvidia only have up to 12 months of cycle starting from this year Gamescon to next year Q3. AMD Navi will be out by Q1 2019. And given the 4X density (>2.5x power saving) u can imagine the monstrosity amount of cores they can pack in. Nvidia will be losing a big slice of the pie if they are still on 12nm.
> 
> 
> Dont forget everyone is in a race to exa-scale computing also, and US want to be in the first in 2020 or 2021 to hit exa-scale supercomputer. Nvidia next gen tesla on 7nm could deliver, if they can iron out all the big chip issue before 2020.
> 
> 
> There are many reason why we will see next gen Geforce 7nm by Q3 2019. Its all about competition, not just on gaming field, but everywhere.


I suspect Navi will be out at the earliest the end of q2, but this is not likely. 

The problem with any mass produced 7nm chip is volume anytime soon unless your Apple or Qualcomm. 7nm will not be able to sustain the quantity of chips demands for something like a 300 dollar GPU which results in sales cannabalization. I.e 7nm killing sales of their current lineup while not being able to sell the volume needed to offset those lost sales. A 7nm vega is possible because it's dedicated to the pro market where 1000's of chips are being made, not hundreds of thousands to millions and AMD has no presence in the professional segment since hawaii hasn't been updated for years, thus no cannibalization. 

With Qualcomm and Apple eating all the capacity of 7nm finfet, particularly TSMC variant, I think 2nd half of 2019 is probably the most likely. Wafers will be incredibly expensive for 7nm finfet with the crazy demand during the end of 2018/beginning of 2018. 

Depending on how much navi deviates from GCN if it is different, the driver work will be substantial. 

I would see AMD releasin 7nm navi about July 2019 at the earliest with Nvidia's chips being released in August or September. However, what I think is more likely is near the end of 2019 for Navi. The reason is Zen 2. Zen 2 particularly in this post mining will be the much more profitable play. 

With Intel's 10nm not looking so hot, if I am Su, I am rushing and prioritizing 7nm zen 2 because this will be the first time they have the chance to have parity or even a process advantage over Intel. That's where all my wafers and money spend on development would go because the revenue and profit gains are much more rich and capturing a few percentage points means hundreds of millions to billions unlike the small pie that is the GPU market where it takes double digit percentage points to really increase revenue. 

Also the profit on a CPU per unit is much betters since they will be able to sell Zen 2 for 500 dollars without any board partner/components eating into the selling price. 

Spending money on wafers and R and D on 7nm for GPU would be a waste if it delays or reduces supply for zen 2. This is the smart play which is why Lisa will do it. 

AMD could release a 7nm navi in Q1 but it would be the dumb play because it would eat into Zen 2 resources, would be supply limited, and come with half baked drivers. 

I made the same prediction on Zen/Vega in /2015/2016 and it proved right and I suspect the same will happen again.


----------



## guttheslayer

tajoh111 said:


> guttheslayer said:
> 
> 
> 
> You are right IF it was a complete new architecture (new architecture means you cannot infer anything at all not even CUDA cores). But i don't call Turing a new mArch if it was Volta stripped down to just CUDA cores.
> 
> 
> 
> 
> AMD wont wait so long, the fact that 7nm is preparing for Apple Iphone this Sept means we only expect 9 months lead time at most. Nvidia only have up to 12 months of cycle starting from this year Gamescon to next year Q3. AMD Navi will be out by Q1 2019. And given the 4X density (>2.5x power saving) u can imagine the monstrosity amount of cores they can pack in. Nvidia will be losing a big slice of the pie if they are still on 12nm.
> 
> 
> Dont forget everyone is in a race to exa-scale computing also, and US want to be in the first in 2020 or 2021 to hit exa-scale supercomputer. Nvidia next gen tesla on 7nm could deliver, if they can iron out all the big chip issue before 2020.
> 
> 
> There are many reason why we will see next gen Geforce 7nm by Q3 2019. Its all about competition, not just on gaming field, but everywhere.
> 
> 
> 
> I suspect Navi will be out at the earliest the end of q2, but this is not likely.
> 
> The problem with any mass produced 7nm chip is volume anytime soon unless your Apple or Qualcomm. 7nm will not be able to sustain the quantity of chips demands for something like a 300 dollar GPU which results in sales cannabalization. I.e 7nm killing sales of their current lineup while not being able to sell the volume needed to offset those lost sales. A 7nm vega is possible because it's dedicated to the pro market where 1000's of chips are being made, not hundreds of thousands to millions and AMD has no presence in the professional segment since hawaii hasn't been updated for years, thus no cannibalization.
> 
> With Qualcomm and Apple eating all the capacity of 7nm finfet, particularly TSMC variant, I think 2nd half of 2019 is probably the most likely. Wafers will be incredibly expensive for 7nm finfet with the crazy demand during the end of 2018/beginning of 2018.
> 
> Depending on how much navi deviates from GCN if it is different, the driver work will be substantial.
> 
> I would see AMD releasin 7nm navi about July 2019 at the earliest with Nvidia's chips being released in August or September. However, what I think is more likely is near the end of 2019 for Navi. The reason is Zen 2. Zen 2 particularly in this post mining will be the much more profitable play.
> 
> With Intel's 10nm not looking so hot, if I am Su, I am rushing and prioritizing 7nm zen 2 because this will be the first time they have the chance to have parity or even a process advantage over Intel. That's where all my wafers and money spend on development would go because the revenue and profit gains are much more rich and capturing a few percentage points means hundreds of millions to billions unlike the small pie that is the GPU market where it takes double digit percentage points to really increase revenue.
> 
> Also the profit on a CPU per unit is much betters since they will be able to sell Zen 2 for 500 dollars without any board partner/components eating into the selling price.
> 
> Spending money on wafers and R and D on 7nm for GPU would be a waste if it delays or reduces supply for zen 2. This is the smart play which is why Lisa will do it.
> 
> AMD could release a 7nm navi in Q1 but it would be the dumb play because it would eat into Zen 2 resources, would be supply limited, and come with half baked drivers.
> 
> I made the same prediction on Zen/Vega in /2015/2016 and it proved right and I suspect the same will happen again.
Click to expand...

It doesnt matter it still show nvidia have 12 month cycle for 12nm turing and not 24 months. By sept 2019 we should be on 7nm.

So a direct skip to gt100 is correct, and ti moniker will be exempted this time round.

The ti moniker will come back again in gtx 1200 series


----------



## ibb27

> ...
> Meanwhile, NVIDIA's traditional hardware development cadence and redacted Hot Chips talks point to a new consumer lineup in the near future, and it will be very interesting to see what architectural features turn up.
> ...



Even Anandtech confirms the new "consumer lineup".


----------



## ENTERPRISE

guttheslayer said:


> You are right IF it was a complete new architecture (new architecture means you cannot infer anything at all not even CUDA cores). But i don't call Turing a new mArch if it was Volta stripped down to just CUDA cores.
> 
> 
> 
> 
> AMD wont wait so long, the fact that 7nm is preparing for Apple Iphone this Sept means we only expect 9 months lead time at most. Nvidia only have up to 12 months of cycle starting from this year Gamescon to next year Q3. AMD Navi will be out by Q1 2019. And given the 4X density (>2.5x power saving) u can imagine the monstrosity amount of cores they can pack in. Nvidia will be losing a big slice of the pie if they are still on 12nm.
> 
> 
> Dont forget everyone is in a race to exa-scale computing also, and US want to be in the first in 2020 or 2021 to hit exa-scale supercomputer. Nvidia next gen tesla on 7nm could deliver, if they can iron out all the big chip issue before 2020.
> 
> 
> There are many reason why we will see next gen Geforce 7nm by Q3 2019. Its all about competition, not just on gaming field, but everywhere.


I agree with what you are saying,then again theoretically speaking the Ti is kind of already out in the guise of the Titan Turing so with that in mind perhaps they will skip the Ti which is just the gaming variant of the Titan. We will have to wait and see.


----------



## Threx

Another possible leak from a Thai site called notebookspec.com shows prices of 1180 being $724 and 1170 being $573, including vat, in their part picker section.

https://i.imgur.com/1fJKH2n.jpg

Btw, that listed price of the 1180 is about the same price as the 980 when it first came out here in Thailand. So if (big IF) the leak is accurate then expect the MSRP of the 1180 to be the same as the 980.


----------



## guttheslayer

Threx said:


> Another possible leak from a Thai site called notebookspec.com shows prices of 1180 being $724 and 1170 being $573, including vat, in their part picker section.
> 
> https://i.imgur.com/1fJKH2n.jpg
> 
> Btw, that listed price of the 1180 is about the same price as the 980 when it first came out here in Thailand. So if (big IF) the leak is accurate then expect the MSRP of the 1180 to be the same as the 980.



If its US$549, then the 1180 will indeed be 3584 cores, but coupled with 8GB of ram (1GB G6 module cost close to $10 accordingly to GN, or $80 on memory alone w/o GPU, unlikely to be 2GB module, which is probably close to 2x the price, hence 16GB $549 is unlikely). The 1180 Ti with 11-12GB 5120 cores should be released shortly or at the same time with a MSRP $699 - $799. 

If that is really the case though, it look nothing more than a pascal refresh with faster memory and just more cores:


----------



## Khelben

guttheslayer said:


> The 1180 Ti with 11-12GB 5120 cores *should be released shortly or even side by side* with a MSRP of $699 - $799. This will be interesting if it is true.


May i ask if this anything other than wishful thinking? Ti variants are usually released almost a year after the base model are they not? I m not trying to be snide or sarcastic btw. Just really curious as to why you think so.


----------



## guttheslayer

Khelben said:


> May i ask if this anything other than wishful thinking? Ti variants are usually released almost a year after the base model are they not? I m not trying to be snide or sarcastic btw. Just really curious as to why you think so.


I been saying this many times. 1 years cycle and not 2 year cycle.

I find it funny many didnt understand that 7nm process node is around the corner, which by that I mean is within 12 months of a 7nm GPU. If Turing is release on this year Sept, the next gen 7nm will be released by Sept 2019, not in 2020.


Nvidia will not use the same codename Turing for their 7nm product. So Turing existed only for a year. That is what I mean.


----------



## guttheslayer

Anyway be it GTX 1180 or GTX 1180 Ti it doesnt matter. What I trying to say is, the big Turing Chip will pop up by this Sept, and that it will occupy the price bracket of $700-$800, and will be around the size of Big Maxwell (GM200)

There are 2 option on how it will turn out:

1) If the big Turing chip (with 5120 cores) is named as GTX 1180, then the medium Turing (with 3584 cores) GTX 1170 will occupy the $400-$450 bracket

2) If the big Turing chip (with 5120 cores) is named as GTX 1180 Ti, then the medium Turing (with 3584 cores) will be renamed as GTX 1080 and will occupy the $500-$550 bracket.


Doesnt matter Nv will end up with option 1 or 2 in term of 1100 series naming, both options doesnt change the fact that we might see big Turing GTX (not titan) chip before Q4 this year, and next gen 7nm Geforce by Q3 next year.


Note: Option 2 is a more attractive option for Nvidia as it can help them to price their medium Turing abit higher.


----------



## SuprUsrStan

guttheslayer said:


> Anyway be it GTX 1180 or GTX 1180 Ti it doesnt matter. What I trying to say is, the big Turing Chip will pop up by this Sept, and that it will occupy the price bracket of $700-$800, and will be around the size of Big Maxwell (GM200)
> 
> There are 2 option on how it will turn out:
> 
> 1) If the big Turing chip (with 5120 cores) is named as GTX 1180, then the medium Turing (with 3584 cores) GTX 1170 will occupy the $400-$450 bracket
> 
> 2) If the big Turing chip (with 5120 cores) is named as GTX 1180 Ti, then the medium Turing (with 3584 cores) will be renamed as GTX 1080 and will occupy the $500-$550 bracket.
> 
> 
> Doesnt matter Nv will end up with option 1 or 2 in term of 1100 series naming, both options doesnt change the fact that we might see big Turing GTX (not titan) chip before Q4 this year, and next gen 7nm Geforce by Q3 next year.
> 
> 
> Note: Option 2 is a more attractive option for Nvidia as it can help them to price their medium Turing abit higher.




It's too early. It wont be by this September.


----------



## guttheslayer

Syan48306 said:


> It's too early. It wont be by this September.


Even if Sept is too early though they will still release big Turing within this year so they have 6-9 months time in 2019 to clear them out. It can also means Option 1 is more viable


We shall see how this will interestingly turn out.


----------



## ENTERPRISE

Looks like the 1180Ti will be a thing in the end.


----------



## Wishmaker

Looks at calendar, sees July and asks : WHERE IS IT?


----------



## Newbie2009

Hardware Hoshi said:


> The date of July is rumored almost everywhere. If this isn't the usual copy&paste journalism, the release of the new GTX-Generation must be really close.
> 
> I am personally not exited about the GTX 1180 because its price tag is probably beyond 800 Euros for early adopters again. To me it is more technical interest if or how the card beats a current Pascal GTX 1080 Ti or even a Titan. Maybe a GTX1170 could be interesting, but I fear the price is closer to 500 Euro/Dollar too.
> 
> Only thing left for me is a potential GTX1160 at 1080 levels for my HTPC. Below 120W TDP it would be another killercard to have. I pray to god this will be below 300 Euro/Dollar. IIRC the smaller chips got released later than the big Gx104 ones. Waiting for another 4-6 months after release of the new gen might get really painful.
> 
> At least this is something ot look forward to.


That would be more expensive than a 1080ti so I doubt it. (I presume the new card, non ti , will be same/slightly faster than 1080ti)


----------



## ZealotKi11er

I am going to skip 12nm GPU considering 7nm is around the corner.


----------



## guttheslayer

ZealotKi11er said:


> I am going to skip 12nm GPU considering 7nm is around the corner.


And that is the problem why they have to release the big Turing chip fast.




Newbie2009 said:


> (I presume the new card, non ti , will be same/slightly faster than 1080ti)


If GTX 1180 is as fast or slightly faster it will be only $500+, if it is 25%-30% faster than 1080 Ti it will be _at least_ $699. Simple and Easy.


----------



## doom26464

7nm gaming gpu are still a long way away yet. We should be getting 7nm professional card q1 2019. Gamming cards are at the point now where they only come out on old high yeild nodes at this point. 

12nm gamming gpu will last probaly 2 years if I was to guess.


----------



## guttheslayer

doom26464 said:


> 7nm gaming gpu are still a long way away yet. We should be getting 7nm professional card q1 2019. Gamming cards are at the point now where they only come out on old high yeild nodes at this point.
> 
> 12nm gamming gpu will last probaly 2 years if I was to guess.


No it wont. 12nm will not last 2 years with AMD on 7nm by this year end or Q1 2019 (starting Polaris 680). I believe by mid 2019 we should see 7nm GPU from Nvidia. Nvidia is never more than 6 months late when it comes with process node jump against AMD.


----------



## Lee Patekar

I knew about the 7 nm node for Epyc server processors and Vega professional GPUs.. but I heard nothing of Polaris or any other gaming card on 7 nm.. was there a source I missed?


----------



## chessmyantidrug

The only way this upcoming series of graphics cards can't last anyone two years is if these cards aren't adequate from day one. I didn't even need to upgrade from Maxwell to Pascal, but the price justified the extra performance. I'm probably going to hold off on upgrading until I can get another 100% performance bump for under $400.


----------



## doom26464

I also have heard no such thing of polaris 7nm gpu. 

Source?


----------



## keikei

12 nm polaris(refresh) rumored for late 2018, but nothing i'm finding regarding 7 nm. 7 nm might be reserved for Navi? https://wccftech.com/amd-gpu-rumors-12nm-polaris-30-7nm-navi-7nm-vega-20-radeon-rx-gaming/


----------



## kd5151

Looks like we are getting more RX 580's and RX Vega56/64 until FEB 2019.

https://videocardz.com/76737/asrock...o-next-gen-radeon-till-at-least-february-2019

Sad


----------



## doom26464

Then I still stand by 12nm nvidia gamming gpus will be around for 2 years at least.


----------



## Gary2015

The 1180s will cost over $1000. Nvidia has a lock on this market so they will charge what they want. Those thinking that it will cost $700-800 are deluding themselves. Just look at their pricing history with the Titans.


----------



## chessmyantidrug

Gary2015 said:


> The 1180s will cost over $1000. Nvidia has a lock on this market so they will charge what they want. Those thinking that it will cost $700-800 are deluding themselves. Just look at their pricing history with the Titans.


This seems rather ignorant. Firstly, the GTX 1180 won't be a Titan so it won't be priced like a Titan. At the very most, it will be priced like a GTX 1080 Ti. I would be surprised to see an MSRP over $699 for the Founder's Edition and $599 for non-Founder's Edition. If they priced them over $1000, very few people would buy them. It's hard to make money if no one buys your product.


----------



## Malinkadink

chessmyantidrug said:


> This seems rather ignorant. Firstly, the GTX 1180 won't be a Titan so it won't be priced like a Titan. At the very most, it will be priced like a GTX 1080 Ti. I would be surprised to see an MSRP over $699 for the Founder's Edition and $599 for non-Founder's Edition. If they priced them over $1000, very few people would buy them. It's hard to make money if no one buys your product.


No people, will buy it, they always do. If 100 people buy it at $500, but only 50 buy it at $1000 they still end up making the same amount of money, and seeing how people were/are buying 1080Tis for $1k because of miners then i'd see folks buying the new cards for $1k too if thats what it cost to get the best. I'm not spending more than $700 on anything less than an 1180Ti assuming i actually want or need one to replace my 1080, which at this point is a no.


----------



## MonarchX

I just hope it will finally be a true 4K gaming card or at least the Ti version of it will be. I want to get a way better display and LG OLED 4K is where it's at, but with current specs it's not an option...


----------



## ENTERPRISE

MonarchX said:


> I just hope it will finally be a true 4K gaming card or at least the Ti version of it will be. I want to get a way better display and LG OLED 4K is where it's at, but with current specs it's not an option...


Same as that, I am hoping that they are bringing HDMI 2.1 to the table, it has been said that the new gen GPU's will have increased bandwidth for VR and 4K at 144Hz. It was speculated it could be an Nvidia propriety connector but most believe it will be the HDMI 2.1 standard, which makes more sense to me and will make life much better for when the next gen screens are out so far as those wanting to double up their TV's as monitors.


----------



## Lee Patekar

Malinkadink said:


> I'm not spending more than $700 on anything less than an 1180Ti assuming i actually want or need one to replace my 1080, which at this point is a no.


This is why I don't think they'll charge titan level prices for their card. They probably will increase the MSRP by a hundred or so, but not by three hundred or more. With incremental price increases you slowly acclimate your customers to higher prices.. suddenly throwing them in the fire of a non-titan 1k card.. and they'll jump out of the pot.

We're essentially frogs in a pot of water being brought to a boil slowly.


----------



## chessmyantidrug

Malinkadink said:


> No people, will buy it, they always do. If 100 people buy it at $500, but only 50 buy it at $1000 they still end up making the same amount of money, and seeing how people were/are buying 1080Tis for $1k because of miners then i'd see folks buying the new cards for $1k too if thats what it cost to get the best. I'm not spending more than $700 on anything less than an 1180Ti assuming i actually want or need one to replace my 1080, which at this point is a no.


It's cute to see you don't understand how margins work.


----------



## Jedson3614

I can not comment at this time, but I can say OCN is definitely getting all variations of the 11 series cards in for review. We will have a full breakdown of PCB and overview of founders and aftermarket solutions. Stay tuned for some great upcoming content : ) 

Don't forget to check out our latest reviews on the front page, Sponsored OCN Lab Reviews under the "Reviews" section, and don't forget to check out the improved YouTube Channel and hit that subscribe button.


----------



## Woundingchaney

Malinkadink said:


> No people, will buy it, they always do. If 100 people buy it at $500, but only 50 buy it at $1000 they still end up making the same amount of money, and seeing how people were/are buying 1080Tis for $1k because of miners then i'd see folks buying the new cards for $1k too if thats what it cost to get the best. I'm not spending more than $700 on anything less than an 1180Ti assuming i actually want or need one to replace my 1080, which at this point is a no.



Every market has a sweet spot consisting of several different factors, its not as if Nvidia or any large manufacturer simply randomly pick a price point. Production costs, marketing, support, component supply, consumer demand, market penetration, status of internal business, R and D, and numerous other factors all are carefully calculated into developing a consumer cost. Your analogy is completely disregarding production costs, shipping costs, support costs, and many other factors. Profit margins for 500 and 1000 USD are considerably different.


----------



## MonarchX

ENTERPRISE said:


> Same as that, I am hoping that they are bringing HDMI 2.1 to the table, it has been said that the new gen GPU's will have increased bandwidth for VR and 4K at 144Hz. It was speculated it could be an Nvidia propriety connector but most believe it will be the HDMI 2.1 standard, which makes more sense to me and will make life much better for when the next gen screens are out so far as those wanting to double up their TV's as monitors.


I would even settle for 60Hz, as long as I can get constant and consistent 60fps 90% of the time. I would also prefer 60Hz + variable refresh rate support to 120Hz without variable refresh rate support. I doubt we will get some kind of non-G-Sync variable refresh rate support at any point and I doubt we will get G-Sync modules for high-end displays like LG 4K OLED.


----------



## Threx

Jedson3614 said:


> I can not comment at this time, but I can say OCN is definitely getting all variations of the 11 series cards in for review. We will have a full breakdown of PCB and overview of founders and aftermarket solutions. Stay tuned for some great upcoming content : )
> 
> Don't forget to check out our latest reviews on the front page, Sponsored OCN Lab Reviews under the "Reviews" section, and don't forget to check out the improved YouTube Channel and hit that subscribe button.


So at least you're confirming that the 11 cards are coming soon.


----------



## guttheslayer

doom26464 said:


> Then I still stand by 12nm nvidia gamming gpus will be around for 2 years at least.


No it wont be, unless something went terribly wrong with 7nm, there is no reason to believe it will take 2 years for NV Geforce card to come out for 7nm.


There is no need for any more debate, if Big Turning chip measure 600mm sq turn up this Q3/4, means I am right 7nm is coming in within 12 months.


----------



## guttheslayer

Jedson3614 said:


> I can not comment at this time, but I can say OCN is definitely getting all variations of the 11 series cards in for review. We will have a full breakdown of PCB and overview of founders and aftermarket solutions. Stay tuned for some great upcoming content : )
> 
> Don't forget to check out our latest reviews on the front page, Sponsored OCN Lab Reviews under the "Reviews" section, and don't forget to check out the improved YouTube Channel and hit that subscribe button.



Hi Jed, lets see if what I predicted is it anywhere near 

Geforce GTX 1180 - 5120 cores, 1.6GHz core, 12GB G6 @ 672 GB/s, GT100, 630 mm^2, US$699
Geforce GTX 1170 - 3584 cores, 1.8GHz core, 8GB G6 @ 448 GB/s, GT104, 420 mm^2, US$449
Geforce GTX 1160 - 2688 cores, 1.7GHz core, 6GB G6 @ 336 GB/s GT104, 420 mm^2, US$299


----------



## Kana Chan

1160 with 2688 is a bit high for nvidia to sell at 299. That beats the 1080 and 1070 Ti. Doing this would certainly put ATI out of business until Q2 2019. 
An RX580 ~ 1060 and this is more than double the cores with higher cache.


----------



## guttheslayer

Kana Chan said:


> 1160 with 2688 is a bit high for nvidia to sell at 299. That beats the 1080 and 1070 Ti. Doing this would certainly put ATI out of business until Q2 2019.
> An RX580 ~ 1060 and this is more than double the cores with higher cache.


There is a Polaris 680 coming in for a start so that itself could be better than Vega 64 performance with double transistors count (7nm is > 4X more dense, so it means it can pack 4096 cores while being only 1/2 as small )


The 7nm GPU are capable to give GTX 1080 performance at only RX580 prices, which is in-line with the rumors right now.


----------



## Threx

1160 should be in the same ballpark as 1080, like how the 1060 is similar to 980. 1170 should be similar to 1080 ti, and 1180 should be 15-30% faster than 1080 ti.


----------



## guttheslayer

Threx said:


> 1160 should be in the same ballpark as 1080, like how the 1060 is similar to 980. 1170 should be similar to 1080 ti, and 1180 should be 15-30% faster than 1080 ti.


I thought the 1060 is faster than the 980?


I wont be surprised if the GTX 1160 is faster than GTX 1080, but the difference this time is that the former have 2 GB less memory capacity as compared to the latter. I think that made a good balance somehow!

Which is also why I feel 5120 -> 3584 -> 2688 cores is a good step down in performances from GTX 1180 -> 1160.


----------



## keikei

Threx said:


> So at least you're confirming that the 11 cards are coming soon.


He's non confirmation was the confirmation. Lol. Gonna put up my card up for sale as soon as I get word.


----------



## guttheslayer

Threx said:


> So at least you're confirming that the 11 cards are coming soon.


The biggest confirmation from him is that he has put all the GTX 2000s rumor to rest. For good.


----------



## Threx

guttheslayer said:


> I thought the 1060 is faster than the 980?


Sometimes, not always. They pretty much trade blows. They're very much on the "same tier."


----------



## ToTheSun!

guttheslayer said:


> The biggest confirmation from him is that he has put all the GTX 2000s rumor to rest. For good.


Brainbean had already done that back in June, to be fair.


----------



## sefwe

Why would Nvidia release anything worthwhile with 74% Steam market share and AMD only reaching 1070 levels, in grill mode. Intel played this game for 7 years and Nvidia isn't likely to like money less.


----------



## Threx

Why would nvidia release anything at all if it's not gonna beat current cards?


----------



## keikei

Shareholders and to a lesser extent, AMD?


----------



## guttheslayer

Threx said:


> Why would nvidia release anything at all if it's not gonna beat current cards?


Because competition doesnt come from AMD or gaming alone.


Also Nvidia don't want to be caught with their pants down like Intel did. 7nm is all it take to bring the game back to the AMD if Nvidia didnt jump to 7nm fast.


----------



## Threx

guttheslayer said:


> Because competition doesnt come from AMD or gaming alone.


I was talking about new geforce line up only.


----------



## guttheslayer

Threx said:


> I was talking about new geforce line up only.


Well the design cost went to AI in mind, and serving the gaming market using the same chip can help to generate extra revenue.

No point keeping gaming lineup lag behind their AI counterpart 2,3,4 generation over time.


----------



## doom26464

guttheslayer said:


> doom26464 said:
> 
> 
> 
> Then I still stand by 12nm nvidia gamming gpus will be around for 2 years at least.
> 
> 
> 
> No it wont be, unless something went terribly wrong with 7nm, there is no reason to believe it will take 2 years for NV Geforce card to come out for
> 
> 
> 
> There is no need for any more debate, if Big Turning chip measure 600mm sq turn up this Q3/4, means I am right 7nm is coming in within 12 months.
Click to expand...

Nonsense. I doubt nvidia would even launch 12nm gaming cards for such a short period on the market.

Unless 11 series is on 7nm but then that means we wont see gamming cards till next year. 7nm is far too expensive for gaming cards right now, hence we will just start to see professional cards start of 2019. 

Professional cards can get away with low yield cutting edge nodes as they sell at high prices. Gamming cards can not, been that way for a awhile now. Unless nvidia plans to sell cutting edge 7nm gamming cards at ******ed high prices, possible for a titan variant not for main stream xx70 and xx80 variants.


Your reference to 7nm polairs cards comming out of thin air already has your credibilty out the window.


----------



## Threx

guttheslayer said:


> Well the design cost went to AI in mind, and serving the gaming market using the same chip can help to generate extra revenue.
> 
> No point keeping gaming lineup lag behind their AI counterpart 2,3,4 generation over time.


Eh, not sure if we're talking about the same point lol.

I was responding to someone saying "why would nvidia release *anything worthwhile* when they already own the gaming market" which implies that new cards won't be faster than old cards, to which I responded with "why would they bother releasing new cards at all then if they are not gonna be faster than old cards."


----------



## Asmodian

guttheslayer said:


> There is a Polaris 680 coming in for a start so that itself could be better than Vega 64 performance with double transistors count (7nm is > 4X more dense, so it means it can pack 4096 cores while being only 1/2 as small )
> 
> 
> The 7nm GPU are capable to give GTX 1080 performance at only RX580 prices, which is in-line with the rumors right now.


4x density would be amazing! Semiwiki agrees, TSMC's 7nm is a bit more than 4x the density of their 16nm (116.7/28.2=4.14x MTr/mm²). I hope 7nm GPUs are not too far away.

Too bad no one is going to come out with a 600mm² 7nm GPU anytime soon.


----------



## Kana Chan

guttheslayer said:


> I thought the 1060 is faster than the 980?
> 
> 
> I wont be surprised if the GTX 1160 is faster than GTX 1080, but the difference this time is that the former have 2 GB less memory capacity as compared to the latter. I think that made a good balance somehow!
> 
> Which is also why I feel 5120 -> 3584 -> 2688 cores is a good step down in performances from GTX 1180 -> 1160.


At minimum it should be 2688 / 2560 = 1.05x faster without including the increased cache sizes

There's 4 versions of the P4 with 256 KB / 512 KB / 1024 KB / 2048 KB L2 and each one added some % increase.


----------



## guttheslayer

doom26464 said:


> Nonsense. I doubt nvidia would even launch 12nm gaming cards for such a short period on the market.
> 
> Unless 11 series is on 7nm but then that means we wont see gamming cards till next year. 7nm is far too expensive for gaming cards right now, hence we will just start to see professional cards start of 2019.
> 
> Professional cards can get away with low yield cutting edge nodes as they sell at high prices. Gamming cards can not, been that way for a awhile now. Unless nvidia plans to sell cutting edge 7nm gamming cards at ******ed high prices, possible for a titan variant not for main stream xx70 and xx80 variants.
> 
> 
> Your reference to 7nm polairs cards comming out of thin air already has your credibilty out the window.



You can disagree all you want, but this coming August or before end of this year if Nvidia announce a big Turing chip then you will know you are wrong, and Nvidia will have their 7nm Geforce ready by Q3 2019. No need to continue debate.




Kana Chan said:


> At minimum it should be 2688 / 2560 = 1.05x faster without including the increased cache sizes
> 
> There's 4 versions of the P4 with 256 KB / 512 KB / 1024 KB / 2048 KB L2 and each one added some % increase.



1160 should be segmented like how GTX 1070 was. Basically everything is cut down by 1/4, from ROPs to memory bits to CUDA Cores.

So for L2 cache, it should be 3MB. The full GT104 chip should be 4MB while the big Turing chip comes with 6MB Cache.


----------



## keikei

https://wccftech.com/nvidia-geforce-gtx-2080-ti-benchmark-leak/


----------



## ENTERPRISE

keikei said:


> https://wccftech.com/nvidia-geforce-gtx-2080-ti-benchmark-leak/


Interesting but with the information we currently have (or at least the information that seems the most plausible) combined with what other OEMs have stated, this does not correlate. I really cannot see why its device ID is 2080 Ti when from what we understand it is most certainly going to be the 1180 naming scheme, unless we are all being hard trolled. That or the new gens were going to be called 2080 and this could be a device ID remnant from the past and Nvidia binned it in favour of the 1180 naming scheme since then. 

Brings up more questions than answers at this point so I think I will take this with a good dose of salt.


----------



## keikei

ENTERPRISE said:


> Interesting but with the information we currently have (or at least the information that seems the most plausible) combined with what other OEMs have stated, this does not correlate. I really cannot see why its device ID is 2080 Ti when from what we understand it is most certainly going to be the 1180 naming scheme, unless we are all being hard trolled. That or the new gens were going to be called 2080 and this could be a device ID remnant from the past and Nvidia binned it in favour of the 1180 naming scheme since then.
> 
> Brings up more questions than answers at this point so I think I will take this with a good dose of salt.


Grain of salt indeed. I dont ever remember nvidia releasing an entire card lineup at the same time. If a Ti does get a simultaneous release my mind will explode.


----------



## epic1337

but its already july!


----------



## ibb27

epic1337 said:


> but its already july!


It's not last day of July.


----------



## keikei

Remember, the last Ti was released the day after it was announced.


----------



## JoeDohn

ibb27 said:


> It's not last day of July.


So basically wait 3 more weeks, then grab 1080/1080Ti for cheap?


----------



## doom26464

Remeber when the hip rumour was they where going to launch in march at GTC..

I memeber


----------



## DrFPS

JoeDohn said:


> So basically wait 3 more weeks, then grab 1080/1080Ti for cheap?



Historically new technology does not lower the prices on older. Your better off jumping in on launch day.

If you wait a few weeks 1080/1080ti might not come down in price. Nvidia rarely lowers prices. Unless you want to buy used. The market is reving up for that now.

The price gougers will have raised the price of the newer gpu's. Usually 20% to 30%.


----------



## epic1337

DrFPS said:


> Historically new technology does not lower the prices on older. Your better off jumping in on launch day.
> 
> If you wait a few weeks 1080/1080ti might not come down in price. Nvidia rarely lowers prices. Unless you want to buy used. The market is reving up for that now.
> 
> The price gougers will have raised the price of the newer gpu's. Usually 20% to 30%.


ehh... not quite, recently didn't Nvidia launch the FE to milk early adopters? the non-FE is quite a bit cheaper.


----------



## guttheslayer

keikei said:


> ENTERPRISE said:
> 
> 
> 
> Interesting but with the information we currently have (or at least the information that seems the most plausible) combined with what other OEMs have stated, this does not correlate. I really cannot see why its device ID is 2080 Ti when from what we understand it is most certainly going to be the 1180 naming scheme, unless we are all being hard trolled. That or the new gens were going to be called 2080 and this could be a device ID remnant from the past and Nvidia binned it in favour of the 1180 naming scheme since then.
> 
> Brings up more questions than answers at this point so I think I will take this with a good dose of salt.
> 
> 
> 
> Grain of salt indeed. I dont ever remember nvidia releasing an entire card lineup at the same time. If a Ti does get a simultaneous release my mind will explode.
Click to expand...

Ppl refuse to believe big turing chip is coming out with the new lineup, and many also dont believe 7nm gpus for geforce is not too far off


The gtx 2000s will surprised me, but not the Ti, the big chip is coming in whether you like it or not.


----------



## Asmodian

If 7nm is close and Nvidia is really going to release something on 12nm I could see them releasing the xx80 and xx80 Ti at the same time.


----------



## Kana Chan

If the RX 680 = RX 580 with +200mhz, you'd have to buy two RX 680s and hope it scales 100% to match the 1160.


----------



## Newbie2009

Something new from AMD or Nvidia is needed. Something, anything, please.


----------



## Lee Patekar

I bet they'll make a huge fanfare for a new 7nm data center AI card.. then release a 12 nm gamer version of pascal :^)


----------



## guttheslayer

Lee Patekar said:


> I bet they'll make a huge fanfare for a new 7nm data center AI card.. then release a 12 nm gamer version of pascal :^)


What are you talking? The GTX 1100 series is the gamer version of 12nm pascal.


----------



## TUFinside

Slaughtahouse said:


> While I do agree to an extent, the non ti versions typically have lower power draws.(GTX 1080 TDP 180W vs GTX 1080Ti 250W).
> 
> Given this is an enthusiast forum, I understand that most users don't factor that it but it does make a difference. Granted, if I knew a GTX 780Ti would of been released, I would of purchased that instead of a GTX 780 because I really wanted all the performance I could get. However, that is when Nvidia started this trend of releasing Titan's and X80ti's.
> 
> Going forward, I think I will stick to the non-Ti's. Just so I am dumping less wattage into my w/c loop. But to each their own


yes !...simply yes.


----------



## doom26464

guttheslayer said:


> keikei said:
> 
> 
> 
> 
> 
> ENTERPRISE said:
> 
> 
> 
> Interesting but with the information we currently have (or at least the information that seems the most plausible) combined with what other OEMs have stated, this does not correlate. I really cannot see why its device ID is 2080 Ti when from what we understand it is most certainly going to be the 1180 naming scheme, unless we are all being hard trolled. That or the new gens were going to be called 2080 and this could be a device ID remnant from the past and Nvidia binned it in favour of the 1180 naming scheme since then.
> 
> Brings up more questions than answers at this point so I think I will take this with a good dose of salt.
> 
> 
> 
> Grain of salt indeed. I dont ever remember nvidia releasing an entire card lineup at the same time. If a Ti does get a simultaneous release my mind will explode.
> 
> Click to expand...
> 
> Ppl refuse to believe big turing chip is coming out with the new lineup, and many also dont believe 7nm gpus for geforce is not too far off
> 
> 
> The gtx 2000s will surprised me, but not the Ti, the big chip is coming in whether you like it or not.
Click to expand...

7nm gaming gpu are very far off. Once again you seem very disconected with whats going on here. 

Lets break this down for you here using some of the recent reports swirling around. 

Currently a single 7nm wafer is estimated to cost 10K USD+ plus the yeilds and complications that go with 7nm at this current time. Like I said before for 2019 7nm will be reserved for gpu for pro consumer market(data center/AI/machine learning) 

Also another report I had read has estimated if and when we would see 7nm gamming gpus would be at best 2020 or 2021(which I would have to agree with)

I mean I would love to see a 7nm gamming gpu by next year but with all the information present about 7nm it is unlikely. If we do it will be a stupid expensive card, possibilty of a 7nm titan I could see.


----------



## guttheslayer

doom26464 said:


> 7nm gaming gpu are very far off. Once again you seem very disconected with whats going on here.
> 
> Lets break this down for you here using some of the recent reports swirling around.
> 
> Currently a single 7nm wafer is estimated to cost 10K USD+ plus the yeilds and complications that go with 7nm at this current time. Like I said before for 2019 7nm will be reserved for gpu for pro consumer market(data center/AI/machine learning)
> 
> Also another report I had read has estimated if and when we would see 7nm gamming gpus would be at best 2020 or 2021(which I would have to agree with)
> 
> I mean I would love to see a 7nm gamming gpu by next year but with all the information present about 7nm it is unlikely. If we do it will be a stupid expensive card, possibilty of a 7nm titan I could see.



And so what is it about 10K per USD wafer? That is the worst case scenario price estimated after complicated process going through yield.

http://www.silicon-edge.co.uk/j/index.php/resources/die-per-wafer


I can extract 600 dies for a die size with a size of 10mm*10mm or 100 mm^2. And that give me the base cost of US$17 per die. In comparison, the G6 memory module is estimated to cost US$10 per GB (by GN). It is the design cost that is expensive but that is spread out over million of GPUs created. 

Dont tell me 2019 is impossible to me that is total BS. It will come very small that is for sure, but even a die of 100 mm^2 pack a crazy whopping 11.1 B transistors!


Note: For a die of 225 mm^2 with 24B transistors, 256 dies can be extract from a 300mm wafer, therefore the cost is $39.06, assuming 50% yield loss, that is still less than $80, or cheaper than 8GB G6, so please stop smoking people without facts. If you really want to ask me, it is the current DRAM crisis (affecting GDDR6) that is driving up the cost faster than the 7nm GPU.



Selling a 225mm^2 die GPU as the GTX 1280 and charging $699 is more than sufficient to earn the profit needed.


----------



## pony-tail

I live in Australia , I will have to see if I have any marketable body parts to raise enough $$$$$ for the upcoming 1180 ti !


----------



## guttheslayer

Asmodian said:


> If 7nm is close and Nvidia is really going to release something on 12nm I could see them releasing the xx80 and xx80 Ti at the same time.


That is what I been saying, but there are ppl like the doom guy who refuse to believe that might be the case lol.


----------



## mksteez

I wish these would come out soon! In need of a new GPU so bad


----------



## MonarchX

What would benefit improved performance at 4K resolution that NVidia could improve in the upcoming series? I want the 4K OLED LG TV and yet I won't buy it until I can get constant and consistent 60fps @ 4K in modern games @ Ultra / Very High settings.

Either that or I have to pray that LG replies to my e-mail regarding custom-fitting G-Sync module just for me for Unknown $$$ .


----------



## LongtimeLurker

MonarchX said:


> What would benefit improved performance at 4K resolution that NVidia could improve in the upcoming series? I want the 4K OLED LG TV and yet I won't buy it until I can get constant and consistent 60fps @ 4K in modern games @ Ultra / Very High settings.
> 
> Either that or I have to pray that LG replies to my e-mail regarding custom-fitting G-Sync module just for me for Unknown $$$ .


Be careful with OLED, I've heard they have had problems with screen burn in. You wouldn't want to see your CounterStrike rifle, or "YouTube" for example, permanently burned into your screen, even when it's off  I've heard Sony's Bravia X930E is a good alternative. Edit: If you're playing FPS though I'm not sure what the lag on it is...check RTINGS.

https://www.cnet.com/news/oled-screen-burn-in-what-you-need-to-know/

https://www.zdnet.com/article/lg-oled-burn-in-at-incheon-airport-reignites-controversy/



> LG's 2018-model OLED TV installed at Incheon International Airport has shown signs of burn-in, only months after installation, reigniting worries over the long-term reliability of the technology


----------



## rluker5

Maybe it's just me, but nvidias ray tracing just doesn't seem compatible with sli. If it isn't will Touring be? Maybe the gaming version of the Titan Volta is as good as it will get.
Also this would be a good reason for them to release the big chip right away.


----------



## Timmaigh!

rluker5 said:


> Maybe it's just me, but nvidias ray tracing just doesn't seem compatible with sli. If it isn't will Touring be? Maybe the gaming version of the Titan Volta is as good as it will get.
> Also this would be a good reason for them to release the big chip right away.


Actually, if Nvidia´s raytracing is anything like the raytracing apps used for archviz production, it wont need SLI at all. IIRC i read somewhere that the Nvidia video with the characters inside the apartment, which they used to demostrate the technology, has been ran on 4x V100.


----------



## rluker5

Timmaigh! said:


> Actually, if Nvidia´s raytracing is anything like the raytracing apps used for archviz production, it wont need SLI at all. IIRC i read somewhere that the Nvidia video with the characters inside the apartment, which they used to demostrate the technology, has been ran on 4x V100.


That would be nvlink and not any sli that is compatible with current games.
I suspect the denoising module will make a lot of problems. But maybe this rtc will be isolated like physx was since a tiny minority of gaming systems will support it (only volta and newer nvidia gpus in pcs and probably only high end ones at that.) and maybe sli will still be there if you disable rtc in the games that have rtc. But there is a plausible scenario that nvidia will push their new tech in the top end cards and have nvlink fingers instead of sli fingers. This would leave the new top cards likely slower than current top tier cards in sli in most games. If this is the case, sli support won't be improving either but at least games won't change that much for a while with the huge console + amd + pre volta market.

I don't know for sure or anything, but the possibility certainly put a damper on my plans to sell my cards now and buy 2 1180s soon.


----------



## ENTERPRISE

I have always been an SLI man, I figured go big and avoid disappointment. For the most part that has been true, but as we all know SLI has been left wanting for a little while now, it is still worth it when you get that demanding game that is SLI enabled as it makes your investment more worthwhile. I would be happy for Nvidia to dump SLI for something like NVlink. I personally don't think we are there yet when 1 card can reign king by itself, at least not at 4K and other close resolutions.


----------



## Silent Scone

ENTERPRISE said:


> I have always been an SLI man, I figured go big and avoid disappointment. For the most part that has been true, but as we all know SLI has been left wanting for a little while now, it is still worth it when you get that demanding game that is SLI enabled as it makes your investment more worthwhile. I would be happy for Nvidia to dump SLI for something like NVlink. I personally don't think we are there yet when 1 card can reign king by itself, at least not at 4K and other close resolutions.


I've predominantly been an SLI user since it's inception, but as you've rightly said, the last few years have been dwindling and even for the go big or go home types, it's becoming less of a sensible decision. NVIDIA has not ever really upsold SLI, it costs both time and money to implement. I think the nail in the coffin for MGPU was, in fact, the birth of explicit MGPU which relies heavily if not almost entirely on the developer. The problem with that is, they have a lot less incentive than the vendors do, and thus support was always going to dwindle further. In short I think it's in NVIDIA's interests to drop SLI entirely, which after dropping 3-4 way support is exactly what has happened in the last 12 months silently.


----------



## Nizzen

Silent Scone said:


> I've predominantly been an SLI user since it's inception, but as you've rightly said, the last few years have been dwindling and even for the go big or go home types, it's becoming less of a sensible decision. NVIDIA has not ever really upsold SLI, it costs both time and money to implement. I think the nail in the coffin for MGPU was, in fact, the birth of explicit MGPU which relies heavily if not almost entirely on the developer. The problem with that is, they have a lot less incentive than the vendors do, and thus support was always going to dwindle further. In short I think it's in NVIDIA's interests to drop SLI entirely, which after dropping 3-4 way support is exactly what has happened in the last 12 months silently.


As long SLI works with Battlefield, it's OK 
Can't wait for BF V sli love!
Regards from Nzz1


----------



## Drake87

Nizzen said:


> As long SLI works with Battlefield, it's OK
> Can't wait for BF V sli love!
> Regards from Nzz1


I'd be willing to put money on it. CF on the other hand, not so sure. BF1 has crap crossfire support.


----------



## LuckyDuck69

I'm holding out for the Titan L


----------



## guttheslayer

LuckyDuck69 said:


> I'm holding out for the Titan L


where does the L come from?


----------



## animeowns

doom26464 said:


> Id like the 1180ti version but Im not going to wait 10-12 months for that to drop.
> 
> If this is 20% faster then a 1080ti it will be enough to get me off my 980ti


I hope these 11 series cards 1180ti or the next titan variant is at least 20% faster than the titan V


----------



## ENTERPRISE

Silent Scone said:


> I've predominantly been an SLI user since it's inception, but as you've rightly said, the last few years have been dwindling and even for the go big or go home types, it's becoming less of a sensible decision. NVIDIA has not ever really upsold SLI, it costs both time and money to implement. I think the nail in the coffin for MGPU was, in fact, the birth of explicit MGPU which relies heavily if not almost entirely on the developer. The problem with that is, they have a lot less incentive than the vendors do, and thus support was always going to dwindle further. In short I think it's in NVIDIA's interests to drop SLI entirely, which after dropping 3-4 way support is exactly what has happened in the last 12 months silently.


I agree, I think they are internally taking much less interest in the technology and it would not surprise me in the future if they in a press conference say ''Guess What, SLI is EOL''. That being said I would be surprised if they kill off MGPU completely for the gaming sector. I figure NVLink or something else will replace the technology with the same or hopefully greater bandwidths. I was surprised when they bought about SLI HB when they did, personally I thought if they were intent on letting it slowly die off, they would have let it be. Guess we will have to see what happens in the future.


----------



## guttheslayer

animeowns said:


> doom26464 said:
> 
> 
> 
> Id like the 1180ti version but Im not going to wait 10-12 months for that to drop.
> 
> If this is 20% faster then a 1080ti it will be enough to get me off my 980ti
> 
> 
> 
> I hope these 11 series cards 1180ti or the next titan variant is at least 20% faster than the titan V
Click to expand...

It will be extremely difficult to push the core count to above 5376 so unless there is a big micro arch change, if not a 10% gain from titan V to the next titan is already stretching it.


----------



## Mack42

End of July is soon here. If there is a release this month we should have heard more stuff by now, right?


----------



## EniGma1987

Mack42 said:


> End of July is soon here. If there is a release this month we should have heard more stuff by now, right?





You never know. Nvidia announced the Titan X (Pascal) with no warning and no prior hints at a random Stanford event in the middle of July a couple years ago.


----------



## keikei

Mack42 said:


> End of July is soon here. If there is a release this month we should have heard more stuff by now, right?


The anonymous sources state a launch on the 30th, but a more current rumor the card being announced at the late august gaming conference with a release in Sept. Being very close to the 30th and the lack of leaks, i'm heavily leaning towards no 1180 this month.


----------



## Threx

I'm planning on getting a white exoc version of the 1180, which is usually released about 3 months after the gpu's launch. So hopefully it launches next month so the exoc is in time for my new build at the end of the year.


----------



## guttheslayer

EniGma1987 said:


> You never know. Nvidia announced the Titan X (Pascal) with no warning and no prior hints at a random Stanford event in the middle of July a couple years ago.



All the past 3 Titans have no warning at all. At some small certain event, JHH pull the Titan V, (the CEO edition as well) and Titan X 2016, the worst is Titan Xp where it was simply updated in their website without any mention in public.


But for Geforce, the exact opposite is true. It was also a big event with a huge crowd cheering for JHH in a big stage. Its has been this way since the 600s as far as I rmb. So this month likely mark the month without any 1100 series.


----------



## rbarrett96

"I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants."


I'm just tired of $700 cards. I remember when I built my first rig in 2010. Back then AMD was king and the 5870 was $400 bucks which seemed like a lot at the time.


----------



## ENTERPRISE

rbarrett96 said:


> "I always find these versions (Non Ti) kind of painful. I have always preferred to wait until the Ti releases before the hammer comes down. Sometimes I am disappointed with the gains of new non Ti cards against the older gen Ti variants."
> 
> 
> I'm just tired of $700 cards. I remember when I built my first rig in 2010. Back then AMD was king and the 5870 was $400 bucks which seemed like a lot at the time.


Prices have increased no doubt, but lets face it, everything tech or otherwise has gone up in price sharply. While I would love to get a top end card for the same price as we did in 2010, I think we are just looking at a natural progression of prices over an 8 year period.


----------



## mngdew

I learned my lesson last year. I'm going to wait for the Ti version.
I just hope 1180Ti will enough horsepower for smooth 4k gaming.


----------



## guttheslayer

mngdew said:


> I learned my lesson last year. I'm going to wait for the Ti version.
> I just hope 1180Ti will enough horsepower for smooth 4k gaming.


There is a good chance no Ti version this time round, or if have it will be released tgt with 1180 side by side.

I am still holding my breath to these option for 1100 series:

GTX 1180 - 5120 Cores, 12GB G6 @ US$699
GTX 1170 - 3584 Cores, 8GB G6 @ US$449
GTX 1160 - 2688 Cores, 6GB G6 @ US$299


----------



## Doubletap1911

We're fast running out of July :|


----------



## DNMock

guttheslayer said:


> There is a good chance no Ti version this time round, or if have it will be released tgt with 1180 side by side.
> 
> I am still holding my breath to these option for 1100 series:
> 
> GTX 1180 - 5120 Cores, 12GB G6 @ US$699
> GTX 1170 - 3584 Cores, 8GB G6 @ US$449
> GTX 1160 - 2688 Cores, 6GB G6 @ US$299



Suppose it depends on if there will be a viable commercial market for Turing GPU's. If so, then I would expect to see the standard slightly cut down versions of the commercial GPU sold as 1180ti's. 

Otherwise my money would be on GT 104 and it's cut down variants sold as the 11xx series gpus and the GT100 chips and cut down variants coming out a year or two later as the 12xx series.

Somehow I feel like the architecture spilt between commercial and consumer GPUs is going to end up with Nvidia just miking the consumer side even harder. But that is fine, it leaves the door open for AMD to get back in the game. Not like jumping out ahead of AMD then just sandbagging and milking it into the ground has ever backfired on anyone else...


----------



## stangflyer

ENTERPRISE said:


> Prices have increased no doubt, but lets face it, everything tech or otherwise has gone up in price sharply. While I would love to get a top end card for the same price as we did in 2010, I think we are just looking at a natural progression of prices over an 8 year period.


Now you can buy 65 inch 4k HDR TV for the same price as an 1080ti!


----------



## guttheslayer

DNMock said:


> Suppose it depends on if there will be a viable commercial market for Turing GPU's. If so, then I would expect to see the standard slightly cut down versions of the commercial GPU sold as 1180ti's.
> 
> Otherwise my money would be on GT 104 and it's cut down variants sold as the 11xx series gpus and the GT100 chips and cut down variants coming out a year or two later as the 12xx series.
> 
> Somehow I feel like the architecture spilt between commercial and consumer GPUs is going to end up with Nvidia just miking the consumer side even harder. But that is fine, it leaves the door open for AMD to get back in the game. Not like jumping out ahead of AMD then just sandbagging and milking it into the ground has ever backfired on anyone else...


Dont expect too much performance jump in this 1100 series, even with 5120 cores the scaling is not definitely linear. 40% more cores doesnt translate to 40% more performance in most cases.


What NV see however, is a 1 year stop-gap product cycle that helped to fill the 9-12 months gaping hole while waiting for 7nm GPUs. Unlike AMD who simply do nothing causing a full 2 years gap of no-product release.


----------



## Scotty99

Any new news on this? Would like to sell my 1060 before they come out for obvious reasons


----------



## Threx

"NVIDIA GTX 1170 Alleged Benchmark Leaked, Faster Than 1080 Ti"

WCCFT, take it for what it's worth. 

https://wccftech.com/rumor-nvidia-turing-gtx-1170-benchmark-leaked-faster-than-1080-ti/


----------



## guttheslayer

Threx said:


> "NVIDIA GTX 1170 Alleged Benchmark Leaked, Faster Than 1080 Ti"
> 
> WCCFT, take it for what it's worth.
> 
> https://wccftech.com/rumor-nvidia-turing-gtx-1170-benchmark-leaked-faster-than-1080-ti/


Its fake fake and fake, its been proven fake when the physics, combined and graphic score doesnt add up to the overall score.


----------



## white owl

Threx said:


> "NVIDIA GTX 1170 Alleged Benchmark Leaked, Faster Than 1080 Ti"
> 
> WCCFT, take it for what it's worth.
> 
> https://wccftech.com/rumor-nvidia-turing-gtx-1170-benchmark-leaked-faster-than-1080-ti/


 It's pretty bad if WCCFT is questioning it's authenticity.
16gb on an 1170 with ram prices the way they are? I doubt that very much.
I think since they've had so long to work on them they could very well make a GPU that roughly fits the description but I doubt they would. I'd expect a 20% bump for each card with some of them having the same capacity ram and some with a little more. 8gb is already more than I'd ever need on my 1080.


----------



## c0nsistent

white owl said:


> It's pretty bad if WCCFT is questioning it's authenticity.
> 16gb on an 1170 with ram prices the way they are? I doubt that very much.
> I think since they've had so long to work on them they could very well make a GPU that roughly fits the description but I doubt they would. I'd expect a 20% bump for each card with some of them having the same capacity ram and some with a little more. 8gb is already more than I'd ever need on my 1080.


Only if it's $499. At that price it might make sense. Thats $200 less than the 1080 Ti. Then maybe the 1180 at $649, 1180 Ti @ $799


----------



## kd5151

c0nsistent said:


> Only if it's $499. At that price it might make sense. Thats $200 less than the 1080 Ti. Then maybe the 1180 at $649, 1180 Ti @ $799


 I agree. 

More demand today driving these high prices. People willing to pay top dollar will enable it.


----------



## guttheslayer

kd5151 said:


> I agree.
> 
> More demand today driving these high prices. People willing to pay top dollar will enable it.


you guys are missing the point. Please go read up the formula of FS score and how they compute, then you will understand the score is vastly different and doesnt add up to the overall scores.


It's 200% fake. Also regarding 16GB, I can almost assure you it is 8GB seeing how it goes.


----------



## guttheslayer

c0nsistent said:


> Only if it's $499. At that price it might make sense. Thats $200 less than the 1080 Ti. Then maybe the 1180 at $649, 1180 Ti @ $799


Let me just highlight to you how much does 1gb of g6 cost for 14 gbps per GB. That is US$27 for 1 module for retail price. Although nvidia order in massive bulk allow them to purchase at great discount. It will still cost around $12 per GB. For 16GB you are looking at $200 without the cost of other components, GPU, and PCB. Knowing that nvidia like to price their profit margin more than 100%, that is not working out, even at US$499. The only way to go is 8GB and even then it is eating into their usual profit margin alr.


----------



## G woodlogger

I don't think the chips actually costs very much.

https://www.dramexchange.com/


----------



## coreykill99

probably pure BS but still kinda interesting. 

http://www.guru3d.com/news-story/ge...-late-august-11801170-and-1160-to-follow.html


----------



## guttheslayer

coreykill99 said:


> probably pure BS but still kinda interesting.
> 
> http://www.guru3d.com/news-story/ge...-late-august-11801170-and-1160-to-follow.html


I have started a thread on this:

https://www.overclock.net/forum/379...ube-released-date-gtx-1100-series-leaked.html



G woodlogger said:


> I don't think the chips actually costs very much.
> 
> https://www.dramexchange.com/


I dont see any GDDR6 on these link. Here is my source:


----------

