# [WCCFTECH] NVIDIA’s Ampere-based graphics card launching in 1H 2020



## Sheyster

https://wccftech.com/nvidia-next-generation-ampere-7nm-graphics-cards-landing-1h-2020/



> Ladies and gents, we have our third confirmation of NVIDIA’s Ampere-based graphics card launching in 2020. While we previously knew they were going to launch in 2020, we didn’t know which half and thanks to the report by Igor’s Lab we know it’s going to be sooner rather than later. Leaked EEC certification and a report by Taiwan’s top tech publication, Digitimes, puts the Ampere graphics card on Samsung’s 7nm node and will represent a significant performance upgrade over Turing counterparts.
> 
> The fact that it is based on Samsung’s 7nm EUV process means we are looking at a performance advantage as well as a power efficiency advantage. Not only that, but believe it or not, 7nm EUV is actually supposed to be easier to fab than standard UV multi-patterning efforts. Think of EUV as sort of a reset of the difficulty curve as the company moves to a new light source. This will, however, require extensive re-tooling, but the economies of scale will almost certainly prove to be worth it. At a bare minimum, you are looking at a 50% increase all things considered and watt for watt.


----------



## 113802

Should be in rumors but I hope this is true since Intel will be needing some competition when their Xe cards are out.


----------



## UltraMega

Maybe Ray tracing will be practical this time... But if Nvidia doesn't bring their prices back down to earth, it will be hard not to label them as a greedy company to avoid long term.


----------



## tpi2007

It could be January or end of July, with reference models in August and AIB models in September. It will all depend on for how long Nvidia will be able to milk Turing's high prices before the competition rises up to the occasion, namely AMD with big Navi (most probably Navi 2.0 with ray tracing capability) and Intel with its new discrete cards. Since Nvidia was the first to launch Turing in 2018, there is no reason why they would let either AMD or Intel get even close, as they've had ample time to iterate their next arch, so this is essentially a timing issue and to an extent, the yields / volume of Samsung's 7nm EUV.

Having said all that, and interpret the following quote how you like, but it may well be the second option from above (late July launch). Why? This is what Jensen said in the last earnings call in August when asked about the Turing Super lineup:

https://www.fool.com/earnings/call-...rp-nvda-q2-2020-earnings-call-transcript.aspx


> In a ray tracing content, it just keeps coming out and and between the performance of Super and the fact that it has ray tracing hardware, it's going to be *super well positioned for through all of next year.*



There will always be some overlap between a new and old arch, but a full year of overlap? It doesn't seem logical, so I'm betting on a mid-year launch.


----------



## Toology

I am purposely waiting for this for a new full build.


----------



## m4fox90

WannaBeOCer said:


> Should be in rumors but I hope this is true since Intel will be needing some competition when their Xe cards are out.


Mid-range market is quite saturated. Intel will be needing to break in, if anything.


----------



## Malinkadink

I wasn't going to bite on 50% more perf over my 1080 for $1200 of the 2080 Ti, but i can definitely swing 100% more perf with a 3080Ti for $1200, still stupidly expensive as we were paying $700-800 for flagships just a few years ago, but Nvidia tested people with Titan prices and they gobbled it up, now we're paying Titan prices for Ti cards, sad.


----------



## keikei

Malinkadink said:


> I wasn't going to bite on 50% more perf over my 1080 for $1200 of the 2080 Ti, but i can definitely swing 100% more perf with a 3080Ti for $1200, still stupidly expensive as we were paying $700-800 for flagships just a few years ago, but Nvidia tested people with Titan prices and they gobbled it up, now we're paying Titan prices for Ti cards, sad.





You think ampere Ti 50% faster than current one?


----------



## m4fox90

keikei said:


> You think ampere Ti 50% faster than current one?


3080Ti will be far, far faster than 1080. I came to this from a 1080 and the difference is tremendous. That will be a good upgrade, though hopefully not at the $1k price tag


----------



## skupples

3080TI will be the same price.


keikei said:


> You think ampere Ti 50% faster than current one?


highly unlikely. NVidia hasn't given us gains like that in years. 30% max. oh right 2080ti is 50%  

i have pretty high expectations actually, if not at least we know its the first hdmi 2.1 card 

either way, i'm trying to hold out for the cards that release AFTER the new consoles. Not the last run up to the new consoles. That's like the last bit of paste in the dx11 tube.


----------



## Section31

Hope thats true. Cyberpunk 2077 is asking for new Gpu. I have decided to give my friend RTX2080TI once next gen stuff is out. Thanks for being the photographer at my wedding and even doing an wedding photo album.


----------



## b.walker36

I almost ordered a 5700xt yesterday, I think I'm just going to hold out until these drop and buy the best card in the 700 range. Whatever it is will be a huge upgrade over my 980Ti


----------



## littledonny

Malinkadink said:


> I wasn't going to bite on 50% more perf over my 1080 for $1200 of the 2080 Ti, but i can definitely swing 100% more perf with a 3080Ti for $1200, still stupidly expensive as we were paying $700-800 for flagships just a few years ago, but Nvidia tested people with Titan prices and they gobbled it up, now we're paying Titan prices for Ti cards, sad.


1 + (1 - (0.5^2)) = 175% the performance of your current card (75% better)


----------



## littledonny

keikei said:


> You think ampere Ti 50% faster than current one?


Some of that will be diverted to RT cores imo


----------



## JackCY

UltraMega said:


> Maybe Ray tracing will be practical this time... But if Nvidia doesn't bring their prices back down to earth, it will be hard not to label them as a greedy company to avoid long term.


It is their nickname since inception. Greedy, nGreedia, ...
Their name literally means envy and logo is an evil green eye. You can look it up 

At least they don't say "don't be evil" and then do some of the most evil things.

The prices are gonna be same if not worse insanity than Turing. This will remain going up until their total profits start dropping too much or any competition shows up on market.


----------



## moonbogg

I don't consider myself an Nvidia customer anymore and I'm, for the first time, not excited at news of a new generation from them. The 1080ti is likely the last one I get from them. Their prices likely won't go down and I consider myself no longer welcome as a customer of theirs as I won't spend more than the $6-700 price bracket for a high end card yet they insist on charging over $1000. They have plenty of people who can't wait to spend that much, so it's fine. They don't need plebs like me, lol.


----------



## skupples

moonbogg said:


> I don't consider myself an Nvidia customer anymore and I'm, for the first time, not excited at news of a new generation from them. The 1080ti is likely the last one I get from them. Their prices likely won't go down and I consider myself no longer welcome as a customer of theirs as I won't spend more than the $6-700 price bracket for a high end card yet they insist on charging over $1000. They have plenty of people who can't wait to spend that much, so it's fine. They don't need plebs like me, lol.


that's because you're on the right page. 30x0 series is the last of the DX11 cards, and drops before "next gen" consoles. that + intel joining the fray, and AMD just getting started with 7nm = 3080ti will be one of the last times nvidia can release a truly zero competition flagship. AMD, Intel, and NVidia will be trading blows on the top shelf by the end of 2021.

i'm way more interested in AMD's moves for the next year. I'm sure they're over being the discount option.


----------



## blodflekk

Considering the source, this is likely all hot garbage.


----------



## tpi2007

blodflekk said:


> Considering the source, this is likely all hot garbage.



Considering that the source's source, as explicitly stated in the article, is Igor Wallossek (igor's LAB), it probably isn't:


https://www.igorslab.media/en/nvidi...s-relaxed-with-the-old-2060-for-a-rx-5600-xt/



> And what else? So first of all (almost) everything has been written that is known or may be written. At least I don’t write here against better knowledge. Whereby up to the half of the year 2020 Ampere should finally come (I had already written something about it from time to time). As a middle class offer, as it was said from various sources and then also with a smaller structure width. Because obviously Nvidia doesn’t dare to get close to the big guys right at the beginning, but practices with light sailors. You never know..


----------



## skupples

" Because obviously Nvidia doesn’t dare to get close to the big guys right at the beginning" 

because nvidia exhibits standard industry practices -


----------



## doom26464

Im itching to get off my 1080ti but it will all come down to pricing. 

Even if the 3080 is like 800+ dollars it will be a hard sell for me. Depends on performance as well.


----------



## skupples

4080ti or bust. new goal, like I had from GK >> pascal 

5900xt should be pretty epic though. 

i'm trying to not spend a dime until we see intel's first hand.


----------



## huzzug

Section31 said:


> Hope thats true. Cyberpunk 2077 is asking for new Gpu. I have decided to give my friend RTX2080TI once next gen stuff is out. Thanks for being the photographer at my wedding and even doing an wedding photo album.


So, when are you planning to get married next?


----------



## Ashura

b.walker36 said:


> I almost ordered a 5700xt yesterday, I think I'm just going to hold out until these drop and buy the best card in the 700 range. Whatever it is will be a huge upgrade over my 980Ti


Welcome to the club.


----------



## magnek

Titan Xp performance for $300 or no buy kkthx


----------



## Damage Inc

Section31 said:


> Hope thats true. Cyberpunk 2077 is asking for new Gpu. I have decided to give my friend RTX2080TI once next gen stuff is out. Thanks for being the photographer at my wedding and even doing an wedding photo album.


Same here. Giving away my 2080 as soon as top Ampere is out. Need a card that can handle Cyberpunk 2077 at 4k and hopefully with RTX on baby. It's going to be glorious.


----------



## DNMock

So wait, why does everyone automatically think this is gonna replace Turing and not Volta?


----------



## speed_demon

Makes me wonder what AMD has planned for their laptop GPU in the future. Nvidia always has full fledged performance options and I expect 3000 series for lappy's but AMD not so much.


----------



## TriWheel

moonbogg said:


> I don't consider myself an Nvidia customer anymore and I'm, for the first time, not excited at news of a new generation from them. The 1080ti is likely the last one I get from them. Their prices likely won't go down and I consider myself no longer welcome as a customer of theirs as I won't spend more than the $6-700 price bracket for a high end card yet they insist on charging over $1000. They have plenty of people who can't wait to spend that much, so it's fine. They don't need plebs like me, lol.


That's the attitude that I adopted.

I will get a card that is 3x the performance of Fury for the $300, or I simply won't be buying any more cards.


----------



## skupples

magnek said:


> Titan Xp performance for $300 or no buy kkthx


that actually doesn't seem like a lot to ask. 5700xt keeps up with / beats 1080ti here & there for $400.



DNMock said:


> So wait, why does everyone automatically think this is gonna replace Turing and not Volta?


full line refresh. 

poor super buyers, shortest lifespan GPU of the bunch. They'll rage like 780 kids right before maxwell. (then be thankful their 780 at least doesn't have 3.5gb memory)


----------



## guttheslayer

skupples said:


> 3080TI will be the same price.
> 
> 
> highly unlikely. NVidia hasn't given us gains like that in years. 30% max. oh right 2080ti is 50%
> 
> i have pretty high expectations actually, if not at least we know its the first hdmi 2.1 card
> 
> either way, i'm trying to hold out for the cards that release AFTER the new consoles. Not the last run up to the new consoles. That's like the last bit of paste in the dx11 tube.


2080 Ti isnt 50% faster than 1080 Ti, so we have a wishful thinking on NV giving us that.

Also the next gen will be just RTX 3080, Ti variant wont be once till 2021 which is back to the usual NV old model, reason being it is too risky and expensive to manufacture a big Ampere die at this stage for gaming market. Most of the first iteration of big die, if any, goes directly to Computational GPU like Volta.


2080 Ti was a special case because 12nm was a refined 16nm which has already mature for >2 years during Turing release. 7nm+ is first EUV on the market, it wont be that easy nor will it be cheap (due to poor yield) even for a $1200 price tag.


My take is that given how big the RT cores are, the first GA104 die should be 320-360mm^2 sq in size on 7nm EUV, that is 20-25% faster than Titan RTX and probably pack in 17B transistors. 

Be mentally prepare if it is expensive, possibly MSRP at $799 and $899 for custom / FE edition respectively. Might come in with 12GB (256-bits) configuration as I know there is a 1.5GB GDDR6 modules existence.


----------



## Slaughtahouse

skupples said:


> that actually doesn't seem like a lot to ask. 5700xt keeps up with / beats 1080ti here & there for $400.
> 
> 
> 
> full line refresh.
> 
> poor super buyers, shortest lifespan GPU of the bunch. They'll rage like 780 kids right before maxwell. (then be thankful their 780 at least doesn't have 3.5gb memory)


Not all of us raged...  My 780 last me many years (sold it this February for $180 w. block) and was equal to the Maxwell 970. I was happy to get "TITAN" level gaming performance for 70% of the cost at the time. 

I was more "raged" that my 3gb card was shortly replaced by a new 780 w. 6gb and 780 ti a couple months after. Oh well, that was first and last time buying a "premium card". Back to the X60 series / used market for me


----------



## Damage Inc

50% per watt... Where does it say 50% more performance. Sounds good in either case seeing as AMD has nothing on nVidia. I need to get me a CL-X and a new X299 board before that though.


----------



## SoloCamo

$500 peak for me. I hung onto my 290x from Oct 2013 to Jan 2019 so I think this V64 can hold me over until the consoles hit and we see what Intel brings to the table.


----------



## Zam15

I've been waiting forever for a worthy upgrade path for my 980 SLI setup.

If the new TI card hits with 16GB, HDMI 2.0, and DP 2.0 I'll jump.

Need something to run all the next gen AAA console ports at 4K Ultra 60hz and up for the next 4/5 years and my VR headsets then I'll be happy.


----------



## keikei

Zam15 said:


> I've been waiting forever for a worthy upgrade path for my 980 SLI setup.
> 
> If the new TI card hits with 16GB, HDMI 2.0, and DP 2.0 I'll jump.
> 
> Need something to run all the next gen AAA console ports at 4K Ultra 60hz and up for the next 4/5 years and my VR headsets then I'll be happy.



I'm still waiting for that happiness myself.


----------



## littledonny

Zam15 said:


> I've been waiting forever for a worthy upgrade path for my 980 SLI setup.
> 
> If the new TI card hits with 16GB, HDMI 2.0, and DP 2.0 I'll jump.
> 
> Need something to run all the next gen AAA console ports at 4K Ultra 60hz and up for the next 4/5 years and my VR headsets then I'll be happy.


Should be HDMI 2.1


----------



## Zam15

littledonny said:


> Should be HDMI 2.1


Sorry typo on my end. HDMI 2.1, plan to upgrade my AV receiver to a processor with HDMI 2.1 next year, and eventually plan to upgrade my monitor (HDR, 144hz, 4K) and to headset ones that have DP 2.0. 

But this all hinges on getting a card that support the new standards. 

I'm guessing they'll time the release around Cyberpunk 2077, as that will be the new game in town and drive people to upgrade. Maybe even include it with the new card.


----------



## PontiacGTX

SoloCamo said:


> $500 peak for me. I hung onto my 290x from Oct 2013 to Jan 2019 so I think this V64 can hold me over until the consoles hit and we see what Intel brings to the table.


if the price trend continues 500usd only will get you a GA104 or cutdown GA104 I feel like the Vega GPUs didnt age so well, probably due to 290x in comparison lasted almost 2-3years in the high end


----------



## Hydroplane

Wonder if NVLink will stick around for Ampere?


----------



## CallsignVega

I hate the "performance increase per watt" metric. Outside of laptops, who gives a crap? It's raw performance gains I am interested in.


----------



## Slaughtahouse

CallsignVega said:


> I hate the "performance increase per watt" metric. Outside of laptops, who gives a crap? It's raw performance gains I am interested in.


I am interested. I don't want additional wattage (heat) or my cooling (noise) to ramp up.

Performance is relative.


----------



## magnek

moonbogg said:


> I don't consider myself an Nvidia customer anymore and I'm, for the first time, not excited at news of a new generation from them. The 1080ti is likely the last one I get from them. Their prices likely won't go down and I consider myself no longer welcome as a customer of theirs as I won't spend more than the $6-700 price bracket for a high end card yet they insist on charging over $1000. They have plenty of people who can't wait to spend that much, so it's fine. They don't need plebs like me, lol.


Likewise. I expect *at least* 2x performance for the same amount of money for a GPU to be worthy of an "upgrade". No point upgrading if I still can't play the game! 

This is why I'm still holding on to my 980 Ti, because neither camp has delivered on that metric so far.


----------



## skupples

guttheslayer said:


> 2080 Ti isnt 50% faster than 1080 Ti, so we have a wishful thinking on NV giving us that.
> 
> Also the next gen will be just RTX 3080, Ti variant wont be once till 2021 which is back to the usual NV old model, reason being it is too risky and expensive to manufacture a big Ampere die at this stage for gaming market. Most of the first iteration of big die, if any, goes directly to Computational GPU like Volta.
> 
> 
> 2080 Ti was a special case because 12nm was a refined 16nm which has already mature for >2 years during Turing release. 7nm+ is first EUV on the market, it wont be that easy nor will it be cheap (due to poor yield) even for a $1200 price tag.
> 
> 
> My take is that given how big the RT cores are, the first GA104 die should be 320-360mm^2 sq in size on 7nm EUV, that is 20-25% faster than Titan RTX and probably pack in 17B transistors.
> 
> Be mentally prepare if it is expensive, possibly MSRP at $799 and $899 for custom / FE edition respectively. Might come in with 12GB (256-bits) configuration as I know there is a 1.5GB GDDR6 modules existence.


thanks for that... I didn't want anyone to jump on me for pointing out 1080ti >> 2080ti is not a cut and dry "50% performance gain" specially where it matters, 4K... and yes there's no reason to expect lower prices unless your goal is to set yourself up for eventual rage. NV 7NM will break price records, just like they always do.

The only good news is the trifecta all eventually competing in that ~1080 price point. We should eventually see some really good "value" out of this, and at this point its not so far away. 



Slaughtahouse said:


> Not all of us raged...  My 780 last me many years (sold it this February for $180 w. block) and was equal to the Maxwell 970. I was happy to get "TITAN" level gaming performance for 70% of the cost at the time.
> 
> I was more "raged" that my 3gb card was shortly replaced by a new 780 w. 6gb and 780 ti a couple months after. Oh well, that was first and last time buying a "premium card". Back to the X60 series / used market for me


I was wondering to myself if I meant 780ti folks. Remember, 780ti was the first TI in a long time to not be the drag racer of the low end market (560ti memories anyone?!) 

i've been buying GPUs based off of available memory for ages now due to being obsessed with 3x screen gaming for many many years. GK110 + good waterblock + software volt mod = zoom zoom!


----------



## pompss

Next year will go with console. The 2080ti will be my last card. Not giving nvidia another $1200

I could change my mind if i will see a price tag of $799 and 120 fps at 4k. Otherwise make absolutely no sense for me since the new consoles will run 4k 60Hz.


----------



## ILoveHighDPI

pompss said:


> Next year will go with console. The 2080ti will be my last card. Not giving nvidia another $1200
> 
> I could change my mind if i will see a price tag of $799 and 120 fps at 4k. Otherwise make absolutely no sense for me since the new consoles will run 4k 60Hz.


Yup. In current market conditions the console makers are the only ones willing to push hardware at a decent value.
Nvidia will never undercut AMD ever again, and AMD is only undercutting Nvidia by the slimmest of margins.

The only thing that could change the situation dramatically is if AMD launches Big Navi with extremely good Rasterization performance and Nvidia is forced to react.
I still say the market would respond very well to a top tier card with zero Ray Tracing support.


----------



## skupples

pompss said:


> Next year will go with console. The 2080ti will be my last card. Not giving nvidia another $1200
> 
> I could change my mind if i will see a price tag of $799 and 120 fps at 4k. Otherwise make absolutely no sense for me since the new consoles will run 4k 60Hz.


not too far off from that, tbh. unless you're one of those weird people who demand those settings @ ultra across all settings, then it'll be 5 years.

i can get my 1080ti to do 4K60 in most titles pretty easily(in the mediums), i guess 2080ti is just that underwhelming 

also, you're pretty much always gonna be disappointing if you upgrade every generation, unless you to some sorta tick-tock between low and and high end. 

gotta remember the base NV metric.

you can guarantee 3080 will be ~2080ti performance, give or take 10%. I expect 30x0 to have a short lifespan and to be underwhelming.



ILoveHighDPI said:


> Yup. In current market conditions the console makers are the only ones willing to push hardware at a decent value.
> Nvidia will never undercut AMD ever again, and AMD is only undercutting Nvidia by the slimmest of margins.
> 
> The only thing that could change the situation dramatically is if AMD launches Big Navi with extremely good Rasterization performance and Nvidia is forced to react.
> I still say the market would respond very well to a top tier card with zero Ray Tracing support.


yep, that's how it works now, and the market would gobble up a top end no tensor product, but NV will never push it. You'll have to get it from AMD or Intel. 

consoles are also the limiting factor for PC games, when they exist on both platforms. 

next gen consoles are important to PC gaming, whether you wanna admit it or not. Not just this cycle, last cycle too. It'll be ever more-so the case now that they're basically gaming laptops without the screen.


----------



## UltraMega

This situation is going to be hard to predict. Nvidia has a frayed relationship with PC gamers right now due to the greedy 2000 series pricing and it may be in their best interest to focus on offering a better value this time around, and moving to 7nm should give them every opportunity to do that. On top of that, Nvidia should have had plenty of time to optimize ray tracing performance to a level that is fully practical by then. Further more, they should be concerned about their reputation more than ever with new competition expected to enter the market soon. This next gen for Nvidia will be a perfect way to measure their level of greed. If they release something good at a reasonable price this time around, great. If not, with so many reason for them to avoid greed right now, they can't pull that off, it will be a pretty clear litmus test for all to see. If Nvidia doesn't release something that doesn't feel like a rip off this time, I don't think I will be buying any more Nvidia cards in any foreseeable future.


----------



## guttheslayer

UltraMega said:


> This situation is going to be hard to predict. Nvidia has a frayed relationship with PC gamers right now due to the greedy 2000 series pricing and it may be in their best interest to focus on offering a better value this time around, and moving to 7nm should give them every opportunity to do that. On top of that, Nvidia should have had plenty of time to optimize ray tracing performance to a level that is fully practical by then. Further more, they should be concerned about their reputation more than ever with new competition expected to enter the market soon. This next gen for Nvidia will be a perfect way to measure their level of greed. If they release something good at a reasonable price this time around, great. If not, with so many reason for them to avoid greed right now, they can't pull that off, it will be a pretty clear litmus test for all to see. If Nvidia doesn't release something that doesn't feel like a rip off this time, I don't think I will be buying any more Nvidia cards in any foreseeable future.


I agree with you, but dont expect too much leeway to be given from NV side. Also seeing them skip 7nm to go straight to EUV, i expect pretty big jump, with pricing as well.

We should expect to see a Vram jump as well, but all these point to extremely unattractive price as the G6 high speed module aren't that cheap itself (esp for 16 gpbs which NV will desperate need for their next gen)

If given by old record we should see 20-25% performance jump but with $100 increase in pricing this time.

Hopefully by then DP 2.0 support will he out, there is no point for 4k120hz HDR display when u are capped at 98hz


----------



## UltraMega

guttheslayer said:


> I agree with you, but dont expect too much leeway to be given from NV side. Also seeing them skip 7nm to go straight to EUV, i expect pretty big jump, with pricing as well.
> 
> We should expect to see a Vram jump as well, but all these point to extremely unattractive price as the G6 high speed module aren't that cheap itself (esp for 16 gpbs which NV will desperate need for their next gen)
> 
> If given by old record we should see 20-25% performance jump but with $100 increase in pricing this time.
> 
> Hopefully by then DP 2.0 support will he out, there is no point for 4k120hz HDR display when u are capped at 98hz


That would be a huge fail from NV IMO. 

Keep in mind the 10 series lasted way longer than most GPU gens on PC, and the follow-up should have been way more advanced given the larger gap between releases but it was a longer wait, a lesser upgrade, and way higher prices than usual. If Nvidia can't undo some of that trend.. then I hope they have some other long term plans beyond selling desktop GPUs for their own sake. There may be a lot of people on OCN who will pay high prices for GPUs but OCN is a very niche segment of the PC gaming market that definitely represents the enthusiast consumer base. Nvidia will lose hard as soon as any competition get serious if they have pissed of most of their customers two generations in a row. Nvidia needs to take the position of "we hear our customers, we messed up, and we will do better next time" and not "we want to keep exploiting our current market dominance like the only thing that matters are short term gains for our investors.". It's somewhat OK to mess up once and backtrack after, but mess up and then push forward and it sends the clear message that greed is more important that the product, and that the company is focusing on getting the most out of their market control rather than just releasing a good product at a good price.


----------



## ttnuagmada

CallsignVega said:


> I hate the "performance increase per watt" metric. Outside of laptops, who gives a crap? It's raw performance gains I am interested in.


That's the wrong way to look at it. Higher perf/watt means they can make bigger chips. PCI-E power delivery and GPU cooling demands are going to mean AMD/Nvidia are generally not going to venture over 300w TDP. IE high perf/watt means Nvidia can make massive GPU's like the TU102 when AMD can't get away with it.


----------



## guttheslayer

UltraMega said:


> That would be a huge fail from NV IMO.
> 
> Keep in mind the 10 series lasted way longer than most GPU gens on PC, and the follow-up should have been way more advanced given the larger gap between releases but it was a longer wait, a lesser upgrade, and way higher prices than usual. If Nvidia can't undo some of that trend.. then I hope they have some other long term plans beyond selling desktop GPUs for their own sake. There may be a lot of people on OCN who will pay high prices for GPUs but OCN is a very niche segment of the PC gaming market that definitely represents the enthusiast consumer base. Nvidia will lose hard as soon as any competition get serious if they have pissed of most of their customers two generations in a row. Nvidia needs to take the position of "we hear our customers, we messed up, and we will do better next time" and not "we want to keep exploiting our current market dominance like the only thing that matters are short term gains for our investors.". It's somewhat OK to mess up once and backtrack after, but mess up and then push forward and it sends the clear message that greed is more important that the product, and that the company is focusing on getting the most out of their market control rather than just releasing a good product at a good price.


I know what you are trying to drive, but to maintain that level of price / performance would mean undercutting their profit margin by half.


That is because it almost cost 2x as much more to make 7nm+ wafer than 12nm ones. Also thanks to the RT Cores that occupy that much die space, the GA104 chip would be the biggest compared to all previous x04 chip except TU104.


I hope for $699 for even lesser price (which I really hope I am wrong), but given RTX Titan is $2500 and the 3080 will be faster, this will give us doubt if they could maintain that $700 pricing of GTX 1080 especially for 1st gen EUV chip.


----------



## UltraMega

guttheslayer said:


> I know what you are trying to drive, but to maintain that level of price / performance would mean undercutting their profit margin by half.
> 
> 
> That is because it almost cost 2x as much more to make 7nm+ wafer than 12nm ones. Also thanks to the RT Cores that occupy that much die space, the GA104 chip would be the biggest compared to all previous x04 chip except TU104.
> 
> 
> I hope for $699 for even lesser price (which I really hope I am wrong), but given RTX Titan is $2500 and the 3080 will be faster, this will give us doubt if they could maintain that $700 pricing of GTX 1080 especially for 1st gen EUV chip.


Ray Tracing is the variable though. 120 FPS 4k without ray tracing should be easily attainable next gen. Since RT is so new, hard to predict how it's going to improve from first gen to second gen. If they can do 4K 60 with RT and 4K 120 without generally speaking, and for a decent price, meaning around $600, I would consider that a decent product.


----------



## The-Real-Link

Haven't jumped on the 2080TI bandwagon due to just rebuilding here and funds. But if a 3080 / 3080TI crushes a 2080Ti by any decent margin, that'd wallop my Titan X (Pascal) so probably going to jump.


----------



## guttheslayer

UltraMega said:


> Ray Tracing is the variable though. 120 FPS 4k without ray tracing should be easily attainable next gen. Since RT is so new, hard to predict how it's going to improve from first gen to second gen. If they can do 4K 60 with RT and 4K 120 without generally speaking, and for a decent price, meaning around $600, I would consider that a decent product.


120 FPS @ 4K is possible on 3080 Ti, but RT at 4K 60 is not, the first gen Turing can only do 1080p 30 Hz without gimmick, that is like a 8x jump. Nothing sort of miracle can be done within a generation leap.

DLSS is a big fail after NV released their image sharpening option, so I hope DLSS 2.0 will be significantly better to push the performance of RT further.


Also I believe this card will be released at Q3 of 2020 precisely because it might be DP 2.0 / HDMI 2.1 compatible. Given the time frame DP 2.0 is announced to actual product releasing it, anything earlier Q3 seem to be a pipe dream. Also it will be likely to be on PCIe 4.0 as well.


Ampere is probably the true GPU chip that we have been waiting for from Green camp.


----------



## CallsignVega

ttnuagmada said:


> That's the wrong way to look at it. Higher perf/watt means they can make bigger chips. PCI-E power delivery and GPU cooling demands are going to mean AMD/Nvidia are generally not going to venture over 300w TDP. IE high perf/watt means Nvidia can make massive GPU's like the TU102 when AMD can't get away with it.


Hmm, I see your point. Just in past experience though, companies toss out the "performance per watt" increase when the absolute performance increase isn't anything to brag about.


----------



## Elmy

I will sell my 2080Ti and upgrade to anything 20% or more faster. 2560X1440p 240Hz monitors will be arriving very soon. Will need all the FPS I can get. 

I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny. And there are many of us that do this. There are also many ppl that are lurking in the for sale forums to snag up those used GPU's at a discount. 

Also I see so many ppl in here complaining about pricing. Its getting more expensive for R&D , Real estate , wages , logistics , cost of wafers , etc. Flagship will never be 6-7 hundred again... EVER! Even if Intel joins the market.. You think they somehow figured out how to build a GPU cheaper than Nvidia and AMD and even if they did.. You think they are going to pass that savings onto you? They know the excitement for their GPU's will be high. They will price at whatever NVIDIA/AMD is commanding at the same performance. The are NOT going to come into the industry and price their GPU's at the same performance for 50% less money. Some of you think somehow a pricing fairy is going to come rescue the market. Its NOT going to happen. Coming into here and complaining about the pricing falls on deaf ears. No one cares that you are not buying a GPU because its too expensive. There are 20 people behind you in line that will.


----------



## keikei

I'm still working on FFXV and Metro regarding 4k/60fps. Whether the reasons being unoptimized or new engine/more effects, the bar for higher fps will forever be raised.


----------



## guttheslayer

Elmy said:


> I will sell my 2080Ti and upgrade to anything 20% or more faster. 2560X1440p 240Hz monitors will be arriving very soon. Will need all the FPS I can get.
> 
> I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny. And there are many of us that do this. There are also many ppl that are lurking in the for sale forums to snag up those used GPU's at a discount.
> 
> Also I see so many ppl in here complaining about pricing. Its getting more expensive for R&D , Real estate , wages , logistics , cost of wafers , etc. Flagship will never be 6-7 hundred again... EVER! Even if Intel joins the market.. You think they somehow figured out how to build a GPU cheaper than Nvidia and AMD and even if they did.. You think they are going to pass that savings onto you? They know the excitement for their GPU's will be high. They will price at whatever NVIDIA/AMD is commanding at the same performance. The are NOT going to come into the industry and price their GPU's at the same performance for 50% less money. Some of you think somehow a pricing fairy is going to come rescue the market. Its NOT going to happen. Coming into here and complaining about the pricing falls on deaf ears. No one cares that you are not buying a GPU because its too expensive. There are 20 people behind you in line that will.


Haha, but the poor sales of RTX card reflect it very clear, it was BAD.


20 people behind you that will buy you said, then after that will be no one, the leftover cards will be left to collect dust at the shelves. If pricing premium will guarantee good sales, then Intel doesnt have to lower their HEDT CPU MSRP by more than half.


Don't complain about people complaining the price when the sales and market share will reflect clearly on the dumb decision to price it sky-high. People cannot afford it and felt that its not worth it is as simple as that. There are many thing else on this world that is worth buying over and no one cares about your 240 FPS lol.


----------



## skupples

Elmy said:


> I will sell my 2080Ti and upgrade to anything 20% or more faster. 2560X1440p 240Hz monitors will be arriving very soon. Will need all the FPS I can get.
> 
> I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny. And there are many of us that do this. There are also many ppl that are lurking in the for sale forums to snag up those used GPU's at a discount.
> 
> Also I see so many ppl in here complaining about pricing. Its getting more expensive for R&D , Real estate , wages , logistics , cost of wafers , etc. Flagship will never be 6-7 hundred again... EVER! Even if Intel joins the market.. You think they somehow figured out how to build a GPU cheaper than Nvidia and AMD and even if they did.. You think they are going to pass that savings onto you? They know the excitement for their GPU's will be high. They will price at whatever NVIDIA/AMD is commanding at the same performance. The are NOT going to come into the industry and price their GPU's at the same performance for 50% less money. Some of you think somehow a pricing fairy is going to come rescue the market. Its NOT going to happen. Coming into here and complaining about the pricing falls on deaf ears. No one cares that you are not buying a GPU because its too expensive. There are 20 people behind you in line that will.


yep, i've been snagging up used flagships since maxwell.

however, that ends soon.


----------



## EniGma1987

DNMock said:


> So wait, why does everyone automatically think this is gonna replace Turing and not Volta?



Volta was the first time they really made an arch that was only for enterprise. I think everyone expects it will be a standard arch launch and contain both consumer and enterprise chips. The current Nvidia strategy of a x100 chip being enterprise focused and x102+ being consumer seems to be working


----------



## guttheslayer

EniGma1987 said:


> Volta was the first time they really made an arch that was only for enterprise. I think everyone expects it will be a standard arch launch and contain both consumer and enterprise chips. The current Nvidia strategy of a x100 chip being enterprise focused and x102+ being consumer seems to be working


Yes until now we never see a TU100 chip, which means 100 is the code for enterprise.


I am surprised the first G code is coming back, most likely signify the return of both enterprise + consumer card sharing the same code name.


----------



## Doubletap1911

The 1080Ti is enough to push 7680x1440 on older titles, but I'm looking for some serious power to see if I can run NV Surround on current games at good frame rates. 

If not, I'll probably sell my 27s and buy the new LG 38"

I don't believe in "greed", I believe in market forces. They don't have a competitor and they make the fastest cards. You have to decide which you like more: frame rates or money.


----------



## skupples

multi-monitor is fading faster than ever anyways. I downgraded to a single screen for gaming quite some time ago now. 

Though really, I'd love to have one of those 5120x1440p screens from sammy.


----------



## UltraMega

guttheslayer said:


> 120 FPS @ 4K is possible on 3080 Ti, but RT at 4K 60 is not, the first gen Turing can only do 1080p 30 Hz without gimmick, that is like a 8x jump. Nothing sort of miracle can be done within a generation leap.
> 
> DLSS is a big fail after NV released their image sharpening option, so I hope DLSS 2.0 will be significantly better to push the performance of RT further.
> 
> 
> Also I believe this card will be released at Q3 of 2020 precisely because it might be DP 2.0 / HDMI 2.1 compatible. Given the time frame DP 2.0 is announced to actual product releasing it, anything earlier Q3 seem to be a pipe dream. Also it will be likely to be on PCIe 4.0 as well.
> 
> 
> Ampere is probably the true GPU chip that we have been waiting for from Green camp.


That may have been true at launch but performance has improved a lot since then. Most rtx games can do at least 30fps in 4k now, so only asking for a 2x increase but given how new the tech is I think we should see at least a 3x improvement.


----------



## Sheyster

Elmy said:


> I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny.


This... My Titan Xp sale covered a big chunk of the 2080 Ti purchase. I stayed on a 4790K for over 4 years without feeling a real need to upgrade until the 9900K was available last year. Every hobby has high-rollers who go all out all the time, and then there is the bang-for-the-buck crowd who upgrade much less frequently and are more strategic about how, when and what they spend their money on. Not everyone rolls the same way obviously but we all enjoy it and participate at various levels.


----------



## keikei

So what happens with RT when all 3 bois show up to the party next summer? 3 separate methods to run it or some sort of unified standard? Summer of 2020 will be a very interesting time indeed.


----------



## skupples

something along those lines.

I haven't kept up well enough to know the ins an outs, but honestly... 2080ti was probably one of the least valuable ones to buy since keplar due to total lifespan, and how it gets usurped. 

NV had almost no competition for half of the 20x0 series line. They could release whatever they hell they wanted, they were only competing with themselves.


----------



## Sheyster

keikei said:


> So what happens with RT when all 3 bois show up to the party next summer? 3 separate methods to run it or some sort of unified standard? Summer of 2020 will be a very interesting time indeed.



It should not matter much. DXR is part of DirectX 12 since late last year. Nvidia just had the first implementation to market that supported it.


----------



## UltraMega

Sheyster said:


> It should not matter much. DXR is part of DirectX 12 since late last year. Nvidia just had the first implementation to market that supported it.


Is Nvidia using DXR or is RTX some gameworks kinda BS? What happens to RTX games when there are more than just Nvidia cards doing DXR?


----------



## skupples

Ray Tracing is DXR. Tensor core is a "dedicated solution" to handle it. 

"RTX" is part of gameworks 3.0

at least, that's how I read it.

NVupdate, now known as Experience + Game works = RTX! or something...


----------



## Sheyster

UltraMega said:


> Is Nvidia using DXR or is RTX some gameworks kinda BS? What happens to RTX games when there are more than just Nvidia cards doing DXR?


This should shed more light on it (no pun intended):

https://developer.nvidia.com/rtx

So the short answer is RTX does use DXR (or it supports it would be a better term) but there is more to it than that obviously.


----------



## 113802

Sheyster said:


> UltraMega said:
> 
> 
> 
> Is Nvidia using DXR or is RTX some gameworks kinda BS? What happens to RTX games when there are more than just Nvidia cards doing DXR?
> 
> 
> 
> This should shed more light on it (no pun intended):
> 
> https://developer.nvidia.com/rtx
> 
> So the short answer is RTX does use DXR (or it supports it would be a better term) but there is more to it than that obviously.
Click to expand...

Just like AMD's AGS, nVidia also has a platform with enhanced libraries to make better use of their hardware. Even though AMD's is open source it's still AMD hardware dependant just like AMD's ROCm.


----------



## UltraMega

So then it sounds like it's just not clear what will happen when other RT cards come out for pre-existing RTX games. Sounds to me like RTX either; will not work on other RT cards at all, work but much slower and unoptimized. 

So basically those games would need to get patched to remove RTX and replace it with pure DXR standards. This has been one of the biggest issues about RTX to me. Nvidia trying to turn an open standard into something walled off and proprietary. Feels pretty anti-consumer. When competition catches up with RT, RTX's wall off issues are going to stand out like a soar thumb.

So then will this make other RT cards looks slower as well? RTX games will work better on Nvidia cards if only because devs won't go back and standardize RT support for older games, so seem to me this is likely to give Nvidia an really sneaky and artificial "boost" in certain benchmarks. Basically, they are going to be holding their competition down with this. 

Nvidia can honestly kick rocks.


----------



## Sheyster

UltraMega said:


> Basically, they are going to be holding their competition down with this.
> 
> Nvidia can honestly kick rocks.



Well, we all know that nVidia loves their proprietary tech. Case in point: G-Sync. Only recently have they started to support Freesync 2 dubbing it "G-Sync compatible".


----------



## 113802

Sheyster said:


> [Well, we all know that nVidia loves their proprietary tech. Case in point: G-Sync. Only recently have they started to support Freesync 2 dubbing it "G-Sync compatible". /forum/images/smilies/rolleyes.gif


You do realize FreeSync is proprietary right? Just like Qualcomm's QSync. 

https://www.amd.com/en/technologies/free-sync-faq



> How are DisplayPort Adaptive-Sync and Radeon™️ FreeSync technology different?
> 
> DisplayPort Adaptive-Sync is an ingredient DisplayPort feature that enables real-time adjustment of monitor refresh rates required by technologies like Radeon™️ FreeSync technology. Radeon™️ FreeSync technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.​ Users are encouraged to read this interview​ to learn more.


----------



## Sheyster

WannaBeOCer said:


> You do realize FreeSync is proprietary right? Just like Qualcomm's QSync.
> 
> https://www.amd.com/en/technologies/free-sync-faq



We're quibbling over semantics here... This is really all that matters:



> FreeSync:
> 
> Royalty-free licensing for monitor manufacturers;
> Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);


----------



## 113802

Sheyster said:


> We're quibbling over semantics here... This is really all that matters:
> 
> 
> 
> 
> FreeSync:
> 
> Royalty-free licensing for monitor manufacturers;
> Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
Click to expand...

We were talking about proprietary software which FreeSync is. Just like ROCm and most of their other software on GPUOpen. Open Source doesn't mean hardware agnostic. 

If you said you don't like the extra fee for the hardware module I understand but claiming it's not proprietary is a joke.


----------



## UltraMega

Gsync and Free Sync are separate, they're hardware things that don't have anything to do with rendering. Also, Nvidia kinda invented that whole trend since before Gsync there was nothing like it, so I have no issue with that.

But Nvidia did not invent ray tracing. Not only that, but it was already a standard feature of DX12. They had invested a lot into the tensor core design and it didn't work well as a GPU architecture, so they pawned it off onto gamers as RT cores for a high premium AND they tried to make RT, again something they in no way invented, a proprietary thing at the same time. It's just all around dirty the more I think about it.


----------



## tpi2007

WannaBeOCer said:


> We were talking about proprietary software which FreeSync is. Just like ROCm and most of their other software on GPUOpen. Open Source doesn't mean hardware agnostic.
> 
> If you said you don't like the extra fee for the hardware module I understand but claiming it's not proprietary is a joke.



That's not the correct way to put it either. Adaptive Sync is hardware agnostic, Intel's iGPU's and discrete cards will support it, and so do Nvidia's already.






UltraMega said:


> Gsync and Free Sync are separate, they're hardware things that don't have anything to do with rendering. Also, Nvidia kinda invented that whole trend since before Gsync there was nothing like it, so I have no issue with that.
> 
> But Nvidia did not invent ray tracing. Not only that, but it was already a standard feature of DX12. *They had invested a lot into the tensor core design and it didn't work well as a GPU architecture, so they pawned it off onto gamers as RT cores for a high premium* AND they tried to make RT, again something they in no way invented, a proprietary thing at the same time. It's just all around dirty the more I think about it.



What? Tensor cores are one thing, RT cores are another. The Tensor cores are needed for Nvidia's current ray tracing implementation to produce acceptable results, as they act as image denoisers for the RT cores' output.


----------



## 113802

tpi2007 said:


> That's not the correct way to put it either. Adaptive Sync is hardware agnostic, Intel's iGPU's and discrete cards will support it, and so do Nvidia's already.


I never said Adaptive Sync wasn't hardware agnostic. 

I stated AMD's FreeSync, Qualcomm's QSync and nVidia's G-Sync compatible are all proprietary. If you look above you'll see AMD's exact statement that stands for all of the others that use the Adaptive Sync protocol.


----------



## magnek

Elmy said:


> I will sell my 2080Ti and upgrade to anything 20% or more faster. 2560X1440p 240Hz monitors will be arriving very soon. Will need all the FPS I can get.
> 
> I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny. And there are many of us that do this. There are also many ppl that are lurking in the for sale forums to snag up those used GPU's at a discount.
> 
> Also I see so many ppl in here complaining about pricing. Its getting more expensive for R&D , Real estate , wages , logistics , cost of wafers , etc. Flagship will never be 6-7 hundred again... EVER! Even if Intel joins the market.. You think they somehow figured out how to build a GPU cheaper than Nvidia and AMD and even if they did.. *You think they are going to pass that savings onto you? They know the excitement for their GPU's will be high. They will price at whatever NVIDIA/AMD is commanding at the same performance. The are NOT going to come into the industry and price their GPU's at the same performance for 50% less money. Some of you think somehow a pricing fairy is going to come rescue the market. Its NOT going to happen. Coming into here and complaining about the pricing falls on deaf ears. No one cares that you are not buying a GPU because its too expensive. There are 20 people behind you in line that will.*


You're not wrong, but at some point enough is enough, and a breaking point is reached. For me, that means I either quit PC gaming for good, or will always intentionally stay at least one gen behind to keep purchase prices more sane. I'll be damned if I had to pay $1000+ for a not-even-full-chip faux "flagship".

The same line of reasoning is also why I have never paid more than half price for PC games since 2012. $60 for a base game that has at least 30% of its content cut out in the name of "DLC" is absolutely criminal, and I refuse to cave to such money gouging tactics.


----------



## guttheslayer

magnek said:


> You're not wrong, but at some point enough is enough, and a breaking point is reached. For me, that means I either quit PC gaming for good, or will always intentionally stay at least one gen behind to keep purchase prices more sane. I'll be damned if I had to pay $1000+ for a not-even-full-chip faux "flagship".
> 
> The same line of reasoning is also why I have never paid more than half price for PC games since 2012. $60 for a base game that has at least 30% of its content cut out in the name of "DLC" is absolutely criminal, and I refuse to cave to such money gouging tactics.


Like I said, let Market share decide.

JHH had his worth value drop by half in a matter of months. So let that continue. CEO who think like Elmy that is what happen to them lol.


All I can agree is $699-$799 for flagship card in mid-range die is going to stay. I am not so sure for the 3080 Ti if it does appear in 2021, it might be $1200, or it might be $700, all depend on competition from AMD at that point.


----------



## ilmazzo

Elmy said:


> I will sell my 2080Ti and upgrade to anything 20% or more faster. 2560X1440p 240Hz monitors will be arriving very soon. Will need all the FPS I can get.
> 
> I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny. And there are many of us that do this. There are also many ppl that are lurking in the for sale forums to snag up those used GPU's at a discount.
> 
> Also I see so many ppl in here complaining about pricing. Its getting more expensive for R&D , Real estate , wages , logistics , cost of wafers , etc. Flagship will never be 6-7 hundred again... EVER! Even if Intel joins the market.. You think they somehow figured out how to build a GPU cheaper than Nvidia and AMD and even if they did.. You think they are going to pass that savings onto you? They know the excitement for their GPU's will be high. They will price at whatever NVIDIA/AMD is commanding at the same performance. The are NOT going to come into the industry and price their GPU's at the same performance for 50% less money. Some of you think somehow a pricing fairy is going to come rescue the market. Its NOT going to happen. Coming into here and complaining about the pricing falls on deaf ears. No one cares that you are not buying a GPU because its too expensive. There are 20 people behind you in line that will.



TINA 

there is no alternative

"shut up and take my money"

beautiful


----------



## The Robot

The only winning move is not to buy.


----------



## ilmazzo

UltraMega said:


> That may have been true at launch but performance has improved a lot since then. Most rtx games can do at least 30fps in 4k now, so only asking for a 2x increase but given how new the tech is I think we should see at least a 3x improvement.


yeah but even RTX effect will be multiplied by a 2-3 factor

otherwise we would have 4k viable since pascal/vega.... resolution is fixed, but the effects around that resolution are more demanding every single title....


----------



## keikei

The Robot said:


> The only winning move is not to buy.


That's crazy talk. 

Don't forget about Team Blue: https://wccftech.com/intel-xe-gpu-architecture-2x-performance-ray-tracing-support/


----------



## skupples

UltraMega said:


> So then it sounds like it's just not clear what will happen when other RT cards come out for pre-existing RTX games. Sounds to me like RTX either; will not work on other RT cards at all, work but much slower and unoptimized.
> 
> So basically those games would need to get patched to remove RTX and replace it with pure DXR standards. This has been one of the biggest issues about RTX to me. Nvidia trying to turn an open standard into something walled off and proprietary. Feels pretty anti-consumer. When competition catches up with RT, RTX's wall off issues are going to stand out like a soar thumb.
> 
> So then will this make other RT cards looks slower as well? RTX games will work better on Nvidia cards if only because devs won't go back and standardize RT support for older games, so seem to me this is likely to give Nvidia an really sneaky and artificial "boost" in certain benchmarks. Basically, they are going to be holding their competition down with this.
> 
> Nvidia can honestly kick rocks.


i think it'll mirror physx more closely. 

"dedicated solution"
everyone starts doing it 
boom, dedicated solution becomes pointless, useless garbage. 

the only competition they're holding down is theoretical. It's currently a fancy box badge that works poorly here and there. I'd also assume games will be patched to support AMD and Intel ray tracing solutions once they come around UNLESS they're specifically branded RTX based features, like i'd assume gameworks stuff is grayed out on an AMD card?

i'm gonna LOL hard if the next top tier RT card can't handle 4K + RT properly.



The Robot said:


> The only winning move is not to buy.



pretty much. The markets in a terrible spot right now, it's only made slightly better by seeing AMD coming up in the distance, with intel behind them lacing their shoes. 2021 holdout train!! WOOOT WOOOOT


----------



## JackCY

The Robot said:


> The only winning move is not to buy.


There is nothing to buy anyway. They have effectively priced modern GPUs out of reach of most people. Upgrading from a GTX 1080 up makes little sense anyway.

---

Tracing graphics is far different from physics  While yes you can do physics on a CPU or elsewhere, good luck doing traced graphics without dedicated hardware in real time. Take your GTX or Radeon and go on play traced graphics games, so fast isn't it?

Freesync is as proprietary as a wheel. Anyone can implement it.


----------



## skupples

i just mean the way it'll be hustled, then integrated and be everywhere. 

its apples to oranges tech wise.


----------



## DNMock

tpi2007 said:


> What? Tensor cores are one thing, RT cores are another. The Tensor cores are needed for Nvidia's current ray tracing implementation to produce acceptable results, as they act as image denoisers for the RT cores' output.


Are they though? As far as I am aware they are the exact same modules but have different firmware to optimize them towards ray tracing.


----------



## 113802

UltraMega said:


> Gsync and Free Sync are separate, they're hardware things that don't have anything to do with rendering. Also, Nvidia kinda invented that whole trend since before Gsync there was nothing like it, so I have no issue with that.
> 
> But Nvidia did not invent ray tracing. Not only that, but it was already a standard feature of DX12. They had invested a lot into the tensor core design and it didn't work well as a GPU architecture, so they pawned it off onto gamers as RT cores for a high premium AND they tried to make RT, again something they in no way invented, a proprietary thing at the same time. It's just all around dirty the more I think about it.


You're right they didn't invent ray tracing but they did advance it. They also invented GIVoxels/VXGI voxel-based indirect illumination techniques. 

https://research.nvidia.com/person/cyril-crassin

They created RT cores which accelerate BVH and Tensor cores which can be used for denoising. How does DxR utilize those cores? With enhanced DX12 libraries. Just like AMD's future DxR compatible cards will need to utilize the hardware they release. 

The other method to accelerate BVH would be to increase current compute performance. We would need around 40TFlops of FP32 performance. Current generation flagship cards are around 15TFlops. 

We've seen the current poor performance of Vega/Pascal.


----------



## tpi2007

The Robot said:


> The only winning move is not to buy.



Still winning over here. During the summer I thought about buying one of the outgoing RTX 2070 models on discount, but ultimately they were still too expensive here in Europe. And now that Nvidia has in my opinion shot themselves in the foot in terms of mindshare with the Super lineup, I'm not inclined to buy anything at all until next summer. And by that I mean, for around 500 € now you only get a good quality *x60* class card. That could have been an unforeseen consequence of their lineup milking strategy, but it just sends the wrong message in terms of what tier prices we should be paying in the next generation of cards, a message I'm definitely not willing to support. 

An on a super sarcastic note I'll add that there's no amount of "free" games laden with obnoxious microtransactions, loot boxes and pay to win elements that can make up for it.


----------



## 331149

Volts & Amps. Nvidia so clever when it comes to naming products.


----------



## moonbogg

I do feel very fortunate to have jumped on the 1080Ti right when it came out. I can't remember a GPU staying that high-end for this long. If AMD and Intel really get their act together in the high-end, I think things will really get fun again. For now, I feel like what I have is more than enough to let me enjoy my casual gaming hobby without any annoying compromises. Anyone who feels the same about their card is in a good spot. We can wait. The days of 8800GTX levels of excitement are long gone anyway, so screw it.


----------



## DrFPS

moonbogg said:


> I do feel very fortunate to have jumped on the 1080Ti right when it came out. I can't remember a GPU staying that high-end for this long. If AMD and Intel really get their act together in the high-end, I think things will really get fun again. For now, I feel like what I have is more than enough to let me enjoy my casual gaming hobby without any annoying compromises. Anyone who feels the same about their card is in a good spot. We can wait. The days of 8800GTX levels of excitement are long gone anyway, so screw it.



I bought a 1080ti MSI duke= dukie same day 2080 was released. I paid $549 from newegg and got free shipping LOL. I'm glad I didn't buy the 2080, I was going to. I'm too cheap.


$1199.99 today same card. newegg prices are crazy...

https://www.newegg.com/msi-geforce-gtx-1080-ti-gtx-1080-ti-duke-11g/p/N82E16814137146?Description=gtx%201080ti&cm_re=gtx_1080ti-_-9SIAE8D8GD1254-_-Product


----------



## skupples

that's a pretty damn good price. reference was what like $649.99 MSRP?

they go used for $500 all day long still.


----------



## RealNeil

moonbogg said:


> I do feel very fortunate to have jumped on the 1080Ti right when it came out.


Me too. I bought the MSI GTX-1080Ti Gaming-X TRIO from a reviewer for a too good price and never looked back.
This is a great gaming solution.


----------



## Defoler

UltraMega said:


> Gsync and Free Sync are separate, they're hardware things that don't have anything to do with rendering. Also, Nvidia kinda invented that whole trend since before Gsync there was nothing like it, so I have no issue with that.


Not exactly accurate. 
The "prime model" both are using is eDP. eDP support variable frequency on the laptops as standard (somewhere around 2008). It is a bit of a different hardware standard than DP (which it is based on), in order to support the sync between monitor and hardware gpu. 
What nvidia did was implement it via dedicated hardware for complete integrity, and amd responded by pushing vesa to support it via standard, so they only need to write their stuff, and let monitor manufacturers to do the hard part. 
So the first to bring it to the desktop where nvidia, but there was something even though most were just unaware of it.


----------



## Asmodian

Defoler said:


> So the first to bring it to the desktop where nvidia, but there was something even though most were just unaware of it.


The bit Nvidia added was using it to sync with the game engine. The stuff in eDP was only used for power saving, where the screen would update slower when the image hadn't changed.

That AMD was able to use v-blank and software to enable effective variable sync connected to the game engine was very very cool, but I am not sure they would have even tried if it wasn't for G-sync. Now it is mainstream enough that my TV should support the same technique very soon. 

If you are giving credit to eDP then you are not looking far enough back. The video transmission standards could have been used to implement a variable sync since the first large scale black and white TV broadcasts, assuming the right software was developed. All they are doing is extending the v-blank interval and if the screen can hold the image that is all you need to do.


----------



## CoD511

I'm thinking it's all dependent on the competition, performance and pricing. Nvidia has had ample time to develop their next architecture and I don't see them letting anyone take away their crown. Here's hoping raytracing is more viable at the very least. Had my 1080Ti for a long while and I've been more than happy with it.



TheBDK said:


> Volts & Amps. Nvidia so clever when it comes to naming products.


The architecture is named after people in science, as they've always done for over a decade.


----------



## braincracking

To state the obvious, we need competition, people saying that a flagship(xx80ti) can never be around 700USD anymore are nuts. Look what happened in the HEDT space with Intel's pricing, that's why it is so important to have competition, so that we can actually move on and get a worthy upgrade for a price that isn't 1200USD.

Just my two cents.


----------



## Damage Inc

braincracking said:


> To state the obvious, we need competition, people saying that a flagship(xx80ti) can never be around 700USD anymore are nuts. Look what happened in the HEDT space with Intel's pricing, that's why it is so important to have competition, so that we can actually move on and get a worthy upgrade for a price that isn't 1200USD.
> 
> Just my two cents.


I bought a X800XTPE for around $649 back in 2004 or thereabouts and that wasn't even the best card. It lacked SM3.0 support off the top of my head and ran super hot. Here you are, 16 years later expecting the cards that are 100x more complex to cost the same? You either need a top card or you don't.


----------



## huzzug

Damage Inc said:


> I bought a X800XTPE for around $649 back in 2004 or thereabouts....


Wasn't their launch price $499?


----------



## keikei

braincracking said:


> To state the obvious, we need competition, people saying that a flagship(xx80ti) can never be around 700USD anymore are nuts. Look what happened in the HEDT space with Intel's pricing, that's why it is so important to have competition, so that we can actually move on and get a worthy upgrade for a price that isn't 1200USD.
> 
> Just my two cents.


AMD would have to pull another Navi and they are ready to. If both Supa/Navi launch wasnt so close i think Navi wouldve been more $. 5700 Xt launched at $400 (midrange). I can see a $600-700 Navi 2 top end with similar 2080ti performance.


----------



## skupples

HEDT price drop isn't the first time AMD has saved our asses. We would'a been doomed to $1,000 4 core hell for eternity without bulldozer flopping about. 

The question is, are the margins on GPUs as high as CPUs? Do they have the same kinda room to drop prices in half? I have no clue, but it seems unlikely. 

seems from nvidia we would be met with slightly lower prices, + MOAR POWER! Folks are fooling themselves if they think this is the best NV can do, when they're quite been in a realm of their own for nearly two full cycles.


----------



## Damage Inc

huzzug said:


> Wasn't their launch price $499?



You may be right, I think it was $549. I think I paid $649 for s X1800XT or X1900XT though. I remember paying $700 for a 4800 X2 S939 dual core as well.


----------



## ilmazzo

skupples said:


> seems from nvidia we would be met with slightly lower prices, + MOAR POWER! Folks are fooling themselves if they think this is the best NV can do, when they're quite been in a realm of their own for nearly two full cycles.


Well, you can't have profits nv has without high margins on what you sell.....this is absolutely obvious for quadros and teslas, dunno for the gaming lineup but for sure they are not "at cost" since kepler era.....


----------



## EniGma1987

skupples said:


> HEDT price drop isn't the first time AMD has saved our asses. We would'a been doomed to $1,000 4 core hell for eternity without bulldozer flopping about.
> 
> The question is, are the margins on GPUs as high as CPUs? Do they have the same kinda room to drop prices in half? I have no clue, but it seems unlikely.
> 
> seems from nvidia we would be met with slightly lower prices, + MOAR POWER! Folks are fooling themselves if they think this is the best NV can do, when they're quite been in a realm of their own for nearly two full cycles.





If the GPUs keep being designed like Nvidia is, then no they dont have the margins or can be price dropped. Making these single dies at 600mm2+ just costs insane amounts of money and have very low yields at these edges of production capability. That would be for consumer lineup though. Any enterprise GPU from either company has HUGE margins.
The only way to move forward would be to perfect multi-GPU gaming though, since that is essentially what configuring a GPU like Ryzen/Threadripper would function like. But that would be the only way to use smaller dies and still grow in performance. Nvidia is closest to this already though, with the introduction of NVSwitch. It is basically like the IO die in a Ryzen, where it is what gets talked to and then passes on data to the GPUs behind it and can make everything look like 1 big GPU. Unfortunately to make it high enough bandwidth, the cost is very high right now


----------



## skupples

i'd love a return to multiGPU gaming. IDK about AMD ever doing it again though, isn't there something about how DX12 doesn't run in FSW or some nonsense?


----------



## guttheslayer

skupples said:


> HEDT price drop isn't the first time AMD has saved our asses. We would'a been doomed to $1,000 4 core hell for eternity without bulldozer flopping about.
> 
> The question is, are the margins on GPUs as high as CPUs? Do they have the same kinda room to drop prices in half? I have no clue, but it seems unlikely.
> 
> seems from nvidia we would be met with slightly lower prices, + MOAR POWER! Folks are fooling themselves if they think this is the best NV can do, when they're quite been in a realm of their own for nearly two full cycles.


GPU have less room to drop prices by half thanks to Vram and PCB + premium component. Those G6 aint exactly cheap by itself and you need about 8 to 12 or them, each cost $20-30 at least? Add that with the expensive large monolithic die GPU on 7/12nm, they dont have alot of margin room to play about.


CPU die are usually smaller, or in fact much smaller.


----------



## skupples

that's what I thought, but it had been awhile since I've really paid attention.

as previously stated, the more likely outcome of new GPU wars will be power, not cost savings. At least, in the top tier.

can only hope all three start competing to see who can release the best $1299 card. instead of... you know... only one company even producing cards in that class, thus having zero competition but themselves, meaning each step can be as meek as they think users will accept.

either way, 1080ti power will be down to $300-$400 from NV in the next gen, thanks to 5700xt.


----------



## 113802

skupples said:


> i'd love a return to multiGPU gaming. IDK about AMD ever doing it again though, isn't there something about how DX12 doesn't run in FSW or some nonsense?


I'm sure the future will be toward mGPU but drivers will need to do the lifting since developers don't want to support it. Programs will need to see the multiple GPUs as one which is where MCM designs come in to place. Just like AMD's new Radeon Pro Vega II Duo. 

nVidia is also working on it: 

https://research.nvidia.com/publication/2017-06_MCM-GPU:-Multi-Chip-Module-GPUs
https://research.nvidia.com/sites/default/files/pubs/2019-06_A-0.11-pJ/Op,//C24_1.pdf


----------



## skupples

good, idk how else PC gaming segment really seeks to separate itself as consoles get beefier, with ever improving APIs, and rendering tricks that make "native 4K" not all that important for anything but epeen  

i wanna be running 3x 4k120 @ 120FPS all day please, and that's only going to happen with a couple 4080tis.


----------



## The Pook

Damage Inc said:


> 50% per watt... Where does it say 50% more performance.



The same place? 

50% more performance per watt means 50% more performance if they keep the same power requirements for the new cards (which is likely since the 780 Ti, 980 Ti, 1080 Ti, and 2080 Ti are all guesstimated by NVIDIA to use ~250W).


----------



## DNMock

guttheslayer said:


> GPU have less room to drop prices by half thanks to Vram and PCB + premium component. Those G6 aint exactly cheap by itself and you need about 8 to 12 or them, each cost $20-30 at least? Add that with the expensive large monolithic die GPU on 7/12nm, they dont have alot of margin room to play about.
> 
> 
> CPU die are usually smaller, or in fact much smaller.



I don't know for sure on this, but hasn't the cutting edge on that stuff always been like that? I can't imagine the cost of a GDDR6 ram module is much more now than when GDDR5 first came out.

For sure fab prices have gone up, but I don't think they have gone up nearly as much as what Nvidia has been marking up for. Heck it's not like this is even the highest GPU prices have ever been when you factor in inflation.


----------



## Seyumi

You all have to admit, the gap between console & PC is closing with each and every generation. I've always been a PC gamer, before the original Playstation & Xbox days:

My PC games looked like PS2 games when the PS1 was out
My PC games looked like PS3 games when the PS2 was out
My PC games looked like PS4 games when the PS3 came out
My PC games looked like PS4 pro games when the PS4 came out

Next generation, my PC games will probably look like PS5 games...when the PS5 comes out.

All PC has over an Xbox One X or a PS4 pro is higher resolution, frame rates, and some settings on high/ultra instead of medium. PC games are only MARGINALLY better visually than their console counterparts these days but A LOT more money. I'm losing faith in the PC-only gaming industry. I guess we have Nvidia and Intel to thank for that with their monopolistic prices. Don't even bother mentioning "but but but PC has mods" because those have now all been replaced by microtransactions these days. The PC modding community is practically dead at this point.

I can't wait to get a PS5 for $500? and get 95% of the visuals that my $5,000 computer has. My wallet is getting tired after all these years.


----------



## Asmodian

Seyumi said:


> I can't wait to get a PS5 for $500? and get 95% of the visuals that my $5,000 computer has. My wallet is getting tired after all these years.


That is not my impression. The hardware in consoles keeps getting worse compared to PC hardware so the games need to render at an obviously lower resolution and use lower quality effects to be able to run on it. If you want 4K and/or >60 fps then consoles don't seem to offer much and even at 1080p60 you can really go crazy with the graphics quality in modern games on a PC with a 2080 Ti. Consoles used to render at the same resolution as PCs.

Consoles sacrifice a lot to run games on their potato hardware, even allowing for the better optimization due to running on fixed hardware.


----------



## guttheslayer

DNMock said:


> I don't know for sure on this, but hasn't the cutting edge on that stuff always been like that? I can't imagine the cost of a GDDR6 ram module is much more now than when GDDR5 first came out.
> 
> For sure fab prices have gone up, but I don't think they have gone up nearly as much as what Nvidia has been marking up for. Heck it's not like this is even the highest GPU prices have ever been when you factor in inflation.


I am comparing it with CPU, as someone question whether is it possible to drop price by half in GPU side just like Intel did with their HEDT, I said unlikely.


NV mark up their price is not due to card cost, but increased R/D over the useless RT cores which they force us gamers to swallow. The RT cores is a clear indication that the R/D is being split into 2 major front, with data center at one side, and gaming at the other side (Which mean compute and gaming card will be different going forward).

Now that they are on 7nm+ EUV, I would say the price drop will be unlikely also due to the significant cost increase of 7nm+ over 12nm.



WannaBeOCer said:


> I'm sure the future will be toward mGPU but drivers will need to do the lifting since developers don't want to support it. Programs will need to see the multiple GPUs as one which is where MCM designs come in to place. Just like AMD's new Radeon Pro Vega II Duo.
> 
> nVidia is also working on it:
> 
> https://research.nvidia.com/publication/2017-06_MCM-GPU:-Multi-Chip-Module-GPUs
> https://research.nvidia.com/sites/default/files/pubs/2019-06_A-0.11-pJ/Op,//C24_1.pdf



MCM or chiplets GPU connecting to a main I/O hub in a single card is the only way going forward to deal with the demand of (>8 bits) HDR, RTX, increased Graphic demand as well as higher resolution / FPS all at once.


----------



## little_ninjai

*Discount*



guttheslayer said:


> Haha, but the poor sales of RTX card reflect it very clear, it was BAD.
> 
> 
> 20 people behind you that will buy you said, then after that will be no one, the leftover cards will be left to collect dust at the shelves. If pricing premium will guarantee good sales, then Intel doesnt have to lower their HEDT CPU MSRP by more than half.
> 
> 
> Don't complain about people complaining the price when the sales and market share will reflect clearly on the dumb decision to price it sky-high. People cannot afford it and felt that its not worth it is as simple as that. There are many thing else on this world that is worth buying over and no one cares about your 240 FPS lol.





Elmy said:


> I will sell my 2080Ti and upgrade to anything 20% or more faster. 2560X1440p 240Hz monitors will be arriving very soon. Will need all the FPS I can get.
> 
> I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny. And there are many of us that do this. There are also many ppl that are lurking in the for sale forums to snag up those used GPU's at a discount.
> 
> Also I see so many ppl in here complaining about pricing. Its getting more expensive for R&D , Real estate , wages , logistics , cost of wafers , etc. Flagship will never be 6-7 hundred again... EVER! Even if Intel joins the market.. You think they somehow figured out how to build a GPU cheaper than Nvidia and AMD and even if they did.. You think they are going to pass that savings onto you? They know the excitement for their GPU's will be high. They will price at whatever NVIDIA/AMD is commanding at the same performance. The are NOT going to come into the industry and price their GPU's at the same performance for 50% less money. Some of you think somehow a pricing fairy is going to come rescue the market. Its NOT going to happen. Coming into here and complaining about the pricing falls on deaf ears. No one cares that you are not buying a GPU because its too expensive. There are 20 people behind you in line that will.


I care about his 240hz monitor, i need his discount card lol


----------



## BigMack70

Seyumi said:


> You all have to admit, the gap between console & PC is closing with each and every generation. I've always been a PC gamer, before the original Playstation & Xbox days:
> 
> My PC games looked like PS2 games when the PS1 was out
> My PC games looked like PS3 games when the PS2 was out
> My PC games looked like PS4 games when the PS3 came out
> My PC games looked like PS4 pro games when the PS4 came out
> 
> Next generation, my PC games will probably look like PS5 games...when the PS5 comes out.
> 
> All PC has over an Xbox One X or a PS4 pro is higher resolution, frame rates, and some settings on high/ultra instead of medium. PC games are only MARGINALLY better visually than their console counterparts these days but A LOT more money. I'm losing faith in the PC-only gaming industry. I guess we have Nvidia and Intel to thank for that with their monopolistic prices. Don't even bother mentioning "but but but PC has mods" because those have now all been replaced by microtransactions these days. The PC modding community is practically dead at this point.
> 
> I can't wait to get a PS5 for $500? and get 95% of the visuals that my $5,000 computer has. My wallet is getting tired after all these years.


I don't agree with this at all. Gaming on my 2080Ti feels vastly superior, visually, to my xbox one x. And the titles with good Ray tracing implementation (metro, control) look next-gen in comparison. 

When the 360 came out, it looked better than my pc for a little while. That was the last time a console was close to PC in visual fidelity, IMO. I don't think it really got outclassed by PC until the 8800 series came out a year later. I'd say the PC has been steadily improving vs consoles ever since. 

I think the difference here is graphics have improved to where "low" and "medium" settings at a lower resolution no longer look like ugly garbage like they used to 10-20 years ago.


----------



## skupples

Seyumi said:


> You all have to admit, the gap between console & PC is closing with each and every generation. I've always been a PC gamer, before the original Playstation & Xbox days:
> 
> My PC games looked like PS2 games when the PS1 was out
> My PC games looked like PS3 games when the PS2 was out
> My PC games looked like PS4 games when the PS3 came out
> My PC games looked like PS4 pro games when the PS4 came out
> 
> Next generation, my PC games will probably look like PS5 games...when the PS5 comes out.
> 
> All PC has over an Xbox One X or a PS4 pro is higher resolution, frame rates, and some settings on high/ultra instead of medium. PC games are only MARGINALLY better visually than their console counterparts these days but A LOT more money. I'm losing faith in the PC-only gaming industry. I guess we have Nvidia and Intel to thank for that with their monopolistic prices. Don't even bother mentioning "but but but PC has mods" because those have now all been replaced by microtransactions these days. The PC modding community is practically dead at this point.
> 
> I can't wait to get a PS5 for $500? and get 95% of the visuals that my $5,000 computer has. My wallet is getting tired after all these years.


yep, that's totally the only difference. ATM it is a bit depressing, specially now that multi-GPU is essentially dead. hopefully it'll return allowing PC gamers to return to what made us PC gamers in the first place. Excessive power over any other option giving us the ability to run obscene setups like 3x 4K @ high frames, etc. 

you're right that the gap i closing though, and all PC gamers can thank the new consoles for forcing AMD and NV to adapt proper 4K120hz ports.


----------



## ZealotKi11er

I am interested in the vRAM configuration. With 980 Ti it was easy. For 1080 Ti we thought 12GB was too much but they pulled an 11GB G5X, With 2080 Ti same 11GB but G6. We know for sure G6 will remain and we can assume 16-18Gbps but 11GB still the right amount? I am sure people will be fine with 12GB 384-Bit 18Gbps but AMD might have 16GB with Big Navi.


----------



## BigMack70

ZealotKi11er said:


> I am interested in the vRAM configuration. With 980 Ti it was easy. For 1080 Ti we thought 12GB was too much but they pulled an 11GB G5X, With 2080 Ti same 11GB but G6. We know for sure G6 will remain and we can assume 16-18Gbps but 11GB still the right amount? I am sure people will be fine with 12GB 384-Bit 18Gbps but AMD might have 16GB with Big Navi.


I'm expecting a vram increase to at least 16GB given that new consoles are coming out, which generally means a leap forward in memory usage.


----------



## guttheslayer

ZealotKi11er said:


> I am interested in the vRAM configuration. With 980 Ti it was easy. For 1080 Ti we thought 12GB was too much but they pulled an 11GB G5X, With 2080 Ti same 11GB but G6. We know for sure G6 will remain and we can assume 16-18Gbps but 11GB still the right amount? I am sure people will be fine with 12GB 384-Bit 18Gbps but AMD might have 16GB with Big Navi.


The VRAM is easy to guess.


NV is always consistent with the bandwidth of the VRAM for their different size chip, for big chip it will be cap at 384 bits, while the mid-range will be (128-256 bits). Since bandwidth bits is usually tied to the module size, a G6 32-bits module come 1GB, 1.5GB (rumor) and 2GB capacity.


If rumor is true, then NV might use the 1.5GB modules for their GP104 cards, which cap at 8 of them per card totalling to *12GB* @ 256 bits. If the Vram is using the fastest one available then it will be 16 gbps (Total 512GB/s for the entire card.)

NV could use the 2GB modules too, giving up to 16GB for RTX 3080, but IMO that is not happening given the top end RTX 2080 Ti have 11GB. A 5GB jump for a far lesser asking price is just not happening (if they are maintaining $800 price point for RTX 3080 non-TI)


----------



## Newbie2009

So AMD have 8 months to release big NAVI to take performance crown before Nvidia takes it back. 2080ti perf : €599, I'd probably break my rule of no more than 500bucks on a gpu.


----------



## b.walker36

Newbie2009 said:


> So AMD have 8 months to release big NAVI to take performance crown before Nvidia takes it back. 2080ti perf : €599, I'd probably break my rule of no more than 500bucks on a gpu.


They have 8 months to try at least. There is no given big Navi would beat a 2080ti. I wonder if nvidia could counter with a 2080ti super if big navi did beat it and it was a thing.


----------



## keikei

b.walker36 said:


> They have 8 months to try at least. There is no given big Navi would beat a 2080ti. I wonder if nvidia could counter with a *2080ti super* if big navi did beat it and it was a thing.


That card would make sense before Navi 2. Possibly around Navi 2 if Armpere was delayed.


----------



## b.walker36

keikei said:


> That card would make sense before Navi 2. Possibly around Navi 2 if Armpere was delayed.


Even if there was no super they could always lower the price and probably still make a large margin. I don't really see them doing 1200 again for the Ti. All I know is I'm buying something when ampere launches. I don't care which camp it is. The best performance right around 700 bucks gets my money, this 980ti is loosing at 1440p now a days.


----------



## keikei

b.walker36 said:


> Even if there was no super they could always lower the price and probably still make a large margin. I don't really see them doing 1200 again for the Ti. All I know is I'm buying something when ampere launches. I don't care which camp it is. The best performance right around 700 bucks gets my money, *this 980ti is loosing at 1440p now a days*.


You don't like the current offerings? Several cards can handle that situation.


----------



## Section31

It's good sign those with RTX2080TI should sell there cards if they want to maximize there value.


----------



## b.walker36

keikei said:


> You don't like the current offerings? Several cards can handle that situation.


If new cards were not rumored in the first half of next year I would probably get a 5700xt. I just figure I can hold out to at least see what big navi and ampere are all about.If I bought right now it would most likely be a red devil or nitro.


----------



## keikei

b.walker36 said:


> If new cards were not rumored in the first half of next year I would probably get a 5700xt. I just figure I can hold out to at least see what big navi and ampere are all about.If I bought right now it would most likely be a red devil or nitro.


You could get an inbetweenie while you wait....*$289.99*


----------



## skupples

isn't 2080ti already a fully enabled core?


----------



## Section31

I'm looking forward to Ampere. Its started an interesting plan for myself. 

Since I have to redesign the room for married life and move all the gaming/pc to another room. My test in upgrading the network (without tearing the existing cat5e in the house) worked. I got an Trendnet Managed 8 Port 10GBE switch and I was able to get 2.5Gbe through Cat5e (only my 3900X/Hero VIII had the higher speed ethernet). In the future, I will add another 10GBE switch somewhere in the house so it can effectively provide 10GBE to key locations throughout the house. Also, with LG TV's now supporting Gsync, i might as well combine my 4K TV/Alienware 3418dw Monitor into one device. It probably better deal to buy an 4k120hz TV with HDMI 2.1 rather than an 4K Monitor. Maybe add one of those devialet wireless speakers system in and that would be once sweet setup for watching streaming/4k gaming (cyberpunk2077). 

It's part of ongoing idea project to upgrade early 2000's smart homes (most of the home automation stuff works but is nonupgradable and difficult to fix/find replacement parts). Those house builder back in the early 2000's, told them to use Cat6 Cables but they said just use Cat5e for cost savings. The extra costs are nothing compared to when I have to get someone to tear down the wall to replace internet cables. Today, its largely based on wifi/tablets where as the early systems were based on ir, needed an central controller unit and keypads (ethernet ones weren't introduced till late 2000's) and it was difficult to get them to cooperate with different manufacturers lighting, temperature and shade systems. That and some features have become useless like built in speakers, central video and ipad interface. I am still looking for replacement for the custom designed front electronic lock (keypad with camera). If I am successful with the house, this will give me solid ideas how to upgrade the semi-smart system used in my parent's apartment's in Asia (also similar and is breaking down too). Both lasted very long time though (almost 13years now).


----------



## ToTheSun!

skupples said:


> isn't 2080ti already a fully enabled core?


Titan is.


----------



## skupples

duuuuuhhhh

n 2080ti is then probably 2/3 sectors short of a titan so technically a super 2080ti is plausible... seems like something they'd only release from a defensive posture that's for sure.


----------



## tpi2007

RTX 2080 Ti: 4352 cores, 272 TMU's, 88 ROP's, 544 Tensor Cores, 68 RT cores, 352-bit memory bus, 11 GB GGDR6 14 Gbps, 250w TDP

Titan RTX : 4608 cores, 288 TMU's, 96 ROP's, 576 Tensor Cores, 72 RT cores, 384-bit memory bus, 24 GB GGDR6 14 Gbps, 280w TDP


There is room for a 2080 Ti and Nvidia being on top as they are, they can get away with keeping a core count separation between the two, they just have to milk things some more and release a halfway there 2080 Ti Super with 70 out of 72 SM's enabled (4480 cores) and the full memory bus enabled with 12 GB of 14 Gbps VRAM, thus leaving the Titan RTX's status untouched.


----------



## guttheslayer

b.walker36 said:


> They have 8 months to try at least. There is no given big Navi would beat a 2080ti. I wonder if nvidia could counter with a 2080ti super if big navi did beat it and it was a thing.


There wont be a 2080 Ti Super even if AMD released big NAVI, especially if it was close to Ampere launch date. Main reason is that the window is too close and the extra SMs is not going to help much, unless they do a double boost which is price cut + more performance. But given that Ampere is closing in, very unlikely.


We didnt see a 1080 Ti Super before the RTX series, so I guess 2080 ti will go down the same path.


----------



## tpi2007

guttheslayer said:


> There wont be a 2080 Ti Super even if AMD released big NAVI, especially if it was close to Ampere launch date. Main reason is that the window is too close and the extra SMs is not going to help much, unless they do a double boost which is price cut + more performance. But given that Ampere is closing in, very unlikely.
> 
> 
> We didnt see a 1080 Ti Super before the RTX series, so I guess 2080 ti will go down the same path.



Well, the 2080 Super exists and it's only ~6% faster than the 2080 at 1440p, so if lackluster updates to the lineup are anything to go by, it's possible.

Also, Nvidia launched the GTX 980 just 7 months after launching the Titan Black, and Igor Wallossek says that they'll go with a small card approach again, which makes sense since it's a new node, so they'll be back to launching the x04 chip based x80 and x70 cards first, so... if Ampere launches in June, they can still release the 2080 Ti Super next month. You know, a just in time small holiday season 'we're not that greedy (but still pretty greedy)' _present_ from Nvidia.


----------



## Section31

The worst GPU series i bought was actually the GTX1080 series, within 9 months I dumped it for the GTX1080TI.


----------



## tpi2007

Section31 said:


> The worst GPU series i bought was actually the GTX1080 series, within 9 months I dumped it for the GTX1080TI.



If you want the big card performance, you have to wait, it's been happening since Kepler in 2012. The RTX series was the exception because they were still on basically the same process and couldn't ramp up clocks more than Pascal, nor was the arch much faster in traditional raster performance without resorting to adding more cores, which is what they did, resulting in the 2080 Ti being available from the beginning and with a huge die size.


----------



## Section31

tpi2007 said:


> If you want the big card performance, you have to wait, it's been happening since Kepler in 2012. The RTX series was the exception because they were still on basically the same process and couldn't ramp up clocks more than Pascal, nor was the arch much faster in traditional raster performance without resorting to adding more cores, which is what they did, resulting in the 2080 Ti being available from the beginning and with a huge die size.


I have learned hence why I only buy the TI/Titan and live with it till the next big series comes. It might cost more at once on the wallet but overall you save from buying again 6-12months down the road. If i can get two years between GPU changes, i am satisfied. I am pretty sure i am getting 3080TI, its now just matter of whether I go with heatkiller or optimus for the waterblock. Heatkiller looks better (black nickel version) but optimus likely cools better than the heatkiller. The heatkiller itself cools very well and its much price point than aquacomputer.


----------



## guttheslayer

tpi2007 said:


> Well, the 2080 Super exists and it's only ~6% faster than the 2080 at 1440p, so if lackluster updates to the lineup are anything to go by, it's possible.
> 
> Also, Nvidia launched the GTX 980 just 7 months after launching the Titan Black, and Igor Wallossek says that they'll go with a small card approach again, which makes sense since it's a new node, so they'll be back to launching the x04 chip based x80 and x70 cards first, so... if Ampere launches in June, they can still release the 2080 Ti Super next month. You know, a just in time small holiday season 'we're not that greedy (but still pretty greedy)' _present_ from Nvidia.



It is still too close to Ampere release, if anything to go by Big Navi and Ampere will be 3-6 months difference only. you can imagine that if NV release RTX 3080 at $700 that is faster than a 6 mth old 2080 Ti Super at $1000, alot of ppl are gonna be pissed. (Titan are not aim at gamers, hence NV doesnt really care, but if 2080 Ti Super, a gaming card that is only valid for 6 months before getting obsolete, I forsee a real public outcry. Also not forgetting that Titan Black have more Vram than GTX 980, that make the former less buyer remorse). 

If not NV will push back Ampere release date just to appease the 2080 Ti Super crowd. It might also give NV a good reason to neutered Ampere performance to appease crowd further. I rather that doesnt happen. I think at this point NV dont want another public backlash giving its bad reputation now.

Also Super will not come next month cause big Navi is not coming before 2020. With no Big Navi, no reason for NV to release a Super edition (if not, they would have release it with 2080 Super together). If you want to recap history, NV only release one full unlock big die for gaming since 2011, and that is only because NV have no answer to AMD's R9 290X for almost 12 months. If Ampere is just 3-6 month away, I strongly believe NV will wait, unless they want to crash their own market.


----------



## keikei

guttheslayer said:


> It is still too close to Ampere release, if anything to go by Big Navi and Ampere will be 3-6 months difference only. you can imagine that if NV release RTX 3080 at $700 that is faster than a 6 mth old 2080 Ti Super at $1000, alot of ppl are gonna be pissed. (Titan are not aim at gamers, hence NV doesnt really care, but if 2080 Ti Super, a gaming card that is only valid for 6 months before getting obsolete, I forsee a real public outcry. Also not forgetting that Titan Black have more Vram than GTX 980, that make the former less buyer remorse).
> 
> If not NV will push back Ampere release date just to appease the 2080 Ti Super crowd. It might also give NV a good reason to neutered Ampere performance to appease crowd further. I rather that doesnt happen. I think at this point NV dont want another public backlash giving its bad reputation now.
> 
> *Also Super will not come next month cause big Navi is not coming before 2020. With no Big Navi, no reason for NV to release a Super edition (if not, they would have release it with 2080 Super together).* If you want to recap history, NV only release one full unlock big die for gaming since 2011, and that is only because NV have no answer to AMD's R9 290X for almost 12 months. If Ampere is just 3-6 month away, I strongly believe NV will wait, unless they want to crash their own market.


2020 holiday season sale. Just like 2080S, it'll be a another Redo, i mean Super. Small performance bump + cheaper manufacturer process=moar card sales. 'New' card. New sales right? Thats my prediction. Why does 2080 Ti S potentially exist? The entire line RTX lineup has an S version, why is the top card (excluding TITAN) left out? Obviously, it's Nvidia's ace.


----------



## tpi2007

guttheslayer said:


> It is still too close to Ampere release, if anything to go by Big Navi and Ampere will be 3-6 months difference only. you can imagine that if NV release RTX 3080 at $700 that is faster than a 6 mth old 2080 Ti Super at $1000, alot of ppl are gonna be pissed. (Titan are not aim at gamers, hence NV doesnt really care, but if 2080 Ti Super, a gaming card that is only valid for 6 months before getting obsolete, I forsee a real public outcry. Also not forgetting that Titan Black have more Vram than GTX 980, that make the former less buyer remorse).
> 
> If not NV will push back Ampere release date just to appease the 2080 Ti Super crowd. It might also give NV a good reason to neutered Ampere performance to appease crowd further. I rather that doesnt happen. I think at this point NV dont want another public backlash giving its bad reputation now.
> 
> Also Super will not come next month cause big Navi is not coming before 2020. With no Big Navi, no reason for NV to release a Super edition (if not, they would have release it with 2080 Super together). If you want to recap history, NV only release one full unlock big die for gaming since 2011, and that is only because NV have no answer to AMD's R9 290X for almost 12 months. If Ampere is just 3-6 month away, I strongly believe NV will wait, unless they want to crash their own market.



Yeah, but Titan Black was a $1k card, and the 2080 Ti is a $1.2k card, so we're well past that point. Also, a 3080 will carry 12 GB of VRAM or will they keep 8 GB or will they move to 16 GB? That's the question. Will they pair an x04 chip with a 384-bit memory bus for the first time ever? And maybe the 3080 Ti comes with HBM2? Maybe. Maybe not. As to release timelines, people have dinky memory, if they release the 2080 Ti Super in November (and again, it doesn't need to be the full die, could be 70 ouf of 72 SM's enabled) and the 3080 in June of next year, people won't notice much. It will be a different year, it will be Summer, completely different mindset, it will seem like a long time ago.

Not saying that they will release a 2080 Ti Super next month, but they could if they wanted to. I'd rather they release Ampere with great cost / performance ratio like everyone else sooner rather than later, but the way things are with companies milking and milking some more, it will all depend on how soon AMD and Intel release their cards and how competitive they are.


----------



## skupples

that would be lovely, idk how else they expect to keep pushing into higher frames @ high resolutions. It's not just GPU core, need that bandwidth.

weren't they able to stick with the same bit rate due to going from G5 to G6?


----------



## guttheslayer

tpi2007 said:


> Yeah, but Titan Black was a $1k card, and the 2080 Ti is a $1.2k card, so we're well past that point. Also, a 3080 will carry 12 GB of VRAM or will they keep 8 GB or will they move to 16 GB? That's the question. Will they pair an x04 chip with a 384-bit memory bus for the first time ever? And maybe the 3080 Ti comes with HBM2? Maybe. Maybe not. As to release timelines, people have dinky memory, if they release the 2080 Ti Super in November (and again, it doesn't need to be the full die, could be 70 ouf of 72 SM's enabled) and the 3080 in June of next year, people won't notice much. It will be a different year, it will be Summer, completely different mindset, it will seem like a long time ago.
> 
> Not saying that they will release a 2080 Ti Super next month, but they could if they wanted to. I'd rather they release Ampere with great cost / performance ratio like everyone else sooner rather than later, but the way things are with companies milking and milking some more, it will all depend on how soon AMD and Intel release their cards and how competitive they are.


I prefer they release Ampere earlier if they can.


At this point, there is nothing special about a Titan RTX rebrand, especially at $999


----------



## tpi2007

guttheslayer said:


> I prefer they release Ampere earlier if they can.
> 
> 
> At this point, there is nothing special about a Titan RTX rebrand, especially at $999



Of course there isn't, except to Jensen, who will tell you that "it just works"® as a means of buying one more leather jacket for his Ferrari (in rich land cars also wear jackets, I'm guessing).


----------



## randomizer

Hopefully I will finally be able to move off my 970.


----------



## skupples

tpi2007 said:


> Of course there isn't, except to Jensen, who will tell you that "it just works"® as a means of buying one more leather jacket for his Ferrari (in rich land cars also wear jackets, I'm guessing).


um, excuse me zir. We call them wraps.


----------



## The Pook

all I want is a 1080 Ti priced (~$700) GPU that is ~20% faster than a 1080 Ti with >11GB vRAM. 

plz NVIDIA.


----------



## keikei

The Pook said:


> all I want is a 1080 Ti priced (~$700) GPU that is ~20% faster than a 1080 Ti with >11GB vRAM.
> 
> plz NVIDIA.



I don't see how both teams can't do that. I'll lend towards Red to be the 'cheaper' option. I just hope they kill the blower design...maybe. Vega VII at least launched with triple fans, so there's hope.


----------



## b.walker36

The Pook said:


> all I want is a 1080 Ti priced (~$700) GPU that is ~20% faster than a 1080 Ti with >11GB vRAM.
> 
> plz NVIDIA.


I would eat that up. It would like double my performance lol


----------



## ttnuagmada

Seyumi said:


> You all have to admit, the gap between console & PC is closing with each and every generation. I've always been a PC gamer, before the original Playstation & Xbox days:
> 
> My PC games looked like PS2 games when the PS1 was out
> My PC games looked like PS3 games when the PS2 was out
> My PC games looked like PS4 games when the PS3 came out
> My PC games looked like PS4 pro games when the PS4 came out
> 
> Next generation, my PC games will probably look like PS5 games...when the PS5 comes out.
> 
> All PC has over an Xbox One X or a PS4 pro is higher resolution, frame rates, and some settings on high/ultra instead of medium. PC games are only MARGINALLY better visually than their console counterparts these days but A LOT more money. I'm losing faith in the PC-only gaming industry. I guess we have Nvidia and Intel to thank for that with their monopolistic prices. Don't even bother mentioning "but but but PC has mods" because those have now all been replaced by microtransactions these days. The PC modding community is practically dead at this point.
> 
> I can't wait to get a PS5 for $500? and get 95% of the visuals that my $5,000 computer has. My wallet is getting tired after all these years.


Lol what? The 360 launched with a near top-tier GPU. By the time the PS5/Xbox2 come out, they'll be using low-midranged GPU's. The tier of GPU that Navi belongs in will be in the same tier as the original PS4/Xbox when they launched.


----------



## iamjanco

tpi2007 said:


> Of course there isn't, except to Jensen, who will tell you that "it just works"® as a means of buying one more leather jacket for his Ferrari (in rich land cars also wear jackets, I'm guessing).


----------



## ToTheSun!

iamjanco said:


> View attachment 301476


----------



## guttheslayer

The Pook said:


> all I want is a 1080 Ti priced (~$700) GPU that is ~20% faster than a 1080 Ti with >11GB vRAM.
> 
> plz NVIDIA.


RTX 3070 is your answer, though the Vram might not exceed 11GB.


----------



## ZealotKi11er

guttheslayer said:


> RTX 3070 is your answer, though the Vram might not exceed 11GB.


The vRAM might be either 8GB or 12GB.


----------



## iamjanco

ToTheSun! said:


>


LOL!


----------



## PontiacGTX

WannaBeOCer said:


> I'm sure the future will be toward mGPU but drivers will need to do the lifting since developers don't want to support it. Programs will need to see the multiple GPUs as one which is where MCM designs come in to place. Just like AMD's new Radeon Pro Vega II Duo.
> 
> nVidia is also working on it:
> 
> https://research.nvidia.com/publication/2017-06_MCM-GPU:-Multi-Chip-Module-GPUs
> https://research.nvidia.com/sites/default/files/pubs/2019-06_A-0.11-pJ/Op,//C24_1.pdf


still developers have to do the mgpu on Directx12/vulkan it wont change if it is a MCM GPU it will be seen as a single gpu just slighly different way to utilize the cores but you wont have 2 cards in SLI if developers dont want to


----------



## Zam15

Oh look, the new 1660 Super is about to be released! 

Not like I had that performance 5 years ago! Tired of this fight at the low end.. I had that performance in 2014 with my 980, well double that actually when SLI scaled with my two 980 SC...

Seriously I want an upgrade path soon, I'd be happy with a card that can do 4k Ultra 60FPS on all modern PC titles and PS5, XBONE2X ports. 

If I can get 3-5 more years out of my rig until the next big upgrade I'll be a happy camper. 

Thinking Zen 5, 64c, DDR5, PCIE5.. but until then I need a better card!


----------



## skupples

guttheslayer said:


> RTX 3070 is your answer, though the Vram might not exceed 11GB.


small chip is never the answer.


----------



## EniGma1987

The Pook said:


> all I want is a 1080 Ti priced (~$700) GPU that is ~20% faster than a 1080 Ti with >11GB vRAM.
> 
> plz NVIDIA.



I would really hope that performance jump would be closer to 50%. Isnt Nvidia going from 12nm, to 7nm with EUV? That means moving from quad patterning back to single pattern, saving TONS of money and increasing yields. Sure EUV itself costs more, but that is more than offset by moving back to single patterning lithography. Add substantial density improvements on top of that and a $700 GPU at 50% higher performance than 1080Ti while still good profit for Nvidia seems perfectly reasonable.


----------



## skupples

Exactly, Enigma. He's setting his expectations reasonably low. 

5800/5800XT should easily be 20%+ faster than 1080ti, and should slide in @ like $600 and $700 a piece.


----------



## Sheyster

EniGma1987 said:


> Add substantial density improvements on top of that and a $700 GPU at 50% higher performance than 1080Ti while still good profit for Nvidia seems *perfectly reasonable*.


Jensen's wallet strongly disagrees.


----------



## doom26464

AMD might not even be ready for big navi for awhile yet, and we still don't have much for leaks on intel.


Nvidia could launch ampere with good performance and very uncontested. But they could enjoy a very free window again to price to the moon.


----------



## Asmodian

I do think Nvidia will be pretty uncontested at the top end for at least most of 2020. 

I really hope a GPU from anyone else makes the 1080 Ti look out of date by the end of 2020.


----------



## guttheslayer

skupples said:


> small chip is never the answer.


He wanted 20% faster than 1080 Ti at that price, so its a reasonable low expectation that x70 card can fulfil.

The RTX 3080 should be 25% faster than RTX 2080 Ti by jumping 1 full nodes down.


----------



## skupples

doom26464 said:


> AMD might not even be ready for big navi for awhile yet, and we still don't have much for leaks on intel.
> 
> 
> Nvidia could launch ampere with good performance and very uncontested. But they could enjoy a very free window again to price to the moon.


I expect them to try to squeeze in one more before AMD comes to play, and at 2-3 more until intel is worth a damn.



guttheslayer said:


> He wanted 20% faster than 1080 Ti at that price, so its a reasonable low expectation that x70 card can fulfil.
> 
> The RTX 3080 should be 25% faster than RTX 2080 Ti by jumping 1 full nodes down.


Assuming ampere releases before any further Navi cards - NV can beat their previous cards by 5% and still run away with it. a 25% jump would be awesome, but they'll just release even smaller dyes while they can.


----------



## ilmazzo

I personally don't care for top gpus which I cannot afford, I ended up some years ago to be fan of a brand instead of the other....

so next top gpus could push 144hz 4k hdr rt ultra details on but .... for me would not even exist since I could not have one in my case....they just filled the market with lot of models to just allow "normal people" to have something cheaper instead of bringing down the top tier pricing...that's it. 1% won.


----------



## guttheslayer

skupples said:


> I expect them to try to squeeze in one more before AMD comes to play, and at 2-3 more until intel is worth a damn.
> 
> 
> 
> Assuming ampere releases before any further Navi cards - NV can beat their previous cards by 5% and still run away with it. a 25% jump would be awesome, but they'll just release even smaller dyes while they can.


I am assuming Navi card came first, since Ampere was rumored to be July 2020 earliest, which means AMD might have advantage for a few weeks before NV release Ampere to counter it.


They dont need big die to push performance 25% above RTX 2080 Ti, based on my estimate the die size only need to be ~350mm^2 to achieve that for EUV 7nm+.



ilmazzo said:


> I personally don't care for top gpus which I cannot afford, I ended up some years ago to be fan of a brand instead of the other....
> 
> so next top gpus could push 144hz 4k hdr rt ultra details on but .... for me would not even exist since I could not have one in my case....they just filled the market with lot of models to just allow "normal people" to have something cheaper instead of bringing down the top tier pricing...that's it. 1% won.


That is the sad truth for us normal peasant, we have ppl who can dump 20K into a setup every year and they bash ppl who complain about high GPU prices. These people are part of the reason for the atrocious price


----------



## skupples

guttheslayer said:


> I am assuming Navi card came first, since Ampere was rumored to be July 2020 earliest, which means AMD might have advantage for a few weeks before NV release Ampere to counter it.
> 
> 
> They dont need big die to push performance 25% above RTX 2080 Ti, based on my estimate the die size only need to be ~350mm^2 to achieve that for EUV 7nm+.
> 
> 
> 
> That is the sad truth for us normal peasant, we have ppl who can dump 20K into a setup every year and they bash ppl who complain about high GPU prices. These people are part of the reason for the atrocious price


so what you mean to say is itll be like always. AMD drops something, gets some acclaim, then NV smashes that success with price and perf. 

^^^ what its like when AMD and NV actually compete. I know we haven't seen this recently, but it wasn't all that long ago this was the norm.

and please, let's stop blaming successful and wealthy people for all the world's problems. It's honestly just trashy. At least get to the root of the problem, which isn't the citizen or his paycheck.


----------



## guttheslayer

skupples said:


> and please, let's stop blaming successful and wealthy people for all the world's problems. It's honestly just trashy. At least get to the root of the problem, which isn't the citizen or his paycheck.


That is where you are wrong, no one blame successful and wealthy people if they keep quiet. 

It only start to escalating when people complain about the bad prices (which is very true) and these wealthy people start telling them to stop whining. Without strong support for boycotting extreme price gouging, these practice will never stop, so rich people should just stay out of it since they are afford all they want, instead of supporting those greedy organisation, such as blue or green camp.


----------



## tpi2007

iamjanco said:


> View attachment 301476



I see your hand and raise two and a half leather jackets (Lisa Su already owns the other half):








(Click to animate)


----------



## huzzug

guttheslayer said:


> That is where you are wrong, no one blame successful and wealthy people if they keep quiet.
> 
> It only start to escalating when people complain about the bad prices (which is very true) and these wealthy people start telling them to stop whining. Without strong support for boycotting extreme price gouging, these practice will never stop, so rich people should just stay out of it since they are afford all they want, instead of supporting those greedy organisation, such as blue or green camp.


It would make sense if this was about rice paddy's. Doesn't make sense when we're talking about luxury goods.

And no. You do not have a right to have to have a gaming system.


----------



## Woundingchaney

guttheslayer said:


> That is where you are wrong, no one blame successful and wealthy people if they keep quiet.
> 
> It only start to escalating when people complain about the bad prices (which is very true) and these wealthy people start telling them to stop whining. Without strong support for boycotting extreme price gouging, these practice will never stop, so rich people should just stay out of it since they are afford all they want, instead of supporting those greedy organisation, such as blue or green camp.


I have not seen anyone that owns a higher end gpu ridicule someone for complaining about price. Most of what I have seen is people complaining about price blaming others for purchasing the product at retail cost.

I or others in no way what so ever should have to defend our luxury purchases to you simply because you dont have the income to afford the item. Im not interested in a boycott and I wish you the best in your feeble attempt.


----------



## skupples

guttheslayer said:


> That is where you are wrong, no one blame successful and wealthy people if they keep quiet.
> 
> It only start to escalating when people complain about the bad prices (which is very true) and these wealthy people start telling them to stop whining. Without strong support for boycotting extreme price gouging, these practice will never stop, so rich people should just stay out of it since they are afford all they want, instead of supporting those greedy organisation, such as blue or green camp.


oi vey, this is so up side down. I'm not wealthy by any stretch, and I tell people to stop QQ'ing about prices constantly. They should use all that extra energy they waste on complaining to make more money. 

where to even start - 

first, can you please break down the math proving NV is "gouging" prices on Geforce products. That argument is sound for Quadro cards for sure, but that's not what we're discussing here. 

I'll wait.

also, by definition any for profit organization is greedy, its kind of in the charter. So i'll take AMD math or NV math on costs. It's simply hilarious that you see AMD as an angel. You know they'll be jacking prices as soon as they can compete, right? If you weren't aware of this, do some product price history research while doing the math for AMD & NV's greed.

sorry, but the government is never going to buy you GPUs, even once they've taxed carbon, and jacked income taxes to 70%.


----------



## ilmazzo

skupples said:


> oi vey, this is so up side down.
> 
> where to even start -
> 
> first, can you please break down the math proving NV is "gouging" prices on Geforce products. That argument is sound for Quadro cards for sure, but that's not what we're discussing here.
> 
> I'll wait.
> 
> also, by definition any for profit organization is greedy, its kind of in the charter. So i'll take AMD math or NV math on costs.


well, look at the last 5 years financial reports of NV

if you have a profit you are asking more than what are your costs, simple as that


----------



## skupples

ilmazzo said:


> well, look at the last 5 years financial reports of NV
> 
> if you have a profit you are asking more than what are your costs, simple as that


this is a joke right? 

How do you expect a company to pay their wage slaves without profit margins?

maybe skip out on a few hardware cycles & put that money into their stocks instead 

maybe i'm reading this wrong, but you're implying that making a profit is greed. Obscene profit could be seen as greed, like what is made on quadros... quadros are also used to make money, so that's kinda why they can charge obscene mark ups. Said quadro can crank out millions in profit.

you know, the same thing ATI does with their Apple tier products? Those operate in obscene profit margin ratios, so they must be evil and greedy too, by your own definition.

here's one place where I'm sure we can agree.

you shouldn't be buying new leather jackets for each hour of the day if you can afford to provide full benefits for all staff but don't.


----------



## ilmazzo

Wake me up when discussion will get to the "class struggle" point 

Interesting discussion but big OT, so.... maybe we will continue in steam chat one day

cheers


----------



## Blze001

My approach to all of this is to run along a generation behind a grab up the xx80ti cards when people offload them for $500 to get preorders in for the new gen. #poorpeoplehacks


----------



## iamjanco

tpi2007 said:


> I see your hand and raise two and a half leather jackets (Lisa Su already owns the other half):
> 
> View attachment 301966
> 
> (Click to animate)


touché. One must maintain a sense of humor today, if not a sense of perspective. Otherwise we'd all be storming the gates when the front door is only three feet wide (that's a little under one meter for those in the mainland EU).


----------



## skupples

Blze001 said:


> My approach to all of this is to run along a generation behind a grab up the xx80ti cards when people offload them for $500 to get preorders in for the new gen. #poorpeoplehacks


yep, this is what I've been doing since GK110, though that'll change once this next generation is in full swing, which I suspect to be mid / late 2021. 

also, if mGPU comes back in a meaningful way? I'll divert funds for 2x flagships. (i don't make a lot of money, i just spend as little of it as possible, as I don't need fancy clothes, sneakers n cars to make me feel good about myself, or to fool other people into judging you in a certain way. that's all a waste of energy, time, and money. Cant lie though, those new twin turbo cadillacs? hnnnnggggg)


----------



## Sheyster

huzzug said:


> It would make sense if this was about rice paddy's


Or about bacon if this was in the U.S. 



Woundingchaney said:


> I have not seen anyone that owns a higher end gpu ridicule someone for complaining about price. Most of what I have seen is people complaining about price blaming others for purchasing the product at retail cost.


Some of the old Titan video card threads had many folks wandering in to say how stupid early Titan adopters were for dropping money on the card. This usually resulted in flaming, extra moderation of the thread, warnings, etc. I even received one warning and I don't even recall why. It was a flame-fest, what can I say?


----------



## ToTheSun!

skupples said:


> first, can you please break down the math proving NV is "gouging" prices on Geforce products.





ilmazzo said:


> well, look at the last 5 years financial reports of NV
> 
> if you have a profit you are asking more than what are your costs, simple as that





skupples said:


> this is a joke right?





ilmazzo said:


> Wake me up when discussion will get to the "class struggle" point


Son, you're going to sleep a long while.


----------



## skupples

Sheyster said:


> Or about bacon if this was in the U.S.
> 
> 
> 
> Some of the old Titan video card threads had many folks wandering in to say how stupid early Titan adopters were for dropping money on the card. This usually resulted in flaming, extra moderation of the thread, warnings, etc. I even received one warning and I don't even recall why. It was a flame-fest, what can I say?


yep, the way we were treated in the original GK110 club thread was absolutely atrocious. So much wasted energy, folks could really re-apply that energy elsewhere, rendering themselves more profitable, thus allowing the acquisition of $1,000 GPUs.


----------



## tpi2007

LOL if true: https://wccftech.com/nvidia-resuming-production-of-geforce-rtx-2070-graphics-card/


AMD's RX 5700 XT is doing better in the market than Nvidia thought, it seems. Then again, this micro managed market segmentation that produced a situation where the full TU106 die isn't used is kind of nonsensical, they're throwing performance out of the window. Here's hoping that they bring in a round of price cuts, say $259 for the 1660 Ti, 329$ for the 2060, $379 for the 2060 Super, $399 for the 2070 and finally $449 for the 2070 Super.


----------



## keikei

tpi2007 said:


> LOL if true: https://wccftech.com/nvidia-resuming-production-of-geforce-rtx-2070-graphics-card/
> 
> 
> AMD's RX 5700 XT is doing better in the market than Nvidia thought, it seems. Then again, this micro managed market segmentation that produced a situation where the full TU104 die isn't used is kind of nonsensical, they're throwing performance out of the window. Here's hoping that they bring in a round of price cuts, say $259 for the 1660 Ti, 329$ for the 2060, $379 for the 2060 Super, $399 for the 2070 and finally $449 for the 2070 Super.



Nvidia realizes gamers will pay a premium, but not that big a premium. 2 very similar performing cards, but 1 is near a bill more? I can guess which one most will take. Green is getting a nice reality check in the midrange for sure.


----------



## guttheslayer

keikei said:


> Nvidia realizes gamers will pay a premium, but not that big a premium. 2 very similar performing cards, but 1 is near a bill more? I can guess which one most will take. Green is getting a nice reality check in the midrange for sure.


I hope NV realised targeting that few rich wealthy gamers isnt going to work in long term.


We all have a right to own a gaming PC, as long as we can afford, but there is a limit on how sky high they can charge, especially given that it get obsolete every 12 months. I laugh at ppl who compare it to luxury good like UV bag or lambo, where the latter doesnt get obsolete in a span of 9-12 months.


PC hardware is not suppose to be luxury unless it last consumer at least 5 years as a top-end performance.


----------



## speed_demon

To add on that, many of us also use our PC's for professional work and the value for the money has to be there for us to buy this new hardware. With Nvidia jacking up prices it's dropping the bang for the buck factor lower and lower.


----------



## BigMack70

Woundingchaney said:


> I have not seen anyone that owns a higher end gpu ridicule someone for complaining about price. Most of what I have seen is people complaining about price blaming others for purchasing the product at retail cost.
> 
> I or others in no way what so ever should have to defend our luxury purchases to you simply because you dont have the income to afford the item. Im not interested in a boycott and I wish you the best in your feeble attempt.


There have definitely been folks, particularly in the Kepler Era, who made ridiculous, absurd, and rabid defenses of nvidia's pricing. 

That said, I think they've usually been a minority. 

I owned Titan X maxwell in SLI and currently run a 2080Ti and I was among the more outspoken critics of nvidia's absurd price doubling which they started with Kepler and continued to present day. 

It's scummy, but it's what happens when one company completely dominates the high end without competition. Without competition, you either get insane price increases (nvidia 2012-present) or a full stop on performance improvements (Intel 2011-2018).


----------



## huzzug

guttheslayer said:


> We all have a right to own a gaming PC, *as long as we can afford*


And that's what many of us have been asking people who complain do.



> but there is a limit on how sky high they can charge, especially given that it get obsolete every 12 months. I laugh at ppl who compare it to luxury good like UV bag or lambo, where the latter doesnt get obsolete in a span of 9-12 months.


The later also does not get sold in millions of units each month.



> PC hardware is not suppose to be luxury


You're right. But it definitely isn't a necessity. 



> unless it last consumer at least 5 years as a top-end performance.


That is not a definition of what luxury good is. A UV bag is expensive than a Dollar store bought on, with less storage space. It's value comes from the brand embossed on it's front.


----------



## guttheslayer

huzzug said:


> hat is not a definition of what luxury good is. A UV bag is expensive than a Dollar store bought on, with less storage space. It's value comes from the brand embossed on it's front.



But a LV bag value last as well as it could. You buy it once and you can be just done with it. How about comparing it with a top end studio level sound system. That would make alot of sense. I could spend 50 grand on a studio sound system hardware, and yet it was able to last me through decades, probably still retain most of the value should I decide to sell it.


In that sense, the $$ spend on the sound system is very worth it. But a PC hardware doesn't last that long, value drop to 1/2 or less in a matter of 12-24 months.


----------



## huzzug

guttheslayer said:


> But a LV bag value last as well as it could.


Would not if you dragged it through dirt everytime you went off roading.



> You buy it once and you can be just done with it.


No. You need to pay, in $$ or time or comfort, for it's upkeep.



> How about comparing it with a top end studio level sound system. That would make alot of sense.


Sure. Go ahead.



> I could spend 50 grand on a studio sound system hardware, and yet it was able to last me through decades, probably still retain most of the value should I decide to sell it.


What studio equipment lasts 20 years (since you mentioned decades) and still fetches you ~100% on the market? 




> In that sense, the $$ spend on the sound system is very worth it. But a PC hardware doesn't last that long, value drop to 1/2 or less in a matter of 12-24 months.


Are you really complaining about the cards not holding value or the cards being too expensive to afford? They don't hold value because there are better models available next year making them essentially consumables.


----------



## guttheslayer

huzzug said:


> Are you really complaining about the cards not holding value or the cards being too expensive to afford? They don't hold value because there are better models available next year making them essentially consumables.



Both, expensive to afford and holding value are tied hand in hand. No one will buy a house costing million of USD and have it plummet to half value in just 12 months. However buying a cheap toy that cost $10 and sell it at $5 next year, no one will bother or feel the pain.


----------



## Asmodian

guttheslayer said:


> However buying a cheap toy that cost $10 and sell it at $5 next year, no one will bother or feel the pain.


A lot of people feel that way about a $1000 toy too. Enough for Nvidia to have a good reason to have something at that price point. 

This is like any market, the top end sku is based on what people will pay and what the competition has. All skus really, but it is at the top where they really push it. The only difference in the GPU market is that the barriers to entry are absolutely crazy. Gotta love patents being used to enforce the current oligarchy, but that is getting off on a tangent.

GPU have some of the luxury bag effect going on too. At least Nvidia would really like them to and AMD wouldn't mind at all either. I think that is what Nvidia has been trying to create with the Titan line, a pointlessly expensive status symbol but for a very different social group. CEO edition? Like those stupidly expensive golf clubs that really aren't much if any better.


----------



## Woundingchaney

guttheslayer said:


> But a LV bag value last as well as it could. You buy it once and you can be just done with it. How about comparing it with a top end studio level sound system. That would make alot of sense. I could spend 50 grand on a studio sound system hardware, and yet it was able to last me through decades, probably still retain most of the value should I decide to sell it.
> 
> 
> In that sense, the $$ spend on the sound system is very worth it. But a PC hardware doesn't last that long, value drop to 1/2 or less in a matter of 12-24 months.


Same scenario with purchasing a vehicle or a multitude of other items.

The depreciation value on a vehicle is massive. I bought my gpu 1.5 years ago and could probably get 50-60% resale value, my car I bought 4 years ago and "might" get 20%.


----------



## Woundingchaney

Asmodian said:


> A lot of people feel that way about a $1000 toy too. Enough for Nvidia to have a good reason to have something at that price point.
> 
> This is like any market, the top end sku is based on what people will pay and what the competition has. All skus really, but it is at the top where they really push it. The only difference in the GPU market is that the barriers to entry are absolutely crazy. Gotta love patents being used to enforce the current oligarchy, but that is getting off on a tangent.
> 
> GPU have also got some of the luxury bag effect going on too. At least Nvidia would really like them to and AMD wouldn't mind at all either. I think that is what Nvidia has been trying to create with the Titan line, a pointlessly expensive status symbol but for a very different social group. CEO edition? Like those stupidly expensive golf clubs that really aren't much if any better.


Im having a very difficult time associating a GPU to a social group.

The closest thing would perhaps be online forums and even then nearly all of the discussion is operational/performance related. Far from any social status.


----------



## Asmodian

Woundingchaney said:


> The closest thing would perhaps be online forums and even then nearly all of the discussion is operational/performance related. Far from any social status.


It is not quite the same as a handbag... but it is also pretty similar in many ways. I believe the term is Epeen? 

The golf clubs are probably a better comparison. They are actually supposed to be better. You don't really want to have the worst clubs in the group while having the best does feel good. Perhaps it is just a more male version of the handbag effect.


----------



## huzzug

guttheslayer said:


> Both, expensive to afford and holding value are tied hand in hand.


No they're not. If things are inexpensive, they'll likely be bought by everyone, thereby losing it's appeal. 



> No one will buy a house costing million of USD and have it plummet to half value in just 12 months.


2008 - 2009 would have been a very difficult period for you to be in. 



> However buying a cheap toy that cost $10 and sell it at $5 next year, no one will bother or feel the pain.


Prices are relative. That $10 for someone getting by with $200 each month would be pretty hard to shell out and then watch it's value drop.


----------



## iamjanco

Marketing 101 today, the executive version:

Nvidia's Jensen Saying "It Just Works" for 10 Hours


----------



## skupples

guttheslayer said:


> I hope NV realised targeting that few rich wealthy gamers isnt going to work in long term.
> 
> 
> We all have a right to own a gaming PC, as long as we can afford, but there is a limit on how sky high they can charge, especially given that it get obsolete every 12 months. I laugh at ppl who compare it to luxury good like UV bag or lambo, where the latter doesnt get obsolete in a span of 9-12 months.
> 
> 
> PC hardware is not suppose to be luxury unless it last consumer at least 5 years as a top-end performance.


how about 1080ti still being relevant in the top 10% 3 years later? ;P 

also, unfortunately no... a gaming PC is not a right. It's a privilege, like most everything else. like your examples, lambo and whatever a UV bag is. 

next thing you're going to tell us is that taxing the 1% should afford the 99% GPU discounts.

also, property values do exactly that all the time, specially just a decade ago.

hell, my parents just purchased one of the last units in a development for half of the initial asking price when they first broke ground 4 years ago. I can guarantee you the folks that bought in @ the original price see that as an extreme injustice.


----------



## BigMack70

Disclaimer: The following is not meant as a justification of Nvidia's pricing.


I think some of you guys _really_ lack perspective on the relative costs of graphics cards vs other hobbies and purchases in adult life. PC gaming is dirt cheap even if you maintain a $3k rig over the years, relative to almost all other adult life expenses. 

I have owned at least one, if not two, top end gaming GPUs for my entire adult life. It caused no financial hardship to drop $2k on a pair of Titan XM cards when I made the jump up to 4k back in 2015, because I sold old hardware and had saved up and pre-planned the purchase. And you know what? My annual household income is just above the median household income for where we live. As an example of the relative costs on offer here, I have never and probably will never own a new car. Every vehicle we've ever had has been used, and not particularly nice or noteworthy; just reliable and functional. You don't have to be filthy rich to own a high end gaming PC; with hardware re-selling, I've spent an average of $500/year on my PC in the last decade. That's less than $50/month. 

Know what that's equivalent to? Eating out once a month. Do you think eating out once a month is something only for the wealthy millionaire 1%? Because you're seriously out of touch with reality if yes.

Graphics cards are nothing like cars, or houses, or other luxury adult goods. Graphics cards are a luxury good for high schoolers and college students. They're a relatively cheap hobby expense for adults.


----------



## criminal

BigMack70 said:


> Disclaimer: The following is not meant as a justification of Nvidia's pricing.
> 
> 
> I think some of you guys _really_ lack perspective on the relative costs of graphics cards vs other hobbies and purchases in adult life. PC gaming is dirt cheap even if you maintain a $3k rig over the years, relative to almost all other adult life expenses.
> 
> I have owned at least one, if not two, top end gaming GPUs for my entire adult life. It caused no financial hardship to drop $2k on a pair of Titan XM cards when I made the jump up to 4k back in 2015, because I sold old hardware and had saved up and pre-planned the purchase. And you know what? My annual household income is just above the median household income for where we live. As an example of the relative costs on offer here, I have never and probably will never own a new car. Every vehicle we've ever had has been used, and not particularly nice or noteworthy; just reliable and functional. You don't have to be filthy rich to own a high end gaming PC; with hardware re-selling, I've spent an average of $500/year on my PC in the last decade. That's less than $50/month.
> 
> Know what that's equivalent to? Eating out once a month. Do you think eating out once a month is something only for the wealthy millionaire 1%? Because you're seriously out of touch with reality if yes.
> 
> Graphics cards are nothing like cars, or houses, or other luxury adult goods. Graphics cards are a luxury good for high schoolers and college students. They're a relatively cheap hobby expense for adults.


QFT

My ideal hobby would be building cars, not computers. But as stated above, that's a ridiculously expensive hobby compared to building PCs.


----------



## Blze001

BigMack70 said:


> They're a relatively cheap hobby expense for adults.


Eh, I wouldn't consider $1200 every year "relatively cheap" for most adults, tbh.


----------



## BigMack70

Blze001 said:


> Eh, I wouldn't consider $1200 every year "relatively cheap" for most adults, tbh.


Please read the post before replying. It doesn't even cost half that to maintain a top end gaming PC.


----------



## skupples

Blze001 said:


> Eh, I wouldn't consider $1200 every year "relatively cheap" for most adults, tbh.


Less than two weeks of pay, a year for the average / low end of the income spectrum, and as of recent (since Keplar nearly) there's been almost no reason to jump on every refresh. Specially as a gamer that doesn't create or profit from their GPU in any way. Even folks that want 4K performance can rely on used and mid tier products at this point. 

this can be easily seen by how salty people are EVERY TIME they do this. Not to mention the fact they're going to get back at least 50% of their buy in price if they're re-upping every cycle.


----------



## Sheyster

guttheslayer said:


> I hope NV realised targeting that few rich wealthy gamers isnt going to work in long term.
> 
> 
> We all have a right to own a gaming PC, as long as we can afford, but there is a limit on how sky high they can charge, especially given that it get obsolete every 12 months. I laugh at ppl who compare it to luxury good like UV bag or lambo, where the latter doesnt get obsolete in a span of 9-12 months.
> 
> 
> PC hardware is not suppose to be luxury unless it last consumer at least 5 years as a top-end performance.



You really need to get over this. Toyota Camrys and Ferraris will aways exist. That's how the world works. Luxury items are EVERYWHERE, including FOOD, the most basic necessity in life. Sheesh.


----------



## skupples

and that came from a Californian... that's some serious perspective


----------



## Blze001

BigMack70 said:


> Please read the post before replying. It doesn't even cost half that to maintain a top end gaming PC.


Your scenario assumes you can recoup half the cost of a new GPU by selling the old one, that's not always the case, especially when the replacement GPU's price makes large jumps between generations like the Nvidia ones are.


----------



## skupples

Blze001 said:


> Your scenario assumes you can recoup half the cost of a new GPU by selling the old one, that's not always the case, especially when the replacement GPU's price makes large jumps between generations like the Nvidia ones are.


and even if you don't get back 50% (because you got it wet, or held onto it for t0o long) you're still putting out less than two weeks of pay a year to have "the best" gaming card on the market, which is absolutely no way shape or form a requirement for top tier 4K gaming at this point. (ray tracing aside)

compare that to your $200-$600 a month car payment...

hell, even the $500 shart boxes from XBox and Sony are competent 4K machines, and the new ones aren't even out yet.  

its definitely easy to make a gaming PC obscenely expensive, specially when you're using $15 a piece fittings, but its still one of the least expensive adult hobbies I know of. RC cars will even run you more.


----------



## Sheyster

Blze001 said:


> Eh, I wouldn't consider $1200 every year "relatively cheap" for most adults, tbh.



The key word here is "relatively". A new $75 million Gulfstream G700 PJ is relatively cheap for a billionaire businessman or media mogul. It's all relative.


----------



## Sheyster

skupples said:


> and that came from a Californian... that's some serious perspective


LMAO! I'm unusual here.. I actually believe in the second amendment and own a gun safe with associated "accessories".  I will tell you one thing, I will NEVER retire here.


----------



## skupples

there won't be much of a state left to retire to by then anyways. Y'all are near apocalypse status at this point. Wild fires raging, millions suffering brown outs like its Y2K all over again, etc etc. So glad my brother finally moved back to florida. The real sunshine state! 

y'all got us beat in one area. The FL police absolutely HATE beach bonfires.


----------



## Blze001

Maybe it was my upbringing or something, but I'll never see $1200 as a "cheap" expense. Even if it's only half of a paycheck for me, that's still a lot of money.

Also I'll never just shrug and say "eh, price increase, who cares. I can still afford it, so it's fine"


----------



## Sheyster

skupples said:


> there won't be much of a state left to retire to by then anyways. Y'all are near apolypse status at this point. Wild fires raging, millions suffering brown outs like its Y2K all over again, etc etc. So glad my brother finally moved back to florida. The real sunshine state!
> 
> y'all got us beat in one area. The police absolutely HATE beach bonfires.


I'm primarily looking at Nevada and Arizona. Possibly even Texas or Florida. It's early yet, I have a good 10 years to think about it, unless I win the lottery of course!


----------



## BigMack70

Blze001 said:


> Your scenario assumes you can recoup half the cost of a new GPU by selling the old one, that's not always the case, especially when the replacement GPU's price makes large jumps between generations like the Nvidia ones are.


Reselling GPUs is a perfectly reasonable assumption. And your insinuation that you can't recoup half the cost is also false. You're just trying to invent reasons to be upset.

For example, from my past 7 years:
Purchases:
2x 7970 Lightning ($1300)
2x GTX 780 ($1350)
2x Titan XM ($2100)
1x GTX 1080 Ti ($770)
1x GTX 2080 Ti ($1270)

Resales:
2x 7970 Lightning ($860)
2x GTX 780 ($600)
2x Titan XM ($1600)
1x GTX 1080 Ti ($600)

That's $3130 over 7 years. Or, less than $500/year. 


It's a complete fantasy that it requires $1000+ per year to own a $1000 graphics card. New GPUs only come around every 2 years, and you can almost always recoup half or more of the cost of your old ones on resale.

If you want to buy top GPUs and financially cannot, just stop eating out once a month (or heck... stop buying cigarettes or alcohol or whatever other food vice if you spend money on that) and save that money towards GPU purchases.

In most circumstances, the reason not to purchase these expensive cards is that they are a bad value (performance per dollar) proposition compared to cheaper cards, not because they're a huge expense. I understand that some people genuinely cannot afford such an expense, but if you've ever had a car payment of more than $150-200/month, or if you go on a yearly vacation that costs you $1000+, or if you eat out twice a month, or if you buy a 6 pack of craft beer each week, you already engage in other lifestyle activities that are more expensive and luxurious than owning a top end GPU. 

*It's perfectly fair to say "GPUs aren't worth that much money to me", but it's utterly absurd to say "GPUs are like Ferarris... only the rich can afford them".*


----------



## skupples

you're correct, it is... but in the grand scheme of hobbies, it's pretty low on the list.


----------



## ToTheSun!

Sheyster said:


> LMAO! I'm unusual here.. I actually believe in the second amendment and own a gun safe with associated "accessories".  I will tell you one thing, I will NEVER retire here.


It's kind of amazing how no one wants to live in Cali, but everyone does. All of my north american friends say the exact same thing (that they'd never live in California).


----------



## EniGma1987

ToTheSun! said:


> It's kind of amazing how no one wants to live in Cali, but everyone does. All of my north american friends say the exact same thing (that they'd never live in California).



California actually has a TON of people fleeing the state for other ones and it has been that way for about 10 years now. They are mostly moving to Oregon, Texas, and Tennessee.


----------



## iamjanco

ToTheSun! said:


> It's kind of amazing how no one wants to live in Cali, but everyone does. All of my north american friends say the exact same thing (that they'd never live in California).


It's kinda like tripe (or Lutefisk). Until you try it, you don't know what you're missing. Once you try it, you know what to avoid. 

Disclaimer: lived there for two years back in the early 90s. I hate tripe (and Lutefisk) as well.


----------



## skupples

EniGma1987 said:


> California actually has a TON of people fleeing the state for other ones and it has been that way for about 10 years now. They are mostly moving to Oregon, Texas, and Tennessee.


yep, they're starting to show up down here as well, specially in the RED counties... which I find strange. 

this new dude @ my bar is likely gonna get tossed out soon if he keeps acting like we're buckwheat chewing morons that don't know a thing about weed.


----------



## iamjanco

skupples said:


> yep, they're starting to show up down here as well, specially in the RED counties... which I find strange.
> 
> this new dude @ my bar is likely gonna get tossed out soon if he keeps acting like we're buckwheat chewing morons that don't know a thing about weed.


A blanket party might help fix that.


----------



## Blze001

BigMack70 said:


> If you want to buy top GPUs and financially cannot, just stop eating out once a month (or heck... stop buying cigarettes or alcohol or whatever other food vice if you spend money on that) and save that money towards GPU purchases.


Sadly, I've already cut those out so I can afford my degree and maybe a downpayment on a house some day (although going back to school instead of buying before this never-ending home price increase probably ruined that dream). I'm not saying Nvidia shouldn't charge what they do, I just don't want to celebrate that they've found out there's not really an upper limit to what they can charge. 

I'm perfectly fine with my generation-behind strategy, personally.


----------



## skupples

buying a house is incredibly overrated anyways, tbh. It's a part of the classic dream still ingrained in us from our folks & old society. It's out of order to want it right away in this modern market & world. I mean, paying rent Vs. paying mortage + property tax... what's the difference if you don't actually end up paying off the mortgage either way? It's just so incredibly unlikely to spend 15-30 years somewhere, specially pre wife & kids. The only * to that for me is I'd definitely hope to give my crotch goblins the ability to go thru all of their primary schooling in the same place. My folks having to relocate my brother & me multiple times had serious side effects. 

I've re-ranked it to a much lower spot on my list, AFTER proper self employment.

I just sign 2-3 year leases at a time on a house instead.


----------



## Jedi Mind Trick

skupples said:


> buying a house is incredibly overrated anyways, tbh. It's a part of the classic dream still ingrained in us from our folks & old society. It's out of order to want it right away in this modern market & world. I mean, paying rent Vs. paying mortage + property tax... what's the difference if you don't actually end up paying off the mortgage either way? It's just so incredibly unlikely to spend 15-30 years somewhere, specially pre wife & kids. The only * to that for me is I'd definitely hope to give my crotch goblins the ability to go thru all of their primary schooling in the same place. My folks having to relocate my brother & me multiple times had serious side effects.
> 
> I've re-ranked it to a much lower spot on my list, AFTER proper self employment.
> 
> I just sign 2-3 year leases at a time on a house instead.


Makes sense to me; only thing I'd be looking to buy is a 3+ flat, with the hopes of 1 of the units paying the mortgage/taxes (and the other covering anything left over) with the rest going into my pocket. A house just seems like asking to lose money at this point (market seems way too high to me). At least with renting, you likely don't have to pay for anything that goes wrong (at least, not directly).


----------



## ToTheSun!

skupples said:


> buying a house is incredibly overrated anyways, tbh. It's a part of the classic dream still ingrained in us from our folks & old society. It's out of order to want it right away in this modern market & world. I mean, paying rent Vs. paying mortage + property tax... what's the difference if you don't actually end up paying off the mortgage either way? It's just so incredibly unlikely to spend 15-30 years somewhere, specially pre wife & kids. The only * to that for me is I'd definitely hope to give my crotch goblins the ability to go thru all of their primary schooling in the same place. My folks having to relocate my brother & me multiple times had serious side effects.
> 
> I've re-ranked it to a much lower spot on my list, AFTER proper self employment.
> 
> I just sign 2-3 year leases at a time on a house instead.


Whereby you hope to afford your kids a chance to go through school in the same town, you would probably want to pay for a house in your lifetime because that would give them a place to come back to, push comes to shove.

That's how I see it, anyway.


----------



## Sheyster

EniGma1987 said:


> California actually has a TON of people fleeing the state for other ones and it has been that way for about 10 years now. They are mostly moving to Oregon, Texas, and Tennessee.


Not just people fleeing, entire companies and many of them are technology companies that have moved to business-friendly Texas.


----------



## skupples

yep, i'd 100% buy a strip of units long before buying a single family home. I've done the math numerous times for properties in the different towns I've lived in. You've just gotta have enough units to offset the cost of major failures. ACs, leaks, etc. 

I think a 4-5 unit building would be a good starter. Only other issue I've heard of is getting loan money for future buildings. co-worker of mine with 30+ buildings came back to work for two years because the banks wouldn't issue him new loans otherwise. I didn't ask many questions, so no clue on the deets. I'd be willing to bet its due to him not properly filing his mostly cash income. 



ToTheSun! said:


> Whereby you hope to afford your kids a chance to go through school in the same town, you would probably want to pay for a house in your lifetime because that would give them a place to come back to, push comes to shove.
> 
> That's how I see it, anyway.


that's what I was trying to convey. 

I see no point in buying a home until I have kids, as I'd want them to stay in the same place for the bulk of their primary schooling. Having to make new friends every 3-4 years SUCKS, and has long term effects on your life that you don't realize until many many moons later.


----------



## DNMock

Depending on where you live, buying a house is a fantastic investment. It is usually actually cheaper than renting a house, and you gain equity for it to boot. Having something to borrow against when you need it gets you way better interest rates, and best of all the value of property and a house (as long as you take care of it) goes up faster than the rate of inflation in a lot of places. 

In fact, my old home I bought in 2008 for $105,000. In 2015 my house got totaled by a tornado, and the market value on my house had more than doubled over that time. I was able to pay off my old house, and had enough left over to build myself a new home without any loans. 

As to the Cali exodus, yeah, here in DFW over the last few years, there has been a huge influx of people moving here from California. Most of which financed the move by making a killing selling their old home in Cali and moving here where the housing is a lot cheaper. A lot of them are able to actually upgrade to a nicer home with zero loans from the equity and inflationary value of their old house in Cali.


----------



## skupples

yeap, I keep hearing the speech time and time again, but it means little to me right now, as I'm moving every couple of years.

My parents did the cali flip all the way back in 2002. They had a home built in M section of Rohnert Park for like $200 in 1997 sold it in 3 days for $800 in 2001 or 02, before moving to Indiana.

You definitely get more for your money when buying over renting, but it all depends on context & goals.


----------



## DNMock

Yuck, Indiana? Lived there as a kid, saw enough cornfields and farms for one lifetime, no intention on going back.


----------



## looniam

skupples said:


> this new dude @ my bar is likely gonna get tossed out soon if he keeps acting like we're buckwheat chewing morons that don't know a thing about weed.


----------



## Hydroplane

Been reading and still not sure how this thread transitioned to a real estate discussion lol


----------



## dVeLoPe

talking about real estate im looking to get my first piece of land in FLORiDA how do I go about knowing which ones are actually able to have a house constructed on them or not?


----------



## ilmazzo

lol

think this thread derailed a little bit until you are implying that next ampere gpus will cost like a real estate


----------



## ToTheSun!

ilmazzo said:


> lol
> 
> think this thread derailed a little bit until you are implying that next ampere gpus will cost like a real estate


Holding value is not a bad.


----------



## Jedi Mind Trick

Hydroplane said:


> Been reading and still not sure how this thread transitioned to a real estate discussion lol


We in GPU bubble territory right now; when it bursts, I'm going to pick up everything I can afford, rehab em, flip em and live comfortably for the rest of my life!


----------



## b.walker36

skupples said:


> buying a house is incredibly overrated anyways, tbh. It's a part of the classic dream still ingrained in us from our folks & old society. It's out of order to want it right away in this modern market & world. I mean, paying rent Vs. paying mortage + property tax... what's the difference if you don't actually end up paying off the mortgage either way? It's just so incredibly unlikely to spend 15-30 years somewhere, specially pre wife & kids. The only * to that for me is I'd definitely hope to give my crotch goblins the ability to go thru all of their primary schooling in the same place. My folks having to relocate my brother & me multiple times had serious side effects.
> 
> I've re-ranked it to a much lower spot on my list, AFTER proper self employment.
> 
> I just sign 2-3 year leases at a time on a house instead.


IMO opinion there is a big difference in paying rent vs mortgage and taxes. At least with a mortgage you are gaining equity while you live somewhere (assuming your home value isn't falling) You don't need to pay it off in its entirety. However I agree with you that its not something to rush into especially if you have no wife/kids or don't plan to live there for a long time. Getting a 100k mortgage, paying off 5k then moving and resetting while paying an agent a fee and loosing money isn't smart lol. 

I got really lucky with my house. 120k, 114k mortgage my house is valued at 175k currently. I have gained almost 90k in equity vs throwing away 1600 bucks a month when I was renting and getting nothing back. If you mortgage and taxes are less then your rent then buying a house becomes a lot more lucrative since you just gain equity over time.


----------



## skupples

DNMock said:


> Yuck, Indiana? Lived there as a kid, saw enough cornfields and farms for one lifetime, no intention on going back.


thus why they were able to get 4x the house, and save half the money they made from the cali flip. 

I left the moment I finished high school. Drove to my rental, packed my stuff, & bailed.


----------



## Sheyster

ilmazzo said:


> lol
> 
> think this thread derailed a little bit until you are implying that next ampere gpus will cost like a real estate


Yep, one or two guys come in and toss that same old price grenade about premium GPUs and the discussion devolves from there. I'm never the one to toss that grenade but I can't resist responding to the preposterous attitudes about it and life in general. When they're giving away luxury items of any kind and there are no more price differentials for anything, anywhere, then please come back at that time. I'll keep the lights on for ya.


----------



## BigMack70

ilmazzo said:


> lol
> 
> think this thread derailed a little bit until you are implying that next ampere gpus will cost like a real estate


These threads always derail the moment that some galaxy brain geniuses on either side of the argument decide to start comparing a relatively inexpensive hobby to luxury cars and houses. It always happens, and it's always nonsense. When top end GPUs start costing $10k, then maybe that comparison starts to be valid. Not when they're $750-1200. We're an order of magnitude off that comparison having any validity at all. 

It makes me think people have never actually had any other hobbies when they start talking about how expensive PCs are. They're an expensive hobby when you're a broke student just trying to make ends meet, but they're on the cheap end of adult hobbies.


----------



## EniGma1987

well, its not like there is anything left to discuss on the Ampere rumor anyway. There wont be anymore meaningful discussion on next gen Nvidia cards until we know something more than 1H (which we all already can guess at April-June like usual) or have some performance leaks/rumors.


----------



## Sheyster

EniGma1987 said:


> well, its not like there is anything left to discuss on the Ampere rumor anyway. There wont be anymore meaningful discussion on next gen Nvidia cards until we know something more than 1H (which we all already can guess at April-June like usual) or have some performance leaks/rumors.


I agree, thread should be closed. There is no more added value here. Haters are gonna hate, some will feel entitled to Titan level GPUs at 2007 pricing, etc. I'm over it.


----------



## skupples

pretty sure its young folks pre-not-minimum-wage-income making these silly statements. 

what they don't realize is that even @ $15 an hour, GPUs become quite affordable.

favorite line of the thread, gaming PCs are a right!


----------



## looniam

skupples said:


> pretty sure its young folks pre-not-minimum-wage-income making these silly statements.
> 
> *what they don't realize is that even @ $15 an hour, GPUs become quite affordable.
> *
> favorite line of the thread, gaming PCs are a right!


and just what state (or where) is that?

i recall a few years ago a big debate in a thread for the cost of AAA games. as a baseline, breaking down the cost of living for a family of 3 (old fashion wife 2 kids) the "bread winner" will have to earn ~$22.00 just to afford 2 games a year ($120) in my state/place of residence. ( O . . H . . beat michigan!).

honestly whats about probably affordable for most . . .









btw, just because i can't afford something, well . . then i live vicariously through those who can; i think you've seen me cheer on folks in owners/benchmarking threads when i couldn't actively participate. 

unless you subscribe to were everything is suffering to (maybe) be rewarded later; people do have the right to have an enjoyable past time. but (to go way off w/analogy) just because i have a drivers license doesn't immediately bestow me the privilege of driving a porche; just that $400 POS i have sitting in my driveway.


----------



## Diffident

Everyone talks about $1200 GPU's as if that's the only model available.


----------



## alcmdemdsks

In order for RTX and other premium product features to be beneficial, it has to be available for most people, even consoles.
Despite people here arguing that it's a cheap adult hobby, less than 1% have 2080 Ti according to Steam.
Only about 10% use CPUs with clock speed of 3.7 Ghz and above (Intel & AMD combined)
It's just not possible to enjoy even if you have the top of the line setup if only few games are developed for it.
PC gaming market is growing, but catering towards the $1000+ will only drive more developers away toward the budget oriented (bigger pie)
So if you're content with current trend of price increases and justify their actions, you'll just have to shell out more cash for scarce hardware AND niche games made for it.


----------



## AuraNova

Diffident said:


> Everyone talks about $1200 GPU's as if that's the only model available.


The way people think and talk, they strive for the best. Which there's nothing wrong with that, but you have to do so smartly.


----------



## ZealotKi11er

AuraNova said:


> The way people think and talk, they strive for the best. Which there's nothing wrong with that, but you have to do so smartly.


You cant no do it smartly if you want the fastest GPU. You buy a 2080 Ti and you have peace of mind that nothing will beat it. Hopefully, next-gen AMD has an answer for all of Nvidia's tiers. You can easily play every game out there with RX 570 at 1080p and RX 5700 at 1440p.


----------



## AuraNova

ZealotKi11er said:


> You cant no do it smartly if you want the fastest GPU. You buy a 2080 Ti and you have peace of mind that nothing will beat it. Hopefully, next-gen AMD has an answer for all of Nvidia's tiers. You can easily play every game out there with RX 570 at 1080p and RX 5700 at 1440p.


I have to disagree. Very rarely anyone NEEDS the fastest card. If they want it and have the cash, great, but that's still not the smartest move. This is let alone shopping for the best price on said high-end card. You buy for what you feel is best for your needs. It's a part of proper consumerism. "Future-proofing" is a myth, especially in this industry. I didn't buy the fastest card, and I have piece-of-mind on my purchase. To each their own.

I feel those who transfix themselves on getting the fastest and best is a factor that sets the precedent for the high prices. These companies know this and they price accordingly. I am not saying people shouldn't. I'm saying that it's not really a smart move. It's all e-peen.


----------



## tpi2007

AuraNova said:


> I have to disagree. Very rarely anyone NEEDS the fastest card. If they want it and have the cash, great, but that's still not the smartest move. This is let alone shopping for the best price on said high-end card. You buy for what you feel is best for your needs. It's a part of proper consumerism. "Future-proofing" is a myth, especially in this industry. I didn't buy the fastest card, and I have piece-of-mind on my purchase. To each their own.
> 
> I feel those who transfix themselves on getting the fastest and best is a factor that sets the precedent for the high prices. These companies know this and they price accordingly. I am not saying people shouldn't. I'm saying that it's not really a smart move. It's all e-peen.



I disagree that future proofing is a myth, it's all about how much you want to spend and what kind of performance threshold you are comfortable with, and of course, it will also depend on the existing products on the market, so it does not always apply. And also, making predictions too far out in the future is irresponsible, but 3-4 years, even 5 years is feasible. For example, there is no card to future proof for the next gen of titles to be played at 4K - the 2080 Ti does it for now, but there isn't much headroom in it. It's even more true for ray tracing. The current hardware is v1, unbalanced, and overall lacks resources, it's not even good for today's titles, let alone next year's. 

On the other hand if you want to buy a card to future proof for 1080p (60 fps), there's plenty available. Get an RX 5700 XT or a 2070, unless you want future proofing with ray tracing included, in which case, it doesn't exist (same as above). The RTX 2080 Super gives you some headroom for 1440p and of course the 2080 Ti even more. 

For CPUs, well, there was a time when you'd be outdated in two years, but now there's cycles where you're fine with a given choice for many years. Ryzen started a new cycle of multi-core with decent IPC at affordable prices which will usher an era where games will use more than 4C/8T, but they take time to develop, so you can make predictions based on game development lifecycles and game engine updates.

Future proofing with RAM is also relatively straightforward, you just need to pick it at the right time. Picking it a the start of a new standard is not so good, as you'll be paying more for less speed, higher latencies and less capacity, the sweetspot is around the middle of the lifecycle. Picking a PCIe version CPU / motherboard combo is also a matter of looking at the needs and projections of software and hardware.


----------



## AuraNova

tpi2007 said:


> I disagree that future proofing is a myth, it's all about how much you want to spend and what kind of performance threshold you are comfortable with, and of course, it will also depend on the existing products on the market, so it does not always apply. And also, making predictions too far out in the future is irresponsible, but 3-4 years, even 5 years is feasible. For example, there is no card to future proof for the next gen of titles to be played at 4K - the 2080 Ti does it for now, but there isn't much headroom in it. It's even more true for ray tracing. The current hardware is v1, unbalanced, and overall lacks resources, it's not even good for today's titles, let alone next year's.
> 
> On the other hand if you want to buy a card to future proof for 1080p (60 fps), there's plenty available. Get an RX 5700 XT or a 2070, unless you want future proofing with ray tracing included, in which case, it doesn't exist (same as above). The RTX 2080 Super gives you some headroom for 1440p and of course the 2080 Ti even more.
> 
> For CPUs, well, there was a time when you'd be outdated in two years, but now there's cycles where you're fine with a given choice for many years. Ryzen started a new cycle of multi-core with decent IPC at affordable prices which will usher an era where games will use more than 4C/8T, but they take time to develop, so you can make predictions based on game development lifecycles and game engine updates.
> 
> Future proofing with RAM is also relatively straightforward, you just need to pick it at the right time. Picking it a the start of a new standard is not so good, as you'll be paying more for less speed, higher latencies and less capacity, the sweetspot is around the middle of the lifecycle. Picking a PCIe version CPU / motherboard combo is also a matter of looking at the needs and projections of software and hardware.


Considering that it's likely most people in this forum who buy high-end video cards will wind up wanting the latest thing when it comes out makes it a myth. I've seen a lot of people go right into the next generation of card when the card they have is perfectly fine for many years to come. I used an HD7870 for over 5 years after initial release before I upgraded because I felt it was time. That card ran fine for my needs, and still did when I upgraded. Granted, my experience is not everyone else's. But I feel that people are too quick to drop money on something that doesn't really benefit them immensely. The same goes with CPUs as well. I'm not really disagreeing with you, per se. I'm just stretching out there that people think differently when it comes to future proofing. You have the right idea, but how many people actually do that?

Some of what you said proves my point to begin with:



> it's all about how much you want to spend and what kind of performance threshold you are comfortable with


There's a ton of guides out there for those who want a "gaming" rig and will be able to play games well for today's standards. Some are high-end. Some are budget. Most people don't know what they need to begin with. If ray tracing is important to you, then fine. If 4k gaming is important, great. But it's not always necessary. People don't buy based on what they need. They just see it performs in tip-top form and buy it. That's what I have been seeing lately. That comfort you mentioned changes often with people not because they have a need, but that they follow the trend of having the best. So it stands that "future proofing" is a myth in this industry. It's the old adage: If it isn't broke, don't fix it.


----------



## guttheslayer

TBH, 4K60 is not possible till very recent years which is RTX 2000s series.


Not all want to buy the latest graphic card for the sake of it. Their display require a much higher horsepower than what the GPU of previous gen can offer.


----------



## keikei

guttheslayer said:


> TBH, *4K60 is not possible till very recent years which is RTX 2000s series.*
> 
> 
> Not all want to buy the latest graphic card for the sake of it. Their display require a much higher horsepower than what the GPU of previous gen can offer.



Somewhat possible-ish. Do dual 1080ti's count?


----------



## wingman99

Elmy said:


> I will sell my 2080Ti and upgrade to anything 20% or more faster. 2560X1440p 240Hz monitors will be arriving very soon. Will need all the FPS I can get.
> 
> I will sell my 2080Ti at a loss... I will buy the new 3080Ti or whatever they call it for X amount of dollars. New GPU Cost - Old GPU selloff = X dollars. X dollars every 12 or 18 months to play on flagship GPU is worth every penny. And there are many of us that do this. There are also many ppl that are lurking in the for sale forums to snag up those used GPU's at a discount.
> 
> Also I see so many ppl in here complaining about pricing. Its getting more expensive for R&D , Real estate , wages , logistics , cost of wafers , etc. Flagship will never be 6-7 hundred again... EVER! Even if Intel joins the market.. You think they somehow figured out how to build a GPU cheaper than Nvidia and AMD and even if they did.. You think they are going to pass that savings onto you? They know the excitement for their GPU's will be high. They will price at whatever NVIDIA/AMD is commanding at the same performance. The are NOT going to come into the industry and price their GPU's at the same performance for 50% less money. Some of you think somehow a pricing fairy is going to come rescue the market. Its NOT going to happen. Coming into here and complaining about the pricing falls on deaf ears. No one cares that you are not buying a GPU because its too expensive. There are 20 people behind you in line that will.


I've been PC gaming for 24 years. With newly increasing demanding generational games, the gaming system recommend or better specifications go up with CPU and GPU hardware also. So PC gaming trying to achieve 2560x1440p 240Hz monitors with 240 FPS is just mostly to generate more sales for folks like you that don't know that the peak FPS over the last 10 years is the same do to generational increased demanding game graphics with increased generational PC hardware.

For those reasons I will stay with 1920x1080p 144Hz, 240Hz for keeping up with same FPS every 2-3 years with the ever increasing new game demand with system hardware do to new games that need new system hardware.

Consoles do the same thing new game and hardware demanding cycle at a much lower cost. Of course with a PC and top hardware and newly released games you can do much better with FPS at 1920x1080p.

There is one good advantage about new hardware, If you like to play old games like Cysis 3 that came out 2013, it will run now at a average 187 FPS at 1920x1080p, with i9 9900k and 2080ti.


----------



## ZealotKi11er

Even $1200 is not too much if you are working every 1-2 years. I bought 1080 Ti for $900 CAD and sold it for $680 CAD. Added another $500 and got 2080 Ti. Worse case is ~ 400-500 a year to own the best GPU every year. You can play the same game with a slower GPU. My biggest problem is GPUs these days don't warrant the cost especially anything over 2070S.


----------



## rluker5

guttheslayer said:


> TBH, 4K60 is not possible till very recent years which is RTX 2000s series.
> 
> 
> Not all want to buy the latest graphic card for the sake of it. Their display require a much higher horsepower than what the GPU of previous gen can offer.


My sli 780tis could handle it from 2014 to 2016 with their biggest challenge being 300+ hours of W3. But I could only play high setings. (yes, the 4k tv (I still have as my monitor) I picked up in 2014 has DP 1.2) Sli 1080tis breezed through it with the games that sli, and one 1080ti is enough for 4k60 high settings. Would be nice to have more power, but a 2080ti isn't worth it for me with the 3080ti on the way. For my uses a $1200 2080ti would have been cheaper then the $1400 I paid for the 780tis or the $1600 I paid for the 1080tis. 

If I stay at 4k60 a 3080ti should be adequate and a cheaper gpu upgrade than my previous 2. My cpu/ram/mobo/psu/case/storage still has plenty of headroom left so I'll let that be.


----------



## neurotix

keikei said:


> Somewhat possible-ish. Do dual 1080ti's count?



✔ dual 2000mhz+ 1080tis on air

✔ just upgraded to a 3900x + oc'ed 3800mhz c16 ram

✔ doesn't use/care about raytracing


----------



## wingman99

Seyumi said:


> You all have to admit, the gap between console & PC is closing with each and every generation. I've always been a PC gamer, before the original Playstation & Xbox days:
> 
> My PC games looked like PS2 games when the PS1 was out
> My PC games looked like PS3 games when the PS2 was out
> My PC games looked like PS4 games when the PS3 came out
> My PC games looked like PS4 pro games when the PS4 came out
> 
> Next generation, my PC games will probably look like PS5 games...when the PS5 comes out.
> 
> All PC has over an Xbox One X or a PS4 pro is higher resolution, frame rates, and some settings on high/ultra instead of medium. PC games are only MARGINALLY better visually than their console counterparts these days but A LOT more money. I'm losing faith in the PC-only gaming industry. I guess we have Nvidia and Intel to thank for that with their monopolistic prices. Don't even bother mentioning "but but but PC has mods" because those have now all been replaced by microtransactions these days. The PC modding community is practically dead at this point.
> 
> I can't wait to get a PS5 for $500? and get 95% of the visuals that my $5,000 computer has. My wallet is getting tired after all these years.


I totally agree. On consoles they now run PC games ported over to console and the console hardware is more efficient with gaming since that is what the proprietary hardware was solely designed for. PC is jack of all trades and master of none.


----------



## dantoddd

AuraNova said:


> I have to disagree. Very rarely anyone NEEDS the fastest card. If they want it and have the cash, great, but that's still not the smartest move. This is let alone shopping for the best price on said high-end card. You buy for what you feel is best for your needs. It's a part of proper consumerism. "Future-proofing" is a myth, especially in this industry. I didn't buy the fastest card, and I have piece-of-mind on my purchase. To each their own.
> 
> I feel those who transfix themselves on getting the fastest and best is a factor that sets the precedent for the high prices. These companies know this and they price accordingly. I am not saying people shouldn't. I'm saying that it's not really a smart move. It's all e-peen.


Future proofing is not a myth. Look at the 1080Ti, If you had bought that card in 2017, you're still enjoying amazing performance at 1440P


----------



## wingman99

dantoddd said:


> Future proofing is not a myth. Look at the 1080Ti, If you had bought that card in 2017, you're still enjoying amazing performance at 1440P


Not with the new games like Call of Duty: Modern Warfare LINK: https://www.nvidia.com/en-us/geforce/news/call-of-duty-modern-warfare-2019-system-requirements/

New demanding games need need new hardware, if you care about the FPS and RTX. It is the life cycle of progressing technology.


----------



## dantoddd

wingman99 said:


> Not with the new games like Call of Duty: Modern Warfare LINK: https://www.nvidia.com/en-us/geforce/news/call-of-duty-modern-warfare-2019-system-requirements/
> 
> New demanding games need need new hardware, if you care about the FPS and RTX. It is the life cycle of progressing technology.


1080Ti can't do RTX, but at present there are like 5 RTX titles and maybe 2 worthwhile implementations. on anything else at 1440P 1080TI is way above 60 fps, closer to 100 fps.


----------



## wingman99

dantoddd said:


> 1080Ti can't do RTX, but at present there are like 5 RTX titles and maybe 2 worthwhile implementations. on anything else at 1440P 1080TI is way above 60 fps, closer to 100 fps.


The 1080ti FPS are to low for modern games. The 1080ti is outdated upgrade to RTX 2080ti so you can run GPU demanding new release games at 120-144 FPS with a 1440p, 144Hz monitor or 180-240 FPS with a 1920x1080p, 240Hz monitor.

The is no such thing as future proofing, the new more demanding games are here like Call of Duty: Modern Warfare System Requirements Revealed LINK: https://www.nvidia.com/en-us/geforce/news/call-of-duty-modern-warfare-2019-system-requirements/ and the GTX 1080ti is behind the ball now, were in the future.


----------



## skupples

the purchasers in this market are heavily affected by FOMO, more news at 11:00.


----------



## m4fox90

wingman99 said:


> The 1080ti FPS are to low for modern games. The 1080ti is outdated upgrade to RTX 2080ti so you can run GPU demanding new release games at 120-144 FPS with a 1440p, 144Hz monitor or 180-240 FPS with a 1920x1080p, 240Hz monitor.
> 
> The is no such thing as future proofing, the new more demanding games are here like Call of Duty: Modern Warfare System Requirements Revealed LINK: https://www.nvidia.com/en-us/geforce/news/call-of-duty-modern-warfare-2019-system-requirements/ and the GTX 1080ti is behind the ball now, were in the future.


175 GB hard drive space? Jeez. These are getting crazy. I remember thinking the 50ish GB for FFXIII 1/2/3 were a lot.


----------



## skupples

m4fox90 said:


> 175 GB hard drive space? Jeez. These are getting crazy. I remember thinking the 50ish GB for FFXIII 1/2/3 were a lot.


this happens every console evolution. Games get bigger. 

last game I can think of that triggered this kinda WOOWWWW was Max Payne.


----------



## m4fox90

skupples said:


> this happens every console evolution. Games get bigger.
> 
> last game I can think of that triggered this kinda WOOWWWW was Max Payne.


I'm not smart enough on this subject, does ray tracing occupy increased storage space vs "faking it" ?


----------



## ToTheSun!

m4fox90 said:


> 175 GB hard drive space? Jeez. These are getting crazy. I remember thinking the 50ish GB for FFXIII 1/2/3 were a lot.


If I remember correctly, GTA V used up something like 60-ish GB, but needed some 100+ when downloading+installing. That might be the case now. I mean, it's hard to imagine the game would actually need ALL THAT.


----------



## skupples

m4fox90 said:


> I'm not smart enough on this subject, does ray tracing occupy increased storage space vs "faking it" ?


not sure... but 150gb for RDR2 with no Ray Tracing O.O!


----------



## Sheyster

skupples said:


> the purchasers in this market are heavily affected by FOMO, more news at 11:00.


All I can say to that is: YOLO! F.T.P.


----------



## Sheyster

skupples said:


> not sure... but 150gb for RDR2 with no Ray Tracing O.O!


Samsung and Micron must be loving these new fall 2019 games, MOAR/BIGGER SSD SALES.


----------



## Ultixer

Seyumi said:


> You all have to admit, the gap between console & PC is closing with each and every generation. I've always been a PC gamer, before the original Playstation & Xbox days:
> 
> My PC games looked like PS2 games when the PS1 was out
> My PC games looked like PS3 games when the PS2 was out
> My PC games looked like PS4 games when the PS3 came out
> My PC games looked like PS4 pro games when the PS4 came out
> 
> Next generation, my PC games will probably look like PS5 games...when the PS5 comes out.
> 
> All PC has over an Xbox One X or a PS4 pro is higher resolution, frame rates, and some settings on high/ultra instead of medium. PC games are only MARGINALLY better visually than their console counterparts these days but A LOT more money. I'm losing faith in the PC-only gaming industry. I guess we have Nvidia and Intel to thank for that with their monopolistic prices. Don't even bother mentioning "but but but PC has mods" because those have now all been replaced by microtransactions these days. The PC modding community is practically dead at this point.
> 
> I can't wait to get a PS5 for $500? and get 95% of the visuals that my $5,000 computer has. My wallet is getting tired after all these years.


This is one hell of a dumb comment, bro no PC game in 2006 looked better than Gears of War, what are you smoking? Also modding isn't dead, battle royale genre & auto chess come from mods, did you think Epic Games invented the battle royale concept?


----------



## Ultixer

wingman99 said:


> I totally agree. On consoles they now run PC games ported over to console and the console hardware is more efficient with gaming since that is what the proprietary hardware was solely designed for. PC is jack of all trades and master of none.


https://gamerant.com/destiny-2-bungie-investigating-frame-rate-issues-in-new-raid-on-console/

Destiny running at like 10fps on Consoles, efficient?


----------



## wingman99

m4fox90 said:


> 175 GB hard drive space? Jeez. These are getting crazy. I remember thinking the 50ish GB for FFXIII 1/2/3 were a lot.


The graphic detail is going up on new games requiring more storage for all the detailed geometry and shading colors.

This Nvidia 1920x1080p graphic screenshot was done 2017 to show how good graphics could be in the far future when graphics power is unbelievable.


----------



## skupples

Ultixer said:


> This is one hell of a dumb comment, bro no PC game in 2006 looked better than Gears of War, what are you smoking? Also modding isn't dead, battle royale genre & auto chess come from mods, did you think Epic Games invented the battle royale concept?


thanks for pointing this out.

all we're seeing are new brands repackaging old models in fancier and higher contrast wrappers with MTX, with tens (if not hundreds) of millions $$ into the marketing and hype machine. 

or look at the mobile segment, not completely up on the AFK Arena scene, but most of the major games you see advertised are just classic PC formats shoved into a phone. (civ, C&C, etc etc) 

or the classic story of CandyCrush. 

as to mods giving birth - a great example of this most recently. PUGB is an ARMA mod gone stand alone. DayZ is an arma mod gone stand alone. FortNite is PUGB focus grouped to the n'th degree for maximum digital vibrancy and addiction. 

it all makes sense now looking back on it. It's essentially the same mass of people that showed up with WoW that keep the modern AAA monstrosities afloat.


Ultixer said:


> https://gamerant.com/destiny-2-bungie-investigating-frame-rate-issues-in-new-raid-on-console/
> 
> Destiny running at like 10fps on Consoles, efficient?


consoles do more with less, this isn't a contested thing in any way shape or form. 



wingman99 said:


> The graphic detail is going up on new games requiring more storage for all the detailed geometry and shading colors.
> 
> This Nvidia 1920x1080p graphic screenshot was done 2017 to show how good graphics could be in the far future when graphics power is unbelievable.
> 
> View attachment 303102


that's a pretty extreme example.

put up the face modeling "of the future" for a more reasonable example.

i see what you're getting at though 

CES 2017 keynote with the ray traced car, and the ultra realistic no valley faces.


----------



## keikei

https://wccftech.com/nvidia-ampere-...higher-clocks-more-vram-lower-tdps-vs-turing/


----------



## Raghar

Sheyster said:


> Samsung and Micron must be loving these new fall 2019 games, MOAR/BIGGER SSD SALES.


That's too much for SSD, you need proper HDD for that 2 TB one with 128-512MB cache.


----------



## ToTheSun!

keikei said:


> https://wccftech.com/nvidia-ampere-...higher-clocks-more-vram-lower-tdps-vs-turing/


"Everything is much better and cheaper."

*sprinkles pinch of salt*


----------



## Blze001

wingman99 said:


> the GTX 1080ti is behind the ball now, were in the future.


I'm running 3440x1440 at 100hz and I'm still keeping up with most of the new titles I'm interested in. Not surprised Call of Duty is more demanding, they just roll out effectively the same game every year, so pushing graphics as hard as they can is vital.

For example Outer Worlds with everything maxed is sitting around 80-90fps for me, and it's pretty badly optimized.

Cyberpunk 2077 is gonna be the first title where I have trouble. I think my days of enjoying great visuals are over once that drops


----------



## Sheyster

Raghar said:


> That's too much for SSD, you need proper HDD for that 2 TB one with 128-512MB cache.


I'm running 2TB of SSD now. That's plenty for what I do with this PC, almost purely gaming. I'm not a collector of any sort so don't need the huge storage space some folks might need.  I have not owned an internal HDD for ~4 years.


----------



## wingman99

Blze001 said:


> I'm running 3440x1440 at 100hz and I'm still keeping up with most of the new titles I'm interested in. Not surprised Call of Duty is more demanding, they just roll out effectively the same game every year, so pushing graphics as hard as they can is vital.
> 
> For example Outer Worlds with everything maxed is sitting around 80-90fps for me, and it's pretty badly optimized.
> 
> Cyberpunk 2077 is gonna be the first title where I have trouble. I think my days of enjoying great visuals are over once that drops


Game manufactures increase the graphics detail every generation of increased graphic card performance, otherwise we would only need graphics and PCs from the year 1995. The is no such thing as future proofing.:thumb:


----------



## Blze001

wingman99 said:


> Game manufactures increase the graphics detail every generation of increased graphic card performance, otherwise we would only need graphics and PCs from the year 1995. The is no such thing as future proofing.:thumb:


Quite true, but as previously discussed, the cost of staying caught up in graphics wasn't quite as hefty back when the top-tier card was $700 and not $1200 or however much nVidia charges for the 3080ti now that they know $1200 doesn't impact sales.

Even lagging behind a generation, I'm looking at $700+ for a used RTX-2080ti.


----------



## wingman99

Blze001 said:


> Quite true, but as previously discussed, the cost of staying caught up in graphics wasn't quite as hefty back when the top-tier card was $700 and not $1200 or however much nVidia charges for the 3080ti now that they know $1200 doesn't impact sales.
> 
> Even lagging behind a generation, I'm looking at $700+ for a used RTX-2080ti.


Nvidia charges what the market will bear for the best graphics and FPS. However, Nvidia is rumored to lower the price on the 3080ti, the market has spoken.:specool:

3080ti rumor link: https://wccftech.com/nvidia-ampere-...higher-clocks-more-vram-lower-tdps-vs-turing/


----------



## Blze001

wingman99 said:


> Nvidia charges what the market will bear for the best graphics and FPS. However, Nvidia is rumored to lower the price on the 3080ti, the market has spoken.:specool:
> 
> 3080ti rumor link: https://wccftech.com/nvidia-ampere-...higher-clocks-more-vram-lower-tdps-vs-turing/


Like I've said before: I understand why nVidia prices the way they do, and considering they're a company, that's what they should be doing.

Doesn't mean I have to like it or be happy about how much less my dollar buys.


----------



## skupples

Sheyster said:


> I'm running 2TB of SSD now. That's plenty for what I do with this PC, almost purely gaming. I'm not a collector of any sort so don't need the huge storage space some folks might need.  I have not owned an internal HDD for ~4 years.


likewise.

I have 4TB NVME, 4TB SSD, and a NAS for everything else that doesn't require speed in any way shape or form. 

NVME raid Vs. SSD raid - only difference is file transfer speeds. game loading is nigh identical. (both are stripes though)


----------



## guttheslayer

wingman99 said:


> Game manufactures increase the graphics detail every generation of increased graphic card performance, otherwise we would only need graphics and PCs from the year 1995. The is no such thing as future proofing.:thumb:


Every aspect of demand that requires GPU horsepower is going at accelerating rate, in fact the pace of GPU speed advancement is too slow to catch up.

Factors that impact GPU performances:

HDR
RTX
Higher FPS
Higher Resolution
Game Engine Fidelity

Each of these 5 are becoming increasing demanding. GPU speed have to go up x5 just to catch up all these 5 front at once.


----------



## guttheslayer

wingman99 said:


> Nvidia charges what the market will bear for the best graphics and FPS. However, Nvidia is rumored to lower the price on the 3080ti, the market has spoken.:specool:
> 
> 3080ti rumor link: https://wccftech.com/nvidia-ampere-...higher-clocks-more-vram-lower-tdps-vs-turing/


I think we are all too happy to forget certain point:

The article compared it with RTX 2080 and 2080 Ti, which is non-super variant. Meaning to say the price of 3080 and 3080 Ti could be $700 and $900 respectively. But then again the $900 Ti could released at 9-12 month later as compared to the non-TI variant, which is what NV has been doing in the past, if not the low $200 premium will eat into the sales of 3080.


I was guessing the specs for 3080 to be something along:

*
GPU: GA104
Process Node: 7nm+ EUV
Die Size: 350mm~
Cores: 4096 (?)
RT Cores: 64
Boost Clock: 2GHz~
Memory: 12GB GDDR6
Memory Bandwidth: 512 GB/s (256 bits)
TDP: 180W (?)
MSRP: $699 (?)
*

Performance uplift: +50% for RT, 25% for rasterization, compared to 2080 ti.


----------



## ilmazzo

wingman99 said:


> Nvidia charges what the market will bear for the best graphics and FPS. However, Nvidia is rumored to lower the price on the 3080ti, the market has spoken.:specool:
> 
> 3080ti rumor link: https://wccftech.com/nvidia-ampere-...higher-clocks-more-vram-lower-tdps-vs-turing/


don't think so

I just read a lower tdp and it is exactly what I expect from them, until amd drops big navi 2 that smashes 2080ti at 999$ but we are in the wishfull thinking realm so better have lower expectations on the next nvidia lineup release....


----------



## wingman99

guttheslayer said:


> Every aspect of demand that requires GPU horsepower is going at accelerating rate, in fact the pace of GPU speed advancement is too slow to catch up.
> 
> Factors that impact GPU performances:
> 
> HDR
> RTX
> Higher FPS
> Higher Resolution
> Game Engine Fidelity
> 
> Each of these 5 are becoming increasing demanding. GPU speed have to go up x5 just to catch up all these 5 front at once.


That is why I said in a previous post new games with increased system requirements the FPS really never goes up with new system hardware. Nvidia GPU performance increases 25-30% every generation. 

I purchase about every new generation mid-range GPU for the last 19 years. I have RTX 2070 now and can now play all Graphic settings on Ultra with 1920x1080p monitor, Battlefield 3 release year 2011, 200 FPS average, Battlefield 4 release year 2013, 172 FPS average, Battlefield 1 release year 2016, 144 FPS average, Battlefield V release year 2018, 126 FPS average. 

So I just wanted to show if you wait a few years while upgrading a mid-range GPU you can have the FPS up to 144-200 FPS with older games on a 144Hz-240Hz 1920x1080p monitor. Upgrading with the top of the line GPU every generation will still only produce 25-30% performance improvement. However, top of the line GPU over mid-range GPU will increase performance about 34.5% and that performance gain you will get most of it back being behind one generation, while upgrading the mid-range GPU every generation. So I can now save around $500 upgrading to every new mid-range generation from the cost of not upgrading with top of the line GPU every generation.


----------



## Hale59

Turing's successor core will be launched in 2021 - http://translate.google.com/transla...&u=https://ascii.jp/elem/000/001/968/1968798/


----------



## skupples

hmm, so nothing seriously new from NV until after the new consoles? spicy.

this gives AMD time to play catch up, and mess with pricing a bit maybe.

and someone else is going to have to point out how that 2019 road map has anything to do with 2020, and 2021.


----------



## wingman99

ilmazzo said:


> don't think so
> 
> I just read a lower tdp and it is exactly what I expect from them, until amd drops big navi 2 that smashes 2080ti at 999$ but we are in the wishfull thinking realm so better have lower expectations on the next nvidia lineup release....


What does a lower TDP have to do with the usual 25-30% Nvidia target performance increase every generation. Also what does the 2080ti have to do with the 3080ti rumored price drop from Wccftech link: https://wccftech.com/nvidia-ampere-...higher-clocks-more-vram-lower-tdps-vs-turing/


----------



## rluker5

Hale59 said:


> Turing's successor core will be launched in 2021 - http://translate.google.com/transla...&u=https://ascii.jp/elem/000/001/968/1968798/


That appears to be Yasuke Ohara's claim. And he does present a timeline of past gpu releases.
But I don't know who that guy is and why I should believe him.


----------



## Section31

I have feeling if AMD hypes out the 5800/5900XT enough, Nvidia is going to announce earlier and will price close to it so AMD can't get the top mindshare. Nvidia imo only cares about it high end market (2080TI), the big compute market and the growing car ai/autodriving market.


----------



## zGunBLADEz

didnt drink the koolaid for a RTX card and it looks like i will continue in the bandwagon as i dont see games that are demanding enough for me to give upo my 1080tis..
got me a 5700xt tho XD


----------



## Torvi

now im wondering if there is any point of upgrading now, I wanted to upgrade from 1050ti to 1660ti this month but now im unsure.


----------



## skupples

zGunBLADEz said:


> didnt drink the koolaid for a RTX card and it looks like i will continue in the bandwagon as i dont see games that are demanding enough for me to give upo my 1080tis..
> got me a 5700xt tho XD


why? they're nearly identical, with 5700xt being just a bit slower.

i definitely feel like the 5700XT is more responsive/less laggy though.



Torvi said:


> now im wondering if there is any point of upgrading now, I wanted to upgrade from 1050ti to 1660ti this month but now im unsure.


seems like a bit of a small jump. should jump to 5700xt or something. 

I tried not to drink the RTX cool aid, but my 1080 ti went on the fritz. Now i'm @ 4K60 w. a 2080ti. so i'm essentially stuck  I'm not gonna jump on the ultra wide band wagon, that's just silly. only bandwagon left is one that hasn't shown up yet. cheaper lanes, and HDMI 2.1/dp 1.4!


----------



## skupples

thinking about this more... all that means is we're for sure seeing a full turing refresh.


----------



## tpi2007

skupples said:


> thinking about this more... all that means is we're for sure seeing a full turing refresh.



7nm EUV means that they can cram more of everything and possibly increase the average clocks by ~ 200 Mhz, so when it comes to rasterization performance they can probably afford to do the same they did with Maxwell -> Pascal, i.e., keep the same basic arch. On the RTX front, I think they will go beyond just an increase in core count, they'll have to make the RT and Tensor cores better.


----------



## skupples

either way, we're seeing a tweaked turing 30x0 before we see 7nm


----------



## Section31

I suspect Turing was meant originally for 7nm but Nvidia decided to use 12nm which turned out to be more problematic and they didn't get the performance numbers at the power number were higher. Probably will still be the actual turing architecture but on 7nm (maybe 7nm euv). Still potentially interesting.

Crazy the original roadmaps back in 2012/2013 said that 7nm would have been out 1-2years ago. That being said, the 5nm chart could be accurate as the learning mistakes occured at the 7nm stage.


----------



## Torvi

skupples said:


> seems like a bit of a small jump. should jump to 5700xt or something.


1050ti barely plays games on mediums with 50~ish fps, 1660ti will play most of games at ultra 60-80fps so its a nice jump, the 5700xt is twice as expensive also im using nvidia ansel in games so im not really considering team red. But yet again no clue whether it's worth upgrading or should I just stay patient and wait for new gpus to come.


edit: im in midrange 1080p market, I have r5-1600 @ 4.0ghz so I don't think going overkill with gpu have any point since my cpu will then bottleneck it.


----------



## guttheslayer

skupples said:


> thinking about this more... all that means is we're for sure seeing a full turing refresh.


I really doubt so at this point.

7nm EUV is in full ramp and NV based product should be out by H2 2020.


I really doubt NV will risk 2 years behind AMD in process node. Furthermore 12nm+ have nothing to offer given that 2080 Ti already have 280W of TDP.


----------



## Blze001

Section31 said:


> I suspect Turing was meant originally for 7nm but Nvidia decided to use 12nm which turned out to be more problematic and they didn't get the performance numbers at the power number were higher. Probably will still be the actual turing architecture but on 7nm (maybe 7nm euv). Still potentially interesting.
> 
> Crazy the original roadmaps back in 2012/2013 said that 7nm would have been out 1-2years ago. That being said, the 5nm chart could be accurate as the learning mistakes occured at the 7nm stage.


I suspect it has less to do with problems and fab issues and more to do with nVidia realizing they didn't have to invest in the new tech since they don't have any competition. Kinda how Intel was doing a few years ago.


----------



## skupples

same thing crossed my mind. 

seems more like an intel issue sure, it's totally fab issues... or its being lazy from being king for too long.


----------



## ZealotKi11er

Blze001 said:


> I suspect it has less to do with problems and fab issues and more to do with nVidia realizing they didn't have to invest in the new tech since they don't have any competition. Kinda how Intel was doing a few years ago.


It all about making bets.


----------



## Section31

Possible true considering Nvidia has been shafting us. We know we don't get there most powerful GPU's, its goes to there AI and High End Compute department. Unfortunately that is where all the money is now and AMD was nice to prioritize us. Looking at intel, there priority is get 10nm/7nm first for servers. Laptops get it because they can just get rejected stock to them. I hope AMD doesn't change but its hard to resist the high end profits margins associated with server chips, compute and AI market.


----------



## JackCY

AMD doesn't care about retail, they focus on compute and servers too same as everyone else. Then they all release the worse bins as retail products while design itself is a compromise to do "gaming" at some acceptable level while being good at other/compute.

There also still is a big difference between AMD's CPUs and GPUs, in how aggressive and competitive they are, how good their supply and R&D for each is.
Intel is getting into GPUs thanks to AMD, because AMD threw the towel in when it comes to competing with NV in server space and right now all Intel does is feed NV GPUs to do compute. And they are not liking that very much. So unlike AMD they can throw money at this problem same way they are throwing money at the CPU issues they are having.


----------



## m4fox90

JackCY said:


> AMD doesn't care about retail, they focus on compute and servers too same as everyone else. Then they all release the worse bins as retail products while design itself is a compromise to do "gaming" at some acceptable level while being good at other/compute.
> 
> There also still is a big difference between AMD's CPUs and GPUs, in how aggressive and competitive they are, how good their supply and R&D for each is.
> Intel is getting into GPUs thanks to AMD, because AMD threw the towel in when it comes to competing with NV in server space and right now all Intel does is feed NV GPUs to do compute. And they are not liking that very much. So unlike AMD they can throw money at this problem same way they are throwing money at the CPU issues they are having.


AMD "threw in the towel" so hard that NVIDIA is re-starting production of the RTX 2070 to compete with the successful 5700 series.


----------



## tpi2007

If the latest rumours are true, this whole thread just went down the drain:

https://wccftech.com/nvidia-geforce-rtx-2080-ti-super-q1-2020-launch-rumor/

The other day an industry source talking about Intel's discrete GPUs made the point that they won't be competing strictly for gaming, so that's probably the reason then, Nvidia knows the landscape and doesn't need to compete at the high end for another year, they just need token upgrades of their current line-up.






I guess this quote I put on post #4 means literally that there won't be anything but Turing for the whole of 2020:

https://www.fool.com/earnings/call-...rp-nvda-q2-2020-earnings-call-transcript.aspx



> In a ray tracing content, it just keeps coming out and and between the performance of Super and the fact that it has ray tracing hardware, it's going to be *super well positioned for through all of next year.*


----------



## rluker5

But I want a new toy 
What am I supposed to spend money on if last year's upgrade wasn't worth it and still isn't?

Saving it isn't as fun, I can't play video games with higher settings on that. It just goes in the imaginary pile.

I hope wccf is wrong.


----------



## m4fox90

Lol can't wait for a 2080ti Super with 5 additional CUDA cores and +2 MHz guaranteed boost clock. Oh Nvidia.


----------



## skupples

tpi2007 said:


> If the latest rumours are true, this whole thread just went down the drain:
> 
> https://wccftech.com/nvidia-geforce-rtx-2080-ti-super-q1-2020-launch-rumor/
> 
> The other day an industry source talking about Intel's discrete GPUs made the point that they won't be competing strictly for gaming, so that's probably the reason then, Nvidia knows the landscape and doesn't need to compete at the high end for another year, they just need token upgrades of their current line-up.
> 
> 
> 
> 
> 
> 
> I guess this quote I put on post #4 means literally that there won't be anything but Turing for the whole of 2020:
> 
> https://www.fool.com/earnings/call-...rp-nvda-q2-2020-earnings-call-transcript.aspx


LOL! 

I love pulling theories outta me arse that line up with reality. 

aint nothing grand coming until round 2 of intel GPUs.

so once again, looks like 30x0 is a complete turing refresh.

this also means AMD's gonna be playing catch up for another year +, and as soon as they do catch up, NV's gonna hulk smash em.


----------



## guttheslayer

skupples said:


> LOL!
> 
> I love pulling theories outta me arse that line up with reality.
> 
> aint nothing grand coming until round 2 of intel GPUs.
> 
> so once again, looks like 30x0 is a complete turing refresh.
> 
> this also means AMD's gonna be playing catch up for another year +, and as soon as they do catch up, NV's gonna hulk smash em.


I still seriously doubt Turing will delay till 2021, the GPU market is effective dead at this point.

I will wait till GTC 2020 to confirm, but I have no doubt Ampere wont be release before H2 2020.


But that GA100 with 55B transistor is pure crazy. How will they handle that monstrous size. If 7nm EUV+ has a transistor density of 48-50 mtr per mm^2, that is almost 1100mm^2 size (unless its 5nm EUV LPE) . Pricing aside, the TDP will be crazy as well. I am not buying that 55B nonsense. The memory bandwidth is pretty over the head as well with like 2.3 TB/s?


We have some crazy thing going on.


----------



## keikei

Well, if Navi 2 is about 15-20% faster than 2080S or even gets a whiff of 2080ti peformance, Nvidia just has to do another refresh...which is sadly gud enough. I was expecting Big Blue to come out blazing, but it looks they might not. Their tech might be still green in 2020. Looking forward to Supa Dupa lineup though. :doh:


----------



## EniGma1987

guttheslayer said:


> I still seriously doubt Turing will delay till 2021, the GPU market is effective dead at this point.
> 
> I will wait till GTC 2020 to confirm, but I have no doubt Ampere wont be release before H2 2020.
> 
> 
> But that GA100 with 55B transistor is pure crazy. How will they handle that monstrous size. If 7nm EUV+ has a transistor density of 48-50 mtr per mm^2, that is almost 1100mm^2 size (unless its 5nm EUV LPE) . Pricing aside, the TDP will be crazy as well. I am not buying that 55B nonsense. The memory bandwidth is pretty over the head as well with like 2.3 TB/s?
> 
> 
> We have some crazy thing going on.





I too highly doubt 55 billion, that is a very big number. I dont even think a chip that large can fit on the current form factor GPU AICs
Though if Nvidia wanted, they could probably use something similar to what Intel just did with their EMIB and stitch two dies together. We have seen it before, but technology is finally at a place where it would be feasible for a GPU to do so, and it would be a way for Nvidia or someone else to get a much larger die size by using smaller dies. Intel's current biggest is 43.3 billion transistors with two dies, stitched together with EMIB and having an cross-die bandwidth of 6.5 terra*bytes* per second without using an interposer. Bandwidth between dies has always been the issue, but with interconnects reaching bandwidth like that, it is finally ready for GPUs to be designed with chiplets as well without taking a major performance hit.


----------



## skupples

guttheslayer said:


> I still seriously doubt Turing will delay till 2021, the GPU market is effective dead at this point.
> 
> I will wait till GTC 2020 to confirm, but I have no doubt Ampere wont be release before H2 2020.
> 
> 
> But that GA100 with 55B transistor is pure crazy. How will they handle that monstrous size. If 7nm EUV+ has a transistor density of 48-50 mtr per mm^2, that is almost 1100mm^2 size (unless its 5nm EUV LPE) . Pricing aside, the TDP will be crazy as well. I am not buying that 55B nonsense. The memory bandwidth is pretty over the head as well with like 2.3 TB/s?
> 
> 
> We have some crazy thing going on.


 that's all well and good, but when was the last time nvidia was one and done with a single arch? or am I out of order? is turing pascal 2.0? i haven't been keeping up all that well.

also, as to the interlink stufff. didn't NV buy a company thats devotes all time to that kinda stuff? 

either way, I don't expect a whole lot out of 3080ti, it's the 4080 ti i'm after.


NV could'a simply screwed themselves from being overly arrogant towards AMD's position. maybe just maybe AMD will get one full season on top for the first time in 15 years? 
nahhh


----------



## guttheslayer

EniGma1987 said:


> I too highly doubt 55 billion, that is a very big number. I dont even think a chip that large can fit on the current form factor GPU AICs
> Though if Nvidia wanted, they could probably use something similar to what Intel just did with their EMIB and stitch two dies together. We have seen it before, but technology is finally at a place where it would be feasible for a GPU to do so, and it would be a way for Nvidia or someone else to get a much larger die size by using smaller dies. Intel's current biggest is 43.3 billion transistors with two dies, stitched together with EMIB and having an cross-die bandwidth of 6.5 terra*bytes* per second without using an interposer. Bandwidth between dies has always been the issue, but with interconnects reaching bandwidth like that, it is finally ready for GPUs to be designed with chiplets as well without taking a major performance hit.


It could be, then the rumoured GA101 could be the suppose single die variant.

Stitch together or not, it is a monster GPU to consist 55B transistors.



skupples said:


> that's all well and good, but when was the last time nvidia was one and done with a single arch? or am I out of order? is turing pascal 2.0? i haven't been keeping up all that well.
> 
> also, as to the interlink stufff. didn't NV buy a company thats devotes all time to that kinda stuff?
> 
> either way, I don't expect a whole lot out of 3080ti, it's the 4080 ti i'm after.
> 
> 
> NV could'a simply screwed themselves from being overly arrogant towards AMD's position. maybe just maybe AMD will get one full season on top for the first time in 15 years?
> nahhh


Pascal is 2 years old before replaced, so is maxwell. I am expecting Turing to be replace by Ampere by late 2020.


----------



## skupples

maybe - either way, 30x0 is turing refresh  

like 4 to 5, or 6 to 7.

I think intel and NV are both falling victim to their own egos. Thinking they were so ahead of the ball Vs. any competition.

AMD's gonna be all new hotness for the next 2 years while NV & intel flounder. I guess we'll see how my predictions pan out this time. They were pretty solid a few years back when AMD was down to like $3.50 a piece.


----------



## rluker5

skupples said:


> maybe - either way, 30x0 is turing refresh
> 
> like 4 to 5, or 6 to 7.
> 
> I think intel and NV are both falling victim to their own egos. Thinking they were so ahead of the ball Vs. any competition.
> 
> AMD's gonna be all new hotness for the next 2 years while NV & intel flounder. I guess we'll see how my predictions pan out this time. They were pretty solid a few years back when AMD was down to like $3.50 a piece.


Why not 9 to 10? Is the node shrink gone?
I'm hoping for a 9 to 10 repeat at least. A little less would be ok too.

Actually a 6 to 7 repeat would be good as well. 4352(2880/1536)=8160


----------



## skupples

because i didn't pay enough attention to 9 & 10 to properly put it in the list as a factual statement.


----------



## guttheslayer

skupples said:


> maybe - either way, 30x0 is turing refresh
> 
> like 4 to 5, or 6 to 7.
> 
> I think intel and NV are both falling victim to their own egos. Thinking they were so ahead of the ball Vs. any competition.
> 
> AMD's gonna be all new hotness for the next 2 years while NV & intel flounder. I guess we'll see how my predictions pan out this time. They were pretty solid a few years back when AMD was down to like $3.50 a piece.


I wont believe there is any more Turing refresh for 30x0 on 12 nm. Super is a turing refresh already and its on 2000 code.


3080 probably be the successor but might appear at Q4 2020 earliest.


----------



## skupples

super is only a partial refresh. There's still no between 2080ti and titan yet


----------



## ThrashZone

Hi,
I can wait my gpu's are doing just fine.

May get another 1080ti just to tied me over for a third build.


----------



## guttheslayer

skupples said:


> super is only a partial refresh. There's still no between 2080ti and titan yet


So how many lineup you think NV can squeeze between 4352 cores (2080 Ti) and 4608 cores (T-RTX)?


2080 Ti Super might happen but that is not a full refresh. Dont kid people.


Ampere is next one most likely, probably after 2080 Ti Super. No more Turing refreshes.


----------



## PontiacGTX

tpi2007 said:


> If the latest rumours are true, this whole thread just went down the drain:
> 
> https://wccftech.com/nvidia-geforce-rtx-2080-ti-super-q1-2020-launch-rumor/
> 
> The other day an industry source talking about Intel's discrete GPUs made the point that they won't be competing strictly for gaming, so that's probably the reason then, Nvidia knows the landscape and doesn't need to compete at the high end for another year, they just need token upgrades of their current line-up.
> 
> 
> 
> 
> 
> 
> I guess this quote I put on post #4 means literally that there won't be anything but Turing for the whole of 2020:
> 
> https://www.fool.com/earnings/call-...rp-nvda-q2-2020-earnings-call-transcript.aspx


if this is true then their production for 7nm may not be ready? unexpected problems? because releasing a 2080T Super will only give AMD the chance to beat nvidia easily


----------



## skupples

maybe that means AMD's gonna get a leg up on both AMD and Intel for a full season.

that would be the best outcome, as it would cause the most price fluctuation. If NV can only compete by releasing ACTUAL turing refresh (super isn't really a refresh, its a SKU replacement) then AMD can possibly level out the playing field, and thus bring #s down a bit.

funny to see y'all dancing circles around whats coming here.

and yes there's definitely room between 2080ti and titan for yet another super segmentation/sku replacement. the lulz we'll have seeing some folks race to spend $1,500 on 2-5% perf increase


----------



## Thingamajig

UltraMega said:


> Maybe Ray tracing will be practical this time... But if Nvidia doesn't bring their prices back down to earth, it will be hard not to label them as a greedy company to avoid long term.


This is what i find so funny and just so typical in this industry. People slap down hundreds, near thousands for cutting edge technology, only to get a half-baked product thats still in experimental phases, and every time people are "outraged" only to then be buying up the next product because "it does it better" - never getting the full use of all the hundreds they put down on the previous card/hardware. Rinse and repeat.

It's just all marketing and PR now with less substance.

If people are stupid enough to buy, I dont blame Nvidia at all. A fool and their money are easily parted. I usually try and keep behind the latest news and hardware releases for this reason. It's all noise to me - unless i ever have very specific needs for the hardware, and even then i'm extremly dubious.


----------



## Raghar

Well boys. I looked at current NVidia cards because I thought I'd upgrade my GTX 660 with 4 heatpipes and two fans non OC version.
Result was:

Asus Phoenix is something you want to get either for small case far away from your ears, or to rip off the heatsink and replace it by custom.
Strix version doesn't come in 2.7 slot version, just with more heat pipes and silent fans (and switch that allows underclocked seting for zero noise gaming). Naw it comes with 2.4 slot version for no reason whatever. EVO might be decent when LED would be removed and bios hacked to allow low noise gaming.

GB... Well, after bit trainwreck current Asus cards are, I looked what GB cards looks like... And I didn't like what I seen. I liked heatsink ONLY on Aorus, and well, I quickly dismissed the idea. 1. I didn't see anything about switch for low noise gaming. 2. Backplate is PLASTIC. Who would use plastic as backplate? Backplate is for spreading heat. When metal is not an option, using thick PCB with lot of copper works better than PLASTIC.

Then I looked at Palit.
...
...
...
And after that all Palit GTX 1660 Super are on my black list.

Would Ampere cards look like the same trainwreck? I need to get new card before my old one burns out. Like preemptive replacement.

I'm seriously thinking about Intel, 5 years from now.


----------



## skupples

Thingamajig said:


> This is what i find so funny and just so typical in this industry. People slap down hundreds, near thousands for cutting edge technology, only to get a half-baked product thats still in experimental phases, and every time people are "outraged" only to then be buying up the next product because "it does it better" - never getting the full use of all the hundreds they put down on the previous card/hardware. Rinse and repeat.
> 
> It's just all marketing and PR now with less substance.
> 
> If people are stupid enough to buy, I dont blame Nvidia at all. A fool and their money are easily parted. I usually try and keep behind the latest news and hardware releases for this reason. It's all noise to me - unless i ever have very specific needs for the hardware, and even then i'm extremly dubious.


depending on how you do it, you don't actually end up spending much, but yes it is a bit entertaining. the ray tracing growing pains will eventually pay off though. Someone had to start doing it, and it wasn't AMD to do it first, for once. 

example, i'll likely only lose $150-$250 on my 2080ti when I flip up for 3080ti. divide loss across number of months used, n cost to game in the top tier isn't all that high. 

or, my 1080tis that I got for $500, n will have sold via buy it now within an hour of listing for $450.


----------



## tpi2007

Raghar said:


> Well boys. I looked at current NVidia cards because I thought I'd upgrade my GTX 660 with 4 heatpipes and two fans non OC version.
> Result was:
> 
> Asus Phoenix is something you want to get either for small case far away from your ears, or to rip off the heatsink and replace it by custom.
> Strix version doesn't come in 2.7 slot version, just with more heat pipes and silent fans (and switch that allows underclocked seting for zero noise gaming). Naw it comes with 2.4 slot version for no reason whatever. EVO might be decent when LED would be removed and bios hacked to allow low noise gaming.
> 
> GB... Well, after bit trainwreck current Asus cards are, I looked what GB cards looks like... And I didn't like what I seen. I liked heatsink ONLY on Aorus, and well, I quickly dismissed the idea. 1. I didn't see anything about switch for low noise gaming. 2. Backplate is PLASTIC. Who would use plastic as backplate? Backplate is for spreading heat. When metal is not an option, using thick PCB with lot of copper works better than PLASTIC.
> 
> Then I looked at Palit.
> ...
> ...
> ...
> And after that all Palit GTX 1660 Super are on my black list.
> 
> Would Ampere cards look like the same trainwreck? I need to get new card before my old one burns out. Like preemptive replacement.
> 
> I'm seriously thinking about Intel, 5 years from now.



MSI Gaming X models. You're going to be paying a premium, but in my experience they're good.


----------



## wingman99

Thingamajig said:


> This is what i find so funny and just so typical in this industry. People slap down hundreds, near thousands for cutting edge technology, only to get a half-baked product thats still in experimental phases, and every time people are "outraged" only to then be buying up the next product because "it does it better" - never getting the full use of all the hundreds they put down on the previous card/hardware. Rinse and repeat.
> 
> It's just all marketing and PR now with less substance.
> 
> If people are stupid enough to buy, I dont blame Nvidia at all. A fool and their money are easily parted. I usually try and keep behind the latest news and hardware releases for this reason. It's all noise to me - unless i ever have very specific needs for the hardware, and even then i'm extremly dubious.


I totally agree. I purchased my usual midrange upgrade the RTX 2070 and was very disappointed in ray tracing effects for the high cost of a $500 graphics card. Now I don't use ray tracing do to the loss in FPS. Now AMD has made RX 5700 XT that is 4% increased FPS over the RTX 2070 for $400 the competition has spoken.:specool:


----------



## rluker5

Thingamajig said:


> This is what i find so funny and just so typical in this industry. People slap down hundreds, near thousands for cutting edge technology, only to get a half-baked product thats still in experimental phases, and every time people are "outraged" only to then be buying up the next product because "it does it better" - never getting the full use of all the hundreds they put down on the previous card/hardware. Rinse and repeat.
> 
> It's just all marketing and PR now with less substance.
> 
> If people are stupid enough to buy, I dont blame Nvidia at all. A fool and their money are easily parted. I usually try and keep behind the latest news and hardware releases for this reason. It's all noise to me - unless i ever have very specific needs for the hardware, and even then i'm extremly dubious.


The thing about "cutting edge" is it is inherently in the experimental and development stage.
Maybe you are expecting "cutting edge" to be well established, traditional, reliable and boring like the very last gen from nvidia - paxwell, which is an improved kepler and not much more change than what has been going on with GCN.

Nvidia could have tossed in bigger rasterization improvements or decreased price, but they would have to leave out a huge pile of risky innovation.

Maybe AMD should have done the same with FX? 

I'm not speaking for everybody, but my specific uses are videogames and those RTX cores would net me more benefit than 12 or 28 more cpu cores. Try switching the word "Nvidia" with "AMD" in your statement to see if it fits better.


----------



## ZealotKi11er

rluker5 said:


> The thing about "cutting edge" is it is inherently in the experimental and development stage.
> Maybe you are expecting "cutting edge" to be well established, traditional, reliable and boring like the very last gen from nvidia - paxwell, which is an improved kepler and not much more change than what has been going on with GCN.
> 
> Nvidia could have tossed in bigger rasterization improvements or decreased price, but they would have to leave out a huge pile of risky innovation.
> 
> Maybe AMD should have done the same with FX?
> 
> I'm not speaking for everybody, but my specific uses are videogames and those RTX cores would net me more benefit than 12 or 28 more cpu cores. Try switching the word "Nvidia" with "AMD" in your statement to see if it fits better.


AMD has had a lot of tech in their GPUs that Nvidia lacked. DX12 came out 2015 and 4 years later we only have some "DX12" games because of "DXR". If games were made with DX12, ASync Compute, Primitive Shader etc than AMD would see more gains.


----------



## looniam

ZealotKi11er said:


> AMD has had a lot of tech in their GPUs that Nvidia lacked. DX12 came out 2015 and 4 years later we only have some "DX12" games because of "DXR". If games were made with DX12, ASync Compute, Primitive Shader etc than AMD would see more gains.


both companies have had and have hardware exclusive to themself. i don't see blaming DXR for the past lack of DX12 support in games where its had generations of gpus to support.

a-sync compute first came out during the GCN1/kepler days and i'm almost willing to bet that there are more DXR games than those that support a-sync; definitely w/DX12 and maybe include vulkan games.

take a past/historic look at tessellation; amd brought it to the table but dropped it while nvidia is now making a living off of it. its not what tech each has but what they do with it that matters. it seems that nvidia, with it's relationship w/game devs (that some would call bribery), is the one who is more successful there.

yeah that last sentence put that cart before the horse but i don't think it needs explaining . .much.


----------



## 113802

ZealotKi11er said:


> AMD has had a lot of tech in their GPUs that Nvidia lacked. DX12 came out 2015 and 4 years later we only have some "DX12" games because of "DXR". If games were made with DX12, ASync Compute, Primitive Shader etc than AMD would see more gains.


We had 40 DX12 titles before DxR was re-released in Windows 1809 on November 13, 2018. AMD pretty much abandoned Primitive Shaders when Raja Koduri left so no Primitive Shaders couldn't be used in DX12 until Navi which enabled support.

https://www.mail-archive.com/[email protected]/msg24458.html

https://www.extremetech.com/gaming/293107-meet-rdna-amds-long-awaited-new-gpu-architecture



> Navi activates Vega’s previously implemented-but-unused support for primitive shaders, so that feature is now functional on AMD’s latest GPUs. Potential IPC is also much higher, up to 1, as opposed to GCN’s limit of 0.25.


I kinda feel cheated because they bragged about Primitive Shader support. They even removed the videos explaining Primitive Shaders when Vega was released off their Youtube channel.


----------



## rluker5

ZealotKi11er said:


> AMD has had a lot of tech in their GPUs that Nvidia lacked. DX12 came out 2015 and 4 years later we only have some "DX12" games because of "DXR". If games were made with DX12, ASync Compute, Primitive Shader etc than AMD would see more gains.


I agree. 
I still like what Nvidia did with Turing and expect big Ampere to be a nice upgrade from my Pascal. With all games.


----------



## wingman99

Asynchronous compute is for compute workloads, not for graphics.


----------



## tpi2007

wingman99 said:


> Asynchronous compute is for compute workloads, not for graphics.



It's the ability "to cram compute and graphics workloads together".

https://pcper.com/2016/07/whats-asynchronous-compute-3dmark-time-spy-controversy/


----------



## wingman99

tpi2007 said:


> It's the ability "to cram compute and graphics workloads together".
> 
> https://pcper.com/2016/07/whats-asynchronous-compute-3dmark-time-spy-controversy/


Yes I know that. I'm a gamer only with my graphics cards and could care less about doing any asynchronous compute workload on my PC.


----------



## tpi2007

wingman99 said:


> Yes I know that. I'm a gamer only with my graphics cards and could care less about doing any asynchronous compute workload on my PC.



Your first statement was inaccurate, hence the quote and link I provided and so is this one. Async compute is used in some games to compute visual effects. Maybe not the games you play, but that's another thing.

Doom (2016), AoTS, Strange Brigade, Wolfenstein II, Sniper Elite 4, Gears 5 and Far Cry 5 all use it:

https://www.eurogamer.net/articles/...n-patch-shows-game-changing-performance-gains
https://www.tomshardware.com/review...ute-multi-adapter-power-consumption,4479.html
https://www.techspot.com/article/1685-strange-brigade-benchmarks/
https://www.overclock3d.net/reviews/software/wolfenstein_ii_the_new_colossus_pc_performance_review/3
https://www.guru3d.com/articles-pag...graphics-performance-benchmark-review,10.html
https://www.guru3d.com/news-story/gears-5-is-locked-loaded-with-amd-technology-features.html
http://advances.realtimerendering.c...g an Open World in Far Cry 5 (With Notes).pdf


----------



## wingman99

tpi2007 said:


> Your first statement was inaccurate, hence the quote and link I provided and so is this one. Async compute is used in some games to compute visual effects. Maybe not the games you play, but that's another thing.
> 
> Doom (2016), AoTS, Strange Brigade, Wolfenstein II, Sniper Elite 4, Gears 5 and Far Cry 5 all use it:
> 
> https://www.eurogamer.net/articles/...n-patch-shows-game-changing-performance-gains
> https://www.tomshardware.com/review...ute-multi-adapter-power-consumption,4479.html
> https://www.techspot.com/article/1685-strange-brigade-benchmarks/
> https://www.overclock3d.net/reviews/software/wolfenstein_ii_the_new_colossus_pc_performance_review/3
> https://www.guru3d.com/articles-pag...graphics-performance-benchmark-review,10.html
> https://www.guru3d.com/news-story/gears-5-is-locked-loaded-with-amd-technology-features.html
> http://advances.realtimerendering.c...g an Open World in Far Cry 5 (With Notes).pdf


Thanks for the links.


----------



## tpi2007

wingman99 said:


> Thanks for the links.



You're welcome.


----------



## guttheslayer

skupples said:


> and yes there's definitely room between 2080ti and titan for yet another super segmentation/sku replacement. the lulz we'll have seeing some folks race to spend $1,500 on 2-5% perf increase



that will not happen, so is a full turing refresh.


Any Turing refresh will be based on a smaller node. At 12nm, the turning TU102 is already reaching its max reticle at 754mm^2, there is no more room left for refresh.


Its either they just come out with 2080 Ti S and leave it till 2021, or its 7nm Turing (which NV will not call it a refresh but ampere) out in 2020. Simple as that. Nothing else.


Please stop talking about Turing refresh. 2080 Ti S will probably be the last card we see before 7nm comes in.


----------



## skupples

lol! just said this in a different thread.

i'm betting 2020/2021 will play out like the Keplar >> Maxwell drop, as far as pacing & such goes. 

2080TIS drops, then a few months later the smallest of the EUV drops, then the gaming cards start dropping, then the flagships start dropping. I doubt we'll see a 2080ti replacement (other than S) in march 2020. March 2020 would be the $150 baby card drop.


----------



## guttheslayer

https://www.tweaktown.com/news/68689/nvidia-geforce-rtx-3080-ti-june-2020-earlier/index.html


Ampere is coming in june 2020,

RTX 3070 - 12GB
RTX 3080 - 12GB
RTX 3080 Ti - 16GB


----------



## Hydroplane

guttheslayer said:


> https://www.tweaktown.com/news/68689/nvidia-geforce-rtx-3080-ti-june-2020-earlier/index.html
> 
> 
> Ampere is coming in june 2020,
> 
> RTX 3070 - 12GB
> RTX 3080 - 12GB
> RTX 3080 Ti - 16GB


But when will the Ampere Titan release?


----------



## skupples

"AMD will be unleashing RDNA 2-based "NVIDIA Killer" GPUs in 2020, which is something I'm hearing from my sources that NVIDIA is quietly preparing for behind the scenes. NVIDIA doesn't really have anything to worry about at this stage, and even for a generation or two as they're so far ahead of the game at the flagship level it's not funny."

lol, yup. NV can disappoint, and still slaughter. 

guess i'll be jumping on with pre-orders this time around. fuggit.

the absolute massive amount of VRAM going onto everything is kinda weird, its rare I see a game user over 8gb these days... Maybe that's where AMD can sneak in a cost savings NV killer.


----------



## ilmazzo

RT features seems quite memory hungry so I suppose 8GB on 4k qon't be sufficient on the long run anyway amd will provide always more ram in their skus, think navi 2 top will be 16gb ddr6 too....don't think we will see hbm in consumer cards anymore from them...but at this point these are all rumors and nothing more....


----------



## skupples

ahh, duh. I've yet to do any RT testing on my 2080ti.


----------



## guttheslayer

ilmazzo said:


> RT features seems quite memory hungry so I suppose 8GB on 4k qon't be sufficient on the long run anyway amd will provide always more ram in their skus, think navi 2 top will be 16gb ddr6 too....don't think we will see hbm in consumer cards anymore from them...but at this point these are all rumors and nothing more....


Given that the Ti variant is a 16GB, there is a good chance its a HBM2E with 2 of those 2 hi-stack die connected together.


https://www.extremetech.com/computi...-standard-to-24gb-307gb-s-bandwidth-per-stack


There is no way NV is going for 512 bits (256 bits doesn't work with the highest end Ti model) GDDR6, as such if its 16GB, it has to be HBM. Also based on the latest HBM2E from Samsung, we could see potential bandwith of 840GB/s on just 2 stack of HBM alone.

https://semiengineering.com/hbm2e-the-e-stands-for-evolutionary/


----------



## ilmazzo

I was talking of amd, nvidia for hbm is non existant on consumer market


----------



## guttheslayer

ilmazzo said:


> I was talking of amd, nvidia for hbm is non existant on consumer market


I am talking about NV might use HBM for Ampere this time, with just 2 stack per GPU, unless that 16GB rumor doesnt hold true.


Yield will definitely be alot better if it was just 2 HBM stack per GPU.


----------



## ilmazzo

mmmmmm

So you are implying that since they will rise the engine counts they need more bw and the only way to achieve aside moar compression it is to use HBM2 becasue DDR6 are at their best already?


----------



## EniGma1987

ilmazzo said:


> mmmmmm
> 
> So you are implying that since they will rise the engine counts they need more bw and the only way to achieve aside moar compression it is to use HBM2 becasue DDR6 are at their best already?



GDDR6 actually has more bandwidth per channel than HBM2 does due to its MHz speed. For instance the Titan V had 27.2GB/s per memory channel (with HBM2), while the 2080Ti has 56GB/s per channel (or, 28GB/s when using 16-bit channel modes which it isn't actually set up to do). When using GDDR though, it is hard to go above 12 channels because of needing to maintain the proper form factor. GDDR6 technically has 2 channels per chip (which could mean you get 32 channels on a typical card size config), but at only 16-bit per channel. These can (and often are) combined to be single 32-bit channels though just like GGDR5 and other past specifications used.

HBM2 has 8 memory channels per stack (and they are 128-bit wide channels instead of 32-bit/16-bit channels like GDDR6), so 4 stacks of HBM is 32 channels to work with. You can feed more engines with having more channels.


----------



## ZealotKi11er

guttheslayer said:


> I am talking about NV might use HBM for Ampere this time, with just 2 stack per GPU, unless that 16GB rumor doesnt hold true.
> 
> 
> Yield will definitely be alot better if it was just 2 HBM stack per GPU.


384-Bit 16-18Gbps + larger L2 is enough for 3080 Ti. For RT cache is more important.


----------



## guttheslayer

ZealotKi11er said:


> 384-Bit 16-18Gbps + larger L2 is enough for 3080 Ti. For RT cache is more important.


Yes I know, but the value 16GB gave me the impression they were using HBM instead.

GDDR6 chip typically comes in 3 capacity size as far as I know, which consist of 1/1.5/2GB per module. Each of these chips is usually 32 bits wide and 2 of these is tied to each 64 bits memory controller in the GPU.


In order to achieve a standard 16GB total size, a total of 8 or 16 chips is needed for the size (You cannot have 16 GB for 1.5GB modules unless they truncate the value from 16.5GB <- this might seem very possible but lets not digress now), which equal to either 256 bits or 512 bits. 256 bits works for 3070 / 3080 but not the Ti as the speed just isn't fast enough.


As far as we know for GDDR, NV always adopt a maximum 384 bits on their big die, which means we usually see up to 12 modules soldered on their card for their flagship (Titan). So for this configuration it has to be 12/18/24GB.


Now unless, the 3080 Ti is being segmented from a 18GB maximum configuration, meaning it is lacking 1 modules from its 12 maximum limit, having 11 of these 1.5GB chip give you 16.5GB (it will be 16GB if you remove the decimals) on a 352 bits bandwidth.


----------



## skupples

i know this is slightly OT, but y'all smart.

switched from DP1.2 to HDMI2.0 on my 10bit pro art, and lost the ability to select 10bit in 4K when switching to hdmi. according to the manual, all 4 ports are identical. I switched due a glitch I get running via the 1 dp port it has. 

 what do?


----------



## looniam

http://www.3dcenter.org/news/geruec...technische-daten-zu-nvidias-ampere-generation









i'll say i really can't see HBM in the consumer market/space, too damned costly . . still.


----------



## guttheslayer

looniam said:


> http://www.3dcenter.org/news/geruec...technische-daten-zu-nvidias-ampere-generation
> 
> View attachment 306060
> 
> 
> i'll say i really can't see HBM in the consumer market/space, too damned costly . . still.


Alright guys, we have some very juicy info right here, especially on the term 8 GPC and each GPC has 1024 CUDA Cores. In this case the configuration for different die will be:

GA100 - 8GPC (Compute Based)
GA102 - 6GPC (RT Based)
GA104 - 4GPC (RT Based)
GA106 - 2GPC (RT Based)

Based on the info we can already extrapolate some info regarding the die that GA100 is based on.


Tesla A100 (name not confirmed)
Process: 7nm+ EUV
Transistors: >32B
Die size: ~650mm^2
Cores: 8192 CCs + 4096 DPs + 2048 Tensors
Memory: 48 GB HBM2
Bandwidth: 1.5TB/s


That is one big monster we are seeing here.


----------



## EniGma1987

can you even go above 4 stacks on HBM? Never seen it done or talked about before, but a 4 stack limit means 4096-bit memory bus limit. Anything higher doesnt work if you cant get above 4 stacks.
Even future HBM3 maintains a 1024-bit per stack interface while going up to 12 high chips.


----------



## guttheslayer

EniGma1987 said:


> can you even go above 4 stacks on HBM? Never seen it done or talked about before, but a 4 stack limit means 4096-bit memory bus limit. Anything higher doesnt work if you cant get above 4 stacks.
> Even future HBM3 maintains a 1024-bit per stack interface while going up to 12 high chips.


Which is why some suspect its a dual GPU MCM. Where each chip have 3 stacks of HBM and you "glue" both chip together.


If you ask me, a 64GB @ 4096 bits HBM2E on a 1.6 TB/s would have been alot more believable as well.


----------



## EarlZ

Prices will still be high unless AMD has something to offer that exceeds the 3080Ti at a lower price point, else is still gonna be a $1200 GPU.


----------



## doom26464

EarlZ said:


> Prices will still be high unless AMD has something to offer that exceeds the 3080Ti at a lower price point, else is still gonna be a $1200 GPU.


Yah this is still what bothers me. On launch this has potential to still be a 1700CAN gpu which is pure nonsense to buy. 

My only hope is the 3080 or 3070 has insanely good performance and be below 900CAN. Im not willing to pay anymore as that's just financial nonsense to just play games. 


Usually I buy last gen flag ships on used market, and get nice upgrades without breaking the bank but 2080ti are such meh flagship and are so overpriced on launch im not sure how they will fare on used market come next gen. Not worth it doing 1080ti to 2080ti IMO. Probably better to jump right to a 3080 or 3070.


----------



## guttheslayer

EarlZ said:


> Prices will still be high unless AMD has something to offer that exceeds the 3080Ti at a lower price point, else is still gonna be a $1200 GPU.


You are not wrong, given its size and new EUV.


NV have 2 paths to walk down now:

1) $1200 for 3080 Ti release in 2020,
2) $800 for 3080 Ti release in 2021. (3080 will be flagship for 2020)

Either way its a lose-lose lol. We sacrifice waiting time for cheaper option, or we pay premium to access these powerhouse earlier.


----------



## skupples

doom26464 said:


> Yah this is still what bothers me. On launch this has potential to still be a 1700CAN gpu which is pure nonsense to buy.
> 
> My only hope is the 3080 or 3070 has insanely good performance and be below 900CAN. Im not willing to pay anymore as that's just financial nonsense to just play games.
> 
> 
> Usually I buy last gen flag ships on used market, and get nice upgrades without breaking the bank but 2080ti are such meh flagship and are so overpriced on launch im not sure how they will fare on used market come next gen. Not worth it doing 1080ti to 2080ti IMO. Probably better to jump right to a 3080 or 3070.


yep, i'll be ditching my 2080ti the day they announce an actual release date. I expect em' to fall to ~$700 pretty quick. $100 less than the 3080's MSRP, because 3080 will be as fast, or a bit faster, with better RT capabilities.


----------



## keikei

Well, the source has a decent rate of rumors being true. Wasn't Turing just released?! Ampere=Hopper? https://www.techpowerup.com/261164/nvidia-ampere-successor-reportedly-codenamed-hopper


----------



## ThrashZone

Hi,
Yep nvidia likes that 1200.us price point way too much to lower it 
First the titan's now top ti series.

I suppose buyers are the ultimate reason and problem for the price point


----------



## skupples

titan started @ $999


----------



## ThrashZone

Hi,
...2080 Super duper line next


----------



## ThrashZone

skupples said:


> titan started @ $999


Hi,
I've been screwing around with evga too long I guess first one I remember was a hybrid and boom 1200.us around the 980 time


----------



## skupples

ThrashZone said:


> Hi,
> I've been screwing around with evga too long I guess first one I remember was a hybrid and boom 1200.us around the 980 time


GK110(chip for 780ti and first Titan) = $999 MSRP. I was lucky enough to get them from compUSA @ MSRP.  

here's a question... would Nvidia earn more customers by taking TI down to $999, or would the same folks still shout "too damn high?" (prices are always too damn high, on everything, always, of course)


----------



## ThrashZone

Hi,
There will always be the way too much crowd 
But nvidia knows it's buyers way too good now to lower prices in any meaning full way beside saying that's why there is a 2080../2070../....


----------



## skupples

work harder, afford nicer toys. idk. 

even at 50-60k a year, these GPUs become quite affordable, specially if you flip em right.


----------



## Bart

Staying behind the curve by one generation used to be a decent way to save money, but even those days are over. I was content to stay one gen behind, all the time. I waited until the 1080's were out before buying 980s for the first time. Saved a ton that way. But these days due to the complete lack of competition, the top cards hold their value for so much longer than they used to. I waited as long as I could before buying a pair of 1080TI's, WELL into the 2080 life cycle, and the lowest price I could get was $875CDN each. Now look at the 2080TI, been out for quite a while, and you still can't find a new one cheap. The lowest price I've seen for non-A chips was just under $1400CDN, and I'm not paying that for an incremental upgrade. Until competition returns to this market, we're all pooched.


----------



## skupples

uh... i guess...

i sold my origial titans for $800 AFTER 970 released, and it took less than a day for them to go. 

n my 2080ti came in for $950.


----------



## ThrashZone

skupples said:


> work harder, afford nicer toys. idk.
> 
> even at 50-60k a year, these GPUs become quite affordable, specially if you flip em right.


Hi,
Nope not the issue 
Issue really is China 
One either cares what's going on or one doesn't.

Could of bought two 2080ti's at launch for both builds just didn't see a real reason why.
Launch with space invader cards was way too many.
I don't live for the privilege of rma only the shipper profits


----------



## ToTheSun!

skupples said:


> n my 2080ti came in for $950.


What model did you get?


----------



## skupples

used SC.


----------



## ToTheSun!

skupples said:


> used SC.


Neat. I just got a refund for my 11 month old 2080, and I'm thinking of grabbing a 2080ti AMP for 100€ more.


----------



## Bart

I wish I could bring myself to buy used, but I won't do that for GPUs (because trust issues, LOL). Mining scared me off good.


----------



## ThrashZone

Hi,
Used cards in this price range is way out of my comfort zone 
Surprised how many manufactures do not transfer warranties nvidia especially doesn't
EVGA is the only know to transfer warranty rma is already tough if scratches/... not sure about zotac/ msi/.. if they transfer warranties or not.

Best to buy new is my policy.


----------



## Nizzen

ThrashZone said:


> Hi,
> Used cards in this price range is way out of my comfort zone
> Surprised how many manufactures do not transfer warranties nvidia especially doesn't
> EVGA is the only know to transfer warranty rma is already tough if scratches/... not sure about zotac/ msi/.. if they transfer warranties or not.
> 
> Best to buy new is my policy.


Best is to buy in Norway! 5 years "return if broken" no matter the brand warranty  Transfer to new owner is no problem. Just need the original bill


----------



## skupples

Bart said:


> I wish I could bring myself to buy used, but I won't do that for GPUs (because trust issues, LOL). Mining scared me off good.


just stick with EVGA, they accept ebay receipt as POP, and will accept RMA. I've been doing this since selling my keplar titans, works out quite well. I end up spending next to nothing on hardware, aside from the buy in. Like, my 1080tis were $500 each, used blocks were under $100, n they'll go for $500 Buy It Now, within 24 hours of listing them this weekend.


----------



## Clos

skupples said:


> just stick with EVGA, they accept ebay receipt as POP, and will accept RMA. I've been doing this since selling my keplar titans, works out quite well. I end up spending next to nothing on hardware, aside from the buy in. Like, my 1080tis were $500 each, used blocks were under $100, n they'll go for $500 Buy It Now, within 24 hours of listing them this weekend.


Quote for Truth, I JUST RMA'd my 1080ti Hybrid that i bought off of ebay. Uploaded my EBAY/Paypal receipt and BAM. Warranty. Still have 400+ Days left. The only thing about second hand warranty is, they won't allow advanced RMA. have to send yours in before they send you one. And i can live with that.


----------



## doom26464

I Can't see nvidia dropping prices on next gen, they already raised the bar and found people still bought em at stupid high prices so they will do the same again even if there cheaper to make.


Only thing that will change prices is competition which isn't going to happen any time soon.


----------



## Bart

Nice, I never knew EVGA was so good on the warranty side. I tend to buy the cheapest cards I can get my hands on, never considering warranty. I rarely sell anything (needs to change), so this was never a factor for me. It's like I'm genetically predisposed to wasting money!


----------



## skupples

I'm the kinda person to only allow myself to get burned once.

EVGA FTW until they give me a reason otherwise, at least for used card games. 

though, for 3080ti, i'll probably go NV direct FEs.


----------



## Sheyster

skupples said:


> titan started @ $999


Ah.. The good old days! 2 please!


----------



## keikei

Bart said:


> Nice, I never knew EVGA was so good on the warranty side. I tend to buy the cheapest cards I can get my hands on, never considering warranty. I rarely sell anything (needs to change), so this was never a factor for me. It's like I'm genetically predisposed to wasting money!



Given the current trend, we're seeing yearly cycles. Who cares about a warranty when the new hotness drops so relatively fast. Its like smart phones. I guess its great for the perpetual upgrader/epeener.


----------



## skupples

Sheyster said:


> Ah.. The good old days! 2 please!


at least we don't get roasted for buying them like we used to.


----------



## Bart

I'll consider myself roasted when I spend over $1000 on a GPU. Hasn't happened yet, but it's coming.


----------



## Sheyster

skupples said:


> at least we don't get roasted for buying them like we used to.


Well, it just doesn't make much sense to buy 2 anymore, unless you happen to play one of the few games with decent scaling (or support at all) in SLI.


----------



## skupples

keikei said:


> Given the current trend, we're seeing yearly cycles. Who cares about a warranty when the new hotness drops so relatively fast. Its like smart phones. I guess its great for the perpetual upgrader/epeener.


"current trend" when has it not been this way? 

i'd say the folks who care about warranties are the ones that get brand new cards that let out the magic smoke within an hour of use... like Turing @ launch.


----------



## Bart

Sheyster said:


> Well, it just doesn't make much sense to buy 2 anymore, unless you happen to play one of the few games with decent scaling (or support at all) in SLI.


But they're like kittens, you can't get just one. They need a buddy.  Seriously though, the main reason I like two is because I prefer big cases with massive cooling. Single GPU rigs look kinda small in big cases, especially on full ATX boards. Silly reasoning to be sure, and not worth the money unless you're a benchmark numbers type. But once you get into custom loops, you do silly things just for aesthetics.


----------



## Hydroplane

Bart said:


> But they're like kittens, you can't get just one. They need a buddy.  Seriously though, the main reason I like two is because I prefer big cases with massive cooling. Single GPU rigs look kinda small in big cases, especially on full ATX boards. Silly reasoning to be sure, and not worth the money unless you're a benchmark numbers type. But once you get into custom loops, you do silly things just for aesthetics.


Me too, don't want to go back to 1, E-ATX board looks empty lol


----------



## Bart

Hydroplane said:


> Me too, don't want to go back to 1, E-ATX board looks empty lol


Nice build, I see by your sig that you understand, LOL! I did build one extremely pretty system on a white TT P3, but with only one 420 rad, it can heat up enough to get loud, even with a single GPU. People underestimate how much rad space you need if you want peace and quiet.


----------



## skupples

and will still cost you less than that nitro/lipo RC and drone habit.

wonder if the new star wars game will get some mgpu support? the vertical slice ran in ue4 w/ quad titans or some squat, supposedly.


----------



## ahnafakeef

skupples said:


> i know this is slightly OT, but y'all smart.
> 
> switched from DP1.2 to HDMI2.0 on my 10bit pro art, and lost the ability to select 10bit in 4K when switching to hdmi. according to the manual, all 4 ports are identical. I switched due a glitch I get running via the 1 dp port it has.
> 
> what do?


I can't select anything higher than 8-bit on my C9 via HDMI either. Did you find a solution for this?


----------



## skupples

nope, never did get a definite answer. My gut tells me its the stupid hdmi port on the 2080ti? Your response rules out it being my monitor.

Guess i'll have to google more. Is HDMI 8bit some how inherently different than DP? seems like you'd notice banding on that beast of a screen if lack of colors were an issue .


----------



## ahnafakeef

skupples said:


> nope, never did get a definite answer. My gut tells me its the stupid hdmi port on the 2080ti? Your response rules out it being my monitor.
> 
> Guess i'll have to google more. Is HDMI 8bit some how inherently different than DP? seems like you'd notice banding on that beast of a screen if lack of colors were an issue .


I'm pretty sure that I was using 10-bit 4K 60Hz RGB (no HDR) on my Benq BL3201PT via miniDP. So I don't think it's your monitor. Perhaps it only works via DP/miniDP and not HDMI.

This comment suggests the same: https://forums.overclockers.co.uk/t...-nvidia-control-panel.18825061/#post-31950782

Googled the issue and the first result was one of my own posts on Reddit: https://www.reddit.com/r/OLED/comments/dek9iz/can_the_lg_c9_do_4k_60hz_rgb444_10bit_via_hdmi/

Apparently, it is indeed a limitation of HDMI 2.0b. We need to drop RGB to YCbCr422 to get 10-bit. But apparently, that shouldn't matter for movies and games because they are mastered in 4:2:0. The only downside would be color fringing on text in the Windows environment. I experienced the color fringing just once, and I must say it is significantly worse for general PC usage than some people make it out to be.

BUT, look here: https://forum.doom9.org/showthread.php?p=1883022#post1883022

This comment suggests that the 8-bit 4:4:4 is better than 10-bit on the C9 for some reason, at least when watching movies.

As for my own experience with banding, I've watched quite a few movies and played through almost half of Shadow of the Tomb Raider on the C9, and I don't think I've experienced much of banding, if any. I mean, I'd definitely remember and troubleshoot it if I were experiencing extreme cases of banding consistently across all content. Perhaps there's some dithering going on in the C9 that's helping out? I wonder how accurate the color reproduction is though, given 8-bit has only 16 million colors as opposed to the 1 billion of 10-bit.

Sorry if this is too much information. I learned all this when I looked into it recently before purchasing my C9. I thought it might help you too.


----------



## skupples

no worries  

yeah, that's what I figured. I figured it was a limitation of the hdmi port on the GPU. So, it should go away next TIme around.


----------



## Asmodian

ahnafakeef said:


> This comment suggests that the 8-bit 4:4:4 is better than 10-bit on the C9 for some reason, at least when watching movies.


It is pretty easy to test. Switching back and forth between 8 bit and 10 bit output to my C9 (YCbCr 4:4:4 and PC mode) with a banding test pattern has obviously worse banding with 10 bit input. Using any HDMI mode on the TV that subsamples chroma to 4:2:2 looks fine with 10 bit input (anything but Game or PC), but subsampled chroma is bad. Using RGB input has similar banding with either bitdepth in PC/Game mode. 



ahnafakeef said:


> As for my own experience with banding, I've watched quite a few movies and played through almost half of Shadow of the Tomb Raider on the C9, and I don't think I've experienced much of banding, if any. I mean, I'd definitely remember and troubleshoot it if I were experiencing extreme cases of banding consistently across all content. Perhaps there's some dithering going on in the C9 that's helping out? I wonder how accurate the color reproduction is though, given 8-bit has only 16 million colors as opposed to the 1 billion of 10-bit.


8 bit is definitely better for my C9. The accuracy of the color does not change between 8 and 10 bit. It simply does not change at all. The only difference is less dithering noise (hopefully) or less banding if the video path is terrible. Tomb Raider, and all games I have tried, do seem to dither so on the C9 using 8 bit is simply higher quality than 10 bit. The extra colors are tiny steps between colors, not new colors. Color accuracy is about displaying the right colors at 100% red/green/blur and having the correct spacing between colors. My C9 measures exactly the same color accuracy in 10 bit and 8 bit, with a i1d3 calibrated with an i1pro2.

The only time this is not true is when supplying subsampled video, like from a bluray player or similar. I am not sure which is better in the standard video mode, when subsampling 10 and 8 bit input are indistinguible with my banding tests.


----------



## Aristotelian

Bart said:


> Nice build, I see by your sig that you understand, LOL! I did build one extremely pretty system on a white TT P3, but with only one 420 rad, it can heat up enough to get loud, even with a single GPU. People underestimate how much rad space you need if you want peace and quiet.


Well, I lurk here a lot but:

I think in 2015 I bought a Thermaltake Core X and 3x480mm EK radiators with a pretty cool fan controller. I have been waiting to 'pull the plug' on a rig where the monitor would be 27 inches (or higher) 4k, 10bit 4:4:4, and at least 100Hz. So I'm waiting for the new HDMI protocol and the monitors.

Yet I look at the 3 480mm and there's probably no way I can fit all 3 in that case, period. 

I think I'm going to have to end up driving to Rochester to visit this case modding company with all the parts and then asking if they'd be able to build this for me. Am I wussing out? 

I want peace and quiet - an elite-ish gaming PC. But I think I may have overestimated the rad space. 

Just a quick thanks to the community on this forum at large for the detailed analysis and guides. Your comment really urged me to reply


----------



## skupples

there are very few cases on the market that'll fit 3x 480s of any thickness without extensive modding.

this is the main reason why CaseLabs was so loved, and why I'll never part with my STH10.

the only other thing I'd ever build would be an SFF with external radiator tower.


----------



## Aristotelian

That's what I'm looking at now: somehow making an external radiator tower that is 'plug and play' to the case setup, but doing this 'aside' so that the actual case (piping, cabling, etc) can be super neat and aesthetic. 

And I'm really hoping Nvidia hits it out of the park with Ampere...a seriously strong Ti that can push lots of frames at 4k even say, for a triple A game like Cyberpunk at release.


----------



## Bart

I'm going to step down from my Case Labs M8 cube with pedestal, just to work in something a bit smaller but still with decent cooling (triple 360s in a Lian Li PC011 Dynamic XL). But I will never sell that case, ever. That behemoth can hold up to six 360 rads with ease (7 if you're insane), of nearly ANY thickness. I have a pair of 88mm thick Alphacool Monsta 360s in the pedestal, and I could fit another PAIR of those things in the PSU compartment, and still have room for a full HDD cage and a PSU. You just don't get that level of modularity and flexibility anymore.

Aristotelian: there is no such thing as too much rad space. That's like too much sex or too much money, it just doesn't exist.  As for having someone do the modding, it's not 'wussing out' if you don't want to cut up a case, or don't have the knowledge or time. But before you do that, maybe take a step back and re-asses your needs. Cases aren't that expensive, and since you haven't built the rig yet, it might be smart to change cases now and sell off the 480s before they're used. Or at the very least, do some test fitting with the parts. Are you sure they won't fit?


----------



## Aristotelian

The 'needs' is hard to calculate a priori - I mean, I guess I can estimate what the full TDP of a highly overclocked Ampere Ti would be. I'm still equivocating between SLI and not - more likely not, since SLI performance tends to get patched in later and I love day 1 performance.

Between that and perhaps a higher core count HEDT - overclocked - it doesn't seem like I'd need too much. But, like you said - there's no such thing as too much. And my once fairly conservative wife is now singing (we just moved to NYC) 'I like big bucks and I cannot lie' to the tune of the Sir Mixalot classic. She's very supportive of this build too. 

I am rocking a 2600k (overclocked) with a 1070, on a 1440p Dell monitor from 2010 or so (60Hz). So you can imagine how much I'm looking forward to CES next year and other hardware announcements for this build...


----------



## Sheyster

Aristotelian said:


> Well, I lurk here a lot but:
> 
> I think in 2015 I bought a Thermaltake Core X and 3x480mm EK radiators with a pretty cool fan controller. I have been waiting to 'pull the plug' on a rig where the monitor would be 27 inches (or higher) 4k, 10bit 4:4:4, and at least 100Hz. So I'm waiting for the new HDMI protocol and the monitors.
> 
> Yet I look at the 3 480mm and there's probably no way I can fit all 3 in that case, period.
> 
> I think I'm going to have to end up driving to Rochester to visit this case modding company with all the parts and then asking if they'd be able to build this for me. Am I wussing out?
> 
> I want peace and quiet - an elite-ish gaming PC. But I think I may have overestimated the rad space.
> 
> Just a quick thanks to the community on this forum at large for the detailed analysis and guides. Your comment really urged me to reply


If it's a Core X9 it can easily handle 3 480's. If you're concerned about it just roll with 2 480's on top and sell the third rad.


----------



## skupples

noooo don't give them money for knock off caselabs


----------



## Sheyster

skupples said:


> noooo don't give them money for knock off caselabs


I believe he stated he already did, a few years ago!


----------



## skupples

still, pretty sure thats the smaller one, only two will fit up top. you'd need the W200 for the front rad support.


----------



## ZealotKi11er

guttheslayer said:


> You are not wrong, given its size and new EUV.
> 
> 
> NV have 2 paths to walk down now:
> 
> 1) $1200 for 3080 Ti release in 2020,
> 2) $800 for 3080 Ti release in 2021. (3080 will be flagship for 2020)
> 
> Either way its a lose-lose lol. We sacrifice waiting time for cheaper option, or we pay premium to access these powerhouse earlier.


I do not see how 7nm RTX 3080 Ti could be anything less than what 2080 Ti. I am worried it will be more.


----------



## EniGma1987

ZealotKi11er said:


> I do not see how 7nm RTX 3080 Ti could be anything less than what 2080 Ti. I am worried it will be more.



Final cost to build should be less since yields should be higher due to use of single patterning and a light source that can create sharper transistor features which means less defects.
Though Nvidia may simply want to have thew extra profit and not lower the price to the end user.


----------



## skupples

I don't expect the price to drop at all. In fact, I'll be surprised if MSRP doesn't go up by $100.


----------



## ilmazzo

The more you spend, the more you save

makes sense a 3080ti C.E.O. edition at 1499 MSRP : Cashback Edition Orgasm


----------



## Aristotelian

It is a Core X9. I'll see how this works out. I bought it back then because I had some kind of 'stackable' fantasy but then the case came and I realize how impractical that is. You end up getting a case in person that you buy online and then see that...if you had rads at the top then the clearance near the video card is pretty tight. At the side (bottom) I could do one on one side....but it won't be 'clean' to do more than 1 as far as I can plan it out with the standard case. 

Perhaps a radiator tower is the solution. I'm in touch with a case modding company here that is currently busy with CES but I'll try to get something going with them because I don't have any of the skill - in particular the artistic kind to make it work. As my wife said 'yeah if you want a Diablo theme lots of red and black won't cut it - what about the art?'

On the 3080Ti - I fully expect it to cost a lot because...Nvidia can. They and other companies aren't into charitable product releases so...if they take single GPU gaming crown again and announce at the same time as Cyberpunk 2077 or another AAA release there'll be a long line of people saying 'take my money, please'.


----------



## Blze001

EniGma1987 said:


> Final cost to build should be less since yields should be higher due to use of single patterning and a light source that can create sharper transistor features which means less defects.
> Though Nvidia may simply want to have the extra profit and not lower the price to the end user.


Nvidia is gonna raise the price even higher, I bet.


----------



## Sheyster

Aristotelian said:


> It is a Core X9. I'll see how this works out. I bought it back then because I had some kind of 'stackable' fantasy but then the case came and I realize how impractical that is. You end up getting a case in person that you buy online and then see that...if you had rads at the top then the clearance near the video card is pretty tight. At the side (bottom) I could do one on one side....but it won't be 'clean' to do more than 1 as far as I can plan it out with the standard case.


It's possible to have dual Monsta 480's in the case if you drill some holes and use an external shroud for the fans on top. Two standard 480 radiators should not be an issue at all.


----------



## skupples

but why would anyone use monsta's in this day and age? they look bad ass, but they don't cool all that well, unless you like LOTS of noise.]

I'd like to note, out of all the radiators I've purchased. Monsta's always fail first, and always from the same spot. (the soldered in threading)


----------



## Sheyster

skupples said:


> but why would anyone use monsta's in this day and age? they look bad ass, but they don't cool all that well, unless you like LOTS of noise.]
> 
> I'd like to note, out of all the radiators I've purchased. Monsta's always fail first, and always from the same spot. (the soldered in threading)


I was not advocating using them at all. The OP had mentioned space limitations above the video cards in the X9 so I wanted to use them as an example due to their gerthiness.  If you can handle the Monsta you can handle anything.


----------



## skupples

monsta 80 + push / pull deltas = still meh


----------



## Aristotelian

I don't want to disagree with you pros, but:

I have 3 EK 480mm coolstreams and I remember (at least back in 2016) they were rated very well: EDIT: think I got them after reading the thermalbench review. They are 60mm thick. Even without fans if I put two at the top of the case I'm certain there wouldn't be clearance for pipes from the video card if I still mount it vertically like that. The space between the motherboard and the top isn't all that great at all, let alone with the rads there. I mean I'll get rid of my old Noctua cooler (NH D15) for the watercooling too but even with a CPU waterblock the rads...they're just enormous already (to me) at 60mm thick.

Anyone want to bet when a 3080Ti might come out?


----------



## skupples

3080ti will come out this time next year. First wave ampere will be seen in june, possibly all the way up to 3080, depending on what AMD's up to at that point.


----------



## Bart

I have a pair of Monsta 360s in my current box, but I don't notice the cooling performance. They never leaked though. But like every Alphacool rad, they were filthy! I remember flushing those stupid things an ABSURD amount of times trying to get all the crap out of them. Pretty sure I gave up on the 10th rinse, with little black things STILL coming out, and just tossed em in the loop anyway.


----------



## guttheslayer

skupples said:


> 3080ti will come out this time next year. First wave ampere will be seen in june, possibly all the way up to 3080, depending on what AMD's up to at that point.


I doubt 3080 Ti will be release this soon this time. I will say it will be year 2021. 2020 will be the year of 3080 and 3070. We might be able to see the Titan iteration by the end of 2020.


It will take some time before the big die (~600mm^2) yield of 7nm+ will be affordable for most consumer.


----------



## skupples

let me dream, and valid point on them possibly trying to sell us Titans first. That's definitely something in their playbook from the past. 

2020, the year of the $800 2080ti with better RT.


----------



## guttheslayer

skupples said:


> let me dream, and valid point on them possibly trying to sell us Titans first. That's definitely something in their playbook from the past.
> 
> 2020, the year of the $800 2080ti with better RT.


2080ti?

It will most likely turn out to be $800 3080 with better RT (RT might double per SM which we see >+80% performance in RT scenarios than 2080 Ti). At that price, I wont say it will be a good product, but definitely not terrible like the 2000s card.


----------



## skupples

that's what I mean. The 3080 will be 2080ti perf. w/ better RT. 

that's typically how it goes.


----------



## guttheslayer

skupples said:


> that's what I mean. The 3080 will be 2080ti perf. w/ better RT.
> 
> that's typically how it goes.


That is not how it goes all along. 3070 = 2080 Ti more like it. 3080 will be 15-25% faster.

Usually that happen when we see a full node jump between generation.


GTX 580 -> GTX 680
GTX 980 Ti -> GTX 1080

The small die of the new node usually outperform the big die of previous node, by a certain margin. I expect the same for RTX 3080. Which is why there is no need for 3080 Ti to be released this fast.


----------



## tconroy135

guttheslayer said:


> That is not how it goes all along. 3070 = 2080 Ti more like it. 3080 will be 15-25% faster.
> 
> Usually that happen when we see a full node jump between generation.
> 
> 
> GTX 580 -> GTX 680
> GTX 980 Ti -> GTX 1080
> 
> The small die of the new node usually outperform the big die of previous node, by a certain margin. I expect the same for RTX 3080. Which is why there is no need for 3080 Ti to be released this fast.


NVIDIA will limit the performance to what will sell the best. If rumors are true that the cards are cheaper, i expect they will allow for lower performance gains to obtain higher yield.


----------



## skupples

nvidia will do as little as humanly possible if AMD still hasn't released any contenders. 

however, I hope you're right. It would be nice to see another keplar>>maxwell tier jump.


----------



## guttheslayer

tconroy135 said:


> NVIDIA will limit the performance to what will sell the best. If rumors are true that the cards are cheaper, i expect they will allow for lower performance gains to obtain higher yield.


Not really the release of cheaper price is due to poor sales from previous entries.


GA104 can easily outperform a bigger 2080 Ti due to the full 2x density shrink, and it will be smaller. Given the small size of the mid-range card, NV wouldn't have to resort to disabling 1 or 2 SMs like they did on RTX 2080 (that TU104 is huge at 545 mm^2, bigger than GP102). Since the yield of EUV is reported good, comparable to standard 7nm, and cheaper to manufacture, NV have little reason to continue to neutered performance.


What will sell best now is beside being cheaper, the new card must be certain % faster than previous flagship gen, especially after a full 2 years of stagnant.


NV so far has been consistent on the record, the top card of the new lineup is always 25% faster than the previous top flagship *at the time of the release*. This is true even for Turing release as Ti was 25% faster than GTX 1080 Ti. So if NV release something next year, rest assured it will be 25% faster.


The only question is, whether the 25% faster card is a 3080, or a 3080 Ti, remains to be seen. But if it is the latter, then we will all be badly disappointed for sure.


----------



## BulletSponge

guttheslayer said:


> Not really the release of cheaper price is due to poor sales from previous entries.
> 
> 
> GA104 can easily outperform a bigger 2080 Ti due to the full 2x density shrink, and it will be smaller. Given the small size of the mid-range card, NV wouldn't have to resort to disabling 1 or 2 SMs like they did on RTX 2080 (that TU104 is huge at 545 mm^2, bigger than GP102). Since the yield of EUV is reported good, comparable to standard 7nm, and cheaper to manufacture, NV have little reason to continue to neutered performance.
> 
> 
> What will sell best now is beside being cheaper, the new card must be certain % faster than previous flagship gen, especially after a full 2 years of stagnant.
> 
> 
> NV so far has been consistent on the record, the top card of the new lineup is always 25% faster than the previous top flagship *at the time of the release*. This is true even for Turing release as Ti was 25% faster than GTX 1080 Ti. So if NV release something next year, rest assured it will be 25% faster.
> 
> 
> The only question is, whether the 25% faster card is a 3080, or a 3080 Ti, remains to be seen. But if it is the latter, then we will all be badly disappointed for sure.


I want this to be true but when launch day comes and it isn't Bullet will be here for you.


----------



## skupples

well, truly... we have two things going for us.

A.) node shrink
B.) node shrink at the same time as new consoles coming out. 

we got GK110 the last time a console released.



aaaachhhhhtuuuaaalllly.

it'll probably be just like keplar.

drop 3080 to get folks off of 2080ti, then drop a Titan shortly after. Who needs TIs anyhow? I mean, 560TI was pretty cool, but 780ti sucked


----------



## wingman99

guttheslayer said:


> Not really the release of cheaper price is due to poor sales from previous entries.
> 
> 
> GA104 can easily outperform a bigger 2080 Ti due to the full 2x density shrink, and it will be smaller. Given the small size of the mid-range card, NV wouldn't have to resort to disabling 1 or 2 SMs like they did on RTX 2080 (that TU104 is huge at 545 mm^2, bigger than GP102). Since the yield of EUV is reported good, comparable to standard 7nm, and cheaper to manufacture, NV have little reason to continue to neutered performance.
> 
> 
> What will sell best now is beside being cheaper, the new card must be certain % faster than previous flagship gen, especially after a full 2 years of stagnant.
> 
> 
> NV so far has been consistent on the record, the top card of the new lineup is always 25% faster than the previous top flagship *at the time of the release*. This is true even for Turing release as Ti was 25% faster than GTX 1080 Ti. So if NV release something next year, rest assured it will be 25% faster.
> 
> 
> The only question is, whether the 25% faster card is a 3080, or a 3080 Ti, remains to be seen. But if it is the latter, then we will all be badly disappointed for sure.


I believe generation increase was always at least 25% until I checked and found out you needed scaling with SLI in the past.

2080ti was 29% increase in FPS from 1080ti
1080ti was 50% increase in FPS from 980Ti
980ti was 40% increase in FPS from the 780ti. 980 was 32% increase in FPS from 780.
780ti was 26.4% increase in FPS from the 2 GPU SLI 690. 780 was 19.94% increase in FPS from 680.
2 GPU SLI 690 was 51% increase in FPS from the 2GPU SLI 590. 680 was 41.77% increase in FPS from 580.
2 GPU SLI 590 was 74% increase in FPS from the 480. 580 was 17.37% increase in FPS from 480.
480 was 50% increase in FPS from GTX 285.


----------



## tconroy135

skupples said:


> well, truly... we have two things going for us.
> 
> A.) node shrink
> B.) node shrink at the same time as new consoles coming out.
> 
> we got GK110 the last time a console released.
> 
> 
> 
> aaaachhhhhtuuuaaalllly.
> 
> it'll probably be just like keplar.
> 
> drop 3080 to get folks off of 2080ti, then drop a Titan shortly after. Who needs TIs anyhow? I mean, 560TI was pretty cool, but 780ti sucked


IDK, I think they like the tier structure. The 3080 performs the same as 2080Ti, but they of course offer better Ray Tracing performance than 2080Ti, so there is a reason to upgrade at a lower price for anyone not gaming at 4k.

Then with the 3080 Ti, they offer a reasonable performance bump, i.e. %25, plus a large Ray Tracing Bump.

What I am interested to see if I am right is if the Titan will have an actual rasterization performance difference from the 3080Ti because that is what would influence my purchasing selection. Now that we are in the age of 4k, I would like to see a card that can get close to [email protected] in the most demanding of gaming titles.

*Edit: I'm also hoping they go away from the Titan being a consumer workstation card.


----------



## Woundingchaney

Its important to note that the 2000 series didnt go over well with consumers and Nvidia received lower than expected sales as well as backlash internally and externally. Nvidia still needs to release a relevant product to keep their current consumer based tied to their offerings. The notion of simply because AMD doesnt have a competing product so performance is a secondary issue, doesnt really tell the whole story. 

It cost quite a bit of development, manufacturing and engineering time to bring a product line to market. Nvidia are well aware that performance is virtually the only selling point for gpus (particularly after the launch of the 2000 series) and will need something to invigorate their current consumer base. I dont see any reason to expect anything lower than a 25-30 percent performance increase for similar tiered offering from one generation to the next.


----------



## keikei

tconroy135 said:


> IDK, I think they like the tier structure. The 3080 performs the same as 2080Ti, but they of course offer better Ray Tracing performance than 2080Ti, so there is a reason to upgrade at a lower price for anyone not gaming at 4k.
> 
> Then with the 3080 Ti, they offer a reasonable performance bump, i.e. %25, plus a large Ray Tracing Bump.
> 
> What I am interested to see if I am right is if the Titan will have an actual rasterization performance difference from the 3080Ti because that is what would influence my purchasing selection. Now that we are in the age of 4k, *I would like to see a card that can get close to [email protected] in the most demanding of gaming titles.*
> 
> *Edit: I'm also hoping they go away from the Titan being a consumer workstation card.


RDR2 gets 40 fps with a 2080Ti and that is an avg 40% bump over its predecessor at that res. I suspect 2 more gens to hit that mark.


----------



## skupples

why would they go away from titan as a consumer workstation card with geforce drivers? that's what its always been... 

and only 25% bump? jeesh, you little faith. 25% would be as terrible as the 1080ti>>2080ti jump.



keikei said:


> RDR2 gets 40 fps with a 2080Ti and that is an avg 40% bump over its predecessor at that res. I suspect 2 more gens to hit that mark.


40fps when? with everything super maxed? it might be even lower  luckily the game EASILY tunes to wherever you want your FPS to be now, without much if any IQ hit.

folks keep repeating that bit about 100 fps @ 4K, on ultra. I don't think anything like that is going to exist until 4 series, and even then? probably not. 

Most games are designed with a 60FPS target, this is STILL the case, and will continue to be the case.

Y'all silly. That doesn't even exist when you play in 1440p144 w. a 2080ti. You STILL take a graphical hit even then, if you want a consistent 144fps in most things. It's just not going to happen any time soon. Y'all setting yourselves up for extreme disapointment if you think the next TI is going to be proper 4K120 card. It might have the port to handle it, but the horse power still won't be there. Close though.


----------



## keikei

skupples said:


> why would they go away from titan as a consumer workstation card with geforce drivers? that's what its always been...
> 
> and only 25% bump? jeesh, you little faith. 25% would be as terrible as the 1080ti>>2080ti jump.
> 
> 
> 
> *40fps when? with everything super maxed? it might be even lower * luckily the game EASILY tunes to wherever you want your FPS to be now, without much if any IQ hit.
> 
> folks keep repeating that bit about 100 fps @ 4K, on ultra. I don't think anything like that is going to exist until 4 series, and even then? probably not.
> 
> Most games are designed with a 60FPS target, this is STILL the case, and will continue to be the case.
> 
> Y'all silly. That doesn't even exist when you play in 1440p144 w. a 2080ti. You STILL take a graphical hit even then, if you want a consistent 144fps in most things. It's just not going to happen any time soon. Y'all setting yourselves up for extreme disapointment if you think the next TI is going to be proper 4K120 card. It might have the port to handle it, but the horse power still won't be there. Close though.


Thats the supposed average. Like with all games it'll fluctuate depending on the scene/action.


----------



## skupples

.9 res scale, thank me later.


----------



## bigjdubb

I don't think a GPU twice as powerful than my 2080ti will help me get 144fps @ 1440p consistently. A big portion of the games I play are limited to around 100fps from low to ultra settings on my system. Maybe I should spec my next system for content consumption instead of content creation.


----------



## lightsout

Bart said:


> I wish I could bring myself to buy used, but I won't do that for GPUs (because trust issues, LOL). Mining scared me off good.


Really? It's the only way to buy GPU's for me. The discount is too good to pass up. Like skupples said if you are worried about warranty you can't go wrong with EVGA.


----------



## Bart

lightsout said:


> Really? It's the only way to buy GPU's for me. The discount is too good to pass up. Like skupples said if you are worried about warranty you can't go wrong with EVGA.


Yeah now that I know that about EVGA, I might change my tune. The next gen might be interesting enough to make me skip the 2080 series entirely though. Not seeing much benefit in replacing 1080TIs with 2080s, now or in the future used.


----------



## bigjdubb

I got my 2080ti used and haven't had any troubles with it, but it was the first time in a very long time that I purchased used computer hardware. I would be leery about buying a used gpu from someone selling 20 gpu's at the same time on Ebay but I doubt the person selling just one used gpu is an ex miner.


----------



## skupples

Bart said:


> Yeah now that I know that about EVGA, I might change my tune. The next gen might be interesting enough to make me skip the 2080 series entirely though. Not seeing much benefit in replacing 1080TIs with 2080s, now or in the future used.


that was my plan, thus the used 1080tis, the 2080ti came along due all the weirdness I had going on.

turns out the processor in my monitor likes to artifact on all inputs, and experience was corkblocking games from running for unknown reasons.


----------



## guttheslayer

tconroy135 said:


> IDK, I think they like the tier structure. The 3080 performs the same as 2080Ti, but they of course offer better Ray Tracing performance than 2080Ti, so there is a reason to upgrade at a lower price for anyone not gaming at 4k.
> 
> Then with the 3080 Ti, they offer a reasonable performance bump, i.e. %25, plus a large Ray Tracing Bump.
> 
> What I am interested to see if I am right is if the Titan will have an actual rasterization performance difference from the 3080Ti because that is what would influence my purchasing selection. Now that we are in the age of 4k, I would like to see a card that can get close to [email protected] in the most demanding of gaming titles.
> 
> *Edit: I'm also hoping they go away from the Titan being a consumer workstation card.


The 3080 will outperform the 2080 Ti like how they did for 1080 to 980 Ti. Rumor has point out 3 main advantage: Cheaper price, better RT & *faster raserization*.


The 3 factors, only price seem abit not possible. The rest are easily achievable.


----------



## ilmazzo

Well, for the used market turing introduced a new factor: when it will start let me playing space invaders w/o RT?


----------



## criminal

ilmazzo said:


> Well, for the used market turing introduced a new factor: when it will start let me playing space invaders w/o RT?


Used Turing is a crapshoot.


----------



## ThrashZone

Hi,
All used cards are a crap shoot not many manufactures allow warranty transfer so if it goes out sool.


----------



## tconroy135

guttheslayer said:


> The 3080 will outperform the 2080 Ti like how they did for 1080 to 980 Ti. Rumor has point out 3 main advantage: Cheaper price, better RT & *faster raserization*.
> 
> 
> The 3 factors, only price seem abit not possible. The rest are easily achievable.


I'll give you that the 3080 might have a slight rasterization advantage over the 2080Ti, but I'd be amazed if it was enough that some 2080Ti owners choose to buy a 3080 instead of a 3080Ti as an upgrade.


----------



## m4fox90

tconroy135 said:


> I'll give you that the 3080 might have a slight rasterization advantage over the 2080Ti, but I'd be amazed if it was enough that some 2080Ti owners choose to buy a 3080 instead of a 3080Ti as an upgrade.


After the extremely marginal difference between 2080 and 1080Ti, I think folks should temper expectations severely. Nvidia isn't going to bump up performance dramatically out of the goodness of their hearts.


----------



## Zam15

Need this by March so I'll be able to jump into Half Life Alyx, Cyberpunk, Doom Eternal, Metro Exodus (on steam), Red Dead and Death Stranding!


----------



## guttheslayer

m4fox90 said:


> After the extremely marginal difference between 2080 and 1080Ti, I think folks should temper expectations severely. Nvidia isn't going to bump up performance dramatically out of the goodness of their hearts.


There is nothing to temper when NV have been doing that for the past 2 nodes jump.


That expectation is more or less expectated after a 2 years stagnant and that is after a full 18 month disappointing jump from 1080 Ti to Turing.


Also the poor performance of Turing vs Pascal was because NV add RT Cores which doesnt benefit standard rasterisation while barely increasing the CUDA cores, furthermore it was almost on a same node (12nm and 16nm have around the same density logic)


----------



## epic1337

m4fox90 said:


> After the extremely marginal difference between 2080 and 1080Ti, I think folks should temper expectations severely. Nvidia isn't going to bump up performance dramatically out of the goodness of their hearts.


well maybe not out of compassion, but if you think about the possible publicity stunt they can pull off a massive perf increase then they'll suddenly gain all the attention.
but then, that doesn't mean they'll put it on a lower cost either, the price bracket of xx70 and xx80 series had gone up with Turing after all.


----------



## Sheyster

tconroy135 said:


> I'll give you that the 3080 might have a slight rasterization advantage over the 2080Ti, but I'd be amazed if it was enough that some 2080Ti owners choose to buy a 3080 instead of a 3080Ti as an upgrade.


My next card will be big Ampere. I won't jump on the 3080 no matter how "good" it is.


----------



## skupples

their next genuine slayer won't come until ~ the new consoles. 

3080 will be a nice card either way though, and plenty of 2080ti owners will jump ship if its even 10% faster. The terrible news here is 2080ti will have as bad of resale value as 970, one day after 3080 drop.


----------



## mouacyk

skupples said:


> their next genuine slayer won't come until ~ the new consoles.
> 
> 3080 will be a nice card either way though, and plenty of 2080ti owners will jump ship if its even 10% faster. The terrible news here is 2080ti will have as bad of resale value as 970, one day after 3080 drop.


The x80 Ti has always been scheduled this way since its inception with Keplar. It's not even a binning delay, because the x80 and x80 Ti are two different SKUs. NVidia formulated it that way to garner additional impulsive X80 sales by those who would eventually settle on the x80 Ti anyway, and are willing to accept the incremental performance increase (for benchmarking) as fed slowly by NVidia themselves. This is also why the x80 has always been slightly faster than the previous generation x80 Ti. It's simple:

a * (x80 Ti) + b * (x80) > a * (x80 Ti), where b <= a and b >= 0

If they offer the x80 Ti upfront, b = 0.


----------



## 113802

mouacyk said:


> The x80 Ti has always been scheduled this way since its inception with Keplar. It's not even a binning delay, because the x80 and x80 Ti are two different SKUs. NVidia formulated it that way to garner additional impulsive X80 sales by those who would eventually settle on the x80 Ti anyway, and are willing to accept the incremental performance increase (for benchmarking) as fed slowly by NVidia themselves. This is also why the x80 has always been slightly faster than the previous generation x80 Ti. It's simple:
> 
> a * (x80 Ti) + b * (x80) > a * (x80 Ti), where b <= a and b >= 0
> 
> If they offer the x80 Ti upfront, b = 0.


The GTX 780 and GTX 780 Ti were both the same die. It wasn't until the GTX 980 and GTX 980 Ti where they started selling different dies.


----------



## mouacyk

WannaBeOCer said:


> The GTX 780 and GTX 780 Ti were both the same die. It wasn't until the GTX 980 and GTX 980 Ti where they started selling different dies.


You're right and good memory there (https://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review). Same memory subsystem, just 20% less shaders.


----------



## 113802

mouacyk said:


> You're right and good memory there (https://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review). Same memory subsystem, just 20% less shaders.


I was one of the users who bought a GTX 780 at launch(upgraded from a GTX 470) and sold it to buy a GTX 780 Ti KingPin Edition.


----------



## aDyerSituation

Really trying to hold out for a 3080 or whatever so I can fully enjoy Read Dead and Cyberpunk. 

I'd get a 2080s right now but I don't like upgrading my gpu every gen since the majority of the time I'm playing shooters on low settings


----------



## bigjdubb

skupples said:


> their next genuine slayer won't come until ~ the new consoles.
> 
> 3080 will be a nice card either way though, and plenty of 2080ti owners will jump ship if its even 10% faster. The terrible news here is 2080ti will have as bad of resale value as 970, one day after 3080 drop.


I kind of hope they release something that tanks the value of 2080ti's. I got mine for dirt cheap (relative to msrp) so I might get a little whacky and buy another one if the floor falls out from under the price. I honestly don't need anything faster but I wouldn't mind having another one for the giggles.


----------



## pompss

The rumors state a release by summer 2020 . Well its time for nvidia to start lowering the price back to $599. Ps5 e Xbox are around the corner and if those consoles will cost $499 each then i dont really see any reasons to spent $1000 on a gpu when you can get ps5 and xbox for the same price.
The only reason to spent $1000 on rtx 3080 will be 50% increase 120fps 4k! But i really doubt this will happen. Even a 30% increase will not be enough in my opinion to see major difference in gaming or convince me to buy it.

This time around nvidia doesnt have a choice, like Intel they will be force to lower the price this time around.

Thanks AMD !!!


----------



## 113802

pompss said:


> The rumors state a release by summer 2020 . Well its time for nvidia to start lowering the price back to $599. Ps5 e Xbox are around the corner and if those consoles will cost $499 each then i dont really see any reasons to spent $1000 on a gpu when you can get ps5 and xbox for the same price.
> The only reason to spent $1000 on rtx 3080 will be 50% increase 120fps 4k! But i really doubt this will happen. Even a 30% increase will not be enough in my opinion to see major difference in gaming or convince me to buy it.
> 
> This time around nvidia doesnt have a choice, like Intel they will be force to lower the price this time around.
> 
> Thanks AMD !!!


That would be the case if GeForce cards were just meant for gaming. GeForce RTX cards are already great at deep learning because of the tensor cores. Memory becomes less of an issue due to float16 which effectively doubles the size of the models that can be trained using RTX cards because half precision floats take up half the memory space of float32. Aside from Deep Learning Studio Drives boost performance in professional applications. They recently released Studio Drivers with Geforce RTX optimizations for Arnold one of the renders that always benefited from Quadro cards: https://www.nvidia.com/en-us/geforce/news/arnold-maya-nvidia-studio-driver/

I do not see AMD RDNA gaming cards dropping the price of nVidia's multiple purpose GeForce lineup.


----------



## DNMock

pompss said:


> The rumors state a release by summer 2020 . Well its time for nvidia to start lowering the price back to $599. Ps5 e Xbox are around the corner and if those consoles will cost $499 each then i dont really see any reasons to spent $1000 on a gpu when you can get ps5 and xbox for the same price.
> The only reason to spent $1000 on rtx 3080 will be 50% increase 120fps 4k! But i really doubt this will happen. Even a 30% increase will not be enough in my opinion to see major difference in gaming or convince me to buy it.
> 
> This time around nvidia doesnt have a choice, like Intel they will be force to lower the price this time around.
> 
> Thanks AMD !!!


With two+ years to refine their RT core situation and the node shrink, seeing 30% better performance achieved for 30% lower MSRP is absolutely within the realm of possibility.



WannaBeOCer said:


> I do not see AMD RDNA gaming cards dropping the price of nVidia's multiple purpose GeForce lineup.


Nvidia will be forced to cut their prices down I would imagine. They won't be competing with AMD per say for the gaming market, they will be competing with the consoles subsidizing AMD hardware. Let's be real here, the GEFORCE line may well be useful in other applications, but the overwhelming majority of their sales are for gaming.


----------



## 113802

DNMock said:


> With two+ years to refine their RT core situation and the node shrink, seeing 30% better performance achieved for 30% lower MSRP is absolutely within the realm of possibility.
> 
> 
> 
> Nvidia will be forced to cut their prices down I would imagine. They won't be competing with AMD per say for the gaming market, they will be competing with the consoles subsidizing AMD hardware. Let's be real here, the GEFORCE line may well be useful in other applications, but the overwhelming majority of their *sales are for gaming.*


Maybe their mid range cards but I'm sure their higher end cards have more sales by universities than actual gamers.


----------



## DNMock

WannaBeOCer said:


> Maybe their mid range cards but I'm sure their higher end cards have more sales by universities than actual gamers.


Yup, mid range cards are also referred to as mainstream for a reason lol.

edit: Obviously we are talking about pressure from consoles, which I very much doubt will have any effect on the $1,000 GPU market segments. There will be trickle down (or up as it may be). I'm sure there will still be sticker shock for x80ti / titan level gpu's but shouldn't be anywhere near as bad as the 2080ti was


----------



## skupples

mouacyk said:


> You're right and good memory there (https://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review). Same memory subsystem, just 20% less shaders.


GK110 Vs. GK104. I have those BIOSs burned into the back of my eye lids.

and like I've been droning on now for a year. NV will be forced to make things happen due to the new consoles, so we better hope they're as OP as humanly possible. Turing release was during a zero competition window, so all they had to do was release a product. NV is competing against quite the behemoth now, at least on the gaming front. New consoles + new high end 7nm GPUs. AMD's stock is gonna keep going ^^^^^

i have zero interest in what releases in 2020, tbh. 2021 is gonna be an epic year for the glorious PC master race.


----------



## pompss

WannaBeOCer said:


> Maybe their mid range cards but I'm sure their higher end cards have more sales by universities than actual gamers.


Well a high end cards will be a 3080 and 3080 Ti .
If they keep the price of 3080 at $699 like the 2080 then most likely people will buy it.Anything higher will a a big ????.

Different story for the 3080 Ti. I don't think will sell like the 2080ti if priced at $1200 and those sells weren't that good as we know from the reports.

I just hope that this new consoles will force nvidia to price the 3080 at 599-699 and the Ti at $999 like the old titans.


----------



## speed_demon

I don't see the 3000 series cards being sold for less than current gen cards unless there was a huge upset by AMD or Intel GPUs where Nvidia would want to regain customers. Nvidia knows that they are the market leader and they have tremendous mind share from customers which equates to pricing power like you wouldn't believe.

I'd love to buy myself a shiny new 3080 Ti for under $800 but it's just not likely to happen this generation.


----------



## Hydroplane

I feel like they might cut 3080 / 3080ti prices a little. Not much though, maybe $100 per part. Turing sales are down vs. Pascal


----------



## dantoddd

Hydroplane said:


> I feel like they might cut 3080 / 3080ti prices a little. Not much though, maybe $100 per part. Turing sales are down vs. Pascal


i thought turing was selling better than pascal by about 40%


----------



## Nineball_Seraph

magnek said:


> Titan Xp performance for $300 or no buy kkthx


I know that you are being sarcastic but i feel this accurately sums up alot of the complainer/entitled mindset.

People have forgotten that this is an enthusiasts hobby and at the highest levels of performance comes a high price. Hell, in the very early days pricing was way worse. People are constantly complaining about the price increases of flagship products without thinking about ever changing/increasing costs through the entire R&D and manufacturing process. 

We have far more choices now than ever before. So if you can't afford the flagship then either settle for lower tier parts or drop out of the hobby. People complaining about $1300 flagship GPUs are like people complaining that they cant get Lambos and Ferraris for the price of civics. 

Save Up or Shut Up


----------



## DNMock

Nineball_Seraph said:


> I know that you are being sarcastic but i feel this accurately sums up alot of the complainer/entitled mindset.
> 
> People have forgotten that this is an enthusiasts hobby and at the highest levels of performance comes a high price. Hell, in the very early days pricing was way worse. People are constantly complaining about the price increases of flagship products without thinking about ever changing/increasing costs through the entire R&D and manufacturing process.
> 
> We have far more choices now than ever before. So if you can't afford the flagship then either settle for lower tier parts or drop out of the hobby. People complaining about $1300 flagship GPUs are like people complaining that they cant get Lambos and Ferraris for the price of civics.
> 
> Save Up or Shut Up


How dare the consumers voice their opinions on the pricing scheme of the products they purchase! Filthy peasants need to know their place, right? 


I feel like I need a meme of tiny tim with outstretched arms asking Ebenezer Leatherjacket "Please sir, may I have a few more CUDA cores?"


----------



## ToTheSun!

dantoddd said:


> i thought turing was selling better than pascal by about 40%


What information made you conclude that?


----------



## Nineball_Seraph

DNMock said:


> How dare the consumers voice their opinions on the pricing scheme of the products they purchase! Filthy peasants need to know their place, right?
> 
> 
> I feel like I need a meme of tiny tim with outstretched arms asking Ebenezer Leatherjacket "Please sir, may I have a few more CUDA cores?"



"How dare a company try to make money back on their massive investments. We want premium products that are the best of the best but we want it for absolutely nothing."


The problem with these so-called "opinions" is that they are mostly framed as "nvidia doesn't deserve to make a profit and they better give us these products and take a massive loss in the process" So its not really voicing any logical opinion, just people feeling entitled to the best stuff on the market is paying nothing for it. Most of these people have no clue about the true costs behind these gpus nor the profit margins that nvidia (or any company for that matter) need to meet in order to keep making products.


----------



## Nunzi

Nineball_Seraph said:


> I know that you are being sarcastic but i feel this accurately sums up alot of the complainer/entitled mindset.
> 
> People have forgotten that this is an enthusiasts hobby and at the highest levels of performance comes a high price. Hell, in the very early days pricing was way worse. People are constantly complaining about the price increases of flagship products without thinking about ever changing/increasing costs through the entire R&D and manufacturing process.
> 
> We have far more choices now than ever before. So if you can't afford the flagship then either settle for lower tier parts or drop out of the hobby. People complaining about $1300 flagship GPUs are like people complaining that they cant get Lambos and Ferraris for the price of civics.
> 
> Save Up or Shut Up





Nineball_Seraph said:


> "How dare a company try to make money back on their massive investments. We want premium products that are the best of the best but we want it for absolutely nothing."
> 
> 
> The problem with these so-called "opinions" is that they are mostly framed as "nvidia doesn't deserve to make a profit and they better give us these products and take a massive loss in the process" So its not really voicing any logical opinion, just people feeling entitled to the best stuff on the market is paying nothing for it. Most of these people have no clue about the true costs behind these gpus nor the profit margins that nvidia (or any company for that matter) need to meet in order to keep making products.


Nailed it.....


----------



## skupples

and leather jackets...


----------



## Section31

The latest Nvidia news make sense where its focus has really shifted to for last while. Nvidia went where the money is, the market involving AI Compute. However, Nvidia clearly hasn't been sleeping, just when will we see this tech in consumer GPU's. 

AMD is only one that really cares about the consumer GPU market it seems. Though AMD is also now targetting that market, its just way more profitable than us.


----------



## Asmodian

Section31 said:


> AMD is only one that really cares about the consumer GPU market it seems. Though AMD is also now targetting that market, its just way more profitable than us.


This seems like an odd statement since AMD has put very little effort (or money) into consumer GPUs for the last decade or so. Justifiably, given how their focus on Zen paid off, but they seemed to care about the consumer GPU market less than Nvidia, who released many new consumer GPUs with the new top end cards only competing with their older cards for much of that time. Why bother to put out a new card in a market you do not care about if you already have the uncontested flagship in that market? AMD seemed to only release new GPUs so consumers wouldn't forget they made them but almost everything was focused on CPUs, they simply did not have the capacity to care about the consumer GPU market.

Nvidia cares about the consumer GPU market a lot. A huge portion of their revenue comes from this market, much higher than AMD's portion, and they successfully increased their top end consumers cards from $500+ to $1000+. This increase in price and margin really pulls the attention of a company. They might not care about keeping market prices low so everyone can afford one but they definitely care about the market.


----------



## DNMock

Nineball_Seraph said:


> "How dare a company try to make money back on their massive investments. We want premium products that are the best of the best but we want it for absolutely nothing."
> 
> 
> The problem with these so-called "opinions" is that they are mostly framed as "nvidia doesn't deserve to make a profit and they better give us these products and take a massive loss in the process" So its not really voicing any logical opinion, just people feeling entitled to the best stuff on the market is paying nothing for it. Most of these people have no clue about the true costs behind these gpus nor the profit margins that nvidia (or any company for that matter) need to meet in order to keep making products.


We are reaching levels of straw-man arguments that shouldn't even be possible!

Take a breather and remember the audience here. This isn't a board room nor is it a town hall. It's just a group of enthusiasts expressing their personal opinions. No one is trying to force legislature that would limit profits on anyone, so please leave the "entitled this" and "you have no clue, that" at the door. 

Just because my water cooler opinion happens to be that Nvidia is getting greedy and overcharging a little too excessively doesn't make me an entitled whiny baby or whatever other stuff you throw out there. What it means is that I am expressing my opinions so that Nvidia can possiblly hear it, and if that opinion is echo'ed by the majority, they can course correct before it costs them money long term. There is no us vs them, everyone is on the same team here, all data points are relevant even if they don't fit your world view, and dissenting opinions are in fact data points to help everyone reach the best course of action.


----------



## Sheyster

DNMock said:


> We are reaching levels of straw-man arguments that shouldn't even be possible!
> 
> Take a breather and remember the audience here. This isn't a board room nor is it a town hall. It's just a group of enthusiasts expressing their personal opinions. No one is trying to force legislature that would limit profits on anyone, so please leave the "entitled this" and "you have no clue, that" at the door.
> 
> Just because my water cooler opinion happens to be that Nvidia is getting greedy and overcharging a little too excessively doesn't make me an entitled whiny baby or whatever other stuff you throw out there. What it means is that I am expressing my opinions so that Nvidia can possiblly hear it, and if that opinion is echo'ed by the majority, they can course correct before it costs them money long term. There is no us vs them, everyone is on the same team here, all data points are relevant even if they don't fit your world view, and dissenting opinions are in fact data points to help everyone reach the best course of action.



Entitled folks with no common sense aside, this about sums it up:

https://pragmaticpricing.com/2013/10/19/what-is-what-the-market-will-bear/


----------



## Section31

I think lot of people are waiting for new GPU's that's all. Truth be told, the prices are set by market and if you want the item, you have to pay it. Basically right now for the last couple years, its seems (if you have the cash flow), is buy the top of the line and then replace it when the next top of line comes out every 18-24months. 

I have tried the cheaper upgrade path before. GTX260 (2008) to GTX470(2010) to GTX680(2012) to GTX970(2014) to GTX1080(2015) to GTX1080TI(2016) to RTX2080TI (2018) to RTX3080TI (2020 Say). The gap between 2010-2012 was when I was out doing internship overseas so I had to use my Vaio Z laptop with Nvidia graphics. Basically I owned that card for 9 months in total. Seems like going from every 1-2 generation TI is better than going the intermediate route. This is only if you have the means to pay for it.


----------



## bigjdubb

Section31 said:


> The latest Nvidia news make sense where its focus has really shifted to for last while. Nvidia went where the money is, the market involving AI Compute. However, Nvidia clearly hasn't been sleeping, just when will we see this tech in consumer GPU's.
> 
> AMD is only one that really cares about the consumer GPU market it seems. Though AMD is also now targetting that market, its just way more profitable than us.


How much AMD cares about the consumer really depends on the consumer. I feel like they don't care about me at all because they keep pumping out the low performance budget minded video cards. We will see if they change their tune next year, but I don't think they care about THIS consumer one bit.


----------



## 113802

DNMock said:


> We are reaching levels of straw-man arguments that shouldn't even be possible!
> 
> Take a breather and remember the audience here. This isn't a board room nor is it a town hall. It's just a group of enthusiasts expressing their personal opinions. No one is trying to force legislature that would limit profits on anyone, so please leave the "entitled this" and "you have no clue, that" at the door.
> 
> Just because my water cooler opinion happens to be that Nvidia is getting greedy and overcharging a little too excessively doesn't make me an entitled whiny baby or whatever other stuff you throw out there. What it means is that I am expressing my opinions so that Nvidia can possiblly hear it, and if that opinion is echo'ed by the majority, they can course correct before it costs them money long term. There is no us vs them, everyone is on the same team here, all data points are relevant even if they don't fit your world view, and dissenting opinions are in fact data points to help everyone reach the best course of action.


As a PC enthusiasts I can see why the price of nVidia's cards went up compared to Pascal. With the addition of RT Cores, Tensor cores and Studio Drivers. These cards are fantastic for many prosumers and researchers. As a PC gamer I can understand the frustration of the cost of these GPUs with two features that don't heavily improve the gaming experience. nVidia GPUs do much more than they did just 3 years ago. AMD charging a few dollars less while doing much less even if it isn't just gaming should be noticed.


----------



## ZealotKi11er

WannaBeOCer said:


> As a PC enthusiasts I can see why the price of nVidia's cards went up compared to Pascal. With the addition of RT Cores, Tensor cores and Studio Drivers. These cards are fantastic for many prosumers and researchers. As a PC gamer I can understand the frustration of the cost of these GPUs with two features that don't heavily improve the gaming experience. nVidia GPUs do much more than they did just 3 years ago. AMD charging a few dollars less while doing much less even if it isn't just gaming should be noticed.


It all depends in the target audience and not what the GPU can do in theory. You cant take a family SUV and add a big engine and expect them to pay more when they never asked for engine upgrade. Someone will use RTX cards for gaming, some will use them for RT and Tensor Cores but very few people will use them for both workloads.


----------



## DNMock

WannaBeOCer said:


> As a PC enthusiasts I can see why the price of nVidia's cards went up compared to Pascal. With the addition of RT Cores, Tensor cores and Studio Drivers. These cards are fantastic for many prosumers and researchers. As a PC gamer I can understand the frustration of the cost of these GPUs with two features that don't heavily improve the gaming experience. nVidia GPUs do much more than they did just 3 years ago. AMD charging a few dollars less while doing much less even if it isn't just gaming should be noticed.


For sure, they do a ton of different tasks. That's a big part of the problem though, I dont want to pay for silicon that I most likely will never use. To Nvidia's credit, they have already noticed this before it became a problem with Turing with the full split of their consumer and commercial divisions with Volta. 

That is one of the reasons I actually expect Amphere to be cheaper. Turing was kind of a bastard child stuck in the middle of the divorce of those two lines of GPU's. At least that is my hope anyway.


----------



## skupples

still have yet to turn RT on. rather stick to native res @ 120fps.


----------



## ZealotKi11er

DNMock said:


> For sure, they do a ton of different tasks. That's a big part of the problem though, I dont want to pay for silicon that I most likely will never use. To Nvidia's credit, they have already noticed this before it became a problem with Turing with the full split of their consumer and commercial divisions with Volta.
> 
> That is one of the reasons I actually expect Amphere to be cheaper. Turing was kind of a bastard child stuck in the middle of the divorce of those two lines of GPU's. At least that is my hope anyway.


Turing goes against everything Maxwell and Pascal did in some ways. It was since Kepler that this started. Making more gaming focused GPUs and then more datacenter focused GPUs. AMD finally doing it with rdna/gcn


----------



## DNMock

ZealotKi11er said:


> Turing goes against everything Maxwell and Pascal did in some ways. It was since Kepler that this started. Making more gaming focused GPUs and then more datacenter focused GPUs. AMD finally doing it with rdna/gcn



What I mean was, you could buy a Fermi GEForce, Fermi Quadro, Fermi Tesla, etc. etc. totally different use GPU's using the same chips. 

Volta was the first arch that was totally dedicated to data centers, and Turing to be to consumers. I'm hoping there was just a snag and NV had to shoehorn in a slightly modified Volta in and call it Turing and that the snafu that lead to that will be ironed out with Amphere. 

It's a fools hope, but a hope none the less.


----------



## Sheyster

skupples said:


> still have yet to turn RT on. rather stick to native res @ 120fps.


I tried it out with an HDR600 monitor and the BF5 campaign. It is impressive visually in conjunction with HDR but definitely not to be used for competitive online play which is where I live 99% of the time when gaming.


----------



## bigjdubb

skupples said:


> still have yet to turn RT on. rather stick to native res @ 120fps.


You haven't missed anything, most of the ray tracing visual features are a letdown. It's sort of like switching from high to ultra, you have to really look to see what improved and once you find something.... you be like "I lost how many fps for that???"


----------



## evensen007

For me, there wasn't really even a consideration for AMD this go-round. I game at 4k, and want/need close to 60FPS in my favorite titles. Am I happy that a 2080ti cost me 1000 dollars? Nope. Am I happy I got surcharged for stupid RT capabilities I will never turn on? Nope. I believe 2020 will be a great year for competition at the 4k60 high end. Hoping AMD comes out swinging and makes me go back to them for the first time since I had my x-fire 290's. To be fair, I unloaded my 1080ti with waterblock for 500, so that helped subsidize the cost.


----------



## Bart

bigjdubb said:


> You haven't missed anything, most of the ray tracing visual features are a letdown. It's sort of like switching from high to ultra, you have to really look to see what improved and once you find something.... you be like "I lost how many fps for that???"


And also "I spend how much $$$ for that???", LOL! I just picked one up, on sale (lulz) for $200 off, for the BARGAIN BASEMENT price of ONLY $1328CDN. I think on some level, I just raped myself for Christmas!


----------



## dantoddd

ToTheSun! said:


> What information made you conclude that?


https://www.extremetech.com/gaming/288080-nvidia-turing-sales-revenue-up-45-percent-over-pascal


----------



## DNMock

dantoddd said:


> https://www.extremetech.com/gaming/288080-nvidia-turing-sales-revenue-up-45-percent-over-pascal


That is only comparing the revenue generated in the first 8 weeks after release of Pascal vs the first 8 weeks after the release of Turing. 

Turd in the punchbowl here is only the 1080 and 1070 were available in the first 8 weeks, with the 1080ti coming some 6 months later. Turing released the 2080ti on launch day. 

Since we are comparing only revenue of a partial launch of the mid range only cards to the revenue of a full release of turing cards (with a 30% mark up on their gpu's not to mention the 1200 2080ti all available during that timeframe). That tells me that in just units sold, the 1080 and 1070 alone likely surpassed that of the entire Turing line-up. And if you were to add in the later released 1080ti's first 8 weeks figures to pascals figures it would be a runaway slam dunking on Turing.


----------



## ToTheSun!

DNMock said:


> That is only comparing the revenue generated in the first 8 weeks after release of Pascal vs the first 8 weeks after the release of Turing.
> 
> Turd in the punchbowl here is only the 1080 and 1070 were available in the first 8 weeks, with the 1080ti coming some 6 months later. Turing released the 2080ti on launch day.
> 
> Since we are comparing only revenue of a partial launch of the mid range only cards to the revenue of a full release of turing cards (with a 30% mark up on their gpu's not to mention the 1200 2080ti all available during that timeframe). That tells me that in just units sold, the 1080 and 1070 alone likely surpassed that of the entire Turing line-up. And if you were to add in the later released 1080ti's first 8 weeks figures to pascals figures it would be a runaway slam dunking on Turing.


And judging by the owners thread on OCN (hardly an encompassing part of the universe of Turing buyers, but it's likely indicative), the 2080ti makes the bulk of the revenue.


----------



## Woundingchaney

ToTheSun! said:


> And judging by the owners thread on OCN (hardly an encompassing part of the universe of Turing buyers, but it's likely indicative), the 2080ti makes the bulk of the revenue.


Revenue or profits?

I doubt the 2080ti was the largest revenue generator.


----------



## ToTheSun!

Woundingchaney said:


> Revenue or profits?
> 
> I doubt the 2080ti was the largest revenue generator.


With the parameter I referred to above, both.

In any case, Steam Survey from last month shows Pascal has the biggest chunk of the market, by far! The 2060 and 2070 are the Turing best sellers, also by a pretty big relative margin, but I wouldn't know how that would translate to profit. Revenue is almost evenly split between all SKU's, so my theory is, at least partially, incorrect.


----------



## skupples

not surprised there. 1080tis are $400 on ebay. AnD iS bAsIcAlLy A 2080!


----------



## rainzor

DNMock said:


> That is only comparing the revenue generated in the first 8 weeks after release of Pascal vs the first 8 weeks after the release of Turing.
> 
> Turd in the punchbowl here is only the 1080 and 1070 were available in the first 8 weeks, with the 1080ti coming some 6 months later. Turing released the 2080ti on launch day.
> 
> Since we are comparing only revenue of a partial launch of the mid range only cards to the revenue of a full release of turing cards (with a 30% mark up on their gpu's not to mention the 1200 2080ti all available during that timeframe). That tells me that in just units sold, the 1080 and 1070 alone likely surpassed that of the entire Turing line-up. And if you were to add in the later released 1080ti's first 8 weeks figures to pascals figures it would be a runaway slam dunking on Turing.


During the first 8 weeks only three cards from Turing lineup were released, same as Pascal (you forgot 1060).
It's safe to conclude that Pascal sold more even with initial availability issues, but Turing with higher ASP did produce higher revenue.
Another thing to note when comparing these launches is that Pascal launch wasn't plagued with crypto crash and massive influx of super cheap 2nd hand cards in the market like it was the case with Turing.
It's hard to draw any conclusions when you factor in such a big anomalies like mining.


----------



## DNMock

rainzor said:


> During the first 8 weeks only three cards from Turing lineup were released, same as Pascal (you forgot 1060).
> It's safe to conclude that Pascal sold more even with initial availability issues, but Turing with higher ASP did produce higher revenue.
> Another thing to note when comparing these launches is that Pascal launch wasn't plagued with crypto crash and massive influx of super cheap 2nd hand cards in the market like it was the case with Turing.
> It's hard to draw any conclusions when you factor in such a big anomalies like mining.


True enough, that mining boom really messed up any way of reporting it. I did forget the 1060 though, for some reason I was thinking it was a lot later on though. Probably confusing it with the 1070ti. I am sure that the 40% more revenue thing is an attempt to skew the numbers as heavily as possible as a song and dance to the shareholders so it can be thrown out too. 

None of it really matters though since for the most part it's completely unrelated to Amphere...


----------



## MonarchX

Perfect timing for CypberPunk, at least once CyberPunk gets patched properly!


----------



## dantoddd

DNMock said:


> That is only comparing the revenue generated in the first 8 weeks after release of Pascal vs the first 8 weeks after the release of Turing.
> 
> Turd in the punchbowl here is only the 1080 and 1070 were available in the first 8 weeks, with the 1080ti coming some 6 months later. Turing released the 2080ti on launch day.
> 
> Since we are comparing only revenue of a partial launch of the mid range only cards to the revenue of a full release of turing cards (with a 30% mark up on their gpu's not to mention the 1200 2080ti all available during that timeframe). That tells me that in just units sold, the 1080 and 1070 alone likely surpassed that of the entire Turing line-up. And if you were to add in the later released 1080ti's first 8 weeks figures to pascals figures it would be a runaway slam dunking on Turing.


Dude, you're clutching at straws here. Nvidia has a detailed segment by segment breakdown of their quarterly revenue online. do you have any concrete facts to back up your assertions. NVidia has been doing well as a company consistently. in fact they're even gaining market share from AMD. Clear indication that turing is doing better than pascal.

https://investor.nvidia.com/financial-info/financial-reports/default.aspx

just did a simple calculation revenue from turing for first 1 year 5.086 bn USD, revenue from pascal 4.805 bn USD.


----------



## magnek

Nineball_Seraph said:


> I know that you are being sarcastic but i feel this accurately sums up alot of the complainer/entitled mindset.
> 
> People have forgotten that this is an enthusiasts hobby and at the highest levels of performance comes a high price. Hell, in the very early days pricing was way worse. People are constantly complaining about the price increases of flagship products without thinking about ever changing/increasing costs through the entire R&D and manufacturing process.
> 
> We have far more choices now than ever before. So if you can't afford the flagship then either settle for lower tier parts or drop out of the hobby. People complaining about $1300 flagship GPUs are like people complaining that they cant get Lambos and Ferraris for the price of civics.
> 
> Save Up or Shut Up


This was in no way a facetious comment. Actually what I said makes perfect sense if you take a step back, take off those green blinders, and take a breather. The trend so far has been that two generations later, we should expect the x60 part to perform at the x80 (Ti) level for around $200-350 (ideally $200, but those days are long gone). 2060 beat 980 Ti, 1060 beat 780 Ti, and 960 beat 580. If the $300 something 3060 doesn't soundly kick 1080 Ti/Titan Xp in the ass, then it's a fail or nVidia is deliberately sandbagging. 

As for save up or shut up, all I have to say is LOL. I think there are extremely few (possibly even 0) on here who literally cannot afford a $1300 GPU. But just because we can afford something, doesn't mean we want to pay that much for it.


----------



## skupples

saving for a $1300 purchase probably means you should focus more on just saving for now


----------



## ToTheSun!

skupples said:


> saving for a $1300 purchase probably means you should focus more on just saving for now


Exactly. I mean, consumerism can get the best of us, but it's just sound advice to have a few of your expensive products' worth of savings for a rainy day (or more if you are graced so).

And I make Magnek's my words, too, even though I generally cast my vote as a consumer more silently. The market tends to correct itself accordingly.


----------



## skupples

3 months of living expenses goes by real quick when you're out a job. 

living pay check to pay check, even while young, is obscenely stupid if you have the ability to save & just blow it all on stupid frills check after check. That's how you get stuck in the loop.


----------



## BigMack70

3-12 months of living expenses saved, depending on how fast you'd reasonably expect to find a new job if yours ended, really should be standard operating procedure for more people. 

As for tech, as long as you plan out your expected purchases, it's not too bad. I'll never understand the idiocy of comparing PCs to cars as a hobby. The two are nothing alike in terms of affordability and expense. PCs cost pennies on the dollar as a hobby compared to autos. You would have a hard time spending more than $10-15k on building a desktop PC. You could blow well past $100-150k on a vehicle without blinking an eye.


----------



## skupples

top tier ENTHUSIAST PCs start to cost as much as a 20 year old civic tuning jobs.

that's where it comes from. It comes from young people & children without proper understanding of costs quite yet.

either way - I completely agree. More people need my Jew trait that makes me warm & tingly when looking @ money in the bank. Seeing a check land is way more gratifying than most purchases


----------



## Sheyster

skupples said:


> 3 months of living expenses goes by real quick when you're out a job.
> 
> living pay check to pay check, even while young, is obscenely stupid if you have the ability to save & just blow it all on stupid frills check after check. That's how you get stuck in the loop.





BigMack70 said:


> 3-12 months of living expenses saved, depending on how fast you'd reasonably expect to find a new job if yours ended, really should be standard operating procedure for more people.


IMHO 6 months of all expenses should be the absolute minimum savings anyone should have, especially if you own a home. Anyone who is living paycheck to paycheck should not be buying $1300 video cards. Used consoles would be a better option.


----------



## skupples

How dare you state I can't afford something.

you've offended my sensibilities, i'll now afford it with 20%+ interest rates. What were you saying about not being able to afford something? 

fast forward 25 years - congrats, you still have no money!


----------



## aDyerSituation

As long as I can flex with my sig rig who cares about the future


----------



## skupples

right?! gimme a reason to have tri-sli again, nv!


----------



## ZealotKi11er

skupples said:


> right?! gimme a reason to have tri-sli again, nv!


Its not Nvidia to give you a reason. Its the game devs.


----------



## skupples

yeah yeah. NV already provides "financial assistance" to those using gamewerx, why can't dual GPU be part of that bribery? 

I thought the advantage of AMD's failed checkerboard, & NV's soon to be seen supposedly checkerboard = driver level dual GPU?


----------



## keikei

skupples said:


> yeah yeah. NV already provides "financial assistance" to those using gamewerx, *why can't dual GPU be part of that bribery*?
> 
> I thought the advantage of AMD's failed checkerboard, & NV's soon to be seen supposedly checkerboard = driver level dual GPU?



I suspect it will. Remember that mgpu driver leak? It maybe part of armpere. Nvidia needs a stop gap until RT is viable on a single card. My guess is birth of universal SLI with RT enabled as well. Yeet!


----------



## skupples

that's what I'm referencing with the checkerboard comment.

also, what's being pushed into dev drivers isn't really a leak  

NV needs a way to stomp all over consoles trying to sell themselves as 4K120 anything, or 8k anything  

honestly? I don't think its the 3080ti, but the 4080ti  I think the 3080ti is the last drop of the last gen bucket. Even if its EUV 7nm, it'll be the smallest example possible that'll still sell cuz "shart, consoles are coming, drop something NOW!"


----------



## b.walker36

skupples said:


> How dare you state I can't afford something.
> 
> you've offended my sensibilities, i'll now afford it with 20%+ interest rates. What were you saying about not being able to afford something?
> 
> fast forward 25 years - congrats, you still have no money!


I went to look at cars the other day and the dude pulled one up and was like no that's too expensive, I was pissed haha. I was like you don't know me.....


----------



## skupples

^^ tactics. those dudes think they're 2 year degree psych majors.


----------



## magnek

aDyerSituation said:


> As long as I can flex with my sig rig who cares about the future


You say that as a joke, but you just _know_ there are people (not necessarily on here mind you) like that. :/

OTOH, quite proud of my now ancient X79 sig rig. It has held up surprisingly well, and the 980 Ti OC'd to 1500/8000 still does 1440p/60 like a champ even after 5 years. GM200 has gone way above and beyond the legendary G80.


----------



## ToTheSun!

magnek said:


> OTOH, quite proud of my now ancient X79 sig rig. It has held up surprisingly well, and the 980 Ti OC'd to 1500/8000 still does 1440p/60 like a champ even after 5 years. GM200 has gone way above and beyond the legendary G80.


The other day, while I was doing some benching with 3DMark, I saw an entry at the bottom of my results with my old 980ti overclocked to hell and back, and I was amazed at how close it was to my previous 2080. Brought the metaphorical tear to my eye. But, since we're on the topic of frugality, I have to thank Jensen for the hecked up launch of Turing; it allowed my 2080 to die after 11 months of purchase and afford me a full refund. Nowadays, my much cheaper 2070S overclocks well enough to match it. It's like they paid me to wait for the 2070 to become what it should have been from the beginning instead of that sorry excuse of a 2060 on speed.


----------



## Sheyster

magnek said:


> OTOH, quite proud of my now ancient X79 sig rig. It has held up surprisingly well, and the 980 Ti OC'd to 1500/8000 still does 1440p/60 like a champ even after 5 years. GM200 has gone way above and beyond the legendary G80.



I felt the same way about my 5 GHz 4790K rig. It lasted 4+ years with various video cards, last one being a Titan Xp. I actually regret getting rid of the Xp. A friend bought the entire system from me for a decent price so it worked out well, not too much regret.


----------



## CoD511

Honestly, what I find most interesting is the checkerboard SLI technique... if that works out and can be universally applied to games for the most part, I'd be happy to go SLI again but without the frame pacing issues. It'd be really nice to if the launch is in the first half of 2020, I've been itching to replace this 1080Ti.. Still, everything is rumour and speculation besides the checkerboard SLI implementation being present in the driver now. I'll wait until there are hard performance numbers.




DNMock said:


> We are reaching levels of straw-man arguments that shouldn't even be possible!
> 
> Take a breather and remember the audience here. This isn't a board room nor is it a town hall. It's just a group of enthusiasts expressing their personal opinions. No one is trying to force legislature that would limit profits on anyone, so please leave the "entitled this" and "you have no clue, that" at the door.
> 
> Just because my water cooler opinion happens to be that Nvidia is getting greedy and overcharging a little too excessively doesn't make me an entitled whiny baby or whatever other stuff you throw out there. What it means is that I am expressing my opinions so that Nvidia can possiblly hear it, and if that opinion is echo'ed by the majority, they can course correct before it costs them money long term. There is no us vs them, everyone is on the same team here, all data points are relevant even if they don't fit your world view, and dissenting opinions are in fact data points to help everyone reach the best course of action.



I just have to say, you're talking to another enthusiast who expressed their opinion. Yet you asked them to leave their own views at the door? That seems rather hypocritical. They're fully within their own rights to voice their own opinion too, just as you are.


----------



## guttheslayer

Its almost 2020 and we heard nothing from Ampere or any new releases from Green Camp.


Nowaday PC HW is no longer as competitive driven as it was.


----------



## keikei

guttheslayer said:


> Its almost 2020 and we heard nothing from Ampere or any new releases from Green Camp.
> 
> 
> Nowaday PC HW is no longer as competitive driven as it was.



Well, computex aint till late may/ early june. I expect some info before then. Another member mentioned new cards before next gen console launch, and other rumors around summer 2020. AMD must have done something right as i dont remember Green releasing anything 1 year after a new arc. While just a refresh, Supa was a surprise to most. Yet again we have a years time and the next cycle is upon us. I dont remember cards launching so 'fast'. Still waiting on the Ti Supa though. I suspect thats the next card.


----------



## CoD511

guttheslayer said:


> Its almost 2020 and we heard nothing from Ampere or any new releases from Green Camp.
> 
> 
> Nowaday PC HW is no longer as competitive driven as it was.


I mean, looking at Turing and official announcements from Nvidia; introduced at SIGGRAPH 2018 and then consumer parts were revealed a week later at GamesCom. I feel like Nvidia just like to keep things tightly under wraps.


----------



## littledonny

keikei said:


> Still waiting on the Ti Supa though. I suspect thats the next card.


Personally, I doubt it ever gets made. Feels like 780 Ti 6GB all over again.


----------



## Baasha

Summer 2020 seems most likely as the 2080 (?) was released around June (was it 2016?).


----------



## keikei

Baasha said:


> Summer 2020 seems most likely as the 2080 (?) was released around June (was it 2016?).



Sept 2018. I luv these 'fast' cycles.


----------



## skupples

Baasha said:


> Summer 2020 seems most likely as the 2080 (?) was released around June (was it 2016?).


i expect the 2080ti beating 3080 in the summer, and the 3080ti 3 - 6 months later.


----------



## m4fox90

skupples said:


> i expect the 2080ti beating 3080 in the summer, and the 3080ti 3 - 6 months later.


Are we sure they're going to keep doing that staggered release of the 80Ti vs the 80? Like the cat's out of the bag on that strategy, they have to know they're only ruining their own sales by only running out the 80 out and disappointing everybody who can just go buy the last 80ti for cheap. If they want to hit hard and get more money, they're much better off dropping the big boy like with Turing


----------



## skupples

the only time they haven't in recent history is when the node shrink isn't anything to write home about/non existent.

3080 faster than 2080ti gets the weak/wealthy to double dip. They'll take the incremental performance, n sell off the 3080 to folks like me once the 3080ti drops, then again if the titan is further staggered 

when it comes to modern big business you can pretty much put your money on them using the most aggressive and sleazy tactics to squeeze a dime. NV went from $20 a share to $240 a share in a 4 year time span folks. ball kicking good time when you were too broke to get your money in & knew this was coming 5 years back


----------



## guttheslayer

skupples said:


> the only time they haven't in recent history is when the node shrink isn't anything to write home about/non existent.
> 
> 3080 faster than 2080ti gets the weak/wealthy to double dip. They'll take the incremental performance, n sell off the 3080 to folks like me once the 3080ti drops, then again if the titan is further staggered


And that is why 2080 Ti will not be faster than 3080, time and time again when NV release a new gen of cards, the fastest card in that release is always faster than previous gen flagship.


If NV is releasing 3080 it will be faster than 2080 Ti, if it doesnt then the 3080 Ti will be release as the flagship together with 3080.


----------



## skupples

3080 will likely even be faster than turing titan.

assuming the next series is the shrink to 7nm then 3080 will smoke 2080ti. shrink + evolved tensor = win.

RT will maybe actually be viable @ high res + decent frames come next gen.


----------



## ZealotKi11er

Its crazy that we already know codenames and some specs from AMD unreleased stuff but from Nvidia all we can speculate is the name of the GPU and that it is going to be expensive.


----------



## skupples

ZealotKi11er said:


> Its crazy that we already know codenames and some specs from AMD unreleased stuff but from Nvidia all we can speculate is the name of the GPU and that it is going to be expensive.


oh yeah? I haven't seen much snoop from AMD recently, because this is the only forum i frequent really. what's the scoop?


----------



## umeng2002

The current "mid range" market is the old low end.

nVidia essential monopoly in the consumer space has allowed them to repurpose their mid-range GPUs as high end... and their high-end as "halo" products...

Hopefully, Intel in the market will provide the competition AMD isn't and bring prices back in check.

If you've gotten into PCs over the past 10 years or so, all you know of is price gouging and low value from Intel and nVidia... in the mid to high-end market anyways.


----------



## guttheslayer

skupples said:


> 3080 will likely even be faster than turing titan.
> 
> assuming the next series is the shrink to 7nm then 3080 will smoke 2080ti. shrink + evolved tensor = win.
> 
> RT will maybe actually be viable @ high res + decent frames come next gen.



I dun care about RT, I only care if 3080 is indeed faster than Titan RTX in normal rasterization game. If that is the case it is consider a win despite its high price.


----------



## umeng2002

guttheslayer said:


> I dun care about RT, I only care if 3080 is indeed faster than Titan RTX in normal rasterization game. If that is the case it is consider a win despite its high price.


RT is still a bit of a joke. The best implantation is Metro Exodus and you still need to play around 1080p and make quality sacrifices and put up with poorly optimized locations that tank performance by half here and there.

The next generation of cards to need to at least double the RT performance and drop DLSS, imho. DLSS is a joke more than RT at this point.


----------



## doom26464

Nvidia stays very tight lipped as to not hurt there current product stack. They will keep pushing 20 series right till they surprise launch the 30 series. 

Im glad AMD has come back and navi is a good product and competitive(at least in there respected price brackets) but nvidia will pull way ahead again with ampere.


----------



## skupples

doom26464 said:


> Nvidia stays very tight lipped as to not hurt there current product stack. They will keep pushing 20 series right till they surprise launch the 30 series.
> 
> Im glad AMD has come back and navi is a good product and competitive(at least in there respected price brackets) but nvidia will pull way ahead again with ampere.



pulling ahead, and AMD not competing in the top segment aren't the same thing. It'll be nice to know if AMD intends to ever compete above mid range ever again. If not, then the only hope is in Intel, so not much.


----------



## guttheslayer

skupples said:


> pulling ahead, and AMD not competing in the top segment aren't the same thing. It'll be nice to know if AMD intends to ever compete above mid range ever again. If not, then the only hope is in Intel, so not much.


If the latest Navi 21 rumor hold true, the WC version of that could give Ampere a run for its money.



It wouldnt seem far fetch if the 505mm^2 Navi on 7nm+ could match a mid-range 350-400mm^2 die size Ampere known as 3080.


----------



## skupples

that's quite the jump for them to make though. From 5700xt that barely holds with 2070(s)/1080ti, to something 20-30% faster than the entire 20 series line up? oof.


----------



## BigMack70

--delete--


----------



## guttheslayer

skupples said:


> that's quite the jump for them to make though. From 5700xt that barely holds with 2070(s)/1080ti, to something 20-30% faster than the entire 20 series line up? oof.


the 5700 XT isnt even high end to begin with. Its only has a die size of 251mm^2, which equate to mid-low range of Polaris size in the past.


----------



## Zam15

Tom's Hardware:
Nvidia Ampere Purportedly 50% Faster Than Turing At Half The Power Consumption
Apparently Ampere is more than ready for Big Navi.

https://www.tomshardware.com/news/n...ter-than-turing-at-half-the-power-consumption

Can only hope this is true, I'm ready to upgrade!


----------



## m4fox90

Zam15 said:


> Tom's Hardware:
> Nvidia Ampere Purportedly 50% Faster Than Turing At Half The Power Consumption
> Apparently Ampere is more than ready for Big Navi.
> 
> https://www.tomshardware.com/news/n...ter-than-turing-at-half-the-power-consumption
> 
> Can only hope this is true, I'm ready to upgrade!


Usually its one or the other, not both. 50% faster at same power consumption would be believable, if unlikely. 50% at half power is too improbable to even consider. You're talking 100+ FPS at 4K in modern titles at ~140 watts.


----------



## Zam15

m4fox90 said:


> Usually its one or the other, not both. 50% faster at same power consumption would be believable, if unlikely. 50% at half power is too improbable to even consider. You're talking 100+ FPS at 4K in modern titles at ~140 watts.


Agreed, but even in the rare event this is the case I'd say keep the power savings and give me 100%.


----------



## ToTheSun!

m4fox90 said:


> Usually its one or the other, not both. 50% faster at same power consumption would be believable, if unlikely. 50% at half power is too improbable to even consider. You're talking 100+ FPS at 4K in modern titles at ~140 watts.


Well, it's an entirely new generation on entirely new lithography. 150% capability at 50% power seems unreasonable, but perhaps some form of energy efficiency improvement is plausible.


----------



## skupples

i feel like we hear that EVERY shrink. 

2x for .5!


----------



## Zam15

I'm sure Nvidia has some incredible tech that they have been sitting on for sever years, its not like they cant do it, but would rather drip feed us old tech and make as much money as they can. They haven't had to worry about competition for years. Maxwell gave us Multi Frame Sampled AA (MFAA), for a jump over the 780, not really a new architecture, then Maxwell to Pascal was pretty much just a node shrink but still nearly doubled the performance, the Turing we got RTX but missed out on a major performance uplift. Pretty much at Kepler 4.0 at the moment. 

Only way I see actually getting that kind of performance jump from Nvidia is if they were pushed into a corner and that's not going to happen. 

We may see a 50% increase in RTX performance but I doubt an overall jump like that. But then again we saw that kind of jump going to Pascal 16nm from 28nm.


----------



## m4fox90

It depends on how much space they want to devote to RT. the 2080TI's 700mm2 die could house some more good stuff instead of the silly RT cores and the DLSS I can't use because I'm not playing stuff at 4K


----------



## skupples

that's honestly my expectation. Even with AMD's super tuned for gaming cutting edge stuffs, NV will just brute force thru it all like usual.

the question people will start to draw is longevity, and people will blindly claim that Navi will age as well as og GCN.


----------



## dubldwn

The difference in transistor density between TSMC 12nm and 7nm+ is huge and I believe these claims are possible. The issue is nVidia has the proven ability to put out just enough, with a couple conspicuous exceptions, like GP102.


----------



## skupples

i wonder how rampant corporate spying is. 

like, how does NV Always have a next day response to an AMD drop? 

"oh, well hey.... we just so happen to have 1,000,000 GPUs ready to go! surprise!


----------



## Zam15

skupples said:


> i wonder how rampant corporate spying is.
> 
> like, how does NV Always have a next day response to an AMD drop?
> 
> "oh, well hey.... we just so happen to have 1,000,000 GPUs ready to go! surprise!


AMD: Releases their brand new cutting edge tech

Nvidia: "Quick to the warehouse and dust those old video cards off, we can finally got the ok sell them!"


----------



## skupples

maybe this is why we never saw the 8 series! They're just sitting in a warehouse somewhere. Like presidential ballots at the Fort Lauderdale airport in 2016. "Oh look, we found a semi truck full!" 

oh look, they're 90% line tote votes for one side... strange!


----------



## guttheslayer

ToTheSun! said:


> Well, it's an entirely new generation on entirely new lithography. 150% capability at 50% power seems unreasonable, but perhaps some form of energy efficiency improvement is plausible.


Ok lets be abit more realistic here. We can compare the performance of all the big bad boy from each gen

GF110 -> GK102
GK102 -> GM200
GM200 -> GP102
GP102 -> TU102


You will realised there is a pattern and that each bad generation boy is 50%-70% faster at the same power bracket (Except TU102 which under perform)


To line up your expectation, reduce both value by half and that should be the real performance per watt. (25% faster at 75% power consumption, that seem to be realistic performance of 3080). Or that 50% performance boost could be focused on specific RT cases. I just hope you guys dont get too overhyped for now.


----------



## EniGma1987

Zam15 said:


> Tom's Hardware:
> Nvidia Ampere Purportedly 50% Faster Than Turing At Half The Power Consumption
> Apparently Ampere is more than ready for Big Navi.
> 
> https://www.tomshardware.com/news/n...ter-than-turing-at-half-the-power-consumption
> 
> Can only hope this is true, I'm ready to upgrade!


lol. Someone is only starting that rumor because WCCF started a rumor that Navi is so super powerful.


"OMG. Navi 21 is twice the performance as Navi 10!"
"well. well. Nvidia Ampere is 50% faster than Turing cards so it will still win!!!!"






skupples said:


> that's honestly my expectation. Even with AMD's super tuned for gaming cutting edge stuffs, NV will just brute force thru it all like usual.
> 
> the question people will start to draw is longevity, and people will blindly claim that Navi will age as well as og GCN.


Honestly, I would expect next-gen Navi to age as well as GCN has because of the same reason GCN aged well. It was put into the XBox and Playstation consoles around the beginning. Now, RDNA2 is being placed into next-gen Xbox and Playstation. So RDNA2 will last a long time as it gains performance benefits of console use, same as GCN did.


----------



## keikei

EniGma1987 said:


> lol. Someone is only starting that rumor because WCCF started a rumor that Navi is so super powerful.
> 
> 
> "OMG. Navi 21 is twice the performance as Navi 10!"
> "well. well. Nvidia Ampere is 50% faster than Turing cards so it will still win!!!!"
> 
> 
> 
> 
> 
> Honestly, I would expect next-gen Navi to age as well as GCN has because of the same reason GCN aged well. It was put into the XBox and Playstation consoles around the beginning. Now, RDNA2 is being placed into next-gen Xbox and Playstation. So RDNA2 will last a long time as it gains performance benefits of console use, same as GCN did.


The nvidia rumor looks to push back the release. The prior one mention 1H2020, either way, this seems to narrow down to late summer/early fall. Let the console/gpu battle commence.


----------



## skupples

from what I understand, GCN 2.0 lacks the "fine wine" capabilities due to inherent design decisions. 

lemme see if i can find it...

something about GCN 1.0 having 4 rending pipelines, and GCN 2.0 only having 2?


----------



## pompss

All i want its hdmi 2.1 on the future Gpu so i can use my oled 4k 120hz for lg


----------



## Clos

pompss said:


> All i want its hdmi 2.1 on the future Gpu so i can use my oled 4k 120hz for lg


I'm with ya, But i just want More HDMI Ports Than Display Ports..


----------



## EniGma1987

Clos said:


> I'm with ya, But i just want More HDMI Ports Than Display Ports..


People used to complain about not being able to do triple monitor setups well with 2 HDMI and a DP or even 2 of each type. You either had to get monitors that had inputs for both ports (which were rare), or have two monitors of one model and one of another, or you had to use two graphics cards which didnt always work right. So the industry changed to triple display port outputs to satisfy the complaints of the triple monitor crowd. DP is meant more for monitors and it had way more bandwidth than HDMI has had, and it costs the manufacturers less money which is why they went that way.

It would be nice to see triple HDMI 2.1 ports on GPUs going forward, but I doubt that will happen. They will include a single HDMI for necessary connectivity, and expect people using multi-monitor PCs to continue to use display ports.


----------



## Clos

EniGma1987 said:


> People used to complain about not being able to do triple monitor setups well with 2 HDMI and a DP or even 2 of each type. You either had to get monitors that had inputs for both ports (which were rare), or have two monitors of one model and one of another, or you had to use two graphics cards which didnt always work right. So the industry changed to triple display port outputs to satisfy the complaints of the triple monitor crowd. DP is meant more for monitors and it had way more bandwidth than HDMI has had, and it costs the manufacturers less money which is why they went that way.
> 
> It would be nice to see triple HDMI 2.1 ports on GPUs going forward, but I doubt that will happen. They will include a single HDMI for necessary connectivity, and expect people using multi-monitor PCs to continue to use display ports.


I get why they did it and in most cases i wouldn't mind using display ports but in my rare user case, My issue with Display port is the lack of distance allowed with cabling. I rebuilt my 'office area' and embedded my display cables inside of the walls along with usb cables for a "cleaner" desk look. Had to go with HDMI all the way through. So i have 3 dell with both ports on the monitors but ended up, as you say, using my 1080ti for my main, and my 1070 'spare' as my driver for the other two monitors using that HDMI for the farthest monitor and DVI converted to HDMI for the closest.

But anyways, back on track, i wish there were more HDMI port  Or at least two card options BUT, that's asking for them to give out another sku and we all know that won't happen at all. haha


----------



## keikei

pompss said:


> All i want its hdmi 2.1 on the future Gpu so i can use my oled *4k 120hz* for lg



I want a card that can run those specs. RDR2 musters a measly 4k/30fps with the top card. We got a LONG WAY to go!


----------



## BigMack70

keikei said:


> I want a card that can run those specs. RDR2 musters a measly 4k/30fps with the top card. We got a LONG WAY to go!


My dream fantasy is they come out with some new kind of game-agnostic version of multi-GPU so we can run two (or more?) cards and get actual scaling performance in more than 1/3rd of games. 

I know it's a dream fantasy and will never ever happen, but hope springs eternal.


----------



## epic1337

BigMack70 said:


> My dream fantasy is they come out with some new kind of game-agnostic version of multi-GPU so we can run two (or more?) cards and get actual scaling performance in more than 1/3rd of games.
> 
> I know it's a dream fantasy and will never ever happen, but hope springs eternal.


might happen when they move GPU dies to chiplet designs, they'll be forced to make GPUs modular as a result.


----------



## keikei

BigMack70 said:


> My dream fantasy is they come out with some new kind of game-agnostic version of multi-GPU so we can run two (or more?) cards and get actual scaling performance in more than 1/3rd of games.
> 
> I know it's a dream fantasy and will never ever happen, but hope springs eternal.



An early mgpu driver was found not too long ago. I suspect it'll get announced with Ampere. Nvidia has been working OT in regards to side benefits with RTX. Dual/Tri with current cards are a much cheaper upgrade vs next gen if viable.


----------



## BigMack70

keikei said:


> An early mgpu driver was found not too long ago. I suspect it'll get announced with Ampere. Nvidia has been working OT in regards to side benefits with RTX. Dual/Tri with current cards are a much cheaper upgrade vs next gen if viable.


I am less confident. They've been increasingly dropping support for SLI, both in hardware (midrange cards can no longer do it) and software (lower percentage of games supported, supported games rarely scale above +60% perf). NVLink was hyped up but in the end was a big nothing. 

Now, *maybe* this is because they recognize what everyone else recognizes - that AFR is dead tech that's going nowhere, and *maybe* they've been doing a huge amount of R&D to figure out some other solution to multi-GPU that will actually work in a more general modern use case, but I'll believe it when I see it. It sounds like unicorn tech to me.


----------



## Defoler

Clos said:


> I get why they did it and in most cases i wouldn't mind using display ports but in my rare user case, My issue with Display port is the lack of distance allowed with cabling. I rebuilt my 'office area' and embedded my display cables inside of the walls along with usb cables for a "cleaner" desk look. Had to go with HDMI all the way through. So i have 3 dell with both ports on the monitors but ended up, as you say, using my 1080ti for my main, and my 1070 'spare' as my driver for the other two monitors using that HDMI for the farthest monitor and DVI converted to HDMI for the closest.
> 
> But anyways, back on track, i wish there were more HDMI port  Or at least two card options BUT, that's asking for them to give out another sku and we all know that won't happen at all. haha


Not sure why do you think HDMI 2.1 is different. License/spec wise, both DP 1.4a and HDMI 2.1 are about 3m max cable length in terms of rating. You also need a cable that is rated ultra high speed to get the full benefit of HDMI 2.1. You can get high quality cables longer than 3m that can still give the full benefit of DP.


----------



## skupples

keikei said:


> I want a card that can run those specs. RDR2 musters a measly 4k/30fps with the top card. We got a LONG WAY to go!


its actually lower than that, if you peg it at ultra.

luckily though, it's quite easy to get 4K60 now with very little IQ change over Ultra.

game's way more enjoyable @ 120FPS though. All my completions when from bronze/silver to mostly golds.

some day people will realize ULTRA is often times a setting tier put in place for hardware that doesn't exist yet.


----------



## ToTheSun!

skupples said:


> some day people will realize ULTRA is often times a setting tier put in place for hardware that doesn't exist yet.


What about EXTREME?


----------



## umeng2002

Playing on the highest setting is to appease your OCD.

Middle or the next highest from middle (unless that's the highest) is what you should be playing on with a "new" game on "new hardware."


----------



## skupples

ToTheSun! said:


> What about EXTREME?


 

let me restate - some day people will realize that the highest toggle in the graphics options is often times intended for hardware that doesn't exist yet. I figured this out first hand by jumping on GK110 band wagon. It did however finally allow me to play games @ 5760x1080p @ medium/high settings while maintaining 60+


----------



## ZealotKi11er

umeng2002 said:


> Playing on the highest setting is to appease your OCD.
> 
> Middle or the next highest from middle (unless that's the highest) is what you should be playing on with a "new" game on "new hardware."


The reason to play using the highest settings is simply for laziness. I know that I am not losing and IQ. The nice thing is that some people do guides with proper comparisons and you can use those settings instead. Most of the time I don't need to lower settings with 2080 Ti at 4K60.


----------



## Zam15

I just want an uber powerful single card. Not going SLI the next time around and all the new features are dropping on Turing and newer, in some cases Pascal but Maxwell has been completely left to hang out dry. Including VR updates for the drivers that hit today. Sigh... 

I really loved DSR, unfortunately it doesn't run in SLI. I would only be able to use it on older titles. I feel DSR is a great feature for older and new games to clean things up.

There are a few features I'll immediately disable, this includes motion blur and film grain.


----------



## Sheyster

Is MFAA even a thing anymore? It seems like none of the new AAA titles are using AA technology that would leverage MFAA. It was not supported with SLI and it seems devs have moved away from MSAA anti-aliasing.


----------



## iTurn

skupples said:


> 3080TI will be the same price.
> 
> 
> highly unlikely. *NVidia hasn't given us gains like that in years*. 30% max. oh right 2080ti is 50%
> 
> i have pretty high expectations actually, if not at least we know its the first hdmi 2.1 card
> 
> either way, i'm trying to hold out for the cards that release AFTER the new consoles. Not the last run up to the new consoles. That's like the last bit of paste in the dx11 tube.


That's not a fair argument (IMO), 980 Ti -> 1080 Ti was +50% gain.
1080 Ti -> 2080 Ti was ~30% yes that less that the 50% expectation but the tensor cores for Ray Tracing is why we didn't get that leap in performance, the silicon is there it's just being utilized differently than the norm. 

1080 Ti has a 471²mm vs 775²mm for the 2080 Ti.


----------



## betam4x

ZealotKi11er said:


> The reason to play using the highest settings is simply for laziness. I know that I am not losing and IQ. The nice thing is that some people do guides with proper comparisons and you can use those settings instead. Most of the time I don't need to lower settings with 2080 Ti at 4K60.


I do need to jump in here and say that, with nearly all the games I play, I can typically play on the highest, or close to the highest settings on my 1080ti. There is a marked visual difference in most games between low/medium/high. Anyone that tries to justify playing on lower settings by stating that there is no visual difference is full of it. One can easily do side by side comparison screenshots and *see* the differences.


----------



## Clos

Defoler said:


> Not sure why do you think HDMI 2.1 is different. License/spec wise, both DP 1.4a and HDMI 2.1 are about 3m max cable length in terms of rating. You also need a cable that is rated ultra high speed to get the full benefit of HDMI 2.1. You can get high quality cables longer than 3m that can still give the full benefit of DP.


These are just what i've found... not sure if they're 'good enough' sources *shrug*

DP Specification: go to 4.1.3
https://en.wikipedia.org/wiki/DisplayPort#Cable_length

HDMI Spec : go to 'relationship with display port
https://en.wikipedia.org/wiki/HDMI#Cables

Either way... I have yet to find a Display port cable capable of 10ft plus without resolution/image quality degradation.
I'm using a 1.5 ft, plus 24ft inwall rated, and 4 ft Hdmi cables and run 1440p and 4k np *shrug* Can't find a display port cable to save my life.


----------



## Clos

BigMack70 said:


> I am less confident. They've been increasingly dropping support for SLI, both in hardware (midrange cards can no longer do it) and software (lower percentage of games supported, supported games rarely scale above +60% perf). NVLink was hyped up but in the end was a big nothing.
> 
> Now, *maybe* this is because they recognize what everyone else recognizes - that AFR is dead tech that's going nowhere, and *maybe* they've been doing a huge amount of R&D to figure out some other solution to multi-GPU that will actually work in a more general modern use case, but I'll believe it when I see it. It sounds like unicorn tech to me.





Just a thought, isn't the idea of DX12, Vulkan etc allow mGPU without the needs for "SLI". I could be completely wrong and just a thought but, could it be SLI and Xfire connectors and such are disappearing since they've essentially become obsolete due to the new API's ability to use mGPU's (and supposedly even mix match) without the need for sli/xfire drivers etc.

i think the real problem is dev's don't care and will never program their games to utilize multi-gpu's.. so Nvidia/AMD didn't or aren't killing it, game dev's are.


----------



## 113802

Clos said:


> Just a thought, isn't the idea of DX12, Vulkan etc allow mGPU without the needs for "SLI". I could be completely wrong and just a thought but, could it be SLI and Xfire connectors and such are disappearing since they've essentially become obsolete due to the new API's ability to use mGPU's (and supposedly even mix match) without the need for sli/xfire drivers etc.
> 
> i think the real problem is dev's don't care and will never program their games to utilize multi-gpu's.. so Nvidia/AMD didn't or aren't killing it, game dev's are.


nVidia explains it here: https://developer.nvidia.com/explicit-multi-gpu-programming-directx-12

Bridges will probably disappear for gaming but they are still needed for compute. AMD made their own interconnect called xGMI.


----------



## Clos

WannaBeOCer said:


> nVidia explains it here: https://developer.nvidia.com/explicit-multi-gpu-programming-directx-12
> 
> Bridges will probably disappear for gaming but they are still needed for compute. AMD made their own interconnect called xGMI.


Really Good Read, Thanks for linking that WBOc.


----------



## BigMack70

Clos said:


> Just a thought, isn't the idea of DX12, Vulkan etc allow mGPU without the needs for "SLI". I could be completely wrong and just a thought but, could it be SLI and Xfire connectors and such are disappearing since they've essentially become obsolete due to the new API's ability to use mGPU's (and supposedly even mix match) without the need for sli/xfire drivers etc.
> 
> i think the real problem is dev's don't care and will never program their games to utilize multi-gpu's.. so Nvidia/AMD didn't or aren't killing it, game dev's are.


I could be mistaken, but as far as I'm aware, there is no way to implement explicit multi-GPU in a game-agnostic way at the driver level. That means it will never see meaningful implementation, because game developers will never broadly implement or invest in such technology. The only companies that have a dog in this race are the hardware manufacturers, and the only thing they can really control is driver level implementation. 

Explicit multi-GPU has been around for several years I'm not aware of any game outside of Ashes of the Singularity that uses it. Game devs just aren't going to support it.

As for what killed multi-GPU so far, it was the reliance on AFR as the primary method of implementing it. AFR doesn't work with a large number of modern rendering techniques. Most modern games, in rendering a frame, depend upon data present from a previous frame, and for all practical purposes that makes AFR impossible. It didn't get killed so much as multi-GPU failed to evolve with the times, and so traditional SLI became obsolete.


----------



## Sheyster

BigMack70 said:


> The only companies that have a dog in this race are the hardware manufacturers, and the only thing they can really control is driver level implementation.


And it's a very small dog (small percentage of the market). It'll be interesting to see what happens next.


----------



## rv8000

keikei said:


> I want a card that can run those specs. RDR2 musters a measly 4k/30fps with the top card. We got a LONG WAY to go!


Without some monumental leap in GPU processing power I don't think 4k/120fps will ever happen. Being dependent on both hardware and software, that target FPS will always be a moving goal post. Until we hit a limitation on the software side, the hardware is never going to keep up. We may be lucky to have one or two games released within a year that look great and hit fps over 60, but not with out potential sacrifices to image quality.


----------



## skupples

i just wish dual GPU support was part of the FLEX bribery bundle.


----------



## BigMack70

rv8000 said:


> Without some monumental leap in GPU processing power I don't think 4k/120fps will ever happen. Being dependent on both hardware and software, that target FPS will always be a moving goal post. Until we hit a limitation on the software side, the hardware is never going to keep up. We may be lucky to have one or two games released within a year that look great and hit fps over 60, but not with out potential sacrifices to image quality.


We'll get there at some point this console generation, I think. 

In 2013, the GTX 780s performance profile at 1440p was fairly similar to the RTX 2080 Ti's current performance profile at 4k. 

It took a while, but 1440p120 is now a very reasonable target on top hardware. 4k 120 should be a reasonable target without compromising visual fidelity in a few more GPU generations.


----------



## rv8000

BigMack70 said:


> We'll get there at some point this console generation, I think.
> 
> In 2013, the GTX 780s performance profile at 1440p was fairly similar to the RTX 2080 Ti's current performance profile at 4k.
> 
> It took a while, but 1440p120 is now a very reasonable target on top hardware. 4k 120 should be a reasonable target without compromising visual fidelity in a few more GPU generations.


If we're lucky gpus in 2030 may be able to reliably hit 4k120 on titles released in the same year if game engines were to completely stop any sort graphical improvement. 2k to 4k is a much larger jump in number of pixels than 1080 to 1440p, and it definitely shows when it comes to benchmarks.

Take a look at any recently released games:

Best case with COD:MW the 2080ti averages 77fps @ 4k, all the way to worst case of averaging 46fps @ 4k in Red Ded. Without nuking quality settings you'll never approach 120fps +/- 10 fps. Even looking at 1440p in these titles the 2080, 2080S, Vega 7, and 1080ti even struggle to hit the 120 fps mark.

The next problem then becomes when/if ray tracing becomes the main rendering technique across all engines and sets hardware performance back to the stone age. 4k120 is simply unrealistic. It's sad because even when playing older games upscaled via VSR/DSR, 4k looks amazing. Unless something revolutionary happens when it comes to software/hardware advancements in the next few years, the typical 30-50% generational performance improvement isn't going to cut it. At that rate 4k120 will certainly be 15+ years off.


----------



## BigMack70

rv8000 said:


> If we're lucky gpus in 2030 may be able to reliably hit 4k120 on titles released in the same year if game engines were to completely stop any sort graphical improvement. 2k to 4k is a much larger jump in number of pixels than 1080 to 1440p, and it definitely shows when it comes to benchmarks.
> 
> Take a look at any recently released games:
> 
> Best case with COD:MW the 2080ti averages 77fps @ 4k, all the way to worst case of averaging 46fps @ 4k in Red Ded. Without nuking quality settings you'll never approach 120fps +/- 10 fps. Even looking at 1440p in these titles the 2080, 2080S, Vega 7, and 1080ti even struggle to hit the 120 fps mark.
> 
> The next problem then becomes when/if ray tracing becomes the main rendering technique across all engines and sets hardware performance back to the stone age. 4k120 is simply unrealistic. It's sad because even when playing older games upscaled via VSR/DSR, 4k looks amazing. Unless something revolutionary happens when it comes to software/hardware advancements in the next few years, the typical 30-50% generational performance improvement isn't going to cut it. At that rate 4k120 will certainly be 15+ years off.



Ray Tracing is certainly something that could toss framerates back to 1995 depending on how it gets implemented and how fast the hardware develops to handle it, I certainly grant you that. I just don't expect it, because I expect the implementation to be heavily limited by console hardware. If I'm wrong, then you are probably correct that 4k120 is not going to be a thing at highest settings for a verrrrry long time.

However, I expect consoles are going to prevent that scenario, so I don't agree at all with your analysis of how daunting 4k 120 is to run. It's a reasonable assumption to think that GPU performance progress from 2020 through 2027 will mirror GPU progress from 2013 to 2020. And if the top cards (GTX 780) in 2013 were barely handling the top resolution (1440p) at 60fps, but now in 2020 can easily do 120fps, it's not unreasonable to expect the same in another similar timeframe - that 4k120 will be ordinary on top hardware in about another 6-7 years. Probably 2-3 GPU generations after Ampere. 

This is just the ordinary cycle of graphics. New consoles --> base game engine fidelity increases to match console capability --> PC hardware continues to develop --> PC hardware pushes up framerate and resolution because base fidelity is limited by consoles --> New consoles come out --> repeat


----------



## skupples

2x2080ti in RDR2 on ultra 4K (vulkan) = ~80-90FPS.

it isn't all that far off outside of RT realm.


----------



## rv8000

BigMack70 said:


> Ray Tracing is certainly something that could toss framerates back to 1995 depending on how it gets implemented and how fast the hardware develops to handle it, I certainly grant you that. I just don't expect it, because I expect the implementation to be heavily limited by console hardware. If I'm wrong, then you are probably correct that 4k120 is not going to be a thing at highest settings for a verrrrry long time.
> 
> However, I expect consoles are going to prevent that scenario, so I don't agree at all with your analysis of how daunting 4k 120 is to run. It's a reasonable assumption to think that GPU performance progress from 2020 through 2027 will mirror GPU progress from 2013 to 2020. And if the top cards (GTX 780) in 2013 were barely handling the top resolution (1440p) at 60fps, but now in 2020 can easily do 120fps, it's not unreasonable to expect the same in another similar timeframe - that 4k120 will be ordinary on top hardware in about another 6-7 years. Probably 2-3 GPU generations after Ampere.
> 
> This is just the ordinary cycle of graphics. New consoles --> base game engine fidelity increases to match console capability --> PC hardware continues to develop --> PC hardware pushes up framerate and resolution because base fidelity is limited by consoles --> New consoles come out --> repeat


The jump isn't the same though. On top of that, GPU performance gain per generation has regressed (not necessarily due to inability to develop faster cards but due to market stagnation and change to rendering techniques). It's not a linear progression, for both performance increases and difficulty to run 4k as opposed to 2k; 4k is 225% increase in pixels over 2k, 2k is a 177% increase in pixels over 1080.

Your theory heavily relies on the fact that developing for a console is a 1:1 translation to PC. It isn't, and that has been proven time and time again. The console environment gets to leverage that fact that there is a single hardware/software configuration to develop for; weaker hardware can perform better due to better resource allocation and efficiency. On top of the fact 90% of the time most games will be subpar ports over to the PC, developers will add in additional settings, graphical advancements, etc..., making the games harder to run on PC.

Simply put, it WILL take longer than 7 years to get to 4k120. There are games the 2080ti can't even run at 1440p120 today.
@skupples 

Multi GPU solutions are a moot point unless you enjoy next to no support for most games, continuing on a downward trend.


----------



## keikei

skupples said:


> 2x2080ti in RDR2 on ultra 4K (vulkan) = ~80-90FPS.
> 
> it isn't all that far off outside of RT realm.


This guy has it 60-70fps. Around 80% scaling. Yeah, thats rough in terms of frame numbers, buts its not an easy game to run.


----------



## BigMack70

rv8000 said:


> The jump isn't the same though.


You appear not to understand my point. 1440p 60fps --> 1440p 120fps is literally the same jump as 4k 60fps --> 4k 120fps 

The GTX 780 was about a 1440p60 card in 2013. Did better than 1440p60 in some games, couldn't quite hit it in others. In general, the 2080 Ti is the same - about a 4k60 card; hits 4k60 or better in most titles, doesn't get there in a few. And most likely the better analogue to the GTX 780 will be whatever comes out this year, since that puts it a few months before console launch just like the GTX 780, and Ampere should easily be 4k60 capable.


----------



## rv8000

BigMack70 said:


> You appear not to understand my point. 1440p 60fps --> 1440p 120fps is literally the same jump as 4k 60fps --> 4k 120fps
> 
> The GTX 780 was about a 1440p60 card in 2013. Did better than 1440p60 in some games, couldn't quite hit it in others. In general, the 2080 Ti is the same - about a 4k60 card; hits 4k60 or better in most titles, doesn't get there in a few. And most likely the better analogue to the GTX 780 will be whatever comes out this year, since that puts it a few months before console launch just like the GTX 780, and Ampere should easily be 4k60 capable.


It doesn't work that way.

Per FPS requirements are different as resolution increases. For every additional fps @ 4k it requires more performance than every additional fps @ 2k; everytime you increase the resolution the steeper the requirement becomes on the hardware.

In order to hit these idealistic fps requirements at higher resolutions we would have to see the opposite of GPU performance trends. Currently the performance gap between top tier cards has reduced every generation since the 680. If we wanted 4k120 we would have to see the trend flip on it's head and need 200% performance increases, which WILL NOT HAPPEN.

Thus the end conclusion being it will take significantly longer to hit the same FPS leap you're speaking of.


----------



## skupples

keikei said:


> This guy has it 60-70fps. Around 80% scaling. Yeah, thats rough in terms of frame numbers, buts its not an easy game to run.
> 
> https://www.youtube.com/watch?v=H_p9ZgiQ2mA


that's also less than 10 days after release  and that it is, thus why I used it as an example. Folks on truly top of the line sli 2080ti systems maintain at LEAST 60 on maxed, with others reporting/screen shotting 80s-90s. 

if Bashaaa sees this he'll link his screenies.


----------



## BigMack70

rv8000 said:


> It doesn't work that way.


It absolutely works this way. Performance increases GPU to GPU are relative to the performance of the previous GPU, not to some arbitrary resolution or framerate numbers. We could be talking about 640x480 or 8k resolution; the trend is that doubling from 60 to 120fps at a given resolution in current titles takes 6-8 years.

Trends can change. But I think this trend will continue. And if the trend changes, it will have nothing to do with the fact that 4k is a higher resolution than 1440p.

--EDIT-- Also, I don't think you are understanding the jumps in GPU architecture performance correctly...

GTX 285 (big Tesla) --> GTX 580 (big Fermi) was about a 65% performance increase
GTX 580 --> GTX 780 Ti (big Kepler) was about a 65% performance improvement
780 Ti --> 980 Ti (big Maxwell) was about a 43% performance increase
980 Ti --> 1080 Ti (big Pascal) was around an 85% performance increase
1080 Ti --> 2080 Ti (big Turing) was about a 38% performance increase

This is reasonably consistent over time. Maxwell was a slightly smaller increase because it was on the same process node as Kepler. Turing was because of all the ray tracing hardware, and it was only kinda sorta on a new process.

If we average this all out, that's a 60% improvement per generation. That means that 2-3 generations after Ampere - my stated timeframe for when I expect 4k120 to be a thing - the GPUs will be 4-7x as powerful in terms of relative gaming performance compared to the 2080 Ti. It's very reasonable to assume that GPUs several times more powerful than what's available in consoles will be able to hit the standard target resolution of the generation (4k) at high framerate (120fps).


----------



## umeng2002

And the GTX 580 was $550. Inflation is almost a non-factor. Value has gone down significantly since then. I blame a lack of high-end competition from AMD.


----------



## BigMack70

umeng2002 said:


> And the GTX 580 was $550. Inflation is almost a non-factor. Value has gone down significantly since then. I blame a lack of high-end competition from AMD.


Obviously. Nvidia literally doubled their pricing across the entire product stack beginning with Kepler and the GTX 600 series, and (in part) because everyone in the tech industry fell over themselves to praise the GTX 680, they got away with it and so they've maintained that doubled pricing very consistently. My point is only about the performance, not the value or the merits of buying their high end cards or any of that.


----------



## skupples

hopefully that 7nm shrink gives folks the perf they've been waiting for since 1080ti. 

I may even go 2080ti>>3080>>3080ti, unless Titan comes along first. I haven't paid enough attention to the release cycle since keplar to guess.

I see Ampere 1.0 being the generation that really irons out the widescreen fad. 4K120 goodness @ decent settings isn't gonna kick off until 4 series at the earliest. Yes, you can do it now in 5+ year old titles, but not anywhere else. [email protected]+ is far more enjoyable than anything I was able to get 4K to do at native... minus the clarity, which all these sharpening tools make less important now.

& hell, we literally don't even have a true 4K120 card yet. 3070+ will probably be the first ones (i'd be surprised if NV puts it on the entire line)


----------



## keikei

skupples said:


> hopefully that 7nm shrink gives folks the perf they've been waiting for since 1080ti.
> 
> I may even go 2080ti>>*3080>>3080ti,* unless Titan comes along first. I haven't paid enough attention to the release cycle since keplar to guess.
> 
> I see Ampere 1.0 being the generation that really irons out the widescreen fad. 4K120 goodness @ decent settings isn't gonna kick off until 4 series at the earliest. Yes, you can do it now in 5+ year old titles, but not anywhere else. [email protected]+ is far more enjoyable than anything I was able to get 4K to do at native... minus the clarity, which all these sharpening tools make less important now.
> 
> & hell, we literally don't even have a true 4K120 card yet. 3070+ will probably be the first ones (i'd be surprised if NV puts it on the entire line)



I expect the same release date. Green can't milk us as gud this cycle. AMD will come out swing this summer.


Ampere announcing in March?


----------



## christoph

guttheslayer said:


> Like I said, let Market share decide.
> 
> JHH had his worth value drop by half in a matter of months. So let that continue. CEO who think like Elmy that is what happen to them lol.
> 
> 
> All I can agree is $699-$799 for flagship card in mid-range die is going to stay. I am not so sure for the 3080 Ti if it does appear in 2021, it might be $1200, or it might be $700, all depend on competition from AMD at that point.





is not about competition, is about stupidity


----------



## skupples

3080ti is dropping end of 2020.

Sometime after September, bundled with CyberPunk 2077.


----------



## bonami2

Hum maybe a gpu that going to force me to upgrade my 2600k. Sold my 4790k because the 2600k perform almost the same lol


----------



## Zam15

Salt? A truckload of it? Memory count and range seem off.

https://www.gizchina.com/2020/01/19...070-specifications-leaked-for-the-first-time/

"As per the recent emerging report, two new GPUs have appeared online with core codes GA103 and GA104. Both the GPUs are part of the high-end offering of the Ampere family. In an unrelated note, this is the first time that an Nvidia core code is ending with a numerical 3. Coming to the specifications, the GA103 will come with 10/20GB GDDR6 VRAM capacity. IT will also feature SM (Streaming Multiprocessors) count of 60, 3480 stream processors and 320-bit video memory.

The GA104, on the other hand, will feature 8/16GB GDDR6 VRAM memory capacity along with 48 sets of SM arrays. There will be 3072 stream processors and 256-bit video memory. Looking at the specs, the GA103 points towards the Nvidia RTX 3080 GPU while the latter indicates RTX 3070. Both the new GPUs will be manufactured using 7nm architecture. It still remains unknown who will manufacture the graphics card. We might see Samsung doing that considering good relations of the South Korean band and Nvidia. Furthermore, TSMC is already pretty packed with its current 7nm production capacity."


----------



## skupples

that sounds like 3070 and 3080. or 3060 n 70? I haven't kept up on my core maths.

and yep, that memory seems weird, but capacity isn't really the issue @ 4K, it's bandwidth. Top tier cards are rarely anywhere close to maxing out vram. 8+ is viable for mid+

pretty sure I'm grabbing a Titan this time around. At least, if it drops before 3080ti? 100% going Titan.


----------



## guttheslayer

The truckload of salt is difficult to believe.


I will hold off till more information is out or till the day NV unveiled them officially.


For your record, NV has never made a chip that differ in their size by less than 50%. GA103 is way too close to GA104 to see good performance lift.


----------



## ZealotKi11er

60 SMs + 10GB + 320-Bit sounds very plosible for 3080
48 SMs + 8GB + 256-Bit also very prossible for 3070

With higher clks, faster g6, new architecture they will get the gains. 

GA103 (3080 ti) will have 60+ SMs + 12GB + 384-bit. 

It all depends on how good the architecture is. 

8GB is still fine for 4K. If Nvidia feels like they need more they might release double memory models.


----------



## BigMack70

skupples said:


> that sounds like 3070 and 3080. or 3060 n 70? I haven't kept up on my core maths.
> 
> and yep, that memory seems weird, but capacity isn't really the issue @ 4K, it's bandwidth. Top tier cards are rarely anywhere close to maxing out vram. 8+ is viable for mid+
> 
> pretty sure I'm grabbing a Titan this time around. At least, if it drops before 3080ti? 100% going Titan.


I'll go Titan for $1-1.5k. Not paying $3k for a graphics card, ever.


----------



## guttheslayer

ZealotKi11er said:


> 60 SMs + 10GB + 320-Bit sounds very plosible for 3080
> 48 SMs + 8GB + 256-Bit also very prossible for 3070
> 
> With higher clks, faster g6, new architecture they will get the gains.
> 
> GA103 (3080 ti) will have 60+ SMs + 12GB + 384-bit.
> 
> It all depends on how good the architecture is.
> 
> 8GB is still fine for 4K. If Nvidia feels like they need more they might release double memory models.


You are missing the point, NV tries to create as little dies that could extend as much product range as possible, its cost efficient. Each die design required a hefty sum for R&D.


GA 103, if it comes with 3840 cores, could be easily soldered/disabled (from bad yield) by 20% to give 3072 Cores, which give the exact performance of GA104. You think the GA103 will be manufactured at 100% yield, and that they will throw away die with a single speck of defect? Also if you look back in the past, the GTX 1070 is effectively same as GTX 1080 but with its 25% cores being disabled.


Go back to NV history between each gen and compare the size of each GPU die along the stack. The die size differences is always at least 50% bigger, for a good reason.


----------



## Damage Inc

BigMack70 said:


> I'll go Titan for $1-1.5k. Not paying $3k for a graphics card, ever.


Titan for $1-1.5k? Yeah, dream on.


----------



## BigMack70

Damage Inc said:


> Titan for $1-1.5k? Yeah, dream on.


Last I checked, $1k was the Titan price for a long time. It depends entirely on if they have competition from AMD or not. If they do, I'd fully expect their pricing to come back down to earth a bit, with 3080 = $500, 3080 Ti = $700-800, Titan = $1000-1500. 

If no competition from AMD, then yeah, they're gonna keep up this nonsense. But it wasn't that long ago Titan cards were $1k.


----------



## skupples

a long time? 

didn't that only last for Keplar and maxwell? where ti then replaced the Titan in the gaming segment, and titan was pushed up to "uberprosumer"

almost positive titan has spent more time @2K+ than it did @ 1K  

will it go back down? probably not by much. Instead, they'd just eek our more perf in the ti segment.


----------



## BigMack70

skupples said:


> a long time?
> 
> didn't that only last for Keplar and maxwell? where ti then replaced the Titan in the gaming segment, and titan was pushed up to "uberprosumer"
> 
> almost positive titan has spent more time @2K+ than it did @ 1K
> 
> will it go back down? probably not by much. Instead, they'd just eek our more perf in the ti segment.


2 out of 4 generations of Titan were $1k. 

The whole thing is new, and it's purely a "F You" to the market when AMD doesn't have anything within ten miles of their top card for performance.

I maintain that if AMD has a card that is even somewhat competitive with Nvidia's high end, you won't see any GPUs north of $1500. It'll be exactly like what happened with Intel where all of a sudden they cut their $2000 CPU price in half to $1000 because oh crap now there's real competition.

Now, if big Navi is only competitive with 2080 Ti, and then Nvidia starts releasing cards with +30-50% performance, then yeah I expect to see a $1k+ Ti model and a $3k+ Titan because why not.


----------



## skupples

BigMack70 said:


> 2 out of 4 generations of Titan were $1k.
> 
> The whole thing is new, and it's purely a "F You" to the market when AMD doesn't have anything within ten miles of their top card for performance.
> 
> I maintain that if AMD has a card that is even somewhat competitive with Nvidia's high end, you won't see any GPUs north of $1500. It'll be exactly like what happened with Intel where all of a sudden they cut their $2000 CPU price in half to $1000 because oh crap now there's real competition.
> *
> Now, if big Navi is only competitive with 2080 Ti, and then Nvidia starts releasing cards with +30-50% performance, then yeah I expect to see a $1k+ Ti model and a $3k+ Titan because why not.*


50/50, time flies, just not when counting in GPUs. 

and that's exactly what's going to happen me thinks. What about the Radeon 7 replacement though? AMD's 100% sealing doom and gloom for radeaon segment if they can't even close the gap in 7nm vs. 7nm.


----------



## EniGma1987

Isnt the Gx102 normally the xx80 card? Why would Nvidia use a 103 die for it now? Also, the specs on that rumored 103 die seem very low. Dropping back down to 3500 cores? lol. The 2080Ti has 4350 cores. Sure there will be arch enhancements, but not enough to overcome that massive core deficit. If anything the 103 die would be a 70 class card at best.

Add to this that we are getting a die shrink to 7nm and Nvidia has a lot more room for even more cores than last gen? Ya no way the top end die comes in at 3.5k cuda cores.


----------



## Sheyster

BigMack70 said:


> Last I checked, $1k was the Titan price for a long time. It depends entirely on if they have competition from AMD or not. If they do, I'd fully expect their pricing to come back down to earth a bit, with 3080 = $500, 3080 Ti = $700-800, Titan = $1000-1500.


I'll be shocked if 3080 Ti drops for anything less than $1199. I don't think we'll ever see a sub $1500 Titan ever again.


----------



## ZealotKi11er

guttheslayer said:


> You are missing the point, NV tries to create as little dies that could extend as much product range as possible, its cost efficient. Each die design required a hefty sum for R&D.
> 
> 
> GA 103, if it comes with 3840 cores, could be easily soldered/disabled (from bad yield) by 20% to give 3072 Cores, which give the exact performance of GA104. You think the GA103 will be manufactured at 100% yield, and that they will throw away die with a single speck of defect? Also if you look back in the past, the GTX 1070 is effectively same as GTX 1080 but with its 25% cores being disabled.
> 
> 
> Go back to NV history between each gen and compare the size of each GPU die along the stack. The die size differences is always at least 50% bigger, for a good reason.


Naming aside it could be that 103 is used for both 3080 and 3080 Ti and 104 is used for 3060 and 3070. Could also be that 102 is not ready and we might not get 3080 ti right away. It all depends when Nvidia launches. They could launch 3080 that is 10% faster than 2080 ti in April and keep that until AMD released something faster. Also, a 102 launch with Titan could be possible and released 3080 Ti whenever they need to.


----------



## 113802

ZealotKi11er said:


> Naming aside it could be that 103 is used for both 3080 and 3080 Ti and 104 is used for 3060 and 3070. Could also be that 102 is not ready and we might not get 3080 ti right away. It all depends when Nvidia launches. They could launch 3080 that is 10% faster than 2080 ti in April and keep that until AMD released something faster. Also, a 102 launch with Titan could be possible and released 3080 Ti whenever they need to.


A Titan will be released at the Neural Information Processing Systems conference not because "they need to" but because they changed the target for the card hence the increase in memory/FP64 performance.



skupples said:


> that sounds like 3070 and 3080. or 3060 n 70? I haven't kept up on my core maths.
> 
> and yep, that memory seems weird, but capacity isn't really the issue @ 4K, it's bandwidth. Top tier cards are rarely anywhere close to maxing out vram. 8+ is viable for mid+
> 
> pretty sure I'm grabbing a Titan this time around. At least, if it drops before 3080ti? *100% going Titan.*


Hopefully you're using it for its actual purpose and not just gaming/benchmarking.


----------



## SystemTech

BigMack70 said:


> Last I checked, $1k was the Titan price for a long time. It depends entirely on if they have competition from AMD or not. If they do, I'd fully expect their pricing to come back down to earth a bit, with 3080 = $500, 3080 Ti = $700-800, Titan = $1000-1500.
> 
> If no competition from AMD, then yeah, they're gonna keep up this nonsense. But it wasn't that long ago Titan cards were $1k.


You are also assuming that if AMD do have competition, they will price their cards reasonably well, which, if your competitior is asking twice the price, why should you keep your prices low if you dont have to.
AMD has shown that they are happy to price their cards close to the repective Green card, meaning, even if there is high-end competition, you will have a 3080Ti at $1500 and a Navi 20 at $1450

Its simple business logic, nothing to do with what the price "was" or "should" be.


----------



## BigMack70

SystemTech said:


> You are also assuming that if AMD do have competition, they will price their cards reasonably well, which, if your competitior is asking twice the price, why should you keep your prices low if you dont have to.
> AMD has shown that they are happy to price their cards close to the repective Green card, meaning, even if there is high-end competition, you will have a 3080Ti at $1500 and a Navi 20 at $1450
> 
> Its simple business logic, nothing to do with what the price "was" or "should" be.


Competition drives prices down, not up.


----------



## skupples

Sheyster said:


> I'll be shocked if 3080 Ti drops for anything less than $1199. I don't think we'll ever see a sub $1500 Titan ever again.


it all comes down to competition. Though true, we're highly unlikely to ever see radeon group bring enough heat to affect prices more than $100-$200. I think its more likely we see more perf. instead of price dropping, once/if/when Radeon Group starts throwing out some competition again.

otherwise, we'll stay in the current status quo


----------



## EniGma1987

ZealotKi11er said:


> Naming aside it could be that 103 is used for both 3080 and 3080 Ti and 104 is used for 3060 and 3070. Could also be that 102 is not ready and we might not get 3080 ti right away. It all depends when Nvidia launches. They could launch 3080 that is 10% faster than 2080 ti in April and keep that until AMD released something faster. Also, a 102 launch with Titan could be possible and released 3080 Ti whenever they need to.


I just can't see a 3500 cuda core card being used as the 3080 card on this next gen. Even if Nvidia were to release a 102-400 (or maybe 102-400 for Titan and 102-300 cut down with removed FP64 cores for Ti) die as you suggest for a 3080Ti card, that would have to put the Ti at a minimum of around 1k cores more than the non-ti card. It would be a huge step between models and far more than the 80 -> 70 transition. Even that is trying to forget all about how the 100 die's are the ones to typically get full FP64.


----------



## skupples

definitely don't think we'll be getting a 3080ti right away, unless there's disappointment or unknown competition brewing. (can't assume AMD is competition until they prove they are, since they haven't been competing above xx70 for 3+ years)

3080 will beat out 2080ti, n 3080ti will follow a few months later for extra milking. 

because we all know people that'll drive both, unless Titan drops first.


----------



## keikei

<Drools over 3080Ti potential, but still run 60hz. :doh:


----------



## doom26464

Im more afraid of pricing for next gen. We already know what 7nm improvement brings on top of any arch improvements and it will once again leave AMD in the dust with freedom to price as they want. 

Along with typical first 6 month availability and pricing skewing because of that leave my stomach in a bad place. 


Id love to get off my 1080ti but not willing to pay 1100+ usd for it.


----------



## ZealotKi11er

EniGma1987 said:


> I just can't see a 3500 cuda core card being used as the 3080 card on this next gen. Even if Nvidia were to release a 102-400 (or maybe 102-400 for Titan and 102-300 cut down with removed FP64 cores for Ti) die as you suggest for a 3080Ti card, that would have to put the Ti at a minimum of around 1k cores more than the non-ti card. It would be a huge step between models and far more than the 80 -> 70 transition. Even that is trying to forget all about how the 100 die's are the ones to typically get full FP64.


2080 non SUPER > 2944 
2080 Ti > 4352

1400 difference in Cuda Cores.


----------



## skupples

ZealotKi11er said:


> 2080 non SUPER > 2944
> 2080 Ti > 4352
> 
> 1400 difference in Cuda Cores.


so you think ampere cores are = in power to turing cores, just smaller?

i'm confused as to what your point is. You'd typically see "more" than the previous gen after a node shrink, would you not?


----------



## ZealotKi11er

skupples said:


> so you think ampere cores are = in power to turing cores, just smaller?
> 
> i'm confused as to what your point is. You'd typically see "more" than the previous gen after a node shrink, would you not?


35xx for 3080 is a possible number. 3080 Ti would need 5xxx.


----------



## skupples

so you think ampere cores are OP AF.


----------



## huzzug

BigMack70 said:


> Competition drives prices down, not up.


We haven't had competition in this segment for a long time. Duopoly isn't competition.


----------



## EniGma1987

ZealotKi11er said:


> 35xx for 3080 is a possible number. 3080 Ti would need 5xxx.


Alright, I can see your point there.


----------



## BigMack70

huzzug said:


> We haven't had competition in thus segment for a long time. Duopoly isn't competition.


This is completely irrelevant to my point. Prices go down when AMD and Nvidia are competitive at a given performance level. Not up.


----------



## skupples

huzzug said:


> We haven't had competition in this segment for a long time. Duopoly isn't competition.


i'm hoping Intel is actually competing where it benefits us by Gen 3.


----------



## guttheslayer

ZealotKi11er said:


> Naming aside it could be that 103 is used for both 3080 and 3080 Ti and 104 is used for 3060 and 3070. Could also be that 102 is not ready and we might not get 3080 ti right away. It all depends when Nvidia launches. They could launch 3080 that is 10% faster than 2080 ti in April and keep that until AMD released something faster. Also, a 102 launch with Titan could be possible and released 3080 Ti whenever they need to.


That is a possibility but it doesnt change the fact that size between each GPU differ by quite big (50% bigger or more). Somehow each die must have a cut down variant to support yield, and is usually marketed as a separate product. (RTX 3080 and RTX 3070 for eg).


GA103 is very fishy, unless the 103 comes with RT cores while the 104 comes with no RT cores, that could explain the small cores count difference. But I still dont believe NV will go for 1280 Cores GPC group, if they are adding more stuff in they will most likely reduce to 1024 Cores GPC (Resulting in 4 GPC of 4096 Cores)


Only time will tell, but this rumor hold only 10% or less credibility atm for me.


----------



## speed_demon

I know it's not the prime subject around here by any means but I'm excited to see what we end up with for mobile dGPU options. I always liked the idea that I could plug in my laptop anywhere with an outlet, hop on my hotspot, and be enjoying my games far from home. And the ever increasing SSD capacity makes laptops even more capacious with regards to games.

And no before somebody asks - I don't think laptops will ever replace a desktop. I had my gaming laptop in addition to my more serious 1700x/1080 rig.


----------



## EniGma1987

guttheslayer said:


> GA103 is very fishy, unless the 103 comes with RT cores while the 104 comes with no RT cores, that could explain the small cores count difference. But I still dont believe NV will go for 1280 Cores GPC group, if they are adding more stuff in they will most likely reduce to 1024 Cores GPC (Resulting in 4 GPC of 4096 Cores)


Titan V had a 5120 configuration, and it has full FP64 and tensor cores. It was only lacking in RT cores. For a gaming die, Nvidia could do a similar config but strip FP64 and use that space with RT cores instead. The V100 has a size of 815mm2 on 12nm, but moving to Samsung 7nm with EUV I bet if you kept all those same resources the die size would be manageable for a gaming tier product.


----------



## AlphaC

BigMack70 said:


> 2 out of 4 generations of Titan were $1k.
> 
> The whole thing is new, and it's purely a "F You" to the market when AMD doesn't have anything within ten miles of their top card for performance.
> 
> I maintain that if AMD has a card that is even somewhat competitive with Nvidia's high end, you won't see any GPUs north of $1500. It'll be exactly like what happened with Intel where all of a sudden they cut their $2000 CPU price in half to $1000 because oh crap now there's real competition.
> 
> Now, if big Navi is only competitive with 2080 Ti, and then Nvidia starts releasing cards with +30-50% performance, then yeah I expect to see a $1k+ Ti model and a $3k+ Titan because why not.


 Titan gets gets some pro driver optimizations and with the advent of RTX and Studio drivers there's more users of Autodesk products (Autocad, Inventor , Maya , 3dsmax , etc) using Geforce GPUs with large VRAM amounts ($900 Quadro RTX 4000 = 8GB GDDR6 , so everything above RTX 2060 Super is fair game).


It's not unreasonable to expect all price tiers to drop by one while the top tier TITAN $2500+ or $1200+ RTX 3080 Ti increases a bit.


That said, I also think that all existing price tiers will increase by 30-50% performance as you state as the TSMC 7nm process reportedly is 40% or so performance improvement at isopower. I don't believe for a second the idea that there will be double performance or +50% performance while also half power. It's either or, unless they're talking about Raytracing performance which could easily be improved via software and increased percentage of die area from the die shrink savings.


----------



## 113802

BigMack70 said:


> 2 out of 4 generations of Titan were $1k.
> 
> The whole thing is new, and it's purely a "F You" to the market when AMD doesn't have anything within ten miles of their top card for performance.
> 
> I maintain that if AMD has a card that is even somewhat competitive with Nvidia's high end, you won't see any GPUs north of $1500. It'll be exactly like what happened with Intel where all of a sudden they cut their $2000 CPU price in half to $1000 because oh crap now there's real competition.
> 
> Now, if big Navi is only competitive with 2080 Ti, and then Nvidia starts releasing cards with +30-50% performance, then yeah I expect to see a $1k+ Ti model and a $3k+ Titan because why not.


The last two Titans were released at NIPS unlike the previous Titans. The Titan V's FP64 performance is 1:2 of FP32 which no GeForce card has ever had. The Titan RTX supported NVLink memory pooling which again no GeForce card supported. You're also forgetting the huge AI performance difference between the Titan Xp and the Titan V and Titan RTX were targeted at:

The Titan V was faster than the Titan Xp by 94% when comparing Tensorflow training performance when utilizing the Tensor cores. Both these Titans were dirt cheap compared to purchasing a Tesla V100 until recently when nVidia released the V100S which brought the 16GB Tesla V100 price down to $4000. 

https://lambdalabs.com/blog/titan-v-deep-learning-benchmarks/
https://lambdalabs.com/blog/titan-rtx-tensorflow-benchmarks/


----------



## Kaltenbrunner

Hopefully nvidia can increase the prices, otherwise AMD will think they've gone soft and pc gamer snobs will feel like average peasants


----------



## Buris

Anyone taking bets on if we'll see Nvidia's MCM this year?

I don't believe the GA103 rumors. 10GB and 20GB variants are ridiculous and Nvidia has been known to limit memory size to cut down on costs. If it is true, my guess would be Nvidia *only made 10GB and 20GB variants because they know something about the new consoles* (Caching techniques, anyone?) 



Buris said:


> With that knowledge, I think it's clear to see PCs that do not have NVME SSDs will have their performance obliterated. You better hope they decide on having the Texture Cache option available on PCs rather than simply offloading that cache onto the GPU which would require 16GB+ VRAM.


____________________________________________________________________________



speed_demon said:


> I know it's not the prime subject around here by any means but I'm excited to see what we end up with for mobile dGPU options. I always liked the idea that I could plug in my laptop anywhere with an outlet, hop on my hotspot, and be enjoying my games far from home. And the ever increasing SSD capacity makes laptops even more capacious with regards to games.
> 
> And no before somebody asks - I don't think laptops will ever replace a desktop. I had my gaming laptop in addition to my more serious 1700x/1080 rig.


Totally agree. Nvidia still holds such a large architectural advantage when it comes to GPU efficiency that we should be looking at 7nm Laptop GPUs with a minimum of 50% more performance at any given price, at least I would hope so.


----------



## skupples

in geforce form? no. Maybe after Ampere.

I think NV's going to try to ring every last drop outta the current gen with the 1st ampere run. Do all they can to squeeze a penny before consoles redefine mid/low end gaming again.


----------



## EniGma1987

AlphaC said:


> That said, I also think that all existing price tiers will increase by 30-50% performance as you state as the TSMC 7nm process reportedly is 40% or so performance improvement at isopower.


Ampere uses Samsung 7nm though, which is inferior to TSMC 7nm+. Samsung 7nm with EUV is actually closer to TSMC regular 7nm than TSMC's 7nm+ with EUV for transistor density.
https://www.pcgamesn.com/nvidia/ampere-7nm-gpu-samsung-discount


----------



## BigMack70

EniGma1987 said:


> Ampere uses Samsung 7nm though, which is inferior to TSMC 7nm+. Samsung 7nm with EUV is actually closer to TSMC regular 7nm than TSMC's 7nm+ with EUV for transistor density.
> https://www.pcgamesn.com/nvidia/ampere-7nm-gpu-samsung-discount


Past fifteen years of GPU history says 30-50% jump is low for a new architecture. It's a safe bet, without being overly dependent on them using the most bleeding edge process. 60-80% is a reasonable expectation if feeling extra optimistic about the new process node.


----------



## guttheslayer

EniGma1987 said:


> Titan V had a 5120 configuration, and it has full FP64 and tensor cores. It was only lacking in RT cores. For a gaming die, Nvidia could do a similar config but strip FP64 and use that space with RT cores instead. The V100 has a size of 815mm2 on 12nm, but moving to Samsung 7nm with EUV I bet if you kept all those same resources the die size would be manageable for a gaming tier product.


The GV100 is a standalone chip. No other Volta chip has been produced beside GV100. It is not the same as Pascal or Turing, hence not a good eg.


GA103 and GA104 are the same ampere family, as such I believe they are both stripped of DP unit but coupled with RT Cores


----------



## Mooncheese

Buris said:


> Anyone taking bets on if we'll see Nvidia's MCM this year?
> 
> I don't believe the GA103 rumors. 10GB and 20GB variants are ridiculous and Nvidia has been known to limit memory size to cut down on costs. If it is true, my guess would be Nvidia *only made 10GB and 20GB variants because they know something about the new consoles* (Caching techniques, anyone?)
> 
> 
> 
> ____________________________________________________________________________
> 
> 
> 
> Totally agree. Nvidia still holds such a large architectural advantage when it comes to GPU efficiency that we should be looking at 7nm Laptop GPUs with a minimum of 50% more performance at any given price, at least I would hope so.


They can get GDDR6 @ $7-8 / 1GB. 

The 3080 is going to have more video memory than 2080 Ti, just see every incoming 70 and 80 series launch vs the outgoing 80 Ti card. I'm seeing 10.5 GB of utilization in The Division 2 @ 3440x1440 with my 1080 Ti and 9GB+ in quite a few other titles at this resolution. The games are getting more and more complex, 10GB is not future-proof, it's already becoming obsolete. To add to this point, the next-gen consoles are going to have more system memory which means the console ports are going to be more memory intensive, especially at higher resolutions. 

If not 20GB then definitely 16GB but they aren't going to have only 10GB of video memory because memory isn't THAT expensive as they buy it in bulk @ ~$7 per GB. 

Cutting 10GB off each card may save them $70 tops. The 3070 and 3080 are absolutely not going to come with only 10GB of video memory in 2020 with next-gen consoles sporting more system memory, the outgoing 2080 Ti having 11GB of video memory and the fact that ultimately, it's not THAT expensive to add to the card. 

They are going to try to sell the 3070 and 3080 as "4K Ray Tracing @ 60 FPS" cards (at least the latter). 4K Ray Tracing @ 60 FPS with a next-gen console port is going to require more than 10GB of video memory. Full stop. 

The 3070 might come with only 10GB of video memory but I can nearly guarantee you that the 3080 will have more than 10GB. Given the actual cost of video memory, there is no reason why it wouldn't be 20GB over say 10GB. This is the way they've always increased video memory. They double it in most instances. See 780 Ti to 980 Ti, see GTX 580 to GTX 680 (1 to 2 GB), see 980 to 1080 (4 to 8 GB). See 980 Ti to 1080 Ti (6 to 11 GB, the actual reason they omitted 1GB of video memory wasn't so that they could save a whopping $7 it was because they didn't want Titan X owners to feel burned).

Additionally, they are going to want to wow and impress the consumer-base at the reveal, generate media buzz and excitement and coming out and saying "introducing the RTX 3080 with 10GB of video memory, up from 8 on the RTX 2080, you might be ok for a few more months!" is the furthest thing from that no, they are going to come out and say "and introducing the 3080, with 20 GB of GDDR6, you will be able to enjoy ray tracing at 4K @ 60 FPS for some time to come!"

Ampere is going to blow everyone away. It's going to be 50% faster in rasterization, more than that in RT, it's going to be cheaper (I'm predicting the 3080 FE will launch at $800, and drop to $700 thereafter). 

RTX 3080 will be some 25% faster than 2080 TI, have more video memory, do RT more than 25% faster, probably something like 50% faster, @ 220W and $800. 

RTX 3080 Ti will be 50% faster than 2080 Ti, have even more video memory, do RT ~70% faster @ 300W for $1000-1200. 

I don't see them launching the 80 Ti at launch this time around. They may release a Titan card again for something like $1500 and then a slightly cut down version of that early next year for $1000-1200.

This may sound fantastic and unrealistic, overly optimistic but look at what happened the last time they dropped the node 41%. 

Kepler to Maxwell, 40 to 28nm, 41%: GTX 980 vs 680, GTX 980 Ti vs 780 Ti, 50% performance difference. 

Maxwell to Pascal, 28 to 16nm, 41%: GTX 1080 vs 980, GTX 1080 Ti vs 980 Ti, 50% and 65% performance uplift. 

But it's not just the node shrinking, we are talking about Samsung's 7nm EUV process, which is head and shoulders better than TSMC. 

So in reality it's something like 50% when you take that into consideration, and that's where the additional RT performance will come from. 

So 50% in rasterization EASILY and something north of that in RT, possibly 70, maybe even 75% between Ampere and each predecessor card (i.e. RTX 3080 vs 2080)


----------



## Kaltenbrunner

If they really are out by for August, what do u think cost about $1000 USD ? If I go crazy and give in to nviida for all that, would it be a rtx3080 , what was the 2080 on launch ?

Damn, now I'm in planning mode for this summer, but it's good to have a goal


----------



## guitarmageddon88

Mooncheese said:


> They can get GDDR6 @ $7-8 / 1GB.
> 
> The 3080 is going to have more video memory than 2080 Ti, just see every incoming 70 and 80 series launch vs the outgoing 80 Ti card. I'm seeing 10.5 GB of utilization in The Division 2 @ 3440x1440 with my 1080 Ti and 9GB+ in quite a few other titles at this resolution. The games are getting more and more complex, 10GB is not future-proof, it's already becoming obsolete. To add to this point, the next-gen consoles are going to have more system memory which means the console ports are going to be more memory intensive, especially at higher resolutions.
> 
> If not 20GB then definitely 16GB but they aren't going to have only 10GB of video memory because memory isn't THAT expensive as they buy it in bulk @ ~$7 per GB.


Just because VRAM is available doesnt mean that every bit of it which is filled with data is necessary. I have a 2080, and in division 2 at 3440x1440 on all ultra, I utilize far less than that, and obviously my performance is quite a bit higher than yours being a 2080. Does that mean you filling most of your vram is superior to my card only using 6-7gb? System memory does the same. Its hard to tell how much the system just goes "ooooh memory available, lets stash some stuff too it, or maybe just keep some space on standby just in case" versus truly needed space to run smoothly. Considering that I see roughly 6.5gb usage in BFV while all ultra, and close to that on pretty much any other triple A title, Im good for probably one more generation of releases before ultra at 3440x1440 overwhelms the 2080.


----------



## aDyerSituation

1080Ti and 2080 are basically the same performance level.


----------



## Mooncheese

guitarmageddon88 said:


> Just because VRAM is available doesnt mean that every bit of it which is filled with data is necessary. I have a 2080, and in division 2 at 3440x1440 on all ultra, I utilize far less than that, and obviously my performance is quite a bit higher than yours being a 2080. Does that mean you filling most of your vram is superior to my card only using 6-7gb? System memory does the same. Its hard to tell how much the system just goes "ooooh memory available, lets stash some stuff too it, or maybe just keep some space on standby just in case" versus truly needed space to run smoothly. Considering that I see roughly 6.5gb usage in BFV while all ultra, and close to that on pretty much any other triple A title, Im good for probably one more generation of releases before ultra at 3440x1440 overwhelms the 2080.


"Obviously my performance is higher than yours considering I have a 2080" 

Here's my DX12 benchmark at 3440x1440 with nearly everything maxed (I followed Hardware Unboxed optimization guide), go ahead and put yours up for comparison: 

https://imgur.com/a/NKAV6hQ

Most everyone knows the 2080 isn't really faster than 1080 Ti, but NGreedia's marketing was obviously effective on some people. 

I'm seeing 10GB of utilization in this game, not allocation. You can get by with less VRAM but I can guarantee you that youre seeing more hitching and stuttering.


----------



## Mooncheese

Wrong thread sorry.


----------



## skupples

Kaltenbrunner said:


> If they really are out by for August, what do u think cost about $1000 USD ? If I go crazy and give in to nviida for all that, would it be a rtx3080 , what was the 2080 on launch ?
> 
> Damn, now I'm in planning mode for this summer, but it's good to have a goal


70 & 80 will be out in August, and Titan/ti at the end of the year.

Cost? Likely the same as this gen, unless AMD drops competition.

n while 1080ti n 2080 are close, it's not a dead heat. telling yourself otherwise is just a farce.


----------



## Section31

I agree with skrupples. I will be avoiding the 3080 though if your 2080ti owner. Going to get burned once 3080ti/titan ampere comes out


----------



## Mooncheese

skupples said:


> 70 & 80 will be out in August, and Titan/ti at the end of the year.
> 
> Cost? Likely the same as this gen, unless AMD drops competition.
> 
> n while 1080ti n 2080 are close, it's not a dead heat. telling yourself otherwise is just a farce.


What makes you think they will wait until August?

You don't think AMD is releasing Big Navi in early March? I mean they have Korean certification for a new GPU and historically they only get that 1-2 months before releasing a new GPU. 

https://www.overclock3d.net/news/gp...o-be-released_radeon_big_navi_graphics_card/1






Also, both manufacturers are not going to want to wait until holiday season to release expensive new GPU's with a consuming public who is enticed with $2000+ PC performance with a $500 PS5. They are going to want to pre-empt that by quite some time. If AMD announces Big Navi on March 5 and it is indeed 15-20% faster than 2080 Ti there is no way that NGreedia won't also announce Ampere @ GTC in response. 

I don't see them waiting until August. That's too close to the new console release. 

Again, would be new entrants into PC gaming will do the arithmetic and come to the conclusion that they are better off spending $500 on a PS5, especially for those stellar console exclusives, than building a $2000 PC that will offer comparable performance (4k @ 60 FPS). 

Both AMD and NGreedia want to release their respective products this summer to pre-empt those would be entrants earlier. There is no way they are both going to announce new GPU's as late as August. 

Please substantiate why you feel otherwise.

And it's great to have not parted with $1300 for last years 80 card ("2080 Ti"). Considering the 3080 will probably be at minimum 50% faster than 2080 in rasterization (more than that in RT) I'm looking at a nice 50% bump, probably something between 50-75% in RT, for $800 @ 220W coming from 1080 Ti. That's a nice bump and taking 100W out of my loop is always good. 

Pretty much done getting the 80 Ti card, I did that every generation going back to Kepler (when it was only $700 to do so). No way am I waiting for 80 Ti for $1000-$1200, that's just stupid considering the life-cycle of these things. If I upgrade to 3080 Ti then I'm looking at having to part with $1000-$1200+ to upgrade to the next architecture 2-3 years hence for that 50% bump again whereas dropping down to the 80 card means that upgrading again to chiplets on next architecture means another 50% bump for $700-800. 

Plus I won't have to wait for 80 Ti, which probably wont come out this year. There might be a $1500 Titan card, but I doubt we will see 80 Ti (if only going by the leaked information thus far where only GA-103 and GA-104 are due out at launch). 

NGreedia basically renamed the consumer Titan the 80 Ti card, completely destroying the point of 80 Ti. (With Turing they renamed the 80 card the 80 Ti card).


----------



## 113802

Mooncheese said:


> What makes you think they will wait until August?
> 
> You don't think AMD is releasing Big Navi in early March? I mean they have Korean certification for a new GPU and historically they only get that 1-2 months before releasing a new GPU.
> 
> https://www.overclock3d.net/news/gp...o-be-released_radeon_big_navi_graphics_card/1
> 
> 
> Also, both manufacturers are not going to want to wait until holiday season to release expensive new GPU's with a consuming public who is enticed with $2000+ PC performance with a $500 PS5. They are going to want to pre-empt that by quite some time. If AMD announces Big Navi on March 5 and it is indeed 15-20% faster than 2080 Ti there is no way that NGreedia won't also announce Ampere @ GTC in response.
> 
> I don't see them waiting until August. That's too close to the new console release.
> 
> Again, would be new entrants into PC gaming will do the arithmetic and come to the conclusion that they are better off spending $500 on a PS5, especially for those stellar console exclusives, than building a $2000 PC that will offer comparable performance (4k @ 60 FPS).
> 
> Both AMD and NGreedia want to release their respective products this summer to pre-empt those would be entrants earlier. There is no way they are both going to announce new GPU's as late as August.
> 
> Please substantiate why you feel otherwise.
> 
> And it's great to have not parted with $1300 for last years 80 card ("2080 Ti"). Considering the 3080 will probably be at minimum 50% faster than 2080 in rasterization (more than that in RT) I'm looking at a nice 50% bump, probably something between 50-75% in RT, for $800 @ 220W coming from 1080 Ti. That's a nice bump and taking 100W out of my loop is always good.
> 
> Pretty much done getting the 80 Ti card, I did that every generation going back to Kepler (when it was only $700 to do so). No way am I waiting for 80 Ti for $1000-$1200, that's just stupid considering the life-cycle of these things. If I upgrade to 3080 Ti then I'm looking at having to part with $1000-$1200+ to upgrade to the next architecture 2-3 years hence for that 50% bump again whereas dropping down to the 80 card means that upgrading again to chiplets on next architecture means another 50% bump for $700-800.
> 
> Plus I won't have to wait for 80 Ti, which probably wont come out this year. There might be a $1500 Titan card, but I doubt we will see 80 Ti (if only going by the leaked information thus far where only GA-103 and GA-104 are due out at launch).
> 
> NGreedia basically renamed the consumer Titan the 80 Ti card, completely destroying the point of 80 Ti. (With Turing they renamed the 80 card the 80 Ti card).


Step out of your gamer bubble.


----------



## skupples

i don't mean literally released in august. i mean, they'll be around by then. n IDK, I haven't seen any news in the week, been busy working 13 hour days cuz two dudes bailed 

last I saw, they were still trying to compete with 2080ti, while nv's dropping 70 n 80 in the next few months.

also, the days of the console exclusive are dead. MS has already confirmed it, n Sony is slowly but surely porting more n more games to PC. By mid cycle, the norm will be timed exclusives, like RDR2.


----------



## Mooncheese

WannaBeOCer said:


> Step out of your gamer bubble.


? 



skupples said:


> i don't mean literally released in august. i mean, they'll be around by then. n IDK, I haven't seen any news in the week, been busy working 13 hour days cuz two dudes bailed
> 
> last I saw, they were still trying to compete with 2080ti, while nv's dropping 70 n 80 in the next few months.
> 
> also, the days of the console exclusive are dead. MS has already confirmed it, n Sony is slowly but surely porting more n more games to PC. By mid cycle, the norm will be timed exclusives, like RDR2.


To the contrary, Sony is absolutely not porting games to PC. We will never see their console exclusives. Without the console exclusives there is no reason for us to get a Playstation now is there?

Microsoft is another matter, they have already ceded the fight against Sony and have taken a different strategy where they are doing cross platform releases, RE: "Play Anywhere". Forza Horizon and Gears of War on PC etc. (I mean they are only doing the logical thing to do considering they make Windows). 

But Sony, we will never see their titles on PC outside of their subscription service which limits the games to being streamed from the console in question, limited to 30 FPS (at present) at 16x9 with latency to top it off at an expensive rate. 

Might want to take a look at the information I presented in my last few posts, the Korean certification information in particular since youre obviously not up to speed on what's going on (but still manage to issue up your opinion. An uninformed opinion is exactly that). 

Up until now Korean certification has preceded an AMD release by 1 month on avg. They just received Korean certification for a new GPU a few days ago. They have a conference on the 5th of March. It is highly likely that they will announce a new GPU on the 5th of March. 

Let's see, we have: 

5600XT
5700XT

They are probably going to announce 5800XT on the 5th of March, which is 17% faster than 2080 Ti in a certain VR benchmark (no other benchmarks were included, but it's probably safe to say 15-20% faster outside of VR as well). 

If they release a GPU that is 15-20% faster than 2080 Ti on the 5th of March for $700 I can guarantee you that NGreedia will not take that on the chin until June and will announce Ampere at GTC at the end of the same month. 

Again, it behooves both manufacturers to get way ahead of the console refresh this holiday season. 

Very unlikely that they will release in July-August, that's too close to Q4 holiday release window of Oct. 

There's a good chance we will see both manufacturers announce new cards next month. It's not a guarantee, but given the fact that there's a high probability of AMD announcing Big Navi on the 5th of March and the console release this holiday season, yeah, there's more likelihood of an early launch than later. 

Remember, both manufacturers 100% have said cards ready to go now. NGreedia is simply trying to downsize it's inventory of Turing, which didn't sell particularly well (gee, I wonder why) and are in no rush (except wanting to get ahead of console refresh) but if AMD releases Big Navi early next month then we will see Ampere at the end of March. 

All we gotta do now is wait until the 5th of March. 

But no way is NGreedia going to let AMD win customers over with a card that is 15-20% faster than 2080 Ti for $700 until June, let alone August. If AMD announces Big Navi early next month NGreedi will be forced to announce Ampere at GTC and there we will see an $800 (or $700) 3080 that is some 25% faster than 2080 Ti in rasterization, possibly as high as 50% faster in RT hit the scene.


----------



## doom26464

WannaBeOCer said:


> Mooncheese said:
> 
> 
> 
> What makes you think they will wait until August?
> 
> You don't think AMD is releasing Big Navi in early March? I mean they have Korean certification for a new GPU and historically they only get that 1-2 months before releasing a new GPU.
> 
> https://www.overclock3d.net/news/gp...o-be-released_radeon_big_navi_graphics_card/1
> 
> 
> Also, both manufacturers are not going to want to wait until holiday season to release expensive new GPU's with a consuming public who is enticed with $2000+ PC performance with a $500 PS5. They are going to want to pre-empt that by quite some time. If AMD announces Big Navi on March 5 and it is indeed 15-20% faster than 2080 Ti there is no way that NGreedia won't also announce Ampere @ GTC in response.
> 
> I don't see them waiting until August. That's too close to the new console release.
> 
> Again, would be new entrants into PC gaming will do the arithmetic and come to the conclusion that they are better off spending $500 on a PS5, especially for those stellar console exclusives, than building a $2000 PC that will offer comparable performance (4k @ 60 FPS).
> 
> Both AMD and NGreedia want to release their respective products this summer to pre-empt those would be entrants earlier. There is no way they are both going to announce new GPU's as late as August.
> 
> Please substantiate why you feel otherwise.
> 
> And it's great to have not parted with $1300 for last years 80 card ("2080 Ti"). Considering the 3080 will probably be at minimum 50% faster than 2080 in rasterization (more than that in RT) I'm looking at a nice 50% bump, probably something between 50-75% in RT, for $800 @ 220W coming from 1080 Ti. That's a nice bump and taking 100W out of my loop is always good.
> 
> Pretty much done getting the 80 Ti card, I did that every generation going back to Kepler (when it was only $700 to do so). No way am I waiting for 80 Ti for $1000-$1200, that's just stupid considering the life-cycle of these things. If I upgrade to 3080 Ti then I'm looking at having to part with $1000-$1200+ to upgrade to the next architecture 2-3 years hence for that 50% bump again whereas dropping down to the 80 card means that upgrading again to chiplets on next architecture means another 50% bump for $700-800.
> 
> Plus I won't have to wait for 80 Ti, which probably wont come out this year. There might be a $1500 Titan card, but I doubt we will see 80 Ti (if only going by the leaked information thus far where only GA-103 and GA-104 are due out at launch).
> 
> NGreedia basically renamed the consumer Titan the 80 Ti card, completely destroying the point of 80 Ti. (With Turing they renamed the 80 card the 80 Ti card).
> 
> 
> 
> Step out of your gamer bubble.
Click to expand...

Just wanna......


God I hope pc gaming hell has a special place for your soul. Forced to play Duke Nukem Forever....forever


----------



## 113802

doom26464 said:


> Just wanna......
> 
> 
> God I hope pc gaming hell has a special place for your soul. Forced to play Duke Nukem Forever....forever





> When GeForce launched 20 years ago, powering cutting-edge games was its focus. Over time, GPUs became more capable, evolving to the point where they were not just useful in creative design, but vital. Last summer, the entire creative industry got a shot in the arm with the launch of NVIDIA’s RTX family of GPUs and features. In this article, we’re going to explore why that is
> 
> This last year has been surprising on many levels with just how far NVIDIA’s RTX platform has come. Even the last few months alone have seen announcement after announcement centered on new and exciting ways the creative industry has not just latched onto, but embraced RTX. The biggest surprise isn’t just the 2-3x performance increases we’re seeing in applications, or the ingenuity behind the AI research – it’s the sheer scale of the industry’s support.
> 
> These ray tracing and AI inferencing engines are not restricted to NVIDIA’s Turing architecture. Blender will work great with any GPU, but the addition of those RT cores can really speed up the process. Adobe has experimented with AI in its applications in the background for a few years, the Tensor cores on Volta and Turing cards just speed up the process.
> 
> In an industry where time is money, every single advantage that can be leveraged, should be utilized. Real-time interactive renders of ray traced scenes with AI-assisted denoising allows creatives to not just rapidly change and adapt projects, but actually experiment with new ideas and methods without feeling guilty about wasting an hour on a bad render with old hardware.


https://techgage.com/article/being-creative-with-nvidia-rtx/

Night and day difference between GeForce Pascal and Turing in everything else aside from gaming. I'm a tech geek, I don't game as much as I used to. These new technologies are exciting because I get to work with creative users who take advantage of them.


----------



## Woundingchaney

Mooncheese said:


> ?
> 
> 
> 
> To the contrary, Sony is absolutely not porting games to PC. We will never see their console exclusives. Without the console exclusives there is no reason for us to get a Playstation now is there?


I dont think you have a very good grasp on the market or you are willingfully ignorant as to where the market is going. Software and service are the future of the games industry, bar none. Sony is considerably behind in this regard, in fact if not for their first party line up they would have a very bleak future. Ultimately given that hardware has never been overly profitable for Sony Im not sure why they would continue to tie their first party line up exclusively to their hardware. If you really think its in Sony's best interest to tie their strong games line up solely to the Playstation hardware I can assure you that you are quite mistaken. Ultimately it comes down to profitability and revenue, neither of which have favored console hardware for Sony (or MS for that matter). Sony's profitability in the games market has literally always been due to software (if you dont believe me by all means you can do the research). I imagine that Sony will attempt associate their first party IPs with a platform, but the notion of that platform running exclusively on a Sony box is going to be a concept of the past. If they cannot create a compelling platform they will host their games on other platforms, I expect after the launch of the PS5 they will move full product development to creating a viable platform capable of being accessed and ran over a plethora of devices. Right now, the console market is a fraction of gamers and Sony owns a fraction of the console market, do you really think they are going into the next decade with the archaic concept of limiting their potential consumers to those that want to buy a PS5 or a PS6 or whatever? From a hardware architecture or even a development standpoint there is nothing tying their console games to the PS5 hardware (or PS4 for that matter), consoles are and have been x86 architecture and design through and through.

Sony is in a dangerous position for the next decade, in order for them to offer a competing software platform they are going to have to partner with one of the larger cloud providers (or at the very least should be considering it). This changes their profitability model considerably when their direct competition is Microsoft which has a foothold on the majority home PCs on the planet and a public cloud infrastructure that is rivaling Amazon.

Your fanboy notions of winning and losing, or some hashed out console war are concepts of a bygone era. Right now the major players in the industry should be focusing on how to get their platform accessible to as many consumers as possible and console hardware sure as hell isnt the way to do it.


----------



## Mooncheese

https://wccftech.com/amd-announcing-turing-killer-big-navi-gpu-at-financial-analyst-day/


----------



## ZealotKi11er

My prediction. Titan 3XXX > Q1, 3080 Computex, 3080 Ti Q4.


----------



## Kaltenbrunner

aDyerSituation said:


> 1080Ti and 2080 are basically the same performance level.


And the 1080ti was $800 at launch and the 2080ti was $1,200. Soon nvidia will need an ultra-titan, to add another even higher priced tier, rather than lower any prices.


----------



## keikei

ZealotKi11er said:


> My prediction. Titan 3XXX > Q1, 3080 Computex, 3080 Ti Q4.



Gud prediction. Big swing Navi may not be able to touch 3080, let alone Ti. Highend gamers are borked if true. Ugh.


----------



## Sheyster

keikei said:


> Gud prediction. Big swing Navi may not be able to touch 3080, let alone Ti. Highend gamers are borked if true. Ugh.


That prediction if proven to be true would not surprise me at all.


----------



## Section31

ZealotKi11er said:


> My prediction. Titan 3XXX > Q1, 3080 Computex, 3080 Ti Q4.


That's the best out come. It means for the watercoolers, we don't have to wait for blocks since the Titan blocks are more or less usable on the TI (been the case for last couple of generations). It's good for the people who are going to use Optimus Water Cooling for there GPU Blocks, they don't have the contacts yet to get the GPU's in advance from Nvidia/AMD to have blocks ready at launch.


----------



## ZealotKi11er

keikei said:


> Gud prediction. Big swing Navi may not be able to touch 3080, let alone Ti. Highend gamers are borked if true. Ugh.


It depends. Nvidia will probably lead in RT performance. I am sure AMD will beat 3080. Personally, RT is not useful feature for me at 4K because of how demanding this res already is.


----------



## guttheslayer

*[email protected]*



keikei said:


> Gud prediction. Big swing Navi may not be able to touch 3080, let alone Ti. Highend gamers are borked if true. Ugh.


Too close to 3080, it could be much later, 3 Quarter later or Q2 2021.


The release trend will be closer to Pascal than Turing if you ask me.


----------

