# [Various] Pascal Titan could be 50% faster than the GeForce GTX 1080 - Rumoured to launch in August



## BiG StroOnZ

Quote:


> VR World is reporting that they've had hands-on time with the GP102-based GeForce GTX Titan card, with a few details on NVIDIA's new monster.
> 
> NVIDIA's next-gen Titan X successor will feature both 8+8 and 8+6-pin PCIe power connectors, with the PCIe power connectors being found at the front of the card, and not on top. The 8+8-pin power connectors will provide up to 375W TDP, while the 8+6-pin version will provide 300W. Performance-wise, it should be a monster with around 50% more horsepower over the already lightning quick GTX 1080.
> 
> The new Pascal-based Titan X successor will reportedly be 12 inches long, while the new Titan X successor will arrive in 12/16GB versions (as we've previously reported) rocking HBM2 memory. VR World's sources say that the new GPU is actually now bound by the CPU, with NVIDIA's engineers reportedly saying that even Intel's new Core i7-6950X isn't powerful enough to deliver the performance the new Titan cards need in enthusiast level scenarios.
> 
> *We should expect NVIDIA to unveil its new Titan P at Gamescom, which takes place in Cologne, Germany between August 17-21, 2016*.


Here is an AIB GTX 1080 running Shadow of Mordor @ 5K Resolution:



50% faster than this 1080 would mean with the Pascal Titan you would be over 60 fps @ 5K resolution with a single card.

*Source:* http://www.tweaktown.com/news/52886/nvidias-next-gen-titan-50-faster-geforce-gtx-1080/index.html
*Source 2:* http://vrworld.com/2016/07/05/nvidia-gp100-titan-faster-geforce-1080

[TPU] NVIDIA to Unveil GeForce GTX TITAN P at Gamescom
Quote:


> NVIDIA is preparing to launch its flagship graphics card based on the "Pascal" architecture, the so-called GeForce GTX TITAN P, at the 2016 Gamescom, held in Cologne, Germany, between 17-21 August. The card is expected to be based on the GP100 silicon, and could likely come in two variants - 16 GB and 12 GB. The two differ by memory bus width besides memory size. The 16 GB variant could feature four HBM2 stacks over a 4096-bit memory bus; while the 12 GB variant could feature three HBM2 stacks, and a 3072-bit bus. This approach by NVIDIA is identical to the way it carved out Tesla P100-based PCIe accelerators, based on this ASIC. The cards' TDP could be rated between 300-375W, drawing power from two 8-pin PCIe power connectors.
> 
> The GP100 and GTX TITAN P isn't the only high-end graphics card lineup targeted at gamers and PC enthusiasts, NVIDIA is also working the GP102 silicon, positioned between the GP104 and the GP100. This chip could lack FP64 CUDA cores found on the GP100 silicon, and feature up to 3,840 CUDA cores of the same kind found on the GP104. The GP102 is also expected to feature simpler 384-bit GDDR5X memory. NVIDIA could base the GTX 1080 Ti on this chip.


*Source 3:* https://www.techpowerup.com/223895/nvidia-to-unveil-geforce-gtx-titan-p-at-gamescom
*Source 4:* http://www.guru3d.com/news-story/geforce-gtx-titan-p-might-see-august-announcement.html
*Source 5:* http://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidias-pascal-titan-rumoured-to-launch-in-august/


----------



## QSS-5

source?


----------



## BiG StroOnZ

Quote:


> Originally Posted by *QSS-5*
> 
> source?


http://www.tweaktown.com/news/52886/nvidias-next-gen-titan-50-faster-geforce-gtx-1080/index.html


----------



## FattysGoneWild

$1,400 FE confirmed?


----------



## Wishmaker

I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything







!


----------



## Yuhfhrh

Quote:


> Originally Posted by *FattysGoneWild*
> 
> $1,400 FE confirmed?


It will be interesting to see how they price this.


----------



## DADDYDC650

Already listed my GTX 1080 on ebay. I'm ready for the milking Nvidia!


----------



## Clovertail100

Quote:


> Originally Posted by *Wishmaker*
> 
> I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything
> 
> 
> 
> 
> 
> 
> 
> !


They're probably talking about when you're at 720P and doing 230 FPS instead of the 250 you'd get if you weren't CPU limited.

I don't know what's up with NV lately. They've got a new marketing team, or something, I think. Their recent tactics are more desperate than AMD's during the 3rd "release" of GCN 1.0 cards.


----------



## Wishmaker

Quote:


> Originally Posted by *Mookster*
> 
> They're probably talking about when you're at 720P and doing 230 FPS instead of the 250 you'd get if you weren't CPU limited.
> 
> I don't know what's up with NV lately. They've got a new marketing team, or something, I think. Their recent tactics are more desperate than AMD's during the 3rd "release" of GCN 1.0 cards.


...and the chip is to blame because most of the cores will be on standby waving at Pascal Titan 'pick me, pick me for the next task'.









Gaming development is to blame if 6950x or Zen mega hyper jiggawatts FX Edition will bottleneck Pascal Titan







. You got cores, you got threads, use them !


----------



## Majin SSJ Eric

Gonna be a monster for sure.


----------



## michaelius

Quote:


> Originally Posted by *Wishmaker*
> 
> I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything
> 
> 
> 
> 
> 
> 
> 
> !


1080 is already bottlenecked in 1080p by overclocked i7 6700k


----------



## huzzug

Quote:


> Originally Posted by *michaelius*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wishmaker*
> 
> I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything
> 
> 
> 
> 
> 
> 
> 
> !
> 
> 
> 
> 1080 is already bottlenecked in 1080p by overclocked i7 6700k
Click to expand...

If you're using GTX1080 @ 1080p, you're doing it wrong


----------



## infranoia

Might be... Could be... Maybe... Don't worry, it's just around the corner. Look at these graphs, it's 1.85x better!

From both camps. This node is defined by paper promises and empty marketing skirmishes. I can't wait for node volume and maturity to start defining the choices, instead of this cynical manipulation.


----------



## Klocek001

I've already decided I'm gonna skip this one due to the horrendous price







with HBM2 onboard I don't expect any lower than $900 for 1080Ti, unless 1080Ti will be GDDR5X and HBM2 is for Titan P only. Then we might get the 1080Ti for $850


----------



## Nestala

Available for 1299$ with a 1399$ founder's edition available?

The card will be way too expensive for me, but I'm glad that we finally have GPUs that are comfortable with 4k.


----------



## Majin SSJ Eric

More like $1500 Fanboy Edition! But it will be almost worth it...


----------



## renejr902

im not a expert at all, but something disappoint me here: VRWORLD '' While Nvidia will pull out all the guns to fight the AMD Radeon RX 480, releasing GeForce GTX 1060 as early as July 7th - our focus is slowly turning towards the real big gun of Pascal-based GeForce line-up. If our sources are correct, GP100 and GP102 were essentially the same chips, with the difference being NVLink interface on the GP100 and PCI Express for the GP100. Feature set on both chips is the same, and there are no surprises. ''

So if its the same chip we only get around 2900 single core unit and 900 DP. So i will buy the titan for gaming, so 900 DP is a waste. Why not use all 3800 single core unit for the titan P like the Titan X, people that only gaming dont need these double precision unit core.

Am i wrong ? sorry for my bad english, i try my best









and finally all rumors for gp102 with gddr5x are false ?

man im dead with my i5 4690, i need at least a i7 4790k ? isnt ?

i just bought a gtx windforce 1070 for waiting for the next titan. i will receive it next week. Maybe i should not use it and sell it brand new, the 1070-1080 (non-founders card) is backorder everywhere in canada, at least in montreal. Should i wait for the titan P and sell my gefore 1070 brand new ?

Thanks for answers and help. Im really interested to buy the next Titan P. i want to be able to play game in 4k at high setting or ultra with 60fps for atleast a few years. i can live with 35-40fps, but its not optimal at all.


----------



## ChevChelios

Quote:


> Pascal Titan could be 50% faster than the GeForce GTX 1080


its ok, its ok, I still love my shiny new 1080, I love it, love it ....

Ill just get a 1180 Volta later for cheaper, same performance as 1080Ti
















also RIP Vega


----------



## Nestala

Quote:


> Originally Posted by *huzzug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *michaelius*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wishmaker*
> 
> I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything
> 
> 
> 
> 
> 
> 
> 
> !
> 
> 
> 
> 1080 is already bottlenecked in 1080p by overclocked i7 6700k
> 
> Click to expand...
> 
> If you're using GTX1080 @ 1080p, you're doing it wrong
Click to expand...

Why did they name it like that then?









Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> More like $1500 Fanboy Edition! But it will be almost worth it...


"almost"


----------



## Xuper

Quote:


> Originally Posted by *ChevChelios*
> 
> its ok, its ok, I still love my shiny new 1080, I love it, love it ....
> 
> Ill just get a 1180 Volta later for cheaper, same performance as 1080Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also RIP Vega


*Big Vega could be 50% faster than the Pascal Titan - features 8+8 and 128 ROP.*

RIP Pascal.


----------



## huzzug

I blame the panel manufacturers for atrocious prices of GPU's. 1080p is for pennies while 4K is a grand


----------



## Defoler

Can't wait.
With the new titan or the new 1080 ti, replacing a dual 980 TI setup for a single card which should perform better, might be worth the price.
Though not going to rush and buy it. Gotta let those prices sink in and settle down.

Though I'm sure some people who spend two grand on a titan SLI setup, and yell about the price of the new titan as unworthy, kinda makes my grin and chuckle


----------



## Defoler

Quote:


> Originally Posted by *huzzug*
> 
> I blame the panel manufacturers for atrocious prices of GPU's. 1080p is for pennies while 4K is a grand


High end will always cost more.
When LCD was the market majority, high end backlit led panels cost a good 1000-2000$. When the first 4K monitors came out, they cost 3000$+.
When 4K starts to be the standard for under 400$, 8K will come out and cost 3000$+ and GPUs being able to run it will still cost 1000$+.

That is the price of early high end adoption. You want the best, you pay for it.
R&D and trial manufacturing and testing has its costs. Someone need to get back those billions $$ investment so they can start investing another billions on the next product.


----------



## huzzug

Thanks for missing and ruining a well thought out joke


----------



## Nestala

Quote:


> Originally Posted by *Defoler*
> 
> Quote:
> 
> 
> 
> Originally Posted by *huzzug*
> 
> I blame the panel manufacturers for atrocious prices of GPU's. 1080p is for pennies while 4K is a grand
> 
> 
> 
> High end will always cost more.
> When LCD was the market majority, high end backlit led panels cost a good 1000-2000$. When the first 4K monitors came out, they cost 3000$+.
> When 4K starts to be the standard for under 400$, 8K will come out and cost 3000$+ and GPUs being able to run it will still cost 1000$+.
> 
> That is the price of early high end adoption. You want the best, you pay for it.
> R&D and trial manufacturing and testing has its costs. Someone need to get back those billions $$ investment so they can start investing another billions on the next product.
Click to expand...

Good 4k panels are already pretty cheap. The LG 27MU67-B only costs 430€ right now over here in Europe for example.


----------



## Wishmaker

Quote:


> Originally Posted by *Xuper*
> 
> *Big Vega could be 50% faster than the Pascal Titan - features 8+8 and 128 ROP.*
> 
> RIP Pascal.


RIP your mobo if Vega will be 50% faster than Pascal Titan. Make sure you have enough PCI-E slots *hue hue hue*


----------



## ChevChelios

a Vega faster than Titan P will draw more power than the city of New York and atomize your poor PCI-E slot in the process


----------



## Tobiman

Here we go again.


----------



## Glottis

Quote:


> Originally Posted by *huzzug*
> 
> If you're using GTX1080 @ 1080p, you're doing it wrong


i'll be using GTX1080Ti @ 1080p. (please don't have heart attack)


----------



## renejr902

Quote:


> Originally Posted by *Glottis*
> 
> i'll be using GTX1080Ti @ 1080p. (please don't have heart attack)


if you want to keep your videocard for 10+ years and still performing well even at ultra settings, buy a Titan P and play in 720p








maybe a dual sli titan P in 480p can give you 20 years at ultra. in worst case disable anti aliasing








please someone answer my question about sp vs dp unit i try to understand.


----------



## BranField

anyone not a bit curious about the size of the card? as this is meant to be powered by HBM2 and another HBM card (fury x for instance) is 7.5in long how can the pascal titan be 12in long?


----------



## renejr902

Quote:


> Originally Posted by *BranField*
> 
> anyone not a bit curious about the size of the card? as this is meant to be powered by HBM2 and another HBM card (fury x for instance) is 7.5in long how can the pascal titan be 12in long?


vrworld said 30cm


----------



## BranField

Quote:


> Originally Posted by *renejr902*
> 
> vrworld said 30cm


even with it being 30cm = 11.811in it would still be bigger than a 980ti @ 26.7cm = 10.511. it just doesnt seem to add up to me. Also with the putting the pcie plugs on the end that would increase the length so one would assume that they moved them to the end because space allowed it however it is looking to be longer than the 980ti?


----------



## Slashuzero

Nvidia's beast are just underbed monsters nowadays


----------



## Defoler

Quote:


> Originally Posted by *renejr902*
> 
> vrworld said 30cm


That length is strange.
HBM should actually make the card much smaller as all the place were the memory chips are supposed to be, is irrelevant, and the HBM2 chips should be small around the die itself.


----------



## ChevChelios

maybe its not HBM2

Quote:


> The target is at least 50% higher performance than GeForce GTX 1080 Founders Edition, and our sources are saying they're now bound by the CPU. Even Core i7-6950X isn't enough to feed all the cards and in a lot of scenarios you could see an Intel Core i7-6700K, with its supreme clock (4.0 vs. 3.0 GHz) easily feed the GP100 more efficiently than Broadwell-E based Core i7 Extreme Edition. *The running joke inside Nvidia is "don't buy the 6950X - buy 6700K and a Titan"* but we're not sure that Nvidia will use this for an official tagline. Truth to be told, they might be right - we need Intel to return to the 4-core next-gen mainstream/enthusiast and then X-core big-daddy part using the same architecture, rather than the current cadence which makes sense only if you're working for Intel. AMD's 8-core ZEN cannot come soon enough.


such jokes
much wow


----------



## Unkzilla

Big Vega or Titan... the true upgrade for 980ti/furyX (me) owners

Lets see which one comes out first and the pricing. This card could have pricing close to that dual GPU titan that was going around


----------



## Newbie2009

North Korea are stockpiling Pascal Titans because so much power


----------



## iLeakStuff

Already in August?
Holy crap thats good news


----------



## JoHnYBLaZe

Sooooo....not even intel's fastest can keep up with 1 card?

So forget about SLi entirely?

Kaby lake and the fastest ddr4 you can wrangle up to get the most out of this?

Someone here said Nvidia was acting desperate....well maybe cuz they just hit a will they have no control over, and that wall is called intel

6950x bottleneck pascal......volta is already dead in the water.....


----------



## w1LLz

Should be a beast. With the rise of esports/competitive gaming 120/144hz 1080p+ 200+fps single card setups will be standard.


----------



## ChevChelios

the 6950X bottleneck is either bs or its only a stock 6950X with its low clocks

I cant see an OCed fast Intel CPU bottlenecking it, especially at 4K/5K

Quote:


> Sooooo....not even intel's fastest can keep up with 1 card?
> 
> So forget about SLi entirely?
> 
> Kaby lake and the fastest ddr4 you can wrangle up to get the most out of this?
> 
> Someone here said Nvidia was acting desperate....well maybe cuz they just hit a will they have no control over, and that wall is called intel
> 
> 6950x bottleneck pascal......volta is already dead in the water.....


its dead _because its too fast_ .. ? Oo

and by Volta time there should already be 10nm Cannonlake


----------



## StarGazerLeon

I wonder how many peope with more money than sense will buy a Pascal Titan for 1920x1080, lol. I don't think there's a CPU on this planet that could maintain full usage from one of these at 1080p. I am not quite ready to ditch 1080p just yet, but I am glad GPU power is steadily increasing. Then again, so is the price...


----------



## FLCLimax

Looks like it'll get 60fps in 4K so it's on my radar.


----------



## ChevChelios

this sounds like the card(s) that will be a must for that new [email protected] monitor that Asus showed at Computex

they might even come out at a similar time

although 2 of these + that monitor might end up close to $4500-5000


----------



## iLeakStuff

Price will probably be in the $1200 region this time I think


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> im not a expert at all, but something disappoint me here: VRWORLD '' While Nvidia will pull out all the guns to fight the AMD Radeon RX 480, releasing GeForce GTX 1060 as early as July 7th - our focus is slowly turning towards the real big gun of Pascal-based GeForce line-up. If our sources are correct, GP100 and GP102 were essentially the same chips, with the difference being NVLink interface on the GP100 and PCI Express for the GP100. Feature set on both chips is the same, and there are no surprises. ''
> 
> So if its the same chip we only get around 2900 single core unit and 900 DP. So i will buy the titan for gaming, so 900 DP is a waste. Why not use all 3800 single core unit for the titan P like the Titan X, people that only gaming dont need these double precision unit core.
> 
> Am i wrong ? sorry for my bad english, i try my best
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and finally all rumors for gp102 with gddr5x are false ?
> 
> man im dead with my i5 4690, i need at least a i7 4790k ? isnt ?
> 
> i just bought a gtx windforce 1070 for waiting for the next titan. i will receive it next week. Maybe i should not use it and sell it brand new, the 1070-1080 (non-founders card) is backorder everywhere in canada, at least in montreal. Should i wait for the titan P and sell my gefore 1070 brand new ?
> 
> Thanks for answers and help. Im really interested to buy the next Titan P. i want to be able to play game in 4k at high setting or ultra with 60fps for atleast a few years. i can live with 35-40fps, but its not optimal at all.


No, gp102 means its a geforce core without the dp to improve yield loss and also to give extra performance per watt.

And becz it doesnt have a dp, it doesnt undercut the gp100 tesla or quadro sales, but the problem is all the useless gp100 will be a huge waste, hence gp102 will require a premium above $1000 to compensate.

This is the only reason they could release it, within 3 months of may. I wonder what they desperate for?


----------



## ChevChelios

if this costs 1200+ EUR in my locale then I wont even feel bad about the 50% faster since I "only" payed 697 EUR for the G1 1080









but I wonder if it means 1080Ti will come sooner than we think ?

if you already have a Titan P ready then it pretty much means you already have a 1080Ti too


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *ChevChelios*
> 
> the 6950X bottleneck is either bs or its only a stock 6950X with its low clocks
> 
> I cant see an OCed fast Intel CPU bottlenecking it, especially at 4K/5K
> its dead _because its too fast_ .. ? Oo
> 
> and by Volta time there should already be 10nm Cannonlake


If its true and there are bottlenecks with even 1 card....

Then what's needed is a major cpu upgrade, but there are no cpu's beyond intel latest arc....

So sure.....volta is dead because its too fast or because intel is too slow....

And at that rate I doubt cannonlake or overclocking will be the savior of anything given intel's love of INCREMENTAL performance increases


----------



## TopicClocker

The big die GPUs are almost always 40% to over 50% faster, no surprises here.

It will be good to see GPUs which are potentially capable of pulling off 4K 60fps with fairly high graphical settings enabled in the more demanding titles.


----------



## ChevChelios

Quote:


> And at that rate I doubt cannonlake or overclocking will be the savior


(1) Cannonlake is a tick, so 10nm alone should give some perf increase
(2) they might feel the need to get off their asses and increace perf gains more this time to decisively beat Zen(+) which will be the first real competition in a while

although I doubt they look at Titan P/Volta and think "oh man, we gotta make out CPUs faster so we dont bottleneck Nvidias monsters"


----------



## Ghoxt

This is like groundhogs day...









Here we go for the 3rd iteration of "Here's the new TITAN..." With no mention of the 1080Ti to follow, cutdown, gimped, or better performance wise than the Titan model. I'll not buy another Titan anywhere near launch until the TI drops to compare.

WCCF source article incoming. They live for this crap.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *ChevChelios*
> 
> (1) Cannonlake is a tick, so 10nm alone should give some perf increase
> (2) they might feel the need to get off their asses and increace perf gains more this time to decisively beat Zen which will be the first real competition in a while
> 
> although I doubt they look at Titan P/Volta and think "oh man, we gotta make out CPUs faster so we dont bottleneck Nvidias monsters"


Ooohhhh, I doubt it too......

When we're talking about feeding gpu's, we're talking about IPC right? Not more cores....that's why they made that supposed joke of buying the 6700k over the 6950x

Skylake didn't shatter the earth compared to haswell in terms of IPC, I HIGHLY doubt cannonlake will perform any miracles against skylake

So if there is bottleneck now.....volta with its performance increases, _is really going to be gimped hard_


----------



## JoHnYBLaZe

Also imagine wanting 2 1080 ti's with the dream of doing 100+ fps at 4k......

_*But alas......no cpu in existence can handle that configuration*_


----------



## umeng2002

Quote:


> Originally Posted by *FattysGoneWild*
> 
> $1,400 FE confirmed?


Don't forget your $10 a month driver subscription


----------



## f1LL

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Also imagine wanting 2 1080 ti's with the dream of doing 100+ fps at 4k......
> 
> _*But alas......no cpu in existence can handle that configuration*_


Don't worry. That's only true in DX11. With Mantle/Vulkan/DX12 there will be no bottleneck


----------



## Defoler

Quote:


> Originally Posted by *ChevChelios*
> 
> the 6950X bottleneck is either bs or its only a stock 6950X with its low clocks
> 
> I cant see an OCed fast Intel CPU bottlenecking it, especially at 4K/5K


I can see a GPU being bottlenecked even by a 10/20 cores CPU if it is powerful enough, because even with a lot of cores, most game engines are still not multi threaded, which gives a good core with high clock speeds and higher IPC, an advantage.
Single core performance, the 6700K is much better than the broadwell-E multi core CPUs, even the X one (and yes you can OC it, but well, you can OC the 6700K as well no?). The 4970K is even better. The 4770K is even better.

I can definitely see a good OCed 6700K or a 4970K giving better performance in games compared to an OCed 6950X.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *f1LL*
> 
> Don't worry. That's only true in DX11. With Mantle/Vulkan/DX12 there will be no bottleneck


LOL

DX12 benchies I've seen have 0 or even negative SLi scaling....but one can dream i guess


----------



## Nestala

Quote:


> Originally Posted by *iLeakStuff*
> 
> Already in August?
> Holy crap thats good news


Only the announcement. This means nothing. For all we know, we could see this in Q1/Q2 2017.


----------



## Klocek001

Quote:


> Originally Posted by *Xuper*
> 
> *Big Vega could be 50% faster than the Pascal Titan - features 8+8 and 128 ROP.*
> 
> RIP Pascal.


with Polaris performance/watt ratio it's more likely that the card you're describing is gonna need a CLC to keep a clock speed high enough to beat a 2GHz 1080 running on a semi-passive air cooler


----------



## guttheslayer

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Also imagine wanting 2 1080 ti's with the dream of doing 100+ fps at 4k......
> 
> _*But alas......no cpu in existence can handle that configuration*_


Overclock the bwe to same speed as skylake it should be able to match well. Afterall clock for clock skylake is just a little better. Very very little if both are same speed.

The biggest problem now is actually the clock speed of bwe, not the ipc.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *guttheslayer*
> 
> Overclock the bwe to same speed as skylake it should be able to match well. Afterall clock for clock skylake is just a little better. Very very little if both are same speed.
> 
> The biggest problem now is actually the clock speed of bwe, not the ipc.


If bwe is bottlenecking then one must naturally wonder how much the incrementally better skylake can alleviate the problem

Seriously, I doubt a few hundred mhz is going to be a game-changer in that scenario but I guess we'll have to wait and see.....


----------



## guttheslayer

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> If bwe is bottlenecking then one must naturally wonder how much the incrementally better skylake can alleviate the problem
> 
> Seriously, I doubt a few hundred mhz is going to be a game-changer in that scenario but I guess we'll have to wait and see.....


Just becz skylake is more efficient in supporting titan pascal doesnt mean its significant faster...

In term of pure clock the skylake is 30% faster than 6950x, i think that explain alot.

They should have tried 6850k, who know it will wield impressive result.


----------



## Eorzean

Now we're at the mercy of Intel to inch out even more performance to play a sea of unoptimized ports. Incoming inevitable price increases.

What a time to be alive.

(Yes I'm very bitter and salty at the current state of things)


----------



## SuprUsrStan

Quote:


> Originally Posted by *Wishmaker*
> 
> I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything
> 
> 
> 
> 
> 
> 
> 
> !


Probably stock 6950x. I bet you my OC'd 5960x won't bottleneck one of those babies.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Syan48306*
> 
> Probably stock 6950x. I bet you my OC'd 5960x won't bottleneck one of those babies.


Funny you mention this....

Jayz 2 cents over on youtube has a PC running 3 titan x's with a 5960x and says the 5960x was bottlenecking the titans to his amazement....

Also good to remember that actual amount of cpu cores has little to no effect on GPU performance....

For example a 6700k>5960x for pure fps....


----------



## Kana Chan

Are they testing this with high speed ram too?
Some boards can take ram to 4133-4266 ( impact / ocf )
4.9/5.0 vs 4.4/4.5?


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Kana Chan*
> 
> Are they testing this with high speed ram too?
> Some boards can take ram to 4133-4266 ( impact / ocf )
> 4.9/5.0 vs 4.4/4.5?


Im interested in this too

Apparantly high speed ddr4 gave a performance boost to 980ti sli in some review posted here not long ago

Although they didn't post any timings but the performance seemed to scale on clock speed alone

Wonder if some 4xxx speed ram would make a difference if at all.....


----------



## Defoler

Quote:


> Originally Posted by *Eorzean*
> 
> Now we're at the mercy of Intel to inch out even more performance to play a sea of unoptimized ports. Incoming inevitable price increases.
> 
> What a time to be alive.
> 
> (Yes I'm very bitter and salty at the current state of things)


Don't worry.
AMD will come and save the day with zen extreme performance which will rival intel's high end.

*giggles*


----------



## guttheslayer

Quote:


> Originally Posted by *Defoler*
> 
> Don't worry.
> AMD will come and save the day with zen extreme performance which will rival intel's high end.
> 
> *giggles*


Yup all board the hype train.


----------



## ChevChelios

is there a hype train station for all these hype trains ?


----------



## Defoler

Quote:


> Originally Posted by *ChevChelios*
> 
> is there a hype train station for all these hype trains ?


Just pick a line and stand behind it. There just too many, it stopped being matter anymore


----------



## Waitng4realGPU

Wow even the Pascal Titan thread being tainted with hype train crap and AMD bashing? Way to go guys


----------



## Sleazybigfoot

Quote:


> Originally Posted by *Xuper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ChevChelios*
> 
> its ok, its ok, I still love my shiny new 1080, I love it, love it ....
> 
> Ill just get a 1180 Volta later for cheaper, same performance as 1080Ti redface.gifredface.gif
> 
> also RIP Vega
> 
> 
> 
> *Big Vega could be 50% faster than the Pascal Titan - features 8+8 and 128 ROP.*
> 
> RIP Pascal.
Click to expand...

Vega faster than Big Pascal confirmed.


----------



## JackCY

Maybe if they actually do proper parallelism then the CPUs can feed it. But with their DX11 focus their cards excel at low parallelism. And that a 10 core or 44 core Intel CPU can't put the GPU under 100% load is nonsense it definitely can.

$999 MSRP, $1499 retail lol as always.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Defoler*
> 
> Don't worry.
> AMD will come and save the day with zen extreme performance which will rival intel's high end.
> 
> *giggles*


Sarcasm noted*

But if zen has broadwell IPC, 8 cores and a decent price like reports say it will.....definitely sounds like a rival to me.....


----------



## ChevChelios

Quote:


> But if zen has broadwell IPC, 8 cores and a decent price like reports say it will.


"reports" also said 480 is at stock 980 level and is a good overclocker

hype train now arriving at caution station


----------



## Rayleyne

This is... Disgusting... I want one.


----------



## prjindigo

First off, technically "ti" stands for that.... but yeah, the Titan will be 50% more GPU with 384bit bus and the 1080ti will be 25% more GPU with a 384bit bus.

Known facts at this point.

Problem is that 300w thingie.


----------



## renejr902

i will buy the titan P, if rumors are TRUE,

should i change my intel i5 4690 cpu ? ( i have the non-k version, i didnt overclock it)

is the i7 4790K cpu be enough ? thanks guys for answer.


----------



## Pointy

Whats with All this bottlenecking fear stuff?

sounds lke certain reddit subreddits

you will not be battlenecked anywhere with a decent cpu.

unless you plan on playing at 480p

use dsr run at 4k

turn on ingame AA

that should cut your excess gpu power down quite a bit.


----------



## StarGazerLeon

Well, my 3770K at 4.6Ghz botttlenecks my 1070 in a few titles at 1080p, and a 3770K at 4.6Ghz isn't that much slower in games than an overclocked 6700K in CPU limited scenarios. A Pascal Titan absolutely will require an absolutely blazing 4 core 8 threads CPU at least; we are talking a 6700K at 4.8Ghz+ with fast RAM if you are the kind of person who like to be GPU bound at all times.


----------



## SuprUsrStan

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syan48306*
> 
> Probably stock 6950x. I bet you my OC'd 5960x won't bottleneck one of those babies.
> 
> 
> 
> Funny you mention this....
> 
> Jayz 2 cents over on youtube has a PC running 3 titan x's with a 5960x and says the 5960x was bottlenecking the titans to his amazement....
> 
> Also good to remember that actual amount of cpu cores has little to no effect on GPU performance....
> 
> For example a 6700k>5960x for pure fps....
Click to expand...

I agree three Titan X will be bottlencked with an overclocked 5960X. Heck, my three 980 Ti's are bottlenecked by my 5960x on benchmarks like valley. That said, I really don't think a single Titan P will be able to push the same kind of horsepower as three Titan X.


----------



## zealord

well it is no surprise that a big Pascal card is coming. People should know by now that Nvidia starts with a mid-range card GTX X80 GPU then transitions into the Titan which is much better, but also much much more expensive.

I think we can take the basis of old X80 -> Titan card to see the performance gains. People shouldn't be surprised to see 35%+ gains on the GP102. Depending on clock it might even be close to 50%, but I doubt the GP102 is clocked as high out of the box as the GTX1080, so 35% would be a realistic expectation. Same happened with GTX 680 -> Vanilla Titan, GTX 980 -> Titan X.

The real questions are when does it release and how expensive is it going to be? If we go with recent history then the GP102 should release in November/December 2016 with a price of 999$. But I do not believe that the PascalTitan will be 999$ considering the chance of AMD having anything remotely close to challenge it is close to 0%. AMD may have RX 490 VEGA by then, but it is unlike that RX480 -> RX490 is like a 3 times performance increase. It will probably end up between 1070 and 1080.

My personal speculation :

- December 2016
- 35% on GTX1080 both stock
- 1299$
- 478mm²~
- GDDR5X


----------



## Zero4549

Quote:


> Originally Posted by *zealord*
> 
> well it is no surprise that a big Pascal card is coming. People should know by now that Nvidia starts with a mid-range card GTX X80 GPU then transitions into the Titan which is much better, but also much much more expensive.
> 
> I think we can take the basis of old X80 -> Titan card to see the performance gains. People shouldn't be surprised to see 35%+ gains on the GP102. Depending on clock it might even be close to 50%, but I doubt the GP102 is clocked as high out of the box as the GTX1080, so 35% would be a realistic expectation. Same happened with GTX 680 -> Vanilla Titan, GTX 980 -> Titan X.
> 
> The real questions are when does it release and how expensive is it going to be? If we go with recent history then the GP102 should release in November/December 2016 with a price of 999$. But I do not believe that the PascalTitan will be 999$ considering the chance of AMD having anything remotely close to challenge it is close to 0%. AMD may have RX 490 VEGA by then, but it is unlike that RX480 -> RX490 is like a 3 times performance increase. It will probably end up between 1070 and 1080.
> 
> My personal speculation :
> 
> - December 2016
> - 35% on GTX1080 both stock
> - 1299$
> - 478mm²~
> - GDDR5X


And then a 1080Ti launches a month or two later and performs within a margin of error of the Titan for only 2/3 the price. It is ok though, the Titan is still better because it has a fancy name.


----------



## zealord

Quote:


> Originally Posted by *Zero4549*
> 
> And then a 1080Ti launches a month or two later and performs within a margin of error of the Titan for only 2/3 the price. It is ok though, the Titan is still better because it has a fancy name.


That is how it always will be. (or was in recent history)

Titan Pascal probably has like 3840~ cuda cores and the GTX 1080 Ti like 3500~ with less VRAM.


----------



## Exeed Orbit

Quote:


> Originally Posted by *Zero4549*
> 
> And then a 1080Ti launches a month or two later and performs within a margin of error of the Titan for only 2/3 the price. It is ok though, the Titan is still better because it has a fancy name.


Gotta admit though. Titan sounds pretty pimp.


----------



## Nestala

So, 1080Ti just a 25% upgrade of 1080 for 899$ confirmed?


----------



## Nestala

Quote:


> Originally Posted by *zealord*
> 
> well it is no surprise that a big Pascal card is coming. People should know by now that Nvidia starts with a mid-range card GTX X80 GPU then transitions into the Titan which is much better, but also much much more expensive.
> 
> I think we can take the basis of old X80 -> Titan card to see the performance gains. People shouldn't be surprised to see 35%+ gains on the GP102. Depending on clock it might even be close to 50%, but I doubt the GP102 is clocked as high out of the box as the GTX1080, so 35% would be a realistic expectation. Same happened with GTX 680 -> Vanilla Titan, GTX 980 -> Titan X.
> 
> The real questions are when does it release and how expensive is it going to be? If we go with recent history then the GP102 should release in November/December 2016 with a price of 999$. But I do not believe that the PascalTitan will be 999$ considering the chance of AMD having anything remotely close to challenge it is close to 0%. AMD may have RX 490 VEGA by then, but it is unlike that RX480 -> RX490 is like a 3 times performance increase. It will probably end up between 1070 and 1080.
> 
> My personal speculation :
> 
> - December 2016
> - 35% on GTX1080 both stock
> - 1299$
> - 478mm²~
> - GDDR5X


Are you really comparing a card that is aimed at the mainstream market (RX 480) with Vega? RX 480 has GDDR5, while Vega will have HBM2 and be a bigger die.


----------



## ChevChelios

Quote:


> Originally Posted by *Nestala*
> 
> Are you really comparing a card that is aimed at the mainstream market (RX 480) with Vega? RX 480 has GDDR5, while Vega will have HBM2 and be a bigger die.


but does that Vega die look big enough to you to go against 1080Ti ?









granted its a Vega10, not Vega 11

but Vega will still take some of Polaris tech, as well as its 14nm GLoFo (as far as I know) .. and that we can already judge from the 480

Quote:


> Originally Posted by *Zero4549*
> 
> And then a 1080Ti launches a month or two later and performs within a margin of error of the Titan for only 2/3 the price.


yes, ppl knew all that when Titan X released too .. and they still bought those Titan Xs


----------



## zealord

Quote:


> Originally Posted by *Nestala*
> 
> Are you really comparing a card that is aimed at the mainstream market (RX 480) with Vega? RX 480 has GDDR5, while Vega will have HBM2 and be a bigger die.


read my post again. I said it's unlikely that RX480 -> RX490 is a 3 times performance increase. If I am wrong and VEGA RX490 is 3 times as fast as Polaris RX480 then I will apologize to you personally









I never compared them directly I simply said what I said again above. Please don't put words in my mouth. Thank you.


----------



## ChevChelios

Quote:


> So, 1080Ti just a *25%* upgrade of 1080 for 899$ confirmed?


how do you figure ?

the OP says Titan P = 50% over 1080

that, *if true*, would make 1080Ti *40%+* over 1080


----------



## thebski

The only thing I'm curious about is where they price these. The fact that there will be two Titan branded cards scares me. I felt burned by the first Titan after I bought two of them. Of course it was my own choice to pony up $2K, but I didn't appreciate the 780 coming out 3 months later. I swore then that I would never pony up for a Titan again. My tune has changed a little bit, if it were the right card. I guess I've been conditioned on ridiculous prices like most others around here. Their plan is working.

I have said all along that if I could get back to a single card (which I'd like to) that matches my 980 Ti SLI performance, I'd pay for it. However, I'm nearly certain that "big Titan" will cost more than $1K if there are two Titan cards. That really wouldn't be that great of a deal considering I paid $1350 for 980 Ti SLI performance a year ago.

The other stuff isn't much of a surprise. Being 50% faster than GP104 is exactly what I expected based on rumored specs and history. It's also not surprising that it will be bottlenecked by CPUs in the right scenario. That is nothing new for high end GPUs. GPU reviews often show GPU bound games to demonstrate a cards performance, but there are several games that are hard on a CPU as well.

I will be like everyone else, waiting to see how much of my body I have to sell to get one. I'm afraid high end PC gaming is just getting too ridiculous. Now that the top end CPU is $1750 I wouldn't be surprised to see NV try to slap some astronomical price tag on the big Titan card. As if $1000 wasn't already astronomical enough.


----------



## ChevChelios

Quote:


> The only thing I'm curious about is where they price these. The fact that there will be two Titan branded cards scares me. I felt burned by the first Titan after I bought two of them. Of course it was my own choice to pony up $2K, but I didn't appreciate the 780 coming out 3 months later. I swore then that I would never pony up for a Titan again. My tune has changed a little bit, if it were the right card. I guess I've been conditioned on ridiculous prices like most others around here. Their plan is working.


if you dont want to get burned money-wise - never buy Titans, period

only buy X80Tis cards each gen (or every second gen)


----------



## zealord

Quote:


> Originally Posted by *thebski*
> 
> The only thing I'm curious about is where they price these. The fact that there will be two Titan branded cards scares me. I felt burned by the first Titan after I bought two of them. Of course it was my own choice to pony up $2K, but I didn't appreciate the 780 coming out 3 months later. I swore then that I would never pony up for a Titan again. My tune has changed a little bit, if it were the right card. *I guess I've been conditioned on ridiculous prices like most others around here. Their plan is working.*
> 
> I have said all along that if I could get back to a single card (which I'd like to) that matches my 980 Ti SLI performance, I'd pay for it. However, I'm nearly certain that "big Titan" will cost more than $1K if there are two Titan cards. That really wouldn't be that great of a deal considering I paid $1350 for 980 Ti SLI performance a year ago.
> 
> The other stuff isn't much of a surprise. Being 50% faster than GP104 is exactly what I expected based on rumored specs and history. It's also not surprising that it will be bottlenecked by CPUs in the right scenario. That is nothing new for high end GPUs. GPU reviews often show GPU bound games to demonstrate a cards performance, but there are several games that are hard on a CPU as well.
> 
> I will be like everyone else, waiting to see how much of my body I have to sell to get one. I'm afraid high end PC gaming is just getting too ridiculous. Now that the top end CPU is $1750 I wouldn't be surprised to see NV try to slap some astronomical price tag on the big Titan card. As if $1000 wasn't already astronomical enough.


haha very true. There were two polls on a german forum in 2013 and in 2015. Both asking the same thing "What are you ready to pay for a Titan card?"

In 2013 when the Titan Vanilla came out it was like 95% of people said 600€ or lower. Only 1% was ready to pay 999$ and only like 0,1% was ready to pay even more.

In 2015 when the Titan X came out at 999$ it was like 60% of people said 600€ or lower and 8% were ready to pay 999$ or more.

Nvidias marketing/strategy is working really well.

I bet that if they asked the same question shortly before Titan Pascal comes out like 15% are ready to pay 999$ or more for it simply because of how expensive the GTX 1080 is.


----------



## Nightbird

Quote:


> Originally Posted by *ChevChelios*
> 
> [quote name="Nestala" url="/t/1604953/tt-pascal-titan-
> 
> 
> 
> 
> 
> 
> 
> 
> but does that Vega die look big enough to you to go against 1080Ti ?


If that picture is to scale than Vega10 is 80-100% bigger than Polaris 10, if that translates to 50-70% more performance would be it fast enough?


----------



## sugalumps

Quote:


> Originally Posted by *zealord*
> 
> haha very true. There were two polls on a german forum in 2013 and in 2015. Both asking the same thing "What are you ready to pay for a Titan card?"
> 
> In 2013 when the Titan Vanilla came out it was like 95% of people said 600€ or lower. Only 1% was ready to pay 999$ and only like 0,1% was ready to pay even more.
> 
> In 2015 when the Titan X came out at 999$ it was like 60% of people said 600€ or lower and 8% were ready to pay 999$ or more.
> 
> Nvidias marketing/strategy is working really well.
> 
> I bet that if they asked the same question shortly before Titan Pascal comes out like 15% are ready to pay 999$ or more for it simply because of how expensive the GTX 1080 is.


It's not really marketing strategy tbh, it's just each year the competition falls a little further behind allowing nvidia to get away with more.


----------



## thebski

Quote:


> Originally Posted by *ChevChelios*
> 
> if you dont want to get burned money-wise - never buy Titans, period
> 
> only buy X80Tis cards each gen (or every second gen)


That's probably a valid point. It just all depends on where they all lie in price and performance.


----------



## ChevChelios

Quote:


> If that picture is to scale than Vega10 is 80-100% bigger than Polaris 10, if that translates to 50-70% more performance would be it fast enough?


fast enough for what specifically ?

https://www.techpowerup.com/reviews/AMD/RX_480/24.html - 1080 is 80%+ faster than a ref 480; 1070 is 50%+ faster


----------



## Nightbird

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> If that picture is to scale than Vega10 is 80-100% bigger than Polaris 10, if that translates to 50-70% more performance would be it fast enough?
> 
> 
> 
> fast enough for what specifically ?
> 
> https://www.techpowerup.com/reviews/AMD/RX_480/24.html - 1080 is 80%+ faster than a ref 480; 1070 is 50%+ faster
Click to expand...

Look at the quote, it's a rhetorical question


----------



## zealord

Quote:


> Originally Posted by *sugalumps*
> 
> It's not really marketing strategy tbh, it's just each year the competition falls a little further behind allowing nvidia to get away with more.


Well AMD is certainly helping Nvidia, but Nvidia has to "exploit" AMDs failure to compete in the high end segment.

Marketing strategy may have been an inappropriate word for me to use in that scenario. Maybe adapt to competition would've been more fitting.


----------



## ChevChelios

Quote:


> Look at the quote, it's a rhetorical question


Im sorry


----------



## Exeed Orbit

Quote:


> Originally Posted by *Nestala*
> 
> So, 1080Ti just a 25% upgrade of 1080 for 899$ confirmed?


Diminishing returns bro


----------



## thebski

Quote:


> Originally Posted by *sugalumps*
> 
> It's not really marketing strategy tbh, it's just each year the competition falls a little further behind allowing nvidia to get away with more.


It is both in my opinion. You are definitely right in that AMD being no where in sight pretty much allows Nvidia to do what they want. However, the whole Titan branding was a marketing strategy that is working. They touted these cards as halo cards even though they were nothing new. Before the Titan, we had the GTX 580. A GTX 580 and a GTX Titan X are the same card to their respective architectures and processes. Now that the Titan branding and marketing has soaked in for a couple gens, we are seeing $700 cards based on the small chips. And people are lapping it up.


----------



## Exeed Orbit

Quote:


> Originally Posted by *sugalumps*
> 
> It's not really marketing strategy tbh, it's just each year the competition falls a little further behind allowing nvidia to get away with more.


People are becoming more and more willing to disregard price, and go for pure performance. AMD has usually been pretty good at price/performance, but Nvidia has usually beat them outright in performance (At least in the last few gens, and longevity notwithstanding)


----------



## criminal

$1499 at least.


----------



## Wishmaker

1750 is my vote because NVIDIA has no shame







.


----------



## Nestala

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> Are you really comparing a card that is aimed at the mainstream market (RX 480) with Vega? RX 480 has GDDR5, while Vega will have HBM2 and be a bigger die.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but does that Vega die look big enough to you to go against 1080Ti ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> granted its a Vega10, not Vega 11
> 
> but Vega will still take some of Polaris tech, as well as its 14nm GLoFo (as far as I know) .. and that we can already judge from the 480
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zero4549*
> 
> And then a 1080Ti launches a month or two later and performs within a margin of error of the Titan for only 2/3 the price.
> 
> Click to expand...
> 
> yes, ppl knew all that when Titan X released too .. and they still bought those Titan Xs
Click to expand...

I think with Vega it's the other way around, so that would mean that Vega 11 would be the big chip and Vega 10 the cut down one.

*
Edit:* Also, your argument is pretty bad. That's like saying we can judge the 980Ti's performance from the 780...right? Since it's both 28nm?















Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> Are you really comparing a card that is aimed at the mainstream market (RX 480) with Vega? RX 480 has GDDR5, while Vega will have HBM2 and be a bigger die.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> read my post again. I said it's unlikely that RX480 -> RX490 is a 3 times performance increase. If I am wrong and VEGA RX490 is 3 times as fast as Polaris RX480 then I will apologize to you personally
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I never compared them directly I simply said what I said again above. Please don't put words in my mouth. Thank you.
Click to expand...

Didn't want to offend you mate, just sayin' that Vega will be a whole other beast and we don't know anything besides that it has a bigger die and HBM2, and that it's a card aimed at the enthusiast market. So no use in guessing the performance of it or writing it off prematurely







.

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> So, 1080Ti just a *25%* upgrade of 1080 for 899$ confirmed?
> 
> 
> 
> how do you figure ?
> 
> the OP says Titan P = 50% over 1080
> 
> that, *if true*, would make 1080Ti *40%+* over 1080
Click to expand...

Yeah, actually the Ti versions always released after the Titan right? Got that mixed up there. But yeah, you're correct. Also I just fished the 25% out of the water, we don't know







.


----------



## zealord

Quote:


> Originally Posted by *Nestala*
> 
> I think with Vega it's the other way around, so that would mean that Vega 11 would be the big chip and Vega 10 the cut down one.
> Didn't want to offend you mate, just sayin' that Vega will be a whole other beast and we don't know anything besides that it has a bigger die and HBM2, and that it's a card aimed at the enthusiast market. So no use in guessing the performance of it or writing it off prematurely
> 
> 
> 
> 
> 
> 
> 
> .


It's allright. We will see "soon" how it turns out to be. My bet is that VEGA definitively can't contest GP102, but rather in the region around GTX1070-GTX1080.

If the RX480 was better it'd be a different story, but the reality is that a GTX 1080 is like 70% faster than a RX480. It will be damn hard for AMD to close that gap.









There is simply no way I see VEGA10 RX490 being able to compete with GP102. It would also leave a huge gap between RX480 and RX490 where AMD has no card to offer.

If I had to take a guess :

RX490

GTX1070 +10% performance

399$-499$


----------



## keikei

Quote:


> Originally Posted by *Rayleyne*
> 
> This is... Disgusting... I want one.


----------



## ChevChelios

what if their "6950X = 6700K + Titan P" joke can hint at the price ?

~$1700 - $350 = $1300-1350 for Titan P ?


----------



## Nestala

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> I think with Vega it's the other way around, so that would mean that Vega 11 would be the big chip and Vega 10 the cut down one.
> Didn't want to offend you mate, just sayin' that Vega will be a whole other beast and we don't know anything besides that it has a bigger die and HBM2, and that it's a card aimed at the enthusiast market. So no use in guessing the performance of it or writing it off prematurely
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> It's allright. We will see "soon" how it turns out to be. My bet is that VEGA definitively can't contest GP102, but rather in the region around GTX1070-GTX1080.
> 
> If the RX480 was better it'd be a different story, but the reality is that a GTX 1080 is like 70% faster than a RX480. It will be damn hard for AMD to close that gap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There is simply no way I see VEGA10 RX490 being able to compete with GP102. It would also leave a huge gap between RX480 and RX490 where AMD has no card to offer.
> 
> If I had to take a guess :
> 
> RX490
> 
> GTX1070 +10% performance
> 
> 399$-499$
Click to expand...

Soon? Half a year or something is so long of a wait







.
Maybe you're right, but I really think it will have 1080+X% performance. I don't think not having anything other than the 4/8GB RX 480 is bad, AMD said themselves that that alone accounts for..~83% or something of the market. That leaves them with the rest of the enthusiast market.
And just because the RX 480 has "mainstream" performance, doesn't mean that Vega can't have good performance. It's a flagship card after all. That's like saying 980Ti can't have nice performance because 960 (a mainstream card that is on the same 28nm process) doesn't perform great. See what I mean?









Edit: OCN messed quotes up.


----------



## ChevChelios

Vega 10 should be in the 1070/1080 bracket and (the bigger) Vega 11 should be in the 1080Ti bracket

how it will be in reality we'll see


----------



## Nestala

Quote:


> Originally Posted by *ChevChelios*
> 
> what if their "6950X = 6700K + Titan P" joke can hint at the price ?
> 
> ~$1700 - $350 = $1300-1350 for Titan P ?


Could very well be possible, at bit more expensive than the original Titan at 999$, and their recent pricing history shows that they slowly ramp up the price. So a new Titan in the 1300-1500$ range could very well be the price.

Quote:


> Originally Posted by *ChevChelios*
> 
> Vega 10 should be in the 1070/1080 bracket and (the bigger) Vega 11 should be in the 1080Ti bracket
> 
> how it will be in reality we'll see


Exactly.


----------



## zealord

Quote:


> Originally Posted by *ChevChelios*
> 
> Vega 10 should be in the 1070/1080 bracket and (the bigger) Vega 11 should be in the 1080Ti bracket
> 
> how it will be in reality we'll see


It is really confusing that Polaris11 is the small Polaris and Polaris10 is the big Polaris,
but VEGA10 is the small vega and VEGA11 is supposed to be the big VEGA?










Quote:


> Originally Posted by *Nestala*
> 
> Soon? Half a year or something is so long of a wait
> 
> 
> 
> 
> 
> 
> 
> .
> Maybe you're right, but I really think it will have 1080+X% performance. I don't think not having anything other than the 4/8GB RX 480 is bad, AMD said themselves that that alone accounts for..~83% or something of the market. That leaves them with the rest of the enthusiast market.
> And just because the RX 480 has "mainstream" performance, doesn't mean that Vega can't have good performance. It's a flagship card after all. That's like saying 980Ti can't have nice performance because 960 (a mainstream card that is on the same 28nm process) doesn't perform great. See what I mean?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: OCN messed quotes up.


I see what you mean, but there is a lot of hope in there. I don't think the RX480 aligns the same with the RX490 as the GTX 960 alings with the GTX 980 Ti. Between the 960 and 980 Ti there are 2 cards that fill the gap, but what is supposed to be between the RX480 and RX490 ?

Maybe AMD is going the Fury route again. RX480 -> RX490 -> Fury-like 14nm card.


----------



## Nestala

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ChevChelios*
> 
> Vega 10 should be in the 1070/1080 bracket and (the bigger) Vega 11 should be in the 1080Ti bracket
> 
> how it will be in reality we'll see
> 
> 
> 
> It is really confusing that Polaris11 is the small Polaris and Polaris10 is the big Polaris,
> but VEGA10 is the small vega and VEGA11 is supposed to be the big VEGA?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> Soon? Half a year or something is so long of a wait
> 
> 
> 
> 
> 
> 
> 
> .
> Maybe you're right, but I really think it will have 1080+X% performance. I don't think not having anything other than the 4/8GB RX 480 is bad, AMD said themselves that that alone accounts for..~83% or something of the market. That leaves them with the rest of the enthusiast market.
> And just because the RX 480 has "mainstream" performance, doesn't mean that Vega can't have good performance. It's a flagship card after all. That's like saying 980Ti can't have nice performance because 960 (a mainstream card that is on the same 28nm process) doesn't perform great. See what I mean?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: OCN messed quotes up.
> 
> Click to expand...
> 
> I see what you mean, but there is a lot of hope in there. I don't think the RX480 aligns the same with the RX490 as the GTX 960 alings with the GTX 980 Ti. Between the 960 and 980 Ti there are 2 cards that fill the gap, *but what is supposed to be between the RX480 and RX490* ?
> 
> Maybe AMD is going the Fury route again. RX480 -> RX490 -> Fury-like 14nm card.
Click to expand...

The smaller version of Vega (wheather that is 10 or 11)?


----------



## dieanotherday

they dont have to release if AMD provides no competition.


----------



## Nestala

Quote:


> Originally Posted by *dieanotherday*
> 
> they dont have to release if AMD provides no competition.


If they have no competition they'll probably just take more time releasing it and price it higher.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *dieanotherday*
> 
> they dont have to release if AMD provides no competition.


Err........if they don't release they won't make more money.

Watch many OCN members sell their 1070/1080 for titans when released.

AMD competing won't stop this card coming out, it makes no sense. They will have a beast card they can tout as amazing 4K experience etc


----------



## Nestala

By the way @Waitng4realGPU, what is a "real GPU" for you?


----------



## Klocek001

Quote:


> Originally Posted by *Nestala*
> 
> By the way @Waitng4realGPU, what is a "real GPU" for you?


rx480


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Nestala*
> 
> By the way @Waitng4realGPU, what is a "real GPU" for you?


Titan @ 2500mhz for $200 that unlocks to super titan and unlockable ram, all with DX13/DX14 performance.

I'll be waiting for a while








Quote:


> Originally Posted by *Klocek001*
> 
> rx480


Correct I am waiting for an AIB 480.


----------



## zealord

Quote:


> Originally Posted by *Nestala*
> 
> The smaller version of Vega (wheather that is 10 or 11)?


AMDs new naming scheme doesn't leave room for something between those 2 (RX480 and RX490)








Quote:


> Originally Posted by *dieanotherday*
> 
> they dont have to release if AMD provides no competition.


Those prestige products are also for .. well prestige !

People hearing about Nvidia having the Titan Pascal best GPU associate Nvidia with succees and good GPUs. They are more likely to buy Nvidia products in their budget range then.


----------



## Nestala

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> By the way @Waitng4realGPU, what is a "real GPU" for you?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Titan @ 2500mhz for $200 that unlocks to super titan and unlockable ram, all with DX13/DX14 performance.
> 
> I'll be waiting for a while
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Klocek001*
> 
> rx480
> 
> Click to expand...
> 
> Correct I am waiting for an AIB 480.
Click to expand...

Haha







.

Same on the AIB 480 front, I want to replace my 7950 since it shows it's age right now.
Planning on getting an AIB 480 for 1080p and when Vega hits (and has decent 4k performance for an alright price) upgrade to Vega and 4k.

I can't decide if I should wait for Sapphire VaporX 480 or if I should just go with the first AIB 480 from Sapphire (the Nitro I guess) right now.

Edit:
Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> The smaller version of Vega (wheather that is 10 or 11)?
> 
> 
> 
> AMDs new naming scheme doesn't leave room for something between those 2 (RX480 and RX490)
Click to expand...

Well, may be the 490 is small Vega and they'll have a Fury style card for big Vega?

Or small Vega=490 and big Vega=490X?


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Nestala*
> 
> Haha
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Same on the AIB 480 front, I want to replace my 7950 since it shows it's age right now.
> Planning on getting an AIB 480 for 1080p and when Vega hits (and has decent 4k performance for an alright price) upgrade to Vega and 4k.
> 
> I can't decide if I should wait for Sapphire VaporX 480 or if I should just go with the first AIB 480 from Sapphire (the Nitro I guess) right now.


I'm in the same boat with a 7970, Same rez too.

I'm hoping a bunch of AIB cards come out at the same time so we have some choice, Not sure if I'd fork out extra for the Vapor-X (if there is one)


----------



## Exeed Orbit

Quote:


> Originally Posted by *zealord*
> 
> AMDs new naming scheme doesn't leave room for something between those 2 (RX480 and RX490)


Wasn't it mentioned that they'd be using the "5" suffix as well? So, 480, 485, 490, etc. But I believe that would be the second iteration of the chips. So not to be expected any time soon.


----------



## zealord

Quote:


> Originally Posted by *Exeed Orbit*
> 
> Wasn't it mentioned that they'd be using the "5" suffix as well? So, 480, 485, 490, etc. But I believe that would be the second iteration of the chips. So not to be expected any time soon.


Oh I missed that. It's a possibility.

But we also have the leak on AMDs official website that the RX490 is coming.


----------



## Cyro999

Quote:


> Originally Posted by *Wishmaker*
> 
> I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything
> 
> 
> 
> 
> 
> 
> 
> !


a 4.5 - 4.7ghz 6950x is slower than a 6600k for a bunch of games that are already CPU bottlenecked on a gtx980 or slower cards - many of the games in the MMO + RTS genre's plus some others.

For actual GPU performance, a 600mm2 GPU w/ HBM2 has potential to be roughly 2x 1080 if it's at similar clock speeds so 1.5x should be ez.


----------



## ChevChelios

tbh they do kind of have to release the Titan P/1080Ti at some point even if Vega isnt around or isnt a direct competitor

just to get 980Ti/Titan X owners to upgrade .. lots of them passed on the 1080 and rightfully so

plus the high-end community has been waiting for a long time for that "4K @ 60+ fps @ single GPU" sweet combo, so just for that alone they will clear Titan P or 1080Ti off the shelves and price wont even matter if it can do [email protected]


----------



## Cakewalk_S

Ok so recap...

GTX1070 - GP104 + GDDR5
GTX1080 - GP104 + GDDR5X
GTX1080ti - GPBigDie + GDDR5X
Titan P - GPBigDie + HBM2

I think that's right...


----------



## Exeed Orbit

Quote:


> Originally Posted by *zealord*
> 
> Oh I missed that. It's a possibility.
> 
> But we also have the leak on AMDs official website that the RX490 is coming.


The fact that it's coming is a surprise to no one methinks. My question is... when. If it's 2016, it's worth waiting for (though I highly doubt that). If it's 2017, I'll wait for the 1060 to come out, see which $/perf is better between that and the 480. That should hold me out until competition to the 1070/80 comes rolling around.


----------



## Exeed Orbit

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Ok so recap...
> 
> GTX1070 - GP104 + GDDR5
> GTX1080 - GP104 + GDDR5X
> GTX1080ti - GPBigDie + GDDR5X
> Titan P - GPBigDie + HBM2
> 
> I think that's right...


The only thing is, If they decide to use the same GP102 on the ti and titan, I don't think they'd want a version that uses GDDR5X, and another that uses HBM. Doesn't really make all that much sense. Unless they're trying to differentiate performance enough to warrant making two separate SKUs.


----------



## ChevChelios

Quote:


> GTX1080ti - GPBigDie + GDDR5X
> Titan P - GPBigDie + HBM2


noone knows for sure

IMHO full Titan P will definitely be HBM2, but 1080Ti may or may not be

the OP also talks of a 375W Pascal and a 300W Pascal .. maybe 375W = HBM2, 300W = G5X ?


----------



## Exeed Orbit

Quote:


> Originally Posted by *ChevChelios*
> 
> noone knows for sure
> 
> IMHO full Titan P will definitely be HBM2, but 1080Ti may or may not be
> 
> the OP also talks of a 375W Pascal and a 300W Pascal .. maybe 375W = HBM2, 300W = G5X ?


You would think that based on the fact that HBM is more power efficient, it would use less power, no?


----------



## Nestala

Quote:


> Originally Posted by *ChevChelios*
> 
> tbh they do kind of have to release the Titan P/1080Ti at some point even if Vega isnt around or isnt a direct competitor
> 
> just to get 980Ti/Titan X owners to upgrade .. lots of them passed on the 1080 and rightfully so
> 
> plus the high-end community has been waiting for a long time for that "*4K @ 60+ fps @ single GPU*" sweet combo, so just for that alone they will clear Titan P or 1080Ti off the shelves and price wont even matter if it can do [email protected]


Yup. I mean, 4k is already doable with no AA (to be fair, not really needed on 4k anyway) and medium settings...but 4k/60 at high settings? Hell yeah I'll upgrade. That's what I'm and many others are waiting for right now.


----------



## ChevChelios

Quote:


> Originally Posted by *Exeed Orbit*
> 
> You would think that based on the fact that HBM is more power efficient, it would use less power, no?


well obviously the 375W will be the more powerful expensive one with more cores

so that may justify using the best (and most expensive) HBM2 memory on it as well to make a true monster

while the lesser one makes due with less cores and lesser memory


----------



## Cyro999

Quote:


> Originally Posted by *Exeed Orbit*
> 
> You would think that based on the fact that HBM is more power efficient, it would use less power, no?


That's one of the main advantages of HBM2 - memory bandwidth per watt. However core clock speeds (due to changed stock voltages) will likely have a greater impact on power profile.


----------



## Exeed Orbit

Quote:


> Originally Posted by *ChevChelios*
> 
> well obviously the 375W will be the more powerful expensive one with more cores
> 
> so that may justify using the best (and most expensive) HBM2 memory on it as well to make a true monster
> 
> while the lesser one makes due with less cores and lesser memory


Your line of thinking is sound.

I just figured that if the core count differs by a mere 300 cores, the power consumption difference wouldn't be that big.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Exeed Orbit*
> 
> Your line of thinking is sound.
> 
> I just figured that if the core count differs by a mere 300 cores, the power consumption difference wouldn't be that big.


AMD dropped about 20-30W didn't they using HBM iirc?

If the Titan card uses HBM2 then I'd expect it to use the same total power or less than the 1080ti with a big pile of DDR5.


----------



## Nestala

Quote:


> Originally Posted by *Cyro999*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Exeed Orbit*
> 
> You would think that based on the fact that HBM is more power efficient, it would use less power, no?
> 
> 
> 
> That's one of the main advantages of HBM2 - memory bandwidth per watt. However core clock speeds (due to changed stock voltages) will likely have a greater impact on power profile.
Click to expand...

Also it takes up less space and it's possible to have between 4-32GB vram. But the biggest advantage is ofc the 1024GB/s fast memory (compare that to 320GB/s GDDR5X on the 1080).


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> well obviously the 375W will be the more powerful expensive one with more cores
> 
> so that may justify using the best (and most expensive) HBM2 memory on it as well to make a true monster
> 
> while the lesser one makes due with less cores and lesser memory


Please dont anyhow assume, the fact its a big die means there has to be a full blown and a cut down, and becz its based on the same GPU core, if its HBM both will be HBM, if G5X means both G5X.

Also the rumor of it pointing toward August release suggest it is a G5X variant on both BIG PASCAL. Unless HBM2 got pull out of volume production miraculously, or it comes with ridiculous pricing.


----------



## guttheslayer

- delete for double post.-


----------



## ChevChelios

Quote:


> The fact that it is going to release in August time frame


wat

thats just a rumor for an announcement

there is no guarantee that the rumor is true and no guarantee of even a 2016 release, much less an August release


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> wat
> 
> thats just a rumor for an announcement
> 
> there is no guarantee that the rumor is true and no guarantee of even a 2016 release, much less an August release


Well if they did have the sample, Nvidia could already had the Titan. I mean they already release the PCIe Tesla. They had the chip. Problem is whether they want to release or just hold it back till 2017.


----------



## Nestala

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ChevChelios*
> 
> wat
> 
> thats just a rumor for an announcement
> 
> there is no guarantee that the rumor is true and no guarantee of even a 2016 release, much less an August release
> 
> 
> 
> Well if they did have the sample, Nvidia could already had the Titan. I mean they already release the PCIe Tesla. They had the chip. Problem is whether they want to release or just hold it back till 2017.
Click to expand...

Having an engineering sample =/= being ready for production.


----------



## guttheslayer

While its good that they *MAY* want to release the Titan faster than expected. There is always a catch.

The price.

I am preparing my body and my kidneys for a MSRP of $1499-$1999. Maybe that is why they hint 6950X becz they are effective same "price range". Nvidia is hinting you to cut down the CPU to 6700K and use the cash to get a Titan P instead.

Looks like there is a hidden msg maybe.


----------



## guttheslayer

Quote:


> Originally Posted by *Nestala*
> 
> Having an engineering sample =/= being ready for production.


More like they have the card just not enough stock.

Its always about stock actually. No point they have 5 card at hand and the rest of the world cannot get it.


----------



## ChevChelios

theres no way its $2000









Id say $1500 is the absolute max ceiling and 1200-1300 is more likely


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> theres no way its $2000
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Id say $1500 is the absolute max ceiling and 1200-1300 is more likely


I am not sure if 16GB HBM is even enough for a price bracket of $1500. Sure 1200-1300 might be plausible, only if it doesnt have DP capability

And no one knows the die size of this hidden GP102. 375W is more than 2X the power of 1080 and that is on top of having HBM and that is not right with just 50% more shader. Something doesnt tally, are this card clocked ridiculously high?


----------



## zealord

oh god we are already starting to give Nvidia ideas about how they should price the card. It's perfect research material for them to surf forums lol.

People expecting it to be 1300$ + and are fine with it make Jen Hsun happy









We should all scream "NAH NAH WAY TOO MUCH. GO BELOW 700$ OR NO BUY"


----------



## Cybertox

This is just merely an assumption hence the use of the word "could", nothing to discuss here. It may or may not be 50% faster than the 1080.


----------



## guttheslayer

Quote:


> Originally Posted by *zealord*
> 
> oh god we are already starting to give Nvidia ideas about how they should price the card. It's perfect research material for them to surf forums lol.
> 
> People expecting it to be 1300$ + and are fine with it make Jen Hsun happy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We should all scream "NAH NAH WAY TOO MUCH. GO BELOW 700$ OR NO BUY"


Actually, the only person who has the power to do that is Lisa, but I doubt she can say that.


----------



## guttheslayer

Quote:


> Originally Posted by *Cybertox*
> 
> This is just merely an assumption hence the use of the word "could", nothing to discuss here. It may or may not be 50% faster than the 1080.


Anything lower than 50% performance for 375W is abysmal.

Its like the GTX 480 on steroid.


----------



## ChevChelios

frankly Im not sure i believe the 375W thing

thats insane

I think 250W-300W will more than cover the performance target that would satisfy them and the buyers


----------



## BillOhio

Any interest I might have had in a 1080 is pretty much gone at this point. Call me when 1080Ti rolls out. Hopefull/probably(?) that'll be less than 12 months from now.


----------



## SuprUsrStan

Quote:


> Originally Posted by *ChevChelios*
> 
> frankly Im not sure i believe the 375W thing
> 
> thats insane
> 
> I think 250W-300W will more than cover the performance target that would satisfy them and the buyers


Double 8 pin is absolutely insane for a Pascal part considering the 1080 manages with a single 8 pin. 150w + 75w = 225w.

Double 8 pin would be 375w on a 16nm die. The GTX 780 was a 6 & 8 pin. Hell, even a GTX 480 was a 6 & 8 pin. There's almost no way a stock Titan P is double 8 pin.


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> frankly Im not sure i believe the 375W thing
> 
> thats insane
> 
> I think 250W-300W will more than cover the performance target that would satisfy them and the buyers


Well if it does come with 8+8 pin it simply means it could draw anything from 300-375W.


----------



## EightDee8D

That 375 figure is just total board power, 150+150+75 ( 8+8pin+ pci-e). actual power consumption will be 180+90 = 270w unless they go lower clocks vs gtx1080. one thing is pretty sure that both the 1060 and titan p/1080ti will have a bit lower p/w vs 1080. just like 980 vs other maxwell2 cards.

As for being cpu limited, if you are using this card anything less than 2k144, you are doing it wrong.


----------



## guttheslayer

Quote:


> Originally Posted by *Syan48306*
> 
> Double 8 pin is absolutely insane for a Pascal part considering the 1080 manages with a single 8 pin. 150w + 75w = 225w.
> 
> Double 8 pin would be 375w on a 16nm die. The GTX 780 was a 6 & 8 pin. Hell, even a GTX 480 was a 6 & 8 pin. There's almost no way a stock Titan P is double 8 pin.


Its not entirely impossible, but the cores would not be just 3840 cores. If GP102 is all full FP32 cores, it could be a 600mm sq 5120 cores which could explain the 375W thing.

OR

Titan 16GB is just a dual GP104 card, which explain its 375W.

Titan 12GB is a single GPU which is the true single GPU Titan.


----------



## guttheslayer

Quote:


> Originally Posted by *Syan48306*
> 
> Double 8 pin is absolutely insane for a Pascal part considering the 1080 manages with a single 8 pin. 150w + 75w = 225w.
> 
> Double 8 pin would be 375w on a 16nm die. The GTX 780 was a 6 & 8 pin. Hell, even a GTX 480 was a 6 & 8 pin. There's almost no way a stock Titan P is double 8 pin.


Its not entirely impossible, but the cores would not be just 3840 cores. It could be smth even bigger, I dunno.

OR

Titan 16GB is just a dual GP104 card, which explain its 375W.

Titan 12GB is a single GPU which is the true single GPU Titan.
Quote:


> Originally Posted by *EightDee8D*
> 
> That 375 figure is just total board power, 150+150+75 ( 8+8pin+ pci-e). actual power consumption will be 180+90 = 270w unless they go lower clocks vs gtx1080. one thing is pretty sure that both the 1060 and titan p/1080ti will have a bit lower p/w vs 1080. just like 980 vs other maxwell2 cards.
> 
> As for being cpu limited, if you are using this card anything less than 2k144, you are doing it wrong.


I thought of that too but 8+6 pins is enough to satisfy a 270W GPU.


----------



## CasualCat

Has anything ever been unveiled at gamescom by Nvidia? I ask because last time there was that rumor (Titanx? or 980?) there was nothing but a cringe worthy small event with some give aways.

edit: went back and check post history, was 2014 so 980.


----------



## rcfc89

"We should expect NVIDIA to unveil its new Titan P at Gamescom, which takes place in Cologne, Germany between August 17-21, 2016."

Read more: http://www.tweaktown.com/news/52886/nvidias-next-gen-titan-50-faster-geforce-gtx-1080/index.html"

Just lol at those who paid $700 for a mid-range 1080. Inb4 new-Titan drops before October at $1k.


----------



## SuprUsrStan

I would seriously be down for two Titan P's if they're going for just $1000 each. I'd offload my current 980 Ti's and pick up the two Titans. Hopefully they don't try to increase the price to $1250 or $1500


----------



## ChevChelios

Quote:


> Just lol at those who paid $700 for a mid-range 1080. Inb4 new-Titan drops before October at $1k.


just lol at those who think Titan will come as early as October and will only cost $1000


----------



## SuprUsrStan

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> Just lol at those who paid $700 for a mid-range 1080. Inb4 new-Titan drops before October at $1k.
> 
> 
> 
> just lol at those who think Titan will come as early as October and will only cost $1000
Click to expand...

Hey, a man can dream can't he?


----------



## rcfc89

"We should expect NVIDIA to unveil its new Titan P at Gamescom, which takes place in Cologne, Germany between August 17-21, 2016.

Read more: http://www.tweaktown.com/news/52886/nvidias-next-gen-titan-50-faster-geforce-gtx-1080/index.html"

Just lol at those who paid $700 for a mid-range 1080. Inb4 new-Titan drops before October at $1000.
Quote:


> Originally Posted by *Syan48306*
> 
> I would seriously be down for two Titan P's if they're going for just $1000 each. I'd offload my current 980 Ti's and pick up the two Titans. Hopefully they don't try to increase the price to $1250 or $1500


They won't the Titan has and will always be Nvidia's $1000 single gpu. People just grossly overpaid for the 1080. Nvidia's marketing at its best.


----------



## Klocek001

Quote:


> Originally Posted by *Nestala*
> 
> Haha
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Same on the AIB 480 front, I want to replace my 7950 since it shows it's age right now.
> Planning on getting an AIB 480 for 1080p and when Vega hits (and has decent 4k performance for an alright price) upgrade to Vega and 4k.
> 
> I can't decide if I should wait for Sapphire VaporX 480 or if I should just go with the first AIB 480 from Sapphire (the Nitro I guess) right now.
> 
> Edit:
> Well, may be the 490 is small Vega and they'll have a Fury style card for big Vega?
> 
> Or small Vega=490 and big Vega=490X?


I think last Vapor-X was 290X, so sadly I think the Vapor-X line is dead and Sapphire only offers Nitro and Trix. With Nitro being a confirmed dual fan card I'd see if they release a trix and try to get that. Trix can handle a lot of extra voltage,power and heat. And I mean *a lot*. Seen ppl running +200mV on those


----------



## mandrake88

I'm going to laugh a lot watching the 1080 owners cry. We said it from day 1, the 1080 is going to be the worst price/performance card of the entire nvidia lineup when the new Ti/Titan gets released


----------



## ChevChelios

Quote:


> *We* said it from day 1


The Illuminati ?









do the Illuminati set Titan prices as well ?


----------



## Klocek001

Quote:


> Originally Posted by *mandrake88*
> 
> I'm going to laugh a lot watching the 1080 owners cry. We said it from day 1, the 1080 is going to be the worst price/performance card of the entire nvidia lineup when the new Ti/Titan gets released


I don't know what it is that makes people think that nvidia will price 1080Ti @$100 more than 1080.
Maybe if you compare extreme cases, like a person paying $700 for 1080 FE and then take the cheapest 1080Ti you could possibly find, e.g. 1080 is 700 and 1080Ti is 850. Can't see what you're saying happen otherwise.


----------



## aDyerSituation

these claims just keep getting more and more funny

OF course a card this powerful will be cpu limited

...at 1080p


----------



## guttheslayer

Quote:


> Originally Posted by *rcfc89*
> 
> "We should expect NVIDIA to unveil its new Titan P at Gamescom, which takes place in Cologne, Germany between August 17-21, 2016."
> 
> Read more: http://www.tweaktown.com/news/52886/nvidias-next-gen-titan-50-faster-geforce-gtx-1080/index.html"
> 
> Just lol at those who paid $700 for a mid-range 1080. Inb4 new-Titan drops before October at $1k.


Now who say it was just rumour?

Why is Nvidia rushing to push the Titan out, cos the sales for 1080 were too good? Or they know smth about AMD that we didnt know?

Something is not right lol.

Anyway...

http://vrworld.com/2016/07/05/nvidia-gp100-titan-faster-geforce-1080/

This explain ALOT better than what we seen so far.

Quote:


> "We held the "Tesla P100 for PCI Express-based Servers" board in our hands just a few weeks ago, and just a few days ago, we managed to grab our hands on a GP100/GP102-based GeForce GTX Titan. As we reported earlier, this board will only come to market after the debut of Quadro-branded products. Both Quadro and GeForce cards come with the same heatsink, albeit in different color scheme, and "QUADRO" markings vs. "TITAN" on the aluminum-machined shroud.
> 
> Display configuration is similar to GeForce GTX 1080/1070, with one small change. Our board did not had DVI connectors on it.
> 
> The boards have 8+8-pin and 8+6-pin configuration, with the power connectors being placed in the front, rather than on top (which is the case with GeForce GTX 1070/1080). Bear in mind the PCB has routings for both, but the samples we played with all had front-placed power connectors. If the company ends up using 8+6-pin, you can count on 300W TDP, while the 8+8-pin configuration would give you 375W to play with. The picture below shows position of power connectors on a Tesla PCB, and as you can see, it's an 8+6-pin configuration."


----------



## EightDee8D

Nice, hype the 1400-1500$ price and release at 1000-1200$. what a steal









Amd has a chance to beat nvidia's gpus even though that chance is pretty low. but they cannot beat nv's PR, *NEVER* .


----------



## rcfc89

Quote:


> Originally Posted by *Klocek001*
> 
> I don't know what it is that makes people think that nvidia will price 1080Ti @$100 more than 1080.


The 1080Ti will be priced exactly as it was last year with the 980Ti. $650-700.

You will then see the 1080 placed where it belongs at $500-550.


----------



## ChevChelios

instead of waiting for 1080Ti (*unknown* release date (except that it will come even _later_ than the Titan), *unknown* price) I'd rather get a 1080 now (which I did) and then upgrade to Volta earlier then I would have done if I had gotten a 1080Ti

of course that doesnt apply to 980Ti owners since they dont need a 1080 .. but for others I like that path more than waiting for 1080Ti (and after that having to wait for the 1180Ti)

and Im very sure Titan P will have the worst p/p of all Pascals _just like all Titans before it did_


----------



## Klocek001

*1070 is $450*, 1080 is $700.
Yet ppl expect 1080Ti will be somehow a bang for the buck card, *with HBM2 onboard.*...

Yeah,right.

Quote:


> Originally Posted by *rcfc89*
> 
> The 1080Ti will be priced exactly as it was last year with the 980Ti. $650-700.
> 
> You will then see the 1080 placed where it belongs at $500-550.


boy, will your face be red when I show you this comment when the official msrp is known....


----------



## aDyerSituation

We are just getting a glimpse of the future if AMD doesn't bounce back


----------



## DETERMINOLOGY

Quote:


> Originally Posted by *huzzug*
> 
> If you're using GTX1080 @ 1080p, you're doing it wrong


pretty much so....With a gtx 1080 atleast need to be doing 1440p or 1080p at 144hz


----------



## guttheslayer

Quote:


> Originally Posted by *aDyerSituation*
> 
> We are just getting a glimpse of the future if AMD doesn't bounce back


Guys guys, Intel is first to start the ball rolling, $1700 for the flagship CPU. NV faster rebuke and hint its better to downgrade to a 6700K with a titan P.

This give a clear hindside of the titan P new pricing, it has to be within 6700K price from $1700, or anywhere between $1300-$1700.









Jen HH: If Intel can be first to price a $999 cpu to $1759, why we cant?


----------



## Eorzean

Quote:


> Originally Posted by *EightDee8D*
> 
> Nice, hype the 1400-1500$ price and release at 1000-1200$. what a steal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amd has a chance to beat nvidia's gpus even though that chance is pretty low. but they cannot beat nv's PR, *NEVER* .


I wouldn't hype their PR that much. I'd say demand with a lack of competition is the main driving force for their prices. Thy can price these at whatever they want because where else will people go?


----------



## Klocek001

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys guys, Intel is first to start the ball rolling, $1700 for the flagship CPU. NV faster rebuke and hint its better to downgrade to a 6700K with a titan P.
> 
> This give a clear hindside of the titan P new pricing, it has to be within 6700K price from $1700, or anywhere between $1300-$1700.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Jen HH: If Intel can be first to price a $999 cpu to $1759, why we cant?


No, it's gonna be $650, and undermine the whole GP104 line-up. Cause Nvidia is a well known Santa Claus type.


----------



## rcfc89

Quote:


> Originally Posted by *Klocek001*
> 
> *1070 is $450*, 1080 is $700.
> Yet ppl expect 1080Ti will be somehow a bang for the buck card, *with HBM2 onboard.*...
> 
> Yeah,right.
> boy, will your face be red when I show you this comment when the official msrp is known....


I could care less. I can afford it either way. I don't buy Titans. I'll wait for the Lightning 1080Ti's.


----------



## Klocek001

Quote:


> Originally Posted by *rcfc89*
> 
> I could care less. I can afford it either way. I don't buy Titans. I'll wait for the Lightning 1080Ti's.


I don't see your point if you do not care about money. That 1080 owners do and you get a kick out of it ?

Frankly, both waiting for full chip of current gen and getting the x80 of the new gen are good GPU upgrade tactics. Volta should have a native async support just when it's needed while you'll be stuck on your 1080Ti, which is gonna get maybe 12 months of nvidia improving it with drivers.


----------



## hollowtek

Damn i don't even want to know the 1080ti's price. Just thinking about it is making my wallet have a nervous breakdown.


----------



## rcfc89

Quote:


> Originally Posted by *Klocek001*
> 
> I don't see your point if you do not care about money. That 1080 owners do and you get a kick out of it ?
> 
> Frankly, both waiting for full chip of current gen and getting the x80 of the new gen are good GPU upgrade tactics. Volta should have a native async support just when it's needed while you'll be stuck on your 1080Ti, which is gonna get maybe 12 months of nvidia improving it with drivers.


I've spoken on this before. If you have a fresh build and could not wait the 1070/1080 are great options minus the price. Having money to burn doesn't mean I spend it unwisely. Nvidia has worked this scheme many times over. I just don't simply understand why people keep falling for it. These price hikes have gotten out of hand. We are currently paying $700 for a midrange cut down gpu. That's just nuts. If you have an itch to upgrade and can't wait what choice do you have I guess. Amd has and always will be garbage and keep falling further and further behind. Titan will be a monster of a card but still terrible value at $1k. 1080Ti will once again be the smart choice when it drops. But again if you can't wait then well you're stuck overpaying.


----------



## Neo_Morpheus

another recap?

GTX1070 - GP104 + GDDR5
GTX1080 - GP104 + GDDR5X
GTX1080ti - GPBigDie + HBM2 8GB $999
Titan P - GPBigDie + HBM2 16GB $1500


----------



## outofmyheadyo

What people fail to realise is many people dont care about the 300 price diffrence between this and that, oh the tragedy I have to work 1 day extra to buy a titan its just a few hundred bucks.


----------



## Tideman

With this news, I'm quite close to cancelling my preorder for 2 1080s and waiting it out for the Titan. I will be running 4k.

Is this likely to launch the moment it's unveiled? What has happened in the past?


----------



## irced

Quote:


> Originally Posted by *Tideman*
> 
> With this news, I'm quite close to cancelling my preorder for 2 1080s and waiting it out for the Titan. I will be running 4k.
> 
> Is this likely to launch the moment it's unveiled? What has happened in the past?


If you're looking to upgrade "soon", you may as well just commit to the 1080 (single, since SLI scaling is broken/garbage in pretty much everything). I don't expect any of these GP100s to materialize until after nvidia finishes their Tesla delivery to ORNL, which won't be until the middle of next year.

Given nvidia's recent history of bilking customers, the first generation of Titans will probably be immediately available after they "launch", but probably crippled P100s that failed ORNL's requirements for TDP or have defective compute clusters.

I wouldn't expect the full, unlocked, monster chip until winter 2017 at the earliest--that'll be the Titan Black to the Titan.


----------



## prjindigo

Quote:


> Originally Posted by *ChevChelios*
> 
> "reports" also said 480 is at stock 980 level and is a good overclocker
> 
> hype train now arriving at caution station


No, people said the 490 was stock a 980 and a good overclocker... then AMD released the 480.

Get your farces straight!


----------



## guttheslayer

Quote:


> Originally Posted by *irced*
> 
> If you're looking to upgrade "soon", you may as well just commit to the 1080 (single, since SLI scaling is broken/garbage in pretty much everything). I don't expect any of these GP100s to materialize until after nvidia finishes their Tesla delivery to ORNL, which won't be until the middle of next year.
> 
> Given nvidia's recent history of bilking customers, the first generation of Titans will probably be immediately available after they "launch", but probably crippled P100s that failed ORNL's requirements for TDP or have defective compute clusters.
> 
> I wouldn't expect the full, unlocked, monster chip until winter 2017 at the earliest--that'll be the Titan Black to the Titan.


If the defective compute clusters comes way earlier than the ORNL date, Ppl will still grab that one like its hot cake.

The fact remains its faster than 1080 and its all that matters. After all everyone is really hoping for that true [email protected] Hz single GPU.


----------



## NikolayNeykov

Quote:


> Originally Posted by *guttheslayer*
> 
> If the defective compute clusters comes first, in fact way earlier than the ORNL, I will still grab that one.
> 
> The fact remains its faster than 1080 and its all that matters. After all everyone is really really eyeing for that true [email protected] Hz single GPU.


I am eyeing for pure 120/[email protected] and monitor like so, no need anything else, they need to work a bit more.


----------



## guttheslayer

Quote:


> Originally Posted by *NikolayNeykov*
> 
> I am eyeing for pure 120/[email protected] and monitor like so, no need anything else, they need to work a bit more.


You can slowly wait, its still a year away from 120Hz 4K monitor and GPU will take 2 years after monitor release to catch up.

SLI Titan P would save you that trouble of waiting 3 years though.


----------



## Cyro999

Quote:


> Double 8 pin is absolutely insane for a Pascal part considering the 1080 manages with a single 8 pin. 150w + 75w = 225w.


The reference 1080 throttles at OC due to the single 8-pin even when maxing out the pci-e power specification.


----------



## wolfej

Quote:


> Originally Posted by *Neo_Morpheus*
> 
> another recap?
> 
> GTX1070 - GP104 + GDDR5
> GTX1080 - GP104 + GDDR5X
> GTX1080ti - GPBigDie + HBM2 8GB $999
> Titan P - GPBigDie + HBM2 16GB $1500


I just seriously doubt they'll sell the 1080ti for a grand.

I'm expecting ~980ti prices with it being anywhere from 700-800 bucks.


----------



## magnek

Quote:


> Originally Posted by *BranField*
> 
> even with it being 30cm = 11.811in it would still be bigger than a 980ti @ 26.7cm = 10.511. it just doesnt seem to add up to me. Also with the putting the pcie plugs on the end that would increase the length so one would assume that they moved them to the end because space allowed it however it is looking to be longer than the 980ti?


Length definitely doesn't make sense if it indeed uses HBM, since HBM saves a ton of PCB space. So either it's GDDR5X, or they got the PCB length wrong, or this rumor was pulled from thin air.

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Ok so recap...
> 
> GTX1070 - GP104 + GDDR5
> GTX1080 - GP104 + GDDR5X
> GTX1080ti - GPBigDie + GDDR5X
> Titan P - GPBigDie + HBM2
> 
> I think that's right...


HBM and GDDR5 uses very different memory controllers. It would be costly and a waste to design two chips with different memory controllers. Either it's all GDDR5X or all HBM2, unless Titan and 1080 Ti are different chips altogether.

Quote:


> Originally Posted by *rcfc89*
> 
> "We should expect NVIDIA to unveil its new Titan P at Gamescom, which takes place in Cologne, Germany between August 17-21, 2016."
> 
> Read more: http://www.tweaktown.com/news/52886/nvidias-next-gen-titan-50-faster-geforce-gtx-1080/index.html"
> 
> Just lol at those who paid $700 for a mid-range 1080. Inb4 new-Titan drops before October at $1k.


lol @ you if you actually believe Pascal Titan will be available for sale in August or even September


----------



## Mhill2029

My wallet is ready for 4 of these bad boys.....sadly i'll have to ditch this mobo though if I want to continue using a 950 Pro.


----------



## GamerusMaximus

Quote:


> Originally Posted by *Mhill2029*
> 
> My wallet is ready for 4 of these bad boys.....sadly i'll have to ditch this mobo though if I want to continue using an 950 Pro.


Why 4 when nvidia no longer supports anything over 2 way SLI?


----------



## ZealotKi11er

If its 50% it will not $1000.


----------



## Mhill2029

Quote:


> Originally Posted by *GamerusMaximus*
> 
> Why 4 when nvidia no longer supports anything over 2 way SLI?


That's for the existing midrange cards of the Pascal family, I'm still under the belief that Nvidia are going back to days of old.

*Gamers*
GTX 1080 = 2-Way only

*Extreme Gamers*
GTX 1080Ti = Upto 3-Way SLI

*Extreme Enthusiasts*
GTX Titan P = Upto 4-Way SLI

I'm only guessing here, but that's what I feel could be happening. They locked the GTX 1080 for good reason, it's a midrange die and has no place supporting such configurations within it's price point. It's a card aimed at the masses, not the enthusiast market. Just like the GTX 1060 has no SLI support at all.

And by giving this kind of exclusivity to the Titan P, not only will it be more desirable to the enthusiast, but they'll have more reason to charge a small fortune for it. And they'll likely sell more, since it'll be the only GPU that will support such a configuration.


----------



## Kpjoslee

Why would Nvidia bother releasing Titan when 4096 core Vega is only likely be competing against 1080? And that Vega has slim chance of even making it this year.
I don't expect Titan until like Feb/March of next year.


----------



## FattysGoneWild

Quote:


> Originally Posted by *DETERMINOLOGY*
> 
> pretty much so....With a gtx 1080 atleast need to be doing 1440p or 1080p at 144hz


1080 wont push 1440p 60+fps max settings when the newer and more demanding games hit. It is a cross between 1080p/1440p card. Now the 1080Ti I would expect full 1440p 60fps+ max settings always. We are not even into the newer games yet. The 1080 cannot push witcher 3 all maxed out settings at 60+fps 1440p. That is a FACT.


----------



## GamerusMaximus

Quote:


> Originally Posted by *Kpjoslee*
> 
> Why would Nvidia bother releasing Titan when 4096 core Vega is only likely be competing against 1080? And that Vega has slim chance of even making it this year.
> I don't expect Titan until like Feb/March of next year.


Because the margins on such a product with no competition would be sky high vs the 1080. And people will buy them. They have to do something with the defective tesla and quadro parts.


----------



## vmatt1203

I predict: For the low, low price of $1,999 and the feeling guilt for putting that much on your credit card.

If true these performance gains are getting insane. When will it start to plateau?


----------



## DETERMINOLOGY

Quote:


> Originally Posted by *FattysGoneWild*
> 
> 1080 wont push 1440p 60+fps max settings when the newer and more demanding games hit. It is a cross between 1080p/1440p card. Now the 1080Ti I would expect full 1440p 60fps+ max settings always.


Its a sick card if you know how to cut settings down...As me playing at 4k you think i crank up all the settings? No lol


----------



## FattysGoneWild

Quote:


> Originally Posted by *DETERMINOLOGY*
> 
> Its a sick card if you know how to cut settings down...As me playing at 4k you think i crank up all the settings? No lol


Who buys a $700 dollar card to cut back on settings? Your doing it wrong.


----------



## theringisMINE

Quote:


> Gamers
> GTX 1080 = 2-Way only
> 
> Extreme Gamers
> GTX 1080Ti = Upto 3-Way SLI
> 
> Extreme Enthusiasts
> GTX Titan P = Upto 4-Way SLI


This makes me giggle. Would all gamers please line up neatly in order of your epeen length?


----------



## Kpjoslee

Quote:


> Originally Posted by *GamerusMaximus*
> 
> Because the margins on such a product with no competition would be sky high vs the 1080. And people will buy them. They have to do something with the defective tesla and quadro parts.


And they would go to cut down Quadro versions for even better margin.









And historically, All Titans were released in Feb/March. Given there is no competition from AMD any time soon at the high end, I don't expect that to be changing.


----------



## rcfc89

People on here complaining about big Pascal taking 8+8 or 8+6 when baby Pascal only takes a single 8-pin. My Lightnings take 2x8's and a 6pin lol.


----------



## DETERMINOLOGY

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Who buys a $700 dollar card to cut back on settings? Your doing it wrong.


Playing at max is icing on the cake but its not needed or a requirement so its eh...As long as my game runs nicely and smooth while looking good thats all it matters...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mhill2029*
> 
> That's for the existing midrange cards of the Pascal family, I'm still under the belief that Nvidia are going back to days of old.
> 
> *Gamers*
> GTX 1080 = 2-Way only
> 
> *Extreme Gamers*
> GTX 1080Ti = Upto 3-Way SLI
> 
> *Extreme Enthusiasts*
> GTX Titan P = Upto 4-Way SLI
> 
> I'm only guessing here, but that's what I feel could be happening. They locked the GTX 1080 for good reason, it's a midrange die and has no place supporting such configurations within it's price point. It's a card aimed at the masses, not the enthusiast market. Just like the GTX 1060 has no SLI support at all.
> 
> And by giving this kind of exclusivity to the Titan P, not only will it be more desirable to the enthusiast, but they'll have more reason to charge a small fortune for it. And they'll likely sell more, since it'll be the only GPU that will support such a configuration.


Lol. 2 x 1080 is for Extreme Extreme + Infinity Enthusiast. Gamers buy RX 480. Extreme Gamers maybe 1070.


----------



## criminal

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Lol. 2 x 1080 is for Extreme Extreme + Infinity Enthusiast. Gamers buy RX 480. Extreme Gamers maybe 1070.


Exactly.

IF Titan P is $1400+, I would label anyone who would buy four an extreme idiot since most games don't scale for crap past two... lol


----------



## aDyerSituation

Quote:


> Originally Posted by *criminal*
> 
> Exactly.
> 
> IF Titan P is $1400+, I would label anyone who would buy *two* an extreme idiot


FTFY

sorry had to


----------



## Malinkadink

Quote:


> Originally Posted by *aDyerSituation*
> 
> FTFY
> 
> sorry had to


Yes of course, Nvidia isn't supporting past 2 cards anymore, and anyone that buys two Titan Ps isn't an idiot so much as they just have a lot of income to spend on their hobby. You don't need two enthusiast GPUs unless you're trying to max games out at 1440p 144hz or 4k 60hz while maintaining the respective framerates.

Someone on a 1080p 144hz or 1440p 60hz display with a single 1070 for example isn't really missing much eye candy vs the guy who blew over $3k on GPUs. There are some extreme diminishing returns that kick in the further you climb up the resolution ladder.


----------



## c0nsistent

Quote:


> Originally Posted by *ChevChelios*
> 
> its ok, its ok, I still love my shiny new 1080, I love it, love it ....
> 
> Ill just get a 1180 Volta later for cheaper, same performance as 1080Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also RIP Vega


Be prepared to spend 20% more for GPUs every year then. As long as AMD isn't level with Nvidia performance wise, the prices will keep rising. Once AMD is gone for good, people like yourself wont have a competitive brand to talk trash about, so I'd be hoping Vega does well if I were you.


----------



## rcfc89

Quote:


> Originally Posted by *Malinkadink*
> 
> Yes of course, Nvidia isn't supporting past 2 cards anymore, and anyone that buys two Titan Ps isn't an idiot so much as they just have a lot of income to spend on their hobby. You don't need two enthusiast GPUs unless you're trying to max games out at 1440p 144hz or 4k 60hz while maintaining the respective framerates.
> 
> Someone on a 1080p 144hz or *1440p 60hz display with a single 1070 for example isn't really missing much eye candy* vs the guy who blew over $3k on GPUs. There are some extreme diminishing returns that kick in the further you climb up the resolution ladder.


I disagree. I game 1440p 60hz and there are several games that take every bit of my two 980Ti's to max them out and maintain 60fps at all times. And the graphical differences of AA off vs. 8x in games like GtaV/FC4 etc. is very noticeable and worth the extra gpu power needed.


----------



## magnek

Quote:


> Originally Posted by *criminal*
> 
> Exactly.
> 
> IF Titan P is $1400+, I would label anyone who would buy four an extreme idiot since most games don't scale for crap past two... lol


They're just overcompensating.


----------



## Mhill2029

Quote:


> Originally Posted by *magnek*
> 
> They're just overcompensating.


Over compensating for what? It's simply a matter of what an individuals disposable income is.


----------



## Nestala

Quote:


> Originally Posted by *Mhill2029*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GamerusMaximus*
> 
> Why 4 when nvidia no longer supports anything over 2 way SLI?
> 
> 
> 
> That's for the existing midrange cards of the Pascal family, I'm still under the belief that Nvidia are going back to days of old.
> 
> *Gamers*
> GTX 1080 = 2-Way only
> 
> *Extreme Gamers*
> GTX 1080Ti = Upto 3-Way SLI
> 
> *Extreme Enthusiasts*
> GTX Titan P = Upto 4-Way SLI
> 
> I'm only guessing here, but that's what I feel could be happening. They locked the GTX 1080 for good reason, it's a midrange die and has no place supporting such configurations within it's price point. It's a card aimed at the masses, not the enthusiast market. Just like the GTX 1060 has no SLI support at all.
> 
> And by giving this kind of exclusivity to the Titan P, not only will it be more desirable to the enthusiast, but they'll have more reason to charge a small fortune for it. And they'll likely sell more, since it'll be the only GPU that will support such a configuration.
Click to expand...

Did you seriously just say a 680$ GPU is a mainstream card aimed at the masses? lol
RX 480 and it's price range is aimed at the mainstream market.


----------



## criminal

Quote:


> Originally Posted by *Nestala*
> 
> Did you seriously just say a 680$ GPU is a mainstream card aimed at the masses? lol
> RX 480 and it's price range is aimed at the mainstream market.


I know right? I guess a gamer without dual 1080's are just plebs... lol


----------



## Mhill2029

Quote:


> Originally Posted by *Nestala*
> 
> Did you seriously just say a 680$ GPU is a mainstream card aimed at the masses? lol
> RX 480 and it's price range is aimed at the mainstream market.


Yes I did. Just like the 970/980 before it....

Look at the sales of the GTX 1080, that should tell you enough.


----------



## keikei

Quote:


> The new Pascal-based Titan X successor will reportedly be *12 inches long*


Gonna need a new case.


----------



## Nestala

Quote:


> Originally Posted by *Mhill2029*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> Did you seriously just say a 680$ GPU is a mainstream card aimed at the masses? lol
> RX 480 and it's price range is aimed at the mainstream market.
> 
> 
> 
> Yes I did. Just like the 970/980 before it....
> 
> Look at the sales of the GTX 1080, that should tell you enough.
Click to expand...

Well it's a midrange die, it's technically a card aimed at the masses, it just completely misses it's target with it's price. People are buying it because they are stupid.
Back then, the titan sold like hotcakes too. Your point?


----------



## jaredismee

Quote:


> Originally Posted by *Nestala*
> 
> Well it's a midrange die, it's technically a card aimed at the masses, it just completely misses it's target with it's price. People are buying it because they are stupid.
> Back then, the titan sold like hotcakes too. Your point?


considering the competition they can basically price it wherever they want and still sell it. specially if some supplies are on short hand.

i cant really justify buying a card over 400, have a 2g r9 380 i am planning to upgrade and fall a little above the average american in terms of disposable income.

i really do like to see the new cards and their benchmarks though


----------



## magnek

Quote:


> Originally Posted by *Mhill2029*
> 
> Over compensating for what? It's simply a matter of what an individuals disposable income is.


Dude it's just a joke.


----------



## criminal

Quote:


> Originally Posted by *magnek*
> 
> Dude it's just a joke.


I think you hit a nerve... the small weewee nerve... I kid I kid.


----------



## PostalTwinkie

Quote:


> Originally Posted by *keikei*
> 
> Gonna need a new case.


Most enthusiasts that are even looking at a Titan class card are going to have a case it will fit in. There have been, and are, cards longer than 12 inches. Although I do chuckle at the idea that someone with more money than knowledge on the topic is going to buy one and it not fit!

I only chuckle because we have all been there!


----------



## zealord

Quote:


> Originally Posted by *criminal*
> 
> I think you hit a nerve... the small weewee nerve... I kid I kid.


----------



## thebski

Titan P at $1400 would be terrible. That's roughly 980 Ti SLI performance which was available for $1300 a year ago on what is now old technology.


----------



## BillOhio

Again... are there even any games out there that are worth $1,500+ (the card, the game, taxes).


----------



## undeadhunter

Quote:


> Originally Posted by *BillOhio*
> 
> Again... are there even any games out there that are worth $1,500+ (the card, the game, taxes).


No, but people here in OCN will tell you otherwise to justify they have more money than sense


----------



## HeadlessKnight

Quote:


> Originally Posted by *Mhill2029*
> 
> *Gamers*
> GTX 1080 = 2-Way only
> 
> *Extreme Gamers*
> GTX 1080Ti = Upto 3-Way SLI
> 
> *Extreme Enthusiasts*
> GTX Titan P = Upto 4-Way SLI
> .


So anyone with a card below GTX 1080 is not a gamer? GTX X70/ X60 cards while are not made for elitists but those levels of cards generally run games pretty well and are targeted for gamers.


----------



## Mhill2029

Quote:


> Originally Posted by *HeadlessKnight*
> 
> So anyone with a card below GTX 1080 is not a gamer? GTX X70/ X60 cards while are not made for elitists but those levels of cards generally run games pretty well and are targeted for gamers. The only X60 card that sucked is the GTX 960.


Sure cards below that are fine and the majority only use a single card. But what i was talking about is how i see SLI being done on current and future cards on the Pascal architecture. Ie; Gamers buying midrange cards are limited to 2-Way SLI max and Ti and Titans will likely have exclusivity to 3-Way and 4-Way SLI. I'm pretty certain that's what i see happening.


----------



## jtom320

Looking more forward to the 1080 TI then the Titan at this point.

Having personally felt the sting of buying a founders 1080 you just know that the next Titan's pricing is going to be through the roof.

I realize I'm part of the problem buying an FE but I don't like this new trend from Intel and Nvidia of having 1500+ dollar enthusiast flagships. I realize that 1500 dollars is speculation on the Nvidia front but I think that's a pretty reasonable guess at this point.

Of course on the other hand 4k is pretty nice. I mean the 1080 is giving me 'perceptual' 60 in Rise of the Tomb Raider and GTA 5 the two games I'm playing right now at basically ultra settings basically just dropping MSAA/SSAA. The new Titan and TI are going to be the first real honest to god 60 fps 4k cards. Going to be a pretty big step performance wise getting over that hump.


----------



## ChevChelios

so we need a $1300-1500 Titan P monster to get [email protected]

meanwhile the sheer geniuses over at MS have figured out how to [email protected] with a 6TFlops Scorpio


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> so we need a $1300-1500 Titan P monster to get [email protected]
> 
> meanwhile the sheer geniuses over at MS have figured out how to [email protected] with a 6TFlops Scorpio


The power of single config and optimization.









Plus a bit lower iq which you wont notice unless compare it side by side.


----------



## Somasonic

Quote:


> Originally Posted by *EightDee8D*
> 
> The power of single config and optimization.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plus a bit *lower iq* which you wont notice unless compare it side by side.


Is that image quality or intelligence quotient


----------



## magnek

Quote:


> Originally Posted by *Somasonic*
> 
> Is that image quality or intelligence quotient


A little from column A, a lot from column B.


----------



## EightDee8D

Quote:


> Originally Posted by *magnek*
> 
> A little from column A, a lot from column B.


----------



## zealord

Quote:


> Originally Posted by *ChevChelios*
> 
> so we need a $1300-1500 Titan P monster to get [email protected]
> 
> meanwhile the sheer geniuses over at MS have figured out how to [email protected] with a 6TFlops Scorpio


do you have a source where microsoft claimed the scorpio does promise to deliver 60fps at 4K?


----------



## Slomo4shO

Quote:


> We should expect NVIDIA to unveil its new Titan P at Gamescom, which takes place in Cologne, Germany between *August 17-21, 2016*.


Only way I can see this product being paper launched around August is if AMD has Vega lined up for a late Q3 or early Q4 release. Nvidia has no reason to prop another product on the market until it has competition for its current offerings.


----------



## Lee Patekar

Quote:


> Originally Posted by *Slomo4shO*
> 
> Only way I can see this product being paper launched around August is if AMD has Vega lined up for a late Q3 or early Q4 release. Nvidia has no reason to prop another product on the market until it has competition for its current offerings.


There were rumors of Vega in October.. but in all honesty AMD didn't have a hard counter for the 680 GTX when they launched big Kepler as a Titan. So I don't see why they'd wait after AMD now for this Titan. (The brand practically prints money).


----------



## Curleyyy

I was looking at spending about $600 ~ and aiming for a EVGA 1070 but there's so many GPU's coming out I'm so confused.

What should I be expecting to buy around the next 3 - 6 months?


----------



## aDyerSituation

Quote:


> Originally Posted by *Curleyyy*
> 
> I was looking at spending about $600 ~ and aiming for a EVGA 1070 but there's so many GPU's coming out I'm so confused.
> 
> What should I be expecting to buy around the next 3 - 6 months?


1080's will be priced where they should be within that time frame I'd imagine.


----------



## ChevChelios

Quote:


> Originally Posted by *zealord*
> 
> do you have a source where microsoft claimed the scorpio does promise to deliver 60fps at 4K?


at the Scorpio reveal at E3 they were talking about 4K native and 60 fps was mentioned


----------



## Celcius

Now I'm glad I skipped the GTX 1080. This is the card that I've been waiting for!


----------



## ChevChelios

Quote:


> Originally Posted by *Celcius*
> 
> Now I'm glad I skipped the GTX 1080. This is the card that I've been waiting for!


if Titan P costs $1300-1400 then you can buy 2x 1080 for that price


----------



## zealord

Quote:


> Originally Posted by *ChevChelios*
> 
> at the Scorpio reveal at E3 they were talking about 4K native and 60 fps was mentioned


ah. I see your confusion. They talked about hz. people need to remember that fps is not the same as hz









They also never said "4K @ 60hz/fps". Those terms came up seperated.


----------



## rcfc89

Quote:


> Originally Posted by *ChevChelios*
> 
> at the Scorpio reveal at E3 they were talking about 4K native and 60 fps was mentioned


The only game that a system that weak will be running at 60fps in 4k is "Frogger."

It will take a machine that's way more then4x as powerful as the Xbone to run popular titles at 4k 60fps considering the current Xbone struggles to achieve 720p 60fps. 4k is basically 8x that resolution.


----------



## ChevChelios

Quote:


> Originally Posted by *zealord*
> 
> ah. I see your confusion. They talked about hz. people need to remember that fps is not the same as hz


what did they mean then ? How does Hz relate to this ?

Quote:


> They also never said "4K @ 60hz/fps"


what else would it apply to other than 4K ? since 4K is all they talked about ..

I dont believe for a second Scorpio will do [email protected] stable in demanding games, but that seems to be what they want to achieve/want us to think ..


----------



## keikei

Quote:


> Originally Posted by *zealord*
> 
> do you have a source where microsoft claimed the scorpio does promise to deliver 60fps at 4K?


Quote:


> Originally Posted by *ChevChelios*
> 
> at the Scorpio reveal at E3 they were talking about 4K native and 60 fps was mentioned


While it'd be nice, 4K @60 is never mentioned. Sony said a similar bench (1080p/60) and they we're grilled for it. I'm sure MS is striving for 60fps, but they've never claimed it (a smart move imo). Scorpio will be the poor mans 4k gaming, while the PCMR gets the holy Titan P.



Spoiler: Warning: Spoiler!


----------



## ChevChelios

but since current consoles do [email protected] then its likely prudent to expect [email protected] from Scorpio


----------



## Lee Patekar

Quote:


> Originally Posted by *keikei*
> 
> While it'd be nice, 4K @60 is never mentioned. Sony said a similar bench (1080p/60) and they we're grilled for it. I'm sure MS is striving for 60fps, but they've never claimed it (a smart move imo). Scorpio will be the poor mans 4k gaming, while the PCMR gets the holy Titan P.


Where did they say anything about gaming in 4K? They'll support 4K video @ 60 Hz.. so movies. At most they'll upscale so you can use your 4K TV (since they'll have the proper version of HDMI to support it).

edit: ah, never mind.. they actually claim 4K gaming.. utter bull.


----------



## ChevChelios

the _slim X1_ supports 4K video and UHD blu-ray playback

Scorpio is all about games in native 4K


----------



## zealord

Quote:


> Originally Posted by *ChevChelios*
> 
> but since current consoles do [email protected] then its likely prudent to expect [email protected] from Scorpio


hz is the refresh rate of your display. How often it refreshes during a second.

fps is the number of frames your GPU does render per second.

For example. I can have a 120hz monitor and still only play at 30fps.


----------



## ryder

noob question..

why would an ultra powerful gpu bottleneck cpus at lower resolutions (1080p, etc)?

like a) why does the gpu get bottlenecked by a cpu in the first place and b) why does that effect get stronger with lower resolutions?

i ask cause plan to buy a pascal titan and game at 1920x1200p.


----------



## Lee Patekar

Because of CPU overhead needed to prepare the frame for the GPU.. and DX 11 (and under) are mostly single threaded, or reliant on single threaded CPU performance. Overhead stays the same as the resolution drops, but the GPU can handle the smaller frames faster. Eventually you choke the GPU as the CPU can't handle the work needed for that quantity of frames. (You get 1 core at 100% CPU in your game). More or less, quick and dirty answer.

DX12 / Vulkan should change that.


----------



## Pragmatist

Quote:


> Originally Posted by *ryder*
> 
> noob question..
> 
> why would an ultra powerful gpu bottleneck cpus at lower resolutions (1080p, etc)?
> 
> like a) why does the gpu get bottlenecked by a cpu in the first place and b) why does that effect get stronger with lower resolutions?
> 
> i ask cause plan to buy a pascal titan and game at 1920x1200p.


There is literally no reason for you to buy the Titan if you're going to just play games at that resolution. Hell, even a 1060 would be more than enough.

Edit: Typos, phone auto correction.


----------



## zealord

Quote:


> Originally Posted by *ryder*
> 
> noob question..
> 
> why would an ultra powerful gpu bottleneck cpus at lower resolutions (1080p, etc)?
> 
> like a) why does the gpu get bottlenecked by a cpu in the first place and b) why does that effect get stronger with lower resolutions?
> 
> i ask cause plan to buy a pascal titan and game at 1920x1200p.


CPU have to render stuff like physics. For example download 3Dmark and watch the benchmark. There you can see tests where only the CPU renders stuff. Like falling towers for example. CPU also handle other stuff that is not (really) related to resolution.

When resolutions goes up the GPU is stressed more, but the CPU basically the same. That is why many CPU bottlenecking tests, do test the CPU at a very low resolution. 720p or lower.

When you game at those resolutions with a super powerful GPU there is a chance the GPU is able to render more frames than the CPU -> CPU bottleneck.

At higher resolutions. E.G. 4K the GPU is stressed much more (circa 3-4 times as much very roughly~) and the chance of CPU bottleneck decreases.

Also CPU have PCIE lanes. for example a QUAD SLI 980 Ti setup wouldn't fare well with an i3 CPU. You'd need an extreme CPU to fully make use of SLI then.

It's a little bit more complicated, but that it basically it


----------



## ChevChelios

Quote:


> hz is the refresh rate of your display. How often it refreshes during a second.
> 
> fps is the number of frames your GPU does render per second.
> 
> For example. I can have a 120hz monitor and still only play at 30fps.


I know what fps/Hz are









I asked why they mentioned Hz when talking about 4K res and the specs/power of their new machine

seemed to me like he was using Hz/fps interchangeably and implying/hoping for [email protected]

but we'll see


----------



## Lee Patekar

Quote:


> Originally Posted by *ChevChelios*
> 
> I know what fps/Hz are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked why they mentioned Hz when talking about 4K res and the specs/power of their new machine
> 
> seemed to me like he was using Hz/fps interchangeably and implying/hoping for [email protected]
> 
> but we'll see


Probably because the current consoles don't have an HDMI port that can sustain [email protected] If I recall there was an issue with DVI where 4K was limited to 30Hz on PC too.. Not sure exactly, still rocking the 1080p so didn't feel it personally but I read about it a while back.


----------



## zealord

Quote:


> Originally Posted by *ChevChelios*
> 
> I know what fps/Hz are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked why they mentioned Hz when talking about 4K res and the specs/power of their new machine
> 
> seemed to me like he was using Hz/fps interchangeably and implying/hoping for [email protected]
> 
> but we'll see


good to know you know what it is. I wasn't sure you knew









Companies love buzz words. Most people don't know the difference between hz and fps and so it is easy to use those words to not say anything "legally" wrong, but also make people believe your product is better than it actually is.

For example when your local tech store says "OUR NEW 3RD GENERATION INTEL PC WITH 250 GB MEMORY". Then it sounds good to uninformed people, but we know it's probably a very small i3 for 50$ and a 250GB 5400rmp HDD"








Quote:


> Originally Posted by *Lee Patekar*
> 
> Probably because the current consoles don't have an HDMI port that can sustain [email protected] If I recall there was an issue with DVI where 4K was limited to 30Hz on PC too.. Not sure exactly, still rocking the 1080p so didn't feel it personally but I read about it a while back.


oh yeah good point. That aswell!


----------



## magnek

Quote:


> Originally Posted by *Slomo4shO*
> 
> Only way I can see this product being paper launched around August is if AMD has Vega lined up for a late Q3 or early Q4 release. Nvidia has no reason to prop another product on the market until it has competition for its current offerings.


The only other scenario that makes sense is if nVidia is only releasing a Titan this year with the Ti variant at least 6 months away, and the Titan costs *at least* $1200+ to avoid cannibalizing 1080 sales.


----------



## jtom320

I'd imagine that you will see 4k 60FPS titles on Scorpio. Probably about the same ratio of 30 to 60 fps titles that you see on the xbone / ps4 now.

It's going to produce some excellent looking games. But just like 60 fps games on current consoles they'll be some sacrifices made in all the usual categories (textures, LOD etc).

Just considering the first party talent difference however I'd imagine the best looking titles end up on the Neo. Not counting PC and general third party games of course.


----------



## ToTheSun!

Quote:


> Originally Posted by *jtom320*
> 
> I'd imagine that you will see 4k 60FPS titles on Scorpio.


Considering not all games run at 1080p 60 FPS on the XBOne, i'll have to classify 4x the amount of pixels rendered on a revision as wishful thinking.


----------



## zealord

Quote:


> Originally Posted by *magnek*
> 
> The only other scenario that makes sense is if nVidia is only releasing a Titan this year with the Ti variant at least 6 months away, and the Titan costs *at least* $1200+ to avoid cannibalizing 1080 sales.


interesting point. It could be a "longevity" Titan









Would definitively bode well with the Titan buyers that love a little reasuring clap on the back for lots of money well spent









When yields get better bring out the GTX 1080 Ti. Nvidia may not need a GTX 1080 Ti for the next 9 months to 1-up AMD. The GTX 1080 may be enough and the Titan is the ultra deluxe enthusiast card for quite a while. Definitively food for thought there magnek !


----------



## jtom320

Quote:


> Originally Posted by *ToTheSun!*
> 
> Considering not all games run at 1080p 60 FPS on the XBOne, i'll have to classify 4x the amount of pixels rendered on a revision as wishful thinking.


Well of course not all titles run at 1080p 60. That's my point.

On consoles it's a developer choice with a give and take between detail level and resolution/framerate.

The reason why it's pretty obviously going to happen is because Microsoft is going to want it to happen to be able to say their system runs 4k/60 games.

Anyway thread is about the new Titan.


----------



## Tobiman

Lmao, 1080s aren't even in stock yet.


----------



## zealord

Quote:


> Originally Posted by *Tobiman*
> 
> Lmao, 1080s aren't even in stock yet.


In Europe they are









even MSI 1080 Gaming X, ASUS STRIX etc. basically everything is in stock and ready to ship


----------



## Xuvial

Glad I stood by my decision to wait for 1080 Ti. The actual release obviously may stretch to end of 2016, but this is my logic - if I'm going to spend an absurd amount of money on a graphics card, I better be getting the full power of the architecture (or very close to it)








Quote:


> Originally Posted by *Tobiman*
> 
> Lmao, 1080s aren't even in stock yet.


Really? I mean I live in NZ, and even the main store here has 1080 FE's and aftermarket versions coming and going in/out of stock.


----------



## FattysGoneWild

$699 was a tough pill to swallow. Even at $599. Nvidia can get bent going further with pricing. My cap was reached with the 1080.


----------



## ref

Praying this isn't over 1500 USD...

I was going to get 2 of these to replace my 980s, but good god living in Canada that conversion rate is just killer.


----------



## FattysGoneWild

Big Pasqal reveal. When the price of $1,400 comes up on the screen. I would love for everyone just to get up and walk out. Man that would be epic.


----------



## zealord

Quote:


> Originally Posted by *ref*
> 
> *Praying this isn't over 1500 USD...*
> 
> I was going to get 2 of these to replace my 980s, but good god living in Canada that conversion rate is just killer.


it is shocking/horrifying to see how well Nvidias pricing works despite being way too high.

People are not worrying that this card will be more expensive than the previous Titan cards (999$) but they are "only" worried to not be over 50%!! more expensive than the previous one.

I mean people can buy/pay what they want, but damn that is a sour outlook for me if people are ready to pay up to 1499$ for a single GPU and maybe even more


----------



## FattysGoneWild

Quote:


> Originally Posted by *zealord*
> 
> it is shocking/horrifying to see how well Nvidias pricing works despite being way too high.
> 
> People are not worrying that this card will be more expensive than the previous Titan cards (999$) but they are "only" worried to not be over 50%!! more expensive than the previous one.
> 
> I mean people can buy/pay what they want, but damn that is a sour outlook for me if people are ready to pay up to 1499$ for a single GPU and maybe even more


Pay that. Less then a year the 1180 hits for only $699 faster then big pasqal. Its a pricing game and it works great. Bravo to Nvidia I say. Well played. No cap exist for them or other people obviously. Sky is the limit. Keep on shooting!!!!!!


----------



## magnek

Quote:


> Originally Posted by *zealord*
> 
> interesting point. It could be a "longevity" Titan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would definitively bode well with the Titan buyers that love a little reasuring clap on the back for lots of money well spent
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When yields get better bring out the GTX 1080 Ti. Nvidia may not need a GTX 1080 Ti for the next 9 months to 1-up AMD. The GTX 1080 may be enough and the Titan is the ultra deluxe enthusiast card for quite a while. Definitively food for thought there magnek !


I actually think Titan will be a cut down GP100 part that uses HBM2, while 1080 Ti will actually be GP102 but still use GDDR5X.

This way nVidia can justify the extra premium on the Titan even more, and make the Titan owners feel more special.
Quote:


> Originally Posted by *zealord*
> 
> it is shocking/horrifying to see how well Nvidias pricing works despite being way too high.
> 
> People are not worrying that this card will be more expensive than the previous Titan cards (999$) but they are "only" worried to not be over 50%!! more expensive than the previous one.
> 
> I mean people can buy/pay what they want, but damn that is a sour outlook for me if people are ready to pay up to 1499$ for a single GPU and maybe even more


The day x04 cards reach $999 is the day I quit PC gaming.


----------



## keikei

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Big Pasqal reveal. When the price of $*1,400* comes up on the screen. I would love for everyone just to get up and walk out. Man that would be epic.


Well, someones gotta help fund Tegra. Thats PC gaming related right?


----------



## Chargeit

Quote:


> Originally Posted by *magnek*
> 
> I actually think Titan will be a cut down GP100 part that uses HBM2, while 1080 Ti will actually be GP102 but still use GDDR5X.
> 
> This way nVidia can justify the extra premium on the Titan even more, and make the Titan owners feel more special.
> The day x04 cards reach $999 is the day I quit PC gaming.


Kind of what I've been fearing. It doesn't make sense to continually put out a $1,000+ GPU then to equal it with a slightly cut down version of the same card at $300 less. At the very least I'd be surprised to see HBM2 memory on the card.

It all has me on the fence. Part of me wants to give in and buy a titan. Part of me wants to rebel and move down in GPU.

I can't complain too much about costs. I've got 2 Gsync monitors. Nvidia can easily check me off as a guaranteed sale for some time. I'm sure they figure all of that crap into what they're going to charge. I mean, wth is going to buy AMD over Nvidia when they have a Gsync monitor?


----------



## bobbavet

Bottlenecks a 6950X? Is that based on single thread gaming performance? Won't DX12 utilize available cores/threads?


----------



## ZealotKi11er

Quote:


> Originally Posted by *bobbavet*
> 
> Bottlenecks a 6950X? Is that based on single thread gaming performance? Won't DX12 utilize available cores/threads?


At 1080p DX11 it will. 4K. Not. 144 Hz probably in some games.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ryder*
> 
> noob question..
> 
> why would an ultra powerful gpu bottleneck cpus at lower resolutions (1080p, etc)?
> 
> like a) why does the gpu get bottlenecked by a cpu in the first place and b) why does that effect get stronger with lower resolutions?
> 
> i ask cause plan to buy a pascal titan and game at 1920x1200p.


The lower the resolution the faster the GPU can render frames, hence higher frame rates at lower resolutions. In the case of an uber powerful card like the Titan P it can render frames at 1080p in excess of 200 FPS which the CPU then has to match. If the CPU can't hold up to those high frame rendering times from the GPU then the video card is stuck sitting around waiting for the CPU to serve up the next frame and thus a CPU bottleneck. A card like the Titan P is really idiotic for 1080p gaming in the first place, however. If you want to hame at 1080p (even with a 144Hz monitor) you would be much better served with something like a 980 or 390X at the time being. Anything faster is just wasted on lower resolutions (unless benching)...


----------



## SpeedyVT

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> The lower the resolution the faster the GPU can render frames, hence higher frame rates at lower resolutions. In the case of an uber powerful card like the Titan P it can render frames at 1080p in excess of 200 FPS which the CPU then has to match. If the CPU can't hold up to those high frame rendering times from the GPU then the video card is stuck sitting around waiting for the CPU to serve up the next frame and thus a CPU bottleneck. A card like the Titan P is really idiotic for 1080p gaming in the first place, however. If you want to hame at 1080p (even with a 144Hz monitor) you would be much better served with something like a 980 or 390X at the time being. Anything faster is just wasted on lower resolutions (unless benching)...


Not quite. Sometimes frame rates are dependent on other variables. Notice AMD cards lose less performance at higher resolutions and NVidia loses more performance at higher resolution. You're definitely onto something.

I think the pricing scheme for this card is going to be $1500-1800.


----------



## Kana Chan

Aren't they coming out with a 1080P / 240Hz monitor this year?


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Mhill2029*
> 
> Over compensating for what? It's simply a matter of what an individuals disposable income is.


All that disposable income and Logitech speakers


----------



## Yop

For being an hbm2 card why is it so long?


----------



## guttheslayer

Quote:


> Originally Posted by *GamerusMaximus*
> 
> Because the margins on such a product with no competition would be sky high vs the 1080. And people will buy them. They have to do something with the defective tesla and quadro parts.


If its true that Titan are ready by August / Sept, and mind you I dont mean paper launch, it can explain one thing:

TSMC yield are much better than 28nm when it first came out.


----------



## guttheslayer

Quote:


> Originally Posted by *Yop*
> 
> For being an hbm2 card why is it so long?


Tons of voltage regulator? Maybe this card will be the first to break the 2.1GHz barrier. Also the FE cooler is not even cutting it for 1080, a monster die like GP100 which is twice as big, definitely need TWICE the cooling capacity.

The PCB is probably designed excessively large to accommodate the giant heatsink.

to recap, there might not be a 1080 Ti, instead renamed as both Titan so Nvidia can price it through the roof:

*
Titan P - 3584 cores, 12GB HBM2, No DP, $999, AiB available

Titan P FE - 3840 cores, 16GB HBM2, 1/4 DP, $1499, Only FE*


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> Tons of voltage regulator? Maybe this card will be the first to break the 2.1GHz barrier. Also the FE cooler is not even cutting it for 1080, a monster die like GP100 which is twice as big, definitely need TWICE the cooling capacity.
> 
> The PCB is probably designed excessively large to accommodate the giant heatsink.
> 
> to recap, there might not be a 1080 Ti, instead renamed as both Titan so Nvidia can price it through the roof:
> 
> *
> Titan P - 3584 cores, 12GB HBM2, No DP, $999, AiB available
> 
> Titan P FE - 3840 cores, 16GB HBM2, 1/4 DP, $1499, Only FE*


And maybe a 1080 Ti with 16 12 GB of GDDR5X later, for $800?

Edit: 12 GB, not 16.. 384-bit bus


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> And maybe a 1080 Ti with 16 GB of GDDR5X later, for $800?


Nope, Given that GP102 and GP100 are same die but using different interlink interface, if GP102 is using HBM, Nvidia had to redesign a new core like GP103 to integrate the G5X controller instead of HBM.

That is very very unlikely, the cost to design a new core for the sake of G5X is just not worth it.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> Nope, Given that GP102 and GP100 are same die but using different interlink interface, if GP102 is using HBM, Nvidia had to redesign a new core like GP103 to integrate the G5X controller instead of HBM.
> 
> That is very very unlikely, the cost to design a new core for the sake of G5X is just not worth it.


Are we sure GP102 is for a different interlink? The TechPowerUp post today seems to suggest the new Titan is GP100, but who really knows?








Quote:


> The GP100 and GTX TITAN P isn't the only high-end graphics card lineup targeted at gamers and PC enthusiasts, NVIDIA is also working the GP102 silicon, positioned between the GP104 and the GP100. This chip could lack FP64 CUDA cores found on the GP100 silicon, and feature up to 3,840 CUDA cores of the same kind found on the GP104. The GP102 is also expected to feature simpler 384-bit GDDR5X memory. NVIDIA could base the GTX 1080 Ti on this chip.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> Are we sure GP102 is for a different interlink? The TechPowerUp post today seems to suggest the new Titan is GP100, but who really knows?


Let me repost the original source that Techpowerup seem to quote from.

http://vrworld.com/2016/07/05/nvidia-gp100-titan-faster-geforce-1080/
Quote:


> While Nvidia will pull out all the guns to fight the AMD Radeon RX 480, releasing GeForce GTX 1060 as early as July 7th - our focus is slowly turning towards the real big gun of Pascal-based GeForce line-up. If our sources are correct, *GP100 and GP102 were essentially the same chips, with the difference being NVLink interface on the GP100 and PCI Express for the GP100*. Feature set on both chips is the same, and there are no surprises.
> 
> We held the "Tesla P100 for PCI Express-based Servers" board in our hands just a few weeks ago, and just a few days ago, we managed to grab our hands on a GP100/GP102-based GeForce GTX Titan. As we reported earlier, this board will only come to market after the debut of Quadro-branded products. Both Quadro and GeForce cards come with the same heatsink, albeit in different color scheme, and "QUADRO" markings vs. "TITAN" on the aluminum-machined shroud.
> 
> Display configuration is similar to GeForce GTX 1080/1070, with one small change. Our board did not had DVI connectors on it.


I find it hilarious that TECHPOWERUP give a *complete different interpretation* from its own source.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> Let me repost the original leak here...
> 
> http://vrworld.com/2016/07/05/nvidia-gp100-titan-faster-geforce-1080/


I wonder if their sources are correct. An entirely new die only for a different interface seems extreme to me but it could be true. However, I don't think it would add much area to pack a PCI-E interface next to the one for NVLink.

A GDDR5X ready die also seems like a good idea, incase HBM2 didn't work out for some reason if not for being a cheaper option. It also looks like the same memory controller can use GDDR5 or GDDR5X, though 384-bit GDDR5 would bottleneck a 3840 CUDA core Pascal GPU pretty badly 384-bit GDDR5X would probably provide enough bandwidth. Maybe even some of that 14Gbps GDDR5X.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> I wonder if their sources are correct. An entirely new die only for a different interface seems extreme to me but it could be true. However, I don't think it would add much area to pack a PCI-E interface next to the one for NVLink.
> 
> A GDDR5X ready die also seems like a good idea, in case HBM2 didn't work out for some reason if not for being a cheaper option. It also looks like the same memory controller can use GDDR5 or GDDR5X, though 384-bit GDDR5 would bottleneck a 3840 CUDA core Pascal GPU pretty badly 384-bit GDDR5X would probably provide enough bandwidth. Maybe even some of that 14Gbps GDDR5X.


GP100 doesnt need to have a PCIe, or they can disabled the NVLink and just slap to a PCIe board, either way it too confusing at this point but what we see so far on Tesla vs Quadro, what VRW suggest is pretty spot on. GP100 and GP102 for both NVL and PCIe. They need a code to separate this 2 out since they run on different interface (probably have minor tweaks in the GPU cores itself)

And also, I still believe that G5X is not ready for 384 bits. So what Nvidia will do is to follow the original schedule of:

HBM - High end / Ultra Enthuiast
G5X - Mainstream
G5 - Entry level

In this case, there is no need for a 384 bits G5X.


----------



## Defoler

Quote:


> Originally Posted by *Asmodian*
> 
> I wonder if their sources are correct. An entirely new die only for a different interface seems extreme to me but it could be true. However, I don't think it would add much area to pack a PCI-E interface next to the one for NVLink.
> 
> A GDDR5X ready die also seems like a good idea, incase HBM2 didn't work out for some reason if not for being a cheaper option. It also looks like the same memory controller can use GDDR5 or GDDR5X, though 384-bit GDDR5 would bottleneck a 3840 CUDA core Pascal GPU pretty badly 384-bit GDDR5X would probably provide enough bandwidth. Maybe even some of that 14Gbps GDDR5X.


If I'm not mistaken, people stated that the GP100 chip is missing parts which are essential for being a full time GPU.
Meaning the GP100 can't be a titan, and a different version, aka the GP102, will become the titan instead.

Regarding GDDR5X, since it is pretty new, I don't know a high speed high capacity one will be available outside of the 1080. Since they are already making tesla and quadro cards wth HBM2, and HBM2 is supposed to be mass produced for some good months already, I hope at least the cards will arrive with HBM2 as the rumours have been for awhile. It will make more sense.


----------



## magnek

Quote:


> Originally Posted by *guttheslayer*
> 
> Tons of voltage regulator? Maybe this card will be the first to break the 2.1GHz barrier. Also the FE cooler is not even cutting it for 1080, a monster die like GP100 which is twice as big, definitely need TWICE the cooling capacity.
> 
> The PCB is probably designed excessively large to accommodate the giant heatsink.
> 
> to recap, there might not be a 1080 Ti, instead renamed as both Titan so Nvidia can price it through the roof:
> 
> *
> Titan P - 3584 cores, 12GB HBM2, No DP, $999, AiB available
> 
> Titan P FE - 3840 cores, 16GB HBM2, 1/4 DP, $1499, Only FE*


My friend, I think Occam's razor applies here and the rumor is just simply full of equine stool is all.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> GP100 doesnt need to have a PCIe, or they can disabled the NVLink and just slap to a PCIe board, either way it too confusing at this point but what we see so far on Tesla vs Quadro, what VRW suggest is pretty spot on. GP100 and GP102 for both NVL and PCIe. They need a code to separate this 2 out since they run on different interface (probably have minor tweaks in the GPU cores itself)
> 
> And also, I still believe that G5X is not ready for 384 bits. So what Nvidia will do is to follow the original schedule of:
> 
> HBM - High end / Ultra Enthuiast
> G5X - Mainstream
> G5 - Entry level
> 
> In this case, there is no need for a 384 bits G5X.


From what I understand the PCIE Quadro is GP100, from VRW:
Quote:


> Feature-wise, all products are identical. GP100 physically carries 3840 CUDA cores. The number splits into two figures - 2880 FP32 Single-Precision and 960 FP64 Double-Precision cores. All products utilize GPU Boost to temporarily achieve peak performance. What Tesla P100 for PCIe-Based Servers does not carry are the display outputs. Still, we managed to see a prototype, a fully functional PCIe board which will come to market as Quadro P6000 and as GeForce GTX Titan (insert the random letter).


----------



## guttheslayer

Quote:


> Originally Posted by *magnek*
> 
> My friend, I think Occam's razor applies here and the rumor is just simply full of equine stool is all.


First they already mentioned the board have some difference, no DVI connector.

Next they mentioned the positioning of the 6+8 power pin

Lastly they mentioned the board is 12 inch long which is different from previous Titan, and describe the engrave with Titan and also different colour scheme.

I am very sure they did have the sample on board to drill down to this level of detail. Unless they are imaginating thing.


----------



## Chargeit

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> All that disposable income and Logitech speakers


Hey, don't knock simplicity. I got kind of crazy with my surround sound system and it dictates my entire computer room layout. Really want to be rid of the damned thing but it's so sweet to have when gaming, watching something, or listening to music.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> From what I understand the PCIE Quadro is GP100, from VRW:


This author apparently doesn't know the layout of GP100, despite having the block diagram officially release by NV, let me help him:

Quote:


> Feature-wise, all products are identical. *GP100 physically carries 5760 cores. The number splits into two figures - 3840 FP32 Single-Precision and 1920 FP64 Double-Precision cores*. All products utilize GPU Boost to temporarily achieve peak performance. What Tesla P100 for PCIe-Based Servers does not carry are the display outputs. Still, we managed to see a prototype, a fully functional PCIe board which will come to market as Quadro P6000 and as GeForce GTX Titan (insert the random letter)


----------



## magnek

Quote:


> Originally Posted by *guttheslayer*
> 
> First they already mentioned the board have some difference, no DVI connector.
> 
> Next they mentioned the positioning of the 6+8 power pin
> 
> Lastly they mentioned the board is 12 inch long which is different from previous Titan, and describe the engrave with Titan and also different colour scheme.
> 
> I am very sure they did have the sample on board to drill down to this level of detail. *Unless they are imaginating thing.*


Well I have a Titan P sample sitting right at my desk, and I can tell you it does have a DVI connector, there is only one 8+6 pin version, and the power connectors are in the usual place. Oh and the card is only 8 inches long, not 12 as they mentioned.

...

I mean it's just too easy to write stuff and sound authoritative. Either way, I refuse to believe anything from this rumor until I see a 12 inch Titan with HBM2.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> This author apparently doesn't know the layout of GP100, despite having the block diagram officially release by NV, let me help him:


Ah good, that 2880 32-bit CUDA cores number doesn't sound right because that would be almost the same as a 1080.

It is hard to know, I am inclined to believe both GP100 and GP102 can be run on PCIe but I am leaning towards agreeing that the new Titan P and at least one Quadro will actually use GP102. Hard to be sure right now though.


----------



## guttheslayer

Anyway here is what I can roughly estimate the specs and detail of the upcoming Titan.

Titan P 12G

Cores: 3584 CUDA
Base Clock: 1.5GHz+
Boost Clock: 1.6GHz+
FP32: 12 TFLOP
FP64: 380 GFLOP
Memory: 12GB HBM2
Bus Width: 3072 bits
Memory Bandwidth: 540 GB/s
TDP: 250W
MSRP: $999+

Others: Available as custom AiB

Titan P 16G

Cores: 3840 CUDA
Base Clock: 1.6GHz~
Boost Clock: 1.7GHz~
FP32: 13 TFLOP
FP64: 3.3 TFLOP
Memory: 16GB HBM2
Bus Width: 4096 bits
Memory Bandwidth: 720 GB/s
TDP: 300W
MSRP: $1399+

Others: Only as Founder Edition.

Should be around there, performance is estimated based on +50% delta of 1080 for the full flat titan.


----------



## Xuvial

Quote:


> Originally Posted by *guttheslayer*
> 
> Anyway here is what I can roughly estimate the specs and detail of the upcoming Titan.


Any estimates on 1080 Ti?


----------



## Haruna

If the historical price trend for Nvidia continues, where the "performance" Titan costs 3x more than the base X80 card, then I predict that Titan P will cost $2100

The cheaper titan will be $1800


----------



## guttheslayer

Quote:


> Originally Posted by *Haruna*
> 
> If the historical price trend for Nvidia continues, where the "performance" Titan costs 3x more than the base X80 card, then I predict that Titan P will cost $2100
> 
> The cheaper titan will be $1800


That is BS imho, since when the Titan is 3x the price of X80?

Its always been less than 2X the price of X80 or 3X the price of X70.

GTX 680 vs Titan = $499 vs $999
GTX 780 vs Titan Black = $649 vs $999
GTX 980 vs Titan X = $549 vs $999


----------



## Defoler

I doubt a titan at over 1000$ price tag.
Seems way overblown. I remember people claiming 1000$ 1080 so, I will take everything with a mountain of salt grains until the card is actually out.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> Ah good, that 2880 32-bit CUDA cores number doesn't sound right because that would be almost the same as a 1080.
> 
> It is hard to know, I am inclined to believe both GP100 and GP102 can be run on PCIe but I am leaning towards agreeing that the new Titan P and at least one Quadro will actually use GP102. Hard to be sure right now though.


You know smth is not right when GP100 has almost twice as big die as GP104. 2880 cores on a die twice as big doesnt make sense.

anyway actual spec of GP100 can be found at NV devblog.


----------



## guttheslayer

Quote:


> Originally Posted by *Defoler*
> 
> I doubt a titan at over 1000$ price tag.


You forget Titan Z at $2999.

Also by having the 12GB Titan at $999, NV didnt really "flaunt" that rule.

Same case with Intel, we didnt expect their new 6950X to go way over $999.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> You forget Titan Z at $2999.
> 
> Also by having the 12GB Titan at $999, NV didnt really "flaunt" that rule.
> 
> Same case with Intel, we didnt expect their new 6950X to go way over $999.


I agree, I expect the 12GB Titan to be the one at $999, it makes sense too, e.g. Fury and Fury X. With two Titans this time, and the price of the 1080, there isn't a good price point for the 12GB Titan if the 16GB Titan is "only" $999.

I also admit I would quickly pay well over $999 for a 3840 core 1.8+ GHz Pascal with 16GB of HBM2... and Nvidia knows it.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> I agree, I expect the 12GB Titan to be the one at $999, it makes sense too, e.g. Fury and Fury X. With two Titans this time, and the price of the 1080, there isn't a good price point for the 12GB Titan if the 16GB Titan is "only" $999.
> 
> I also admit I would quickly pay well over $999 for a 3840 core 1.8+ GHz Pascal with 16GB of HBM2... and Nvidia knows it.


To top it off:

1) GP100 is bigger than GM200, meaning lesser chip per wafer, less usable yield.
2) 16nm FF is almost twice as expensive as 28nm
3) HBM2 is way more expensive than G5X which alr is higher price than GDDR5

If you expect the high end Titan to be $999, same as previous 28nm Titan, you need to get your head check.


----------



## SpeedyVT

Quote:


> Originally Posted by *guttheslayer*
> 
> To top it off:
> 
> 1) GP100 is bigger than GM200, meaning lesser chip per wafer, less usable yield.
> 2) 16nm FF is almost twice as expensive as 28nm
> 3) HBM2 is way more expensive than G5X which alr is higher price than GDDR5
> 
> If you expect the high end Titan to be $999, same as previous 28nm Titan, you need to get your head check.


I think there will multiple Titans this time. The highest binned chip will be in a $1.8-2k titan for second highest $1.6-1.8k for next bin down and $1.4-1.6k for the lowest bin quality.


----------



## guttheslayer

Quote:


> Originally Posted by *SpeedyVT*
> 
> I think there will multiple Titans this time. The highest binned chip will be in a $1.8-2k titan for second highest $1.6-1.8k for next bin down and $1.4-1.6k for the lowest bin quality.


That price is abit ridiculous.

More like 1K, 1.2K and 1.5K range.

Most of the good chip are used in Tesla / Quadro anyway, all that left for Geforce are the one that fails.


----------



## Remij

My Canadian wallet will weep for sure.

I think the 1080ti/Titan(P) will be when I drop SLI and make a nice and small mATX build.

This go around I'm more interested in shrinking the form factor than adding more power. There's something very satisfying of having such powerful PC be so small.


----------



## Wishmaker

I cannot wait for the MSRP which will be like a Jules Verne novel : non -existent on the globe







.


----------



## Yttrium

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> And at that rate I doubt cannonlake or overclocking will be the savior
> 
> 
> 
> (1) Cannonlake is a tick, so 10nm alone should give some perf increase
> (2) they might feel the need to get off their asses and increace perf gains more this time to decisively beat Zen(+) which will be the first real competition in a while
> 
> although I doubt they look at Titan P/Volta and think "oh man, we gotta make out CPUs faster so we dont bottleneck Nvidias monsters"
Click to expand...

Intel has said to be more focused on perf/watt instead of performance. Which, given their minor improvements over the years, might mean we wont see that much performance.


----------



## Majin SSJ Eric

I don't see them splitting up the Titan cards in that way personally. The lesser card would more likely be the 1080 TI with the full card being the Titan, which will almost certainly cost more than $1000. But hey, let's all just speculate away as we always do! Everything always turns out exactly the way we think it will!


----------



## MACH1NE

RRP $1,999


----------



## Baasha

well.. this is good news. Having run the GTX 1080 in SLI for a few weeks now at 5K (and every other resolution below), I cannot wait for the new Titan to drop.

The 1080 SLI perform tremendously well at 5K in most games which is phenomenal to say the least. I never expected 2 GPUs to outperform 4x GPUs, and that too the Titan X, from the previous generation (sans a really small sample of games).

The real question, however is, will the new Titan P or whatever it's going to be called, support 3-Way and 4-Way SLI for gaming? I really hope Dell releases the 4K OLED 120Hz panels so that I can get three of those for 4K Surround with 4x 16GB Titan XX (?) and blast!









Here's a quickie on the GTX 1080 in 5K playing some BF4:


----------



## HackHeaven

What games do you guys plan to play with these $1500 cards?

I dont think amd will ever match nvidia for some reason but if they did they would win at lower price & higher power(well maybe people buy nvidia just to buy it it seems)


----------



## BiG StroOnZ

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't see them splitting up the Titan cards in that way personally. The lesser card would more likely be the 1080 TI with the full card being the Titan, which will almost certainly cost more than $1000. But hey, let's all just speculate away as we always do! Everything always turns out exactly the way we think it will!


NVIDIA is all about making money, if that means two Titans with different core amounts, memory sizes, and different prices by all means they will do it. NVIDIA has had some time to plan out how they want to do Big Die Pascal.

With Big Die Maxwell we only saw two GTX SKUs (980 Ti and Titan X).

Yet with Big Die Kepler we saw four (780, Titan, 780 Ti, Titan Black). Again with different Cuda Core counts and memory sizes.

NVIDIA could easily do the following:

GP102 Titan 16GB HBM - $1200 (August) - 50% over 1080
GP102 Titan 12GB HBM (slightly cut down) - $1000 (August) - 40% over 1080
GP102 1080 Ti 8GB G5X (even more cut down) - $800 (November) - 30% over 1080


----------



## Neo_Morpheus

I didn't think they have the supply of HBM2 to waste on variants with smaller differences like a 12GB & 16GB model.


----------



## Nestala

Anyone who thinks Nvidia will release the Titan in August or even this year is delusional. It will only be a reveal in August, having an engineering sample isn't the same as being ready for production. You could have an engineering sample ready a year before you mass produce them.

Also, the 1080 is barely out, and Nvidia knows that Vega will not come until at least October (and even that are only rumors), more likely is a Q1 release for Vega, same goes for the Titan.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *Neo_Morpheus*
> 
> I didn't think they have the supply of HBM2 to waste on variants with smaller differences like a 12GB & 16GB model.


Well the problem is we have to assume that a 600 mm²+ die on 16nm is quite a large die to be producing when 16nm isn't even mature yet. So the yields on them aren't going to be great. They are going to have tons of dies that don't make the cut. They have to do something with them.


----------



## Eorzean

Whoops, double. Please delete.


----------



## Eorzean

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> NVIDIA is all about making money, if that means two Titans with different core amounts, memory sizes, and different prices by all means they will do it. NVIDIA has had some time to plan out how they want to do Big Die Pascal.
> 
> With Big Die Maxwell we only saw two GTX SKUs (980 Ti and Titan X).
> 
> Yet with Big Die Kepler we saw four (780, Titan, 780 Ti, Titan Black). Again with different Cuda Core counts and memory sizes.
> 
> NVIDIA could easily do the following:
> 
> GP102 Titan 16GB HBM - $1200 (August) - 50% over 1080
> GP102 Titan 12GB HBM (slightly cut down) - $1000 (August) - 40% over 1080
> *GP102 1080 Ti 8GB G5X (even more cut down) - $800 (November) - 30% over 1080*


That's making me want to hold off on getting a 1080 (was in the process of selling my 1070 to upgrade) and live with toning settings down until the Ti comes out. As nice as a 1080 Ti for only $100 more sounds, why would Nvidia give us a 30% jump in performance for only $100 more if there is no competition? As of right now, that 30% jump comes at a ~$320 premium. If they're only competing with themselves, I really don't see them giving us any performance freebies.


----------



## ChevChelios

Im one step ahead and instead of Titan P/Ti - Im waiting for Volta


----------



## BiG StroOnZ

Quote:


> Originally Posted by *Eorzean*
> 
> That's making me want to hold off on getting a 1080 (was in the process of selling my 1070 to upgrade) and live with toning settings down until the Ti comes out. As nice as a 1080 Ti for only $100 more sounds, why would Nvidia give us a 30% jump in performance for only $100 more if there is no competition? As of right now, that 30% jump comes at a ~$320 premium. If they're only competing with themselves, I really don't see them giving us any performance freebies.


Vega is supposed to launch in the Fall too (most recent rumor), so I'm guessing by then they might actually have some competition. AMD launched 3 SKUs with Polaris, so maybe we can expect 3 SKUs with Vega.

By then I imagine the $600 1080 will be in abundance, and the $380 1070 will be widely available. So realistically it will end up being for around $200 more. Which is how I priced my GP102 parts.


----------



## theringisMINE

Quote:


> Originally Posted by *c0nsistent*
> 
> Be prepared to spend 20% more for GPUs every year then. As long as AMD isn't level with Nvidia performance wise, the prices will keep rising. Once AMD is gone for good, people like yourself wont have a competitive brand to talk trash about, so I'd be hoping Vega does well if I were you.


Pretty much. Even the biggest nvidia fanboys out there should be praying AMD does well, for everyones sake.... Otherwise this is a visual representation as to the control Nvidia will have over gamers/market/pricing


----------



## ChevChelios

Vega is the hero we need


----------



## Eorzean

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Vega is supposed to launch in the Fall too (most recent rumor), so I'm guessing by then they might actually have some competition. AMD launched 3 SKUs with Polaris, so maybe we can expect 3 SKUs with Vega.
> 
> By then I imagine the $600 1080 will be in abundance, and the $380 1070 will be widely available. So realistically it will end up being for around $200 more. Which is how I priced my GP102 parts.


Hopefully Vega does come out in October and drives prices down across the board. What I really hope for is AMD to release something better than a 1080, at a better price... I'd jump ship in a heartbeat. I've been waiting for that opportunity for a long time though.
Quote:


> Originally Posted by *theringisMINE*
> 
> Pretty much. Even the biggest nvidia fanboys out there should be praying AMD does well, for everyones sake.... Otherwise this is a visual representation as to the control Nvidia will have over gamers/market/pricing


LMAO. That pic brings "integrated" to a whole new level. Nvidia in the year 2027 2019.


----------



## guttheslayer

Quote:


> Originally Posted by *Nestala*
> 
> Anyone who thinks Nvidia will release the Titan in August or even this year is delusional. It will only be a reveal in August, having an engineering sample isn't the same as being ready for production. You could have an engineering sample ready a year before you mass produce them.
> 
> Also, the 1080 is barely out, and Nvidia knows that Vega will not come until at least October (and even that are only rumors), more likely is a Q1 release for Vega, same goes for the Titan.


You are the one that is delusion, when the market are willing to pay for 1080, and how fast they were being sold out, pushing the titan out at a extreme premium as early as possible is one way to earn even more.

Time and time again the world has shown that we are willing to pay extreme prices. And the best way to milk the market is to cut down the die in many config:

3840 / 3584 / 3200 / 3072 cores configuration

So many different GPU to feed the masses. That is how they can milk for the next 12-24 months. The above configuration should come true for the following 1100 series next year.


----------



## Xuvial

Quote:


> Originally Posted by *theringisMINE*
> 
> Pretty much. Even the biggest nvidia fanboys out there should be praying AMD does well, for everyones sake.... Otherwise this is a visual representation as to the control Nvidia will have over gamers/market/pricing


As an avid nVidia (and Intel) fan, I'm absolutely HOPING with every fiber of my being that AMD can keep up with these performance leaps.

I don't know if there is some kind of legal/law thing that keeps companies from monopolizing stuff and inflating the heck out of prices...but Intel are pretty much already doing it and nVidia are starting to do it too.
Quote:


> Originally Posted by *guttheslayer*
> 
> You are the one that is delusion, when the market are willing to pay for 1080, and how fast they were being sold out, pushing the titan out at a extreme premium as early as possible is one way to earn even more.


They can't chug out new products too fast, or they will saturate and stagnate their own market. If you were in nVidia's shoes the last thing you would want people to think is "oh just wait a few months and get a better product". You want to hit that sweet-spot where the next product is just _slightly_ too far away to make the consumer buy the current product, with the next product being _just_ enticing enough to make the consumer consider upgrading. It's a science that Apple has mastered with their phones









For example, I'm looking to upgrade my card and this announcement has completely killed any chance of me buying a 1080.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Xuvial*
> 
> As an avid nVidia (and Intel) fan, I'm absolutely HOPING with every fiber of my being that AMD can keep up with these performance leaps.
> 
> I don't know if there is some kind of legal/law thing that keeps companies from monopolizing stuff and inflating the heck out of prices...but Intel are pretty much already doing it and nVidia are starting to do it too.
> They can't chug out new products too fast, or they will saturate and stagnate their own market. If you were in nVidia's shoes the last thing you would want people to think is "oh just wait a few months and get a better product". You want to hit that sweet-spot where the next product is just *slightly* too far away and make the consumer buy the current product, with the next product being *just* enticing enough to make the consumer consider upgrading. It's a science that Apple has mastered with their phones.
> 
> For example, I'm looking to upgrade my card and this announcement has completely killed any chance of me buying a 1080.


To be fair the actual MSRP isn't that much higher than their previous products, it's just the FE is kinda a loophole to charge more.

Apple is worse than Nvidia in my eyes.


----------



## guttheslayer

Quote:


> Originally Posted by *Xuvial*
> 
> For example, I'm looking to upgrade my card and this announcement has completely killed any chance of me buying a 1080.


If the price of the new Titan is $1400, are u sure you want to give up 1080 at $599?

The enthusiast market is quite different from mainstream, its not a big market and probably only affect 5%, but the whole point is to shift the market perception of PRICE. Basically Nvidia goal is to tell the world that its bang for bucks to price their flagship at $1400. If they can change the market perception as early as possible, it could mean something for them.

Putting 1080 and Titan are a very big price gap doesn't really saturate their market, in fact it help to alleviate the lack of stock everywhere for 1080s, and it make the 1080s look even more attractive.

Your situation only apply if GTX 1180 is coming out with same price bracket as 1080 within 3 months. That would create the problem you mention. But no, Titan is twice as exp as 1080. Not many will actually dump their 1080 for it. Just FYI.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> Tons of voltage regulator? Maybe this card will be the first to break the 2.1GHz barrier. Also the FE cooler is not even cutting it for 1080, a monster die like GP100 which is twice as big, definitely need TWICE the cooling capacity.
> 
> The PCB is probably designed excessively large to accommodate the giant heatsink.
> 
> to recap, there might not be a 1080 Ti, instead renamed as both Titan so Nvidia can price it through the roof:
> 
> *
> Titan P - 3584 cores, 12GB HBM2, No DP, $999, AiB available
> 
> Titan P FE - 3840 cores, 16GB HBM2, 1/4 DP, $1499, Only FE*


im ready to buy the Titan P at 3584 cores. Where i can pre order?


----------



## Kana Chan

Twice the cost for 2x transistor count though? ~1300-1400 seems about right?


----------



## guttheslayer

Quote:


> Originally Posted by *Kana Chan*
> 
> Twice the cost for 2x transistor count though? ~1300-1400 seems about right?


Err no, full transistor count only happen when you include the DP compute unit, most likely the titan will be crippled partially or fully for DP. So no, you dont get twice the transistor for your Titan.


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> im ready to buy the Titan P at 3584 cores. Where i can pre order?


If anything, the 3584 cores with AiB might cost even more, $1200 King Pin edition anyone?

its basically a 1080 Ti in disguise that is slapped with a Titan price.


----------



## renejr902

Quote:


> Originally Posted by *Kana Chan*
> 
> Twice the cost for 2x transistor count though? ~1300-1400 seems about right?


To be honest i want to buy the next Titan, but only at 999$ 1099$ MAX. So i really hope they do a titan P 12GB 3584.... otherwise i will wait for the 1080 ti. i wont waste 1400$ for a full titan card


----------



## renejr902

Quote:


> Originally Posted by *Kana Chan*
> 
> Twice the cost for 2x transistor count though? ~1300-1400 seems about right?


To be honest i want to buy the next Titan, but only at 999$ 1099$ MAX. So i really hope they do a titan P 12GB 3584.... otherwise i will wait for the 1080 ti. i wont waste 1400$ for a full tit
Quote:


> Originally Posted by *guttheslayer*
> 
> If anything, the 3584 cores with AiB might cost even more, $1200 King Pin edition anyone?
> 
> its basically a 1080 Ti in disguise that is slapped with a Titan price.


i wont care about a Kingpin edition, i will buy a cheaper custom edition. i can overclock myself. i just want 2 or 3 cooler fans, i didnt like the ONE fan of the founders edition. i just bought 2 weeks ago, the 1070 gtx gigabyte windforce with 2 fans, while im waiting for the titan ... cant game anymore with a intel integrated graphic gpu.

i suppose nvidia should know that some enthousiast like me wont pay more than 999$-1099$ for the titan. they have to build 2 different titan card if they want to make a 1400$+ edition, otherwise they wont sell a lot of them, and like a lot of people i will wait for the 1080 ti


----------



## Nestala

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> Anyone who thinks Nvidia will release the Titan in August or even this year is delusional. It will only be a reveal in August, having an engineering sample isn't the same as being ready for production. You could have an engineering sample ready a year before you mass produce them.
> 
> Also, the 1080 is barely out, and Nvidia knows that Vega will not come until at least October (and even that are only rumors), more likely is a Q1 release for Vega, same goes for the Titan.
> 
> 
> 
> You are the one that is delusion, when the market are willing to pay for 1080, and how fast they were being sold out, pushing the titan out at a extreme premium as early as possible is one way to earn even more.
> 
> Time and time again the world has shown that we are willing to pay extreme prices. And the best way to milk the market is to cut down the die in many config:
> 
> 3840 / 3584 / 3200 / 3072 cores configuration
> 
> So many different GPU to feed the masses. That is how they can milk for the next 12-24 months. The above configuration should come true for the following 1100 series next year.
Click to expand...

Quote:


> Originally Posted by *Xuvial*
> 
> Quote:
> 
> 
> 
> Originally Posted by *theringisMINE*
> 
> Pretty much. Even the biggest nvidia fanboys out there should be praying AMD does well, for everyones sake.... Otherwise this is a visual representation as to the control Nvidia will have over gamers/market/pricing
> 
> 
> 
> As an avid nVidia (and Intel) fan, I'm absolutely HOPING with every fiber of my being that AMD can keep up with these performance leaps.
> 
> I don't know if there is some kind of legal/law thing that keeps companies from monopolizing stuff and inflating the heck out of prices...but Intel are pretty much already doing it and nVidia are starting to do it too.
> Quote:
> 
> 
> 
> Originally Posted by *guttheslayer*
> 
> You are the one that is delusion, when the market are willing to pay for 1080, and how fast they were being sold out, pushing the titan out at a extreme premium as early as possible is one way to earn even more.
> 
> Click to expand...
> 
> They can't chug out new products too fast, or they will saturate and stagnate their own market. If you were in nVidia's shoes the last thing you would want people to think is "oh just wait a few months and get a better product". You want to hit that sweet-spot where the next product is just _slightly_ too far away to make the consumer buy the current product, with the next product being _just_ enticing enough to make the consumer consider upgrading. It's a science that Apple has mastered with their phones
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For example, I'm looking to upgrade my card and this announcement has completely killed any chance of me buying a 1080.
Click to expand...

This. Also I'm saying this as I can guarantee you 100% they aren't ready for production yet when they've just made the jump to 16nm, it's not a problem of "want to release new product", it's a problem of "not being able to release new product".

Also the 1080 sold out so quick because there was barely any stock available, if that was artificial from Nvidia or because they have problems with yields is another topic.

I will remind you when the Titan P releases on that again


----------



## Power Drill

Quote:


> Originally Posted by *renejr902*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kana Chan*
> 
> Twice the cost for 2x transistor count though? ~1300-1400 seems about right?
> 
> 
> 
> To be honest i want to buy the next Titan, but only at 999$ 1099$ MAX. So i really hope they do a titan P 12GB 3584.... otherwise i will wait for the 1080 ti. i wont waste 1400$ for a full titan card
Click to expand...

Remember the days when people could buy full blow high end PC with that money even with monitor included an play games at max setting?

Now you get only one component with that money to achieve the same and it's considered the norm... truly sad times now and even more so ahead.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> Err no, full transistor count only happen when you include the DP compute unit, most likely the titan will be crippled partially or fully for DP. So no, you dont get twice the transistor for your Titan.


i have a question for you guttheslayer, i just received my windforce 1070 gtx , its still brand new, i didnt open it yet at all. SHOULD I SELL IT and wait for a Titan at 1099$ in the end of august or september ? OR is it too risky to HOPE for a titan at 1099$ max for september. I cant wait until november with my intel gpu LOL Thanks for your opinion. i know i have to take a chance , one way or another ( sorry for my bad english)


----------



## mrpurplehawk

Quote:


> Originally Posted by *guttheslayer*
> 
> If anything, the 3584 cores with AiB might cost even more, $1200 King Pin edition anyone?
> 
> its basically a 1080 Ti in disguise that is slapped with a Titan price.


Quote:


> Originally Posted by *renejr902*
> 
> To be honest i want to buy the next Titan, but only at 999$ 1099$ MAX. So i really hope they do a titan P 12GB 3584.... otherwise i will wait for the 1080 ti. i wont waste 1400$ for a full tit
> i wont care about a Kingpin edition, i will buy a cheaper custom edition. i can overclock myself. i just want 2 or 3 cooler fans, i didnt like the ONE fan of the founders edition. i just bought 2 weeks ago, the 1070 gtx gigabyte windforce with 2 fans, while im waiting for the titan ... cant game anymore with a intel integrated graphic gpu.
> 
> i suppose nvidia should know that some enthousiast like me wont pay more than 999$-1099$ for the titan. they have to build 2 different titan card if they want to make a 1400$+ edition, otherwise they wont sell a lot of them, and like a lot of people i will wait for the 1080 ti


AIB Titan? That'd be interesting.


----------



## renejr902

Quote:


> Originally Posted by *Power Drill*
> 
> Remember the days when people could buy full blow high end PC with that money even with monitor included an play games at max setting?
> 
> Now you get only one component with that money to achieve the same and it's considered the norm... truly sad times now and even more so ahead.


i know what you means by that, my first video card was a 3dfx voodoo 1 4mb









i know they sell their cards for a very expensive price, but the technology advance faster than before at that time. they try to sell us the best product available the sooner possible, i think that can explain the price are very expensive vs older time...


----------



## renejr902

Quote:


> Originally Posted by *mrpurplehawk*
> 
> AIB Titan? That'd be interesting.


i hope for AIB Titan too, i dont like at all this founders edition.


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> i have a question for you guttheslayer, i just received my windforce 1070 gtx , its still brand new, i didnt open it yet at all. SHOULD I SELL IT and wait for a Titan at 1099$ in the end of august or september ? OR is it too risky to HOPE for a titan at 1099$ max for september. I cant wait until november with my intel gpu LOL Thanks for your opinion. i know i have to take a chance , one way or another ( sorry for my bad english)


Why do make me sound like I work for Nvidia. I dont lol.

I suggest u keep ur 1070 and just be happy, there is no real confirmation when the Titan stock be available and HOW EXPENSIVE it will be. Paper launch is one thing, actual product availability is another.


----------



## Nestala

Quote:


> Originally Posted by *renejr902*
> 
> Quote:
> 
> 
> 
> Originally Posted by *guttheslayer*
> 
> Err no, full transistor count only happen when you include the DP compute unit, most likely the titan will be crippled partially or fully for DP. So no, you dont get twice the transistor for your Titan.
> 
> 
> 
> i have a question for you guttheslayer, i just received my windforce 1070 gtx , its still brand new, i didnt open it yet at all. SHOULD I SELL IT and wait for a Titan at 1099$ in the end of august or september ? OR is it too risky to HOPE for a titan at 1099$ max for september. I cant wait until november with my intel gpu LOL Thanks for your opinion. i know i have to take a chance , one way or another ( sorry for my bad english)
Click to expand...

You have a 1070. If you'd be in the target audience for the Titan you'd have 1080. So keep your 1070 and be happy with it.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> Why do make me sound like I work for Nvidia. I dont lol.
> 
> I suggest u keep ur 1070 and just be happy, there is no real confirmation when the Titan stock be available and HOW EXPENSIVE it will be. Paper launch is one thing, actual product availability is another.


thanks so much for answer









i know you dont work for nvidia, but you seem to well understand this industry


----------



## guttheslayer

Quote:


> Originally Posted by *mrpurplehawk*
> 
> AIB Titan? That'd be interesting.


I am just assuming there could be a AiB titan becz this card is actually a 1080 Ti in disguise.

In the past, Titan were never allow to sell with custom cooling / PCB was the reason it actually affect the sales of their Professional market.

Imagine a full DP capable titan that could be overclock through the roof with KP edition. How the hell are they able to justify selling their Tesla at $5000?

The same reason Titan X were not allowed for AiB for the same reason becz Quadro M6000 share exactly the same features.


----------



## renejr902

Quote:


> Originally Posted by *Nestala*
> 
> You have a 1070. If you'd be in the target audience for the Titan you'd have 1080. So keep your 1070 and be happy with it.


i wait for more than 1 year for the next titan. i sold my 980ti because it was not enough good for 4k at 60fps. i dont want to go sli

im gaming with a 50'' Samsung SUHDTV. i want to play most games at 4k at ultra or high settings with 60 fps.

with my 980ti some games were playable at high settings in 4k at 60fps, but only a few of them


----------



## Majin SSJ Eric

Titan will not be out in August, or in 2016 at all I don't think. Just don't see them cutting off the 1080 so soon. Historically Titan has followed the X80 cards by a good long while.


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Titan will not be out in August, or in 2016 at all I don't think. Just don't see them cutting off the 1080 so soon. Historically Titan has followed the X80 cards by a good long while.


Again how do they cut the 1080 off when they are more than 2x the price apart.

It actually Nvidia coming out with mid-range card and den high end in just 3 month. AMD is doing the same, but becz their gpu are way behind Nvidia, they have to price at 1/3.


----------



## Nestala

Quote:


> Originally Posted by *renejr902*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> You have a 1070. If you'd be in the target audience for the Titan you'd have 1080. So keep your 1070 and be happy with it.
> 
> 
> 
> i wait for more than 1 year for the next titan. i sold my 980ti because it was not enough good for 4k at 60fps. i dont want to go sli
> 
> im gaming with a 50'' Samsung SUHDTV. i want to play most games at 4k at ultra or high settings with 60 fps.
> 
> with my 980ti some games were playable at high settings in 4k at 60fps, but only a few of them
Click to expand...

Best option for you would be to get a Titan/1080Ti/Vega then, since these seem to be the first cards than actually can do 4k/60 at high settings.

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Titan will not be out in August, or in 2016 at all I don't think. Just don't see them cutting off the 1080 so soon. Historically Titan has followed the X80 cards by a good long while.


Exactly, it's in Nvidia's best interest to saturate the market with 1060/1070/1080 first, before they push out the Titan. Since Vega seems to drop in Q1/2017, that seems like the perfect moment for Nvidia to drop the Titan, probably slightly before Vega releases, as Nvidia usual releases their products before AMD does.


----------



## renejr902

Thanks guys for answer, its really appreciated. i will open it later today or tomorrow , at least i can play dirt rally at 4k in 60fps with it. probably Forza 6 too. but i dont hope anything for witcher3 and rise of the tomb raider. i played once witcher 3( i finished the whole game) with my 980 ti overclocked in 4k with ultra settings, it was playable enough 35-45fps all the time WITH NO AA and NO NVIDIA HAIRWORK, LOL with these enable it was stuttering too much


----------



## Kana Chan

Quote:


> Originally Posted by *Nestala*
> 
> You have a 1070. If you'd be in the target audience for the Titan you'd have 1080. So keep your 1070 and be happy with it.


Unless you like upscaling videos with SVP, then you can never have enough gpu power


----------



## Nestala

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Majin SSJ Eric*
> 
> Titan will not be out in August, or in 2016 at all I don't think. Just don't see them cutting off the 1080 so soon. Historically Titan has followed the X80 cards by a good long while.
> 
> 
> 
> Again how do they cut the 1080 off when they are more than 2x the price apart.
> 
> It actually Nvidia coming out with mid-range card and den high end in just 3 month. AMD is doing the same, but becz their gpu are way behind Nvidia, they have to price at 1/3.
Click to expand...

The things is, lets say if you want to drive a 4k/60hz display right now, you want the best card available right? So you get a 1080. Then lets say in Q1/2017 when the Titan drops, you obviously upgrade from your 1080 to a Titan, since it has better performance, right?
So Nvidia grabbed your cash 2 times. They made ~2000$ off of you (estimate since we don't know Titan price yet obviously).

So now lets say Nvidia does release the Titan in August. You want to drive a 4k/60 display. You straight up get the Titan, right? Since it has the better performance. In this scenario Nvidia would only grab your cash 1 time, and they would've made 1300$ (only my estimated Titan price) off of you.

So tell me, what would be the better scenario for Nvidia?

That's what saturating the market is all about.


----------



## renejr902

WCCFTECH just posted that 1 min ago:

AMD's Powerful Vega 10 GPU Expected For Launch in 1H 2017 - Utilizes HBM2 Memory, Featured in High-End Graphics Cards

Read more: http://wccftech.com/amd-vega-10-gpu-launch-rumor-2017/#ixzz4Dd3pVz1c


----------



## ChevChelios

Quote:


> in the first part of H1 2017, possibly even in Q1 2017


hmm

and wccftech doesnt seem to be mentioning Vega 11 at all and is calling Vega10 the flagship


----------



## renejr902

why nvidia should release a hbm2 titan card now ??? if amd will wait for q1 2017. im really not sure anymore about august rumor


----------



## guttheslayer

Lol amd always late and underperform


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> Lol amd always late and underperform


I hope youre right, i want my Titan P in August or september


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Titan will not be out in August, or in 2016 at all I don't think. Just don't see them cutting off the 1080 so soon. Historically Titan has followed the X80 cards by a good long while.


Quote:


> Originally Posted by *Nestala*
> 
> The things is, lets say if you want to drive a 4k/60hz display right now, you want the best card available right? So you get a 1080. Then lets say in Q1/2017 when the Titan drops, you obviously upgrade from your 1080 to a Titan, since it has better performance, right?
> So Nvidia grabbed your cash 2 times. They made ~2000$ off of you (estimate since we don't know Titan price yet obviously).
> 
> So now lets say Nvidia does release the Titan in August. You want to drive a 4k/60 display. You straight up get the Titan, right? Since it has the better performance. In this scenario Nvidia would only grab your cash 1 time, and they would've made 1300$ (only my estimated Titan price) off of you.
> 
> So tell me, what would be the better scenario for Nvidia?
> 
> That's what saturating the market is all about.


Agreed.

Think it'll be next spring just like the last Titans release dates. Just in time for your tax return!


----------



## guttheslayer

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Agreed.
> 
> Think it'll be next spring just like the last Titans release dates. Just in time for your tax return!


I am still believing for a Sept release.


----------



## keikei

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Titan will not be out in August, or in 2016 at all I don't think. Just don't see them cutting off the 1080 so soon. Historically Titan has followed the X80 cards by a good long while.


Yeah, an august release date doesnt make sense. Nvidia would be cannibalizing their own cards. Vega isnt scheduled until early 2017. The GTX 1080 will be the top card for awhile. The relevant question is when will Green release something to compete with the RX 480? We know AMD has no motivation to compete at the high-end currently.


----------



## ChevChelios

Quote:


> The relevant question is when will Green release something to compete with the RX 480


apparently its tomorrow - 7-th July

maybe


----------



## Nestala

Quote:


> Originally Posted by *guttheslayer*
> 
> Lol amd always late and underperform


"late and underperform" now you just sound like a Nvidia fanboy because you're judging a card that we have exactly 0 info about other the fact that it's gonna be called Vega, a flagship card and will have HBM2.

Also "late": this will be around the same time the Titan will arrive. No way they'll release it this year, as many others and me have said before.


----------



## ChevChelios

he meant late to compete with 1080, not the Titan


----------



## Exeed Orbit

I still don't understand the concept of bashing the competition. It's as if some people enjoy being hammered by $700 mid range cards. If AMD keeps failing, Nvidia will keep charging obscene amounts for their cards. But I guess that's worth being "right".


----------



## Nestala

Quote:


> Originally Posted by *ChevChelios*
> 
> he meant late to compete with 1080, not the Titan


I think AMD never planned to compete with the 1080 in any form right from the beginning. I think Vega is more aimed to compete with Titan P/1080Ti.


----------



## renejr902

videocardz said the source from vrworld are the truth, and its not a rumor:

Some fresh news about a new TITAN have just popped up at VR-World.
GeForce GTX TITAN 'P', GTX 1080 Ti

Before we begin, few words those who are unfamiliar with VR-World website. You might remember a site called Bright Side of News (BSN). VR World is essentially BSN's new name, you can still find all BSN articles at VR World. Theo Valich is known editor and I don't think his latest post is just a rumor.

According to the article, VRW staff had the opportunity to hold 'GP100/102-based GTX TITAN' in their hands just few days ago. Although this card (TITAN) is not expected to hit the market before GP102 Quadro implementation goes official, VRW actually gave us a date, which is August 17th to 21st at Gamescom in Germany.


----------



## Yttrium

Quote:


> Originally Posted by *Nestala*
> 
> Quote:
> 
> 
> 
> Originally Posted by *guttheslayer*
> 
> Lol amd always late and underperform
> 
> 
> 
> "late and underperform" now you just sound like a Nvidia fanboy because you're judging a card that we have exactly 0 info about other the fact that it's gonna be called Vega, a flagship card and will have HBM2.
> 
> Also "late": this will be around the same time the Titan will arrive. No way they'll release it this year, as many others and me have said before.
Click to expand...

Inb4 Nvidia releases their titan with 2 cards in stock.

Nvidia has somehow managed to make a launch date subjective, Where I live I can get a 480 but not the 1080. Its insane, Its insane, Its insane, Irresponsible launch dates.


----------



## keikei

Quote:


> Originally Posted by *renejr902*
> 
> _videocardz said the source from vrworld are the truth, and its not a rumor:_
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Some fresh news about a new TITAN have just popped up at VR-World.
> GeForce GTX TITAN 'P', GTX 1080 Ti
> 
> Before we begin, few words those who are unfamiliar with VR-World website. You might remember a site called Bright Side of News (BSN). VR World is essentially BSN's new name, you can still find all BSN articles at VR World. Theo Valich is known editor and I don't think his latest post is just a rumor.
> 
> According to the article, VRW staff had the opportunity to hold 'GP100/102-based GTX TITAN' in their hands just few days ago. Although this card (TITAN) is not expected to hit the market before GP102 Quadro implementation goes official, VRW actually gave us a date, which is August 17th to 21st at Gamescom in Germany.


Until Nvidia confirms, the only truth is vrworld is getting a lot of clicks.


----------



## ZealotKi11er

Considering the price of 1080 and these rumor it can only mean one thing. Titan is no longer 1K if it comes this soon.


----------



## guttheslayer

Quote:


> Originally Posted by *Nestala*
> 
> "late and underperform" now you just sound like a Nvidia fanboy because you're judging a card that we have exactly 0 info about other the fact that it's gonna be called Vega, a flagship card and will have HBM2.
> 
> Also "late": this will be around the same time the Titan will arrive. No way they'll release it this year, as many others and me have said before.


Oh and so the RX480 which is one month late, consume as much power as 1070 and yet still perform so much slower is not under performing?

I wasnt never an nvidia boy, but seeing how AMD had again time and time disappoint us, no more trust on the red.


----------



## EightDee8D

Quote:


> Originally Posted by *guttheslayer*
> 
> Oh and so the RX480 which is one month late, consume as much power as 1070 and yet still perform so much slower is not under performing?
> 
> I wasnt never an nvidia boy, but seeing how AMD had again time and time disappoint us, no more trust on the red.


480 wasn't late, they said polaris will launch mid 2016 and it launched on exact mid 2016 (29 june). and late or not what does it have to do with it's tdp ? losing an argument so put another ? lol

not their fault you guys believe in hype. performance is expected ( it's just 32rops brah) but p/w is disappointing.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Xuvial*
> 
> I wasnt never an nvidia boy, but seeing how AMD had again time and time disappoint us, no more trust on the red.


So you trust the company that bought you irresponsible power prices and the 3.5GB 970?

Trusting any large company that is out to make money is foolish, they are just a business. Trust family and friends.


----------



## Nestala

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> "late and underperform" now you just sound like a Nvidia fanboy because you're judging a card that we have exactly 0 info about other the fact that it's gonna be called Vega, a flagship card and will have HBM2.
> 
> Also "late": this will be around the same time the Titan will arrive. No way they'll release it this year, as many others and me have said before.
> 
> 
> 
> Oh and so the RX480 which is one month late, consume as much power as 1070 and yet still perform so much slower is not under performing?
> 
> I wasnt never an nvidia boy, but seeing how AMD had again time and time disappoint us, no more trust on the red.
Click to expand...

1. It wasn't a month late.
2. RX 480 isn't trying to compete with 1070.


----------



## guttheslayer

Quote:


> Originally Posted by *Nestala*
> 
> 1. It wasn't a month late.
> 2. RX 480 isn't trying to compete with 1070.


It wasnt a month late compared to Nvidia new offering

RX 480 doesnt need to compete to 1070 but if it offer at least 980 performance at 100W peak consumption the ball game would have been completely different.

Look at the gap between RX 480 and 1080, how much Vega will need to catch up to match a 1080? Almost twice the performance, and that means twice the power consumption and cores. If they need to stand a chance against the Titan its has to be a 6000 cores Vega. Do you seriously think AMD can pack a 6000 cores in their Vega lineup?


----------



## ZealotKi11er

Quote:


> Originally Posted by *guttheslayer*
> 
> It wasnt a month late compared to Nvidia new offering
> 
> RX 480 doesnt need to compete to 1070 but if it offer at least 980 performance at 100W peak consumption the ball game would have been completely different.
> 
> Look at the gap between RX 480 and 1080, how much Vega will need to catch up to match a 1080? Almost twice the performance, and that means twice the power consumption and cores. If they need to stand a chance against the Titan its has to be a 6000 cores Vega. Do you seriously think AMD can pack a 6000 cores in their Vega lineup?


8000 cores and crush Ngridia.


----------



## Cakewalk_S

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nestala*
> 
> 1. It wasn't a month late.
> 2. RX 480 isn't trying to compete with 1070.
> 
> 
> 
> It wasnt a month late compared to Nvidia new offering
> 
> RX 480 doesnt need to compete to 1070 but if it offer at least 980 performance at 100W peak consumption the ball game would have been completely different.
> 
> Look at the gap between RX 480 and 1080, how much Vega will need to catch up to match a 1080? Almost twice the performance, and that means twice the power consumption and cores. If they need to stand a chance against the Titan its has to be a 6000 cores Vega. Do you seriously think AMD can pack a 6000 cores in their Vega lineup?
Click to expand...

RX480 competes with the GTX970. It'll be maybe on the level of a 1060. Vega will likely be around 1070 levels IMHO....


----------



## Slomo4shO

Quote:


> Originally Posted by *magnek*
> 
> The only other scenario that makes sense is if nVidia is only releasing a Titan this year with the Ti variant at least 6 months away, and the Titan costs *at least* $1200+ to avoid cannibalizing 1080 sales.


I suppose it is plausible but I suspect that GP104 has better gross margins than GP102.
Quote:


> Originally Posted by *Lee Patekar*
> 
> but in all honesty AMD didn't have a hard counter for the 680 GTX when they launched big Kepler as a Titan.


The HD 7970 wasn't competitive against the GTX 680?


Spoiler: Warning: Spoiler!


----------



## guttheslayer

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 8000 cores and crush Ngridia.


Oh with what TDP? 600W TDP with triple slot water block and dual pump with 4x140mm radiator?

Be real, at this rate, AMD will never match the high end offering that Nvidia has. Oh not to mentioned Vega is still 6 months away... That is GTX 1080 performance level being 6 month late.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *guttheslayer*
> 
> Oh with what TDP? 600W TDP with triple slot water block and dual pump with 4x140mm radiator?
> 
> Be real, at this rate, AMD will never match the high end offering that Nvidia has. Oh not to mentioned Vega is still 6 months away... That is GTX 1080 performance level being 6 month late.


They seem to be only able to match the tier below the top at this stage. Then again they have an architecture that will age well, so hopefully DX12 can level the playing field at least slightly.

I'll be happy if they can get good competition with vega so at least the 1080 priced tier comes down. I sure wouldn't pay the price for an FE 1080 right now they are going to depreciate very quickly I imagine.

On the flip side to that if you need the performance for 4K etc then what else is there to buy?


----------



## EightDee8D

Quote:


> Originally Posted by *guttheslayer*
> 
> Oh with what TDP? 600W TDP with triple slot water block and dual pump with 4x140mm radiator?
> 
> Be real, at this rate, AMD will never match the high end offering that Nvidia has. Oh not to mentioned Vega is still 6 months away... That is GTX 1080 performance level being 6 month late.


You can buy gtx1080 and enjoy it instead of waiting for vega. nobody is stopping you.

or you want vega to be released now so nv can lower 1080's price so that you can buy it. if that's the case i wish they never release vega. and nvidia charges even more for next gpus. you people ( not you) really deserve it.


----------



## rcfc89

Quote:


> Originally Posted by *ChevChelios*
> 
> at the Scorpio reveal at E3 they were talking about 4K native and 60 fps was mentioned


The only game that a system that weak will be running at 60fps in 4k is "Frogger."

It will take a machine that's way more the 4x as powerful as the Xbone to run popular titles at 4k 60fps considering the current Xbone struggles to achieve 720p 60fps. 4k is basically 8x that resolution.
Quote:


> Originally Posted by *Baasha*
> 
> well.. this is good news. Having run the GTX 1080 in SLI for a few weeks now at 5K (and every other resolution below), I cannot wait for the new Titan to drop.
> 
> The 1080 SLI perform tremendously well at 5K in most games which is phenomenal to say the least. I never expected 2 GPUs to outperform 4x GPUs, and that too the Titan X, from the previous generation (sans a really small sample of games).
> 
> The real question, however is, will the new Titan P or whatever it's going to be called, support 3-Way and 4-Way SLI for gaming? I really hope Dell releases the 4K OLED 120Hz panels so that I can get three of those for 4K Surround with 4x 16GB Titan XX (?) and blast!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's a quickie on the GTX 1080 in 5K playing some BF4:


Damn I'm really thinking about picking up a 5K display after watching this video. The vibrancy and colors are insane. Thanks for sharing.


----------



## guttheslayer

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> They seem to be only able to match the tier below the top at this stage. Then again they have an architecture that will age well, so hopefully DX12 can level the playing field at least slightly.
> 
> I'll be happy if they can get good competition with vega so at least the 1080 priced tier comes down. I sure wouldn't pay the price for an FE 1080 right now they are going to depreciate very quickly I imagine.
> 
> On the flip side to that if you need the performance for 4K etc then what else is there to buy?


That is the problem you see, I am not being an nvidia fanboy, I am being pissed at AMD not able to compete be it midrange or high end (GTX 1060 faster than RX480?)

And becz of AMD incompetency, Nvidia has every reason to price it sky high, just like how Intel does with their BWE. I can be angry at insane pricing of Nvidia, but at the end of the day I know who is the culprit for all this high price, its called monopoly.

And monopoly will continue until AMD get bought over by some major player.


----------



## guttheslayer

Quote:


> Originally Posted by *EightDee8D*
> 
> You can buy gtx1080 and enjoy it instead of waiting for vega. nobody is stopping you.
> 
> or you want vega to be released now so nv can lower 1080's price so that you can buy it. if that's the case i wish they never release vega. and nvidia charges even more for next gpus. you people ( not you) really deserve it.


Oh it doesnt matter if Vega get release or not, looking back at Fury and Fury X pricing scheme.

Its not going to affect the price set by nvidia anyway with this kinda performance unless Vega is below $400. Since we know its ain't going to happen Titan will be happily above $1000 to $1500, and next few years creep to 2K region.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *guttheslayer*
> 
> And monopoly will continue until AMD get bought over by some major player.


Heard this before, good luck to a new company designing something to compete with Nvidia which has been around for ever designing high end GPUs.

You can't make a competitive GPU architecture overnight lol.

Better yet a takeover of AMD and scrap GCN to compete again, maybe in 5-10 years time when their new tech has been developed.

It's a pipe dream.

Better put your faith in all games using Async compute and the playing field being level in the next year or two. Much more likely than the takeover scenario.
Quote:


> Originally Posted by *guttheslayer*
> 
> Its not going to affect the price set by nvidia anyway with this kinda performance unless Vega is below $400. Since we know its ain't going to happen Titan will be happily above $1000 to $1500, and next few years creep to 2K region.


Well you don't need to be a blind consumer and buy the top card when it comes out, wait six months to a year and get something more reasonable.

Nobody is making you have to buy cutting edge resolutions and tech, it's a luxury and these companies spend billions in R+D, so they gotta make big bucks to continue.


----------



## guttheslayer

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> Heard this before, good luck to a new company designing something to compete with Nvidia which has been around for ever designing high end GPUs.
> 
> You can't make a competitive GPU architecture overnight lol.
> 
> Better yet a takeover of AMD and scrap GCN to compete again, maybe in 5-10 years time when their new tech has been developed.
> 
> It's a pipe dream.
> 
> Better put your faith in all games using Async compute and the playing field being level in the next year or two. Much more likely than the takeover scenario.


Getting bought over doesn't means throwing everything AMD has been building. It is to inject *TONS OF FUNDING* to their R&D and recruit more skilled engineer to improve the architecture.

If AMD has the same funding that could take them to MARS, do you think they will perform so badly?


----------



## Waitng4realGPU

Quote:


> Originally Posted by *guttheslayer*
> 
> Getting bought over doesn't means throwing everything AMD has been building. It is to inject *TONS OF FUNDING* to their R&D and recruit more skilled engineer to improve the architecture.
> 
> If AMD has the same funding that could take them to MARS, do you think they will perform so badly?


You don't know anything about GCN if you think throwing money and engineers at it will make it as fast as Pascal.

It's a different architecture with different strengths, and different limitations.

AMD have a great GPU architecture, and games are swinging toward utilising it, Nvidia also have a great architecture though and they are winning at the moment.

Plus let's look at it this way...............put TONNES of funding in maybe gain 10% performance to your flagship, marketshare might gain 10%. Then what, you've spent TONNES of money and now the people that took it over are in major debt.

Same downward spiral.


----------



## Lee Patekar

Quote:


> Originally Posted by *ChevChelios*
> 
> The HD 7970 wasn't competitive against the GTX 680?


No it wasn't. AMD had to relaunch the 7970 as the GHz edition and tweak the drivers for over a year. I own a 7970 and I saw my relative performance to the 680 only increase, and the surpass it. But at the time AMD wasn't competitive vs the 680, which is why they launched GK104 with the 680 branding instead of the 670 one. That's also when nvidia started selling mainstream products (300 mm^2) at enthusiast prices. Money money money.

This talks about the launch of the 7970 GHz edition.. to match the 680... or try to. (http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-edition-review-catching-up-to-gtx-680)

The Titan launch wasn't a response to outside forces.. it was actually the lack of competition that allowed nvidia to launch GK100 as a luxury item instead of branding it a 680 GTX like they did with the 580 GTX.

This years titan launch will be similar. There is no competition (yet) but they can milk the crowd with GP102 with a new Titan card while waiting for AMD to launch Vega. Then they'll launch the 1080 TI to compete directly with Vega. Money money money, its what businesses are about.


----------



## Fancykiller65

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> You don't know anything about GCN if you think throwing money and engineers at it will make it as fast as Pascal.
> 
> It's a different architecture with different strengths, and different limitations.
> 
> AMD have a great GPU architecture, and games are swinging toward utilising it, Nvidia also have a great architecture though and they are winning at the moment.


There is a chance that any company buying AMD might choose to not bother with competing with Nvidia. On that note is the gpu division profitable?


----------



## Lee Patekar

Quote:


> Originally Posted by *Fancykiller65*
> 
> There is a chance that any company buying AMD might choose to not bother with competing with Nvidia. On that note is the gpu division profitable?


They're the only ones able to offer custom designs offering enough CPU and GPU horsepower to dominate the console market. Why would anyone scrap an integral and profitable part of the business? AMD never really competed with nVidia on raw performance.. they always hit the sweet price / perf ratio.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Fancykiller65*
> 
> There is a chance that any company buying AMD might choose to not bother with competing with Nvidia. On that note is the gpu division profitable?


Doubt it's profitable although it might be in a year or two.


----------



## Kiros

I wonder why the TDP is 300-375watts when the titan x is 250. I'd imagine pascal+hbm2 would reduce it to 200watts for the titan P


----------



## kaosstar

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> Heard this before, good luck to a new company designing something to compete with Nvidia which has been around for ever designing high end GPUs.
> 
> You can't make a competitive GPU architecture overnight lol.
> 
> Better yet a takeover of AMD and scrap GCN to compete again, maybe in 5-10 years time when their new tech has been developed.
> 
> It's a pipe dream.
> 
> Better put your faith in all games using Async compute and the playing field being level in the next year or two. Much more likely than the takeover scenario.
> Well you don't need to be a blind consumer and buy the top card when it comes out, wait six months to a year and get something more reasonable.
> 
> Nobody is making you have to buy cutting edge resolutions and tech, it's a luxury and these companies spend billions in R+D, so they gotta make big bucks to continue.


ATI was designing high end GPUs before Nvidia was even a glimmer in Jen-Hsun's eye. They just need more money. AMD was competitive in graphics for quite a while. Their CPU division has been hemorrhaging money for so long, they have no $$$ left for GPU R&D. If a big player with deep pockets bought them, they could be competitive at the high end within 2 years.


----------



## iLeakStuff

Lets see:
Nvidia out with high end, ultra high end and midrange in August.

AMD is out with midrange.

GG AMD.
Hope Zen works out better for them than this "always late to the party" strategy they have with GPUs.


----------



## ref

I can see an announcement in August with release a few months later (Late October, Early November)

Releasing in August does seem way too soon, but you never know.


----------



## DNMock

Quote:


> Originally Posted by *renejr902*
> 
> videocardz said the source from vrworld are the truth, and its not a rumor:
> 
> Some fresh news about a new TITAN have just popped up at VR-World.
> *GeForce GTX TITAN 'P'*, GTX 1080 Ti
> 
> Before we begin, few words those who are unfamiliar with VR-World website. You might remember a site called Bright Side of News (BSN). VR World is essentially BSN's new name, you can still find all BSN articles at VR World. Theo Valich is known editor and I don't think his latest post is just a rumor.
> 
> According to the article, VRW staff had the opportunity to hold 'GP100/102-based GTX TITAN' in their hands just few days ago. Although this card (TITAN) is not expected to hit the market before GP102 Quadro implementation goes official, VRW actually gave us a date, which is August 17th to 21st at Gamescom in Germany.


Bam, I called it that it would go by that name a good 6 months ago. +1 internetz to me!


----------



## Waitng4realGPU

Quote:


> Originally Posted by *kaosstar*
> 
> ATI was designing high end GPUs before Nvidia was even a glimmer in Jen-Hsun's eye. They just need more money. AMD was competitive in graphics for quite a while. Their CPU division has been hemorrhaging money for so long, they have no $$$ left for GPU R&D. If a big player with deep pockets bought them, they could be competitive at the high end within 2 years.


I'm not too sure about that they've put a lot into GCN and it could pay off in the next few years. I could be wrong but pouring in money there is always diminishing returns look at intel's cpu gains there's only so far they can get with billions in R+D.

I think big players would be scared to tackle such a risky venture.
Quote:


> Originally Posted by *iLeakStuff*
> 
> Lets see:
> Nvidia out with high end, ultra high end and midrange in August.
> 
> AMD is out with midrange.
> 
> GG AMD.
> Hope Zen works out better for them than this "always late to the party" strategy they have with GPUs.


But it's cool to arrive late to the party. Oh wait this is the tech business.


----------



## iLeakStuff

Quote:


> Originally Posted by *ref*
> 
> I can see an announcement in August with release a few months later (Late October, Early November)
> 
> Releasing in August does seem way too soon, but you never know.


Could be October release for Titan and Q2-Q3 2017 for 1080Ti maybe


----------



## magnek

Quote:


> Originally Posted by *DNMock*
> 
> Bam, I called it that it would go by that name a good 6 months ago. +1 internetz to me!


Prepare to hear from nVidia's counsel for trademark infringement.


----------



## mothergoose729

A single GTX 1080 is not quite comfortable at 4k with max settings for newer games. Even a modest 25% increase in performance would push it over the top.

I am not looking forward to the hefty price tag though. Going to be pretty hard to convince the wife I need this card...


----------



## ZealotKi11er

Quote:


> Originally Posted by *mothergoose729*
> 
> A single GTX 1080 is not quite comfortable at 4k with max settings for newer games. Even a modest 25% increase in performance would push it over the top.
> 
> I am not looking forward to the hefty price tag though. Going to be pretty hard to convince the wife I need this card...


Yeah. 1080 is not enough even OCed. I am not spending more than $700 on a GPU so which ever brand makes a card that can hit 60 fps at 4K will get my money.


----------



## iLeakStuff

Quote:


> Originally Posted by *mothergoose729*
> 
> A single GTX 1080 is not quite comfortable at 4k with max settings for newer games. Even a modest 25% increase in performance would push it over the top.
> 
> I am not looking forward to the hefty price tag though. Going to be pretty hard to convince the wife I need this card...


Since GP104 is now $100-$150 above earlier midrange chips, I bet one can expect $1199 or something for the Titan.
And $899 for the 1080Ti in 2017


----------



## Exeed Orbit

Quote:


> Originally Posted by *iLeakStuff*
> 
> Since GP104 is now $100-$150 above earlier midrange chips, I bet one can expect $1199 or something for the Titan.
> And $899 for the 1080Ti in 2017


Which is absolutely absurd if you ask me. I'm hoping these prices eventually reach a point where they start shrinking the dGPU market. I'd much rather pay $500 a year on an entirely new console refresh, rather than have to pay $700 for a mid range card. Let alone 900-1200 for enthusiast grade cards.


----------



## DNMock

If I'm reading this right, the Titan P will have 16gb of HBM2 and a few more cores like usual and dual 8 pin power connectors while the 1080ti will have 12gb HBM2, a few less cores, and a single 8 and 6 pin power connector.

Also, thanks in large part to the awful SLI scaling, a single Titan P will perform on par with SLI Titan-X cards.

That about sum everything up?


----------



## magnek

The Titan P will also cost you your left eye, right hand and right thigh.

But yeah other than that all good.


----------



## rcfc89

Quote:


> Originally Posted by *DNMock*
> 
> If I'm reading this right, the Titan P will have 16gb of HBM2 and a few more cores like usual and dual 8 pin power connectors while the 1080ti will have 12gb HBM2, a few less cores, and a single 8 and 6 pin power connector.
> 
> Also, thanks in large part to the *awful SLI scaling*, a single Titan P will perform on par with SLI Titan-X cards.
> 
> That about sum everything up?


That's odd I'm getting nearly double the frames in the more popular title games that I play. Even in some older games I'm getting 65-75% increase. Not what I would call "awful." Maybe when you get into 3 or 4-way SLI. 2-way has been excellent in my experience.


----------



## ChevChelios

Titan P could be like 75-80% faster then a stock Titan X if its indeed 1.5x faster than 1080

at that point it will beat SLI Titan X even *with* good gaming scaling


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> Oh with what TDP? 600W TDP with triple slot water block and dual pump with 4x140mm radiator?
> 
> Be real, at this rate, AMD will never match the high end offering that Nvidia has. Oh not to mentioned Vega is still 6 months away... That is GTX 1080 performance level being 6 month late.


Oh with what TDP? 600W TDP with triple slot water block and dual pump with 4x140mm radiator I WOuLD LiKE TO BuY ThiS CARD


----------



## renejr902

Quote:


> Originally Posted by *rcfc89*
> 
> The only game that a system that weak will be running at 60fps in 4k is "Frogger."
> 
> It will take a machine that's way more the 4x as powerful as the Xbone to run popular titles at 4k 60fps considering the current Xbone struggles to achieve 720p 60fps. 4k is basically 8x that resolution.
> Damn I'm really thinking about picking up a 5K display after watching this video. The vibrancy and colors are insane. Thanks for sharing.


XBOX Scorpio will be great for 4k at 30fps. its still better than nothing. all games could be able to do that 4k and 30fps, a few of them will be 60 fps


----------



## Celcius

This will be the card worth upgrading for! I can't wait to see how much faster it is then my GTX 780 Ti... GTX 1080 is already literally twice as fast.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> That is the problem you see, I am not being an nvidia fanboy, I am being pissed at AMD not able to compete be it midrange or high end (GTX 1060 faster than RX480?)
> 
> And becz of AMD incompetency, Nvidia has every reason to price it sky high, just like how Intel does with their BWE. I can be angry at insane pricing of Nvidia, but at the end of the day I know who is the culprit for all this high price, its called monopoly.
> 
> And monopoly will continue until AMD get bought over by some major player.


i agree with him. Still AMD dont need to be bought , it need to wake up. Anyway they still doing a great job with console. im not a nvidia fanatic !!! its just the fact. in the past i bought at least 5 radeon card: ati rage pro 8mb, ati rage fury pro 128mb, radeon 7500, radoen 9600, radeon 9700pro, x1950 pro

. After that i always bought nvidia card, but it were some time i hesitate for a radeon card. This is my Geforce list i bought: Geforce 256, geforce 2 ultra DDR, ti4200 64mb, geforce 6800 gt, geforce 280 gtx, i took a break several years... and goes from 280 gtx to geforce 960 to geforce 980 ti to geforce 970 to geforce 1070.
others cards: diamond stealth ii 4mb ( i loved this card) , voodoo1, voodoo banshee 16mb, voodoo 2.
i bought a 512kb video card for svga too, cant remember the name and a 2mb card too after that, 256kb videocard always suck, several dos games didnt work with them


----------



## renejr902

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. 1080 is not enough even OCed. I am not spending more than $700 on a GPU so which ever brand makes a card that can hit 60 fps at 4K will get my money.


do you think the future 1080 gtx Kingpin edition will be enough for 4k at 60fps. Maybe kingpin will have a custom bios and allow much faster clock speed


----------



## BiG StroOnZ

Quote:


> Originally Posted by *Eorzean*
> 
> Hopefully Vega does come out in October and drives prices down across the board. What I really hope for is AMD to release something better than a 1080, at a better price... I'd jump ship in a heartbeat. I've been waiting for that opportunity for a long time though.
> LMAO. That pic brings "integrated" to a whole new level. Nvidia in the year 2027 2019.


The 490 will probably be the card that will be competing with the 1080. But that is only one SKU. So I imagine AMD would be looking to release Fury replacements as well.


----------



## Darkpriest667

Quote:


> Originally Posted by *renejr902*
> 
> i agree with him. Still AMD dont need to be bought , it need to wake up. Anyway they still doing a great job with console. im not a nvidia fanatic !!! its just the fact. in the past i bought at least 5 radeon card: ati rage pro 8mb, ati rage fury pro 128mb, radeon 7500, radoen 9600, radeon 9700pro, x1950 pro
> 
> . After that i always bought nvidia card, but it were some time i hesitate for a radeon card. This is my Geforce list i bought: Geforce 256, geforce 2 ultra DDR, ti4200 64mb, geforce 6800 gt, geforce 280 gtx, i took a break several years... and goes from 280 gtx to geforce 960 to geforce 980 ti to geforce 970 to geforce 1070.
> others cards: diamond stealth ii 4mb ( i loved this card) , voodoo1, voodoo banshee 16mb, voodoo 2.
> i bought a 512kb video card for svga too, cant remember the name and a 2mb card too after that, 256kb videocard always suck, several dos games didnt work with them


They do need to be bought, preferably by someone who has their own fabs like Samsung. They do not have the R and D budget to compete with Intel.... on top of competing with Nvidia. Samsung woudl give them two things 1 - name brand recognition. There isn't a person on this planet that doesn't know who they are 2 - a huge infusion of engineering department resources. They can lease the X86 license.


----------



## magnek

Yeah but then we'd get GPUs that slowly degraded in performance over time just like the 840 Evo did.


----------



## ToTheSun!

Quote:


> Originally Posted by *magnek*
> 
> Yeah but then we'd get GPUs that slowly degraded in performance over time just like the 840 Evo did.


So, Nvidia would finally have competition?


----------



## magnek

Quote:


> Originally Posted by *ToTheSun!*
> 
> So, Nvidia would finally have competition?


----------



## Mhill2029

I'm a little perplexed by the idea of a 12" GPU with power connectors on the FRONT of the card. That's going to make cable management a serious PITA for sure.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *guttheslayer*
> 
> Again how do they cut the 1080 off when they are more than 2x the price apart.
> 
> It actually Nvidia coming out with mid-range card and den high end in just 3 month. AMD is doing the same, but becz their gpu are way behind Nvidia, they have to price at 1/3.


I would hope that you are not intimating that a $700 GTX 1080 is mid-range?!?! The X80 cards are always the in-between flagships between big-chip Titans, at least since 2012. There's no reason for Nvidia to change this dynamic now, especially when AMD is not even competing at the high end at all. There's no guarantee that Vega will even offer better performance than the 1080 so it would be an absolute waste to rush out a Titan card now just months after the 1080 launched only to compete with themselves! They will wait to see how Vega does, and in the unlikely event that it beats the 1080 handily, can then drop the new Titan to reclaim the crown. This is how Nvidia has operated for donkey's years now...


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Nestala*
> 
> "late and underperform" now you just sound like a Nvidia fanboy because you're judging a card that we have exactly 0 info about other the fact that it's gonna be called Vega, a flagship card and will have HBM2.
> 
> Also "late": this will be around the same time the Titan will arrive. No way they'll release it this year, as many others and me have said before.


I don't like it anymore than you do but he is just telling it like it is. AMD has been late and under-performing for years now and that doesn't look to be changing anytime soon, unfortunately. There could be any number of reasons for this (likely a combination of being tied to GCN due to consoles and a lack of capital for extensive R&D) but the fact remains. Its gotten so bad at the high end now that AMD is not even competing at all and ceding the entire market to Nvidia for all of 2016 (or so it would seem). I do believe they are on the right track to profitability with the Polaris strategy but its going to be a looooooong time before we see parity at the flagship level between these two companies (unless Vega performs far beyond current expectations).
Quote:


> Originally Posted by *EightDee8D*
> 
> 480 wasn't late, they said polaris will launch mid 2016 and it launched on exact mid 2016 (29 june). and late or not what does it have to do with it's tdp ? losing an argument so put another ? lol
> 
> not their fault you guys believe in hype. performance is expected ( it's just 32rops brah) but p/w is disappointing.


I am only talking about the high end here. The 480 was not late by any stretch (as they actually beat Nvidia to the market segment they were targeting) and the performance is fantastic for that market and price. Efficiency is disappointing but raw performance is right where they said it would be (hype train notwithstanding) and remains far above anything else currently available in the class...


----------



## kingduqc

Does AMD has anything that can actually compete? I dont see vega be that much faster then polaris and the core count wont grow to much over 4k..


----------



## Asmodian

Quote:


> Originally Posted by *kingduqc*
> 
> Does AMD has anything that can actually compete? I dont see vega be that much faster then polaris and the core count wont grow to much over 4k..


Vega 11 is expected to be over 6K cores so it is possible we might see something from AMD to compete with GP100/102.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kingduqc*
> 
> Does AMD has anything that can actually compete? I dont see vega be that much faster then polaris and the core count wont grow to much over 4k..


What is what people said when Titan OG first came out. Also if this Titan is more then $1000 there is no reason for AMD to compete at that segment. That's only for bragging rights.


----------



## rcfc89

Quote:


> Originally Posted by *renejr902*
> 
> do you think the future 1080 gtx Kingpin edition will be enough for 4k at 60fps. Maybe kingpin will have a custom bios and allow much faster clock speed


To do 4K in Ultra even without AA you will likely still need 2 gpu's to maintain 60fps in major title's. That's my goal anyways when I upgrade to a 4K display here very soon. Two highly clocked 1080's or the a pair of the upcoming Titan's would be perfect.


----------



## Asmodian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is what people said when Titan OG first came out. Also if this Titan is more then $1000 there is no reason for AMD to compete at that segment. That's only for bragging rights.


Bragging rights do seem to matter though, when you have the fastest single GPU card your midrange seems to sell better even when at that price point you are slower than the competition.


----------



## kaosstar

Quote:


> Originally Posted by *renejr902*
> 
> i agree with him. Still AMD dont need to be bought , it need to wake up. Anyway they still doing a great job with console. im not a nvidia fanatic !!! its just the fact. in the past i bought at least 5 radeon card: ati rage pro 8mb, ati rage fury pro 128mb, radeon 7500, radoen 9600, radeon 9700pro, x1950 pro
> 
> . After that i always bought nvidia card, but it were some time i hesitate for a radeon card. This is my Geforce list i bought: Geforce 256, geforce 2 ultra DDR, ti4200 64mb, geforce 6800 gt, geforce 280 gtx, i took a break several years... and goes from 280 gtx to geforce 960 to geforce 980 ti to geforce 970 to geforce 1070.
> others cards: diamond stealth ii 4mb ( i loved this card) , voodoo1, voodoo banshee 16mb, voodoo 2.
> i bought a 512kb video card for svga too, cant remember the name and a 2mb card too after that, 256kb videocard always suck, several dos games didnt work with them


The Rage Pro 8MB was the first discrete GPU I ever bought. It was around $100 which is a lot when you're a kid.


----------



## kaosstar

Quote:


> Originally Posted by *Asmodian*
> 
> Bragging rights do seem to matter though, when you have the fastest single GPU card your midrange seems to sell better even when at that price point you are slower than the competition.


That's pretty much the only explanation for why the 960 sold at all.


----------



## Clocknut

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I would hope that you are not intimating that a $700 GTX 1080 is mid-range?!?! The X80 cards are always the in-between flagships between big-chip Titans, at least since 2012. There's no reason for Nvidia to change this dynamic now, especially when AMD is not even competing at the high end at all. There's no guarantee that Vega will even offer better performance than the 1080 so it would be an absolute waste to rush out a Titan card now just months after the 1080 launched only to compete with themselves! They will wait to see how Vega does, and in the unlikely event that it beats the 1080 handily, can then drop the new Titan to reclaim the crown. This is how Nvidia has operated for donkey's years now...


They still have to put a TITAN out several months ahead of actual vega launch, so they can charge arms & leg for a TITAN.

Reason being that to kill Vega, they need 1080Ti release around vega launch date. IMO Having 1080Ti launch too close to TITAN will cannibalize titan's rip off sales.


----------



## Xuvial

Quote:


> Originally Posted by *Clocknut*
> 
> Having 1080Ti launch too close to TITAN will cannibalize titan's rip off sales.


From what I understood Titan isn't even aimed at the same consumer base as the x80 Ti. Yes Titan can be used for gaming, but it's a compute/cuda card more than anything else and it costs $1000+.

I mean no gamer is seriously buying Titans just for gaming right? Right?? Why the heck would anyone do that?


----------



## Asmodian

Quote:


> Originally Posted by *kaosstar*
> 
> That's pretty much the only explanation for why the 960 sold at all.


True for performance, but its HEVC hardware decode was great for HTPC builds.








Quote:


> Originally Posted by *Clocknut*
> 
> They still have to put a TITAN out several months ahead of actual vega launch, so they can charge arms & leg for a TITAN.
> 
> Reason being that to kill Vega, they need 1080Ti release around vega launch date. IMO Having 1080Ti launch too close to TITAN will cannibalize titan's rip off sales.


Being able to charge more to the people who are willing to pay more while also selling basically the same product to the people who are not willing to pay more is one of the capitalist's holy grails. Releasing the Titan "deluxe" version a few months early is required to make it work. Nvidia and their partners get the extra cash from those who are willing to pay more to buy early while still selling large die GPUs in reasonable volumes to those who aren't willing to spend that much. It is hard to get good product differentiators that do not add to the BOM but Nvidia really doesn't want anyone who is willing to pay $1000 to only pay $750.


----------



## Asmodian

Quote:


> Originally Posted by *Xuvial*
> 
> From what I understood Titan isn't even aimed at the same consumer base as the x80 Ti. Yes Titan can be used for gaming, but it's a compute/cuda card more than anything else and it costs $1000+.
> 
> I mean no gamer is seriously buying Titans just for gaming right? Right?? Why the heck would anyone do that?


The Titan X was pretty much exactly a 980 Ti with double the memory, I suppose they probably sold some for scientific applications that needed a lot of memory and didn't want to pay for a Quadro M6000 but both Titans were mostly sold to gamers.

Titans are gaming cards designed to sell to those who do not mind paying 50% more to get 5% more. This is a consumer group that is often catered to, for obvious reasons. Have you looked at the price of a first class airline ticket? I will happily drop a grand or two extra to get the best computer parts but I will not spend THAT much to have a more comfortable flight to Beijing.

Compared to all the random luxuries humans spend vast sums on a slightly overpriced "deluxe" video card is a very rational use of funds.


----------



## ref

Quote:


> Originally Posted by *Xuvial*
> 
> From what I understood Titan isn't even aimed at the same consumer base as the x80 Ti. Yes Titan can be used for gaming, but it's a compute/cuda card more than anything else and it costs $1000+.
> 
> I mean no gamer is seriously buying Titans just for gaming right? Right?? Why the heck would anyone do that?


I will be.

Reasons being, regardless of price to performance ratio compared to Ti's, Titans are still the fastest cards out and I have disposable income. I want to upgrade to a big chip from my 980s and don't want to wait who knows how long for the Ti's.

I upgrade every 2-3 years so I want to have the best available at the time of upgrading.


----------



## Seyumi

Quote:


> Originally Posted by *Clocknut*
> 
> They still have to put a TITAN out several months ahead of actual vega launch, so they can charge arms & leg for a TITAN.
> 
> Reason being that to kill Vega, they need 1080Ti release around vega launch date. IMO Having 1080Ti launch too close to TITAN will cannibalize titan's rip off sales.


I think you have a fairly good point. If I was Nvidia, I would release the Titan P ASAP (Summer/Fall 2016) to rack in on the probably $1200~$1500 GPU sales then release the more rational 1080Ti to compete (and most likely beat) the Vega 10 in Spring of 2017.

Quote:


> Originally Posted by *Xuvial*
> 
> From what I understood Titan isn't even aimed at the same consumer base as the x80 Ti. Yes Titan can be used for gaming, but it's a compute/cuda card more than anything else and it costs $1000+.
> 
> I mean no gamer is seriously buying Titans just for gaming right? Right?? Why the heck would anyone do that?


Not so good of a point. The current generation of Titan cards don't do squat with compute/cuda. It's literally just a more beefed upped version of the 980Ti. It's a 100% gaming card. First Titan not so much. And if you're wondering who buys Titan X's for gaming then it's this guy. I needed 4 of them in SLI to maintain 60FPS minimum on modern games with 4k resolution. I'm happy with the 2 SLI only changes as I was able to cut my systems cost in half and while almost have the same performance (more once the Titan P gets released for SLI).


----------



## guttheslayer

That is precisely the point. The advantage of releasing titan asap is to make potential 1080 ti players simply jump on the big gun. What if i tell u when titan release and there is no 1080 ti within the next one year? Ppl will have no choice but to wack titan or settle down with an inferior 1080.

Alot of ppl been commenting about market saturation but i can tell u not everyone who bought 1080 will dump their card for titan that is twice as exp. they will wait for 1080 ti. But when u tell the world there is no 1080 ti for the next 6 months at least and quickly release the titan that is even more exp than the usual price, u have a win in all situation on profits.

That is how nvidia want to milk the market. Kills the 1080 ti and force ppl to jump to titan. 1080 ti will only be available if amd had smth to compete.


----------



## SpeedyVT

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> NVIDIA is all about making money, if that means two Titans with different core amounts, memory sizes, and different prices by all means they will do it. NVIDIA has had some time to plan out how they want to do Big Die Pascal.
> 
> With Big Die Maxwell we only saw two GTX SKUs (980 Ti and Titan X).
> 
> Yet with Big Die Kepler we saw four (780, Titan, 780 Ti, Titan Black). Again with different Cuda Core counts and memory sizes.
> 
> NVIDIA could easily do the following:
> 
> GP102 Titan 16GB HBM - $1200 (August) - 50% over 1080
> GP102 Titan 12GB HBM (slightly cut down) - $1000 (August) - 40% over 1080
> GP102 1080 Ti 8GB G5X (even more cut down) - $800 (November) - 30% over 1080


They'll charge a lot more I'm certain of it.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *guttheslayer*
> 
> That is precisely the point. The advantage of releasing titan asap is to make potential 1080 ti players simply jump on the big gun. What if i tell u when titan release and there is no 1080 ti within the next one year? Ppl will have no choice but to wack titan or settle down with an inferior 1080.
> 
> Alot of ppl been commenting about market saturation but i can tell u not everyone who bought 1080 will dump their card for titan that is twice as exp. they will wait for 1080 ti. But when u tell the world there is no 1080 ti for the next 6 months at least and quickly release the titan that is even more exp than the usual price, u have a win in all situation on profits.
> 
> That is how nvidia want to milk the market. Kills the 1080 ti and force ppl to jump to titan. 1080 ti will only be available if amd had smth to compete.


All I'm saying is that if they did as you suggest and release the new Titan within the next month it would be contrary to 4+ years of Nvidia business practices. They have never launched a Titan card this close to the launch of an X80 card before. They already hold a completely uncontested GPU crown and there is simply no need to up the ante right now (especially considering the fact that they are likely going to be stuck on 16nm for a while and would like to stretch out the performance gains for as long as possible). I think you guys are just confusing your hopes for what will happen with what 4 years of a successful business model tells us will happen. But who knows?


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> All I'm saying is that if they did as you suggest and release the new Titan within the next month it would be contrary to 4+ years of Nvidia business practices. They have never launched a Titan card this close to the launch of an X80 card before. They already hold a completely uncontested GPU crown and there is simply no need to up the ante right now (especially considering the fact that they are likely going to be stuck on 16nm for a while and would like to stretch out the performance gains for as long as possible). I think you guys are just confusing your hopes for what will happen with what 4 years of a successful business model tells us will happen. But who knows?


That 4 years of business was based on just 28nm which was an accidental since no one predicted the 20nm soc would turn out to be a failure.

If progress for 10nm ff is good, we might see the old 2 years model going back. Nvidia is already aiming for volta within a year of pascal launch.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *guttheslayer*
> 
> That 4 years of business was based on just 28nm which was an accidental since no one predicted the 20nm soc would turn out to be a failure.
> 
> If progress for 10nm ff is good, we might see the old 2 years model going back. Nvidia is already aiming for volta within a year of pascal launch.


I have a feeling we're gonna be on this node for at least as long as 28nm. The end is fast approaching in terms of node shrinks so they can't just rush through them and be left with no where to go in 5 years...


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have a feeling we're gonna be on this node for at least as long as 28nm. The end is fast approaching in terms of node shrinks so they can't just rush through them and be left with no where to go in 5 years...


The 10nm is in volume production in 2017 with possibility of apple iphone 7 using 10nm ff this year, and u say it take 4 years for 10nm to arrive for gpu?

Year 2020 is the volume production dateline for 5nm. while 7nm is 2018.


----------



## ChevChelios

I think Volta & Navi will be 16/14nm still


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ChevChelios*
> 
> I think Volta & Navi will be 16/14nm still


Without a doubt.


----------



## Defoler

Quote:


> Originally Posted by *magnek*
> 
> The only other scenario that makes sense is if nVidia is only releasing a Titan this year with the Ti variant at least 6 months away, and the Titan costs *at least* $1200+ to avoid cannibalizing 1080 sales.


Unless it comes out when AMD release vega to compete with the 1080. And if they give 1080 performance at 200$ less, nvidia will be forced to reduce the 1080 price by at least 150$, and then the new titan will cost 1000$.

But considering people are willing to pay 700$+ or even 800$+ for a GPU, well, I'm sure people will be willing to pay 1200$+ for a GPU since nvidia aren't going to put them out in the same amount as the 480, so stocks will be gone just as fast.


----------



## Defoler

Quote:


> Originally Posted by *guttheslayer*
> 
> That 4 years of business was based on just 28nm which was an accidental since no one predicted the 20nm soc would turn out to be a failure.
> 
> If progress for 10nm ff is good, we might see the old 2 years model going back. Nvidia is already aiming for volta within a year of pascal launch.


Well with both samsung and tsmc claiming (more like insisting) they are both going to start 10nm production in 2016 and will go full on mass production in 2017, nvidia might try to piggy back and try to get somewhat ahead of the queue of at least somewhere in the middle for 2017 chips, since apple are definitely going to demand high production numbers from both.

Global foundries on the other hand have been keeping a really low profile, and since AMD have struck a deal with them way into 2017 for their GPU chips, I don't know what will happen then. But if they are going to stick with 14nm for 2017, nvidia might do the same and wait in order to let the 10nm also mature.

This of course in case 10nm will become more popular in 2017 outside of apple A11 chip.


----------



## Kpjoslee

When Intel delayed 10nm Cannonlake and revealed Kaby Lake for this year on same 14nm node, I have given up hopes of having 10nm GPU in 2018 lol.


----------



## Defoler

Quote:


> Originally Posted by *Kpjoslee*
> 
> When Intel delayed 10nm Cannonlake and revealed Kaby Lake for this year on same 14nm node, I have given up hopes of having 10nm GPU in 2018 lol.


TBH TSMC and samsung aren't chasing intel in that term. They are being driven by apple, which are just as big of a player, if not bigger, as well as samsung themselves and ARM and qualcomm etc etc.

Mobile systems have a big rush for as much performance as possible, and they are pushing to 10nm way faster than intel, since intel doesn't really have any competition or stress to reach 10nm, and even at 10nm they aren't going to really push into the SoC mobile market (got to face the truth), so overall they are just not in a hurry.


----------



## Clocknut

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> All I'm saying is that if they did as you suggest and release the new Titan within the next month it would be contrary to 4+ years of Nvidia business practices. They have never launched a Titan card this close to the launch of an X80 card before. They already hold a completely uncontested GPU crown and there is simply no need to up the ante right now (especially considering the fact that they are likely going to be stuck on 16nm for a while and would like to stretch out the performance gains for as long as possible). I think you guys are just confusing your hopes for what will happen with what 4 years of a successful business model tells us will happen. But who knows?


Well duuhh







of cause, they need these ultra high end buyer group to buy 1080 first... then a few months later buy a much rip off TITAN then a few months later only release 1080Ti, so that TITAN owners wont feel cheated. IMO not many will buy a TITAN if they know a 1080Ti is coming 1-2 months later at half the price.

Besides Nvidia probably need months of low volume production to stock up enough imperfect GP102 to be sell as 1080Ti at launch day anyway.


----------



## Kpjoslee

Quote:


> Originally Posted by *Defoler*
> 
> TBH TSMC and samsung aren't chasing intel in that term. They are being driven by apple, which are just as big of a player, if not bigger, as well as samsung themselves and ARM and qualcomm etc etc.
> 
> Mobile systems have a big rush for as much performance as possible, and they are pushing to 10nm way faster than intel, since intel doesn't really have any competition or stress to reach 10nm, and even at 10nm they aren't going to really push into the SoC mobile market (got to face the truth), so overall they are just not in a hurry.


Being driven by mobile systems? Yes. But it doesn't guarantee that we will see them by next year. Mobile SoC on certain node doesn't mean we will see them in GPUs. We all saw what happened with 20nm. I would love to see 10nm in couple of years, but I would keep my expectations in check instead of being disappointed later.


----------



## ChevChelios

yesterday we had a report that Vega is either Q1 2017 or H1 2017
http://www.fudzilla.com/news/graphics/41034-vega-10-amd-hbm-2-can-launch-in-1h-2017

Titan P = 2, maybe 3 months before that (and before 1080Ti) .. and shortly after P100 (which should be very early 2017, maybe even December 2016 ?)

1080Ti = several weeks before Vega


----------



## Nestala

Quote:


> Originally Posted by *ChevChelios*
> 
> yesterday we had a report that Vega is either Q1 2017 or H1 2017
> http://www.fudzilla.com/news/graphics/41034-vega-10-amd-hbm-2-can-launch-in-1h-2017
> 
> Titan P = 2, maybe 3 months before that (and before 1080Ti) .. and shortly after P100
> 
> 1080Ti = several weeks before Vega


Yeah, that'd be my guess as well.


----------



## ChevChelios

is there someone here that still believes in August-September Titan P launch ?


----------



## Nestala

Quote:


> Originally Posted by *ChevChelios*
> 
> is there someone here that still believes in August-September Titan P launch ?


Ask @guttheslayer







.


----------



## Defoler

Quote:


> Originally Posted by *Kpjoslee*
> 
> Being driven by mobile systems? Yes. But it doesn't guarantee that we will see them by next year. Mobile SoC on certain node doesn't mean we will see them in GPUs. We all saw what happened with 20nm. I would love to see 10nm in couple of years, but I would keep my expectations in check instead of being disappointed later.


Well my point was not that we will definitely see 10nm in GPUs next year.

My point was that since both TSMC and samsung are pushing to move to 10nm, we might see early 10nm GPUs, maybe at the extreme top end or the very low end to save costs, as a way to start the process before 2018. Mainly because of TSMC and samsung plan to not increase more 16nm fabs now, and move everything to 10nm, they might have enough 10nm manufacturing capacity in 2017.


----------



## guttheslayer

Quote:


> Originally Posted by *Nestala*
> 
> Ask @guttheslayer
> 
> 
> 
> 
> 
> 
> 
> .


I still believe its in sept launch. Its the 1080 ti that will be q1 2017 if not later

I think of you guys cant see the reason to release titan asap. The point is titan is never meant to compete with amd nor their own 1080. Their price put them completely out of any lineup reach. Pitching the titan at q1 when vega is around the corner will put in titan in an extremely bad limelight. Especially when they release another 1080ti at 2/3 the price just to compete within 2 months. Smart buyer will skip titan and wait for 1080 ti.

If the gap between ti and titan is more than 6 months, the situation would have been very very different


----------



## renejr902

Quote:


> Originally Posted by *kaosstar*
> 
> The Rage Pro 8MB was the first discrete GPU I ever bought. It was around $100 which is a lot when you're a kid.


LOL, when i got my voodoo1 several years ago, i have to sold several snes games


----------



## renejr902

Quote:


> Originally Posted by *kaosstar*
> 
> That's pretty much the only explanation for why the 960 sold at all.


i didnt keep my geforce 960 for a long time, it was too weak even in 1080p at ultra for witcher iii


----------



## renejr902

Quote:


> Originally Posted by *Clocknut*
> 
> They still have to put a TITAN out several months ahead of actual vega launch, so they can charge arms & leg for a TITAN.
> 
> Reason being that to kill Vega, they need 1080Ti release around vega launch date. IMO Having 1080Ti launch too close to TITAN will cannibalize titan's rip off sales.


what will happen if they release the titan next month, but Vega hbm2 videocard is much stronger than Titan at release, even if its release in q1 2017 ? Nvidia cant release a 1080ti after that to compete with vega hbm2, if the 1080ti is stronger than Titan. i hope you understand my explanations, but im still confuse about that, isnt risky for nvdia ? Nvidia cant release a 1080ti card stronger than titan to compete with Vega hbm2, isnt?


----------



## Randomdude

If AMD did nothing but removed the Fury bottlenecks, then the Big Vega would be 217.5% of a 390x performance. Where does that put it compared to the TP?


----------



## renejr902

im still thinking that nvidia know they cant release a titan card with a price much higher than 1000$, otherwise most people will wait for the 1080ti, like me. While waiting for a titan, i bought the 1070 instead of 1080, because my goal is to buy the titan or 1080ti, and by that time, i think the 1080 will got a higher price drop compared to 1070. anyway for me 1070 like 1080 are too weak for 4k at 60hz so i dont care much, i just cant live with the integrated intel gpu anymore lol. when i sold my 970 2 months ago, it was for buying a titan in april,may, rumor said it was possible at that time. i agree with the fact that nvidia should release titan soon and then wait 6 months for 1080ti. titan is not really for the same people that buy 1070 or 1080, titan is so much expensive.


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> im still thinking that nvidia know they cant release a titan card with a price much higher than 1000$, otherwise most people will wait for the 1080ti, like me. While waiting for a titan, i bought the 1070 instead of 1080, because my goal is to buy the titan or 1080ti, and by that time, i think the 1080 will got a higher price drop compared to 1070. anyway for me 1070 like 1080 are too weak for 4k at 60hz so i dont care much, i just cant live with the integrated intel gpu anymore lol. when i sold my 970 2 months ago, it was for buying a titan in april,may, rumor said it was possible at that time. i agree with the fact that nvidia should release titan soon and then wait 6 months for 1080ti. titan is not really for the same people that buy 1070 or 1080, titan is so much expensive.


To release the Titan earlier it give them right to price the Titan above $1000, aka early adopter tax. It probably make more sense to release the card at higher price now and den cut down to $999 after a few month than to release it at $999 after a few month.

Another thing you all fail to realised, There are 2 Titan variants, did the source even mentioned if both will be release at the same timeframe? What if the smaller titan is out in Sept, bigger Titan is out in Dec and 1080 Ti is out March 2017? At the price of $999, $1199+ and $799 respectively?

All the source point out it could be 50% faster than 1080, but none of them can confirm if Pascal Titan will be indeed full flat out 3840 cores. If you see carefully the fastest P100 Tesla NVLink only has 3584 cores available. And the PCIe variant is 13% slower, putting it around the performance of 3200 Cores at 1.45GHz.

What if the 3200 cores is the one that is coming out in Sept, there is just too many possibility, but as you can see they are ALOT of configuration u can play around with GP100. Its really too early to conclude they wont be out. Who know they could cut out enough pattern for them to milk the market in the next 12-18 months.


----------



## zealord

Quote:


> Originally Posted by *renejr902*
> 
> im still thinking that nvidia know they cant release a titan card with a price much higher than 1000$, otherwise most people will wait for the 1080ti, like me. i bought the 1070 instead of 1080, because my goal is to buy the titan or 1080ti, and by that time, i think the 1080 will got a higher price drop compared to 1070. anyway for me 1070 like 1080 are too weak for 4k at 60hz so i dont care much, i just cant live the integrated intel gpu anymore lol. when i sold my 970 2 months ago, it was for buying a titan in april,may, rumor said it was possible at that time.
> While waiting for a titan,
> i agree with the fact that nvidia should release titan soon and then wait 6 months for 1080ti. titan is not really for the same people that buy 1070 or 1080, titan is so much expensive.


if the GTX 1080 is 699$ and the Titan Pascal is over 1000$ then don't hope for a GTX 1080 Ti that is like 650$ or something









I would expect a GTX 1080 Ti to be 899$ if the next Titan card is above a grand.

Let's hope GTX 1080 was a temporary isolated case and Nvidia does not increase prices again. It was a bit much in the last 5 years. Basically everything went up by atleast 100% price increase. Another increase would mean we now pay 125%+ compared to what we paid 5-6 years ago. Don't nobody even start talking about inflation in that short time frame.

Truth be told I am not planning to buy any card in the current landscape. With GPU prices of 700€+ for cards that used to be 300€ consoles are probably the more attractive route going forward.
Polaris didn't give me enough hope to see the status quo changing anytime soon. I
I still hope for those that can't control the urge to buy a Titan even though they aren't as well off as some people, that the new Titan won't come above a grand and completely wreck their credit cards


----------



## renejr902

Quote:


> Originally Posted by *zealord*
> 
> if the GTX 1080 is 699$ and the Titan Pascal is over 1000$ then don't hope for a GTX 1080 Ti that is like 650$ or something
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would expect a GTX 1080 Ti to be 899$ if the next Titan card is above a grand.
> 
> Let's hope GTX 1080 was a temporary isolated case and Nvidia does not increase prices again. It was a bit much in the last 5 years. Basically everything went up by atleast 100% price increase. Another increase would mean we now pay 125%+ compared to what we paid 5-6 years ago. Don't nobody even start talking about inflation in that short time frame.
> 
> Truth be told I am not planning to buy any card in the current landscape. With GPU prices of 700€+ for cards that used to be 300€ consoles are probably the more attractive route going forward.
> Polaris didn't give me enough hope to see the status quo changing anytime soon. I
> I still hope for those that can't control the urge to buy a Titan even though they aren't as well off as some people, that the new Titan won't come above a grand and completely wreck their credit cards


1100$ is my max, otherwise i wait for a 1080ti, even at 1200$ i will wait for the 1080ti, i made my choice already.
but i dont mind buying a titan P (junior edition lol) 12gb instead of 16gb with 540 bandwidth vs 720 and 3560 core instead of 3840 if the price is 999$, im ok with this.
i would really appreciate if they can allow custom titan card. i dont like to have only one fan with any videocard. otherwise, maybe i could invest for water cooling


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> 1100$ is my max, otherwise i wait for a 1080ti, even at 1200$ i will wait for the 1080ti, i made my choice already.
> but i dont mind buying a titan P (junior edition lol) 12gb instead of 16gb with 540 bandwidth vs 720 and 3560 core instead of 3840 if the price is 999$, im ok with this.


That Titan Junior might come with only 3200 cores, 480GB/s of 12GB HBM2. That is probably the card u get for "just" $999.

Its the Titan Senior that get the possible +50% performance over 1080, not the Jr.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> That Titan Junior might come with only 3200 cores, 480GB/s of 12GB HBM2. That is probably the card u get for "just" $999.
> 
> Its the Titan Senior that get the possible +50% performance over 1080, not the Jr.


im still ok with it.

with overclocking, i could get nearly 50% more performance than 1080 gtx.

to me the geforce 1080 gtx need 30% more performance to be ok with me. So a titan junior will be at least 30% better than 1080, after overclock maybe 45% better.

anyway if they do a titan junior for 999$
i wont be willing to pay 1499$ for the full titan
after that maybe a 1080ti will happen, maybe not, but if it happen my guess is similar performance to titan junior but with gddr5x at 384bit

i prefer to pay 999$ for titan junior than 600-700$ for a 1080 gtx, in my case, the 1080 wont worth it, i want more of my favorites games to be able to get 60fps at 4k, as much as possible


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> im still ok with it.
> 
> with overclocking, i could get nearly 50% more performance than 1080 gtx.
> 
> to me the geforce 1080 gtx need 30% more performance to be ok with me. So a titan junior will be at least 30% better than 1080, after overclock maybe 45% better.
> 
> anyway if they do a titan junior for 999$
> i wont be willing to pay 1499$ for the full titan
> after that maybe a 1080ti will happen, maybe not, but if it happen my guess is similar performance to titan junior but with gddr5x at 384bit


The GTX 1080 Ti if it exist might come with only 8GB of HBM2 at same bandwidth


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> The GTX 1080 Ti if it exist might come with only 8GB of HBM2 at same bandwidth


yeah, im not sure gddr5x on 384 bit is possible anyway...


----------



## Defoler

Quote:


> Originally Posted by *renejr902*
> 
> what will happen if they release the titan next month, but Vega hbm2 videocard is much stronger than Titan at release, even if its release in q1 2017 ? Nvidia cant release a 1080ti after that to compete with vega hbm2, if the 1080ti is stronger than Titan. i hope you understand my explanations, but im still confuse about that, isnt risky for nvdia ? Nvidia cant release a 1080ti card stronger than titan to compete with Vega hbm2, isnt?


Won't it be like they did before?
Put out a titan P black.
Also I don't know. With current architecture, AMD are really pushing it in order to match something like the titan, that the card will be far from just 2x8pins. So they might as well do a vega duo which will rival the titan.


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> yeah, im not sure gddr5x on 384 bit is possible anyway...


Unlikely it will be using G5X as a redesign of the core is needed and that wont be happening until Volta at least. The roadmap could be like this.

Titan P12 - 3200 cores, 12GB HBM2 @ 480 GB/s, $899-$999, Q4 2016

Titan P16 - 3584 cores, 16GB HBM2 @ 640 GB/s, $1199-$1399, Q1 2017 (Likely to have partial crippled DP)

GTX 1180 - 3200 cores, 8GB HBM2 @ 512 GB/s, $699-$799, Q2 2017 (Tuned to be slightly faster than Titan P12, compete with Vega)

Note: The price range could be AiB - FE variant, or simply just a range if its only available as FE.

Ignore the previous possibility I posted, 3840 cores so early at this stage seem to be unlikely, some more its suppose to be their trumph card if AMD surprise us.


----------



## guttheslayer

If you guys notice on the roadmap I posted, its not exactly something I come out of my imagination.

In fact its almost the same as Kepler roadmap back in 2012-13. Except the supposed *GTX '780'* model was brought forward by 6 months with a premium price, to fill in the 6 month empty gap, den rebrand and re-release as the true X80 counterpart with lesser memory.

Other than that, everything else effectively mirror the Kepler release from 2012-2013.


----------



## EniGma1987

Quote:


> Originally Posted by *Xuvial*
> 
> From what I understood Titan isn't even aimed at the same consumer base as the x80 Ti. Yes Titan can be used for gaming, but it's a compute/cuda card more than anything else and it costs $1000+.
> 
> I mean no gamer is seriously buying Titans just for gaming right? Right?? Why the heck would anyone do that?


The original Titan was about that it had compute capability too, but then even the Titan Black was about gaming more than anything, and every titan since then has been about gaming. The maxwell Titan didn't even have the FP64 capability at all.


----------



## guttheslayer

Quote:


> Originally Posted by *EniGma1987*
> 
> The original Titan was about that it had compute capability too, but then even the Titan Black was about gaming more than anything, and every titan since then has been about gaming. The maxwell Titan didn't even have the FP64 capability at all.


The MW Titan X was an exception becz just like the Quadro M6000, it didnt have any FP64 compute.

Pascal will be different this time.


----------



## EniGma1987

Quote:


> Originally Posted by *renejr902*
> 
> Nvidia cant release a 1080ti after that to compete with vega hbm2, if the 1080ti is stronger than Titan.


Nvidia released the 780Ti to compete with AMD last time when AMD surprised them, and it was stronger than the Titan that had just been released not long before.


----------



## guttheslayer

Quote:


> Originally Posted by *EniGma1987*
> 
> Nvidia released the 780Ti to compete with AMD last time when AMD surprised them, and it was stronger than the Titan that had just been released not long before.


Which why I believe the 3840 full blown core will be their trumph card. Hence you might never see it if AMD continue to disappoint us.


----------



## JackCY

From what it seems so far, 1060 costs like 970, 1070 like 980, 1080 like 980 Ti, and 1080Ti? Probably the same rip off as Intel 6950x.
780 Ti = Titan chip, just not cut down if I remember right.


----------



## DNMock

Lets get two things straight in this thread.

1. Vega 10 = fully unlocked chip Vega 11 = cut down chip
The number denotes the order the chips were produced. In order to produce a cut down version of a chip, you must first produce a non cut down version, so Vega 10 is the larger

2. There won't be two versions of the Titan. the "junior" titan is the 1080ti.
The 980ti is a slightly cut down Titan X, the 780ti is a slightly cut down Titan, it stands to reason the slightly cut down Titan P will be the 1080ti


----------



## Woundingchaney

When was the last time we actually seen a 50% performance increase from one high end product to the next in a product cycle? The closest thing I can recall is the release of the 8800gtx.


----------



## -terabyte-

Quote:


> Originally Posted by *DNMock*
> 
> Lets get two things straight in this thread.
> 
> 1. Vega 10 = fully unlocked chip Vega 11 = cut down chip
> The number denotes the order the chips were produced. In order to produce a cut down version of a chip, you must first produce a non cut down version, so Vega 10 is the larger
> 
> 2. There won't be two versions of the Titan. the "junior" titan is the 1080ti.
> The 980ti is a slightly cut down Titan X, the 780ti is a slightly cut down Titan, it stands to reason the slightly cut down Titan P will be the 1080ti


You're the one posting wrong info:


Vega 11 is supposed to be bigger than Vega 10: https://www.techpowerup.com/222403/amd-pulls-radeon-vega-launch-to-october
The 780Ti was the full chip and Titan OG was the cut down version at the time. That changed with the 9XX series where the Ti version is now the cut down chip.


----------



## Ghoxt

Quote:


> Originally Posted by *Woundingchaney*
> 
> When was the last time we actually seen a 50% performance increase from one high end product to the next in a product cycle? The closest thing I can recall is the release of the 8800gtx.


Maybe another question is when have the planets aligned with a Die Shrink, new Architecture, and new Memory. Seems to me that 50% is strangely not enough.


----------



## guttheslayer

Quote:


> Originally Posted by *DNMock*
> 
> Lets get two things straight in this thread.
> 
> 1. Vega 10 = fully unlocked chip Vega 11 = cut down chip
> The number denotes the order the chips were produced. In order to produce a cut down version of a chip, you must first produce a non cut down version, so Vega 10 is the larger
> 
> 2. There won't be two versions of the Titan. the "junior" titan is the 1080ti.
> The 980ti is a slightly cut down Titan X, the 780ti is a slightly cut down Titan, it stands to reason the slightly cut down Titan P will be the 1080ti


There can be a 3200 cores, 3584 and a final 3840 core configuration.

The Kepler big die has 2304, 2688 & 2880 cores configuration.

Who say there wont be 2 version of Titan? That is a load of BS since we already seen 2 Titan iteration in Kepler era alone.

There might never be a 1080 Ti, the one that matches this Ti is actually the GTX 1180, which is 1 year after the 1080 is out.


----------



## rcfc89

Quote:


> Originally Posted by *guttheslayer*
> 
> There can be a 3200 cores, 3584 and a final 3840 core configuration.
> 
> The Kepler big die has 2304, 2688 & 2880 cores configuration.
> 
> Who say there wont be 2 version of Titan? That is a load of BS since we already seen 2 Titan iteration in Kepler era alone.
> 
> *
> There might never be a 1080 Ti*, the one that matches this Ti is actually the GTX 1180, which is 1 year after the 1080 is out.


Last generation MSI skipped out on making a Lightning version of the 980 instead saving its design for Big-Max 980Ti. Msi currently has all its current normal offerings for the 1080 either out or announced. Not even a hint of a Lightning 1080. Looks like MSI is following the same path as last year. There will be a 1080Ti Lightning.


----------



## Zero4549

Quote:


> Originally Posted by *rcfc89*
> 
> Last generation MSI skipped out on making a Lightning version of the 980 instead saving its design for Big-Max 980Ti. Msi currently has all its current normal offerings for the 1080 either out or announced. Not even a hint of a Lightning 1080. Looks like MSI is following the same path as last year. There will be a 1080Ti Lightning.


They learned from getting burned on the 680 lightning (and so did I. lol)


----------



## magnek

Literally?


----------



## twitchyzero

if this is even paper launching next month I think it's safe to say it's GDDR5X?

I just want a 1080Ti with 8-12GB HBM2 that's 2x the perf of a 980Ti for no more than $750...I don't care if it takes another 12 months. Sounds realistic?


----------



## rcfc89

Quote:


> Originally Posted by *twitchyzero*
> 
> if this is even paper launching next month I think it's safe to say it's GDDR5X?
> 
> I just want a 1080Ti with 8-12GB HBM2 that's 2x the perf of a 980Ti for no more than $750...I don't care if it takes another 12 months. Sounds realistic?


No considering the 980Ti max OC is within 5-10% of a 1080 max OC you are virtually asking for a card that is 90-95% faster then a 1080. Unless the new Titan is a dual-gpu thats not happening. If you are referring to stock clock performance of a 980Ti then well you're on the wrong forum.


----------



## Zero4549

Quote:


> Originally Posted by *magnek*
> 
> Literally?


No, because it is a low end card and doesn't produce any heat


----------



## junkman

Quote:


> Originally Posted by *guttheslayer*
> 
> Who say there wont be 2 version of Titan? That is a load of BS since we already seen 2 Titan iteration in Kepler era alone.
> 
> There might never be a 1080 Ti, the one that matches this Ti is actually the GTX 1180, which is 1 year after the 1080 is out.


in b4 Titan X Founder's Edition. The sheer thought of the price of that beast..

On a more robust and serious note, I believe it is entirely possible.

I think something we should consider is the maturity of the 28nm process compared to this FinFET technology.

I'm not sure yields will hold up.
Quote:


> Originally Posted by *twitchyzero*
> 
> if this is even paper launching next month I think it's safe to say it's GDDR5X?
> 
> I just want a 1080Ti with 8-12GB HBM2 that's 2x the perf of a 980Ti for no more than $750...I don't care if it takes another 12 months. Sounds realistic?


I really don't think so. HBM2 is going to be NV's entry into HBM, where they will run into interposer issues concerning the die packaging process. For the FuryX, it wasn't cheap even for 4GB. One interposer flaw even with a good Fiji die = dead chip, no reclaim.

If they've improved the process since then, they will save the costs in-house to mitigate the FinFET yield loss. Won't pass it on to the consumer.


----------



## iLeakStuff

GTX Titan: 384bit. GDDR5X. Q3 launch
Prepping up for launch when school starts


----------



## SuprUsrStan

Quote:


> Originally Posted by *iLeakStuff*
> 
> GTX Titan: 384bit. GDDR5X. Q3 launch
> Prepping up for launch when school starts


Oh?


----------



## EniGma1987

Quote:


> Originally Posted by *Syan48306*
> 
> Oh?


The 384-bit GDDR5X is actually an old rumor now. iLeakStuff is hoping everyone forgot it was "leaked" a while ago and will give him credit for that.
I do think this gen Titan will be GDDR5X though as well if it is indeed based on GP102. Nvidia can make more money and get stock made faster that way, and it will still have more bandwidth than last gen because of the extra that "X" brings. Adding 500 more cores compared to last gen would be right about perfect to use that extra bandwidth from the new memory without overdoing it either way, so it makes sense for Nvidia to do. Who knows, maybe we will see a 512-bit, 16GB full die version at some point as well.


----------



## ZealotKi11er

Quote:


> Originally Posted by *EniGma1987*
> 
> The 384-bit GDDR5X is actually an old rumor now. iLeakStuff is hoping everyone forgot it was "leaked" a while ago and will give him credit for that.
> I do think this gen Titan will be GDDR5X though as well if it is indeed based on GP102. Nvidia can make more money and get stock made faster that way, and it will still have more bandwidth than last gen because of the extra that "X" brings. Adding 500 more cores compared to last gen would be right about perfect to use that extra bandwidth from the new memory without overdoing it either way, so it makes sense for Nvidia to do. Who knows, maybe we will see a 512-bit, 16GB full die version at some point as well.


384-Bit G5X is still more then enough memory to make it 30-40% faster then 1080. I think people do not want to pay $650+ for 1080 Ti with just G5X. We want to see that Pascal can do with no memory bandwidth limitation.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *SpeedyVT*
> 
> They'll charge a lot more I'm certain of it.


Maybe, maybe not. NVIDIA pushed their pricing luck to its limits before with the Titan Z and that didn't turn out so well for them.


----------



## guttheslayer

Quote:


> Originally Posted by *rcfc89*
> 
> Last generation MSI skipped out on making a Lightning version of the 980 instead saving its design for Big-Max 980Ti. Msi currently has all its current normal offerings for the 1080 either out or announced. Not even a hint of a Lightning 1080. Looks like MSI is following the same path as last year. There will be a 1080Ti Lightning.


That 1080 ti is actually gtx 1180 next year. so MSI can have its lightning next year.


----------



## guttheslayer

So
Quote:


> Originally Posted by *iLeakStuff*
> 
> GTX Titan: 384bit. GDDR5X. Q3 launch
> Prepping up for launch when school starts


Source have already indicated that gp 102 and gp100 are the same chip and is actually hbm based. just like the pcie tesla which was release, cant believe there are ppl this stubborn

It is hbm2, so unless there is a gp103 with a gddr5x controller if not u can dream on having g5x on titan.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> That 1080 ti is actually gtx 1180 next year. so MSI can have its lightning next year.


You just made that up, didn't you?
Quote:


> Originally Posted by *guttheslayer*
> 
> So
> Source have already indicated that gp 102 and gp100 are the same chip and is actually hbm based. just like the pcie tesla which was release, cant believe there are ppl this stubborn
> 
> It is hbm2, so unless there is a gp103 with a gddr5x controller if not u can dream on having g5x on titan.


I haven't seen anything that strongly supports this, only a few casual mentions that give different conflicting impressions. Why would Nvidia make two chips if they were the same chip? There has to be something different between them; GP102 missing all the double precision cores, having a GDDR5X memory controller, or both makes sense to me.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> You just made that up, didn't you?
> I haven't seen anything that strongly supports this, only a few casual mentions that give different conflicting impressions. Why would Nvidia make two chips if they were the same chip? There has to be something different between them; GP102 missing all the double precision cores, having a GDDR5X memory controller, or both makes sense to me.


Hello read this!

http://www.nextplatform.com/2016/06/20/nvidia-rounds-pascal-tesla-accelerator-lineup/

We conjectured back in April that Nvidia could put out a dual-CPU Pascal Tesla card, call it a theoretical P80, that might have GDDR5 memory instead of the High Bandwidth Memory 2 on-package, stacked memory used in the original Pascal Tesla P100 card. *This did not happen, and it probably will never happen* now that Nvidia has announced a version of the Pascal PCI-Express card that has 12 GB of HBM2 memory and supports 540 GB/sec of memory bandwidth. (Why this part runs at 250 watts like the card with 16 GB is not clear, but 12 GB should run cooler than 16 GB we would think.)

Everything is explained very clearly on this site, read this first before claim there is a g5x gp102. There is a reason why gp100 and gp102 are different despite have the same specs. Gp100 support node scaling while gp102 does not, and yes both pcie and nvlink variant have 3584 cores, which surprised me, but pcie variant could come out with both 16/12 gb hbm2. Titan is the tesla gpu that fail the ecc or tdp requirement. So that effective mean Titan will come with HBM2.

As for gtx 1180, that is what happen to kepler, the gtx 780 is actually the gtx 680 ti, but it didnt happen becz the concept of x80 ti wasnt there (AMD wasnt competitive) and there is no need to.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *guttheslayer*
> 
> If you guys notice on the roadmap I posted, its not exactly something I come out of my imagination.
> 
> In fact its almost the same as Kepler roadmap back in 2012-13. Except the supposed *GTX '780'* model was brought forward by 6 months with a premium price, to fill in the 6 month empty gap, den rebrand and re-release as the true X80 counterpart with lesser memory.
> 
> *Other than that, everything else effectively mirror the Kepler release from 2012-2013.*


If that was the case then we wouldn't expect to see the Titan P released for at least 11 months after the 1080, or in April 2017. I think we will see it sooner than that tbh (like January or February) but definitely not in 2016.


----------



## spinFX

Quote:


> Originally Posted by *Wishmaker*
> 
> I am extremely curious to see how the 6950x bottlenecks this. Also, at what frequency was it tested? I highly doubt a 4.5-4.7 6950x will bottleneck anything
> 
> 
> 
> 
> 
> 
> 
> !


Yeah it is curious what they are saying about nothing be able to remove the CPU bottleneck. Some more information would be nice; like how much is it being bottlenecked? It will be highly unlikely for me to get one of these, but I'd be keen to know how much by 4930K would hold it back


----------



## magnek

It probably bottlenecks a 6950X *at stock* with its atrocious 3.5GHz turbo. Assuming 2x Titan X performance, except for games that scale beyond 4 cores (the number of which I can count on two hands), I refuse to believe a 6950X @ 4.4 will bottleneck this thing, well maybe if you gamed at 720p LOL.


----------



## ZealotKi11er

Quote:


> Originally Posted by *magnek*
> 
> It probably bottlenecks a 6950X *at stock* with its atrocious 3.5GHz turbo. Assuming 2x Titan X performance, except for games that scale beyond 4 cores (the number of which I can count on two hands), I refuse to believe a 6950X @ 4.4 will bottleneck this thing, well maybe if you gamed at 720p LOL.


Funny in most games my IVY @ 4.6GHz probably faster then this $1700 stock CPU in games.


----------



## magnek

More like 95% of games.


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> If that was the case then we wouldn't expect to see the Titan P released for at least 11 months after the 1080, or in April 2017. I think we will see it sooner than that tbh (like January or February) but definitely not in 2016.


The big P maybe, jan 2017, but the small P? Maybe 3 month earlier. I said the rest mirror the release of kepler, but i didnt say the small P will follow. It will be the one that might come out in sept


----------



## Majin SSJ Eric

Wouldn't the small P be the 1080Ti and follow the Big P as per usual?


----------



## JakdMan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Wouldn't the small P be the 1080Ti and follow the Big P as per usual?


Well they could want the small P to be a direct _Titan X_ successor and be the absolute highest of high-end gaming paraphernalia, while the big P is a proper successor to the DP toutin' og Titans


----------



## guttheslayer

Quote:


> Originally Posted by *JakdMan*
> 
> Well they could want the small P to be a direct _Titan X_ successor and be the absolute highest of high-end gaming paraphernalia, while the big P is a proper successor to the DP toutin' og Titans


Yup that is the plan. Putting it as 1080 ti is going to cause uproar for the $999, but not so negative if its flag as titan series.

On the other hand both big and small P share the same 3584 cores configure. Only different lies in 12GB and 16GB, 640 vs 480 GB/s. And probably DP available for the bigger brother.


----------



## OwnedINC

Quote:


> Originally Posted by *magnek*
> 
> It probably bottlenecks a 6950X *at stock* with its atrocious 3.5GHz turbo. Assuming 2x Titan X performance, except for games that scale beyond 4 cores (the number of which I can count on two hands), I refuse to believe a 6950X @ 4.4 will bottleneck this thing, well maybe if you gamed at 720p LOL.


And even then the "bottleneck" you'd be hitting is your monitor anyways!


----------



## iLeakStuff

Quote:


> Originally Posted by *guttheslayer*
> 
> So
> Source have already indicated that gp 102 and gp100 are the same chip and is actually hbm based. just like the pcie tesla which was release, cant believe there are ppl this stubborn
> 
> It is hbm2, so unless there is a gp103 with a gddr5x controller if not u can dream on having g5x on titan.


Source is wrong. They may have held the card in their hand but have never tried it.









From Erinyes, a well known insider:
https://forum.beyond3d.com/posts/1928645/

https://forum.beyond3d.com/posts/1928661/


----------



## guttheslayer

Quote:


> Originally Posted by *iLeakStuff*
> 
> Source is wrong. They may have held the card in their hand but have never tried it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From Erinyes, a well known insider:
> https://forum.beyond3d.com/posts/1928645/
> 
> https://forum.beyond3d.com/posts/1928661/


Ur source is likely to be wrong, vrw alr pull off the heatsink as u can see from the Power connector shown in the pcb. Now if g5x its should be obvious since the memory chip are place outside the GPU die, any amateur can identify the black emulsion Memory chip. The fact that its not means its HBM inside the GPU die pad itself.

Also titan card have memory chip place on the back side of the pcb. Just flip it around and u can see it.

Last but not least, if u understand how memory work, you will not be able to get 16gb with 384 bits. Your source dont make sense.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> Hello read this!
> 
> http://www.nextplatform.com/2016/06/20/nvidia-rounds-pascal-tesla-accelerator-lineup/
> 
> We conjectured back in April that Nvidia could put out a dual-CPU Pascal Tesla card, call it a theoretical P80, that might have GDDR5 memory instead of the High Bandwidth Memory 2 on-package, stacked memory used in the original Pascal Tesla P100 card. *This did not happen, and it probably will never happen* now that Nvidia has announced a version of the Pascal PCI-Express card that has 12 GB of HBM2 memory and supports 540 GB/sec of memory bandwidth. (Why this part runs at 250 watts like the card with 16 GB is not clear, but 12 GB should run cooler than 16 GB we would think.)
> 
> Everything is explained very clearly on this site, read this first before claim there is a g5x gp102. There is a reason why gp100 and gp102 are different despite have the same specs. Gp100 support node scaling while gp102 does not, and yes both pcie and nvlink variant have 3584 cores, which surprised me, but pcie variant could come out with both 16/12 gb hbm2. Titan is the tesla gpu that fail the ecc or tdp requirement. So that effective mean Titan will come with HBM2.
> 
> As for gtx 1180, that is what happen to kepler, the gtx 780 is actually the gtx 680 ti, but it didnt happen becz the concept of x80 ti wasnt there (AMD wasnt competitive) and there is no need to.


I fail to see how the lack of a dual GPU Tesla card using GDDR5(X) in the current Tesla lineup confirms GP102 uses HBM2. That article never mentions GP102 and the smaller Tesla makes perfect sense as a use of GP100 GPUs that partially failed during the interposer process, one memory stack didn't work.

What is your source for node scaling being the only difference between GP100 and GP102?

The entire 700 series was weird because Nvidia was still on 28nm, the only option was using their bigger die. The 680 was GK104 while the 780 came out several months _after_ the GK110 Titan and the 780 Ti had _more_ cores than the Titan... but there was also the Titan Black that was a full die too... I assume we will not see a repeat of the 700 series.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> I fail to see how the lack of a dual GPU Tesla card using GDDR(X) in the current Tesla lineup confirms GP102 uses HBM2. That article never mentions GP102 and the smaller Tesla makes perfect sense as a use of GP100 GPUs that partially failed during the interposer process, one memory stack didn't work.
> 
> What is your source for node scaling being the only difference between GP100 and GP102?
> 
> The entire 700 series was weird because Nvidia was still on 28nm, the only option was using their bigger die. The 780 came out several months _after_ the Titan and the 780 Ti had _more_ cores than the Titan... but there was also the Titan Black that was a full die too... I assume we will not see a repeat of the 700 series.


the failed tesla card have to go somewhere, and gp100 is already in production. There is no news of this gp102 with g5x exception certain validation post (yes it exist) and some chiphell rumor. If its put in Q3, we should have seen a more solid proof. One eg is the shipping profile.

Which is why i believe gp102 and gp100 are same chip which explain why we didnt get any concrete information till now. If its coming in sept it has to be a cut down tesla chip. And tesla dont run gddr5x.

If its g5x it wont be tesla based, and if its in huge volume production it should have some solid proof if we are just month away from release.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> the failed tesla card have to go somewhere, and gp100 is already in production. There is no news of this gp102 with g5x exception certain validation post (yes it exist) and some chiphell rumor. If its put in Q3, we should have seen a more solid proof. One eg is the shipping profile.
> 
> Which is why i believe gp102 and gp100 are same chip which explain why we didnt get any concrete information till now. If its coming in sept it has to be a cut down tesla chip. And tesla dont run gddr5x.
> 
> If its g5x it wont be tesla based, and if its in huge volume production it should have some solid proof if we are just month away from release.


Couldn't we be simply getting GP100 Titans (basically the two PCIe Tesla P100s) in Q3 and see GP102 sometime later? This VR World article claiming it is the GP102 might be wrong, the TechPowerUp article claims these Titans use GP100.

I would prefer GP102 to be GP100 without double precision cores but still using HBM2.









The very long card is weird as well, simply for extra cooling? I thought HBM2 would allow smaller GPUs.


----------



## guttheslayer

Quote:


> Originally Posted by *Asmodian*
> 
> Couldn't we be simply getting GP100 Titans (basically the two PCIe Tesla P100s) in Q3 and see GP102 sometime later? This VR World article claiming it is the GP102 might be wrong, the TechPowerUp article claims these Titans use GP100.
> 
> I would prefer GP102 to be GP100 without double precision cores but still using HBM2.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The very long card is weird as well, simply for extra cooling? I thought HBM2 would allow smaller GPUs.


The long card could be due to cooling capacity which is sorely needed for the insane tdp. I recall all the past 12 inch monster have enomous tdp.

Its not entirely impossible for a gp102 g5x variant with no dp to emerge, but that is probably next year matter, most likely to cater the gtx 1100 series. For now, this year sept release frame, they must find a place to sell all these failed tesla chip.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> The long card could be due to cooling capacity which is sorely needed for the insane tdp. I recall all the past 12 inch monster have enomous tdp.
> 
> Its not entirely impossible for a gp102 g5x variant without dp to emerge, but that is probably next year matter, most likely to cater the gtx 1100 series. For now, this year sept, they must find a place to sell all these failed tesla chip.


I agree, this sept isn't going to be Titans with G5X memory, I just don't think they will use GP102 either. I think GP102 will use 384-bit G5X and be called the 1080 Ti, but who knows.


----------



## scgeek12

Quote:


> Originally Posted by *JakdMan*
> 
> Well they could want the small P to be a direct _Titan X_ successor and be the absolute highest of high-end gaming paraphernalia, while the big P is a proper successor to the DP toutin' og Titans


I have a small P







are sure you need a big P for some DP?


----------



## guttheslayer

Quote:


> Originally Posted by *scgeek12*
> 
> I have a small P
> 
> 
> 
> 
> 
> 
> 
> are sure you need a big P for some DP?


Small P no DP so it can give some incentive to go for bigger P.

Anyway i realised it doesnt matter if titan is G5x or hbm. Either way i am excited that sept is the date and i am getting one.


----------



## Tideman

Think I'm going to cancel my 1080 preorders and wait for the Titan. Suppose I should consider it a good thing non-FE 1080s have been in such short supply here. Been waiting over a month already for mine.. what's another couple months for something far better.


----------



## xentrox

God please let this be true.. I need this in my life badly.


----------



## Ghoxt

One thing to think about is regardless of AMD's actions, to an extent Nvidia has a sliding window upon which to make money with merchandise. They will only stall so long, and they will not burn a QTR negatively waiting on AMD. I'm sure they have revenue targets to meet, and primarily if those are not in jeopardy then I could see them stalling a new GPU release.

However if releasing the GP100 Titan's makes sense for their continued revenue model per QTR I could see them doing it. Especially if the 1070/1080 we only ok revenue speaking and Nvidia see's an opportunity to garner further sales "Now" as opposed to waiting.

With new Architecture, memory, and die shrink I'm really interested to see what Nvidia can actually do, (provided HBM2 is in play here)


----------



## iLeakStuff

GP100 is for Tesla and NVlink and have HBM2 because they are very expensive and because interlinks and NVlink is required here.
GP102 is for consumer cards like Geforce. Comes with GDDR5X to keep price down and to have enough supply for the massive interest.

Why do you think OP say the Titan card is long?
Because it have GDDR5X which requires a lot of PCB space


----------



## DIYDeath

$1400 price tag is....no.

I could be persuaded @ $800-$1000...but not $1400.


----------



## FattysGoneWild

Quote:


> Originally Posted by *DIYDeath*
> 
> $1400 price tag is....no.
> 
> I could be persuaded @ $800-$1000...but not $1400.


Only to be beat by a card a few months later that is $700? Ala Titan X to GTX 1080. What a complete money grab and waste that Titan series is.







They should call it Titan S for suckers.


----------



## Clocknut

Quote:


> Originally Posted by *DIYDeath*
> 
> $1400 price tag is....no.
> 
> I could be persuaded @ $800-$1000...but not $1400.


they probably gonna try $1200-$1300. Mainstream size GP104 is $600-700 thats a $150-250 mark up from GTX980. I dont think TITAN will cost $1000 if it is come with bigger GPU.


----------



## guttheslayer

Quote:


> Originally Posted by *iLeakStuff*
> 
> GP100 is for Tesla and NVlink and have HBM2 because they are very expensive and because interlinks and NVlink is required here.
> GP102 is for consumer cards like Geforce. Comes with GDDR5X to keep price down and to have enough supply for the massive interest.
> 
> Why do you think OP say the Titan card is long?
> Because it have GDDR5X which requires a lot of PCB space


I suggest you stop leaking nonsense until a valid proof (like leaked picture) is out.

All this Tesla fail chip need to go somewhere, and has to go fast. The only chip that is available enough for the masses by SEPT release is the same chip that is used in Tesla P100 now.

G5X variant of GP100 (if it really does exist) is save for next year GTX 1100s. If not what happen when next year comes? Obviously this give Titan the edge since its on HBM and whatever the GTX 1180 (or even the Ti variant) churn out, it will never be as the same level as the HBM Titans. Even if its on 3840 cores. This is a good move as the 1180 can be priced reasonably well, like $700.

At the same time it also open up a whole new range of pricing for the Titans, which can be between $1000-$1500.

No matter what, I am gearing toward *a chip that actually exists*, than some chip that is being imagined by some chiphell poster who has too much time on hand.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Tideman*
> 
> Think I'm going to cancel my 1080 preorders and wait for the Titan. Suppose I should consider it a good thing non-FE 1080s have been in such short supply here. Been waiting over a month already for mine.. what's another couple months for something far better.


Probably a good idea anyway, not because I think Titans will be released anytime remotely by August/September but because if you can stomach the wait of another month or two anyway then you will be able to pick up those 1080's much more easily and may even find some better bargains by then as well. The 1080 will be it for Nvidia on the high end for 2016...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Probably a good idea anyway, not because I think Titans will be released anytime remotely by August/September but because if you can stomach the wait of another month or two anyway then you will be able to pick up those 1080's much more easily and may even find some better bargains by then as well. The 1080 will be it for Nvidia on the high end for 2016...


Define High End? Nvidia had been steeping on its own high end for ages now. They will introduce 1400 Titan if the have too.


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Probably a good idea anyway, not because I think Titans will be released anytime remotely by August/September but because if you can stomach the wait of another month or two anyway then you will be able to pick up those 1080's much more easily and may even find some better bargains by then as well. The 1080 will be it for Nvidia on the high end for 2016...


Those titan wont cause a drop in 1080 price when they are twice as exp.


----------



## Majin SSJ Eric

No but i dont think Titans will be out this year anyway. Availability will lower 1080 prices (from where they are right now).


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> No but i dont think Titans will be out this year anyway. Availability will lower 1080 prices (from where they are right now).


We have seen it all. $3K Titan Z, $1700 Intel EE, $700 Mainstream GPU. I do not see why Nvidia can't release a Titan now for $1400 as a brand boost dominance. Even if it's a paper launch something like a Titan is a morale booster for people that buy 1060s and 1070s.


----------



## renejr902

I believe the vrworld rumor, videocardz said they arent liar. Vrworld said they did test the titan in hand, so i believe for a august announcement with hbm2 with 2 models, one with 12gb and the other 16gb, and i still think the 12gb will be 999$, in that case they know they will sell a lot more and this way it will affect radeon with vega sales. If you bought a titan at 999$ , i dont think you will buy a vega flagship with hbm2. The 16gb titan will be for the most fanatic and people with a lot of money to spend. I think maybe they will skip 1080ti and go directly with 1180 gtx with 384bit and gddr5x with 3840 cores that will be near titan 16gb performance. Thats my prediction


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> No but i dont think Titans will be out this year anyway. Availability will lower 1080 prices (from where they are right now).


Wow u kidding, a $1400 can lower the price of 1080? Ppl could can buy hb sli 1080 with the same price.


----------



## KeepWalkinG

After 9 months GTX 1080 will be 500$.


----------



## guttheslayer

Quote:


> Originally Posted by *KeepWalkinG*
> 
> After 9 months GTX 1080 will be 500$.


And another 3 month the GTX 1170 that is faster than 1080 will be out at $379.


----------



## renx

Quote:


> Originally Posted by *guttheslayer*
> 
> And another 3 month the GTX 1170 that is faster than 1080 will be out at $379.


Those $379 xx70 video cards are like Big Foot or Loch Ness monster. Not proven to exist.
I yet have to see one to believe.
Yet you have a point.


----------



## guttheslayer

Quote:


> Originally Posted by *renx*
> 
> Those $379 xx70 video cards are like Big Foot or Loch Ness monster. Not proven to exist.
> I yet have to see one to believe.
> Yet you have a point.


You are funny Nvidia has been doing new release on their next generation lineup. We predict there is 1000 series back in the 2014 days. What is special here?

Unless we are seeing a full rebrand, if not the next X80 card will be GP102 based, and the X70 will be a full GP104 card and so on.

Being a full GP104 card is just a rebrand GTX 1080.


----------



## renx

Quote:


> Originally Posted by *guttheslayer*
> 
> You are funny Nvidia has been doing new release on their next generation lineup. We predict there is 1000 series back in the 2014 days. What is special here?


Oh don't get the joke wrong. The video card does exist.
But the $379 one is nowhere to be seen.


----------



## Klocek001

facts are pretty predictable. we are gonna see a *$1000 cut down* chip, with fully enabled one priced at +$1200, if Vega is either waay late or can't deliver the performance to overtake cut Titan P.
If we get Vega that is stuck between 1080 and Titan P just like Fury X was stuck between 980 and 980Ti, and couldn't even touch Titan X, we'll see nvidia charge more.

I think the situation will be getting worse if 1060 turns out to be faster than 480. AMD will simply not be able to compete if Polaris vs Pascal architecture comparison turns out to be worse than GCN 1.1-1.3 vs Maxwell

I don't give a crap about overpaying for a GPU once, but if that's the sign of things to stay for a long,long while then I might give that Xbox scorpio a second thought


----------



## renejr902

I prey for a 999$ 12gb titan announcement in august. Please prey for me







i cant spend 1400$ for a 16gb titan or worst case 1400$ for a 12gb titan . I want a 4k 60fps videocard before december, sept to nov a perfect fit for me. Still i dont think far cry primal will have 60fps in ultra at 4k , even with a titan 16gb full die, but most games will do it. And i dont enable AA in 4k at all

Anyway in worst case i will play with my gtx 1070 until a 999$ videocard can do 4k at 60fps for most games
Maybe intel can do 4k at 60fps with their next integrated gpu ? LoL
Intel should begin to make dedicate videocard, this way nvidia will lower their price in panic.







the worst case for nvidia is Intel buying Radeon. Can i dream ?







... Last news: Intel surprised everyone and announced a dedicate videocard with 8192 cores and 32gb hbm2 named Intel PowerDemon videocard only at 50% retail price promotion if you buy a intel 8 cores cpu for a limited time...LOL ... I should go to sleep and stop drinking LOL


----------



## Klocek001

Quote:


> Originally Posted by *renejr902*
> 
> I prey for a 999$ 12gb titan announcement in august. Please prey for me
> 
> 
> 
> 
> 
> 
> 
> i cant spend 1400$ for a 16gb titan or worst case 1400$ for a 12gb titan . I want a 4k 60fps videocard before december, sept to nov a perfect fit for me. Still i dont think far cry primal will have 60fps in ultra at 4k , even with a titan 16gb full die, but most games will do it. And i dont enable AA in 4k at all
> 
> Anyway in worst case i will play with my gtx 1070 until a 999$ videocard can do 4k at 60fps for most games
> Maybe intel can do 4k at 60fps with their next integrated gpu ? LoL
> Intel should begin to make dedicate videocard, this way nvidia will lower their price in panic.
> 
> 
> 
> 
> 
> 
> 
> the worst case for nvidia is Intel buying Radeon.


I can justify spending $1.4K for a 4K 60 fps card, but to do it cause of Far Cry Primal is just too much for me


----------



## renejr902

Quote:


> Originally Posted by *Klocek001*
> 
> I can justify spending $1.4K for a 4K 60 fps card, but to do it cause of Far Cry Primal is just too much for me


LOL, i bought the gtx 1070 last week only to play dirt rally in 4k at 60fps while waiting for a true 4k 60fps card. Did you know guys that you can start witcher iii at ultra in 4k with a intel integrated gpu ( the one included in my i5 4690) i get 1-2fps, still better than i thought before trying. It still looks pretty like playing it with my old geforce 980ti, it just missing some fps








Good night everyone ! Im falling in my keyboard


----------



## Klocek001

well you've got one 1070 already,in your case I'd go 1070 SLI instead of paying a huge price premium for Titan P.
1070 SLI is already 41% faster than 1080 in 4K, and that comes at only $200 more comparing FE vs FE


----------



## ref

Man, a 4k 144hz panel needs to be released ASAP.

I'll be getting 2 of these whenever they release, but I'd love to upgrade from 2k 144hz. I just can't go back to 60hz though, first world problems.

I still think we will see a Titan before 2017, I'm not sold on it being available in August though, Late September would be my absolute earliest uniformed, gut feeling guess.


----------



## DNMock

Quote:


> Originally Posted by *Klocek001*
> 
> facts are pretty predictable. we are gonna see a *$1000 cut down* chip, with fully enabled one priced at +$1200, if Vega is either waay late or can't deliver the performance to overtake cut Titan P.
> If we get Vega that is stuck between 1080 and Titan P just like Fury X was stuck between 980 and 980Ti, and couldn't even touch Titan X, we'll see nvidia charge more.
> 
> I think the situation will be getting worse if 1060 turns out to be faster than 480. AMD will simply not be able to compete if Polaris vs Pascal architecture comparison turns out to be worse than GCN 1.1-1.3 vs Maxwell
> 
> I don't give a crap about overpaying for a GPU once, but if that's the sign of things to stay for a long,long while then I might give that Xbox scorpio a second thought


Yup, discussing what name they put on it is pretty pointless, that is the facts.


----------



## st0necold

Guys is the Titan P pretty much confirmed for August? Was going to pick up 2 1080 FE's to replace my 980ti's but read that titan P will drop in August. Any advice?


----------



## rcfc89

Quote:


> Originally Posted by *st0necold*
> 
> Guys is the Titan P pretty much confirmed for August? Was going to pick up 2 1080 FE's to replace my 980ti's but read that titan P will drop in August. Any advice?


Definitely wait. 980Ti Oc'd vs 1080 Oc'd is only about a 5-10% difference in performance depending on the game. It would be silly to upgrade from a 980ti to a 1080. Wait for the Titan or 1080Ti.


----------



## sherlock

Quote:


> Originally Posted by *st0necold*
> 
> Guys is the Titan P *pretty much confirmed for August?* Was going to pick up 2 1080 FE's to replace my 980ti's but read that titan P will drop in August. Any advice?


Titan P is still *unreliable rumor*, wouldn't count on it if I were you. I'd still wait in your case for chaper AIB 1080 to come in stock. While FE are not bad card themsleves there is no point in paying $700 just to have them when AIBs are chaper and have more potential when people figure out a 1.15-1.25V volt modded bios for 1080. If you are going for dual blower SLI with or without Waterblock, cheap blower variant shoould also have better stock in a week or two.

Plenty of evidence/track record goes against a early(2016) Titan P release

The OG Titan is most comparable to the Titan P(first big chip gaming product on a new process), and that one took 11 month after 680.
AMD is totally noncompetitive at the high end, latest Vega estimate isn't Q1 2017 and the "490" seems to be 480 CFX on 1 card(slower than 1080 at 2X the power consumption), *the card NV should release next is 1050 and/or 1050Ti* to compete with 460/470 that's coming next, I'd venture to guess those are the cards that will be announced at Gamescon instead of the Titan P.
HBM2 avaliablity still unclear and a GK100 based Titan P need those. GP102 is just a chiphell invention with no accurate proof.
Also, Titan P at any likely price point($1K+) wouldn't hurt 1080 resale value, so your 1080 decision shouldn't really be impacted by a imminent Titan P even if that were the case.


----------



## ChevChelios

Quote:


> 980Ti Oc'd vs 1080 Oc'd is only about a 5-10% difference in performance depending on the game.


nope, its 20%

noone knows when titan P comes

IMO - in early 2017


----------



## Slomo4shO

Quote:


> Originally Posted by *guttheslayer*
> 
> Those titan wont cause a drop in 1080 price when they are twice as exp.


The only way the 1080 will drop in price is if AMD delivers a competitive product. If not, the GTX 1080 can easily occupy the $600-750 niche while the GTX 1080 Ti come in at $900-1000 with the Titan P coming in at $1100-1300. Without competition, Nvidia has no incentive to lower prices.


----------



## philosopher

Quote:


> Originally Posted by *Slomo4shO*
> 
> The only way the 1080 will drop in price is if AMD delivers a competitive product. If not, the GTX 1080 can easily occupy the $600-750 niche while the GTX 1080 Ti come in at $900-1000 with the Titan P coming in at $1100-1300. Without competition, Nvidia has no incentive to lower prices.


Their incentive will be to move units. There is no way the 1080Ti drops at or near $1,000. That price point is reserved for Titan. They will cut the price of "little Pascal" cards(1080/1070) and release "big Pascal" at or near their old price points. 980Ti was only $100-$150 more than 980....780 was only $100 more than 680. At some point even most well off pc gamers will seriously question dropping $1200+ on a single gpu. It's easy to forget on a forum such as this that that price point is well out of reach for 95%+ of consumers in the space. The only way I can see 1080Ti close to $1,000 is if HBM2 is still having yield issues.


----------



## Slomo4shO

Quote:


> Originally Posted by *philosopher*
> 
> Their incentive will be to move units. .


Their incentive is to maximize profits. Greater volumes don't necessary align with said goal.


----------



## Randomdude

Quote:


> Originally Posted by *ChevChelios*
> 
> nope, its 20%
> 
> noone knows when titan P comes
> 
> IMO - in early 2017






The temperature, clocks, frames, what have you, are all abysmal on this particular 1080, which is what a real-life scenario would be like at the moment. Not your 2100 24/7 clocked unicorn cards that nobody has seen, yet judge performance on. So yes, at the moment it's not really this magical minimum 20% difference that started off as 40.


----------



## Asmodian

Quote:


> Originally Posted by *sherlock*
> 
> Plenty of evidence/track record goes against a early(2016) Titan P release
> 
> The OG Titan is most comparable to the Titan P(first big chip gaming product on a new process), and that one took 11 month after 680.
> AMD is totally noncompetitive at the high end, latest Vega estimate isn't Q1 2017 and the "490" seems to be 480 CFX on 1 card(slower than 1080 at 2X the power consumption), *the card NV should release next is 1050 and/or 1050Ti* to compete with 460/470 that's coming next, I'd venture to guess those are the cards that will be announced at Gamescon instead of the Titan P.
> HBM2 avaliablity still unclear and a GK100 based Titan P need those. GP102 is just a chiphell invention with no accurate proof.
> Also, Titan P at any likely price point($1K+) wouldn't hurt 1080 resale value, so your 1080 decision shouldn't really be impacted by a imminent Titan P even if that were the case.


I do not believe this logic.









What the OG Titan did is pretty random, that is the only example we have with other releases being pretty different. Using a single data point to make future predictions is often misleading.
Nvidia is competing with their old lineups, not AMDs. Nvidia could make more money releasing a high end card now that is a good upgrade from the 980Ti/TitanX, and another in a year, than they could holding it back and not selling any top end cards for six months. How many 1080s do they have to sell to make as much money as they do when selling a single Titan P?
HBM2 availability still unclear, there might be enough for a Titan P release.
Of course, my logic is heavily influenced by hope.


----------



## sherlock

Quote:


> Originally Posted by *Asmodian*
> 
> I do not believe this logic.
> 
> What the OG Titan did is pretty random, that is the only example we have with other releases being pretty different. Using a single data point to make future predictions is often misleading.
> Nvidia is competing with their old lineups, not AMDs. Nvidia could make more money releasing a high end card now that is a good upgrade from the 980Ti/TitanX, and another in a year, than they could holding it back and not selling any top end cards for six months. How many 1080s do they have to sell to make as much money as they do when selling a single Titan P?
> HBM2 availability still unclear, there might be enough for a Titan P release.
> Of course, my logic is heavily influenced by hope.


OG Titan(built on a still new 28nm process, like Titan P is on a still new 16nm process) is a better example than Titan X(which is built on a mature 28nm process), and Titan X(6 month after 980) is where all the optimistic People get their "Titan P in 2016" idea from.

Nvidia like Intel is competing with their old lineup, what does/did Intel do when competing with their old lineup:

Drag the release cycle longer(12 month cycle enlongate into 16-18 month cycle)
Make Generational Product Improvement smaller(Tick-Tock goes to Tick-Tock-Tock and s)
Increase Price for the same SKU for new generations
All of this works against the optimistic projection of August 2016 Titan P at $1000 with 50% improvement over 1080. When competing with your old lineup only, it makes more sense to slow down pace of development and save on R&D instead of keep pushing out new product.
Quote:


> How many 1080s do they have to sell to make as much money as they do when selling a single Titan P?


Given How much more silicon a Titan P needs(610 vs GP104 which is around 314) and HBM2 price vs GDDR5X, not as much as you think if they keep the price at $1000. 1080 at $650-700 might be more profitable per unit than a $1K Titan P at this stage of yield(and all the good GP100 chip have to go to Compute because that's more profitable) and HBM2 price/avaliablity. Also Titans don't sell much to begin with, *There are already more 1080 than Titan X* in people's rigs according to the June Steam Survey.


----------



## magnek

Quote:


> Originally Posted by *Klocek001*
> 
> facts are pretty predictable. we are gonna see a *$1000 cut down* chip, with fully enabled one priced at +$1200 *at $1400*, if Vega is either waay late or can't deliver the performance to overtake cut Titan P.
> If we get Vega that is stuck between 1080 and Titan P just like Fury X was stuck between 980 and 980Ti, and couldn't even touch Titan X, we'll see nvidia charge more.
> 
> I think the situation will be getting worse if 1060 turns out to be faster than 480. AMD will simply not be able to compete if Polaris vs Pascal architecture comparison turns out to be worse than GCN 1.1-1.3 vs Maxwell
> 
> I don't give a crap about overpaying for a GPU once, but if that's the sign of things to stay for a long,long while then I might give that Xbox scorpio a second thought


FTFY

Titans have historically been $350 more than the next cutdown card.


----------



## xioros

Quote:


> Originally Posted by *Klocek001*
> 
> I don't give a crap about overpaying for a GPU once, but if that's the sign of things to stay for a long,long while then I might give that Xbox scorpio a second thought


That's the thing. It's staying because you did it once. And then again. Once a company figures they can screw you over once and get away with it, they won't stop it doing so.
Heck, the only reason we're getting screwed hardware every generation, is because you guys keep giving in and keep telling yourselves it's a good deal. Nvidia isn't to blame (as I pointed out in an other thread: a system that allows for derailed capitalism and dumb consumers are to blame), the buyers of their products are. If no one buys them at that price, the price goes down.

Nvidia is now trying to figure out the equilibrium between how much they can overcharge us without losing to much sales volume.
Quote:


> Originally Posted by *magnek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Klocek001*
> 
> facts are pretty predictable. we are gonna see a *$1000 cut down* chip, with fully enabled one priced at +$1200 *at $1400*, if Vega is either waay late or can't deliver the performance to overtake cut Titan P.
> If we get Vega that is stuck between 1080 and Titan P just like Fury X was stuck between 980 and 980Ti, and couldn't even touch Titan X, we'll see nvidia charge more.
> 
> I think the situation will be getting worse if 1060 turns out to be faster than 480. AMD will simply not be able to compete if Polaris vs Pascal architecture comparison turns out to be worse than GCN 1.1-1.3 vs Maxwell
> 
> I don't give a crap about overpaying for a GPU once, but if that's the sign of things to stay for a long,long while then I might give that Xbox scorpio a second thought
> 
> 
> 
> FTFY
> 
> Titans have historically been $350 more than the next cutdown card.
Click to expand...

And small dies have historically been sub $300. Well, it became history with the GTX 680. Look at Nvidia now, managing to charge $700 for the small die and people still think it's a fair price. I'm expecting a $1400-1500 Titan and a $1000 GTX 1080 Ti.

*On a side note: what I'm most afraid of, is that AMD is going to price a product that's nearly as fast, just below the Nvidia equivalent - instead of forcing a lower price of Nvidia. Nvidia won't give a crap and the consumer is stuck with the crappy GPU market.*


----------



## Asmodian

Quote:


> Originally Posted by *sherlock*
> 
> OG Titan(built on a still new 28nm process, like Titan P is on a still new 16nm process) is a better example than Titan X(which is built on a mature 28nm process), and Titan X(6 month after 980) is where all the optimistic People get their "Titan P in 2016" idea from.
> 
> Nvidia like Intel is competing with their old lineup, what does/did Intel do when competing with their old lineup:
> 
> Drag the release cycle longer(12 month cycle enlongate into 16-18 month cycle)
> Make Generational Product Improvement smaller(Tick-Tock goes to Tick-Tock-Tock and s)
> Increase Price for the same SKU for new generations


True, the OG Titan is a better data point than the Titan X, but it is only one data point and it is the first use of the "Titan" branding in a situation where the next planned node failed and Nvidia was reinventing their roadmaps. I seriously doubt we can use the OG Titan as a reference for the future given the 780Ti and Titan Black, those will never happen again.

The entire 28nm node was new and different, GK104 was released as the flagship for the first time, GK100 failed and needed to be reworked into the GK110, the 20nm node failed, and the invention of a new high margin price point all confused the release schedule and make basing future predictions on what happened then a bad idea, in my opinion.

A big difference between Nvidia and Intel is that Nvidia's new products are still significantly faster than their old ones. Intel is also chasing very different markets right now, the server market is well planned, mature, and predictable. The GPU market is still weird and evolving quickly, Nvidia holding back on GPU R&D right now is not safe in the way holding back on CPU R&D is for Intel.
Quote:


> Originally Posted by *sherlock*
> 
> All of this works against the optimistic projection of August 2016 Titan P at $1000 with 50% improvement over 1080. When competing with your old lineup only, it makes more sense to slow down pace of development and save on R&D instead of keep pushing out new product.
> Given How much more silicon a Titan P needs(610 vs GP104 which is around 314) and HBM2 price vs GDDR5X, not as much as you think if they keep the price at $1000. 1080 at $650-700 might be more profitable per unit than a $1K Titan P at this stage of yield(and all the good GP100 chip have to go to Compute because that's more profitable) and HBM2 price/avaliablity. Also Titans don't sell much to begin with, *There are already more 1080 than Titan X* in people's rigs according to the June Steam Survey.


I agree that a Titan P in late August seems unexpectedly soon but I think we might have another new situation with the HBM2 cards. At a $1000 I will agree that the Titan might not have great margins, buy is $1000 really the market limit for a Titan P with 16GB of HBM2, I don't think so. This is the first time the Titan would actually be significantly different compared to the X80Ti. What does Nvidia do with all the leaky, high power use, Tesla chips? Selling them to me now, for well above $1000, might strike Nvidia as a good idea. I am not going to buy a 1080 but I am willing to drop a lot of money on something that would be significantly faster than my 980Ti, and, like a lot of people, I will do this every year if something better is available. Nvidia really wants people like me buying a Titan instead of Ti so they need something to differentiate the Titan enough that I am willing to pay that extra. Releasing something like the 780 Ti after the OG Titan hurts a lot, and the Titan X is much too similar to a 980 Ti to do it. However, a version with HBM2 that gets released early? Their average price per card sold could go through the roof.









The Titan P 12 and 16GB cards could be very expensive (as if $1000 wasn't) and be out a long time before the 1080 Ti. We might not see the GP102 on a 1080 Ti until spring, Nvidia might even call it the 1180 like @guttheslayer keeps claiming. This would allow the Titan line to sell more units, without the Ti undercutting it, it gives flush 1080 owners something to upgrade to quickly*, and it even reinforces the Titan brand as the "super high margin end" line.

I think Nvidia is still figuring out how the Titan, x80Ti, and standard x80/x70/x60 lineup fit together to generate the most money.

*six months might be enough time that people want get something new and many just aren't the type to drop stupid amounts of money on the Titan line.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ref*
> 
> Man, a 4k 144hz panel needs to be released ASAP.
> 
> I'll be getting 2 of these whenever they release, but I'd love to upgrade from 2k 144hz. I just can't go back to 60hz though, first world problems.
> 
> I still think we will see a Titan before 2017, I'm not sold on it being available in August though, Late September would be my absolute earliest uniformed, gut feeling guess.












Not that this Acer Predator isn't anything amazing, I just want 4K to go with all the amazing. At 4K, I will be happy for a number of years, then by the time we have 8K 144 Hz, I will be too damn blind to use a computer anyways!


----------



## Just Spear

Oooh boy I can't wait. Hopefully nothing breaks, because the next couple month's rent income are designated Titan funds.

My OG Titan is getting a little old.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Just Spear*
> 
> Oooh boy I can't wait. Hopefully nothing breaks, because the next couple month's rent income are designated Titan funds.
> 
> My OG Titan is getting a little old.


"Dear Tenants,

Due to the up and coming release of the Pascal based Titan class GPU from Nvidia, I have opted to increase your rent by $25 per month."


----------



## guttheslayer

Quote:


> Originally Posted by *philosopher*
> 
> Their incentive will be to move units. There is no way the 1080Ti drops at or near $1,000. That price point is reserved for Titan. They will cut the price of "little Pascal" cards(1080/1070) and release "big Pascal" at or near their old price points. 980Ti was only $100-$150 more than 980....780 was only $100 more than 680. At some point even most well off pc gamers will seriously question dropping $1200+ on a single gpu. It's easy to forget on a forum such as this that that price point is well out of reach for 95%+ of consumers in the space. The only way I can see 1080Ti close to $1,000 is if HBM2 is still having yield issues.


1080 Ti is the titan Jr.

IF AMD was competitive, 12GB GP100 would be 1080 Ti @ $700, 16GB would be the usual Titan @ $999

Now that AMD wasnt competitive:

12GB = Titan Jr @ $999, 16GB = Super Titan @ $1200-1500.

Whatever it is, I am strongly believing a Aug/Sept Launch.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *sherlock*
> 
> OG Titan(built on a still new 28nm process, like Titan P is on a still new 16nm process) is a better example than Titan X(which is built on a mature 28nm process), and Titan X(6 month after 980) is where all the optimistic People get their "Titan P in 2016" idea from.
> 
> Nvidia like Intel is competing with their old lineup, what does/did Intel do when competing with their old lineup:
> 
> Drag the release cycle longer(12 month cycle enlongate into 16-18 month cycle)
> Make Generational Product Improvement smaller(Tick-Tock goes to Tick-Tock-Tock and s)
> Increase Price for the same SKU for new generations
> All of this works against the optimistic projection of August 2016 Titan P at $1000 with 50% improvement over 1080. When competing with your old lineup only, it makes more sense to slow down pace of development and save on R&D instead of keep pushing out new product.
> Given How much more silicon a Titan P needs(610 vs GP104 which is around 314) and HBM2 price vs GDDR5X, not as much as you think if they keep the price at $1000. 1080 at $650-700 might be more profitable per unit than a $1K Titan P at this stage of yield(and all the good GP100 chip have to go to Compute because that's more profitable) and HBM2 price/avaliablity. Also Titans don't sell much to begin with, *There are already more 1080 than Titan X* in people's rigs according to the June Steam Survey.


Yes to every one of your points and that is exactly why we won't be seeing a new Titan coming out this year (besides it really just being a completely stupid and idiotic rumor to start with). It makes no sense for them to do so as they don't need to be MORE powerful at the high end right now when AMD is not even competing. When AMD releases Vega what if it ends up being really strong, and Nvidia has already shot their "Titan" wad back in August 2016? They would have nothing to counter with but a 1080Ti that would have a "been there, done that" feel about it, whereas by holding back they can bring the fight to AMD with a new Titan release at a time of their choosing without wasting an entire product release at a time they absolutely do not need to do so.

*Keep in mind that Volta is a 2018 product so if they did release Titan in August 2016 they would have nothing left to release besides a cut down Pascal 1080Ti over the next 1.5 years! Its just an idiotic idea and its not happening...*


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yes to every one of your points and that is exactly why we won't be seeing a new Titan coming out this year (besides it really just being a completely stupid and idiotic rumor to start with). It makes no sense for them to do so as they don't need to be MORE powerful at the high end right now when AMD is not even competing. When AMD releases Vega what if it ends up being really strong, and Nvidia has already shot their "Titan" wad back in August 2016? They would have nothing to counter with but a 1080Ti that would have a "been there, done that" feel about it, whereas by holding back they can bring the fight to AMD with a new Titan release at a time of their choosing without wasting an entire product release at a time they absolutely do not need to do so.
> 
> *Keep in mind that Volta is a 2018 product so if they did release Titan in August 2016 they would have nothing left to release besides a cut down Pascal 1080Ti over the next 1.5 years! Its just an idiotic idea and its not happening...*


If they do release a Titan now It will be a Cut Down Titan like Titan OG. In 2017 they will release 1080 Ti and if there is need a Full Titan. 1 year after that they will release new architecture in form of 1180.


----------



## magnek

They could release a Titan P in 2016 as long as it's priced so high (ie $1400 or 2x 1080 FE price) that there's no danger of cannibalizing 1080 sales at all.


----------



## HowAmI

Quote:


> Originally Posted by *Glottis*
> 
> i'll be using GTX1080Ti @ 1080p. (please don't have heart attack)


That's the stupidest 1080ti post I've seen.... At that point you should be given a 970 a 1080ti is for 4k not 1080p


----------



## magnek

B-b-but 1080p *180Hz*!


----------



## HowAmI

Quote:


> Originally Posted by *magnek*
> 
> B-b-but 1080p *180Hz*!


B-b-but 1080p *1080Hz!*************


----------



## Somasonic

Plenty of horsepower for all the AA that 1080p needs


----------



## renejr902

Quote:


> Originally Posted by *Klocek001*
> 
> well you've got one 1070 already,in your case I'd go 1070 SLI instead of paying a huge price premium for Titan P.
> 1070 SLI is already 41% faster than 1080 in 4K, and that comes at only $200 more comparing FE vs FE


Hmm its still interesting, but i will have to change my mainboard , i prefer to go titan, so much people seem to hate sli for several reasons... And i played several old games between 1 to 10 years old, it could be a problem i suppose... Thanks anyway for the suggestion.







in the case that only a titan card at 16gb is released, if its cost too much, i will wait for a 1080ti


----------



## magnek

Quote:


> Originally Posted by *Somasonic*
> 
> Plenty of horsepower for all the AA that 1080p needs












Honestly might as well go up in resolution if that's truly what one is shooting for. I'll be honest, I don't think I could ever go back to 1080p now that I've been on 1440p for a year. I'll probably never go 4K though simply because it's unusable without DPI scaling (which Windows absolutely sucks at) or a 40" screen. :/


----------



## renejr902

Quote:


> Originally Posted by *HowAmI*
> 
> B-b-but 1080p *1080Hz!*************


b-but 720p in 2160hz WoW!


----------



## Ghoxt

Nvidia has revenue requirements to make, which no one on OCN thinks or talks about. I'm sure NV has short, medium and longer term QTR to QTR Revenue targets, meaning Revenue they are counting on. We don't know how well they are doing at any period until after a public earnings report, but even then we are blind to their projections.

If their Revenue roadmap is planning on a release of the the Titan P in the short term, they will do it. We have to remind ourselves, this company is not doing this for altruistic GPU fanboy reasons. Follow the fat man with the cash... and you'll not be surprised.


----------



## renejr902

I tested my windforce oc gigabytes 1070 gtx today (i didnt overclock it myself until now, but i tested it in OC mode with guru extreme) ,in witcher3 at ultra no AA no hairwork at 4k i got 40fps with the 1070. At the same location with my old 980ti G1 fully overclocked by myself i got 46fps. Anywhere in the game the difference between the 2 cards is 4 to 6fps more for my old 980ti G1 overclocked fully. Next week i will overclock my 1070, i suppose the performance will be the same with the 2 cards


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Ghoxt*
> 
> Nvidia has revenue requirements to make, which no one on OCN thinks or talks about. I'm sure NV has short, medium and longer term QTR to QTR Revenue targets, meaning Revenue they are counting on. We don't know how well they are doing at any period until after a public earnings report, but even then we are blind to their projections.
> 
> If their Revenue roadmap is planning on a release of the the Titan P in the short term, they will do it. We have to remind ourselves, this company is not doing this for altruistic GPU fanboy reasons. Follow the fat man with the cash... and you'll not be surprised.


The point is, if the last four years of product cycles from Nvidia hold true, they would've never planned on releasing a titan in 2016 anyway. Profitability from the 1080/1070/1060 will be more than enough to suffice for the rest of this year. The only way I see that changing is if AMD unexpectedly dropped a high-performance Vega card into the market in the fall. But that too is highly unlikely.


----------



## Ghoxt

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> The point is, if the last four years of product cycles from Nvidia hold true, they would've never planned on releasing a titan in 2016 anyway. Profitability from the 1080/1070/1060 will be more than enough to suffice for the rest of this year. The only way I see that changing is if AMD unexpectedly dropped a high-performance Vega card into the market in the fall. But that too is highly unlikely.


I have no doubt what you say is true, Nvidia could go completely without the Titan and not miss a beat, I just think they are maximising profits. They have to pay for those McMansions...


----------



## twitchyzero

Quote:


> Originally Posted by *rcfc89*
> 
> No considering the 980Ti max OC is within 5-10% of a 1080 max OC you are virtually asking for a card that is 90-95% faster then a 1080. Unless the new Titan is a dual-gpu thats not happening. If you are referring to stock clock performance of a 980Ti then well you're on the wrong forum.


nah, comparison metric that takes into account OC is unreliable because way too many variables:

1. non-factory OC is lottery-based, ymmv
2. max OC, how's that defined? max on air? max on LN2?
If both are max on water, are they using identical cooler?


----------



## Glottis

Quote:


> Originally Posted by *HowAmI*
> 
> That's the stupidest 1080ti post I've seen.... At that point you should be given a 970 a 1080ti is for 4k not 1080p


the only thing dumb and ignorant here is posts like yours. i had a gtx670 and people said it's overkill for 1080p. i had gtx 780 and people said it's overkill for 1080p. i have 980ti and people said it's overkill for 1080p but i already find myself lowering gfx settings to achieve 60+fps. i'll be using 1080ti for whatever resolution i damn please and there isn't a single thing you can do about it, so DEAL WITH IT.


----------



## Kpjoslee

Quote:


> Originally Posted by *Glottis*
> 
> the only thing dumb and ignorant here is posts like yours. i had a gtx670 and people said it's overkill for 1080p. i had gtx 780 and people said it's overkill for 1080p. i have 980ti and people said it's overkill for 1080p but i already find myself lowering gfx settings to achieve 60+fps. i'll be using 1080ti for whatever resolution i damn please and there isn't a single thing you can do about it, so DEAL WITH IT.


I am curious, what game are you struggling to achieve 60fps+ with maxed out settings on 1080p with 980ti?


----------



## ChevChelios

Quote:


> Originally Posted by *renejr902*
> 
> I tested my windforce oc gigabytes 1070 gtx today (i didnt overclock it myself until now, but i tested it in OC mode with guru extreme) ,in witcher3 at ultra no AA no hairwork at 4k i got 40fps with the 1070. At the same location with my old 980ti G1 fully overclocked by myself i got 46fps. Anywhere in the game the difference between the 2 cards is 4 to 6fps more for my old 980ti G1 overclocked fully. Next week i will overclock my 1070, i suppose the performance will be the same with the 2 cards


how overclocked is your 980Ti G1 ?

but yes: OC 1080 > OC 1070 = OC 980Ti


----------



## Glottis

Quote:


> Originally Posted by *Kpjoslee*
> 
> I am curious, what game are you struggling to achieve 60fps+ with maxed out settings on 1080p with 980ti?


many. GTA5, witcher3, division, rise of the tomb raider. (there are some newer games that i don't own, afaik Hitman is very demanding etc) problem is, when people see benchmarks they only care about these nice looking avarage fps graphs. what matters the most for me is minimal fps. 80fps or 200fps avarage is meaningless to me if i get drops to 50s or 40s from time to time.


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> The point is, if the last four years of product cycles from Nvidia hold true, they would've never planned on releasing a titan in 2016 anyway. Profitability from the 1080/1070/1060 will be more than enough to suffice for the rest of this year. The only way I see that changing is if AMD unexpectedly dropped a high-performance Vega card into the market in the fall. But that too is highly unlikely.


Nope that is where u are wrong, if vega was competitive, it wont be a titan, it will be a 1080 Ti WITH a titan.

Look back all the 3 generation Titan, has Nvidia ever release a titan just to compete with AMD. No, its always the 780 Ti and the 980 Ti.

When the past 3 Titan were release, is there any competitive product from AMD? No. This show the titan were a different league and Nvidia will release them when and where as they like. It wont be affected by AMD in any single way, so is the price.


----------



## NikolayNeykov

Quote:


> Originally Posted by *Glottis*
> 
> many. GTA5, witcher3, division, rise of the tomb raider. (there are some newer games that i don't own, afaik Hitman is very demanding etc) problem is, when people see benchmarks they only care about these nice looking avarage fps graphs. what matters the most for me is minimal fps. 80fps or 200fps avarage is meaningless to me if i get drops to 50s or 40s from time to time.


I think you have other problems, i find it hard to see less then 30-35 fps in any game with ultra settings on 4k, i know this card is made for 4k but on 1080 should cover 60Hz area easy.
Check your PSU /Motherboard / Cpu / Ram

I tested Witcher 3 on 1080p with absolutely max settings i didn't drop below 90 fps anywere.


----------



## ChevChelios

Quote:


> Originally Posted by *Glottis*
> 
> many. GTA5, witcher3, division, rise of the tomb raider. (there are some newer games that i don't own, afaik Hitman is very demanding etc) problem is, when people see benchmarks they only care about these nice looking avarage fps graphs. what matters the most for me is minimal fps. 80fps or 200fps avarage is meaningless to me if i get drops to 50s or 40s from time to time.


(1) use Gsync, helps a lot with fps drops

(2) what is your CPU/RAM ? you should be getting even minimum fps above 60 @ 1080p on a 980Ti (except maxed out Ashes/Hitman I guess), unless you have trash CPU or some other issue


----------



## prjindigo

If this thread had more garbage in it Greenpeace would protest it.

Pretty sure that the 1080ti and Titan-P will be 512 bit cards at this point, the GQDR5 (x) cuts the number of chips needed in half. We may see the 1080ti being a bank-disabled/reduced bit-width of the 512bit TitanP.


----------



## Klocek001

Quote:


> Originally Posted by *Glottis*
> 
> many. GTA5, witcher3, division, rise of the tomb raider. (there are some newer games that i don't own, afaik Hitman is very demanding etc) problem is, when people see benchmarks they only care about these nice looking avarage fps graphs. what matters the most for me is minimal fps. 80fps or 200fps avarage is meaningless to me if i get drops to 50s or 40s from time to time.


I was like that back when I was running a fixed refresh monitor, and before I moved to nvidia and had 290 in my system I found fps drops in games to be the #1 issue.
g-sync/freesync is the way to go, not only on high-end systems, but on mid-range ones too. that 980Ti is just going to waste a lot of its capabilities if you're not gonna run it on a adaptive sync display


----------



## Darkpriest667

Quote:


> Originally Posted by *Glottis*
> 
> the only thing dumb and ignorant here is posts like yours. i had a gtx670 and people said it's overkill for 1080p. i had gtx 780 and people said it's overkill for 1080p. i have 980ti and people said it's overkill for 1080p but i already find myself lowering gfx settings to achieve 60+fps. i'll be using 1080ti for whatever resolution i damn please and there isn't a single thing you can do about it, so DEAL WITH IT.


I agree. People told me 2 years ago my 980 was overkill for 1080... What they fail to realize is A) I ran 4 monitors at that time.. and B) minimum FPS can never go below 60.

The Division is the worst culprit of my games, but there are MANY that dip below 60 fps on the highest graphics settings... I'll be looking to pick up a 1080ti for this rig and throwing my 980 into my gfs rig and selling her 980 (mine is SC)


----------



## JackCY

Quote:


> Originally Posted by *DNMock*
> 
> the 780ti is a slightly cut down Titan


Wasn't so back then.

In order of release:
Titan... 2688:224:48
780Ti... 2880:240:48
Titan Black... 2880:240:48

---

Screw sync of any kind, just run ingame fps limiter.


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> how overclocked is your 980Ti G1 ?
> 
> but yes: OC 1080 > OC 1070 = OC 980Ti


I dont remember exactly the clock of my oc 980ti G1, i didnt have it anymore. But i wrote some stats and fps with different save points, so i can compare with my 1070. But i took several days to overclocked my 980ti last year, i got great number clock and stable at 100%. Sorry i really cant remember clock number. I did it with stock cooling fan, but my case have a great airflow, temp was great

I played all my games with my Samsung SUHDTV JS8500 55".( i had a 4k 28" and a 4k 40" in the past too, but i prefer the js8500 at 55") So i need a titan P , drop in 30-35 are not great in 4k, still this tv has very little motion blur, but 60fps and no drop is the best way to play , more fun,no tearing, no blur. i use adaptative vsync option, with my 1070 dirt rally at ultra in 4k never drop below 60,(AA disable) but one graphic setting is disable in ultra, if l enable it , i lost several fps, i dont remember the option setting name, but i cant see a visual difference


----------



## pez

If you're not too prideful to turn AA, tesselation, etc off, the 1080 does just fine at 4K.


----------



## twitchyzero

Quote:


> Originally Posted by *pez*
> 
> If you're not too prideful to turn AA, tesselation, etc off, the 1080 does just fine at 4K.


jaggies bother some people and tesselation can really enhance the visuals
i'd argue if you're gonna turn all that off might as well not run at 4K


----------



## Cyro999

Quote:


> meanwhile the sheer geniuses over at MS have figured out how to [email protected] with a 6TFlops Scorpio


you mean 4k low settings at 20-30fps









We already know exactly what you can do on 4k res with an r9 290x of performance


----------



## STEvil

Quote:


> Originally Posted by *twitchyzero*
> 
> jaggies bother some people and tesselation can really enhance the visuals
> i'd argue if you're gonna turn all that off might as well not run at 4K


So just turn them down so you still get good visuals. Dont necessarily need to be fully turned off.


----------



## PostalTwinkie

Quote:


> Originally Posted by *prjindigo*
> 
> If this thread had more garbage in it Greenpeace would protest it.
> 
> Pretty sure that the 1080ti and Titan-P will be 512 bit cards at this point, the GQDR5 (x) cuts the number of chips needed in half. We may see the 1080ti being a bank-disabled/reduced bit-width of the 512bit TitanP.


I would be a bit shocked if Titan didn't come with HBMv2 on it. I could possibly see a 1080 Ti with GDDR5X, but that would sour me a little.


----------



## Mhill2029

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I would be a bit shocked if Titan didn't come with HBMv2 on it. I could possibly see a 1080 Ti with GDDR5X, but that would sour me a little.


Is there much between GDDR5X and HBM2 in terms of overall gains in a gaming perspective though?


----------



## bigjdubb

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I would be a bit shocked if Titan didn't come with HBMv2 on it. *I could possibly see a 1080 Ti with GDDR5X, but that would sour me a little.*


The 1080 not having HBM2 soured me a little, the 1080ti not having it would sour me a lot. At least it would make the choice between AMD or Nvidia quite a bit easier.

Quote:


> Originally Posted by *Mhill2029*
> 
> Is there much between GDDR5X and HBM2 in terms of overall gains in a gaming perspective though?


That part of it doesn't really even matter to me. When you buy the top tier cards you expect top tier hardware. If AMD is offering HBM2 on their top tier cards and Nvidia doesn't then they are automatically second class for me, performance would be irrelevant at that point.


----------



## Somasonic

Quote:


> Originally Posted by *Glottis*
> 
> many. GTA5, witcher3, division, rise of the tomb raider. (there are some newer games that i don't own, afaik Hitman is very demanding etc) problem is, when people see benchmarks they only care about these nice looking avarage fps graphs. what matters the most for me is minimal fps. 80fps or 200fps avarage is meaningless to me if i get drops to 50s or 40s from time to time.


I poked fun before but in all fairness I think this is a valid point - there is no such thing as overkill if your minimum FPS is below 60.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Mhill2029*
> 
> Is there much between GDDR5X and HBM2 in terms of overall gains in a gaming perspective though?


Hard to really say, since we haven't seen HBMv2 yet. One would imagine that HBMv2 on a Titan class Pascal card would outperform a GDDR5X variant, especially at 4K+.


----------



## guttheslayer

The ins
Quote:


> Originally Posted by *PostalTwinkie*
> 
> Hard to really say, since we haven't seen HBMv2 yet. One would imagine that HBMv2 on a Titan class Pascal card would outperform a GDDR5X variant, especially at 4K+.


If anything the gp100 has a enomous l2 cache that is tie to the insane bus width (4096 bits). Even with just 50% shader more it comes with twice the cache. Among others like registry etc. this is the biggest advantage and i believe gp100 has a different class of Ipc per core compare to other gp10x variant. Giving it beyond 50% perf with only 40% increase in core count. We shall see.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *twitchyzero*
> 
> jaggies bother some people and tesselation can really enhance the visuals
> i'd argue if you're gonna turn all that off might as well not run at 4K


You can still run tessellation at 4k (albeit at lowered settings) and I would agree that its not a feature you'd want to turn off. At 4k, however, I really find your "jaggies" comment dubious at best. Pixel size on even a 40" 4k monitor is miniscule so those "jaggies" would be very insignificant to the point of merely being nitpicky by complaining about them IMO. AA is just not needed at 4k. Its use may result in a slightly better image quality but most assuredly is not the deal-breaker that Tess-off would be. Of course everybody sees differently so my opinion is definitely not the end all, be all (but I have noticed many ridiculous "claims" of certain people that their eyes are just oh-so-so sensitive that they simply can't live with extremely minor issues, as though they would go blind from them or something).


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *bigjdubb*
> 
> That part of it doesn't really even matter to me. When you buy the top tier cards you expect top tier hardware. If AMD is offering HBM2 on their top tier cards and Nvidia doesn't then they are automatically second class for me, performance would be irrelevant at that point.


I disagree with this statement to a point. HBM didn't make the Fury X a better card than the 980Ti with its regular GDDR5 so performance is ultimately the most important metric. That said, it would be ridiculous for the new Titan to NOT have HBM2 considering its many advantages and AMD's assured use of it in Vega. Perhaps the new Titan would be faster than Vega regardless of memory config but I don't see them using anything but HBM2 on their next flagship card...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Cyro999*
> 
> you mean 4k low settings at 20-30fps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We already know exactly what you can do on 4k res with an r9 290x of performance


The GPU on Xbox will be faster then 290X. Also I play 4K with 290X and can get 30 fps with almost all setting High/Ultra even with the most demanding games. We know Xbox will still favor 30 fps so it will do 4K 30fps no problem.


----------



## Xuvial

Quote:


> Originally Posted by *ZealotKi11er*
> 
> We know Xbox will still favor 30 fps so it will do 4K 30fps no problem.


I don't think XBox is aiming for 4k/30fps, it just doesn't make sense. I am betting 99.9% of console players play on their TV's and/or a 1080p monitor. With a TV people typically sit far away enough to make the visual difference between 4k and 1080p almost unnoticeable (NOT worth the performance hit), and no console player is buying 4k monitors.

[email protected] simply needs far less horsepower and memory/bandwidth than [email protected] It makes perfect sense for Scorpio to aim for native [email protected], which the current XBox is struggling (or completely failing) to do.
Imagine a console player getting a taste of Witcher 3 at 1080p/60fps at decent settings, their jaw is going to be on the floor. But what will they get at 4k? A bit less aliasing/jaggies they may not even notice on their TV...at the cost of horrible 30fps and lower settings. Not even remotely worth it.

4k is so incredibly redundant for console gaming it's not even funny


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I disagree with this statement to a point. HBM didn't make the Fury X a better card than the 980Ti with its regular GDDR5 so performance is ultimately the most important metric. That said, it would be ridiculous for the new Titan to NOT have HBM2 considering its many advantages and AMD's assured use of it in Vega. Perhaps the new Titan would be faster than Vega regardless of memory config but I don't see them using anything but HBM2 on their next flagship card...


You have to redesign a whole chip for g5x. 4mb of l2 cache doesnt work for 384 bits unless they are going for 512 bits this time, which is extremely unlikely.

For 384 bits it has to tie to 3072mb of l2 cache. So the gp100 g5x variant will definitely abit different, not to mention different controller to support 4096 vs 384.


----------



## Somasonic

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> You can still run tessellation at 4k (albeit at lowered settings) and I would agree that its not a feature you'd want to turn off. *At 4k, however, I really find your "jaggies" comment dubious at best.* Pixel size on even a 40" 4k monitor is miniscule so those "jaggies" would be very insignificant to the point of merely being nitpicky by complaining about them IMO. *AA is just not needed at 4k.* Its use may result in a slightly better image quality but most assuredly is not the deal-breaker that Tess-off would be. Of course everybody sees differently so my opinion is definitely not the end all, be all (but I have noticed many ridiculous "claims" of certain people that their eyes are just oh-so-so sensitive that they simply can't live with extremely minor issues, as though they would go blind from them or something).


I heard the same sort of things before I went to 1440p and to be honest I like at least 4x AA at that res. Admittedly I haven't experienced 4k but my point is I wouldn't underestimate the difference in people's perception. What you may think are 'ridiculous claims' may in fact be what is ruining immersion and killing the experience for someone else.

Cheers.


----------



## guttheslayer

Quote:


> Originally Posted by *Somasonic*
> 
> I heard the same sort of things before I went to 1440p and to be honest I like at least 4x AA at that res. Admittedly I haven't experienced 4k but my point is I wouldn't underestimate the difference in people's perception. What you may think are 'ridiculous claims' may in fact be what is ruining immersion and killing the experience for someone else.
> 
> Cheers.


He is not wrong when u have dynamic scaling of 2.25x for the same 27 inch. Its like doing msaa of 2x or even more.

AA is redundant for 4k if u are on 27 inch display. On 32 or 40 den u might consider having them on.


----------



## renejr902

I know what is tesselation. But Im still not sure about one thing, how can i be sure its enable or disable in game? I know witcher 3 hairwork use tesselation, and i dont care about hairwork, so i disabled it to save some fps. But without hairwork, will witcher 3 use tesselation for others things? If yes can you enable or disable it ? Sorry if my question is confusing.. Thanks for answer, i really want to know!


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> He is not wrong when u have dynamic scaling of 2.25x for the same 27 inch. Its like doing msaa of 2x or even more.
> 
> AA is redundant for 4k if u are on 27 inch display. On 32 or 40 den u might consider having them on.


I tested AA at 4K with a 27", 40, 50" and 55"

In all case you can disable AA because if you are seeing some aliaising, you are probably too close of your screen. But AA can still make a interesting difference in 50" and 55" screen, but only if youre sitting enough close of your screen, im still sitting close to my screen and i dont find it necessary at all. I can live with a little aliasing without any problem . For 27" you dont see anliasing at all, dont use it, if you want to play at one inch of your screen you can enable it lol. Even at 40" you dont really need it, you will need to be very close to see aliasing. In all case i disable AA. About display quality, my favorite size was 40", all looks so great and more details. 27" is too small to enjoy 4k in my opinion, it still enjoyable but not so much, pixel are too small for me and windows look crap without scaling. unfortunately I didnt test 32". 40" is the perfect balance. But i still play in my 55" because i love big screen. I dont think i will buy a 8k screen in my life, because even at 4k you have to sit very close to see all details, at 8k i dont want to play only one feet of my 55" to see the details. 4K will suffice for my whole life. I dont think 8k will be a important buying factor in future. HDR is more important than 8k screen. About people playing in 1440p or 1600p , i recommend you to buy a 4k monitor , its still much better , the details are much better, it worth it. I prefer 4k at 60hz than 1440p at 120hz. I cant live without a 4k monitor or tv. The day you will play a pc game at ultra in 4k you wont turning back. I prefer playing a game at 4k at 40-45fps than at 1080p or 1440p at 60fps or more. I love the details that bring 4k resolution, all things you see in your screen become so much perfect, its impressive even today i still impress with a 4k screen. But i know all people prefer differents things, its up to you, i know people that wont play at all at 40-45fps. It bring more motion blur too


----------



## pez

Quote:


> Originally Posted by *twitchyzero*
> 
> jaggies bother some people and tesselation can really enhance the visuals
> i'd argue if you're gonna turn all that off might as well not run at 4K


Quote:


> Originally Posted by *STEvil*
> 
> So just turn them down so you still get good visuals. Dont necessarily need to be fully turned off.


Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> You can still run tessellation at 4k (albeit at lowered settings) and I would agree that its not a feature you'd want to turn off. At 4k, however, I really find your "jaggies" comment dubious at best. Pixel size on even a 40" 4k monitor is miniscule so those "jaggies" would be very insignificant to the point of merely being nitpicky by complaining about them IMO. AA is just not needed at 4k. Its use may result in a slightly better image quality but most assuredly is not the deal-breaker that Tess-off would be. Of course everybody sees differently so my opinion is definitely not the end all, be all (but I have noticed many ridiculous "claims" of certain people that their eyes are just oh-so-so sensitive that they simply can't live with extremely minor issues, as though they would go blind from them or something).


Quote:


> Originally Posted by *Somasonic*
> 
> I heard the same sort of things before I went to 1440p and to be honest I like at least 4x AA at that res. Admittedly I haven't experienced 4k but my point is I wouldn't underestimate the difference in people's perception. What you may think are 'ridiculous claims' may in fact be what is ruining immersion and killing the experience for someone else.
> 
> Cheers.


Quote:


> Originally Posted by *guttheslayer*
> 
> He is not wrong when u have dynamic scaling of 2.25x for the same 27 inch. Its like doing msaa of 2x or even more.
> 
> AA is redundant for 4k if u are on 27 inch display. On 32 or 40 den u might consider having them on.


All good arguments. But yes, AA on 4K is less noticeable than on 2K, and so on. I've noticed some jaggies in FO4 not having AA on, but you really have to look for them. Even then, something like FXAA can be turned on for a very minimal hit. I guess I should have originally stated turning them down and/or off. In my mind, I went more to directly thinking about the original Tomb Raider and how turning on the hair effects could destroy framerates. A 'pretty' feature, nonetheless, but ultimately one that's not worth the performance hit.

However, I find that I tend to try and push my GPUs to the brink. I thought about staying on 2K and going with G-sync and high refresh rate, but ultimately 4K at 27" vs 2K at 27" is just a nice visual upgrade.


----------



## Zero4549

Quote:


> Originally Posted by *renejr902*
> 
> I tested AA at 4K with a 27", 40, 50" and 55"
> 
> In all case you can disable AA because if you are seeing some aliaising, you are probably too close of your screen. But AA can still make a interesting difference in 50" and 55" screen, but only if youre sitting enough close of your screen, im still sitting close to my screen and i dont find it necessary at all. I can live with a little aliasing without any problem . For 27" you dont see anliasing at all, dont use it, if you want to play at one inch of your screen you can enable it lol. Even at 40" you dont really need it, you will need to be very close to see aliasing. In all case i disable AA. About display quality, my favorite size was 40", all looks so great and more details. 27" is too small to enjoy 4k in my opinion, it still enjoyable but not so much, pixel are too small for me and windows look crap without scaling. unfortunately I didnt test 32". 40" is the perfect balance. But i still play in my 55" because i love big screen. I dont think i will buy a 8k screen in my life, because even at 4k you have to sit very close to see all details, at 8k i dont want to play only one feet of my 55" to see the details. 4K will suffice for my whole life. I dont think 8k will be a important buying factor in future. HDR is more important than 8k screen. About people playing in 1440p or 1600p , i recommend you to buy a 4k monitor , its still much better , the details are much better, it worth it. I prefer 4k at 60hz than 1440p at 120hz. I cant live without a 4k monitor or tv. The day you will play a pc game at ultra in 4k you wont turning back


To each their own. Personally I'd take lower resolution over lower refresh rate any day.

Low resolutions aren't as pretty, but they don't negatively impact the player much (and in fact is often are beneficial both in terms of game performance and in reducing the amount of visual noise on the screen, thus allowing you to more quickly focus on the important parts).

Low refresh rates on the other hand directly hinder a player's ability to process data in real time, and more importantly in my case, give me killer migraines.

Now if I were to buy a monitor for office work, 3D rendering, photography, or digital art then yes I'd absolutely prioritize resolution over refresh rates. For gaming however, refresh rate is king.

Personally, I do both, and I don't have unlimited money or the desk space for more than one monitor, so I split the difference and got an overclockable korean 1440 pls panel. High enough rate to keep my eyes from bleeding in games, high enough resolution and good enough colors for other work.


----------



## renejr902

Dont go 27" with a 4k screen. You need at least 32" , otherwise keep your 2k screen, i just explained why i think that in my last post


----------



## renejr902

Quote:


> Originally Posted by *Zero4549*
> 
> To each their own. Personally I'd take lower resolution over lower refresh rate any day.
> 
> Low resolutions aren't as pretty, but they don't negatively impact the player much (and in fact is often are beneficial both in terms of game performance and in reducing the amount of visual noise on the screen, thus allowing you to more quickly focus on the important parts).
> 
> Low refresh rates on the other hand directly hinder a player's ability to process data in real time, and more importantly in my case, give me killer migraines.
> 
> Now if I were to buy a monitor for office work, 3D rendering, photography, or digital art then yes I'd absolutely prioritize resolution over refresh rates. For gaming however, refresh rate is king.
> 
> Personally, I do both, and I don't have unlimited money or the desk space for more than one monitor, so I split the difference and got an overclockable korean 1440 pls panel. High enough rate to keep my eyes from bleeding in games, high enough resolution and good enough colors for other work.


I play rarely first and third person shooters, for shooters refresh rate is very important. Im playing mostly RPG, i think 4k bring a better immersion than refresh rate in rpg. I play racing game too , but 60fps is enough, but 40-45fps is not great at all, but rpg 40-45 fps is ok. 40fps with a shooter game is very bad


----------



## pez

Quote:


> Originally Posted by *renejr902*
> 
> Dont go 27" with a 4k screen. You need at least 32" , otherwise keep your 2k screen, i just explained why i think that in my last post


Yep, 4K at 27" is fine for me. Definitely a huge upgrade to 2K at 27".


----------



## renejr902

Quote:


> Originally Posted by *pez*
> 
> Yep, 4K at 27" is fine for me. Definitely a huge upgrade to 2K at 27".


Cool im happy for you! I suppose you dont use AA ? Do you? I didnt try 2k at 27" so i cant compare really


----------



## renejr902

To be honest 2k is for gtx 1080 and people with 4k will surely buy the next titan or 1080ti. I still understand 2k at 144hz can be useful with a titan.
I cant wait for next titan !!! Come to me 4k 60hz !
I still think nvidia will try to get nearly all game at 60fps in 4k with their titan. I still hope hope for nearly all game at 60-70fps maximum in 4k, but we have to accept framerate drop will happen in some games. Some games should have their minimum at 40-45fps and maximum at 60-70fps


----------



## pez

Yeah, no AA necessary here. Strangely the only game I could notice it if I really looked for it is FO4. Second 1080 is hopefully on its way to me this week







. The great thing is that this monitor upscales well so 2K on it looks just as good if not better than my last IPS 27".


----------



## STEvil

I went from a 30" 2560x1600 to 46" 1920x1080 to 27" 3840x2160.

I think 27" 4k is too small. About 34-40" (depending on 16:9 or 21:9) would be perfect.


----------



## renejr902

Quote:


> Originally Posted by *pez*
> 
> Yeah, no AA necessary here. Strangely the only game I could notice it if I really looked for it is FO4. Second 1080 is hopefully on its way to me this week
> 
> 
> 
> 
> 
> 
> 
> . The great thing is that this monitor upscales well so 2K on it looks just as good if not better than my last IPS 27".


Great!


----------



## renejr902

Quote:


> Originally Posted by *STEvil*
> 
> I went from a 30" 2560x1600 to 46" 1920x1080 to 27" 3840x2160.
> 
> I think 27" 4k is too small. About 34-40" (depending on 16:9 or 21:9) would be perfect.


40" 16:9 is not too big, it works very well on my desktop for working too and editing pictures. 50" is absolutely too big for desktop use. You will be happy with 40" maybe 32" 16:9 monitor will be ok for you and 50-55" is great for lounge. Now i need a 4k tv in my bathroom too ! LoL, I need to try VR in bath too LoL


----------



## ChevChelios

anything more than 30" is too big for a desktop IMHO (for browsing/media/gaming use, not professional), regardless of res

maybe even 30" is too big

right now I think 27" 1440p 144hz is *the* perfect monitor, I may try a 30" 4K 144hz at some point in the future


----------



## pez

I'm still curious to try ultrawide, but I really want IPS. I paid a decent price for the 4K panel that lack of g-sync isn't, too upsetting, but I really do want to try it. Just not at the premium of 3x the cost. Depending on how SLI does with 4K, I may return the 4K panel and consider holding out until Black Friday specials hit the x34 Predator.


----------



## Klocek001

Quote:


> Originally Posted by *ChevChelios*
> 
> anything more than 30" is too big for a desktop IMHO (for browsing/media/gaming use, not professional), regardless of res
> 
> maybe even 30" is too big
> 
> right now I think 27" 1440p 144hz is *the* perfect monitor, I may try a 30" 4K 144hz at some point in the future


true, I've got a 32" TV in my room and I'd never be able to use that as a monitor, too huge. 27" is almost perfect, though I think the best spot for me would be 28".
In the future I'd like to try a 21:9 version of a 28" 4K screen, with at least 100Hz panel. I might just give up upgrading forever if I had that.


----------



## DarkIdeals

Quote:


> Originally Posted by *Mookster*
> 
> They're probably talking about when you're at 720P and doing 230 FPS instead of the 250 you'd get if you weren't CPU limited.
> 
> I don't know what's up with NV lately. They've got a new marketing team, or something, I think. Their recent tactics are more desperate than AMD's during the 3rd "release" of GCN 1.0 cards.


It was confirmed to be an "inside joke" by nvidia. Nothing more. They simply joked that since the 6950X runs at 3.7ghz that you might be better off running a 6700K as a base frequency Broadwell 3.7ghz could actually bottleneck. However its obvious that overclocking to ~4.2ghz or more would mostly alleviate any possible bottlenecking; even if it wasn't a joke based on no real facts (which is was)


----------



## Neo_Morpheus

Full screen immersion, 27" 4k is too small, a waste almost. Also depends on the room layout and how far back you can have your monitor setup. Some games are going to stress gpu's at 4k, so when they come out, its better to have a high resolution monitor like 4k with a real high refresh rate, so you always have the options to reduce it back to 2k if needed.


----------



## pez

Yep, I went back to 2K just to confirm and I can't now. IQ is just that much better for me. It'd be one thing if I went from a high refresh rate to a 'low' one, but 60 to 60 is great. Even desktop use is not the same anymore







. Once a proper G-Sync 4K display comes out that's not the same cost as the 2 1080s that it needs to power it, maybe I'll hop on the wagon







.


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> anything more than 30" is too big for a desktop IMHO (for browsing/media/gaming use, not professional), regardless of res
> 
> maybe even 30" is too big
> 
> right now I think 27" 1440p 144hz is *the* perfect monitor, I may try a 30" 4K 144hz at some point in the future


For me 40" is perfect for desktop and 27" was too small. 4K in 27" has too small pixel to me and i cant live with windows scaling at 150-200%. The pixel are so small in game that my eyes bring me tears and screen seem a little blurred because the pixel are too small, it can be my eyes







BUT it will be different for everybody. For me 40" 4k screen, is my favorite size for my desktop. one of my friend prefer the 27" vs 40" and another one prefer my 40" vs 27". And another one cant see the difference between witcher locked at 60hz vs locked at 30hz !!! 30fps vs 60fps for witcher3 is pretty obvious to me but that guy could not discern any difference he told me, i was surprised even in shock







Even before i used a 40" at 1080p for my desktop, but pixel are way too big lol


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> For me 40" is perfect for desktop and 27" was too small. 4K in 27" has too small pixel to me and i cant live with windows scaling at 150-200%. The pixel are so small in game that my eyes bring me tears and screen seem a little blurred because the pixel are too small, it can be my eyes
> 
> 
> 
> 
> 
> 
> 
> BUT it will be different for everybody. For me 40" 4k screen, is my favorite size for my desktop. one of my friend prefer the 27" vs 40" and another one prefer my 40" vs 27". And another one cant see the difference between witcher locked at 60hz vs locked at 30hz !!! 30fps vs 60fps for witcher3 is pretty obvious to me but that guy could not discern any difference he told me, i was surprised even in shock
> 
> 
> 
> 
> 
> 
> 
> Even before i used a 40" at 1080p for my desktop, but pixel are way too big lol


4K at 40" has the almost same PPI as 1440p at 27"

I prefer something not bigger than 32" if possible.


----------



## ChevChelios

30-32"is the sweet spot for 4K IMO, but everyone has their own personal preferences

anything significantly over 30" is too big for me for such a small (outstretched arm) distance to the screen


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> 4K at 40" has the almost same PPI as 1440p at 27"
> 
> I prefer something not bigger than 32" if possible.


maybe 32" is the sweet spot, the perfect size, but i didnt have a chance to test it yet, to be honest im more impress with IQ with my 4k 40" than 4k 55". You need to be not too close to a 55" 4k otherwise IQ is not great. 32" in 4k could be the perfect desktop size for me too, i have to try it







, i will wait for a 32" 4k 144hz to test it, PPI of the 4K 32" should be perfect


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> 30-32"is the sweet spot for 4K IMO, but everyone has their own personal preferences
> 
> anything significantly over 30" is too big for me for such a small (outstretched arm) distance to the screen


I have a BIG desktop in a very large space, it could explain that i can deal with a 40" in it, but i would like to test a 4k 32" , i will wait for a 4k 32" at 144hz to be released


----------



## ChevChelios

if the monitor is so far away that even 40" is ok on desktop, then I can hardly call that a monitor anymore tbh









or maybe I just need a jumbo-sized desk


----------



## Ghoxt

On the HMB2 topic, Was I the only one that has the opinion, that Nvidia will likely get more out of HBM2 than AMD got out of 1 gen HBM, all things being equal? We've learned obviously that Nvidia has some really good compression methods that made it easier for them to use a 384bit bus and not suffer as opposed to 512.

I have to give it to AMD for pushing the envelope being first with HBM, but unfortunately it didn't translate into the OC's Dream like marketing promised. I wish it had pushed things forward.








I was dreaming about having a quad GP100 Titan-P HBM2 watercooled rig with NVlink fixing all the scaling problems of the past, and curing cancer along the way. I know...you don't have to say it, "Keep right on dreaming lol"


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> if the monitor is so far away that even 40" is ok on desktop, then I can hardly call that a monitor anymore tbh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> or maybe I just need a jumbo-sized desk


LoL, yeah its a jumbo size desk, we could say that LoL


----------



## renejr902

Quote:


> Originally Posted by *Ghoxt*
> 
> On the HMB2 topic, Was I the only one that has the opinion, that Nvidia will likely get more out of HBM2 than AMD got out of 1 gen HBM, all things being equal? We've learned obviously that Nvidia has some really good compression methods that made it easier for them to use a 384bit bus and not suffer as opposed to 512.
> 
> I have to give it to AMD for pushing the envelope being first with HBM, but unfortunately it didn't translate into the OC's Dream like marketing promised. I wish it had pushed things forward.
> 
> 
> 
> 
> 
> 
> 
> 
> I was dreaming about having a quad GP100 Titan-P HBM2 watercooled rig with NVlink fixing all the scaling problems of the past, and curing cancer along the way. I know...you don't have to say it, "Keep right on dreaming lol"


Nvidia compression + Hbm2 will give result far more impressive than amd even using them hbm2. Amd need better bandwith than nvidia their compression tools are not great like nvidia, it can explain in some way the reason why 980ti was stronger than fury x. I will buy a 4 sli way titan P if it prevent and cure cancer, thats for sure !!!


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> LoL, yeah it a jumbo size desk, we could say that LoL


Or it can be on a wall mount, and ur table aint that near to the wall...









Either way, after reading through this:

http://www.anandtech.com/show/10433/nvidia-announces-pci-express-tesla-p100

Seem like the titan will definitely be a GP100 that failed Tesla requirement. And it has to come with 2 variant because the HBM failure is mutually exclusive to the Tesla requirement.

So first a GP100 is being split into 2 variant, fully enabled HBM or with some fault HBM stack, before splitting further with Tesla requirement, all using sorting methods:

1) HBM passed -> 16G P100, HBM failed ->12G P100

2) For 16G GP100:

Passed Tesla Requirement -> PCIe Tesla 16G
Failed Tesla Rquirement -> Titan 16G

For 12G GP100:

Passed Tesla Requirement -> PCIe Tesla 12G
Failed Tesla Requirement -> Titan 12G

Because HBM is partially crippled for 12G and is tied to memory bandwidth, the cut down Titan will have lesser L2 cache, slower memory bandwidth which will definitely affect OC performance, probably it will not even hit 2GHz core clock. But it comes as in the *cheapest GP100* from Nvidia. $999 most likely.

For the full 16G, it comes with 4MB of L2 cache and 33% more memory bandwidth, therefore performance with OC will not be bottlenecked even when clock beyond 2GHz, especially with 8+8 power pin, it can easily reach GTX 1080 max speed or even more. (with custom loop). Because of it fully unlock OC capability, it probably comes in at $1200-$1500

Look like I finally see the whole picture, and screw those who still believe Titan will be a G5X variant by this Sept.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> Or it can be on a wall mount, and ur table aint that near to the wall...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Either way, after reading through this:
> 
> http://www.anandtech.com/show/10433/nvidia-announces-pci-express-tesla-p100
> 
> Seem like the titan will definitely be a GP100 that failed Tesla requirement. And it has to come with 2 variant because the HBM failure is mutually exclusive to the Tesla requirement.
> 
> So first a GP100 is being split into 2 variant, fully enabled HBM or with some fault HBM stack, before splitting further with Tesla requirement, all using sorting methods:
> 
> 1) HBM passed -> 16G P100, HBM failed ->12G P100
> 
> 2) For 16G GP100:
> 
> Passed Tesla Requirement -> PCIe Tesla 16G
> Failed Tesla Rquirement -> Titan 16G
> 
> For 12G GP100:
> 
> Passed Tesla Requirement -> PCIe Tesla 12G
> Failed Tesla Requirement -> Titan 12G
> 
> Because HBM is partially crippled for 12G and is tied to memory bandwidth, the cut down Titan will have lesser L2 cache, slower memory bandwidth which will definitely affect OC performance, probably it will not even hit 2GHz core clock. But it comes as in the *cheapest GP100* from Nvidia. $999 most likely.
> 
> For the full 16G, it comes with 4MB of L2 cache and 33% more memory bandwidth, therefore performance with OC will not be bottlenecked even when clock beyond 2GHz, especially with 8+8 power pin, it can easily reach GTX 1080 max speed or even more. (with custom loop). Because of it fully unlock OC capability, it probably comes in at $1200-$1500
> 
> Look like I finally see the whole picture, and screw those who still believe Titan will be a G5X variant by this Sept.


seems very possible i agree. Thanks for your idea for the wall mount and table , i will make a 2nd desktop LOL


----------



## ChevChelios

Quote:


> Along with releasing the specifications, NVIDIA has announced that the PCIe Tesla P100 will be available in Q4 of this year.


Tesla in Q4 means first Titan (whatever it is) is either ~December 2016 at earlier (unlikely) or ~Jan/Feb 2017 (since it comes after Tesla obviously), with Vega also at about that time

hope this puts to rest August/September rumors

also 3584 is 40% more shaders than 1080, which could theoretically mean 40% more performance than an OCed 1080 (since also better memory), but only if the Titan is also ~2000+ Mhz, which remains to be seen if it will be possible (and if it is it will definitely require water)

if 1080Ti follows suit then air cooled 1080Ti may not clock as high as an air cooled 1080 ? Or will it ?

generally Id predict 1080Ti would be 25-30% above 1080 (keeping in line with the previous line), if it can clock reasonably high (since it will probably also have less than the 3584 SP of the Titan, thus 30%+ more shaders over 1080, rather then 40%)


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> seems very possible i agree. Thanks for your idea for the wall mount and table , i will make a 2nd desktop


Its look like the ball game with HBM is very different as compared to standard GDDR5, becz HBM are known to respond poorly to OCing. So whether ur memory bandwidth get bottlenecked, it depend mostly on *Bus-Width*

From what I see, the best Nvidia can clock for HBM2 is 1.4 gbps or 700MHz. Most likely due to absence or cut down of DP, Nvidia might lower the HBM clock for Titan to 625MHz (as it is still sufficient for stock speed). However once you overclock ur GPU, things start to change. For one, the base stock clock is definitely much lower than GTX 1080, most likely i believe is 1.4GHz - 1.6GHz core speed. So OC to 2GHz is much much more demanding when u have 40% more shaders than 1080. That bandwidth speed need to catch up proportionately.

480GB/s is definitely not going to be enough to feed the monster GPU once u charge them all the way to 2GHz. Which can hamper OCing potential. However for 16GB you have at least 640GB/s. That make it pretty good headroom for u to do crazy LN2 nonsense.


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> Tesla in Q4 means first Titan (whatever it is) is either ~December 2016 at earlier (unlikely) or ~Jan/Feb 2017 (since it comes after Tesla obviously), with Vega also at about that time
> 
> hope this puts to rest August/September rumors


Or they could be release in same timeframe, this put it in late sept too.

Nope, not put to rest.


----------



## ChevChelios

Q4 2016 is not September

and Q4 is still only Tesla, not Titan


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> Q4 2016 is not September
> 
> and Q4 is still only Tesla, not Titan


So where is the source that say Q4 is only Tesla not Titan?

Anandtech article came out before VRW, so they probably didnt realised Titan existence back den.

Late Sept or early Oct, Oct - Dec is the Q4 timeframe.

And...

*If you check back ALL Titan history, they always release Titan at least 3 month before AMD release anything new*. That itself is a more believing trend since it has happen thrice. Its always the Ti that compete with AMD highest offering. Always been the case till now.


----------



## ChevChelios

Ive read earlier Titan will come after Tesla/Quadro, dont have the link right now

not to mention it makes sense, frist units are Teslas, they have priority, then they start Titans

especially since Vega is a long way off and there is no big rush

Vega is rumored Q1 or even H1 2017, they can release Titan P ~January 2017


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> Ive read earlier Titan will come after Tesla/Quadro, dont have the link right now
> 
> not to mention it makes sense, frist units are Teslas, they have priority, then they start Titans
> 
> especially since Vega is a long way off and thre is no big rush


If you read by the way I mentioned they did the sorting, they will have enough stock for Titan as well as Tesla, for one thing, its easier to fail a HPC tesla requirement. Heck they might even have more Titans than Tesla right now.

Again while its not impossible, Nvidia could release them at the same time, IF Titan does not cannibalized their Tesla sales, which in the case appear to be more likely (Crippled DP).

Again this is not 1080 Ti, Nvidia can release them anytime and it will not be affecting anything. Since when AMD can even affect the Titan in any way?

Check the release date for:

OG Titan vs R9 290X
Titan Black vs (Nth from AMD)
Titan X vs Fury X

You realized Titan has nth to do with AMD at all. It is an isolated independent release. So stop linking it with Vega, if anything titan will release at least 3 months before Vega, like it been doing since 2013.


----------



## ChevChelios

Titan is very much linked to Vega, since there is a chance Vega might outperform 1080, but it will never touch Titan P

so Titan P gives 100% guarantee that the perf crown belongs to Nvidia

Quote:


> if anything titan will release at least 3 months before Vega


yes maybe

but not in August-Sept


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> Titan is very much linked to Vega, since there is a chance Vega might outperform 1080, but it will never touch Titan P
> 
> so Titan P gives 100% guarantee that the perf crown belongs to Nvidia


That will be the 1080 Ti, not the Titan

You keep getting things mixed up.

780 ti was born becz of R9 290X which threaten the Titan
980 ti was release to combat Fury X.

Not obvious enough?

What make you think Titan P release on Sept cannot guarantee 100% performance crown, they probably crown it 3 months before AMD did anything to challenge.


----------



## ChevChelios

they might not have (enough) 1080Ti ready for Vega launch or something like that

so TItan covers them


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> they might not have (enough) 1080Ti ready for Vega launch or something like that
> 
> so TItan covers them


Source?

You realised the strongest point of Ti is always about price that matches close to AMD highest offering right? Not the absurb $999 or even $1399

Nvidia pull out a 780 Ti when we least expected... So where is ur source they didnt have enough 1080 Ti?


----------



## ChevChelios

source is my speculation, same as you


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> source is my speculation, same as you


Mine is obviously based on Nvidia release trend of Titan for past 3 generation. Not your out of the blue imagination.

I am sorry you feel sour for "early" Titan release, but it will comes as when Nvidia feel like releasing it.


----------



## ChevChelios

Quote:


> Mine is obviously based on Nvidia release trend of Titan for past 3 generation


well then you should look how long it took for Titan to release when it was under a new node

it was 11 months IIRC

and when did a Titan ever come out within first 4-5 months of a new gen ? even on a mature node

Quote:


> it will comes as when Nvidia feel like releasing it.


yes

and that wont be in August-September


----------



## rcfc89

Quote:


> Originally Posted by *renejr902*
> 
> I tested AA at 4K with a 27", 40, 50" and 55"
> 
> In all case you can disable AA because if you are seeing some aliaising, you are probably too close of your screen. But AA can still make a interesting difference in 50" and 55" screen, but only if youre sitting enough close of your screen, im still sitting close to my screen and i dont find it necessary at all. I can live with a little aliasing without any problem . For 27" you dont see anliasing at all, dont use it, if you want to play at one inch of your screen you can enable it lol. Even at 40" you dont really need it, you will need to be very close to see aliasing. In all case i disable AA. About display quality, my favorite size was 40", all looks so great and more details. 27" is too small to enjoy 4k in my opinion, it still enjoyable but not so much, pixel are too small for me and windows look crap without scaling. unfortunately I didnt test 32". 40" is the perfect balance. But i still play in my 55" because i love big screen. I dont think i will buy a 8k screen in my life, because even at 4k you have to sit very close to see all details, at 8k i dont want to play only one feet of my 55" to see the details. 4K will suffice for my whole life. I dont think 8k will be a important buying factor in future. HDR is more important than 8k screen. About people playing in 1440p or 1600p , i recommend you to buy a 4k monitor , its still much better , the details are much better, it worth it. *I prefer 4k at 60hz than 1440p at 120hz. I cant live without a 4k monitor or tv. The day you will play a pc game at ultra in 4k you wont turning back.* I prefer playing a game at 4k at 40-45fps than at 1080p or 1440p at 60fps or more. I love the details that bring 4k resolution, all things you see in your screen become so much perfect, its impressive even today i still impress with a 4k screen. But i know all people prefer differents things, its up to you, i know people that wont play at all at 40-45fps. It bring more motion blur too


I disagree I played on the Acer Predator hb271hk which is superior in every way to the 4k display you game on(Input lag/ Gsync etc.) and it only took me a week to send it back. 4K simply put is just not ready. Even with Gsync and maintaining 60fps things felt laggy and sloppy. No matter the game. Fallout 4, BF4, Doom, Battefront, FCPrimal. I believe are current available DisplayPort tech is whats causing this experience. No matter how strong your PC is and how well designed the Display is the cable struggling to push things along smoothly. In today's market I believe the best available experience in PC gaming is the X34. Unreal immersion, fantastic picture quality and vibrancy along with 100hz and Gsync. I played on a friends this past friday and sent my 4K back today and ordered the X34.


----------



## Lee Patekar

Quote:


> Originally Posted by *ChevChelios*
> 
> and that wont be in August-September


If GP102 or GP100 is in sufficient quantities to sell, they'll release it as a Titan for extra revenue. Once AMD can compete with the 1080 GTX or the Titan P, that's when they undercut AMD with the 1080 TI.

There's zero reason to release a 1080 TI if there's no market pressure.. but a "luxury" item like a titan, which you can put a crazy price tag on like 1,400$, is a compelling sell.


----------



## ChevChelios

i dont see how they can have Titans ready this soon


----------



## rcfc89

Quote:


> Originally Posted by *Lee Patekar*
> 
> If GP102 or GP100 is in sufficient quantities to sell, they'll release it as a Titan for extra revenue. Once AMD can compete with the 1080 GTX or the Titan P, that's when they undercut AMD with the 1080 TI.
> 
> There's zero reason to release a 1080 TI if there's no market pressure.. but a "luxury" item like a titan, which you can put a crazy price tag on like 1,400$, is a compelling sell.


There are plenty of 980Ti owners like myself who are ready and willing to upgrade. A 1080 isn't nearly enough. Good thing is I can wait if need be. There's currently not one game my Lightnings can't handle in UltraWide 100hz.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *guttheslayer*
> 
> If you read by the way I mentioned they did the sorting, they will have enough stock for Titan as well as Tesla, for one thing, its easier to fail a HPC tesla requirement. Heck they might even have more Titans than Tesla right now.
> 
> Again while its not impossible, Nvidia could release them at the same time, IF Titan does not cannibalized their Tesla sales, which in the case appear to be more likely (Crippled DP).
> 
> Again this is not 1080 Ti, Nvidia can release them anytime and it will not be affecting anything. Since when AMD can even affect the Titan in any way?
> 
> Check the release date for:
> 
> OG Titan vs R9 290X
> Titan Black vs (Nth from AMD)
> Titan X vs Fury X
> 
> You realized Titan has nth to do with AMD at all. It is an isolated independent release. So stop linking it with Vega, if anything titan will release at least 3 months before Vega, like it been doing since 2013.


The more relevant comparison is when Titan has historically showed up after Tesla in the past. I don't recall them ever releasing at the same time.


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> well then you should look how long it took for Titan to release when it was under a new node
> 
> it was 11 months IIRC
> 
> and when did a Titan ever come out within first 4-5 months of a new gen ? even on a mature node
> yes
> 
> and that wont be in August-September


28nm was an exception for poor yield. The fact that nvidia never cripple their pcie further than their nvlink counterpart showed that the yield was much better than expected, hence the early release


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *guttheslayer*
> 
> 28nm was an exception for poor yield. The fact that nvidia never cripple their pcie further than their nvlink counterpart showed that the yield was much better than expected, hence the early release


Are you kidding??? With the supply issues we have seen with the much smaller chips in the 1080/1070 you really think a monster chip like P100 is going to have good yields??? You really need to lay off of that "hope" train dude, you are embarrassing yourself. Yields are always poor on brand new nodes, especially with massive chip sizes...


----------



## magnek

From Tesla K20(X) to OG Titan took 3 months (Nov 2012 to Feb 2013), so if Tesla P100 launches in early September, it's possible we could see a Titan part by December 2016.

But yeah retail availability in September is just wishful thinking.


----------



## PostalTwinkie

Quote:


> Originally Posted by *magnek*
> 
> From Tesla K20(X) to OG Titan took 3 months (Nov 2012 to Feb 2013), so if Tesla P100 launches in early September, it's possible we could see a Titan part by December 2016.
> 
> But yeah retail availability in September is just wishful thinking.


Screw the Titan, give me the 1080 Ti!


----------



## Kpjoslee

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Screw the Titan, give me the 1080 Ti!


That won't happen for a good while unless highest end Vega next year convincingly beats 1080.


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Are you kidding??? With the supply issues we have seen with the much smaller chips in the 1080/1070 you really think a monster chip like P100 is going to have good yields??? You really need to lay off of that "hope" train dude, you are embarrassing yourself. Yields are always poor on brand new nodes, especially with massive chip sizes...


Lets see who will embarass when the titan part hit on early oct or late sept.

You guys can go around convincing other screaming its fake, but i am not getting convinced till a reliable source confirm to disprove that. If not it will still probably release tgt with tesla release as early as sept.


----------



## magnek

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Screw the Titan, give me the 1080 Ti!


Calling it now: Q2 2017 at the absolute earliest, and there's still no guarantee it'll use HBM2 instead of GDDR5X.

Heck depending on how Vega fairs nVidia may not even need a 1080 Ti this round, and if you want the big chip, it's Titan or bust.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *magnek*
> 
> Calling it now: Q2 2017 at the absolute earliest, and there's still no guarantee it'll use HBM2 instead of GDDR5X.
> 
> Heck depending on how Vega fairs nVidia may not even need a 1080 Ti this round, and if you want the big chip, it's Titan or bust.


There would be little reason to release a 1080Ti if Vega fails to match or significantly beat the 1080. Nvidia fanboys ought to be praying that Vega turns in a strong performance...


----------



## ChevChelios

regardless of Vega, IMHO after some months of Titan milking it makes sense to release 1080Ti, just to give an upgrade option to 980Ti owners who dont wish to pay over $1000 for the Titan P


----------



## SuperZan

With their brand and ecosystem as strong as it is I don't think they'd need to do so if Vega doesn't pan out. Why spend extra money prepping another product when you can simply offer the 1080 or the Titan knowing that so many Nvidia owners will upgrade regardless? At that point the only reason for the 1080 Ti would be the double-milk, getting 1080 owners to upgrade again to the 1080 Ti, but if they don't reckon that the number of holdouts will be significant they could count on the Titan for that as well. They tested the waters with the 1080's FE pricing and now the Titan's typical MSRP doesn't seem a bridge too far to many, judging by the reactions we've seen.


----------



## ChevChelios

didnt know where else to post, but since Titan is affected by Vega - might as well post here

http://videocardz.com/62117/amd-confirms-radeon-rx-470-and-rx-460-specifications



AMDs latest roadmap shows Vega still slated for early 2017 .. no October nonsense


----------



## Cyro999

Quote:


> I disagree with this statement to a point. HBM didn't make the Fury X a better card than the 980Ti with its regular GDDR5


HBM1 allowed for ~1.33x the memory bandwidth of the 390/x.

Maxwell and Pascal just use significantly less memory bandwidth per frame than GCN does, especially gcn 1.0 / 1.1

HBM1 was going from ~336 - 384GB/s to 512GB/s.

HBM2 is 1024GB/s


----------



## renejr902

Quote:


> Originally Posted by *rcfc89*
> 
> I disagree I played on the Acer Predator hb271hk which is superior in every way to the 4k display you game on(Input lag/ Gsync etc.) and it only took me a week to send it back. 4K simply put is just not ready. Even with Gsync and maintaining 60fps things felt laggy and sloppy. No matter the game. Fallout 4, BF4, Doom, Battefront, FCPrimal. I believe are current available DisplayPort tech is whats causing this experience. No matter how strong your PC is and how well designed the Display is the cable struggling to push things along smoothly. In today's market I believe the best available experience in PC gaming is the X34. Unreal immersion, fantastic picture quality and vibrancy along with 100hz and Gsync. I played on a friends this past friday and sent my 4K back today and ordered the X34.


You didnt play on my Samsung JS8500 55" , its the 2nd best tv of 2015 go read at rtings.com. NO BLUR AT ALL, NO Ghostling, it beat all monitor i had before. Dont check the specification of a monitor or tv ( they played with numbers everytime), play with it yourself to evaluate. The js8500 even in hdmi is the best iq i have seen from any monitor or other tv.i tested 24ms for input lag which is fine for me


----------



## renejr902

Quote:


> Originally Posted by *Lee Patekar*
> 
> If GP102 or GP100 is in sufficient quantities to sell, they'll release it as a Titan for extra revenue. Once AMD can compete with the 1080 GTX or the Titan P, that's when they undercut AMD with the 1080 TI.
> 
> There's zero reason to release a 1080 TI if there's no market pressure.. but a "luxury" item like a titan, which you can put a crazy price tag on like 1,400$, is a compelling sell.


I agree, and titan coming in sept








If titan is too strong for amd to compete , nvidia can play fair with amd and release a 3dfx voodoo6 or TnT3 instead , everyone would be surprised


----------



## NikolayNeykov

Vega or no Vega Nvidia release Ti not to compare to someone, but for own personal profit. After peeps buyng titan there are some that doesn't wish to pay so much and looking for alternative - this is where Ti comes, so nvidia to collect more money from the people. Sound about right, don't you agree?


----------



## ChevChelios

Quote:


> Originally Posted by *NikolayNeykov*
> 
> Vega or no Vega Nvidia release Ti not to compare to someone, but for own personal profit. After peeps buyng titan there are some that doesn't wish to pay so much and looking for alternative - this is where Ti comes, so nvidia to collect more money from the people. Sound about right, don't you agree?


exactly

1080Ti will come even if Vega fails to beat 1080 by a significant margin

plenty of people with 980Ti/Titan X may decide to hold off a little longer and wait for 1080Ti instead of dumping money on the Titan P

_especially_ if the big Titan P will be more than $1000


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> exactly
> 
> 1080Ti will come even if Vega fails to beat 1080 by a significant margin
> 
> plenty of people with 980Ti/Titan X may decide to hold off a little longer and wait for 1080Ti instead of dumping money on the Titan P
> 
> _especially_ if the big Titan P will be more than $1000


i agree


----------



## Lee Patekar

Everyone expecting to see a 1080 TI even if AMD doesn't compete are delusional. Why on earth would nVidia compete with their own current top seller? No, they'll launch a luxury item instead which doesn't compete with the 1080 GTX and keep the prices of their medium size die HIGH.

If VEGA competes with the 1080 GTX then you'll see a 1080TI. But if Vega isn't competitive or delayed, you'll never see a 1080 TI, only an 1180 GTX much later.

Some of you have extremely short attention spans... where's the 680 TI? Oh yeah, doesn't exist.. didn't need to exist.. the 7970 barely competed with the 680.. So they sat on GK100 and released it as a 780 instead. That's what lack of competition does. But they sure did rush that 780 TI out the gate, making titan owners feel uneasy.. remember why?

Competition, competition and competition. Without it our hobby is stale.. much like intel's desktop processors.


----------



## ChevChelios

well of course Vega is going to at least match 1080, thus competing with it

I mean its the top high-end chip of AMD, how can it not at least reach the "mid-range" 314 mm2 of the competition ?

by that logic, it should already be enough for the 1080Ti to come out no matter what and take over for 1080

unless there is a chance that if Vega = 1080 perf wise, then they just lower the price of 1080 a bit and settle at that

but thats not likely, is it ? they will just release a faster card to take the top spot .. so again - the 1080Ti will come, theres no way it doesnt come


----------



## CasualCat

Quote:


> Originally Posted by *Lee Patekar*
> 
> Everyone expecting to see a 1080 TI even if AMD doesn't compete are delusional. Why on earth would nVidia compete with their own current top seller? No, they'll launch a luxury item instead which doesn't compete with the 1080 GTX and keep the prices of their medium size die HIGH.
> 
> If VEGA competes with the 1080 GTX then you'll see a 1080TI. But if Vega isn't competitive or delayed, you'll never see a 1080 TI, only an 1180 GTX much later.
> 
> Some of you have extremely short attention spans... where's the 680 TI? Oh yeah, doesn't exist.. didn't need to exist.. the 7970 barely competed with the 680.. So they sat on GK100 and released it as a 780 instead. That's what lack of competition does. But they sure did rush that 780 TI out the gate, making titan owners feel uneasy.. remember why?
> 
> Competition, competition and competition. Without it our hobby is stale.. much like intel's desktop processors.


Maybe, but I suspect they still want to sell new cards to 980Ti/TitanX owners and honestly the 1080 is not the card for them.

You also contradict yourself with your Intel example. Despite lack of competition Intel is still releasing HEDT processors. Just look at the 5960X, 5820K, 6950X, and 6900K. We got a 6 core on the entry level HEDT chip for the first time as well as an eight core chip for the first time. Now we're getting a 10 core chip for the first time. It certainly isn't competition from AMD pressuring them to do that. Also the OG Titan was released with no competition from AMD. That was the GK110 part. Besides the thread is primarily about Pascal Titan not the 1080Ti (or whatever they decide to call it) part.


----------



## Lee Patekar

Quote:


> Originally Posted by *ChevChelios*
> 
> well of course Vega is going to at least match 1080, thus competing with it
> 
> I mean its the top high-end chip of AMD, how can it not at least reach the "mid-range" 314 mm2 of the competition ?
> 
> by that logic, it should already be enough for the 1080Ti to come out no matter what and take over for 1080


You're talking like an GP100/GP102 product isn't already being tested by nVidia. They're not going to brand or release it until market forces justify its existence. Why should they launch their big chip as a 1080 TI and cannibalize 1080 GTX sales when AMD didn't even announce a Vega SKU yet. AMD don't even want to talk about Vega yet. Like with the 780, nVidia are sitting on it.
Quote:


> Originally Posted by *ChevChelios*
> 
> unless there is a chance that if Vega = 1080 perf wise, then they just lower the price of 1080 a bit and settle at that
> 
> but thats not likely, is it ? they will just release a faster card to take the top spot .. so again - the 1080Ti will come, theres no way it doesnt come


Or release factory overclocked 1080s. Think a little bit.. if they can sell a 300 mm^2 die for 600-700 bucks why on earth would they want to sell a 600 mm^2 for less than 1200-1400? Its wishful thinking! Much like expecting intel to release an 8 core desktop processor for 300$. They can, but they won't unless AMD forces them with Zen.

Quote:


> Originally Posted by *CasualCat*
> 
> Maybe, but I suspect they still want to sell new cards to 980Ti/TitanX owners and honestly the 1080 is not the card for them.


Why? nVidia is a corporation, they aren't your friend. The 1080 GTX is faster than your 980 TI, that's your upgrade. (or the 1400$ Titan P).
Quote:


> Originally Posted by *CasualCat*
> 
> You also contradict yourself with your Intel example. Despite lack of competition Intel is still releasing HEDT processors. Just look at the 5960X, 5820K, 6950X, and 6900K. We got a 6 core on the entry level HEDT chip for the first time as well as an eight core chip for the first time. Now we're getting a 10 core chip for the first time. It certainly isn't competition from AMD pressuring them to do that. Also the OG Titan was released with no competition from AMD. That was the GK110 part. Besides the thread is primarily about Pascal Titan not the 1080Ti (or whatever they decide to call it) part.


Extreme edition processors are luxury items.. just look at their price. They have more in common with the Xeon line (which has a large market with increasing needs, despite lack of competition. You can thank data servers or "the cloud" for that.) And that 10 core monster chip.. I'd rather get its equivalent Xeon which costs less hahahaha

I'm talking about the desktop variants.. they are all 4 core, 8 threads with similar clock speeds. As for the Titan you're actually proving my point.. there was no competition so they packaged GK100s into a hybrid gaming/workstation card they sold for 1000$. They'll do the same with GP100/GP102, they'll sell it like a luxury item, not a 1080 TI.


----------



## ChevChelios

Quote:


> Or release factory overclocked 1080s


those already exist, the best AIB factory overclocked 1080 cards boost to 1950-2000 out of the box

a nice bump but nothing with which to beat Vega

for that they should have 1080Ti, which they will release when the time is right and I stand by that









Quote:


> Why should they *launch their big chip as a 1080 TI*


I never said that

of course they release Titan P first, milk it and then 1080Ti


----------



## CasualCat

Quote:


> Originally Posted by *Lee Patekar*
> 
> Why? nVidia is a corporation, they aren't your friend. The 1080 GTX is faster than your 980 TI, that's your upgrade. (or the 1400$ Titan P).










I don't think they're anyone's friend. Wow they're a corporation? Thanks for your condescension. They know there are people who own 980Tis/TitanXs that won't buy 1080s. Even some of the marketing slides compared the 1080 to the 980. Same as the 980 slides being compared to the 680:





Presumably they'll want to sell those people new cards, and they're clearly not targeting them with the 1080.


----------



## ChevChelios

the whole X80 --> Titan --> X80 Ti has worked excellent for them with Maxwell, and why would it change now


----------



## Lee Patekar

Quote:


> Originally Posted by *ChevChelios*
> 
> of course they release Titan P first, milk it and then 1080Ti


Only if Vega releases on schedule or warrants it. If enough time passes they'll re-brand it a 1180 GTX. Ultimately the high end, or should I say affordable high end, is limited by AMD right now. Don't take a 1080 TI for granted.. if AMD screws up again, its another boring release schedule like we had with the 780.
Quote:


> Originally Posted by *ChevChelios*
> 
> the whole X80 --> Titan --> X80 Ti has worked excellent for them with Maxwell, and why would it change now


That mentality assumes nVidia act alone, like they're in a vacuum.


----------



## ChevChelios

Vega will release early 2017, its all but guranteed at this point, even by AMDs latest roadmap

and I certainly hope they wont rebrand 1080Ti as 1180 lol, since Volta is a new arch and should be quite different to Pascal

Quote:


> Don't take a 1080 TI for granted.


I dont even play to buy it, I went with the 1080 --> 1180 upgrade path instead of waiting for big dies

but I think it will definitely come


----------



## renejr902

Quote:


> Originally Posted by *Lee Patekar*
> 
> You're talking like an GP100/GP102 product isn't already being tested by nVidia. They're not going to brand or release it until market forces justify its existence. Why should they launch their big chip as a 1080 TI and cannibalize 1080 GTX sales when AMD didn't even announce a Vega SKU yet. AMD don't even want to talk about Vega yet. Like with the 780, nVidia are sitting on it.
> Or release factory overclocked 1080s. Think a little bit.. if they can sell a 300 mm^2 die for 600-700 bucks why on earth would they want to sell a 600 mm^2 for less than 1200-1400? Its wishful thinking! Much like expecting intel to release an 8 core desktop processor for 300$. They can, but they won't unless AMD forces them with Zen.
> Why? nVidia is a corporation, they aren't your friend. The 1080 GTX is faster than your 980 TI, that's your upgrade. (or the 1400$ Titan P).
> Extreme edition processors are luxury items.. just look at their price. They have more in common with the Xeon line (which has a large market with increasing needs, despite lack of competition. You can thank data servers or "the cloud" for that.) And that 10 core monster chip.. I'd rather get its equivalent Xeon which costs less hahahaha
> 
> I'm talking about the desktop variants.. they are all 4 core, 8 threads with similar clock speeds. As for the Titan you're actually proving my point.. there was no competition so they packaged GK100s into a hybrid gaming/workstation card they sold for 1000$. They'll do the same with GP100/GP102, they'll sell it like a luxury item, not a 1080 TI.


Sorry guy, i dont agree much with you on several points, they know a lot of owners of 980ti will wait for a 1080ti , they need to make money with them...


----------



## Lee Patekar

Quote:


> Originally Posted by *renejr902*
> 
> Sorry guy, i dont agree much with you on several points


Yes, that's a crushing argument you made there lol.

Tech enthusiasts meet business, business meet tech enthusiast.


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> the whole X80 --> Titan --> X80 Ti has worked excellent for them with Maxwell, and why would it change now


Exactly what i think !


----------



## renejr902

Quote:


> Originally Posted by *Lee Patekar*
> 
> Yes, that's a crushing argument you made there lol.
> 
> Tech enthusiasts meet business, business meet tech enthusiast.


im from Montreal too, but im not sure we will become friend lol , we dont seem to understand each other , but dont worry everyone has right to his own opinion









Nvidia knows that a lot of owners of 980ti will wait for a 1080ti , they need to make money with them... They need to sell them cards, for more money and to keep them, if they dont release 1080ti , maybe it will because Volta is near and they dont care about pascal much, but i will be surprised! I think they will still release 1080ti to crush amd even if their cards are weak vs 1080. Sometime unfortunately, i think radeon videocard will die within two years and amd will focus on cpu and sell radeon departement OR concentrate videocards only for consoles and apu


----------



## Lee Patekar

Quote:


> Originally Posted by *renejr902*
> 
> im from Montreal too, but im not sure we will become friend lol , we dont seem to understand each other , but dont worry everyone has right to his own opinion


History never repeats but it rhymes. This is my prediction:

1080 GTX -> Titan P

if no Vega (very long delay), time passes and 1180 GTX is based on GP100/GP102
if Vega is delayed, 1080TI is delayed
if Vega isn't competitive, time passes and 1180 GTX is based on GP100/GP102
if Vega is competitive, 1080 TI to beat it
if Vega is competitive with 1080 TI, expect Volta sooner than later

Personally I'm hoping for the last two options; they are the best for all consumers. But I don't take anything for granted... we could see another 680 GTX -> Titan -> 780 GTX should AMD stumble again.

As for Montreal, who cares?


----------



## czin125

If Volta comes out sooner, does that mean the DGX-1's successor comes out earlier too?


----------



## renejr902

Quote:


> Originally Posted by *Lee Patekar*
> 
> History never repeats but it rhymes. This is my prediction:
> 
> 1080 GTX -> Titan P
> 
> if no Vega (very long delay), time passes and 1180 GTX is based on GP100/GP102
> if Vega is delayed, 1080TI is delayed
> if Vega isn't competitive, time passes and 1180 GTX is based on GP100/GP102
> if Vega is competitive, 1080 TI to beat it
> if Vega is competitive with 1080 TI, expect Volta sooner than later
> 
> Personally I'm hoping for the last two options; they are the best for all consumers. But I don't take anything for granted... we could see another 680 GTX -> Titan -> 780 GTX should AMD stumble again.
> 
> As for Montreal, who cares?


Your scenario is still possible, but its not what i think, we will see in near future, i think time has changed and TI will be present nearly in each generation. Maxell prove it worth it and maximise profit. About montreal thing i just said that like a joke.


----------



## Lee Patekar

Quote:


> Originally Posted by *renejr902*
> 
> Your scenario is still possible, but its not what i think, we will see in near future, i think time has changed and TI will be present nearly in each generation. Maxell prove it worth it and maximise profit. About montreal thing i just said that like a joke.


Except the 980 TI was launched to preempt AMD's fury launch by a few weeks. Had there been no fury, would there have been a 980 TI?

Like i said in a previous post, nVidia doesn't exist in a vacuum. They compete and they maximize profits. That's why we never saw a 680 TI.


----------



## ChevChelios

there will be a Vega, so there will be a 1080Ti

believe it !


----------



## Lee Patekar

Quote:


> Originally Posted by *ChevChelios*
> 
> there will be a Vega, so there will be a 1080Ti
> 
> believe it !


Oh I want to believe Vega will kick ass and be on time.. but.. AMD doesn't have the best track record right now -_-


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> exactly
> 
> 1080Ti will come even if Vega fails to beat 1080 by a significant margin
> 
> plenty of people with 980Ti/Titan X may decide to hold off a little longer and wait for 1080Ti instead of dumping money on the Titan P
> 
> _especially_ if the big Titan P will be more than $1000


That will make more sense with Ti in Q1 while the Titan will be on Q4 or Sept time.


----------



## guttheslayer

Quote:


> Originally Posted by *Lee Patekar*
> 
> Oh I want to believe Vega will kick ass and be on time.. but.. AMD doesn't have the best track record right now -_-


titan if its officially 50% faster than 1080, den you will be shock to find out its 3x the RX 480 in raw performance alone.

That make Vega to be on 6900 cores to stand any chance to compete. Oh not including OCing the gap will be even further.


----------



## renejr902

Quote:


> Originally Posted by *Lee Patekar*
> 
> Except the 980 TI was launched to preempt AMD's fury launch by a few weeks. Had there been no fury, would there have been a 980 TI?
> 
> Like i said in a previous post, nVidia doesn't exist in a vacuum. They compete and they maximize profits. That's why we never saw a 680 TI.


Without fury, im still not sure 980ti would have existed, but 980ti sold so well , they should have seen that ti card will maximize their profit every generation. In the 680 time they didnt really know the ti impact, now they know it. Anyway i can be wrong we will see...


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> Without fury, im still not sure 980ti would have existed, but 980ti sold so well , they should have seen that ti card will maximize their profit every generation. In the 680 time they didnt really know the ti impact, now they know it. Anyway i can be wrong we will see...


The ti is a very interesting case for 980 series. If u understand its a cut down of titan x and the entire gm200 lineup didnt have alot of variant coming out. In fact only 2 gpu. Titan x and ti. No more.

For gp100 the case is very different. In light to this they is already 2 titans and possible one more variant to be release as gtx 1180 for next year. That leave ti option to exist decrease significantly.

Nvidia need the 1100 to last them another year before volta hit with possible 10nm (not cfm). And the flagship 1180 cannot be a re-brand 1080.

That leave the possibility of a gp102 g5x variant but till now is in the rumor region. Maybe the 1080 ti will be based on that, if it was release 9 months from now. Or maybe it will be push by another 3 months just to become 1180 instead.

The entire release can prove to be difficult for 1080 ti to come out due to so many new card cramped from now till mid 2016.


----------



## sherlock

Quote:


> Originally Posted by *renejr902*
> 
> Without fury, im still not sure 980ti would have existed, but 980ti sold so well , they should have seen that ti card will maximize their profit every generation. In the 680 time they didnt really know the ti impact, now they know it. Anyway i can be wrong we will see...


They knew the ti impact pretty well even in 680 time, thus 660Ti($300 cut down GK104) was created to contest with 7870 Ghz or 7950($350) without dropping 670 price($400) as 660 was inadequate.

Same goes for 560Ti 448, basically a cut down 570 because Nvidia was having problem with 6950.

The Ti variant are created because the normal 60/70/80 card is having problem meeting AMD at a certain price point/performance. The same goes for 780Ti(290X surpassed 780) and 980Ti( Fury X immiment). There were no 580Ti or 680Ti because 580/680 was enough to compete with 6970 & 7970/Ghz.


----------



## guttheslayer

Quote:


> Originally Posted by *sherlock*
> 
> They knew the ti impact pretty well even in 600 series, thus 660Ti($300 cut down GK104) was created to contest with 7950($350) without dropping 670 price($400).
> 
> Same goes for 560Ti 448, basically a cut down 570 because Nvidia was having problem with 6950.
> 
> The Ti variant are created because the normal 60/70/80 card is having problem meeting AMD at a certain price point/performance. The same goes for 780Ti(290X surpassed 780) and 980Ti( Fury X immiment). There were no 580Ti or 680Ti because 580/680 was enough to compete with 6970 & 7970/Ghz.


For many time I have explained the Ti were meant to compete with AMD highest offering and yet some stubborn ppl refuse to believe.

On the other hand I never seen any of the Titan suffered any form of price cut no matter how well AMD does. Even the Titan X didnt get a official slash for prices, it was simply EOLed.


----------



## sherlock

Quote:


> Originally Posted by *guttheslayer*
> 
> For many time I have explained *the Ti* were meant to compete with *AMD highest offering* and yet some stubborn ppl refuse to believe.


Then what are *750Ti, 660Ti or 560 Ti 448*? None of these cards were AMD's highest offering. They are created to fill a gap in NV product line where they were not competitive with AMD.

A X80Ti is only necessary when X80 itself couldn't compete with AMD's highest offering and the next X80 is still far away. This goes for 780Ti and 980Ti, yet 580Ti/680Ti were not necssary because 580/680 were competitive with 6970/7970.


----------



## guttheslayer

Quote:


> Originally Posted by *sherlock*
> 
> Then what are 750Ti, 660Ti or 560 Ti 448? None of these cards were AMD's highest offering. They are created to fill a gap in NV product line where they were not competitive with AMD.


The Ti I referring to was the 980 Ti and 780 Ti. That Ti existences will be determined by AMD's best offering.

We may never see the light of 1080 Ti if AMD Vega fails badly.


----------



## Lee Patekar

Quote:


> Originally Posted by *guttheslayer*
> 
> We may never see the light of 1080 Ti if AMD Vega fails badly.


Its so obvious but some really want to believe...


----------



## sherlock

Quote:


> Originally Posted by *guttheslayer*
> 
> The Ti I referring to was the 980 Ti and 780 Ti. That Ti existences will be determined by AMD's best offering.
> 
> We may never see the light of 1080 Ti if AMD Vega fails badly.


Then spell it out that you meant X80Ti, I agree with you that 1080Ti might not be released at all if Vega is noncompetitive, as 1080 would be a higher margin product at any reasonable 1080Ti price point. Though we might still see Titan P.


----------



## guttheslayer

Quote:


> Originally Posted by *sherlock*
> 
> Then spell it out that you meant X80Ti, I agree with you that 1080Ti might not be released at all if Vega is noncompetitive, as 1080 would be a higher margin product at any reasonable 1080Ti price point. Though we might still see Titan P.


Titan P will exist no matter what AMD do, that is becz there are still many chips to sell if they failed their Tesla requirement.

Their price have been untouchable, and will remain so. So their existence is independent of AMD release, except inflation of prices.


----------



## magnek

Quote:


> Originally Posted by *guttheslayer*
> 
> titan if its officially 50% faster than 1080, den you will be shock to find out its 3x the RX 480 in raw performance alone.
> 
> That make Vega to be on 6900 cores to stand any chance to compete. Oh not including OCing the gap will be even further.


I fail to see how Titan P can be 50% faster than 1080 with only 40% more shaders and likely a lower clock as well.

Titan X was exactly 1.5x of 980 in terms of chip specs, but only ended up being about 25-30% faster. Needed a quite significant overclock to open the gap to 50%, but even then it took OC Titan X vs stock 980.

I think Titan P might end up being 50% faster in select games with a pushed to the limit OC vs 1080, but in no way will it be 50% faster in all scenarios and certainly not stock vs stock or OC vs OC.


----------



## vaseria

i actually find it sad that pepole stomp on amd like there garbage when they have very little money compared to intel but can still make cpus and zen is supposed to match haswell or slightly better i belive and in response intels gonna ground pound the heck out of them permanantly removing them from the x86 market but some fools are happy about this im buying zzen regardless of the ground pound because zen has to be semi successful or were never gonna see that much performance gains in the x86 cpu market we havent really since that much at all since sandy bridge but now cannon lake is all the talk id like to say all skylake was is haswell on 14nm cannonlake may just be skylake on 10nm or a whole new whooping this is actually not good at all without competition nothings going to push the industry further and we will be paying 400$ for the 5-15% better performance that u chould just get but ocing the previous generation


----------



## EniGma1987

Quote:


> Originally Posted by *SuperZan*
> 
> With their brand and ecosystem as strong as it is I don't think they'd need to do so if Vega doesn't pan out. Why spend extra money prepping another product when you can simply offer the 1080 or the Titan knowing that so many Nvidia owners will upgrade regardless? At that point the only reason for the 1080 Ti would be the double-milk, getting 1080 owners to upgrade again to the 1080 Ti, but if they don't reckon that the number of holdouts will be significant they could count on the Titan for that as well. They tested the waters with the 1080's FE pricing and now the Titan's typical MSRP doesn't seem a bridge too far to many, judging by the reactions we've seen.


Nvidia would already have to have the 080Ti designed if they wanted to be able to release should Vega scare them. Since it would already be designed, no way Nvidia simply does not release the card should AMD not step up to the plate. That would be billions of R&D money down the drain to design and build the GPU and then not ever sell it. That doesnt make any sense. If Nvidia designed it, they will release, no matter what AMD does.


----------



## SuperZan

Quote:


> Originally Posted by *EniGma1987*
> 
> Nvidia would already have to have the 080Ti designed if they wanted to be able to release should Vega scare them. Since it would already be designed, no way Nvidia simply does not release the card should AMD not step up to the plate. That would be billions of R&D money down the drain to design and build the GPU and then not ever sell it. That doesnt make any sense. If Nvidia designed it, they will release, no matter what AMD does.


It takes billions of R&D money to design a cut chip?



The 1060, 1070, 1080, and Titan will more than return Nvidia's R&D expenditures. The 1080 Ti is only necessary if you've got to compete with something that comes near the Titan for substantially less. Besides all that , if the Titan is at or below $1200 it's going to appeal to more people than you'd think because they've already conditioned consumers to see the *80 chip at $699. There is no reason for Nvidia to compete with itself if it doesn't have to.


----------



## aDyerSituation

1. Titan P will not be 50% faster than 1080, and I'll put money on that
2. There won't be a 1080 Ti if Vega doesn't beat the 1080


----------



## PostalTwinkie

Not sure why people are acting like the "Ti" naming convention is new, and has some magical rules to being utilized. Nvidia has looooooooooong used the "Ti" suffix on cards at various tiers.


----------



## aDyerSituation

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Not sure why people are acting like the "Ti" naming convention is new, and has some magical rules to being utilized. Nvidia has looooooooooong used the "Ti" suffix on cards at various tears.


I can't think of an instance where it WASN'T used to fill in the gap for the competition


----------



## Lee Patekar

Quote:


> Originally Posted by *aDyerSituation*
> 
> I can't think of an instance where it WASN'T used to fill in the gap for the competition


Maybe that's why he said..
Quote:


> Originally Posted by *PostalTwinkie*
> 
> Nvidia has looooooooooong used the "Ti" suffix on cards at various *tears*.


Tears at selling a large die at a competitive price instead of selling it as Titans :^)


----------



## DNMock

Quote:


> Originally Posted by *aDyerSituation*
> 
> 1. Titan P will not be 50% faster than 1080, and I'll put money on that
> 2. There won't be a 1080 Ti if Vega doesn't beat the 1080


It's very possible it may well be with 40% more shaders and HBM2 memory. Should overclock even higher given that it's got two 8-pin power connectors, giving it what, a theoretical cap of pulling 375+ watts? Since the HBM2 should draw less power than the GDDR5X memory that's even more power to the GPU. Would explain why the card is 12" long too, probably for cooling.

Slap that baby under water with a custom bios and the 50% curbstomping of the 1080 shouldn't be any issue.


----------



## ChevChelios

I doubt 50% faster too tbh (maaaybe from the 375W one, but 375W is insane)

but we'll see


----------



## magnek

Quote:


> Originally Posted by *DNMock*
> 
> It's very possible it may well be with 40% more shaders and HBM2 memory. Should overclock even higher given that it's got two 8-pin power connectors, giving it what, a theoretical cap of pulling 375+ watts? Since the HBM2 should draw less power than the GDDR5X memory that's even more power to the GPU. Would explain why the card is 12" long too, probably for cooling.
> 
> Slap that baby under water with a custom bios and the 50% curbstomping of the 1080 shouldn't be any issue.


Power isn't what's holding 1080 back, so 2x8 pins on the Titan P doesn't mean it'll overclock higher. If anything the big chips always overclock a little worse because of having more of everything and thus more potential weak links.

Also this:
Quote:


> Originally Posted by *magnek*
> 
> I fail to see how Titan P can be 50% faster than 1080 with only 40% more shaders and likely a lower clock as well.
> 
> Titan X was exactly 1.5x of 980 in terms of chip specs, but only ended up being about 25-30% faster. Needed a quite significant overclock to open the gap to 50%, but even then it took OC Titan X vs stock 980.
> 
> I think Titan P might end up being 50% faster in select games with a pushed to the limit OC vs 1080, but in no way will it be 50% faster in all scenarios and certainly not stock vs stock or OC vs OC.


----------



## Asmodian

Quote:


> Originally Posted by *magnek*
> 
> Power isn't what's holding 1080 back, so 2x8 pins on the Titan P doesn't mean it'll overclock higher. If anything the big chips always overclock a little worse because of having more of everything and thus more potential weak links.
> 
> Also this:


Very true, it doesn't look like power is holding back any of the new GPUs. I fully expect the Titan P to overclock a bit worse than the 1080, very much like GM204 v.s. GM200.

However, how bandwidth limited is the 1080? With HBM2 a Titan P should have 100% more bandwidth than the 1080 to feed its 40% more shaders. Not that it changes anything for me if it is 40 instead of 50% faster than the 1080, I just want something significantly faster than my 980Ti.


----------



## ChevChelios

Quote:


> However, how bandwidth limited is the 1080?


on 1440p its definitely not

on 4K idk, buty I think it isnt a major issue there either


----------



## Asmodian

Quote:


> Originally Posted by *ChevChelios*
> 
> on 1440p its definitely not
> 
> on 4K idk, buty I think it isnt a major issue there either


Any reviews? I cannot find any good benchmarks where the impact of overclocking the memory on the 1080 was tested, everyone keeps benching with both overclocked (or just the core).


----------



## magnek

This is the closest I could find: https://www.reddit.com/r/nvidia/comments/4mm5zt/for_those_overclocking_1080_memory_please_read/

But just looking at the OC results from various sites, they seem to fall in line with expectations, so can't see any obvious bandwidth issues.


----------



## Asmodian

Quote:


> Originally Posted by *magnek*
> 
> This is the closest I could find: https://www.reddit.com/r/nvidia/comments/4mm5zt/for_those_overclocking_1080_memory_please_read/
> 
> But just looking at the OC results from various sites, they seem to fall in line with expectations, so can't see any obvious bandwidth issues.


I am sure it isn't starved for bandwidth but even the Fury gains performance with a memory overclock, HBM2 should add performance compared to 256-bit GDDR5X.


----------



## okcomputer360

Looks like this is the push Intel needed to unleash there super tech processors instead of inching along with 10% gaines each release to hold AMD down. Lets hope!


----------



## ryder

Quote:


> Originally Posted by *okcomputer360*
> 
> Looks like this is the push Intel needed to unleash there super tech processors instead of inching along with 10% gaines each release to hold AMD down. Lets hope!


so true


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *okcomputer360*
> 
> Looks like this is the push Intel needed to unleash there super tech processors instead of inching along with 10% gaines each release to hold AMD down. Lets hope!


I think Zen is going to be the push that Intel needs to release something more impressive than the meh products they've been releasing since SB, not some new card from Nvidia...


----------



## okcomputer360

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think Zen is going to be the push that Intel needs to release something more impressive than the meh products they've been releasing since SB, not some new card from Nvidia...


The perfect storm is brewing for intel to release more hidden alien tech on us!


----------



## guttheslayer

Quote:


> Originally Posted by *magnek*
> 
> I fail to see how Titan P can be 50% faster than 1080 with only 40% more shaders and likely a lower clock as well.
> 
> Titan X was exactly 1.5x of 980 in terms of chip specs, but only ended up being about 25-30% faster. Needed a quite significant overclock to open the gap to 50%, but even then it took OC Titan X vs stock 980.
> 
> I think Titan P might end up being 50% faster in select games with a pushed to the limit OC vs 1080, but in no way will it be 50% faster in all scenarios and certainly not stock vs stock or OC vs OC.


You are right if they have the same ipc. But the there is alot of hidden benefit on gp100 vs gp104

More cache, lesser core per sm, and the insanely low latency hbm2. That might boost ipc by a fair bit. So we wont know how much ipc advantage gp100 have vs 104. But the overall architecture is designed better than gp104 tbh.


----------



## renejr902

If 2 titan models are released in near future, i dont think you will see a 1080ti. Otherwise i still believe 1080ti will exist without much performance of amd this generation, but all of you have good points about that, so maybe i will change my mind








Its true that i wont see nvidia spend money for a 1080ti and have low profit vs 1080, youre right, but maybe nvidia will still release 1080ti for 980ti guys that wait for it ? Isnt? Finally you got me im still not sure maybe youre right guys







im not stubborn , i wanted proof, but some of you are convincing enough to be my mind changed after all.









In the time of geforce 2, i was there, i know im old







, there was geforce 2 gts, geforce 2 ultra and they released a geforce 2 pro by the end, and finally a geforce 2 ti to be competitive wuth rafeon 7500, in that case if radeon 7500 was not released , never a geforce 2 ti should have existed. so ti process is a old one, i just thought time have changed now. But i suppose im wrong finally, so i will have to buy a titan at 1400$ if no ti by spring 2017 or no titan at 1099$. i wont be more patient
Me too i fear titan wont be 50% better than 1080, it will be hard to do that, 30-40% more core and double bandwith, isnt enough?

Come on give me reputation i changed my mind







LOL, you got me !


----------



## ChevChelios

yeah Id say if there are actually 2 Titans then 1080Ti may not come, but then the lesser Titan would have to serve as some form of replcement of the 1080Ti anyway


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> yeah Id say if there are actually 2 Titans then 1080Ti may not come, but then the lesser Titan would have to serve as some form of replcement of the 1080Ti anyway


i checked history and i remember that at this time without radeon 7500, no geforce2 ti would have exist, at that time 3 geforce 2 flagship model were released, never a fourth model should have existed without radeon 7500, and i bought that radeon 7500 to replace my geforce2 gts lol, but to me the change was not worth it. Even in that time some games were bad optimized on radeon card, i was dissapointed, rally trophy was not really playable on radeon 7500 and i had no trouble to get high fps with my geforce 2 gts, where others games were slighly better with the radeon 7500 than gf2 gts, so after that i bought a gefore 4200ti but geforce 4 serie is another story all mid-range and high end were TI cards, pretty weird.. Geforce 6 as a 6800 gt and ultra(ultra = titan today) but no ti, because no need for it. History repeat by himself everytime, so finally ti is only to save their face vs competition, finally i was wrong i think, gt at old time was to people that would not have paid for ultra, not so much performance difference between a gt and ultra, its like high and enthousiast. The normal 6800 without gt and ultra was like a entry to highend


----------



## pez

Quote:


> Originally Posted by *Asmodian*
> 
> Any reviews? I cannot find any good benchmarks where the impact of overclocking the memory on the 1080 was tested, everyone keeps benching with both overclocked (or just the core).


I've done some very limited 4K benchmarking with Hitman Absolution as it was the hardest game for me to get stable on previous cards. I found my highest stable GPU clock and then set it back and worked with memory clocks.

The best thing I can give you is that between a series of 3 tests at each test (I started at 500, went to 525, etc), with 536Mhz added on top (literal 536, so 10546 in my case), I saw minimum FPS readings go up from 70 to 72 a couple times, and once it hit 74, but I'd chalk that up to margin of error as it didn't consistently get this. Overall, in this game, i didn't find GPU or memory clock to really effect much. Slightly higher average, but for 4K, I consider 60FPS minimum a more important task. IIRC, all of my testing pretty much gave me 70 for minimum, 79 on average and 89 max.


----------



## renejr902

I tried to find 1080 benchmark that compare only memory clock, no luck. I still did that test myself with my 980ti some months ago in witcher3 at ultra at 4k and i got some result. With no memory overclock i got 44fps , with full memory overclock 46-47fps for the same location.i will test that with my 1070 in a few days and report back, i have 2 babies so be patient lol, but it will be better with a 1080. Try to underclock of several mhz your 1080 memory and compare your result vs fully overclocked memory


----------



## guttheslayer

Anyway Vega have a long way to go if they wanna catch up with Nvidia highest offering.


----------



## FattysGoneWild

Quote:


> Originally Posted by *guttheslayer*
> 
> Anyway Vega have a long way to go if they wanna catch up with Nvidia highest offering.


AMD is not capable. Period.


----------



## junkman

Quote:


> Originally Posted by *FattysGoneWild*
> 
> AMD is not capable. Period.


I think they are fully capable. I think they are a bit behind on perf/watt at base clocks, but I think they can deliver a 1080 contender. Afterall, it is only a mid-range card.

Titan P contender will require HBM(2) to keep within TDP limits. I'm glad to see with Vulkan, HBM is finally getting the praise it is due with async support.

I don't think we will see either of those until next year. Yields are poor on these processes, as this FinFET technology was just a way (although novel) to salvage the 20nm failure a few years ago.

Can't even imagine the yield issues with a large, dense, GPU die. Why else do you think they charge an arm and a leg?


----------



## st0necold

Quote:


> Originally Posted by *rcfc89*
> 
> I disagree I played on the Acer Predator hb271hk which is superior in every way to the 4k display you game on(Input lag/ Gsync etc.) and it only took me a week to send it back. 4K simply put is just not ready. Even with Gsync and maintaining 60fps things felt laggy and sloppy. No matter the game. Fallout 4, BF4, Doom, Battefront, FCPrimal. I believe are current available DisplayPort tech is whats causing this experience. No matter how strong your PC is and how well designed the Display is the cable struggling to push things along smoothly. In today's market I believe the best available experience in PC gaming is the X34. Unreal immersion, fantastic picture quality and vibrancy along with 100hz and Gsync. I played on a friends this past friday and sent my 4K back today and ordered the X34.


I agree.

I picked up an XB270HK (4k, ips, gsync) and it was a great monitor but playing battlefield (all of them.) just was not happening... the other games looked great but until they can get 4k at 100hz+ it's a no go for me.


----------



## rcfc89

Quote:


> Originally Posted by *Lee Patekar*
> 
> History never repeats but it rhymes. This is my prediction:
> 
> 1080 GTX -> Titan P
> 
> if no Vega (very long delay), time passes and 1180 GTX is based on GP100/GP102
> if Vega is delayed, 1080TI is delayed
> if Vega isn't competitive, time passes and 1180 GTX is based on GP100/GP102
> if Vega is competitive, 1080 TI to beat it
> if Vega is competitive with 1080 TI, expect Volta sooner than later
> 
> Personally I'm hoping for the last two options; they are the best for all consumers. But I don't take anything for granted... we could see another 680 GTX -> Titan -> 780 GTX should AMD stumble again.
> 
> As for Montreal, who cares?


Where is the Lightning version of the 1080 then? Its not coming. Msi skipped the 980 with the Lightning and saved it for the 980Ti. Same thing will happen this generation. 1080Ti is coming. Book it. Lol if you think Nvidia cares about cutting into 1080 sells. 3 high end cards on the market is better then 2.


----------



## Asmodian

Quote:


> Originally Posted by *rcfc89*
> 
> Where is the Lightning version of the 1080 then? Its not coming. Msi skipped the 980 with the Lightning and saved it for the 980Ti. Same thing will happen this generation. 1080Ti is coming. Book it. Lol if you think Nvidia cares about cutting into 1080 sells. 3 high end cards on the market is better then 2.


Good point about the Lightning, MSI would be more likly to know if there was a higher end card comming out. Of course, Nvidia might let them make a Lightning Titan instead.

I agree, cutting into 1080 sales with the 1080 Ti doesn't even make sense, Nvidia makes more money on the Ti. They might worry about losing a Titan sale to a 1080 Ti but they would want to sell a 1080Ti instead of a 1080.


----------



## rcfc89

Quote:


> Originally Posted by *Asmodian*
> 
> Good point about the Lightning, MSI would be more likly to know if there was a higher end card comming out. Of course, *Nvidia might let them make a Lightning Titan instead*.
> 
> I agree, cutting into 1080 sales with the 1080 Ti doesn't even make sense, Nvidia makes more money on the Ti. They might worry about losing a Titan sale to a 1080 Ti but they would want to sell a 1080Ti instead of a 1080.


Bro you're giving me goosebumps haha. Anyways very slim chance we ever get aftermarket pcb's on the Titan brand. Haven't had one yet.


----------



## Lee Patekar

Quote:


> Originally Posted by *Asmodian*
> 
> I agree, cutting into 1080 sales with the 1080 Ti doesn't even make sense, Nvidia makes more money on the Ti. They might worry about losing a Titan sale to a 1080 Ti but they would want to sell a 1080Ti instead of a 1080.


Ideally they'd want to sell the 1080 GTX which is cheaper to produce for 600-700 bucks.. then eventually release its larger brother later when yields are better as a 1180 GTX. (All while selling it as a Titan beforehand for 1000$+)

Worse case is they have to compete with AMD and release the larger die before yields improve, thus costing them more in production costs, while selling it at a competitive price point. Which usually sits around the 600-800$.

Its basic economics gys.. Its also why we never saw a 680 TI.

I know you want an affordable 600 mm^2 pascal chip. I want it too.. but it won't magically materialize without AMD.


----------



## EniGma1987

Quote:


> Originally Posted by *Asmodian*
> 
> Of course, Nvidia might let them make a Lightning Titan instead.


Highly doubt that. Nvidia has never let there be any custom Titans at all, dont see why they would start now.


----------



## rcfc89

Quote:


> Originally Posted by *Lee Patekar*
> 
> Ideally they'd want to sell the 1080 GTX which is cheaper to produce for 600-700 bucks.. then eventually release its larger brother later when yields are better as a 1180 GTX. (All while selling it as a Titan beforehand for 1000$+)
> 
> Worse case is they have to compete with AMD and release the larger die before yields improve, thus costing them more in production costs, while selling it at a competitive price point. Which usually sits around the 600-800$.
> _*
> Its basic economics gys.. Its also why we never saw a 680 TI.*_
> 
> I know you want an affordable 600 mm^2 pascal chip. I want it too.. but it won't magically materialize without AMD.


That particular history is flawed. Before the 780Ti the Ti brand was reserved for mid-range chips like the 560Ti and the 660Ti. Had nothing to do with competition from Amd. Every since the 780Ti Nvidia has reserved the Ti brand to the higher-end 80 series chip. It has worked out very well for them. I don't see anything changing this generation.


----------



## ChevChelios

Lightning big Titan P $1600


----------



## Lee Patekar

Quote:


> Originally Posted by *rcfc89*
> 
> That particular history is flawed. Before the 780Ti the Ti brand was reserved for mid-range chips like the 560Ti and the 660Ti. Had nothing to do with competition from Amd. Every since the 780Ti Nvidia has reserved the Ti brand to the higher-end 80 series chip. It has worked out very well for them. I don't see anything changing this generation.


Then explain the 680 GTX relative to the 780 GTX and why they could justify selling the first Titan.
Then compare the release date of the 780 TI with the R9 290
Then compare the release date of the 980 TI with the Fury line.

Decisions aren't made on a whim nor do they just follow a pattern. Its a dance, like a nice tango, with AMD.

Edit: Actually if the Vega launch was imminent and the early numbers showed it beating the 1080 GTX by a large number we wouldn't be getting TITAN rumors, we'd be getting 1080 TI rumors. And we probably wouldn't see a Titan based on Pascal. The Titan brand only makes sense if the TI card isn't imminent.


----------



## Asmodian

Quote:


> Originally Posted by *Lee Patekar*
> 
> Then explain the 680 GTX relative to the 780 GTX and why they could justify selling the first Titan.
> Then compare the release date of the 780 TI with the R9 290
> Then compare the release date of the 980 TI with the Fury line.
> 
> Decisions aren't made on a whim nor do they just follow a pattern. Its a dance, like a nice tango, with AMD.


I seriously doubt AMD means anything to Nvidia this round, they are competing with their old linup, not AMD, for at least the rest of this year.

It is hard to tango when your partner isn't on the dance floor.


----------



## rcfc89

Quote:


> Originally Posted by *Lee Patekar*
> 
> Then explain the 680 GTX relative to the 780 GTX and why they could justify selling the first Titan.
> Then compare the release date of the 780 TI with the R9 290
> Then compare the release date of the 980 TI with the Fury line.
> 
> Decisions aren't made on a whim nor do they just follow a pattern. Its a dance, like a nice tango, with AMD.


Your argument is based off "release date's" now? My post was to explain to you that a 1080Ti will indeed release. When it comes out is directly up to Nvidia. Either way that besides the point. Your original point was that it wouldn't release because of "lack of competition", "hurting 1080 sells" and the fact that a 680Ti didn't exist. The last point is bogus as I explained in my last post. The 1080Ti will be released. When may or may not be effected by what Amd does but I doubt it. Nvidia has taken full control over the high-end gpu market. I don't think they really care what Amd does from this point on.


----------



## Lee Patekar

Quote:


> Originally Posted by *Asmodian*
> 
> I seriously doubt AMD means anything to Nvidia this round, they are competing with their old linup, not AMD, for at least the rest of this year.
> 
> It is hard to tango when your partner isn't on the dance floor.


If they have no competition (and its looking like that) then they'll do like intel in the desktop space. What's intel doing? Very small performance increases, stagnant feature set (4 cores, 8 threads). There die size keeps shrinking, prices stay static, profits increase with each generation.

With no competition nVidia will concentrate on their Tesla line where they compete with the Xeon Phi. No need to undercut their current top seller in the gaming market.. That means no 1080 TI, only a Titan P. Then in about a year they'll release big Pascal as an 1180 GTX. The new "generation" .. same trick they pulled with the 780 GTX.

We want AMD to compete. If zen is competitive we may yet see an 8 core processor for 350$. If Vega is competitive we may see low prices on a 1080 TI (500-600$). Without competition we get monopolies that concentrate on maximizing profits instead of innovating. We get a Titan line from nVidia, and an "extreme edition" line from intel..

It always amazes me how people fail to understand this simple fact.
Quote:


> Originally Posted by *rcfc89*
> 
> I don't think they really care what Amd does from this point on.


They did in the past. If this generation AMD chooses not to compete, then forget about a 1080 TI. That's the point I'm making. We need strong competition to spur innovation and raise the performance / price ratio of all offerings.


----------



## magnek

It boggles my mind that people are so insistent that 1080 Ti will be a thing regardless of competition. AMD has nothing to compete with 1080 right now, and if Vega only ends up around 1080 level, that's still not much competition. So if this were to happen, why exactly would nVidia need to bring out a 1080 Ti? Because 980 Ti and Titan X owners need something to upgrade to? Well the 1080 will always be around, and if you want more of an upgrade, feel free to step right up to the $1300+ Titan P. No 1080 Ti needed. Or you could always go and buy Vega if you wanted to save a few bucks (yeah like that's gonna happen *snort*).


----------



## SuperZan

Quote:


> Originally Posted by *magnek*
> 
> It boggles my mind that people are so insistent that 1080 Ti will be a thing regardless of competition. AMD has nothing to compete with 1080 right now, and if Vega only ends up around 1080 level, that's still not much competition. So if this were to happen, why exactly would nVidia need to bring out a 1080 Ti? Because 980 Ti and Titan X owners need something to upgrade to? Well the 1080 will always be around, and if you want more of an upgrade, feel free to step right up to the $1300+ Titan P. No 1080 Ti needed.


You're forgetting Jen-Hsun's long and storied tradition of altruism and concern for the consumer!

....

I'll see myself out.

Seriously though, neither company has any incentive to compete against itself and Nvidia hasn't reached its market-share levels through playing the nice guy. 980 Ti is technically supplanted by the 1080, and if that's not enough of an increase you can buy a Titan. I can't stress enough too that even with the likely increased price of the Titan the increased price of the *80 has made it seem 'reasonable' by comparison if you're the sort that was already inclined to consider a Titan.


----------



## ChevChelios

I still think there has to be either a 2-nd smaller cheaper Titan or a 1080Ti

both to give more options/incentive to 980Ti owners to upgrade as well as release a direct rival to Vega


----------



## rcfc89

Quote:


> Originally Posted by *SuperZan*
> 
> You're forgetting Jen-Hsun's long and storied tradition of altruism and concern for the consumer!
> 
> ....
> 
> I'll see myself out.
> 
> Seriously though, neither company has any incentive to compete against itself and Nvidia hasn't reached its market-share levels through playing the nice guy. 980 Ti is technically supplanted by the 1080, and *if that's not enough of an increase you can buy a Titan*. I can't stress enough too that even with the likely increased price of the Titan the increased price of the *80 has made it seem 'reasonable' by comparison if you're the sort that was already inclined to consider a Titan.


My only argument with that is Nvidia has created a monster with the Ti brand moving to the 80 series. Titan has and always will be a limited luxury piece with very poor price-performance ratio. 1080 currently doesn't offer enough of a performance increase to get most 980Ti owners to upgrade. Titan will but again the price will likely be ridiculous and only attract a limited amount of 980Ti owners. The Ti in the last 2 generations 780Ti/980Ti has been the best Nvidia bargain if you will for price/performance. I just can't see Nvidia taking that option away. A option that has made them a lot of money. Again 3 options at the Top will yield much better revenue for the company. 1080= high end, 1080Ti= Ultra-High end, Titan = Luxury. Again not seeing anything in the works for a MSI Lightning 1080 tells me without a doubt the 1080Ti is coming.


----------



## magnek

The point being that in the absence of a 1080 Ti, if you wanted more performance than a 1080, you'd have to buy a Titan. The only other option is to not upgrade and keep waiting.

This is of course assuming that Vega will only be around 1080 level, which is why more than one person has said what AMD does matters.


----------



## Lee Patekar

Quote:


> Originally Posted by *rcfc89*
> 
> 1080 currently doesn't offer enough of a performance increase to get most 980Ti owners to upgrade.


Without AMD 980 TI owners will need to wait for the 1180 GTX.
Quote:


> Originally Posted by *rcfc89*
> 
> The Ti in the last 2 generations 780Ti/980Ti has been the best Nvidia bargain if you will for price/performance.


That's because the 780 TI was direct competition to AMD's R9 290 and the 980 TI was direct competition to AMD's Fury lineup. Why you refuse to accept this is beyond me.


----------



## criminal

Quote:


> Originally Posted by *magnek*
> 
> The point being that in the absence of a 1080 Ti, if you wanted more performance than a 1080, you'd have to buy a Titan. The only other option is to not upgrade and keep waiting.
> 
> This is of course assuming that Vega will only be around 1080 level, which is why more than one person has said what AMD does matters.


I agree that if Vega can only compete with the 1080, Titan P will be the only choice available as an upgrade path for 980Ti owners unless they want to wait for Volta. Why waste chips on a 1080ti, when you can get so much more from the chips selling them as Titans?


----------



## SuperZan

Quote:


> Originally Posted by *criminal*
> 
> I agree that if Vega can only compete with the 1080, Titan P will be the only choice available as an upgrade path for 980Ti owners unless they want to wait for Volta. Why waste chips on a 1080ti, when you can get so much more from the chips selling them as Titans?


Yep, and judging by the 1080's pricing scheme a 1080 Ti would have to be at least $800 USD "MSRP", to say nothing of a founder's edition. At that point you're basically talking about Titan prices, and a ~$1200 Titan begins to seem 'reasonable'.


----------



## bigjdubb

I haven't kept up with this thread but I really really hope that Vega falls much closer to the TitanP than the 1080. I will most likely get one either way though, unless Nvidia suddenly starts supporting freesync monitors.


----------



## Seyumi

I already see where the new "Titan Mini" direction is going in if true:

x80 series -> Big Titan: Charge $1200~$1500 instead of $500
x70 series -> Small Titan: Charge $1000 instead of $350
x60 Series -> New x80 Ti series: Charge $850 instead of $200
x50 series -> new x80 series: Charge $700 instead of $100

If you think I'm just pulling that out of my ass then just look at the newest pricing schemes. You used to be able to buy Nvidia's highest end GPU for $500. I don't think inflation is 100% over the course of 5'ish years. Anyone old enough whos been buying and installing graphics cards since the beginning already know Nvidias new BS pricing model after they released the "high end" gtx 680. If they do a Titan AND a Titan mini then they just did it AGAIN and just took this a step further.

I'll let you in on a secret classified Nvidia company secret. A "high end" GPU with a single 8 pin or a 2x6pin power connector is NOT a "highly efficient" and "advanced technology" GPU....it's a mid-range GPU being labeled & sold as a high end GPU. Nvidia made it sound like they reinvented the wheel when they stuck two "high end" GTX 680's on a single card and called it a GTX 690 which wasn't completely gimped like the previous generation of dual GPUs. Not really a feat of technology when they were really just 2x midrange GPUs...


----------



## Asmodian

Quote:


> Originally Posted by *criminal*
> 
> I agree that if Vega can only compete with the 1080, Titan P will be the only choice available as an upgrade path for 980Ti owners unless they want to wait for Volta. Why waste chips on a 1080ti, when you can get so much more from the chips selling them as Titans?


I agree that an "affordable" GP100/102 will not be released if Vega isn't competitive but there is more to optimizing profit than pricing one product through the roof because you do not have strong competition. The x80Ti seems to be a strong brand at this point, probably even stronger than Titan due to the 980Ti, no reason not to capitalize on it.

Think of price v.s. demand curves, you want to maximize your profit which means picking the optimal point on the price/demand curve. Having multiple products allows you to cover multiple points on the curve. This also allows you to increase the top price point at lot, well above what would be optimal given only one product. All they need is something to differentiate the higher price points enough that not too many people who would have paid more get the cheaper version.

HBM2 might do it, a 12GB HBM2 Titan at $1400, a 16GB one at $2000, and a 12GB GDDR5X 1080Ti at $999 might offer Nvidia higher profits than simply trying to sell the two Titans for $1200 and $1500.

It is hard to know, I don't have a lot of good market research data on GPUs, but it really isn't as simple as "AMD has nothing near this performance so we will just sell this one for a lot".

Intel still releases the cheap version of their -E line. Why doesn't Intel only sell the 6950X since AMD doesn't have anything to compete at the high end? The 1080 is the 6700K and a theoretical 1080Ti would be the 6800K/6850K.


----------



## SuperZan

Quote:


> Originally Posted by *Asmodian*
> 
> I agree that an "affordable" GP100/102 will not be released if Vega isn't competitive but there is more to optimizing profit than pricing one product through the roof because you do not have strong competition. The x80Ti seems to be a strong brand at this point, probably even stronger than Titan due to the 980Ti, no reason not to capitalize on it.
> 
> Think of price v.s. demand curves, you want to maximize your profit which means picking the optimal point on the price/demand curve. Having multiple products allows you to cover multiple points on the curve. This also allows you to increase the top price point at lot, well above what would be optimal given only one product. All they need is something to differentiate the higher price points enough that not too many people who would have paid more get the cheaper version.
> 
> HBM2 might do it, a 12GB HBM2 Titan at $1400, a 16GB one at $2000, and a 12GB GDDR5X 1080Ti at $999 might offer Nvidia higher profits than simply trying to sell the two Titans for $1200 and $1500.
> 
> It is hard to know, I don't have a lot of good market research data on GPUs, but it really isn't as simple as "AMD has nothing near this performance so we will just sell this one for a lot".


You make some good points though in a larger sense I'd say that a 1080 Ti as you describe it at that price-point would ipso facto be a Titan. I could definitely see a tier-based scheme as you lay out, and I think that's actually a very reasonable prediction assuming a non-competitive Vega. What I'll say as a qualification is that when we discuss the *80 Ti, we're usually talking about Titan performance at a price nearer the *80 than the Titan, at least as far as reference. Going with your pontification and adding the element of FE, the *80 Ti becoming a 'tiered' member of the Titan grouping by virtue of price and performance will have eliminated that 'superb value' feeling that people took from the 980 Ti. I'm not saying it wouldn't sell - it would, and likely very well. It's a bit of labelling-semantics but I think it's worth considering in terms of what consumers expect and hope for with regards to *80 Ti.


----------



## magnek

A "1080 Ti" in name only is really just another Titan.


----------



## Kpjoslee

It is just hard to imagine having 600mm^2 gaming GPU this year when we just started getting 16nm 1080 in May in limited quantities.


----------



## Xuvial

Quote:


> Originally Posted by *Seyumi*
> 
> I already see where the new "Titan Mini" direction is going in if true:
> 
> x80 series -> Big Titan: Charge $1200~$1500 instead of $500
> x70 series -> Small Titan: Charge $1000 instead of $350
> x60 Series -> New x80 Ti series: Charge $850 instead of $200
> x50 series -> new x80 series: Charge $700 instead of $100
> 
> If you think I'm just pulling that out of my ass then just look at the newest pricing schemes. You used to be able to buy Nvidia's highest end GPU for $500. I don't think inflation is 100% over the course of 5'ish years. Anyone old enough whos been buying and installing graphics cards since the beginning already know Nvidias new BS pricing model after they released the "high end" gtx 680. If they do a Titan AND a Titan mini then they just did it AGAIN and just took this a step further.
> 
> I'll let you in on a secret classified Nvidia company secret. A "high end" GPU with a single 8 pin or a 2x6pin power connector is NOT a "highly efficient" and "advanced technology" GPU....it's a mid-range GPU being labeled & sold as a high end GPU. Nvidia made it sound like they reinvented the wheel when they stuck two "high end" GTX 680's on a single card and called it a GTX 690 which wasn't completely gimped like the previous generation of dual GPUs. Not really a feat of technology when they were really just 2x midrange GPUs...


Call it BS or whatever, ultimately nVidia can do whatever they want as long as it remains highly profitable, which heavily depends on competition from AMD. At the end of the day they're a business with shareholders and their business decisions are determined by sales. And sales are damn good right now.


----------



## Asmodian

Quote:


> Originally Posted by *SuperZan*
> 
> You make some good points though in a larger sense I'd say that a 1080 Ti as you describe it at that price-point would ipso facto be a Titan. I could definitely see a tier-based scheme as you lay out, and I think that's actually a very reasonable prediction assuming a non-competitive Vega. What I'll say as a qualification is that when we discuss the *80 Ti, we're usually talking about Titan performance at a price nearer the *80 than the Titan, at least as far as reference. Going with your pontification and adding the element of FE, the *80 Ti becoming a 'tiered' member of the Titan grouping by virtue of price and performance will have eliminated that 'superb value' feeling that people took from the 980 Ti. I'm not saying it wouldn't sell - it would, and likely very well. It's a bit of labelling-semantics but I think it's worth considering in terms of what consumers expect and hope for with regards to *80 Ti.


You make a good point, I think this is exactly why people do not believe a 1080Ti will come out if Vega isn't competitive.

And yes, a $999 1080Ti could easily be thought of as part of the Titan line, as far as price/performance goes. However, assuming AMD is not competitive (NOT a given) they might be able to push up the top price point in the way the original Titan did and the Titan Z was laughed at for trying.

Ouch.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *magnek*
> 
> It boggles my mind that people are so insistent that 1080 Ti will be a thing regardless of competition. AMD has nothing to compete with 1080 right now, and if Vega only ends up around 1080 level, that's still not much competition. So if this were to happen, why exactly would nVidia need to bring out a 1080 Ti? Because 980 Ti and Titan X owners need something to upgrade to? Well the 1080 will always be around, and if you want more of an upgrade, feel free to step right up to the $1300+ Titan P. No 1080 Ti needed. Or you could always go and buy Vega if you wanted to save a few bucks (yeah like that's gonna happen *snort*).


Exactly. If Vega turns out to be the true flop the Nvidia fanboys seem to be praying for you can go ahead and mark it down that there will be no 1080Ti at all...








Quote:


> Originally Posted by *Kpjoslee*
> 
> It is just hard to imagine having 600mm^2 gaming GPU this year when we just started getting 16nm 1080 in May in limited quantities.


That's because it ain't happening, no matter how over excited guttheslayer's rhetoric gets...


----------



## xTesla1856

I see a lot of people speculating and saying that Vega will be non-competitive and inferior to the Titan. Do we even know anything solid about Vega yet?


----------



## SuperZan

Quote:



> Originally Posted by *xTesla1856*
> 
> I see a lot of people speculating and saying that Vega will be non-competitive and inferior to the Titan. Do we even know anything solid about Vega yet?


Nope. Our very best guesstimates can only be formed from our incomplete picture of Polaris and that's not exactly solid ground IMO. I personally believe Vega will be competitive with a superior price/performance ratio. My guess is something oriented towards 1080 performance and something a bit more powerful that puts up a good showing against a Pascal Titan (note that I'm not claiming that I believe Vega will beat a Pascal Titan). I think in that event both chips will undercut their competition in terms of price. All just IMO, things could go in a radically different direction.


----------



## ChevChelios

Quote:


> Originally Posted by *xTesla1856*
> 
> I see a lot of people speculating and saying that Vega will be non-competitive and inferior to the Titan. Do we even know anything solid about Vega yet?


no, thats why they're speculating

in part its based on the bad perf/w and perf/mm2 of the 480 (its very bad in DX11, in DX12/Vulkan titles where GCN gains a good boost it improves but still doesnt quite reach Pascal level of perf/w)

but Vega will definitely be competitive in terms of price/perf (since AMD likes/needs to put a cheaper price to entice buyers and Nvidia is the "premium", more expensive brand







), that much you can count on


----------



## Kpjoslee

Quote:


> Originally Posted by *xTesla1856*
> 
> I see a lot of people speculating and saying that Vega will be non-competitive and inferior to the Titan. Do we even know anything solid about Vega yet?


We have a rumor that first Vega part will feature 4096 cores, given the clockspeed of 480 and bigger chips tend to clock lower than smaller parts, we can estimate Vega being about 30-40% faster than Fury X. That performance will definitely will match or beat 1080, but likely not touch Titan which will likely feature 3584-3840 cores (vs 1080's 2560 cores) and likely outperform 1080 by 30-40%.


----------



## xTesla1856

My personal belief is that we will see a similar situation as with the 980Ti and the Fury X, where at launch the Fiji cards were slower, but today with driver advancements and API optimization Fiji can be faster than GM200. I'm hoping for a very strong big Vega card, as it would improve competition on the market and maybe even force lower pricing from Nvidia.


----------



## pez

Quote:


> Originally Posted by *Kpjoslee*
> 
> It is just hard to imagine having 600mm^2 gaming GPU this year when we just started getting 16nm 1080 in May in limited quantities.


I thought this was because of GDDR5X? There's actually quite a lot of 1070s in stock or available to preorder compared to 1080s. I saw speculation of both, but wasn't sure the truth to it all as I never followed up.

For reference:

GTX 1080
http://www.nowinstock.net/computers/videocards/nvidia/gtx1080/

GTX 1070
http://www.nowinstock.net/computers/videocards/nvidia/gtx1070/


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's because it ain't happening, no matter how over excited guttheslayer's rhetoric gets...


More like this will be happening until another true reliable source prove otherwise, and is not from some prophet by the name of Majin from OCN.


----------



## guttheslayer

Quote:


> Originally Posted by *pez*
> 
> I thought this was because of GDDR5X? There's actually quite a lot of 1070s in stock or available to preorder compared to 1080s. I saw speculation of both, but wasn't sure the truth to it all as I never followed up.
> 
> For reference:
> 
> GTX 1080
> http://www.nowinstock.net/computers/videocards/nvidia/gtx1080/
> 
> GTX 1070
> http://www.nowinstock.net/computers/videocards/nvidia/gtx1070/


There are quite a few prophet who claim 1080 will never release on mid 2016 if they are on G5X, or that G5X will never appear on GP104, bam, nvidia prove them wrong this time, despite we all know G5X are in extreme limited production yet Nvidia somehow pull it through.

I am quite confident Titan will be out on Sept. Nvidia can pull it out if they want, but for what urgent reason to push ahead of their release I am still abit puzzled by that.

One of the possible reason is that, Nvidia has quite a number of product at hand, especially from GP100/102 to push out, and to avoid overlapping clashes they need to start FAST and EARLY.

GTX 1080 Ti
GTX 1180
GTX Titan with 2 variant on different date.

If each of the 4 high end part take 3 months to release, it will take 4 Quarter to release all of them (1 year). Of cos we can argue 1080 Ti is optional and highly dependable on AMD. But Nvidia always prepared themselves for the worst case and reserve a quarter time frame for this (if AMD fails, den this period will be empty)

Nvidia might plan to release all this but different time frame, and they only have 9-12 month to do so before the 1100 series is due for its annual Geforce lineup release (Q3 2017)


----------



## ChevChelios

GTX 1180 aka Volta is a 2018 product (probably May/June 2018, maaaybe Q1 2018), so it shouldnt overlap with Pascal

Quote:


> Nvidia might plan to release all this but different time frame, and they only have 9-12 month to do so before the 1100 series is due for its annual Geforce lineup release (Q3 2017)


Pascal just came out

how is 1100 already due in Q3 2017 ? Its ~2 years, not 1+ year

you're way in a rush here


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> GTX 1180 aka Volta is a 2018 product (probably May/June 2018, maaaybe Q1 2018), so it shouldnt overlap with Pascal
> Pascal just came out
> 
> how is 1100 already due in Q3 2017 ? Its ~2 years, not 1+ year
> 
> you're way in a rush here


The GTX 780 is one year after GTX 680 was release, on a new 28nm node. *and is on the same Kepler*

Who said 1180 is only for Volta? Why Volta cannot be based on GTX 1280? Why is it hard to produce an entire lineup of GTX 1100 series that is just rebrand of 1000s?

GTX 1180 - GP102, G5X with 3200 cores (or more)
GTX 1170 - Refined GP104
GTX 1160 - Refined GTX 1070
GTX 1150 - Rebranded GTX 1060

Volta will be 2 years later on Geforce 1200 series.


----------



## ChevChelios

why do you need to rebrand anything

seems fine to me stretch Pascal over 2 years (or 1.5-2 years) without any rebrands (like Maxwell was) and then hit with brand new Volta in Q1/Q2 2018 to go up against Navi

not to mention Nvidia roadmap already confirmed Volta for 2018 ...

your plan with Titan in August, rebrands and Volta for 1200 series is bizarre IMO


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> why do you need to rebrand anything
> 
> seems fine to me stretch Pascal over 2 years (or 1.5-2 years) without any rebrands (like Maxwell was) and then hit with brand new Volta in Q1/Q2 2018 to go up against Navi
> 
> not to mention Nvidia roadmap already confirmed Volta for 2018 ...
> 
> your plan with Titan in August, rebrands and Volta for 1200 series is bizarre IMO


Maxwell didnt have rebrand becz Nvidia didnt have to cut too much variant on their big die Maxwell. There were only 2 different GPU from GM200. 310-A1 (980 Ti) and 400-A1 (TX). There is no reason for Nvidia to further cut on GM200 due to extremely matured 28nm node. With just 2 variant add on, especially there is no Titan for Maxwell, didnt have the compelling reason to call for another generation rebrand.

For big kepler it was different, there were 3 different GPU on GK110 alone, as well as 2 Titans breed. Having more variant you can add in a new X80 GPU and shift the rest of the lineup down in the new series while still keeping ur Ti/Titan options available.

Rebrand is a extremely effective way to keep the market alive after 1 year. Especially if an extremely expensive production was shift down the lineup and price more reasonable (GTX 680 -> GTX 770). It was so effective in fact it help to keep Kepler going strong for 30 months (Mar 2012 - Sept 2014)

Maxwell on the other hand, had more limited release and it only lasted 21 months, Not to mention the last 12 month (after 980 Ti were out) were pretty dead for both Nvidia and the market.

Nvidia would be stupid not to pull the rebrand stuff that help to keep the cycle afresh. Especially it can easily buy them another year of revenue before Volta comes in. And with so many different variant of GP100/102 (HBM to G5X can easily double the count), a whole new lineup of Pascal is simply imminent next year.


----------



## rcfc89

Quote:


> Originally Posted by *xTesla1856*
> 
> My personal belief is that we will see a similar situation as with the 980Ti and the Fury X, where at launch the Fiji cards were slower, but today with driver advancements and API optimization Fiji can be faster than GM200. I'm hoping for a very strong big Vega card, as it would improve competition on the market and maybe even force lower pricing from Nvidia.


Having to wait a year after purchase for Fury X to beat its the competitor (980Ti) in one benchmark in one game that's two month's old while getting destroyed in all other major AAA title's. That makes me wan't to go out and buy a couple AMD gpu's right now. Not


----------



## guttheslayer

Quote:


> Originally Posted by *rcfc89*
> 
> Having to wait a year after purchase for Fury X to beat its the competitor (980Ti) in one benchmark in one game while getting destroyed in all other major AAA title's. That makes me wan't to go out and buy a couple AMD gpu's right now. Not


AMD future proofing doesnt work when they are so late in the game (Vega 9 months late), and need to further wait another 12 months to beat just one title. Lol.

Either way they are a flop right now, and I am not looking forward for their hyped up Zen.


----------



## SuperZan

Quote:


> Originally Posted by *rcfc89*
> 
> Having to wait a year after purchase for Fury X to beat its the competitor (980Ti) in one benchmark in one game that's two month's old while getting destroyed in all other major AAA title's. That makes me wan't to go out and buy a couple AMD gpu's right now. Not


Hyperbole is fun.


----------



## renejr902

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Exactly. If Vega turns out to be the true flop the Nvidia fanboys seem to be praying for you can go ahead and mark it down that there will be no 1080Ti at all...
> 
> 
> 
> 
> 
> 
> 
> 
> That's because it ain't happening, no matter how over excited guttheslayer's rhetoric gets...


im too believe this sept rumor for next titan, we will know the truth very soon...


----------



## renejr902

Quote:


> Originally Posted by *ChevChelios*
> 
> GTX 1180 aka Volta is a 2018 product (probably May/June 2018, maaaybe Q1 2018), so it shouldnt overlap with Pascal
> Pascal just came out
> 
> how is 1100 already due in Q3 2017 ? Its ~2 years, not 1+ year
> 
> you're way in a rush here


me too i dont think its possible so soon.

but we can be all wrong here, nvidia made suprising moves lately


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> me too i dont think its possible so soon.
> 
> but we can be all wrong here, nvidia made suprising moves lately


There is too many pascal variant for Pascal to end with just 1000 series, I am willingly to bet everything 1100 series will be the updated pascal, follow by volta on 1200 series in mid 2018 with 10nm (Apple Iphone is already on 10nm this year end.)


----------



## ChevChelios

gut is super optimistic









I think both Volta and Navi wont be 10nm


----------



## ChronoBodi

Quote:


> Originally Posted by *ChevChelios*
> 
> gut is super optimistic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think both Volta and Navi wont be 10nm


they just started 16nm/14nm for GPUs, there's no way we're going that fast to the next node, that joyride ended in 2011 at the start of 28nm node.

This is essentially the Kepler of 16nm, sort of. There isn't any 600mm die space 16nm GPU yet, even Big Pascal for gamers is not going to be 600mm, only like, 478mm due to immaturity of the new node.

Only when 16nm node is mature and old will we see that kind of GPU, AND maybe 10nm after that.


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> gut is super optimistic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think both Volta and Navi wont be 10nm


That will depend on 10nm is suitable for high power FF, if it is 2018 definitely it can come,

But if it fails like 20nm, den it will be 2020.

The problem is TSMC is striving to be in volume production for 5nm in year 2020 and that from what we know that is the probably the limit of physics for Silicon, so by den post-silicon material have to materialised if not I am sure all the newer node release will slow down to a crawl.


----------



## QSS-5

Quote:


> Originally Posted by *guttheslayer*
> 
> There is too many pascal variant for Pascal to end with just 1000 series, I am willingly to bet everything 1100 series will be the updated pascal, follow by volta on 1200 series in mid 2018 with 10nm (Apple Iphone is already on 10nm this year end.)


I bet you that the next series are going to be Called 20xx (2080/2070) and not 12xx and will be released 1H and the flagship will be a variant of the GP100 610mm2 with no dual precision cores meaning we will see a 5120core GPU with roughly 20 Tflops of performance, twice the performance of 1080. I also think the 1080 will come as a refresh with higher clocks and power limits as 2070, 2080 will be equivalent chip with better clocks than the 1080ti. The 2080ti will be a full Pascal chip with HBM2. So if a 1080ti arrives this year expect a 2000 series between March and May next year.


----------



## guttheslayer

Quote:


> Originally Posted by *QSS-5*
> 
> I bet you that the next series are going to be Called 20xx (2080/2070) and not 12xx and will be released 1H and the flagship will be a variant of the GP100 610mm2 with no dual precision cores meaning we will see a 5120core GPU with roughly 20 Tflops of performance, twice the performance of 1080. I also think the 1080 will come as a refresh with higher clocks and power limits as 2070, 2080 will be equivalent chip with better clocks than the 1080ti. The 2080ti will be a full Pascal chip with HBM2. So if a 1080ti arrives this year expect a 2000 series between March and May next year.


What follow the 10th series is 11th unless Nvidia has something wrong with their Maths.

I would have prefer it though if they name it as X180 GTX.


----------



## Ghoxt

The question to ask is did Nvidia make anywhere near their projection $$ wise this past QTR for 1080 & 1070 released close together. If not, releasing the Titan lines might make sense sooner than later as many are sitting around with 980Ti , Titan X and Fury (X) class cards waiting to see something powerful drop *well beyond* what we already had a year and a half ago.

I don't know if the Titan P will drop with a sonic boom or a slight 15%-20% whimper. Nvidia should know that there is a rising sentiment, that if they do not deliver with the already strong evidence of the fracture in their armor with DX-12 Async, and if AMD could, I don't know...DO something














, and happen to deliver first, many 980Ti/Titan X owners like myself might happily jump ship.

OT: and contrary to what I just posted...

What I've noticed talking to friends, after back to back to back generations of buying "mostly" Nvidia is that the company is focusing less on it's customers that supported them for years and while growth is encouraged for the company, it's been a strong perception that Nvidia treats longtime customers like a guaranteed commodity taking their customers for granted, like we are "owned" lock stock and barrel. The Titan Z was a clear example of them doing live human testing on what they could get away with...

I ask myself, Is there a single game that if I had purchased an AMD Fury,or 390 class card the last generation that I would not have been able to play and be satisfied?

P.S: I think against all forum logic, I'm doing something I use to do more often, ignore all postings , postulations, proof etc and try not to nitpick about the highest FPS etc...

Again, I seriously now question if Nvidia has us marked as sheep/fools...? We may admittedly agree with them too.


----------



## aDyerSituation

When are we going to get a high end GPU like the 7970 again

So much life and growth. Three generations old and still kicking it no problem. I want REAL upgrades, not 20%, not 30%, maybe not even 50% if I have to pay a grand to get a noticeable performance upgrade, what's the point.


----------



## kaosstar

Quote:


> Originally Posted by *aDyerSituation*
> 
> When are we going to get a high end GPU like the 7970 again
> 
> So much life and growth. Three generations old and still kicking it no problem. I want REAL upgrades, not 20%, not 30%, maybe not even 50% if I have to pay a grand to get a noticeable performance upgrade, what's the point.


It DX12 pans out, I think an R9 290X could end up with comparable longevity. But yeah, it's amazing that a 7970 will still be a quite viable GPU, likely even into the next generation.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> There is too many pascal variant for Pascal to end with just 1000 series, I am willingly to bet everything 1100 series will be the updated pascal, follow by volta on 1200 series in mid 2018 with 10nm (Apple Iphone is already on 10nm this year end.)


Its very optimistic, but still possible i think


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> Its very optimistic, but still possible i think


10nm is very optimistic, but GTX 1100 series on pascal is not.

1200 series with volta is only available at mid 2018 onward.


----------



## renejr902

Quote:


> Originally Posted by *guttheslayer*
> 
> 10nm is very optimistic, but GTX 1100 series on pascal is not.
> 
> 1200 series with volta is only available at mid 2018 onward.


Now i agree







it seems very logical. Without a 1080ti, a Pascal gtx 1100 instead sounds right if amd has no serious menace to 1080 gtx. If 1080ti happens , im not sure we will have a 1100gtx on pascal, but not impossible


----------



## renejr902

Im a little disappointed after overclcinkg my gtx 1070 very stable for core clock and memory, the max stable is +110mhz of 1557mhz = 1667mhz, boost at 2062 most of time, mem 9200mhz (+550mhz) temp max 66c (my case has a great airflow), even with that i got 2-4fps fewer than my old 980ti fully overclocked in witcher 3 at ultra in 4k , no AA no hairwork.
For same place: gtx1070 oc= 40fps and 980ti oc = 42fps. Another place 1070=43-44 and 980ti=46-47.

In same battle 1070=min 30fps 980ti=min 34-35fps

I cant beat my old 980ti oc to max with my 1070 oc to max. Some people got more oc from 1070gtx, mine cant more for 100% stable and no artifact. I hope to beat my gtx 980ti with my 1070, it seem it wont happen, at least in witcher3


----------



## renejr902

I have a important question, thanks for answer. Im about to change my i5 4690 for a i7 4790k. If i only play in 4K resolution at ultra for each game with a future Titan P , IS IT WORTH IT to change my current cpu for a i7 4790k? Does it make a difference in gaming at 4k at ultra for FPS? Thanks for answer guys. I need to know, i got almost all my money for a titan P , so should i economize for a i7 4790k ? Thanks so much for your opinion im still not sure if it worth it for gaming at ultra in 4K !!







i dont want to be cpu bottleneck .

(I do have 32gb kingston ram at 1600mhz, for memory i think, i should be ok even if its not ddr4 right?. I do gaming on my wd black not on ssd but i think it wont impact on fps too.)


----------



## Zero4549

Quote:


> Originally Posted by *renejr902*
> 
> Im a little disappointed after overclcinkg my gtx 1070 very stable for core clock and memory, the max stable is +110mhz of 1557mhz = 1667mhz, boost at 2062 most of time, mem 9200mhz (+550mhz) temp max 66c (my case has a great airflow), even with that i got 2-4fps fewer than my old 980ti fully overclocked in witcher 3 at ultra in 4k , no AA no hairwork.
> For same place: gtx1070 oc= 40fps and 980ti oc = 42fps. Another place 1070=43-44 and 980ti=46-47.
> 
> In same battle 1070=min 30fps 980ti=min 34-35fps
> 
> I cant beat my old 980ti oc to max with my 1070 oc to max. Some people got more oc from 1070gtx, mine cant more for 100% stable and no artifact. I hope to beat my gtx 980ti with my 1070, it seem it wont happen, at least in witcher3


Why on earth did you replace a 980ti with a 1070? Do you just like literally throwing away money for nothing?

You must have really wanted to reduce your power bill and/or room temp, because that it the only benefit you could possibly have with that exchange.


----------



## outofmyheadyo

Just open a window, there is no need to buy 1070


----------



## Swolern

Quote:


> Originally Posted by *aDyerSituation*
> 
> When are we going to get a high end GPU like the *7970 again
> So much life and growth. Three generations old and still kicking it no problem.* I want REAL upgrades, not 20%, not 30%, maybe not even 50% if I have to pay a grand to get a noticeable performance upgrade, what's the point.


Still kicking the pants off 720p!


----------



## outofmyheadyo

Going from 7970 to 1070 is very noticable and not even 500


----------



## guttheslayer

Quote:


> Originally Posted by *Zero4549*
> 
> Why on earth did you replace a 980ti with a 1070? Do you just like literally throwing away money for nothing?
> 
> You must have really wanted to reduce your power bill and/or room temp, because that it the only benefit you could possibly have with that exchange.


Maybe 5% improvement for dx12


----------



## Cyro999

Quote:


> Originally Posted by *Zero4549*
> 
> Why on earth did you replace a 980ti with a 1070? Do you just like literally throwing away money for nothing?
> 
> You must have really wanted to reduce your power bill and/or room temp, because that it the only benefit you could possibly have with that exchange.


They're indeed roughly equivelant in FPS (980ti a bit more GPU power perhaps, 1070 more VRAM)

There are other technological improvements. Maybe he wants SMP; maybe NVENC being twice as fast is relevant to him.


----------



## pez

Quote:


> Originally Posted by *renejr902*
> 
> I have a important question, thanks for answer. Im about to change my i5 4690 for a i7 4790k. If i only play in 4K resolution at ultra for each game with a future Titan P , IS IT WORTH IT to change my current cpu for a i7 4790k? Does it make a difference in gaming at 4k at ultra for FPS? Thanks for answer guys. I need to know, i got almost all my money for a titan P , so should i economize for a i7 4790k ? Thanks so much for your opinion im still not sure if it worth it for gaming at ultra in 4K !!
> 
> 
> 
> 
> 
> 
> 
> i dont want to be cpu bottleneck .
> 
> (I do have 32gb kingston ram at 1600mhz, for memory i think, i should be ok even if its not ddr4 right?. I do gaming on my wd black not on ssd but i think it wont impact on fps too.)


At 4K in games that aren't CPU bottlenecked to hell and back, I see 10-30% CPU usage. The i5 should be fine for a good portion of games. Unfortunately, until DX12 or Vulkan be comes the norm, we're still going to see the occasional CPU bottlenecked titles.


----------



## renejr902

Quote:


> Originally Posted by *Zero4549*
> 
> Why on earth did you replace a 980ti with a 1070? Do you just like literally throwing away money for nothing?
> 
> You must have really wanted to reduce your power bill and/or room temp, because that it the only benefit you could possibly have with that exchange.


I sold my 980ti 3months ago, waiting for Titan P, i was tired of waiting and playing with my intel integrated gpu. I bought a 1070 while waiting for titan P


----------



## Ghoxt

Quote:


> Originally Posted by *renejr902*
> 
> I sold my 980ti 3months ago, waiting for Titan P, i was tired of waiting and playing with my intel integrated gpu. I bought a 1070 while waiting for titan P


Makes perfect sense. I know the guys are harsh on you.







I initially thought the same. I'm sitting on Titan X SLI on water with custom Bios @ 1500. I'm not sure when I'll upgrade. Waiting for something...


----------



## renejr902

Quote:


> Originally Posted by *pez*
> 
> At 4K in games that aren't CPU bottlenecked to hell and back, I see 10-30% CPU usage. The i5 should be fine for a good portion of games. Unfortunately, until DX12 or Vulkan be comes the norm, we're still going to see the occasional CPU bottlenecked titles.


thanks so much for answer , its really appreciated, so do you think it worth the exchange and trouble and 300$+ for 4690 to 4790k ? Thanks, after reading your post i think its not worth it, but i will appreciate your personal opinion. Thanks


----------



## renejr902

Quote:


> Originally Posted by *Ghoxt*
> 
> Makes perfect sense. I know the guys are harsh on you.
> 
> 
> 
> 
> 
> 
> 
> I initially thought the same. I'm sitting on Titan X SLI on water with custom Bios @ 1500. I'm not sure when I'll upgrade. Waiting for something...


wait a little more to see a better performance gain







titan P Sli should worth it


----------



## pez

Quote:


> Originally Posted by *renejr902*
> 
> thanks so much for answer , its really appreciated, so do you think it worth the exchange and trouble and 300$+ for 4690 to 4790k ? Thanks, after reading your post i think its not worth it, but i will appreciate your personal opinion. Thanks


Personally I think no. I'm only on the CPU I am because of the great deal I got in the combo. It's rare I put my i7 to legitimate use outside of games.


----------



## renejr902

Quote:


> Originally Posted by *pez*
> 
> Personally I think no. I'm only on the CPU I am because of the great deal I got in the combo. It's rare I put my i7 to legitimate use outside of games.


thanks so much for answer, same thing here, 98% of time im gaming with my rig. Otherwise i mostly surfing... I will wait for now, i wanted to do the change for the +500mhz
Only and for gaming purpose only, i dont need a i7 otherwise .







i appreciated your help. Thanks again


----------



## pez

Quote:


> Originally Posted by *renejr902*
> 
> thanks so much for answer, same thing here, 98% of time im gaming with my rig. Otherwise i mostly surfing... I will wait for now, i wanted to do the change for the +500mhz
> Only and for gaming purpose only, i dont need a i7 otherwise .
> 
> 
> 
> 
> 
> 
> 
> i appreciated your help. Thanks again


No worries







. Glad to be a help







.


----------



## ChronoBodi

Quote:


> Originally Posted by *renejr902*
> 
> Im a little disappointed after overclcinkg my gtx 1070 very stable for core clock and memory, the max stable is +110mhz of 1557mhz = 1667mhz, boost at 2062 most of time, mem 9200mhz (+550mhz) temp max 66c (my case has a great airflow), even with that i got 2-4fps fewer than my old 980ti fully overclocked in witcher 3 at ultra in 4k , no AA no hairwork.
> For same place: gtx1070 oc= 40fps and 980ti oc = 42fps. Another place 1070=43-44 and 980ti=46-47.
> 
> In same battle 1070=min 30fps 980ti=min 34-35fps
> 
> I cant beat my old 980ti oc to max with my 1070 oc to max. Some people got more oc from 1070gtx, mine cant more for 100% stable and no artifact. I hope to beat my gtx 980ti with my 1070, it seem it wont happen, at least in witcher3


1070 not worth it, according to this

https://www.youtube.com/watch?v=Kgfe4QdhBbE

Seriously, even a 1080 is not worth it compared to a 1450 MHz 980 Ti, the 15%-20% improvement is not worth $600-$700 for me. Wait for 1080 Ti at this point.


----------



## QSS-5

Quote:


> Originally Posted by *guttheslayer*
> 
> What follow the 10th series is 11th unless Nvidia has something wrong with their Maths.
> 
> I would have prefer it though if they name it as X180 GTX.


It is not about math it is about marketing and communications, which will sell more cards. GTX 2080 sounds bigger, better, faster and newer than GTX 1280. Secondly it wont confuse consumers to think that the next card is in the same series, as 1280 can be confusing as it has a number 1. you can quote me when the 2080 releases.


----------



## xxroxx

What's really disappointing is that 50% faster doesn't mean 50% more FPS. Oh boy, how happy I'd be if we got a 50% performance bump every year!


----------



## renejr902

Quote:


> Originally Posted by *ChronoBodi*
> 
> 1070 not worth it, according to this
> 
> https://www.youtube.com/watch?v=Kgfe4QdhBbE
> 
> Seriously, even a 1080 is not worth it compared to a 1450 MHz 980 Ti, the 15%-20% improvement is not worth $600-$700 for me. Wait for 1080 Ti at this point.


its still better for now than my integrated intel gpu







i sold my 980ti too soon, i thought the titan P was about to be release in april-may, instead it was 1070-1080.


----------



## guttheslayer

Quote:


> Originally Posted by *renejr902*
> 
> its still better for now than my integrated intel gpu
> 
> 
> 
> 
> 
> 
> 
> i sold my 980ti too soon, i thought the titan P was about to be release in april-may, instead it was 1070-1080.


You are like what luke from linustech say:

You hate your money.


----------



## SuprUsrStan

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *renejr902*
> 
> its still better for now than my integrated intel gpu
> 
> 
> 
> 
> 
> 
> 
> i sold my 980ti too soon, i thought the titan P was about to be release in april-may, instead it was 1070-1080.
> 
> 
> 
> You are like what luke from linustech say:
> 
> You hate your money.
Click to expand...

Nah, money is meant to be spent.


----------



## renejr902

Quote:


> Originally Posted by *Syan48306*
> 
> Nah, money is meant to be spent.


i was about to post the same thing !


----------



## pez

Quote:


> Originally Posted by *guttheslayer*
> 
> You are like what luke from linustech say:
> 
> You hate your money.


Except that was in a totally different context...in a video about SLI.


----------



## Klocek001

Quote:


> Originally Posted by *ChronoBodi*
> 
> 1070 not worth it, according to this
> 
> https://www.youtube.com/watch?v=Kgfe4QdhBbE
> 
> Seriously, even a 1080 is not worth it compared to a 1450 MHz 980 Ti, the 15%-20% improvement is not worth *$600-$700* for me. Wait for 1080 Ti at this point.


well I guess if you donate the 980Ti to charity or get it for free as opposed to paying the full $700 for 1080 then yes, it's not worth $700.


----------



## guttheslayer

Quote:


> Originally Posted by *pez*
> 
> Except that was in a totally different context...in a video about SLI.


Wow look like someone watch that video.









But yeah should have kept ur 980 Ti all the way till Titan is out, and even titan, shouldnt affect the value of 980 ti at $1000-$1400.

I am still rocking a Palit 670 reference card. I wonder the performance jump on my 1080P should i get a titan pascal.


----------



## SuprUsrStan

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pez*
> 
> Except that was in a totally different context...in a video about SLI.
> 
> 
> 
> Wow look like someone watch that video.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yeah should have kept ur 980 Ti all the way till Titan is out, and even titan, shouldnt affect the value of 980 ti at $1000-$1400.
> 
> I am still rocking a Palit 670 reference card. I wonder the performance jump on my 1080P should i get a titan pascal.
Click to expand...

Why are you even looking at a titan on 1080p?


----------



## guttheslayer

Quote:


> Originally Posted by *Syan48306*
> 
> Why are you even looking at a titan on 1080p?


Good question, cos the 1080p was a stand-in till the AUO 30" Ultra-wide 124PPI VA is out. Or if I can wait longer the ASUS 4K 144Hz panel

And my current 4 years+ dying PC will unlikely survive till den.


----------



## TheBlindDeafMute

I bought a gtx 1070 from msi. Awesome card. Been playing all sorts of vr on the vive with zero issues.

Can't wait for the new titans. Those will be for the work computer.








tax write off


----------



## aDyerSituation

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> I bought a gtx 1070 from msi. Awesome card. Been playing all sorts of vr on the vive with zero issues.
> 
> Can't wait for the new titans. Those will be for the work computer.
> 
> 
> 
> 
> 
> 
> 
> 
> tax write off


Hmm, I wonder if I can get away with that from my simple help desk job


----------



## ChevChelios

Titan for work computer


----------



## magnek

aka tax fraud if said "work" consists of "gaming" 90% the time


----------



## Cyro999

Quote:


> Originally Posted by *pez*
> 
> At 4K in games that aren't CPU bottlenecked to hell and back, I see 10-30% CPU usage. The i5 should be fine for a good portion of games. Unfortunately, until DX12 or Vulkan be comes the norm, we're still going to see the occasional CPU bottlenecked titles.


We're still going to see a lot of them with any API. The API doesn't even dictate the CPU usage of the game - it's only one part of the CPU load. Even if it did, why would we leave midrange CPU's at 10% load rather than making a 10x bigger and more complex game?

We don't leave 1080's at 10% load just because the available performance level is 10x higher than some previous GPU's. We don't say "yay, we're not GPU-limited any more" - the goal is always to load the hardware and translate that performance into an improved experience.

As another note, CPU usage on 4k res isn't lower than on lower resolutions like 1080p. The main difference is that people tend to expect worse performance at 4k - if you're playing at 40fps instead of 100fps, the load on the CPU will dramatically drop. If you keep the FPS the same, the CPU load will stay the same or increase.


----------



## renejr902

Quote:


> Originally Posted by *magnek*
> 
> aka tax fraud if said "work" consists of "gaming" 90% the time


LoL, maybe playing 99% of time


----------



## renejr902

I can get a 100$ deal for a acer 144hz at 1080p (a friend). I still prefer playing in my samsung js8500 at 55", but 144hz on some games could be interesting.. I didnt buy it yet , i made some testing few minutes ago and the min fps i got is 115fps in 1080p at ultra in Dirt Rally with my gtx 1070 and my i5 4690 and 32gb ram. Why cant i get 144hz stable? Is it a cpu or gpu bottleneck? Would i be able to get 144hz stable with my Future Titan P ? If not is the 4790k be able to do it ? If not do i need both of them , if not WHY cant i get 144hz stable with this game at 1080p at ultra ? I dont have any problem running it at 4k at ultra for a stable 60fps most of time. I want a stable 144hz to use vsync option, i dont want any tearing, i think vsync add some input lag, but i dont notice it in dirt rally. THANKS FOR HELP GUYS!


----------



## pez

Quote:


> Originally Posted by *guttheslayer*
> 
> Wow look like someone watch that video.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yeah should have kept ur 980 Ti all the way till Titan is out, and even titan, shouldnt affect the value of 980 ti at $1000-$1400.
> 
> I am still rocking a Palit 670 reference card. I wonder the performance jump on my 1080P should i get a titan pascal.


Why of course







.
Quote:


> Originally Posted by *guttheslayer*
> 
> Good question, cos the 1080p was a stand-in till the AUO 30" Ultra-wide 124PPI VA is out. Or if I can wait longer the ASUS 4K 144Hz panel
> 
> And my current 4 years+ dying PC will unlikely survive till den.


Not looking forward to the price tag that will come with that panel. However, it's definitely hard enough to push 60 frames at 4K still, even with 1080 SLI.
Quote:


> Originally Posted by *Cyro999*
> 
> We're still going to see a lot of them with any API. The API doesn't even dictate the CPU usage of the game - it's only one part of the CPU load. Even if it did, why would we leave midrange CPU's at 10% load rather than making a 10x bigger and more complex game?
> 
> We don't leave 1080's at 10% load just because the available performance level is 10x higher than some previous GPU's. We don't say "yay, we're not GPU-limited any more" - the goal is always to load the hardware and translate that performance into an improved experience.
> 
> As another note, CPU usage on 4k res isn't lower than on lower resolutions like 1080p. The main difference is that people tend to expect worse performance at 4k - if you're playing at 40fps instead of 100fps, the load on the CPU will dramatically drop. If you keep the FPS the same, the CPU load will stay the same or increase.


Well of course. There's never going to be a year where we don't see a title like that. But it is decreasing a bit each year. Any I don't remember anyone talking about CPU usage percentage being lower, but it does still remain that 2K and 4K are resolutions that rely a lot less on the CPU than something like 1080p. Hell, you can look over the ancient review in my sig between two older AMD platforms for some truly CPU-unfriendly res'.


----------



## Jupitel

Maybe I'm asking for too much, but how the hell is it possible that this is the first card that's able to properly run 1440p? I see everyone talk about 4k, the 1080 can barely run over 60 fps on a 1440p, what is up with these crappy cards? Will someone, nvidia/amd step their game up already? I'm supposed to pay over 1200 USD just to get 144hz on 1440p or 60hz on 4k? I'm very disappointed.


----------



## guttheslayer

If you really want to average out max performance of 144Hz 1440p or [email protected], u definitely need dual GPU setup.

Basically no single GPU, even Titan Pascal can fully saturated DP1.2 bandwidth. And here we have DP1.4 soon.

GPU is in a losing race in both display and gaming graphic advances combined. Games are getting harder to run, and yet resolution and frequency are almost doubling each generation.


----------



## sherlock

Quote:


> Originally Posted by *Jupitel*
> 
> Maybe I'm asking for too much, but how the hell is it possible that this is the first card that's able to properly run 1440p? I see everyone talk about 4k, the 1080 can barely run over 60 fps on a 1440p, what is up with these crappy cards? Will someone, nvidia/amd step their game up already? I'm supposed to pay over 1200 USD just to get 144hz on 1440p or 60hz on 4k? I'm very disappointed.


Because people insist on pointless AA settings on hiDPI screens, running high AA settings on a 130+ DPI screen is a complete waste of resources with no visual benefits. My stock 1080FTW(1974/5000) goes 65-75 FPS on Armored Warfare 4K and AA is off(with FXAA I can get around 60 FPS, but no visual benefits) because I am on a 140 DPI 32" 4K screen.

If you want 1 card 4K or 1440p/144 Ultra, you can either get a 1080 and not run pointless AA, or wait for a Titan P next year and $400 more to run your crazy AA settings.


----------



## guttheslayer

Quote:


> Originally Posted by *sherlock*
> 
> Because people insist on pointless AA settings on hiDPI screens, running high AA settings on a 130+ DPI screen is a complete waste of resources with no visual benefits. My stock 1080FTW(1974/5000) goes 65-75 FPS on Armored Warfare 4K and AA is off because I am on a 140 DPI 32" 4K screen.
> 
> If you want 1 card 4K or 1440p/144, you can either get a 1080 and not run pointless AA, or wait for a Titan P next year and $400 more to run your crazy AA settings.


AA is a waste of resources at anything above 120 DPI

No amount of graphic power can satisfy these ppl.


----------



## Jupitel

I think you are both right, we are increasing resolutions each year and most games use Nvidia hairworks/AA/MSAA whatever else you can think that barely improves the game graphics but comes at huge cost to the GPU.

But still, all I want to do is run 1440p at 144hz, and I'm going to forced to wait a year or go with a....yes....SLI....blaaaaaah, 12 years as an IT and I'd rather shoot myself in the nuts than pay twice as much for 120/140% performance. Honestly I don't give a damn about 4k, it just isn't worth it. It's less expensive to run 3 1080p 144hz monitors at full speed than to run 1 4k monitor (and yes I know that 3 1080p are still less than a 4k). IMO I should add.

edited for typos.


----------



## pez

There's a big '*' to that statement. Not everyone is on 144hz panels yet, so a 1080 is actually perfect for 1440p if they're ok not being at 144Hz otherwise. However, there are quite a lot of games that will give you plenty of frames at that res to put a g-sync monitor to the test.


----------



## Jupitel

As I said this is just my opinion, I'm sure there's people out there willing to play with 12 fps as long as they play on a 4k screen. Personally I think when it comes to gaming (no, I'm not a streamer/comp/prof) fps trumps resolution every time. I'd much prefer if Nvidia/AMD focused on mastering 2k before moving on to 4k/5k like I'm seeing.

And yes if you play with 60hz 1440p 1080 is fine, so is the titan X and even a gtx 980 TI can do well with a few exceptions, so what's the point of a gtx 1080? It can run 1440p 60hz just like previous cards and it can't run 60hz 4k like the previous cards and even a 980 is a 1080p overkill, so you tell me, what's the point of this card?

To me, again my personal opinion, the market right now doesn't make much sense. Serious gamers, with a few exceptions, will want 144hz, that should be the objective before moving on to bigger resolutions.


----------



## ref

I'll be getting 2 Pascal Titans for 2k 144hz, not so much because I want fancy settings on, I'm fine lowering settings if they have little to no difference, but so when a 4k 144hz panel is out and a 'reasonable' price to me, I should be good.

Honestly from 90hz+ the difference is very small I find, now from 60hz to 90+ is like night and day, could never go back to a 60hz panel.


----------



## Klocek001

Quote:


> Originally Posted by *ref*
> 
> I'll be getting 2 Pascal Titans for 2k 144hz, not so much because I want fancy settings on, I'm fine lowering settings if they have little to no difference, but so when a 4k 144hz panel is out and a 'reasonable' price to me, I should be good.
> 
> Honestly from 90hz+ the difference is very small I find, now from 60hz to 90+ is like night and day, could never go back to a 60hz panel.


60 to 90 is very noticeable for me too and I prefer 90 fps with a mix of settings than 60 fps all maxed out. as for framerates over 90, 120 fps is great in fast paced games where you move around quickly and it's great combined with ulmb mode. 120 to 144 is hardly a parceptible difference for me. Overall, fps adds a lot more to immersion from my point of view than 4K can.
I'd like to try a panel with 200hz ulmb mode some day, I bet that'd be just ridiculous.


----------



## ToTheSun!

Quote:


> Originally Posted by *Klocek001*
> 
> I'd like to try a panel with 200hz ulmb mode some day, I bet that'd be just ridiculous.


I think the only way we'll be able to strobe at 200 Hz without artifacts is with OLED tech.


----------



## ccRicers

Hopefully one Pascal Titan would be enough for 4k at consistent 60fps in high settings. Should be better than just two 1070's in SLI and with that said, probably more expensive too. I don't play competitively- more about just enjoying games with detailed eye candy. But I still have to look at some 120-144hz monitors in person and see if I'll just go with one of them instead (at a lower 1440p resolution).


----------



## renejr902

Finally i didnt buy the acer 144hz, IQ is not good enough vs my samsung js8500, about playing 4k , im willing to play at 4k but i want at least 60fps most of time, 30-40hz is not good enough and too much blur. Like someome said 12fps at 4k , lol nobody could play at that. More than 60fps is very good but not essential. 30-40fps is playable for some people but its not great at all. I did some part of rise of tomb raider at 27-40fps at very high with my old 980ti, it was playable but not very fun, i could say it barely playable, playable to limit, sorry my english is not perfect. Most of my friend that play pc games, play games even at 25-30fps, but to me 50-70fps not lower is really better, even necessary


----------



## aDyerSituation

I can't play at even 60-70 FPS on my 60hz monitor. Still feels clunky. I need at least 100+ in all the games I play(shooters at least)


----------



## SuprUsrStan

Quote:


> Originally Posted by *aDyerSituation*
> 
> I can't play at even 60-70 FPS on my 60hz monitor. Still feels clunky. I need at least 100+ in all the games I play(shooters at least)


ಠ_ಠ

Not sure if trolling...


----------



## kingduqc

Quote:


> Originally Posted by *Syan48306*
> 
> ಠ_ಠ
> 
> Not sure if trolling...


I played cs go on a 60hz and I had the same feeling, you get the more up to date frame every refresh if your fps is higher even on lower refresh rate screen. let say you play at 120 you would get a more current frame shown to you. It's not the same as displaying all the frames, but it feels marginally better. That being said, 144 hz is just a must at this point.


----------



## rcfc89

Quote:


> Originally Posted by *ref*
> 
> I'll be getting 2 Pascal Titans for 2k 144hz, not so much because I want fancy settings on, I'm fine lowering settings if they have little to no difference, but so when a 4k 144hz panel is out and a 'reasonable' price to me, I should be good.
> 
> Honestly from 90hz+ the difference is very small I find, now from 60hz to 90+ is like night and day, could never go back to a 60hz panel.


Bingo. 60-100 is huge. 100+ not so much. This is why the X34 is perfect for most gamers imo. 60 or below gives me motion sickness limits my playing time and how I feel afterwards. This is why 4k isn't ready for PC gaming imo.


----------



## ChevChelios

Quote:


> 60-100 is huge. 100+ not so much. This is why the X34 is perfect for most gamers imo


with X34 that means you need to be at ~95-100 (aka it max range) all the time and if you go over 100 fps then you're out of Gsync range

it'll be good when 3440x1440 144hz DP 1.3 monitors come out, then you can go over 100 and still be in Gsync range


----------



## rcfc89

Quote:


> Originally Posted by *ChevChelios*
> 
> with X34 that means you need to be at ~95-100 (aka it max range) all the time and if you go over 100 fps then you're out of Gsync range
> 
> it'll be good when 3440x1440 144hz DP 1.3 monitors come out, then you can go over 100 and still be in Gsync range


Lol what? I said the big jump in smoothness comes between 60-100. 100+ is where there is very little visual difference achieved. So because of that a steady 100 is perfect. Have enough gpu power to keep frames above 100 and lock in vsync which holds things at 100 and you're getting a incredibly smooth gaming experience with the immersive Ultra-Wide resolution to add.


----------



## ChevChelios

Quote:


> Originally Posted by *rcfc89*
> 
> lock in vsync


cant do that, Vsync gives input lag, whole point of using Gsync is to not use Vsync

though I suppose you can frame cap at 99 or 100 (assuming that doesnt give input lag either)


----------



## rcfc89

Quote:


> Originally Posted by *ChevChelios*
> 
> cant do that, Vsync gives input lag, whole point of using Gsync is to not use Vsync
> 
> though I suppose you can frame cap at 99 or 100 (assuming that doesnt give input lag either)


It doesn't Vsync locks your display in at its maximum refresh rate. Once the display and nvidia control panel are set to 100hz that's where Vsync locks onto. Gsync is more noticeable at adjusting when frames drop below 60. Around 75+ its hardly needed.


----------



## TheBlindDeafMute

Quote:


> Originally Posted by *aDyerSituation*
> 
> Hmm, I wonder if I can get away with that from my simple help desk job
> 
> 
> 
> 
> 
> 
> 
> 
> Possibly


Quote:


> Originally Posted by *ChevChelios*
> 
> Titan for work computer










correction. 2 titans for work lol


----------



## pez

Quote:


> Originally Posted by *Jupitel*
> 
> As I said this is just my opinion, I'm sure there's people out there willing to play with 12 fps as long as they play on a 4k screen. Personally I think when it comes to gaming (no, I'm not a streamer/comp/prof) fps trumps resolution every time. I'd much prefer if Nvidia/AMD focused on mastering 2k before moving on to 4k/5k like I'm seeing.
> 
> And yes if you play with 60hz 1440p 1080 is fine, so is the titan X and even a gtx 980 TI can do well with a few exceptions, so what's the point of a gtx 1080? It can run 1440p 60hz just like previous cards and it can't run 60hz 4k like the previous cards and even a 980 is a 1080p overkill, so you tell me, what's the point of this card?
> 
> To me, again my personal opinion, the market right now doesn't make much sense. Serious gamers, with a few exceptions, will want 144hz, that should be the objective before moving on to bigger resolutions.


Because 1080 SLI CAN run 4K at 60 frames







. That's not everyone's cup of tea, but it suits me fine







. I just happen to prefer higher res to higher refresh rate. A single 1080 does 1440p just fine. However, I don't think because some people want every game to run at 144Hz, that we should cease to improve display resolution and technology. That's just silly to me. You don't see many new titles coming out already pushing 100+ frames off the bat. Something has to cease to improve for this to be the case, whether the game engine or the monitor technology. No progress is bad progress in PC gaming.
Quote:


> Originally Posted by *ChevChelios*
> 
> cant do that, Vsync gives input lag, whole point of using Gsync is to not use Vsync
> 
> though I suppose you can frame cap at 99 or 100 (assuming that doesnt give input lag either)


NVIDIA Control Panel also supports adaptive v-sync as well, which does a great job at keeping smooth and consistent gameplay without input lag. However, this is much different from g-sync, where you need to be able to maintain a minimum rate for good results.I notice most games have this feature built in, but not all. Games like Crysis 3 are a mess without it, IMO. CS:GO doesn't need this, but that's also an engine thing, too.


----------



## Jupitel

Everyone has their own thing, if you prefer resolution over frame rate that's perfectly fine, it does seem odd to me that the top card right now, the 1080, can actually fail to run 144hz on 1080 with AAA games (literally a i7-6700k/GTX 1080/32 gb ram can't run witcher 3 with 144hz). It most definetly cannot run 144hz 1440p, so at the moment, the best card on the market, can barely run 2k, how are people even concerned about 4k I don't know.

I never said we shouldn't keep pushing the limits, but as far as the market is concern I would love too see graphics cards at least try to keep up. The 1070/1080 literally have no purpose, they can run 1080p fine like the previous ones and can't run 1440p or 4k like the previous ones. Isn't it strange? Again I'm using 144hz as a standard, if you want to play with 15 fps you can buy a 5k monitor.


----------



## ChevChelios

you dont need 144 fps everywhere, because the jump from 60 to 100 is MUCH more noticeable than from 100 to 144

an OCed 1080 can do 85-90+ fps in 1440p Witcher 3 or RotR @ Ultra, which is fine

in Overwatch it can do 144 fps 1440p, also fine

Quote:


> The 1070/1080 literally have no purpose


lol, they serve plenty of purpose

the only "fault" a 1080 has is that it still isnt a 4K @ 60 card .. Titan P will be


----------



## skypine27

Quote:


> Originally Posted by *Jupitel*
> 
> The 1070/1080 literally have no purpose.


I agree with this, as a high end user (6950x + 2 x Titan X SLI, Acer X34 3440 x 1440 @ 100 HZ)

I usually spend money, regardless of price, on the latest bleeding edge stuff. Hell I just went from a 1000 dollar 5960x to a 1600 dollar 6950x.

But I cant seem to drop 1600 dollars (2 x 1080's) to replace out my 2 x Titan X's. They simply aren't a big leap forward and 2 x Titan X's runs every game there is damn near 100 FPS at 3440 x 1400 (full 4K at 60 as well). I know new games will come out and drop this, but so will new video cards. 2 x 1080s have convinced me to wait for 2 x Titan "Nexts"

However as a mid-lower end guy, I think a 1070 is a great single card solution. I think that card has a purpose, the 1080 I'm not so sure.


----------



## loguerto

"NVIDIA's engineers reportedly saying that even Intel's new Core i7-6950X isn't powerful enough to deliver the performance the new Titan cards need in enthusiast level scenarios."
Depends on the software used in my opinion, if they use dx11/opengl high overhead API's of course the card is bottlenecked, the GTX 1080 is already pretty bottlenecked. If NVIDIA instead starts building it's architectures around new low level API's like dx12/vulkan the CPU bottleneck limit would be pushed much further away.


----------



## Mhill2029

Quote:


> Originally Posted by *skypine27*
> 
> I agree with this, as a high end user (6950x + 2 x Titan X SLI, Acer X34 3440 x 1440 @ 100 HZ)
> 
> I usually spend money, regardless of price, on the latest bleeding edge stuff. Hell I just went from a 1000 dollar 5960x to a 1600 dollar 6950x.
> 
> *But I cant seem to drop 1600 dollars (2 x 1080's) to replace out my 2 x Titan X's. They simply aren't a big leap forward and 2 x Titan X's runs every game there is damn near 100 FPS at 3440 x 1400 (full 4K at 60 as well). I know new games will come out and drop this, but so will new video cards. 2 x 1080s have convinced me to wait for 2 x Titan "Nexts"
> *
> 
> However as a mid-lower end guy, I think a 1070 is a great single card solution. I think that card has a purpose, the 1080 I'm not so sure.


I'm actually in the same boat myself, although I have 4x Titan X SC's. But since took 2 of them out and my gaming experience is 10 fold better, I knew that 4-Way SLI and games has never been a smart move since I've done it for several generations now (i'm a sucker for it). But in light of the GTX 1080, I can't see the point in upgrading, if you can even call it an upgrade from SLI Titan X's (especially as GTX 1080, a midrange card is almost Titan X money). I'll happily ride the wave until the Titan Pascal's come along. Will I go 4-Way SLI with those, you bet your ass I will.

Hence the reason I jumped on the 6950X, i'm preparing for some serious awesomeness.


----------



## pez

Quote:


> Originally Posted by *Jupitel*
> 
> Everyone has their own thing, if you prefer resolution over frame rate that's perfectly fine, it does seem odd to me that the top card right now, the 1080, can actually fail to run 144hz on 1080 with AAA games (literally a i7-6700k/GTX 1080/32 gb ram can't run witcher 3 with 144hz). It most definetly cannot run 144hz 1440p, so at the moment, the best card on the market, can barely run 2k, how are people even concerned about 4k I don't know.
> 
> I never said we shouldn't keep pushing the limits, but as far as the market is concern I would love too see graphics cards at least try to keep up. The 1070/1080 literally have no purpose, they can run 1080p fine like the previous ones and can't run 1440p or 4k like the previous ones. Isn't it strange? Again I'm using 144hz as a standard, if you want to play with 15 fps you can buy a 5k monitor.


Just because you think 60Hz or apparently even 100Hz at this point in technology isn't good enough, doesn't make the card useless nor does it dictate it's ability to 'run' a certain resolution.

You also jumped onto another train, stating that 144Hz is your standard, but think that everyone below that wants to play at 15FPS? Maybe your point didn't come across clear, but you're starting to sound like a troll at this point.

I mean if this is your mindset, then you should be excited that cards are increasing in performance and becoming closer to the high standards you require to play your games.


----------



## Jupitel

Quote:


> Originally Posted by *pez*
> 
> Just because you think 60Hz or apparently even 100Hz at this point in technology isn't good enough, doesn't make the card useless nor does it dictate it's ability to 'run' a certain resolution.
> 
> You also jumped onto another train, stating that 144Hz is your standard, but think that everyone below that wants to play at 15FPS? Maybe your point didn't come across clear, but you're starting to sound like a troll at this point.
> 
> I mean if this is your mindset, then you should be excited that cards are increasing in performance and becoming closer to the high standards you require to play your games.


I might have misspoken, I did not mean the 1070/1080 don't serve any purpose as in useless, I meant they don't do anything previous cards couldn't do. They can run 1080p, like the 980 TI and Titan before, they can't run 1440p properly, like the 980 TI and Titan, they are still useless for 60hz 4k, so what do they do? Yes I'm happy about better performance but they need to keep up with the industry I feel.

I take your point about the Hz criteria, as a ex-CSGO player anything under 120hz is garbage to me, but I realize that's just my personal opinion.

However, nowadays especially with IPS 144hz monitors there really is no excuse I feel. I don't know where you get the idea I'm a troll, because I'm unsatisfied by the current graphic card market? That makes me a troll? Or for having personal standards that do not conform to most? Or because I dislike resolution over framerates? If you read you'll find others agree with me and are not as impressed with the new cards performance as most.

I was making the point that if you (not you personally) don't care about frames rates than you can play on 4k monitors, the technology used for what you like exist already, mine does not. You don't think that's a fair point? I didn't mean that because someone doesn't care about 144hz then they would be satisfied with 15 fps.

And yes I agree it's a very subjective criteria and I don't expect the entire industry to follow it. But you have to admit it is a very decent criteria to use, because if a graphic card can't run 144hz on X resolution then you know it's not going to be able to run 60hz in a few years on the same resolution with similar games. It's a good way to see if a graphic card has "mastered" a resolution. Even then I'm not completly insane about it, I did mention that the 1080 can't run The Witcher 3 on 1080p with consistent 144hz but I don't use the top graphic AAA game (RPG even) as a measure for everything.


----------



## rcfc89

Quote:


> Originally Posted by *Jupitel*
> 
> Everyone has their own thing, if you prefer resolution over frame rate that's perfectly fine, it does seem odd to me that the top card right now, the 1080, can actually fail to run 144hz on 1080 with AAA games (literally a i7-6700k/GTX 1080/32 gb ram can't run witcher 3 with 144hz). It most definetly cannot run 144hz 1440p, so at the moment, the best card on the market, can barely run 2k, how are people even concerned about 4k I don't know.
> 
> I never said we shouldn't keep pushing the limits, but as far as the market is concern I would love too see graphics cards at least try to keep up. The *1070/1080 literally have no purpose,* they can run 1080p fine like the previous ones and can't run 1440p or 4k like the previous ones. Isn't it strange? Again I'm using 144hz as a standard, if you want to play with 15 fps you can buy a 5k monitor.


Sure they do. In SLI they are great cards and will be able to handle most everything you throw at them regardless of the resolution. The 8gb of Vram is a nice touch as well with upcoming games like BF1 already pushing the limitations of 6gb. If you consider yourself a high-end or ultra gaming enthusiast you must except running SLI and you must accept it now. Sure you may be able to max out some games with a single card but most you will have to dial things back a bit. Most AAA games support SLI and with only 2-way you should experience a smooth gaming experience with great scaling. The days of SLI stuttering and skipping are a thing of the past with these new gpu's. In fact I haven't experienced any SLI issue's since the 7 series. Its been fantastic.


----------



## Jupitel

Quote:


> Originally Posted by *rcfc89*
> 
> Sure they do. In SLI they are great cards and will be able to handle most everything you throw at them regardless of the resolution. The 8gb of Vram is a nice touch as well with upcoming games like BF1 already pushing the limitations of 6gb. If you consider yourself a high-end or ultra gaming enthusiast you must except running SLI and you must accept it now. Sure you may be able to max out some games with a single card but most you will have to dial things back a bit. Most AAA games support SLI and with only 2-way you should experience a smooth gaming experience with great scaling. The days of SLI stuttering and skipping are a thing of the past with these new gpu's. In fact I haven't experienced any SLI issue's since the 7 series. Its been fantastic.


Fair enough, to be honest I didn't even think about SLI. My experience with it has been appalling to say the least. To pay twice as much for 20/40% avarage increase in performance... I don't know. Yes you are right that to run capped fps on higher resolutions than 1080p you pretty much need something in SLI. Personally I'm looking at going with a 1440p 144hz gsync monitor combined with a 1080, hopefully that will be enough to run most games at high without AA and other high-end settings. In any case, your point is well taken as far as SLI is concerned and I stand corrected.


----------



## WorldExclusive

Quote:


> Originally Posted by *Jupitel*
> 
> I might have misspoken, I did not mean the 1070/1080 don't serve any purpose as in useless, I meant they don't do anything previous cards couldn't do. They can run 1080p, like the 980 TI and Titan before, they can't run 1440p properly, like the 980 TI and Titan, they are still useless for 60hz 4k, so what do they do? Yes I'm happy about better performance but they need to keep up with the industry I feel.
> 
> I take your point about the Hz criteria, as a ex-CSGO player anything under 120hz is garbage to me, but I realize that's just my personal opinion.
> 
> However, nowadays especially with IPS 144hz monitors there really is no excuse I feel. I don't know where you get the idea I'm a troll, because I'm unsatisfied by the current graphic card market? That makes me a troll? Or for having personal standards that do not conform to most? Or because I dislike resolution over framerates? If you read you'll find others agree with me and are not as impressed with the new cards performance as most.
> 
> I was making the point that if you (not you personally) don't care about frames rates than you can play on 4k monitors, the technology used for what you like exist already, mine does not. You don't think that's a fair point? I didn't mean that because someone doesn't care about 144hz then they would be satisfied with 15 fps.
> 
> And yes I agree it's a very subjective criteria and I don't expect the entire industry to follow it. But you have to admit it is a very decent criteria to use, because if a graphic card can't run 144hz on X resolution then you know it's not going to be able to run 60hz in a few years on the same resolution with similar games. It's a good way to see if a graphic card has "mastered" a resolution. Even then I'm not completly insane about it, I did mention that the 1080 can't run The Witcher 3 on 1080p with consistent 144hz but I don't use the top graphic AAA game (RPG even) as a measure for everything.


There are no reviews of cards at 144Hz for a reason.
Please don't use framerates not achievable above 1080p with a single card for any comparison.

Framerate whores are never satisfied.


----------



## Jupitel

Quote:


> Originally Posted by *WorldExclusive*
> 
> There are no reviews of cards at 144Hz for a reason.
> Please don't use framerates not achievable above 1080p with a single card for any comparison.
> 
> Framerate whores are never satisfied.


I'm not sure what that even means. Benchmark are not capped, every review has 144hz, just because they can't reach it doesn't mean it's not there. I also added it's a personal criteria, and I can use whatever the hell I want, if I decide the only good products on the market are ones with red label that's my decision based on my personal preferences, plus it's not like framerates>resolution is such a unique point of view.

I would say the same about resolutions whores, never satisfied even when they can't run that resolution they already want a bigger one.


----------



## rcfc89

Quote:


> Originally Posted by *Jupitel*
> 
> Fair enough, to be honest I didn't even think about SLI. My experience with it has been appalling to say the least. To pay twice as much for *20/40% avarage increase in performance.*.. I don't know. Yes you are right that to run capped fps on higher resolutions than 1080p you pretty much need something in SLI. Personally I'm looking at going with a 1440p 144hz gsync monitor combined with a 1080, hopefully that will be enough to run most games at high without AA and other high-end settings. In any case, your point is well taken as far as SLI is concerned and I stand corrected.


If you are running the game a low/medium settings where the usage percentages are low because they are not needed then maybe 20-30% is correct but who runs SLI on 1080p or low/medium settings?

If you run the game in Ultra in 1440p or higher resolutions keeping gpu usage on both cards above 85% I've seen fps gains as much as 70-80% in most AAA games that utilize SLI correctly. Even in the worst case scenario running 45fps vs 75fps in Ultra settings in UltraWide resolutions or 4K would definitely warrant the price of a 2nd gpu imo.


----------



## Jupitel

Quote:


> Originally Posted by *rcfc89*
> 
> If you are running the game a low/medium settings where the usage percentages are low because they are not needed then maybe 20-30% is correct but who runs SLI on 1080p or low/medium settings?
> 
> If you run the game in Ultra in 1440p or higher resolutions keeping gpu usage on both cards above 85% I've seen fps gains as much as 70-80% in most AAA games that utilize SLI correctly. Even in the worst case scenario running 45fps vs 75fps in Ultra settings in UltraWide resolutions or 4K would definitely warrant the price of a 2nd gpu imo.


To be honest I'm kinda ignorant when it comes to SLI, I studied it and tried it a long time ago. I'll take a look at 1080 SLI performance, I'm sure that should be able to handle 1440p 144hz but I would make two quick points: 1) Even at 85% you are buying a 1070 for the price of a 1080 2) Most people don't run SLI.

My point being that in this last build I actually had to buy lower quality parts to run them properly, this has never happened to me in over 15 rigs I've had. Is this not strange or unusual?

To be fair to your point with SLI I could have avoided that.


----------



## Asmodian

Quote:


> Originally Posted by *guttheslayer*
> 
> If you run the game in Ultra in 1440p or higher resolutions keeping gpu usage on both cards above 85% I've seen fps gains as much as 70-80% in most AAA games *that utilize SLI correctly*. Even in the worst case scenario running 45fps vs 75fps in Ultra settings in UltraWide resolutions or 4K would definitely warrant the price of a 2nd gpu imo.


That is a pretty big qualifier.

It isn't good when you need SLI for decent settings only to find your new game doesn't like SLI at all; if you need SLI for your resolution it can cause problems. I hate running non-native resolutions.

My last SLI system was SLI original Titans, they were nice when SLI worked but even then were often only about 40% scaling. When SLI wasn't supported correctly everything was terrible with SLI enabled, sometimes it was faster but with flickering. micro stutter. or some other annoyance and other times it could be slower with SLI enabled.

My previous system with two GPUs was Crossfire 6950s (unlocked to 6970s) which had horrible stutter in many engines where crossfire nominally worked. I had decided to give up dual GPUs after those but all the performance available with SLI TItans tempted me... but NEVER Again ( ... at least until Volta and it has been long enough I have hope again







).


----------



## szeged

Pascal Titan needs to release already, I'm having withdrawals from not buying any new gpus for a while.


----------



## rcfc89

Quote:


> Originally Posted by *Jupitel*
> 
> To be honest I'm kinda ignorant when it comes to SLI, I studied it and tried it a long time ago. I'll take a look at 1080 SLI performance, I'm sure that should be able to handle 1440p 144hz but I would make two quick points: 1) Even at 85% you are buying a 1070 for the price of a 1080 2) Most people don't run SLI.
> 
> My point being that in this last build I actually had to buy lower quality parts to run them properly, this has never happened to me in over 15 rigs I've had. Is this not strange or unusual?
> 
> To be fair to your point with SLI I could have avoided that.


Like I said whether its 60% or 80% its still needed if you're running games at max settings at higher resolutions. In all honesty I didn't drop several grand on a PC and another 1200 on a display to dial back the settings. Like I said SLI runs great now and most AAA title's not only utilize it but do it very well with incredible scaling. Forget what you have seen several generations ago. Everything is better now from the MB's, SLI bridges, drivers to the gpu's themselves. Give it a try and you will never look back.

If you wan't 1440p 144hz, Ultra Wide 100hz or 4K 60hz you really don't have much of a choice. The value of the second card is to be able to run these games in all there glory with maximum eye candy at smooth high frames. Its expensive but honestly ask your self would you rather run Witcher 3 in Ultra 1440p at 50fps or spend an extra $700 and play it at a much smoother 75-80fps. Easy decision for me. Its all about the experience and the enjoyment I get from gaming when I have time to play. The cost is what it is.


----------



## NikolayNeykov

Quote:


> Originally Posted by *rcfc89*
> 
> Like I said whether its 60% or 80% its still needed if you're running games at max settings at higher resolutions. In all honesty I didn't drop several grand on a PC and another 1200 on a display to dial back the settings. Like I said SLI runs great now and most AAA title's not only utilize it but do it very well with incredible scaling. Forget what you have seen several generations ago. Everything is better now from the MB's, SLI bridges, drivers to the gpu's themselves. Give it a try and you will never look back.
> 
> If you wan't 1440p 144hz, Ultra Wide 100hz or 4K 60hz you really don't have much of a choice. The value of the second card is to be able to run these games in all there glory with maximum eye candy at smooth high frames. Its expensive but honestly ask your self would you rather run Witcher 3 in Ultra 1440p at 50fps or spend an extra $700 and play it at a much smoother 75-80fps. Easy decision for me. Its all about the experience and the enjoyment I get from gaming when I have time to play. The cost is what it is.


I think there is no other choice, because they are too behind with cards and we could already have 144hz 4k,
but they are just releasing small steps, because there is no competition, so increase in 5-10 % is the best steps they will do like maxwell and pascal...
That's why it's best to avoid buyng cards untill they really do something about it.


----------



## Jupitel

Quote:


> Originally Posted by *rcfc89*
> 
> Like I said whether its 60% or 80% its still needed if you're running games at max settings at higher resolutions. In all honesty I didn't drop several grand on a PC and another 1200 on a display to dial back the settings. Like I said SLI runs great now and most AAA title's not only utilize it but do it very well with incredible scaling. Forget what you have seen several generations ago. Everything is better now from the MB's, SLI bridges, drivers to the gpu's themselves. Give it a try and you will never look back.
> 
> If you wan't 1440p 144hz, Ultra Wide 100hz or 4K 60hz you really don't have much of a choice. The value of the second card is to be able to run these games in all there glory with maximum eye candy at smooth high frames. Its expensive but honestly ask your self would you rather run Witcher 3 in Ultra 1440p at 50fps or spend an extra $700 and play it at a much smoother 75-80fps. Easy decision for me. Its all about the experience and the enjoyment I get from gaming when I have time to play. The cost is what it is.


I'll take a look for sure, if what you say is accurate (I don't contest it) it sounds like SLI has improved quite a bit. Yes for now I'm looking at 1440p 144hz IPS GSYNC run on a 1070/80. Possibly SLI now.

Apart from my situation and personal preferences I would say it's still pretty remarkable the way the consumers request higher resolutions compared to single graphic cards being able to run them.

Seems odd to me. Not to mention when you consider the benefits of 144hz (now even on IPS) compared to a higher resolution. I have a U2515H 1440p 25" and I love it, but that's not what consumers want, is not just higher resolution, its bigger screens. I'm not sure why though, we are still talking computer monitors you sit in front of. Maybe someone can enlighten me.


----------



## rcfc89

Quote:


> Originally Posted by *NikolayNeykov*
> 
> I think there is no other choice, because they are too behind with cards and we could already have 144hz 4k,
> but they are just releasing small steps, because there is no competition, so increase in 5-10 % is the best steps they will do like maxwell and pascal...
> That's why it's best to avoid buyng cards untill they really do something about it.


Quote:


> Originally Posted by *NikolayNeykov*
> 
> I think there is no other choice, because they are too behind with cards and we could already have 144hz 4k,
> but they are just releasing small steps, because there is no competition, so increase in 5-10 % is the best steps they will do like maxwell and pascal...
> That's why it's best to *avoid buyng cards* untill they really do something about it.


And not be able to run upcoming games like BF1, Forza etc in Ultra settings.............Not a chance









Quote:


> Originally Posted by *Jupitel*
> 
> I'll take a look for sure, if what you say is accurate (I don't contest it) it sounds like SLI has improved quite a bit. Yes for now I'm looking at 1440p 144hz IPS GSYNC run on a 1070/80. Possibly SLI now.
> 
> Apart from my situation and personal preferences I would say it's still pretty remarkable the way the consumers request higher resolutions compared to single graphic cards being able to run them.
> 
> Seems odd to me. Not to mention when you consider the benefits of 144hz (now even on IPS) compared to a higher resolution. I have a U2515H 1440p 25" and I love it, but that's not what consumers want, is not just higher resolution, *its bigger screens*. I'm not sure why though, we are still talking computer monitors you sit in front of. Maybe someone can enlighten me.


Bigger screens at least in 4k means less ppi which kills the reason why you game on 4K to begin with. I don't get why people PC game on big 4K tv's designed for watching movies or television. Crazy input lag. The display I have is just an extension of a 27" 1440p just much wider with the same ppi.


----------



## Chargeit

Quote:


> Originally Posted by *rcfc89*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> And not be able to run upcoming games like BF1, Forza etc in Ultra settings.............Not a chance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bigger screens at least in 4k means less ppi which kills the reason why you game on 4K to begin with. I don't get why people PC game on big 4K tv's designed for watching movies or television. Crazy input lag. The display I have is just an extension of a 27" 1440p just much wider with the same ppi.


I tried the large 4k screen thing. Looked good on the desk and had some real, "wow" moments. Once you got down on more serious gaming or desktop usage those "wow" moments turned to a lot of compromises for me. I'm just happy I was able to accept limits and return the screens before getting stuck with something that strained my neck and eyes all while being fuzzy and half useless at normal desk distance. Like, when you've got your chair rolled back to the middle of a room playing a game at your computer you know you're trying to fit in a little too much.


----------



## ChevChelios

can we get back to the Titan ?


----------



## Chargeit

Quote:


> Originally Posted by *ChevChelios*
> 
> can we get back to the Titan ?


Two weeks of rumors isn't enough?


----------



## magnek

Quote:


> Originally Posted by *ChevChelios*
> 
> can we get back to the Titan ?


It's gonna be expensive.

Moving on, I don't think I could ever go 4K simply because outside of gaming and media-ing, general usage would just be horrendous without good DPI scaling, and we all know how much Windows fails at that.


----------



## Chargeit

I'll toss my TP predictions into the mix then.

Titan P in a nut shell,

Expensive
Good, but crap for the price(Still not going to be able to max everything out at 4k 60 fps since $1,000+ GPU apparently isn't enough for that in newer games)
Terrible price/performance
Out performed by cards fraction of its price sooner then later
Money would be better spent elsewhere. Like you might as well give that crap away because the cards going to look junk when the new stuff comes out.

That's how Titans roll.

This entire set of Milwaukee cordless power tools costs less then a titan and will last you a life time.


----------



## Cyro999

Quote:


> but it does still remain that 2K and 4K are resolutions that rely a lot less on the CPU than something like 1080p.


If you keep FPS the same, higher resolutions actually have the same or higher CPU load.

The reason for lower CPU utilization is because running the game more poorly (at, say, half the framerate due to insufficient graphics hardware or too high settings) means dramatically reduced CPU load as well. 40 frames per second is a lot less CPU work than 80 frames per second.

A lot of people will incorrectly compare 1080p 80fps to 4k 40fps and say "hey, the CPU is under more demand on the 1080p test!" because they didn't properly control for some important variables.


----------



## ChevChelios

Quote:


> Expensive
> Good, but crap for the price(Still not going to be able to max everything out at 4k 60 fps since $1,000+ GPU apparently isn't enough for that in newer games)
> Terrible price/performance
> *Out performed by cards fraction of its price sooner then later*
> Money would be better spent elsewhere. Like you might as well give that crap away because the cards going to look junk when the new stuff comes out.
> 
> That's how Titans roll.


neither 980Ti, nor Fury X outperform or even equal Titan X (and yes, Titan X can be OCed to almost 1600 under water, so even OCed 980Ti cant reach it)

everything else is true, Titan is not for value .. its for raw power and it only gets beaten when the high end of the _next_ gen comes out

Titan P will be same


----------



## Chargeit

Quote:


> Originally Posted by *ChevChelios*
> 
> neither 980Ti, nor Fury X outperform or even equal Titan X (and yes, Titan X can be OCed to almost 1600 under water, so even OCed 980Ti cant reach it)
> 
> everything else is true, Titan is not for value .. its for raw power and it only gets beaten when the high end of the _next_ gen comes out
> 
> Titan P will be same


Yea, but a 980 ti factory oc'ed out the box will beat a Tx out the box. Most people aren't going to put theirs under water and OC it to the max. Even then, the Titan shouldn't be outdone until the next Titan is out.


----------



## magnek

Quote:


> Originally Posted by *ChevChelios*
> 
> neither 980Ti, nor Fury X outperform or even equal Titan X (and yes, Titan X can be OCed to almost 1600 under water, so even OCed 980Ti cant reach it)
> 
> everything else is true, Titan is not for value .. its for raw power and it only gets beaten when the high end of the _next_ gen comes out
> 
> Titan P will be same


A golden sample Titan X could maybe bench at 1600 under water but in no way will that be 24/7 clocks.

Clock for clock Titan X is 5-7% faster than 980 Ti, so an insignificant difference.


----------



## ChevChelios

Quote:


> Clock for clock Titan X is 5-7% faster than 980 Ti, so an insignificant difference.


to you 5-7% is insignificant, to me it isnt









point is its the fast

the Ti outdoes it in p/p, but not in performance


----------



## magnek

In what world is a 5-7% difference considered "significant"?

Hell even by FM_Janis' own admission, 3DMark runs have an allowable margin of error of 3%.


----------



## Chargeit

Quote:


> Originally Posted by *magnek*
> 
> In what world is a 5-7% difference considered "significant"?
> 
> Hell even by FM_Janis' own admission, 3DMark runs have an allowable margin of error of 3%.


In the world of marketing,


----------



## JackCY

Quote:


> Originally Posted by *ChevChelios*
> 
> can we get back to the Titan ?


I've heard that the heatsink shroud will be made of titanium on the FE card which is probably again the only version.


----------



## magnek

Quote:


> Originally Posted by *Chargeit*
> 
> In the world of marketing,


Could be worse:


----------



## Chargeit

Seems legit.


----------



## JackCY

Quote:


> Originally Posted by *magnek*
> 
> Could be worse:


And I thought that's GPU market share between Intel, Nvidia and AMD







But wait that would actually make sense, never mind.


----------



## Baasha

Isn't Nvidia at PDXLan now? No news/hints about it so I highly doubt the Titan is going to be released next month. My hunch is next March (2017).


----------



## ChevChelios

Im going with Jan or Feb 2017 for Pascal Titan


----------



## rcfc89

I say we say Pascal Titan by the end of August. Its already been 16 month's since Titan X. I expect 1080Ti 3 month's later just like last generation.


----------



## Chargeit

Won't be this year. Nvidia has to milk the 1080's for all they're worth then put out the Titan. Also have to allow the 1080 buyers time to save their pennies for the Titian P.

Also, AMD has nothing out in the range. Why put out a Titan, cut into your 1080 sales, and only compete with yourself. Doesn't make sense to me.

*Sites acting wonky as hell for me.


----------



## magnek

Quote:


> Originally Posted by *rcfc89*
> 
> I say we say Pascal Titan by the end of August. Its already been 16 month's since Titan X. I expect 1080Ti 3 month's later just like last generation.


I will be first in line for the all you can eat crow buffet if Pascal Titan launches with retail availability by the end of August 2016.


----------



## rcfc89

Quote:


> Originally Posted by *Chargeit*
> 
> Won't be this year. Nvidia has to milk the 1080's for all they're worth then put out the Titan. Also have to allow the 1080 buyers time to save their pennies for the Titian P.
> 
> Also, AMD has nothing out in the range. Why put out a Titan, cut into your 1080 sales, and only compete with yourself. Doesn't make sense to me.
> 
> *Sites acting wonky as hell for me.


I see AMD pushing out something extra nice for the BF1 launch in October. Nvidia will counter with a more priced right 1080Ti. Haha one can dream right. And saying a Titan or 1080Ti isn't needed is silly. Us 980Ti owners are ready to upgrade and need more vram for BF1. 1080 is not near enough to get my money.


----------



## Chargeit

Well, needed for us, and needed for Nvidia can be two different things.

If we got what we needed they'd release their whole line of cards at the same time and let us all make well informed purchases.

Honestly the only thing I see forcing Nvidia to release their TitanP or 1080ti cards early would be if AMD put out a GPU that stomped all over the 1080.

My guess there would be if they put out a 1080ti first then it wouldn't be based on the titan. Or majorly cut down.


----------



## Ghoxt

Quote:


> Originally Posted by *Chargeit*
> 
> Won't be this year. Nvidia has to milk the 1080's for all they're worth then put out the Titan. Also have to allow the 1080 buyers time to save their pennies for the Titian P.
> 
> Also, AMD has nothing out in the range. Why put out a Titan, *cut into your 1080 sales*, and only compete with yourself. Doesn't make sense to me.
> 
> *Sites acting wonky as hell for me.


But do we know how well the 1070 & 1080 are moving this QTR? If sales are subpar, I could see Titan P version(s) coming out sooner than later. Nvidia is in the business of Money per QTR regardless.

Why would it not apply here?









It thought it was very well known that most of the people, like me, who are awaiting Launch Facts & Specs on the Titan-P were *never* going to buy 1070's and 1080's in the first place so I don't think you have the Titan owners pegged here, and its also why I'm thinking its not close to a reason why Nvidia would not launch sooner than later.

It could very well be some other business reason, Sales to Govt, Scientific & Supercomputer Contract sales etc, but again, i really don't think it's due to "eating into 1080 sales".


----------



## sherlock

Quote:


> Originally Posted by *Ghoxt*
> 
> But do we know how well the 1070 & 1080 are moving this QTR? If sales are subpar, I could see Titan P version(s) coming out sooner than later. Nvidia is in the business of Money per QTR regardless.
> 
> Why would it not apply here?
> 
> 
> 
> 
> 
> 
> 
> 
> It thought it was very well known that most of the people, like me, who are awaiting Launch Facts & Specs on the Titan-P were *never* going to buy 1070's and 1080's in the first place so I don't think you have the Titan owners pegged here, and its also why I'm thinking its not close to a reason why Nvidia would not launch sooner than later.
> 
> It could very well be some other business reason, Sales to Govt, Scientific & Supercomputer Contract sales etc, but again, i really don't think it's due to "eating into 1080 sales".


How could 1070/1080 be out of stock everywhere + going for $100+ more than MSRP on ebay and have subpar sales?? clearly every card that have been manufactured have a good chance to be sold within a day or two of getting listed on retail site.

According to steam survey in June, there were more 1080(0.09%) out there Titan X(0.08%), so Titan owner like you make up so small a portion of the user base that serving your $1K cards now instead of next Q1 is not going to make a significant impact on Nvidia bottom lines. Nvidia's revenue will be propped up by 1060, and their profit will be propped up by 1070/1080, Titan sales would have very little impact on their financials due to their extremely limited volumes.

http://www.nowinstock.net/computers/videocards/nvidia/gtx1080/
http://www.nowinstock.net/computers/videocards/nvidia/gtx1070/

Nvidia might have their reasons for launching Titan P in Q3, but poor 1070/1080 sales is not one of them. If the fastest card avaliable at $700 is not selling well, selling the fastest card available at $1000 is not going to help one bit.


----------



## iLeakStuff

There isnt any reason to why Titan P and 1080 cant be on the market simultaniously.

Titan P will make 1080 seem like a bargain


----------



## magnek

Sure they can as long as Titan P is sufficiently more expensive than the 1080, which means >$1200 at a minimum.


----------



## Chargeit

Quote:


> Originally Posted by *iLeakStuff*
> 
> There isnt any reason to why Titan P and 1080 cant be on the market simultaniously.
> 
> Titan P will make 1080 seem like a bargain


Maximizing sales.

Why sell you one GPU in a generation when I can sell you 2 or 3.

The 1080 doesn't need the Titan P released to sell. It's selling just fine as is.


----------



## ChevChelios

Quote:


> I see AMD pushing out something extra nice for the BF1 launch in October.


they already have that something

480 crossfire

even though CF is ass overall, you can/should count on BF1 supporting it and SLI (possibly/probably from release day even)

if you only want AMD and need something fast for BF1 and only mostly care about BF1 - 480 CF is honestly a decent/good option


----------



## Zero4549

Quote:


> Originally Posted by *ChevChelios*
> 
> they already have that something
> 
> 480 crossfire
> 
> even though CF is ass overall, you can/should count on BF1 supporting it and SLI (possibly/probably from release day even)
> 
> if you only want AMD and need something fast for BF1 and only mostly care about BF1 - 480 CF is honestly a decent/good option


480CF is actually a really interesting option going forward. CF's implementation for VR is rather good. I'm sure BF will support it great as well. As for other non-VR titles... that remains to be seen. You do get some pretty nice bang for your buck when it comes to DX12/Vulkan with the 480, but non-VR CF support really comes down to the game itself.


----------



## guttheslayer

Quote:


> Originally Posted by *ToTheSun!*
> 
> I think the only way we'll be able to strobe at 200 Hz without artifacts is with OLED tech.


You dont need strobing for OLED, there is no backlight for u to strobe.

And the pixel response is so fast, there is almost no pixel persistence for motion blur to occur. We are talking about at least 0.1ms here.


----------



## pez

Skipped a few replies, but I wanted to post before I lost all I had to say.

Quote:


> Originally Posted by *Jupitel*
> 
> I might have misspoken, I did not mean the 1070/1080 don't serve any purpose as in useless, I meant they don't do anything previous cards couldn't do. They can run 1080p, like the 980 TI and Titan before, they can't run 1440p properly, like the 980 TI and Titan, they are still useless for 60hz 4k, so what do they do? Yes I'm happy about better performance but they need to keep up with the industry I feel.
> 
> I take your point about the Hz criteria, as a ex-CSGO player anything under 120hz is garbage to me, but I realize that's just my personal opinion.
> 
> However, nowadays especially with IPS 144hz monitors there really is no excuse I feel. I don't know where you get the idea I'm a troll, because I'm unsatisfied by the current graphic card market? That makes me a troll? Or for having personal standards that do not conform to most? Or because I dislike resolution over framerates? If you read you'll find others agree with me and are not as impressed with the new cards performance as most.
> 
> I was making the point that if you (not you personally) don't care about frames rates than you can play on 4k monitors, the technology used for what you like exist already, mine does not. You don't think that's a fair point? I didn't mean that because someone doesn't care about 144hz then they would be satisfied with 15 fps.
> 
> And yes I agree it's a very subjective criteria and I don't expect the entire industry to follow it. But you have to admit it is a very decent criteria to use, because if a graphic card can't run 144hz on X resolution then you know it's not going to be able to run 60hz in a few years on the same resolution with similar games. It's a good way to see if a graphic card has "mastered" a resolution. Even then I'm not completly insane about it, I did mention that the 1080 can't run The Witcher 3 on 1080p with consistent 144hz but I don't use the top graphic AAA game (RPG even) as a measure for everything.


It's a valid opinion, but it sounded more like an attempt to preach than you stating your opinion







. I don't fault you for wanting 144Hz in everything, though it's not even something that's been realistic at even 1080p. I mean by that logic, every card, including the GTX 1080 is useless for 1080p because it doesn't achieve 144Hz in many games at all:

http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_gaming_z_8g_review,19.html
Quote:


> Originally Posted by *rcfc89*
> 
> Sure they do. In SLI they are great cards and will be able to handle most everything you throw at them regardless of the resolution. The 8gb of Vram is a nice touch as well with upcoming games like BF1 already pushing the limitations of 6gb. If you consider yourself a high-end or ultra gaming enthusiast you must except running SLI and you must accept it now. Sure you may be able to max out some games with a single card but most you will have to dial things back a bit. Most AAA games support SLI and with only 2-way you should experience a smooth gaming experience with great scaling. The days of SLI stuttering and skipping are a thing of the past with these new gpu's. In fact I haven't experienced any SLI issue's since the 7 series. Its been fantastic.


Quote:


> Originally Posted by *Jupitel*
> 
> Fair enough, to be honest I didn't even think about SLI. My experience with it has been appalling to say the least. To pay twice as much for 20/40% avarage increase in performance... I don't know. Yes you are right that to run capped fps on higher resolutions than 1080p you pretty much need something in SLI. Personally I'm looking at going with a 1440p 144hz gsync monitor combined with a 1080, hopefully that will be enough to run most games at high without AA and other high-end settings. In any case, your point is well taken as far as SLI is concerned and I stand corrected.


I have to agree with rcfc89's post above. 30% gains from SLI are largely a thing of the past so long as a profile is available for the game. There are reviews outlining GTA V scaling around 90% with GTX 1080 SLI at 4K. SLI is not perfect, but it's not the demon that it used to be, or the one that everyone makes it out to be still.


----------



## Ghoxt

Quote:


> Originally Posted by *sherlock*
> 
> How could 1070/1080 be out of stock everywhere + going for $100+ more than MSRP on ebay and have subpar sales?? clearly every card that have been manufactured have a good chance to be sold within a day or two of getting listed on retail site.
> 
> According to steam survey in June, there were more 1080(0.09%) out there Titan X(0.08%), so Titan owner like you make up so small a portion of the user base that serving your $1K cards now instead of next Q1 is not going to make a significant impact on Nvidia bottom lines. Nvidia's revenue will be propped up by 1060, and their profit will be propped up by 1070/1080, Titan sales would have very little impact on their financials due to their extremely limited volumes.
> 
> http://www.nowinstock.net/computers/videocards/nvidia/gtx1080/
> http://www.nowinstock.net/computers/videocards/nvidia/gtx1070/
> 
> Nvidia might have their reasons for launching Titan P in Q3, but poor 1070/1080 sales is not one of them. If the fastest card avaliable at $700 is not selling well, selling the fastest card available at $1000 is not going to help one bit.


Exactly, we don't know sales numbers. We don't know yield numbers... Titan X was never going to be a huge seller especially gimped in DP (Scientifically shunned), so comparing it to that has less weight in my opinion. Halo cards like the Titan are also not for huge numbers in any sense. We have speculated that some cards were produced as Nvidia's method of "catching the falling knife" that is broken yields for Tesla/Quadro, and getting something out of it.

I'ts funny how we both look at this though different lenses.







I don't think either of us has enough info to be credible however we try lol. I admit it, we don't have enough credible information on what the hell Nvidia is doing and why. they seem to march to their own drum which in their position is admirable.


----------



## LunaTiC123

Quote:


> Originally Posted by *guttheslayer*
> 
> You dont need strobing for OLED, there is no backlight for u to strobe.
> 
> And the pixel response is so fast, there is almost no pixel persistence for motion blur to occur. We are talking about at least 0.1ms here.


Taken from wikipedia ( not sure how true it is though:


Spoiler: Warning: Spoiler!



Quote:


> According to LG, OLED response times are up to 1,000 times faster than LCD, putting conservative estimates at under 10 μs (0.01 ms), which could theoretically accommodate refresh frequencies approaching 100 kHz (100,000 Hz). Due to their extremely fast response time, OLED displays can also be easily designed to be strobed, creating an effect similar to CRT flicker in order to avoid the sample-and-hold behavior seen on both LCDs and some OLED displays, which creates the perception of motion blur.[67]






Here's a nice article on sample and hold motion blur on OLED's:

http://www.blurbusters.com/faq/oled-motion-blur/

So we still need some kind of black frame insertion or very high refresh rates to get rid of motion blur even on OLED.

Anyway OLED is indeed the future and hopefully the near future so we can get rid of the trash IPS/VA/TN panels we have nowdays(the day my sony fw900 crt died was a sad day







) it still feels kinda far away







hopefully by 2018 we'll have some consumer high refreshrate oled monitors under 1k...


----------



## ChevChelios

do you still need/want Gsync/Freesync for OLED ?


----------



## LunaTiC123

Quote:


> Originally Posted by *ChevChelios*
> 
> do you still need/want Gsync/Freesync for OLED ?


sure, gsync / freesync should be on all monitors imo, no matter the panel technology


----------



## Chargeit

I'll be impressed with the idea of OLED as a computer monitor the day they're selling them and they prove that they do not suffer from burn-in or IR. I'm already stressed out enough about black bars on my Plasma I sure don't want to be with my computer monitor.

And yea, all the motion clarity in the world won't fix screen tearing. What you won't need is ULMB.

http://televisions.reviewed.com/features/what-to-know-about-oled-screen-burn-in-problems-causes-image-retention
Quote:


> Gamers will want to take special precautions, though, especially if you like to sit down for marathon sessions of the same game for hours at a time. Your best bet is to take a break now and again, change the channel, and let the TV (and your competitors) catch a breather.


This highlights my problem with the idea of an OLED for a monitor and I don't let myself get caught up in the "OLED as monitor" hype train. I've already experienced this with Plasma and I know it's stressful and always something in the back of my mind while using my screen. I don't worry about that stuff with my LCD based screens.


----------



## DETERMINOLOGY

91 pages on a light rumor yall be over cooking threads..


----------



## guttheslayer

Quote:


> Originally Posted by *ChevChelios*
> 
> do you still need/want Gsync/Freesync for OLED ?


G sync or VRR is a must, unless its native by OLED that they refresh dynamically.

However ULMB is essentially useless with OLED.


----------



## rcfc89

Quote:


> Originally Posted by *magnek*
> 
> Sure they can as long as Titan P is sufficiently more expensive than the 1080, which means >$1200 at a minimum.


Or maybe 1080 is priced too high? Nothing on the market to compare it to thus leading to its inflated price. I'm hoping for normal $1k Titan and see 1080's drop to $550-599. Then later seeing 1080Ti's at $699-750 range. Sorry to you early adopters of the 1080. There's always a fee especially when Amd is dragging their feet.


----------



## guttheslayer

Quote:


> Originally Posted by *rcfc89*
> 
> Or maybe 1080 is priced too high? Nothing on the market to compare it to thus leading to its inflated price. I'm hoping for normal $1k Titan and see 1080's drop to $550-599. Then later seeing 1080Ti's at $699-750 range. Sorry to you early adopters of the 1080. There's always a fee especially when Amd is dragging their feet.


There is no need to drop 1080 price even if Titan comes in at $999. First Titan will not be >40% faster. Second even with 40% performance boost not everyone will instant buy a Titan over 1080, that is because we are talking about extra $300-$400 on top of the over-expensive 1080.

A pair of 1080 will always be a more attractive option when you go above 1K.


----------



## rcfc89

Quote:


> Originally Posted by *guttheslayer*
> 
> There is no need to drop 1080 price even if Titan comes in at $999. First Titan will not be >40% faster. Second even with 40% performance boost not everyone will instant buy a Titan over 1080, that is because we are talking about extra $300-$400 on top of the over-expensive 1080.
> 
> A pair of 1080 will always be a more attractive option when you go above 1K.


What are the specs of your PC? Do you even own a high-end gpu? I'm going to guess the answer is No. Titan owners don't buy 1080's. Titan owners don't give a hoot of the premium price to own a Titan. Yet people like you continue to try to justify why or why you would not buy a Titan. Price/Performance blah blah blah. You're not buying one. Ok. We get it.


----------



## magnek

Quote:


> Originally Posted by *rcfc89*
> 
> Or maybe 1080 is priced too high? Nothing on the market to compare it to thus leading to its inflated price. I'm hoping for normal $1k Titan and see 1080's drop to $550-599. Then later seeing 1080Ti's at $699-750 range. Sorry to you early adopters of the 1080. There's always a fee especially when Amd is dragging their feet.


You're in dreamland if you think 1080 Ti will come in at $699-750, especially if it comes with HBM2.

If Titan P drops at $999 and id 30% faster than 1080, it'll instantly obsolete the latter since you're almost seeing linear scaling of performance with price. Thus if Titan P is to drop soon, it will have to be priced sky high to avoid cannibalizing 1080 sales. That and 16 HBM2 won't be cheap either.


----------



## Jpmboy

^^ Titans have always been 4 figure cards... and a halo product really does not impact sales of the down-card skus. Evidenced by TX launching first. Different target markets.

gonna have to get two of these for sure!


----------



## Chargeit

Quote:


> Originally Posted by *rcfc89*
> 
> What are the specs of your PC? Do you even own a high-end gpu? I'm going to guess the answer is No. Titan owners don't buy 1080's. Titan owners don't give a hoot of the premium price to own a Titan. Yet people like you continue to try to justify why or why you would not buy a Titan. Price/Performance blah blah blah. You're not buying one. Ok. We get it.


There's a big difference between buying something and being able to afford it. If money isn't a thing then why hold on to some crusty old Titan X when a sleek new 1080 is out. Screw that.

Not that I think Titan X/980ti to 1080 is a smart upgrade, but, smart upgrades are for people on a budget.


----------



## magnek

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ Titans have always been 4 figure cards... and a halo product really does not impact sales of the down-card skus. Evidenced by TX launching first. Different target markets.
> 
> gonna have to get two of these for sure!


We're talking about Titan P vs 1080 here, not Titan P vs 1080 Ti.

980 was already on the market for 6 months before Titan X released, so the consumers have already been sufficiently milked. The Titan X also cost 81% more for about 30-35% more performance, so that very much goes in line with what I said about NOT wanting to have 1:1 scaling of performance with price.


----------



## GunnzAkimbo

i will buy 3 Titan P's, sli, and 2 x 2699V4's on a asus z10pe-d16 ws with 1024GB RAM.

it will be used to browse porn sites.


----------



## DesmoLocke

used to "browse" in VR...


----------



## magnek

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> i will buy 3 Titan P's, sli, and 2 x 2699V4's on a asus z10pe-d16 ws with 1024GB RAM.
> 
> it will be used to browse porn sites.


Do >16GB DDR4 ram sticks exist? If not you'll have to settle for 256GB ram


----------



## azanimefan

Quote:


> Originally Posted by *iLeakStuff*
> 
> Lets see:
> Nvidia out with high end, ultra high end and midrange in August.
> 
> AMD is out with midrange.
> 
> GG AMD.
> *Hope Zen works out better for them than this "always late to the party" strategy they have with GPUs*.


I got boo'ed out of a rx480 thread for saying this about a month before it's release. My point was AMD is always a quarter late with it's releases. Had the rx480 come out with the 1080 and 1070 it would have been ok. it probably needed to be -$50 cheaper to be a truly great release, but it was solid. The problem the rx-480 had was it was a card for a market which didn't exist. Everyone who wanted a card in that performance bracket already bought a 970... or a used 780 or a 290 or a 390 almost 2 years earlier. saturated market, poor quality card $50 over AMD market value for it's offerings, guaranteed it to be a flop. But the biggest problem of all was (as i pointed out then) people would just say "wait for the 1060".

THAT IS AMD's PROBLEM in a nutshell. No one wants their ****. Not at the price they want to sell it anyway. Nvidia is in the enviable position right now to basically be able to release **** on a stick and people would line up to buy it like sheeple buy apple products. The only way you beat that is offer a better product for the same price OR offer the same product for a significantly lesser price. You can't sell an inferior product at any price if you're AMD.

Zen is going to be a repeat. It will probably overclock like a dream, perform like a haswell, and give you more cores per $$, but people are still going to say "skylake-e is coming out shortly, wait for that!" or "Kaby lake will be released in 4 weeks, wait for that!" and they'll go no where with it. They're completely out of touch with their market


----------



## EniGma1987

Quote:


> Originally Posted by *magnek*
> 
> Do >16GB DDR4 ram sticks exist? If not you'll have to settle for 256GB ram


Can that xeon use registered memory? If so there are larger sticks than 16GB, if not then none larger than 16GB yet.
http://www.newegg.com/Product/Product.aspx?Item=N82E16820242278
http://www.newegg.com/Product/Product.aspx?Item=N82E16820014090


----------



## SuperZan

Quote:


> Originally Posted by *azanimefan*
> 
> I got boo'ed out of a rx480 thread for saying this about a month before it's release. My point was AMD is always a quarter late with it's releases. Had the rx480 come out with the 1080 and 1070 it would have been ok. it probably needed to be -$50 cheaper to be a truly great release, but it was solid. The problem the rx-480 had was it was a card for a market which didn't exist. Everyone who wanted a card in that performance bracket already bought a 970... or a used 780 or a 290 or a 390 almost 2 years earlier. saturated market, poor quality card $50 over AMD market value for it's offerings, *guaranteed it to be a flop*. But the biggest problem of all was (as i pointed out then) people would just say "wait for the 1060".
> 
> THAT IS AMD's PROBLEM in a nutshell. No one wants their ****. Not at the price they want to sell it anyway. Nvidia is in the enviable position right now to basically be able to release **** on a stick and people would line up to buy it like sheeple buy apple products. The only way you beat that is offer a better product for the same price OR offer the same product for a significantly lesser price. You can't sell an inferior product at any price if you're AMD.
> 
> Zen is going to be a repeat. It will probably overclock like a dream, perform like a haswell, and give you more cores per $$, but people are still going to say "skylake-e is coming out shortly, wait for that!" or "Kaby lake will be released in 4 weeks, wait for that!" and they'll go no where with it. They're completely out of touch with their market


I mean, I follow your train of thought and I wish that AMD was quicker on the draw as well, but it's not like RX 480's aren't moving. If Zen can offer Haswell IPC as a hex/octa-core at a quad i7 price, it will move as well.

I get what you're saying and think that it applies 100% to enthusiasts. To the lion's share of the market, though, I think AMD's approach makes sense. We enthusiasts might tell somebody to wait for Skylake-e but your average PC gamer isn't buying HEDT and quad Titans.


----------



## guttheslayer

Quote:


> Originally Posted by *rcfc89*
> 
> What are the specs of your PC? Do you even own a high-end gpu? I'm going to guess the answer is No. Titan owners don't buy 1080's. Titan owners don't give a hoot of the premium price to own a Titan. Yet people like you continue to try to justify why or why you would not buy a Titan. Price/Performance blah blah blah. You're not buying one. Ok. We get it.


You are on a dreamland. I can afford a titan easily, but for that price i could be better off with a pair of ftw 1080s.

Stop trying to act bossy just because u are rich. There are more ppl here that can afford alot more than u can imagined without bragging.

I am not even justify why 1080 is better than titan, i am saying there is no need for 1080 to even drop in price with titan around. Read with your eyes before you even comment.

PS: just in case u didnt realised. I am in fact waiting for titan. How fast will it be will determine if it was worth the hassle of waiting, over a pair of 1080s. Not the budget.


----------



## magnek

Funniest part is dude doesn't even own a Titan (X) himself.


----------



## SuperZan

Quote:


> Originally Posted by *magnek*
> 
> Funniest part is dude doesn't even own a Titan (X) himself.


Those posts always make me laugh. The real Fellowship of the Bling types are only ever noticed if you pay close attention. They're not bragging about what they have or belittling anybody else. You'll catch them in a thread about a particular monitor bug, where they'll ask if anybody else with a 9-panel wall-mounted configuration has encountered X problem. Or, in a thread about a new driver, wondering if anybody else with four water-cooled TX's is encountering this odd scaling bug.

The silliness of defining an enthusiast by whether or not they buy the most expensive of everything is silly, IMO. Most enthusiasts spend, but not always on a halo product, though on the whole we're more likely to buy or at least consider one. Some enthusiasts like to try a little bit of everything, some have a particular niche they love and will have like 20 of a particular item. Many will weigh 'most expensive' versus 'pretty close on perf. and less needlessly price-jacked'. It's about the love of the game, not how much you spend on each component!


----------



## magnek

I think you pretty much just delineated the problem with "hardware enthusiasts" these days. Sadly in this day and age of "overclocking by dragging a few sliders", a "hardware enthusiast" is simply someone who has deep pockets who always chases the latest and greatest regardless of whether it makes sense or not.


----------



## Slomo4shO

Quote:


> Originally Posted by *iLeakStuff*
> 
> There isnt any reason to why Titan P and 1080 cant be on the market simultaniously.


Gross Margins


----------



## guttheslayer

My point from start is to highlight titan doesnt need to compete with 1080 for the price. Thus no need to drop in price.


----------



## NikolayNeykov

There will be a drop of the prices anyway, like it or not, maybe not for Titan P, but i am sure there will be for 1080, because it's not quality card at first place and people already tasted that, see you in few months








Everything is moving forward, you cannot stop time.


----------



## guttheslayer

Quote:


> Originally Posted by *NikolayNeykov*
> 
> There will be a drop of the prices anyway, like it or not, maybe not for Titan P, but i am sure there will be for 1080, because it's not quality card at first place and people already tasted that, see you in few months
> 
> 
> 
> 
> 
> 
> 
> 
> Everything is moving forward, you cannot stop time.


I didnt get the 1080, and if I am on the Titan whether if 1080 drop or not it wont really matter right?

The more interesting is how much faster is Titan really faster, not just based on "theoretical" ideal figure.


----------



## Lee Patekar

Quote:


> Originally Posted by *NikolayNeykov*
> 
> There will be a drop of the prices anyway, like it or not, maybe not for Titan P, but i am sure there will be for 1080, because it's not quality card at first place and people already tasted that, see you in few months
> 
> 
> 
> 
> 
> 
> 
> 
> Everything is moving forward, you cannot stop time.


They won't magically lower the price unless there's competition. Just look at intel's desktop processors.. same price each generation, same quantity of cores and features.. only die shrinks which lower production costs and increase profit margins. If people are buying all the stock at current price levels and AMD doesn't compete with the 1080 GTX directly then the price won't drop and they'll never release a 1080 TI. They'll release the bigger die much later once the sales of 1080 GTX slow down significantly and brand it as 1180 GTX and the cycle continues.

Without competition you get monopolies, high prices and stagnation.


----------



## GunnzAkimbo

just hope its not a rumor, coz really need avid upgrade, but 1080 is not amazing, i think the 1080 Ti will be the single card 4K messiah (Titan P will probably be some stupid price).


----------



## Jpmboy

http://www.crossmap.com/news/nvidia-pascal-news-rumors-and-updates-geforce-gtx-titan-p-is-the-true-2016-flagship-gpu-of-nvidia-29558
Quote:


> Originally Posted by *magnek*
> 
> We're talking about Titan P vs 1080 here, not Titan P vs 1080 Ti.
> 
> 980 was already on the market for 6 months before Titan X released, so the consumers have already been sufficiently milked. The Titan X also cost 81% more for about 30-35% more performance, so that very much goes in line with what I said about NOT wanting to have 1:1 scaling of performance with price.


Sorry bud.... I had 2 TXs before the 980 was retail.
In either case, halo products are for a limited market. Whether you are that type of "enthusiast", defined in whatever contorted way you choose is likely based on how much "casino" money is in your pocket.
Quote:


> Originally Posted by *SuperZan*
> 
> Those posts always make me laugh. The real Fellowship of the Bling types are only ever noticed if you pay close attention. They're not bragging about what they have or belittling anybody else. You'll catch them in a thread about a particular monitor bug, where they'll ask if anybody else with a 9-panel wall-mounted configuration has encountered X problem. Or, in a thread about a new driver, wondering if anybody else with four water-cooled TX's is encountering this odd scaling bug.
> 
> The silliness of defining an enthusiast by whether or not they buy the most expensive of everything is silly, IMO. Most enthusiasts spend, but not always on a halo product, though on the whole we're more likely to buy or at least consider one. Some enthusiasts like to try a little bit of everything, some have a particular niche they love and will have like 20 of a particular item. Many will weigh 'most expensive' versus 'pretty close on perf. and less needlessly price-jacked'. It's about the love of the game, not how much you spend on each component!


well said. +1
Quote:


> Originally Posted by *magnek*
> 
> I think you pretty much just delineated the problem with "hardware enthusiasts" these days. Sadly in this day and age of "overclocking by dragging a few sliders", a "hardware enthusiast" is simply someone who has deep pockets who always chases the latest and greatest regardless of whether it makes sense or not.


dragging sliders, soldering shunts, modifying bios'... and having deep pockets.








Sense has nothing to do with any of this... especially if you are talking gaming.


----------



## magnek

Quote:


> Originally Posted by *Jpmboy*
> 
> http://www.crossmap.com/news/nvidia-pascal-news-rumors-and-updates-geforce-gtx-titan-p-is-the-true-2016-flagship-gpu-of-nvidia-29558
> Sorry bud.... I had 2 TXs before the 980 was retail.
> In either case, halo products are for a limited market. Whether you are that type of "enthusiast", defined in whatever contorted way you choose is likely based on how much "casino" money is in your pocket.


You had two TXs beore 980 went retail?

Well then you must've gotten them through unofficial means, which means you're the exception rather than the norm. Halo products or not, it very much matters the time and order of product launch if you want to avoid cannibalizing certain SKUs.
Quote:


> dragging sliders, soldering shunts, modifying bios'... and having deep pockets.
> 
> 
> 
> 
> 
> 
> 
> 
> Sense has nothing to do with any of this... especially if you are talking gaming.


When I said sense I meant this:
Quote:


> Many will weigh 'most expensive' versus 'pretty close on perf. and less needlessly price-jacked'.


----------



## Ultracarpet

Quote:


> Originally Posted by *Slomo4shO*
> 
> Gross Margins


You're right, the margins would be pretty gross


----------



## NikolayNeykov

Quote:


> Originally Posted by *Lee Patekar*
> 
> They won't magically lower the price unless there's competition. Just look at intel's desktop processors.. same price each generation, same quantity of cores and features.. only die shrinks which lower production costs and increase profit margins. If people are buying all the stock at current price levels and AMD doesn't compete with the 1080 GTX directly then the price won't drop and they'll never release a 1080 TI. They'll release the bigger die much later once the sales of 1080 GTX slow down significantly and brand it as 1180 GTX and the cycle continues.
> 
> Without competition you get monopolies, high prices and stagnation.


Even extreme processors has some nice drop on prices at my location, don't know what you really talking about.
The monopoly only means 5-10 % increase of performance like maxwell - pascal / haswell - skylake, nothing else really, it is best for the nvidia and intel company.
So basically the performance increase is low that is the bad part of it.


----------



## Lee Patekar

Quote:


> Originally Posted by *NikolayNeykov*
> 
> So basically the perforamce increase is low that is the bad part of it.


I think manufacturing costs decrease over time and from changing nodes. If the transistor count of a design remains fairly static each node shrink decreases die size, thus increasing number of parts per wafer lowering costs. Therefore you can give small increases in performance while increasing profits by lowering costs by simply node shrinking your base product.

For example:
Core i7-2700K -> 216 mm² -> $332
Core i7-3770K -> 160 mm² -> $313
Core i7-4770K -> 177 mm² -> $303
Core i7-6700K -> ?122.4 mm² -> $339

The only real increase in transistor count and performance from these chips is the integrated GPU. The CPU has beeing getting only minor advances.. Now imagine if AMD was also competing in this segment. Do you believe we'd still be seeing 4 cores standard in the mainstream desktop space?

The extreme editions are harder to quantify.. but if I look at the cheapest full featured model per family I get:

Core i7-3930K -> 435 mm² -> $583 (3.2 GHz / 6 cores)
Core i7-4930K -> 257 mm² -> $555 (3.4 GHz / 6 cores)
Core i7-5820K -> 356 mm² -> $389 (3.3 GHz / 6 cores)
Core i7-6800K -> ??? mm² -> $434 (3.4 GHz / 6 cores)

I wonder how much of this low end is subsidized by the higher clocked chips having larger profit margins. But even ignore that, the Core i7-4930K must have been very profitable compared to the Core i7-3930K. Not sure what's up with the Core i7-5820K though.. did it have such a low price because it was the lowest offering? Your guess is as good as mine (or maybe better whee)

I hope Zen bring some strong competition to bring down the cost of 8 core processors in general. We'll see. This lack of competition is bad for performance, features and cost. And this is why I also hope that Vega meets or beats the 1080 GTX, it will lower prices and force a response from nvidia (1080 TI)


----------



## Yttrium

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rcfc89*
> 
> Having to wait a year after purchase for Fury X to beat its the competitor (980Ti) in one benchmark in one game while getting destroyed in all other major AAA title's. That makes me wan't to go out and buy a couple AMD gpu's right now. Not
> 
> 
> 
> AMD future proofing doesnt work when they are so late in the game (Vega 9 months late), and need to further wait another 12 months to beat just one title. Lol.
> 
> Either way they are a flop right now, and I am not looking forward for their hyped up Zen.
Click to expand...

Vega is not "late" its an 2017 product, unless you're talking about a competitor to the 1080 which we knew wouldn't happen anyway. mainstream market and all that. and hyped up?! just like eveyone EXEPT AMD hyped up the 480 to reach 1600Mhz on air and other insane rumours. AMD is as tight lipped as they can be while still giving information to stockholders. Normally companies, like intel, really try to keep things alive and give snippets of information away. release dates, specs, names etc. AMD has done very little of this.

I guarantee you that ZEN will get overhyped as their unknown release date approaches. People on this forum, reputable news sites like wccftech and so on just like the rx480.


----------



## Klocek001

$1.2K for 12GB GDDR5X, 3584 cores and 1.5GHz boost clock. Supposedly 60% faster than Titan X Maxwell. Now where does 1080Ti fit in here ? They won't cut the vram for sure. Maybe a 3200 core 1080Ti sold at $850 ?


----------



## flopticalcube

https://blogs.nvidia.com/blog/2016/07/21/titan-x/

/thread


----------



## renejr902

AAAAAHH!! im surprised ! im in shock !! already announced !!

But no HBM2 memory ???


----------



## Yuhfhrh

Quote:


> Originally Posted by *Klocek001*
> 
> $1.2K for 12GB GDDR5X, 3584 cores and 1.5GHz boost clock. Supposedly 60% faster than Titan X Maxwell. Now where does 1080Ti fit in here ? They won't cut the vram for sure. Maybe a 3200 core 1080Ti sold at $850 ?


There will likely not be a 1080 Ti. This is already too similar to what a 1080 Ti would have been (cut down die, only 12GB memory)


----------



## renejr902

i agree with you !

Now im really confuse, Vrworld was lying ! i dont know what to do. i was expecting more performance for 1200$
i was expecting HBM2 memory, but 480gb\s is ok, but im still disappointed. 3500+ cores, i really wanted for 3800+
i was expecting faster core clock too. i hope overclocking potential is VERY HIGH!
i still dont get it.. is this card will be release by third party, like ASUS, MSI, GIGABYTES ??? it seems the answer is no, i hope im wrong.

For 999$ no problem, but for 1200$ , its a lot of money to ask for it, i can afford it, but seriously, does it worth it ??

SOME GUYS HERE SHOULD BE SHY !!! im one of the few that predict a august or september release







im proud of me !! LOL

But seriously nobody could really know, but im still disappointed about that titan card and im disappointed that Vrworld lied to us so badly.

they are not honest people, they said they tested the card in hand!

anyway i wait your opinions guys about this titan card, i cant decide myself, if i buy it or not ?

Could Vega with HBM2 beat it ? when Amd will announce their Vega hbm2 then ?

When Nvidia will release a better card than that new Titan X ???

i dont think this card will be able to do stable 4k at 60fps at ultra for games like Rise of tomb raider, far cry primal, even witcher 3.

So much questions im asking myself now !! im confuse, LOL

i really dont know what to do...


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Are you kidding??? With the supply issues we have seen with the much smaller chips in the 1080/1070 you really think a monster chip like P100 is going to have good yields??? You really need to lay off of that "hope" train dude, you are embarrassing yourself. Yields are always poor on brand new nodes, especially with massive chip sizes...


Now look! Who is embarrassing yourself?

Titan x with gp102 at august 2016. One big slap on ur face, less than 3 month after 1080 lol!

I dare you to ask me again am i kidding? Stop prophesizing cos you aint a prophet
Quote:


> Originally Posted by *ChevChelios*
> 
> well then you should look how long it took for Titan to release when it was under a new node
> 
> it was 11 months IIRC
> 
> and when did a Titan ever come out within first 4-5 months of a new gen ? even on a mature node
> yes
> 
> and that wont be in August-September


I am laughing so badly at this now. Lesson learnt: dont act like you can prophesize the future and try shove it down ppl throat.


----------



## magnek

Quote:


> Originally Posted by *magnek*
> 
> Sure they can as long as Titan P is sufficiently more expensive than the 1080, which means >$1200 at a minimum.


Does this mean I'm a prophet too?









But in all seriousness, I most definitely did not expect GP102 to drop 2 months after GP104, yet given the price, it makes sense.


----------



## guttheslayer

Quote:


> Originally Posted by *magnek*
> 
> Does this mean I'm a prophet too?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But in all seriousness, I most definitely did not expect GP102 to drop 2 months after GP104, yet given the price, it makes sense.


I already said many times, nvidia will release the card when and where they like. It even before the gamecon. Its that fast. So fast it make me sweat.


----------



## sherlock

Quote:


> Originally Posted by *NikolayNeykov*
> 
> There will be a drop of the prices anyway, like it or not, maybe not for Titan P, but i am sure there will be for 1080, because it's not quality card at first place and people already tasted that, see you in few months
> 
> 
> 
> 
> 
> 
> 
> 
> Everything is moving forward, you cannot stop time.


680 Price didn't move a inch when Titan dropped, 980 didn't for Titan X either. It is only when thte 780/980Ti equivalent drops that there is a price drop to make room.


----------



## Klocek001

Quote:


> Originally Posted by *Yuhfhrh*
> 
> There will likely not be a 1080 Ti. This is already too similar to what a 1080 Ti would have been (cut down die, only 12GB memory)


so that means 980Ti owners who did not think 1080 is good enough for an upgrade will have to pay $1200 for a better one.... And can retrieve maybe $300 from selling the 980Ti. 60% over Titan X is rougly 55% over 1070, they're probably using their own measurements of reference vs reference, and they advertised 1070 as faster than TX. That'll mean rougly 30% over 1080.
Well, 1080 SLI for $1200 sounds a lot better.


----------



## GunnzAkimbo

It's done.

https://www.techpowerup.com/224335/nvidia-announces-the-geforce-gtx-titan-x-pascal


----------



## ChevChelios

I was so horribly wrong

guttheslayer is my Lord and Prophet


----------



## Klocek001

probably no AIB versions, just TX FE for $1200. Wonder what clock speed it can reach on the reference cooler.


----------



## ChevChelios

well this definitely does not make me regret my 1080 purchase at all

$1200 for a GPU ? nope, not a snowballs chance in hell

especially if it will also be reference only with no AIBs

the waiting for 1180 Volta continues


----------



## Kpjoslee

At least Titan XP making 1080 SLI look like absolute steal lol Clever Nvidia.


----------



## ChevChelios

Quote:


> Meanwhile for distribution, making a departure from previous generations, the card is only being sold directly by NVIDIA through their website. The company's board partners will not be distributing it, though system builders will still be able to include it.


----------



## Klocek001

lol at those crapping on 1080 now hurrying to buy two









If we're getting a *cut* chip with gddr5x for $1200 and Volta is 2018 then I guess my second 1080 will be here too.


----------



## SuperZan

Quote:


> Originally Posted by *Klocek001*
> 
> lol at those crapping on 1080 now hurrying to buy two
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If we're getting a *cut* chip with gddr5x for $1200 and Volta is 2018 then I guess my second 1080 will be here too.


It certainly makes the pricing on the 1080 seem reasonable, if only by comparison. I've got a 1070 to satisfy my need to play with something new; hopefully Vega will be worth the wait, or I'll have to add another 1070 and call it a day.


----------



## daviejams

Wonder how much they will be in the UK ? Probably over a grand


----------



## SuperZan

Quote:


> Originally Posted by *daviejams*
> 
> Wonder how much they will be in the UK ? Probably over a grand


Way the pound is at the moment I'd wager you've hit the mark. Quick and dirty maths estimation suggests £1,100 as a minimum. Ready to drop a kilotonne bomb on your wallet?


----------



## daviejams

Quote:


> Originally Posted by *SuperZan*
> 
> Way the pound is at the moment I'd wager you've hit the mark. Quick and dirty maths estimation suggests £1,100 as a minimum. Ready to drop a kilotonne bomb on your wallet?


Pretty much out of my budget

Could not justify spending that on a GPU


----------



## SuperZan

I'm in that same boat with you mate. I could pay it for two nice cards but I couldn't bring myself to drop that amount of money on a single card that isn't an astronomical improvement over all other available products -and- that allows me to do things in games that I couldn't do with less.


----------



## mcg75

No point in keeping this rumor thread open now we have confirmation.

Locked.


----------

