# [Game Debate] Rumour: AMD Polaris 10 Reportedly offers near 980 Ti performance for 300 USD



## Majinwar

Source
Quote:


> AMD reportedly hosted an event designed to showcase its upcoming Polaris GPUs and the Radeon Pro Duo to journalists behind closed doors in Taiwan recently, ahead of an expected official unveiling in May. The big noise coming out of the event is that the switch to the 14nm FinFET fabrication process means the Polaris 10 GPU performs extremely close to the GeForce GTX 980 Ti, but for a drastically cheaper price point.


----------



## Newbie2009

Hmmm


----------



## Cakewalk_S

Cool cool...all is well...EXCEPT I REALLY want to see power consumption numbers. I can deal with the cooling with some CLU but I REALLY REALLY hope AMD can reduce their power consumption like there's no tomorrow.


----------



## 364901

Yeah, no, I can't believe that. 980 Ti performance for half the price, on a new process that is twice as expensive? That's literally implausible.

I can't even fathom how much such a GPU would break the industry (it most certainly would), and drive a price war that'll last far longer than it needs to. If anything, it'll be 980 Ti performance for $150 less, not $300 less.


----------



## TK421

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Cool cool...all is well...EXCEPT I REALLY want to see power consumption numbers. I can deal with the cooling with some CLU but I REALLY REALLY hope AMD can reduce their power consumption like there's no tomorrow.


crossfire? more like house fire.

That being said, it should be fine if amd keeps the AIO cooler from the fury series.


----------



## davidelite10

Quote:


> Originally Posted by *CataclysmZA*
> 
> Yeah, no, I can't believe that. 980 Ti performance for half the price, on a new process that is twice as expensive? That's literally implausible.
> 
> I can't even fathom how much such a GPU would break the industry (it most certainly would), and drive a price war that'll last far longer than it needs to. If anything, it'll be 980 Ti performance for $150 less, not $300 less.


This is my stance, and even if this is true then the pro duo is literally the worst price point card on current release. 4x""""980ti performance AMDcards"""" for 1200 bucks or a produo for 300 more and over 50% slower....


----------



## jsc1973

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Cool cool...all is well...EXCEPT I REALLY want to see power consumption numbers. I can deal with the cooling with some CLU but I REALLY REALLY hope AMD can reduce their power consumption like there's no tomorrow.


Just the fact that it's on a 14nm process should make a big reduction in the power consumption all by itself.


----------



## mandrake88

is too good to be true.


----------



## spyshagg

All those concerned with power consumption, Polaris was seen running hitman 4K60 passively (as in, with no fan).

Not some hearsay, it came from pcper iirc


----------



## 364901

Quote:


> Originally Posted by *spyshagg*
> 
> All those concerned with power consumption, Polaris was seen running hitman 4K60 passively (as in, with no fan).
> 
> Not some hearsay, it came from pcper iirc


You can do that with a R9 Fury today quite easily - as long as the card doesn't reach a specific temperature, you can have the fans off. Also that was 1440p at 60fps, not 4K.

4K 60fps is still a hard thing to do for a single GPU, especially with a taxing game like Hitman.


----------



## davidelite10

Quote:


> Originally Posted by *spyshagg*
> 
> All those concerned with power consumption, Polaris was seen running hitman 4K60 passively (as in, with no fan).
> 
> Not some hearsay, it came from pcper iirc


And faildozer was seen running benches at 40%+ faster than ivy.


----------



## spyshagg

Quote:


> Originally Posted by *davidelite10*
> 
> And faildozer was seen running benches at 40%+ faster than ivy.


thaaanks for the irony.

However:
http://www.overclock3d.net/articles/gpu_displays/amd_polaris_10_engineering_sample_pictured/1

4K VR, not hitman though.


----------



## BinaryDemon

980Ti performance for $300, I think that would convert me.


----------



## Yvese

Those in denial clearly haven't been around long if they find this 'implausible'







. I remember when the 4870 launched for $299 and matched or beat Nvidia's $399 GTX 260. Or when the 5850 launched and was 50%+ the performance of a 4870 but had an even lower launch price of $259. Those were the golden days and I'm glad to see it making a comeback.

This Polaris 10 will likely be the 480x. Having the mainstream GPU of the next-generation match the flagship of this generation isn't farfetched at all. It's expected.


----------



## Newbie2009

From what I have heard, this is going to blow Intel i7 series out of the water. Twice the power, half the price. All we have to do is wait.


----------



## zealord

The biggest question is.

In case this is true, what does AMD do with the Fury, Fury X, 390 and 390X ?


----------



## one-shot

Quote:


> Originally Posted by *CataclysmZA*
> 
> You can do that with a R9 Fury today quite easily - as long as the card doesn't reach a specific temperature, you can have the fans off. Also that was 1440p at 60fps, not 4K.
> 
> 4K 60fps is still a hard thing to do for a single GPU, especially with a taxing game like Hitman.


lol. That's the point. The fans didn't come on. What are you trying to say? At a certain temps you can have the fans off, and the Polaris 10 had its fans off. Not sure if srs....


----------



## one-shot

Quote:


> Originally Posted by *Newbie2009*
> 
> From what I have heard, this is going to blow Intel i7 series out of the water. Twice the power, half the price. All we have to do is wait.


Intel i7 isn't a GPU.


----------



## sugarhell

Quote:


> Originally Posted by *zealord*
> 
> The biggest question is.
> 
> In case this is true, what does AMD do with the Fury, Fury X, 390 and 390X ?


EOL?


----------



## mothergoose729

On a new processing node with a new architecture AMD aught to be able to match 980ti performance at that price range. The GTX 1070 (or whatever it will be called) aught to do the same.


----------



## zealord

Quote:


> Originally Posted by *sugarhell*
> 
> EOL?


yeah but I mean a 300$ card that perform like 980 Ti would change everything.

AMD has still a lot of stock of those cards I mentioned left over (atleast I think they do).

All those cards would be completely undesireable


----------



## sugarhell

Quote:


> Originally Posted by *zealord*
> 
> yeah but I mean a 300$ card that perform like 980 Ti would change everything.
> 
> AMD has still a lot of stock of those cards I mentioned left over (atleast I think they do).
> 
> All those cards would be completely undesireable


I bet this card will be expensive. Like the 1080 from nvidia. 14nm is really expensive at the moment. I can see a cut down version to cost 300 bucks but not the full die.


----------



## Newbie2009

Quote:


> Originally Posted by *one-shot*
> 
> Intel i7 isn't a GPU.


Exactly, so much better, won't even be comparable.


----------



## Cakewalk_S

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zealord*
> 
> yeah but I mean a 300$ card that perform like 980 Ti would change everything.
> 
> AMD has still a lot of stock of those cards I mentioned left over (atleast I think they do).
> 
> All those cards would be completely undesireable
> 
> 
> 
> I bet this card will be expensive. Like the 1080 from nvidia. 14nm is really expensive at the moment. I can see a cut down version to cost 300 bucks but not the full die.
Click to expand...

AMD can't really afford an expensive card. They're basically going to have to produce this card with a low margin and high volume in mind. If they want to compete with Nvidia they're really going to have to make one cheap card...


----------



## Shadymort

Well, here's the "Sweet spot" strategy for ya. Can't understand why 980ti (I presume not overclocked) performance at 300$ ( more probably 350-ish ) by RTG is so absurd. Regarding the price, while 14 nm is more expensive, we are talking about a far smaller chip here. I would have hoped for better than 980ti perf for that price at this point.


----------



## i7monkey

Quote:


> Originally Posted by *Yvese*
> 
> Those in denial clearly haven't been around long if they find this 'implausible'
> 
> 
> 
> 
> 
> 
> 
> . I remember when the 4870 launched for $299 and matched or beat Nvidia's $399 GTX 260. Or when the 5850 launched and was 50%+ the performance of a 4870 but had an even lower launch price of $259. Those were the golden days and I'm glad to see it making a comeback.


Hate to burst anyone's bubble but the AMD Radeon Pro Duo is selling for $1500, so why would anyone get a Pro Duo if they can spend $600 and get 2 Polaris 10's and match it in performance at almost one third the cost?


----------



## Juub

It's probably gonna be far more expensive than 300$. Most likely 400$.

That's apparently what the GPU inside the PS4 Neo is based on.


----------



## davidelite10

Quote:


> Originally Posted by *spyshagg*
> 
> thaaanks for the irony.
> 
> However:
> http://www.overclock3d.net/articles/gpu_displays/amd_polaris_10_engineering_sample_pictured/1
> 
> 4K VR, not hitman though.


No settings or anything else, also it's just a demo and not a real case scenario.
Just like the FuryX Overclocker's dream!

Again I'll believe it when I see it, and I'm definitely not a fanboy by any means.
I've just be made sad by AMD Hype for a while. I really hope it does not the 980ti out of the park that'd be sweet.


----------



## Laserlight

When something is too good to be true then it's too good to be true. I want AMD to release good things this year to push some serious competition to Nvidia but this is too far fetched to believe.


----------



## Shadymort

Quote:


> It's probably gonna be far more expensive than 300$. Most likely 400$.
> 
> That's apparently what the GPU inside the PS4 Neo is based on.


That would be Polaris 11. Polaris 10 is far too pricey for the ps4 Neo.


----------



## sugarhell

Quote:


> Originally Posted by *Cakewalk_S*
> 
> AMD can't really afford an expensive card. They're basically going to have to produce this card with a low margin and high volume in mind. If they want to compete with Nvidia they're really going to have to make one cheap card...


The top cards are not high margin high volume products.

Even with 980/970 the best selling card was the 970 by far.

A cut down version of polaris 10 will gonna hit that spot. But dont expect the full die of polaris 10 to just cost 300 bucks.

With the current prices of 14nm and if we expect a size of >250mm^2 then i think its impossible

But i can be wrong


----------



## criminal

Quote:


> Originally Posted by *sugarhell*
> 
> EOL?


Might as well. Why would any of those sale if this rumor is true? I was already hyped for Polaris 10, but this would be epic.









I still find it plausible that the full die chip Polaris 10 will cost $450 and the cut chip will be $300-349.


----------



## guttheslayer

The reason why AMD can price is so low is becz of its incredible small die size of 232mm sq.

The smaller the die, the easier it would be to manage yield.


----------



## Newbie2009

Quote:


> Originally Posted by *guttheslayer*
> 
> The reason why AMD can price is so low is becz of its incredible small die size of 232mm sq.
> 
> The smaller the die, the easier it would be to manage yield.


No, it's because AMD loves us. Even you.


----------



## Slaughterem

Quote:


> Originally Posted by *i7monkey*
> 
> Hate to burst anyone's bubble but the AMD Radeon Pro Duo is selling for $1500, so why would anyone get a Pro Duo if they can spend $600 and get 2 Polaris 10's and match it in performance at almost one third the cost?


Pro Duo is not marketed for you but for DEVS. You see the gaming business does not revolve around you 1% who must have the fastest card at the time. Your comments IMO represent the attitude of being all about me and shortsighted. I tend to look at future reasons. Planting the seed for *multi gpu* not only for VR but for devs for DX12 and Vulcan. This is the big picture and Nvidia has been planting systems with DEVS for a long time.
There is this little NAVI thing that will hit the market in 2018. NAVI by definition is a Variable star both intrinsic and extrinsic. Dual Vega or Polaris or APU + gpu on interposer with HBM2 or next gen memory will be what this is about IMO. DEVS now have a card that they can use to design their next game engines on. Even Nvidia will be looking towards this type of platform because of NVLink. You may not realize or want to realize that AMD is the company moving the gaming industry forward while Nvidia is moving towards a major opportunity with AI and computerized cars.


----------



## Ghoxt

The price spoken would never occur. AMD has stockholders that would go...ballistic, for them leaving money on the table. As someone else mentioned, they don't like us that damn much to basically give performance away at that price. Competitive discount yes, but $300? FUD.


----------



## criminal

Quote:


> Originally Posted by *Slaughterem*
> 
> Pro Duo is not marketed for you but for DEVS. You see the gaming business does not revolve around you 1% who must have the fastest card at the time. Your comments IMO represent the attitude of being all about me and shortsighted. I tend to look at future reasons. Planting the seed for *multi gpu* not only for VR but for devs for DX12 and Vulcan. This is the big picture and Nvidia has been planting systems with DEVS for a long time.
> There is this little NAVI thing that will hit the market in 2018. NAVI by definition is a Variable star both intrinsic and extrinsic. Dual Vega or Polaris or APU + gpu on interposer with HBM2 or next gen memory will be what this is about IMO. DEVS now have a card that they can use to design their next game engines on. Even Nvidia will be looking towards this type of platform because of NVLink. You may not realize or want to realize that AMD is the company moving the gaming industry forward while Nvidia is moving towards a major opportunity with AI and computerized cars.


I have to agree. Pro duo being so late to the party means either AMD screwed up again royally or its purpose has more to do with development than is does gaming.


----------



## sugarhell

Quote:


> Originally Posted by *criminal*
> 
> I have to agree. Pro duo being so late to the party means either AMD screwed up again royally or its purpose has more to do with development than is does gaming.


Firepro Duo has firepro drivers for specific applications. Its a niche but dual gpus were a niche already.


----------



## Ghoxt

Quote:


> Originally Posted by *i7monkey*
> 
> Hate to burst anyone's bubble but the AMD Radeon Pro Duo is selling for $1500, so why would anyone get a Pro Duo if they can spend $600 and get 2 Polaris 10's and match it in performance at almost one third the cost?


Having a Titan-Z flashback lol.


----------



## delboy67

Quote:


> Originally Posted by *guttheslayer*
> 
> The reason why AMD can price is so low is becz of its incredible small die size of 232mm sq.


Agree, from the rumors I've read, 232mm, 135w, hd7870/270x branding and price and near 980ti performance, I'm gona be optimistic because I'm overdue a new card and this would be perfect


----------



## criminal

Quote:


> Originally Posted by *Ghoxt*
> 
> Having a Titan-Z flashback lol.


Even Pro Duo is a deal compared to that turd.


----------



## Vesku

Quote:


> Originally Posted by *zealord*
> 
> yeah but I mean a 300$ card that perform like 980 Ti would change everything.
> 
> AMD has still a lot of stock of those cards I mentioned left over (atleast I think they do).
> 
> All those cards would be completely undesireable


A few weeks ago I read a post about someone RMAing to XFX. They were out of what he had and also told him they couldn't elevate his replacement to a 390/390X because AMD was no longer shipping new ones, i.e. EOL. Could have been BS but would fit with a strong Polaris 10 & 11 meaning no 28nm holdouts once they launch.


----------



## Master__Shake

Quote:


> Originally Posted by *criminal*
> 
> Even Pro Duo is a deal compared to that turd.


hey now.

just cause it was overpriced and abandoned like a used sock doesn't mean....

nevermind.










i'm sure the 3 people who bought them are happy with them.


----------



## m70b1jr

Should I save towards a new car making while i'm making 7.25 or buy a $300 near performance 980Ti chip from AMD. Well, guess I won't be having a car anytime soon.


----------



## lukart

If that happens, I wonder how many returns will Newegg will have after all the guys buy the expensive nvidia gear


----------



## 364901

Quote:


> Originally Posted by *one-shot*
> 
> lol. That's the point. The fans didn't come on. What are you trying to say? At a certain temps you can have the fans off, and the Polaris 10 had its fans off. Not sure if srs....


I'm pointing out that this post:

Quote:


> Quote:
> 
> 
> 
> Originally Posted by *spyshagg*
> 
> All those concerned with power consumption, Polaris was seen running hitman 4K60 passively (as in, with no fan).
> 
> Not some hearsay, it came from pcper iirc
Click to expand...

implies that people shouldn't be worried about power consumption because Polaris was running a game while being passively cooled.

That's not an indication of power consumption, it's an indication that the heat levels are low enough that the card can run the game with the fans off (or, alternatively, that the fans have been set to only turn on at a specific heat level).

The fact that you can do that with existing high-end GPUs at 28nm today says nothing about Polaris' power consumption.


----------



## Ultracarpet

Yea I mean Nvidia released the 970 with the Titan Z still on the market....If AMD has a card they think they can aggressively get market share with, they will do so. Who cares if they cannibalize a few sales of an extremely low volume product... Especially when they only have 20% of the market to begin with.

Tldr: stealing sales from Nvidia will make a much larger difference than protecting the sales of their old and arguably failed products


----------



## TheLAWNOOB

Quote:


> Originally Posted by *m70b1jr*
> 
> Should I save towards a new car making while i'm making 7.25 or buy a $300 near performance 980Ti chip from AMD. Well, guess I won't be having a car anytime soon.


Just stave yourself for a week or two and you can afford the card.

Good thing minimum wage in Canada is higher. Working slightly above Canadian minimum wage isn't too bad


----------



## prjindigo

Quote:


> Originally Posted by *CataclysmZA*
> 
> Yeah, no, I can't believe that. 980 Ti performance for half the price, on a new process that is twice as expensive? That's literally implausible.
> 
> I can't even fathom how much such a GPU would break the industry (it most certainly would), and drive a price war that'll last far longer than it needs to. If anything, it'll be 980 Ti performance for $150 less, not $300 less.


AMD's GoFlo/Samsung 14nm process combination isn't new - samsung is using it in ram and it rocks and sips.

Polaris 10 is a small die = much much lower loss rates do to normal lithography errors

Polaris 10 uses small power (at 14nm woo) and thus needs less hardware to feed it.

14nm is not twice as expensive, 16nm is twice as expensive as 28nm.


----------



## iLeakStuff

Although I`m certain AMDs strategy this year is to beat Nvidia on price/performance, I think $300 is stretching it.
I think the Polaris 10 card is sort of what 7870 was when it launched. It beat previously flagship, and was midrange for AMD.

7870 launched at $360 and that is where Im guessing Polaris 10 will lay. $350 to max $400,
Matching or slightly beating Fury X. Matching GTX 1070 but is cheaper.


----------



## delboy67

Quote:


> Originally Posted by *m70b1jr*
> 
> Should I save towards a new car making while i'm making 7.25 or buy a $300 near performance 980Ti chip from AMD. Well, guess I won't be having a car anytime soon.


You'll never be done putting money into the car so I'd get the card first.


----------



## Sand3853

I think positioning and price really comes down to how AMD plans to use Vega. Is the Vega chips the new Fury line, or the 490 and 490x...or both? If Vega is all performance/enthusiast, with Vega 11= 490 and 490x and Vega 10= Fury 2 then I think its entirely plausible that the Polaris 10 (480/480x) will be somewhere between 200 and 300 and have the performance as rumored. Heck, if things do play out where Polaris 10 comes within 980ti performance at significantly lower power usage and cost as rumored, I'll happily ditch my 290x for one, and then save up for Vega.


----------



## TAr

Release date?


----------



## iLeakStuff

Quote:


> Originally Posted by *TAr*
> 
> Release date?


This year


----------



## Yvese

Quote:


> Originally Posted by *m70b1jr*
> 
> Should I save towards a new car making while i'm making 7.25 or buy a $300 near performance 980Ti chip from AMD. Well, guess I won't be having a car anytime soon.


Forget the new car. Buy a used 1998 - early 00's toyota/honda. Those things last 300k+ miles easily. Aside from regular maintenance, you'd save so much money rather than losing half your cars value by the time you pay it off on a 4 year term. There's a reason hondas and toyotas are the most popular cars.

Buy used, save money, have money for card and more


----------



## Noufel

Isn't this the end for the pro duo card and the fury lineup ? Who in his sane mind would buy the fury pro duo if polaris 10 cfx out perform it for less than half the price ?
Please don't bring me the form factor argument.


----------



## zealord

Quote:


> Originally Posted by *sugarhell*
> 
> I bet this card will be expensive. Like the 1080 from nvidia. 14nm is really expensive at the moment. I can see a cut down version to cost 300 bucks but not the full die.


AMD tried expensive with Fury X. I think it is safe to say they failed big time.

Also Polaris 10 will probably be a very small die.

I could see a 14nm 250mm² Polaris chip be roughly on par with a 980 Ti. Remember a 980 Ti is cut down.

But to be fair there is a probably huge difference between what I think will happen, what could happen and what actually will happen









The price of Polaris 10 is heavily reliant on GTX 1070 I think


----------



## iLeakStuff

Fury X would have been super popular if AMD released Fury X at the same time as GTX 980Ti.
AMDs biggest problems is their launch timing. But I guess that is related to less money to speed up development.

Vicious cycle :/


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X would have been super popular if AMD released Fury X at the same time as GTX 980Ti.
> AMDs biggest problems is their launch timing. But I guess that is related to less money to speed up development.
> 
> Vicious cycle :/


It was only like two weeks after the 980 Ti IIRC but I think it was a paper launch with a couple more weeks until they were available at etailers. Would have been a nice card if it could OC worth a damn. Could easily get 25-30% perf out of the 980 Ti but Fury X, not so much.


----------



## zealord

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X would have been super popular if AMD released Fury X at the same time as GTX 980Ti.
> AMDs biggest problems is their launch timing. But I guess that is related to less money to speed up development.
> 
> Vicious cycle :/


I hate Nvidia pricing and I think the 980 Ti is a better buy than a Fury X.

Mostly because overclocking headroom


----------



## iLeakStuff

Yeah you are both right. There were other factors as well that made Fury X the less desirable choice. Overclocking was certainly one of them as 980Ti is in an entire different field there,
AMD also struggled bigtime with getting cards out to the public due to yields most likely


----------



## erocker

Ahhhhh! A rumor I like! Good to hear about a possible bargain rather than a price increase for an incremental performance gain.


----------



## SSJVegeta

Due to be released in June?


----------



## clao

too good to be true but if it is I am getting 1 (but I will still keep my trusty 290x).


----------



## 364901

Quote:


> Originally Posted by *Yvese*
> 
> Those in denial clearly haven't been around long if they find this 'implausible'
> 
> 
> 
> 
> 
> 
> 
> . I remember when the 4870 launched for $299 and matched or beat Nvidia's $399 GTX 260. Or when the 5850 launched and was 50%+ the performance of a 4870 but had an even lower launch price of $259. Those were the golden days and I'm glad to see it making a comeback.
> 
> This Polaris 10 will likely be the 480x. Having the mainstream GPU of the next-generation match the flagship of this generation isn't farfetched at all. It's expected.


The problem I have with these kinds of statements is that, yes, they're true, but that's no longer the case today. Look at Pascal compared to Maxwell - it's almost Maxwell 3.0 with a few tweaks that don't change the make-up too much, but that add up to some serious performance boosts. Do you think that its wise to launch a brand new architecture on a new process that no-one's ever used for desktop GPUs? My personal opinion is that it's not.

That's why, looking at AMD's market positioning of Polaris 10 and 11, I don't expect there to be replacements to the R9 Fury, Nano, and Fury cards. I don't even expect a successor to Hawaii. I expect Polaris 10 to replace Tonga and Pitcairn, while Polaris 11 replaces Bonaire and maybe Cape Verde. AMD's tested out a new process on an existing, tweaked architecture before, and that's how they got the jump on NVIDIA when they launched their high-end HD5000 cards a few months later.

Vega, I expect, will be the high-end GPU family. It'll replace Fiji and Hawaii, usurping both families with interposer and HBM v2 technology. If Polaris 10 has the performance of any previous GPU family, it'll likely be Hawaii levels of performance at a lower price, on a 256-bit bus wit GDDR5 memory, priced to deliver fluid VR gaming at a reasonable level.


----------



## SlackerITGuy

Had a good laugh after reading the title.

There's absolutely no way this is true, no one should believe this.


----------



## mouacyk

Quote:


> Originally Posted by *erocker*
> 
> Ahhhhh! A rumor I like! Good to hear about a possible bargain rather than a price increase for an incremental performance gain.


Keep this rumor up and AMD will have to deliver. You know they just downsized their headquarter.


----------



## CasualCat

I'm not concerned about power consumption. If they can pull off 980Ti performance at $300, my HTPC is getting a new GPU.


----------



## tweezlednutball

almost time to retire the good ol 7970's. it was fun old pals.


----------



## Majinwar

Quote:


> Originally Posted by *tweezlednutball*
> 
> almost time to retire the good ol 7970's. it was fun old pals.


My 4870 is tired.. It wants to go home! lol


----------



## xx9e02

Quote:


> Originally Posted by *Majinwar*
> 
> My 4870 is tired.. It wants to go home! lol


My old 4770 is still chugging along in a friend's PC. It's time for it to take a nice long rest as well! Can't wait for summer.


----------



## Clovertail100

Quote:


> Originally Posted by *davidelite10*
> 
> This is my stance, and even if this is true then the pro duo is literally the worst price point card on current release. 4x""""980ti performance AMDcards"""" for 1200 bucks or a produo for 300 more and over 50% slower....


Is it any surprise? Single card dual chip, and single card premium chips always offer the worst priceerformance.

See: 5970, 6990, 7990, 590, 690, Titan, Titan Black, Titan X.

New nodes always bring dramatic improvements. Previous nodes becoming obsolete is not a new phenomena.


----------



## Serios

Quote:


> Originally Posted by *criminal*
> 
> I have to agree. Pro duo being so late to the party means either AMD screwed up again royally or its purpose has more to do with development than is does gaming.


AMD sold a bunch of these dual GPUs as FirePros for a supercomputer. That is why they developed it and it's also aimed for VR developers. Not every single card is gaming oriented or AMD needs sales on the gaming market for it to be profitable.


----------



## f33t

And did all the previous processes actually justify their prices? Or were they over priced to begin with?


----------



## Distaste

I feel like I'm missing something. I mean 980 TI performance for $300 would be awesome IF Nvidia wasn't also going to release new cards. Doesn't the X70 version of Nvidia cards usually meet or beat the X80 version of the previous gen? Aren't they usually around $350? So assuming that Nivida releases their new cards anywhere near these, I'm not getting the amazement. I'm not saying it won't be a great card or it won't win price/performance, I simply don't see think it's that mind boggling that a next gen card can meet or beat a last gen card.


----------



## GTRagnarok

Hoping this is true so I can grab a used 290/390/X or even a Fury for cheap. I got a used 7970 last year for $110 that I recently sold for a profit


----------



## caliking420

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Had a good laugh after reading the title.
> 
> There's absolutely no way this is true, no one should believe this.


Ah someone that understands









Even if they cut the cost enough to do this, they still wouldn't.
Why you ask? Because they run a business and that's how business works. The shareholders want to see the biggest return in there investment they can.

Example:
You're selling a house worth 100k, sure you might drop the price to be competitive in the market.
But who ever said nahhh, just sell it for as low as you can because i love people. No one ever.

Edit
Not saying i don't believe that this card will perform great. I'm not even saying it wont perform on par with a 980Ti because it might.
What i don't believe is that they would intentionally undercut there own profit by that much when they could just as easily not.


----------



## Ultracarpet

Quote:


> Originally Posted by *caliking420*
> 
> Ah someone that understands
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even if they cut the cost enough to do this, they still wouldn't.
> Why you ask? Because they run a business and that's how business works. The shareholders want to see the biggest return in there investment they can.
> 
> Example:
> You're selling a house worth 100k, sure you might drop the price to be competitive in the market.
> But who ever said nahhh, just sell it for as low as you can because i love people. No one ever.
> 
> Edit
> Not saying i don't believe that this card will perform great. I'm not even saying it wont perform on par with a 980Ti because it might.
> What i don't believe is that they would intentionally undercut there own profit by that much when they could just as easily not.


It's a very small chip that is using cheap memory, and does not require a robust power delivery system. They should easily be able to maintain a margin at reduced prices, and if they believe this card can gain considerable market share back, they will price it accordingly.


----------



## tajoh111

Quote:


> Originally Posted by *Ultracarpet*
> 
> It's a very small chip that is using cheap memory, and does not require a robust power delivery system. They should easily be able to maintain a margin at reduced prices, and if they believe this card can gain considerable market share back, they will price it accordingly.


There are two sides to the coin. They can either gain marketshare which is good in the short term or intentionally but not done in speaking or writing terms cooperate with Nvidia to raise graphic card prices to account for the increase in the cost of production which is good for both companies in the long term.

Nvidia with it brisk sales and high margins would take a loss if they maintain pricing at the present. This is because Nvidia's cost of production is between 500 to 600 million dollars. So increase this by the wafer cost and we have losses. They will make money without adding operation expenses but add in the overhead, R and D, administration and they take a loss.

Same happens to AMD, price too low and they torpedo the market and get into a race to the bottom. Nvidia fights back and both companies get into a price war, where the company with the biggest cash on hand wins. This is because too low of a price cripples margins to the point where gross profits aren't enough to cover the rest of the expenses. So although gaining market share is good for the short term, having profit margins that kill prices so no profits are made to pay for the rest of the expenses is bad in the long run. Margins have to be bigger than ever before to cover for the massive increase in R and D expenditure. And with increasing wafer pricing, prices of videocards have to go up to make up the difference. I think Fury pro duo price reflects this and Nvidia cutting supplies of most of its maxwell series support this theory.

Also if you look some of the expected die sizes, it seems like the companies have aligned their products to interfere with each each products the least. gp104 is 320mm, polaris 10 is 232mm, gp 106 is 180mm and polaris 11 is likely 130mm2. I think the companies are going to avoid pricing wars by segmenting the market in a manner if that a customer wants a certain product for a specific price, there is only 1 card in that segment.


----------



## Peter Nixeus

My wallet is ready...


----------



## Clocknut

This is more likely to be Hawaii replacement. May be at Fury Nano performance.


----------



## kaosstar

Perhaps their R&D budget for the new chip and node shrink are being significantly subsidized by the PS4K and Nintendo NX contracts. In that case, the pricing seems reasonable.

It also may be part of a strategy to build up their market share. Perhaps they predict that DX12 will really take off within the next 2 years or so, and that their DX12 performance lead over Nvidia will stay the same or widen. In that case, they'd want to expand their market share at the cost of short term profits, so that they can rake everyone over the coals on Vega or Navi.

BTW, I'm not holding my breath. We've rarely, if ever, seen a positive rumor regarding AMD hardware turn out to be true, at least since Bulldozer.


----------



## ebduncan

People are losing their mind over these new cards.

14nm and 16nm cards will change the game just like 28nm did over 40nm gpus. I mean the 7870 came out and pretty much tied the 6970 aka the flagship card. Going by die costs AMD wants to completely replace its 28nm cards if they can achieve better performance with a smaller die. Replacing a 400+mm die with a 232mm die that performs the same or better not only saves them money, but will save you money in reduced power consumption.Margins improve for AMD because instead of selling this big die for 300-400$ they can sell a small die for the same price.\

everyone already knows that big pascal and vega are the true next generation flagships. We will find out in 5-6 weeks exactly how these cards stack up and the next generation wars between AMD and Nvidia can resume.


----------



## Forceman

Quote:


> Originally Posted by *ebduncan*
> 
> I mean the 7870 came out and pretty much tied the 6970 aka the flagship card.


Yeah, but 6970 wasn't very big. The ratio of Pitcairn to Cayman is the same as Polaris 10 to Hawaii. Matching Hawaii is pretty much expected, not so sure about the considerably bigger Fiji.

Assuming the 232 mm rumor is correct, which I'm not convinced of.


----------



## ebduncan

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, but 6970 wasn't very big. The ratio of Pitcairn to Cayman is the same as Polaris 10 to Hawaii. Matching Hawaii is pretty much expected, not so sure about the considerably bigger Fiji.
> 
> Assuming the 232 mm rumor is correct, which I'm not convinced of.


The gap between a fury x and a 390x isn't really that much. For the increase in shaders it certainly didn't scale well. That and well AMD have said themselves that polaris is the largest jump in performance ever between generations. Even if you consider the rumor of a 232mm die, 14nm alone offers near double the density of 28nm. Again this isn't just a die shrink though they also revamped the shaders themselves this time, instead of the minor tweaks they did to GCN before. The only concerns i have are towards memory bandwidth. It's safe to say that Polaris 10 will outperform hawaii.


----------



## Forceman

Quote:


> Originally Posted by *ebduncan*
> 
> The gap between a fury x and a 390x isn't really that much. For the increase in shaders it certainly didn't scale well. That and well AMD have said themselves that polaris is the largest jump in performance ever between generations. Even if you consider the rumor of a 232mm die, 14nm alone offers near double the density of 28nm. Again this isn't just a die shrink though they also revamped the shaders themselves this time, instead of the minor tweaks they did to GCN before. The only concerns i have are towards memory bandwidth. It's safe to say that Polaris 10 will outperform hawaii.


I'm sure we'll see, but I think if it was a Fiji beater they'd be talking performance, not performance/watt. And I'm also worried about how they are going to get Fiji performance out of a 256-bit bus (although that also applies to the 1070).


----------



## Chargeit

Hope so. This would force Nvidia to be more competitive with their pricing.









Got to feed my Gsync monitors.


----------



## Majin SSJ Eric

To be clear, I don't buy this rumor at all, but IF it did turn out to be true it would lead to a very interesting divergence of strategies by AMD and Nvidia. You'd essentially have AMD ceding the flagship bragging rights to Nvidia while trying to corner the market on midrange sales ("midrange" as in pricing a la the 970). It would also put tremendous pressure on Nvidia's pricing strategy for GP104 and possibly even their future flagship offerings next year (a $300 Polaris card with the same performance as a 980tI would likely gain AMD back a lot of market share making obscenely priced Nvidia flagships less tenable than they have been since 28nm).

To sum up, this would be a game-changer the likes we have not seen in the GPU segment since all the way back to the 55nm days and would have a massively positive impact on consumers of both AMD and Nvidia cards. That is one of the main reason that I don't believe this rumor for a second, but then again this is the fastest and most sure-fire way for AMD to make a dent (or more than a dent) into Nvidia's commanding market share lead. I just don't think 14nm is going to be cheap enough for this sort of aggressive pricing to make sense...


----------



## KarathKasun

$300 pricing with the current yields will net them a huge profit. Current data on 14FF at GloFo/Samsung puts the cost per P10 die at ~$60. AFAIK, at the $300 price point, they could be making $60 profit from each chip and the cards would still make good money for the AIB partners.

HD 4870 launch was similar if memory serves. Forced NV prices down by something like %40 overnight.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *KarathKasun*
> 
> $300 pricing with the current yields will net them a huge profit. Current data on 14FF at GloFo/Samsung puts the cost per P10 die at ~$60. AFAIK, at the $300 price point, they could be making $60 profit from each chip and the cards would still make good money for the AIB partners.


I don't know anything about wafer costs or the break down of cost per Polaris 10 chip but if what you say is true then AMD would be in a great position to really stick it to Nvidia and pull a GTX 970 here. If the rumor of 980Ti performance for $300 were to be true there would be a run on the cards for sure (including me). That brings up another point; the only way this works for AMD (in terms of making significant gains in market share) is if they have enough supply to fill demand, something they have a poor history of doing with new product launches...


----------



## KarathKasun

14FF yields are... superb to say the least. AFAIK they have nearly the whole fab to themselves as well. They can flood the market if they wanted to.


----------



## electro2u

The question i have is will it coil whine?


----------



## KarathKasun

With the 175w and under TDP, its likely to not have as much of an issue with whine.


----------



## Majin SSJ Eric

Damn you too-good-to-be-true rumors!!!! I'm already planning how I can come up with $600 for two of these cards!!!


----------



## KarathKasun

Well, they did say $300-$400. So budget up to $800.


----------



## Malinkadink

Glad i got this mg279q, now to just replace my 970 with a 980 ti equivalent polaris for $300


----------



## Slaughterem

Quote:


> Originally Posted by *KarathKasun*
> 
> With the 175w and under TDP, its likely to not have as much of an issue with whine.


And when you consider the power factor this card could be as small as Nano so reduced pcb, vrm, connectors, cooling etc. Add to this the fact that 14nm is 10% smaller than 16nm which leads to more dies per wafer and Polaris 10 rumored to be 232 mm2 vs 324 mm2 gp104. There is the potential that $300 could happen tho I don't believe that I want to jump on a hype train. Just a few more weeks and we will know for sure what really will be the products offered from each. Either way it appears that competition is certainly happening and this is good for consumers for whatever brand you like.


----------



## L36

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Damn you too-good-to-be-true rumors!!!! I'm already planning how I can come up with $600 for two of these cards!!!


Sell those titan Xs. $600 for both could be the deal of a decade.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *KarathKasun*
> 
> Well, they did say $300-$400. So budget up to $800.


Well tbh I'm not questioning the pricing but rather the actual performance. The big IF in the $300 Polaris 10 equation is IF it indeed matches the performance of a 980Ti (the fastest card currently on the market not counting the Titan X). That's a pretty big IF but, assuming that its true, I will be lining up behind everybody else to get my hands on a couple for sure!


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *L36*
> 
> Sell those titan Xs. $600 for both could be the deal of a decade.


If only... They are OG Titans, not Titan X's.


----------



## Ultracarpet

Quote:


> Originally Posted by *tajoh111*
> 
> There are two sides to the coin. They can either gain marketshare which is good in the short term or intentionally but not done in speaking or writing terms cooperate with Nvidia to raise graphic card prices to account for the increase in the cost of production which is good for both companies in the long term.
> 
> Nvidia with it brisk sales and high margins would take a loss if they maintain pricing at the present. This is because Nvidia's cost of production is between 500 to 600 million dollars. So increase this by the wafer cost and we have losses. They will make money without adding operation expenses but add in the overhead, R and D, administration and they take a loss.
> 
> Same happens to AMD, price too low and they torpedo the market and get into a race to the bottom. Nvidia fights back and both companies get into a price war, where the company with the biggest cash on hand wins. This is because too low of a price cripples margins to the point where gross profits aren't enough to cover the rest of the expenses. So although gaining market share is good for the short term, having profit margins that kill prices so no profits are made to pay for the rest of the expenses is bad in the long run. Margins have to be bigger than ever before to cover for the massive increase in R and D expenditure. And with increasing wafer pricing, prices of videocards have to go up to make up the difference. I think Fury pro duo price reflects this and Nvidia cutting supplies of most of its maxwell series support this theory.
> 
> Also if you look some of the expected die sizes, it seems like the companies have aligned their products to interfere with each each products the least. gp104 is 320mm, polaris 10 is 232mm, gp 106 is 180mm and polaris 11 is likely 130mm2. I think the companies are going to avoid pricing wars by segmenting the market in a manner if that a customer wants a certain product for a specific price, there is only 1 card in that segment.


You know that one of the main reasons to drop a node size is to reduce cost right? Like they are able to have a 300mm chip match a 600mm chip of the previous node... that means twice the chips from a wafer at that same performance level... AMD and NVidia waited as long as they did to drop nodes because they were waiting for the cost benefit to be there. Polaris 10 will be drastically cheaper to produce than a fury x/ 980ti. Especially considering, as I mentioned before, the cheaper memory and power delivery (the pcb's don't have to be that beefy). If these companies were able to sell absolutely massive chips with complex memory tech (hbm), and robust power delivery etc etc at prices ~500-600 and maintain a margin, AMD should easily be able to sell polaris 10 at 300-400 with roughly the same margin.

And no, market share is more or less an indicator of a companies success in any given market. The only time a company can operate well with a low market share is if they are selling massively high margin, low volume product as compared to the rest of the market... ie Ferrari vs. cheap brands like Honda.... But we aren't talking about cars, we are talking about GPU's... and the thing about this market is that you get to sell at a higher margin when you have MORE market share. AMD sells at generally a lower margin than NVidia, and yet they have a fraction of the market share as well, that is literally the worst possible scenario a company can be in lol. The only way AMD is going to gain market share back is to make a compelling card, and price it aggressively. If an AMD and an NVidia card were priced similarly and performed similarly, the NVidia card would easily outsell the AMD card (probably somewhere close to the market share percentage 8-2). They have lost too much market share to just put out a card and price it around NVidias pricing.

If AMD does put out a compelling card, and prices it aggressively, NVidia will undoubtedly lower their prices... but not as low as you'd think, they are used to the higher margins, and have been operating their company with such... until there is a sizeable swing in market share they will not budge that far IMO.


----------



## Chargeit

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> If only... They are OG Titans, not Titan X's.


The world of computer tech is a harsh mistress.


----------



## Majin SSJ Eric

I think I could probably still get $600 for both of them. At least $500.


----------



## Forceman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> To be clear, I don't buy this rumor at all, but IF it did turn out to be true it would lead to a very interesting divergence of strategies by AMD and Nvidia. You'd essentially have AMD ceding the flagship bragging rights to Nvidia while trying to corner the market on midrange sales ("midrange" as in pricing a la the 970). It would also put tremendous pressure on Nvidia's pricing strategy for GP104 and possibly even their future flagship offerings next year (a $300 Polaris card with the same performance as a 980tI would likely gain AMD back a lot of market share making obscenely priced Nvidia flagships less tenable than they have been since 28nm).
> 
> To sum up, this would be a game-changer the likes we have not seen in the GPU segment since all the way back to the 55nm days and would have a massively positive impact on consumers of both AMD and Nvidia cards. That is one of the main reason that I don't believe this rumor for a second, but then again this is the fastest and most sure-fire way for AMD to make a dent (or more than a dent) into Nvidia's commanding market share lead. I just don't think 14nm is going to be cheap enough for this sort of aggressive pricing to make sense...


Considering that P10 is rumored to be 2/3 the size of GP104, you have to wonder if Nvidia isn't going to counter it with GP106 instead. That would change the pricing structure dramatically, with GP104 on a level by itself. When was the last time AMD had a 30% performance/die size advantage (which they'd seemingly need for P10 to match GP104)?

Edit: that assumes you believe the 232mm rumor, which I don't.


----------



## Serios

Quote:


> Originally Posted by *tajoh111*
> 
> There are two sides to the coin. They can either gain marketshare which is good in the short term or intentionally but not done in speaking or writing terms cooperate with Nvidia to raise graphic card prices to account for the increase in the cost of production which is good for both companies in the long term.
> 
> Nvidia with it brisk sales and high margins would take a loss if they maintain pricing at the present. This is because Nvidia's cost of production is between 500 to 600 million dollars. So increase this by the wafer cost and we have losses. They will make money without adding operation expenses but add in the overhead, R and D, administration and they take a loss.
> 
> Same happens to AMD, price too low and they torpedo the market and get into a race to the bottom. Nvidia fights back and both companies get into a price war, where the company with the biggest cash on hand wins. This is because too low of a price cripples margins to the point where gross profits aren't enough to cover the rest of the expenses. So although gaining market share is good for the short term, having profit margins that kill prices so no profits are made to pay for the rest of the expenses is bad in the long run. Margins have to be bigger than ever before to cover for the massive increase in R and D expenditure. And with increasing wafer pricing, prices of videocards have to go up to make up the difference. I think Fury pro duo price reflects this and Nvidia cutting supplies of most of its maxwell series support this theory.
> 
> Also if you look some of the expected die sizes, it seems like the companies have aligned their products to interfere with each each products the least. gp104 is 320mm, polaris 10 is 232mm, gp 106 is 180mm and polaris 11 is likely 130mm2. I think the companies are going to avoid pricing wars by segmenting the market in a manner if that a customer wants a certain product for a specific price, there is only 1 card in that segment.


All empty words without knowing the technical detail and how much is AMD actually paying for these GPUs.
Anyway what we know for sure is that AMD wants to bring VR to the masses and that means their upcoming Polaris GPUs will be competitively priced.
I don't know about Nvidia but rumors say their next GPU dies will be bigger than AMDs and most likely they will not want to go as low as AMD can whit prices.


----------



## tajoh111

Quote:


> Originally Posted by *Ultracarpet*
> 
> You know that one of the main reasons to drop a node size is to reduce cost right? Like they are able to have a 300mm chip match a 600mm chip of the previous node... that means twice the chips from a wafer at that same performance level... AMD and NVidia waited as long as they did to drop nodes because they were waiting for the cost benefit to be there. Polaris 10 will be drastically cheaper to produce than a fury x/ 980ti. Especially considering, as I mentioned before, the cheaper memory and power delivery (the pcb's don't have to be that beefy). If these companies were able to sell absolutely massive chips with complex memory tech (hbm), and robust power delivery etc etc at prices ~500-600 and maintain a margin, AMD should easily be able to sell polaris 10 at 300-400 with roughly the same margin.
> 
> If AMD does put out a compelling card, and prices it aggressively, NVidia will undoubtedly lower their prices... but not as low as you'd think, they are used to the higher margins, and have been operating their company with such... until there is a sizeable swing in market share they will not budge that far IMO.
> 
> And no, market share is more or less an indicator of a companies success in any given market. The only time a company can operate well with a low market share is if they are selling massively high margin, low volume product as compared to the rest of the market... ie Ferrari vs. cheap brands like Honda.... But we aren't talking about cars, we are talking about GPU's... and the thing about this market is that you get to sell at a higher margin when you have MORE market share. AMD sells at generally a lower margin than NVidia, and yet they have a fraction of the market share as well, that is literally the worst possible scenario a company can be in lol. The only way AMD is going to gain market share back is to make a compelling card, and price it aggressively. If an AMD and an NVidia card were priced similarly and performed similarly, the NVidia card would easily outsell the AMD card (probably somewhere close to the market share percentage 8-2). They have lost too much market share to just put out a card and price it around NVidias pricing.


This time the cost savings are not nearly what it once was prior to switching nodes. The cost per transistor is more or less the same.

Most of the gains are from better yields from using smaller dies, but since the cost per transistor is the same, the savings are not as dramatic as before.

If companies were simply looking at savings, companies would have used 20nm planer. 20nm planer and 16/14nm finfet are pretty similar in size as far as transistor density goes. But it basically only brings transistor density to the table and a bit of power savings. But the added transistor cost make this node a node to skip along with the worse yields. The reason to go with finfet are yields are a bit better and more importantly, performance goes up along with power savings. This allows companies to charge more for the card which offset the increased expenditure.

Higher marketshare only translate into profit if it can generate enough sales to cover the remaining cost. You need volume along with margins to make money. 290x sales picked up big after the price drop and the flaw in the gtx 970 came about, but because margins were so bad, it lead to losses. AMD priced their 7870 at 350 dollars when it was first launched and that card used a smaller die, cheaper node. I think both companies are going to be careful about getting into a price war because of the increased cost of finfet. gtx 980 ti performance at 300 dollars is too aggressive of a move. They don't need to go that cheap and if they did, supply would run out and seller would scalp the cards. In the meantime, Nvidia could do a price drop and supply the market, much like it did with initial hawaii sales vs gtx 780/780 ti sales. Nvidia was way more profitable, even those AMD was selling every single card for 4 or 5 months.

In addition in business, it much harder to raise prices then to lower them. If AMD prices the cards too low and the supply outstrips demands and companies start scalping the cards, it's the seller that make the extra not AMD. This happened during the mining craze and AMD got very little revenue overall. AMD underpriced their cards, sold all their stock and AMD showed little profit.

Nvidia does have a bad image when it comes to pricing but when it comes to targeting the mainstream this reputation is undeserved and they are often wiling to value price their products and in some ways bleed out AMD. They did it with the gtx 970/670/570 and often with previous cards like the gtx 460. When it comes to the mainstream, Nvidia is willing to price aggressively and has often forced AMD to lower their prices or launch their cards cheaper. Nvidia Gx104 cut down's have been priced aggressively for the last 6 or 8 years.


----------



## Klocek001

well if this is true then hell line me up for three!








what I'm afraid this will come down to is the actual "near 980Ti" performance. I mean you could cherry pick the games and resolutions where 290X would be 1-3 fps behind 980Ti even now, and I'm sure they also compare to the reference one. So do they really mean it will match a 980Ti or is this just big words ?


----------



## KarathKasun

Quote:


> Originally Posted by *Serios*
> 
> All empty words without knowing the technical detail and how much is AMD actually paying for these GPUs.
> Anyway what we know for sure is that AMD wants to bring VR to the masses and that means their upcoming Polaris GPUs will be competitively priced.
> I don't know about Nvidia but rumors say their next GPU dies will be bigger than AMDs and most likely they will not want to go as low as AMD can whit prices.


If you do enough digging in the right places, you can find that info.

With the defect rate for 14FF from a month or so back and the known cost per wafer along with supposed die size, you get something like ~$60 per chip cost.

When you apply best case scenario prices and yield rates for NV's larger die you get approximately $100 per die cost.


----------



## Serios

Probably but if I were AMD or Nvidia I would try to pay less then the standard price.
After all AMD and Nvidia have been partners whit TSMC and GloFon for may years.


----------



## KarathKasun

Their costs are pretty much known by those in manufacturing circles. They have their own forums where they talk shop like the enthusiasts talk shop here.


----------



## The-Beast

Quote:


> Originally Posted by *CataclysmZA*
> 
> Yeah, no, I can't believe that. 980 Ti performance for half the price, on a new process that is twice as expensive? That's literally implausible.
> 
> I can't even fathom how much such a GPU would break the industry (it most certainly would), and drive a price war that'll last far longer than it needs to. If anything, it'll be 980 Ti performance for $150 less, not $300 less.


If you can get 4x as many chips out of a wafer and the wafer costs twice as much, and those chips are equal performance to the 980ti you can absolutely price it at 300.


----------



## badtaylorx

I've owned quite a few team red cards, so I can speak AMD. "near 980 Ti performance" translates to "comes within %10 FPS on a couple select benchmarks"

Id expect a TOP performing non ref. 970 to be a closer comparison. Like a Zotac AMP! (still no slouch mind you)

Combine this with 8 gigs of ram, and you do indeed bring VR performance to the table for under $300


----------



## TrueForm

Yup, waiting for these cards that will go nicely with my new Freesync monitor.


----------



## variant

Quote:


> Originally Posted by *badtaylorx*
> 
> I've owned quite a few team red cards, so I can speak AMD. "near 980 Ti performance" translates to "comes within %10 FPS on a couple select benchmarks"
> 
> Id expect a TOP performing non ref. 970 to be a closer comparison. Like a Zotac AMP! (still no slouch mind you)
> 
> Combine this with 8 gigs of ram, and you do indeed bring VR performance to the table for under $300


Except this information doesn't come from AMD.


----------



## Power Drill

Quote:


> Originally Posted by *badtaylorx*
> 
> I've owned quite a few team red cards, so I can speak AMD. "near 980 Ti performance" *translates to "comes within %10 FPS on a couple select benchmarks"*
> 
> Id expect a TOP performing non ref. 970 to be a closer comparison. Like a Zotac AMP! (still no slouch mind you)
> 
> Combine this with 8 gigs of ram, and you do indeed bring VR performance to the table for under $300


This even though I wish that it would be the same situation on GW titles on DX11. But the truth probably is that it's 980Ti performance only on few selected DX12 titles and rest will fall 10% or more behind.

Seriously, people are over hyping AMD stuff again and no wonder if everybody will be disappointed when they can't deliver some dreamland performance people are expecting for some reason.

Even if the 2300 shader core is clocked 20% higher than current and the core has 20% better IPC, it would still fall around 20% behind Fury X and 980Ti.


----------



## guttheslayer

Quote:


> Originally Posted by *Power Drill*
> 
> This even though I wish that it would be the same situation on GW titles on DX11. But the truth probably is that it's 980Ti performance only on few selected DX12 titles and rest will fall 10% or more behind.
> 
> Seriously, people are over hyping AMD stuff again and no wonder if everybody will be disappointed when they can't deliver some dreamland performance people are expecting for some reason.
> 
> Even if the 2300 shader core is clocked 20% higher than current and the core has 20% better IPC, it would still fall around 20% behind Fury X and 980Ti.


Den u realised their ipc could be above 20% and the cores is probably 2560 instead of 2304.


----------



## Bogga

I had the thoughts of a dual card setup stuffed awayou and my hopes on ~30% improvement over the 980ti from a single card. But if there's any truth to these rumors I will strongly consider going back to the red team and crossfire... long time no see since I've been green since my dual 5850's


----------



## tajoh111

Quote:


> Originally Posted by *guttheslayer*
> 
> Den u realised their ipc could be above 20% and the cores is probably 2560 instead of 2304.


20% IPC gains are very difficult as it is. AMD only gained something like 10% from its transition from VLIW to GCN, which was a completely new architecture where the core size increased massively compared to it's predecessor. Most of the performance came from more shaders that were higher clocked. With Hawaii the gains were very linears compared to tahiti. From Hawaii to Fiji, not so much. What this means is gains are the most likely at shader amounts above 2816 when performance has already begun to plateau. Where it is still linear, the gains are not as likely.

Expecting a 30% to 50% gain when they are still on GCN is unrealistic because it will be basically the biggest gain ever, which is tough to expect from technology that is still GCN and on top of this is going to be hamstrung by ddr5 technology.


----------



## Nickyvida

meh, disappointing if true.

All the hype about 2.5x and it just can about match the 980ti only? Even after a node shrink and a possible GDDR5X memory usage? Expected it to be 10-20% more than the Ti.

Planned to go Polaris this year but i'll guess it is Vega or bust for me. Guess the bright side is that i can hang on to my 390 for a while more.

Graphics are slowly going the way of PC development.

milking consumers with exorbitant prices for minimal gains.


----------



## n64ADL

i wish i could hang on to my
Quote:


> Originally Posted by *Nickyvida*
> 
> meh, disappointing if true.
> 
> All the hype about 2.5x and it just can about match the 980ti only? Even after a node shrink and a possible GDDR5X memory usage? Expected it to be 10-20% more than the Ti.
> 
> Planned to go Polaris this year but i'll guess it is Vega or bust for me. Guess the bright side is that i can hang on to my 390 for a while more.
> 
> Graphics are slowly going the way of PC development.
> 
> milking consumers with exorbitant prices for minimal gains.


wish i could hang on to my r9 390 but my 3440 x 1440 monitor needs more power. crossfire polaris for me, yes please with my 750 watt power supply you could pull it off with 2.5 power efficiency.


----------



## Nickyvida

Quote:


> Originally Posted by *n64ADL*
> 
> i wish i could hang on to my
> wish i could hang on to my r9 390 but my 3440 x 1440 monitor needs more power. crossfire polaris for me, yes please with my 750 watt power supply you could pull it off with 2.5 power efficiency.


Better to grab another 390 and crossfire it imo while waiting for Vega. HBM2 will be epic once they have it ramped up.

You don't spend much and you can sell both 390's for two Vegas once they drop.

I'm content with a single as im on 1080p. When Vega drops then that might change.


----------



## n64ADL

i've
Quote:


> Originally Posted by *Nickyvida*
> 
> Better to grab another 390 and crossfire it imo while waiting for Vega. HBM2 will be epic once they have it ramped up.
> 
> You don't spend much and you can sell both 390's for two Vegas once they drop.
> 
> I'm content with a single as im on 1080p. When Vega drops then that might change.


i've thought about that but i have a 750 watt power supply and i'd need to get atleast a 900 watt power supply to make crossfire with those bad boys work







i'd be cheaper to get polaris. should i just get another power supply and r9 390 or what???


----------



## Nickyvida

Quote:


> Originally Posted by *n64ADL*
> 
> i've
> i've thought about that but i have a 750 watt power supply and i'd need to get atleast a 900 watt power supply to make crossfire with those bad boys work
> 
> 
> 
> 
> 
> 
> 
> i'd be cheaper to get polaris. should i just get another power supply and r9 390 or what???


It's better to get a better power supply and another r9. Helps alot when you sell both the r9s since you're planning to get 2 vegas anyway rather than worry about power consumption. Two Vegas are going to take a hefty load of power anyway. So better to be safe than sorry

I have a 1000w that's been waiting for a capable cf to max it out since the days of a single 780. Hoping Vega will be that.


----------



## xzamples

i doubt it will be for that price


----------



## Serios

Quote:


> Originally Posted by *tajoh111*
> 
> Expecting a 30% to 50% gain when they are still on GCN is unrealistic because it will be basically the biggest gain ever, which is tough to expect from technology that is still GCN and on top of this is going to be hamstrung by ddr5 technology.


Well they did say whit these new 14nm GPUs they basically will have their biggest leap in performance and efficiency.
GCN is and always was a very good architecture and the transition to 14nm allowed them to correct a lot of it's shortcomings.


----------



## n64ADL

would a 1000 watt power supply do, newegg.com has a great deal for a corsair 1000i watt power supply for 188 today. best deal i can think of for that or should i get another brand???


----------



## Mrip541

I'll believe it when I see it.


----------



## Ultracarpet

Quote:


> Originally Posted by *tajoh111*
> 
> This time the cost savings are not nearly what it once was prior to switching nodes. The cost per transistor is more or less the same.
> 
> Most of the gains are from better yields from using smaller dies, but since the cost per transistor is the same, the savings are not as dramatic as before.
> 
> If companies were simply looking at savings, companies would have used 20nm planer. 20nm planer and 16/14nm finfet are pretty similar in size as far as transistor density goes. But it basically only brings transistor density to the table and a bit of power savings. But the added transistor cost make this node a node to skip along with the worse yields. The reason to go with finfet are yields are a bit better and more importantly, performance goes up along with power savings. This allows companies to charge more for the card which offset the increased expenditure.
> 
> Higher marketshare only translate into profit if it can generate enough sales to cover the remaining cost. You need volume along with margins to make money. 290x sales picked up big after the price drop and the flaw in the gtx 970 came about, but because margins were so bad, it lead to losses. AMD priced their 7870 at 350 dollars when it was first launched and that card used a smaller die, cheaper node. I think both companies are going to be careful about getting into a price war because of the increased cost of finfet. gtx 980 ti performance at 300 dollars is too aggressive of a move. They don't need to go that cheap and if they did, supply would run out and seller would scalp the cards. In the meantime, Nvidia could do a price drop and supply the market, much like it did with initial hawaii sales vs gtx 780/780 ti sales. Nvidia was way more profitable, even those AMD was selling every single card for 4 or 5 months.
> 
> In addition in business, it much harder to raise prices then to lower them. If AMD prices the cards too low and the supply outstrips demands and companies start scalping the cards, it's the seller that make the extra not AMD. This happened during the mining craze and AMD got very little revenue overall. AMD underpriced their cards, sold all their stock and AMD showed little profit.
> 
> Nvidia does have a bad image when it comes to pricing but when it comes to targeting the mainstream this reputation is undeserved and they are often wiling to value price their products and in some ways bleed out AMD. They did it with the gtx 970/670/570 and often with previous cards like the gtx 460. When it comes to the mainstream, Nvidia is willing to price aggressively and has often forced AMD to lower their prices or launch their cards cheaper. Nvidia Gx104 cut down's have been priced aggressively for the last 6 or 8 years.


You just went on a tangent about there being no way these companies can price these cards low, then completely contradicted yourself with the mid range NVidia chips like the 970 being priced super aggressively. Also, the hawaii series did very little to impact the market share regardless of the ridiculous gtx 970 incident, and on top of that, the chips were large and power hungry... Ie they probably cost a whole lot more than the gtx 970 to produce.

This is a smaller, lower power, and overall cheaper chip... I think it's just been so long for a node shrink that people are forgetting what is supposed to happen.... Perpetually increasing prices for incremental performance increases is NOT what should happen. Both product stacks need to reshift what is considered low and mid range.

This chip is around the size of the 7870, which btw beat the 6970... I do not recall the 7870 being sold for $500+, that was 7970 territory. Vega is where we start seeing the performance segment of AMD's chips on this new node.


----------



## Clocknut

Quote:


> Originally Posted by *tajoh111*
> 
> 20% IPC gains are very difficult as it is. AMD only gained something like 10% from its transition from VLIW to GCN, which was a completely new architecture where the core size increased massively compared to it's predecessor. Most of the performance came from more shaders that were higher clocked. With Hawaii the gains were very linears compared to tahiti. From Hawaii to Fiji, not so much. What this means is gains are the most likely at shader amounts above 2816 when performance has already begun to plateau. Where it is still linear, the gains are not as likely.
> 
> Expecting a 30% to 50% gain when they are still on GCN is unrealistic because it will be basically the biggest gain ever, which is tough to expect from technology that is still GCN and on top of this is going to be hamstrung by ddr5 technology.


you forgot that most 28nm GCN are on 1GHz clock speed.

Nvidia already showing they are capable to clock @ 1.3GHz even on a large server chip Tesla GP100. (it is a server chip & are usually underclock). So consumer Geforce is going to clock higher.

So I will not be any surprise AMD can do the same, that all polaris to be clock at least 1.3GHz+, possibly more.


----------



## Nickyvida

Quote:


> Originally Posted by *n64ADL*
> 
> would a 1000 watt power supply do, newegg.com has a great deal for a corsair 1000i watt power supply for 188 today. best deal i can think of for that or should i get another brand???


It should be enough, some people say 850w is enough but you have to account for your other components too such as your type of gpus, chips, whether u are over clocking and etc.

And 1000w doesnt mean its a good psu. There are good and bad psu and they are rated by bronze to titanium. Bad psus have inferior components and are liable to take your whole pc along if they blow
Get a 1000w with a gold, plat or titanium rating and you should be good to go. Using a seasonic plat x1000w. A bit on the pricey side but they are very reliable.


----------



## KarathKasun

Bronze/gold/platinum is the efficiency rating. There are VERY good bronze PSUs around, they just need more cooling and pull more at the wall.

FWIW, I ran two GTX 465s modded to 470s with massive overclocks (900mhz core @ 1.25v) in an OC'd i5-2500k system using a high quality Antec 700w PSU. If your PSU is rated for 750w continuous, you should be fine. Im pretty sure my GPUs were pulling 500w or more in total.


----------



## n64ADL

Quote:


> Originally Posted by *Nickyvida*
> 
> It should be enough, some people say 850w is enough but you have to account for your other components too such as your type of gpus, chips, whether u are over clocking and etc.
> 
> And 1000w doesnt mean its a good psu. There are good and bad psu and they are rated by bronze to titanium. Bad psus have inferior components and are liable to take your whole pc along if they blow
> Get a 1000w with a gold, plat or titanium rating and you should be good to go. Using a seasonic plat x1000w. A bit on the pricey side but they are very reliable.


the corsair one is 80 platinum rated certified. i've had good experiences with corsair in the past.


----------



## Nickyvida

Quote:


> Originally Posted by *n64ADL*
> 
> the corsair one is 80 platinum rated certified. i've had good experiences with corsair in the past.


As long as its not the RM1000w it should be fine. Go for it


----------



## Randomdude

Can somebody tell me, in a graphics <300mm^2 card, from the total price for it to be made, in % how much is the silicon? I find this wafer cost propaganda a little bit too odd, because if you paid for literally only the silicon in a card, and the card cost 250USD, then a 14nm card of the same size, and given equal yields on such regular sized chips, it would basically cost 15% more as that's roughly how much more expensive 14nm is compared to 28nm. If there were 20% less yield and 15% higher cost, ~350USD worth of 14nm silicon would be the equal to 250USD worth of 28nm silicon. But... that's not the whole cost of the card, so I'm curious what the whole shebang is behind this "14nm cards can't help but be super expensive" when in reality I can't seem to find a reason for that.


----------



## rv8000

If this card is marginally close to a 980ti in terms of performance, I'd expect a price tag of $450 at the very least. The price will be more dependent on the release schedule of the 1070/1080.


----------



## zealord

Quote:


> Originally Posted by *Randomdude*
> 
> Can somebody tell me, in a graphics <300mm^2 card, from the total price for it to be made, in % how much is the silicon? I find this wafer cost propaganda a little bit too odd, because if you paid for literally only the silicon in a card, and the card cost 250USD, then a 14nm card of the same size, and given equal yields on such regular sized chips, it would basically cost 15% more as that's roughly how much more expensive 14nm is compared to 28nm. If there were 20% less yield and 15% higher cost, ~350USD worth of 14nm silicon would be the equal to 250USD worth of 28nm silicon. But... that's not the whole cost of the card, so I'm curious what the whole shebang is behind this "14nm cards can't help but be super expensive" when in reality I can't seem to find a reason for that.


I have no idea about the actual cost, but I remember this picture :



It is a bit older (from the GTX 580 days). Also no idea if it is legit or fake


----------



## lahvie

Quote:


> Originally Posted by *Ultracarpet*
> 
> You just went on a tangent about there being no way these companies can price these cards low, then completely contradicted yourself with the mid range NVidia chips like the 970 being priced super aggressively. Also, the hawaii series did very little to impact the market share regardless of the ridiculous gtx 970 incident, and on top of that, the chips were large and power hungry... Ie they probably cost a whole lot more than the gtx 970 to produce.
> 
> This is a smaller, lower power, and overall cheaper chip... I think it's just been so long for a node shrink that people are forgetting what is supposed to happen.... Perpetually increasing prices for incremental performance increases is NOT what should happen. Both product stacks need to reshift what is considered low and mid range.
> 
> This chip is around the size of the 7870, which btw beat the 6970... I do not recall the 7870 being sold for $500+, that was 7970 territory. Vega is where we start seeing the performance segment of AMD's chips on this new node.


970 was aggressively priced considering the recent inflation of the higher end

How much was high end, back in the day

Just saying


----------



## zealord

Quote:


> Originally Posted by *lahvie*
> 
> 970 was aggressively priced considering the recent inflation of the higher end
> 
> How much was high end, back in the day
> 
> Just saying


It was aggressively priced compared to the GTX 980 and GTX 780 Ti, but I think in terms of how much it actually did cost to manufacture for Nvidia is was priced "normally" I'd dare to say.

Looking bad the GTX 970 was only a great card in disguise.

I remember when it came out I was thinking about selling my 290X and grabbing a GTX 970, but looking how well the 290(X) and therefore the 390(X) aged I am glad I didn't do that. Even putting the 3.5GB disaster aside there are quite a few benchmarks out there where the 970 is closer to the 280X than some people might like and cards like the 390X pulling far ahead of the GTX 970.

It is still an okay card and most people who bought it are probably happily gaming with it right now


----------



## iLeakStuff

Quote:


> Originally Posted by *KarathKasun*
> 
> 14FF yields are... superb to say the least. AFAIK they have nearly the whole fab to themselves as well. They can flood the market if they wanted to.


LOL. No AMD most certainly isnt the only customer ordering chips from GloFo and Samsung.
They are actually tiny compared to some other clients there.

Apple
Altera
Qualcomm
Samsung


----------



## KarathKasun

Compare that to how many companies are fighting for TSMC 16FF. The GloFo 14FF fab is also only one of two or three others that do the same process. Having second sources helps tons with total output capacity.


----------



## lahvie

Quote:


> Originally Posted by *zealord*
> 
> It was aggressively priced compared to the GTX 980 and GTX 780 Ti, but I think in terms of how much it actually did cost to manufacture for Nvidia is was priced "normally" I'd dare to say.
> 
> Looking bad the GTX 970 was only a great card in disguise.
> 
> I remember when it came out I was thinking about selling my 290X and grabbing a GTX 970, but looking how well the 290(X) and therefore the 390(X) aged I am glad I didn't do that. Even putting the 3.5GB disaster aside there are quite a few benchmarks out there where the 970 is closer to the 280X than some people might like and cards like the 390X pulling far ahead of the GTX 970.
> 
> It is still an okay card and most people who bought it are probably happily gaming with it right now


I agree. I recommended my father in law on it. He runs 1080p

but thinking amd is about to drop a 980ti performance for 300$ is out there

All their old cards are obsolete if they did that. How much better is their high end and how high would it sell?

better yet, I'd love that. Make my dreams come true AMD But what about their old lineup and their new lineup.


----------



## KarathKasun

It is possible that AMD made the decision to write off the old inventory on launch. They are hurting that bad for market share right now. If they also get good margins at that price, it could likely cover the decision to dump the old inventory of chips in short order.

Its a bold and desperate move, but if its as good as they seem to think it is, it might be enough to dig themselves out of the money pit that they dug with the extended Hawaii shelf life.


----------



## Shadymort

Quote:


> It is possible that AMD made the decision to write off the old inventory on launch. They are hurting that bad for market share right now. If they also get good margins at that price, it could likely cover the decision to dump the old inventory of chips in short order.
> 
> Its a bold and desperate move, but if its as good as they seem to think it is, it might be enough to dig themselves out of the money pit that they dug with the extended Hawaii shelf life.


They will probably renew the entire lineup with Polaris, except maybe the fury cards (which will be namely replaced by Vega) and the very low end, so it makes sense to drop the inventory. Several posts ago, someone suggested that inventory clearance is already going on. What RTG really needs now is market share, not money. They need to convince the market and shareholders that they can do it and that they are still relevant. Money comes later. Their "Sweet spot" strategy is all about making the best deal possible for the mainstream market and push VR capable hardware at a lower entry price. A 350-ish Polaris 10, with similar 980Ti performance, does seem pretty viable.


----------



## SliceTbone

Actually I heard from my own sources (don't ask who pls) that Polaris is expected to come with 390X performance having the launch price of 380.
Which, brings the question, price of the current 380 or the price of 380 at launch?

The line up should be as follows:
Polaris 10 XT 490
10 Pro 480
11 470


----------



## PontiacGTX

Quote:


> Originally Posted by *CataclysmZA*
> 
> Yeah, no, I can't believe that. 980 Ti performance for half the price, on a new process that is twice as expensive? That's literally implausible.
> 
> I can't even fathom how much such a GPU would break the industry (it most certainly would), and drive a price war that'll last far longer than it needs to. If anything, it'll be 980 Ti performance for $150 less, not $300 less.


GTX 580(high end) 500usd/HD 6970(high end) 370usd to HD 7870(midrange) 350usd to R9 285 (midrange) 200usd, 2 midrange cards having a new architecture and a smaller node size achieved lower price and better performance. the market has been stuck with 28nm for 4 years, making a midrange for 300usd and outperforms a high end card isnt something that hasnt been done before
Quote:


> Originally Posted by *CataclysmZA*
> 
> That's not an indication of power consumption, it's an indication that the heat levels are low enough that the card can run the game with the fans off (or, alternatively, that the fans have been set to only turn on at a specific heat level).
> 
> The fact that you can do that with existing high-end GPUs at 28nm today says nothing about Polaris' power consumption.


where is the passively cooled HD 7970,R9 290X, Fury X?

with the architecture tweaking lower bus width/less IMC and the node shrink the cards can really reach low power levels.


----------



## tp4tissue

If this is true.. It will be the a Major upset like the Legendary ATI HD4870.. $300, beats out 8800gtx


----------



## lahvie

Quote:


> Originally Posted by *SliceTbone*
> 
> Actually I heard from my own sources (don't ask who pls) that Polaris is expected to come with 390X performance having the launch price of 380.
> Which, brings the question, price of the current 380 or the price of 380 at launch?
> 
> The line up should be as follows:
> Polaris 10 XT 490
> 10 Pro 480
> 11 470


My source said it should actually said it was 390$ because it performs like a 390, not a 380

He may or may not be credible, like me


----------



## Klocek001

Quote:


> Originally Posted by *PontiacGTX*
> 
> where is the passively cooled HD 7970,R9 290X, Fury X?


http://www.silentpcreview.com/forums/viewtopic.php?f=19&t=66028


----------



## PontiacGTX

Quote:


> Originally Posted by *Klocek001*
> 
> http://www.silentpcreview.com/forums/viewtopic.php?f=19&t=66028


thats a 7950 which uses 150w,40w less than a 7970,and really isnt a consumer card.


----------



## f1LL

Quote:


> Originally Posted by *lahvie*
> 
> My source said it should actually said it was 390$ because it performs like a 390, not a 380
> 
> He may or may not be credible, like me


390 performance for $50-90 more than right now? sounds like a good deal


----------



## criminal

Quote:


> Originally Posted by *SliceTbone*
> 
> Actually I heard from my own sources (don't ask who pls) that Polaris is expected to come with 390X performance having the launch price of 380.
> Which, brings the question, price of the current 380 or the price of 380 at launch?
> 
> The line up should be as follows:
> Polaris 10 XT 490
> 10 Pro 480
> 11 470


My source says that Polaris will have Fury Nano performance for $299.97 plus tax and shipping.


----------



## bigjdubb

My source told me to stop asking stupid questions









I need better sources.

.....or better questions


----------



## n64ADL

Quote:


> Originally Posted by *bigjdubb*
> 
> My source told me to stop asking stupid questions
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need better sources.
> 
> .....or better questions


^---- This lol


----------



## iLeakStuff

Quote:


> Originally Posted by *Klocek001*
> 
> http://www.silentpcreview.com/forums/viewtopic.php?f=19&t=66028


"passive" cooled.









Still requires air to blow through the heatsinks to cool down the card.
Its like having a house with an electric heater that used to have an active cooler on the heater to make sure it doesnt overheat. Instead its now overheating the house and they now have to use fans on the walls to go twice as fast to remove the heat from the house


----------



## DIYDeath

I'll believe it when I see it. Though @ $300 USD, I might just buy 2 of them unless Nvidia can offer something similar. (love Nvidia drivers, not a fan of red team drivers).


----------



## zealord

Quote:


> Originally Posted by *DIYDeath*
> 
> I'll believe it when I see it. Though @ $300 CAD, I might just buy 2 of them unless Nvidia can offer something similar. (love Nvidia drivers, not a fan of red team drivers).


I can tell you one thing for sure. If Polaris 10 performs very close the 980 Ti then you *WON'T* get it for 300$ *CAD*.


----------



## iLeakStuff

Quote:


> Originally Posted by *SliceTbone*
> 
> Actually I heard from my own sources (don't ask who pls) that Polaris is expected to come with 390X performance having the launch price of 380.
> Which, brings the question, price of the current 380 or the price of 380 at launch?
> 
> The line up should be as follows:
> Polaris 10 XT 490
> 10 Pro 480
> 11 470


I`ve seen things that actually might support what your source say, but to protect my info I cant post it here.








I sure hope its not true but it also fit what AMD said for Polaris 10 lately, that its not for enthusiasts but affordable cards for the mass.

To quote Arstechnica`s article today, the supporting parts to your info:
Quote:


> In an interview with Ars, AMD's Roy Taylor also confirmed that Polaris would target mainstream users, particularly those interested in creating a VR-ready system.


Quote:


> "The reason Polaris is a big deal, is because I believe we will be able to grow that TAM [total addressable market] significantly," said Taylor. "I don't think Nvidia is going to do anything to increase the TAM, because according to everything we've seen around Pascal, it's a high-end part


Quote:


> While those after a successor to the likes of high-end graphics cards like the Fury and Fury X may be disappointed, that Polaris is a mainstream part doesn't necessarily mean it's underpowered. The minimum specs for a VR system call for an Nvidia GTX 970 or an AMD Radeon 290 (or its near-identical replacement the 390), both of which currently retail for around £250.


----------



## dubldwn

Quote:


> Originally Posted by *tp4tissue*
> 
> If this is true.. It will be the a Major upset like the Legendary ATI HD4870.. $300, beats out 8800gtx


55nm HD4870 came out over a year and a half later than 90nm 8800GTX.


----------



## lahvie

Quote:


> Originally Posted by *f1LL*
> 
> 390 performance for $50-90 more than right now? sounds like a good deal


Hey I was just quoting the other guy -_-


----------



## Offender_Mullet

For a minute, let's say this rumor is 100% true:

Alright, the msrp is $300. Newegg and other retailers will jack the cards' pricing up near 980Ti levels. In turn, then the retailers will have dozens of other cards (which perform worse) in need of a huge price drop. Can't see that happening. IF the rumors are true, it's going to be much more expensive.


----------



## criminal

Quote:


> Originally Posted by *dubldwn*
> 
> 55nm HD4870 came out over a year and a half later than 90nm 8800GTX.


I think he meant GTX 280. Which it didn't beat it performance wise, but it sure did price/performance and even undercut the GTX260 which it was faster than.


----------



## mandrake88

Quote:


> Originally Posted by *SliceTbone*
> 
> Actually I heard from my own sources (don't ask who pls) that Polaris is expected to come with 390X performance having the launch price of 380.
> Which, brings the question, price of the current 380 or the price of 380 at launch?
> 
> The line up should be as follows:
> Polaris 10 XT 490
> 10 Pro 480
> 11 470


How can you no trust a user with 9 post with "secret sources"? lol


----------



## bigjdubb

Quote:


> Originally Posted by *Offender_Mullet*
> 
> For a minute, let's say this rumor is 100% true:
> 
> Alright, the msrp is $300. Newegg and other retailers will jack the cards' pricing up near 980Ti levels. In turn, then the retailers will have dozens of other cards (which perform worse) in need of a huge price drop. Can't see that happening. IF the rumors are true, it's going to be much more expensive.


I think the only way you will see the price increase dramatically over MSRP is if the demand outweighs the supply. It's performance level will not determine how retailers price the cards, it may effect how they price the older cards it will replace though. If it is true then you may see some decent price cuts on the fury lineup.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *criminal*
> 
> My source says that Polaris will have Fury Nano performance for $299.97 plus tax and shipping.


Hey, you know AMD Billy too????


----------



## Fyrwulf

Quote:


> Originally Posted by *sugarhell*
> 
> I bet this card will be expensive. Like the 1080 from nvidia. 14nm is really expensive at the moment. I can see a cut down version to cost 300 bucks but not the full die.


Well, we have to consider several factors. For a process to be production ready, a minimum of 90% of the dies must be good. Per wafer prices right now are roughly $4,500 per for 28nm and $7,250 per for 14/16nm FinFET. Then we have to factor in that a Fury X die is 596mm^2 and our best information is that a P10 die is 236mm^2. Plugging that into a handy calculator gives us a per die price of $54.95 for the Fury X and $31.47 for Polaris 10. That's the out the door price for a raw die, mind. We also have to figure that with AMD using cheap and mature GDDR5, a smaller PCB, and an inexpensive cooling solution, they can in fact afford the price point they're talking about. Maybe their profit margins won't be as high, but their pricing strategy is more Walmart than Gap (razor thin margins and massive movement of product).

Something else to consider is that those are TSMC prices. It's public knowledge that after their Qualcomm and Apple production runs, Samsung cut prices on 14nm FinFET for follow-on customers. We really don't have any idea what Samsung is charging AMD, but between their new GPU and CPU production, it wouldn't surprise me if the per wafer costs were significantly cheaper due to a bulk purchase discount.


----------



## guttheslayer

Quote:


> Originally Posted by *Fyrwulf*
> 
> Well, we have to consider several factors. For a process to be production ready, a minimum of 90% of the dies must be good. Per wafer prices right now are roughly $4,500 per for 28nm and $7,250 per for 14/16nm FinFET. Then we have to factor in that a Fury X die is 596mm^2 and our best information is that a P10 die is 236mm^2. Plugging that into a handy calculator gives us a per die price of $54.95 for the Fury X and $31.47 for Polaris 10. That's the out the door price for a raw die, mind. We also have to figure that with AMD using cheap and mature GDDR5, a smaller PCB, and an inexpensive cooling solution, they can in fact afford the price point they're talking about. Maybe their profit margins won't be as high, but their pricing strategy is more Walmart than Gap (razor thin margins and massive movement of product).
> 
> Something else to consider is that those are TSMC prices. It's public knowledge that after their Qualcomm and Apple production runs, Samsung cut prices on 14nm FinFET for follow-on customers. We really don't have any idea what Samsung is charging AMD, but between their new GPU and CPU production, it wouldn't surprise me if the per wafer costs were significantly cheaper due to a bulk purchase discount.


That is a very insightful post if its true... Yes i still believe 300 dollar is possible for that die size, and maybe it will be 980 ti comparable. They are already not as fast as pascal (1080 shld be faster than titan x) so they need something else to appeal the market.

Somemore 980 ti has been around for 12 months.


----------



## Fyrwulf

Quote:


> Originally Posted by *guttheslayer*
> 
> That is a very insightful post if its true... Yes i still believe 300 dollar is possible for that die size, and maybe it will be 980 ti comparable. They are already not as fast as pascal (1080 shld be faster than titan x) so they need something else to appeal the market.
> 
> Somemore 980 ti has been around for 12 months.


I think Polaris is aimed at the 390X and before crowd. Rory Taylor is right, if nVidia is pushing the high end that does nothing for the legions of people who can't afford a $600 graphics card. However, those same people surely would like 980Ti performance or close to it. If these rumors pan out, the resale market on a 980Ti is going to hit rock bottom right out of the gate and then people are going to have to decide on spending $250 on a card that is second hand and has no future proofing or a $300 brand new card that will approach Fury X speeds and is actually capable in a Dx12 environment.

The irony of this situation is that nVidia has made their money on some really good middle tier cards, but they seem determined to sacrifice that for their high margin top tier parts. That might bite them in the behind if their Dx12 performance doesn't significantly improve, because they'll be crushed in reviews for the perceived failure and nVidia's hard won mindshare will start to tank.


----------



## Forceman

Quote:


> Originally Posted by *Fyrwulf*
> 
> I think Polaris is aimed at the 390X and before crowd. Rory Taylor is right, if nVidia is pushing the high end that does nothing for the legions of people who can't afford a $600 graphics card. However, those same people surely would like 980Ti performance or close to it. If these rumors pan out, the resale market on a 980Ti is going to hit rock bottom right out of the gate and then people are going to have to decide on spending $250 on a card that is second hand and has no future proofing or a $300 brand new card that will approach Fury X speeds and is actually capable in a Dx12 environment.
> 
> The irony of this situation is that nVidia has made their money on some really good middle tier cards, but they seem determined to sacrifice that for their high margin top tier parts. That might bite them in the behind if their Dx12 performance doesn't significantly improve, because they'll be crushed in reviews for the perceived failure and nVidia's hard won mindshare will start to tank.


Why does everyone seem to be assuming Nvidia is going to ignore the mid-market? They are certainly planning to release a GP106 card(s).


----------



## EightDee8D

Quote:


> Originally Posted by *Forceman*
> 
> Why does everyone seem to be assuming Nvidia is going to ignore the mid-market? They are certainly planning to release a GP106 card(s).


Because nvidia gave minor improvements to 660-760-960 ? that's the market Amd will focus with Polaris.

That doesn't mean Nvidia is going to ignore the mid-market, but in last 2-3 years they haven't paid that much focus in this segment.


----------



## Forceman

Quote:


> Originally Posted by *EightDee8D*
> 
> Because nvidia gave minor improvements to 660-760-960 ? that's the market Amd will focus with Polaris.
> 
> That doesn't mean Nvidia is going to ignore the mid-market, but in last 2-3 years they haven't paid that much focus in this segment.


Isn't that the same market AMD just rebrands all their cards in? I don't pay much attention to that segment, but isn't Tonga the only new chip there in like 3 years?


----------



## KarathKasun

Honestly, AMD has been able to compete by moving chips down the stack of SKU's. It has been cutting into margins but NV hasn't really brought anything head and shoulders above AMD's 3 year old parts excepting the GTX 980Ti.


----------



## Fyrwulf

Quote:


> Originally Posted by *Forceman*
> 
> Why does everyone seem to be assuming Nvidia is going to ignore the mid-market? They are certainly planning to release a GP106 card(s).


Because recent rumors suggest that they're going to release the 1070/1080 first then follow that up with the Ti and Titan lines. More to the point, if Pascal isn't much better than Maxwell in Dx12, I'm not sure there's a _point_ to the mid-tier card from nVidia.


----------



## Fyrwulf

Quote:


> Originally Posted by *Forceman*
> 
> Isn't that the same market AMD just rebrands all their cards in? I don't pay much attention to that segment, but isn't Tonga the only new chip there in like 3 years?


Yes. And they're deep sixing everything that isn't a Fury to make room for Polaris.


----------



## Forceman

Quote:


> Originally Posted by *Fyrwulf*
> 
> Because recent rumors suggest that they're going to release the 1070/1080 first then follow that up with the Ti and Titan lines. More to the point, if Pascal isn't much better than Maxwell in Dx12, I'm not sure there's a _point_ to the mid-tier card from nVidia.


Latest rumors I've seen say GP106 launches in-between those two.
Quote:


> NVIDIA is expected to launch the first consumer graphics cards based on the GP106 silicon some time in Autumn 2016 (late Q3-early Q4).


http://www.techpowerup.com/221848/nvidia-to-launch-mid-range-gp106-based-graphics-cards-in-autumn-2016


----------



## EightDee8D

Quote:


> Originally Posted by *Forceman*
> 
> Isn't that the same market AMD just rebrands all their cards in? I don't pay much attention to that segment, but isn't Tonga the only new chip there in like 3 years?


Yes because their old cards gets free performance upgrade via driver ? and they perform same as nvdia cards without upgrading ?


----------



## Fyrwulf

Quote:


> Originally Posted by *Forceman*
> 
> Latest rumors I've seen say GP106 launches in-between those two.


Well, I'll concede the point, then. I hadn't heard that, but I don't pay very close attention to nVidia right now. It seems a very odd market strategy, but we'll see how it works for them.


----------



## Forceman

Quote:


> Originally Posted by *Fyrwulf*
> 
> It seems a very odd market strategy, but we'll see how it works for them.


It's the same strategy they've had for a while. Have the fastest card, and then turn that reputation into sales lower down the stack. It's worked pretty well for them so far.

The only thing these $300 980 Ti rumors is doing is making the inevitable disappointment that much greater when it doesn't materialize.


----------



## Fyrwulf

Quote:


> Originally Posted by *Forceman*
> 
> It's the same strategy they've had for a while. Have the fastest card, and then turn that reputation into sales lower down the stack. It's worked pretty well for them so far.


And it's worked because AMD has been rebranding the hell out of their mid-tier cards. Like I said, unless there's a massive improvement in Dx12 performance, AMD's going to use that as a cudgel to take them out at the knees in that price point.
Quote:


> Originally Posted by *Forceman*
> 
> The only thing these $300 980 Ti rumors is doing is making the inevitable disappointment that much greater when it doesn't materialize.


The rumor is that Polaris 10 will have _near_ the performance of the 980Ti. I honestly expected 390X levels of performance and would quite happily spend $350 for that, so anything in the region of 980 levels would be even better.


----------



## dagget3450

The other side of the question for me is, "near" 980ti could be good and bad depending on what API were talking? In quite a few dx12 titles 390x has almost caught up with a 980ti? Are they using the dx11 side of it for 980ti and polaris 10? I would assume they are using dx12, and thus in my opinion would be 390x territory. However if they are factoring in dx11 980ti performance, then it would be a nice gain over 390x/nano/fury pro?


----------



## Serios

Quote:


> Originally Posted by *dagget3450*
> 
> The other side of the question for me is, "near" 980ti could be good and bad depending on what API were talking? In quite a few dx12 titles 390x has almost caught up with a 980ti? Are they using the dx11 side of it for 980ti and polaris 10? I would assume they are using dx12, and thus in my opinion would be 390x territory. However if they are factoring in dx11 980ti performance, then it would be a nice gain over 390x/nano/fury pro?


They are using 3D Mark.
The rumors says the the top Polaris GPU scored close to the 980Ti and Fury X in 3D Mark.
We don't know what that says about games. These new GPUs have improvements that should make them perform better in DX11 and maybe even DX12.

Also as AMD demoed Polaris GPUs multiple times I think drives will be more than acceptable at launch and they won't repeat the same mistakes they did whit the introduction of GCN.


----------



## KeepWalkinG

I will get 2 Polaris if they are 300-350$ !!!


----------



## darealist

Most likely it'll only be close in that mediocre Hitman unfinished-game.


----------



## Nickyvida

Polaris is a disappointment if true.

Even after a long overdue node shrink, new architecture, supposed memory improvements like VRAM oc and GDDR5x, it's still only within reach of a 980Ti?

Vega it is then. GPU advancement is going the way of CPU development, milking sky high prices for mediocre improvement. Just look at Nvidia and AMD likely to follow suit.


----------



## KarathKasun

What you are skipping is the fact that it is GTX 980Ti level performance at HALF the size and power consumption.

If Vega is ~500m^2, you are looking at 2X 980Ti performance or more for that card in 6 months or so.


----------



## Nickyvida

Quote:


> Originally Posted by *KarathKasun*
> 
> What you are skipping is the fact that it is GTX 980Ti level performance at HALF the size and power consumption.
> 
> If Vega is ~500m^2, you are looking at 2X 980Ti performance or more for that card in 6 months or so.


Polaris should have been what Vega is, imho.

Waiting almost three years for a node shrink, amidst incremental improvements alongside the failed 20nm debacle, it would have been better if they could at least have edged the 980ti with half the size and power consumption like you mentioned Hyping it up with the 2.5x and then saying 980ti, that was achieved like a year or two ago by Nvidia killed it for me.


----------



## KarathKasun

AMD has to push out the high volume good profit card first, they have no other option. Ya know, unless they want to win in performance and not be able grow their market share.


----------



## Nickyvida

Quote:


> Originally Posted by *KarathKasun*
> 
> AMD has to push out the high volume good profit card first, they have no other option. Ya know, unless they want to win in performance and not be able grow their market share.


Yeah i guess Just pretty peeved with what and all about the hyping up and then finding out i have to wait till 2017 after nearly 4 years of waiting just to get a decent increase in performance...







They had nearly four years to work on Polaris and even with the node shrink and new architecture, it is still disappointing imo, given that they skipped 20nm.

GPU market is pretty stagnant now, just look at Nvidia, potentially charging in excess of $600 for a mid range GP104 die. And we haven't even gotten to Vega/Big Pascal yet. The prices are out of control and the supposedly reported performance wanting.


----------



## Serios

Quote:


> Originally Posted by *Nickyvida*
> 
> Polaris is a disappointment if true.
> 
> Even after a long overdue node shrink, new architecture, supposed memory improvements like VRAM oc and GDDR5x, it's still only within reach of a 980Ti?
> 
> Vega it is then. GPU advancement is going the way of CPU development, milking sky high prices for mediocre improvement. Just look at Nvidia and AMD likely to follow suit.


Polaris is midrange and a small one like 236mm^2.
What's so bad about just matching a 601mm^2 GPU even is it's on 28nm?

Whit Vega we will most likely see a big and a very big die.
AMD is doing it's best to address the biggest part of the gaming GPU market right now.
It's not a bad strategy.


----------



## Klocek001

I guess we should wait for the actual cards and reviews before we voice our disappointment.
My bet is we're not getting any sort of big performance jump this year. If we get a 30% increase over 980Ti it's gonna be the 1080 but the price tag would be disgusting for a medium size chip. We might also get a great performance/price from Polaris, but again they won't even match a reference 980Ti, not to mention AIBs.
I'm probably still gonna wait for Vega or big die Pascal, maybe buy a used 980Ti in the meantime for SLI, people would be selling like crazy. I'm not fond of buying mid-range chips even if they are a step up from the former big die. They offfer better performance and good power consumption efficiency but they also get replaced by high end chip pretty quickly.


----------



## 364901

Quote:


> Originally Posted by *Nickyvida*
> 
> They had nearly four years to work on Polaris and even with the node shrink and new architecture, it is still disappointing imo, given that they skipped 20nm.


Keep in mind that both NVIDIA and AMD must have taped out their designs at 20nm for testing, but found that things weren't working correctly or to the specification they wanted, and the performance improvements just weren't there. They didn't just decide to skip 20nm on a whim, there was some serious work done to determine its viability.


----------



## n64ADL

all this hype is kinda making me want to skip this summers lineup more, i'll just get another r9 390 to hold me over for the 9 or so months until vega comes out. I'll wait till the price goes down when polaris launches, more concerned about upgrading my load times and slowly convert my computer to almost all SSD's. especially with the announcment of nintendo NX next year around the same time, i'm gonna save up now by spend less now, spend more later. both the nintendo NX and vega come out in Q1 2017.


----------



## zealord

Quote:


> Originally Posted by *n64ADL*
> 
> all this hype is kinda making me want to skip this summers lineup more, i'll just get another r9 390 to hold me over for the 9 or so months until vega comes out. I'll wait till the price goes down when polaris launches, more concerned about upgrading my load times and slowly convert my computer to almost all SSD's. especially with the announcment of nintendo NX next year around the same time, i'm gonna save up now by spend less now, spend more later. both the nintendo NX *and vega come out in Q1 2017*.


I would not count on that or do you have an official source for that?


----------



## n64ADL

i could of sworn they both were coming out at that same time. heres the link i first saw.

my bad this is the correct link i saw. http://wccftech.com/amd-powerful-c99-polaris-gpu-vega-10-test-vehicle/


----------



## bigjdubb

Quote:


> Originally Posted by *n64ADL*
> 
> i could of sworn they both were coming out at that same time. heres the link i first saw.
> 
> my bad this is the correct link i saw. http://wccftech.com/amd-powerful-c99-polaris-gpu-vega-10-test-vehicle/


I think all of us have seen that thread. It is a lot of speculation with nothing substantial backing it up at this point. I think once we get past the releases this summer we will start to get more concrete information on what's going to happen in 2017.


----------



## gamervivek

That 110k board did seem like a Greenland/ Vega candidate, too bad the 48k board turned out to be small Polaris XT


----------



## Vesku

Quote:


> Originally Posted by *Nickyvida*
> 
> Yeah i guess Just pretty peeved with what and all about the hyping up and then finding out i have to wait till 2017 after nearly 4 years of waiting just to get a decent increase in performance...
> 
> 
> 
> 
> 
> 
> 
> They had nearly four years to work on Polaris and even with the node shrink and new architecture, it is still disappointing imo, given that they skipped 20nm.
> 
> GPU market is pretty stagnant now, just look at Nvidia, potentially charging in excess of $600 for a mid range GP104 die. And we haven't even gotten to Vega/Big Pascal yet. The prices are out of control and the supposedly reported performance wanting.


Bottom line is HBM2 just isn't ready for selling into retail. In terms of not wanting to get overcharged, if I'm going to be buying a $400+ GPU this generation it better have 8+GB of HBM2.


----------



## Nickyvida

Quote:


> Originally Posted by *Vesku*
> 
> Bottom line is HBM2 just isn't ready for selling into retail. In terms of not wanting to get overcharged, if I'm going to be buying a $400+ GPU this generation it better have 8+GB of HBM2.


Just isnt ready? So why are GP100 Tesla boards carrying it?.


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> Yeah i guess Just pretty peeved with what and all about the hyping up and then finding out i have to wait till 2017 after nearly 4 years of waiting just to get a decent increase in performance...
> 
> 
> 
> 
> 
> 
> 
> They had nearly four years to work on Polaris and even with the node shrink and new architecture, it is still disappointing imo, given that they skipped 20nm.
> 
> GPU market is pretty stagnant now, just look at Nvidia, potentially charging in excess of $600 for a mid range GP104 die. And we haven't even gotten to Vega/Big Pascal yet. The prices are out of control and the supposedly reported performance wanting.


Fury X is 30% faster than the 390X. Polaris is the 390X replacement, yet you're not satisfied with a decent performance increase with half the power draw if the rumors hold true? If you've been sitting on your card for four years, that means you likely have a 7970. Polaris would murder hobo a 7970 in a benchmark contest. Talk about Millennial Entitlement.


----------



## Nickyvida

Quote:


> Originally Posted by *Serios*
> 
> Polaris is midrange and a small one like 236mm^2.
> What's so bad about just matching a 601mm^2 GPU even is it's on 28nm?
> 
> Whit Vega we will most likely see a big and a very big die.
> AMD is doing it's best to address the biggest part of the gaming GPU market right now.
> It's not a bad strategy.


Whatever happened to mid range dies beating the big die of the last gen? Like the 680 vs 590 and older?

Now it's just all about efficiency


----------



## Vesku

Quote:


> Originally Posted by *Nickyvida*
> 
> Just isnt ready? So why are GP100 Tesla boards carrying it?
> 
> Polaris is pretty much overcharged anyway with GDDR5x. Whatever happened to mid range dies beating the big die of the last gen? Like the 680 vs 590 and older?
> 
> Now it's just all about efficiency.


Not ready for RETAIL. GP100 boards with HBM2 can only be purchased as part of ~$100,000 systems with first deliveries scheduled (not yet shipped) for June. Doesn't matter who makes it I think the only *retail* gaming GPUs this generation worth paying more than $400 will be the ones with 8+GB of HBM2.


----------



## cowie

its all; good even if they don't beat match the 980ti it may put NVidia prices down


----------



## PontiacGTX

Quote:


> Originally Posted by *Nickyvida*
> 
> Just isnt ready? So why are GP100 Tesla boards carrying it?.


Those Cards hasnt been released yet for the whole market. and AMD the main co developer of HBM (like did with GDDR5) wouldnt be using it on a Performance tier GPU given it would be too expensive and it is only aimed for high end GPUs
Quote:


> Originally Posted by *Nickyvida*
> 
> Whatever happened to mid range dies beating the big die of the last gen? Like the 680 vs 590 and older?
> 
> Now it's just all about efficiency


There arent any released GPUs yet to compare if this wont outperform a r9 Fury x
Quote:


> Originally Posted by *cowie*
> 
> its all; good even if they don't beat match the 980ti it may put NVidia prices down


they cant put nvidia prices down since they put the MSRP they wish, the GTX 580 was a high end GPU for 500usd, and then the GTX 580 replacement on 600 series was a midrange GPU for 550usd


----------



## prjindigo

Quote:


> Originally Posted by *Nickyvida*
> 
> Just isnt ready? So why are GP100 Tesla boards carrying it?.


THERE ARE NO GP100 BOARDS. The Tesla PCIE card shown at nVidia's press thing was a Maxwell Tesla card used to feed the XG100 HBM2 64bit-only cluster processors in the GDX-1. Those XG100 processors are the "P100" but are not "GP100" and they only had 9 that worked. They have HBM2 ram mounted on-top of their processor die and can only clock to a maximum of 850MHz (the ones in the GDX-1 were only managing about 700) and are 110% NOT a consumer product. XP100 at 850MHz require massive heatsinks and strong flow to keep operational and they operate as a daughter-board cluster to a pair of 2011v3 Xeon's. They're not for graphics and the memory isn't on an interposer. The XP100 "Tesla Pascal" is server-only hardware that cannot operate at 32bit. nVidia did this advertising crap with Titan and Kepler and even the 600 series cards where they talked tons about their super-card and then sold you slices of it.

That "P100" you think is gonna come and be a graphics card for you? Price on it is something like $12,000.00 per chip _once they have a yeild higher than 9 out of 100._

A GP104-300 is about 2/3rds the power of a "GP100" chip, a GP104-200 is probably little more than half (1070) and supposedly we're gonna see a third GP104 chip that might be either a 1060 or 1080ti or possibly even their "titan" but it's not gonna be HBM either.

But this is a thread about AMD and not for missinformation about nVidia's deceptive marketing practices.


----------



## Kokin

Quote:


> Originally Posted by *PontiacGTX*
> 
> Nvidia the main co developer of HBM (like did with GDDR5)


HBM was co-developed between AMD and Hynix. In fact AMD started working on HBM as early as 2008/2009. We haven't even seen an actual demo of Nvidia using HBM on their GPUs and now they have become a "main co-developer"?

This thread is just ridiculous with all the misinformation and "how smart can I sound if I say this" type of speculation. I know this is how things are when new products are on the horizon, but c'mon guys!


----------



## flopper

Quote:


> Originally Posted by *PontiacGTX*
> 
> AMD the main co developer of HBM


Fixed it for you.
use google before you dont know what your doing.


----------



## variant

Quote:


> Originally Posted by *PontiacGTX*
> 
> Nvidia the main co developer of HBM (like did with GDDR5)


Quote:


> Originally Posted by *Kokin*
> 
> HBM was co-developed between AMD and Hynix. In fact AMD started working on HBM as early as 2008/2009. We haven't even seen an actual demo of Nvidia using HBM on their GPUs and now they have become a "main co-developer"?
> 
> This thread is just ridiculous with all the misinformation and "how smart can I sound if I say this" type of speculation. I know this is how things are when new products are on the horizon, but c'mon guys!


Not only was HBM developed by AMD, but so was GDDR5. ATI designed GDDR and the subsequent GDDR2, GDDR3, and GDDR4. GDDR5 was in development when AMD bought ATI.


----------



## PontiacGTX

Quote:


> Originally Posted by *Kokin*
> 
> HBM was co-developed between AMD and Hynix. In fact AMD started working on HBM as early as 2008/2009. We haven't even seen an actual demo of Nvidia using HBM on their GPUs and now they have become a "main co-developer"?
> 
> This thread is just ridiculous with all the misinformation and "how smart can I sound if I say this" type of speculation. I know this is how things are when new products are on the horizon, but c'mon guys!


thanks for correcting the mistake for some reason i had written nvidia instead amd
Quote:


> Originally Posted by *flopper*
> 
> Fixed it for you.
> use google before you dont know what your doing.


thanks but there are other posibilities like a mistake?


----------



## KarathKasun

Quote:


> Originally Posted by *prjindigo*
> 
> THERE ARE NO GP100 BOARDS. The Tesla PCIE card shown at nVidia's press thing was a Maxwell Tesla card used to feed the XG100 HBM2 64bit-only cluster processors in the GDX-1. Those XG100 processors are the "P100" but are not "GP100" and they only had 9 that worked. They have HBM2 ram mounted on-top of their processor die and can only clock to a maximum of 850MHz (the ones in the GDX-1 were only managing about 700) and are 110% NOT a consumer product. XP100 at 850MHz require massive heatsinks and strong flow to keep operational and they operate as a daughter-board cluster to a pair of 2011v3 Xeon's. They're not for graphics and the memory isn't on an interposer. The XP100 "Tesla Pascal" is server-only hardware that cannot operate at 32bit. nVidia did this advertising crap with Titan and Kepler and even the 600 series cards where they talked tons about their super-card and then sold you slices of it.
> 
> That "P100" you think is gonna come and be a graphics card for you? Price on it is something like $12,000.00 per chip _once they have a yeild higher than 9 out of 100._
> 
> A GP104-300 is about 2/3rds the power of a "GP100" chip, a GP104-200 is probably little more than half (1070) and supposedly we're gonna see a third GP104 chip that might be either a 1060 or 1080ti or possibly even their "titan" but it's not gonna be HBM either.
> 
> But this is a thread about AMD and not for missinformation about nVidia's deceptive marketing practices.


Holy cow, you are delusional.

We have seen ACTUAL GP/P100 dies, HBM is not on top of the GP/P100 die. There are 3D stacked chips and they are not high power, they are used in cell phones. High power chips use 2.5D stacking (all components mounted beside each other on a passive interposer) for thermal dissipation. I have pics, it did happen.

You do not know the intended market for the chips either. You only use math that is as accurate as you need, FP32 when its needed and FP64 when that is needed. FP32/64 can have a huge impact on memory consumption and bandwidth needs, which in turn can have a negative impact on performance.


----------



## Forceman

Quote:


> Originally Posted by *KarathKasun*
> 
> Holy cow, you are delusional.
> 
> We have seen ACTUAL GP/P100 dies, HBM is not on top of the GP/P100 die. There are 3D stacked chips and they are not high power, they are used in cell phones. High power chips use 2.5D stacking (all components mounted beside each other on a passive interposer) for thermal dissipation. I have pics, it did happen.
> 
> You do not know the intended market for the chips either. You only use math that is as accurate as you need, FP32 when its needed and FP64 when that is needed. FP32/64 can have a huge impact on memory consumption and bandwidth needs, which in turn can have a negative impact on performance.


Don't waste your time. He's been told numerous times in multiple threads, including photographic evidence. He doesn't care about the truth.


----------



## hokk

£300 for 980TI performance ?

If this is true







i'll snatch one up on release.

I've not been impressed with the price hike over the past 3 gens.


----------



## variant

Quote:


> Originally Posted by *kylzer*
> 
> £300 for 980TI performance ?
> 
> If this is true
> 
> 
> 
> 
> 
> 
> 
> i'll snatch one up on release.
> 
> I've not been impressed with the price hike over the past 3 gens.


Best to just not get any hopes up based on rumors.


----------



## Malinkadink

I don't know why people are so doubtful that we wont get 980 Ti performance for $300-400, that is exactly what we got with the 970 vs the 780 Ti, and i expect no less from Nvidia or AMD this time around especially since we're finally jumping ship from 28nm.


----------



## variant

Quote:


> Originally Posted by *Malinkadink*
> 
> I don't know why people are so doubtful that we wont get 980 Ti performance for $300-400, that is exactly what we got with the 970 vs the 780 Ti, and i expect no less from Nvidia or AMD this time around especially since we're finally jumping ship from 28nm.


I think it's possible, but I am just skeptical as a whole about any rumor.


----------



## svenge

Quote:


> Originally Posted by *Malinkadink*
> 
> I don't know why people are so doubtful that we wont get 980 Ti performance for $300-400, that is exactly what we got with the 970 vs the 780 Ti, and i expect no less from Nvidia or AMD this time around especially since we're finally jumping ship from 28nm.


$400 for 980Ti performance is _somewhat_ plausible, but expecting $300 is lunacy.

At this early date it's best to not pretend to know what either company is going to do in terms of performance and/or price, as it'll all sort itself out in a couple months anyhow. It's not quantum physics; attempting to observe it won't change the outcome.


----------



## Malinkadink

Quote:


> Originally Posted by *svenge*
> 
> $400 for 980Ti performance is _somewhat_ plausible, but expecting $300 is lunacy.


I got my 970 @ launch for around $350 so if anything we'll see 980 Ti performance for $350 at least from Nvidia, and AMD can surely undercut them and sell 980 Ti performance for $300 if they so please. We'll see in a couple months. I'm definitely gonna grab the best bang for buck card in this price segment as i feel the higher end stuff never truly justifies itself in terms of price/performance.


----------



## spinFX

Quote:


> Originally Posted by *Newbie2009*
> 
> From what I have heard, this is going to blow Intel i7 series out of the water. Twice the power, half the price. All we have to do is wait.


Apples and oranges.


----------



## KarathKasun

Quote:


> Originally Posted by *Forceman*
> 
> Don't waste your time. He's been told numerous times in multiple threads, including photographic evidence. He doesn't care about the truth.


All of his posts are just confusing to the uninformed. Its not like he is posting "NV IS TEH AWESOME!" Its more like a technical brief where everything is just wrong.


----------



## mohit9206

AMD equivalent 980Ti card is going to be $450 not $300 is my prediction.


----------



## Newbie2009

Quote:


> Originally Posted by *mohit9206*
> 
> AMD equivalent 980Ti card is going to be $450 not $300 is my prediction.


It all depends on the jump in performance with the shrink.


----------



## tajoh111

Quote:


> Originally Posted by *KarathKasun*
> 
> All of his posts are just confusing to the uninformed. Its not like he is posting "NV IS TEH AWESOME!" Its more like a technical brief where everything is just wrong.


He hates Nvidia. If you couldn't get that from his posts, then your not reading it right. Basically his agenda is everything Nvidia says is a lie, they are going to under perform, they are going to fail, etc etc. While on the AMD he's like praise be to AMD, everything going to go right and plus another 50%.
Quote:


> Originally Posted by *mohit9206*
> 
> AMD equivalent 980Ti card is going to be $450 not $300 is my prediction.


This is more realistic.

AMD has to set pricing as high as it can for the class range because it only goes down from here. If they price too low and prices fall too much, it will make pricing the next card they release at profitable levels difficult. The more expensive the market is, the easier it is to price higher. Which they will need since we could be on this node for a long time.

AMD is very likely to deliver performance between a 390 to 390x for 350 or under, but gtx 980 ti level performance, it's going to charge more for that. Otherwise, the cards that do perform like a gtx 970 and 390, which there certainly will be, they will have to charge less than 200 dollars for the price to performance to make sense. Just not profitable, as this would make cards for the die size cheaper than 40nm(we can forget about a 128bit bus polaris 11 performing at this level). And that's just bad business when your wafer prices double. Nvidia going to go through this as well of course and are probably paying through the nose for ddr5x if they can get it on their cards in june.

These wafers are not cheap and GF is likely charging more than what Apple and tsmc pays for wafers as AMD doesn't have the volume those companies order in. In addition, it's a higher performance process compared to what Apple last used on A9 and GF themselves have to pay a licencing fee towards samsung which is likely very high since the R and D for process technology is in the billions. Plus GF has to buy equipment and etc. This is a significant point because GF is a smaller manufacturer, doing 5-6 billion annually in revenue. This pales to the 24 and 35 billion that TSMC and Samsung revenue for the semiconductor manufacturing.


----------



## Fyrwulf

Once again, the rumor is _near_ 980Ti performance for $350. Near, to me, means 980 level performance. If the 980 released for $350, that's perfectly in line.


----------



## julizs

Quote:


> Originally Posted by *tajoh111*
> 
> AMD has to set pricing as high as it can for the class range because it only goes down from here. If they price too low and prices fall too much, it will make pricing the next card they release at profitable levels difficult. The more expensive the market is, the easier it is to price higher. Which they will need since we could be on this node for a long time.
> 
> AMD is very likely to deliver performance between a 390 to 390x for 350 or under, but gtx 980 ti level performance, it's going to charge more for that.


It's basically the same the last few times a new generation launches...people expect them to underperform and be overpriced. The 970 was supposed to be just a bit faster than 770, the 980ti was supposed to be 800-900 Euros.

In the End the 970 is (almost) heads up with 780ti, on that same lame 28nm process. Why would you expect ANYTHING less then that on the new mega die shrink, even if its just the small chip.

AMD also said in a new article they want to offer the VR Experience to the *mainstream* and that the new Polaris 10 is aimed at them. Is 450$ a mainstream price? No.

If Polaris 10 would be, like you wrote, R9 390 level at 350 or under, that would be laughable since we could buy a used R9 290 for around 200 Euros already like 2 years ago, I dont even remember exactly because the 390/390X launch was just *so* dissappointing.

To me it's pretty clear Nvidia's as well as AMD's new midrange are gonna be 980ti level at around 350 Euros and I'm already selling off my 980ti to not make a sudden 300 Euro minus, like the 780ti owners did.


----------



## flopper

Quote:


> Originally Posted by *julizs*
> 
> To me it's pretty clear Nvidia's as well as AMD's new midrange are gonna be 980ti level at around 350 Euros and I'm already selling off my 980ti to not make a sudden 300 Euro minus, like the 780ti owners did.


welcome to AMD the better technology


----------



## TheLAWNOOB

Quote:


> Originally Posted by *Fyrwulf*
> 
> Once again, the rumor is _near_ 980Ti performance for $350. Near, to me, means 980 level performance. If the 980 released for $350, that's perfectly in line.


980 = 390X
Paying 350 for another 390X level card would suck.


----------



## iLeakStuff

I think its extremely strange if AMD have nothing to counter GTX 1080/1070 with until 2017.


----------



## Fyrwulf

Quote:


> Originally Posted by *tajoh111*
> 
> These wafers are not cheap and GF is likely charging more than what Apple and tsmc pays for wafers as AMD doesn't have the volume those companies order in. In addition, it's a higher performance process compared to what Apple last used on A9 and GF themselves have to pay a licencing fee towards samsung which is likely very high since the R and D for process technology is in the billions. Plus GF has to buy equipment and etc. This is a significant point because GF is a smaller manufacturer, doing 5-6 billion annually in revenue. This pales to the 24 and 35 billion that TSMC and Samsung revenue for the semiconductor manufacturing.


This paragraph is so riddled with errors I don't know where to start.

http://www.fudzilla.com/news/processors/39023-samsung-cutting-prices-on-14nm-finfet
http://wccftech.com/samsung-lowering-14nm-finfet-prices-enable-client/

Samsung stole Apple and Qualcomm business by being cheaper than TSMC, whose 16nm wafers are $7k out the door. Then, it lowered prices even further for HiSilicon, which makes the Huawei phones.

There have been 1.6 million LG G5 and 9.5 million Galaxy S7 phones sold. The other flagship smartphones that utilize the Snapdragon 820 haven't released yet, so there are no sales figures.

In comparison, there were 268 million X86 PCs shipped in 2015, not including Apple's sales. Even if 11.9% Q1 2016 drop holds true across the rest of the year and AMD achieves 50% market share in new PCs shipped in the last two quarters, that's still 61 million units

So, back to the Snapdragon 820. We have an idea that it costs $70 as a finished product and while I believe cell phone processor magins are sky high, Intel's average per processor margin is around 60%. Going by that, Samsung charges $42 per processor to Qualcomm (a healthy 66% margin for Qualcomm). Since it costs Samsung roughly 10.71$ to manufacture a single die and it probably costs another $10.71 to finish it, that means Samsung has a 102% margin.

So an our notional 480X is going to cost AMD $42 per GPU, right? Nope, remember that per die cost is wafer cost divided by the number of dies. The 820 is only 93.09mm^2, whereas we have a rumored die size of 296mm^2. That means the raw die is going to cost $36.94 and a completed GPU $73.88, right? Nope, because Samsung's cut costs again for follow on customers, so let's assume that now a wafer costs an additional 5% less, or $6,317.50. That's the end of it, right? Not quite, becomes economies of scale come into play here and you can roughly adjust for it with the following formula: UnitCost2 = (TotalUnits1/TotalUnits2)*(UnitCost1*0.95). Of course, a unit here is going to have to be an entire wafer. That means there were 16,087 wafers for the 820's production run. For a notional 61 million GPU production run, AMD would have to order 305,000 wafers.

UC2 = (16,087/305,000)*(6,650*0.95)
UC2 = 0.05*$6,317.50
UC2 = $315.88

So, is that accurate? Well, no. At those sort of production runs, you hit a floor at a certain point and more just costs more. The floor in this case is probably around 15x that, or $4,738.13. That comes out to a raw die cost of $26.32, or $52.64 for a complete die. There's no way AMD's going to eat a 100% markup, it'll probably be more like 50%. So, $78.96 to AMD. That actually makes it fairly easy to figure out what AMD will make out of this, because we have a rough estimate of what Samsung will charge. If we figure AMD will charge their AIBs 50% over and above what Samsung charges AMD, we come out with a profit of $39.47 per GPU or a profit margin of 66%. At 61 million units, that's a gross profit of $2.407 billion. The sad thing is their net profit off of Polaris will probably still be zero because they need to service their debts.

So, can AMD afford to charge a $350 MSRP? Yes, they can.


----------



## Fyrwulf

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> 980 = 390X
> Paying 350 for another 390X level card would suck.


Actually, the 980 is midway between the 390X and 980Ti in most Dx11 workloads.


----------



## Fyrwulf

Quote:


> Originally Posted by *iLeakStuff*
> 
> I think its extremely strange if AMD have nothing to counter GTX 1080/1070 with until 2017.


Their goal is to make money. Selling 3.5 million units, on the upside, does not do that. Selling 61 million units (read my previous post on the economics of the situation) does. It makes them a boatload. Remember, they don't exist for the enthusiast user base to adulate and denigrate them. They exist to make money. They'll do that in the best way they know how.

EDIT: At the hypothesize wafer price above and assuming 91 valid dies per wafer for a 600mm^2 GPU die, that a raw die cost of $57.85 or $115.70 for a finished die. A 50% markup to the AIB is a gross profit per die of $57.85. At 3.5 million units, that's a gross profit of $202.475 million. Do you see the difference?


----------



## KarathKasun

Quote:


> Originally Posted by *Fyrwulf*
> 
> This paragraph is so riddled with errors I don't know where to start.
> 
> http://www.fudzilla.com/news/processors/39023-samsung-cutting-prices-on-14nm-finfet
> http://wccftech.com/samsung-lowering-14nm-finfet-prices-enable-client/
> 
> Samsung stole Apple and Qualcomm business by being cheaper than TSMC, whose 16nm wafers are $7k out the door. Then, it lowered prices even further for HiSilicon, which makes the Huawei phones.
> 
> There have been 1.6 million LG G5 and 9.5 million Galaxy S7 phones sold. The other flagship smartphones that utilize the Snapdragon 820 haven't released yet, so there are no sales figures.
> 
> In comparison, there were 268 million X86 PCs shipped in 2015, not including Apple's sales. Even if 11.9% Q1 2016 drop holds true across the rest of the year and AMD achieves 50% market share in new PCs shipped in the last two quarters, that's still 61 million units
> 
> So, back to the Snapdragon 820. We have an idea that it costs $70 as a finished product and while I believe cell phone processor magins are sky high, Intel's average per processor margin is around 60%. Going by that, Samsung charges $42 per processor to Qualcomm (a healthy 66% margin for Qualcomm). Since it costs Samsung roughly 10.71$ to manufacture a single die and it probably costs another $10.71 to finish it, that means Samsung has a 102% margin.
> 
> So an our notional 480X is going to cost AMD $42 per GPU, right? Nope, remember that per die cost is wafer cost divided by the number of dies. The 820 is only 93.09mm^2, whereas we have a rumored die size of 296mm^2. That means the raw die is going to cost $36.94 and a completed GPU $73.88, right? Nope, because Samsung's cut costs again for follow on customers, so let's assume that now a wafer costs an additional 5% less, or $6,317.50. That's the end of it, right? Not quite, becomes economies of scale come into play here and you can roughly adjust for it with the following formula: UnitCost2 = (TotalUnits1/TotalUnits2)*(UnitCost1*0.95). Of course, a unit here is going to have to be an entire wafer. That means there were 16,087 wafers for the 820's production run. For a notional 61 million GPU production run, AMD would have to order 305,000 wafers.
> 
> UC2 = (16,087/305,000)*(6,650*0.95)
> UC2 = 0.05*$6,317.50
> UC2 = $315.88
> 
> So, is that accurate? Well, no. At those sort of production runs, you hit a floor at a certain point and more just costs more. The floor in this case is probably around 15x that, or $4,738.13. That comes out to a raw die cost of $26.32, or $52.64 for a complete die. There's no way AMD's going to eat a 100% markup, it'll probably be more like 50%. So, $78.96 to AMD. That actually makes it fairly easy to figure out what AMD will make out of this, because we have a rough estimate of what Samsung will charge. If we figure AMD will charge their AIBs 50% over and above what Samsung charges AMD, we come out with a profit of $39.47 per GPU or a profit margin of 66%. At 61 million units, that's a gross profit of $2.407 billion. The sad thing is their net profit off of Polaris will probably still be zero because they need to service their debts.
> 
> So, can AMD afford to charge a $350 MSRP? Yes, they can.


Die size is smaller AFAIK, 250mm^2 ballpark.


----------



## Fyrwulf

Quote:


> Originally Posted by *KarathKasun*
> 
> Die size is smaller AFAIK, 250mm^2 ballpark.


That doesn't change the math overmuch.


----------



## KarathKasun

Over a few million units it does.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *iLeakStuff*
> 
> I think its extremely strange if AMD have nothing to counter GTX 1080/1070 with until 2017.


So much for not being the "budget" card/CPU makers







I really don't know how they're going to play it with Polaris 10. Maybe it will be within 5-15% of the 980 Ti. TPU shows for 1080 res: http://tpucdn.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/images/perfrel_1920_1080.png 980 Ti ref: 88%, 390X (a 290X with 4 GB more VRAM and a fancy BIOS): 68%. A 980 Ti is 29% faster. They were able to sell the 290X cards for ~$250 on sale after the 970 was released since they were getting slaughtered. IMO, the 390X should be a $300 card, $350 *tops*. Now they want to release a card and focus on getting more people on the specs for VR, (2160x1200 @ 90 FPS) which they claim is a 970 or a 290: http://videocardz.com/59445/amd-polaris-aiming-at-vr-capable-graphics-cards
Quote:


> The reason Polaris is a big deal, is because I believe we will be able to grow that TAM significantly. I don't think Nvidia is going to do anything to increase the TAM, because according to everything we've seen around Pascal, it's a high-end part. I don't know what the price is gonna be, but let's say it's as low as £500/$600 and as high as £800/$1000. That price range is not going to expand the TAM for VR. We're going on the record right now to say Polaris will expand the TAM. Full stop.


So they're going to be cards that are around that $300 price range but they're going 14nm instead of rebranding and IIRC, they have a really small die. How much more performance they can get out of it than a 390X, who knows, but it better be a big bump because it feels like they've stagnated since the end of 2014 when Maxwell 2 (970/980) was released. And from all the news that they're really focusing on performance:watt, makes me feel that we'll get a card which is hardly better than the 390X but way more power efficient.

Nvidia will counter this top end Polaris 10 card with their cut down Pascal GP104 die (1060?).


----------



## tajoh111

Quote:


> Originally Posted by *Fyrwulf*
> 
> This paragraph is so riddled with errors I don't know where to start.
> 
> http://www.fudzilla.com/news/processors/39023-samsung-cutting-prices-on-14nm-finfet
> http://wccftech.com/samsung-lowering-14nm-finfet-prices-enable-client/
> 
> Samsung stole Apple and Qualcomm business by being cheaper than TSMC, whose 16nm wafers are $7k out the door. Then, it lowered prices even further for HiSilicon, which makes the Huawei phones.
> 
> There have been 1.6 million LG G5 and 9.5 million Galaxy S7 phones sold. The other flagship smartphones that utilize the Snapdragon 820 haven't released yet, so there are no sales figures.
> 
> In comparison, there were 268 million X86 PCs shipped in 2015, not including Apple's sales. Even if 11.9% Q1 2016 drop holds true across the rest of the year and AMD achieves 50% market share in new PCs shipped in the last two quarters, that's still 61 million units
> 
> So, back to the Snapdragon 820. We have an idea that it costs $70 as a finished product and while I believe cell phone processor magins are sky high, Intel's average per processor margin is around 60%. Going by that, Samsung charges $42 per processor to Qualcomm (a healthy 66% margin for Qualcomm). Since it costs Samsung roughly 10.71$ to manufacture a single die and it probably costs another $10.71 to finish it, that means Samsung has a 102% margin.
> 
> So an our notional 480X is going to cost AMD $42 per GPU, right? Nope, remember that per die cost is wafer cost divided by the number of dies. The 820 is only 93.09mm^2, whereas we have a rumored die size of 296mm^2. That means the raw die is going to cost $36.94 and a completed GPU $73.88, right? Nope, because Samsung's cut costs again for follow on customers, so let's assume that now a wafer costs an additional 5% less, or $6,317.50. That's the end of it, right? Not quite, becomes economies of scale come into play here and you can roughly adjust for it with the following formula: UnitCost2 = (TotalUnits1/TotalUnits2)*(UnitCost1*0.95). Of course, a unit here is going to have to be an entire wafer. That means there were 16,087 wafers for the 820's production run. For a notional 61 million GPU production run, AMD would have to order 305,000 wafers.
> 
> UC2 = (16,087/305,000)*(6,650*0.95)
> UC2 = 0.05*$6,317.50
> UC2 = $315.88
> 
> So, is that accurate? Well, no. At those sort of production runs, you hit a floor at a certain point and more just costs more. The floor in this case is probably around 15x that, or $4,738.13. That comes out to a raw die cost of $26.32, or $52.64 for a complete die. There's no way AMD's going to eat a 100% markup, it'll probably be more like 50%. So, $78.96 to AMD. That actually makes it fairly easy to figure out what AMD will make out of this, because we have a rough estimate of what Samsung will charge. If we figure AMD will charge their AIBs 50% over and above what Samsung charges AMD, we come out with a profit of $39.47 per GPU or a profit margin of 66%. At 61 million units, that's a gross profit of $2.407 billion. The sad thing is their net profit off of Polaris will probably still be zero because they need to service their debts.
> 
> So, can AMD afford to charge a $350 MSRP? Yes, they can.


GF and Samsung are two different entities. Samsung is licencing their technology to GF to use. AMD is using GF to make their products, which in turn has to pay Samsung licencing or royalty fees so they get something out of it. So even those samsung would lower the price of wafers for mega monsters like qualcomm and apple, even if they did business directly with AMD, it doesn't mean they are obligated to give that type of pricing to AMD. AMD isn't a big enough customer for Samsung, that it has the weight to get the same contracts as qualcomm or Apple. AMD currently does one billion dollars with GF which is basically most of their lineup, this isn't the type of volume Samsung wants with their 14nm line, particularly when we bisect on the graphic portion out. But this is all hypothetical since AMD isn't dealing with Samsung directly and is dealing with GF.

What ever GF charges, it is higher than Samsung since they are are licencing the technology from them and likely have an agreement that prevents the canibalization of samsungs own sales.

I didn't disagree they could charge 350 and make some money(but it would be far less profitable) but this 300 dollar amount is impossible. 300 leaves little to no room for 4-6 distinct products that polaris 10 and 11 will make up. Assuming this rumor was true as far as the top end polaris goes, 300 dollars for gtx 980 ti performance would mean everything below it would have to have better performance per dollar since the top card usually has the worst performance per dollar. So the cut down chip would have to be priced at 229 dollars(with vanilla fury like performance), the card with gtx 970 performance would need to be priced at 160 dollars(and this would still be made using a polaris 10 chip(128bit cards aren't going to perform at this level). And polaris 10 cards would be at less than 100 dollars for their prices to make sense vs polaris 10 based cards. This is too low. Partners wouldn't be happy as their margins would be too low and AMD would not be making much money. Sure they get the volume, but when profits are split three ways between AMD, AMD board partners and Retailers, the money wouldn't cover the rest of their operation expense along with their R and D. It not just the increased cost of wafers that makes the price increase necessary, it's the increased cost of R and D as well which is more than double for 28nm cards.

But even 350 dollars would not be really all that profitable. It would just be a bit better but would run into the same issues on a less scale with trying to pack so many sku in such a small space relative to the top end sku price to performance.

Thus it doesn't make sense to price these cards below the initial pricing of 28nm cards.

One thing I do agree with you on is die costs are probably in the 80 dollar region plus or minus 10 dollars. AMD is not getting the yields Apple gets, because their chips are larger and they don't have the same tools and R and D expense that apple does to make improvements on their scale. Although 80 dollars seems cheap enough, it when you add in the rest of the cost that you realize that pie better be big when you add in the rest of the partners and their expenses. I.e the cost of the rest of the card, the marketing, the partners shares, the retailers shares and logistics. With that in mind, AMD would not be making that much money. Particularly when the volume card is likely to be the cut down version with much better yields.


----------



## Fyrwulf

Quote:


> Originally Posted by *KarathKasun*
> 
> Over a few million units it does.


Economies of scale are economies of scale, the floor still exists and it's hit well before the end of a production run that large. And it doesn't really do anything for AMD's net bottom line, either. If they manage to grab 50% of the total GPU market, at it will do is enable them to pay off some of their debts.


----------



## KarathKasun

With current defect density and die size numbers you can calculate a rough estimate of yield. With that you can get the cost per die depending on how much is disabled and how many bins.
With recent figures its in the ballpark of $60-$70 per uncut P10 and $50-$60 per 1st salvage P10. P11 OTOH is ~$40 with much less to be gained from cut down salvage bins.

Regardless, you are right Fyrwulf. Im not sure a net gain will be possible from Polaris alone. Even if Zen ends up being something that can move units with good margins as well, I think AMD will only just be able to claw back into the black. They have tons of debt coming to maturity in the next 18 months or so. If they gain significant market share on both fronts, it will still be an uphill struggle.


----------



## Fyrwulf

Quote:


> Originally Posted by *tajoh111*
> 
> GF and Samsung are two different entities. Samsung is licencing their technology to GF to use. AMD is using GF to make their products, which in turn has to pay Samsung licencing or royalty fees so they get something out of it. So even those samsung would lower the price of wafers for mega monsters like qualcomm and apple, even if they did business directly with AMD, it doesn't mean they are obligated to give that type of pricing to AMD.


AMD is dual sourcing from GloFo and Samsung. I dunno where you heard different, but they're wrong. You're also conflating revenue with the actual number of chips. For one thing, Apple makes their money on really high margins. Qualcomm is involved in far more than smart phones, in fact their largest single source of revenue is their networking stack.
Quote:


> What ever GF charges, it is higher than Samsung since they are are licencing the technology from them and likely have an agreement that prevents the canibalization of samsungs own sales.


Again, economies of scale. Even if it's a flat dollar amount and not a percentage (not likely), the number of chips I'm talking about means a negligible amount added to the cost of each chip.
Quote:


> One thing I do agree with you on is die costs are probably in the 80 dollar region plus or minus 10 dollars. AMD is not getting the yields Apple gets, because their chips are larger and they don't have the same tools and R and D expense that apple does to make improvements on their scale.


Apple is fabless. They literally have no control over the production process. Yes, their processors are tiny, as small as any other phone processor.
Quote:


> Although 80 dollars seems cheap enough, it when you add in the rest of the cost that you realize that pie better be big when you add in the rest of the partners and their expenses. I.e the cost of the rest of the card, the marketing, the partners shares, the retailers shares and logistics. With that in mind, AMD would not be making that much money. Particularly when the volume card is likely to be the cut down version with much better yields.


A high electronics retail margin is 15%. In the retail field I work in, that's pathetic. So for an MSRP of $350, that's 52.50 in profit. That leaves the AIB partners $179.06 to play with. There's no way a completed card costs more than an additional $50. That's actually a really healthy gross profit for them. Now, their _net_ profit probably only comes out to a third of that, but at the sort of numbers we're talking about that's not bad.


----------



## julizs

Quote:


> Originally Posted by *tajoh111*
> 
> I didn't disagree they could charge 350 and make some money(but it would be far less profitable) but this 300 dollar amount is impossible. 300 leaves little to no room for 4-6 distinct products that polaris 10 and 11 will make up. Assuming this rumor was true as far as the top end polaris goes, 300 dollars for gtx 980 ti performance would mean everything below it would have to have better performance per dollar since the top card usually has the worst performance per dollar. So the cut down chip would have to be priced at 229 dollars(with vanilla fury like performance), the card with gtx 970 performance would need to be priced at 160 dollars(and this would still be made using a polaris 10 chip(128bit cards aren't going to perform at this level). And polaris 10 cards would be at less than 100 dollars for their prices to make sense vs polaris 10 based cards. This is too low. Partners wouldn't be happy as their margins would be too low and AMD would not be making much money. Sure they get the volume, but when profits are split three ways between AMD, AMD board partners and Retailers, the money wouldn't cover the rest of their operation expense along with their R and D. It not just the increased cost of wafers that makes the price increase necessary, it's the increased cost of R and D as well which is more than double for 28nm cards.


Remember Polaris 10 is only half the size of Hawaii and only has a 256bit bus...I think AMD learned from Nvidia and is producing much more price efficient designs now. I think it's very much achievable for AMD to release full Polaris 10 at 370€ (the typical price that the 970 inflated to very quickly in Europe) and the cut down version for 270€ and still be well profitable. This is exactly the price range that the mainstream finds acceptable.

What AMD does will be dictated by Nvidia and probably the Gtx 1070 will repeat the story of the 970, be almost as fast as the last Ti around the 350€ / $ mark, so AMD has to deliver. Their top of the line card till Vega has to be at least as fast as the 1070, or even mainstream enthusiasts can't take them seriously anymore.


----------



## Redwoodz

Quote:


> Originally Posted by *tajoh111*
> 
> GF and Samsung are two different entities. Samsung is licencing their technology to GF to use. AMD is using GF to make their products, which in turn has to pay Samsung licencing or royalty fees so they get something out of it. So even those samsung would lower the price of wafers for mega monsters like qualcomm and apple, even if they did business directly with AMD, it doesn't mean they are obligated to give that type of pricing to AMD. AMD isn't a big enough customer for Samsung, that it has the weight to get the same contracts as qualcomm or Apple. AMD currently does one billion dollars with GF which is basically most of their lineup, this isn't the type of volume Samsung wants with their 14nm line, particularly when we bisect on the graphic portion out. But this is all hypothetical since AMD isn't dealing with Samsung directly and is dealing with GF.
> 
> What ever GF charges, it is higher than Samsung since they are are licencing the technology from them and likely have an agreement that prevents the canibalization of samsungs own sales.
> 
> I didn't disagree they could charge 350 and make some money(but it would be far less profitable) but this 300 dollar amount is impossible. 300 leaves little to no room for 4-6 distinct products that polaris 10 and 11 will make up. Assuming this rumor was true as far as the top end polaris goes, 300 dollars for gtx 980 ti performance would mean everything below it would have to have better performance per dollar since the top card usually has the worst performance per dollar. So the cut down chip would have to be priced at 229 dollars(with vanilla fury like performance), the card with gtx 970 performance would need to be priced at 160 dollars(and this would still be made using a polaris 10 chip(128bit cards aren't going to perform at this level). And polaris 10 cards would be at less than 100 dollars for their prices to make sense vs polaris 10 based cards. This is too low. Partners wouldn't be happy as their margins would be too low and AMD would not be making much money. Sure they get the volume, but when profits are split three ways between AMD, AMD board partners and Retailers, the money wouldn't cover the rest of their operation expense along with their R and D. It not just the increased cost of wafers that makes the price increase necessary, it's the increased cost of R and D as well which is more than double for 28nm cards.
> 
> But even 350 dollars would not be really all that profitable. It would just be a bit better but would run into the same issues on a less scale with trying to pack so many sku in such a small space relative to the top end sku price to performance.
> 
> Thus it doesn't make sense to price these cards below the initial pricing of 28nm cards.
> 
> One thing I do agree with you on is die costs are probably in the 80 dollar region plus or minus 10 dollars. AMD is not getting the yields Apple gets, because their chips are larger and they don't have the same tools and R and D expense that apple does to make improvements on their scale. Although 80 dollars seems cheap enough, it when you add in the rest of the cost that you realize that pie better be big when you add in the rest of the partners and their expenses. I.e the cost of the rest of the card, the marketing, the partners shares, the retailers shares and logistics. With that in mind, AMD would not be making that much money. Particularly when the volume card is likely to be the cut down version with much better yields.


You do realise AMD, Samsung,GF and IBM(which became GF) all entered into a co-operative agreement? Do you know the terms of that deal? All your speculations omit the obvious.


----------



## Fyrwulf

Quote:


> Originally Posted by *KarathKasun*
> 
> They have tons of debt coming to maturity in the next 18 months or so.


$600 million in 2019 and $450 million in 2020, which is very doable for a company that average $1 billion per quarter in revenue. That means they could set aside $78.125 million per quarter in order to service those debts. That's a large amount, but it won't break them.


----------



## spyshagg

Quote:


> Originally Posted by *Fyrwulf*
> 
> Actually, the 980 is midway between the 390X and 980Ti in most Dx11 workloads.


The last 3 or 4 games released showed both of them very close to each other.


----------



## variant

Quote:


> Originally Posted by *tajoh111*
> 
> GF and Samsung are two different entities. Samsung is licencing their technology to GF to use. AMD is using GF to make their products, which in turn has to pay Samsung licencing or royalty fees so they get something out of it.


Samsung will reportedly start making chips for AMD in 2016 which was apparently an agreement that goes back as far as at least October 2014.


----------



## Redwoodz

Quote:


> Originally Posted by *variant*
> 
> Samsung will reportedly start making chips for AMD in 2016 which was apparently an agreement that goes back as far as at least October 2014.


http://globalfoundries.com/docs/default-source/PDF/samsung-globalfoundries-14nm-collaboration---final.pdf?sfvrsn=2

I'm pretty sure AMD won't be paying too much per wafer,as they have redesigned the whole supply network for wafers.


----------



## Olivon

Quote:


> Originally Posted by *prjindigo*
> 
> THERE ARE NO GP100 BOARDS. The Tesla PCIE card shown at nVidia's press thing was a Maxwell Tesla card used to feed the XG100 HBM2 64bit-only cluster processors in the GDX-1. Those XG100 processors are the "P100" but are not "GP100" and they only had 9 that worked. They have HBM2 ram mounted on-top of their processor die and can only clock to a maximum of 850MHz (the ones in the GDX-1 were only managing about 700) and are 110% NOT a consumer product. XP100 at 850MHz require massive heatsinks and strong flow to keep operational and they operate as a daughter-board cluster to a pair of 2011v3 Xeon's. They're not for graphics and the memory isn't on an interposer. The XP100 "Tesla Pascal" is server-only hardware that cannot operate at 32bit. nVidia did this advertising crap with Titan and Kepler and even the 600 series cards where they talked tons about their super-card and then sold you slices of it.
> 
> That "P100" you think is gonna come and be a graphics card for you? Price on it is something like $12,000.00 per chip _once they have a yeild higher than 9 out of 100._
> 
> A GP104-300 is about 2/3rds the power of a "GP100" chip, a GP104-200 is probably little more than half (1070) and supposedly we're gonna see a third GP104 chip that might be either a 1060 or 1080ti or possibly even their "titan" but it's not gonna be HBM either.
> 
> But this is a thread about AMD and not for missinformation about nVidia's deceptive marketing practices.


http://images.nvidia.com/content/pdf/tesla/whitepaper/pascal-architecture-whitepaper.pdf

Have a nice read.


----------



## The-Beast

Quote:


> Originally Posted by *Nickyvida*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Serios*
> 
> Polaris is midrange and a small one like 236mm^2.
> What's so bad about just matching a 601mm^2 GPU even is it's on 28nm?
> 
> Whit Vega we will most likely see a big and a very big die.
> AMD is doing it's best to address the biggest part of the gaming GPU market right now.
> It's not a bad strategy.
> 
> 
> 
> Whatever happened to mid range dies beating the big die of the last gen? Like the 680 vs 590 and older?
> 
> Now it's just all about efficiency
Click to expand...

People stupidly bought into the concept of relative performance = cost, that companies were feeding them. In essence removing the cost of production from their decision making. This decoupling allows a company to push whatever chip they want into any position they want in their lineup. AMD dropped 100mm off their normal new node design for high end, so we won't get the normal performance increase on a new node. We also won't get a correctly priced product either due to price being linked to an inflationary metric.


----------



## Brimlock

Quote:


> Originally Posted by *Olivon*
> 
> http://images.nvidia.com/content/pdf/tesla/whitepaper/pascal-architecture-whitepaper.pdf
> 
> Have a nice read.


Lol, what a slap in the face. That has GP100 and Pascal all over it. Someone clearly didn't pay enough attention to GDC.


----------



## Vesku

Quote:


> Originally Posted by *The-Beast*
> 
> People stupidly bought into the concept of relative performance = cost, that companies were feeding them. In essence removing the cost of production from their decision making. This decoupling allows a company to push whatever chip they want into any position they want in their lineup. AMD dropped 100mm off their normal new node design for high end, so we won't get the normal performance increase on a new node. We also won't get a correctly priced product either due to price being linked to an inflationary metric.


Polaris 10 isn't AMD's high end. Due to HBM2 supply constraints Vega won't arrive until sometime end of year or Q1 2017.


----------



## zealord

Quote:


> Originally Posted by *Vesku*
> 
> Polaris 10 isn't AMD's high end. Due to HBM2 supply constraints Vega won't arrive until sometime *end of year or Q1 2017*.


Do you have any source that it will be within that window?

I've only heard of Vega coming in 2017, but it could be January, June or December.


----------



## bigjdubb

Quote:


> Originally Posted by *Vesku*
> 
> Polaris 10 isn't AMD's high end. Due to HBM2 supply constraints Vega won't arrive until sometime end of year or Q1 2017.


Quote:


> Originally Posted by *zealord*
> 
> Do you have any source that it will be within that window?
> 
> I've only heard of Vega coming in 2017, but it could be January, June or December.


Same here, only seen 2017 with no mention of when in 2017 other than some chart that would lead you to believe it will be Q1 2017.

Also, I have heard nothing about HBM2 supply being the reason we won't see big boy chips until next year. Samsung has been producing HBM2 since January.


----------



## Fyrwulf

Quote:


> Originally Posted by *bigjdubb*
> 
> Same here, only seen 2017 with no mention of when in 2017 other than some chart that would lead you to believe it will be Q1 2017.
> 
> Also, I have heard nothing about HBM2 supply being the reason we won't see big boy chips until next year. Samsung has been producing HBM2 since January.


Mass production isn't scheduled to start until June.


----------



## bigjdubb

Quote:


> Originally Posted by *Fyrwulf*
> 
> Mass production isn't scheduled to start until June.


https://news.samsung.com/global/samsung-begins-mass-producing-worlds-fastest-dram-based-on-newest-high-bandwidth-memory-hbm-interface

You are thinking of GDDR5x, HBM2 has been in mass production for 4 months.


----------



## Ithanul

Quote:


> Originally Posted by *zealord*
> 
> yeah but I mean a 300$ card that perform like 980 Ti would change everything.
> 
> AMD has still a lot of stock of those cards I mentioned left over (atleast I think they do).
> 
> All those cards would be completely undesireable


Most likely for high end PC gamers. But not everyone got 300+ bucks lying around for PC gaming. Sold one of my 970 to one peep. Was stuck on a old 750Ti, boy, was he giddie as a schoolgirl when he smacked that in and started gaming.







Heck, I had peeps buying old 580 and 680 for their gaming machines off me. They where super happy to buy them even though newer expensive cards where out. Most others I meet that game on PC still play at 1080P and only care if it can run the game where they can play it. Also know some peeps who don't play the lastest AAA games or just play one particular game and only care if it helps run the game better (usually the games are more CPU bound or don't require beefy GPUs at all to play) Usually I just twick the graphic settings and show them a few tricks with settings. Makes them happy and gets me a free lunch.







(I like to eat, so I don't mind that as payment)

I do hope AMD bring out a nice GPU. I like to mess around with one of their GPUs again. I just hope they keep the power consumption down and the things can fold like a champ. Since their new drivers, some of peeps who do fold on AMD cards have seen higher PPD of late. So hopefully that a good sign for the up coming cards. I still miss that 7970 that abused the crap out of with folding. Was such a tank of a card.


----------



## guttheslayer

Quote:


> Originally Posted by *Ithanul*
> 
> Most likely for high end PC gamers. But not everyone got 300+ bucks lying around for PC gaming. Sold one of my 970 to one peep. Was stuck on a old 750Ti, boy, was he giddie as a schoolgirl when he smacked that in and started gaming.
> 
> 
> 
> 
> 
> 
> 
> Heck, I had peeps buying old 580 and 680 for their gaming machines off me. They where super happy to buy them even though newer expensive cards where out. Most others I meet that game on PC still play at 1080P and only care if it can run the game where they can play it. Also know some peeps who don't play the lastest AAA games or just play one particular game and only care if it helps run the game better (usually the games are more CPU bound or don't require beefy GPUs at all to play) Usually I just twick the graphic settings and show them a few tricks with settings. Makes them happy and gets me a free lunch.
> 
> 
> 
> 
> 
> 
> 
> (I like to eat, so I don't mind that as payment)
> 
> I do hope AMD bring out a nice GPU. I like to mess around with one of their GPUs again. I just hope they keep the power consumption down and the things can fold like a champ. Since their new drivers, some of peeps who do fold on AMD cards have seen higher PPD of late. So hopefully that a good sign for the up coming cards. I still miss that 7970 that abused the crap out of with folding. Was such a tank of a card.


In all respect, I hope AMD bring a both a good GPU and a good CPU.

the Intel / Nvidia dominance has been too boring / atrocious already. Intel doing a repetitive +5% at same price each gen, while NV bring a better performance each gen but with ever increasing pricing as well.


----------



## Majin SSJ Eric

I expect both Vega and Big-Pascal (GP100 or 102 or whatever) in the first half of 2017 but I have seen no confirmation of that whatsoever. Anyone who claims otherwise is likely not being truthful...


----------



## Olivon

Quote:


> Originally Posted by *zealord*
> 
> Do you have any source that it will be within that window?
> 
> I've only heard of Vega coming in 2017, but it could be January, June or December.


Quote:


> Oh I also had a chat with the AMD guys at PAX. Polaris will only use GDDR5 not even GDDR5x. *Vega 10 will use HBM 2 and it's more Spring 2017*.
> 
> Edit: Polaris was said to be faster than a 290, but much cheaper and use way less energy.


It's just rumour of course but I don' think this guy talking bollocks.

edit : the source, sorry

http://www.neogaf.com/forum/showpost.php?p=201834331&postcount=11990


----------



## Tojara

Quote:


> Originally Posted by *Olivon*
> 
> It's just rumour of course but I don' think this guy talking bollocks.
> 
> edit : the source, sorry
> 
> http://www.neogaf.com/forum/showpost.php?p=201834331&postcount=11990


Seems plausible. GDDR5X production will likely not hit high enough levels to be an option for mainstream cards before the end of the year. The real question is whether the price is lower than 290 at launch ($400) or near EOL when the 390 replaced it ($~260).


----------



## guttheslayer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I expect both Vega and Big-Pascal (GP100 or 102 or whatever) in the first half of 2017 but I have seen no confirmation of that whatsoever. Anyone who claims otherwise is likely not being truthful...


The GP102, if it does exist, will be a radically different design as compared to GP100. That is what I believe but we shall see.


----------



## PCGamer4Ever

Quote:


> Originally Posted by *Ithanul*
> 
> But not everyone got 300+ bucks lying around for PC gaming. [SNIP] Most others I meet that game on PC still play at 1080P and only care if it can run the game where they can play it. Also know some peeps who don't play the lastest AAA games or just play one particular game and only care if it helps run the game better (usually the games are more CPU bound or don't require beefy GPUs at all to play)


Snipped for relevant points....

THIS is something people on this forum forget. We, as tech enthusiasts are the 1% of PC Gamers. (Okay maybe more like 5% but still minority) Most gamers out there are using 1080P monitors and they are not likely to move soon, simple fact. Monitors are not like video cards, they do not upgrade them often, usually only when they fail. Even then they seldom "upgrade" as they are happy with the experience they are having. Even if they want to upgrade which way do they go? I mean a base 24" 1080P monitor is around $100. The move to a 24" 1080 with 144Hz is around $200 and adding Freesync bumps to $240. That same $240 gets a basic 1440p 27", which for gaming makes no sense as smoother game play trumps pixels every time.

Further a bigger monitor means a bigger investment to justify it. A 1080P monitor can be run just fine with even a $150 video card, $200 to $250 is the sweet spot. Move to 1440 and now you need at least a $300 to get solid performance, comparable to trhe $200 at 1080P. So a basic 1440 setup needs $550 in monitor and video and that guys is a basic 1440 setup for gaming. $500 can deliver an outstanding 1080P gaming experience with freesync and high refresh rates. Heck you can even cut that down to $450 and still get a great experience.

If AMD, as rumors seems to suggest, is targeting the $150 to $300 price points with this launch then they are very smart. This is where sales numbers are massively greater than the higher end cards.Sure they may not be winning the top end but seriously they do not need to. A Porsche 911 may be a vastly superior card in every test and every spec sheet but it will NEVER outsell or bring in more overall revenue than a Toyota Camry.

As for the latest Triple A games and technology. I heard a couple of industry people over the last few months discuss this. Game developers do not make games to be played, the way they envisioned them, at ultra settings. The games are actually designed to be played at high, ultra is extra load put on to make the tech enthusiasts happy. Developers do not make games with textures designed for 1440 or even 4K, they make games designed to be played at 1080P. So to enjoy a game, as the developer designed it and envisioned it being played you play at 1080P on high detail. Wait a second, that brings us back to the $200 video card and 1080P monitor as a great choice....

Look I know , here amongst enthusiasts, we demand more, we want more. We should, we spent our money dang it and deserve something for what we spent all this cash on. However we need to except the reality that we are not the be all, end all of PC gaming. Sure we are the bleeding edge, sure our pushing it was drives things forward, but driving things forward is not what pays the bills.

Can AMD deliver a card at $300 with 980ti performance? I doubt it. I expect we will see a bump at $300 to around 390X or maybe a bit closer to Fury. AMD has been seriously crying from the roof tops about the power consumption of their new design. I think we will see a pretty hefty reduction in power needs and TDP.


----------



## flopper

Quote:


> Originally Posted by *Olivon*
> 
> It's just rumour of course but I don' think this guy talking bollocks.
> 
> edit : the source, sorry
> 
> http://www.neogaf.com/forum/showpost.php?p=201834331&postcount=11990


if you look at fps
a 390 let say do 60fps.
a furyx/980ti does let say 80 to 90fps.
you pay twice the price for 20 to 30 more fps.
If a Polaris hit 75fps and priced resonably you can OC it and wont have any difference in actual gameplay vs a furyx/980ti in 1080p and 1440p.
Thats pretty much what I expect.
Price/performance king


----------



## cowie

Polaris is long awaited replacements for 39ox on down







don't think they will have much past fury performance tbh

all they have to do is put up some hitman dx12 numbers with a "380x Polaris" and there you go as fast as a 980ti


----------



## nakano2k1

Right now, the cheapest new 980ti here in Canada is $830CDN, After you add taxes and shipping you're most likely ending up around just short of $1000. If AMD could offer me something similar in performance but at a much cheaper price, they've got my business.

Unless NVidia offers me something even more compelling.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *PCGamer4Ever*
> 
> Snipped for relevant points....
> 
> THIS is something people on this forum forget. We, as tech enthusiasts are the 1% of PC Gamers. (Okay maybe more like 5% but still minority) Most gamers out there are using 1080P monitors and they are not likely to move soon, simple fact. *Monitors are not like video cards, they do not upgrade them often, usually only when they fail.* Even then they seldom "upgrade" as they are happy with the experience they are having


Absolutely agree with you there. I got a pair of Korean IPS 1440p monitors from a member here way back in 2014 and have no intention of EVER getting rid of them (especially for the $400 I paid for both of them at the time). I might eventually go to a large (40"+) 4k monitor but its not a priority.
Quote:


> Even if they want to upgrade which way do they go? I mean a base 24" 1080P monitor is around $100. The move to a 24" 1080 with 144Hz is around $200 and adding Freesync bumps to $240. *That same $240 gets a basic 1440p 27", which for gaming makes no sense as smoother game play trumps pixels every time*.


This is where I totally disagree with you. I'll take my non-Freesync 1440p monitors over 1080p FS ones any time for games. "Smoothness trumps pixels" is a purely subjective statement and is no way a fact.


----------



## Nickyvida

Quote:


> Originally Posted by *Serios*
> 
> Polaris is midrange and a small one like 236mm^2.
> What's so bad about just matching a 601mm^2 GPU even is it's on 28nm?
> 
> Whit Vega we will most likely see a big and a very big die.
> AMD is doing it's best to address the biggest part of the gaming GPU market right now.
> It's not a bad strategy.


We used to have small dies beat big dies of the next gen. The last of which was the 680. After this it's all about paltry improvements, going the way of the cpu with 5~10% if we're lucky, for almost exorbitant prices. Just look at Nvidia, it's already starting with small Pascal 1070 rumored to cost $500-$600. GP104, a midrange die, retailing for what was once a big die price and GM 200 costing almost 1k. Just look at the RPD as well, Full fat Fiji, costing $1.5k with barely 20-40% improvement over the R9 295X. And the infamous Titan Z.

it's all about exploiting customers now with each node shrink. And to think, node shrinks usually meant that it was usually cheaper overall for fabs.

OT, on another thread, Polaris 10/11 rumored benchmarks has surfaced, taken from Ileakstuff





Would go a long way to explaining the $300 price. And it is no better than a 390X.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> Fury X is 30% faster than the 390X. Polaris is the 390X replacement, yet you're not satisfied with a decent performance increase with half the power draw if the rumors hold true? If you've been sitting on your card for four years, that means you likely have a 7970. Polaris would murder hobo a 7970 in a benchmark contest. Talk about Millennial Entitlement.


It ain't entitlement when i'm parting with my hard earned and saved money, which i've scrimped for 3 years.

Been waiting since i had my 780, and if it hadn't decided to give up 2 years in, i would have sat on it till Vega. It will be 3 years by the time Polaris launches and nearly 4, to be exact since i'm going for Vega now.

Not really satisfied if i'm going to pay big die prices for a small die for just a"decent increase." And the big die prices are going to be at least a lot bigger. I could get 2x of Polaris or Vega with what i've saved up but in the end, if the performance doesn't justify it, i'm not going to cash in. Same goes for Zen.

I stand by my statements, Polaris should have been what Vega is and Vega should have been 2x of that.


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> It ain't entitlement when i'm parting with my hard earned and saved money, which i've scrimped for 3 years.
> 
> Been waiting since i had my 780, and if it hadn't decided to give up 2 years in, i would have sat on it till Vega. It will be 3 years by the time Polaris launches and nearly 4, to be exact since i'm going for Vega now.
> 
> Not really satisfied if i'm going to pay big die prices for a small die for just a"decent increase." And the big die prices are going to be at least a lot bigger. I could get 2x of Polaris or Vega with what i've saved up but in the end, if the performance doesn't justify it, i'm not going to cash in. Same goes for Zen.
> 
> I stand by my statements, Polaris should have been what Vega is and Vega should have been 2x of that.


Since when is $350 a "big die price"? And "I want what I want" is entitlement, especially when you're talking about something like gaming hardware. Also, if it's taken you this long to save up for a new graphics card, perhaps you need to sell your computers and turn off your internet, because the time you spend gaming and on here could be better spent working.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> Since when is $350 a "big die price"? And "I want what I want" is entitlement, especially when you're talking about something like gaming hardware. Also, if it's taken you this long to save up for a new graphics card, perhaps you need to sell your computers and turn off your internet, because the time you spend gaming and on here could be better spent working.


Well a 390x costs $779 here in my own currency. And we're not even talking about Nano or Fury/X yet. I could just order from Amazon and get it cheaper but i need the warranty.

So paying S$500+ just for efficiency improvements over the 390x, if the above benchmarks ive posted,doesnt that sound like a waste of money?

And if the 1070 and 1080 500+ and 650+ pricing holds true, i'm looking at S$800-1000 for a 1070 and 1080 respectively, for mid range dies, which probably just have a decent increase over my 390 at the moment.

We're paying sky high prices for merely subpar improvements if the above benchmarks are true, don't see any entitlement there. I'm not criticising AMD alone, but as well as Nvidia too. Graphics prices have been rising for so so improvements.

Nice of you to assume and dictate how to spend my time saving money and how i should give up on gaming anyway. Surprised your head hasnt got any bigger yet.


----------



## The Robot

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Absolutely agree with you there. I got a pair of Korean IPS 1440p monitors from a member here way back in 2014 and have no intention of EVER getting rid of them (especially for the $400 I paid for both of them at the time). I might eventually go to a large (40"+) 4k monitor but its not a priority.
> This is where I totally disagree with you. I'll take my non-Freesync 1440p monitors over 1080p FS ones any time for games. "Smoothness trumps pixels" is a purely subjective statement and is no way a fact.


I agree, especially if you like glossy monitors like I do. They simply don't make proper 8-bit premium glossy 1080p IPS monitors, a few ones out there are all 6-bit FRC which is really noticeable.


----------



## raghu78

Quote:


> Originally Posted by *Nickyvida*
> 
> Well a 390x costs $779 here in my own currency. And we're not even talking about Nano or Fury/X yet. I could just order from Amazon and get it cheaper but i need the warranty.
> 
> So paying S$500+ just for efficiency improvements over the 390x, if the above benchmarks ive posted,doesnt that sound like a waste of money?


I think its better to wait for Polaris launch and then judge it. We have only rumours and nothing concrete in the form of actual pricing, performance and real reviews to really make any factual decision. Its only 1-2 months. So hold on.


----------



## tajoh111

Quote:


> Originally Posted by *Fyrwulf*
> 
> AMD is dual sourcing from GloFo and Samsung. I dunno where you heard different, but they're wrong. You're also conflating revenue with the actual number of chips. For one thing, Apple makes their money on really high margins. Qualcomm is involved in far more than smart phones, in fact their largest single source of revenue is their networking stack.
> Again, economies of scale. Even if it's a flat dollar amount and not a percentage (not likely), the number of chips I'm talking about means a negligible amount added to the cost of each chip.
> Apple is fabless. They literally have no control over the production process. Yes, their processors are tiny, as small as any other phone processor.
> A high electronics retail margin is 15%. In the retail field I work in, that's pathetic. So for an MSRP of $350, that's 52.50 in profit. That leaves the AIB partners $179.06 to play with. There's no way a completed card costs more than an additional $50. That's actually a really healthy gross profit for them. Now, their _net_ profit probably only comes out to a third of that, but at the sort of numbers we're talking about that's not bad.


http://www.sammobile.com/2016/01/04/samsung-to-rake-in-huge-licensing-profits-from-amds-new-14nm-finfet-contract/

AMD has reportedly signed a contract with Globalfoundries to manufacture its new graphics processing unit that will be released later this year, the GPU will be made using GF's 14nm FinFET process. Samsung isn't directly involved in the production and yet it's *going to earn substantial licensing profits off of this contract considering the fact that it owns Globalfoundries' 14nm FinFET technology.*

AMD is charging Global foundaries a licencing fee that is likely signficant. You don't spend billions on R and D and give it away for free. This is the point of licencing your technology out. For GF to develop their own competitive finfet technology would costs billions. Samsung knows this is charging GF significant money which it is likely to pass on to their customers.

The number of chip and revenue that apple deals with and brings in for Samsung is vastly more than the money AMD could potentially bring in. Apple sells 200 million iphones a year, 50 million ipads along with getting all their memory in their phones, laptops and other things from them. Considering chips are sold at about 15-22 dollars a piece for the phones and 30-40 dollars for the ipad ones, this represents a sizable chunk of revenue.

http://fortune.com/2015/10/07/samsung-profits-third-quarter-2015-chips-galaxy/

"And that's partially thanks to smartphone rival Apple, which uses chips manufactured by Samsung in its latest iPhone 6 lineup (Samsung's own phones also use Samsung-made chips). Samsung's chip business saw revenues grow by around 24% to about $28 billion last year."

Samsung is manufacturing most if not all of qualcomms snapdragon 820 processor which is in just about every major smartphone, which altogether represent a similar volume to apples chips. Due to this, it means volume and revenue both dwarf what AMD brings in. Yes Qualcomm makes a crap loads on selling baseband modems and patents related to telecommunications, but they represent a huge volume of guaranteed sales along with likely memory sales which are often outsourced to samsung.

The amount of revenue AMD can bring in is also questionable to say the least. It's why take or pay contracts are necessary and most of the time, AMD has not been able to deliver the revenue/sales orders needed to meet those obligations.

Nvidia and AMD are treated like 2nd class citizens compared to mobile telecom manufactures. It's no secret that Apple and qualcomm have made it difficult to get wafer allocations for both AMD and Nvidia.

That 15% is a significant amount of money. Add in things things like free shipping which is integrated but hidden and built into most online retailers price, aka FOB destination. And If the retailer wants to maintain margins, it means retailers have to add more cost to the price to still make that 15%.

Also many companies use a distributor through their supply chain. Now we have to add the distributors cut of the profits. Now add asus cost for the parts to make the card($50), packaging and assembly, their marketing cost and their own profit and AMD if is making a 50% margin at best on their highest end SKU, it's not going to be great for their volume cut down part. Normally flagship products have higher margins than this. What about the cutdown chip that's 80-100 dollars cheaper?

The margins will be alot thinner on these chips. I don't think the difference in cost to manufacture is all that different from a a cut down compared to a full chip. The PS4 APU for example had a portion of its chip disabled. 1/10 which is the same amount of cutdown as a as a cutdown model of the card would The gross margin on this product was initially about 13-16%. It's improved to a bit over 20 now but even with 10% of the chip disabled for yields, it cost AMD 80+ dollars to product a 348mm2 28nm chip.

Nvidia is also a fabless company, along with Apple and AMD. Doesn't mean they can't do improvements to the chip to increase yields.

https://www.youtube.com/watch?v=pRz_CG3DZb4

Here's a video with Nvidia where they explain some of various tools and equipment they use to improve yields. We are talking about 10's of millions of dollars in equipment but considering the potential savings when taking into account Apples volume, it would be stupid of them not to do it considering Nvidia is doing it. minute increases in yields saves tonnes of money. AMD is doing it too but considering their drastically worse margins even taking into account the value of their chips, I imagine this where AMD saves a bunch on R and D. Considering AMD is researching and developing APU, CPU, Memory AND GPU and likely more, I suspect they likely cut back on stuff like this which is shown in bad supply launches for chips like Tahiti, hawaii and Fiji. AMD products often sell out early into their launch of a brief period of time. And although some of it is derived from demand, I think much of it can be blamed on supply. Nvidia when they launch and sell out, they are selling hundreds of millions of GPU's a month. For AMD's 10's of millions to less than 100millions is enough to tax the AMD supply that they run out of cards. Also AMD cards tend to have greater variance in their cards, which causes them to overvolt their cards to make up for this or underclock them. This has been true of all their first generation chips compared to Nvidia. This was seen in both hawaii, Tonga and tahiti. Fiji might have the same problems if they didn't go with the water cooler. AMD seems to a


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> Well a 390x costs $779 here in my own currency. And we're not even talking about Nano or Fury/X yet. I could just order from Amazon and get it cheaper but i need the warranty.
> 
> So paying S$500+ just for efficiency improvements over the 390x, if the above benchmarks ive posted,doesnt that sound like a waste of money?


Yes, but I didn't know you lived in a foreign country where prices are inflated, did I? Here in America, a 390X is $400-$450, so $350 for equivalent performance doesn't sound bad. More to the point, these results are from synthetic benchmarks, which nobody takes seriously as an indicator of game performance.
Quote:


> And if the 1070 and 1080 500+ and 650+ pricing holds true, i'm looking at S$800-1000 for a 1070 and 1080 respectively, for mid range dies, which probably just have a decent increase over my 390 at the moment.


The 1070 price is way high, I agree. However, the price for the 1080 is right within the wheel house of previous x80 products if you adjust for inflation. More to the point, nVidia is paying TSMC $7k per wafer. AMD is not paying Samsung/GloFo anywhere near that.
Quote:


> We're paying sky high prices for merely subpar improvements if the above benchmarks are true, don't see any entitlement there. I'm not criticising AMD alone, but as well as Nvidia too. Graphics prices have been rising for so so improvements.


Buddy, I'm gaming on a laptop with an A8 APU, so _anything_ commonly available now is an improvement. I'm also in the process of building a proper tower, but even if I had a 390X, I wouldn't be crying about how something new isn't an improvement, I'd just shrug my shoulders and wait for Vega. FYI, my last gaming tower was an Alienware with an X800 Pro that I had for 8 years. You aren't going to get any sympathy from me.
Quote:


> Nice of you to assume and dictate how to spend my time saving money and how i should give up on gaming anyway. Surprised your head hasnt got any bigger yet.


Or, you know, I've lived that struggle and I'm basically suggesting that you do what I did.


----------



## Fyrwulf

Quote:


> Originally Posted by *tajoh111*
> 
> http://www.sammobile.com/2016/01/04/samsung-to-rake-in-huge-licensing-profits-from-amds-new-14nm-finfet-contract/
> 
> AMD has reportedly signed a contract with Globalfoundries to manufacture its new graphics processing unit that will be released later this year, the GPU will be made using GF's 14nm FinFET process. Samsung isn't directly involved in the production and yet it's *going to earn substantial licensing profits off of this contract considering the fact that it owns Globalfoundries' 14nm FinFET technology.*


http://www.digitaltrends.com/computing/rumor-mill-says-amd-will-switch-samsung-production-next-gen-silicon/

I don't doubt the total dollar amount is quite large, but you have to break it down by per chip to get a feel for its impact on retail customers. Also, I wouldn't trust mobile sources. There was a laptop review site that recently thought AMD's newest 28nm APU contained GCN 4 cores.
Quote:


> Samsung (sic) is charging Global foundaries a licencing fee that is likely signficant. You don't spend billions on R and D and give it away for free. This is the point of licencing your technology out. For GF to develop their own competitive finfet technology would costs billions. Samsung knows this is charging GF significant money which it is likely to pass on to their customers.


You do realize that Samsung's electronics fabrication business is a small fraction of the company's total revenue, right? They could literally misplace a billion dollars and not care. They're in a position to take the long view and bet on making their money back over the lifetime of the process (which is a long time, there are still things made with 40nm).
Quote:


> That 15% is a significant amount of money. Add in things things like free shipping which is integrated but hidden and built into most online retailers price, aka FOB destination. And If the retailer wants to maintain margins, it means retailers have to add more cost to the price to still make that 15%.


Like I said, 15% is high. 5% is more like normal. The retail guys get squeezed on margins, but quantity has a quality all its own.
Quote:


> Also many companies use a distributor through their supply chain. Now we have to add the distributors cut of the profits. Now add asus cost for the parts to make the card($50), packaging and assembly, their marketing cost and their own profit and AMD if is making a 50% margin at best on their highest end SKU, it's not going to be great for their volume cut down part. Normally flagship products have higher margins than this. What about the cutdown chip that's 80-100 dollars cheaper?


Shipping is so dirt cheap anymore that it's calculated by volume. Hell, any Walmart you walk into is restocked every night by two or three semis and that's a company that operates on per item margins so razor thin you'd have trouble threading a needle through them.
Quote:


> The margins will be alot thinner on these chips. I don't think the difference in cost to manufacture is all that different from a a cut down compared to a full chip. The PS4 APU for example had a portion of its chip disabled. 1/10 which is the same amount of cutdown as a as a cutdown model of the card would The gross margin on this product was initially about 13-16%. It's improved to a bit over 20 now but even with 10% of the chip disabled for yields, it cost AMD 80+ dollars to product a 348mm2 28nm chip.


And? You're disagreeing with me, but you're not really demonstrating where my math is wrong. You can slide the margins this way or that, but we're still talking about a boatload of money for all parties involved.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> Yes, but I didn't know you lived in a foreign country where prices are inflated, did I? Here in America, a 390X is $400-$450, so $350 for equivalent performance doesn't sound bad. More to the point, these results are from synthetic benchmarks, which nobody takes seriously as an indicator of game performance.
> The 1070 price is way high, I agree. However, the price for the 1080 is right within the wheel house of previous x80 products if you adjust for inflation. More to the point, nVidia is paying TSMC $7k per wafer. AMD is not paying Samsung/GloFo anywhere near that.
> Buddy, I'm gaming on a laptop with an A8 APU, so _anything_ commonly available now is an improvement. I'm also in the process of building a proper tower, but even if I had a 390X, I wouldn't be crying about how something new isn't an improvement, I'd just shrug my shoulders and wait for Vega. FYI, my last gaming tower was an Alienware with an X800 Pro that I had for 8 years. You aren't going to get any sympathy from me.
> Or, you know, I've lived that struggle and I'm basically suggesting that you do what I did.


Yeah, point conceded on that, but still, over here, it is why i moan at the prices, in hand with the lack of improvements. If i am going to drop a lot of money, it'd have to be a worthwhile investment, since, you know the point. Waiting for 3 years ain't fun, and probably knowing that i have to wait another is well, alright.

1070 is way too high, considering that it was only $300 for Maxwell(970). It has jumped almost $200 more and because it sits at the X80 price point for Maxwell right now, the 1080 is more expensive than the 980 by $100. It is daylight robbery for a GP104 die. Back in the day, we used to get the full fat dies for $600 USD, but now the same money only nets you a small die at maximum.

Yes, like shrugging your shoulders is going to get you anywhere. Complains do help and force companies and other entities to reverse thier decisions, which can be a lowering of prices, or adding things which were cut out of games like GTA V for example. Granted only one(me) won't be of much, if any note, but if more were to get together instead of being sheeps and buying every new release..

Consumers are slowly being milked and priced out like the CPU market. Go figure. Soon it'll be 10% for Volta, and 5% the next, so on and so forth, for increasing amounts of money.


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> Consumers are slowly being milked and priced out like the CPU market. Go figure. Soon it'll be 10% for Volta, and 5% the next, so on and so forth, for increasing amounts of money.


Prices are only going up for CPUs on the extremely high end, at least here. Even Intel's 4 core unlocked i7s are pretty reasonable nowadays, they used to be catastrophically high.


----------



## inedenimadam

I find the dollar/performance ratio in the OP highly unlikely. But if it is true and has 8Gb+ and HMDI2.0...sign me up for 3 or 4.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> Prices are only going up for CPUs on the extremely high end, at least here. Even Intel's 4 core unlocked i7s are pretty reasonable nowadays, they used to be catastrophically high.


yeah, they're pretty reasonable, for what? 5% if we're lucky every tick, tock refresh? $300+ for maybe 5% on average every node refresh/shrink, only someone with money to dump will go for that.

The high end enthusiast chips, X99 are only where notable improvements are being made and the prices are out of this world. $1000 for probably 10-20% at most improvement against previous generations?


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> yeah, they're pretty reasonable, for what? 5% if we're lucky every tick, tock refresh? $300+ for maybe 5% on average every node refresh/shrink, only someone with money to dump will go for that.


So don't update that often. It's not hard, especially if you like having money in your bank account.


----------



## prjindigo

Quote:


> Originally Posted by *Forceman*
> 
> Don't waste your time. He's been told numerous times in multiple threads, including photographic evidence. He doesn't care about the truth.


I'm the one that put that picture where you could find it.

I'm the one who explained that, by nVidia's own admission NONE of those chips are fully functional OR running at full speed.

By spec those chips cannot be put on a PCIE board, there is NO "GP100" card coming and nVidia has already stated this fact as well.

Those processors pull 400w at 850mhz and are MISSING several of the components that make a chip a Gaming GPU. They won't even keep up with a GP104-200 "1070" in generating images. There's no way to take those chips and put them on a PCIE card and stay within the wattage limitation. That requires a different chip design with more efficiency.

If you'd read the actual information provided with the "talk" given on stage and listened you'd know all this.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> So don't update that often. It's not hard, especially if you like having money in your bank account.


Or you could speak out about the high prices. Just keeping shut wont do a thing especially when theres little performance increases and companies just look for the easy way out by citing silicon costs. Whatever happened to node shrinks reducing costs and increasing perf?


----------



## Forceman

Quote:


> Originally Posted by *prjindigo*
> 
> I'm the one that put that picture where you could find it.


So you're Google image search now?
Quote:


> Originally Posted by *prjindigo*
> 
> By spec those chips cannot be put on a PCIE board, there is NO "GP100" card coming and nVidia has already stated this fact as well.


*Though the GP100 GPU at the heart of the P100 supports traditional PCI Express* NVIDIA has also invested heavily in NVLink, their higher-speed interconnect to enable fast memory access between GPUs, and unified memory between the GPU and CPU.

http://www.anandtech.com/show/10229/nvidia-announces-dgx1-server


----------



## TheLAWNOOB

Do we have a concrete date for Polaris yet? Don't want to wait more than 2 month for a paper launch.


----------



## tajoh111

Quote:


> Originally Posted by *Fyrwulf*
> 
> http://www.digitaltrends.com/computing/rumor-mill-says-amd-will-switch-samsung-production-next-gen-silicon/
> 
> I don't doubt the total dollar amount is quite large, but you have to break it down by per chip to get a feel for its impact on retail customers. Also, I wouldn't trust mobile sources. There was a laptop review site that recently thought AMD's newest 28nm APU contained GCN 4 cores.
> You do realize that Samsung's electronics fabrication business is a small fraction of the company's total revenue, right? They could literally misplace a billion dollars and not care. They're in a position to take the long view and bet on making their money back over the lifetime of the process (which is a long time, there are still things made with 40nm).
> Like I said, 15% is high. 5% is more like normal. The retail guys get squeezed on margins, but quantity has a quality all its own.
> Shipping is so dirt cheap anymore that it's calculated by volume. Hell, any Walmart you walk into is restocked every night by two or three semis and that's a company that operates on per item margins so razor thin you'd have trouble threading a needle through them.
> And? You're disagreeing with me, but you're not really demonstrating where my math is wrong. You can slide the margins this way or that, but we're still talking about a boatload of money for all parties involved.


Samsungs semiconductor business represents a large portion of samsungs revenue, but a huge portion of their profits.

Gross Margins on chips and wafers is usually around 50%. This is better than their phone business. It's R and D heavy but the pay offs are huge and the margins are big because it's all about huge deals from a smaller selection of customers. This is because there is only a handful of companies on the market that can produce chips and basically two that can do it on the scale Apple wants.

https://news.samsung.com/global/samsung-electronics-announces-fourth-quarter-fy-2015-results



As a result it represents a huge portion of their operating profits, close to their biggest source of it.

I didn't say they could not make profits at a 350 dollar msrp, it's the the product stack below the high end part that have a difficult time generating profit. Naturally, the card with the highest MSRP have the highest margins. But if their high margin chip only has a profit margin of 50%, what happen's to the chip that costs 100 dollars less which will sell in far greater volume. It will be a lot less and this is where AMD runs into trouble. Nvidia's gross margins overall are still 55% when mixing in their lower margin, high volume products. With AMD, if their highest end product is 50% gross margin, take $100 off this msrp for the cutdown part and watch the gross margins disappear. AMD as is, needs their overall profit margins to be in excess of 40 percent to make a profit. This is in their skeleton like form at the present in regards to R and D expenditure and staffing. At this current 32-35% they lose money while not having sufficient cashflow for investing activities and operational expenses. For AMD to be profitable while generating significant cash flow to fund their other activities, they need margins much higher than now, probably in the 50% range or greater like their competitors.
This fact is undeniable because AMD does the lowest volume out of Intel and Nvidia, while having the lowest margins. The lower the volume you do, the higher the margins you need to be profitable.

This is a recipe for disaster. Thus as a result, their highest end SKU need to have a gross margin in excess of 60-70%, while their volume drivers need to have margins in the 50% range. If their highest end SKU only have a 50% margin, think of what the rest of the margins will be.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *tajoh111*
> 
> Samsungs semiconductor business represents a large portion of samsungs revenue, but a huge portion of their profits.
> 
> Gross Margins on chips and wafers is usually around 50%. This is better than their phone business. It's R and D heavy but the pay offs are huge and the margins are big because it's all about huge deals from a smaller selection of customers. This is because there is only a handful of companies on the market that can produce chips and basically two that can do it on the scale Apple wants.
> 
> https://news.samsung.com/global/samsung-electronics-announces-fourth-quarter-fy-2015-results
> 
> 
> 
> As a result it represents a huge portion of their operating profits, close to their biggest source of it.
> 
> I didn't say they could not make profits at a 350 dollar msrp, it's the the product stack below the high end part that have a difficult time generating profit. Naturally, the card with the highest MSRP have the highest margins. But if their high margin chip only has a profit margin of 50%, what happen's to the chip that costs 100 dollars less which will sell in far greater volume. It will be a lot less and this is where AMD runs into trouble. Nvidia's gross margins overall are still 55% when mixing in their lower margin, high volume products. With AMD, if their highest end product is 50% gross margin, take $100 off this msrp for the cutdown part and watch the gross margins disappear. AMD as is, needs their overall profit margins to be in excess of 40 percent to make a profit. This is in their skeleton like form at the present in regards to R and D expenditure and staffing. At this current 32-35% they lose money while not having sufficient cashflow for investing activities and operational expenses. For AMD to be profitable while generating significant cash flow to fund their other activities, they need margins much higher than now, probably in the 50% range or greater like their competitors.
> This fact is undeniable because AMD does the lowest volume out of Intel and Nvidia, while having the lowest margins. The lower the volume you do, the higher the margins you need to be profitable.
> 
> This is a recipe for disaster. Thus as a result, their highest end SKU need to have a gross margin in excess of 60-70%, while their volume drivers need to have margins in the 50% range. If their highest end SKU only have a 50% margin, think of what the rest of the margins will be.


So after all of this OT crap you've been posting the main thrust of what you are saying is AMD sucks and Nvidia rules, right? K, we got it...


----------



## Drake87

If true then I'll be ordering one asap. I'll believe it when I see it.


----------



## tajoh111

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> So after all of this OT crap you've been posting the main thrust of what you are saying is AMD sucks and Nvidia rules, right? K, we got it...


Heck no, it's just AMD need to raise their prices. How did you get that message? What AMD needs to do is, if they have a really good product, to charge more for it if they are in a position to. Their financial reflect this and low margins along with low volume are what are killing them right now.

Pricing a card that has gtx 980 ti performance at 350 dollars is a very bad move.

It basically means their cards underneath have to be priced accordingly to sell well and it's likely to the point where they barely make any money. Flagship cards have the worst price to performance of all cards. So imagine what the cards underneath it have to sell at to make sense for buyers.

AMD priced their 7870 150 dollars less than a $499 gtx 580 and it had worse performance than it upon launch. Pricing a card at $350(actually 300 if this rumors is to be trusted) which is 300 dollars less than the competition former flagship which was $650, when your cost have gone way up is nutty. Particularly when you know the competition is going to price their cards rather high, because they are a bit greedy and because they also feel the cost of their chips going up.

They don't need to send the graphic card industry into a aggressive pricing war, where they don't have the funds nor the brand value to keep up with it if Nvidia fights back with price drops. What ultimately happens is both companies start taking losses and the ones with the biggest bank balance wins.

I am concerned with the long term health of AMD. If AMD has a card that has gtx 980 ti performance, I think they should charge atleast 450 dollars for it because Nvidia's going to charge 550-650 dollars for a card that performs 20% better than a gtx 980 ti. This still gives AMD the price to performance victory and 450 dollars is a lot more accessible than 600 dollars. And from the response the gtx 970 recieved, people are willing to pay 350-330 for a cutdown card. I want AMD profits and margins to raise. I don't care about short term gain and getting a graphic card on the cheap(which is unlikely when retailers scalp their cards) while AMD graphic card division tanks.

One more thing Eric, I think most people don't believe this rumor is true, particularly those that have looked at the past history of graphics card. If you read my posts more carefully, you would understand I am rationally trying to explain why this rumor is unlikely and if it is, it's unhealthy for AMD. If indeed this rumor is false(which is more than likely), what happens to the the potential buyers in this thread when their expectations are vastly underwhelmed? They basically get a negative image for a product that is likely probably pretty decent like what happened in the Fury x thread. As an added bonus, Nvidia fans overly crap on the product and it just makes the forum a more negative place which isn't needed for an wounded company like AMD that's already being kicked while it's down. There are genuinely stupid moves like pricing fury pro duo for 1500 dollars a month and a half before new cards come out, which AMD fans alike know is just a dumb move, but fury X launch was AMD reaching for the stars and falling short. It's better for people to be pleasantly surprised, than for for their expectations to be met on launch day.


----------



## Fyrwulf

Quote:


> Originally Posted by *tajoh111*
> 
> I didn't say they could not make profits at a 350 dollar msrp, it's the the product stack below the high end part that have a difficult time generating profit. Naturally, the card with the highest MSRP have the highest margins. But if their high margin chip only has a profit margin of 50%, what happen's to the chip that costs 100 dollars less which will sell in far greater volume. It will be a lot less and this is where AMD runs into trouble. Nvidia's gross margins overall are still 55% when mixing in their lower margin, high volume products. With AMD, if their highest end product is 50% gross margin, take $100 off this msrp for the cutdown part and watch the gross margins disappear. AMD as is, needs their overall profit margins to be in excess of 40 percent to make a profit. This is in their skeleton like form at the present in regards to R and D expenditure and staffing. At this current 32-35% they lose money while not having sufficient cashflow for investing activities and operational expenses. For AMD to be profitable while generating significant cash flow to fund their other activities, they need margins much higher than now, probably in the 50% range or greater like their competitors.
> This fact is undeniable because AMD does the lowest volume out of Intel and Nvidia, while having the lowest margins. The lower the volume you do, the higher the margins you need to be profitable.


Wow, talk about drawing the wrong conclusions. AMD is losing money because the bottom two-thirds of their stack has been rebranded for two generations, therefor nobody has been motivated to buy their products. I'm a huge AMD fan that refuses to buy Intel or nVidia products, but even I can acknowledge that.


----------



## tajoh111

Quote:


> Originally Posted by *Fyrwulf*
> 
> Wow, talk about drawing the wrong conclusions. AMD is losing money because the bottom two-thirds of their stack has been rebranded for two generations, therefor nobody has been motivated to buy their products. I'm a huge AMD fan that refuses to buy Intel or nVidia products, but even I can acknowledge that.


No question rebranding was one of the major reasons, I am not disagreeing with you there.

The rebranding + lack of good new products(bad Tonga)/Nvidia for executing an incredibly well market launch is what caused the record low margins and loss of market share. I guess the rebranding is the cause and the declining marketshare and low margins are the effect. But I was correlating things more about financials than anything. My posts are long enough, didn't want to add rebranding and other similar things because it gets the AMD fans riled up.

Rebranding can only do so much to improve margins as product price decrease over time(particular with the 290x price collapse) and image, but it will never reset margins back to what a new product will do and a rebrand always has negative connotation to it.

A good new product, will allow them to gain marketshare, raise margins and ultimately if the volume is there, lead to profit.


----------



## 364901

Quote:


> Originally Posted by *zealord*
> 
> Do you have any source that it will be within that window?
> 
> I've only heard of Vega coming in 2017, but it could be January, June or December.


The Q1 2017 or end 2016 timeframe comes from this slide:



Could go either way for Vega depending on how on time others are, but the Polaris launch seems to be hovering over Q2 2016, which fits well with a Computex 2016 introduction and launch in June.


----------



## KarathKasun

Quote:


> Originally Posted by *tajoh111*
> 
> They basically get a negative image for a product that is likely probably pretty decent like what happened in the Fury x thread. As an added bonus, Nvidia fans overly crap on the product and it just makes the forum a more negative place which isn't needed for an wounded company like AMD that's already being kicked while it's down. There are genuinely stupid moves like pricing fury pro duo for 1500 dollars a month and a half before new cards come out, which AMD fans alike know is just a dumb move, but fury X launch was AMD reaching for the stars and falling short. It's better for people to be pleasantly surprised, than for for their expectations to be met on launch day.


I expect 390-Fury ballpark performance.

AMD is not in a market position to price itself out of reach of the majority of the market. The majority of the market is $350 or lower MSRP. Historically price for a given performance level drops considerably when going down in node size. X800 110nm changeover was a pretty big drop in price for its level of performance. The same can be said for the 3850/3870/8800GT.

50% higher MSRP and the increase of per card revenue that comes with is not smart. If you can make good margins at $350 and get double or more the TAM you will make more money than pricing at $450 with a much smaller TAM.

The lower end will only go down to $150 and will be cheaper to make from a whole card perspective. The reason it exists is to serve as a dumping ground for chips that dont make the laptop bin. Margins will be a bit thinner here, but the laptop margins may be better and offset that.


----------



## flopper

Quote:


> Originally Posted by *KarathKasun*
> 
> I expect 390-Fury ballpark performance.
> 
> .


60fps hitman dx12.
thats good with polaris


----------



## Skinnered

Quote:


> Originally Posted by *KarathKasun*
> 
> I expect 390-Fury ballpark performance.


If that's true, they will have a limited potential buyersbase, people who allready own a R290 (or 970) or higher, wil not upgrade. It needs to be (much) faster then this to make a lot of coin. Near Fury/980Ti perf is a good start.


----------



## KarathKasun

Fury is not much faster than R9 390x honestly.

Most people do not have R9 290/390's and GTX 980/Tis. They have R9 280/380s and GTX 770/970's, or even slower and older cards. I still see TONS of people riding out their GTX 750 Ti's and HD 7800's in the realm of average PC game players.


----------



## guttheslayer

Quote:


> Originally Posted by *KarathKasun*
> 
> Fury is not much faster than R9 390x honestly.
> 
> Most people do not have R9 290/390's and GTX 980/Tis. They have R9 280/380s and GTX 770/970's, or even slower and older cards. I still see TONS of people riding out their GTX 750 Ti's and HD 7800's in the realm of average PC game players.


Fury x is just 20% faster than 390x. So fury is probably even less than that.


----------



## AmericanLoco

Quote:


> Originally Posted by *KarathKasun*
> 
> Fury is not much faster than R9 390x honestly.
> 
> Most people do not have R9 290/390's and GTX 980/Tis. They have R9 280/380s and GTX 770/970's, or even slower and older cards. I still see TONS of people riding out their GTX 750 Ti's and HD 7800's in the realm of average PC game players.


I still know a lot of PC gamers running HD 5000 series cards and GeForce 400 series stuff.


----------



## KarathKasun

Quote:


> Originally Posted by *AmericanLoco*
> 
> I still know a lot of PC gamers running HD 5000 series cards and GeForce 400 series stuff.


Im seeing lots of people move from stuff that age to R9 380s and GTX 960 right now. Those old cards have no real performance issues aside from the lack of vram. Newer games choke on GPUs with less than 2gb vram.


----------



## PCGamer4Ever

Quote:


> Originally Posted by *KarathKasun*
> 
> Im seeing lots of people move from stuff that age to R9 380s and GTX 960 right now. Those old cards have no real performance issues aside from the lack of vram. Newer games choke on GPUs with less than 2gb vram.


This is just not true, the people buying these cards are running 1080P and even the new games do just fine on 2gig cards at 1080P.


----------



## lahvie

Quote:


> Originally Posted by *PCGamer4Ever*
> 
> This is just not true, the people buying these cards are running 1080P and even the new games do just fine on 2gig cards at 1080P.


could older cards with different memory clocks make a diff?


----------



## Ithanul

Quote:


> Originally Posted by *KarathKasun*
> 
> Fury is not much faster than R9 390x honestly.
> 
> Most people do not have R9 290/390's and GTX 980/Tis. They have R9 280/380s and GTX 770/970's, or even slower and older cards. I still see TONS of people riding out their GTX 750 Ti's and HD 7800's in the realm of average PC game players.


True this. Most peeps buying used cards off me in the area are using 750Tis and such. Reason I sell a little lower in the local area. I know the dude who bought my two old Titans off me was super giddie about them. Actually I found surprising is it easier to sell computer parts here back home in the South than it was when I was stationed in North California.

The one peep who bought the 970 off me kept asking did it have 4GB vram as the lower vram on his 750Ti was biting him hard in newer video games.


----------



## criminal

Quote:


> Originally Posted by *Ithanul*
> 
> True this. Most peeps buying used cards off me in the area are using 750Tis and such. Reason I sell a little lower in the local area. I know the dude who bought my two old Titans off me was super giddie about them. Actually I found surprising is it easier to sell computer parts here back home in the South than it was when I was stationed in North California.
> 
> *The one peep who bought the 970 off me kept asking did it have 4GB vram as the lower vram on his 750Ti was biting him hard in newer video games*.


So did you tell him the truth about it only having 3.5GB?









Where in Alabama do you live?


----------



## Ithanul

Quote:


> Originally Posted by *criminal*
> 
> So did you tell him the truth about it only having 3.5GB?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Where in Alabama do you live?


He really did not give a crud as he is super happy with the performance it is giving him.

Deep South in the TriState area of lovely Dothan with the crazy huge roundabout highway know as Ross Clark circle.







With a buttload of restaurants. I ain't joking there like over 300+ of the blasted things. Technical I live 30mins out from that city in the farm land area.

Only good thing about living down here is it the lowest cost of living in the whole country and gas is cheap.







Draw back from that is minimum wage is super low, aka barely over 7 bucks per hour. Though, if you can land a job with pay over 10-12 bucks per hour you doing good.


----------



## superkyle1721

Quote:


> Originally Posted by *Ithanul*
> 
> Quote:
> 
> 
> 
> Originally Posted by *criminal*
> 
> So did you tell him the truth about it only having 3.5GB?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Where in Alabama do you live?
> 
> 
> 
> He really did not give a crud as he is super happy with the performance it is giving him.
> 
> Deep South in the TriState area of lovely Dothan with the crazy huge roundabout highway know as Ross Clark circle.
> 
> 
> 
> 
> 
> 
> 
> With a buttload of restaurants. I ain't joking there like over 300+ of the blasted things. Technical I live 30mins out from that city in the farm land area.
> 
> Only good thing about living down here is it the lowest cost of living in the whole country and gas is cheap.
> 
> 
> 
> 
> 
> 
> 
> Draw back from that is minimum wage is super low, aka barely over 7 bucks per hour. Though, if you can land a job with pay over 10-12 bucks per hour you doing good.
Click to expand...

Nice to see a fellow alabamian here


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *superkyle1721*
> 
> Nice to see a fellow alabamian here


Maybe you two are cousins.


----------



## criminal

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> Maybe you two are cousins.


Wow... that is kinda harsh...lol

So what you saying? All Alabamians are related? I am from Alabama too you know.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *criminal*
> 
> Wow... that is kinda harsh...lol
> 
> So what you saying? *All Alabamians are related*? I am from Alabama too you know.


They aren't?


----------



## criminal

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> They aren't?


Not that I am aware of.









We going off topic bad now...lol


----------



## rbarrett96

Um yeah, and a Fury X is still over $650.


----------



## Fyrwulf

Quote:


> Originally Posted by *tajoh111*
> 
> But I was correlating things more about financials than anything.


Yes, but my point is that I've given math proofs to demonstrate my point, proofs backed by facts. It's a fact that TSMC charges $7k per 16nm wafer, this is a public figure just like Intel's $10k per 14nm wafer. It's a matter of public record that in order to earn the business of Qualcomm and Apple, Samsung reduced prices significantly enough to make the change attractive; 5% was a guess on my part, but I think it's a good guess. It's also a matter of public record that after those production runs ceased (and dropped Samsung's revenue 40%), Samsung reduced prices further for follow on customers; again, I assumed a 5% price reduction, but the math works out. You haven't posted any refuting proofs, you've just disagreed.

AMD does not, in fact, need to raise prices. Let's assume for a moment that nVidia makes $100 on the sale of every one of their new chips and they can only realistically expect about 3.5 million customers; that's $350 million in profit. But, again, AMD's shooting for roughly 60 million customers and at $40 per GPU they'd be looking at $2.4 billion in profit. Even if they only earn 15 million sales, which would be disappointing at 6.1% market share, that's still $600 million in profit. And both companies can do this without coming into conflict with each other because they're catering to different markets come June.

By the way, Samsung Electronics is a wholly owned subsidiary of Samsung Holdings Co, Ltd. While it's a large part of the company's $305 Billion in revenue, its 10% net profit margin is actually rather low. That gives you an idea of what the company is willing to accept as a cost of doing business. High volume trumps high per item profit every single time in every major business.


----------



## rbarrett96

So the 490x will be the flagship then, I'm assuming? It's almost like they're rebranding Nvidia's card, lol.


----------



## tajoh111

Quote:


> Originally Posted by *Fyrwulf*
> 
> Yes, but my point is that I've given math proofs to demonstrate my point, proofs backed by facts. It's a fact that TSMC charges $7k per 16nm wafer, this is a public figure just like Intel's $10k per 14nm wafer. It's a matter of public record that in order to earn the business of Qualcomm and Apple, Samsung reduced prices significantly enough to make the change attractive; 5% was a guess on my part, but I think it's a good guess. It's also a matter of public record that after those production runs ceased (and dropped Samsung's revenue 40%), Samsung reduced prices further for follow on customers; again, I assumed a 5% price reduction, but the math works out. You haven't posted any refuting proofs, you've just disagreed.
> 
> AMD does not, in fact, need to raise prices. Let's assume for a moment that nVidia makes $100 on the sale of every one of their new chips and they can only realistically expect about 3.5 million customers; that's $350 million in profit. But, again, AMD's shooting for roughly 60 million customers and at $40 per GPU they'd be looking at $2.4 billion in profit. Even if they only earn 15 million sales, which would be disappointing at 6.1% market share, that's still $600 million in profit. And both companies can do this without coming into conflict with each other because they're catering to different markets come June.
> 
> By the way, Samsung Electronics is a wholly owned subsidiary of Samsung Holdings Co, Ltd. While it's a large part of the company's $305 Billion in revenue, its 10% net profit margin is actually rather low. That gives you an idea of what the company is willing to accept as a cost of doing business. High volume trumps high per item profit every single time in every major business.


Where did you even get this 7k figure from. The link you originally posted awhile go made no mention of prices. Plus worse yet, it was fudzilla and wccftech, literally the worst rumor websites for reliability.

http://www.fool.com/investing/general/2015/12/02/how-much-does-the-apple-a9x-cost-to-make.aspx

The closest thing I got to a price was 8400 dollars.

The samsung electronics group represents 80-90% of their net profit and over 66 % of their revenue. It's the biggest and most important part of the company. Added how much of the semiconductor arm represents to their net profit, which is close to half of this in some quarters, your attempts to marginalize the importance of the semiconductor business is not going well.

Also, 10% net profit which is profit after taxes is fine and better than most in this type of industry. In a high volume market, this is really the most you can expect in electronics. Apple is the exception to the rule.


----------



## Ithanul

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> Maybe you two are cousins.


Highly unlikely. My family not from this area at all. Comes from the fact both my parents are military brats (Most my family has been in some branch or the other). So my family is from one end of the country to the other end.







I have relatives in California, Washington, Texas, Florida, Georgia. Then non-blood related family members in Germany, Denmark, and Ukraine.







We all over the blasted place. Also the reason my southern accent lighter compared to others from this area.
Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> They aren't?


Nah, we Alabamians joke that Georgians are though.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *Ithanul*
> 
> Highly unlikely. My family not from this area at all. Comes from the fact both my parents are military brats (Most my family has been in some branch or the other). So my family is from one end of the country to the other end.
> 
> 
> 
> 
> 
> 
> 
> I have relatives in California, Washington, Texas, Florida, George. Then non-blood related family members in Germany, Denmark, and Ukraine.
> 
> 
> 
> 
> 
> 
> 
> We all over the blasted place.


www.dictionary.com/browse/joke I like how you went through all the trouble to debunk it haha


----------



## Ithanul

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> www.dictionary.com/browse/joke I like how you went through all the trouble to debunk it haha


I know you joking. I just like yapping.









Technical we all related anyways.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *Ithanul*
> 
> I know you joking. I just like yapping.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Technical we all related anyways*.


I think you've been living there too long









Need more Polaris and Pascal leaks. This thread is getting as stale as Ted Cruz's campaign.


----------



## KarathKasun

Quote:


> Originally Posted by *PCGamer4Ever*
> 
> This is just not true, the people buying these cards are running 1080P and even the new games do just fine on 2gig cards at 1080P.


Please re-read my post. Newer games DO have issues with having *less than 2gb*.


----------



## ToTheSun!

Quote:


> Originally Posted by *Ithanul*
> 
> Technical we all related anyways.


We're all brothers in Christ.


----------



## Fyrwulf

Quote:


> Originally Posted by *Skinnered*
> 
> Where did you even get this 7k figure from. The link you originally posted awhile go made no mention of prices. Plus worse yet, it was fudzilla and wccftech, literally the worst rumor websites for reliability.


https://oemelectronic.wordpress.com/2015/01/13/race-14-nm-finfet-process-renewed-fighting-global-foundry/
Quote:


> To 60,000 wafers per month capacity to calculate the average price of TSMC 20nm process wafers system is estimated at $ 6,000 per tablet in the fourth quarter of 2014, with 28nm wafer average price (about $ 4,500 to $ 5,000) compared to has greatly improved. *The estimated cost of production FinFET wafer 16nm / 14nm is about $ 4,000 per piece, with gross margin of about 45%, the sales price was $ 7,270 per piece.* If TSMC 20nm process for accurate prediction, share the point of view of its 20nm process as a whole, will reach 95% of the world in the fourth quarter of 2014.


There you go. Samsung's 14nm LPP process will probably cost roughly the same per wafer ($4k). Anything after that is gravy.

EDIT: Here's another source that corroborates the manufacturing cost of 14nm FinFET

http://www.advancedsubstratenews.com/2014/03/why-migration-to-fd-soi-is-a-better-approach-than-bulk-cmos-and-finfets-at-20nm-and-1416nm-for-price-sensitive-markets/



Also this, which validates my estimate of raw die cost:


----------



## criminal

Quote:


> Originally Posted by *ToTheSun!*
> 
> We're all brothers in Christ.


Agree.


----------



## prjindigo

Quote:


> Originally Posted by *Forceman*
> 
> So you're Google image search now?
> *Though the GP100 GPU at the heart of the P100 supports traditional PCI Express* NVIDIA has also invested heavily in NVLink, their higher-speed interconnect to enable fast memory access between GPUs, and unified memory between the GPU and CPU.
> 
> http://www.anandtech.com/show/10229/nvidia-announces-dgx1-server


FOUR HUNDRED WATTS PER PROCESSOR ends up being something like 470 watts per PCIE card to support the chip - at 850MHz. Reading comprehension is important.

The processor as shown in the DGX-1 is not able to be used as a consumer discrete GPU because of the power standards involved. This being why they call it the "P100" and not a "GP100" themselves.
Remember that the "Tesla" classed processors are not designed for, nor are they installed on, NOR are they given drivers for use as gaming GPUs. nVidia themselves have said that the P100 will not be put into a gaming card.


----------



## Forceman

Quote:


> Originally Posted by *prjindigo*
> 
> The processor as shown in the DGX-1 is not able to be used as a consumer discrete GPU because of the power standards involved. *This being why they call it the "P100" and not a "GP100" themselves*.
> Remember that the "Tesla" classed processors are not designed for, nor are they installed on, NOR are they given drivers for use as gaming GPUs. nVidia themselves have said that the P100 will not be put into a gaming card.


I don't know how much clearer they can be that the P100 is the card, and the GP100 is the processor on it.
Quote:


> *Based on the new NVIDIA Pascal GP100 GPU* and powered by ground-breaking technologies, *Tesla P100* delivers the highest absolute performance for HPC, technical computing, deep learning, and many computationally intensive datacenter workloads.


And no one expects to game on a Tesla card, but the underlying Pascal architecture is the same.

I still don't know where you are getting 400 watts per processor, when the TDP of the entire card is clearly stated as 300W.

https://devblogs.nvidia.com/parallelforall/inside-pascal/


----------



## Nickyvida

Polaris is a disappointmemt imo. Only replacements for the 480x, 470 amd below being touted around by Roy. Nothing about 490 or Fury replacements. And the 480x aint much of improvements if early benchmarks are to be believed.

AMD giving Nvidia free reign over the high end market and to overcharge


----------



## DaaQ

Quote:


> Originally Posted by *Ithanul*
> 
> Highly unlikely. My family not from this area at all. Comes from the fact both my parents are military brats (Most my family has been in some branch or the other). So my family is from one end of the country to the other end.
> 
> 
> 
> 
> 
> 
> 
> I have relatives in California, Washington, Texas, Florida, Georgia. Then non-blood related family members in Germany, Denmark, and Ukraine.
> 
> 
> 
> 
> 
> 
> 
> We all over the blasted place. Also the reason my southern accent lighter compared to others from this area.
> Nah, we Alabamians joke that Georgians are though.


I thought it was Kentuckians. I'm originally from Michigan so I don't factor into that equation tho.


----------



## spyshagg

Quote:


> Originally Posted by *Nickyvida*
> 
> Polaris is a disappointmemt imo. Only replacements for the 480x, 470 amd below being touted around by Roy. Nothing about 490 or Fury replacements. And the 480x aint much of improvements if early benchmarks are to be believed.
> 
> AMD giving Nvidia free reign over the high end market and to overcharge


I have to agree as well.

Fury sales will be decimated by pascal existence, but the real losers here are the buyers who will spend small fortunes on their inflated high end nvidia cards.


----------



## flopper

Quote:


> Originally Posted by *Nickyvida*
> 
> Polaris is a disappointmemt imo. Only replacements for the 480x, 470 amd below being touted around by Roy. Nothing about 490 or Fury replacements. And the 480x aint much of improvements if early benchmarks are to be believed.
> 
> AMD giving Nvidia free reign over the high end market and to overcharge


80/90% of the market is what AMD cover, so how is that a disapointment?
Anyone with a 370,380,960,970 etc...then has a new shiny Polaris to look forward to


----------



## Nickyvida

Quote:


> Originally Posted by *flopper*
> 
> 80/90% of the market is what AMD cover, so how is that a disapointment?
> Anyone with a 370,380,960,970 etc...then has a new shiny Polaris to look forward to


It is when the gains from a new architecture and node shrink are merely power efficiency. And thats only for mainstream. AMD all but conceded the high end market to Nvidia with thier overpriced pascals for those who want to replace thier 290s 390s and above.

So much for the 2.5x performance.

More like 2.5x the bullcrap.


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> AMD all but conceded the high end market to Nvidia with thier overpriced pascals for those who want to replace thier 290s 390s and above.


AMD's been upfront the entire time that Polaris would be a mainstream part and Vega the performance part. Your fault for not paying attention.
Quote:


> So much for the 2.5x performance.
> 
> More like 2.5x the bullcrap.


You're being childish. First of all, the claim was 2.5x improvement in performance/watt. Second of all, we have a screenshot of a single synthetic benchmark, whose source is of questionable provenance, which has never played well with AMD cards anyway. Never mind that, even if this is true, the card could be a pre-finalized engineering sample.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> AMD's been upfront the entire time that Polaris would be a mainstream part and Vega the performance part. Your fault for not paying attention.
> You're being childish. First of all, the claim was 2.5x improvement in performance/watt. Second of all, we have a screenshot of a single synthetic benchmark, whose source is of questionable provenance, which has never played well with AMD cards anyway. Never mind that, even if this is true, the card could be a pre-finalized engineering sample.


Oh, really? It was only up till two or three weeks ago before AMD clarified Polaris's position. And even then, why a year apart?

They have nothing now to counter x70, x80 and by the time Vega(490/Fury) replacements gets in, the x80ti and the Titan will have been eked out already. The so call high end will have been countered rather effectively.

Polaris should have been released alongside Vega, and then 2X Fury replacement next year to counter the Titan.

It's making me hard to support them when there's nothing from them to offer and they will lose customers over this.

We shall see if the 2.5x performance or the 980ti claim holds true. For all we know, the 2.5x claim may just be for power efficiency, since raja did not state what the 2.5x refers to.


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> Oh, really? It was only up till two or three weeks ago before AMD clarified Polaris's position.


Anybody capable of reading between the lines could see where this was going. We've been talking about it on these boards for months.
Quote:


> They have nothing now to counter x70, x80 and by the time Vega(490/Fury) replacements gets in, the x80ti and the Titan will have been eked out already. The so call high end will have been countered rather effectively.


Nobody's going to buy a mid-range card for the prices they're talking about. More to the point, you and I both don't know what the Polaris product stack is. As for Vega, do you have information that the rest of us don't?
Quote:


> It's making me hard to support them when there's nothing from them to offer and they will lose customers over this.


I'm going to go out on a limb and suggest you wouldn't buy anything from AMD anyway. Also, the customers you're talking about total 7 million people worldwide versus the worldwide per year computer market of a quarter billion. If I'm running AMD, I'm not really worried about the high end on a brand new node, that's the way to ruinous losses. But, well, I suppose trying to explain simple economics to somebody whose greatest exposure to a real market is a piggy bank is wasting my breath.
Quote:


> For all we know, the 2.5x claim may just be for power efficiency, since raja did not state what the 2.5x refers to.


Actually, we do know that 2.5x efficiency isn't possible with the jump from 28nm SOI to 14nm FinFET. Some of that has to be performance.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> Anybody capable of reading between the lines could see where this was going. We've been talking about it on these boards for months.
> Nobody's going to buy a mid-range card for the prices they're talking about. More to the point, you and I both don't know what the Polaris product stack is. As for Vega, do you have information that the rest of us don't?
> I'm going to go out on a limb and suggest you wouldn't buy anything from AMD anyway. Also, the customers you're talking about total 7 million people worldwide versus the worldwide per year computer market of a quarter billion. If I'm running AMD, I'm not really worried about the high end on a brand new node, that's the way to ruinous losses. But, well, I suppose trying to explain simple economics to somebody whose greatest exposure to a real market is a piggy bank is wasting my breath.
> Actually, we do know that 2.5x efficiency isn't possible with the jump from 28nm SOI to 14nm FinFET. Some of that has to be performance.


Oh sure, i remember many were talking about HBM2 touted on Polaris before they confirmed Vega and so on.

Really? There will be buyers no matter what the prices, Titan basically confirmed that. And since Roy all but confirmed mainstream as a focus for Polaris, isn't it safe to assume that anything below 490x will be what Polaris is aimed at, given that those above are basically thier high end offerings?

Well excuse me for buying a 390 from AMD, or at least, already bought in this case. I'm not going back to Nvidia anytime soon for my next upgrade, so yeah, vega or bust.

And it's probably the basic reason why AMD is dying, or at least, it is in the process of dying. Like they haven't lost enough already.


----------



## Bogga

Believe me when I say that I don't support any of these two... I prefer to pay as little as possible for the best available performance. But I want a big leap forward from the dual 970's I had recently... 970 SLI is about the same performance that one 980Ti gives. So one of these "980Ti-performance for 300$" would mean I have to get two of these. Then I take a look at the performance in CF (based on the graphs from the duo review) which just gets worse when you go higher up in resolution, SLI just performs so much better there.

So I'll gladly pay much more and get a pleasant experience from SLI x80 than a meeh experience from cheaper "490's"


----------



## KarathKasun

Quote:


> Originally Posted by *Nickyvida*
> 
> Oh sure, i remember many were talking about HBM2 touted on Polaris before they confirmed Vega and so on.
> 
> Really? There will be buyers no matter what the prices, Titan basically confirmed that. And since Roy all but confirmed mainstream as a focus for Polaris, isn't it safe to assume that anything below 490x will be what Polaris is aimed at, given that those above are basically thier high end offerings?
> 
> Well excuse me for buying a 390 from AMD, or at least, already bought in this case. I'm not going back to Nvidia anytime soon for my next upgrade, so yeah, vega or bust.
> 
> And it's probably the basic reason why AMD is dying, or at least, it is in the process of dying. Like they haven't lost enough already.


Who is they and can you post a link W/R/T AMD Polaris HBM2 statements? Are you sure it wasn't just some internet forum gossip?
Quote:


> Originally Posted by *Bogga*
> 
> Believe me when I say that I don't support any of these two... I prefer to pay as little as possible for the best available performance. But I want a big leap forward from the dual 970's I had recently... 970 SLI is about the same performance that one 980Ti gives. So one of these "980Ti-performance for 300$" would mean I have to get two of these. Then I take a look at the performance in CF (based on the graphs from the duo review) which just gets worse when you go higher up in resolution, SLI just performs so much better there.
> 
> So I'll gladly pay much more and get a pleasant experience from SLI x80 than a meeh experience from cheaper "490's"


SLI and CF are in a death spiral. Native support is at an all time low and NV totally skipped the SLI single card last gen.


----------



## Noufel

Quote:


> Originally Posted by *Nickyvida*
> 
> Oh sure, i remember many were talking about HBM2 touted on Polaris before they confirmed Vega and so on.
> 
> Really? There will be buyers no matter what the prices, Titan basically confirmed that. And since Roy all but confirmed mainstream as a focus for Polaris, isn't it safe to assume that anything below 490x will be what Polaris is aimed at, given that those above are basically thier high end offerings?
> 
> Well excuse me for buying a 390 from AMD, or at least, already bought in this case. I'm not going back to Nvidia anytime soon for my next upgrade, so yeah, vega or bust.
> 
> And it's probably the basic reason why AMD is dying, or at least, it is in the process of dying. Like they haven't lost enough already.


AMD dying ..... yes ofc








And for the HBM2 it was all rumours AMD never said anything about polaris being on that and if i go down to your childish territory what about 3.5gb







( the last one was just for kidding )


----------



## TheHorse

Quote:


> Originally Posted by *CataclysmZA*
> 
> Yeah, no, I can't believe that. 980 Ti performance for half the price, on a new process that is twice as expensive? That's literally implausible.
> 
> I can't even fathom how much such a GPU would break the industry (it most certainly would), and drive a price war that'll last far longer than it needs to. If anything, it'll be 980 Ti performance for $150 less, not $300 less.


I'm sorry, are you complaining about a price war? Do you wipe your butt with money or something? Why would you want prices to stay ridiculously and artificially high?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *TheHorse*
> 
> I'm sorry, are you complaining about a price war? Do you wipe your butt with money or something? Why would you want prices to stay ridiculously and artificially high?


I don't think he is complaining about a possible price war. I think he is saying that it is highly unlikely that AMD will release a Polaris 10 card that would force such a price war (980Ti performance for $300 rumor)...


----------



## 364901

Quote:


> Originally Posted by *TheHorse*
> 
> I'm sorry, are you complaining about a price war? Do you wipe your butt with money or something? Why would you want prices to stay ridiculously and artificially high?


It's not artificially high. The prices are high because that's what makes the companies money, and that's what consumers are prepared to pay.

Take into account that we've arrived at a new node that costs about $4000 per eight-inch wafer. Making large dies on this wafer is expensive; more than twice as expensive, in fact, than 28nm. Both companies are going to try mitigate those increases by designing small-die chips for consumers and keeping the prices at a comfortable level for them to make a profit. They can increase the die size and optimise the chip density in the next generation.

AMD could sell a GTX 980 Ti-class card for $300, but NVIDIA can also easily sell the same level of performance at $500. They use their brand strength to keep up sales, and bill their card as a "premium experience". AMD gives themselves no room to enact mass price drops, but NVIDIA can easily live with cutting off $100 about six months down the line. We can't have that, especially with AMD seeing quarter after quarter of losses. They need a win in the GPU market, and its in our interest as consumers to have a strong alternative to NVIDIA.

In the event, however unlikely, that AMD's performance is right on the money and NVIDIA can't compete at $500, that forces a price war, and will result in stock issues, a large increase in demand, and both companies will struggle with selling future cards at this price point because consumers will expect them to offer big performance improvements with the next generation.

It's far better, instead, to have both GPUs start out at $500 (hypothetically speaking). Both companies make good, healthy profits from initial sales, and they can now play a game of chicken to see who'll drop the price first. If AMD drops to $450 after six months, so will NVIDIA. Neither company has stock issues as a result of the cards being priced relatively high, and they're both considered on the same level in terms of performance and quality when it comes to their partner's cooler and board designs. AMD becomes a highly valued brand once more, and despite the lower sales, they're in far better control of their costs.

Both companies, in addition, can further smooth out future rollouts. They can better control what price points get a certain level of performance, and because this is going to be another long node, they can manage a 20% performance improvement for the same price by managing consumer expectations going into 16 and 14nm respectively. As a consumer you're not benefiting much in terms of how much you're paying, but the higher prices help keep the market stable, and mask over any issues with supply.

In fact, NVIDIA is already doing this with Maxwell. The GTX 980 sells for around $470, whereas it launched at $549. The GTX 970 seems to be $290, but it launched at $329. In the past, this never used to be the case, and prices started dropping heavily with less than three months to go before a new GPU hit the market. AMD on the other hand took the same chip from the R9 290X on a more mature process and increased the price to $429 (it now hovers around $400), but the R9 290X started out at $549 before dropping to the $300-ish it is now. That's a huge loss for AMD, and they've learned their lesson.

I believe that both companies are preparing themselves and their customers for another extended stay on a new node. And that's why I'm saying that a price war would be unwelcome for everyone involved.


----------



## airfathaaaaa

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't think he is complaining about a possible price war. I think he is saying that it is highly unlikely that AMD will release a Polaris 10 card that would force such a price war (980Ti performance for $300 rumor)...


BUT if amd wants to bring down the vr to the masses there isnt really other way to do it a mass adoptation will only happen with a relatively beasty card on a reasonable price


----------



## spyshagg

I think thats what we are getting in the next few months. A very competent VR card at reasonable price.

as it stands, war against the 1080ti will be waged by Vega. In between i have no ideia what they will have to fight the regular 1080


----------



## Ithanul

Quote:


> Originally Posted by *ToTheSun!*
> 
> We're all brothers in Christ.


LoL, brothers, that word a bit in accurate to me.







Hint: I'm a gal.

Otherwise on the video card related. I get this funky feeling both parties are not going to be nice with the prices. Probably going to be small die too.
But o well, I'm have patience and tight bugger with my money. I can wait out till the cards hit used market. Like I do with all my electronic goodies.


----------



## 364901

Quote:


> Originally Posted by *airfathaaaaa*
> 
> BUT if amd wants to bring down the vr to the masses there isnt really other way to do it a mass adoptation will only happen with a relatively beasty card on a reasonable price


Can you say R9 290 performance in a card that costs $229 with less than half the power consumption? That's how I'd do it. That would be a sub-140W card that only needs a six-pin PEG power connector.

Edit: AMD's fairly close to that at any rate. R9 380X is at the same performance level as the R9 290, looking at 4K benchmarks. It launched at $229 with a 190W TDP, but often drew more.


----------



## DrFPS

Quote:


> Originally Posted by *CataclysmZA*
> 
> Can you say R9 290 performance in a card that costs $229
> 
> Edit: AMD's fairly close to that at any rate.


You have not been reading very much concerning vr. ATM the only single card capable of vr from amd is the Radeon Pro Duo $1500. So much for your theories. Wishing is always fun. I hope you are correct. Some how I don't see it happening. IMO near 980 performance is not good enough. At that rate you will need two or three 480's for VR.
http://www.eweek.com/pc-hardware/amd-creates-site-dedicated-to-polaris-gpu.html
Quote:


> She said AMD will continue to expand its efforts in graphics, noting that VR will play an important role in the company's future in both the consumer and commercial spaces and pointing to AMD's new $1,500 Radeon Pro Duo platform for VR creation and consumption


----------



## tajoh111

Quote:


> Originally Posted by *Fyrwulf*
> 
> https://oemelectronic.wordpress.com/2015/01/13/race-14-nm-finfet-process-renewed-fighting-global-foundry/
> There you go. Samsung's 14nm LPP process will probably cost roughly the same per wafer ($4k). Anything after that is gravy.
> 
> EDIT: Here's another source that corroborates the manufacturing cost of 14nm FinFET
> 
> http://www.advancedsubstratenews.com/2014/03/why-migration-to-fd-soi-is-a-better-approach-than-bulk-cmos-and-finfets-at-20nm-and-1416nm-for-price-sensitive-markets/
> 
> 
> 
> Also this, which validates my estimate of raw die cost:


Not quite, they use an IBS business paper that I first showed you and the problem is the figures you show don't use the final cost for the die maker.

You can tell they use the same source because their numbers are lifted from this chart.



The problem is they quote q4 2016 figures and they don't break down the cost enough.

This is the true final cost for the die manufacturer. Aka the total yielded wafer cost.



I think the problem is your initial source likely got their information from the same paper, which is the paper I first linked and didn't apply the math with the right figures and using the wrong data set.

I think they got a bit overzealous with their rounding and on top of this used q4/16 figures which underestimates costs for the time being. On top of this, they didn't use the total yielded wafer cost, rather the unyielded wafer costs.

Everyone who is giving information on die and finfet wafer costs are all using the same data as they often refer to the IBS paper or from IBS. I think the blog you linked to, isn't nearly as credible because well, it's a blog and no professional company or institution is using it as a source.


----------



## 364901

Quote:


> Originally Posted by *DrFPS*
> 
> You have not been reading very much concerning vr. ATM the only single card capable of vr from amd is the Radeon Pro Duo $1500. So much for your theories. Wishing is always fun. I hope you are correct. Some how I don't see it happening. IMO near 980 performance is not good enough. At that rate you will need two or three 480's for VR.
> http://www.eweek.com/pc-hardware/amd-creates-site-dedicated-to-polaris-gpu.html


Well, luckily, no-one needs a Radeon Pro Duo to enjoy a VR game or use a Vive or Oculus headset. Just a Core i5 processor from the Ivy Bridge family and a GTX 970 or RadeonR9 290. AMD's own tests revealed that the Radeon R9 290 is perfectly capable of matching the experience offered by the GTX 970. Radeon Pro Duo is just there for the people who want to make VR games and need much more performance in a single-card solution.

Of course, it's actually just a gaming card with FirePro drivers, but that's OK with me. And of course, the current crop of VR games are underwhelming, and current games that have added VR support in don't do enough to utilise such a unique peripheral.

Quote:


> Originally Posted by *tajoh111*
> 
> This is the true final cost for the die manufacturer. Aka the total yielded wafer cost.
> 
> 
> 
> Holy damn.


----------



## Fyrwulf

Quote:


> Originally Posted by *tajoh111*
> 
> Not quite, they use an IBS business paper that I first showed you and the problem is the figures you show don't use the final cost for the die maker.
> 
> You can tell they use the same source because their numbers are lifted from this chart.
> 
> 
> 
> The problem is they quote q4 2016 figures and they don't break down the cost enough.
> 
> This is the true final cost for the die manufacturer. Aka the total yielded wafer cost.
> 
> 
> 
> I think the problem is your initial source likely got their information from the same paper, which is the paper I first linked and didn't apply the math with the right figures and using the wrong data set.
> 
> I think they got a bit overzealous with their rounding and on top of this used q4/16 figures which underestimates costs for the time being. On top of this, they didn't use the total yielded wafer cost, rather the unyielded wafer costs.
> 
> Everyone who is giving information on die and finfet wafer costs are all using the same data as they often refer to the IBS paper or from IBS. I think the blog you linked to, isn't nearly as credible because well, it's a blog and no professional company or institution is using it as a source.


The image you just linked doesn't substantially disagree with anything I've said, though. There's a $200 price difference between yielded and unyielded costs. Everything above and beyond that is pure profit, which I accounted for and demonstrated that Samsung is in a position to, and has, dropped those prices. I have also explained how they could do that without substantially sacrificing their actual dollar amount per wafer profits.


----------



## iLeakStuff

There is a ton of these 16nm/14nm/28nm cost slides out there and they all differ from each other by a good amount.
I think nobody here knows which one is right


----------



## DIYDeath

Quote:


> Originally Posted by *Ultracarpet*
> 
> Yea I mean Nvidia released the 970 with the Titan Z still on the market....If AMD has a card they think they can aggressively get market share with, they will do so. Who cares if they cannibalize a few sales of an extremely low volume product... Especially when they only have 20% of the market to begin with.
> 
> Tldr: stealing sales from Nvidia will make a much larger difference than protecting the sales of their old and arguably failed products


Don't worry, those old cards are going into the new Nintendo consoles.


----------



## hokk

My friends uncles cousins sisters step son has confirmed this is true.

You heard it here first.


----------



## magnek

Quote:


> Originally Posted by *Ithanul*
> 
> LoL, brothers, that word a bit in accurate to me.
> 
> 
> 
> 
> 
> 
> 
> *Hint: I'm a gal.*
> 
> Otherwise on the video card related. I get this funky feeling both parties are not going to be nice with the prices. Probably going to be small die too.
> But o well, I'm have patience and tight bugger with my money. I can wait out till the cards hit used market. Like I do with all my electronic goodies.


Reported for trolling. Reason: there are no "gals" on the interwebz


----------



## DCSRM

Quote:


> Originally Posted by *kylzer*
> 
> My friends uncles cousins sisters step son has confirmed this is true.
> 
> You heard it here first.


Im really glad you finally cleared this up, I can rest easy now.


----------



## JackCY

Roumors are roumors. Those 2.5x and 10x values of higher performance that AMD and NVIDIA are giving are just numbers ten times misinterpreted. 2.5x is probably 2.5x higher performance/power ratio compared to 28nm AMD chips like 7970/280x maybe 290/390. 10x etc. can be a boost in performance when it comes to some single task like compute or other specific stuff where NV was weak. In other words until they release the cards for review by consumers it's all just speculations.

So far AMD has said HBM2 will wait since they pioneered HBM made the production line for it but that meant extra costs for a new premium product. Now with HBM2 they will wait until that technology becomes more cost effective. It was similar with Mantle, they kicked the industry by being the first and making the first step, now it's all evolving for not just for them to gain from it. From my POV Nvidia is just sitting like ducks and reaping the benefits lately, good but overpriced cards that even lacks some features here and there. Sure AMD may not have the best CPUs and GPUs in every class but they have made serious effort in several areas both HW and SW to bring more performance to GPUs and not just for themselves. Most if not all the tools for game devs they make, propose are open for anyone including freesync compared to overpriced and proprietary Nvidia tools for game devs and expensive Gsync.
They both make good cards but their approach to the market, innovation, developers and consumers differs.


----------



## SuperZan

Quote:


> Originally Posted by *magnek*
> 
> Reported for trolling. Reason: there are no "gals" on the interwebz


 Definitely not. None at all.

*scampers*


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Ithanul*
> 
> LoL, brothers, that word a bit in accurate to me.
> 
> 
> 
> 
> 
> 
> 
> *Hint: I'm a gal.*
> 
> Otherwise on the video card related. I get this funky feeling both parties are not going to be nice with the prices. Probably going to be small die too.
> But o well, I'm have patience and tight bugger with my money. I can wait out till the cards hit used market. Like I do with all my electronic goodies.


Oh lord, prepare your inbox!


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Oh lord, prepare your inbox!


Ha! IME visiting a gaming Teamspeak will cause at least half of the previously gregarious regulars to stop talking completely. All the more when you beat up on them in PvP.


----------



## hokk

Quote:


> Originally Posted by *SuperZan*
> 
> Ha! IME visiting a gaming Teamspeak will cause at least half of the previously gregarious regulars to stop talking completely. All the more when you beat up on them in PvP.


because you're a woman or that you voice is just weird ?

Not trying to be rude btw.


----------



## SuperZan

Quote:


> Originally Posted by *kylzer*
> 
> because you're a woman or that you voice is just weird ?
> 
> Not trying to be rude btw.


The former though maybe the latter applies as well, I grew up in three different countries so my accent is unique.







Some groups of gamers are used to female gamers but there are some that aren't. I'm one of the leaders of a multi-game community so we interact with several different communities from different games.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Oh lord, prepare your inbox!


I wasn't even going to until someone else wrote it.
Quote:


> Originally Posted by *kylzer*
> 
> because you're a woman or that you voice is just weird ?
> 
> Not trying to be rude btw.


Voices can be so deceiving. Most people say mine sounds like a 40 year old black guy. Brings up a funny moment when I mentioned something and another guy queues up "wait, you're not black?". Like come on man, we've been teammates for 8 months now, there's no way you couldn't have figured it out between then.
Quote:


> Originally Posted by *JackCY*
> 
> Roumors are roumors. Those 2.5x and 10x values of higher performance that AMD and NVIDIA are giving are just numbers ten times misinterpreted. 2.5x is probably 2.5x higher performance/power ratio compared to 28nm AMD chips like 7970/280x maybe 290/390. 10x etc. can be a boost in performance when it comes to some single task like compute or other specific stuff where NV was weak. In other words until they release the cards for review by consumers it's all just speculations.


Yeah but how are you getting 2 to 2.5x performance per watt when the Nano is already "near 980 Ti performance" (sort of, at least it's like 10% faster than a 390X) with ~210w power consumption https://www.techpowerup.com/reviews/AMD/R9_Nano/28.html Comparing it to chips like the "7970/280x maybe 290/390" is such a lame way to market those numbers/"improvements".


----------



## Ithanul

Quote:


> Originally Posted by *SuperZan*
> 
> Definitely not. None at all.
> 
> *scampers*










Then can I go by Tomboy then. I don't really act like a gal anyways. My Mom just could never get me into girly stuff, I always wanted the hot wheels and legos.







Plus, had a bad habit of catching frogs, snakes, and lizards. Blame that on my cousin on showing me to catch those. Heck, I have been mistaken in real life as being a dude. (I keep my hair cut short similar to military cut)


----------



## magnek

Quote:


> Originally Posted by *SuperZan*
> 
> The former though maybe the latter applies as well, I grew up in three different countries so my accent is unique.
> 
> 
> 
> 
> 
> 
> 
> Some groups of gamers are used to female gamers but there are some that aren't. I'm one of the leaders of a multi-game community so we interact with several different communities from different games.


Well color me impressed.


----------



## Diogenes5

I seriously hope that AMD can light a fire in the GPU market. Nvidia has gained too much marketshare the last year (80% ... ridiculous) much they've abused their market-leader position. The g970 launced at $310 making many think we might see a lower street price but instead supply was constrained and 970's were often in excess of $350 the first few months. Not only that, Direct X 12 performance is bad and only 3.5 gb of ram is accessible on a 256 bit bus limiting 2k and 4k performance. Gsync is also a steaming pile of turd IMO. Why does it make monitors cost $200 more when freesync as a feature barely affects monitor prices?

Nvidia milks the sheep pretty hard.

If they are able to get 980 TI performance for $300, tAMD must be getting huge economies of scale from providing chips to so many different buyers. So far we know they will be in every new macbook, iMac, Playstation 4.5 Neo, Nintendo NX, and another unknown console (either xbox one next or another Nintendo device). Now I know these are all probably Polaris 11 chips, and the 980 TI-level chips would be polaris 10 but maybe there is a link between the two in manufacturing. Also, going from 28nm to 14nm might be more than making up for lower yield % on a new process.

All I know is the graphics market is ripe for (finally) some decent performance gains. The last 3 years of being on 28nm process have been awfully stagnant. It's not like a 980 TI is that much faster anyways. It's 30%-40% faster than the 970's and 290's most enthusiasts are running right now. It's not completely out of line that we can get 980 TI performance out of a process that is 3-4 years in the making due to problems at TSMC. Cell phone SOC's have seen like 3 node changes since then whereas the GPU market has been stuck on one.


----------



## SuperZan

Quote:


> Originally Posted by *Ithanul*
> 
> 
> 
> 
> 
> 
> 
> 
> Then can I go by Tomboy then. I don't really act like a gal anyways. My Mom just could never get me into girly stuff, I always wanted the hot wheels and legos.
> 
> 
> 
> 
> 
> 
> 
> Plus, had a bad habit of catching frogs, snakes, and lizards. Blame that on my cousin on showing me to catch those. Heck, I have been mistaken in real life as being a dude. (I keep my hair cut short similar to military cut)


I was a bit of a tomboy as well.









Quote:


> Originally Posted by *magnek*
> 
> Well color me impressed.


"I can dance on the head of a pin as well!"

Quote:


> Originally Posted by *Diogenes5*
> 
> All I know is the graphics market is ripe for (finally) some decent performance gains. The last 3 years of being on 28nm process have been awfully stagnant. It's not like a 980 TI is that much faster anyways. It's 30%-40% faster than the 970's and 290's most enthusiasts are running right now. It's not completely out of line that we can get 980 TI performance out of a process that is 3-4 years in the making due to problems at TSMC. Cell phone SOC's have seen like 3 node changes since then whereas the GPU market has been stuck on one.


I was initially worried about AMD not offering something in the performance space before Nvidia. That worry faded a bit as we've learned more about the impending 1080/1070. I still think those will be good cards but if we're not seeing a Titan drop first then I think AMD waiting for HBM2 before Vega makes sense. If AMD can shore up the $400 US and under market then Nvidia releasing high-mid as a high-end part could backfire. Otherwise, they'll release it at ~$500 US and we all win. Either way, I'd wager a month's pay that neither Nvidia nor AMD will be catering to the true high-end/enthusiast until 2017.

As others have mentioned we may be on these nodes for some time; they're not going to hit us with their best shots all at once.


----------



## Majin SSJ Eric

As a benchmarker I am certainly disappointed with AMD's apparent lack of even attempting to compete for the performance crown before 2017 but I also understand the realities of their business situation at the moment as well. Whether I like it or not, AMD simply no longer has the resources to go toe-to-toe with Nvidia in every market segment and has to choose the battles it thinks it can do well in and focus on those. It sucks (especially for us consumers) but it may just be the only path they have left to them for the time being. When Vega drops all we can do is hope to God that it somehow can beat (or at least compete against) big Pascal but for the rest of 2016 it seems as though we will have to watch Nvidia dictate the market and set pricing wherever the hell they feel like, just like we've been watching since early 2013...


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> As a benchmarker I am certainly disappointed with AMD's apparent lack of even attempting to compete for the performance crown before 2017 but I also understand the realities of their business situation at the moment as well. Whether I like it or not, AMD simply no longer has the resources to go toe-to-toe with Nvidia in every market segment and has to choose the battles it thinks it can do well in and focus on those. It sucks (especially for us consumers) but it may just be the only path they have left to them for the time being. When Vega drops all we can do is hope to God that it somehow can beat (or at least compete against) big Pascal but for the rest of 2016 it seems as though we will have to watch Nvidia dictate the market and set pricing wherever the hell they feel like, just like we've been watching since early 2013...


I don't disagree but as you mention AMD's in a financial crunch. Even with that unfortunate reality I think that they'll still take a very good swing at that performance crown in 2017. Without the excess of funds that Nvidia's got at the moment it just wouldn't make sense to dilute the process with a mid-high hybrid placeholder 1080-equivalent when they know that Titan/Ti are still inbound. Polaris 10/11 have to win at the low/mid and Vega has to win at the high end. That's the only way to win back appreciable market share and mindshare. That said, given what we've seen with the Pro Duo and how Raja's been talking up multi-GPU solutions, I would not be surprised to see a dual-Pol 10 card sometime before 2017. Multi-GPU issues aside, that's something I'd be interested in playing with.


----------



## 12Cores

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> When Vega drops all we can do is hope to God that it somehow can beat (or at least compete against) big Pascal but for the rest of 2016 it seems as though we will have to watch Nvidia dictate the market and set pricing wherever the hell they feel like, just like we've been watching since early 2013...


I think if AMD puts something out for $300 that is as fast as Fury X they will sell a lot of cards regardless of what Nvidia charges for their parts. A card this fast would game comfortable up to 3440x1440 for sometime. The only way AMD could screw this up is to release a part comparable to the 390 in performance with a much lower TDP for $300, if this is the case you can stick a fork in them.


----------



## SuperZan

Quote:


> Originally Posted by *12Cores*
> 
> I think if AMD puts something out for $300 that is as fast as Fury X they will sell a lot of cards regardless of what Nvidia charges for their parts. A card this fast would game comfortable up to 3440x1440 for sometime. The only AMD could screw this up is to release a part comparable to the 390 in performance with much lower TDP for $300, if this is the case you stick a fork in them.


I think what we'll see is a super-value card that meets modern requirements, and then Vega will be a true high-end competitor. I think it's the mid-high *80 area that AMD will slack on for the beginning portion of this gen.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *SuperZan*
> 
> I don't disagree but as you mention AMD's in a financial crunch. Even with that unfortunate reality I think that they'll still take a very good swing at that performance crown in 2017. Without the excess of funds that Nvidia's got at the moment it just wouldn't make sense to dilute the process with a mid-high hybrid placeholder 1080-equivalent when they know that Titan/Ti are still inbound. Polaris 10/11 have to win at the low/mid and Vega has to win at the high end. That's the only way to win back appreciable market share and mindshare. That said, given what we've seen with the Pro Duo and how Raja's been talking up multi-GPU solutions, I would not be surprised to see a dual-Pol 10 card sometime before 2017. Multi-GPU issues aside, that's something I'd be interested in playing with.


I see what you are saying but the reality is that their strategy essentially leaves Nvidia completely unchallenged at the 1080/1070 level with no guarantee that Vega will be able to even compete at the top end next year (considering how formidable GP100 is likely to be). If they do indeed release only a very small ~200mm^2 sized Polaris 10 I have to question that strategy (unless of course they are simply financially unable to do better) because it would seem they could have bumped that size up 100mm^ or so to challenge Pascal and save the smaller stuff for Polaris 11. Keep in mind that all of this is still based on pure rumor (and the words of Roy which could just be an attempt to lower expectations ahead of a "surprise" reveal). Who knows anymore, right? I mean, I try to root for AMD even though I'm an Nvidia fan because i want choice in my purchase decisions but they just have such a consistent track record of bungling launches that I can't be too optimistic anymore (I had really high hopes for Fiji running up to the launch and they did deliver a huge die that should have performed better but we all know how that turned out).


----------



## SuperZan

I think it's reasonable to question if they went dual-GPU route and I would too but I could see why they would do it. It seems like the market share is in the sub-$400 and the mind share is in the $650+ so I can see why they would concentrate on those areas. I definitely don't disagree that it's too bad that the mid-range would effectively be Nvidia turf in the interim but I can see why a bleeding company could go that way. I could definitely see them mishandling things but I think I see where they're coming from is all.

As far as the Fiji launch goes, that was rough and "overclocker's dream" is right up there with segmented memory for PR fails. Nano and Fury X were priced a little high, but the Fury is pure gold and I don't regret buying my pair in the least. At 4k I get great performance at High/Ultra settings and I spent less than $1,000 US. I guess in that regard what I'm saying is, even though it's been somewhat better lately AMD is never going to be as silky-smooth in the PR game as Nvidia is, but the technology they put out can still be very competitive. With that in mind I think Vega will put up a fight with GP100; I think it will be completely different from Pol 10/11 so I'm not using expectations for those chips as a bellwether for Vega.


----------



## prjindigo

Quote:


> Originally Posted by *Forceman*
> 
> I don't know how much clearer they can be that the P100 is the card, and the GP100 is the processor on it.
> And no one expects to game on a Tesla card, but the underlying Pascal architecture is the same.
> 
> I still don't know where you are getting 400 watts per processor, when the TDP of the entire card is clearly stated as 300W.
> 
> https://devblogs.nvidia.com/parallelforall/inside-pascal/


Ok, so one of the mods has said I'm not supposed to berate anybody.

That is a processor sub-assembly - not a graphics card. It does not have its own on-board power supply. It does not have an external graphics port. It does not have to drive its own heatsink. The Tesla P100 circuit board is NOT a graphics card. If you put that chip on a PCIE slot card it will require on-card power supply which is not perfectly efficient. If you up-clock it to 1.2ghz instead of 850 it will eat more power.
If you put it on a PCIE slot device card it will chew up nearly 500 watts per card. How can people possibly think that the picture there is a complete graphics card? That board operates with the proprietary 4-link thing nV developed for the 8 processor special purpose server which is _another_ circuit feature that will not be put into a chip that ends up working on a PCIE graphics card. That processor assembly receives power from the board it is mounted to from a major supply system.

The processor on that server socket card _cannot_ function inside a PC as a graphics card. We're not going to see a "P100" graphics card, nV has already stated so. They will do the "1070 1080 and 1080ti" then they will do Volta - which will still not be a P100 chip. The ONLY nV pascal chips we've seen that are _capable_ of displaying frames on a computer monitor are the GP104-200 and GP104-300.

Would I love a 500w P100 graphics card that runs at 1.2GHz, sure! Buy me one. Is one going to exist? No. nVidia answered this question.

In the same presentation Dr Blathers held up two cards, one full length and one half-height http://www.nvidia.com/object/tesla-k80-boost-up.html server accelerator cards. The internet was all abuz about them but they're twin and single Tesla K series accelerators that have no capacity for a graphics port - they're GK110b chips. The chips inside the DGX-1 have no capacity for a graphics port and are designed to decompose images by using edge-detect, color interference and other functions.

_IF_ nVidia got a P100 sized graphics card running at the 1.2 or 1.4 speed that the GP104's can run at, it would have 50% more graphics power and 8 times the memory bandwidth of the GP104-300... that doesn't seem like something nVidia is willing to put on the table this summer since NOBODY would buy a 1080ti.


----------



## lahvie

Quote:


> Originally Posted by *12Cores*
> 
> I think if AMD puts something out for $300 that is as fast as Fury X they will sell a lot of cards regardless of what Nvidia charges for their parts. A card this fast would game comfortable up to 3440x1440 for sometime. The only way AMD could screw this up is to release a part comparable to the 390 in performance with a much lower TDP for $300, if this is the case you can stick a fork in them.


Brings me back to that whole... What is going to happen with their current inventory.
Did they stop production of all current offerings?
Like what sort of ramifications does that mean if their new high end is slightly sub par, or on par, with last gen.
Dump the old, in with the new?

Especially with their finances being in the down and out
Even when vega comes, it's going to be late to the table, but are they really in that bad of a shape?
Edit: which is why I believe this card you speak of would cost at least 4-500 second edit
Actually when does vega come I just remembered late 16 maybe early 17?
May not be as bad as i thought
AMD AMD!


----------



## KGPrime

Whatever comes comes. But i was at Frys looking at monitors recently and the Frys dude ( knowledgeable guy that would probably frequent forums like this not some Best Buy sales pitch dimwit ) i was talking to said AMD reps were in just recently and told them that Polaris will be at leat up to as powerful as Titan and have up to as much ram on them as well. Something to that effect. I honestly was not paying full attention at that moment as i was thinking about the monitors i was looking at, and i kind of don't care as i am running a gtx 660 still so whatever i buy this year will be a massive upgrade, but he said something very much like that. Which seems to jibe with this thread title anyway.


----------



## airfathaaaaa

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I see what you are saying but the reality is that their strategy essentially leaves Nvidia completely unchallenged at the 1080/1070 level with no guarantee that Vega will be able to even compete at the top end next year (considering how formidable GP100 is likely to be). If they do indeed release only a very small ~200mm^2 sized Polaris 10 I have to question that strategy (unless of course they are simply financially unable to do better) because it would seem they could have bumped that size up 100mm^ or so to challenge Pascal and save the smaller stuff for Polaris 11. Keep in mind that all of this is still based on pure rumor (and the words of Roy which could just be an attempt to lower expectations ahead of a "surprise" reveal). Who knows anymore, right? I mean, I try to root for AMD even though I'm an Nvidia fan because i want choice in my purchase decisions but they just have such a consistent track record of bungling launches that I can't be too optimistic anymore (I had really high hopes for Fiji running up to the launch and they did deliver a huge die that should have performed better but we all know how that turned out).


do you even have a single fact to support all of that? afaik no one know anything about both companies card assuming one will be significant faster than the other while we are in a total dark is just worst than those sites that spread rumors just to be right in the end


----------



## ToTheSun!

Quote:


> Originally Posted by *KGPrime*
> 
> Whatever comes comes. But i was at Frys looking at monitors recently and the Frys dude ( knowledgeable guy that would probably frequent forums like this not some Best Buy sales pitch dimwit ) i was talking to said AMD reps were in just recently and told them that Polaris will be at leat up to as powerful as Titan and have up to as much ram on them as well. Something to that effect. I honestly was not paying full attention at that moment as i was thinking about the monitors i was looking at, and i kind of don't care as i am running a gtx 660 still so whatever i buy this year will be a massive upgrade, but he said something very much like that. Which seems to jibe with this thread title anyway.


And what monitors were you looking at?


----------



## cowie

we all can just keep dreaming for amd to put out a $300 card that would top the fury x.

I am sure it will be propaganda slides in hitman(well not the latest) or some other game that amd does very well in.

by the time of launch it may go from 2.5- to 2.0 to 1.5x the performance per watt then reviews might say 1.4








but but they showed the card lol yeah right


----------



## KarathKasun

Quote:


> Originally Posted by *cowie*
> 
> we all can just keep dreaming for amd to put out a $300 card that would top the fury x.
> 
> I am sure it will be propaganda slides in hitman(well not the latest) or some other game that amd does very well in.
> 
> by the time of launch it may go from 2.5- to 2.0 to 1.5x the performance per watt then reviews might say 1.4
> 
> 
> 
> 
> 
> 
> 
> 
> but but they showed the card lol yeah right


2.5x Perf / W does not mean 2.5x performance.

If performance is static, it means power consumption will be something like 1/4 of the comparative precursor GPU.


----------



## cowie

Quote:


> Originally Posted by *KarathKasun*
> 
> 2.5x Perf / W does not mean 2.5x performance.
> 
> If performance is static, it means power consumption will be something like 1/4 of the comparative precursor GPU.


ah where did I say it was?
just said what amd has said that's all

2.5 errrrr 2.0 x the performance /watt improvement


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *KarathKasun*
> 
> 2.5x Perf / W does not mean 2.5x performance.
> 
> If performance is static, it means power consumption will be something like 1/4 of the comparative precursor GPU.


"Something like 1/4"? Would it not be 1/2.5.


----------



## KarathKasun

Depends on how hard they push clocks on the end product. The higher you push clocks and voltage, the less you gain from the process shrink. There is a point where you sacrifice most of the gains from the node shrink and you are relying mostly on architecture changes.

"2.5x lower" is somewhat misleading and can be made to say something different depending on how you come to "2.5x". In reality the window will be 90-120w average power dissipation with peak consumption in the 150w ballpark.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *airfathaaaaa*
> 
> do you even have a single fact to support all of that? afaik no one know anything about both companies card assuming one will be significant faster than the other while we are in a total dark is just worst than those sites that spread rumors just to be right in the end


Read what you quoted again carefully. I specifically said that this was all based on rumor and none of it was based on fact. Jeez.


----------



## 364901

Quote:


> Originally Posted by *KarathKasun*
> 
> 2.5x Perf / W does not mean 2.5x performance.


A lot of people understand this to mean that there's more room in the power budget, which is technically true. I'd like to see how far these chips can be pushed, because FinFET behaves very differently to older fabrication nodes.


----------



## Serios

It looks like Polaris launch is near.
I just checked some local online stores and most only have a 390x in stock and some just a 390 and a 390x.


----------



## Bogga

Quote:


> Originally Posted by *Serios*
> 
> It looks like Polaris launch is near.
> I just checked some local online stores and most only have a 390x in stock and some just a 390 and a 390x.


If I was to use that kind of logic then polaris would be much closer than pascal since there's more 9x0 where I usually buy my stuff...


----------



## Serios

Maybe they had higher stocks or GTXs but it's definitely obvious that the supply or certain GPUs has slow down significantly.
Certain people were saying that AMD won't be able to sell their 390s or Furys if Poaris will be cheap and just as fast if not faster but I don't think that will be a problem.


----------



## xLegendary

Quote:


> Originally Posted by *Serios*
> 
> Maybe they had higher stocks or GTXs but it's definitely obvious that the supply or certain GPUs has slow down significantly.
> Certain people were saying that AMD won't be able to sell their 390s or Furys if Poaris will be cheap and just as fast if not faster but I don't think that will be a problem.


From what I know from distributors, AMD doesnt have much stock on the 390's and fury's. So for them is all good.


----------



## Nickyvida

Gtx 1070 faster than a Titan X.

Meanwhile at AMD..

2.5x mainstream perf/watt efficiency bullcrap. Nothing to challange x70 and x80. As a result the 1080 now sits at $600, a whopping $100 increase. An overpriced mid range die GP104 die for $600. And a x70 almost breaking the $400 mark. Who knows what the x80ti will retail for now?

Thanks alot Raja!


----------



## SuperZan

Quote:


> Originally Posted by *Nickyvida*
> 
> Gtx 1070 faster than a Titan X.
> 
> Meanwhile at AMD..
> 
> 2.5x mainstream perf/watt efficiency bullcrap. Nothing to challange x70 and x80. As a result the 1080 now sits at $600, a whopping $100 increase. An overpriced mid range die GP104 die for $600. And a x70 almost breaking the $400 mark. Who knows what the x80ti will retail for now?
> 
> Thanks alot Raja!


OR, and just go with me here, we could wait a tick and find out what Polaris 10 is really all about. THEN, we could wait a bit longer for the 1080 Ti/Titan and Vega to become tangible realities and likewise judge them on their respective merits!

It's crazy, I know, but we ought to give it a go.


----------



## JonnyBigBoss

How are AMD drivers these days? Last I read they were awful.


----------



## KarathKasun

Quote:


> Originally Posted by *Nickyvida*
> 
> Gtx 1070 faster than a Titan X.
> 
> Meanwhile at AMD..
> 
> 2.5x mainstream perf/watt efficiency bullcrap. Nothing to challange x70 and x80. As a result the 1080 now sits at $600, a whopping $100 increase. An overpriced mid range die GP104 die for $600. And a x70 almost breaking the $400 mark. Who knows what the x80ti will retail for now?
> 
> Thanks alot Raja!


I seriously doubt they are going to have the same OC headroom as prior chips. Meaning that an OC'd T-X will trounce a 1070.

If NV got 500mhz from the shrink, how much do you think that AMD is going to get?


----------



## KarathKasun

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> How are AMD drivers these days? Last I read they were awful.


Last time I had a bad AMD driver experience was back in the X1950 days. Thats 10+ years ago.


----------



## SuperZan

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> How are AMD drivers these days? Last I read they were awful.


They've been pretty good for a while now. Single-card is solid, Crossfire takes a few weeks after SLI but once the fixes are published I don't have any real issues. They've been addressing everything that affected me personally in a pretty systematic fashion.


----------



## Nickyvida

Quote:


> Originally Posted by *SuperZan*
> 
> OR, and just go with me here, we could wait a tick and find out what Polaris 10 is really all about. THEN, we could wait a bit longer for the 1080 Ti/Titan and Vega to become tangible realities and likewise judge them on their respective merits!
> 
> It's crazy, I know, but we ought to give it a go.


Polaris 10/11 has already been confirmed to be mainstream by Roy. Doubt there will be any challenge over the x70 and x80 which are high end which will go to Vega.

Even if Polaris 10 actually delivers 980ti performance, it aint enough. Just look at the x70 and x80.

We've been stuck on 28nm too long. I just want a tangible increase, not matching something that has already been done by Nvidia a year ago.


----------



## Fyrwulf

Quote:


> Originally Posted by *Nickyvida*
> 
> Gtx 1070 faster than a Titan X.
> 
> Meanwhile at AMD..
> 
> 2.5x mainstream perf/watt efficiency bullcrap. Nothing to challange x70 and x80. As a result the 1080 now sits at $600, a whopping $100 increase. An overpriced mid range die GP104 die for $600. And a x70 almost breaking the $400 mark. Who knows what the x80ti will retail for now?
> 
> Thanks alot Raja!


So let me ask you this, why do you care who is "on top"? Either you're going to buy Polaris 10 or you're not. To me it seems like you're upset because a perceived lack of competitiveness on AMD's part has allowed nVidia to jack up prices, which would only make sense if you're not interested in buying AMD. Yeah, you claim you own an AMD card. Here's the thing, I think you're lying. I think you're a professional troll, bought and paid for by nVidia.


----------



## SuperZan

Quote:


> Originally Posted by *Nickyvida*
> 
> Polaris 10/11 has already been confirmed to be mainstream by Roy. Doubt there will be any challenge over the x70 and x80 which are high end which will go to Vega.
> 
> Even if Polaris 10 actually delivers 980ti performance, it aint enough. Just look at the x70 and x80.
> 
> We've been stuck on 28nm too long. I just want a tangible increase, not matching something that has already been done by Nvidia a year ago.


I don't put a ton of stock in the Steam hardware survey, but according to that the 970 -is- mainstream. At that price I would consider it to be so. And, if Pol 10 delivers 980 Ti performance for less than $300 it will do a lot of business. The vast majority of PC gamers don't need 980 Ti performance now. Don't judge the market by our predilections as an enthusiast community.


----------



## Nickyvida

Quote:


> Originally Posted by *Fyrwulf*
> 
> So let me ask you this, why do you care who is "on top"? Either you're going to buy Polaris 10 or you're not. To me it seems like you're upset because a perceived lack of competitiveness on AMD's part has allowed nVidia to jack up prices, which would only make sense if you're not interested in buying AMD. Yeah, you claim you own an AMD card. Here's the thing, I think you're lying. I think you're a professional troll, bought and paid for by nVidia.


Why would it not make sense? If AMD brings out a competitive product, i'd buy it. But as it stands, if Polaris is a stinker, why would i waste my hard earned money? I'm not like some super hardcore enthusiast who can afford to punt several thousands getting Devil's Canyon to Skylake to Kaby Lake etc with 5% generation increase between releases. I'd want to stretch my dollar for something performing worthwhile given there's my hard earned savings coming into play. And waiting for 3 years+ through the whole 20nm saga and then finding out i'll have to wait another just to get a decent increase in performance after Polaris turned out to be mainstream, i'm just pissed off basically i have to wait, that's all. I'm not and never going back to Nvidia after my shoddy experiences with thier cards dying less than two years and thier overpriced cards

Calling me a professional troll is a bloody insult as a fellow enthusiast, elitist douchebag.

Is this enough proof for you that i own an AMD card? Or do i have to dismantle my case, take a pic, scrounge up my reciept, if i have to?


----------



## Nickyvida

Quote:


> Originally Posted by *JonnyBigBoss*
> 
> How are AMD drivers these days? Last I read they were awful.


They've been pretty good so far now, decent releases between shorter timeframes now that they spun off RTG.


----------



## Travieso

i think the only one thing people concern about performance of Polaris is its "tiny die size".

200 mm size gpu can compete with 300 mm ? seriously ? if it's true, this will be one hell of engineering work of all time.

i know Maxwell could surpass Hawaii with smaller die size but the different wasn't this much (only around 400 mm vs 440 mm).

we all know AMD is capable of bringing competitive products to market in some ways especially in gpu section which relies mostly on "MOAR COARS" and clock speed bump.

but competing with much smaller die size ? that's not likely. it feels like they're gonna give up the enthusiasm segment this round.


----------



## n4p0l3onic

Quote:


> Originally Posted by *Travieso*
> 
> i think the only one thing people concern about performance of Polaris is its "tiny die size".
> 
> 200 mm size gpu can compete with 300 mm ? seriously ? if it's true, this will be one hell of engineering work of all time.
> 
> i know Maxwell could surpass Hawaii with smaller die size but the different wasn't this much (only around 400 mm vs 440 mm).
> 
> we all know AMD is capable of bringing competitive products to market in some ways especially in gpu section which relies mostly on "MOAR COARS" and clock speed bump.
> 
> but competing with much smaller die size ? that's not likely. it feels like they're gonna give up the enthusiasm segment this round.


*cough* 4870 vs gtx 260 and 4890 vs 280/275.


----------



## DIYDeath

If the rumors are true...I don't think polaris will sell too well. $380 gets you a 1070. $80 more for a sharp increase in performance.


----------



## variant

Quote:


> Originally Posted by *DIYDeath*
> 
> If the rumors are true...I don't think polaris will sell too well. $380 gets you a 1070. $80 more for a sharp increase in performance.


What? The AMD $300 rumor is that the Polaris 10 will have 980Ti performance at $300. That puts it roughly in the same league as the 1070 at $80 less.


----------



## gamervivek

200mm2 could compete with 300mm2 die if the clockspeeds are in its favor, we saw something similar with gtx960 vs. 380/380X.

With Pascal touching 2Ghz and Maxwell already holding a commanding lead over GCN, doubtful that AMD will overtake nvidia on that front.


----------



## DIYDeath

Quote:


> Originally Posted by *variant*
> 
> What? The AMD $300 rumor is that the Polaris 10 will have 980Ti performance at $300. That puts it roughly in the same league as the 1070 at $80 less.


What would you rather have?

$300 for a AMD 980ti or $380 for a faux Titan X?

One of these things is not like the others...one of them just doesn't belong!









Polaris 10 would need to be priced somewhere between $200-$280 to compete given what we know about the competiion, based on the rumors of polaris 10 which at this point I doubt are true.


----------



## Nickyvida

980ti is so last year. People can just get cutprice 980tis instead of spending moar on the polaris

Sad they have given up on the enthusiast market. Zen will probably be this way too.


----------



## Diogenes5

Can't stand all the Nvidia fanboys/viral marketers. It's a f*cking piece of silicon that drives games. Buy the best one that meets your needs and budget. I remember waiting for the 770 to reach a reasonable price and then the 970. Both times, I just went with AMD instead because they just offered way more value for the buck.

People saying they are rushing out to buy right away without seeing what the competition can do are f*cking idiots. I make a very comfortable salary and would never drop that kind of money with so little thinking.

IMO, Nvidia looks overpriced to me. I don't base my evaluations on the previous generation but look at the base components and performance. A die shrink from 28nm to 16nm should generate even more performance for the price than this. This is a 2 or 3 node jump. We should be getting 40% more performance for the same price but instead they jacked the price on the xx70 line to $380. I'm sure the xx60 when it comes out will be super gimped just like the 960 was heavily incentivizing the purchase of a GTX 1070 as well.

And what's with the %#((% GDDR5 memory in the 1070? Given HBM is just around the corner, GDDR5x was the minimum they should have delivered.

I was looking for a 40% in performance for the same pricepoint at $300 mainstream which is reasonable given the die shrink and how long we've all been waiting on a die shrink. Not only that we should demand more because die shrinks are getting progressively harder and harder. We may not see another one for yet another 3 years. Nvidia is milking the market and doing good business, but my money is valuable. Let's hope ATI brings the heat and gets us 980TI performance at a $300 or less pricepoint.


----------



## DaaQ

Quote:


> Originally Posted by *Nickyvida*
> 
> Polaris 10/11 has already been confirmed to be mainstream by Roy. Doubt there will be any challenge over the x70 and x80 which are high end which will go to Vega.
> 
> Even if Polaris 10 actually delivers 980ti performance, it aint enough. Just look at the x70 and x80.
> 
> We've been stuck on 28nm too long. I just want a tangible increase, not matching something that has already been done by Nvidia a year ago.


Here's the thing. People are taking Roy's word as confirmation. Which was really a vague statement to begin with. Jumping to the conclusion that P10 sucks. Then earlier today the guy with the leather jacket says x80 is TWICE the performance of Titan X (sorry I don't know how to spell his name) and that's taken as gospel, while neither have any performance results to confirm either. Just my observation on this.

Not to mention the two tier price point of the regular and founder's series. So it's either +$50 or +$150


----------



## Majin SSJ Eric

Maybe we could wait until AMD officially announces Polaris 10 before for proclaiming it a total fail based on some comment that Roy made offhand? Fact is we know absolutely nothing about the performance potential of this card, and we really don't know that much about the 1070 either for that matter.


----------



## DaaQ

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Maybe we could wait until AMD officially announces Polaris 10 before for proclaiming it a total fail based on some comment that Roy made offhand? Fact is we know absolutely nothing about the performance potential of this card, and we really don't know that much about the 1070 either for that matter.


I agree. Since when has anyone on this board ever taken Roy's word as confirmation of anything. Until now.

You have Roy saying near 980ti performance, OCN interprets that as less than 390 performance. Then Nvidia guy says x70 is faster than TX, OCN interprets that as +10% or more.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *DaaQ*
> 
> I agree. Since when has anyone on this board ever taken Roy's word as confirmation of anything. Until now.
> 
> You have Roy saying near 980ti performance, OCN interprets that as less than 390 performance. Then Nvidia guy says x70 is faster than TX, OCN interprets that as +10% or more.


One of the most basic marketing ploys around is to undersell and over deliver. But never mind, if Roy says anything that team green likes then all the sudden he is a soothsayer. AMD will have their moment soon enough and then we will all know for sure.


----------



## DaaQ

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> One of the most basic marketing ploys around is to undersell and over deliver. But never mind, if Roy says anything that team green likes then all the sudden he is a soothsayer. AMD will have their moment soon enough and then we will all know for sure.


I mean Roy has had foot in mouth more than on the ground but Nvidia dude says TWICE the performance of TitanX and nerdgasms ensue.
It's disappointing not one person has really questioned the validity of those statements near the end of his presentation. No benchmarks were shown afaik but I wasn't able to view entire video was limited to audio due to work.

One other thing. He came pretty darn close to the "overclockers dream" remark with different wording of course.


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Maybe we could wait until AMD officially announces Polaris 10 before for proclaiming it a total fail based on some comment that Roy made offhand? Fact is we know absolutely nothing about the performance potential of this card, and we really don't know that much about the 1070 either for that matter.


Please and thank you.


----------



## Olivon

Quote:


> Originally Posted by *DaaQ*
> 
> I agree. Since when has anyone on this board ever taken Roy's word as confirmation of anything. Until now.
> 
> *You have Roy saying near 980ti performance*, OCN interprets that as less than 390 performance. Then Nvidia guy says x70 is faster than TX, OCN interprets that as +10% or more.


Can you please post a link where Roy Taylor explicitly said that Polaris will have 980Ti performance ? Thanks in advance.


----------



## DaaQ

Quote:


> Originally Posted by *Olivon*
> 
> Can you please post a link where Roy Taylor explicitly said that Polaris will have 980Ti performance ? Thanks in advance.


NEAR 980ti. But I'm on mobile give me a bit to get home and I'll find it.

Again tho it's Roy. Is why I'm wondering why ppl are believing this statement from him as fact.


----------



## Olivon

Quote:


> Originally Posted by *DaaQ*
> 
> NEAR 980ti. But I'm on mobile give me a bit to get home and I'll find it.
> 
> Again tho it's Roy. Is why I'm wondering why ppl are believing this statement from him as fact.


Don't waste time, he never says that. That's just rumours from usual clickbait sites.


----------



## 2010rig

Quote:


> Originally Posted by *DaaQ*
> 
> I mean Roy has had foot in mouth more than on the ground but Nvidia dude says TWICE the performance of TitanX and nerdgasms ensue.
> It's disappointing not one person has really questioned the validity of those statements near the end of his presentation. No benchmarks were shown afaik but I wasn't able to view entire video was limited to audio due to work.
> 
> One other thing. He came pretty darn close to the "overclockers dream" remark with different wording of course.


No one huh








Quote:


> Originally Posted by *2010rig*
> 
> Twice the performance of TX? $599 MSRP and people cheered


Quote:


> Originally Posted by *2010rig*
> 
> And did we all hear correctly, he claimed the 1080 had TWICE the performance of a TX
> I missed that he meant in VR, sounded very general


When I thought they were falsely advertising, I pointed that out too

http://www.overclock.net/t/1599440/geforce-nvidia-gtx-1080-unveiled/450_50#post_25136110

If you follow that thread, I wasn't the only one to point this out

As far as their 2x claim in VR goes...


----------



## DaaQ

Quote:


> Originally Posted by *2010rig*
> 
> No one huh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I thought they were falsely advertising, I pointed that out too
> 
> http://www.overclock.net/t/1599440/geforce-nvidia-gtx-1080-unveiled/450_50#post_25136110
> 
> If you follow that thread, I wasn't the only one to point this out


I was at work and threads were moving too fast to keep up honestly.

@Olivion it's here somewhere. But it's the main reason people are saying Roy conceded high end to Pascal. It maybe in the Polaris market segment thread. I need to get home tho guys. Will try and catch up there.


----------



## Nickyvida

Quote:


> Originally Posted by *DaaQ*
> 
> Here's the thing. People are taking Roy's word as confirmation. Which was really a vague statement to begin with. Jumping to the conclusion that P10 sucks. Then earlier today the guy with the leather jacket says x80 is TWICE the performance of Titan X (sorry I don't know how to spell his name) and that's taken as gospel, while neither have any performance results to confirm either. Just my observation on this.
> 
> Not to mention the two tier price point of the regular and founder's series. So it's either +$50 or +$150


Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Maybe we could wait until AMD officially announces Polaris 10 before for proclaiming it a total fail based on some comment that Roy made offhand? Fact is we know absolutely nothing about the performance potential of this card, and we really don't know that much about the 1070 either for that matter.


From arstechnica
Quote:


> AMD's upcoming Polaris 10 and Polaris 11 graphics chips won't be powering high-end graphics cards, according to recent comments by AMD. In its latest financial report, the company noted that Polaris 11 would target "the notebook market," while Polaris 10 would target "the mainstream desktop and high-end gaming notebook segment."


Vague? I think it's pretty clear as day to be honest. Nothing for high end cards. Pretty safe to say it's probably a fail even before it launches. 980ti performance IS not cutting it, actually, for a die shrink and a new architecture after three years. Whatever happened to small dies beating big dies like the 680?

And for the record, 980ti perf is so last year for a 14nm card. People can just buy second hand 980tis off the shelves and camp for Vega, where it'll be beaten roundly by an overpriced XX80Ti anyhow.


----------



## caswow

iam actually asking myself why people cant wait until all benches are out? what is the purpose of beeing so negative?


----------



## KarathKasun

Id laugh if there was another larger chip and P10 was misdirection. Its not likely though, and honestly, its good business targeting the largest market when your competition is focusing on a much smaller one.


----------



## Travieso

Quote:


> Originally Posted by *Nickyvida*
> 
> From arstechnica
> Vague? I think it's pretty clear as day to be honest. Nothing for high end cards. Pretty safe to say it's probably a fail even before it launches. 980ti performance IS not cutting it, actually, for a die shrink and a new architecture after three years. Whatever happened to small dies beating big dies like the 680?
> 
> And for the record, 980ti perf is so last year for a 14nm card. People can just buy second hand 980tis off the shelves and camp for Vega, where it'll be beaten roundly by an overpriced XX80Ti anyhow.


P10 has die size around 100 mm2 smaller than GP104.

If it can match 980Ti performance and is priced around $279-299, this will be hell of a card since 1070 is only marginally faster than Titan X which is the same tier as 980Ti.


----------



## Nickyvida

Quote:


> Originally Posted by *Travieso*
> 
> P10 has die size around 100 mm2 smaller than GP104.
> 
> If it can match 980Ti performance and is priced around $279-299, this will be hell of a card since 1070 is only marginally faster than Titan X which is the same tier as 980Ti.


Yeah i know.. but 980ti performance ain't awfully inspiring considering it has been done last year on an old process. Obviously im not in the market for one but pretty disappointing that the increases from the die shrink from both camps have dwindled, but the prices quoted for the cards are increasing, aka milking.


----------



## Tojara

Quote:


> Originally Posted by *DIYDeath*
> 
> What would you rather have?
> $300 for a AMD 980ti or $380 for a faux Titan X?
> One of these things is not like the others...one of them just doesn't belong!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Polaris 10 would need to be priced somewhere between $200-$280 to compete given what we know about the competiion, based on the rumors of polaris 10 which at this point I doubt are true.


You wot m8. The TX is 10% faster than the 980 ti at best, and the price difference is over 25%. So how would the AMD 980 ti not be a better buy?


----------



## Travieso

Quote:


> Originally Posted by *KarathKasun*
> 
> Id laugh if there was another larger chip and P10 was misdirection. Its not likely though, and honestly, its good business targeting the largest market when your competition is focusing on a much smaller one.


Yep, i think everyone can expect gpu performance by just looking at its die size.

gpu isn't as complicated as cpu that affected by architecture change, pipeline, memory controller, etc. this is mostly about muscle power which is displayed through die size.

I know both companies can pull out some special sauce like AMD with rv770 or nVidia with Maxwell but that doesn't happen so often.


----------



## DaaQ

Quote:


> Originally Posted by *Nickyvida*
> 
> From arstechnica
> Vague? I think it's pretty clear as day to be honest. Nothing for high end cards. Pretty safe to say it's probably a fail even before it launches. 980ti performance IS not cutting it, actually, for a die shrink and a new architecture after three years. Whatever happened to small dies beating big dies like the 680?
> 
> And for the record, 980ti perf is so last year for a 14nm card. People can just buy second hand 980tis off the shelves and camp for Vega, where it'll be beaten roundly by an overpriced XX80Ti anyhow.


What I'm saying is it's Roy. When has Roy been a reliable source at all.
From prior to Hawaii Roy may as well be made of salt. But then he makes a statement where suddenly he has credibility.
I understand that P10 is confirmed mainstream. But will mainstream = performance or price? We don't know yet. I'm not holding my breath on the performance personally.
On the other hand tho we have Jen ( sorry I just can't get his name or the spelling atm it's 3:00am here +mobile)
Dropping 1080 is twice the performance of Titan X with no proof other than words afaik (again I was mobile during the stream and got mostly audio) which that comment struck me as marketing speak. Twice the performance compared to what metric? But yet, still no proof of claims.
But we have confirmation that this gen x80 will have a two tier pricing with no information as to why one is $100 premium over the other.

I get what your saying tho as far as my use of vague, I interpret everything Roy says as vague due to his track record of BS.
I hope that clears up what I meant some.


----------



## Nickyvida

Quote:


> Originally Posted by *DaaQ*
> 
> What I'm saying is it's Roy. When has Roy been a reliable source at all.
> From prior to Hawaii Roy may as well be made of salt. But then he makes a statement where suddenly he has credibility.
> I understand that P10 is confirmed mainstream. But will mainstream = performance or price? We don't know yet. I'm not holding my breath on the performance personally.
> On the other hand tho we have Jen ( sorry I just can't get his name or the spelling atm it's 3:00am here +mobile)
> Dropping 1080 is twice the performance of Titan X with no proof other than words afaik (again I was mobile during the stream and got mostly audio) which that comment struck me as marketing speak. Twice the performance compared to what metric? But yet, still no proof of claims.
> But we have confirmation that this gen x80 will have a two tier pricing with no information as to why one is $100 premium over the other.
> 
> I get what your saying tho as far as my use of vague, I interpret everything Roy says as vague due to his track record of BS.
> I hope that clears up what I meant some.


It's pretty much mainstream performance in the arstechnica since the high-end gpus were namedropped as removed before the mention of the mainstream. The inclusion of the highend notebook gaming segment puts the nail on firmly, i don't really think it is the price, tbh. They pulled out of the high end desktop segment.

Pretty disappointing if true, which allowed Nvidia to quote a $600 tag for the x 80, it doesn't matter whether it is a founder's edition or not. The very notion of an x80 at $600... is blatant milking for a mid range die with 20-30% improvement at most.


----------



## Travieso

Quote:


> Originally Posted by *Nickyvida*
> 
> It's pretty much mainstream performance in the arstechnica since the high-end gpus were namedropped as removed before the mention of the mainstream. The inclusion of the highend notebook gaming segment puts the nail on firmly, i don't really think it is the price, tbh. They pulled out of the high end desktop segment.
> 
> Pretty disappointing if true, which allowed Nvidia to quote a $600 tag for the x 80, it doesn't matter whether it is a founder's edition or not. The very notion of an x80 at $600... is blatant milking for a mid range die with 20-30% improvement at most.


i think everyone know that P10 cannot surpass GTX1080's performance considering only its die size unless AMD has Intel's R&D department so that they can squeeze every performance from that tiny die.

and targeting at gaming notebook doesn't mean that it has weak performance. i mean GK104 were also put in notebook and they're considered high-end chips until GK100 came out like 8-10 months later.


----------



## Tojara

Quote:


> Originally Posted by *Travieso*
> 
> i think everyone know that P10 cannot surpass GTX1080's performance considering only its die size unless AMD has Intel's R&D department so that they can squeeze every performance from that tiny die.
> 
> and targeting at gaming notebook doesn't mean that it has weak performance. i mean GK104 were also put in notebook and they're considered high-end chips until GK100 came out like 8-10 months later.


It doesn't solely depend on R&D, but the fact remains that with more resources it's unlikely that the 1080 will lose to a max 250mm2 die from AMD, even with a process disadvantage. Smaller chips having far better perf/mm2 has happened fairly frequenctly in the past, the only issues have generally been slightly lower power efficiency and not being able to scale the chips up from there on.

A 250mm2 Polaris 10 at ~980 ti performance at $300 and 150W TDP would be a great GPU all around, even when Pascal GPUs are out. Double that to a 500mm2 Vega and it would be an absolute monster, 30-50% over GTX 1080.


----------



## speedyeggtart

Quote:


> Originally Posted by *Nickyvida*
> 
> Gtx 1070 faster than a Titan X.
> 
> Meanwhile at AMD..
> 
> 2.5x mainstream perf/watt efficiency bullcrap. Nothing to challange x70 and x80. As a result the 1080 now sits at $600, a whopping $100 increase. An overpriced mid range die GP104 die for $600. And a x70 almost breaking the $400 mark. Who knows what the x80ti will retail for now?
> 
> Thanks alot Raja!


It seems like Nvidia is pitching the same thing in their graph.



*The Y Axis = Relative Gaming Performance while the X Axis = Wattage
Nvidia is pitching "New King" in terms of perf/watt efficiency*
So Nvidia is doing the same thing. Better than Titan only in VR using their "VRWorks" and the only chart showing real gaming performance was comparing it to the 980. And the chart does not even specify at what resolution and settings. *The Gaming Performance Chart also states about speed and power efficiency = NO FPS vs FPS stated.*



Maybe its at or close to MSI Lightning GTX 980Ti performance vs a GTX 980 in this chart in terms of FPS?:



So they are pricing the GTX 1080 close to the GTX 980Ti pricing according the almost the same performance above of the GTX 980Ti Lightning vs a vanilla GTX 980?


----------



## gamervivek

Quote:


> Originally Posted by *KarathKasun*
> 
> Id laugh if there was another larger chip and P10 was misdirection. Its not likely though, and honestly, its good business targeting the largest market when your competition is focusing on a much smaller one.


Shipping manifests to India showed three new graphics cards from AMD, 48k, 62k and 111k INR priced. 48k was Baffin XT or Polaris 10 at its best. That being said, those chips show up quite a while before release, P10 is said to be early stages now according to some chinese forum.


----------



## Nickyvida

Quote:


> Originally Posted by *Travieso*
> 
> i think everyone know that P10 cannot surpass GTX1080's performance considering only its die size unless AMD has Intel's R&D department so that they can squeeze every performance from that tiny die.
> 
> and targeting at gaming notebook doesn't mean that it has weak performance. i mean GK104 were also put in notebook and they're considered high-end chips until GK100 came out like 8-10 months later.


Well, they could have gone with a slightly bigger die? Cost yeah i know, but if that was the only way to extract or better the rumored 980ti performance, it would be for the best.
Quote:


> Originally Posted by *Tojara*
> 
> It doesn't solely depend on R&D, but the fact remains that with more resources it's unlikely that the 1080 will lose to a max 250mm2 die from AMD, even with a process disadvantage. Smaller chips having far better perf/mm2 has happened fairly frequenctly in the past, the only issues have generally been slightly lower power efficiency and not being able to scale the chips up from there on.
> 
> A 250mm2 Polaris 10 at ~980 ti performance at $300 and 150W TDP would be a great GPU all around, even when Pascal GPUs are out*. Double that to a 500mm2 Vega and it would be an absolute monster, 30-50% over GTX 1080*.


You're forgetting the GTX 1080ti and the Titan Pascal.

AMD will have to compete with them when they release Vega by then. Serious and overpriced competition there, y'know.

Things aren't looking good for AMD as much as i hate to say it.


----------



## Tojara

Quote:


> Originally Posted by *Nickyvida*
> 
> Well, they could have gone with a slightly bigger die? Cost yeah i know, but if that was the only way to extract or better the rumored 980ti performance, it would be for the best.
> You're forgetting the GTX 1080ti and the Titan Pascal.
> 
> AMD will have to compete with them when they release Vega by then. Serious and overpriced competition there, y'know.
> 
> Things aren't looking good for AMD as much as i hate to say it.


Nvidia will very likely win the first round in performance, but that's not saying much when AMD isn't addressing the enthusiast segment at all. I'm suspecting that big Pascal will (again) be another ~30% over the smaller die like the last two Titans, and as it is they aren't far ahead in perf/mm2 matching it on a better process shouldn't be a big issue. If anything Vega should be closer to big Pascal than Fiji was to big Maxwell, as long as the scaling isn't as terrible as it was from Hawaii to Fiji.


----------



## Travieso

Quote:


> Originally Posted by *Nickyvida*
> 
> Well, they could have gone with a slightly bigger die? Cost yeah i know, but if that was the only way to extract or better the rumored 980ti performance, it would be for the best.
> You're forgetting the GTX 1080ti and the Titan Pascal.
> 
> AMD will have to compete with them when they release Vega by then. Serious and overpriced competition there, y'know.
> 
> Things aren't looking good for AMD as much as i hate to say it.


Things aren't looking bad if they just put a chip in between Polaris and Vega.

maybe something like Sirius 10 with 350 mm2 die size +30% performance vs Fury X, that might work.

but they didn't, they let the gap between Polaris and Vega be too big. i don't know why, it's like they just gave up this segment (nope i don't believe in magic of Polaris 10 that it can compete with GP104 with 33% smaller die size).


----------



## Tojara

Quote:


> Originally Posted by *Travieso*
> 
> but they didn't, they let the gap between Polaris and Vega be too big. i don't know why, it's like they just gave up this segment (nope i don't believe in magic of Polaris 10 that it can compete with GP104 with 33% smaller die size).


What's the weirdest thing is that we don't know what GP104 even is. The 1080 is so far ahead of the 1070 and with a different memory type that it's very likely to be a different die entirely.


----------



## n64ADL

does anybody know when AMD is going to announce a launch date of polaris?? is that going to be sometime in late may??


----------



## TheLAWNOOB

Quote:


> Originally Posted by *n64ADL*
> 
> does anybody know when AMD is going to announce a launch date of polaris?? is that going to be sometime in late may??


Might reveal at begining of June, launching later.

They only promised before September release.


----------



## CalinTM

After seeing 1070 price, is possible, title regarding...


----------



## toddincabo

Did someone beat on you as a child with an AMD product


----------



## toddincabo

Quote:


> Originally Posted by *Nickyvida*
> 
> 980ti is so last year. People can just get cutprice 980tis instead of spending moar on the polaris
> 
> Sad they have given up on the enthusiast market. Zen will probably be this way too.


That AMD child beating post was for this person.


----------



## Gilles3000

Quote:


> Originally Posted by *toddincabo*
> 
> That AMD child beating post was for this shill.


You can edit your previous post, no need to double post.


----------



## toddincabo

Quote:


> Originally Posted by *Gilles3000*
> 
> You can edit your previous post, no need to double post.


Thanks. My edit box wasn't showing for some reason. I just wanted to clarify.


----------



## Nickyvida

Quote:


> Originally Posted by *toddincabo*
> 
> That AMD child beating post was for this person.


Yeah, the reason why i have AMD products such as APUs in three other pcs as well as the 390 in my pc currently?

Obviously thou shalt not speak ill of AMD here. Not even warranted criticism.


----------



## EightDee8D

Quote:


> Originally Posted by *Nickyvida*
> 
> Yeah, the reason why i have AMD products such as APUs in three other pcs as well as the 390 in my pc currently?
> 
> Obviously thou shalt not speak ill of AMD here. Not even warranted criticism.


People assume you are a hater/ fanboy of another camp just by your post. happened with me too. it seems people don't like to read weakness or truth against their so called "camp".


----------



## caswow

neither nvidia nor amd is selling its new cards and people "criticize" amd for not saying anything? i mean nvidia told us " a lot " but where are the cards? people were mad at amd for their "paperlaunches" but now even when no new cards are on the market except jhh pr bullcrap people already "criticizing" amd









how about we all wait for benchmarks and then smacktalk whoever you want.


----------



## Nickyvida

Quote:


> Originally Posted by *caswow*
> 
> neither nvidia nor amd is selling its new cards and people "criticize" amd for not saying anything? i mean nvidia told us " a lot " but where are the cards? people were mad at amd for their "paperlaunches" but now even when no new cards are on the market except jhh pr bullcrap people already "criticizing" amd
> 
> 
> 
> 
> 
> 
> 
> 
> 
> how about we all wait for benchmarks and then smacktalk whoever you want.


It doesnt matter. Polaris wont even come close when its only aimed at the mainstream segment. Meaning 480x and below. The die size already gave that away.


----------



## kaosstar

It seems there's always a sort of "latent" bias against AMD. Maybe it's just from all the built up skepticism caused by consistently overpromising and underdelivering. It's gone a little overboard, and there seems to be a double standard building.

I probably don't need to remind you all of the fact that, upon release of the Hawaii GPU, suddenly "power efficiency" became one of the most important metrics to so-called hardware enthusiasts.


----------



## Diogenes5

Remember guys, you can block members if they are obvious viral marketers or fanboys. One poster in particular makes nothing but stupid, childish claims without citation.

I am not happy with GTX 1070 pricing. I'm not paying $400 for a card that still uses GDDR5. And 1070's will be probably going well above street price just like the 970 did for months after launch because Nvidia likes to constrain supply and maximize revenue. Great for Nvidia's bottom line. Bad for the consumer.

I hope Polaris 10 has a better price to performance ration with GDDR5x at least or I will be disappointed. Nvidia is slowly pushing "mainstream" pricing of its xx70 line up higher and higher.


----------



## Lee Patekar

I can't help but notice nVidia demonstrated an overclocked 1080 GTX running at 2.1 GHz ... and that makes me wonder if all their performance claims about twice the performance of the Titan X are based off the demonstration unit or the actual production units? I don't know about the rest of you, but I'm waiting for third party benchmarks of both nVidia's cards and whatever AMD releases before buying anything. So far all we got was marketing hype.

That being said I absolutely loved the multi-monitor improvements nVidia demonstrated. I wonder if AMD has something similar?


----------



## ebduncan

Quote:


> Originally Posted by *Nickyvida*
> 
> Polaris is a disappointmemt imo. Only replacements for the 480x, 470 amd below being touted around by Roy. Nothing about 490 or Fury replacements. And the 480x aint much of improvements if early benchmarks are to be believed.
> 
> AMD giving Nvidia free reign over the high end market and to overcharge


we really have no idea what to expect. Everyone is hyped about the 1080, I'm not since it's likely to not be that much faster than the 980ti in actual game benchmarks.

Quote:


> Originally Posted by *spyshagg*
> 
> I have to agree as well.
> 
> Fury sales will be decimated by pascal existence, but the real losers here are the buyers who will spend small fortunes on their inflated high end nvidia cards.


Something tells me that AMD is going to release a GDDR5X verison of polaris 10, and will compete with the 1080. Supposedly the cut down version of polaris 10 at 800mhz is about as fast as the 390. If clocks speeds are anything like Nvidia's then that's gonna be one fast card.


----------



## Ultracarpet

Quote:


> Originally Posted by *Nickyvida*
> 
> It doesnt matter. Polaris wont even come close when its only aimed at the mainstream segment. Meaning 480x and below. The die size already gave that away.


AMD's dies, even when at a performance parity, are usually smaller than NVidias, with a more densly packed design.

In regards to this rumour, the thing is, with NVidias pricing on the 1070 at low 300's AMD is going to have to match that price and beat the 1070 in performance by a substantial margin (probably won't happen, I'm guessing 5-10% max). This is the sad part, because when it only matches or slightly beats the 1070 for the same price, 8/10 people are going to go NVidia, especially after the success of the 970 (don't even try to say the memory thing made a dent, market share is through the roof and 970 is one of their best selling cards of all time).

Market share will jiggle a few percent, but this is not going to be anything major in my mind. I hope I'm wrong.


----------



## Forceman

Quote:


> Originally Posted by *ebduncan*
> 
> Something tells me that AMD is going to release a GDDR5X verison of polaris 10, and will compete with the 1080. Supposedly the cut down version of polaris 10 at 800mhz is about as fast as the 390. If clocks speeds are anything like Nvidia's then that's gonna be one fast card.


If you are talking about that leaked benchmark from a few days ago, it was about 40% slower than the 390. The onscreen score (that was even with the 390) is capped at 60 fps.


----------



## BulletBait

Quote:


> Originally Posted by *Diogenes5*
> 
> I am not happy with GTX 1070 pricing. I'm not paying $400 for a card that still uses GDDR5. And 1070's will be probably going well above street price just like the 970 did for months after launch because Nvidia likes to constrain supply and maximize revenue. Great for Nvidia's bottom line. Bad for the consumer.


This x1000, IF the performance gains are anywhere near what people are projecting onto it, I don't see them selling it at half price of the Titan (edit: for 1080s, and half price 980ti for the 1070), I really think the hype train on this one is going to lead to a massive disappointment. They may have an 'MSRP,' but I'm really expecting the to restrict the heck out of it and take kickbacks from the following insane pricing. I still remember the 290s mining boom, thankfully I bought mine before it, but that price from inflated demand instead of restricted supply was bad enough.

I also read through the whole 'unveil' thread, being the masochist I am, and came here for some solace to all the AMD bashing. That seems like a forlorn hope (I knew it would be).


----------



## Sand3853

Quote:


> Originally Posted by *Forceman*
> 
> If you are talking about that leaked benchmark from a few days ago, it was about 40% slower than the 390. The onscreen score (that was even with the 390) is capped at 60 fps.


The only problem with that leaked benchmark is no one knows what P10 chip was being used. Even the site that leaked the chip said as much. They speculated that it could even be the mobile version of P10, and some others have wondered if it is more likely the new chip to go into the refreshed consoles.

I think it is safe to say that we have very little knowledge of what Polaris 10 will really be like, other than some very contradictory reports and some leaked benches that are suspect at best. I am cautiously optimistic that the lack of information regarding Polaris is an indication that AMD is trying to keep things close to the chest, and just might surprise us. Either way, should be some nice options for upgrade later this year


----------



## ebduncan

Quote:


> Originally Posted by *Forceman*
> 
> If you are talking about that leaked benchmark from a few days ago, it was about 40% slower than the 390. The onscreen score (that was even with the 390) is capped at 60 fps.


even so , that sample was clocked at 800mhz. If it clocks to 1080 levels stock (1600mhz) you can realistically expect near 100% more performance than that benchmark shows.

too little is known about polaris 10, we need some more info. Sadly we gotta wait til June, but hey that's when the Nvidia cards will hit the shelves anyways.


----------



## Forceman

Quote:


> Originally Posted by *Sand3853*
> 
> The only problem with that leaked benchmark is no one knows what P10 chip was being used. Even the site that leaked the chip said as much. They speculated that it could even be the mobile version of P10, and some others have wondered if it is more likely the new chip to go into the refreshed consoles.
> 
> I think it is safe to say that we have very little knowledge of what Polaris 10 will really be like, other than some very contradictory reports and some leaked benches that are suspect at best. I am cautiously optimistic that the lack of information regarding Polaris is an indication that AMD is trying to keep things close to the chest, and just might surprise us. Either way, should be some nice options for upgrade later this year


I'm not optimistic, considering AMD themselves said that it isn't targeting the same market as 1070/1080. It'll be cheaper, probably, but I don't want 390X/Fury speed at $300, I want 980 Ti +20% for $500. Doesn't seem like P10 or 1070 is going to scratch that itch.


----------



## BulletBait

Quote:


> Originally Posted by *Forceman*
> 
> I'm not optimistic, considering AMD themselves said that it isn't targeting the same market as 1070/1080. It'll be cheaper, probably, but I don't want 390X/Fury speed at $300, I want 980 Ti +20% for $500. Doesn't seem like P10 or 1070 is going to scratch that itch.


I'm cautiously optimistic for the market segment they're shooting for. That said, I already knew I'd be waiting for Vega (was excited back when it was 'Greenland') since 'they' 'delayed' it, you can never trust rumor mill on release dates anyways. It's pretty much gone to me going to build a whole new rig with Zen and Vega instead of piecemeal like originally planned.

I also think AMD's being smart about their release schedule (unlike what other people think). They're on the backfoot right now and can't drop something only to get one upped immediately two weeks later. They need to play the response game right now, once they see what nV's got, they can go back, tweak a little if they need to, and come out at least on par if not just a titch better.

I also think that if AMD cards keep outperforming in DX12 and devs actually shift over to the API it will greatly shrink the parity between them and nV.


----------



## Forceman

Quote:


> Originally Posted by *BulletBait*
> 
> I'm cautiously optimistic for the market segment they're shooting for.


True enough, unfortunately it's just not the market segment I am in.


----------



## Diogenes5

Quote:


> Originally Posted by *Forceman*
> 
> I'm not optimistic, considering AMD themselves said that it isn't targeting the same market as 1070/1080. It'll be cheaper, probably, but I don't want 390X/Fury speed at $300, I want 980 Ti +20% for $500. Doesn't seem like P10 or 1070 is going to scratch that itch.


But you pay such a huge premium for it. Both GPU makers have their mainstream cards that cost around $300 and then for $150 more you get like 15% more performance and then another $150 you get 10% more performance on top of that. As for looking for a $500 card to drop your money on ... why would you adopt before seeing what the other guy has to offer? Buying the top end GPU always gets you burned if you're not careful. Just ask all the people who bought 980 TI's last week








Quote:


> Originally Posted by *BulletBait*
> 
> This x1000, IF the performance gains are anywhere near what people are projecting onto it, I don't see them selling it at half price of the Titan (edit: for 1080s, and half price 980ti for the 1070), I really think the hype train on this one is going to lead to a massive disappointment. They may have an 'MSRP,' but I'm really expecting the to restrict the heck out of it and take kickbacks from the following insane pricing. I still remember the 290s mining boom, thankfully I bought mine before it, but that price from inflated demand instead of restricted supply was bad enough.
> 
> I also read through the whole 'unveil' thread, being the masochist I am, and came here for some solace to all the AMD bashing. That seems like a forlorn hope (I knew it would be).


Quote:


> Originally Posted by *BulletBait*
> 
> I'm cautiously optimistic for the market segment they're shooting for. That said, I already knew I'd be waiting for Vega (was excited back when it was 'Greenland') since 'they' 'delayed' it, you can never trust rumor mill on release dates anyways. It's pretty much gone to me going to build a whole new rig with Zen and Vega instead of piecemeal like originally planned.
> 
> I also think AMD's being smart about their release schedule (unlike what other people think). They're on the backfoot right now and can't drop something only to get one upped immediately two weeks later. They need to play the response game right now, once they see what nV's got, they can go back, tweak a little if they need to, and come out at least on par if not just a titch better.
> 
> I also think that if AMD cards keep outperforming in DX12 and devs actually shift over to the API it will greatly shrink the parity between them and nV.


There's not a game out there that's demanding that I upgrade now. VR is the next big pusher of graphics but there aren't many killer apps for that yet either. I can afford to wait. I hope AMD pushes things in both the CPU and GPU space. Nvidia needs some competition for all its expensive tech (I'm looking at you gsync) and Intel has been rereleasing the i5 2500k every year and calling it a day.

My hope for Polaris being better is AMD beating out Nvidia on a bunch of contracts to supply Apple, Sony, and Nintendo with cards. I know that only indicates that their tech is relatively power efficient and cheap on the low-end side of things. But considering they are on a slightly lower NM manufacturing node (14nm vs 16nm), I'm hoping that the stars align and they can get something reasonable out at ~$300. AMD's usually gotten my business the last few years because Nvidia keeps trying to get gamers to spend $350 and $400 for cards and I can usually get the same performance at $250-$300 from AMD cards instead. Hoping history repeats itself only this time AMD is also competitive in performance/watt as well as performance/$ because I don't like having a space heater inside my room.


----------



## Sand3853

Quote:


> Originally Posted by *Forceman*
> 
> I'm not optimistic, considering AMD themselves said that it isn't targeting the same market as 1070/1080. It'll be cheaper, probably, but I don't want 390X/Fury speed at $300, I want 980 Ti +20% for $500. Doesn't seem like P10 or 1070 is going to scratch that itch.


AMDRoy made 1 comment in 1 interview and everyone takes it as gospel. It could turn out to be true, and just as equally false, considering his track record. I have yet to see an official statement from AMD saying one way or the other. All we can do is wait for the official launch and benchmarks/reviews.


----------



## Travieso

well, the only thing we all know is die size which suggests that its not likely to compete with nVidia's counterpart (at least for the full chip).


----------



## BulletBait

Quote:


> Originally Posted by *Diogenes5*
> 
> There's not a game out there that's demanding that I upgrade now. VR is the next big pusher of graphics but there aren't many killer apps for that yet either. I can afford to wait. I hope AMD pushes things in both the CPU and GPU space. Nvidia needs some competition for all its expensive tech (I'm looking at you gsync) and Intel has been rereleasing the i5 2500k every year and calling it a day.
> 
> My hope for Polaris being better is AMD beating out Nvidia on a bunch of contracts to supply Apple, Sony, and Nintendo with cards. I know that only indicates that their tech is relatively power efficient and cheap on the low-end side of things. But considering they are on a slightly lower NM manufacturing node (14nm vs 16nm), I'm hoping that the stars align and they can get something reasonable out at ~$300. AMD's usually gotten my business the last few years because Nvidia keeps trying to get gamers to spend $350 and $400 for cards and I can usually get the same performance at $250-$300 from AMD cards instead. Hoping history repeats itself only this time AMD is also competitive in performance/watt as well as performance/$ because I don't like having a space heater inside my room.


Yeah, that's one of the major reasons I didn't swap to Intel. Besides Vishera finally getting into stride from the larger multi thread support these days. I've never really needed a reason to upgrade, even with a slightly weaker IPC, I've been able to compensate for that with a massive overclock.

What, no Microsoft?







Yes, I've passed on nV specifically because of their pricing and proprietary crap. Intel was more to do with marketing practices before, but now is more to do with their, I'm going to call it 'predatory,' pricing as well.

I also rather enjoy my AMD furnace whenever winter rolls around here







. Cuts down on my heating bill, which offsets its higher power consumption XD.


----------



## Forceman

Quote:


> Originally Posted by *Diogenes5*
> 
> But you pay such a huge premium for it. Both GPU makers have their mainstream cards that cost around $300 and then for $150 more you get like 15% more performance and then another $150 you get 10% more performance on top of that. As for looking for a $500 card to drop your money on ... why would you adopt before seeing what the other guy has to offer? Buying the top end GPU always gets you burned if you're not careful. Just ask all the people who bought 980 TI's last week
> 
> 
> 
> 
> 
> 
> 
> 
> .


Sure, but if I was happy with Fury performance, I'd just get a Fury. I want something more than that to make it worthwhile to upgrade my 290X, and I'm willing to pay $500 for it. If AMD isn't able or willing to offer that this summer then I (and others in my position) are going to have to decide whether to wait until next year (again) or jump to Nvidia. The bottom line is that I'm ready for more performance, but it doesn't look like AMD is ready to provide it - and Nvidia is charging too much for it.

If you play the "$x for y performance" game, you end up with a 750 Ti. I have a performance floor I want, and I'm willing to pay what I consider to be a reasonable price for that level of performance. Hopefully I'll find that this summer.
Quote:


> Originally Posted by *Sand3853*
> 
> AMDRoy made 1 comment in 1 interview and everyone takes it as gospel. It could turn out to be true, and just as equally false, considering his track record. I have yet to see an official statement from AMD saying one way or the other. All we can do is wait for the official launch and benchmarks/reviews.


Assuming the die size is correct at 232 mm^2 (which I'm not convinced is the case) then it kind of fits. That and their relentless commenting about perf/watt and "bringing VR to the mainstream". 1070/1080 aren't mainstream.


----------



## edmwxyz

Hmm....looking at the rumours about Polaris performance, it seems that Polaris 10 won't even match GTX1070....... This is really quite disappointing.


----------



## BulletBait

Quote:


> Originally Posted by *edmwxyz*
> 
> Hmm....looking at the rumours about Polaris performance, it seems that Polaris 10 won't even match GTX1070....... This is really quite disappointing.


How many times does this need to be repeated in thread...
Quote:


> Originally Posted by *Sand3853*
> 
> The only problem with that leaked benchmark is no one knows what P10 chip was being used. Even the site that leaked the chip said as much. They speculated that it could even be the mobile version of P10, and some others have wondered if it is more likely the new chip to go into the refreshed consoles.
> 
> I think it is safe to say that we have very little knowledge of what Polaris 10 will really be like, other than some very contradictory reports and some leaked benches that are suspect at best. I am cautiously optimistic that the lack of information regarding Polaris is an indication that AMD is trying to keep things close to the chest, and just might surprise us. Either way, should be some nice options for upgrade later this year


Geez, you guys get your panties all in a wad from a paper launch and come out of the woodwork to make unsubstantiated claims.

Can we please leave the AMD bash to the nV threads for at least a day?


----------



## TheLAWNOOB

Quote:


> Originally Posted by *Diogenes5*
> 
> Remember guys, you can block members if they are obvious viral marketers or fanboys. One poster in particular makes nothing but stupid, childish claims without citation.
> 
> I am not happy with GTX 1070 pricing. I'm not paying $400 for a card that still uses GDDR5. And 1070's will be probably going well above street price just like the 970 did for months after launch because Nvidia likes to constrain supply and maximize revenue. Great for Nvidia's bottom line. Bad for the consumer.
> 
> I hope Polaris 10 has a better price to performance ration with GDDR5x at least or I will be disappointed. Nvidia is slowly pushing "mainstream" pricing of its xx70 line up higher and higher.


Don't think Polaris has GDDR5X.


----------



## Mygaffer

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Cool cool...all is well...EXCEPT I REALLY want to see power consumption numbers. I can deal with the cooling with some CLU but I REALLY REALLY hope AMD can reduce their power consumption like there's no tomorrow.


They are continuing to pursue small die with Polaris so power consumption and heat will definitely be low.


----------



## BulletBait

Quote:


> Originally Posted by *Mygaffer*
> 
> They are continuing to pursue small die with Polaris so power consumption and heat will definitely be low.


Wasn't there a whole bunch of talk about P11 likely ending up around 50-75W and P10 being 125-150W TDP? At least for initial release (assuming no larger die versions). They even showcased it against a GTX 950 (not the greatest comparison I admit) that almost halved its requirements with 84W vs 140W.


----------



## variant

Quote:


> Originally Posted by *BulletBait*
> 
> Wasn't there a whole bunch of talk about P11 likely ending up around 50-75W and P10 being 125-150W TDP? At least for initial release (assuming no larger die versions). They even showcased it against a GTX 950 (not the greatest comparison I admit) that almost halved its requirements with 84W vs 140W.


The only thing we know is that the full system with Polaris 11 was shown at CES running at 85W~. There's a rumor that the Polaris 10 will have a TDP of up to 175W, but will actually use less than that.


----------



## Sand3853

Quote:


> Originally Posted by *BulletBait*
> 
> How many times does this need to be repeated in thread...
> Geez, you guys get your panties all in a wad from a paper launch and come out of the woodwork to make unsubstantiated claims.
> 
> Can we please leave the AMD bash to the nV threads for at least a day?


I make unsubstantiated claims? Maybe you should go back and read exactly what I wrote...

EDIT: NM... got the quote order backwards...lol...realized how what I wrote was being used.


----------



## BulletBait

Quote:


> Originally Posted by *variant*
> 
> The only thing we know is that the full system with Polaris 11 was shown at CES was running at 85W~. There's a rumor that the Polaris 10 will have a TDP of up to 175W, but will actually use less than that.


Ah, I didn't know which one they 'showcased,' from all the month old articles about it, I didn't think anyone else knew which one either. Also, if it's going to have a 'max' TDP of 175, I could forsee it being around the 1070 minimum. Color me cautiously optimistic still.


----------



## variant

Quote:


> Originally Posted by *BulletBait*
> 
> Ah, I didn't know which one they 'showcased,' from all the month old articles about it, I didn't think anyone else knew which one either. Also, if it's going to have a 'max' TDP of 175, I could forsee it being around the 1070 minimum. Color me cautiously optimistic still.


If the rumors of a 175W TDP is true, it puts it in the same TDP category that the 980 currently holds. The GTX 980M is a 125W TDP whereas the GTX 980 has a 165W TDP. It fits with the Polaris 10 being in high end laptops. Whether it's also mainstream or not, is dependent on price, not necessarily performance.


----------



## KGPrime

Quote:


> Originally Posted by *Nickyvida*
> 
> Things aren't looking good for AMD as much as i hate to say it.


This ridiculous statement has been thrown around for 16 years. Yet Amd/Ati is still kicking. In fact 2016, 2017 is probably going to be the best year Amd has had for graphics since they acquired Ati.
They have the industry standard adaptive sync, that even Intel their mortal nemesis for 20 years backs. Freesync monitors galore. Polaris 10 will likely nip at the gtx 1070 for 30-40 dollars cheaper, and they will have an answer for the 1080Ti when it arrives, probably in between 1080 and 1080Ti performance for cheaper. Amd is going to be doing just fine.


----------



## BulletBait

Quote:


> Originally Posted by *Sand3853*
> 
> I make unsubstantiated claims? Maybe you should go back and read exactly what I wrote...


I read what you wrote, it came off as another whine post about how AMD is so bad and won't force nV to drop prices on their cards. Heck, I'm giving team green a critical eye on their performance gains right now, I think its honestly been fluffed and will not come out to that gains they talked about last night. It's possible, but I'll WAIT before yelling foul and throwing nV under the bus on an assumption.

There's so little information on Polaris besides a couple bunk benchmarks and carefully planned/staged demonstrations by AMD. That's it, so making an educated guess on what little official information there is instead of immediately going down the 'whaa, team red is so bad' even though they've shown next to nothing is not helping, when it SEEMS you're making that assumption off bunk benchmarks that looked way too low to be a full die P10 and had a previous supposed P11 benchmark outperforming it.
Quote:


> Originally Posted by *variant*
> 
> If the rumors of a 175W TDP is true, it puts it in the same TDP category that the 980 currently holds. The GTX 980M is a 125W TDP whereas the GTX 980 has a 165W TDP. It fits with the Polaris 10 being in high end laptops. Whether it's also mainstream or not, is dependent on price, not necessarily performance.


Well... The 1080 is supposedly 180W and was a massive leap over 980/Titan. I realize the die size is also larger, but I don't see why AMD won't at least hit the 980 mark and beyond. Last I saw from an article about an 'internal testing report,' they've already hit the 390 and surpass the 390x by 'significant margins' in some cases. I assume they're still continuing to tweak it as well and may improve those margins a little bit. So it seems they may drop somewhere between the 980/980ti, which is what they were shooting for anyways. I'd think the 1070 is going to drop in there as well, I just don't see it passing the Titan personally for the price point and they didn't talk about it enough to give a solid performance estimate.


----------



## Buris

Quote:


> Originally Posted by *Forceman*
> 
> Sure, but if I was happy with Fury performance, I'd just get a Fury. I want something more than that to make it worthwhile to upgrade my 290X, and I'm willing to pay $500 for it. If AMD isn't able or willing to offer that this summer then I (and others in my position) are going to have to decide whether to wait until next year (again) or jump to Nvidia. The bottom line is that I'm ready for more performance, but it doesn't look like AMD is ready to provide it - and Nvidia is charging too much for it.
> 
> If you play the "$x for y performance" game, you end up with a 750 Ti. I have a performance floor I want, and I'm willing to pay what I consider to be a reasonable price for that level of performance. Hopefully I'll find that this summer.
> Assuming the die size is correct at 232 mm^2 (which I'm not convinced is the case) then it kind of fits. That and their relentless commenting about perf/watt and "bringing VR to the mainstream". 1070/1080 aren't mainstream.


1070 is pretty mainstream, 1080 is "mid-high" real dies come end of year


----------



## variant

Quote:


> Originally Posted by *BulletBait*
> 
> I read what you wrote, it came off as another whine post about how AMD is so bad and won't force nV to drop prices on their cards. Heck, I'm giving team green a critical eye on their performance gains right now, I think its honestly been fluffed and will not come out to that gains they talked about last night. It's possible, but I'll WAIT before yelling foul and throwing nV under the bus on an assumption.
> 
> There's so little information on Polaris besides a couple bunk benchmarks and carefully planned/staged demonstrations by AMD. That's it, so making an educated guess on what little official information there is instead of immediately going down the 'whaa, team red is so bad' even though they've shown next to nothing is not helping, when it SEEMS you're making that assumption off bunk benchmarks that looked way too low to be a full die P10 and had a previous supposed P11 benchmark outperforming it.
> Well... The 1080 is supposedly 180W and was a massive leap over 980/Titan. I realize the die size is also larger, but I don't see why AMD won't at least hit the 980 mark and beyond. Last I saw from an article about an 'internal testing report,' they've already hit the 390 and surpass the 390x by 'significant margins' in some cases. I assume they're still continuing to tweak it as well and may improve those margins a little bit. So it seems they may drop somewhere between the 980/980ti, which is what they were shooting for anyways. I'd think the 1070 is going to drop in there as well, I just don't see it passing the Titan personally for the price point and they didn't talk about it enough to give a solid performance estimate.


The problem with rumors is that even if true, it isn't even clear what they mean exactly. Polaris 10 has two GPUs under it's name, and probably multiple configurations of those two GPUs depending on the graphics card or laptop. Let's not forget the rumor that the PS4 GPU is also Polaris 10. You can change TDP just by clocking it higher or lower. A 390X performance at $200 rumor could be just as true as the 980Ti at $300 rumor, just that the former could be the cut down and the latter being the full version.


----------



## BulletBait

Quote:


> Originally Posted by *variant*
> 
> The problem with rumors is that even if true, it isn't even clear what they mean exactly. Polaris 10 has two GPUs under it's name, and probably multiple configurations of those two GPUs depending on the graphics card or laptop. Let's not forget the rumor that the PS4 GPU is also Polaris 10. You can change TDP just by clocking it higher or lower. A 390X performance at $200 rumor could be just as true as the 980Ti at $300 rumor, just that the former could be the cut down and the latter being the full version.


You're probably right. It all depends on the tests. I mean if their saying it's equal in DX12, that may be good next year, but it won't be a big boon this year and everyone will go, 'ho hum, same old AMD.' My 290Xs are perfectly fine and I'll keep the space heaters another 6-8 months since that's about the Polaris performance estimate right now anyways. Then I can hop the full generation leap to Vega with a power decrease and performance increase.

Nice to make and and see some educated speculation though. That 1080 thread was depressing the crap out of me.


----------



## KeepWalkinG

If the new Polaris flagship can reach 1500mhz frequency or even more will be possible to have performance close to Gtx 1080.
It happens before that small crystal to be faster than bigger one or equal.


----------



## variant

Quote:


> Originally Posted by *BulletBait*
> 
> You're probably right. It all depends on the tests. I mean if their saying it's equal in DX12, that may be good next year, but it won't be a big boon this year and everyone will go, 'ho hum, same old AMD.' My 290Xs are perfectly fine and I'll keep the space heaters another 6-8 months since that's about the Polaris performance estimate right now anyways. Then I can hop the full generation leap to Vega with a power and performance increase.
> 
> Nice to make and and see some educated speculation though. That 1080 thread was depressing the crap out of me.


Well, the Pascal event was suppose to be a press event. What type of press hoot and holler at announcements? The 1080 being 2x the performance of the Titan X is still being perpetuated in places
Quote:


> Originally Posted by *KeepWalkinG*
> 
> If the new Polaris flagship can reach 1500mhz frequency or even more will be possible to have performance close to Gtx 1080.
> It happens before that small crystal to be faster than bigger one or equal.


I wouldn't expect the Polaris 10 to match the 1080. Such a thing would be an amazing feat of engineering.


----------



## BulletBait

Quote:


> Originally Posted by *KeepWalkinG*
> 
> If the new Polaris flagship can reach 1500mhz frequency or even more will be possible to have performance close to Gtx 1080.
> It happens before that small crystal to be faster than bigger one or equal.


It's possible, but not probable. Cypress and Cayman were better then GF100 at a smaller size. GK104 was better then Tahiti at a smaller size. Like I said, it's very improbable, not with the performance goals AMD has already publicly stated.
Quote:


> Originally Posted by *variant*
> 
> Well, the Pascal event was suppose to be a press event. What type of press hoot and holler at announcements? The 1080 being 2x the performance of the Titan X is still being perpetuated in places.


Yeah. I'm just hoping for a HARD Polaris launch at Computex, or maybe a wee bit earlier to match the 1080 or at least the 1070 dates. Even if it is a really limited release by nV since I guess aftermarket won't be seen until later in the summer, it would still be a massive PR win for team green.


----------



## Sand3853

Quote:


> Originally Posted by *BulletBait*
> 
> I read what you wrote, it came off as another whine post about how AMD is so bad and won't force nV to drop prices on their cards. Heck, I'm giving team green a critical eye on their performance gains right now, I think its honestly been fluffed and will not come out to that gains they talked about last night. It's possible, but I'll WAIT before yelling foul and throwing nV under the bus on an assumption.


I think you need to go back and re-read what i posted, in the context it was posted in. Not once did I say anything to even hint at AMD being so bad, or anything remotely close to anything about nVidia and their prices.

If you actualy go and look at my post, I point out that all the p10 'leaks' are suspect and everyone crying that the sky is falling on AMD is premature as those 'leaks' could be any number of p10 chips with the website that was reporting on said 'leak' offering the supposition that the p10 shown could just as easily be a mobile part. As far as I have been able to find there have been 3 different 'leaks' for p10. One that is mere conjecture by journalists at a press event who saw it run Hitman at 1440p with steady 60fps (conjecture), one that shows the p10 score close to fury x/980 ti in Fire Strike, and then the most recent one that shows rather poor performance which most people have fixated on.

Now if you also had read the seconf half of my post, I also stated that we have no clue what is in store, as AMD has been really tight lipped about everything. I honestly cant remember being this close to a possible announcement/launch of an AMD card and not have more than a few obscure rumors/leaks to go on. Its because of this that I am cautiously optimistic as hyped AMD products seem to be a huge let down (FX, 'overclockers dream') and the few times they keep things quiet (r9 295x) we are all suprised.

So, it remains to be seen what p10 really is, and no matter what it is shaping up to be an interesting couple months.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Nickyvida*
> 
> From arstechnica
> Vague? I think it's pretty clear as day to be honest. Nothing for high end cards. Pretty safe to say it's probably a fail even before it launches. 980ti performance IS not cutting it, actually, for a die shrink and a new architecture after three years. Whatever happened to small dies beating big dies like the 680?
> 
> And for the record, 980ti perf is so last year for a 14nm card. People can just buy second hand 980tis off the shelves and camp for Vega, where it'll be beaten roundly by an overpriced XX80Ti anyhow.


Most people I know consider GP104 "mainstream" as well so its not exactly confirmation of performance. But I'd expect all the Nvidia shills to run with that statement as far as they can and to not even consider the possibility that Roy could simply have been trying to lull competitors into a false sense of security. Nah, not even possible at all right?








Quote:


> Originally Posted by *Forceman*
> 
> Sure, but if I was happy with Fury performance, I'd just get a Fury. I want something more than that to make it worthwhile to upgrade my 290X, and I'm willing to pay $500 for it. If AMD isn't able or willing to offer that this summer then I (and others in my position) are going to have to decide whether to wait until next year (again) or jump to Nvidia. The bottom line is that I'm ready for more performance, but it doesn't look like AMD is ready to provide it - and Nvidia is charging too much for it.
> 
> If you play the "$x for y performance" game, you end up with a 750 Ti. I have a performance floor I want, and I'm willing to pay what I consider to be a reasonable price for that level of performance. Hopefully I'll find that this summer.
> Assuming the die size is correct at 232 mm^2 (which I'm not convinced is the case) then it kind of fits. That and their relentless commenting about perf/watt and "bringing VR to the mainstream". 1070/1080 aren't mainstream.


Keep in mind that AMD is going to enjoy the same clock speed advantages over 28nm as Nvidia yet so many around here just assume that Polaris will be stuck with the same low clocking potential as Hawaii and Fiji. If the GP104 is capable of 1800MHz its not unreasonable to conceive of at least an equal clock speed for Polaris 10, especially if it turns out to be an even smaller chip. That said, its certainly possible that AMD will be ceding the flagship market to Nvidia and that a ~$300 Polaris that performs similar to a 980Ti is what we get. In that case the card would obviously not be for you but it would be perfect for the vast majority of people that buy GPU's. If it performs no better than the 390X, however, go ahead and stick a fork in AMD. I don't personally think even AMD thinks that kind of release will fly though...


----------



## inedenimadam

AMD Roy is haphazardly flamboyant, he knows no subterfuge.


----------



## Forceman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Most people I know consider GP104 "mainstream" as well so its not exactly confirmation of performance. But I'd expect all the Nvidia shills to run with that statement as far as they can and to not even consider the possibility that Roy could simply have been trying to lull competitors into a false sense of security. Nah, not even possible at all right?


Well he starts the comment by saying the 970 and 290X were too small a market area, and then says Polaris is going for a larger market, so that strongly implies it is going to slot in below Hawaii and GM204's equivalent market segment. That may still end up being 980 Ti class, but it doesn't seem like they are trying to compete with GP104, but rather GP106.

If you could count on Crossfire working, two almost-980 Tis for $600 would be pretty nice though.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Forceman*
> 
> Well he starts the comment by saying the 970 and 290X were too small a market area, and then says Polaris is going for a larger market, so that strongly implies it is going to slot in below Hawaii and GM204's equivalent market segment. That may still end up being 980 Ti class, but it doesn't seem like they are trying to compete with GP104, but rather GP106.
> 
> If you could count on Crossfire working, two almost-980 Tis for $600 would be pretty nice though.


Again, its still an implication from a notoriously foot-in-mouth Roy, not a CONFIRMATION of any sort really. The way the quote is being used by green fanboys, however, is that Roy flat out said that Polaris 10 will only offer 390X-like performance (which is not at all what he said). At any rate AMD now is fully aware of what they are facing in the 1080 and 1070 so they can plan accordingly. I just find it funny that all of the sudden Roy's statements are now "etched in stone" around here when everyone knows the guy is a walking, talking gaffe in human form...


----------



## ebduncan

guys stop comparing clocks speeds between different architectures, it is pointless.

there isn't enough "official" info about polaris to make any real conclusions of how it will perform. Everyone is assuming a 232mm die, and that's not even confirmed.

The only thing that can be confirmed is that p10 will be vr capable, which current min spec is a r9-290. Going by Amd slides 2.5 x performance per watt didn't say if that was compared to fiji or the 390x. Fiji 300/ 2.5 = 120 watts so could be fury x performance at 120 watts. Hbm saved a-lot of power compared to the 512 bit bus of hawaii, so this really puts a wild guess on numbers.

I'd expect to see the top model polaris 10 somewhere between the 1080 and the 1070.


----------



## Majin SSJ Eric

We still don't even really KNOW all that much about the 1080/1070 either at this point, just that its "confirmed" by Team Greensters that Polaris will have no chance against it at all (even though, as you say, we know NOTHING about Polaris)! I personally would be shocked if AMD has no answer for the 1080 this year whatsoever. In fact, the statement about AMD going after mainstream market share with Polaris doesn't exclude the existence of a larger performance, high end chip at all. Could just be that they are confident that the cheaper variants will offer enough performance at a lower price to increase their market share, not that nothing more expensive actually exists...


----------



## gr4474

Quote:


> Originally Posted by *Nickyvida*
> 
> Why would it not make sense? If AMD brings out a competitive product, i'd buy it. But as it stands, if Polaris is a stinker, why would i waste my hard earned money? I'm not like some super hardcore enthusiast who can afford to punt several thousands getting Devil's Canyon to Skylake to Kaby Lake etc with 5% generation increase between releases. I'd want to stretch my dollar for something performing worthwhile given there's my hard earned savings coming into play. And waiting for 3 years+ through the whole 20nm saga and then finding out i'll have to wait another just to get a decent increase in performance after Polaris turned out to be mainstream, i'm just pissed off basically i have to wait, that's all. I'm not and never going back to Nvidia after my shoddy experiences with thier cards dying less than two years and thier overpriced cards
> 
> Calling me a professional troll is a bloody insult as a fellow enthusiast, elitist douchebag.
> 
> Is this enough proof for you that i own an AMD card? Or do i have to dismantle my case, take a pic, scrounge up my reciept, if i have to?


Ok boyz...I think he's ok.. Do you think we should let him stay? LOL nahhh I'm hanging on to a 660ti, but soooon to be sportin a Polaris. I'm betting it will match a 1070 for $300.


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> We still don't even really KNOW all that much about the 1080/1070 either at this point, just that its "confirmed" by Team Greensters that Polaris will have no chance against it at all (even though, as you say, we know NOTHING about Polaris)! I personally would be shocked if AMD has no answer for the 1080 this year whatsoever. In fact, the statement about AMD going after mainstream market share with Polaris doesn't exclude the existence of a larger performance, high end chip at all. Could just be that they are confident that the cheaper variants will offer enough performance at a lower price to increase their market share, not that nothing more expensive actually exists...


Too much common sense for product reveal week! The hysteria is real; in some threads you're an AMD "fanboy" for not declaring Polaris DOA and praising Nvidia for releasing another "flagship" that's living on borrowed time from the jump-off. 1080/1070 made gains, credit where credit is due. People extrapolating that into Polaris being of no value to anybody are just a little too hyped from Jen-Hsun's performance graphs.


----------



## gr4474

Quote:


> Originally Posted by *DIYDeath*
> 
> What would you rather have?
> 
> $300 for a AMD 980ti or $380 for a faux Titan X?
> 
> One of these things is not like the others...one of them just doesn't belong!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Polaris 10 would need to be priced somewhere between $200-$280 to compete given what we know about the competiion, based on the rumors of polaris 10 which at this point I doubt are true.


I'll take the Polaris. I'm betting on AMD for DirectX12/Vulkan, and against Nvidia's gimping, hyping shady ways.









Edit: I'll confirm after benchmarks, and not against going Nvidia if AMD flops. That's for those calling people idiots and fools for saying they want to buy something, and you assume they won't know what they're getting.


----------



## tajoh111

AMD better get their performance per transistor rating up because i just realized this launch has so many parallel with the gm204/Tonga launch and we remember how that launched turned out for AMD.

Some of the similarities include.

Same amount of shaders, back then it was 2048 shaders for both cards, this time around they are 2560 for both companies.

Also the transistor and size advantages are pretty close like last gen. When adjusted, Nvidia die is around 600mm if it was still made on 28nm, AMD is around 464 to 510mm2. Nvidia had a 39mm2 advantage and a 0.2 billion advantage in transistor density during the tonga/gm204 launch. So Nvidia has a slightly bigger lead this time when it comes to transistor in a best case scenario to a significantly larger advantage.

The only difference is AMD is making a 110-135watt part and Nvidia is making a 180 watt part.

What this means is if AMD made the same magnitude of improvements as Nvidia as far as pure performance, they are going to be in the exact same position as now. Which means a 1070 would be 30% faster than a uncut polaris 10. Where this would put polaris 10 performance at is inbetween a 390 and 390x. However this does not take into account that AMD parts is going to consume 66% of the power Nvidia's part. This means performance if similarly shrunk down along with this power consumption would actually be where the current 380x occupies.

So what this means is to hit the 390/390x level of performance, AMD needs to get the same increase in performance as Nvidia, while decreasing power consumption.

In numbers what this means, AMD would need to increase performance by 65% while decreasing power consumption by 0.33. AKA 2.5x performance per watt. This is basically ideal which is very difficult to meet. Nvidia for example promised us 2x performance per watt while they delivered us 1.5-1.65x performance per watt. Companies exaggerate performance per watt claims and AMD is likely to do the same but maybe 1.8-2.0. AMD did the same with Nano(they promised 2x performance per watt but delivered 1.6-1.7) as far as their performance per watt claims.

What this ultimately translates into is 390-390x levels of performance for 140-150 watts. This is a realistic expectation given marketing exaggeration and what information they have given us and taking into account deficits of last generation. What AMD fans fail to realize generally is that Tonga is so far behind gm204 as far as performance per watt, that reaching gtx 980 ti levels of performance in 120-140 watt chip would need AMD to basically triple their performance per watt and a real one, not a fake marketing derived one.


----------



## variant

Quote:


> Originally Posted by *tajoh111*
> 
> AMD better get their performance per transistor rating up because i just realized this launch has so many parallel with the gm204/Tonga launch and we remember how that launched turned out for AMD.
> 
> Some of the similarities include.
> 
> Same amount of shaders, back then it was 2048 shaders for both cards, this time around they are 2560 for both companies.
> 
> Also the transistor and size advantages are pretty close like last gen. When adjusted, Nvidia die is around 600mm if it was still made on 28nm, AMD is around 464 to 510mm2. Nvidia had a 39mm2 advantage and a 0.2 billion advantage in transistor density during the tonga/gm204 launch. So Nvidia has a slightly bigger lead this time when it comes to transistor in a best case scenario to a significantly larger advantage.
> 
> The only difference is AMD is making a 110-135watt part and Nvidia is making a 180 watt part.
> 
> What this means is if AMD made the same magnitude of improvements as Nvidia as far as pure performance, they are going to be in the exact same position as now. Which means a 1070 would be 30% faster than a uncut polaris 10. Where this would put polaris 10 performance at is inbetween a 390 and 390x. However this does not take into account that AMD parts is going to consume 66% of the power Nvidia's part. This means performance if similarly shrunk down along with this power consumption would actually be where the current 380x occupies.
> 
> So what this means is to hit the 390/390x level of performance, AMD needs to get the same increase in performance as Nvidia, while decreasing power consumption.
> 
> In numbers what this means, AMD would need to increase performance by 65% while decreasing power consumption by 0.33. AKA 2.5x performance per watt. This is basically ideal which is very difficult to meet. Nvidia for example promised us 2x performance per watt while they delivered us 1.5-1.65x performance per watt. Companies exaggerate performance per watt claims and AMD is likely to do the same but maybe 1.8-2.0. AMD did the same with Nano(they promised 2x performance per watt but delivered 1.6-1.7) as far as their performance per watt claims.
> 
> What this ultimately translates into is 390-390x levels of performance for 140-150 watts. This is a realistic expectation given marketing exaggeration and what information they have given us and taking into account deficits of last generation. What AMD fans fail to realize generally is that Tonga is so far behind gm204 as far as performance per watt, that reaching gtx 980 ti levels of performance in 120-140 watt chip would need AMD to basically triple their performance per watt and a real one, not a fake marketing derived one.


I am sure you have lots of proof for all those "facts" you are stating, right?


----------



## Majin SSJ Eric

Not one single solid fact in his entire post. All based on assumptions, rumors and innuendo. There has been no confirmation from AMD about die size, transistor count, clock speed, price, etc. But somehow we all know that it's only going to be as fast as a 390 X? Give me a break.


----------



## rtikphox

Just me or I expected polaris to be some sort of Crysis 3 killer? Sorta like how 3850/4870 and the 7970s murdered the nvidia cards of their day. At least do 1.20X or 1.5X 980 Ti.


----------



## tajoh111

Quote:


> Originally Posted by *variant*
> 
> I am sure you have lots of proof for all those "facts" you are stating, right?


Basically recent techpowerup charts at 2560x1440 + the assumption on the 232mm2 Polaris 10, which is a figure that has been looked at multiple times. Also 28nm vs 14nm finfet being between 2 to 2.2 times denser. Mostly just calculations using those charts.

Plus from what AMD has stated, and what the CEO has said, Polaris 10 is a mainstream part. This I am most likely assuming correctly that this chip is meant to be priced under 350 and under from the get go. Having a die as large as GP104, would make it in the same class as tahiti. That is not mainstream at all.

As I have said many many months ago, expect Polaris 10 to be the true successor Pitcairns, which means a similar die size, which is getting way and way more evidence being true as the months come closer to launch. Nothing has indicated once at all that Polaris die size is 300mm2+ and only the opposite has been stated, written or rumored.

The parallels between the tonga/gm204 and GP204/Polaris 10 are definitely there as there is a very good chance the same amount of shaders are present in both products and they likely share the same 256 bit bus.

The problem for AMD is their performance per transistor is inferior to Nvidias which is well represented by the gulf between the gtx 980/380x. 5.2 billion transistors vs 5 billion respectively and Nvidia card is 51% faster or in the case of the gtx 970 vs 380x which is more representative of the upcoming launch of the 480x vs 1070, 30% faster with the gtx 970 consuming less watts to boot.

What the tonga example shows is how much more work AMD needs to get done and how much more improvement AMD needs to be competitive with Nvidia. AMD not only needs to keep up with the improvement Nvidia did with Pascal, they need to close up the original gap in the first place to be competitive. Making a 120-140 watt part that keeps up titan X is something Nvidia's Pascal architecture cannot achieve. Considering how far back Tonga was vs Maxwell, closing that gap and making that improvement necessary is very difficult and requires AMD to get twice the improvement + a bit more over their previous architechture, compared to Nvidia's pascal upgrade. Thus the rumors of 480x performing at a level 390/390x have more weight and likely hood. Getting to 390x levels of performance vs the 380x would mean a respectable 50-65% increase in performance and lowering power consumption by 40-60watts would be a respectable achievement. This would be already a bigger improvement than what Nvidia has shown. Nvidia's basically getting a 65% improvement while increasing power consumption by 10%.

https://www.techpowerup.com/reviews/ASUS/GTX_950/23.html

Titan x is literally 2x as fast as a 380x. To match that while using 120-140watts, would require a 3x increase in performance per watt. Very unlikely.


----------



## EightDee8D

Quote:


> Originally Posted by *tajoh111*
> 
> Basically techpowerup charts + the assumption on the 232mm2 Polaris 10. Also 28nm vs 14nm finfet being between 2 to 2.2 times denser. Mostly just calculations using those charts.


But you forgot about 980 having huge frequency advantage (25-30%) plus more ROPs ( 32vs64). and some dx11 driver overhead. and with that 980 is just 9% ahead of 380x on 4k ( 4k because for eliminating driver overhead). apart from that R9 nano matches 980ti in p/w.


----------



## tajoh111

Quote:


> Originally Posted by *EightDee8D*
> 
> But you forgot about 980 having huge frequency advantage (25-30%) plus more ROPs ( 32vs64). and some dx11 driver overhead. and with that 980 is just 9% ahead of 380x on 4k ( 4k because for eliminating driver overhead). apart from that R9 nano matches 980ti in p/w.


What chart are you looking at. At 4k, the gtx 980 is 45% faster than a 380x. And 2560x1440 should be enough to eliminate most of the driver overhead issues. In addition people buying 350 dollar graphics cards are most likely going to be playing on 1080p so I already given AMD the benefit of the doubt by bumping the resolution up to 2560x1440.

Nvidia's architecture is designed to run at higher speed which is likely an advantage it will carry on to next generation, running it as these speeds is not some sort of accidental benefit they stumbled into. The ability to run at these frequencies while not using that much power is a testament to Nvidia's engineering and a technological advantage in this aspect. If AMD could run at these frequencies and not dramatically increase power consumption, I recommend they do it. But that might require changes to GCN that might turn it into a new architecture altogether. That doesn't mean AMD doesn't have it's own advantages, as it better at asyncronous computing.

AMD 350 dollar cards are not going to get HBM which adds to the performance per watt and Nano is binned to be a low voltage part and low power part. It's a testament to Nvidia's current advantage in performance per watt, if anything if it requires both binning, underclocking/undervolting + HBM to match Nvidia performance per watt which it does on basically their entire line for maxwell.


----------



## 364901

Quote:


> Originally Posted by *tajoh111*
> 
> Plus from what AMD has stated, and what the CEO has said, Polaris 10 is a mainstream part. This I am most likely assuming correctly that this chip is meant to be priced under 350 and under from the get go. Having a die as large as GP104, would make it in the same class as tahiti. That is not mainstream at all.


Keep in mind that "mainstream" for both companies has been adjusted to the GTX 960, 970, and 980 for NVIDIA and the R9 380, 380X, 390, 390X for AMD. It's a constantly shifting definition for both companies, and the GTX 970 is about as mainstream as it gets going by the Steam hardware survey. NVIDIA is targeting that price point with a stellar GPU priced at $379. AMD will be aiming for the same price point, or lower, with similar amounts of performance.


----------



## EightDee8D

Quote:


> Originally Posted by *tajoh111*
> 
> What chart are you looking at. At 4k, the gtx 980 is 45% faster than a 380x. And 2560x1440 should be enough to eliminate most of the driver overhead issues. In addition people buying 350 dollar graphics cards are most likely going to be playing on 1080p so I already given AMD the benefit of the doubt by bumping the resolution up to 2560x1440.
> 
> Nvidia's architecture is designed to run at higher speed which is likely an advantage it will carry on to next generation, running it as these speeds is not some sort of accidental benefit they stumbled into. The ability to run at these frequencies while not using that much power is a testament to Nvidia's engineering and a technological advantage in this aspect. If AMD could run at these frequencies and not dramatically increase power consumption, I recommend they do it. But that might require changes to GCN that might turn it into a new architecture altogether.


https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/23.html
on 4k
980=64%
380x=59%

64/59=~9% more faster.

on 2k its 52% faster that means there's some driver overhead still left. i don't care for what they buy this gpu for, i mainly used it to remove overhead problem which polaris is supposed to be doing. that puts 380x and 980 more closer too. obviously im not saying tonga is better or something.

Quote:


> The ability to run at these frequencies while not using that much power is a testament to Nvidia's engineering and a technological advantage in this aspect. If AMD could run at these frequencies and not dramatically increase power consumption, I recommend they do it. But that might require changes to GCN that might turn it into a new architecture altogether.


polaris is a major overhaul of gcn for the first time. there are lots of changes.


----------



## Travieso

just have time to rethink again.

Polaris has die size around 230 mm2 while Hawaii is at around 440 mm2.

with process shrink, transistor density can be 2x so that means 230 mm2 die at 14 nm is equivalent to 460 mm2 die at 28 nm process.

deduce some fixed parts like memory controller, etc.

so yeah, it would be quite easy for AMD to get 390X's performance with tiny 230 mm2 die.

but in order to reach 980Ti's ballpark, it needs clockspeed bump around 30-35% at least.

well, my finalword is 'possible' only if they want to push it.


----------



## x3sphere

Quote:


> Originally Posted by *rtikphox*
> 
> Just me or I expected polaris to be some sort of Crysis 3 killer? Sorta like how 3850/4870 and the 7970s murdered the nvidia cards of their day. At least do 1.20X or 1.5X 980 Ti.


That will never happen again in my opinion. Best to hope for from AMD is good price/performance, but there will probably never be another situation where NV gets embarassed on all fronts. It would take a massive screw up on NV's part, as they can afford to spend way more on R&D than AMD can.


----------



## airfathaaaaa

Quote:


> Originally Posted by *x3sphere*
> 
> That will never happen again in my opinion. Best to hope for from AMD is good price/performance, but there will probably never be another situation where NV gets embarassed on all fronts. It would take a massive screw up on NV's part, as they can afford to spend way more on R&D than AMD can.


well considering that we only heard 30 seconds about async compute and we heard exactly 0 seconds about the normal cards you can never say never...


----------



## renx

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Not one single solid fact in his entire post. All based on assumptions, rumors and innuendo. There has been no confirmation from AMD about die size, transistor count, clock speed, price, etc. But somehow we all know that it's only going to be as fast as a 390 X? Give me a break.


100% agreed. Sometimes people tend to forget that rumors are just that. Rumors.
I believe AMD will certainly have a contender for the GTX1080.
And they will have it right from the go.

.


----------



## Buris

Quote:


> Originally Posted by *renx*
> 
> 100% agreed. Sometimes people tend to forget that rumors are just that. Rumors.
> I believe AMD will certainly have contender for the GTX1080. And they will have it right from the go.


I think Nvidia, with maxwell, got their architecture together, this will not be a repeat of the 5870/5850, but I think polaris 10 could beat out the GTX 1070, which is still very good for a 300$ part vs a 370$ part


----------



## DPB23

Quote:


> Originally Posted by *EightDee8D*
> 
> https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/23.html
> on 4k
> 980=64%
> 380x=59%
> 
> 64/59=~9% more faster.
> 
> on 2k its 52% faster that means there's some driver overhead still left. i don't care for what they buy this gpu for, i mainly used it to remove overhead problem which polaris is supposed to be doing. that puts 380x and 980 more closer too. obviously im not saying tonga is better or something.
> polaris is a major overhaul of gcn for the first time. there are lots of changes.


I think there's a rather large typo in that graph, a 380X shouldn't be ahead of a 290, nor should it be far ahead of a 280X. In their original review of the 380X, the 980 was 49% faster at 4K, the 290 32% faster and 280X 4% faster vs stock.


----------



## EightDee8D

Quote:


> Originally Posted by *DPB23*
> 
> I think there's a rather large typo in that graph, a 380X shouldn't be ahead of a 290, nor should it be far ahead of a 280X. In their original review of the 380X, the 980 was 49% faster at 4K, the 290 32% faster and 280X 4% faster vs stock.


Drivers change everything. the review i used is the latest with better drivers.


----------



## DPB23

Quote:


> Originally Posted by *EightDee8D*
> 
> Drivers change everything. the review i used is the latest with better drivers.


Drivers do change, but they don't work miracles. A 380X isn't suddenly going to become faster than a 290 at 4K when it's substantially slower at every other resolution.


----------



## Newbie2009

With Nvidia having announced a new card coming this month, any sign of AMD?


----------



## iLeakStuff

Quote:


> Originally Posted by *Newbie2009*
> 
> With Nvidia having announced a new card coming this month, any sign of AMD?


They should be out with cards in June too.
Not sure how they are gonna do the announcements of the Polaris cards. Could be Computex, could be at a different event.


----------



## Newbie2009

Quote:


> Originally Posted by *iLeakStuff*
> 
> They should be out with cards in June too.
> Not sure how they are gonna do the announcements of the Polaris cards. Could be Computex, could be at a different event.


Stupid to not beat Nvidia to the punch this round after last round disaster.


----------



## tajoh111

Quote:


> Originally Posted by *EightDee8D*
> 
> https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/23.html
> on 4k
> 980=64%
> 380x=59%
> 
> 64/59=~9% more faster.
> 
> on 2k its 52% faster that means there's some driver overhead still left. i don't care for what they buy this gpu for, i mainly used it to remove overhead problem which polaris is supposed to be doing. that puts 380x and 980 more closer too. obviously im not saying tonga is better or something.
> polaris is a major overhaul of gcn for the first time. there are lots of changes.


That's obviously an error that wizard has since corrected for in his latest reviews. Considering where the 290x and 290 series are which are faster cards than the 380x for certain, it doesn't make sense for the 380x to get a super boost in performance at 4k where the 380x is most likely to do worse at due to memory bandwidth. Add in the positioning of the r9 380, which is correctly position which is a very similar card except for about 12% less shaders(There is a 60% difference between the two cards) and it is obviously an error.


----------



## Bryst

Quote:


> Originally Posted by *Newbie2009*
> 
> Stupid to not beat Nvidia to the punch this round after last round disaster.


I think it boils down to AMD needs more time since they dont have all the resources and money Nvidia has. I for one can wait for a real unveiling from AMD to make my decision.

You guys need to go outside and take a walk. People get to antsy over this kind of stuff. Capitalism has consumed you.


----------



## EightDee8D

Quote:


> Originally Posted by *tajoh111*
> 
> That's obviously an error that wizard has since corrected for in his latest reviews. Considering where the 290x and 290 series are which are faster cards than the 380x for certain, it doesn't make sense for the 380x to get a super boost in performance at 4k where the 380x is most likely to do worse at due to memory bandwidth. Add in the positioning of the r9 380, which is correctly position which is a very similar card except for about 12% less shaders(There is a 60% difference between the two cards) and it is obviously an error.


Well In that case, nvm.







( it's the latest gpu review btw)

still though, polaris is a major overhaul of gcn, so we can see pretty close performance.


----------



## BulletBait

Quote:


> Originally Posted by *Bryst*
> 
> I think it boils down to AMD needs more time since they dont have all the resources and money Nvidia has. I for one can wait for a real unveiling from AMD to make my decision.
> 
> You guys need to go outside and take a walk. People get to antsy over this kind of stuff. Capitalism has consumed you.


Noooooooo! Consumerism has its death grip on me! I must go listen to Rage for 36 hours straight now to purge myself.
Quote:


> Originally Posted by *Buris*
> 
> I think Nvidia, with maxwell, got their architecture together, this will not be a repeat of the 5870/5850, but I think polaris 10 could beat out the GTX 1070, which is still very good for a 300$ part vs a 370$ part


This, with the shrink to a smaller node then nV, GCN architecture improvements, and the recent frequent driver update push, I expect expect the '$300' version (or TAM version, whatever AMD wants to call it and price it) to achieve parity with the 1070. Do we know that? Of course not, but given the entire history between AMD and nV, the performance gap between the two has never been OVERWHELMINGLY huge, and are typically priced accordingly with that gap.

We may or may not see a '1080' performance level card this upcoming release period. Maybe it'll be slightly later before Vega, maybe it'll be with Vega as the cutdown Vega chip version. I'll leave it to AMD engineering and marketing to decide what they need to put out. Between you, me, and the trees, I think there's far and away too many GPU versions out right now, they could probably get away with cutting the versions by another 1/3 like the 2xx generation to the 3xx generation. Maybe one 'budget' (say 470), two 'mid' with a mid and mid high (say 480/480X), two 'high' (say 490/490X), and one flagship (Whatever Vega flagship will be called), then their King of Kings (dual Vega whatever). Fusing off down the line on their dies, so maybe we don't get 490X until Vega comes out, but it would save money. Right now, I personally think there's too much choice.

I might be oversimplifying that (I'm sure of it in fact), but I still think if we got back to something in that range of choices where it's a miniscule upgrade between cards to actual tangible increase would make decision making at the low and high ends easier instead of how it is now with a 'wealth' of choice. Where a slight increase may not seem worth it for the price.

I'm just spit balling really to take up some time. Trying not to fall asleep at work.


----------



## Newbie2009

Quote:


> Originally Posted by *Bryst*
> 
> I think it boils down to AMD needs more time since they dont have all the resources and money Nvidia has. I for one can wait for a real unveiling from AMD to make my decision.
> 
> You guys need to go outside and take a walk. People get to antsy over this kind of stuff. Capitalism has consumed you.


Lol not really, just a passing comment. I'd still have HD7970s if I didn't brick one.


----------



## Buris

Quote:


> Originally Posted by *BulletBait*
> 
> Noooooooo! Consumerism has its death grip on me! I must go listen to Rage for 36 hours straight now to purge myself.
> This, with the shrink to a smaller node then nV, GCN architecture improvements, and the recent frequent driver update push, I expect expect the '$300' version (or TAM version, whatever AMD wants to call it and price it) to achieve parity with the 1070. Do we know that? Of course not, but given the entire history between AMD and nV, the performance gap between the two has never been OVERWHELMINGLY huge, and are typically priced accordingly with that gap.
> 
> We may or may not see a '1080' performance level card this upcoming release period. Maybe it'll be slightly later before Vega, maybe it'll be with Vega as the cutdown Vega chip version. I'll leave it to AMD engineering and marketing to decide what they need to put out. Between you, me, and the trees, I think there's far and away too many GPU versions out right now, they could probably get away with cutting the versions by another 1/3 like the 2xx generation to the 3xx generation. Maybe one 'budget' (say 470), two 'mid' with a mid and mid high (say 480/480X), two 'high' (say 490/490X), and one flagship (Whatever Vega flagship will be called), then their King of Kings (dual Vega whatever). Fusing off down the line on their dies, so maybe we don't get 490X until Vega comes out, but it would save money. Right now, I personally think there's too much choice.
> 
> I might be oversimplifying that (I'm sure of it in fact), but I still think if we got back to something in that range of choices where it's a miniscule upgrade between cards to actual tangible increase would make decision making at the low and high ends easier instead of how it is now with a 'wealth' of choice. Where a slight increase may not seem worth it for the price.
> 
> I'm just spit balling really to take up some time. Trying not to fall asleep at work.


I think AMD should release a heavily overclocked version of Polaris 10, if rumors are true and he GTX 1080 is capable of 2ghz clock on the 16nm fab tech, then even with a slightly modified GCN, AMD should be capable of a 1600mhz clock on p10. I have no doubt in my mind this chip would eclipse GTX 1070 performance quite easily. Obviously I don't think it's possible for p10 to beat out the 1080. I think best case scenario for AMD is that they're able to match the 1080- which would require a serious miracle like the 5000 series from ATI


----------



## BulletBait

Quote:


> Originally Posted by *Buris*
> 
> I think AMD should release a heavily overclocked version of Polaris 10, if rumors are true and he GTX 1080 is capable of 2ghz clock on the 16nm fab tech, then even with a slightly modified GCN, AMD should be capable of a 1600mhz clock on p10. I have no doubt in my mind this chip would eclipse GTX 1070 performance quite easily. Obviously I don't think it's possible for p10 to beat out the 1080. I think best case scenario for AMD is that they're able to match the 1080- which would require a serious miracle like the 5000 series from ATI


With how little is known about the 1070 since the whole release was basically about juicing the 1080, I think P10 will as well. Not a sure thing, but I'm willing to give AMD the benefit of the doubt and hedge my bets that way.

I also don't think nV cards will get that '2x' VR performance boost in the real world. They've, what, tripled down on proprietary crap now? It may see that in games that USE VRWorks, but with the huge push to open source VR, not just by AMD, but several other large companies, I don't see it giving them the advantage everyone is expecting.

That said, I may be wrong on all counts and AMD just tanks this refresh with no real performance gain (Which they may just be shooting for anyways and wrap it all into power reduction, which is their choice and I may not be happy, but a 390x with a 1/3 price reduction and a 60%+ power reduction is nothing to sneeze at. I may not be happy, but a lot of of other people will be.), all/majority the devs may decide to use VRWorks for some reason and sabotage AMD/boost nVidea, or mind share just keeps on like it is and AMD continues to lose market share and sales through little fault of their own (marketing failure maybe, but that can only go so far when word of mouth is much more convincing, even if it's wrong.)

So... Still cautiously optimistic.


----------



## TheLAWNOOB

Is a used R9 290 worth about 200 USD or 250 CAD?


----------



## EightDee8D

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Is a used R9 290 worth about 200 USD or 250 CAD?


To me, yes. but still this is a bad time for buying any gpu. better wait for new stuff.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *EightDee8D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheLAWNOOB*
> 
> Is a used R9 290 worth about 200 USD or 250 CAD?
> 
> 
> 
> To me, yes. but still this is a bad time for buying any gpu. better wait for new stuff.
Click to expand...

I'm going to get a used one for CF.

CAD sucks so much right now that the new Polaris will be a rip off considering 1:1.3 exchange rate and 13% tax.


----------



## mav451

If you're _selling_, that's a good price.
Agreed with EightDee8D - I would not be _buying_ any last-gen stuff right now.


----------



## zealord

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Is a used R9 290 worth about 200 USD or 250 CAD?


That is a close one, but I'd say no.

The 290 is nearly 3 years old. They could be bought for 200$ used like for the last 12 months.

The reason why I say no :

- high power draw
- no idea how long and how intensive it was used
- new cards coming soon
- Polaris 10 could be slightly more expensive, but will be overall better.

Now that I think about it.

For me it's a very clear NO. I would not buy that.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheLAWNOOB*
> 
> Is a used R9 290 worth about 200 USD or 250 CAD?
> 
> 
> 
> That is a close one, but I'd say no.
> 
> The 290 is nearly 3 years old. They could be bought for 200$ used like for the last 12 months.
> 
> The reason why I say no :
> 
> - high power draw
> - no idea how long and how intensive it was used
> - new cards coming soon
> - Polaris 10 could be slightly more expensive, but will be overall better.
> 
> Now that I think about it.
> 
> For me it's a very clear NO. I would not buy that.
Click to expand...

Even if Polaris is $300 US, it would still cost 430CAD after tax.

A single 980 Ti tier card is not good enough for 4k.


----------



## BulletBait

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Even if Polaris is $300 US, it would still cost 430CAD after tax.
> 
> A single 980 Ti tier card is not good enough for 4k.


Neither are CFed 290s. I don't even consider my heavily OCed underwater 290Xs good enough for consistent 4K. You're also not guaranteed CF will be supported for a lot of games.


----------



## Bryst

Quote:


> Originally Posted by *Newbie2009*
> 
> Lol not really, just a passing comment. I'd still have HD7970s if I didn't brick one.


I still have my 7950. And honestly even with a 21:9 monitor is still plays BF4 on high/ultra really well and when I played the division it didn't have any issues with that either. Only thing I hate about it is the power draw. Im ready for a 50% cut in TDP.


----------



## BulletBait

Quote:


> Originally Posted by *Bryst*
> 
> I still have my 7950. And honestly even with a 21:9 monitor is still plays BF4 on high/ultra really well and when I played the division it didn't have any issues with that either. Only thing I hate about it is the power draw. Im ready for a 50% cut in TDP.


Depending on how I feel about my money at the time, I may pick up a P10 for the giggles. If they're shooting for entirely power decrease and relatively low performance increase, I may pick up a P10 and put it through some OCing paces, since I'd assume at that point if they're not near the design TDP, there might just be quite a margin of room to kick it around with.

I'm also going to wonder why it's so cold come my first winter after I get Zen and Vega when I halve the the thermal output of my blast furnace.


----------



## Tugrul512bit

Quote:


> Originally Posted by *cowie*
> 
> ah where did I say it was?
> just said what amd has said that's all
> 
> 2.5 errrrr 2.0 x the performance /watt improvement


My english is not so good but,

improvement = an addition or change that makes something better or more valuable

1.0x + 2.0 x addition = 3.0x

doesn't this mean 300 W becomes 100 W ?


----------



## BulletBait

Quote:


> Originally Posted by *Tugrul512bit*
> 
> My english is not so good but,
> 
> improvement = an addition or change that makes something better or more valuable
> 
> 1.0x + 2.0 x addition = 3.0x
> 
> doesn't this mean 300 W becomes 100 W ?


Where's the '1x' coming from? It's just 2x. So if it was all into power reduction, it would be same performance at 150W for 300W previous gen card. So, if they're shooting for 390/390X (275W TDP) performance on P10, it should end up around ~140W TDP.


----------



## sKorcheDeArtH

I'm sure someone probably mentioned this already but I don't see the point of buying polaris 10 if this is true. Polaris 10 for Close to 980 TI performance, spend 50 bucks more and have more than 980TI performance. I'll spend the extra 50 bucks. I don't think is good news at all for AMD or us consumers unless this card is $250.


----------



## zealord

Quote:


> Originally Posted by *sKorcheDeArtH*
> 
> I'm sure someone probably mentioned this already but I don't see the point of buying polaris 10 if this is true. Polaris 10 for Close to 980 TI performance, spend 50 bucks more and have more than 980TI performance. I'll spend the extra 50 bucks. I don't think is good news at all for AMD or us consumers unless this card is $250.


depends on the actual numbers I'd say.

The GTX 1070 is 379$.

If the Polaris card is 300$ and slightly slower it might be still a good buy for some people.


----------



## criminal

Quote:


> Originally Posted by *sKorcheDeArtH*
> 
> I'm sure someone probably mentioned this already but I don't see the point of buying polaris 10 if this is true. Polaris 10 for Close to 980 TI performance, spend 50 bucks more and have more than 980TI performance. I'll spend the extra 50 bucks. I don't think is good news at all for AMD or us consumers unless this card is $250.


What card is faster than the 980ti and cost $350?


----------



## BulletBait

Quote:


> Originally Posted by *sKorcheDeArtH*
> 
> I'm sure someone probably mentioned this already but I don't see the point of buying polaris 10 if this is true. Polaris 10 for Close to 980 TI performance, spend 50 bucks more and have more than 980TI performance. I'll spend the extra 50 bucks. I don't think is good news at all for AMD or us consumers unless this card is $250.


It's already been mentioned several times. It's also been repeated that we have next to ZERO information on Polaris, including architecture, clock, TDP, and price for ANY of them. The only potentially credible info was on P11, not P10. We don't know anything right now, so calling it in team green is premature at best. We also have next to no information about the 1070 since they talked about it for maybe 30 seconds and I won't base a decision off the PR 'claim' of 'faster' then the Titan.

The fact is, people can spend their money where they want to either way. If the positions were reversed and I felt there might be a lack of credible unbiased reviews or actual consumer testing on the 1070/80, I might have picked it up instead. I want to know what the P10 can do though, so I might buy it to test it out, for the giggles.


----------



## Sand3853

Something I've been wondering about (mere conjecture and most likely very off....)... how likely would it be that all the rumors/leaks we have seen is actually of Polaris 11? I mean, they had a working demo of an 11 gpu recently, it seems that most of ehat AMD has shown to date has been of that particular chip. I guess what im getting at is, whats the chances of the leaked benchmarks and supposed 232mm2 die size being the smaller Polaris chip with the Polaris 10 chip more around ~300mm2 range?

Again, mere conjecture and I will concede that I am probably way off, but given the fact that we hardly have any information it just got me wondering.

on a side note, it will be interesting to see just what sort of changes has been made to gcn in this go around, and what type of performance gains the mew architecture brings before a node shrink.


----------



## zealord

Quote:


> Originally Posted by *Sand3853*
> 
> Something I've been wondering about (mere conjecture and most likely very off....)... how likely would it be that all the rumors/leaks we have seen is actually of Polaris 11? I mean, they had a working demo of an 11 gpu recently, it seems that most of ehat AMD has shown to date has been of that particular chip. I guess what im getting at is, whats the chances of the leaked benchmarks and supposed 232mm2 die size being the smaller Polaris chip with the Polaris 10 chip more around ~300mm2 range?
> 
> Again, mere conjecture and I will concede that I am probably way off, but given the fact that we hardly have any information it just got me wondering.
> 
> on a side note, it will be interesting to see just what sort of changes has been made to gcn in this go around, and what type of performance gains the mew architecture brings before a node shrink.


I think very unlikely.

What I think :

Polaris 11 is very small. Around GTX 950 levels of performance. they also tested it against a GTX 950 to show power consumption. It is definitively no 232mm² card. I'd guess it's like 120-140mm².

Polaris 10 is probably the 232mm² card and with 14nm and new architecture is fits perfectly with the expected performance of somewhere between 390X and 980 Ti.

Looking at the GTX 1080 rumors we can see that many of those rumors actually are true. Or somewhat close within margin of error.The GDDDR5X rumors were true. The 8GB rumors were true etc.

The only thing that bugs me about this whole thing is that :

- either AMD has a MASSIVE card for 2017 with VEGA that completely destroys their remaining lineup. Like twice the performance of Polaris10.

- or VEGA is not the bee's knees

There would be massive gap between polaris10 and VEGA with HBM. Well or they have 2 variants of VEGA. One for 490(X) and one for special halo named cards


----------



## azanimefan

Quote:


> Originally Posted by *criminal*
> 
> What card is faster than the 980ti and cost $350?


He's fanboying and has his rumors mixed up.

The gtx1080 is supposed to be about as fast as an overclocked 980ti at a roughly $670 pricepoint.
He's mistaking this for the 1070 which will sell for $380 and perform somewhere around a 980.

What he's not grasping is if the polaris 10 hits at or better than 980ti performance for $330 it completely decimates the entire nvidia launch.


----------



## zealord

Quote:


> Originally Posted by *azanimefan*
> 
> He's fanboying and has his rumors mixed up.
> 
> The gtx1080 is supposed to be about as fast as an overclocked 980ti at a roughly $670 pricepoint.
> *He's mistaking this for the 1070 which will sell for $380 and perform somewhere around a 980.*
> 
> What he's not grasping is if the polaris 10 hits at or better than 980ti performance for $330 it completely decimates the entire nvidia launch.


Nah that would be terrible. Jen Hsun said the GTX 1070 is faster than TX (maybe just 1%, but still faster)


----------



## Sand3853

Quote:


> Originally Posted by *zealord*
> 
> I think very unlikely.
> 
> What I think :
> 
> Polaris 11 is very small. Around GTX 950 levels of performance. they also tested it against a GTX 950 to show power consumption. It is definitively no 232mm² card. I'd guess it's like 120-140mm².
> 
> Polaris 10 is probably the 232mm² card and with 14nm and new architecture is fits perfectly with the expected performance of somewhere between 390X and 980 Ti.
> 
> Looking at the GTX 1080 rumors we can see that many of those rumors actually are true. Or somewhat close within margin of error.The GDDDR5X rumors were true. The 8GB rumors were true etc.
> 
> The only thing that bugs me about this whole thing is that :
> 
> - either AMD has a MASSIVE card for 2017 with VEGA that completely destroys their remaining lineup. Like twice the performance of Polaris10.
> 
> - or VEGA is not the bee's knees
> 
> There would be massive gap between polaris10 and VEGA with HBM. Well or they have 2 variants of VEGA. One for 490(X) and one for special halo named cards


Makes perfect sense...as I said, mere conjecture/ponderings.

I'm honestly of the mindset that we havent seen/heard anything of full polaris 10 and the 'leaks' are more than likely either the mobile chip or the chip for apple/consols (WWDC is just around the corner). Plus, something to consider, the leaks we have had are few and far between. there have been way more rumors of the nVidia cards, to where quite a few folks had a good idea of the general specifics before its announcement.

Now, I also think we will find a chip (polaris 10) that competes with the 1070, and that Vega will come in a few variants with Vega 10 = big chip and Vega 11 ocupying the market roll of the 390/x possibly fitting at or around the 1080 (+/- a few % depending on sku).

I just cant fathom them having a complete launch of a new architecture that only offers the same performance of what they right now. I think if they can offer good performance at the right price (and can beat/match the 1070) it will offer them some momentum in the market. If we remember, it really wasnt the 980 that caused problems for AMD, but the 970 and its suprisingly low price. AMD could pull something similar, provided the performance there.


----------



## Malinkadink

Quote:


> Originally Posted by *azanimefan*
> 
> He's fanboying and has his rumors mixed up.
> 
> The gtx1080 is supposed to be about as fast as an overclocked 980ti at a roughly $670 pricepoint.
> He's mistaking this for the 1070 which will sell for $380 and perform somewhere around a 980.
> 
> What he's not grasping is if the polaris 10 hits at or better than 980ti performance for $330 it completely decimates the entire nvidia launch.


How is it fair to compare an overclocked 980 Ti with a stock 1080? The 1080s will clock to 2k for probably anyone who wants to go that high if they showed off 2100 during the reveal. An overlocked 1080 will be a good chunk faster than the 980 Ti, 20-25% in non VR applications. So for essentially the same price as a 980 Ti you'll get a good chunk more performance if you wait several weeks to get a 1080.

As for the 1070 it will be able to match and or exceed the 980 Ti with both cards heavily overclocked. So if you fancy saving $220 over the 1080 and are content with 980 Ti level performance at a sub $400 price point then i don't understand whats not to like about that. The efficiency gains here are larger than the performance gains which is a great thing on its own.

I'm aiming for a mini-ITX build next so these powerful and efficient cards are a dream come true when they're gonna be packed into a tiny case. I'm very curious to see what the strongest Polaris 10 card can do, and i hope for AMDs sake they'll give us details in a couple weeks tops, else i'll end up buying Nvidia as my 970 is getting long in the tooth for 1440p 144hz gaming.


----------



## BulletBait

So... Decided to do the googledy goog, because reasons.

This article sums up everything of what we know of Polaris and Vega so far for anyone else that comes in asking...

Polaris/Vega Details

The answer is still... Not much.


----------



## variant

Quote:


> Originally Posted by *zealord*
> 
> Nah that would be terrible. Jen Hsun said the GTX 1070 is faster than TX (maybe just 1%, but still faster)


From how he said it, it sounded like it would be faster than the TitanX in VR, not in general.


----------



## ebduncan

Quote:


> Originally Posted by *BulletBait*
> 
> Neither are CFed 290s. I don't even consider my heavily OCed underwater 290Xs good enough for consistent 4K. You're also not guaranteed CF will be supported for a lot of games.


yes they are, stop being a settings junkie. The differences between high and very high is usually not even noticeable.

Quote:


> Originally Posted by *zealord*
> 
> Nah that would be terrible. Jen Hsun said the GTX 1070 is faster than TX (maybe just 1%, but still faster)


Just for reference the Titan X is slower than most custom 980ti's.

There are not enough facts to say for sure what AMD is doing. P11/P10 have already been adopted by oems, and the next gen consoles. Here is what I expect to happen

P11/P10 will come out with the 1070/1080. The geforce cards will be faster and use more power. Nvidia will gain more "high end" market share, AMD will gain "overall market share" as oems and smaller cards take up a much higher percentage of the market. AMD will be the first one out the gates with Vega, and blow the 1080/1070 outta the water for awhile until the big pascal comes out. Everyone will wonder where's Nvidia's answer to Vega, and forget all about how P10 doesn't match the 1070/1080. Then big pascal will be available most likely taking over the performance crown again. That's my prediction.


----------



## Forceman

Quote:


> Originally Posted by *ebduncan*
> 
> yes they are, stop being a settings junkie. The differences between high and very high is usually not even noticeable.
> Just for reference the Titan X is slower than most custom 980ti's.
> 
> There are not enough facts to say for sure what AMD is doing. P11/P10 have already been adopted by oems, and the next gen consoles. Here is what I expect to happen
> 
> P11/P10 will come out with the 1070/1080. The geforce cards will be faster and use more power. Nvidia will gain more "high end" market share, AMD will gain "overall market share" as oems and smaller cards take up a much higher percentage of the market. AMD will be the first one out the gates with Vega, and blow the 1080/1070 outta the water for awhile until the big pascal comes out. Everyone will wonder where's Nvidia's answer to Vega, and forget all about how P10 doesn't match the 1070/1080. Then big pascal will be available most likely taking over the performance crown again. That's my prediction.


Why do you think Vega will come out first when Nvidia is already selling GP100 parts? Seems like Nvidia has the leg up there.


----------



## ebduncan

Quote:


> Originally Posted by *Forceman*
> 
> Why do you think Vega will come out first when Nvidia is already selling GP100 parts? Seems like Nvidia has the leg up there.


Gp100 parts are not for gaming. Vega/greenland has been taped out for awhile, with no word yet about the gaming version of gp100 being taped out. Amd has exclusive rights to HBM as well, meaning they get priority supply for HBM.


----------



## Forceman

Quote:


> Originally Posted by *ebduncan*
> 
> Gp100 parts are not for gaming. Vega/greenland has been taped out for awhile, with no word yet about the gaming version of gp100 being taped out. Amd has exclusive rights to HBM as well, meaning they get priority supply for HBM.


They may have had rights for HBM, but they obviously don't have exclusive rights for HBM2 since Nvidia is selling parts with it today.

And I'm not saying they will use GP100, but they are fabbing 600mm 16FF parts right now so they obviously have some experience with it, which should carry over to whatever they use for consumer parts.


----------



## BulletBait

Quote:


> Originally Posted by *Forceman*
> 
> They may have had rights for HBM, but they obviously don't have exclusive rights for HBM2 since Nvidia is selling parts with it today.
> 
> And I'm not saying they will use GP100, but they are fabbing 600mm 16FF parts right now so they obviously have some experience with it, which should carry over to whatever they use for consumer parts.


It's not and never was an exclusivity deal. It's a priority supply agreement with Hynix, the co-developer. So AMD's orders get filled first, anyone else comes after. AMD hasn't needed a large amount yet since Vega is still 6-8 months out and HBM2 just scaled up to mass production. So who knows the production scale at that point and how much will go to AMD before other orders start getting filled.

Edit: Mass production by Samsung. Hynix scales up Q3. I'd take Hynix memory over Samsung any day.


----------



## ebduncan

Quote:


> Originally Posted by *Forceman*
> 
> They may have had rights for HBM, but they obviously don't have exclusive rights for HBM2 since Nvidia is selling parts with it today.
> 
> And I'm not saying they will use GP100, but they are fabbing 600mm 16FF parts right now so they obviously have some experience with it, which should carry over to whatever they use for consumer parts.


They still have exclusive rights to HBM. I think your getting mixed up with the word exclusive here. Exclusive in this case meaning volume rights with initial production. I don't think hynix has volume supply issues right now, because well gp100 cost 15k$ and AMD doesn't have a product out using it yet.


----------



## Forceman

Quote:


> Originally Posted by *ebduncan*
> 
> They still have exclusive rights to HBM. I think your getting mixed up with the word exclusive here. Exclusive in this case meaning volume rights with initial production. I don't think hynix has volume supply issues right now, because well gp100 cost 15k$ and AMD doesn't have a product out using it yet.


They may have priority access (which isn't the same as exclusive) to Hynix production (although I've only ever seen that as "rumored" not confirmed), but I haven't seen anything about an agreement with Samsung, and that's who's supplying Nvidia right now.


----------



## sKorcheDeArtH

Quote:


> Originally Posted by *azanimefan*
> 
> He's fanboying and has his rumors mixed up.
> 
> The gtx1080 is supposed to be about as fast as an overclocked 980ti at a roughly $670 pricepoint.
> He's mistaking this for the 1070 which will sell for $380 and perform somewhere around a 980.
> 
> What he's not grasping is if the polaris 10 hits at or better than 980ti performance for $330 it completely decimates the entire nvidia launch.


If you had read my post fully instead of instantly crying"fanboy", you'd read that I think it would be terrible for us all if Polaris 10 is not close to 1070 performance which isn't much more than 300$. That's a valid point. Read again.


----------



## BulletBait

Quote:


> Originally Posted by *Forceman*
> 
> They may have priority access (which isn't the same as exclusive) to Hynix production (although I've only ever seen that as "rumored" not confirmed), but I haven't seen anything about an agreement with Samsung, and that's who's supplying Nvidia right now.


You're right. More rumors we gotta deal with, eh? The rumor mill just keeps on a-turnin XD.

I still stand by my statement of Hynix over Samsung.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *zealord*
> 
> Nah that would be terrible. *Jen Hsun said the GTX 1070 is faster than TX (maybe just 1%, but still faster)*


That's at stock clocks I bet. And Titan X is hilariously under clocked at stock...


----------



## KeepWalkinG

We need fast answer from AMD!


----------



## ToTheSun!

Quote:


> Originally Posted by *Malinkadink*
> 
> I'm aiming for a mini-ITX build next


I know it's off-topic, but if you're looking for the best compromise between volume and compatibility (m-itx motherboards without a daughterboard are, generally speaking, not as beefy as m-atx in regard to power delivery and have less memory slots), consider having a look at this: http://www.kimeraindustries.com/cerberus/#cerberus-hero (in case you haven't yet).

It's a very interesting project that simply needs more supporters. Anyway, carry on.


----------



## spyshagg

2x 290x still manage to run BF4 1440P Ultra @ 150fps,

Not every game scales like this. But the recent crossfire push (open source + DX12) makes me very hesitant to sell these cards.


----------



## AvengerNoonZz

A Polaris GPU performing like a 980 Ti would be great. It means I will get a nice upgrade and not lose freesync. Wonder how long my R9 390 can last for at 1440p 144hz.


----------



## BulletBait

Quote:


> Originally Posted by *spyshagg*
> 
> 2x 290x still manage to run BF4 1440P Ultra @ 150fps,
> 
> Not every game scales like this. But the recent crossfire push (open source + DX12) makes me very hesitant to sell these cards.


Quote:


> Originally Posted by *ebduncan*
> 
> yes they are, stop being a settings junkie. The differences between high and very high is usually not even noticeable.


I'll concede. I haven't done it in a while. I haven't given the 4K a shot in a year, so back before DX12 and the more recent AMD driver updates, you're right, it's probably functional now.

I'd also like to point out I used the word consistent, which I meant it varied wildly across games and settings. Still, in the end, you guys are probably right and it's completely functional now. I apologize for my outdated personal experience over current facts.


----------



## BulletBait

AMD Unveil (Launch?) End of May

Yay?


----------



## Nova.

Quote:


> Originally Posted by *BulletBait*
> 
> AMD Unveil (Launch?) End of May
> 
> Yay?


Expected. There is no way AMD was going to let Nvidia launch their cards freely without some sort of competition. Lets hope AMD brings the heat!


----------



## BulletBait

Quote:


> Originally Posted by *Nova.*
> 
> Expected. There is no way AMD was going to let Nvidia launch their cards freely without some sort of competition. Lets hope AMD brings the heat!


I know it was expected, most of us were expecting Computex proper, I was hoping it's be a week earlier after nV dropped their 27th 'Founders' launch date. I'm still hoping it's a proper hard launch, or no more then a week long paper launch. A little confirmation is nice though.


----------



## rv8000

If the leaked die size and core count is true, the biggest factors will be core clock, more significant architectural changes than from Hawaii to Tonga or Tonga to Fiji, and if they went and tweaked the ratio of core/tmus/rops.

Scenario 1: 2560 shaders, 256bit w/GDDR5, same tmu/rop ratio on hawaii, ~1400mhz core clock ---> Marginally faster than 390x

Scenario 2: 2560 shaders, 256bit w/GDDR5, increased rops, ~1400mhz core clock ---> Faster than Fury X ~ around 980ti performance

Expect serious hits to performance @ 1440p and 4k if there haven't been any more memory compression improvements above tonga, and if there is no option for GDDR5X (which I wouldn't be surprised if both p10 cards are limited to GDDR5).

This also puts AMD in a weird spot in terms of skus if they plan on releasing a version of p10 with GDDR5X later on in the year. But I am going to remain cautiously optimistic on the topic, if the card does come in @ 980ti performance I expect to see a price tag of anywhere from 400 to 450.


----------



## caenlen

I'd rather pay the extra 80 bucks for gtx 1070 beating Titan X performance and GDDR5x, and since Nvidia is 70% of all desktop PC gaming never having to worry about a game not getting fixed driver wise super fast... but to each their own.


----------



## bigjdubb

Quote:


> Originally Posted by *rv8000*
> 
> If the leaked die size and core count is true, the biggest factors will be core clock, more significant architectural changes than from Hawaii to Tonga or Tonga to Fiji, and if they went and tweaked the ratio of core/tmus/rops.
> 
> Scenario 1: 2560 shaders, 256bit w/GDDR5, same tmu/rop ratio on hawaii, ~1400mhz core clock ---> Marginally faster than 390x
> 
> Scenario 2: 2560 shaders, 256bit w/GDDR5, increased rops, ~1400mhz core clock ---> Faster than Fury X ~ around 980ti performance
> 
> Expect serious hits to performance @ 1440p and 4k if there haven't been any more memory compression improvements above tonga, and if there is no option for GDDR5X (which I wouldn't be surprised if both p10 cards are limited to GDDR5).
> 
> This also puts AMD in a weird spot in terms of skus if they plan on releasing a version of p10 with GDDR5X later on in the year. But I am going to remain cautiously optimistic on the topic, *if the card does come in @ 980ti performance I expect to see a price tag of anywhere from 400 to 450.*


I would expect them to match or undercut the 1070 price if that's where the performance is at.


----------



## ebduncan

Quote:


> Originally Posted by *caenlen*
> 
> I'd rather pay the extra 80 bucks for gtx 1070 beating Titan X performance and GDDR5x, and since Nvidia is 70% of all desktop PC gaming never having to worry about a game not getting fixed driver wise super fast... but to each their own.


the 1070 doesn't have GDDR5X, it uses regular gddr5.


----------



## criminal

Quote:


> Originally Posted by *caenlen*
> 
> I'd rather pay the extra 80 bucks for gtx 1070 beating Titan X performance and GDDR5x, and since Nvidia is 70% of all desktop PC gaming never having to worry about a game not getting fixed driver wise super fast... but to each their own.


GTX 1070 won't have GDDR5x. Just regular GDDR5.








... beat by 30 seconds.


----------



## rv8000

Quote:


> Originally Posted by *bigjdubb*
> 
> I would expect them to match or undercut the 1070 price if that's where the performance is at.


Not with this "Founder's Edition" BS, GTX 1070 will likely remain around $429~449. I still believe this will end up being decided by how well p10 clocks on the new 14nm process. There is so little concrete information on p10 that it is really hard to say, AMD (RTG) will put themselves in the grave if they can't release a part that's faster than Hawaii on an entirely new node @ $300.


----------



## caswow

Quote:


> Originally Posted by *caenlen*
> 
> I'd rather pay the extra 80 bucks for gtx 1070 beating Titan X performance and GDDR5x, and since Nvidia is 70% of all desktop PC gaming never having to worry about a game not getting fixed driver wise super fast... but to each their own.


you must be joking right?


----------



## BulletBait

Quote:


> Originally Posted by *caenlen*
> 
> This also puts AMD in a weird spot in terms of skus if they plan on releasing a version of p10 with GDDR5X later on in the year. But I am going to remain cautiously optimistic on the topic, if the card does come in @ 980ti performance I expect to see a price tag of anywhere from 400 to 450.


They're not going to price over the 1070 if nV's claim about performance is true. They also said they were shooting for that performance position at around $300 anyways. I believe they're referring to 980Ti/390 DX12 and VR performance level though. I don't think it's DX11, tha'd be a lot of ground to cover.
Quote:


> Originally Posted by *caenlen*
> 
> I'd rather pay the extra 80 bucks for gtx 1070 beating Titan X performance and GDDR5x, and since Nvidia is 70% of all desktop PC gaming never having to worry about a game not getting fixed driver wise super fast... but to each their own.


Wow, 1070 having 5X is news to me, probably nV's engineers as well. That must be what Jen was yelling about to the backstage. I'm amazed nV's engineers crammed it in over a weekend. I'm also going to withhold judgement on that 'faster then Titan' claim.


----------



## bigjdubb

Quote:


> Originally Posted by *rv8000*
> 
> Not with this "Founder's Edition" BS, GTX 1070 will likely remain around $429~449. I still believe this will end up being decided by how well p10 clocks on the new 14nm process. There is so little concrete information on p10 that it is really hard to say, AMD (RTG) will put themselves in the grave if they can't release a part that's faster than Hawaii on an entirely new node @ $300.


If the MSRP really is $379 there will be a $379 card from all the AIB's, they may not drop first but they will be there.


----------



## criminal

Quote:


> Originally Posted by *BulletBait*
> 
> They're not going to price over the 1070 if nV's claim about performance is true. They also said they were shooting for that performance position at around $300 anyways. I believe they're referring to 980Ti/390 DX12 and VR performance level though. I don't think it's DX11, tha'd be a lot of ground to cover.
> Wow, 1070 having 5X is news to me, probably nV's engineers as well. That must be what Jen was yelling about to the backstage. I'm amazed nV's engineers crammed it in over a weekend. I'm also going to withhold judgement on that 'faster then Titan' claim.


And the misinformation is already spreading. 1070 WON"T have GDDR5X.


----------



## rv8000

Quote:


> Originally Posted by *BulletBait*
> 
> They're not going to price over the 1070 if nV's claim about performance is true. They also said they were shooting for that performance position at around $300 anyways. I believe they're referring to 980Ti/390 DX12 and VR performance level though. I don't think it's DX11, tha'd be a lot of ground to cover.


If yields are amazing MAYBE, if full p10 and 1070 have performance parity I really don't see them undercutting the 1070 by almost $100.

As it stands, all the rumors and benchmarks have put p10 below a 390, why would AMD ever charge more than $229 with a card like that. If we get performance parity with 1070/980ti (matches dx11, exceeds dx12) expect a $400 p10 gpu. If not, I expect a $229-$249 part that hits around 390/390x performance (worse in higher res). P10 is either better than the rumors lead us to believe, or cheaper than $300, those are the only two things that make sense to me right now.
Quote:


> Originally Posted by *bigjdubb*
> 
> If the MSRP really is $379 there will be a $379 card from all the AIB's, they may not drop first but they will be there.


I doubt that. Once AIB cards drop, the "Founders Edition" will drop to MSRP" and the AIB cards will slot in from $399 to $429. There may be some Gainward, Zotac, and PNY AIB cards that drop @ $379 2 months down the road.


----------



## KeepWalkinG

I think AMD will attract customers with low budget Polaris 11 card and mid range with Polaris 10.

So they will lose only in the high end graphics cards vs Gtx 1080p. This is not so bad for AMD.


----------



## maltamonk

Quote:


> Originally Posted by *rv8000*
> 
> If yields are amazing MAYBE, if full p10 and 1070 have performance parity I really don't see them undercutting the 1070 by almost $100.
> 
> As it stands, all the rumors and benchmarks have put p10 below a 390, why would AMD every charge more than $229 with a card like that. If we get performance parity with 1070/980ti (matches dx11, exceeds dx12) expect a $400 p10 gpu. If not I expect a $229-$249 part that hits around 390/390x performance (worse in higher res). P10 is either better than the rumors lead us to believe, or cheaper than $300, those are the only two things that make sense to me right now.
> I doubt that. Once AIB cards drop, the "Founders Edition" will drop to MSRP" and the AIB cards will slot in from $399 to $429. There may be some Gainward, Zotac, and PNY AIB cards that drop @ $379 2 months down the road.


"If" we can take Roy's comment for anything more than a grain of salt, then a price point for the pol10 would need to be below that of the 970 and 390 and at least have performance in that range. So the card would need to be <$250 as anything in the $300 range would not have an "expanding" outcome on the vr base.


----------



## zealord

Quote:


> Originally Posted by *variant*
> 
> From how he said it, it sounded like it would be faster than the TitanX in VR, not in general.


For me it sounded like he was talking about general performance.

I can be wrong though.

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's at stock clocks I bet. And Titan X is hilariously under clocked at stock...


Yeah that is one thing I am curious about too. How high is the 1070 clocked and how much OC headroom is left.
Quote:


> Originally Posted by *Sand3853*
> 
> Makes perfect sense...as I said, mere conjecture/ponderings.
> 
> I'm honestly of the mindset that we havent seen/heard anything of full polaris 10 and the 'leaks' are more than likely either the mobile chip or the chip for apple/consols (WWDC is just around the corner). Plus, something to consider, the leaks we have had are few and far between. there have been way more rumors of the nVidia cards, to where quite a few folks had a good idea of the general specifics before its announcement.
> 
> Now, I also think we will find a chip (polaris 10) that competes with the 1070, and that Vega will come in a few variants with Vega 10 = big chip and Vega 11 ocupying the market roll of the 390/x possibly fitting at or around the 1080 (+/- a few % depending on sku).
> 
> I just cant fathom them having a complete launch of a new architecture that only offers the same performance of what they right now. I think if they can offer good performance at the right price (and can beat/match the 1070) it will offer them some momentum in the market. If we remember, it really wasnt the 980 that caused problems for AMD, but the 970 and its suprisingly low price. AMD could pull something similar, provided the performance there.


The best thing would be a complete surprise. Like a great card from AMD that performs like the GTX 1080. Held completely under wraps. Nothing has been leaked and then suddenly they release it, it comes in at 549$ (I am talking real 549$ here. Not Nvidia MSRP price money bogus








).

In terms of performance it seems like nothing of the sorts is gonna happen. I feel like months before major GPU releases we we able to pin point the performance within 5% of each GPU from rumors and leaks alone.

One can still dream right


----------



## bigjdubb

Quote:


> Originally Posted by *rv8000*
> 
> I doubt that. Once AIB cards drop, the "Founders Edition" will drop to MSRP" and the AIB cards will slot in from $399 to $429. There may be some Gainward, Zotac, and PNY AIB cards that drop @ $379 2 months down the road.


I don't think Nvidia will change their selling price for the Founders Edition/Reference card unless they decide to do an official price drop. Aftermarket cooler cards are always a premium over msrp so we will see plenty of G1 and other top shelf aftermarket cards selling for well over $400. The 970 msrp was $329 but the Gigabyte G1 version was $369, so it is very likely we will see $429 or more aftermarket 1070's. The basic $379 will still be available, though probably not all that popular and likely not be released until sometime later.


----------



## BulletBait

Quote:


> Originally Posted by *rv8000*
> 
> As it stands, all the rumors and benchmarks have put p10 below a 390, why would AMD ever charge more than $229 with a card like that.


You mean the bunk benchmarks that were obviously underclocked @800 & 1050 and worse then a 'leaked' P11 benchmark? We've gone over the same benchmarks over and over and they just don't come *close* to matching what we're expecting from their own words, plus added educated guess work on the node shrink and architecture improvements.

I don't know what to expect. We'll see what happens, I'm trying to base what I say off their official statements and releases with a sprinkling of rumor. I may be being slightly optimistic, but not overly so. I don't think they'll release a 1080 competitor this launch, and if they happen to it won't be $300, I never said that. I think they will have one within spitting distance of the 1070 and yes it would be around that price point. They're not going to have a '1070' level card and charge more when they're at 15-20% market share and like 5% mind share, that's financial/market suicide, period.


----------



## SuperZan

Quote:


> Originally Posted by *BulletBait*
> 
> You mean the bunk benchmarks that were obviously underclocked @800 & 1050 and worse then a 'leaked' P11 benchmark? We've gone over the same benchmarks over and over and they just don't come *close* to matching what we're expecting from their own words, plus added educated guess work on the node shrink and architecture improvements.
> 
> I don't know what to expect. We'll see what happens, I'm trying to base what I say off their official statements and releases with a sprinkling of rumor. I may be being slightly optimistic, but not overly so. I don't think they'll release a 1080 competitor this launch, and if they happen to it won't be $300, I never said that. I think they will have one within spitting distance of the 1070 and yes it would be around that price point. They're not going to have a '1070' level card and charge more when they're at 15-20% market share and like 5% mind share, that's financial/market suicide, period.


Yeah, people are betting the farm on Roy Taylor and a weirdly-clocked benchmark. I don't think Pol 10 is a mega-high-end part or anything, but I can't think that it will be the DoA failbomb that some are predicting.


----------



## airfathaaaaa

well we do have 2 patents only
Quote:


> Originally Posted by *SuperZan*
> 
> Yeah, people are betting the farm on Roy Taylor and a weirdly-clocked benchmark. I don't think Pol 10 is a mega-high-end part or anything, but I can't think that it will be the DoA failbomb that some are predicting.


i really cant find any of the roy tweets about pol being a fail of anyway nor even a hint about it last tweet he did about them was for people to "wait if they like dx12 and vr"


----------



## SuperZan

Quote:


> Originally Posted by *airfathaaaaa*
> 
> well we do have 2 patents only
> i really cant find any of the roy tweets about pol being a fail of anyway nor even a hint about it last tweet he did about them was for people to "wait if they like dx12 and vr"


It was something he said about it being a "mainstream" part which people have interpreted to mean "terrible".


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *SuperZan*
> 
> Yeah, people are betting the farm on Roy Taylor and a weirdly-clocked benchmark. I don't think Pol 10 is a mega-high-end part or anything, but I can't think that it will be the DoA failbomb that some are predicting.


Nvidia can always cut prices on their non-EOL cards (like AMD did with the 290/X when the 970 and 980 dropped) to counter Polaris and given what happened with the 390 and 390X pricing, I can't see Polaris cards being that great of a bargain.
Quote:


> Originally Posted by *airfathaaaaa*
> 
> well we do have 2 patents only
> i really cant find any of the roy tweets about pol being a fail of anyway nor even a hint about it last tweet he did about them was for people to "wait if they like dx12 and vr"


http://videocardz.com/59445/amd-polaris-aiming-at-vr-capable-graphics-cards

"If you look at the total install base of a Radeon 290, or a GTX 970 or above [the minimum specs required for VR], it's around 7.5 million units, but the issue is that if a publisher wants to sell a £40/$50 VR game, there's not a big enough market to justify that yet. We've got to prime the pumps, which means somebody has got to start writing cheques to big games publishers. Or we've got to increase the install TAM [total addressable market]."

"The reason Polaris is a big deal, is because I believe we will be able to grow that TAM significantly. I don't think Nvidia is going to do anything to increase the TAM, because according to everything we've seen around Pascal, it's a high-end part. I don't know what the price is gonna be, but let's say it's as low as £500/$600 and as high as £800/$1000. That price range is not going to expand the TAM for VR. *We're going on the record right now to say Polaris will expand the TAM. Full stop.*"

Pretty much means inexpensive (read: budget) cards.


----------



## rv8000

Quote:


> Originally Posted by *BulletBait*
> 
> You mean the bunk benchmarks that were obviously underclocked @800 & 1050 and worse then a 'leaked' P11 benchmark? We've gone over the same benchmarks over and over and they just don't come *close* to matching what we're expecting from their own words, plus added educated guess work on the node shrink and architecture improvements.
> 
> I don't know what to expect. We'll see what happens, I'm trying to base what I say off their official statements and releases with a sprinkling of rumor. I may be being slightly optimistic, but not overly so. I don't think they'll release a 1080 competitor this launch, and if they happen to it won't be $300, I never said that. I think they will have one within spitting distance of the 1070 and yes it would be around that price point. They're not going to have a '1070' level card and charge more when they're at 15-20% market share and like 5% mind share, that's financial/market suicide, period.


I'm basically agreeing (different words) with your statement, hence the use of the word rumor and stating how they would never charge $300 for a part that would seriously under perform in its' price segment. The price will likely come down to yields and wafer cost, and while Fiji was a beast in terms of size and likely the cost of HBM, I can't see them undercutting a similar performing part unless the yields are amazing and p10 is in fact a very cheap to manufacture chip. At the same time if they release p10 @ $300 it will be even worse for them as a company if it is barely out performing Hawaii, performance we've had for several years at that same price range! So I still stand by the statement that p10 will be Hawaii performance @ $229, or parity with 1070/980ti @ ~$400 (=dx11 +dx12).


----------



## airfathaaaaa

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> Nvidia can always cut prices on their non-EOL cards (like AMD did with the 290/X when the 970 and 980 dropped) to counter Polaris and given what happened with the 390 and 390X pricing, I can't see Polaris cards being that great of a bargain.
> http://videocardz.com/59445/amd-polaris-aiming-at-vr-capable-graphics-cards
> 
> "If you look at the total install base of a Radeon 290, or a GTX 970 or above [the minimum specs required for VR], it's around 7.5 million units, but the issue is that if a publisher wants to sell a £40/$50 VR game, there's not a big enough market to justify that yet. We've got to prime the pumps, which means somebody has got to start writing cheques to big games publishers. Or we've got to increase the install TAM [total addressable market]."
> 
> "The reason Polaris is a big deal, is because I believe we will be able to grow that TAM significantly. I don't think Nvidia is going to do anything to increase the TAM, because according to everything we've seen around Pascal, it's a high-end part. I don't know what the price is gonna be, but let's say it's as low as £500/$600 and as high as £800/$1000. That price range is not going to expand the TAM for VR. *We're going on the record right now to say Polaris will expand the TAM. Full stop.*"
> 
> Pretty much means inexpensive (read: budget) cards.


i dont see how they gonna cut prices on non eol cards since they stopped production if amd turns out to have a great card we will see again a 260/280 situation that forced nvidia to stop producing 260 and lower the 280 price a lot


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *airfathaaaaa*
> 
> i dont see how they gonna cut prices on *non eol cards since they stopped production* if amd turns out to have a great card we will see again a 260/280 situation that forced nvidia to stop producing 260 and lower the 280 price a lot


You do know what EOL means right? Only the 970/980/980 Ti have been EOL'd afaik.


----------



## KarathKasun

Quote:


> Originally Posted by *rv8000*
> 
> I'm basically agreeing (different words) with your statement, hence the use of the word rumor and stating how they would never charge $300 for a part that would seriously under perform in its' price segment. The price will likely come down to yields and wafer cost, and while Fiji was a beast in terms of size and likely the cost of HBM, I can't see them undercutting a similar performing part unless the yields are amazing and p10 is in fact a very cheap to manufacture chip. At the same time if they release p10 @ $300 it will be even worse for them as a company if it is barely out performing Hawaii, performance we've had for several years at that same price range! So I still stand by the statement that p10 will be Hawaii performance @ $229, or parity with 1070/980ti @ ~$400 (=dx11 +dx12).


R9 390x is nearly at the $400 price point, not the $300 price point. P10 with 390x performance @ $300 is %20 cheaper at least.


----------



## prava

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> Nvidia can always cut prices on their non-EOL cards (like AMD did with the 290/X when the 970 and 980 dropped) to counter Polaris and given what happened with the 390 and 390X pricing, I can't see Polaris cards being that great of a bargain.
> http://videocardz.com/59445/amd-polaris-aiming-at-vr-capable-graphics-cards
> 
> "If you look at the total install base of a Radeon 290, or a GTX 970 or above [the minimum specs required for VR], it's around 7.5 million units, but the issue is that if a publisher wants to sell a £40/$50 VR game, there's not a big enough market to justify that yet. We've got to prime the pumps, which means somebody has got to start writing cheques to big games publishers. Or we've got to increase the install TAM [total addressable market]."
> 
> "The reason Polaris is a big deal, is because I believe we will be able to grow that TAM significantly. I don't think Nvidia is going to do anything to increase the TAM, because according to everything we've seen around Pascal, it's a high-end part. I don't know what the price is gonna be, but let's say it's as low as £500/$600 and as high as £800/$1000. That price range is not going to expand the TAM for VR. *We're going on the record right now to say Polaris will expand the TAM. Full stop.*"
> 
> Pretty much means inexpensive (read: budget) cards.


It doesn't. Polaris will overpass Fury X. It simply has to. And why wouldn't it do that? If 1080 allegedly beats the 980 TI by 30% while taking 180W... a Polaris could very well be at 140W and equal a Fury X.

What we think will happen is that Polaris 10 won't compete with the 1080... but between a 1080 and a 390X there is such a huge gap in between that it would 0 sense to release something as powerful as a 290X.

So no. Budget cards means that they plan to release two different chips (4 probable sku's), and thus will offer a wider range than NVIDIA, who will only offer 2 sku's for the moment.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *prava*
> 
> It doesn't. Polaris will overpass Fury X. It simply has to. And why wouldn't it do that? If 1080 allegedly beats the 980 TI by 30% while taking 180W... a Polaris could very well be at 140W and equal a Fury X.
> 
> What we think will happen is that Polaris 10 won't compete with the 1080... but between a 1080 and a 390X there is such a huge gap in between that it would 0 sense to release something as powerful as a 290X.
> 
> So no. Budget cards means that they plan to release two different chips (4 probable sku's), and thus will offer a wider range than NVIDIA, who will only offer 2 sku's for the moment.


I sure hope they can beat a Fury X this year because if they can't, that's a pretty massive failure. Time will tell.

And why wouldn't it? It's a very tiny die. Around 1/3 of Fiji XT. Only so much you can do with that small of an area, even if it's much denser.


----------



## airfathaaaaa

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> You do know what EOL means right? Only the 970/980/980 Ti have been EOL'd afaik.


exactly my point...
they are essentially lowering one tier while having 980ti perf therefore lowering or not the prices if those cards deliver it doesnt have any point they wont compete with them


----------



## rv8000

Quote:


> Originally Posted by *KarathKasun*
> 
> R9 390x is nearly at the $400 price point, not the $300 price point. P10 with 390x performance @ $300 is %20 cheaper at least.


I referred to Hawaii, not a 390x. While 290/290x have been replaced by the 300 series, there were times when a 290 was sub $300 and certain 290x models were as low as $329 at points. On top of that you have the 390 @ around $310-320 which performs like the 290x due to clock bumps on the reference cards. Then you aslo have to consider the 970 which has hovered around the $310 mark for a long while now. And the used market... You really believe they're going to release a card that performs like a 390/390X/970/290/290X at $300 when that performance v. price has been around for almost 2 years, and believe they will somehow gain market share?!


----------



## Redwoodz

Quote:


> Originally Posted by *zealord*
> 
> For me it sounded like he was talking about general performance.
> 
> I can be wrong though.
> 
> Yeah that is one thing I am curious about too. How high is the 1070 clocked and how much OC headroom is left.
> The best thing would be a complete surprise. Like a great card from AMD that performs like the GTX 1080. Held completely under wraps. Nothing has been leaked and then suddenly they release it, it comes in at 549$ (I am talking real 549$ here. Not Nvidia MSRP price money bogus
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> In terms of performance it seems like nothing of the sorts is gonna happen. I feel like months before major GPU releases we we able to pin point the performance within 5% of each GPU from rumors and leaks alone.
> 
> One can still dream right


He was talking performance per watt.









I think they overclocked it already, just to beat 980Ti. Look at the TDP, meanwhile Polaris is underclocked and sipping power.

The reason from Nvidia is it is not that great compared to Maxwell in gaming except VR and Ultra4K.

Of course I could be wrong, but I bet at most GTX1080 will be under 10% increase over 980Ti in most games and resolutions.
The price tells you the performance.


----------



## spyshagg

Quote:


> Originally Posted by *rv8000*
> 
> I referred to Hawaii, not a 390x. While 290/290x have been replaced by the 300 series, there were times when a 290 was sub $300 and certain 290x models were as low as $329 at points. On top of that you have the 390 @ around $310-320 which performs like the 290x due to clock bumps on the reference cards. Then you aslo have to consider the 970 which has hovered around the $310 mark for a long while now. And the used market... You really believe they're going to release a card that performs like a 390/390X/970/290/290X at $300 when that performance v. price has been around for almost 2 years, and believe they will somehow gain market share?!


good point


----------



## KarathKasun

Quote:


> Originally Posted by *rv8000*
> 
> I referred to Hawaii, not a 390x. While 290/290x have been replaced by the 300 series, there were times when a 290 was sub $300 and certain 290x models were as low as $329 at points. On top of that you have the 390 @ around $310-320 which performs like the 290x due to clock bumps on the reference cards. Then you aslo have to consider the 970 which has hovered around the $310 mark for a long while now. And the used market... You really believe they're going to release a card that performs like a 390/390X/970/290/290X at $300 when that performance v. price has been around for almost 2 years, and believe they will somehow gain market share?!


The vast majority do not buy used, OEM's dont buy used. IF NV has nothing to address the <$300 market, AMD will get more volume sales from OEMs. Period.

Dont speak of end users being smart. They buy GT740's for $150 from BestBuy.


----------



## BulletBait

Fiji XT die size also includes HBM though. Anyways, that's what I meant by being overly optimistic. It may not be, but if I've learned anything since the Bulldozer/2xx releases, it's to have realistic optimism with AMD.

Anyways, we probably are saying the same thing, although I'm going by nV's MSRP, not the FE price. I would say the 'high end' of the price scale they might go would be ~350 compared to nV's 380 all things being equal. They just can't afford to go higher, they don't have the market or mind share to do that. They need to move volume, even if it kicks the margin down a couple points. If we were back a couple years before it dropped below 30% market share, I could grudgingly agree, right now, they can't afford it and I think they know it. Lisa did say they can't always be the cheaper option, but they know their market positioning and know they need to improve it. If they continue to lose market share this generation, it may just be the last generation of AMD cards we see.


----------



## rv8000

Quote:


> Originally Posted by *KarathKasun*
> 
> The vast majority do not buy used, OEM's dont buy used. IF NV has nothing to address the <$300 market, AMD will get more volume sales from OEMs. Period.


The vast majority don't buy $300 GPUS. What is your point? The 970 exists now, the 390 exists now, both at that performance/price point. And there will be a 1060 gpu to replace the 970, likely at a lower mark than $300. P10 is either 1070/980ti tier at a higher price than the rumor, or Hawaii performance for less than $250. Another $300 card with that performance will do nothing for AMD.


----------



## rbarrett96

I'm most concerned with how much VRAM it has. 4 GB compared to Nvidia's 8GB was a joke. Even for newer RAM


----------



## KarathKasun

Quote:


> Originally Posted by *rv8000*
> 
> The vast majority don't buy $300 GPUS. What is your point? The 970 exists now, the 390 exists now, both at that performance/price point. And there will be a 1060 gpu to replace the 970, likely at a lower mark than $300. P10 is either 1070/980ti tier at a higher price than the rumor, or Hawaii performance for less than $250. Another $300 card with that performance will do nothing for AMD.


If the 1060 is not on the market for a few months, AMD is going to sweep that segment for that time. Especially with OEMs.

Its probably going to be ~Fury performance at $299-$325. As of now its all just conjecture, and most here are not the "normal" market.
Quote:


> Originally Posted by *rbarrett96*
> 
> I'm most concerned with how much VRAM it has. 4 GB compared to Nvidia's 8GB was a joke. Even for newer RAM


8GB is worthless at the bandwidth numbers the 1080 is pushing, at least for single GPU setups. Much less anything with plain GDDR5/256b.


----------



## criminal

Quote:


> Originally Posted by *Redwoodz*
> 
> Of course I could be wrong, but I bet at most GTX1080 will be under 10% increase over 980Ti in most games and resolutions.
> *The price tells you the performance*.


Does it? Well that means it should be 50% faster than the 980Ti then because the price is ridiculous.


----------



## rv8000

Quote:


> Originally Posted by *KarathKasun*
> 
> If the 1060 is not on the market for a few months, AMD is going to sweep that segment for that time. Especially with OEMs.
> 
> Its probably going to be ~Fury performance at $299-$325. As of now its all just conjecture, and most here are not the "normal" market.
> 8GB is worthless at the bandwidth numbers the 1080 is pushing, at least for single GPU setups. Much less anything with plain GDDR5/256b.


Fury performance would be the bare minimum at $299, the majority of people would flock straight to the 1070 if p10 came in @ $329, especially if pascal ends up clocking like Maxwell. $50-80 isn't a huge step to users buying $300+ GPUs when you consider a 980ti (1070) is significantly faster than a Fury in the large majority of released titles.


----------



## iLeakStuff

Not quite GTX 980Ti performance. More like below 390X


----------



## spyshagg

We already have 300$ cards that "expand the TAM". The "TAM" is the minimum VR recommended card, the 970/390.

The point is that in order to expand it (bring users to the recommended spec), you have to go lower in price.

That is why it makes no sense to predict Polaris 10 to be a 390X class card for 300$. we already have close to that now.


----------



## airfathaaaaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> Not quite GTX 980Ti performance. More like below 390X


are you actually now going to justify a 700$ card by going down on 1080p? what is next? benching 1070 on 720p? get real for once


----------



## maltamonk

Quote:


> Originally Posted by *iLeakStuff*
> 
> Not quite GTX 980Ti performance. More like below 390X


Again, they have a disclaimer right at the beginning:
Quote:


> In our charts we are using average scores listed on Ashes website, however they might be manipulated by overclocked results, therefore they are not a good material for comparisons.


As well as the pol specific one:
Quote:


> The same database also lists Polaris 10 and 11 GPUs. Unfortunately they were not tested in as many presets as GTX 1080 was. So direct comparison is problematic at this point.


----------



## rv8000

Quote:


> Originally Posted by *iLeakStuff*
> 
> Not quite GTX 980Ti performance. More like below 390X


I wouldn't trust ANYTHING extrapolated from the AOTS bench site. There are so many mislabeled entries it isn't even funny. 1080 results listed with no resolution @ crazy, then 1440p at crazy with different results. Fury X results listed @ 1440p crazy yet showing 4k for the resolution under info. Dual gpu scores listed as single gpus, Fury X and 980 showing HUGE difference at the same benchmark settings with no clock information. There is no baseline or safe way to compare scores when they aren't even labeled/compiled right in the first place.


----------



## iLeakStuff

Quote:


> Originally Posted by *airfathaaaaa*
> 
> are you actually now going to justify a 700$ card by going down on 1080p? what is next? benching 1070 on 720p? get real for once


Hey kid, the whole topic here is "near GTX 980Ti performance"
Its most certainly is not on 1080p. Not even remotely close.
If you think that it will suddenly be much faster in 1440p or 4K with much lower bandwidth than 980Ti, you better go read about technology and stuff


----------



## variant

Quote:


> Originally Posted by *iLeakStuff*
> 
> Hey kid, the whole topic here is "near GTX 980Ti performance"
> Its most certainly is not on 1080p. Not even remotely close.
> If you think that it will suddenly be much faster in 1440p or 4K with much lower bandwidth than 980Ti, you better go read about technology and stuff


So that's what you plan to take away from a dubious benchmark from a Polaris which everyone has suspected is the cutdown version? If anything can be taken away from it, it's that the cutdown version is at least 390X which makes the full version more powerful than the 390X.


----------



## airfathaaaaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> Hey kid, the whole topic here is "near GTX 980Ti performance"
> Its most certainly is not on 1080p. Not even remotely close.
> If you think that it will suddenly be much faster in 1440p or 4K with much lower bandwidth than 980Ti, you better go read about technology and stuff


how typical yet again from you twisting everything just so that they can suit your need
as i said next stop 720ptill we find the sweet spot of 25% faster


----------



## iLeakStuff

Quote:


> Originally Posted by *variant*
> 
> So that's what you plan to take away from a dubious benchmark from a Polaris which everyone has suspected is the cutdown version? If anything can be taken away from it, it's that the cutdown version is at least 390X which makes the full version more powerful than the 390X.


That Polaris 10, cut down or full, isnt remotely close to 390X when 390X beats 980Ti in AOTS


----------



## airfathaaaaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> That Polaris 10, cut down or full, isnt remotely close to 390X when 390X beats 980Ti in AOTS


can you show us a proof of that? any benchmark about it ?anything REAL ?


----------



## iLeakStuff

Quote:


> Originally Posted by *airfathaaaaa*
> 
> can you show us a proof of that? any benchmark about it ?anything REAL ?


Well pretty close to 980Ti anyway
(That is a 390, not 390X)


----------



## airfathaaaaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> Well pretty close to 980Ti anyway
> (That is a 390, not 390X)


im sorry where is the polaris10 on this chart?


----------



## variant

Quote:


> Originally Posted by *iLeakStuff*
> 
> Well pretty close to 980Ti anyway
> (That is a 390, not 390X)


So does that benchmark use an i3 similar to the Polaris benchmark?


----------



## Sand3853

The whole problem I have seen with the leaks of P10 thus far is that no one knows what p10 chip it is. I am more inclined to believe that what we have seen is either severely cut down/under clocked, the mobile version of the chip, or the chip that is going in either one of the refreshed consols or in an Apple iMac/ Macbook (WWDC is less than a month away, and the new lineup should be revealed). I don't think we have seen full p10 at all, and AMD is keeping it under wraps quite nicely. Ultimately we know that the aim for the Polaris chips is to bring a new GCN architecture out, be very energy efficient, and bring VR gaming to the masses...which, has to at the least match the r9 290/390 at a lower price-point and more than likely match or beat the fury cards in performance. I think its plain dumb to expect otherwise


----------



## caenlen

lol they all get so upset when you accidently put the x after GDDR5, ah this is like the new gamefaqs, I love it, lulz


----------



## BulletBait

The next chart down and shows them all and is what should have been used even though it's 1080P Low. It shows the disparity that's happening. A P10 avg at 20 and 50 and a P11 avg at 40. These 'benchmarks' I can't trust yet.


----------



## variant

I think it's more interesting that it only drops 3 average fps going from 4K Crazy to 5K Crazy. There's also almost no drop from Normal Batch to Medium Batch to Heavy Batch.

It's also interesting that it's only one of two benchmarks that are under 5K Crazy. Why bother testing it at that?

5K Polaris vs 5K Multi-GPU TitanX


----------



## BulletBait

Quote:


> Originally Posted by *variant*
> 
> I think it's more interesting that it only drops 3 average fps going from 4K Crazy to 5K Crazy. There's also almost no drop from Normal Batch to Medium Batch to Heavy Batch.
> 
> It's also interesting that it's only one of two benchmarks that are under 5K Crazy. Why bother testing it at that?
> 
> 5K Polaris vs 5K Multi-GPU TitanX


I'd also expect a larger performance hit going from 1080P Standard to 4K Crazy. It only took about a 1/2 framerate hit. I'm a little inexperienced with 4K, I won't lie, but shouldn't the framerate have more then halved?

Besides all the potential 'leaked' benchmarks are by different people at all the different resolutions, another reason I don't trust it.


----------



## BulletBait

I just thought I'd drop this here since we've all, both AMD and nV sides, have been arguing the exact opposite. An investment advisement company (re: fat cat Wall Street stock brokers XD) has settled the argument once and for all for us, $380-$450 is now officially 'mainstream.'
Quote:


> The cards are targeted at the high end and mai*n* stream PC gamers and enthusiasts


Fat Cats from Seeking Alpha

Now everybody have a laugh at the fat cats.


----------



## variant

Quote:


> Originally Posted by *BulletBait*
> 
> I just thought I'd drop this here since we've all, both AMD and nV sides, have been arguing the exact opposite. An investment advisement company (re: fat cat Wall Street stock brokers XD) has settled the argument once and for all for us, $380-$450 is now officially 'mainstream.'
> Fat Cats from Seeking Alpha
> 
> Now everybody have a laugh at the fat cats.


From that link:
Quote:


> Why they assumed that is perhaps revealing of the dynamics behind the recent surges in AMD's stock. Investment sentiment about AMD has become largely driven by a social media phenomenon that traffics in rumor and speculation about the company, playing on the hopes and fears of its investors.


What? AMD stock prices went up because they announced the deal with that Chinese company and three custom APUs for gaming consoles. A bunch of fanboys on forums don't determine stock prices of anything.


----------



## emeianoite

Quote:


> Originally Posted by *Newbie2009*
> 
> From what I have heard, this is going to blow Intel i7 series out of the water. Twice the power, half the price. All we have to do is wait.


Can we ban this guy for trolling with 6k posts?


----------



## airfathaaaaa

Quote:


> Originally Posted by *variant*
> 
> From that link:
> What? AMD stock prices went up because they announced the deal with that Chinese company and three custom APUs for gaming consoles. A bunch of fanboys on forums don't determine stock prices of anything.


seeking alpha is an nvidia lapdog
you should have seen the complete silence after the 57+ on the stock


----------



## lahvie

Quote:


> Much of that rumor and speculation has been circulated about its rival Nvidia and the state of Pascal development. Earlier this year, following CES, there were numerous rumors that Pascal was "in trouble" as summarized by WCCF Tech. Huang had held up a Drive PX2 board during a CES presentation that was supposed to have Pascal GPU acceleration but in photos appeared to have older generation GPU's as a substitute. Clearly the board was a prop, not a working unit.


lmao
edit -
I knew when I saw the video where they claimed the board was not a pascal board that it was either
An accident, or a squeaky duck for the dogs to fetch... or Nvidia completely blindly bluffing

All three of which, I would not hang my name on

Here boy, fetch


----------



## BulletBait

Quote:


> Originally Posted by *variant*
> 
> From that link:
> What? AMD stock prices went up because they announced the deal with that Chinese company and three custom APUs for gaming consoles. *A bunch of fanboys on forums don't determine stock prices of anything.*


I'll be sure to quote that at an investor meeting when revenue and GAAP come up.

I'll drop the 'fanboy' and use 'consumer' instead. Something along the lines of, 'Revenue? Who cares? Consumers and their purchasing options don't determine the value of our stock.' Meeting ajourned, everyone go home and wait for those options to rocket now.


----------



## variant

Quote:


> Originally Posted by *BulletBait*
> 
> I'll be sure to quote that at an investor meeting when revenue and GAAP come up.
> 
> I'll drop the 'fanboy' and use 'consumer' instead. Something along the lines of, 'Revenue? Who cares? Consumers and their purchasing options don't determine the value of our stock.' Meeting ajourned, everyone go home and wait for those options to rocket now.


Fanboys will purchase their brand of choice whether it's good or not. They are not consumers that are swayed one way or the other and are counter balanced by the fanboys on the other side who are exactly the same.


----------



## Lee Patekar

Quote:


> Originally Posted by *variant*
> 
> Fanboys will purchase their brand of choice whether it's good or not. They are not consumers that are swayed one way or the other and are counter balanced by the fanboys on the other side who are exactly the same.


Thus the value of branding.


----------



## BulletBait

Quote:


> Originally Posted by *variant*
> 
> Fanboys will purchase their brand of choice whether it's good or not. They are not consumers that are swayed one way or the other and are counter balanced by the fanboys on the other side who are exactly the same.


That wasn't even my original point. My original point was that while most of us have been saying 'mainstream' is <300, and about half have been saying <250. They think 380-450 is mainstream. Or another 100-200 higher then we 'quoted.'

We were all supposed to have a jolly old laugh and brighten the day some. I didn't figure it'd start a 'fanboy' argument


----------



## Lee Patekar

Quote:


> Originally Posted by *BulletBait*
> 
> That wasn't even my original point. My original point was that while most of us have been saying 'mainstream' is <300, and about half have been saying <250. They think 380-450 is mainstream. Or another 100-200 higher then we 'quoted.'
> 
> We were all supposed to have a jolly old laugh and brighten the day some. I didn't figure it'd start a 'fanboy' argument


If that's truly the case I may shed a few tears for my wallet. Or maybe cheer that the AMD's Polaris rumors of it being mainstream aren't that bad :^)


----------



## Vintage

Doesn't seem too unlikely. The GTX 1070 reportedly has similar performance to a 980 ti at 379$. Not implausible AMD could match it at a slightly lower price point.

Will wait and see as always.


----------



## BulletBait

Quote:


> Originally Posted by *Lee Patekar*
> 
> If that's truly the case I may shed a few tears for my wallet. Or maybe cheer that the AMD's Polaris rumors of it being mainstream aren't that bad :^)


Put everything below a Fury as midrange last generation release price wise if we use the $450 FE price. 390 if we use $380 MSRP price.










290/280X for previous generation, since everyone blathers on about rebrands.

Edit: It makes me wonder what they consider 'budget.' 350? 300?


----------



## KGPrime

As far as "mainstream". That article isn't _really_ talking about "mainstream" pc consumers. They are talking about mainstream pc gamers, to performance minded pc gamers to enthusiasts.

"The cards are targeted at the high end and *main stream PC gamers* and enthusiasts."

I would consider myself performance minded but in general "mainstream" and i buy the $350 range card on principal, anywhere from $320 - $360 depending is fine. $380 is perfectly acceptable with 8 Gb ram and perhaps an Msi Twin Frozer cooler all day long, and there will be sales rebates and coupons.

x70 = mainstream pc gamer. Single 980Ti equals high end, and enthusiast = Crossfire/Sli 980Ti's or Titan ect.

"mainstream" everything else is general desktop pc users or notebook users x40 x50 series, if even that. Intel HD even.


----------



## BulletBait

Only 'gamers' buy dGPUs. Everyone else uses whatever comes stock whether that's a cheap OEM provided dGPU or more likely, an iGPU.

The only other market is server cards, which are even more expensive. Therefore, my statement stands for now.

Edit: We said the same thing and draw different conclusions...


----------



## variant

Quote:


> Originally Posted by *BulletBait*
> 
> We were all supposed to have a jolly old laugh and brighten the day some. I didn't figure it'd start a 'fanboy' argument


My apologies then.

Quote:


> Originally Posted by *BulletBait*
> 
> Only 'gamers' buy dGPUs. Everyone else uses whatever comes stock whether that's a cheap OEM provided dGPU or more likely, an iGPU.
> 
> The only other market is server cards, which are even more expensive. Therefore, my statement stands for now.
> 
> Edit: We said the same thing and draw different conclusions...


Yep, people buy what they can afford to buy. That says nothing about how enthusiastic they are about gaming.


----------



## NicksTricks007

Found this article from a Thai website called Zolkorn.

Source

I didn't want to translate it all, but VC did a mini article about it HERE

If true, then AMD may be prepping for a paper launch the day before GTX 1080 becomes available.


----------



## 12Cores

A ~250mm 14nm die with gddr5x will probably get pretty close to 980ti performance, its all going to depend on what they charge for such a card. Its going to be really hard for amd to screw this one up, even with their limited resources.


----------



## Forceman

There has been no indication that P10 is using GDDR5X.


----------



## Slomo4shO

Quote:


> Originally Posted by *Forceman*
> 
> There has been no indication that P10 is using GDDR5X.


I don't see the appeal of GDDR5X. May have been cheaper to go with a larger bus than incorporating more expensive ram.


----------



## Forceman

Quote:


> Originally Posted by *Slomo4shO*
> 
> I don't see the appeal of GDDR5X. May have been cheaper to go with a larger bus than incorporating more expensive ram.


Probably had more appeal on 28nm where die size was at a premium so fewer memory controllers helped save some space. Now not such a big concern, but still saves power (I guess) so that's something. Not sure what the impact is on latency.


----------



## Majin SSJ Eric

What I find interesting is just how well AMD has kept any and all information about Polaris under wraps. They are being extremely quiet about their new chips and that can only mean one of two things: Either Polaris is going to be way better then anybody thinks or it's going to be a complete disaster. Given AMD's track record it's hard to be overly optimistic but then again remember the bombshell they dropped with Eyefinity...


----------



## variant

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What I find interesting is just how well AMD has kept any and all information about Polaris under wraps. They are being extremely quiet about their new chips and that can only mean one of two things: Either Polaris is going to be way better then anybody thinks or it's going to be a complete disaster. Given AMD's track record it's hard to be overly optimistic but then again remember the bombshell they dropped with Eyefinity...


There's a third more neutral option where AMD does what would be expected which is releasing competitive cards for for their tier bracket.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *variant*
> 
> There's a third more neutral option where AMD does what would be expected which is releasing competitive cards for for their tier bracket.


I think it's fair to say that if some of the alleged rumors are true and AMD releases a Polaris card that is marginally better than a 390 X that would qualify as a total disaster. Maybe I'm just being overly optimistic about things but it's certainly possible that the reason they are being so quiet right now is because they want to release a bombshell of a card. Given their latest releases though, I have my doubts.


----------



## renx

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think it's fair to say that if some of the alleged rumors are true and AMD releases a Polaris card that is marginally better than a 390 X that would qualify as a total disaster. Maybe I'm just being overly optimistic about things but it's certainly possible that the reason they are being so quiet right now is because they want to release a bombshell of a card. Given their latest releases though, I have my doubts.


Bombshell gets my vote.
Remember that "AMD's Revolutionary 14nm FinFET Polaris GPU Architecture" video?
They showed themselves so excited, and talked like ascended masters. It felt like they had something big.
So I vote bombshell.


----------



## Buris

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think it's fair to say that if some of the alleged rumors are true and AMD releases a Polaris card that is marginally better than a 390 X that would qualify as a total disaster. Maybe I'm just being overly optimistic about things but it's certainly possible that the reason they are being so quiet right now is because they want to release a bombshell of a card. Given their latest releases though, I have my doubts.


The best leaks about polaris have actually been coming out of the console market-

We see that p10 is being put into 400$ consoles. (ps4k, xbox1.5, NX)

They are presumably capable of 4k, they have improved compression tech, 14nm... Architecturally they could be anywhere from on-par with nvidia to far ahead of them, as per usual

P10 *is* a midranged card, but on the PS4K it's clocked at around 900mhz- Console GPU clocks are usually significantly reduced compared to discrete graphics cards.

I'm going to go out on a limb here and say there will be a 130$ p11 meant to dethrone the 750 Ti(460), 170$ P11 with near-tahiti performance, 240$ P10 with 390x performance and a 300-ish P10 that matches the 980Ti.

AMD is going to play it safe with polaris, they're focusing on getting their chips in laptops, tablets, and midranged PC's

Nvidia is fully leveraging their market share to sell as many mid-range chips, aka GTX 1080's and 1070's as fast as possible and for as much money as possible. AMD will sacrifice some of their profits to get a greater foothold in GPU market as a whole so that developers code for AMD by default, as AMD does not have a real answer to GameWorks

This year will obviously make or break AMD on both the CPU and GPU front


----------



## zealord

Quote:


> Originally Posted by *Buris*
> 
> The best leaks about polaris have actually been coming out of the console market-
> 
> We see that p10 is being put into 400$ consoles. (ps4k, xbox1.5, NX)
> 
> They are presumably capable of 4k, they have improved compression tech, 14nm... Architecturally they could be anywhere from on-par with nvidia to far ahead of them, as per usual
> 
> P10 *is* a midranged card, but on the PS4K it's clocked at around 900mhz- Console GPU clocks are usually significantly reduced compared to discrete graphics cards.
> 
> I'm going to go out on a limb here and say there will be a 130$ p11 meant to dethrone the 750 Ti(460), 170$ P11 with near-tahiti performance, 240$ P10 with 390x performance and a 300-ish P10 that matches the 980Ti.
> 
> AMD is going to play it safe with polaris, they're focusing on getting their chips in laptops, tablets, and midranged PC's
> 
> Nvidia is fully leveraging their market share to sell as many mid-range chips, aka GTX 1080's and 1070's as fast as possible and for as much money as possible. AMD will sacrifice some of their profits to get a greater foothold in GPU market as a whole so that developers code for AMD by default, as AMD does not have a real answer to GameWorks
> 
> This year will obviously make or break AMD on both the CPU and GPU front


yeah I have seen the rumors true. Polaris GPU with 36 CU and 911 mhz clock. It should be around 2.3x (or even more) as fast as the first PS4.

I still have like 0.1% hope left that AMD has kept a somewhat better Polaris card secret and will surprise us.


----------



## Buris

Quote:


> Originally Posted by *zealord*
> 
> yeah I have seen the rumors true. Polaris GPU with 36 CU and 911 mhz clock. It should be around 2.3x (or even more) as fast as the first PS4.
> 
> I still have like 0.1% hope left that AMD has kept a somewhat better Polaris card secret and will surprise us.


It's very possible that there will be a 40 CU P10, but it looks as if AMD isn't going for more cores this time around, they're more interesting in improving their GCN architecture.

A P10 chip on a discrete graphics card should be capable of around 1400-1500Mhz, that's a pretty moderate estimate considering pascal is overclocking around 2Ghz


----------



## renx

Quote:


> Originally Posted by *Buris*
> 
> A P10 chip on a discrete graphics card should be capable of around 1400-1500Mhz, that's a pretty moderate estimate considering pascal is overclocking around 2Ghz


IMHO comparing clock rates between different architectures had never been relevant.


----------



## YellowBlackGod

I have faith in AMD and Polaris. We shall not forget that we blamed them for their "old" architecture, yet this has proven to be the more futureproof, and mature one, with GPUs of great performance and Price. Same goes for driver support after the Crimson release.


----------



## ToTheSun!

I'm super hyped for the 1080, but i'd be really glad to see a very competitive alternative from RTG. I like Raja and his vision for graphics technology.


----------



## azanimefan

Quote:


> Originally Posted by *Malinkadink*
> 
> How is it fair to compare an overclocked 980 Ti with a stock 1080? *The 1080s will clock to 2k for probably anyone who wants to go that high if they showed off 2100 during the reveal.*


the base clock speed on the 1080 is 1850mhz. 2100 is just 12% overclock.


----------



## flopper

Quote:


> Originally Posted by *Buris*
> 
> It's very possible that there will be a 40 CU P10, but it looks as if AMD isn't going for more cores this time around, they're more interesting in improving their GCN architecture.
> 
> A P10 chip on a discrete graphics card should be capable of around 1400-1500Mhz, that's a pretty moderate estimate considering pascal is overclocking around 2Ghz


OC should be decent at least on finfet.
we seen previously how GCN holds good once you OC it.
while that might push a chip beyond its desired power/watt target it also might be superb fps along with that.


----------



## Serios

Quote:


> Originally Posted by *tajoh111*
> 
> The parallels between the tonga/gm204 and GP204/Polaris 10 are definitely there as there is a very good chance the same amount of shaders are present in both products and they likely share the same 256 bit bus.
> 
> The problem for AMD is their performance per transistor is inferior to Nvidias which is well represented by the gulf between the gtx 980/380x. 5.2 billion transistors vs 5 billion respectively and Nvidia card is 51% faster or in the case of the gtx 970 vs 380x which is more representative of the upcoming launch of the 480x vs 1070, 30% faster with the gtx 970 consuming less watts to boot.


You are not very convincing.
Tonga is not just a pure gaming GPU, what about compute, double precision?
The 980 is simply faster because is better optimized for graphics and DX11.
We all know that Polaris 10 will be a new design but we lack the details like clocks and architectural optimizations to improve performance in games.

I don't know how fast Polaris 10 will be but Raja said people won't be disappointed so we have that until they will at least paper launch Polaris.


----------



## legend999

Gigabyte is anouncing for the 1080
Quote:


> Core
> Boost: 1733 MHz/ Base: 1607 MHz


How is that 2Ghz?

Same for GALAX and inno3d.


----------



## Majin SSJ Eric

Well one thing is for sure when looking at the 1080 release pricing, Nvidia certainly doesn't think Polaris 10 is going to be any competition at all. For whatever it's worth.


----------



## BulletBait

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Well one thing is for sure when looking at the 1080 release pricing, Nvidia certainly doesn't think Polaris 10 is going to be any competition at all. For whatever it's worth.


I'm going to go with this guy's answer to that one.
Quote:


> Originally Posted by *Buris*
> 
> *Nvidia is fully leveraging their market share to sell as many mid-range chips, aka GTX 1080's and 1070's as fast as possible and for as much money as possible.* AMD will sacrifice some of their profits to get a greater foothold in GPU market as a whole so that developers code for AMD by default, as AMD does not have a real answer to GameWorks


As in I do think something really spooked nV into paper launching theirs early. I know they had a launch planned, probably around Computex as well, but that PR conference just seemed so rushed and uncoordinated for it to have been planned that way.

Does that mean that P10/11 will actually come out and knock nV for a change? Maybe, I'm somewhere around 20/20/40/20 for it to be great/good/bad/terrible for AMD against nV. Like I've said, I'm cautiously optimistic for AMD, I've learned to temper my optimism for them since Bulldozer. I actually just completely ignore their marketing department or listen with a long ton worth of salt.

Even if it's a disaster, I'm 50/50 on picking one up anyways just to see how it performs with my own eyes. I've also grown increasingly skeptical of review sites these days. I've usually experienced better performance, even stock, then what they show for AMD. But, that's a personal experience, and not applicable to anyone but myself.


----------



## tajoh111

Quote:


> Originally Posted by *Serios*
> 
> You are not very convincing.
> Tonga is not just a pure gaming GPU, what about compute, double precision?
> The 980 is simply faster because is better optimized for graphics and DX11.
> We all know that Polaris 10 will be a new design but we lack the details like clocks and architectural optimizations to improve performance in games.
> 
> I don't know how fast Polaris 10 will be but Raja said people won't be disappointed so we have that until they will at leas paper launch Polaris.


Tonga is a pure gaming chip. It has had its double precision capability removed like Fury x. Tahiti had compute ability retained along with Hawaii. So Tonga made sacrifaces for gaming sake if you compare it to Tahiti. Maxwell on the other hand although it had it DP removed made strides to make it a better GPU at compute task. Like Tonga and Fury, it had its double precision removed atleast compared to gk200. But when you compare their performance in compute task, particularly open CL task, they are much better. The m6000 is actually a good professional card for the most part and beats keplar based k6000 in most professional application benchmarks. Since gm200 is just gm204 on steroids, the same applies to gm204. The same cannot really be said of the GCN derivitive found in Tonga and Fiji.

http://hothardware.com/reviews/amd-radeon-pro-duo-benchmarks

The best way to see how good a GPU is at compute task is to test them in professional applications. The Fury series which has the same version of GCN as Tonga just doesn't do that well in professional task as seen in the above review. Remember this is a dual GPU card and it losses out to maxwell. If maxwell is so gaming focused, it should be doing worse in professional applications. but look at the review below.

http://hothardware.com/reviews/nvidias-quadro-m6000-maxwell-goes-workstation?page=3

Maxwell pretty much sweeps keplar. It beats kepler everywhere except in instances of double precision use. To say Maxwell was built only for gaming doesn't do service to it.

Tonga on paper should have been a significantly better gaming part than Tahiti. What Tonga added from Tahiti was an improved front end to increase its geometry performance, which is mostly for gaming applications. It also removed certain features from tahiti which made it more energy efficient to make it a better gaming part. This last point partially applies to Nvidia as well but Nvidia power savings comes from it's improved scheduler, it's new cache arrangement and it's more efficient use of it's cores. It's why Nvidia got a much bigger increase in efficiency compared to AMD when both dropped double precision.


----------



## EightDee8D

Quote:


> Originally Posted by *BulletBait*
> 
> I'm going to go with this guy's answer to that one.
> As in I do think something really spooked nV into paper launching theirs early. I know they had a launch planned, probably around Computex as well, but that PR conference just seemed so rushed and uncoordinated for it to have been planned that way.
> 
> Does that mean that P10/11 will actually come out and knock nV for a change? Maybe, I'm somewhere around 20/20/40/20 for it to be great/good/bad/terrible for AMD against nV. Like I've said, I'm cautiously optimistic for AMD, I've learned to temper my optimism for them since Bulldozer. I actually just completely ignore their marketing department or listen with a long ton worth of salt.
> 
> Even if it's a disaster, I'm 50/50 on picking one up anyways just to see how it performs with my own eyes. *I've also grown increasingly skeptical of review sites these days. I've usually experienced better performance, even stock, then what they show for AMD*. But, that's a personal experience, and not applicable to anyone but myself.


That's one of the reason why i'm salty against nvidia nowadays. last year when i used 970 and 290 in same SB-E platform, my experience was totally different from what review sites or even some forum warriors talk about. apart from power consumption, amd card was better in every single way i used it. interestingly it performed way better in GW games such as wd/fc4 etc. nowadays i don't really trust those reviews at all, even nv gpu reviews for that matter. i mean just look how all those techsites look on amd's every little problem and never forget to get some traffic out of it. but somehow they don't really care nv's shortcomings. like it's not a big deal at all unless ofc people complain about it everywhere.

For example, just look at pcper's Pro duo review, all games are GW titles. i mean seriously ? and that same site avoided Gtx970 sli fcat review because "reasons" ( correct me if they tested it later)


----------



## ebduncan

Quote:


> Originally Posted by *rv8000*
> 
> I'm basically agreeing (different words) with your statement, hence the use of the word rumor and stating how they would never charge $300 for a part that would seriously under perform in its' price segment. The price will likely come down to yields and wafer cost, and while Fiji was a beast in terms of size and likely the cost of HBM, I can't see them undercutting a similar performing part unless the yields are amazing and p10 is in fact a very cheap to manufacture chip. At the same time if they release p10 @ $300 it will be even worse for them as a company if it is barely out performing Hawaii, performance we've had for several years at that same price range! So I still stand by the statement that p10 will be Hawaii performance @ $229, or parity with 1070/980ti @ ~$400 (=dx11 +dx12).


Well the cost of making the die has gone up. This is because with finfets and 14/16nm you are forced to use double patterning instead of of single mask imaging. However, the size of the die has likely gone down (better yields usually) It's hard to predict the actual cost, but I'd expect the new chips to relatively inexpensive to produce.
Quote:


> Originally Posted by *rv8000*
> 
> The vast majority don't buy $300 GPUS. What is your point? The 970 exists now, the 390 exists now, both at that performance/price point. And there will be a 1060 gpu to replace the 970, likely at a lower mark than $300. P10 is either 1070/980ti tier at a higher price than the rumor, or Hawaii performance for less than $250. Another $300 card with that performance will do nothing for AMD.


Pretty sure the 1060 would be a different die, so Nvidia won't be releasing one anytime soon unless it's another cut down from gp204.
Quote:


> Originally Posted by *Slomo4shO*
> 
> I don't see the appeal of GDDR5X. May have been cheaper to go with a larger bus than incorporating more expensive ram.


More bandwidth with fewer memory controllers, saves power.
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think it's fair to say that if some of the alleged rumors are true and AMD releases a Polaris card that is marginally better than a 390 X that would qualify as a total disaster. Maybe I'm just being overly optimistic about things but it's certainly possible that the reason they are being so quiet right now is because they want to release a bombshell of a card. Given their latest releases though, I have my doubts.


Not entirely, remember price is a thing as well, if they release a 200$ card that performs like the 390x and uses little power, they have a winner.
Quote:


> Originally Posted by *Buris*
> 
> It's very possible that there will be a 40 CU P10, but it looks as if AMD isn't going for more cores this time around, they're more interesting in improving their GCN architecture.
> 
> A P10 chip on a discrete graphics card should be capable of around 1400-1500Mhz, that's a pretty moderate estimate considering pascal is overclocking around 2Ghz


Finfets should bring a increase in clock speeds, but there is no way to tell. Comparing pascal to polaris, is like comparing bulldozer clock speed to intel's. It's pointless to compare clockspeed. Most users will be interested in overall performance, instead of clock speed.

Quote:


> Originally Posted by *Serios*
> 
> You are not very convincing.
> Tonga is not just a pure gaming GPU, what about compute, double precision?
> The 980 is simply faster because is better optimized for graphics and DX11.
> We all know that Polaris 10 will be a new design but we lack the details like clocks and architectural optimizations to improve performance in games.
> 
> I don't know how fast Polaris 10 will be but Raja said people won't be disappointed so we have that until they will at leas paper launch Polaris.


Tonga has all of its compute. Even DP, AMD limits DP throughput to 1/16th of total on consumer cards. It's not disabled. You will need to purchase a workstation card to get the full DP performance.

Quote:


> Originally Posted by *tajoh111*
> 
> Tonga is a pure gaming chip. It has had its double precision capability removed like Fury x. Tahiti had compute ability retained along with Hawaii. So Tonga made sacrifaces for gaming sake if you compare it to Tahiti. Maxwell on the other hand although it had it DP removed made strides to make it a better GPU at compute task. Like Tonga and Fury, it had its double precision removed atleast compared to gk200. But when you compare their performance in compute task, particularly open CL task, they are much better. The m6000 is actually a good professional card for the most part and beats keplar based k6000 in most professional application benchmarks. Since gm200 is just gm204 on steroids, the same applies to gm204. The same cannot really be said of the GCN derivitive found in Tonga and Fiji.
> 
> http://hothardware.com/reviews/amd-radeon-pro-duo-benchmarks
> 
> The best way to see how good a GPU is at compute task is to test them in professional applications. The Fury series which has the same version of GCN as Tonga just doesn't do that well in professional task as seen in the above review. Remember this is a dual GPU card and it losses out to maxwell. If maxwell is so gaming focused, it should be doing worse in professional applications. but look at the review below.
> 
> http://hothardware.com/reviews/nvidias-quadro-m6000-maxwell-goes-workstation?page=3
> 
> Maxwell pretty much sweeps keplar. It beats kepler everywhere except in instances of double precision use. To say Maxwell was built only for gaming doesn't do service to it.
> 
> Tonga on paper should have been a significantly better gaming part than Tahiti. What Tonga added from Tahiti was an improved front end to increase its geometry performance, which is mostly for gaming applications. It also removed certain features from tahiti which made it more energy efficient to make it a better gaming part. This last point partially applies to Nvidia as well but Nvidia power savings comes from it's improved scheduler, it's new cache arrangement and it's more efficient use of it's cores. It's why Nvidia got a much bigger increase in efficiency compared to AMD when both dropped double precision.


AMD never removed DP from their consumer cards, they only limited it's throughput to 1/16th of its workstation cards based on the same skew. Also AMD generally smokes Nvidia in Compute.


----------



## Serios

Quote:


> Originally Posted by *tajoh111*
> 
> Tonga is a pure gaming chip. It has had its double precision capability removed like Fury x. Tahiti had compute ability retained along with Hawaii. So Tonga made sacrifaces for gaming sake if you compare it to Tahiti. Maxwell on the other hand although it had it DP removed made strides to make it a better GPU at compute task. Like Tonga and Fury, it had its double precision removed atleast compared to gk200. But when you compare their performance in compute task, particularly open CL task, they are much better. The m6000 is actually a good professional card for the most part and beats keplar based k6000 in most professional application benchmarks. Since gm200 is just gm204 on steroids, the same applies to gm204. The same cannot really be said of the GCN derivitive found in Tonga and Fiji.
> 
> http://hothardware.com/reviews/amd-radeon-pro-duo-benchmarks
> 
> The best way to see how good a GPU is at compute task is to test them in professional applications. The Fury series which has the same version of GCN as Tonga just doesn't do that well in professional task as seen in the above review. Remember this is a dual GPU card and it losses out to maxwell. If maxwell is so gaming focused, it should be doing worse in professional applications. but look at the review below.
> 
> http://hothardware.com/reviews/nvidias-quadro-m6000-maxwell-goes-workstation?page=3
> 
> Maxwell pretty much sweeps keplar. It beats kepler everywhere except in instances of double precision use. To say Maxwell was built only for gaming doesn't do service to it.
> 
> Tonga on paper should have been a significantly better gaming part than Tahiti. What Tonga added from Tahiti was an improved front end to increase its geometry performance, which is mostly for gaming applications. It also removed certain features from tahiti which made it more energy efficient to make it a better gaming part. This last point partially applies to Nvidia as well but Nvidia power savings comes from it's improved scheduler, it's new cache arrangement and it's more efficient use of it's cores. It's why Nvidia got a much bigger increase in efficiency compared to AMD when both dropped double precision.


Tonga a pure gaming chip?
What about FirePro W7100 , FirePro S7150 and S7150 x2?


----------



## BulletBait

Quote:


> Originally Posted by *EightDee8D*
> 
> That's one of the reason why i'm salty against nvidia nowadays. last year when i used 970 and 290 in same SB-E platform, my experience was totally different from what review sites or even some forum warriors talk about. apart from power consumption, amd card was better in every single way i used it. interestingly it performed way better in GW games such as wd/fc4 etc. nowadays i don't really trust those reviews at all, even nv gpu reviews for that matter. i mean just look how all those techsites look on amd's every little problem and never forget to get some traffic out of it. but somehow they don't really care nv's shortcomings. like it's not a big deal at all unless ofc people complain about it everywhere.
> 
> For example, just look at pcper's Pro duo review, all games are GW titles. i mean seriously ? and that same site avoided Gtx970 sli fcat review because "reasons" ( correct me if they tested it later)


Don't forget power consumption (p/w) suddenly mattered the last couple generations. Even money it suddenly won't matter again if Polaris power drop is as good as rumored.


----------



## EightDee8D

Quote:


> Originally Posted by *BulletBait*
> 
> Don't forget power consumption (p/w) suddenly mattered the last couple generations. Even money it suddenly won't matter again if Polaris power drop is as good as rumored.


Yep, Well fury series already matches gm200 on p/w. pascal has improved p/w by only 65% in best case scenario and polaris will bring 2x/2.5x p/w improvement. so i can see it won't matter anymore.


----------



## BulletBait

Quote:


> Originally Posted by *EightDee8D*
> 
> Yep, Well fury series already matches gm200 on p/w. pascal has improved p/w by only 65% in best case scenario and polaris will bring 2x/2.5x p/w improvement. so i can see it won't matter anymore.


That's total overall. So if they're talking about 390x performance, we should see a 110w card. I've heard *rumors* of a 120w 'top' card, which would mean a ~10% performance increase and ~130% efficiency increase or 140w 'top' card, which would be a ~40% performance increase and ~90% efficiency increase over the 390x. *IF* their 2.5x p/w claim holds true, I'm expecting a more ballpark 2-2.3 on the claim.

Of course, we won't know until end of May. I'm excited, but I'm not going to roll out the red carpet for a hype train this early into the game. The P11 vs 950 demonstration looked promising with 1/2 the power consumption, but we don't know what that chip's or the P11 in general performance capability is compared to a 950. So we can't form any real conclusions when we're missing half of the puzzle.


----------



## EightDee8D

Quote:


> Originally Posted by *BulletBait*
> 
> That's total overall. So if they're talking about 390x performance, we should see a 110w card. I've heard *rumors* of a 120w 'top' card, which would mean a ~10% performance increase and ~130% efficiency increase or 140w 'top' card, which would be a ~40% performance increase and ~90% efficiency increase over the 390x. *IF* their 2.5x p/w claim holds true, I'm expecting a more ballpark 2-2.3 on the claim.
> 
> Of course, we won't know until end of May. *I'm excited, but I'm not going to roll out the red carpet for a hype train this early into the game*. The P11 vs 950 demonstration looked promising with 1/2 the power consumption, but we don't know what that chip's or the P11 in general performance capability is compared to a 950. So we can't form any real conclusions when we're missing half of the puzzle.


Oh, me too. im just talking about efficiency here. for performance we have to wait and see.


----------



## flopper

Quote:


> Originally Posted by *ebduncan*
> 
> Not entirely, remember price is a thing as well, if they release a 200$ card that performs like the 390x and uses little power, they have a winner.
> .


cant take a breath


----------



## bigjdubb

So what's the latest with the AMD launch? I got the Nvidia stuff out of my system now and I am really wanting to see what AMD is bringing to the table.


----------



## ebduncan

Quote:


> Originally Posted by *BulletBait*
> 
> That's total overall. So if they're talking about 390x performance, we should see a 110w card. I've heard *rumors* of a 120w 'top' card, which would mean a ~10% performance increase and ~130% efficiency increase or 140w 'top' card, which would be a ~40% performance increase and ~90% efficiency increase over the 390x. *IF* their 2.5x p/w claim holds true, I'm expecting a more ballpark 2-2.3 on the claim.
> 
> Of course, we won't know until end of May. I'm excited, but I'm not going to roll out the red carpet for a hype train this early into the game. The P11 vs 950 demonstration looked promising with 1/2 the power consumption, but we don't know what that chip's or the P11 in general performance capability is compared to a 950. So we can't form any real conclusions when we're missing half of the puzzle.


P11 is mostly for the mobile side of things. We may not see desktop sku's at all. If we do they will take over the spot of their current cards such as 370/360, and will not require an external power connector. Could be aimed at oems
Quote:


> Originally Posted by *flopper*
> 
> cant take a breath


ha

P10= 256bit memory bus 8gbps speed, smaller die than gp204.
GTX1070= 256bit memory bus 8gbps speed, larger die than p10
GTX1080= 256bit memory bus 10gbps speed, larger die than p10

This sums up what we will likely see. The 1070 and p10 will be limited by their memory bandwidth, advantage AMD in performance per watt due to smaller die. The 1080gtx on the other hand will be the hands down victor in terms of total performance, will also consume more power and be well priced higher.

pricing wise we heard from Nvidia, not from AMD, with them having the smaller die and likely offering similar performance to the 1070 we should expect AMD to be very aggressive with their price.


----------



## BulletBait

Quote:


> Originally Posted by *EightDee8D*
> 
> Oh, me too. im just talking about efficiency here. for performance we have to wait and see.


Right, well the problem with AMD slides is they wrapped the performance and efficiency increases together, twice... If you look at their GPU roadmap and FinFet vs Planar slide. So if they go all out to efficiency with Polaris and then Vega jumps performance instead, we'd see a Polaris TDP with a 30% or so performance jump.

Their slides are dumb and confusing though. I also wonder how much HBM2 is playing into their Vega performance expectation.

Edit so no double post:
Quote:


> Originally Posted by *ebduncan*
> 
> This sums up what we will likely see. The 1070 and p10 will be limited by their memory bandwidth, advantage AMD in performance per watt due to smaller die. The 1080gtx on the other hand will be the hands down victor in terms of total performance, will also consume more power and be well priced higher.
> 
> pricing wise we heard from Nvidia, not from AMD, with them having the smaller die and likely offering similar performance to the 1070 we should expect AMD to be very aggressive with their price.


Right, no one is expecting AMD to compete with the 1080. Hell, I'm not even 'hoping,' I see it as extremely improbable. If they do, I'll be pleasantly surprised, but we'll cross that bridge if it comes to it.

I do think they're going to compete with the 1070 and they're going to play hardball on pricing. They kind of have to, market share has forced their hand on this one and I'm really hoping they're holding pocket aces. It's just a hope though, Pandora's box kind of hope...


----------



## ebduncan

Honestly, hbm2 is where it is at. Those are the cards everyone is craving.
Quote:


> Originally Posted by *BulletBait*
> 
> I do think they're going to compete with the 1070 and they're going to play hardball on pricing. They kind of have to, market share has forced their hand on this one and I'm really hoping they're holding pocket aces. It's just a hope though, Pandora's box kind of hope...


pocket aces? the 970 is the most popular card on steam. P10 will be cheaper than 1070, and offer similar performance. P10 is their pocket ace for now anyways.

All that will change when hbm2 comes into play. We have reached the limit on gddr5. GDDR5x is nice, but it's really not that much faster. Combined with limited availability at first for GDDR5x it just doesn't make sense. AMD could release a GDDR5X card, but there simply isn't enough volume of it at first to go around. So why release it? if HBM2 production is ramping up now.


----------



## BulletBait

Quote:


> Originally Posted by *ebduncan*
> 
> pocket aces? the 970 is the most popular card on steam. P10 will be cheaper than 1070, and offer similar performance. P10 is their pocket ace for now anyways.


That's why I said 'Pandora's box' kind of hope







. Maybe I shouldn't use metaphors next time, but it's so much more expressive









I'm glad they skipped 5X honestly. The only place I can see it is on low-mid and budget cards. Mid-high and Halo cards are more worth the HBM and I see 5X phasing out completely in 2-3 years. Which is a very short lifespan considering GDDR5 entered volume production 8 years ago. 5X is just delaying that and an attempt to maintain a market with an inferior component that will evaporate shortly.

Edit double post again
Quote:


> Originally Posted by *bigjdubb*
> 
> So what's the latest with the AMD launch? I got the Nvidia stuff out of my system now and I am really wanting to see what AMD is bringing to the table.


Sorry, missed your question. AMD is launching sometime May 26-29. They're holding a conference on those dates and sent out press invites. Exact date is unknown.

Everything else we're talking about is educated guesswork off official statements/demonstrations with a sprinkling of rumors.

I'd say the next time we have the potential to hear about anything will be their Annual Shareholder Meeting this Thursday. I'm expecting total blackout until the conference though.


----------



## bigjdubb

Quote:


> Originally Posted by *BulletBait*
> 
> Sorry, missed your question. AMD is launching sometime May 26-29. They're holding a conference on those dates and sent out press invites. Exact date is unknown.
> 
> Everything else we're talking about is educated guesswork off official statements/demonstrations with a sprinkling of rumors.
> 
> I'd say the next time we have the potential to hear about anything will be their Annual Shareholder Meeting this Thursday. I'm expecting total blackout until the conference though.


Well I am really hoping that the $300 near 980ti performance rumor is true. I am a longtime Nvidia user wanting to switch over to AMD so a cheaper card to test the AMD water and hold me over until next year seems like a really good idea for me. Near 980ti performance would provide me with a sidestep in performance (but with a single card) and if it turns out that I like the drivers and software I can always buy a second one and wait for the big boys to get here next year.


----------



## Gungnir

Quote:


> Originally Posted by *ebduncan*
> 
> All that will change when hbm2 comes into play. We have reached the limit on gddr5. GDDR5x is nice, but it's really not that much faster. Combined with limited availability at first for GDDR5x it just doesn't make sense. AMD could release a GDDR5X card, but there simply isn't enough volume of it at first to go around. So why release it? if HBM2 production is ramping up now.


IIRC, the biggest benefit of GDDR5X is that it's cheaper than HBM2, and more efficient than GDDR5. I wouldn't be surprised to see it on low-end and midrange cards in the second or third generation of FF. The late start on production has mostly ruled it out for this gen (other than the 1080, which I doubt will be very high volume), but that won't be true for the 5xx/11xx and later.


----------



## BulletBait

Quote:


> Originally Posted by *Gungnir*
> 
> IIRC, the biggest benefit of GDDR5X is that it's cheaper than HBM2, and more efficient than GDDR5. I wouldn't be surprised to see it on low-end and midrange cards in the second or third generation of FF. The late start on production has mostly ruled it out for this gen (other than the 1080, which I doubt will be very high volume), but that won't be true for the 5xx/11xx and later.


Yes, 5X is cheaper right now. Samsung began mass production of HBM2 in Jan, Hynix is expected to scale up in the next 2-3 months (Q3). The only direction its price has to go now is down as production ramps up, quality/yield improves, and other players get in the market. Hynix will likely ship overwhelmingly to AMD, nV will probably continue their contract Samsung. GDDR5 right now is about squeezed for all it can get from manufacturing technique and volume (with the current ramp the exception). So unless they pop out a miraculous GDDR6, I really think it's going to go the way of the dodo pretty quickly.

That Q3 Hynix ramp is one of the reasons I'm expecting a late Q4 (Holiday season) launch of Vega.


----------



## xxdarkreap3rxx

You know what. Polaris 10 uncut is what, 2560 SPs? An R9 290, launched over 2.5 years ago has the same amount of SPs. According to TPU, at 2560x1440, the 980 Ti is about 46% faster than the R9 290. Can they really not squeeze ~35% performance and some power efficiency to get near-980 Ti performance after all that time? I'm starting to doubt it's only going to be around 390X performance. There's no way they only managed 15% of a performance increase... Even comparing the 980 Ti to a 390 (essentially OC'd 290 with more VRAM) the Ti beats it by 34%. This has to be at least close if not matching the 980 Ti. There's no way AMD could muck up 2.5 years of R&D, we're not even stuck on 28nm anymore.


----------



## BulletBait

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> You know what. Polaris 10 uncut is what, 2560 SPs? An R9 290, launched over 2.5 years ago has the same amount of SPs. According to TPU, at 2560x1440, the 980 Ti is about 46% faster than the R9 290. Can they really not squeeze ~35% performance and some power efficiency to get near-980 Ti performance after all that time? I'm starting to doubt it's only going to be around 390X performance. There's no way they only managed 15% of a performance increase... Even comparing the 980 Ti to a 390 (essentially OC'd 290 with more VRAM) the Ti beats it by 34%. This has to be at least close if not matching the 980 Ti. There's no way AMD could muck up 2.5 years of R&D, we're not even stuck on 28nm anymore.


I'm expecting quite a bit of architecture improvement myself. It goes a long way towards explaining their '2.5x' p/w number. nV got something around 1.5 with the 1080 (maths is hard ><) at 16nm? The way Pascal looks to me is a cracked out Maxwell on a new node. With AMD also dropping from 28-14nm, I'd expect a relatively close efficiency gain with a slight edge for the lower node, maybe 1.7-1.8.

I know the keep talking up DX11 overhead fixes, but I don't think that was a primary focus. AMD probably did what they always do and looked more towards the future then the present. You also have to ask yourself, to take nV PR statement of 3x effiency *in VR*, is that what AMD was referring to in their statement?

Makes me wonder...


----------



## TheLAWNOOB

Quote:


> Originally Posted by *BulletBait*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xxdarkreap3rxx*
> 
> You know what. Polaris 10 uncut is what, 2560 SPs? An R9 290, launched over 2.5 years ago has the same amount of SPs. According to TPU, at 2560x1440, the 980 Ti is about 46% faster than the R9 290. Can they really not squeeze ~35% performance and some power efficiency to get near-980 Ti performance after all that time? I'm starting to doubt it's only going to be around 390X performance. There's no way they only managed 15% of a performance increase... Even comparing the 980 Ti to a 390 (essentially OC'd 290 with more VRAM) the Ti beats it by 34%. This has to be at least close if not matching the 980 Ti. There's no way AMD could muck up 2.5 years of R&D, we're not even stuck on 28nm anymore.
> 
> 
> 
> I'm expecting quite a bit of architecture improvement myself. It goes a long way towards explaining their '2.5x' p/w number. nV got something around 1.5 with the 1080 (maths is hard ><) at 16nm? The way Pascal looks to me is a cracked out Maxwell on a new node. With AMD also dropping from 28-14nm, I'd expect a relatively close efficiency gain with a slight edge for the lower node, maybe 1.7-1.8.
> 
> I know the keep talking up DX11 overhead fixes, but I don't think that was a primary focus. AMD probably did what they always do and looked more towards the future then the present. You also have to ask yourself, to take nV PR statement of 3x effiency *in VR*, is that what AMD was referring to in their statement?
> 
> Makes me wonder...
Click to expand...

The way I look at it, AMD onlyneeds to clock it to 1.3-1.4Ghz to reach 980 Ti levels.

nVidia got a 16nm 2560 CUDA GPU to beat a 28nm 3072 CUDA by 20%.

Won't surprise me if AMD's 14nm 2560 SP GPU beat their 28nm 2880 SP or even 3584 SP.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *BulletBait*
> 
> I'm expecting quite a bit of architecture improvement myself. It goes a long way towards explaining their '2.5x' p/w number. nV got something around 1.5 with the 1080 (maths is hard ><) at 16nm? The way Pascal looks to me is a cracked out Maxwell on a new node. With AMD also dropping from 28-14nm, I'd expect a relatively close efficiency gain with a slight edge for the lower node, maybe 1.7-1.8.
> 
> I know the keep talking up DX11 overhead fixes, but I don't think that was a primary focus. AMD probably did what they always do and looked more towards the future then the present. You also have to ask yourself, to take nV PR statement of 3x effiency *in VR*, is that what AMD was referring to in their statement?
> 
> Makes me wonder...


Yeah. At first I believed this thread. Then I saw all the VC posts how Polaris was slow and doubted it. Now I'm wondering if AMD really can't manage more than 20% performance after 2.5 years on the same amount of stream processors + new node + architectural changes. Only way I see them releasing a slow card is if they're aggressively focusing on low TDP/power consumption...










They said they wanted to focus on increasing the market for VR. Increasing performance on the lower end (and even towards the middle) would result in more inexpensive (read: generally affordable) cards meeting the minimum specs for VR. Keeping the performance the same as last gen while just lowering power consumption would not increase that market unless they also decreased the price of the cards.


----------



## PCGamer4Ever

I think, based on stuff we have heard from AMD, we will see cards launched in the mid range line (380 and 390 current range and price points) This will cause the tech enthusiast forums to explode in negative comments which will likely be unfair. Having the fastest card on the market is nice but the mid range and lower price points is where companies really make their money.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *PCGamer4Ever*
> 
> I think, based on stuff we have heard from AMD, we will see cards launched in the mid range line (380 and 390 current range and price points) This will cause the tech enthusiast forums to explode in negative comments which will likely be unfair. Having the fastest card on the market is nice but the mid range and lower price points is where companies really make their money.


But can they afford to price it low enough is the question (assuming performance "sucks").


----------



## provost

Quote:


> Originally Posted by *PCGamer4Ever*
> 
> I think, based on stuff we have heard from AMD, we will see cards launched in the mid range line (380 and 390 current range and price points) This will cause the tech enthusiast forums to explode in negative comments which will likely be unfair. Having the fastest card on the market is nice but the mid range and lower price points is where companies really make their money.


You know what, I have started to view these cards as a "consumable" rather than a piece of hardware to own for longer than a year or two, given how dependent this hardware now is on the gpu companies' ability and willingness to continue providing optimization support past few months (in certain cases... Lol). So, I will be fine with a consumable product at the right price point that I don't mind chugging out after I have gotten some entertainment out of it. This is the only way I can justify renting "current performance" while owning the underlying hardware with accelerated planned obsolescence.


----------



## BulletBait

Quote:


> Originally Posted by *provost*
> 
> You know what, I have started to view these cards as a "consumable" rather than a piece of hardware to own for longer than a year or two, given how dependent this hardware now is on the gpu companies' ability and willingness to continue providing optimization support past few months (in certain cases... Lol). So, I will be fine with a consumable product at the right price point that I don't mind chugging out after I have gotten some entertainment out of it. This is the only way I can justify renting "current performance" while owning the underlying hardware with accelerated planned obsolescence.


The 7970 has been going strong for 4.5 years? How is that renting?  I'm thinking we'll continue to see the same long term support and 'future proofing' out of AMD that we've seen in the past. If I were to get a P10 '490' version, I will fully expect that AMD would support it for the next 5-6 years and it will get 'performance appreciation' in that amount of time, same as the 5/6/7xxx, 2/3xx has since they've been out until support was dropped MANY years later for the earlier cards.

I don't see AMD going planned obsolescence, unless MAYBE the market share flipped. Even then, their corporate culture is pretty resiliant and I, personally, still wouldn't expect them to do it.


----------



## Lee Patekar

Quote:


> Originally Posted by *provost*
> 
> This is the only way I can justify renting "current performance" while owning the underlying hardware with accelerated planned obsolescence.


I haven't noticed any planned obsolescence with my 7970.. only a bunch of crappy console ports. I'm planning an upgrade for VR, not for anything I'm currently playing.. that said I have plenty of time to wait for Vega and GP102 to land, dust to settle, drivers to mature a bit and benchmarks to choose.


----------



## provost

Quote:


> Originally Posted by *BulletBait*
> 
> The 7970 has been going strong for 4.5 years? How is that renting?  I'm thinking we'll continue to see the same long term support and 'future proofing' out of AMD that we've seen in the past. If I were to get a P10 '490' version, I will fully expect that AMD would support it for the next 5-6 years and it will get 'performance appreciation' in that amount of time, same as the 5/6/7xxx, 2/3xx has since they've been out until support was dropped MANY years later for the earlier cards.
> 
> I don't see AMD going planned obsolescence, unless MAYBE the market share flipped. Even then, their corporate culture is pretty resiliant and I, personally, still wouldn't expect them to do it.


Well then, it makes the pot that much sweeter, and the value proposition that much more compelling, from my perspective anyway..


----------



## BulletBait

Quote:


> Originally Posted by *Lee Patekar*
> 
> I haven't noticed any planned obsolescence with my 7970.. only a bunch of crappy console ports. I'm planning an upgrade for VR, not for anything I'm currently playing.. that said I have plenty of time to wait for Vega and GP102 to land, dust to settle, drivers to mature a bit and benchmarks to choose.


And the 7970 launched at, what, $550? That's cheaper then a yearly 'budget' card replacement and better performance then any budget card to boot through those years.


----------



## maltamonk

Quote:


> Originally Posted by *PCGamer4Ever*
> 
> I think, based on stuff we have heard from AMD, we will see cards launched in the mid range line (380 and 390 current range and price points) This will cause the tech enthusiast forums to explode in negative comments which will likely be unfair. Having the fastest card on the market is nice but the mid range and lower price points is where companies really make their money.


I agree with the range and the 380 price point, but not so much about the 390 price point. The performance range of the 390 and the 970 are what they are trying to expand. Simply put they won't be able to if they keep the same $300 price point. So they would need at least 970/390 performance at a lesser price than what is currently available. Increasing the price to that level would not achieve that goal so much even if much increased performance was there.

I believe (and I know that doesn't really me squat here) that the larger pol10 at anything over $250 will fail. If it's closer to the $250 price it'll need fury/980 performance. If it's around $200 then 390/970 levels. Those would be acceptable ratios. Anything better than that is bonus and much worse than that and imo it'll fail.

IMO*
bigger pol10 $200-250 (970/390-980/fury performance)
smaller pol10 $150-200 (380x-970/390 performance)
Larger pol11 $100-150 (380/960-380x performance)
smaller pol11 up to $100 (270x/950-380/960 performance)


----------



## Basard

Quote:


> Originally Posted by *BulletBait*
> 
> Yeah, that's one of the major reasons I didn't swap to Intel. Besides Vishera finally getting into stride from the larger multi thread support these days. I've never really needed a reason to upgrade, even with a slightly weaker IPC, I've been able to compensate for that with a massive overclock.
> 
> What, no Microsoft?
> 
> 
> 
> 
> 
> 
> 
> Yes, I've passed on nV specifically because of their pricing and proprietary crap. Intel was more to do with marketing practices before, but now is more to do with their, I'm going to call it 'predatory,' pricing as well.
> 
> I also rather enjoy my AMD furnace whenever winter rolls around here
> 
> 
> 
> 
> 
> 
> 
> . Cuts down on my heating bill, which offsets its higher power consumption XD.


AND it gives us a good excuse to crank up the AC during summer! HAHAHA


----------



## BulletBait

Quote:


> Originally Posted by *Basard*
> 
> AND it gives us a good excuse to crank up the AC during summer! HAHAHA


Can't tell if serious or not...

We typically have *one* month of average highs over 70... So, generally plenty cool enough to not run AC with a window open most of the time during the day and at night during the *one* hot month.

Edit: I realize now you're being facetious, my bad. Still, not terrible for me when 8/12 months are spent at <50 for the highs.


----------



## SuperZan

Quote:


> Originally Posted by *BulletBait*
> 
> Can't tell if serious or not...
> 
> We typically have *one* month of average highs over 70... So, generally plenty cool enough to not run AC with a window open most of the time during the day and at night during the *one* hot month.
> 
> Edit: I realize now you're being facetious, my bad. Still, not terrible for me when 8/12 months are spent at <50 for the highs.


Love cool ambients! Our highest temperature around here is usually no more than something like... 72 F? And that's not for long, certainly not long enough to ever worry about heat even from my OC'd 8320 running 24/7 with a Fermi GPU. 

Honestly, I'm impressed with Vishera's thermal performance. It can scale up quickly past 4.7-4.8 GHz but definitely not as quickly as my Skylake.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *SuperZan*
> 
> Love cool ambients! Our highest temperature around here is usually no more than something like... 72 F? And that's not for long, certainly not long enough to ever worry about heat even from my OC'd 8320 running 24/7 with a Fermi GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Honestly, I'm impressed with Vishera's thermal performance. It can scale up quickly past 4.7-4.8 GHz but definitely not as quickly as my Skylake.


Definitely jealous. Its already hard into the 90F's where I live and by July we will be over 100F (with high humidity as well) most days...


----------



## kaosstar

Quote:


> Originally Posted by *BulletBait*
> 
> The 7970 has been going strong for 4.5 years? How is that renting?  I'm thinking we'll continue to see the same long term support and 'future proofing' out of AMD that we've seen in the past. If I were to get a P10 '490' version, I will fully expect that AMD would support it for the next 5-6 years and it will get 'performance appreciation' in that amount of time, same as the 5/6/7xxx, 2/3xx has since they've been out until support was dropped MANY years later for the earlier cards.
> 
> I don't see AMD going planned obsolescence, unless MAYBE the market share flipped. Even then, their corporate culture is pretty resiliant and I, personally, still wouldn't expect them to do it.


That's one thing that's kept me buying AMD for years. I wouldn't consider myself a fanboy. I just know that AMD will continue working on optimizations for "old" cards for years.

On the green side, the 980 Ti will get barely a year of optimizations. We all know when the 1080 drops, the 980 Ti will be relegated to "maintenance mode", and likely indirectly sabotaged by new and improved Gameworks technologies. Same story as the 780 Ti.


----------



## SuperZan

Quote:



> Originally Posted by *Majin SSJ Eric*
> 
> Definitely jealous. Its already hard into the 90F's where I live and by July we will be over 100F (with high humidity as well) most days...


Ouch







I'm glad you've got blocks for those Titans!


----------



## Travieso

It's like nVidia target performance of every cards to be 'just good enough' on launch date.

I mean look at GTX780Ti from the fastest gpu on launch date to losing to GTX970, it even falls short against 390X in many tests. *** ???

They still support old cards but it's like every optimization is dropped after first 4-5 months.

Right now every Kepler cards feel like products from ages ago. Its only 2-3 years old FFS.

Look at Techpowerup review, you'll see the difference since they test with the most diversed and newest game libraries, old cards obsolete so fast on their charts.


----------



## Olivon

Quote:


> Originally Posted by *ebduncan*
> 
> AMD never removed DP from their consumer cards, they only limited it's throughput to 1/16th of its workstation cards based on the same skew. Also AMD generally smokes Nvidia in Compute.


That's wrong :


Quote:


> And the fun doesn't stop there. Along with producing the biggest die they could, AMD has also more or less gone the direction of NVIDIA and Maxwell in the case of Fiji, building what is unambiguously the most gaming/FP32-centric GPU the company could build. With GCN supporting power-of-two FP64 rates between 1/2 and 1/16, AMD has gone for the bare minimum in FP64 performance that their architecture allows, leading to a 1/16 FP64 rate on Fiji. This is a significant departure from Hawaii, which implemented native support for ½ rate, and on consumer parts offered a handicapped 1/8 rate. Fiji will not be a FP64 powerhouse - its 4GB of VRAM is already perhaps too large of a handicap for the HPC market - so instead we get AMD's best FP32 GPU going against NVIDIA's best FP32 GPU.


http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/3


----------



## lahvie

*sigh*
I was playing my little heart out yesterday and had one fantastic revelation.
My cat chewed my headphones and my wife was sleeping (all preggers) so my volume was super low on my flimsy monitor speakers and I was trying to hear some guy who was helping me out so I had to put my little face up to the monitor pretty close...
It was In that moment I realized how little detail is actually being displayed on my screen. Finally, done listening to the guy helping me, I back away to my normal distance, which admittedly is actually pretty close. (I have very good eyesight at that)

And now I am ruined. All I can think about is actually the fact that my 980ti and my 1440p monitor, years and years after my 3870 and my 260 and my 4850... We really haven't come so far.
Sure textures and shading and lighting and polygons and blah blah blah

But when we get down to it, the freaking bottleneck is the fact that display quality itself hasn't really changed as much as graphics technology and power itself has evolved.

Perhaps I would say...
My x800 with its measly 256mb and over 200 billion floating point operations per second
whereas my 980ti and its mind-blowing 4.61 TFLOPS

Yet my monitor has come nowhere near as far as evolution is concerned.
Since I was 15 I can remember 1600x900
And Here I am today, 26 going on being a daddy, playing at the mindblowing resolution of 2550x1440p

And today, they are telling me they want to put that resolution on my face?

No thank you AMD, no Thank you NVIDIA,

They are almost trying to sell me a CRT Monitor to put on my face. (compared to current oled and LED sizes, I could quite seriously argue its worse than a crt monitor to throw on my face)

Hey, I could buy a 4K monitor, but ultimately that's still just another small upgrade from 1440p

Stop assuming VR is what EVERYBODY wants. I don't want VR, I don't want to look like a DONKEY. I want a better monitor ( AND I DONT NEED 120HZ ) And I don't understand how my desire should be any less than their mark for improvement and gains.

They are telling me I should forget Polaris and Pascal, because I don't care about VR. And ultimately they still aren't going to drive a 4k monitor sufficiently


----------



## AngEviL

Quote:


> Originally Posted by *lahvie*
> 
> *sigh*
> And Here I am today, 26 going on being a daddy, playing at the mindblowing resolution of 2550x1440p
> 
> And today, they are telling me they want to put that resolution on my face?
> 
> No thank you AMD, no Thank you NVIDIA,
> 
> They are almost trying to sell me a CRT Monitor to put on my face. (compared to current oled and LED sizes, I could quite seriously argue its worse than a crt monitor to throw on my face)
> 
> Hey, I could buy a 4K monitor, but ultimately that's still just another small upgrade from 1440p


I've been using a 4k monitor for 1.5 years now. First i had a single gtx 980, and now a gtx 980 ti. I had a 1440p monitor before, and trust me, it is a big difference, enough to forget about jaggies, and about lack of detail in distant textures.

Fps is lower in some games, but to me, lack of image clarity is a bigger loss of immersion than fps being below 60 (It rarely drops for me below 40). Also i have some older games or less demanding as well, which can do 60 fps well, such as GTA V, Max Payne 3 (x2 AA), Grim Dawn (x4 AA), Sniper Elite zombie army (SMAA).

I will also be getting a gtx 1080 in the next months, so that 40 fps drop will become close to 50, it will be visibly more manageable. Just get the monitor and enjoy, instead of being frustrated from the wait. Trust me, going from 1080p to 1440p wasnt to me that big of a deal, i could still see the jaggies and make up invidiual pixels, but going to 1440p to 2160p was much better.


----------



## lahvie

As much as quality is
Quote:


> Originally Posted by *AngEviL*
> 
> I've been using a 4k monitor for 1.5 years now. First i had a single gtx 980, and now a gtx 980 ti. I had a 1440p monitor before, and trust me, it is a big difference, enough to forget about jaggies, and about lack of detail in distant textures.
> 
> Fps is lower in some games, but to me, lack of image clarity is a bigger loss of immersion than fps being below 60 (It rarely drops for me below 40). Also i have some older games or less demanding as well, which can do 60 fps well, such as GTA V, Max Payne 3 (x2 AA), Grim Dawn (x4 AA), Sniper Elite zombie army (SMAA).
> 
> I will also be getting a gtx 1080 in the next months, so that 40 fps drop will become close to 50, it will be visibly more manageable. Just get the monitor and enjoy, instead of being frustrated from the wait. Trust me, going from 1080p to 1440p wasnt to me that big of a deal, i could still see the jaggies and make up invidiual pixels, but going to 1440p to 2160p was much better.


Well I love quality
But smoothness is way more important to me, which is why I went with the 1440p.
As awesome as my system is, I had to save save and still paying for part of it.
Im forced to wait another generation before I can grab 4K or higher with a new gpu

Ive heard both sides. The jump to 1440p is more noticeable than the jump from 1440 to 4k

I can't say I know yet


----------



## Majin SSJ Eric

My 1440p screens are more than enough for me personally. I wouldn't mind a large (40"+) 4k monitor but at 27" I feel like 1440p is the perfect resolution. Besides, getting anything higher right now would necessitate new GPU's which would make the expense completely unaffordable for me at the current time...


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> My 1440p screens are more than enough for me personally. I wouldn't mind a large (40"+) 4k monitor but at 27" I feel like 1440p is the perfect resolution. Besides, getting anything higher right now would necessitate new GPU's which would make the expense completely unaffordable for me at the current time...


As someone who did it, I can only say: DON'T DO IT. Going back to a 27" (or even tinier monitors at work) will have you feeling like this: https://imgflip.com/readImage?iid=14711757


----------



## lahvie

I guess this comes down to personal preference and physical differences between humans.
My sharp eyes differentiate between pixels and I can see the color blur of AA
I'm at 25" on my monitor size so its slightly sharper than yours, however, I can't say I don't have my monitor closer to my face than yours, I like it right in my grill







(probably 2 foot away, haven't ever bust out a measuring tape and I'm at work)

I have absolutely NO desire to put a VR headset on. I just think, When I'm looking at a Car (for example) in game, I should be able to make out more than just some color shading on the side of it, especially when this car is supposed to be weathered with snow, shot at, and fumigating with disease. and a AAA title

And they want me to put a headset on! Bah. Make my actually normal game look more, more realistic, and my monitor have higher clarity.

Then when I'm bored of my hyper realistic game play on a 2d surface, let me try that VR headset on.

I just don't think we are there as much as they want us to be, because this is a whole new industry of money they are trying to spawn on people, that we aren't ready for. letting this go now ~ too much off topic rant maybe


----------



## SuperZan

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> As someone who did it, I can only say: DON'T DO IT. Going back to a 27" (or even tinier monitors at work) will have you feeling like this: https://imgflip.com/readImage?iid=14711757


I had the opposite problem, I had to go back to a 27" from 40". It got annoying in PvP in a few games, trying to focus a target having to literally move my head to find it. There's probably a happy medium in there somewhere for me.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *SuperZan*
> 
> I had the opposite problem, I had to go back to a 27" from 40". It got annoying in PvP in a few games, trying to focus a target having to literally move my head to find it. There's probably a happy medium in there somewhere for me.


LOL. Best TV show ever made. It's so true though, that's how I watch TV and play my games now (although not as exaggerated). I had a 32" before that actually seemed to big which is why I went 27" but 27 was too small. I bit the bullet on the 40" with a curved screen and it's perfect (aside from the ONE single dead pixel that I can't even see unless I get absurdly close). Weird how 32" flat was too big but 40" curved is perfect. Plus 4K adult entertainment on that thing is... let me just say for the first week I got it, I was not master of my domain.


----------



## SuperZan

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> LOL. Best TV show ever made. It's so true though, that's how I watch TV and play my games now (although not as exaggerated). I had a 32" before that actually seemed to big which is why I went 27" but 27 was too small. I bit the bullet on the 40" with a curved screen and it's perfect (aside from the ONE single dead pixel that I can't even see unless I get absurdly close). Weird how 32" flat was too big but 40" curved is perfect. Plus 4K adult entertainment on that thing is... *let me just say for the first week I got it, I was not master of my domain.*


I shall now need to make a new cup of tea. And ya, I do like the curved monitors, they just feel so immersive. That's probably my next big purchase.

And agreed, it's still my favourite.







Gotta love Kramer walking in and slapping his money on the counter not five minutes into the competition.


----------



## ChevChelios

from what resolution/size does it make sense to get a curved monitor ? only 30"+ ? or 24"/27" curved ok too ?


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *SuperZan*
> 
> I shall now need to make a new cup of tea. And ya, I do like the curved monitors, they just feel so immersive. That's probably my next big purchase.
> 
> And agreed, it's still my favourite.
> 
> 
> 
> 
> 
> 
> 
> Gotta love Kramer walking in and slapping his money on the counter not five minutes into the competition.


Curved is seriously awesome. I don't know how it would do in the living room with multiple people sitting at angles but right in front of it, it's fantastic.

*slap* I'm out!


----------



## bigjdubb

Quote:


> Originally Posted by *ChevChelios*
> 
> from what resolution/size does it make sense to get a curved monitor ? only 30"+ ? or 24"/27" curved ok too ?


I have not used a curved screen but I have spent some time staring at them on the display counter at micro center. To me it is the ultra wide screens that really shine with the curved screen. The Samsung 27" curved 16:9 screen wasn't nearly as impressive as the 21:9's. I don't think my 32"(16:9) 1440p screen would look all that impressive with a curve, the width on the 21:9 really brings out the curve.


----------



## CasualCat

Quote:


> Originally Posted by *ChevChelios*
> 
> from what resolution/size does it make sense to get a curved monitor ? only 30"+ ? or 24"/27" curved ok too ?


IMHO in any readily available consumer fixed panels screens, curves are a gimmick. Twice as gimmicky on 4k TVs


----------



## ChevChelios

Quote:


> Originally Posted by *bigjdubb*
> 
> I have not used a curved screen but I have spent some time staring at them on the display counter at micro center. To me it is the ultra wide screens that really shine with the curved screen. The Samsung 27" curved 16:9 screen wasn't nearly as impressive as the 21:9's. I don't think my 32"(16:9) 1440p screen would look all that impressive with a curve, the width on the 21:9 really brings out the curve.


hmm

a 27" 21:9 monitor has 2560x1080 resolution, right ?

do all modern games support that ?


----------



## zealord

Quote:


> Originally Posted by *ChevChelios*
> 
> hmm
> 
> a 27" 21:9 monitor has 2560x1080 resolution, right ?
> 
> *do all modern games support that ?*


Judging by the amount of "I HATE THAT GAME IT DOESN'T SUPPORT 21:9, I'M GONNA BOYCOTT IT" comments in newly released game threads I hear I doubt it


----------



## Robenger

Quote:


> Originally Posted by *ChevChelios*
> 
> hmm
> 
> a 27" 21:9 monitor has 2560x1080 resolution, right ?
> 
> do all modern games support that ?


I have an LG that's 29in 21:9 and I have had zero issues except for 2 games. First one being Starcraft II as they purposely don't allow it as it would give people an advantage and the second one being Fallout 4 because you know, Bethesda.


----------



## Bogga

Quote:


> Originally Posted by *Robenger*
> 
> I have an LG that's 29in 21:9 and I have had zero issues except for 2 games. First one being Starcraft II as they purposely don't allow it as it would give people an advantage and the second one being *Fallout 4* because you know, Bethesda.


Very easily modified...

Running a 34" 3440x1440 and I've had close to 0 issues,,, just some minor modifications to get things working. WSGF is a place to check perhaps?


----------



## smoicol

just talk and slide, need numbers and screen now


----------



## Olivon

Find this via Anandtech (thanks to Sweepr) :
Quote:


> Ellesmere XT still belongs to A0 chip testing phase frequency


Quote:


> we expect the third week of may to provide A1 official version of the chip for testing at that time, performance, power, basic specifications can be determined, and the rest is driven adjustment and optimization of the product before the A1 chip out, any run points are guessing right, RTG has now issued a PCB design reference suggests, the various AIB has entered card pre-production stage, relatively speaking, Baffin PCB's proposal is very short and simple, 4-layer PCB + single fan on can, but also shorter than the Nano
> Ellesmere's PCB recommendations are slightly longer than the Nano, but as long as the 6-layer PCB
> Finally Ellesmere Pro now with Baffin A0 are not out,


http://tieba.baidu.com/p/4523923961?qq-pf-to=pcqq.group

Dunno if true or not.


----------



## BulletBait

I'm really curious about what (if anything) will come out of their ASM tomorrow. If shareholders aren't clamoring for what AMD's response to nV is, I'll be surprised.

So... I see tomorrow as the last and most likely time for any new tidbits before the launch conference.


----------



## revanchrist

Double post.


----------



## revanchrist

Quote:


> Originally Posted by *Olivon*
> 
> Find this via Anandtech (thanks to Sweepr) :
> 
> http://tieba.baidu.com/p/4523923961?qq-pf-to=pcqq.group
> 
> Dunno if true or not.


That Google translate is quite ok, but not 100% accurate.

- Ellesmere XT currently on A0 testing phase.
- Ellesmere XT estimated to reach A1 phase by 3rd week of May, only by then the specs will be finalised.
- Ellesmere Pro and Baffin currently not even reach A0 phase.
- Baffin reference PCB will be 4 layer board + single fan, board length slightly shorter than Nano.
- Ellesmere XT reference PCB will be 6 layer board + single fan, board length slightly longer than Nano.
- AIB will release both single fan and dual fan version of Ellesmere XT cards but not triple fan one.


----------



## Eorzean

If it ends up being 980 Ti (or near) performance for $300 US, count me in... I'll buy that and an ultrawide with FreeSync and save money all around without needing to pay for any premiums (gsync or nvidia's inflated prices) while having a card that will actually be supported 2 years down the road. Charging an extra $100 for their reference cooler and spinning it to look like a good thing didn't sit well with me at all.


----------



## BulletBait

Quote:


> Originally Posted by *Eorzean*
> 
> If it ends up being 980 Ti (or near) performance for $300 US, count me in... *I'll buy that* and an ultrawide with FreeSync and save money all around without needing to pay for any premiums (gsync or nvidia's inflated prices) while having a card that will actually be supported 2 years down the road. *Charging an extra $100 for their reference cooler and spinning it to look like a good thing didn't sit well with me at all.*


I'll be the first to welcome you to team red post benchmarks









I thought they shot themselves in the foot PR wise with that reference thing as well. But, from what I've seen of the majority of the nV threads here and the internet at large is a 'SHUT UP AND TAKE MY MONEY!' mentality. The nV crowd and company are now officially worse then Apple in my book. I never thought someone would dethrone Apple as the most cultish and exploitative ever... Then it went and happened.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> LOL. Best TV show ever made. It's so true though, that's how I watch TV and play my games now (although not as exaggerated). I had a 32" before that actually seemed to big which is why I went 27" but 27 was too small. I bit the bullet on the 40" with a curved screen and it's perfect (aside from the ONE single dead pixel that I can't even see unless I get absurdly close). Weird how 32" flat was too big but 40" curved is perfect. Plus 4K adult entertainment on that thing is... let me just say for the first week I got it, I was not master of my domain.


What monitor do you have, gotta link? Or is it a TV?


----------



## NicksTricks007

While I agree that Nvidia didn't do themselves any favors by pricing the reference "founders edition" $100 more than the standard card, they unfortunately have enough of the market to get away with it. I really hope for everyone's sake that AMD can gain some market share back and make things a little more competitive again.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *NicksTricks007*
> 
> While I agree that Nvidia didn't do themselves any favors by pricing the reference "founders edition" $100 more than the standard card, they unfortunately have enough of the market to get away with it. I really hope for everyone's sake that AMD can gain some market share back and make things a little more competitive again.


That's assuming we ever actually SEE $599 1080's from the AIB's. As to that, I have my doubts...


----------



## NicksTricks007

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's assuming we ever actually SEE $599 1080's from the AIB's. As to that, I have my doubts...


As do I. I'll go out on a whim here and take a guess that most will be $630- $680 and the special edition cards like the lightning and classified will fetch between $700-$750.


----------



## SuperZan

Quote:


> Originally Posted by *NicksTricks007*
> 
> As do I. I'll go out on a whim here and take a guess that most will be $630- $680 and the special edition cards like the lightning and classified will fetch between $700-$750.


That's my assumption. I broke it down in a few threads like: $600-615 = Windforce, $620-650 = GAMING, $650-700 = GAMING EXTREME, $700+ = AIO's, Lightnings, etc.


----------



## ToTheSun!

Quote:


> Originally Posted by *BulletBait*
> 
> I thought they shot themselves in the foot PR wise with that reference thing as well. But, from what I've seen of the majority of the nV threads here and the internet at large is a 'SHUT UP AND TAKE MY MONEY!' mentality. The nV crowd and company are now officially worse then Apple in my book. I never thought someone would dethrone Apple as the most cultish and exploitative ever... Then it went and happened.


Halo products (priced appropriately) have existed for a long time. Every major company does this. According to the information we have now, the 1080 will be the best consumer graphics card, period. Therefore, it will have a price tag commensurate with its status. By implying that AMD doesn't do this is to be hypocritical, as they've done it multiple times. The extent and intensity with which they've done it, though, were bottlenecked by their market share.

The fact that Nvidia behaving like a good company, in the economics sense, troubles you is a bit strange.

TL;DR: whichever company can provide the best product with a comfortable market share WILL charge a premium for it, not just "evil" Nvidia.


----------



## BulletBait

Quote:


> Originally Posted by *ToTheSun!*
> 
> Halo products (priced appropriately) have existed for a long time. Every major company does this. According to the information we have now, the 1080 will be the best consumer graphics card, period. Therefore, it will have a price tag commensurate with its status. By implying that AMD doesn't do this is to be hypocritical, as they've done it multiple times. The extent and intensity with which they've done it, though, were bottlenecked by their market share.
> 
> The fact that Nvidia behaving like a good company, in the economics sense, troubles you is a bit strange.
> 
> TL;DR: whichever company can provide the best product with a comfortable market share WILL charge a premium for it, not just "evil" Nvidia.


I never said they wouldn't do it. I've in fact explicitly stated that if the market share were flipped completely in the future it may happen. My own statement of opinion about their consumers still stands. I see a lot of parallels between the culture of the two groups.

*If* AMD were to start pulling the same crap or more then the usual marketing stupidity, I would flip off them. They'd have to get really bad right now though. If I did flip off them or they go under, I won't be waving the blue/green flags. I'll be writing off my hobby and either increasing my other hobbies of backpacking and car modding, or finding a new one. It'll be strictly smartphone use for everything else.

I do have fairly loose standards as a consumer, but those two have crossed my personal line several times. You could only get me to buy their products with a gun to my head. So, if some other company were to suddenly appear and fill the performance and/or price gap that is ALWAYS between AMD and nV, I would buy from them over AMD, but I will *never* buy from green.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *bigjdubb*
> 
> I have not used a curved screen but I have spent some time staring at them on the display counter at micro center. To me it is the ultra wide screens that really shine with the curved screen. The Samsung 27" curved 16:9 screen wasn't nearly as impressive as the 21:9's. I don't think my 32"(16:9) 1440p screen would look all that impressive with a curve, the width on the 21:9 really brings out the curve.


27" seems too small for curved. IMO they shine at the really large sizes.
Quote:


> Originally Posted by *CasualCat*
> 
> IMHO in any readily available consumer fixed panels screens, curves are a gimmick. Twice as gimmicky on 4k TVs


Have you actually tried any of them out? I felt the same exact way until I replaced my 27" ROG Swift with the 40" 4K Samsung. Would never go back to flat, tiny size, or less than 4K ever again for gaming.
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What monitor do you have, gotta link? Or is it a TV?


TV. http://www.samsung.com/us/video/tvs/UN40JU6700FXZA


----------



## Skinnered

Quote:


> Originally Posted by *revanchrist*
> 
> That Google translate is quite ok, but not 100% accurate.
> 
> - Ellesmere XT currently on A0 testing phase.
> - Ellesmere XT estimated to reach A1 phase by 3rd week of May, only by then the specs will be finalised.
> - Ellesmere Pro and Baffin currently not even reach A0 phase.
> - Baffin reference PCB will be 4 layer board + single fan, board length slightly shorter than Nano.
> - Ellesmere XT reference PCB will be 6 layer board + single fan, board length slightly longer than Nano.
> - AIB will release both single fan and dual fan version of Ellesmere XT cards but not triple fan one.


IF that's true,.. P10 (Ellesmere XT ) is months away...


----------



## ToTheSun!

Quote:


> Originally Posted by *BulletBait*
> 
> *If* AMD were to start pulling the same crap


The whole point of my post is that you seem to be implying that AMD DOES NOT do it now, while, in fact, they also do it. The 295X2 is a prime example of a card that was priced to the sun initially (to, as some here like to put it, "rip off" early adopters). After a while, street prices dropped immensely.

Unless you think AMD tried to move the remaining units at almost 50% loss, they did exactly what Nvidia is trying to do with the Founders Edition.

And, at this point, shouldn't be looked at as an "evil" act, but as an industry standard.


----------



## ChevChelios

wait, Ellesmere XT and Ellesmere Pro are different versions of Polaris 10 ?

which is faster - XT or Pro ?


----------



## Cakewalk_S

I would imagine we'll start to see some leaked photos or specs or benchmarks very soon. If Polaris is coming out at the end of the month here usually people start to get an itchy trigger finger and start releasing stuff before the NDA drops.. I'm actually hoping this is true that the new mid-range polaris chip is on part with a 980ti...far fetched but if proved to be true...it may be a warranted upgrade from my GTX970. ~30-40% increase in performance to Polaris and then another 10% on top of that for overclocking..


----------



## Forceman

Quote:


> Originally Posted by *ChevChelios*
> 
> wait, Ellesmere XT and Ellesmere Pro are different versions of Polaris 10 ?
> 
> which is faster - XT or Pro ?


XT is historically the faster one. Tahiti XT was the 7970 and Tahiti Pro was the 7950, for example.


----------



## BulletBait

Quote:


> Originally Posted by *ToTheSun!*
> 
> The whole point of my post is that you seem to be implying that AMD DOES NOT do it now, while, in fact, they also do it. The 295X2 is a prime example of a card that was priced to the sun initially (to, as some here like to put it, "rip off" early adopters). After a while, street prices dropped immensely.
> 
> Unless you think AMD tried to move the remaining units at almost 50% loss, they did exactly what Nvidia is trying to do with the Founders Edition.
> 
> And, at this point, shouldn't be looked at as an "evil" act, but as an industry standard.


That's not equatable. It was a $400 premium on two 290X's CFed for a single PCB card. It was also released a month before the Titan Z to compete. It was also *half* the price as the Z and almost as good. If I recall correctly, it was ~5-10% lower performance at 50% of the price.

Compared to the Z, its competitor, the 295x2 was a steal. The Z also didn't drop in price comparable with what the 295x2 was putting up against it. Stop defending nV's deplorable marketing and pricing practices, it's literally indefensible.


----------



## ToTheSun!

Quote:


> Originally Posted by *BulletBait*
> 
> That's not equatable.


The 1080, then, is not equatable, either, as it's a $600-700 card.

But my point was about principle, not about prices.


----------



## BulletBait

Quote:


> Originally Posted by *ToTheSun!*
> 
> The 1080, then, is not equatable, either, as it's a $600-700 card.
> 
> But my point was about principle, not about prices.


Stop calling it a 'mid-range' GPU then. You can't say it's mid and sell it at flagship prices.

I see it a slippery price slope. If they're charging flagship price on mid-range' today, they'll charge superflagship prices on their high-end cards. We'll be seeing Ti at $1000-1500, Titans at $2000-3000 and 'dual' Titans at $4000-5000. Then everyone will whine, 'How did we get here?' about paying as much for a GPU as a decent condition used car.

So, yes, it does equate to previous pricing over the last 10 years for GPU placement with performance expectation.

Edit: Which falls under principle of expectation to not get sucked dry every 2 years for decent hardware.


----------



## ToTheSun!

Quote:


> Originally Posted by *BulletBait*
> 
> Stop calling it a 'mid-range' GPU then.


Who did? I certainly did not, and neither did Nvidia.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *BulletBait*
> 
> Stop calling it a 'mid-range' GPU then. You can't say it's mid and sell it at flagship prices.
> 
> I see it a slippery price slope. If they're charging flagship price on mid-range' today, they'll charge superflagship prices on their high-end cards. We'll be seeing Ti at $1000-1500, Titans at $2000-3000 and 'dual' Titans at $4000-5000. Then everyone will whine, 'How did we get here?' about paying as much for a GPU as a decent condition used car.
> 
> So, yes, it does equate to previous pricing over the last 10 years for GPU placement with performance expectation.
> 
> Edit: Which falls under principle of expectation to not get sucked dry every 2 years for decent hardware.


I think people get 'card' and 'GPU' confused or just use them interchangeably. In the past we had mid-range GPUs on mid-range cards. Now we have mid-range GPUs on high-end cards.


----------



## BulletBait

Quote:


> Originally Posted by *ToTheSun!*
> 
> Who did? I certainly did not, and neither did Nvidia.


Comparable to older 104 lines, although looking at their past pricing like the 780 fell into the exact same release MSRP, I suppose this isn't exactly new for them.

So I'm just going to let it go so we aren't hijacking a thread and continue speculating about P10 potential performance and pricing. If you'd like to continue my PM box is always open. +a rep for excellent points and civility as well.


----------



## ToTheSun!

Quote:


> Originally Posted by *BulletBait*
> 
> +a rep for excellent points and civility as well.


That's very nice of you. I shall return the kindness.


----------



## KGPrime

Quote:


> Originally Posted by *BulletBait*
> 
> Stop calling it a 'mid-range' GPU then. You can't say it's mid and sell it at flagship prices.
> 
> I see it a slippery price slope. If they're charging flagship price on mid-range' today, they'll charge superflagship prices on their high-end cards. We'll be seeing Ti at $1000-1500, Titans at $2000-3000 and 'dual' Titans at $4000-5000. Then everyone will whine, 'How did we get here?' about paying as much for a GPU as a decent condition used car.
> 
> So, yes, it does equate to previous pricing over the last 10 years for GPU placement with performance expectation.
> 
> *Edit: Which falls under principle of expectation to not get sucked dry every 2 years for decent hardware.*


Don't buy it. No one is forcing you to. And as i have posted in these various thread proof of the last 16 years of video cards releases which i have been a consumer of and longer. The pricing as well as the performance increase generation to generation has *basically been the same for 16 years*. Maybe you are just realizing it. Yeah i *****ed about it a decade ago too, this is a very old argument. And you are getting much more for the same pricing give or take 100 bucks than you did a decade ago or more.. Geforce 2 Ultra was 550 bucks. Had 64Mb memory. 980Ti was 600 bucks and had 6 GB memory. 980Ti is good for another year at LEAST before anyone needs to upgrade from it. If you have one or equivalent, and are *****ing about the pricing of the gtx 1080 - you are an idiot. It's the same basic pricing of Every x80 product ever made, i bought a Gtx 280 10 years ago when released, guess how much it cost? That's right, about 600 Bucks! Thej they reaslsed the gtx 295 later which **** on it. So you are *****ing about nothing new, nor unheard of. QQ.

Geforce 2 Ultra vs Geforce 3 non Ti 10-30fps increase depending on title http://www.tomshardware.com/reviews/geforce3-performance,311-8.html Evolva 1600x1200 32 bit color Bump mapping.
MDK2 1600x1200x32 http://www.tomshardware.com/reviews/geforce3-performance,311-9.html

1080p as reference.
480 vs 580 10 fps average increase http://www.techspot.com/review/359-nvidia-geforce-gtx-560ti/page6.html
580 vs 680 10 fps average increase http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/11
580, 680, 780, 10 fps increase average http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/12
680,780, 980 10 fps average increase http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/12

580 - 980
http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1675&page=3
Difference between a gtx 580 and a gtx 980 about 45fps in BF4 1920x1080. 3


----------



## linbetwin

http://www.techpowerup.com/222450/more-polaris10-and-polaris11-specifications-revealed


----------



## BulletBait

@KGPrime I already said I was dropping it. If you wish to continue, my PM box is open.

Anyways, here's something new on the performance side. Haven't seen a thread, don't like to start threads.


----------



## Forceman

Quote:


> Originally Posted by *linbetwin*
> 
> http://www.techpowerup.com/222450/more-polaris10-and-polaris11-specifications-revealed


So a 2048 shader count Ellesmere with 5.5 TFlops, and a 896 shader Baffin with 2.5 TFlops.


----------



## ChevChelios

Quote:


> Originally Posted by *linbetwin*
> 
> http://www.techpowerup.com/222450/more-polaris10-and-polaris11-specifications-revealed


Ellesmere 5.5 TFlops for *$250* and I wave Nvidia goodbye until 2017









if its for $300 - mmm, than it depends, $300 for that might not be so good anymore

with those specs it would be ~ at or a bit above 980/390X ?


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *ChevChelios*
> 
> Ellesmere 5.5 TFlops for *$250* and I wave Nvidia goodbye until 2017
> 
> 
> 
> 
> 
> 
> 
> 
> 
> if its for $300 - mmm, than it depends, $300 for that might not be so good anymore
> 
> with those specs it would be ~ at or a bit above 980/390X ?


VC says 1070-like perf so ~Titan X/980 Ti


----------



## ChevChelios

the specs in the link above dont look to match up to 1070/980Ti ..

what is VC ?

but if it is 980Ti for $300 then sure, thats great


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *ChevChelios*
> 
> the specs in the link above dont look to match up to 1070/980Ti ..
> 
> what is VC ?
> 
> but if it is 980Ti for $300 then sure, thats great


http://videocardz.com/59903/possible-polaris-10-and-polaris-11-specifications-emerge


----------



## criminal

Quote:


> Originally Posted by *ChevChelios*
> 
> the specs in the link above dont look to match up to 1070/980Ti ..
> 
> what is VC ?
> 
> but if it is 980Ti for $300 then sure, thats great


http://videocardz.com/59903/possible-polaris-10-and-polaris-11-specifications-emerge
Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> http://videocardz.com/59903/possible-polaris-10-and-polaris-11-specifications-emerge


Damn.









1070 level performance for $300. I will take one.


----------



## Forceman

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> VC says 1070-like perf so ~Titan X/980 Ti


I guess the question is how well it uses those TFlops. 390X is 5.9 and isn't 980 Ti level, while Fury X is 8.6 (?) and is.


----------



## BulletBait

Quote:


> Originally Posted by *Forceman*
> 
> *I guess the question is how well it uses those TFlops.* 390X is 5.9 and isn't 980 Ti level, while Fury X is 8.6 (?) and is.


Stop reading my mind XD
Quote:


> Originally Posted by *BulletBait*
> 
> I can buy that, I guess it comes down to how their new architecture improvements perform to put that theoretical peak to actual work.


----------



## linbetwin

http://experience.amd.com/amdwebinar?sf26238011=1


----------



## variant

These rumors always seem to to assume there's only one Polaris 10 and Polaris 11.


----------



## SuperZan

Quote:


> Originally Posted by *variant*
> 
> These rumors always seem to to assume there's only one Polaris 10 and Polaris 11.


Amongst many other ridiculous assumptions, yes, this one is prominent. AMD is very much damned if they do, damned if they don't. Give us information/predictions and they are accused of hype-training to t3h max. Stay silent and people start rumours about GloFo sinking into the sea and Polaris being rebranded 6850.

Disclaimer: mild sarcasm present


----------



## variant

Two 1440p High 67DF:C7 benchmarks [1] [2] showed up a couple days ago on the Ashes of Singularity site. The same person also used the same setup to benchmark a Fury series card and a 390 series card. It's on par with the 390 series card, and about 30% less performance than the Fury series card.

He also did 4K custom and 4K Crazy benchmarks for the 67DF:C7 a few days ago. Here's a 4K Crazy 390 series and 4K Fury series benchmark he did as well. I think it's interesting that the 67DF:C7 benchmarks does not have a large drop you expect going from Normal down to Heavy batches.

This appears to be at least 390X. If this is the cutdown Polaris 10, the full one should be quite impressive.

The 67DF:C7 and 67DF:C4 seem to be two versions of the 67DF silicon. If you remember, there were 67DF:C4 5k Crazy, 4K Crazy, and 1080p Standard benchmarks posted earlier. We have yet to see signs of 67C0 yet.


----------



## criminal

Quote:


> Originally Posted by *variant*
> 
> Two 1440p High 67DF:C7 benchmarks [1] [2] showed up a couple days ago on the Ashes of Singularity site. The same person also used the same setup to benchmark a Fury series card and a 390 series card. It's on par with the 390 series card, and about 30% less performance than the Fury series card.
> 
> He also did 4K custom and 4K Crazy benchmarks for the 67DF:C7 a few days ago. Here's a 4K Crazy 390 series and 4K Fury series benchmark he did as well. I think it's interesting that the 67DF:C7 benchmarks does not have a large drop you expect going from Normal down to Heavy batches.
> 
> This appears to be at least 390X. If this is the cutdown Polaris 10, the full one should be quite impressive.
> 
> The 67DF:C7 and 67DF:C4 seem to be two versions of the 67DF silicon. If you remember, there were 67DF:C4 5k Crazy, 4K Crazy, and 1080p Standard benchmarks posted earlier. We have yet to see signs of 67C0 yet.


There isn't even a 30% gap between the 390 series and Fury series cards.

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html


----------



## 364901

In the same AotS leaderboard we have:

1080p standard - AMD 67EF:C3 - not sure what this is, because it's not a known Baffin or Ellesmere hardware ID

1080p low - 67DF:C4 - huge improvements from v0.99 to v1.11 for Polaris 10

1080p low - AMD 67FF:C8  - another Baffin card

On the NVIDIA side:

Rank 86 and rank 246 for "NVIDIA Graphics Device" on Crazy 4K

Rank 143 and rank 332 for "NVIDIA Graphics Device" on Crazy 1440p

Rank 102, rank 109, and rank 222 for "NVIDIA Graphics Device" on Crazy 1080p

Rank 1 for "NVIDIA Graphics Device" on Extreme 4K

Rank 4 and rank 62 for "NVIDIA Graphics Device" on Extreme 1440

Rank 33 and rank 135 for "NVIDIA Graphics Device" on High 4K

Rank 22 for Guru3d's result on High 1440p

Rank 1209 for "%NVIDIA_DEV.17C2%" on High 1080p

Rank 5 and rank 1525 for "NVIDIA Graphics Device" on Standard 1080p. The second result might be for a mobile Pascal GPU?

Rank 31 for "NVIDIA Graphics Device" on Low 1080p. Another mobile chip?

Edit: Also, there's a new AMD CPU listed as "ZM2111C1Y4382_34/21/13/06_9874" which is a Carrizo-based product that also has 64-bit Android Geekbench results.


----------



## variant

Quote:


> Originally Posted by *criminal*
> 
> There isn't even a 30% gap between the 390 series and Fury series cards.
> 
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html


There is on these benchmarks as my links demonstrated.
Quote:


> Originally Posted by *CataclysmZA*
> 
> 1080p standard - AMD 67EF:C3 - not sure what this is, because it's not a known Baffin or Ellesmere hardware ID


There's another 67EF:C3.


----------



## Eorzean

Quote:


> Originally Posted by *BulletBait*
> 
> I'll be the first to welcome you to team red post benchmarks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I thought they shot themselves in the foot PR wise with that reference thing as well. But, from what I've seen of the majority of the nV threads here and the internet at large is a 'SHUT UP AND TAKE MY MONEY!' mentality. The nV crowd and company are now officially worse then Apple in my book. I never thought someone would dethrone Apple as the most cultish and exploitative ever... Then it went and happened.


Thanks for the (re-)welcome (I've been on team red before... I think?!). I'm personally a little conflicted at the moment. I've been green these past two generations and need to keep reminding myself of their lack of support and eventual degradation to the 7xx series (I've owned both a 780 and 780 Ti, which have started to oddly get a smackdown from the 970 benchmark wise, whereas AMD's 2xx/3xx series are actually gaining performance). As tempted as I am to get a 1070, I need to refrain from my brainwashing, be a little patient, and most likely buy whatever AMD is offering. If Polaris 10 doesn't quite match up with the 1070, I'll most likely pick one up, and then move onto either Vega or (again, my indoctrinated mind) a 1080 Ti. Really want to move to 2K gaming, just probably need a little patience.


----------



## Vario

AMD makes these kind of hype statements every release. I don't know how anyone falls for the hype anymore, especially after bulldozer.
Quote:


> Originally Posted by *Eorzean*
> 
> Thanks for the (re-)welcome (I've been on team red before... I think?!). I'm personally a little conflicted at the moment. I've been green these past two generations and need to keep reminding myself of their lack of support and eventual degradation to the 7xx series (I've owned both a 780 and 780 Ti, which have started to oddly get a smackdown from the 970 benchmark wise, whereas AMD's 2xx/3xx series are actually gaining performance). As tempted as I am to get a 1070, I need to refrain from my brainwashing, be a little patient, and most likely buy whatever AMD is offering. If Polaris 10 doesn't quite match up with the 1070, I'll most likely pick one up, and then move onto either Vega or (again, my indoctrinated mind) a 1080 Ti. Really want to move to 2K gaming, just probably need a little patience.


Buy the monitor first and see if you even need to upgrade. Midrange cards can run higher res you just need to turn down the anti-aliasing, which you don't need as much of anyway once you go higher resolution.


----------



## NicksTricks007

Well, now we wait. Nvidia has shown their hand, and it looks like full house (not great, but pretty damn good). Since we already have an idea that AMD probably won't be directly competing for the top end, I feel Polaris absolutely needs to make the 1070 seem like not as good a value.


----------



## Newbie2009

Quote:


> Originally Posted by *NicksTricks007*
> 
> Well, now we wait. Nvidia has shown their hand, and it looks like full house (not great, but pretty damn good). Since we already have an idea that AMD probably won't be directly competing for the top end, I feel Polaris absolutely needs to make the 1070 seem like not as good a value.


AMD probably need vallum right about now. Only flaw with the 1080 is price.

AMD graphics are going the way of AMD cpu department. Hope not but think so.


----------



## NicksTricks007

Quote:


> Originally Posted by *Newbie2009*
> 
> AMD probably need vallum right about now. Only flaw with the 1080 is price.
> 
> AMD graphics are going the way of AMD cpu department. Hope not but think so.


Well, I'm still not sold on the performance of the 1080. Early benchmark numbers are all over the place. As little as only 5% faster than Fury in DX 12 @4k and as much as 40% in DX 11 @1080p. Too inconsistent and not worth the price premium for the F.U. Edition (aka reference design).

I really hope AMD knows what they're doing though. The perception is and has been the last few gpu generations that Nvidia is the best, regardless of the performance.


----------



## Newbie2009

Quote:


> Originally Posted by *NicksTricks007*
> 
> Well, I'm still not sold on the performance of the 1080. Early benchmark numbers are all over the place. As little as only 5% faster than Fury in DX 12 @4k and as much as 40% in DX 11 @1080p. Too inconsistent and not worth the price premium for the F.U. Edition (aka reference design).
> 
> I really hope AMD knows what they're doing though. The perception is and has been the last few gpu generations that Nvidia is the best, regardless of the performance.


I think it looks like the dogs. But you couldn't put a block on it. Taking into account the shelf life of Nvidia Cards have I will wait to see if AMD mess up before I dump them.


----------



## xzamples

Quote:


> Originally Posted by *Newbie2009*
> 
> AMD probably need vallum right about now. Only flaw with the 1080 is price.
> 
> AMD graphics are going the way of AMD cpu department. Hope not but think so.


vega beats 1080


----------



## Newbie2009

Quote:


> Originally Posted by *xzamples*
> 
> vega beats 1080


I should hope so lol


----------



## 7850K

Quote:


> Originally Posted by *Newbie2009*
> 
> AMD graphics are going the way of AMD cpu department.


you mean greatly improving? because that is the only direction the cpu department is headed.
Quote:


> Hope not but think so.


why? you sound confused and uninformed...


----------



## Robenger

Quote:


> Originally Posted by *7850K*
> 
> you mean greatly improving? because that is the only direction the cpu department is headed.
> why? you sound confused and uninformed...


He is uninformed. He's on every thread with news of AMD and how much they suck.


----------



## Newbie2009

Quote:


> Originally Posted by *7850K*
> 
> you mean greatly improving? because that is the only direction the cpu department is headed.
> why? you sound confused and uninformed...


Greatly improving? Intel own the market. Don't bother talking about products which aren't out yet.


----------



## SuperZan

Quote:



> Originally Posted by *Newbie2009*
> 
> Greatly improving? Intel own the market. *Don't bother talking about products which aren't out yet.*


That may well be the daftest thing I've ever read on a PC discussion forum.

You're right, let's all hash over the 1080 reviews for the 47,000th time today. We've got to talk about the products that are out now!.


----------



## BulletBait

Quote:


> Originally Posted by *SuperZan*
> 
> That may well be the daftest thing I've ever read on a PC discussion forum.
> 
> You're right, let's all hash over the 1080 reviews for the 47,000th time today. We've got to talk about the products that are out _now_!.


What's the consensus on the 70/80 anyways? I've been avoiding reading anything (reviews) about it to keep my lunch down.


----------



## SuperZan

Quote:


> Originally Posted by *BulletBait*
> 
> What's the consensus on the 70/80 anyways? I've been avoiding reading anything (reviews) about it to keep my lunch down.


1070 looks to be a 25% cut card this time 'round so people are speculating performance between 980 and 980 Ti with a 1070 Ti to be released after Polaris drops.

1080 looks like what it should be, 25-30% increase over 980 Ti. Interestingly the FE is running into some overclocking issues that are making its 2100 clock less impressive versus a ~1500 clocked 980 Ti where it loses some of its gains. It's also a slight improvement at 4k but it's no "60 fps at 4k" single-card solution at all. For 4k users with 980 Ti/Fiji/TX it's not worth the upgrade at all IMO. If a decent card can be had around $599 that's not a cheapo blower version (for non watercoolers) it would represent a nice upgrade at 1440/1080 for somebody with a 780 Ti / 290 type card.


----------



## airfathaaaaa

well isnt that interesting..

http://ranker.sisoftware.net/show_run.php?q=c2ffcdf4d2b3d2efdce4dde4d5e5c3b18cbc9aff9aa797b1c2ffc7&l=en
1.3ghz almost


----------



## Newbie2009

Quote:


> Originally Posted by *SuperZan*
> 
> That may well be the daftest thing I've ever read on a PC discussion forum.
> 
> You're right, let's all hash over the 1080 reviews for the 47,000th time today. We've got to talk about the products that are out _now_!.


What noteworthy products do AMD have out now then, or you can speculate about products which aren't out yet. How about the overclocking dream that is fury x?


----------



## Forceman

Quote:


> Originally Posted by *airfathaaaaa*
> 
> well isnt that interesting..
> 
> http://ranker.sisoftware.net/show_run.php?q=c2ffcdf4d2b3d2efdce4dde4d5e5c3b18cbc9aff9aa797b1c2ffc7&l=en
> 1.3ghz almost


I don't know, is 1.3 supposed to be impressive on 14nm? Weren't people expecting 1.5~1.6 just last week?


----------



## TheLAWNOOB

Quote:


> Originally Posted by *airfathaaaaa*
> 
> well isnt that interesting..
> 
> http://ranker.sisoftware.net/show_run.php?q=c2ffcdf4d2b3d2efdce4dde4d5e5c3b18cbc9aff9aa797b1c2ffc7&l=en
> 1.3ghz almost


Not sure how much AMD improved their architecture.

With no improvements this is about the same as a R9 390X.


----------



## 7850K

Quote:


> Originally Posted by *Newbie2009*
> 
> Greatly improving? Intel own the market. Don't bother talking about products which aren't out yet.


and that's the last post of yours I read


----------



## Newbie2009

Quote:


> Originally Posted by *7850K*
> 
> and that's the last post of yours I read


Oh no.







If you want to be a fanboy then crack on. AMD deserve criticism so they get it.


----------



## lolerk52

Quote:


> Originally Posted by *Forceman*
> 
> I don't know, is 1.3 supposed to be impressive on 14nm? Weren't people expecting 1.5~1.6 just last week?


It's a test chip. Last one we heard about was 800Mhz.


----------



## Robenger

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Not sure how much AMD improved their architecture.
> 
> With no improvements this is about the same as a R9 390X.


It's totes the same bruh.


----------



## sinholueiro

Well, almost 1.3Ghz is more reasonable now. 1.6Ghz OC is about a 23% OC, in track with the 40% increase core clock of 14nm LPP (1150 medium OC in Hawaii or Fiji), so that core at OC speeds will be matching a FuryX assuming that the arch improvement can make 2304 cores almost as fast as 2816 cores (which is a very doable thing). I'm still waiting for a 2560 core version and a GDDR5X samples.


----------



## Newbie2009

Quote:


> Originally Posted by *sinholueiro*
> 
> Well, almost 1.3Ghz is more reasonable now. 1.6Ghz OC is about a 23% OC, in track with the 40% increase core clock of 14nm LPP (1150 medium OC in Hawaii or Fiji), so that core at OC speeds will be matching a FuryX assuming that the arch improvement can make 2304 cores almost as fast as 2816 cores (which is a very doable thing). I'm still waiting for a 2560 core version and a GDDR5X samples.


There was a hitman bench at 1440p on polaris @ 60fps, so similar to fury x. But now they see Nvidia performance, they have something to aim for/surpass.

I remember the HD7970 was underclocked and was followed up by the ghz edition to fight the gtx 680


----------



## airfathaaaaa

Quote:


> Originally Posted by *Newbie2009*
> 
> There was a hitman bench at 1440p on polaris @ 60fps, so similar to fury x. But now they see Nvidia performance, they have something to aim for/surpass.
> 
> I remember the HD7970 was underclocked and was followed up by the ghz edition to fight the gtx 680


you think amd didnt knew about pascal capabilities?ofc they knew its their job to know
what worries me is the complete silence...this can mean that polaris might be way better or that polaris might be worse than they ever even thought..


----------



## sinholueiro

Quote:


> Originally Posted by *Newbie2009*
> 
> There was a hitman bench at 1440p on polaris @ 60fps, so similar to fury x. But now they see Nvidia performance, they have something to aim for/surpass.
> 
> I remember the HD7970 was underclocked and was followed up by the ghz edition to fight the gtx 680


Fury X runs Hitman at 60fps on Ultra. The demo that they show of Hitman in P10 doesn't have to be on Ultra.


----------



## NicksTricks007

Quote:


> Originally Posted by *airfathaaaaa*
> 
> you think amd didnt knew about pascal capabilities?ofc they knew its their job to know
> what worries me is the complete silence...this can mean that polaris might be way better or that polaris might be worse than they ever even thought..


Well, the way I see it is if Polaris is worse than they thought, it would be better to just not release it and release Vega w/ HBM2 early. If Polaris isn't within spitting distance of 1070, then it needs to be priced accordingly. Like at least $100 less than 1070.


----------



## EightDee8D

Quote:


> Originally Posted by *NicksTricks007*
> 
> Well, the way I see it is if Polaris is worse than they thought, it would be better to just not release it and release Vega w/ HBM2 early. If Polaris isn't within spitting distance of 1070, then it needs to be priced accordingly. Like at least $100 less than 1070.


Amd already said they want to bring vr performance down. like 970-290 performance for around 250-300 with more features and low tdp. 1070 is well above that.


----------



## NicksTricks007

Quote:


> Originally Posted by *sinholueiro*
> 
> Fury X runs Hitman at 60fps on Ultra. The demo that they show of Hitman in P10 doesn't have to be on Ultra.


Something that caught my eye in the gtx 1080 benchmark for hitman and aots is when running dx12 @4k it's only 5%-8% faster than Fury X. I know is only two benchmarks which is cherry picking, but still. If that's any indication of dx12 performance from Pascal, then maybe there's still hope for Polaris and Vega if they can take advantage of their architectural advancements in GCN.


----------



## airfathaaaaa

Quote:


> Originally Posted by *NicksTricks007*
> 
> Something that caught my eye in the gtx 1080 benchmark for hitman and aots is when running dx12 @4k it's only 5%-8% faster than Fury X. I know is only two benchmarks which is cherry picking, but still. If that's any indication of dx12 performance from Pascal, then maybe there's still hope for Polaris and Vega if they can take advantage of their architectural advancements in GCN.


the indication of pascal is that http://www.computerbase.de/2016-05/geforce-gtx-1080-test/11/#diagramm-ashes-of-the-singularity-async-compute
bruteforcing their way out is one way of claiming"async enable" i guess they are basicly 2-3% on regression again

also
check this
http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,15.html
check what they are saying on the settings of the bench
and then check this
http://www.ashesofthesingularity.com/metaverse#/personas/b02c74ea-e3d0-479f-9a11-0b0be0b732c7/match-details/9837e169-4ec9-4006-a93d-2306248b4d67
this is how they do benches nowdays...bending their word imagine if more devs actually have a dedicated site that can show benches like that how can you trust anyone?


----------



## airfathaaaaa

More Polaris IDs, Golden Register Settings Added To AMDGPU
seems like quite a lot of id's for just 4 cards we previously thought


----------



## renx

Are we expecting news today?


----------



## airfathaaaaa

lol forgot to put the link
https://www.phoronix.com/scan.php?page=news_item&px=AMDGPU-Polaris-More-IDs


----------



## Newbie2009

Quote:


> Originally Posted by *renx*
> 
> Are we expecting news today?


Supposed to be some.

http://www.tweaktown.com/news/52114/amd-provide-inside-look-polaris-18/index.html


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *Newbie2009*
> 
> Supposed to be some.
> 
> http://www.tweaktown.com/news/52114/amd-provide-inside-look-polaris-18/index.html


I really hope 390X perf isn't the best they can manage unless it's really inexpensive or can OC very well (like 970 type OCing). Just can't wrap my head around all of this.


----------



## KarathKasun

Quote:


> Originally Posted by *NicksTricks007*
> 
> Well, the way I see it is if Polaris is worse than they thought, it would be better to just not release it and release Vega w/ HBM2 early. If Polaris isn't within spitting distance of 1070, then it needs to be priced accordingly. Like at least $100 less than 1070.


Wow, AMD is a business. They cant just "not release" it. They have to get the cards on the market ASAP to start getting money back on the R&D investment.

Looks like P10 is going to be R9 470 or R9 480. The desktop launch will probably be followed closely by a laptop part launch if power consumption rumors are true.


----------



## Newbie2009

Quote:


> Originally Posted by *KarathKasun*
> 
> Wow, AMD is a business. They cant just "not release" it. They have to get the cards on the market ASAP to start getting money back on the R&D investment.
> 
> Looks like P10 is going to be R9 470 or R9 480. The desktop launch will probably be followed closely by a laptop part launch if power consumption rumors are true.


I think the new laptop parts announced so far are rebrands.


----------



## KarathKasun

AFAIK there have not been any real laptop announcements outside of P11 targeting the "notebook market".


----------



## prznar1

Quote:


> Originally Posted by *Olivon*
> 
> Find this via Anandtech (thanks to Sweepr) :
> 
> http://tieba.baidu.com/p/4523923961?qq-pf-to=pcqq.group
> 
> Dunno if true or not.


Very good news for me :> VERY! Thx for posting this.


----------



## Newbie2009

Quote:


> Originally Posted by *KarathKasun*
> 
> AFAIK there have not been any real laptop announcements outside of P11 targeting the "notebook market".


http://www.pcworld.com/article/3070972/components-graphics/waiting-for-polaris-amds-new-radeon-m400-laptop-gpus-arent-new-at-allyet.html

Just this.


----------



## Majinwar

Quote:


> Originally Posted by *Newbie2009*
> 
> Supposed to be some.
> 
> http://www.tweaktown.com/news/52114/amd-provide-inside-look-polaris-18/index.html


Won't be anything that revealing unfortunately









Link


----------



## Newbie2009

Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> I really hope 390X perf isn't the best they can manage unless it's really inexpensive or can OC very well (like 970 type OCing). Just can't wrap my head around all of this.


Nah cannot see that. People have been brainwashed by nvidia marketing. Mid range next gen should be fury x /980ti level + with much better overclocking and less power consumption.

Nvidia 1080 is a mid range card, just Nvidia can market it and price it as high end as no competition.


----------



## prznar1

Quote:


> Originally Posted by *Majinwar*
> 
> Won't be anything that revealing unfortunately
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link


Atleast they said whats what with that conference.


----------



## Cyclonic

Any link to where this presentation will be streamed?


----------



## SlackerITGuy

I just got kicked from the Webinar (user limit).


----------



## Majinwar

Quote:


> Originally Posted by *prznar1*
> 
> Atleast they said whats what with that conference.


No I agree. I like the honesty and the heads up, but my 4870 is very tired... lol


----------



## prznar1

Quote:


> Originally Posted by *Majinwar*
> 
> No I agree. I like the honesty and the heads up, but my 4870 is very tired... lol


LOL Man, give it to museum


----------



## Newbie2009

Can't complain, good communication at least.


----------



## Cyclonic

Quote:


> Originally Posted by *Newbie2009*
> 
> Can't complain, good communication at least.


Yup but yet again fail marketing


----------



## prznar1

Quote:


> Originally Posted by *Cyclonic*
> 
> Yup but yet again fail marketing


why, afterall they just said that polaris will be explained today, not revealed.


----------



## Cyclonic

Quote:


> Originally Posted by *prznar1*
> 
> why, afterall they just said that polaris will be explained today, not revealed.


They need to show something to derail the 1080/1070 hype and not wait until everyone already has a 1080 at home already.


----------



## Newbie2009

Quote:


> Originally Posted by *Cyclonic*
> 
> They need to show something to derail the 1080/1070 hype and not wait until everyone already has a 1080 at home already.


Agree. But have a week to do it. Day before pascal becomes available will do nicely. (if they have something ready to show, one would assume they will do it)


----------



## prznar1

Quote:


> Originally Posted by *Cyclonic*
> 
> They need to show something to derail the 1080/1070 hype and not wait until everyone already has a 1080 at home already.


dont worry, there is plenty of time to do so.


----------



## rbarrett96

Consumers are the ones that put us in this position. Does no one remember that the flagship series 5000 cards from AMD were $400? I thought that was ridiculous at the time, so I got an open box 5850 for just over $200. Then at some point Nvidia just decided to up their prices because they could and you all bought it anyway which then forced AMD to raise their prices because while they wanted to remain competitive on price, they weren't going to leave free money on the table. Blame yourselves, I won't be buying any cards that cost 50% of the price it takes to build my next PC.


----------



## vloeibaarglas

I wouldn't be surprised really. 1070 by all estimates is going to be near 980 Ti for $379. Seeing how Nvidia has always overcharged consumers for performance, $300 for 980 Ti from AMD would not be unreasonable.


----------



## bigjdubb

Quote:


> Originally Posted by *rbarrett96*
> 
> Consumers are the ones that put us in this position. Does no one remember that the flagship series 5000 cards from AMD were $400? I thought that was ridiculous at the time, so I got an open box 5850 for just over $200. Then at some point Nvidia just decided to up their prices because they could and you all bought it anyway which then forced AMD to raise their prices because while they wanted to remain competitive on price, they weren't going to leave free money on the table. Blame yourselves, I won't be buying any cards that cost 50% of the price it takes to build my next PC.


Says the guy with a GTX 680... the card that birthed this trend.


----------



## keikei

Since the thread about this webcast was locked, i'll leave this here:
Quote:


> Don't expect much from today's AMD MTE live webcast, no specific technical details to be revealed


----------



## rbarrett96

Quote:


> Originally Posted by *bigjdubb*
> 
> Says the guy with a GTX 680... the card that birthed this trend.


Which I got for less than $400 open box. I had to wait till the 780ti came out, I'm not stupid.


----------



## bigjdubb

Quote:


> Originally Posted by *keikei*
> 
> Since the thread about this webcast was locked, i'll leave this here:


I think that webinar is over already.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *rbarrett96*
> 
> Which I got for less than $400 open box. I had to wait till the 780ti came out, I'm not stupid.


You waited a long time to buy that card though..............not exactly that much of a bargain.


----------



## keikei

Quote:


> Originally Posted by *bigjdubb*
> 
> I think that webinar is over already.


No big news I take it? Havent seen anything related on the news feed.


----------



## linbetwin

I understand it was only a recap of things already made public about Polaris. It was for AMD channel partners.


----------



## keikei

Quote:


> Originally Posted by *linbetwin*
> 
> I understand it was only a recap of things already made public about Polaris. It was for AMD channel partners.


Bummer. 6/1 reveal it is then...unless we get some solid leaks before then. Thanks.


----------



## bigjdubb

Quote:


> Originally Posted by *keikei*
> 
> Bummer. 6/1 reveal it is then...unless we get some solid leaks before then. Thanks.


Yeah. I think it's probably a good idea for AMD to get some numbers out there before the 1080 goes on sale if it is remotely competitive with the 1080. It wouldn't be hard to make a strong case for value with the initial $699 price of the 1080.


----------



## prznar1

What is max clusters count of polaris 11? 20?


----------



## TheLAWNOOB

I think its 9. 2560 core cards were no where to be found. Only seen 2304 and 2048.


----------



## prznar1

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I think its 9. 2560 core cards were no where to be found. Only seen 2304 and 2048.


2k+ of shaders is polaris 10, im talking about polaris 11


----------



## airfathaaaaa

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I think its 9. 2560 core cards were no where to be found. Only seen 2304 and 2048.


well problem is with this https://www.phoronix.com/scan.php?page=news_item&px=AMDGPU-Polaris-More-IDs now we have even more cards above the one we knew about


----------



## TheLAWNOOB

Quote:


> Originally Posted by *prznar1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheLAWNOOB*
> 
> I think its 9. 2560 core cards were no where to be found. Only seen 2304 and 2048.
> 
> 
> 
> 2k+ of shaders is polaris 10, im talking about polaris 11
Click to expand...

Its going to be so small that the "its not about the size, its about how you use it" argument wont even apply.


----------



## variant

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I think its 9. 2560 core cards were no where to be found. Only seen 2304 and 2048.


The only indication of a 2048 Polaris is one rumor. We have actually seen the 2304 one in SiSoftware results.


----------



## hokk

May 4, 2016
Ellesmere XT chip testing phase still belongs to A0
frequency? Stream processing units? Sorry, it is that these are now measured
Now it is now simply a stuffed transistor chip, as it is to edit the number of stream processors with much frequency much power operation
Ah, now just do this thing at RTG employees worldwide cooperation, we expect the third week of May to provide A1 official version of the chip for testing
At that time, performance, power, basic specifications can be determined, and the rest is driven adjustment and optimization of the product
Before A1 chip out any run points are speculating
Yes, RTG has now issued proposed reference PCB design, various graphics AIB has entered the pre-production stage
Relatively speaking, Baffin PCB's proposal is very short and simple, 4-layer PCB + single fan can, shorter than some of Nano
Ellesmere's PCB recommendations are slightly longer than the Nano, but as long as the 6-layer PCB + single fan
AIB landlord asked who can not engage in gas stove, AIB say can consider engage cooker ...
AIB landlord asked who can not engage in a three-fan, AIB say we going to put two fans when the display it ...
Finally Ellesmere Pro now with Baffin A0 are not out, do not rush it, RTG efforts in the
PS. Which help me to take a message to the USG Ishimura, to see if he says the 1070 have been suspended or beaten 980Ti, Polaris represents less people first

Google translator not the best


----------



## bucdan

AMD's stock just rose 4%, I think the outlook is pretty positive for P10's announcement today.


----------



## sugarhell

I bet of this also.

http://www.pcauthority.com.au/News/419711,amd-enjoys-a-rare-market-share-increase.aspx


----------



## bigjdubb

Quote:


> Originally Posted by *bucdan*
> 
> AMD's stock just rose 4%, I think the outlook is pretty positive for P10's announcement today.


It is my understanding that the web event you are referring to has already happened and it was not a Polaris announcement, just a little rehash of what's on the Polaris website.


----------



## Forceman

Quote:


> Originally Posted by *bucdan*
> 
> AMD's stock just rose 4%, I think the outlook is pretty positive for P10's announcement today.


There is no announcement today. The webinar is already over and there was no real news.


----------



## bucdan

Quote:


> Originally Posted by *bigjdubb*
> 
> It is my understanding that the web event you are referring to has already happened and it was not a Polaris announcement, just a little rehash of what's on the Polaris website.


Quote:


> Originally Posted by *Forceman*
> 
> There is no announcement today. The webinar is already over and there was no real news.


Ah, that's good to know. I stand corrected.


----------



## Vf2ss

Spoiler: Warning: Spoiler!











What I imagine they talked about during the webcast.


----------



## variant

Quote:


> Originally Posted by *kylzer*
> 
> May 4, 2016
> Ellesmere XT chip testing phase still belongs to A0
> frequency? Stream processing units? Sorry, it is that these are now measured
> Now it is now simply a stuffed transistor chip, as it is to edit the number of stream processors with much frequency much power operation
> Ah, now just do this thing at RTG employees worldwide cooperation, we expect the third week of May to provide A1 official version of the chip for testing
> At that time, performance, power, basic specifications can be determined, and the rest is driven adjustment and optimization of the product
> Before A1 chip out any run points are speculating
> Yes, RTG has now issued proposed reference PCB design, various graphics AIB has entered the pre-production stage
> Relatively speaking, Baffin PCB's proposal is very short and simple, 4-layer PCB + single fan can, shorter than some of Nano
> Ellesmere's PCB recommendations are slightly longer than the Nano, but as long as the 6-layer PCB + single fan
> AIB landlord asked who can not engage in gas stove, AIB say can consider engage cooker ...
> AIB landlord asked who can not engage in a three-fan, AIB say we going to put two fans when the display it ...
> Finally Ellesmere Pro now with Baffin A0 are not out, do not rush it, RTG efforts in the
> PS. Which help me to take a message to the USG Ishimura, to see if he says the 1070 have been suspended or beaten 980Ti, Polaris represents less people first
> 
> Google translator not the best


Except we've seen working Polaris 11 as far back as March.


----------



## Ultracarpet

The performance per inch better be at least 30% better or AMD is done. Mark my words.


----------



## spyshagg

Quote:


> Originally Posted by *sugarhell*
> 
> I bet of this also.
> 
> http://www.pcauthority.com.au/News/419711,amd-enjoys-a-rare-market-share-increase.aspx


Much higher than the 18% they had last year. Good indicator


----------



## NicksTricks007

Quote:


> Originally Posted by *KarathKasun*
> 
> Wow, AMD is a business. They cant just "not release" it. They have to get the cards on the market ASAP to start getting money back on the R&D investment.
> 
> Looks like P10 is going to be R9 470 or R9 480. The desktop launch will probably be followed closely by a laptop part launch if power consumption rumors are true.


Quote:


> Originally Posted by *EightDee8D*
> 
> Amd already said they want to bring vr performance down. like 970-290 performance for around 250-300 with more features and low tdp. 1070 is well above that.


I should have worded that different. Realized after rereading what I posted lol. What I meant was maybe they should note release polaris now and release Vega early. Basically releasing their low, mid and high range at the same time. I know that's unrealistic now and I'm no business connoisseur. Just an average pc enthusiasts hoping for more competition in the gpu market.


----------



## prznar1

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Its going to be so small that the "its not about the size, its about how you use it" argument wont even apply.


i dont need more







i need small pcb with enough performance to run wot and swtor


----------



## bigjdubb

Quote:


> Originally Posted by *prznar1*
> 
> i dont need more
> 
> 
> 
> 
> 
> 
> 
> i need small pcb with enough performance to run wot and swtor


Maybe the next gen APU's will be the ticket.


----------



## Ding Chavez

I know the OP article says Polaris could have GTX 980ti level performance which would be good. But was just looking at this.
Quote:


> AMD Polaris GPU Specs Leaked; Polaris 10 Delivers R9 390/X Performance Levels for Less Than $300


http://techfrag.com/2016/05/15/amd-polaris-gpu-specs-leaked/

Doesn't sound as good but would 390X level be good enough, 980ti level I'd buy it, 390X level not so sure...


----------



## airfathaaaaa

Quote:


> Originally Posted by *Ding Chavez*
> 
> I know the OP article says Polaris could have GTX 980ti level performance which would be good. But was just looking at this.
> http://techfrag.com/2016/05/15/amd-polaris-gpu-specs-leaked/
> 
> Doesn't sound as good but would 390X level be good enough, 980ti level I'd buy it, 390X level not so sure...


a rumor based on another rumor that the based rumor doesnt provide any source
good luck yet another rumor to pile in


----------



## variant

Quote:


> Originally Posted by *Ding Chavez*
> 
> I know the OP article says Polaris could have GTX 980ti level performance which would be good. But was just looking at this.
> http://techfrag.com/2016/05/15/amd-polaris-gpu-specs-leaked/
> 
> Doesn't sound as good but would 390X level be good enough, 980ti level I'd buy it, 390X level not so sure...


I think that rumor can actually be mapped out if you wanted to go through all the rumor mongering websites. It's a conflation of the rumor that said it would be 980Ti at $300 and the rumor that said it would be 390X at $200.


----------



## prznar1

Quote:


> Originally Posted by *bigjdubb*
> 
> Maybe the next gen APU's will be the ticket.


If so, ill ditch the atx format in favor of mini-itx


----------



## bucdan

Quote:


> Originally Posted by *prznar1*
> 
> If so, ill ditch the atx format in favor of mini-itx


I've dumped ATX for mITX myself. The space savings is great for my desk. I miss out on a lot of ports and expansions, but that's okay. I do see that many case manufacturers are making mITX cases that are mDTX compatible too. If AMD starts the mDTX train, Crossfire in a mDTX, yes please! But more likely, it'll be an audio card for me. Nothing beats the challenge of cramming a full size video card, psu, high end cooler, and drives into a small case the size of 2 encyclopedia books lol.


----------



## bigjdubb

Quote:


> Originally Posted by *prznar1*
> 
> If so, ill ditch the atx format in favor of mini-itx


I will be building a second machine with an apu and itx board as soon as i can get one with hdmi 2.0 and hdcp 2.2.


----------



## variant

I had posted this elsewhere, but I thought I would post this here as well. I wanted to see roughly what the Polaris 10 could do and while TFLOPs were a good rough estimate in many cases, it doesn't work well when comparing AMD and Nvidia, and specifically every other AMD cards and the Fury series since they are seriously bottlenecked. So I decided to use a random 3DMark 2013 Firestrike benchmark and extrapolate the Polaris 10 from the 380X since it's the only full Tonga available. I also used a GTX 980 to extrapolate the power of the 1080 and compared it to the real 1080 benchmark I later found, and then the 1070.

I basically determined the percentage of stream processor or CUDA core differences over the card I am using to extrapolate (380X, 980) and the percentage of clockspeeds, and then adjusted the Firestrike score accordingly. The 1080 score got pretty close considering I am rounding numbers, only a 2.6% difference. For the Polaris 10, I used the 67DF:C7 from here and the 67C0 is just speculation that it would be a full 2560 cores. I then just decided to see what clockspeeds would be needed to beat the 980Ti, and I specifically used 1500 Mhz because there's been a lot of speculation that would be a likely clockspeed for full Polaris 10.

Here's the list of Firestrike scores.

380X: 8,457
67DF:C7 (1266 Mhz): 11,607
980: 11,168
390X: 11,686
67C0 (1266 Mhz): 12,896
67DF:C7 (1500 Mhz): 13,795
Fury X: 14,374
1070: 14,546
980 Ti: 15,656
67C0 (1500 Mhz): 15,721
1080: 19,370
1080 (extrapolation): 19,893


----------



## Robenger

Quote:


> Originally Posted by *variant*
> 
> I had posted this elsewhere, but I thought I would post this here as well. I wanted to see roughly what the Polaris 10 could do and while TFLOPs were a good rough estimate in many cases, it doesn't work well when comparing AMD and Nvidia, and specifically every other AMD cards and the Fury series since they are seriously bottlenecked. So I decided to use a random 3DMark 2013 Firestrike benchmark and extrapolate the Polaris 10 from the 380X since it's the only full Tonga available. I also used a GTX 980 to extrapolate the power of the 1080 and compared it to the real 1080 benchmark I later found, and then the 1070.
> 
> I basically determined the percentage of stream processor or CUDA core differences over the card I am using to extrapolate (380X, 980) and the percentage of clockspeeds, and then adjusted the Firestrike score accordingly. The 1080 score got pretty close considering I am rounding numbers, only a 2.6% difference. For the Polaris 10, I used the 67DF:C7 from here and the 67C0 is just speculation that it would be a full 2560 cores. I then just decided to see what clockspeeds would be needed to beat the 980Ti, and I specifically used 1500 Mhz because there's been a lot of speculation that would be a likely clockspeed for full Polaris 10.
> 
> Here's the list of Firestrike scores.
> 
> 380X: 8,457
> 67DF:C7 (1266 Mhz): 11,607
> 980: 11,168
> 390X: 11,686
> 67C0 (1266 Mhz): 12,896
> 67DF:C7 (1500 Mhz): 13,795
> Fury X: 14,374
> 1070: 14,546
> 980 Ti: 15,656
> 67C0 (1500 Mhz): 15,721
> 1080: 19,370
> 1080 (extrapolation): 19,893


67C0 (1500 Mhz): 15,721

This is pretty good considering its a midrange card no?


----------



## variant

Quote:


> Originally Posted by *Robenger*
> 
> 67C0 (1500 Mhz): 15,721
> 
> This is pretty good considering its a midrange card no?


It would be if it is indeed accurate. It's in line with the 980Ti performance at $300 rumor.


----------



## Majinwar

Quote:


> Originally Posted by *variant*
> 
> It would be if it is indeed accurate. It's in line with the 980Ti performance at $300 rumor.


To be fair... the rumour is 'near 980Ti performance'. Assuming the rumour is correct, it would fall short of 980Ti perf.

However, let's hope it's not that far from it!


----------



## variant

Quote:


> Originally Posted by *Majinwar*
> 
> To be fair... the rumour is 'near 980Ti performance'. Assuming the rumour is correct, it would fall short of 980Ti perf.
> 
> However, let's hope it's not that far from it!


The word 'near' doesn't actually indicate less.


----------



## CasualCat

Quote:


> Originally Posted by *variant*
> 
> The word 'near' doesn't actually indicate less.


I think it is implicit in the context. Otherwise the rumor would be about it matching or outperforming the 980Ti imho.


----------



## variant

Quote:


> Originally Posted by *CasualCat*
> 
> I think it is implicit in the context. Otherwise the rumor would be about it matching or outperforming the 980Ti imho.


Maybe, but benchmarks and performance are often not so cut and dry. It could outperform the 980Ti in some cases, but match it or be below it in other cases. As an example, I think it will have a tougher time in 4K simply because of the memory bandwidth deficiency.


----------



## CasualCat

Quote:


> Originally Posted by *variant*
> 
> Maybe, but benchmarks and performance are often not so cut and dry. It could outperform the 980Ti in some cases, but match it or be below it in other cases.


That's definitely a possibility and could still qualify as near.


----------



## Robenger

Quote:


> Originally Posted by *Majinwar*
> 
> To be fair... the rumour is 'near 980Ti performance'. Assuming the rumour is correct, it would fall short of 980Ti perf.
> 
> However, let's hope it's not that far from it!


How is falling short?


----------



## 364901

Quote:


> Originally Posted by *variant*
> 
> Here's the list of Firestrike scores.
> 
> 380X: 8,457
> 67DF:C7 (1266 Mhz): 11,607
> 980: 11,168
> 390X: 11,686
> 67C0 (1266 Mhz): 12,896
> 67DF:C7 (1500 Mhz): 13,795
> Fury X: 14,374
> 1070: 14,546
> 980 Ti: 15,656
> 67C0 (1500 Mhz): 15,721
> 1080: 19,370
> 1080 (extrapolation): 19,893


Eh, would appear to be about right. Also, you're not taking into account that Pascal is basically Maxwell with some new tricks. AMD on the other hand is chalking up Polaris to be almost a complete redesign (which, going by the slip of the tongue by Nalasco in the AMD Partner stream, is pretty accurate).



Quote:


> Originally Posted by *Robenger*
> 
> How is falling short?


The problem is this:



Different OS, different platform, different drivers and an updated version of 3DMark. Variant's math is more or less accurate, and it's at least in the same ballpark, but he's referencing a much older result. Things are different now. Just the other day I was testing a GTX 980 inside Acer's Predator G6, and a driver update bumped up FireStrike numbers by at least 2%. It's been improving ever since the launch.


----------



## BradleyW

Paloris 10 is not designed to go head to head with the GTX 1080. That's Vega's job. Polaris is going to fight for the middle market.


----------



## variant

Quote:


> Originally Posted by *CataclysmZA*
> 
> Eh, would appear to be about right. Also, you're not taking into account that Pascal is basically Maxwell with some new tricks. AMD on the other hand is chalking up Polaris to be almost a complete redesign (which, going by the slip of the tongue by Nalasco in the AMD Partner stream, is pretty accurate).


I can't account for improvements in Polaris architecture since we don't know what effects it will ultimately have.

Quote:


> Originally Posted by *CataclysmZA*
> 
> The problem is this:
> 
> 
> 
> Different OS, different platform, different drivers and an updated version of 3DMark. Variant's math is more or less accurate, and it's at least in the same ballpark, but he's referencing a much older result. Things are different now. Just the other day I was testing a GTX 980 inside Acer's Predator G6, and a driver update bumped up FireStrike numbers by at least 2%. It's been improving ever since the launch.


That benchmark is done on Windows 10, they updated it for the 1080 and nothing changed. and the benchmark you posted isn't Firestrike.


----------



## Robenger

Quote:


> Originally Posted by *variant*
> 
> I can't account for improvements in Polaris architecture since we don't know what effects it will ultimately have.
> That benchmark is done on Windows 10, they updated it for the 1080 and nothing changed. and the benchmark you link isn't Firestrike.


I thought he was trying to pull one over on me lol


----------



## prjindigo

Quote:


> Originally Posted by *xzamples*
> 
> vega beats 1080


At this rate Vega will beat the GP102 (which is only 50% more 1080 but has to run slower for wattage)


----------



## linbetwin

Quote:


> AMD today announced it will hold a press conference and live webcast during Computex 2016 in Taipei, Taiwan. The event will begin on Wednesday, June 1, 2016 at 10:00 AM CST / 10:00 PM EDT. The event will feature launch of 7th Generation AMD A-Series Processors, *Polaris updates* and more.


Source


----------



## zealord

Quote:


> Originally Posted by *linbetwin*
> 
> Source


Polaris _updates_?

So no Polaris launch?


----------



## Forceman

Quote:


> Originally Posted by *zealord*
> 
> Polaris _updates_?
> 
> So no Polaris launch?


Doesn't seem that way. The speculation is that it'll launch at E3 like Fiji did.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *linbetwin*
> 
> Source
> 
> 
> 
> Polaris _updates_?
> 
> So no Polaris launch?
Click to expand...

I bet they are not even going to fix this


----------



## zealord

Quote:


> Originally Posted by *Forceman*
> 
> Doesn't seem that way. The speculation is that it'll launch at E3 like Fiji did.


I'd rather have Vega









@ Lawnoob what am I looking at specifically here? I never had CF so no idea


----------



## TheLAWNOOB

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Doesn't seem that way. The speculation is that it'll launch at E3 like Fiji did.
> 
> 
> 
> I'd rather have Vega
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @ Lawnoob what am I looking at specifically here? I never had CF so no idea
Click to expand...

Look at the frametime. Was trying to play CSGO, which does fine with 1 GPU.

I disabled crossfire in driver, and turns out crossfire will always be active even if you choose to "disable" it if you have Windows 10.

I got windows 10 for DX12. Thanks AMD.


----------



## zealord

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Look at the frametime. Was trying to play CSGO, which does fine with 1 GPU.
> 
> I disabled crossfire in driver, and turns out crossfire will always be active even if you choose to "disable" it if you have Windows 10.
> 
> I got windows 10 for DX12. Thanks AMD.


sorry to hear that. Is it related to VSync somehow? I never use multi GPUs


----------



## TheLAWNOOB

Nope. It's the infamous frame time issue with CF brought to light by PCPer and FCAT.

AMD fixed it somewhat, but the games I tried still studders badly with CF if framerate goes below 60. Keep in mind I have a freesync monitor, which is meant to reduce the effect of lower framerates.

In CSGO, since CF cant be disabled (thanks windows 10), it runs like a piece of crap. It doesnt matter if I cap it to 60fps or leave it uncapped, it feels like I'm playing at 10fps. I tried my best and got a 0.2KD in online deathmatch. I usually get 2-3KD.

Playing CSGO with CF is actually worse than playing at 30fps. A lot worse.


----------



## Forceman

Quote:


> Originally Posted by *zealord*
> 
> I'd rather have Vega


I don't think I can wait that long. I've had this 290X longer than I've ever had any piece of performance-impacting computer hardware, and frankly I'm getting sick of looking at it.


----------



## zealord

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> *Nope. It's the infamous frame time issue with CF brought to light by PCPer and FCAT.*
> 
> AMD fixed it somewhat, but the games I tried still studders badly with CF if framerate goes below 60. Keep in mind I have a freesync monitor, which is meant to reduce the effect of lower framerates.
> 
> In CSGO, since CF cant be disabled (thanks windows 10), it runs like a piece of crap. It doesnt matter if I cap it to 60fps or leave it uncapped, it feels like I'm playing at 10fps. I tried my best and got a 0.2KD in online deathmatch. I usually get 2-3KD.
> 
> Playing CSGO with CF is actually worse than playing at 30fps. A lot worse.


I thought AMD fixed that.

i remember that CF 7970 and 7990 was HORRIBLE with the grunt frames and everything, but I thought with hawaii and especially Fury they are actually doing well with CF


----------



## TheLAWNOOB

Well it doesnt work if you dont have a constant 60fps in KF2 or Tomb Raider 2013.

They never fixed CF for DX9.

And CF cant be disabled in Windows 10.

Still waiting for CF support in NFS 2016. Game crashes in loading screen with CF profile provided by AMD.


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I bet they are not even going to fix this


Something wrong with your setup. i've had my 2 290s for over two years now and never seen core clocks fluctuate like that.

How did you install the driver?


----------



## 364901

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Well it doesnt work if you dont have a constant 60fps in KF2 or Tomb Raider 2013.
> 
> They never fixed CF for DX9.
> 
> And CF cant be disabled in Windows 10.
> 
> Still waiting for CF support in NFS 2016. Game crashes in loading screen with CF profile provided by AMD.


Have you tried reporting this to someone at AMD? I can suggest @theMattB81, he's helped me out before.


----------



## Newbie2009

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Well it doesnt work if you dont have a constant 60fps in KF2 or Tomb Raider 2013.
> 
> They never fixed CF for DX9.
> 
> And CF cant be disabled in Windows 10.
> 
> Still waiting for CF support in NFS 2016. Game crashes in loading screen with CF profile provided by AMD.


something up with ur config. Try disabling ULPS. I'm not familiar with windows 10, but xfire is excellent on w7 these days.

*Edit - it looks like toy are running a mild OC. Use sapphire trixx to disable ULPS or u need to reg edit. ULPS is a headache when not running cards at stock.


----------



## sugarhell

YOU should never play csgo with mgpu. Mgpu increase a lot the latency so there is no point to play csgo with latency. Just dont use fullscreen and crossfire will not kick in


----------



## KarathKasun

MGPU is dying anyway, I wouldnt waste the money on it at this point.

Maybe when VR actually gets established in the market.


----------



## spyshagg

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Well it doesnt work if you dont have a constant 60fps in KF2 or Tomb Raider 2013.
> 
> They never fixed CF for DX9.
> 
> And CF cant be disabled in Windows 10.
> 
> Still waiting for CF support in NFS 2016. Game crashes in loading screen with CF profile provided by AMD.


Catalyst 15.11.1 beta allow you do disable crossfire per application (game profile in catalyst), it works I tried it.

With crimson I dont remember if per app function worked (its there, but i think it ignored what I chose), but will try tomorrow.

At worst you can disable crossfire for all apps, and this works (i remember using it)

I'm using windows 10.


----------



## spyshagg

Quote:


> Originally Posted by *KarathKasun*
> 
> MGPU is dying anyway, I wouldnt waste the money on it at this point.
> 
> Maybe when VR actually gets established in the market.


It makes my BF4 and Pcars assetto corsa experiences much much improved. Hitting 150fps 1440P in BF4 almost all the time @ ultra. With only one 290x active it is not nearly as smooth.

I do want CF working, because when it works its wonderfull. I have high hopes for mgpu with DX12. Those who bought 290x for peanuts will be set for years to come. Its becoming the 2600k of gpus.


----------



## TheLAWNOOB

Thanks I'll try disabling it as default


----------



## spyshagg

No problem.

Unless you need the latest crimson for a specific game (such as Doom), I would strongly suggest using the catalyst 15.11.1 BETA, its a beautiful driver. Everything works in it.

Its my go-to driver as I only play BF4 blops3 Pcars Asseto R3E.

When a good single player game is released that needs the latest driver to work, i'll use crimson until I'm finished with the game and go back to 15.11.1 (its basically what I'm doing now with Doom)


----------



## TheLAWNOOB

Unfortunately I have to use the newest driver for a while. Just bought Rise of tomb raider, and waiting for proper CF support in NFS 2016


----------



## prznar1

Quote:


> Originally Posted by *zealord*
> 
> Polaris _updates_?
> 
> So no Polaris launch?


I think it is another missunderstanding. They will say a bit about next polaris cards range. Polaris 10 and 11 are on shelude for this year.


----------



## criminal

Quote:


> Originally Posted by *Forceman*
> 
> Doesn't seem that way. The speculation is that it'll launch at E3 like Fiji did.


Which is what, two weeks later?


----------



## sugarhell

Quote:


> Originally Posted by *criminal*
> 
> Which is what, two weeks later?


14 June


----------



## BulletBait

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Unfortunately I have to use the newest driver for a while. Just bought Rise of tomb raider, and waiting for proper CF support in NFS 2016


Yeah, the latest non-whql 16.15.2somethingsomething disabled the preference to disable CF in the more options menu. Kind of annoyed me at first since I was looking to make sure it was enabled for stress testing when I brought my rig back up.

I will say I'm not a fan of taking the option away, but AMD are moving that direction permanently it seems (Navi/Scalability ect.)

Edit: This was from the 5/11 driver

The AMD Crossfire mode options in Radeon Settings may not take effect on Origin or Uplay applications.

Maybe it's only temporary with the 5/16 driver until this issue is fixed, who knows.


----------



## WanWhiteWolf

Quote:


> Originally Posted by *BulletBait*
> 
> Yeah, the latest non-whql 16.15.2somethingsomething disabled the preference to disable CF in the more options menu. Kind of annoyed me at first since I was looking to make sure it was enabled for stress testing when I brought my rig back up.
> 
> I will say I'm not a fan of taking the option away, but AMD are moving that direction permanently it seems (Navi/Scalability ect.)
> 
> Edit: This was from the 5/11 driver
> 
> The AMD Crossfire mode options in Radeon Settings may not take effect on Origin or Uplay applications.
> 
> Maybe it's only temporary with the 5/16 driver until this issue is fixed, who knows.


You can create a custom profile (it's pretty easy) to open an application with specific settings (e.g CF enabled).

Basically you create a new profile and put the path to your application. Once the application is launched - even if it is not from within the Radeon menu - the settings will be applied (as far as I've tested).


----------



## BulletBait

Quote:


> Originally Posted by *WanWhiteWolf*
> 
> You can create a custom profile (it's pretty easy) to open an application with specific settings (e.g CF enabled).
> 
> Basically you create a new profile and put the path to your application. Once the application is launched - even if it is not from within the Radeon menu - the settings will be applied (as far as I've tested).


He *just* said that wasn't working for him for CS and was trying to disable CF completely instead to fix it (which you can't do right now).


----------



## Forceman

Quote:


> Originally Posted by *prznar1*
> 
> I think it is another missunderstanding. They will say a bit about next polaris cards range. Polaris 10 and 11 are on shelude for this year.


They'll announce when they are going to announce them, at which time they will announce when they will go on sale, maybe end of June/early July. Just going off the Fury schedule.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *Forceman*
> 
> They'll announce when they are going to announce them, at which time they will announce when they will go on sale, maybe end of June/early July. Just going off the Fury schedule.


An announcement for the announcement that announces their new Polaris graphics cards!


----------



## airfathaaaaa

well to be fair the only OFFICIAL thing amd said about polaris was that they will be on stores on mid 2016


----------



## bigjdubb

Quote:


> Originally Posted by *airfathaaaaa*
> 
> well to be fair the only OFFICIAL thing amd said about polaris was that they will be on stores on mid 2016


So I guess it's a matter of how wide is"mid". I don't think anyone is saying they over promised but there was a lot of AMD fans who swore up and down that AMD would launch first. This is probably a little disappointing for those people.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *bigjdubb*
> 
> So I guess it's a matter of how wide is"mid". I don't think anyone is saying they over promised but there was a lot of AMD fans who swore up and down that AMD would launch first. This is probably a little disappointing for those people.


Definitely for me, I was super excited to ditch Nvidia again but now I have to wait and it looks like they might not even be releasing anything that can surpass a 980 Ti for some time. I still have hope the rumors aren't true.


----------



## jdstock76

Except it'll be what, two years after the 980ti release when we see it? The 980ti will be $300 by then. No need to purchase Polaris.


----------



## GoLDii3

Quote:


> Originally Posted by *bigjdubb*
> 
> So I guess it's a matter of how wide is"mid". I don't think anyone is saying they over promised but there was a lot of AMD fans who swore up and down that AMD would launch first. This is probably a little disappointing for those people.


I swear i hate this situation. I just sold my VGA like two weeks ago,and there seems to be nothing official at all from AMD, I don't want to buy nVidia because prices are crazy here in Europe.

And even if there's the slim chance that i may see a custom 1070 for 400 bucks here in Europe when will it be? July?

I wish AMD would get their ehm... toghether and atleast paper launch Polaris.


----------



## airfathaaaaa

Quote:


> Originally Posted by *bigjdubb*
> 
> So I guess it's a matter of how wide is"mid". I don't think anyone is saying they over promised but there was a lot of AMD fans who swore up and down that AMD would launch first. This is probably a little disappointing for those people.


mid 2016 is end of june anything more is delay


----------



## rbarrett96

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> You waited a long time to buy that card though..............not exactly that much of a bargain.


I waited because I wanted the 4 GB version to make it last. It was hard to find one too.


----------



## RedHood

Hey

Just leaving this here:


----------



## zanardi

July 2


----------



## zealord

Quote:


> Originally Posted by *RedHood*
> 
> Hey
> 
> Just leaving this here:


I don't see a 490X happening in 8 days. Maybe 480X


----------



## GoLDii3

Quote:


> Originally Posted by *zealord*
> 
> I don't see a 490X happening in 8 days. Maybe 480X


Could happen if paper launch.

After all there's P11,P10 and Vega 10. Vega is Fury-tier and Polaris is 400 Series tier. Unless they introduce Vega 11.


----------



## spyshagg

I understand if AMD chose to release lower-mid tier cards first. Its 85% of the market.

But,

I can't honestly predict what AMD will bring to the market. In the past they blew the competition out of the blue when no one predicted it. Twice.

I'll make my predictions AFTER they release the cards


----------



## mav451

Quote:


> Originally Posted by *spyshagg*
> 
> I'll make my predictions AFTER they release the cards


But then it's no longer a prediction - just play-by-play analysis


----------



## zealord

Quote:


> Originally Posted by *GoLDii3*
> 
> Could happen if paper launch.
> 
> After all there's P11,P10 and Vega 10. Vega is Fury-tier and Polaris is 400 Series tier. Unless they introduce Vega 11.


They can't release a 490X that is slower than the GTX 1070 / 980 Ti. We have seen too many leaks and rumors of Polaris. We know it's mid range. AMD even said so.

My bet :

Polaris 11 = super small. not even worth mentioning. probably 460(X)

Polaris 10 = 232mm² performance between 980 and 980 Ti. probably 480(X)

Vega = no idea but I hope Vega are the 490 and 490X.


----------



## xzamples

http://experience.amd.com/NotifyMe/computex-live-stream/


----------



## renx

IMO, by the time they announce and show Polaris, they should address the enthusiast segment, and make some kind of short-term promise.
They shouldn't just act like there's no people waiting for a high-end solution.
Polaris will probably shine in the low and mid range. But if they don't mention the high-end and create some kind of hype, it would be really bad.


----------



## zealord

Quote:


> Originally Posted by *renx*
> 
> IMO, by the time they announce and show Polaris, they should address the enthusiast segment, and make some kind of short-term promise.
> They shouldn't just act like there's no people waiting for a high-end solution.
> Polaris will probably shine in the low and mid range. But if they don't mention the high-end and create some kind of hype, it would be really bad.


yeah I can already see it before my eyes . AMD gonna say "best price/performance" and "best performance/watt" like 70 times during the presentation


----------



## sinholueiro

So 8 days is the day before the 27th of May, the day that the 1080 is launched, isn't it?


----------



## renx

Quote:


> Originally Posted by *zealord*
> 
> yeah I can already see it before my eyes . AMD gonna say "best price/performance" and "best performance/watt" like 70 times during the presentation


Exactly. At least Huang comes with stuff like "The New King".
Then it may or may not be as great. But at least he tries to spoil the enthusiast.
On the other side, I remember of certain Raja Koduri's interview. And the guy was all about performance/watt/dollar.

Well, let's just hope.

.


----------



## flopper

Quote:


> Originally Posted by *zealord*
> 
> yeah I can already see it before my eyes . AMD gonna say "best price/performance" and "best performance/watt" like 70 times during the presentation


Its the new King for gamers.
what else is there to say about Polaris?


----------



## zealord

Quote:


> Originally Posted by *flopper*
> 
> Its the new King for gamers.
> what else is there to say about Polaris?


AMD could do it like this :

- Lisa Su comes on stage and welcomes Raja
- Raja says : "Hi. This is a very special day"
- "We promised you 16K 240hz HDR"
- "We have slides and roadmaps and new products"
- "Virtual Reality is a reality now"
- "There are no good VR games, but who cares right"
- "We at AMD know that for 16K 240hz HDR you need about 500 times the performance of a normal 1080p 60hz monitor"
- "We at AMD know that you want more performance"
- "We at AMD are here to bring it to you"
- "That is why we are introducing the new 232mm² R9 480 GPU !"
- "Yes it is slower than a Fury X"
- "Yes it is only a little bit faster than a 390X, which is basically just a 290X, which is a 3 year old product basically"
- "But I show you guys how I eat some fried chicken wings nows. They are as delicious as our TrueAudio. You should check that out".


----------



## BulletBait

Quote:


> Originally Posted by *flopper*
> 
> Its the new King for gamers.
> what else is there to say about Polaris?


King for your Dollars.

I think that's just as viable as 'king for gamers.'


----------



## Cakewalk_S

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *flopper*
> 
> Its the new King for gamers.
> what else is there to say about Polaris?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AMD could do it like this :
> 
> - Lisa Su comes on stage and welcomes Raja
> - Raja says : "Hi. This is a very special day"
> - "We promised you 16K 240hz HDR"
> - "We have slides and roadmaps and new products"
> - "Virtual Reality is a reality now"
> - "There are no good VR games, but who cares right"
> - "We at AMD know that for 16K 240hz HDR you need about 500 times the performance of a normal 1080p 60hz monitor"
> - "We at AMD know that you want more performance"
> - "We at AMD are here to bring it to you"
> - "That is why we are introducing the new 232mm² R9 480 GPU !"
> - "Yes it is slower than a Fury X"
> - "Yes it is only a little bit faster than a 390X, which is basically just a 290X, which is a 3 year old product basically"
> - "But I show you guys how I eat some fried chicken wings nows. They are as delicious as our TrueAudio. You should check that out".
Click to expand...

- "We know you want to be able to game on a computer that can use so little power it can be powered by a backup generator"
- "Here's benchmarks of the new R9 480 GPU using the Intel i7-6700k CPU..."
- "No, I won't answer any questions about AMD's upcoming Zen CPU, I don't want to drive our stock price even lower..."


----------



## renx

I mean, let's forget 4K for a minute. But 1440p monitors are selling like hot cakes.
As much as a good mid-range is appreciated, they won't be enough for 1440p 60hz, high settings in most new games.
They will have to anounce the power GPU as well, or anyone who's willing to spend 500+ dollars will go for Nvidia.


----------



## GoLDii3

Quote:


> Originally Posted by *zealord*
> 
> yeah I can already see it before my eyes . AMD gonna say "best price/performance" and "best performance/watt" like 70 times during the presentation


performance per inch never seen before


----------



## Robenger

Quote:


> Originally Posted by *Cakewalk_S*
> 
> - "We know you want to be able to game on a computer that can use so little power it can be powered by a backup generator"
> - "Here's benchmarks of the new R9 480 GPU using the Intel i7-6700k CPU..."
> - "No, I won't answer any questions about AMD's upcoming Zen CPU, I don't want to drive our stock price even lower..."


The salt is real.


----------



## criminal

Quote:


> Originally Posted by *jdstock76*
> 
> Except it'll be what, two years after the 980ti release when we see it? The 980ti will be $300 by then. No need to purchase Polaris.


The 980Ti has another year before it is 2 years old. I am pretty sure Polaris will release before then.


----------



## Bryst

Quote:


> Originally Posted by *renx*
> 
> I mean, let's forget 4K for a minute. But 1440p monitors are selling like hot cakes.
> As much as a good mid-range is appreciated, they won't be enough for 1440p 60hz, high settings in most new games.
> They will have to anounce the power GPU as well, or anyone who's willing to spend 500+ dollars will go for Nvidia.


1366 x 768
25.82%
-0.19%

1440 x 900
4.95%
+0.15%

1536 x 864
3.14%
+0.04%

1600 x 900
6.63%
-0.14%

1600 x 1200
0.22%
0.00%

1680 x 1050
4.14%
-0.03%

1920 x 1080
36.42%
-0.08%

1920 x 1200
1.37%
-0.01%

2560 x 1080
0.30%
0.00%

*2560 x 1440
1.51%
+0.05%*

"hot cakes"


----------



## BulletBait

Quote:


> Originally Posted by *renx*
> 
> IMO, by the time they announce and show Polaris, they should address the enthusiast segment, and make some kind of short-term promise.
> They shouldn't just act like there's no people waiting for a high-end solution.
> Polaris will probably shine in the low and mid range. But if they don't mention the high-end and create some kind of hype, it would be really bad.


First time I've ever heard anyone beg AMD to start a hype train.


----------



## zealord

Quote:


> Originally Posted by *BulletBait*
> 
> First time I've ever heard anyone beg AMD to start a hype train.


sorry what?

I don't understand your post at all


----------



## BulletBait

Quote:


> Originally Posted by *zealord*
> 
> sorry what?
> 
> I don't understand your post at all


Wasn't meant for you. My phone is being ridiculous, sorry. It's this stupid new 'google keyboard' I hate it.


----------



## zealord

Quote:


> Originally Posted by *BulletBait*
> 
> Wasn't meant for you. My phone is being ridiculous, sorry. It's this stupid new 'google keyboard' I hate it.


ah all right no worries then. happens


----------



## jdstock76

Quote:


> Originally Posted by *zealord*
> 
> AMD could do it like this :
> 
> - Lisa Su comes on stage and welcomes Raja
> - Raja says : "Hi. This is a very special day"
> - "We promised you 16K 240hz HDR"
> - "We have slides and roadmaps and new products"
> - "Virtual Reality is a reality now"
> - "There are no good VR games, but who cares right"
> - "We at AMD know that for 16K 240hz HDR you need about 500 times the performance of a normal 1080p 60hz monitor"
> - "We at AMD know that you want more performance"
> - "We at AMD are here to bring it to you"
> - "That is why we are introducing the new 232mm² R9 480 GPU !"
> - "Yes it is slower than a Fury X"
> - "Yes it is only a little bit faster than a 390X, which is basically just a 290X, which is a 3 year old product basically"
> - "But I show you guys how I eat some fried chicken wings nows. They are as delicious as our TrueAudio. You should check that out".


----------



## Cyclonic

Quote:


> Originally Posted by *zealord*
> 
> AMD could do it like this :
> 
> - Lisa Su comes on stage and welcomes Raja
> - Raja says : "Hi. This is a very special day"
> - "We promised you 16K 240hz HDR"
> - "We have slides and roadmaps and new products"
> - "Virtual Reality is a reality now"
> - "There are no good VR games, but who cares right"
> - "We at AMD know that for 16K 240hz HDR you need about 500 times the performance of a normal 1080p 60hz monitor"
> - "We at AMD know that you want more performance"
> - "We at AMD are here to bring it to you"
> - "That is why we are introducing the new 232mm² R9 480 GPU !"
> - "Yes it is slower than a Fury X"
> - "Yes it is only a little bit faster than a 390X, which is basically just a 290X, which is a 3 year old product basically"
> - "But I show you guys how I eat some fried chicken wings nows. They are as delicious as our TrueAudio. You should check that out".


----------



## EightDee8D

Quote:


> Originally Posted by *zealord*
> 
> AMD could do it like this :
> 
> - Lisa Su comes on stage and welcomes Raja
> - Raja says : "Hi. This is a very special day"
> - "We promised you 16K 240hz HDR"
> - *"We have slides and roadmaps and new products"*
> - "Virtual Reality is a reality now"
> - "There are no good VR games, but who cares right"
> - "We at AMD know that for 16K 240hz HDR you need about 500 times the performance of a normal 1080p 60hz monitor"
> - "We at AMD know that you want more performance"
> - "We at AMD are here to bring it to you"
> - "That is why we are introducing the new 232mm² R9 480 GPU !"
> - "Yes it is slower than a Fury X"
> - "Yes it is only a little bit faster than a 390X, which is basically just a 290X, which is a 3 year old product basically"
> - "But *I show you guys how I eat some fried chicken wings nows*. They are as delicious as our TrueAudio. You should check that out".


Damn dude , wii u so salty









although it's true


----------



## zealord

Quote:


> Originally Posted by *EightDee8D*
> 
> Damn dude , wii u so salty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> although it's true


what has that to do with being salty









I am just trying to apply ciriticism towards Nvidia Intel and AMD with some banter. We all know AMD has cringeworthy presentations and underwhelming product releases and Nvidia is manipulative and overprice their products heavily.


----------



## Cyclonic

Quote:


> Originally Posted by *zealord*
> 
> what has that to do with being salty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am just trying to apply ciriticism towards Nvidia Intel and AMD with some banter. We all know AMD has cringeworthy presentations and underwhelming product releases and Nvidia is manipulative and overprice their products heavily.


I'm going to be disappointed now if he doesnt talk about fried chicken wings now tho


----------



## EightDee8D

Quote:


> Originally Posted by *zealord*
> 
> what has that to do with being salty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am just trying to apply ciriticism towards Nvidia Intel and AMD with some banter. We all know *AMD has cringeworthy presentations* and underwhelming product releases and Nvidia is manipulative and overprice their products heavily.


Agree, but i think it's mostly because they have less resources to waste it on marketing.


----------



## SuperZan

Quote:


> Originally Posted by *Cyclonic*
> 
> I'm going to be disappointed now if he doesnt talk about fried chicken wings now tho


Tripartite wing eating contest feat. Lisa Su vs. Raja Koduri vs. AMD ROY > JHH leather enthusiast showcase









Either way, the thing I take away from any company's presentation is the launch window. Third-party reviews (should) tell us much more than choreographed corporate theatre.


----------



## Disturbed117

No more racist images/gifs. If this continues i will hand out warnings and/or infractions.


----------



## TheLAWNOOB

Damn, every AMD thread gets derailed by certain people with certain GPUs. Poor AMD









(Am I doing this right? Are you disturbed?)


----------



## SuperZan

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Damn, every AMD thread gets derailed by certain people with certain GPUs. Poor AMD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Am I doing this right? Are you disturbed?)


"Men are disturbed not by things, but by the principles and notions which they form concerning things."

So yes.


----------



## airfathaaaaa

Quote:


> Originally Posted by *zealord*
> 
> what has that to do with being salty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am just trying to apply ciriticism towards Nvidia Intel and AMD with some banter. We all know AMD has cringeworthy presentations and underwhelming product releases and Nvidia is manipulative and overprice their products heavily.


this is what we love

cringe worthy presentations to release awesome products

no one can even be close to that on amd praise them


----------



## kyrie74

Quote:


> Originally Posted by *Cyclonic*
> 
> I'm going to be disappointed now if he doesnt talk about fried chicken wings now tho


New rumor: Polaris comes with fried chicken wings.


----------



## airfathaaaaa

Quote:


> Originally Posted by *kyrie74*
> 
> New rumor: Polaris comes with fried chicken wings.


i guess they will give polaris to reviewers with a kfc bucket full of nuggets?


----------



## m70b1jr

Quote:


> Originally Posted by *airfathaaaaa*
> 
> i guess they will give polaris to reviewers with a kfc bucket full of nuggets?


Speaking of KFC, I work at KFC xd


----------



## linbetwin

I'll just leave this here.

http://videocardz.com/60253/amd-radeon-r9-480-3dmark11-benchmarks


----------



## NicksTricks007

If those numbers are real, and the price comes in at under $400 then the 480x is indeed impressive. Even though it doesn't beat 1080, at that price/performance I don't think it has to. The performance is close enough to give potential 1080 buyers something to think about if they price it $200-$300 less. At $350 a pop (if that's the price), you can get 2 for the price of one "reference" 1080.

Edit: Of course, a moderate amount of salt should be taken into consideration.


----------



## spyshagg

Quote:


> Originally Posted by *NicksTricks007*
> 
> If those numbers are real, and the price comes in at under $400 then the 480x is indeed impressive. Even though it doesn't beat 1080, at that price/performance I don't think it has to. The performance is close enough to give potential 1080 buyers something to think about if they price it $200-$300 less. At $350 a pop (if that's the price), you can get 2 for the price of one "reference" 1080.
> 
> Edit: Of course, a moderate amount of salt should be taken into consideration.


The one at the top is Crossfired polaris. The single one sits bellow fury. It has to be ~ 250$ for that level of performance, because 390 costs 300$ and is only 25% slower than that chart.

It wouldn't make sense to put out a new card with with similar price/performance to a product they already have on the market.


----------



## Bryst

Quote:


> Originally Posted by *linbetwin*
> 
> I'll just leave this here.
> 
> http://videocardz.com/60253/amd-radeon-r9-480-3dmark11-benchmarks


Not to bad, 1266 is hopefully still a conservative clock. If they can get to 1400 it probably would be near 980ti mark


----------



## linbetwin

Maybe July.

http://wccftech.com/amd-radeon-r9-480-polaris-10-july/


----------



## variant

Quote:


> Originally Posted by *linbetwin*
> 
> Maybe July.
> 
> http://wccftech.com/amd-radeon-r9-480-polaris-10-july/


I am not even sure how WCCF came to "July" from Videocardz's rumor stating "next few weeks" which would put it in June.


----------



## NicksTricks007

Quote:


> Originally Posted by *spyshagg*
> 
> The one at the top is Crossfired polaris. The single one sits bellow fury. It has to be ~ 250$ for that level of performance, because 390 costs 300$ and is only 25% slower than that chart.
> 
> It wouldn't make sense to put out a new card with with similar price/performance to a product they already have on the market.


Thanks for catching that, I didn't realize top was xfire. On my phone and my eyes aren't that good anymore lol. I think you're right though. For Fury(ish) performance, and the fact that 980tis and Fury Xs will be dropping to most likely $300-$350, and vanilla Fury probably $250ish, they will be forced to price this at $200-$250. Of course, that's if these numbers are accurate.


----------



## airfathaaaaa

honestly i expected till now that no one would trust either wccft or videocardz....

they just keep throwing numbers around


----------



## Newbie2009

I hope they release a card with good drivers,half the battle


----------



## WanWhiteWolf

Quote:


> Originally Posted by *linbetwin*
> 
> Maybe July.
> 
> http://wccftech.com/amd-radeon-r9-480-polaris-10-july/


Something is fishy here.

Scalling on CF 42%? Maybe with crap drivers. WIth a decent driver CF scales between 70-90% in my experience (depending on game). Is 3D Mark "special" in this regard?

But pricing will be difficult, as no matter how they price it,they will either kill their previous generation cards or lose heavily vs nVidia.

I would assume they will go with a 300-350$ card, that in CF will beat the 1080 for the same price. And the only card that will be in a bad position would be the Fury - which is already in a bad spot considering 1070.


----------



## NicksTricks007

Quote:


> Originally Posted by *WanWhiteWolf*
> 
> Something is fishy here.
> 
> Scalling on CF 42%? Maybe with crap drivers. WIth a decent driver CF scales between 70-90% in my experience (depending on game). Is 3D Mark "special" in this regard?
> 
> *But pricing will be difficult, as no matter how they price it,they will either kill their previous generation cards or lose heavily vs nVidia.*


That's what I'm afraid of. AMD put themselves in a difficult situation as far as pricing for Polaris. Especially if Nvidia really launches the 1070 at $380 and 1060 at $250 (my estimated speculative price). If 1070 is + or - 5% of the 980ti and 1060 is roughly Fury performance (which is what the VC chart indicates 480X is), then AMD will be forced to price it the same. Computex can't come soon enough. I'm ready to get some clarification of what we'll be dealing with. One thing I think AMD has going for them is that they're keeping the hype to a minimum, despite what some may think.


----------



## BulletBait

Quote:


> Originally Posted by *NicksTricks007*
> 
> That's what I'm afraid of. AMD put themselves in a difficult situation as far as pricing for Polaris. Especially if Nvidia really launches the 1070 at $380 and 1060 at $250 (my estimated speculative price). If 1070 is + or - 5% of the 980ti and 1060 is roughly Fury performance (which is what the VC chart indicates 480X is), then AMD will be forced to price it the same. Computex can't come soon enough. I'm ready to get some clarification of what we'll be dealing with. *One thing I think AMD has going for them is that they're keeping the hype to a minimum, despite what some may think.*


We all know this, but any talk or speculation of AMD is automatically 'hype,' whether good, bad, or, most especially, neutral. Didn't you see the memo? Doesn't matter if AMD is keeping locked tight, a single utterance of any of their products by someone outside the company is automatically, 'hype train, hype train, hype train.'

So welcome to the hype train I guess, with your neutral speculation.


----------



## NicksTricks007

Quote:


> Originally Posted by *BulletBait*
> 
> We all know this, but any talk or speculation of AMD is automatically 'hype,' whether good, bad, or, most especially, neutral. Didn't you see the memo? Doesn't matter if AMD is keeping locked tight, a single utterance of any of their products by someone outside the company is automatically, 'hype train, hype train, hype train.'
> 
> So welcome to the hype train I guess, with your neutral speculation.


Great, now I'm responsible for starting more hype







Choo chooooooo!

I guess this is the life of us PC enthusiasts.


----------



## WanWhiteWolf

Quote:


> Originally Posted by *NicksTricks007*
> 
> Great, now I'm responsible for starting more hype
> 
> 
> 
> 
> 
> 
> 
> Choo chooooooo!
> 
> I guess this is the life of us PC enthusiasts.


Let me fix it for you: 1070 reported 5% below TI performance at 379$./hype derail


----------



## ToTheSun!

Quote:


> Originally Posted by *NicksTricks007*
> 
> I guess this is the life of us PC enthusiasts.


It's a life full of hypes and downs.


----------



## prjindigo

rumor confirmed, thread over.

http://www.overclock.net/t/1600940/videocardz-amd-radeon-r9-480-3dmark11-benchmarks/110#post_25189913

AMD wins thread.


----------



## Ultracarpet

Quote:


> Originally Posted by *prjindigo*
> 
> rumor confirmed, thread over.
> 
> http://www.overclock.net/t/1600940/videocardz-amd-radeon-r9-480-3dmark11-benchmarks/110#post_25189913
> 
> AMD wins thread.


Well for starters that rumor shows it being slower than a fury non x which makes it closer to a 980/390x than a 980ti... and the second half of this rumor cannot be confirmed until AMD releases a price soooo.... yea... actually it's not looking so good for this rumor if that leaked bench is real and that card is the fastest p10


----------



## variant

Quote:


> Originally Posted by *Ultracarpet*
> 
> Well for starters that rumor shows it being slower than a fury non x which makes it closer to a 980/390x than a 980ti... and the second half of this rumor cannot be confirmed until AMD releases a price soooo.... yea... actually it's not looking so good for this rumor if that leaked bench is real and that card is the fastest p10


We haven't seen Polaris 10 67C0 which would be the full one.


----------



## airfathaaaaa

videocardz as always taking a number without considering anything just like they did with 1080..credibility below zero


----------



## BulletBait

Quote:


> Originally Posted by *airfathaaaaa*
> 
> *videocardz* as always taking a number without considering anything just like they did with 1080..credibility below zero


*cough*All 'tech' sites*coughcough*. The more this election cycles goes on, the more I see how ridiculous even non-political editorials are at this point.

I can't tell if it's a laughing matter still or not...

From the comments on most of the tech sites, it's not. From the comments in these threads, it is. So... the majority vs the minority in determining a laughing matter... don't know who to go with.


----------



## airfathaaaaa

Quote:


> Originally Posted by *BulletBait*
> 
> *cough*All 'tech' sites*coughcough*. The more this election cycles goes on, the more I see how ridiculous even non-political editorials are at this point.
> 
> I can't tell if it's a laughing matter still or not...
> 
> From the comments on most of the tech sites, it's not. From the comments in these threads, it is. So... the majority vs the minority in determining a laughing matter... don't know who to go with.


i have yet to see a site doing that this one did till now on 2016...literally they spilled so many rumors about 1080 that they were bound to be right..and yet they were so wrong LOL

the same thing is happening now they just keep throwing numbers without even considering the uarch changes or anything else for that matter... just garbage and probably they will be once more way off
even wccft isnt talking about amd and they are like the poster press boys of amd


----------



## magnek

Everybody needs something to fight about, and I need my daily







.

So I guess overall I don't mind if they keep churning out rumors.


----------



## BulletBait

Quote:


> Originally Posted by *magnek*
> 
> Everybody needs something to fight about, and I need my daily
> 
> 
> 
> 
> 
> 
> 
> .
> 
> So I guess overall I don't mind if they keep churning out rumors.


I don't mind *educated* guesses. Heck I don't even mind the nV trolls that come strolling in these threads every once in a while when they raise decent points.

But spreading patently false/unverifiable information *really* pisses me off.

Edit: What I mean by 'unverfiable' is so far off it can't be correct. As in, not even a normal guess, let alone educated.


----------



## KarathKasun

This is honestly where I expect P10 to land performance wise, low to high
390 - P10(480X) - Fury X

Price should be $250 to $300.


----------



## flopper

Quote:


> Originally Posted by *KarathKasun*
> 
> This is honestly where I expect P10 to land performance wise, low to high
> 390 - P10(480X) - Fury X
> 
> Price should be $250 to $300.


The card of a generation if accurate


----------



## ivymaxwell

Quote:


> Originally Posted by *flopper*
> 
> The card of a generation if accurate


the question is how it compares to the gtx 1060 in price and performance.


----------



## ChevChelios

GTX1060 will probably only come early Fall, so P10/P11 can have some time to make $$ in the sub-1070 segment until then


----------



## prznar1

So, polaris 10 will be r9 480x and cost 300$. R9 380x in release day was priced at 250$. Hmmm


----------



## ChevChelios

Quote:


> Originally Posted by *prznar1*
> 
> So, polaris 10 will be r9 480x and cost 300$. R9 380x in release day was priced at 250$. Hmmm


Nvidia increases prices as time goes on so its ok for AMD to do the same


----------



## prznar1

Quote:


> Originally Posted by *ChevChelios*
> 
> Nvidia increases prices as time goes on so its ok for AMD to do the same


you want to pay for video card as much as for car in the nead future ??? It is not ok to raise price for nvidia and and.


----------



## KarathKasun

Quote:


> Originally Posted by *prznar1*
> 
> So, polaris 10 will be r9 480x and cost 300$. R9 380x in release day was priced at 250$. Hmmm


Under $300 is what we have heard from RTG so far, thats why I said $250-$300.


----------



## spyshagg

has to be cheaper. The current product (390) costs 300$. They wouldn't release to the market another 300$ card with only ~20% more performance.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *spyshagg*
> 
> has to be cheaper. The current product (390) costs 300$. They wouldn't release to the market another 300$ card with only ~20% more performance.


Why not? Nvidia did it. Just have to hope it overclocks well I guess.


----------



## KarathKasun

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> Why not? Nvidia did it. Just have to hope it overclocks well I guess.


Pretty much this. Higher performance with a much lower TDP for the same price. At least its not 20% more performance for $150 more.

Also, under $300 is from the horses mouth. Could be $299.99 if they wanted to be pedantic, but its likely ~$279.


----------



## ChevChelios

Nvidia can afford to do that with its leader position, AMD cant









also thats only for the 1080, but the fastest card on the market isnt supposed to be value-heavy

and 1070 is good value


----------



## spyshagg

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> Why not? Nvidia did it. Just have to hope it overclocks well I guess.


Because having price/perf withing 20% of your current product would:
- Not decrease the VR cost of entry (~300$) AMD touted.
- Put your product under competition from yourself
- Give your brand a monopolistic image.

GCN 1.4 14nm finfet with the same price/perf as the archaic 28nm 2 year old architecture?

Price/perf is what people buy. Not perf/watt or overclock-ability. Latest polls show the casual market at 85% of the entire industry. With a price of this range, you cater to them and not to overclockers.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *spyshagg*
> 
> Because having price/perf withing 20% of your current product would:
> - Not decrease the VR cost of entry (~300$) AMD touted.
> - Put your product under competition from yourself
> - Give your brand a monopolistic image.


Well then it will be slightly cheaper, or will perform better than is shown in the rumoured benchmarks.

AMD's midrange is always competitive so I doubt it will change with Polaris.


----------



## Kuivamaa

Quote:


> Originally Posted by *ChevChelios*
> 
> Nvidia can afford to do that with its leader position, AMD cant
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also thats only for the 1080, but the fastest card on the market isnt supposed to be value-heavy
> 
> and 1070 is good value


Market leader sets the prices, the others follow and price accordingly around the standards set by the leader.Plain and simple.


----------



## paralemptor

Oh, the agony. I don't know what to do. Sell my R9 290 now, while it's still worth something, save the money and upgrade to Polaris10 late summer? First world problems


----------



## TheLAWNOOB

Quote:


> Originally Posted by *paralemptor*
> 
> Oh, the agony. I don't know what to do. Sell my R9 290 now, while it's still worth something, save the money and upgrade to Polaris10 late summer? First world problems


P10 wont be an upgrade. I wouldnt recommand 290 CF either. Wait for Vega.


----------



## guttheslayer

Just a food for thought,

If P10 can hit GTX 980 performance with just 2048 cores at 1.2GHz or less.... It means it has better IPC as compared to Maxwell 2.0.

In that case a 4096 vega on a OCed steroid will be...









I really dont think AMD will lose out if their IPC per core matches maxwell 2.0, but that is provided their performance scale well with their cores increment.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *guttheslayer*
> 
> Just a food for thought,
> 
> If P10 can hit GTX 980 performance with just 2048 cores at 1.2GHz or less.... It means it has better IPC as compared to Maxwell 2.0.
> 
> In that case a 4096 vega on a OCed steroid will be...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I really dont think AMD will lose out if their IPC per core matches maxwell 2.0, but that is provided their performance scale well with their cores increment.


From nvidia website:

GTX 980 Engine Specs:
2048CUDA Cores
1126Base Clock (MHz)
1216

P10 matching 980 at 1.2Ghz will be very similar to Maxwell IPC.


----------



## 4everAnoob

I recommend selling now, all current video cards will be cheaper in two months, heck even if you sell your 290 now, you could probably get a 290x for less in two months time.


----------



## blue1512

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> From nvidia website:
> 
> GTX 980 Engine Specs:
> 2048CUDA Cores
> 1126Base Clock (MHz)
> 1216
> 
> P10 matching 980 at 1.2Ghz will be very similar to Maxwell IPC.


You forgot the notorious Boost from nVidia. nVidia cards almost always run at faster clock than the base clock, at least for the benchmark before they throttle


----------



## TheLAWNOOB

Quote:


> Originally Posted by *blue1512*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheLAWNOOB*
> 
> From nvidia website:
> 
> GTX 980 Engine Specs:
> 2048CUDA Cores
> 1126Base Clock (MHz)
> 1216
> 
> P10 matching 980 at 1.2Ghz will be very similar to Maxwell IPC.
> 
> 
> 
> You forgot the notorious Boost from nVidia. nVidia cards almost always run at faster clock than the base clock, at least for the benchmark before they throttle
Click to expand...

From anandtehc's reference 980 review:

GeForce GTX 980 Average Clockspeeds
Max Boost Clock 1252MHz
Metro: LL
1192MHz
CoH2
1177MHz
Bioshock
1201MHz
Battlefield 4
1227MHz
Crysis 3
1227MHz
TW: Rome 2
1161MHz
Thief
1190MHz
GRID 2
1151MHz
Furmark
923MHz

About 1200mhz on average.


----------



## paralemptor

I think I will sell the 290 now. Even if Polaris will be a similarly performing card the energy consumption is an important factor to me, but it's not about the energy bill - more about temperature and fan(s) loudness. My slightly overclocked Asus DCII may not reach the loudness of a reference GTX580 but it is still loud as hell in more demanding games


----------



## GamerusMaximus

Quote:


> Originally Posted by *blue1512*
> 
> You forgot the notorious Boost from nVidia. nVidia cards almost always run at faster clock than the base clock, at least for the benchmark before they throttle


That's what im doing. Im gonna put my 770s on ebay soon, and wait it out until polaris comes out. Or grab pascal if polaris cant deliver.


----------



## guttheslayer

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> From nvidia website:
> 
> GTX 980 Engine Specs:
> 2048CUDA Cores
> 1126Base Clock (MHz)
> 1216
> 
> P10 matching 980 at 1.2Ghz will be very similar to Maxwell IPC.


1.2ghz or less, we know how amd and nvidia boost work. For 980 pitting at 1216 is actually going much higher after boost, for amd is a max cap of 1.2. That means ipc wise the polaris are better.

Either way to see a 4096 on that kinda ipc is pretty crazy.

Just going to a 1.5ghz clock would have crushed a heavily oced 1.5ghz titan x with 30-35% performance gap, which also means >30% above 1080


----------



## Waitng4realGPU

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> From anandtehc's reference 980 review:
> 
> GeForce GTX 980 Average Clockspeeds
> Max Boost Clock 1252MHz
> Metro: LL
> 1192MHz
> CoH2
> 1177MHz
> Bioshock
> 1201MHz
> Battlefield 4
> 1227MHz
> Crysis 3
> 1227MHz
> TW: Rome 2
> 1161MHz
> Thief
> 1190MHz
> GRID 2
> 1151MHz
> Furmark
> 923MHz
> 
> About 1200mhz on average.


Still a cheeky gain for benchmarks though. Not to mention it goes down to 1150 sometimes, it's plain poor.


----------



## GamerusMaximus

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> Still a cheeky gain for benchmarks though. Not to mention it goes down to 1150 sometimes, it's plain poor.


That's why he said average. It goes down to 1150 sometimes, it goes up to 1215 sometimes. And I fail to see how it's "cheeky" when AMD cards do the exact same thing. Nvidia is simply doing what intel does, setting the base clock lower and overdelivering on performance.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *GamerusMaximus*
> 
> That's why he said average. It goes down to 1150 sometimes, it goes up to 1215 sometimes. And I fail to see how it's "cheeky" when AMD cards do the exact same thing. Nvidia is simply doing what intel does, setting the base clock lower and overdelivering on performance.


My 7970 doesn't downclock?

People don't look at the base clock though they look at the peak clock and many reviews will only state the peak clock the card reaches, and in turn it makes it look better than it is.


----------



## Potatolisk

Quote:


> Originally Posted by *paralemptor*
> 
> I think I will sell the 290 now. Even if Polaris will be a similarly performing card the energy consumption is an important factor to me, but it's not about the energy bill - more about temperature and fan(s) loudness. My slightly overclocked Asus DCII may not reach the loudness of a reference GTX580 but it is still loud as hell in more demanding games


I have the same card. It was loud until I redone thermal paste, flashed modified 390 bios and made sure of good air flow in case. Now it's quiet.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *paralemptor*
> 
> I think I will sell the 290 now. Even if Polaris will be a similarly performing card the energy consumption is an important factor to me, but it's not about the energy bill - more about temperature and fan(s) loudness. My slightly overclocked Asus DCII may not reach the loudness of a reference GTX580 but it is still loud as hell in more demanding games


Honestly the overclock probably isn't netting you much of a gain anyway, as stated by potatolisk could reapply the thermal paste, but I'd also see how much you can reduce voltage and run it at stock frequencies.

It's also well known that Asus dropped the ball on it's 290 series coolers, they are pretty bad.


----------



## GamerusMaximus

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> My 7970 doesn't downclock?
> 
> People don't look at the base clock though they look at the peak clock and many reviews will only state the peak clock the card reaches, and in turn it makes it look better than it is.


The 7970 also did not have any sort of GPUboost built in. The 290 and 290x did though. Check out reviews of the 290x. It dropped significantly below it's boost clock unless it was set to uber, and then it was stupid loud and still ran in the 90c range.

Both parties have played this game. Nvidia has done it longer, but the 1080 is the only one that has dropped below the base clock from team green. None of the Keplers or Maxwells had that issue. the 980ti boost clock average was typically _higher_ then the stated boost clock on nvidia's site. Same with the 980.

While the 1080 may be problematic here, both parties are guilty, and AMD did just as bad. It will be sorted out, I'm sure. Could be a problem with nvidia boost 3.0, or simply the board design (which AMD also messed up with the fury line). Neither party is innocent here.

EDIT: boost, not base.


----------



## Waitng4realGPU

Quote:


> Originally Posted by *GamerusMaximus*
> 
> The 7970 also did not have any sort of GPUboost built in. The 290 and 290x did though. Check out reviews of the 290x. It dropped significantly below it's boost clock unless it was set to uber, and then it was stupid loud and still ran in the 90c range.
> 
> Both parties have played this game. Nvidia has done it longer, but the 1080 is the only one that has dropped below the base clock from team green. None of the Keplers or Maxwells had that issue. the 980ti boost clock average was typically _higher_ then the stated boost clock on nvidia's site. Same with the 980.
> 
> While the 1080 may be problematic here, both parties are guilty, and AMD did just as bad. It will be sorted out, I'm sure. Could be a problem with nvidia boost 3.0, or simply the board design (which AMD also messed up with the fury line). Neither party is innocent here.
> 
> EDIT: boost, not base.


The reference 290 series cards were piles of crap and I'm fully against the boost clocks, they should've designed a cooler that could maintain solid clock speeds.


----------



## Kokin

Can't say the same for my R9 290, but it had a water block almost right after I got it. I'm hoping Vega does release this fall so I can upgrade.


----------



## BulletBait

Quote:


> Originally Posted by *Kokin*
> 
> Can't say the same for my R9 290, but it had a water block almost right after I got it. I'm hoping Vega does release this fall so I can upgrade.


Yeah, I pretty much block all my components these days, so reference cards running 'hot' doesn't really affect me. It happens with most things and stock coolers. Although I've been hearing good things about the AMD Wraith cooler these days.


----------



## stoker

Quote:


> Originally Posted by *GamerusMaximus*
> 
> The 7970 also did not have any sort of GPUboost built in.


Some do, my 7970 Sapphire Dual-X has twin bios and one has boost


----------



## Waitng4realGPU

Quote:


> Originally Posted by *stoker*
> 
> Some do, my 7970 Sapphire Dual-X has twin bios and one has boost


Yeah a lot of the 7950's had it too.


----------

