# [Chiphell]Possible 380X preview



## astrallite

http://www.chiphell.com/thread-1182382-1-1.html
Quote:


>


Not sure if already posted. This is from the same group that first leaked GM204 benchmarks.


----------



## MunneY

I'd buy that...

The stats i mean


----------



## geoxile

Were their Maxwell benches accurate?

Also, this coincides with the previous reports (rumors?) that the 380 cards would be more than a match for GM204.


----------



## MapRef41N93W

Yeah I don't think so. The 380x is at worst going to be a re-branded 290 on a smaller node. So equal or better than 290 performance at 180-190 watts sounds more likely.

Edit: Wait I totally read that graph wrong.


----------



## astrallite

Quote:


> Originally Posted by *geoxile*
> 
> Were their Maxwell benches accurate?
> 
> Also, this coincides with the previous reports (rumors?) that the 380 cards would be more than a match for GM204.


They were, their leak was a Tomb Raider benchmark that showed the stock 780Ti had a slight advantage over the GTX 980 although I believe after patches the difference was reversed.


----------



## MunneY

BTW...

The link is bad.


----------



## geoxile

Quote:


> Originally Posted by *astrallite*
> 
> They were, their leak was a Tomb Raider benchmark that showed the stock 780Ti had a slight advantage over the GTX 980 although I believe after patches the difference was reversed.


Well that's a bit different from a huge list of games being averaged out for a product that's still several months out.


----------



## SlackerITGuy

Taken with a grain of salt of course, but it looks promising, the better AMD does the better for us.

Can't wait for GM200 and Fiji.


----------



## salamachaa

I think this is a leaked 390x not the 380x. If it is the 380x then I'll take a 390x that is 25 percent faster or more.


----------



## Clocknut

I guess I wont be ridiculous to ask for GTX680/7970 performance on a $100-$150 range in next gen?


----------



## geoxile

Quote:


> Originally Posted by *Clocknut*
> 
> I guess I wont be ridiculous to ask for GTX680/7970 performance on a $100-$150 range in next gen?


I really doubt they'll be selling Tonga for as low as $100 simply because of how big the die is


----------



## TopicClocker

If this is real and is going to be the R9 380X, it's going to be insane.

Roughly 30% faster than a R9 290X, so we're looking at 3500-3600 stream processors, somewhat onpar with the Si Sandra specs.


----------



## kpzero

This would be extremely impressive if it actually is the 380X. Mildly disappointing if it is the 390X.

Impressive from a power efficiency standpoint beating out the 980.


----------



## PostalTwinkie

Hahahaha, yeah right!

I would buy this chart if it were a 390X, but not a 380X. Unless they have done away with the 290/290X naming scheme, and just went with 390X (290X replacement) and 380X (290 replacement).


----------



## zealord

no way we are so close that they already have benchmarks lol.

Hope I am wrong though. Would make a great card


----------



## hyp36rmax

Well than... if this is the 380X, then i look forward to the 390X!!







Two please


----------



## ZealotKi11er

Quote:


> Originally Posted by *kpzero*
> 
> This would be extremely impressive if it actually is the 380X. Mildly disappointing if it is the 390X.
> 
> Impressive from a power efficiency standpoint beating out the 980.


Can't be 390X unless 490X comes after it because of the power consumption. If this is true its looking good for AMD. If anything this is just to make people hold up on buying new GPUs.


----------



## SyncMaster753

My 6970 is starting to get a little long in the tooth,

970 will likely be my next card....unless the rumors are true.


----------



## kingduqc

Quote:


> Originally Posted by *salamachaa*
> 
> I think this is a leaked 390x not the 380x. If it is the 380x then I'll take a 390x that is 25 percent faster or more.


Why would a high end 390x barely match a mid range maxwell card (980)? The 390x will compete to the 1080 and not the 980.

If this is true and come at 300/325$ I will be really really happy.


----------



## azanimefan

Quote:


> Originally Posted by *kingduqc*
> 
> Why would a high end 390x barely match a mid range maxwell card (980)? The 390x will compete to the 1080 and not the 980.
> 
> If this is true and come at 300/325$ I will be really really happy.


you are reading the benches wrong. this card is 15%-20% faster then a single 980... you're looking at the SLi numbers.


----------



## SoloCamo

If this comes with more then 4gb of vram and that much better power consumption I'd be hardpressed not to replace my 290x with one..

Bah, I almost don't want it to be good so I can resist my insatiable urge


----------



## geoxile

Quote:


> Originally Posted by *azanimefan*
> 
> you are reading the benches wrong. this card is 15%-20% faster then a single 980... you're looking at the SLi numbers.


it's about on par with an overclocked 980 (1400 core it seems). Not too outrageous if 1) AMD is making those big changes to GCN that were rumored a while ago 2) HBM has a noticeable impact on performance.

If HBM is used, there's the added bonus of HBM using less power than GDDR5, which adds to the efficiency.


----------



## salamachaa

Quote:


> Originally Posted by *kingduqc*
> 
> Why would a high end 390x barely match a mid range maxwell card (980)? The 390x will compete to the 1080 and not the 980.
> 
> If this is true and come at 300/325$ I will be really really happy.


It's happened a lot in the past between these two.

Anyway, I could see this being an interim flagship for AMD until they get big Bermuda ready. If that were true, then I bet they release it at 500 dollars, not 300 dollars.


----------



## brucethemoose

If the February release rumors are true, it's too early for leaks like this, isn't it?

The leaked Maxwell rumor was spot on, but that was much, closer to release
http://www.chiphell.com/thread-1122008-1-1.html


----------



## kingduqc

Quote:


> Originally Posted by *azanimefan*
> 
> you are reading the benches wrong. this card is 15%-20% faster then a single 980... you're looking at the SLi numbers.


I'm reading this how every one is. 390x can't be only 15% faster then a mid range die from Nvidia. Just look at what 680->780ti did. 55% performance increase http://www.techpowerup.com/reviews/MSI/GTX_780_Ti_Gaming/24.html. Now, i don't know how faster the gm200 will be but there is a sure thing it's not only 15% faster.

If AMD release their highest end card and it's only 15% faster then a 980 this means they will have their 390x competing with a card(gm200) that would be well over 20% faster (and it will be competing much as soon as the 390x get released since maxwell is already ready). Heck, their old 290x is trading blows with the 980 as of right now in quite a few game and it's 13 months old, do you think new high end gen cards will be only 15% faster then that? Absurd. If the stacked VRAM is true bandwith will be immense on those cards too, so no bottleneck for high res.

That is all speculation ofc, who knows if it's even real but if their 390x is that fast/slow well it's doom and gloom for amd because their market share will shrink to no end after the gm200 release.


----------



## Chrono Detector

Looks good, if it is the successor to the 280X and competes against the GTX 970/980. If it is true I can imagine how fast the 390X would be.


----------



## MunneY

Quote:


> Originally Posted by *kingduqc*
> 
> I'm reading this how every one is. 390x can't be only 15% faster then a mid range die from Nvidia. Just look at what 680->780ti did. 55% performance increase http://www.techpowerup.com/reviews/MSI/GTX_780_Ti_Gaming/24.html. Now, i don't know how faster the gm200 will be but there is a sure thing it's not only 15% faster.
> 
> If AMD release their highest end card and it's only 15% faster then a 980 this means they will have their 390x competing with a card(gm200) that would be well over 20% faster (and it will be competing much as soon as the 390x get released since maxwell is already ready). Heck, their old 290x is trading blows with the 980 as of right now in quite a few game and it's 13 months old, do you think new high end gen cards will be only 15% faster then that? Absurd.
> 
> That is all speculation ofc, who knows if it's even real but if their 390x is that fast/slow well it's doom and gloom for amd because their market share will shrink to no end after the gm200 release.


its the 380x... not the 390x


----------



## Seronx

Fiji (Enthusiast GPU)
Maui (Performance GPU)
Bermuda (Mainstream GPU)
Treasure (Value GPU)
Carrizo(-L/-E) (Mobile/Desktop APU)
Amur (Phone APU)


----------



## PostalTwinkie

Quote:


> Originally Posted by *brucethemoose*
> 
> If the February release rumors are true, it's too early for leaks like this, isn't it?
> 
> The leaked Maxwell rumor was spot on, but that was much, closer to release
> http://www.chiphell.com/thread-1122008-1-1.html


Not really, we know engineering samples have shipped, so it is possible the leaks are coming from those samples.

Although I am about 99% sure this leak is fake, as are 99% of pretty much any leak this far out from a release.


----------



## azanimefan

Quote:


> Originally Posted by *kingduqc*
> 
> I'm reading this how every one is. 390x can't be only 15% faster then a mid range die from Nvidia. Just look at what 680->780ti did. 55% performance increase http://www.techpowerup.com/reviews/MSI/GTX_780_Ti_Gaming/24.html. Now, i don't know how faster the gm200 will be but there is a sure thing it's not only 15% faster.
> 
> If AMD release their highest end card and it's only 15% faster then a 980 this means they will have their 390x competing with a card(gm200) that would be well over 20% faster (and it will be competing much as soon as the 390x get released since maxwell is already ready). Heck, their old 290x is trading blows with the 980 as of right now in quite a few game and it's 13 months old, do you think new high end gen cards will be only 15% faster then that? Absurd. If the stacked VRAM is true bandwith will be immense on those cards too, so no bottleneck for high res.
> 
> That is all speculation ofc, who knows if it's even real but if their 390x is that fast/slow well it's doom and gloom for amd because their market share will shrink to no end after the gm200 release.


this is the 380x, not the 390x.

as i said you're reading this wrong.


----------



## B!0HaZard

Seeing the sub-200 W power consumption, this can't be the 390X. If it were real, it'd be 380X or maybe a 390. NVIDIA has room in their lineup for a 200+ W card with the 980 Ti name so I don't think AMD would have a sub-200 W 390X with no room for a better name. It just doesn't make sense considering the naming schemes they use.

That said, these benchmarks would suggest that AMD increased their performance/power by 84% from Hawaii to Bermuda/Fiji whereas NVIDIA "only" managed a 50% increase from Kepler to Maxwell. I find it hard to believe. Going from significantly worse efficiency than NVIDIA to slightly better would be a big blow to NVIDIA's Maxwell cards. I think it's fake.


----------



## hyp36rmax

Could it be that unlocked Hawaii that was spoken about briefly at the beginning of 2014 with the power efficiency of Tonga? To be the next R9 380X? Since this is rumors why not!?!?!









*Source* Link


----------



## darealist

The joke is nVidia charging midrange cards with highend price.


----------



## zealord

Quote:


> Originally Posted by *darealist*
> 
> The joke is nVidia charging midrange cards with highend price.


The joke is Nvidia can.


----------



## joeh4384

I can see the 380x beating the 980 especially if AMD is using 20nm vs 28nm for Nvidia.


----------



## Slomo4shO

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Although I am about 99% sure this leak is fake, as are 99% of pretty much any leak this far out from a release.


As are 99% of all statistics on the interwebs


----------



## szeged

i actually believe the article because its chiphell and not videocraps or ...the other place.


----------



## incog

Encouraging numbers, let's hope they're real and that the card is well priced when it comes out.


----------



## AKA1

If this is real. 390x for me


----------



## Ultracarpet

oooooh, I want to believe.


----------



## iCrap

wow.... I just hope this is real. This will be worth the wait.


----------



## Dudewitbow

if the 380x has this kind of power consumption, the 390x must be huge(relatively), assuming AMD is still going with those Asetek based coolers that Asetek confirmed to be producing for a certain company.


----------



## nyxagamemnon

This has to be the 380x I'd say we would see about another 20-40% for the 390x. Basicallg big max and big amd will trade blows.


----------



## John Shepard

Not bad if true.
I am curious to see how nvidia responds to this.


----------



## sjwpwpro

I just want them to come out so I can buy a 290 - 290x for the kind of money i paid for my 7870's.


----------



## darealist

This card is targeting nvidia's midrange GTX 980 in power consumption, so this one is most likely a midrange as well.


----------



## Azuredragon1

If true then I know what's my next upgrade.


----------



## Noufel

I'll squeez my 980s until 390x come out so i can redeem my momentary betrayal to AMD


----------



## Joa3d43

...just wondering whether the AMD driver they used (14.11.2 beta) wouldn't have a config file with names / #model #s of all the GPUs it applies to







...in the past, that's how new AMD GPUs could be 'confirmed'

btw, not sure if this has been posted already; though that may not be the final word re. stacked memory etc


----------



## sepiashimmer

Although I'll never buy this card I'm very excited with the rumors.


----------



## delboy67

Quote:


> Originally Posted by *joeh4384*
> 
> I can see the 380x beating the 980 especially if AMD is using 20nm vs 28nm for Nvidia.


Rumour from another forum these are 14nm samsung


----------



## MapRef41N93W

Quote:


> Originally Posted by *nyxagamemnon*
> 
> This has to be the 380x I'd say we would see about another 20-40% for the 390x. Basicallg big max and big amd will trade blows.


10-15% for 390x over 380x. Titan 2 15-20% over a 390x. $1000 vs $599. That's how I see it playing out.


----------



## Kuivamaa

Not sure what to think about these results, especially without knowing if this is 380X ,390X or what are the clocks involved. If that is indeed,a 380X and the successor to tahiti, then yeah it is shaping up to be a powerhouse. If it is a 390X,it will only be passable if it is clocked very conservatively and even in that case, the next big die of nvidia should beat it without too much effort.


----------



## Wishmaker

Unless AMD pulls a Conroe on NVIDIA, things will not change in their favour. I have quite old rigs, true, but I am running on one of them 2x270x DCU II and the performance is oftentimes worse than the equivalent SLI from NVIDIA. If it is not the architecture, it is the drivers with AMD. The 390X needs to be something special for NVIDIA to be put in the corner.


----------



## Clovertail100

Quote:


> Originally Posted by *MapRef41N93W*
> 
> 10-15% for 390x over 380x. Titan 2 15-20% over a 390x. $1000 vs $599. That's how I see it playing out.


Different situation. The 380x should be a relatively small die on 20nm. The move to 20nm and architecture improvements should produce these kinds of results (ala Fermi vs Tahiti.) If these numbers are accurate, I'd guess they're not using HBM on the 380x.

The 390x, however, could benefit from a large 20nm die _and_ HBM. I'd be disappointed with any less than a 50% lead over big Maxwell. It has every advantage to do so.


----------



## blahtibla

I really hope this rumor is true. A big improvement in efficiency for amd is exactly what they needed to compete. It's gonna mean cheaper and better cards for us all, and it would force NV to launch a proper maxwell die


----------



## Pantsu

Seems way too early for these kind of benchmarks. In any case I wouldn't get caught up on the naming. There was also a rumor of skipping the 300-series and going straight to 400. These are just marketing names, they change them almost every gen and they hardly make any sense. Just because 290X was the fastest in this gen, doesn't mean the next gen fastest chip will be called 390X. What we need is specs for the chip, not some speculative name for it.


----------



## Ha-Nocri

so it is not knows yet will AMD use 20nm or 28nm?


----------



## Clovertail100

Quote:


> Originally Posted by *Pantsu*
> 
> Seems way too early for these kind of benchmarks. In any case I wouldn't get caught up on the naming. There was also a rumor of skipping the 300-series and going straight to 400. These are just marketing names, they change them almost every gen and they hardly make any sense. Just because 290X was the fastest in this gen, doesn't mean the next gen fastest chip will be called 390X. What we need is specs for the chip, not some speculative name for it.


When people say 380x they mean Fiji, and when they say 390x they mean Bermuda. Most people in this thread just understand that (unless something changes) 380X and 390X will be the names for these unreleased chips.

No one's getting hung up on names. They're debating which of the two chips these benchmarks really represent.


----------



## Cybertox

I am not interested in anything lower than a 390X. The benches are impressive though if it is indeed a 380X.


----------



## edo101

Am I the only one that didn't see 380X on the two charts on there


----------



## NexusRed

Quote:


> Originally Posted by *Cybertox*
> 
> I am not interested in anything lower than a 390X. The benches are impressive though if it is indeed a 380X.


Then why are you here when the title clearly says "Possible 380X Review"? That's like walking into a Nissan dealership and saying "I'm only interested in BMW's" lol.

This just sealed the deal for me. Was looking to replace my R9 280X with GTX 970. I'll hold out on that and sell my 6300+M5A97 for some 4690k+Z97 goodness!


----------



## naved777

Quote:


> Originally Posted by *edo101*
> 
> Am I the only one that didn't see 380X on the two charts on there


Because 380x = Captain Jack


----------



## Cybertox

Quote:


> Originally Posted by *NexusRed*
> 
> Then why are you here when the title clearly says "Possible 380X Review"? That's like walking into a Nissan dealership and saying "I'm only interested in BMW's" lol.
> 
> This just sealed the deal for me. Was looking to replace my R9 280X with GTX 970. I'll hold out on that and sell my 6300+M5A97 for some 4690k+Z97 goodness!


No, its like walking into a Lamborghini dealership but instead of being interested in a Gallardo I am interested in a Murcielago. If I want to estimate the performance of a GPU which is not out yet or has not been benched yet I see for available benchmarks of other GPUs which will be in the same serie. Therefore I am here looking at the 380X benches which show how it performs so that I can estimate the performance of an 390x which will scale accordingly.

Hope that explains to you why I am here.


----------



## NexusRed

Quote:


> Originally Posted by *Cybertox*
> 
> No, its like walking into a Lamborghini dealership but instead of being interested in a Gallardo I am interested in a Murcielago. If I want to estimate the performance of a GPU which is not out yet or has not been benched yet I see for available benchmarks of other GPUs which will be in the same serie. Therefore I am here looking at the 380X benches which show how it performs so that I can estimate the performance of an 390x which will scale accordingly.
> 
> Hope that explains to you why I am here.


Rep to you sir/madam for so elegantly correcting me. My apologies as I should have not questioned your post in this thread.


----------



## maarten12100

This looks more like the 390x considering how well it does. But on 20nm it could be the 380x. I think this is finally the real GCN update rather than revisions and that on itself is good news.


----------



## Pantsu

Quote:


> Originally Posted by *Mookster*
> 
> When people say 380x they mean Fiji, and when they say 390x they mean Bermuda. Most people in this thread just understand that (unless something changes) 380X and 390X will be the names for these unreleased chips.
> 
> No one's getting hung up on names. They're debating which of the two chips these benchmarks really represent.


Nothing in this leak suggests 380X, yet the OP named it as such. Which chip was he meaning? The source post says it's a no-name engineering sample that's not necessarily even the final spec, let alone final drivers. Still people call it the 380X, 390X, Fiji. Based on no evidence.


----------



## flopper

30% with Dx11 add Mantle and suddenly it get interesting.


----------



## maarten12100

Quote:


> Originally Posted by *B!0HaZard*
> 
> Seeing the sub-200 W power consumption, this can't be the 390X. If it were real, it'd be 380X or maybe a 390. NVIDIA has room in their lineup for a 200+ W card with the 980 Ti name so I don't think AMD would have a sub-200 W 390X with no room for a better name. It just doesn't make sense considering the naming schemes they use.
> 
> That said, these benchmarks would suggest that AMD increased their performance/power by 84% from Hawaii to Bermuda/Fiji whereas NVIDIA "only" managed a 50% increase from Kepler to Maxwell. I find it hard to believe. Going from significantly worse efficiency than NVIDIA to slightly better would be a big blow to NVIDIA's Maxwell cards. I think it's fake.


AMD goes from a extremley high density node to a energy efficient 20nm node so it is obvious that their gain in efficiency is bigger than Nvidia's 28nm products which have lower density than the 28nm used in Hawaii.


----------



## Olivon

Rumours, rumours ... Bring it AMD ! Time to compete !


----------



## kingduqc

Quote:


> Originally Posted by *MunneY*
> 
> its the 380x... not the 390x


Yes, that is exactly what i've said...


----------



## delboy67

Quote:


> Originally Posted by *Olivon*
> 
> Rumours, rumours ... Bring it AMD ! Time to compete !


Did you not see the graphs, they already compete, if this really is mid range 380x then nvidias hand will be forced, we'll get even more new gpus, price drops on 290/maxwell/ big sales hence price drops on 1440/4k displays and good times for all


----------



## Clovertail100

Quote:


> Originally Posted by *Pantsu*
> 
> Nothing in this leak suggests 380X, yet the OP named it as such. Which chip was he meaning? The source post says it's a no-name engineering sample that's not necessarily even the final spec, let alone final drivers. Still people call it the 380X, 390X, Fiji. Based on no evidence.


Like I said, people debating about whether it's a 380X or 390X are debating whether it's Fiji or Bermuda; they're not arguing about the name.


----------



## Lass3

I hope this is true, but I doubt it.


----------



## lacrossewacker

At least slap on a decent reference cooler AMD. Don't try to sell a Lamborghini in a Civic chassis and shrowd.


----------



## Cybertox

Quote:


> Originally Posted by *lacrossewacker*
> 
> At least slap on a decent reference cooler AMD. Don't try to sell a Lamborghini in a Civic chassis and shrowd.


I really hope that they wont have the cooler as a separate external attachment like with the R295.


----------



## PureBlackFire

Quote:


> Originally Posted by *Cybertox*
> 
> I really hope that they wont have the cooler as a separate external attachment like with the R295.


according to the power consumption chart that should not be needed. but who knows.


----------



## TheBlindDeafMute

The faster we get this, the faster we get the Titan 2. Bring it on.


----------



## Olivon

Quote:


> Originally Posted by *delboy67*
> 
> Did you not see the graphs, they already compete, if this really is mid range 380x then nvidias hand will be forced, we'll get even more new gpus, price drops on 290/maxwell/ big sales hence price drops on 1440/4k displays and good times for all


And I hope so. nVidia's domination is quite indecent and AMD need to make turn the table.
And I don't want to emphasis, it's maybe a huge troll/fake chart.
So Wait & See ...


----------



## MunneY

Just to confirm what we already know.

WCCF might just be the most ******ed news site ever... They can't even read.

http://wccftech.com/amds-radeon-r9-390x-es-performance-numbers-allegedly-leaked-faster-geforce-gtx-980-consumes-197w-gaming/


----------



## Xuper

Quote:


> Originally Posted by *MunneY*
> 
> Just to confirm what we already know.
> 
> WCCF might just be the most ******ed news site ever... They can't even read.
> 
> http://wccftech.com/amds-radeon-r9-390x-es-performance-numbers-allegedly-leaked-faster-geforce-gtx-980-consumes-197w-gaming/


LoL ! they Don't care . they want to keep Fanboyism.


----------



## Orangey

Is there anything about the name "Captain Jack" that we can extrapolate that would help us figure out if it's Bermuda or Fiji?

Pirate lore.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Orangey*
> 
> Is there anything about the name "Captain Jack" that we can extrapolate that would help us figure out if it's Bermuda or Fiji?
> 
> Pirate lore.


Just the islands, old pirate lore, etc, as you mentioned yourself. Although nothing more specific that I can think of....


----------



## iamhollywood5

Looks believable to me. As long as it's not paired with an ugly, cumbersome 295X2 water cooler like it's rumored to be, my wallet is open.


----------



## PureBlackFire

the 380X is rumored to be paired with an AIO like the 295X2?


----------



## CasualCat

Quote:


> Originally Posted by *PureBlackFire*
> 
> the 380X is rumored to be paired with an AIO like the 295X2?


It is now.


----------



## Ultracarpet

I'm wondering if that leaked cooler design with the water-cooling holes could be for the next gen dual GPU?

If those power consumption numbers are for reals, I can't see the 390x consuming more than around 250w, which really wouldn't need an AIO to be cooled properly...


----------



## iamhollywood5

Quote:


> Originally Posted by *PureBlackFire*
> 
> the 380X is rumored to be paired with an AIO like the 295X2?


Sorry, I guess it's the 390X that is rumored to have that AIO water cooler.


----------



## PostalTwinkie

Quote:


> Originally Posted by *PureBlackFire*
> 
> the 380X is rumored to be paired with an AIO like the 295X2?


There was a "leak" of AMDs 300 series reference cooler, and it was an AIO+Air hybrid cooler. I imagine if true it would be their 390X, or whatever card fits that position.


----------



## Xuper

Maybe They lied and It's Geforce 980Ti ? Because of Power Consumption , 197W and Score , I think it fit s to Geforce 980ti.With 20nm Tech , AMD can't Reduce 210w to 197w If it 380x is successor of 280x and yet it's Faster than 980 and also do you think it's possible to beat maxwell with CGN 1.3 ? this chart tells us That CGN 1.3 is a high energy efficient Arch and it's done with 20nm ? Hell no , either It's Fake or Geforce 980Ti.


----------



## iamhollywood5

Quote:


> Originally Posted by *Xuper*
> 
> Maybe They lied and It's Geforce 980Ti ? Because of Power Consumption , 197W and Score , I think it fit s to Geforce 980ti.With 20nm Tech , AMD can't Reduce 210w to 197w If it 380x is successor of 280x and yet it's Faster than 980 and also do you think it's possible to beat maxwell with CGN 1.3 ? this chart tells us That CGN 1.3 is a high energy efficient Arch and it's done with 20nm ? Hell no , either It's Fake or Geforce 980Ti.


Hmmm let's see... Codename "Captain Jack," which does that fit better with: Maxell or the Pirate Islands lineup?


----------



## PostalTwinkie

Quote:


> Originally Posted by *iamhollywood5*
> 
> Hmmm let's see... Codename "Captain Jack," which does that fit better with: Maxell or the Pirate Islands lineup?


Rum! It fits with Rum!


----------



## edo101

IMPOSSIBRU


----------



## PureBlackFire

Quote:


> Originally Posted by *CasualCat*
> 
> It is now.


so it seems.








Quote:


> Originally Posted by *Ultracarpet*
> 
> I'm wondering if that leaked cooler design with the water-cooling holes could be for the next gen dual GPU?
> 
> If those power consumption numbers are for reals, I can't see the 390x consuming more than around 250w, which really wouldn't need an AIO to be cooled properly...


that leaked pic was instantly rumored to be for the 390X. you can see it is shorter than the one on the 295X2 and had the opening for a fan at the end rather than in the middle. not designed for a dual gpu card.


----------



## maarten12100

Quote:


> Originally Posted by *Xuper*
> 
> Maybe They lied and It's Geforce 980Ti ? Because of Power Consumption , 197W and Score , I think it fit s to Geforce 980ti.With 20nm Tech , AMD can't Reduce 210w to 197w If it 380x is successor of 280x and yet it's Faster than 980 and also do you think it's possible to beat maxwell with CGN 1.3 ? this chart tells us That CGN 1.3 is a high energy efficient Arch and it's done with 20nm ? Hell no , either It's Fake or Geforce 980Ti.


They can easily do it. A new architecture instead of revisions. This will be the first radical change since 2012. The go from high density 28nm to either lower density 28 or depleted SOI 28 or a 20nm node. All this will yield major improvement.

AMD can easily outshine nvidia's cards on 28nm considering maxwell will probably be the last. They will go head to head on 20 or advanced 28nm.


----------



## raghu78

I am sure there is a GPU above this model. 200w for a flagship GPU is not going to happen when AMD knows a GM200 is waiting in the wings. btw 20nm is ruled out for AMD's next gen GPUs primarily because I don't see AMD releasing a high performance GPU at 300 - 350 sqmm on TSMC 20nm when they just confirmed Carrizo-L at 28nm.

http://www.anandtech.com/show/8742/amd-announces-carrizo-and-carrizol-next-gen-apus-for-h1-2015

This plan has changed from just 6 months back when AMD said the follow-on to Beema at 28nm will be made at 20nm and will be pin-compatible with ARM A57 SoCs.

http://www.anandtech.com/show/7989/amd-announces-project-skybridge-pincompatible-arm-and-x86-socs-in-2015

If AMD cannot release a 20nm low power mobile SoC at 20nm what are the chances that they would do it for a high performance device which needs robust yields and high performance transistors.








Quote:


> Originally Posted by *maarten12100*
> 
> They can easily do it. A new architecture instead of revisions. This will be the first radical change since 2012. The go from high density 28nm to either lower density 28 or depleted SOI 28 or a 20nm node. All this will yield major improvement.


Actually AMD is very likely to build their next gen R9 3xx GPUs using GF 28SHP process. Already GF is manufacturing Kaveri, semi-custom game console chips and GPUs at GF 28SHP. This process is much better than TSMC 28HP. To give you an idea.

http://www.anandtech.com/show/7974/amd-beema-mullins-architecture-a10-micro-6700t-performance-preview

"Puma+ is based on the same micro architecture as Jaguar. We're still looking at a 2-wide OoO design with the same number of execution units and data structures inside the chip. The memory interface remains unchanged as well at 64-bits wide. *These new SoCs are still built on the same 28nm process as their predecessor. The process however has seen some improvements. Not only are both the CPU and GPU designs slightly better optimized for lower power operation, but both benefit from improvements to the manufacturing process resulting in substantial decreases in leakage current

AMD claims a 19% reduction in core leakage/static current for Puma+ compared to Jaguar at 1.2V, and a 38% reduction for the GPU. The drop in leakage directly contributes to a substantially lower power profile for Beema and Mullins.*."

Anandtech was wrong that Beema / Mullins are built on the same process as Kabini/Temash. This fact is reflected in the table in the same page where they clearly mention Beema/Mullins as GF 28nm.

A new architecture + extremely power efficient state of the art High bandwidth memory subsystem + GF 28SHP process . I am sure AMD got the message loud and clear after R9 290X launch that their architecture needed to improve power efficiency. If the rumours are true then AMD have listened and responded well.


----------



## Xuper

Quote:


> Originally Posted by *iamhollywood5*
> 
> Hmmm let's see... Codename "Captain Jack," which does that fit better with: Maxell or the Pirate Islands lineup?


yep , Captain Jack Sparrow, He said :

Quote:


> If you had a sister and a dog , I'd choose the dog


----------



## Slomo4shO

Hmm...


----------



## Seronx

It is the R9 370X, most likely on 28-nm FDSOI with per 4CU, adaptive voltage islands.

20nm FDSOI is 2H 2015;



Vmax is important for per-part/ip frequency/voltage boosts.


----------



## CasualCat

Quote:


> Originally Posted by *Seronx*
> 
> It is the R9 370X, most likely on 28-nm FDSOI with per 4CU, adaptive voltage islands.
> 
> 20nm FDSOI is 2H 2015;
> 
> 
> 
> Vmax is important for per-part/ip frequency/voltage boosts.


That'd be impressive if true, but I don't see how your slides make the case that it is the 370X vs 380X.


----------



## Seronx

Quote:


> Originally Posted by *CasualCat*
> 
> That'd be impressive if true, but I don't see how your slides make the case that it is the 370X vs 380X.


http://www.overclock.net/t/1526789/chiphell-possible-380x-preview/20#post_23197567

Made a post before;
x90 Fiji - New Chip.
x80 Maui - Enhanced/Respin of Hawaii with PI feature set. Incidentally, referred to many as "Big" Hawaii.***
x70 Bermuda - Enhanced/Respin of Tonga with complete PI feature set.
x60 Treasure - New Chip or Enhanced/Respin of Bonaire with PI feature set.

Pirate Islands comes with a faster ld/st which isn't in Tonga.

***


----------



## Kinaesthetic

Quote:


> Originally Posted by *CasualCat*
> 
> That'd be impressive if true, but I don't see how your slides make the case that it is the 370X vs 380X.


Rule #1 on OCN. He is almost 95% of the time is wrong.


----------



## StereoPixel

Quote:


> Originally Posted by *iamhollywood5*
> 
> Hmmm let's see... Codename "Captain Jack," which does that fit better with: Maxell or the Pirate Islands lineup?


page 19 and 21
http://www.slideshare.net/GauravSharma250/bermuda-triangle-17325215

Bermuda Triangle = Bermuda Islands and Captain Jack = Pirate Islands
it is Bermuda GPU I think


----------



## CasualCat

Quote:


> Originally Posted by *Seronx*
> 
> http://www.overclock.net/t/1526789/chiphell-possible-380x-preview/20#post_23197567
> 
> Made a post before;
> x90 Fiji - New Chip.
> x80 Maui - Enhanced/Respin of Hawaii with PI feature set. Incidentally, referred to many as "Big" Hawaii.***
> x70 Bermuda - Enhanced/Respin of Tonga with complete PI feature set.
> x60 Treasure - New Chip or Enhanced/Respin of Bonaire with PI feature set.
> 
> ***


Ok so in essence as others are speculating that Bermuda would be the most likely "fit" for Captain Jack and Bermuda should be the x70 part. Got it. I'm skeptical about a Tonga refresh having that kind of performance jump personally, but not in any way my area of expertise


----------



## PostalTwinkie

Quote:


> Originally Posted by *raghu78*
> 
> I am sure there is a GPU above this model. 200w for a flagship GPU is not going to happen when AMD knows a GM200 is waiting in the wings. btw 20nm is ruled out for AMD's next gen GPUs primarily because I don't see AMD releasing a high performance GPU at 300 - 350 sqmm on TSMC 20nm when they just confirmed Carrizo-L at 28nm.
> 
> http://www.anandtech.com/show/8742/amd-announces-carrizo-and-carrizol-next-gen-apus-for-h1-2015
> 
> This plan has changed from just 6 months back when AMD said the follow-on to Beema at 28nm will be made at 20nm and will be pin-compatible with ARM A57 SoCs.
> 
> http://www.anandtech.com/show/7989/amd-announces-project-skybridge-pincompatible-arm-and-x86-socs-in-2015
> 
> If AMD cannot release a 20nm low power mobile SoC at 20nm what are the chances that they would do it for a high performance device which needs robust yields and high performance transistors.
> 
> 
> 
> 
> 
> 
> 
> 
> Actually AMD is very likely to build their next gen R9 3xx GPUs using GF 28SHP process. Already GF is manufacturing Kaveri, semi-custom game console chips and GPUs at GF 28SHP. This process is much better than TSMC 28HP. To give you an idea.
> 
> http://www.anandtech.com/show/7974/amd-beema-mullins-architecture-a10-micro-6700t-performance-preview
> 
> "Puma+ is based on the same micro architecture as Jaguar. We're still looking at a 2-wide OoO design with the same number of execution units and data structures inside the chip. The memory interface remains unchanged as well at 64-bits wide. *These new SoCs are still built on the same 28nm process as their predecessor. The process however has seen some improvements. Not only are both the CPU and GPU designs slightly better optimized for lower power operation, but both benefit from improvements to the manufacturing process resulting in substantial decreases in leakage current
> 
> AMD claims a 19% reduction in core leakage/static current for Puma+ compared to Jaguar at 1.2V, and a 38% reduction for the GPU. The drop in leakage directly contributes to a substantially lower power profile for Beema and Mullins.*."
> 
> Anandtech was wrong that Beema / Mullins are built on the same process as Kabini/Temash. This fact is reflected in the table in the same page where they clearly mention Beema/Mullins as GF 28nm.
> 
> A new architecture + extremely power efficient state of the art High bandwidth memory subsystem + GF 28SHP process . I am sure AMD got the message loud and clear after R9 290X launch that their architecture needed to improve power efficiency. If the rumours are true then AMD have listened and responded well.


Going to totally derail this thread here........

I wonder what sort of impact this would have on the consoles. Part of me wants to think that both Sony and Microsoft would want to take advantage of these improvements, and release a refresh of the PS4 and XB1. If anything to maintain the same power profile/requirements of the unit, but maybe have higher clock speeds on both the CPU and GPU. It might be enough to get these consoles up to that 1080/60.....


----------



## joeh4384

I think it is the 380x. The GM204 from Nvidia really only is a mid range chip that is raking in flagship prices on the 980. Kudos to Nvidia for getting to market first but that also means your competitor has a performance target to plan for. 2015 looks like an interesting year for GPUs.


----------



## CasualCat

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Going to totally derail this thread here........
> 
> I wonder what sort of implications this could mean for the consoles. Part of me wants to think that both Sony and Microsoft would want to take advantage of these improvements, and release a refresh of the PS4 and XB1. If anything to maintain the same power, but maybe have higher clock speeds on both the CPU and GPU. It might be enough to get these consoles up to that 1080/60.....


Zero. You can't (edit: successfully) do that kind of stuff in consoles. It becomes too fragmenting which is one of console's advantages that they don't have to worry about.


----------



## joeh4384

Quote:


> Originally Posted by *CasualCat*
> 
> Zero. You can't (edit: successfully) do that kind of stuff in consoles. It becomes too fragmenting which is one of console's advantages that they don't have to worry about.


I do not think they can screw over all the people who bought it on release like that. If anything they will use the die shrinks to make small form factor versions like they did in the past.


----------



## Seronx

Quote:


> Originally Posted by *CasualCat*
> 
> I'm skeptical about a Tonga refresh having that kind of performance jump personally, but not in any way my area of expertise


The full rumor expects;
Tonga -> Bermuda

Tonga XT -- *<1 GHz* // 2048 ALUs / 128 TMUs / 32 ROPs // 1x(baseline) read and write cache speed
Bermuda XT -- *>1.1 GHz* // 2048 ALUs / 256 TMUs / 64 ROPs // 2x read and write cache speed // possibly with larger caches as well


----------



## PostalTwinkie

Quote:


> Originally Posted by *CasualCat*
> 
> Zero. You can't (edit: successfully) do that kind of stuff in consoles. It becomes too fragmenting which is one of console's advantages that they don't have to worry about.


I don't know.

Realistically, the hardware in the consoles now isn't going to last a 7 to 8 year cycle, it just isn't. They will have to cut the console cycle short or do a refresh.
Quote:


> Originally Posted by *joeh4384*
> 
> I do not think they can screw over all the people who bought it on release like that. If anything they will use the die shrinks to make small form factor versions like they did in the past.


This is almost assured.


----------



## geoxile

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Rule #1 on OCN. He is almost 95% of the time is wrong.


This. That said, if AMD really does integrate all those power saving features into their GPUs I can see the efficiency jumping up significantly, and more so if HBM is used.


----------



## Noufel

So if it's a 380x and it's comming february 2015 can we expect a 390x fiji for h1 2015 or h2 ?


----------



## Orangey

Maybe the process they are using won't allow that size of die (390X Fiji) right now, in which case it's anyone's guess how long it takes to improve? 20nm @ TSMC still won't allow it and it's been about a year.

It sucks how much PC gaming is being held back by fumbling in the semiconductor industry.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Mookster*
> 
> Different situation. The 380x should be a relatively small die on 20nm. The move to 20nm and architecture improvements should produce these kinds of results (ala Fermi vs Tahiti.) If these numbers are accurate, I'd guess they're not using HBM on the 380x.
> 
> The 390x, however, could benefit from a large 20nm die _and_ HBM. I'd be disappointed with any less than a 50% lead over big Maxwell. It has every advantage to do so.


And then you woke up.


----------



## noilly

Quote:


> Originally Posted by *Orangey*
> 
> Maybe the process they are using won't allow that size of die (390X Fiji) right now, in which case it's anyone's guess how long it takes to improve? 20nm @ TSMC still won't allow it and it's been about a year.
> 
> It sucks how much PC gaming is being held back by fumbling in the semiconductor industry.


making chips is hard


----------



## SlackerITGuy

Quote:


> Originally Posted by *noilly*
> 
> Shrinking transistors is hard.


Fixed.


----------



## incog

Why does anyone even care about reference coolers? We aren't getting those cards unless it's for water cooling right?


----------



## hyp36rmax

Quote:


> Originally Posted by *incog*
> 
> Why does anyone even care about reference coolers? We aren't getting those cards unless it's for water cooling right?


Probably because the new AMD reference coolers are actually good enough to keep if they plan on utilizing the AIO's. Personally i'll be removing it anyways to add into my custom loop.


----------



## iCrap

Quote:


> Originally Posted by *incog*
> 
> Why does anyone even care about reference coolers? We aren't getting those cards unless it's for water cooling right?


Well it would be nice if the reference design wasn't total crap.


----------



## noilly

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Fixed.


yeah, that's implied. even intel's having problems.


----------



## PureBlackFire

Quote:


> Originally Posted by *incog*
> 
> Why does anyone even care about reference coolers? We aren't getting those cards unless it's for water cooling right?


wrong. water cooling is done by a minority among a minority of consumers.


----------



## MapRef41N93W

Quote:


> Originally Posted by *incog*
> 
> Why does anyone even care about reference coolers? We aren't getting those cards unless it's for water cooling right?


Yeah because custom water cooling isn't only done by less than 1% of people who buy high end GPUs or anything.


----------



## SoloCamo

I just want a remotely quiet reference cooler - I will not buy a gpu with a built in AIO - sorry.

Even if it's equal to my 290x's stock reference cooler I can deal with it and even run a mild OC on stock fan speed and realistically 60% is NOT loud, it's when you get into the 70%+ range that's it's too loud for my tastes.

Ideally, a blower style cooler with a larger fan even if it means a wider reference shroud.. just so it's less of a high pitched / high rpm fan
Quote:


> Originally Posted by *MapRef41N93W*
> 
> Yeah because custom water cooling isn't only done by less than 1% of people who buy high end GPUs or anything.


I'd bet money that less than 15% of high end card owners actually water cooler them.


----------



## Meatdohx

Quote:


> Originally Posted by *Wishmaker*
> 
> Unless AMD pulls a Conroe on NVIDIA, things will not change in their favour. I have quite old rigs, true, but I am running on one of them 2x270x DCU II and the performance is oftentimes worse than the equivalent SLI from NVIDIA. If it is not the architecture, it is the drivers with AMD. The 390X needs to be something special for NVIDIA to be put in the corner.


Hawai is better in crossfire than anything i've seen/tried before. It is much better than Pitcairn or tahiti due to the frame pacing improvement plus XDMA crossfire.


----------



## Vintage

I think the 390x will perform better than this. I bet they will beat the 980 with the 380x and then wait for NV to release the GM200 card.

If the efficiency is true then that is really impressive. Getting excited for my upgrade early next year!


----------



## Kaltenbrunner

can't wait to see prices, I've had a 5670, a 6950 CF, 7950 CF, now just a 7950

I'd like to try nvidia for once, but AMD sure has price/performance each time I went to buy


----------



## Testier

This cant be on 28nm.


----------



## Master__Shake

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Going to totally derail this thread here........
> 
> I wonder what sort of impact this would have on the consoles. Part of me wants to think that both Sony and Microsoft would want to take advantage of these improvements, and release a refresh of the PS4 and XB1. If anything to maintain the same power profile/requirements of the unit, but maybe have higher clock speeds on both the CPU and GPU. It might be enough to get these consoles up to that 1080/60.....


the cell broadband engine enjoyed as die shrink as well.

no performance improvements were to be had.

lower wattage was the result.


----------



## Testier

[quote nam
Quote:


> Originally Posted by *Master__Shake*
> 
> the cell broadband engine enjoyed as die shrink as well.
> 
> no performance improvements were to be had.
> 
> lower wattage was the result.


Lower wattage on the consoles are just as important as performance. It allows more compact/cheaper cooling methods which reduces the size of the console and/or cost per unit.

1080/60 on console is a dream. And it will remain a dream unless consumer are willing to accept a higher cost per unit.

And/or the average console consumer get more informed.


----------



## Orangey

By the time conslolers get 1080p60 we will have 4K120.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Orangey*
> 
> By the time conslolers get 1080p60 we will have 4K120.


I would think next gen consoles will be 4K or 1440p. There is no way they are 1080p. Still 30 fps though.


----------



## Slomo4shO

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Realistically, the hardware in the consoles now isn't going to last a 7 to 8 year cycle, it just isn't.


The average has been 6 year for a console thus far. 4k TVs are expected to have ~10% market penetration by the year 2017. This would suggest that new consoles in 2019 would be sufficient.

What would be more concerning to the console market may be PC on a stick machines that can theoretically surpass the GPU capacity of the consoles in the next few years...


----------



## PontiacGTX

Quote:


> Originally Posted by *salamachaa*
> 
> It's happened a lot in the past between these two.
> 
> Anyway, I could see this being an interim flagship for AMD until they get big Bermuda ready. If that were true, then I bet they release it at 500 dollars, not 300 dollars.


it`s Hawaii XT(2.0),Treasure Islands,Fiji XT,Bermuda XT


----------



## maarten12100

Quote:


> Originally Posted by *raghu78*
> 
> Actually AMD is very likely to build their next gen R9 3xx GPUs using GF 28SHP process. Already GF is manufacturing Kaveri, semi-custom game console chips and GPUs at GF 28SHP. This process is much better than TSMC 28HP. To give you an idea.


I share your views especially about GF's nodes people say they are worse but those people couldn't be further from the truth. Glofo is has the very best nodes compared to TSMC. Halved the power on Temash -> Mullins. And they will also deliver great results with Carrizo on whatever node that will be on if we can trust the slides.

AMD said they would swap production to GF I first thought it was just for the Take and pay agreement but it seems they will truely become a Globalfab and direct strong competition to TSMC. Good news.


----------



## Elric

This is not or well there ever be a 20nm GPU those nodes are for low power chips sense GPU and APU's use more power and have higher clocks the low power 20nm would not work and as far as yesterday there is not high power 20nm node from anyone.


----------



## Elric

There is no 20nm GPU or well there ever be on GPU need high power nodes and there is no 20nm high power node AMD did not even put there next Apu's on 20nm because they would not be able to clock them to 3.5 to 4.0 ghz.


----------



## crazycrave

I remember the rumors being that the 380X would compete and take the crown from the 980.. the release was Jan but if things went well .. could be Dec just in time for Christmas..


----------



## Noufel

I own a 980 g1 sli they are good performers but frankly i regret my 290 trix cfx because of the price/perf ratio and good cfx scalling,don't bother explain to me the power efficiency and overclockability thing because the oc on maxwell don't give the same perfomance that an oc on hawaii gives and dsr and mfaa are simply gadjets more that useful for me so when the fiji 390x comes out it will be an instant buy for me


----------



## Paladin Goo

Pretty nice if that's legit.


----------



## maarten12100

Quote:


> Originally Posted by *crazycrave*
> 
> I remember the rumors being that the 380X would compete and take the crown from the 980.. the release was Jan but if things went well .. could be Dec just in time for Christmas..


Don't get your hopes up Q1 2015 it is


----------



## f0rteOC

Quote:


> Originally Posted by *raghu78*
> 
> I am sure there is a GPU above this model. 200w for a flagship GPU is not going to happen when AMD knows a GM200 is waiting in the wings. btw 20nm is ruled out for AMD's next gen GPUs primarily because I don't see AMD releasing a high performance GPU at 300 - 350 sqmm on TSMC 20nm when they just confirmed Carrizo-L at 28nm.
> 
> http://www.anandtech.com/show/8742/amd-announces-carrizo-and-carrizol-next-gen-apus-for-h1-2015
> 
> This plan has changed from just 6 months back when AMD said the follow-on to Beema at 28nm will be made at 20nm and will be pin-compatible with ARM A57 SoCs.
> 
> http://www.anandtech.com/show/7989/amd-announces-project-skybridge-pincompatible-arm-and-x86-socs-in-2015
> 
> If AMD cannot release a 20nm low power mobile SoC at 20nm what are the chances that they would do it for a high performance device which needs robust yields and high performance transistors.
> 
> 
> 
> 
> 
> 
> 
> 
> Actually AMD is very likely to build their next gen R9 3xx GPUs using GF 28SHP process. Already GF is manufacturing Kaveri, semi-custom game console chips and GPUs at GF 28SHP. This process is much better than TSMC 28HP. To give you an idea.
> 
> http://www.anandtech.com/show/7974/amd-beema-mullins-architecture-a10-micro-6700t-performance-preview
> 
> "Puma+ is based on the same micro architecture as Jaguar. We're still looking at a 2-wide OoO design with the same number of execution units and data structures inside the chip. The memory interface remains unchanged as well at 64-bits wide. *These new SoCs are still built on the same 28nm process as their predecessor. The process however has seen some improvements. Not only are both the CPU and GPU designs slightly better optimized for lower power operation, but both benefit from improvements to the manufacturing process resulting in substantial decreases in leakage current
> 
> AMD claims a 19% reduction in core leakage/static current for Puma+ compared to Jaguar at 1.2V, and a 38% reduction for the GPU. The drop in leakage directly contributes to a substantially lower power profile for Beema and Mullins.*."
> 
> Anandtech was wrong that Beema / Mullins are built on the same process as Kabini/Temash. This fact is reflected in the table in the same page where they clearly mention Beema/Mullins as GF 28nm.
> 
> A new architecture + extremely power efficient state of the art High bandwidth memory subsystem + GF 28SHP process . I am sure AMD got the message loud and clear after R9 290X launch that their architecture needed to improve power efficiency. If the rumours are true then AMD have listened and responded well.


This makes a lot of sense. If AMD works their power efficiency magic on their 28nm GPUs, they can't get the huge performance jumps that the 20nm process would.

This "Captain Jack" GPU is probably the 390X, since it is a decent bump above the 290X and AMD only needs to compete with the GTX 980 for the time being. Nvidia wouldn't release a GM200-based GTX 980 Ti if it would eclipse their GTX 980 in performance. I could see Nvidia releasing a GTX TItan 2, but it would be priced way higher than the 390X and Nvidia would still need to cut the GTX 980 price substantially (i.e. $100-$150) to keep it competitive.
Quote:


> Originally Posted by *Slomo4shO*
> 
> What would be more concerning to the console market may be PC on a stick machines that can theoretically surpass the GPU capacity of the consoles in the next few years...


The Radeon HD 7850 is halfway between the GPU performance of the PS4 and the XBONE. The current PC-on-a-stick machines run on Intel's Bay Trail Atom CPU's with Intel HD Graphics based off of the Ivy Bridge Microarchitecture with 4 execution units and a max clock speed of 896MHz (Z3785). Notebookcheck.net compares it to the Radeon HD 6310 performance-wise.
If we compared the GPU performance of Bay Trail to the PS3's GPU (7800GTX equivalent), the PS3 would come out on top (the Tom's Hardware GPU hierarchy chart lists the 7800 GTX as 6 tiers higher).
The 7800GTX is a nine year old GPU, so even with an increased emphasis on developing more powerful integrated graphics, we won't see PS4-level performance in a stick PC for at least 7 or 8 years.


----------



## Nerull

For AMDs sake i hope all of this is true. Their GPU's need to kick some serious ass to make up for the dead weight of the cpu division.

Also im shocked no one has linked this yet


----------



## incog

Quote:


> Originally Posted by *PureBlackFire*
> 
> Quote:
> 
> 
> 
> Originally Posted by *incog*
> 
> Why does anyone even care about reference coolers? We aren't getting those cards unless it's for water cooling right?
> 
> 
> 
> wrong. water cooling is done by a minority among a minority of consumers.
Click to expand...

that's my point though

reference coolers are inferior to aftermarket heatsinks from sapphire, gigabyte, msi, etc

since a minority water cool anyway, and reference cards are nice in that they're the cheapest cards you can get since the heatsink is crap, but you don't care since you're not going to use it (sort of like intel stock cooler), the question becomes:

why do we care about reference coolers?


----------



## hteng

jack sparrow be in the die pirating


----------



## Redwoodz

Any reason to believe this can't be TSMC's 16nm FinFet +? http://www.kitguru.net/components/graphic-cards/anton-shilov/tsmc-reveals-new-16nm-finfet-process-vows-to-start-10nm-production-in-q4-2015/


----------



## SlackerITGuy

Quote:


> Originally Posted by *Redwoodz*
> 
> Any reason to believe this can't be TSMC's 16nm FinFet +? http://www.kitguru.net/components/graphic-cards/anton-shilov/tsmc-reveals-new-16nm-finfet-process-vows-to-start-10nm-production-in-q4-2015/


It will be 28nm.


----------



## Redwoodz

Quote:


> Originally Posted by *SlackerITGuy*
> 
> It will be 28nm.


Very well might be,but you didn't answer my question.


----------



## geoxile

Quote:


> Originally Posted by *Redwoodz*
> 
> Any reason to believe this can't be TSMC's 16nm FinFet +? http://www.kitguru.net/components/graphic-cards/anton-shilov/tsmc-reveals-new-16nm-finfet-process-vows-to-start-10nm-production-in-q4-2015/


For one
Quote:


> Taiwan Semiconductor Manufacturing Co. this week unveiled an improved version of its 16nm FinFET process technology that will hit volume production in 2015


Production start in 2015 + the actual time to market for the GPU (probably several months) would probably indicate second half 2015, where as AMD's new GPUs are expected in the first half. And we also saw a supposed leak of a shipment of components going out for assembly whereas 16nm FF+ isn't even ready.

Plus, it's generally expected that the 16nm FF+ will be geared primarily for less complex, smaller die chips, at least at first.


----------



## curly haired boy

ok, where the hell is big maxwell?


----------



## ZealotKi11er

Quote:


> Originally Posted by *curly haired boy*
> 
> ok, where the hell is big maxwell?


Why do u need big Maxwell? GTX980 not enough?


----------



## Orangey

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why do u need big Maxwell? GTX980 not enough?


That extra 5% that GM204 Maxwell brings is, indeed, not enough.


----------



## curly haired boy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why do u need big Maxwell? GTX980 not enough?


barely enough to max DA:I 1080p @ 60 fps... i'll need way more horsepower to tackle Witcher 3.


----------



## incog

Quote:


> Originally Posted by *curly haired boy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Why do u need big Maxwell? GTX980 not enough?
> 
> 
> 
> barely enough to max DA:I 1080p @ 60 fps... i'll need way more horsepower to tackle Witcher 3.
Click to expand...

Perhaps DAI is just a badly optimized game?

Cards at the level of 7970/770/780 should be able to tackle anything at 1080p, I've seen the footage DAI doesn't even look that great.


----------



## hollowtek

Well those charts sure looked ambitious. Not going to hold my breath though


----------



## Ultracarpet

Quote:


> Originally Posted by *incog*
> 
> Perhaps DAI is just a badly optimized game?
> 
> Cards at the level of 7970/770/780 should be able to tackle anything at 1080p, I've seen the footage DAI doesn't even look that great.


rofl, DAI is beautiful, and the landscapes are massive... and I'm not really sure why people keep saying certain cards are clearly good enough for 1080p for all time... like seriously, try running an 8800gt in 720p on new games. Resolution isn't all that matters.......


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why do u need big Maxwell? GTX980 not enough?


If I'm looking for a substantial single GPU upgrade to my 290X...no.
Quote:


> Originally Posted by *incog*
> 
> why do we care about reference coolers?


Because a good reference cooler means I don't have to wait for or fuss over the differences between the mix of coolers that follow. They also tend to be a few bucks cheaper.
Quote:


> Originally Posted by *incog*
> 
> Cards at the level of 7970/770/780 should be able to tackle anything at 1080p


They aren't, well not if you are looking at certain frame rate/IQ targets in certain games.


----------



## incog

Quote:


> Originally Posted by *Ultracarpet*
> 
> Quote:
> 
> 
> 
> Originally Posted by *incog*
> 
> Perhaps DAI is just a badly optimized game?
> 
> Cards at the level of 7970/770/780 should be able to tackle anything at 1080p, I've seen the footage DAI doesn't even look that great.
> 
> 
> 
> rofl, DAI is beautiful, and the landscapes are massive... and I'm not really sure why people keep saying certain cards are clearly good enough for 1080p for all time... like seriously, try running an 8800gt in 720p on new games. Resolution isn't all that matters.......
Click to expand...

there hasn't really been a game that has impressed me with graphics since il-2 cliffs of dover; that game was also an un-optimized mess until its community fixed the issues themselves.

https://www.youtube.com/watch?v=lu0XRSTRayo

^that just doesn't impress me in terms of graphics,

this does: https://farm8.staticflickr.com/7034/13423161775_bf81f80857_o.png

if you look at the cockpit with all the little wires, knobs and other things, realize that that's all 3D with very, very nice textures on them

some nice lighting effects and almost photo-realistic aircraft: http://i1364.photobucket.com/albums/r739/larry691/92sqn2_zps6a50b1f6.png

if you get this close to foilage it means you're doing something wrong, regardless you get an idea of the very realistic effects you can get in this game: http://img72.imageshack.us/img72/1185/explosion3.jpg

this is very close to what you see in reality from the air, the haze, the flatness of the landscape, etc. i guess that the bay down there isn't quite as well done and there aren't any clouds (clouds aren't the best) but you get the idea: http://tof.canardpc.com/view/23f98237-8044-4036-abeb-4172546ec9fc.jpg

it's just overall very impressive, to me at least. witcher 2 also had good graphics for example (imo) and so does DAI, but they are far from stunning, they're really just OK. now you can get some very beautiful images in il-2 cliffs of dover with kepler cards such as the 770, the 780 and the 780 ti. my 7970 also runs that game quite well at graphically beautiful settings.

so it's silly that 980 sli would get that kind of fps at non-maxed settings and still not look that great, at least to me it. let's just say it's not impressive, even if the game looks more than all right.

far cry 4 is the same thing tbh, it's a good looking game but the graphics are a far cry (heuheu) from being truly stunning.

i think arma 3 is also amazing looking but i'm not 100% certain. a modded skyrim is imo more impressive than DAI or FC4

edit, yes i know that 7970/770 tier cards aren't murdering games like DAI or FC4; my point is that they >should< since those games aren't immensely impressive like other games. R9 290 + and GTX 780 ti/970 + should comfortably game 1440p and not run into any issues whatsoever at 1080p games. unless the said games look really, really, really amazing. but they just don't. mostly because they aren't really optimized.


----------



## caswow

you are comparing a cockpit or 3 aircrafts vs open world


----------



## flopper

Quote:


> Originally Posted by *incog*
> 
> Perhaps DAI is just a badly optimized game?
> 
> Cards at the level of 7970/770/780 should be able to tackle anything at 1080p, I've seen the footage DAI doesn't even look that great.


DAI best looking game in the world.
Beats any other by miles.
skyrim sucks no matter how you modd it.

if you havent played DAI you cant judge it.
the aesthetics of the games landscape are simply mindblowing.
no other game comes close.


----------



## PontiacGTX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why do u need big Maxwell? GTX980 not enough?


for those that use 4k or eyefinity they could nedd something better
Quote:


> Originally Posted by *incog*
> 
> this does: https://farm8.staticflickr.com/7034/13423161775_bf81f80857_o.png
> 
> if you look at the cockpit with all the little wires, knobs and other things, realize that that's all 3D with very, very nice textures on them
> 
> some nice lighting effects and almost photo-realistic aircraft: http://i1364.photobucket.com/albums/r739/larry691/92sqn2_zps6a50b1f6.png
> .


what game is that one?


----------



## XxOsurfer3xX

Can't wait for the 390X, this benchmark doesn't look too unrealistic, it could be possible with hybrid cooling...


----------



## joeh4384

I think this is the 380x and will come it at $400 dollars to undercut the 980 to return the favor of the 970 undercutting the 290 series.


----------



## pbvider

Quote:


> Originally Posted by *flopper*
> 
> DAI best looking game in the world.
> Beats any other by miles.
> skyrim sucks no matter how you modd it.
> 
> if you havent played DAI you cant judge it.
> the aesthetics of the games landscape are simply mindblowing.
> no other game comes close.


I want whatever you are smoking.


----------



## Kaltenbrunner

is this right

7950 => r9 280

7970 => r9 290

and what was the 290X and what was the double chip version like 6990 ?(can't remember if there was a 7990)?


----------



## Kuivamaa

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> is this right
> 
> 7950 => r9 280
> 
> 7970 => r9 290
> 
> and what was the 290X and what was the double chip version like 6990 ?(can't remember if there was a 7990)?


7970=> R9 280X. R9 290 and R9 290X are based on a different chip (hawaii, vs tahiti for the others).


----------



## Tsumi

There was a 7990. And there also is a R9 295X (which is a two chip R9 290X).


----------



## BizzareRide

What worries me is if this is on 20nm and AMD can't completely destroy a midrange die on 28nm... I hope this is 28nm, cause if not then with a 390x at 20nm, AMD will only match Big Maxwell on 28nm... What happens when NVIDIA switches to 20nm or, like the rumors suggest, skip to 16nm?


----------



## Orangey

Forget about 20nm already, geez.


----------



## Kaltenbrunner

Quote:


> Originally Posted by *Kuivamaa*
> 
> 7970=> R9 280X. R9 290 and R9 290X are based on a different chip (hawaii, vs tahiti for the others).


wow what ??????


----------



## PontiacGTX

Quote:


> Originally Posted by *incog*
> 
> this does: https://farm8.staticflickr.com/7034/13423161775_bf81f80857_o.png
> 
> if you look at the cockpit with all the little wires, knobs and other things, realize that that's all 3D with very, very nice textures on them
> 
> some nice lighting effects and almost photo-realistic aircraft: http://i1364.photobucket.com/albums/r739/larry691/92sqn2_zps6a50b1f6.png
> .


what game is that one?
Quote:


> Originally Posted by *BizzareRide*
> 
> What worries me is if this is on 20nm and AMD can't completely destroy a midrange die on 28nm... I hope this is 28nm, cause if not then with a 390x at 20nm, AMD will only match Big Maxwell on 28nm... What happens when NVIDIA switches to 20nm or, like the rumors suggest, skip to 16nm?


i dont think that they could use 28nm with GM210/200 unless they want almost the same power consumption of GK110


----------



## electro2u

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> wow what ??????


He's correct but it reads funny. He's saying:
The 7970 is the same as the 280x.
And the 290/290x are a different chip from the 280x.

I suspect the article is true and that the 390x won't be much more powerful than the 380x. Kinda like the 290x isn't much over the 290.


----------



## sugarhell

Quote:


> Originally Posted by *electro2u*
> 
> He's correct but it reads funny. He's saying:
> The 7970 is the same as the 280x.
> And the 290/290x are a different chip from the 280x.
> 
> I suspect the article is true and that the 390x won't be much more powerful than the 380x. Kinda like the 290x isn't much over the 290.


290 and 290x use the same chip. 380x and 390x probably will be a different chip too. So expect a big difference.


----------



## Kaltenbrunner

Quote:


> Originally Posted by *electro2u*
> 
> He's correct but it reads funny. He's saying:
> The 7970 is the same as the 280x.
> And the 290/290x are a different chip from the 280x.
> 
> I suspect the article is true and that the 390x won't be much more powerful than the 380x. Kinda like the 290x isn't much over the 290.


yeah I got it but wow, what was the difference between the 280 and 290 chips ??? So the 280 isn't just a cut down version of the 290 ?

Here's to hoping I save money and get a gtx980 or 390x next year


----------



## azanimefan

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> yeah I got it but wow, what was the difference between the 280 and 290 chips ??? So the 280 isn't just a cut down version of the 290 ?
> 
> Here's to hoping I save money and get a gtx980 or 390x next year


the r7-250 was a Oland cored GCN 1.1gpu
the r7-240 was a cut down oland cored GCN 1.1gpu

the 7770/r7-250x was a cape verde cored GCN 1.0gpu
the 7750 was a cut down cape verde cored GCN 1.0gpu

the 7790/r7-260x was a bonaire cored GCN 1.1 gpu
the r7-260 was a cut down bonaire cored GCN 1.1gpu

the 7870/r9-270x was a pitcairn cored GCN 1.0 gpu
the r9-270 was a pitcairn cored GCN 1.0gpu with a lower power draw then the r9-270x, however nearly identical performance
the 7850/r7-265 was a CUT DOWN pitcairn cored GCN 1.0gpu

the 7970/r9-280x was a tahiti cored GCN 1.0 gpu
the 7950/r9-280 was a cut down tahiti cored GCN 1.0gpu

the 7990 was 2 tahiti cored GCN 1.0 gpus on the same card

the r9-285 is a tonga cored GCN 1.2gpu ~ it's number might imply it's faster then the r9-280x, however in performance terms it actually comes in around the same as the r9-280.

the r9-290x was a hawaii cored GCN 1.1 gpu
the r9-290 was a cut down hawaii cored GCN 1.1 gpu

the r9-295x2 is a DUAL hawaii cored GCN 1.1 gpu ~ it's basically 2 r9-290x's on the same board much like the TitanZ

so if i were to list them in order based on core design and performance it would go

r9-295x2 - Hawaiix2 GCN 1.1
7990 - Tahitix2 GCN 1.0
r9-290x - Hawaii GCN 1.1
r9-290 - Hawaii GCN 1.1
r9-280x - Tahiti GCN 1.0
r9-285 - Tonga GCN 1.2*
r9-280x - Tahiti GCN 1.0*
r9-270x - Pitcairn GCN 1.0
r9-270 - Pitcairn GCN 1.0
r7-265 - Pitcairn GCN 1.0
r7-260x - Bonaire GCN 1.1
r7-260 - Bonaire GCN 1.1
r7-250x - Cape Verde GCN 1.0
7750 - Cape Verde GCN 1.0
r7-250 - Oland GCN 1.1
r7-240 - Oland GCN 1.1

*note: the r9-285 and r9-280 are basically identical in performance, even though they sport two difference graphic cores, i chose to put the r9-285 on top due to random whim.


----------



## BizzareRide

Quote:


> Originally Posted by *PontiacGTX*
> 
> what game is that one?
> i dont think that they could use 28nm with GM210/200 unless they want almost the same power consumption of GK110


It won't matter if it consumes as much as It will be 40% faster than GK110/780TI that justifies it's power based on 22-24smx cores at 128alus each.


----------



## curly haired boy

Quote:


> Originally Posted by *incog*
> 
> Perhaps DAI is just a badly optimized game?
> 
> Cards at the level of 7970/770/780 should be able to tackle anything at 1080p, I've seen the footage DAI doesn't even look that great.


i agree about should....but given releases these day, i can't afford to skip out on power.


----------



## Swolern

I really hope the 380/390x completely demolish the 980. AMD is really falling behind and needs to make a move!


----------



## SlackerITGuy

So, for those who know who Logan from Tek Syndicate is, he mentioned on his latest video (best black friday deals or something along those lines) that the R9 390X is gonna be coming out "pretty soon" and that it has some "amazing specs".

*FWIW.*

Maybe he knows something we don't?... Or just going by the rumor mill..?

EDIT: Source:






Start it at: ~9:20.


----------



## Noufel

AMD what are you playin,g annoce the damn thing if the fijis or bermudas are supposed to come out this close why not an official annoucement the carrizo apu are comming in q1 2015 and yet they annouced it this early so why not the 380/390


----------



## incog

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *incog*
> 
> this does: https://farm8.staticflickr.com/7034/13423161775_bf81f80857_o.png
> 
> if you look at the cockpit with all the little wires, knobs and other things, realize that that's all 3D with very, very nice textures on them
> 
> some nice lighting effects and almost photo-realistic aircraft: http://i1364.photobucket.com/albums/r739/larry691/92sqn2_zps6a50b1f6.png
> .
> 
> 
> 
> what game is that one?
> Quote:
> 
> 
> 
> Originally Posted by *BizzareRide*
> 
> What worries me is if this is on 20nm and AMD can't completely destroy a midrange die on 28nm... I hope this is 28nm, cause if not then with a 390x at 20nm, AMD will only match Big Maxwell on 28nm... What happens when NVIDIA switches to 20nm or, like the rumors suggest, skip to 16nm?
> 
> Click to expand...
> 
> i dont think that they could use 28nm with GM210/200 unless they want almost the same power consumption of GK110
Click to expand...

That's Il-2 Cliffs of Dover with Team Fusion's 4.312 update on it.

You can look at this video for actual footage, watch in 1080p: https://www.youtube.com/watch?v=KvtTiH3BaFg&feature=youtu.be&t=24m47s

I believe that he is using "only" a GTX 770.


----------



## Olivon

Quote:


> Originally Posted by *SlackerITGuy*
> 
> So, for those who know who Logan from Tek Syndicate
> Start it at: ~9:20.


I saw one video with this guy one time. The scores he gave was totally wrong and I asked myself if it really did test the whole thing.
This guy wants clikcks, not truth. So I simply don't trust him.


----------



## Orangey

Tek Syndicate pander to people who buy riced gear and love AMD / hate Intel/Nvidia for no reason.


----------



## Klocek001

Quote:


> Originally Posted by *maarten12100*
> 
> This looks more like the 390x considering how well it does. But on 20nm it could be the 380x. I think this is finally the real GCN update rather than revisions and that on itself is good news.


no way it's 390X with 96 ROPs and 4000+ shader units, look at the power consumption.


----------



## PureBlackFire

This is 28nm for sure.


----------



## gamervivek

I like seronx's take that this is the 370, 380 would be then fuming with 300W while 390x trundles in with 450W on a super big die.


----------



## Xuper

Why does people believe it's 390x?


----------



## raghu78

Quote:


> Originally Posted by *maarten12100*
> 
> They can easily do it. A new architecture instead of revisions. This will be the first radical change since 2012. The go from high density 28nm to either lower density 28 or depleted SOI 28 or a 20nm node. All this will yield major improvement.
> 
> AMD can easily outshine nvidia's cards on 28nm considering maxwell will probably be the last. They will go head to head on 20 or advanced 28nm.


I have a hunch that AMD is using the GCN 2.0 flagship chip which is > 500 sq mm chip to power R9 390X, R9 390 and R9 380X. Given that Globalfoundries and AMD are doing their first massive GPU I expect yields to be an issue. Nvidia and TSMC have been building big die GPUs from the G80 aka Geforce 8800GTX in late 2006 whereas for GF and AMD its a first attempt at a big die GPU .

Moreover there are other technological advancements such as a state of the art high bandwidth memory system and a completely new packaging technology (2.5D stacking on silicon interposer). I expect yield challenges and therefore more salvage SKUs. In fact I would not be surprised if there is one more salvage SKU named R9 380 using the same chip.

R9 390X - 4096 sp, 4 shader engines, 4 raster engines , 64 ROPs, 4GB HBM
R9 390 - 3584 sp, 4 shader engines, 4 raster engines , 64 ROPs, 4GB HBM
R9 380X - 3072 sp, 4 shader engines, 4 raster engines , 64 ROPs, 4GB HBM


----------



## junkman

Quote:


> Originally Posted by *Xuper*
> 
> Why does people believe it's 390x?


It seems unlikely, but this whole thing is a big rumor mill. Nothing wrong with wondering what it "could" be.

Personally, I believe we need to wait until the official announcement. Most engineering samples I've worked with are pretty gimped, so you can't really come to strong conclusions about what the power consumption/performance is for this card based solely from these graphs.


----------



## maarten12100

Quote:


> Originally Posted by *Klocek001*
> 
> no way it's 390X with 96 ROPs and 4000+ shader units, look at the power consumption.


A early sample it really says nothing. You can undervolt the 290 to 170W while keeping a steady 800MHz.
Quote:


> Originally Posted by *raghu78*
> 
> I have a hunch that AMD is using the GCN 2.0 flagship chip which is > 500 sq mm chip to power R9 390X, R9 390 and R9 380X. Given that Globalfoundries and AMD are doing their first massive GPU I expect yields to be an issue. Nvidia and TSMC have been building big die GPUs from the G80 aka Geforce 8800GTX in late 2006 whereas for GF and AMD its a first attempt at a big die GPU .
> 
> Moreover there are other technological advancements such as a state of the art high bandwidth memory system and a completely new packaging technology (2.5D stacking on silicon interposer). I expect yield challenges and therefore more salvage SKUs. In fact I would not be surprised if there is one more salvage SKU named R9 380 using the same chip.
> 
> R9 390X - 4096 sp, 4 shader engines, 4 raster engines , 64 ROPs, 4GB HBM
> R9 390 - 3584 sp, 4 shader engines, 4 raster engines , 64 ROPs, 4GB HBM
> R9 380X - 3072 sp, 4 shader engines, 4 raster engines , 64 ROPs, 4GB HBM


Globalfoundries may have under-delivered in the past. The last couple of years are a different story they have nodes more power efficient than TSMC the question is can it be scaled to size.

If they do a 7950-7970 approach I do hope they don't lock them shut or apply changes to the pcb so we have a chance of unlocking that goodness at the cost of some extra power over the binned models. I love free performance.


----------



## Orangey

Global Foundries was AMD's fab business a few years ago. Also another ex-ATI top AMD guy has just leapt over to Global:

http://fudzilla.net/home/item/36433-globalfoundries-got-ex-amd-svp-of-operations

I do not think they will have too many problems with a big GPU other than the baseline teething of a new process.


----------



## raghu78

Quote:


> Originally Posted by *maarten12100*
> 
> A early sample it really says nothing. You can undervolt the 290 to 170W while keeping a steady 800MHz. Globalfoundries may have under-delivered in the past. The last couple of years are a different story they have nodes more power efficient than TSMC the question is can it be scaled to size.
> 
> If they do a 7950-7970 approach I do hope they don't lock them shut or apply changes to the pcb so we have a chance of unlocking that goodness at the cost of some extra power over the binned models. I love free performance.


The yields challenge means there are not going to be any unlockable salvage SKUs. The biggest die GF has manufactured for AMD is the semi-custom game console chips at roughly 350 sq mm. So the > 500 sq mm die is a huge challenge for GF and AMD. Add to it the 2.5D packaging complexity and brand new HBM tech and there is a high possibility of more salvage SKUs . This chip is easily going to be 7.5 - 8 billion transistors and thats a massive amount of transistors and even on a mature node AMD has lot of room to improve yields by salvaging more faulty dies.


----------



## Seronx

One of the upcoming AMD dGPUs has been taped out on GlobalFoundries' 28nm FDSOI process. Not sure, if it will hit production but it has happened.


----------



## kingduqc

Quote:


> Originally Posted by *Orangey*
> 
> Tek Syndicate pander to people who buy riced gear and love AMD / hate Intel/Nvidia for no reason.


Im preaty sure its price/performence that they like not a brand.


----------



## azanimefan

Quote:


> Originally Posted by *Orangey*
> 
> Tek Syndicate pander to people who buy riced gear and love AMD / hate Intel/Nvidia for no reason.


i've got no dog in this hunt, but do find him to be amusing from time to time. that said, his own system is a 6 or 8 cored haswell-e with SLi'ed 780ti gpus. he does have people in his little group (mostly that girl "Pistol") who use AMD.

might want to watch his video on AMD vs Intel again. cause you clearly missed what he was saying in it. he was saying AMD doesn't suck, that its pretty good... and that in certain situations (such as playing crysis 3 while streaming the video of your session) it can hold its own against top end intels. he wasn't saying fx8 cores were better then intel. simply saying it was a legit option for a gamer, and shouldn't be ignored or dismissed as junk.

i happen to agree with him. While it was budget and circumstance that lead me to an AMD based system at the moment, i don't feel the need for an intel yet. so i'd say this fx8 core is an adequate gaming platform. That said i'm looking forward to building my next system... that 8c/16t haswell-e. My experience with 4c/8t haswell cpus has proven to me, that for my needs, i probably need to step up over a quad core intel to see a benefit leaving this amd cpu.

he also was one of the first reviewers to talk about the heat problems on the r9-290/290x with the default cooler.

As for his gear love, he loves price/performance, and he is a legit audiophile, so out of all of the reviewers out there his is the one opinion you shouldn't dismiss on audio equipment.


----------



## havocv3

Quote:


> Originally Posted by *Orangey*
> 
> Tek Syndicate pander to people who buy riced gear and love AMD / hate Intel/Nvidia for no reason.


Yup. That's why he keeps building Intel/Nvidia rigs, because he hates them for no reason.

Pretty sure he just hates Intel's anti-competitive practices and their rabid fanboys.


----------



## PostalTwinkie

Quote:


> Originally Posted by *kingduqc*
> 
> Im preaty sure its price/performence that they like not a brand.


This.

Which makes it look like they favor AMD, but in reality Tek has typically just gone after the best price/perf metric. They are right as well in most situations, AMD has for a long time been the king of the performance per dollar.

Heck, I seen half a dozen deals on 290X this week for $250. Not the 290, but the 290X for $250, plus had freebies.


----------



## incog

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kingduqc*
> 
> Im preaty sure its price/performence that they like not a brand.
> 
> 
> 
> This.
> 
> Which makes it look like they favor AMD, but in reality Tek has typically just gone after the best price/perf metric. They are right as well in most situations, AMD has for a long time been the king of the performance per dollar.
> 
> Heck, I seen half a dozen deals on 290X this week for $250. Not the 290, but the 290X for $250, plus had freebies.
Click to expand...

Prices in the USA are criminally low.


----------



## Orangey

Quote:


> Originally Posted by *Seronx*
> 
> One of the upcoming AMD dGPUs has been taped out on GlobalFoundries' 28nm FDSOI process. Not sure, if it will hit production but it has happened.


I knew it


----------



## hyp36rmax

Quote:


> Originally Posted by *Seronx*
> 
> One of the upcoming AMD dGPUs has been taped out on GlobalFoundries' 28nm FDSOI process. Not sure, if it will hit production but it has happened.


Source?

Quote:


> Originally Posted by *Orangey*
> 
> I knew it


NO source from above response as it's just as good as WCCFTech


----------



## Seronx

Quote:


> Originally Posted by *hyp36rmax*
> 
> Source?


Can not post it yet from a legitimate source. It should pop up soon, hopefully not from WCCFTech, but eetimes or digitimes, etc.

Synpase Design is converting with approval of clients; 28nm HPM designs to the 28nm FDSOI node. One of AMD's GPU projects were one of the approved designs to switch over.

http://www.synapse-da.com/Industry-Sectors/Multimedia/Apps-processors
Quote:


> Instrumental in the Successful Tapeout of the First 28FDSOI Apps Processor for our Client


Do not know which one it is referring to. Apps processor is not a GPU processor, but something like a SoC. It is most likely the STi8K from STMicroelectronics.

There is a lot of upcoming 28FD tapeouts coming from Synapse + Others. Post-post-post-post-Edit; *Too many FDSOIs*.

*Soon; Anytime from now to forever.

--
Carrizo is also an Apps Processor, but it was definitely outed on "GF28A" from India SoCtronics;;


---
28FDSOI Flavors at GlobalFoundries;
Advanced; Maximum + S/D Straining + More dense SRAM. // Perf = GF20LPM - Cost = GF28HPP
Maximum; Minimum + Back-biasing + More dense SRAM + Tuned VT flavors. // Perf = GF28SHP - Cost = GF28SLP
Minimum; No back biasing + Footprint of 28SLP + Lower Vmin/Higher Vmax. // Perf = GF28HPP - Cost = GF28LPS

^-- Will be displayed 1H of 2015 on GlobalFoundries website.


----------



## Testier

Quote:


> Originally Posted by *Seronx*
> 
> Can not post it yet from a legitimate source.


So as reliable as WCCFTech.


----------



## i7monkey

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why do u need big Maxwell? *GTX980 not enough?*


owned the 780ti and 980. the 980 is a great proof of concept but it's a total sidegrade.


----------



## GrimDoctor

Quote:


> Originally Posted by *i7monkey*
> 
> owned the 780ti and 980. the 980 is a great proof of concept but it's a total sidegrade.


Sidegrade?


----------



## nleksan

I am skipping the 980 personally, as I had SLI 780Ti Classy Kingpins (sold before the huge price drops) / CFX 290X Lightnings (sold because they were stomped by the KPE's, tested in the same machine, all using full coverage copper blocks (+Backplates) and with an adequate amount of rad space lol (2x Monsta 560's + 2x UT60 560's + 2x GTX480's all w Push-Pull 140mm Bgears Blasters/Phanteks High-Static-Pressure and Koolance 2800rpm 120x38mm fans w XSPC Gaskets and Phobya 7mm and a few20mm sshrouds), and @ the highest possible stable overclocks).
While most 780Ti won't run 1492/8200, neither will most 290X run 1340/6600 (of the friends I have who own 290/X cards, I'd say even 1200 for a non-Lightning is pretty well up there).

I passed on the Titans, but looking back they were actually a worthwhile purchase, in the same way that my X79 system has actually saved me money over a mainstream 1155/50; it is "relevant" for far longer. Of course, I would have bbeen made when the Titan Black released, then the 780Ti, but their existence doesn't make the regular Titan, esp w a customBIOS, any less powerful...

I wwasn't impressed with the 290X's myself... My 7970 Lightnings both died incredibly quickly (@ stock volts), and the 680 Lightnings they gave me instead on request were a fair amount faster and infinitely better for Multi-GPU use (the7970s = mmigraines from stuttering)... And AMD marketing has actively made me feel strongly against supporting them.

So, GM210 it is.

(also, the 980s ARE NOT "midrange cards"; the silicone may be the "intended mid-range GPU", but for as long as they exist as the fastest Nvidia chips, let alone the fastest GPU period, they are not midrange...)


----------



## hyp36rmax

Quote:


> Originally Posted by *GrimDoctor*
> 
> Sidegrade?


In total performance yes, in power consumption no, if coming from a GTX 780Ti then yes Total Sidegrade.


----------



## GrimDoctor

Quote:


> Originally Posted by *hyp36rmax*
> 
> In total performance yes, in power consumption no, if coming from a GTX 780Ti then yes Total Sidegrade.


So your meaning quality?


----------



## Awsan

But will it run Unity?


----------



## i7monkey

Quote:


> Originally Posted by *GrimDoctor*
> 
> So your meaning quality?


Frames per second they're very similar.

Power consumption the 980 is much better. Acoustics are better too. Like I said, the 980 is a great proof-of-concept but performance-wise it's very similar to the 780Ti, so you're not moving up, just to the side, hence "sidegrade".


----------



## GrimDoctor

Quote:


> Originally Posted by *i7monkey*
> 
> Frames per second they're very similar.
> 
> Power consumption the 980 is much better. Acoustics are better too. Like I said, the 980 is a great proof-of-concept but performance-wise it's very similar to the 780Ti, so you're not moving up, just to the side, hence "sidegrade".


Awesome, thanks for the explanation.

I have 2 x 760s at the moment and to be honest they aren't even holding up that well with something like DAI. Would you move up to the current or wait out for the 380/390 or whatever the Nvidia "retaliation" is?


----------



## Noufel

Quote:


> Originally Posted by *GrimDoctor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *i7monkey*
> 
> Frames per second they're very similar.
> 
> Power consumption the 980 is much better. Acoustics are better too. Like I said, the 980 is a great proof-of-concept but performance-wise it's very similar to the 780Ti, so you're not moving up, just to the side, hence "sidegrade".
> 
> 
> 
> Awesome, thanks for the explanation.
> 
> I have 2 x 760s at the moment and to be honest they aren't even holding up that well with something like DAI. Would you move up to the current or wait out for the 380/390 or whatever the Nvidia "retaliation" is?
Click to expand...

Wait for 1 or 2 months when the confirmed spec and price of the 380x come out


----------



## electro2u

Quote:


> Originally Posted by *GrimDoctor*
> 
> Awesome, thanks for the explanation.
> 
> I have 2 x 760s at the moment and to be honest they aren't even holding up that well with something like DAI. Would you move up to the current or wait out for the 380/390 or whatever the Nvidia "retaliation" is?


Whatever it is, the NVIDIA retaliation you speak of is almost certainly going to be expensive. Likewise I expect AMD (particularly if they have any sort of performance advantage) will want quite a bit of moola for the 380x/390x. The baby Maxwell chips Nvidia is killing AMD with right now are probably going to stay about where they are in terms of pricing. That is how I see things.

The 970/980 have no real competition at the same price point or performance level. The 970 has the price segment nailed, and the 980 has the performance segment nailed. Not only that but to make things worse... the 980 isn't a whole lot faster than the already fast 970. The moment the 380x is released, big Maxwell cards will hit the market. It doesn't matter how much AMD wants for the 380/390 (or how fast they are either) Nvidia will not have any incentive or reason to adjust baby Maxwell pricing.

What's my point? You can go ahead and buy a 970 (or 2) and not worry that they will devalue so much. A little--but not like the 290/290x did. Speaking of devaluing: I'm a moron--I bought a 295x2 for $1500.


----------



## orick

Hmm. The $250 range is supposed to be the magical consumer sweet point and neither company is planing to release anything new there?


----------



## electro2u

Quote:


> Originally Posted by *orick*
> 
> Hmm. The $250 range is supposed to be the magical consumer sweet point and neither company is planing to release anything new there?


Oh they will. But it will just be repackaged old stuff. The 290x is hitting that price point already... People are going to be paying $250 to have a card that ISN'T so fast and hot.


----------



## Orangey

Ever since they figured out they could charge $1000 and people would still buy it in droves, the once-lucrative low-end has been almost completely abandoned.

GM200 will be $1500.


----------



## CasualCat

Quote:


> Originally Posted by *Orangey*
> 
> GM200 will be $1500.


----------



## orick

$250 isn't low end, it's mainstream. If you are talking about the $100 and below low end cards, then those are kinda useless with the APUs and integrated GPUs. They do need to put out the 960s and 370s for the holiday season to make their profits for Q4


----------



## Crockturtle566

Quote:


> Originally Posted by *orick*
> 
> $250 isn't low end, it's mainstream. If you are talking about the $100 and below low end cards, then those are kinda useless with the APUs and integrated GPUs. They do need to put out the 960s and 370s for the holiday season to make their profits for Q4


For ocn standards it's low end


----------



## Shogon

Quote:


> Originally Posted by *Crockturtle566*
> 
> For ocn standards it's low end


We are an extreme minority you know, If anything our standards don't apply to the real world. Considering the 290/290x is in that price range though I'd hardly call the sub $200 market low end, not now when you can get 780ti performance for under $300.


----------



## Crockturtle566

Quote:


> Originally Posted by *Shogon*
> 
> We are an extreme minority you know, If anything our standards don't apply to the real world. Considering the 290/290x is in that price range though I'd hardly call the sub $200 market low end, not now when you can get 780ti performance for under $300.


I'm completely aware that we are a extreme minority. /off topic I'm actually trying to make more post so I can get more comfortable with selection. /end of off topic


----------



## crazycrave

I have to agree that the 290x is making the 970GTX a hard sell with the price falling.. 980GTX has to deal with roof falling in as the 295X can be had for $650 .. it think Nvidia is in a tougher spot right now then a month ago plus the magic card is so close now..


----------



## mechris

It's probably too good to be true, BUT. . . .

I see the "Captain Jack" moniker and the immediate thought association is pirates, ergo Treasure Islands. Bermuda is not out of the question, thouh the actual pirate history associated with that island is minimal. Wikipedia also indicates that there was a captain of King Kamehameha's fleet dubbed Captain Jack, so the Maui GPU isn't out of the question, either. All in all, though, based only on the code name they're using, the Treasure Islands GPU seems like the strongest candidate here (unless, of course, they're playing games a bit, such as Captain Jack → Captain Jack Sparrow → Pirates of the Carribean → Bermuda is in the Carribean).

But, that'd probably be too good to be true.


----------



## hyp36rmax

Quote:


> Originally Posted by *mechris*
> 
> It's probably too good to be true, BUT. . . .
> 
> I see the "Captain Jack" moniker and the immediate thought association is pirates, ergo Treasure Islands. Bermuda is not out of the question, thouh the actual pirate history associated with that island is minimal. Wikipedia also indicates that there was a captain of King Kamehameha's fleet dubbed Captain Jack, so the Maui GPU isn't out of the question, either. All in all, though, based only on the code name they're using, the Treasure Islands GPU seems like the strongest candidate here (unless, of course, they're playing games a bit, such as Captain Jack → Captain Jack Sparrow → Pirates of the Carribean → Bermuda is in the Carribean).
> 
> But, that'd probably be too good to be true.


This is the most plausible hypothesis in this whole thread regarding the naming hints hahaha. Just sounds more fun.


----------



## Seronx

I did a geography thing. Unless, AMD is not into geography... then oh well.



Tahiti -> Tonga -> Fiji

This could be a trick to think that Fiji is the Pirate Islands with 32 CUs.

Treasure Island is most likely a reference to Tobago of Trinidad and Tobago.
"Captain Jack" is most likely referencing Calico Jack which went to Bermuda first?

I'm sticking too;
Fiji - Enthusiast - R9 390*
Maui - Performance - R9 380*
Bermuda - Mainstream - R9 370*
Treasure/Tobago - Value - R7 360*

Tobago is most likely the replacement for Curacao(Pitcairn Rebrand) and Bonaire which are also in the Caribbean.


----------



## Adglu

I'm pretty sure Treasure Island is refering to Treasure Island of Mamanuca archipelago near the west cost of Fiji

Also Maui is more of a VI codename since it's a Hawaiian island, my guess that it was original 280x but was canned and replaced with Tahiti.


----------



## Seronx

Quote:


> Originally Posted by *Adglu*
> 
> I'm pretty sure Treasure Island is refering to...


http://en.wikipedia.org/wiki/Isla_de_la_Juventud
Quote:


> the island would also come to be known as Isla de Cotorras ("Isle of Parrots") and Isla de Tesoros ("Treasure Island") at various points in its history.


It could be this as well.
Quote:


> Originally Posted by *Adglu*
> 
> Also Maui is more of a VI codename since it's a Hawaiian island, my guess that it was original 280x but was canned and replaced with Tahiti.


Fiji is part of the Ring of Fire, so it is a volcanic set of Islands.
Bermuda Island was formed from an Atlantic Volcano.
Tobago/Isle of Youth were both formed from volcanos.


----------



## Wishmaker

AMD giving geography lessons


----------



## maarten12100

Quote:


> Originally Posted by *Wishmaker*
> 
> AMD giving geography lessons


Note how all their current products are named after places and rivers. The APU after Carrizo could be named Ebola since it is a river


----------



## EngageTheSun

Quote:


> Originally Posted by *astrallite*
> 
> http://www.chiphell.com/thread-1182382-1-1.html
> Not sure if already posted. This is from the same group that first leaked GM204 benchmarks.


Exactly 10FPS more than a GTX 980? Right on the dot? I'm starting to become skeptical...


----------



## EngageTheSun

Quote:


> Originally Posted by *astrallite*
> 
> They were, their leak was a Tomb Raider benchmark that showed the stock 780Ti had a slight advantage over the GTX 980 although I believe after patches the difference was reversed.


Seems strange that they would benchmark a AMD optimized game with a new AMD card.


----------



## Orangey

Quote:


> Originally Posted by *EngageTheSun*
> 
> Exactly 10FPS more than a GTX 980? Right on the dot? I'm starting to become skeptical...


9


----------



## Kuivamaa

Quote:


> Originally Posted by *EngageTheSun*
> 
> Seems strange that they would benchmark a AMD optimized game with a new AMD card.


I think they benched a bunch of games, around 23 including plenty of nvidia friendly ones (from watch dogs and far cry 4 to wow). Anyway,their leaks are usually fairly acurate.


----------



## hyp36rmax

Quote:


> Originally Posted by *Kuivamaa*
> 
> I think they benched a bunch of games, around 23 including plenty of nvidia friendly ones (from watch dogs and far cry 4 to wow). Anyway,their leaks are usually fairly acurate.


I agree it looks like an average between all those games. It looks fairly positive especially for an R9 380X *hoping*


----------



## PontiacGTX

Quote:


> Originally Posted by *Seronx*
> 
> http://en.wikipedia.org/wiki/Isla_de_la_Juventud
> It could be this as well.
> Fiji is part of the Ring of Fire, so it is a volcanic set of Islands.
> Bermuda Island was formed from an Atlantic Volcano.
> Tobago/Isle of Youth were both formed from volcanos.


why did you change your mind that Bermuda Wouldnt be the high end gpus and Fiji the 2nd?


----------



## szeged

i hope this bench is 100% true, A because....i want one....B because...nvidia will make a move right after.


----------



## Orangey

It's interesting that the SIGGRAPH paper said end of 2014 for GM200. Wouldn't that mean at the very least it was on 28nm, as nothing else would've been ready in time.

So perhaps they're just waiting, quietly panicking at the thought of a respin.


----------



## szeged

if we get gm200 before 2015 i will give away my entire rig free. quote me on this and hold me to it.


----------



## SlackerITGuy

Quote:


> Originally Posted by *szeged*
> 
> if we get gm200 before 2015 i will give away my entire rig free. quote me on this and hold me to it.


Yeah there's absolutely no way its coming out this year.

You're good.


----------



## Seronx

Quote:


> Originally Posted by *PontiacGTX*
> 
> why did you change your mind that Bermuda Wouldnt be the high end gpus and Fiji the 2nd?


Drivers and leaks changed my mind.


----------



## curly haired boy

Quote:


> Originally Posted by *szeged*
> 
> if we get gm200 before 2015 i will give away my entire rig free. quote me on this and hold me to it.


just as long as it's before february...


----------



## DannyDK

When will the new line of R9 3xx come out?


----------



## delboy67

Quote:


> Originally Posted by *DannyDK*
> 
> When will the new line of R9 3xx come out?


All we've got is 1h 2015


----------



## raghu78

Quote:


> Originally Posted by *delboy67*
> 
> All we've got is 1h 2015


more likely Feb 2015.

http://www.thinkcomputers.org/amd-to-launch-radeon-r9-380x-in-february/
http://www.redgamingtech.com/amd-r9-380x-february-r9-390x-370x-announced/


----------



## CasualCat

Quote:


> Originally Posted by *szeged*
> 
> if we get gm200 before 2015 i will give away my entire rig free. quote me on this and hold me to it.


Can I go ahead and call dibs just in case?









edit: also those graphs are approaching the performance I'm looking for in a single GPU(wish we could see individual bench results), just sucks that apparently the best we can hope for is the 380x in February with no news yet of 390x/GM200


----------



## Orangey

Quote:


> Originally Posted by *raghu78*
> 
> more likely Feb 2015.
> 
> http://www.thinkcomputers.org/amd-to-launch-radeon-r9-380x-in-february/
> http://www.redgamingtech.com/amd-r9-380x-february-r9-390x-370x-announced/


Why do you believe those no-name sites?

They're talking rubbish.


----------



## hyp36rmax

Quote:


> Originally Posted by *Orangey*
> 
> Why do you believe those no-name sites?
> 
> They're talking rubbish.


Thinkcomputers.org are more plausible than WCCFTECH that you seem to believe... Never heard of Redgaming before as they probably sourced Thinkcomputers.org as the image and content are the same the following day which *both* linked from 4gamer.net which is a popular Japanese site.


----------



## PontiacGTX

Quote:


> Originally Posted by *Orangey*
> 
> Why do you believe those no-name sites?
> 
> They're talking rubbish.


those were early rumours maybe they arent true.Its question to see, the use of 28nm could make the release date sooner than waiting for 14-16nm at Q4 2015


----------



## Orangey

Quote:


> Originally Posted by *hyp36rmax*
> 
> WCCFTECH that you seem to believe


I've been nothing but critical of wccf.


----------



## hyp36rmax

Quote:


> Originally Posted by *Orangey*
> 
> I've been nothing but critical of wccf.


Must be a misunderstanding on my part based on your post replying to Seronx (Who seems to believe in WCCFTECH in his prior post). In that case nothing to see here, moving along


----------



## hellojustinr

:0


----------



## PostalTwinkie

Quote:


> Originally Posted by *crazycrave*
> 
> I have to agree that the 290x is making the 970GTX a hard sell with the price falling.. 980GTX has to deal with roof falling in as the 295X can be had for $650 .. it think Nvidia is in a tougher spot right now then a month ago plus the magic card is so close now..


The few deals on the 290X I seen for $250 is very compelling. Part of me likes to say I would still take the 970 over the 290X, but really.....if it was a choice of the 970 at $300 and the 290X at $250, the 290X wins.

The real question is; how sustainable is that $250 on a 290X? My answer would be not at all and it is simply a fire sale.


----------



## Seronx

Flinging guesses/speculation out for Pirate Islands;

2015 SKUs (1H 2015 launch)

- Performance SKU -
28nm FDSOI
~8+ billion transistors
<1.1 GHz
<300 watt TDP

- Mainstream SKU -
28nm FDSOI
~6 billion transistors
<1.2 GHz
<200 watt TDP

2016 SKUs (late 2H 2015 launch)

- Enthusiast SKU -
20nm FDSOI
~12+ billion transistors
<1.1 GHz
<300 watt TDP

- Value SKU -
20nm FDSOI
~3+ billion transistors
>1 GHz
<100 watt TDP


----------



## hellojustinr

Ya'll are forgetting the fact that 290X needs an aftermarket cooler on top of it to use it at it's full potential while the 970 is a ready to use out-of-the-box GPU with near 780 Ti performacne.

290X at $280 + $100 decent after market cooler is more or less the same as the 970 price (290X = 980 in performance but still)

Nonetheless yes the 290X at $280 is still a steal considering it was $600 just a few months ago.


----------



## Redeemer

Quote:


> Originally Posted by *hellojustinr*
> 
> Ya'll are forgetting the fact that 290X needs an aftermarket cooler on top of it to use it at it's full potential while the 970 is a ready to use out-of-the-box GPU with near 780 Ti performacne.
> 
> 290X at $280 + $100 decent after market cooler is more or less the same as the 970 price (290X = 980 in performance but still)
> 
> Nonetheless yes the 290X at $280 is still a steal considering it was $600 just a few months ago.


your confused you can buy a 290 with aftermarket cooling for under $300 comes with free games, it take a 970 @1500 to match a 290 @1200mhz. The 970 is not really a big deal especially if you already own a GK110 variant, however it runs cooler and draws less power obviously. Right now the 290 is the best deal out no doubt about that


----------



## szeged

970 also comes with free games, and most of them do 1600+ with ease.


----------



## hellojustinr

Quote:


> Originally Posted by *szeged*
> 
> 970 also comes with free games, and most of them do 1600+ with ease.


Even with liquid cooling this thing still runs hot.

The 970 is a big deal as it pushes 780 Ti level performance into the $350 bracket. Thus forcing AMD to sell their unfinished 290X into the $280 bracket.

Not saying 290X sucks it is a really good card.

I have a 290X as well (three actually) and love it but I do know it's downfalls.


----------



## szeged

290 and 290x dont suck, not even remotely. the only thing that sucks is how cool they have to be to do well. At least aftermarket coolers get the job done moderately well, and waterblocks eliminate the problem almost completely, but then youre back into the $370+ range after that.


----------



## hellojustinr

Quote:


> Originally Posted by *szeged*
> 
> 290 and 290x dont suck, not even remotely. the only thing that sucks is how cool they have to be to do well. At least aftermarket coolers get the job done moderately well, and waterblocks eliminate the problem almost completely, but then youre back into the $370+ range after that.


Never said they suck but yes you'll be in the $370 range after everything. Plus the variables of installation of a liquid cooling setup and that VRM heat sink you need plus black screen possibilities on some affected 290X (there is a whole thread about it)

Anyone who's serious about using the 290X will have already installed or plan to install a liquid cooling setup that's how inefficient the architecture is on this thing, it necessitates the use of a liquid cooling setup. Go on the 290/290x owners page almost 3/5ths have a liquid setup installed

Sometimes I wish I had waited and just bought the 970 with all the issues the 290 brought upon to me where I had to invest into a Hybrid II liquid cooler + VRM heat sinks in order to not just keep it quiet but also to have it run at full potential.

Nothing I can do now but I do know inside that I could've gotten much better deal out of my money.

Not bashing on 290x check my sig but I know the 290s shortcomings and why the 390 will be so much better for people who waited.


----------



## mtcn77

Quote:


> Originally Posted by *szeged*
> 
> 970 also comes with free games, and *most of them do 1600+ with ease*.











Considering 4K, based on direct performance comparisons in commendable reviews and even discounting non-linear overclock scaling, it should take 1610 MHz out of a 970 to establish itself within 7% range of 290x at 1.2 GHz. I have yet to see one 970 review that does just that - let alone prevail in 4K.
Maxwell is an efficiency gizmo, but those extra shaders found in Hawaii are not without merit with some applied sample (MS & SS) / analytical (SM & FX) / bias (CS & EQ) antialiasing. The trend that AMD-ATi gpus offer better value at higher fidelity filtering is still the rule and not the exception. I used to have fun with Nvidia when new drivers were commended to highlight the performance a tier below the best visual options - so much for the PC master race.
Reference AMD-ATi designs have always favoured overclockers with those standard digital vrm's, too. Eventhough, I assent that it is difficult cooling those tiny chips with high heat flux, you keep the heat under control and the reference designs are literally the best overclockers due to the least parasitic power signal any custom model can actually copy, according to my 3 generations back experience.


----------



## szeged

sorry im not getting into it with you again, no matter what anyone says or does or posts you will be a die hard amd fangirl that refuses to see any truth in anything. good day.


----------



## Testier

Quote:


> Originally Posted by *szeged*
> 
> if we get gm200 before 2015 i will give away my entire rig free. quote me on this and hold me to it.


Offtopic, but always wanted to ask. What is as your profile pic? A man pretending to be a bird?????
Quote:


> Originally Posted by *mtcn77*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Considering 4K, based on direct performance comparisons in commendable reviews and even discounting non-linear overclock scaling, it should take 1610 MHz out of a 970 to establish itself within 7% range of 290x at 1.2 GHz. I have yet to see one 970 review that does just - let alone prevail in 4K.
> Maxwell is an efficiency gizmo, but those extra shaders found in Hawaii are not without merit with some applied sample (MS & SS) / analytical (SM & FX) / bias (CS & EQ) antialiasing. The trend that AMD-ATi gpus offer better value at higher fidelity filtering is still the rule and not the exception. I used to have fun with Nvidia when new drivers were commended to highlight the performance a tier below the best - so much for the PC master race.
> Reference AMD-ATi designs have always favoured overclockers with those standard digital vrm's, too. Eventhough, I assent that it is difficult cooling those tiny chips with high heat flux, you keep the heat under control and the reference designs are literally the best overclockers due to the least parasitic power signal any custom model can actually copy, according to my 3 generations back experience.


Considering the fact I am actually looking into 4k, it seems 970 x 3 is best option. Even providing 290x/290 x 3=970 x 3, I still have the problem of power draw and cooling. With 3 x 970s, I am leaning towards reference cooling, with 290x/290 I believe custom WB is kinda must. With the kind of power 290x/290 OC pulls and a 5960x OC, probably looking at 1500w PSU? Its.... unrealistic for me to run 3 x 290x/290 and tbh, the pricing for 970s quite reasonable. I get an AAA game and temp/power for an extra 100 bucks, it seems to be to be well worth the deal. At 4k, I am doubting I need very serious AA if any considering I am planning on a 24 inch 4k monitor. Now, I am not sure on 4k scaling hawaii vs maxwell. I am limited to a 1000w of playing room.


----------



## szeged

its a manicorn duh


----------



## DarkBlade6

Id bet my ass this thing is just a Tahiti chips on a better 28nm process (FD-SOi?) with all the new features from Tonga. Nothing to get excited here, theres not a single foundry who is ready to mass produce 20/16/14nm *>GPUs<* , so the peformance gain is not from a better manufacturing process/smaller node. On the other hand they have headroom to make a bigger die on 28nm but I really doubt they would design a big die 1. Its AMD and 2. because of the ''rumored'' power consumption.


----------



## mtcn77

Quote:


> Originally Posted by *Testier*
> 
> Offtopic, but always wanted to ask. What is as your profile pic? A man pretending to be a bird?????
> Considering the fact I am actually looking into 4k, it seems 970 x 3 is best option. Even providing 290x/290 x 3=970 x 3, I still have the problem of power draw and cooling. With 3 x 970s, I am leaning towards reference cooling, with 290x/290 I believe custom WB is kinda must. With the kind of power 290x/290 OC pulls and a 5960x OC, probably looking at 1500w PSU? Its.... unrealistic for me to run 3 x 290x/290 and tbh, the pricing for 970s quite reasonable. I get an AAA game and temp/power for an extra 100 bucks, it seems to be to be well worth the deal. At 4k, I am doubting I need very serious AA if any considering I am planning on a 24 inch 4k monitor. Now, I am not sure on 4k scaling hawaii vs maxwell. I am limited to a 1000w of playing room.


It feels nice when this audit of ht4u's direct comparison between MSI 970 & [email protected] has findings pointing out that:

At 4K + 2/4/8 MSAA 290x is 12% ahead of a 1342 MHz 970,
At 4K + 0/2/4 SSAA 290x is 4% ahead of a 1342 MHz 970,
Collecting these findings, 970 has got to either hit 1816(lol) to match [email protected] at 4K with MSAA, or hit 1676 MHz to call in No AA & SSAA.



Spoiler: taken from ht4u direct hardware comparison


----------



## szeged

and the fanboys start cherry picking reviews, time to exit the thread, it just got worthless in here.


----------



## kingduqc

Quote:


> Originally Posted by *szeged*
> 
> 970 also comes with free games, *and most of them do 1600+ with ease*.


You simply know this isn't true, I've read just about every review and most of them do 1450-1500ish. Doing 100mhz over that "with ease" 100% stable isn't true.
Quote:


> Originally Posted by *szeged*
> 
> and the fanboys start cherry picking reviews, time to exit the thread, it just got worthless in here.


Here is a nice average of 17 games bench... I would not call one a clear winner.

http://www.techpowerup.com/reviews/EVGA/GTX_970_SC_ACX_Cooler/25.html

Let's not get into this circle jerk and agree to disagree, both of them trade blow depending on the game. In 1440p (my resolution) both perform the same, 970 has a nicer features but the 290x come cheaper. Honestly let's not argue about 5% performance just like the 680 vs 7970 that's a pointless exercise.


----------



## mtcn77

Quote:


> Originally Posted by *szeged*
> 
> sorry im not getting into it with you again, no matter what anyone says or does or posts you will be a die hard amd *fangirl* that refuses to see any truth in anything. good day.


Being a fan*boy* is better than being a fan troll.


----------



## tsm106




----------



## DarkBlade6

We all know that Nvidia will release another one of these 10-15% performance increase drivers for Maxwell GPUs to counter AMD's Pirate Island launch.


----------



## raghu78

Quote:


> Originally Posted by *mtcn77*
> 
> Being a fan*boy* is better than being a fan troll.











Quote:


> Originally Posted by *DarkBlade6*
> 
> We all know that Nvidia will release another one of these 10-15% performance increase drivers for Maxwell GPUs to counter AMD's Pirate Island launch.


Lets not get too ahead of ourselves. This applies both for AMD and Nvidia. Why don't we just let the cards launch and speak for themselves.


----------



## Seid Dark

Quote:


> Originally Posted by *szeged*
> 
> 970 also comes with free games, and most of them do 1600+ with ease.


Where is your evidence that 970 does 1600+ easily? Most seem to stop at 1450-1500.


----------



## PostalTwinkie

Quote:


> Originally Posted by *DarkBlade6*
> 
> We all know that Nvidia will release another one of these 10-15% performance increase drivers for Maxwell GPUs to counter AMD's Pirate Island launch.


Nope.

Nvidia will wait for AMD to drop their flagship, the card AMD is going to ride for a year or so, and release the GTX 980 Ti. Then they will drop the price of the 970 and 980 a few bucks to reposition themselves against AMD.

*OR*

AMD is going to surprise all of us with the new card, like Nvidia did with the 900 series, and really give Nvidia a run for the money.


----------



## geoxile

Honestly, it seems like Nvidia's pricing on the 970 is anticipatory of something substantial from AMD. The GM204 is a pretty big die at nearly 400mm^2 and 33% bigger than the 770 it replaced at the same price range (and around 9% smaller than Hawaii). It seems like Nvidia really wants to capture as much marketshare as they can up to 2015, most likely because they're aware most people who buy now will wait for at least another generational cycle.


----------



## nyxagamemnon

Amd will release the 380x forcing nvidias hand for the big boy maxwell release. Then amd will release their 390x to beat big boy maxwell. Amd should be the one topping out in 2015.


----------



## Dudewitbow

how I think it will play out is AMD will release their 380x, force nvidia to release a new gpu similar to the 780/titan, followed by amd's 390 series, which will then lead to large die maxwell(similar to 780ti). somewhere inbetween the enthusiast ruckus throw in the middle grade stuff like the gtx 960 as well as the r9-370 stuff


----------



## Testier

Quote:


> Originally Posted by *szeged*
> 
> and the fanboys start cherry picking reviews, time to exit the thread, it just got worthless in here.


Pretty much. I never even heard of the site.
Quote:


> Originally Posted by *mtcn77*
> 
> It feels nice when this audit of ht4u's direct comparison between MSI 970 & [email protected] has findings pointing out that:
> 
> At 4K + 2/4/8 MSAA 290x is 12% ahead of a 1342 MHz 970,
> At 4K + 0/2/4 SSAA 290x is 4% ahead of a 1342 MHz 970,
> Collecting these findings, 970 has got to either hit 1816(lol) to match [email protected] at 4K with MSAA, or hit 1676 MHz to call in No AA & SSAA.
> 
> 
> 
> Spoiler: taken from ht4u direct hardware comparison


Even providing the biased review is right. So, at 4k, under my settings considering on a 24inch 4k monitor AA is usually not needed. I am not even looking at single GPU benchmark, its absolutely meaningless at 4k. I need to take in consideration for tri sli/tri cf. I trade 4% performance for usable power consumption and heat? 970 is still a better deal for tbh. Considering possible AMD driver issues with tri sli, and general insanity of Tri firing 290x on air, yeah, not happening.

And what is this talk of GPU on reference design OCing better then custom PCB? It literally makes no sense. Sorry man, I am gonna ask for proof on that considering I never heard it ever before and to the best of my knowledge, should be wrong....... I have see some pretty bad "reference" cards XFX had put out. AMD makes good GPUs, is just, tend not to be as elegant of a solution as nvidia.

I hope GM200 is out by march. And with reasonable power consumption and heat output(Not 480 again). So I can just go with GM200.


----------



## curly haired boy

i hope it's out by february


----------



## mtcn77

Quote:


> Originally Posted by *Testier*
> 
> Pretty much. I never even heard of the site.
> Even providing the *biased* review is right. So, at 4k, under my settings considering on a 24inch 4k monitor AA is usually not needed. I am not even looking at single GPU benchmark, its absolutely meaningless at 4k. I need to take in consideration for tri sli/tri cf. I trade 4% performance for usable power consumption and heat? 970 is still a better deal for tbh. Considering possible AMD driver issues with tri sli, and general insanity of Tri firing 290x on air, yeah, not happening.
> 
> *And what is this talk of GPU on reference design OCing better then custom PCB*? It literally makes no sense. Sorry man, I am gonna ask for proof on that considering I never heard it ever before and to the best of my knowledge, _should be wrong_....... I have see some pretty bad "reference" cards XFX had put out. AMD makes good GPUs, is just, tend not to be as elegant of a solution as nvidia.
> 
> I hope GM200 is out by march. And with reasonable power consumption and heat output(Not 480 again). So I can just go with GM200.



For the best of your info, you have to have the AMD logo on the pcb right next to pci-express slot for a genuine reference card.
About the review, I would urge you to find another one with 36 data points at 4K before flaming.
Eventhough, you are hesitant towards the review, the review still states the supposed power consumption variance between those cards is pretty much nonexistent - 217 watts for MSI 970 & 231 watts for 290x @performance bios2. Note on 290x side: I won't relate to why using the card in Quiet Mode causes the consumption to reach 307 watts other than saying Frenkel-Poole Effect causes you to miss the apex of temperature related power efficiency. I also haven't seen any 970's missing the software limit of power consumption at 220 watts, so...
About overclocking reference designs, digital vrm's provide a "clean" signal since the signal is digital, that is as far as I know. I'm no expert, you have better sources out on the open.
Btw, you lost me when you said aa is not needed. Really?


----------



## Olivon

As usual if we listen the same guyz, we have the impression that AMD is always top of the mountain.
Reality is way more brutal of course and nVidia is plebiscited, GTX 970 is beating all the expectations with crazies sales numberz.
970 prices has clearly increased in my country regarding the success though.

But, like szeged said, if 290X Lightning is not so expensive I will grab one too.
I buy from both camp and I am quite curious to give a try to this monster card.


----------



## maarten12100

Quote:


> Originally Posted by *Olivon*
> 
> As usual if we listen the same guyz, we have the impression that AMD is always top of the mountain.
> Reality is way more brutal of course and nVidia is plebiscited, GTX 970 is beating all the expectations with crazies sales numberz.
> 970 prices has clearly increased in my country regarding the success though.
> 
> But, like szeged said, if 290X Lightning is not so expensive I will grab one too.
> I buy from both camp and I am quite curious to give a try to this monster card.


Better to just wait for the new cards that blow money on R9 2XX serie cards now. Unless you already have one of course.


----------



## HyperC

I would not be shocked if AMD releases its new series very soon since all the prices have falling and newegg is going outta stock on most of them.. My guess is before Christmas we should have real specs!


----------



## Kuivamaa

Yeah, It kinda feels like they are getting rid of inventory right now. Still, I think we will see that gpu around february.


----------



## Awsan

Why is it hard for people to understand at 1080p and maybe 1440p Maxwell is faster than AMD's R9 series but at 1440p+ AMD's R9 win (1% or 100% difference no one cares)


----------



## Olivon

If AMD want to compete they have to slash their prices, that's all.
With the GM204, AMD plans have changed and that's the only way for them to sell cards.
Q3 market shares are so bad that they have no choice :



http://www.3dcenter.org/news/die-grafikchip-und-grafikkarten-marktanteile-im-zweiten-und-dritten-quartal-2014


----------



## azanimefan

it's silly though. prices for the r9-290x are already cheaper then the 970, and it's as fast (or faster) then the 970. I think the r9-290/290x was damaged by the stock cooler overheating fiasco... else the r9-290x would be selling like hotcakes at the $250 price point it's currently sitting at.


----------



## szeged

Link for those $250 290x cards please


----------



## azanimefan

there was a power color r9-290x that was 250 this weekend... it's back up to 300 today. there is a gigabyte going for 290 with a rebate...

i guess it was a black friday/cyber monday deal. I'm sure we'll see more of them as we get closer to xmas, keep your eyes open. i nearly bought the powercolor when i saw it for 250... though i'm happy with my r9-280x (probably the two reasons i DIDN'T buy it was because not only am i happy with my current card, but because i only have a 1080p monitor, seems like a waste of gpu power to use a r9-290x on a 1080p monitor).

Its still a little cheaper then the 970 even at the 300 price point. and the r9-290 (which is pretty comparable to the 970) was selling for 200 (both versions of the power color 290 were going for 200 over the weekend)


----------



## wolfxing

As mentioned in the OP
"this is only a unstable benchmark, the final numbers are still subject to the driver, spec change, etc.."


----------



## DNMock

Quote:


> Originally Posted by *mtcn77*
> 
> It feels nice when this audit of ht4u's direct comparison between MSI 970 & [email protected] has findings pointing out that:
> 
> At 4K + 2/4/8 MSAA 290x is 12% ahead of a 1342 MHz 970,
> At 4K + 0/2/4 SSAA 290x is 4% ahead of a 1342 MHz 970,
> Collecting these findings, 970 has got to either hit 1816(lol) to match [email protected] at 4K with MSAA, or hit 1676 MHz to call in No AA & SSAA.
> 
> 
> 
> Spoiler: taken from ht4u direct hardware comparison


If you are running a game at 4k you need a display port 1.2. Have you ever tried running a high O/C on a 290x over Displayport 1.2 on a 4k monitor? After 2 monitors, 4 DP cables and 4 290x cards, I can safely say, if you get a 290x at 1080 mhz without the monitor having an seizure and exploding @ 4k over DP1.2, please let me know. 1.2 ghz over hdmi (so no 4k @ 60 hertz), all day, every day. But the second you start adding DP 1.2 into the equation, you aren't doing anything at 1.2ghz on a 290x set up.

Once the drivers and Display Port technology matures enough to negate that issue, both the 290x and the GTX 970 will be irrelevant.


----------



## iSlayer

Quote:


> Originally Posted by *szeged*
> 
> if we get gm200 before 2015 i will give away my entire rig free. quote me on this and hold me to it.


Sooner the better. Maybe Nvidia will take a risk and undercut AMD by dropping them early.
Quote:


> Originally Posted by *Seid Dark*
> 
> Where is your evidence that 970 does 1600+ easily? Most seem to stop at 1450-1500.


On water it should do 1600+ easy.

I'm using 110% power on stock bios and in heaven benchmark stable at 1586ish.


----------



## mtcn77

Quote:


> Originally Posted by *DNMock*
> 
> If you are running a game at 4k you need a display port 1.2. Have you ever tried running a high O/C on a 290x over Displayport 1.2 on a 4k monitor? After 2 monitors, 4 DP cables and 4 290x cards, I can safely say, if you get a 290x at 1080 mhz without the monitor having an seizure and exploding @ 4k over DP1.2, please let me know. 1.2 ghz over hdmi (so no 4k @ 60 hertz), all day, every day. But the second you start adding DP 1.2 into the equation, you aren't doing anything at 1.2ghz on a 290x set up.
> 
> Once the drivers and Display Port technology matures enough to negate that issue, both the 290x and the GTX 970 will be irrelevant.


Thanks for clearing that up for me. I don't know much about the compatibility issues, but I just wanted to provide pointers.
Here are the results for 1600p:

At 1600p + 2/4/8 MSAA 290x is 5.6% ahead of a 1342 MHz 970,
At 1600p + 0/2/4 SSAA 290x is dead even (~0.1%) with a 1342 MHz 970.



Spoiler: taken from ht4u direct hardware comparison


----------



## Testier

Quote:


> Originally Posted by *mtcn77*
> 
> 
> For the best of your info, you have to have the AMD logo on the pcb right next to pci-express slot for a genuine reference card.
> About the review, I would urge you to find another one with 36 data points at 4K before flaming.
> Eventhough, you are hesitant towards the review, the review still states the supposed power consumption variance between those cards is pretty much nonexistent - 217 watts for MSI 970 & 231 watts for 290x @performance bios2. Note on 290x side: I won't relate to why using the card in Quiet Mode causes the consumption to reach 307 watts other than saying Frenkel-Poole Effect causes you to miss the apex of temperature related power efficiency. I also haven't seen any 970's missing the software limit of power consumption at 220 watts, so...
> About overclocking reference designs, digital vrm's provide a "clean" signal since the signal is digital, that is as far as I know. I'm no expert, you have better sources out on the open.
> Btw, you lost me when you said aa is not needed. Really?


24inch @4k, I think the pixel density would need no AA.

http://www.techpowerup.com/reviews/EVGA/GTX_970_SC_ACX_Cooler/25.html

Techpowerup graph shows 970 ACX then 290x @4k better on average of 12 games.

Power consumption with 290x average around 250w and 970 at 170w.


----------



## tsm106

Quote:


> Originally Posted by *Testier*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *mtcn77*
> 
> 
> For the best of your info, you have to have the AMD logo on the pcb right next to pci-express slot for a genuine reference card.
> About the review, I would urge you to find another one with 36 data points at 4K before flaming.
> Eventhough, you are hesitant towards the review, the review still states the supposed power consumption variance between those cards is pretty much nonexistent - 217 watts for MSI 970 & 231 watts for 290x @performance bios2. Note on 290x side: I won't relate to why using the card in Quiet Mode causes the consumption to reach 307 watts other than saying Frenkel-Poole Effect causes you to miss the apex of temperature related power efficiency. I also haven't seen any 970's missing the software limit of power consumption at 220 watts, so...
> About overclocking reference designs, digital vrm's provide a "clean" signal since the signal is digital, that is as far as I know. I'm no expert, you have better sources out on the open.
> Btw, you lost me when you said aa is not needed. Really?
> 
> 
> 
> 
> 
> 
> 
> 
> 24inch @4k, I think the pixel density would need no AA.
> 
> http://www.techpowerup.com/reviews/EVGA/GTX_970_SC_ACX_Cooler/25.html
> 
> Techpowerup graph shows 970 ACX then 290x @4k better on average of 12 games.
> 
> Power consumption with 290x average around 250w and 970 at 170w.
Click to expand...

These reviews that pit custom cards vs reference need to be taken with a grain of salt. Look at the hardocp 4k review where they use custom vs custom. I have to mention that the stock hawaii reference cooling sucks, but it is what it is. The card under the stock cooler is excellent though, and the stock cooling buries the card in throttle heaven. Thus it's not really indicative of current performance since the market has transitioned over to customs.


----------



## mtcn77

Quote:


> Originally Posted by *Testier*
> 
> 24inch @4k, I think the pixel density would need no AA.
> 
> http://www.techpowerup.com/reviews/EVGA/GTX_970_SC_ACX_Cooler/25.html
> 
> Techpowerup graph shows 970 ACX then 290x @4k better on average of 12 games.
> 
> Power consumption with 290x average around 250w and 970 at 170w.


I'm not indulgent towards any reviewer, but the games & settings are out there. Precisely the reason I'm quoting ht4u mostly is because they do not mix the old and the new.
Concordantly, I don't think reviews can be dismissed solely because the tested cards are of different make, it is the clocks that matter, after all.
Also, you have to have some form of analytical AA - be it FXAA, or SMAA - to comb geometry edges. Smaller pixels don't hide the artifacts. The colour depth sample count per resolution can only output a limited number of hues, too. It could well be that the final image tone is a few degrees off into the greyscale in a smaller resolution.
Techreport has a fantastic take on this. I have to commend that DSR, with its shock filtering, is doing a tremendous job at trimming the geometry edges. The question I have still stands though: is it more efficient than SMAA+SSAA?


Spoiler: taken from beyond3d








[Beyond3D]


----------



## DNMock

Quote:


> Originally Posted by *mtcn77*
> 
> Thanks for clearing that up for me. I don't know much about the compatibility issues, but I just wanted to provide pointers.
> Here are the results for 1600p:
> 
> At 1600p + 2/4/8 MSAA 290x is 5.6% ahead of a 1342 MHz 970,
> At 1600p + 0/2/4 SSAA 290x is dead even (~0.1%) with a 1342 MHz 970.
> 
> 
> 
> Spoiler: taken from ht4u direct hardware comparison


Haha, better to find out this way than the way I did.

The real benefit of the 290x over the 970 is in multiple card set ups. Once you start running crossfired 290x, especially at high resolutions like 4k, the 290x crossfired competes with the 980 in SLI rather than the 970 in SLI due to smoother frame rates and better scaling. The 290x crossfired smashes the 970 in SLI in price and performance assuming you have enough cooling and a big enough PSU to handle them.

edit: I digress and am falling off topic.

More to the point, Bring on the 300 series cards, my wallet and body are ready!


----------



## PostalTwinkie

Quote:


> Originally Posted by *szeged*
> 
> Link for those $250 290x cards please


I seen several as part of the Black Friday through Cyber Monday campaign on Amazon, Newegg, and other sites. They seemed to be PowerColor, XFX, and Gigabyte if I remember.


----------



## StereoPixel

380X and 390X for Summer 2015?








http://www.fudzilla.com/home/item/36492-amd-next-gen-graphics-are-caribbean-islands


----------



## Noufel

Quote:


> Originally Posted by *StereoPixel*
> 
> 380X and 390X for Summer 2015?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.fudzilla.com/home/item/36492-amd-next-gen-graphics-are-caribbean-islands


It's a little late to fight maxwell if info about summer is true ( monopoly is never a good thing for us )


----------



## Pantsu

It was rumored previously that some new gpus would be delayed to h2 2015, but it doesn't necessarily mean that there's nothing coming earlier in the year. Still, AMD needed new cards yesterday, not H2 next year.


----------



## PontiacGTX

Quote:


> Originally Posted by *Noufel*
> 
> It's a little late to fight maxwell if info about summer is true ( monopoly is never a good thing for us )


a 3500sp hawaii can match a gtx 980 i dont see why it wouldnt be possible


----------



## CasualCat

ugh feels like we've been waiting forever for real true successors to Hawaii/GK110. Summer is unacceptable...


----------



## Clockster

Quote:


> Originally Posted by *CasualCat*
> 
> ugh feels like we've been waiting forever for real true successors to Hawaii/GK110. Summer is unacceptable...


Because we have been waiting forever









I'm not too bothered though, my 290X lightning runs any and everything maxed out and with the new Omega driver launching Monday it should gain even more performance.


----------



## Vintage

Just saw that article on Fudzilla. There's no way I am waiting until Summer 2015..... that;s absurd. What happened to February?


----------



## PontiacGTX

Quote:


> Originally Posted by *Vintage*
> 
> Just saw that article on Fudzilla. There's no way I am waiting until Summer 2015..... that;s absurd. What happened to February?


wait untill feb or april if nothing happens go wtih saving money until you need a better gpu?


----------



## Orangey

I hate buying new GPUs in the summer.


----------



## nSone

I hope this turns out wrong...
but if this is the case do you think it would delay any new maxwell release too?


----------



## Boomstick727

Quote:


> Originally Posted by *StereoPixel*
> 
> 380X and 390X for Summer 2015?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.fudzilla.com/home/item/36492-amd-next-gen-graphics-are-caribbean-islands


Quote:


> Originally Posted by *Noufel*
> 
> It's a little late to fight maxwell if info about summer is true ( monopoly is never a good thing for us )


Quote:


> Originally Posted by *Vintage*
> 
> Just saw that article on Fudzilla. There's no way I am waiting until Summer 2015..... that;s absurd. What happened to February?


If you re read it, it says a new GCN Hawaii card is coming soon to take on 970 / 980, with cards based on a new architecture coming in summer 2015.


----------



## CasualCat

Quote:


> Originally Posted by *nSone*
> 
> I hope this turns out wrong...
> but if this is the case do you think it would delay any new maxwell release too?


Seems possible. GM200 taped out months ago, and as far as I can see both 970/980 are selling well. They may not have any incentive to get a new board out.


----------



## nSone

Quote:


> Originally Posted by *CasualCat*
> 
> Seems possible. GM200 taped out months ago, and as far as I can see both 970/980 are selling well. They may not have any incentive to get a new board out.


that's how it seems to me too, just feels like we never saw the full potential of these cards and hoped they'd be pushed by an amd release, both performance and price-wise..


----------



## Noufel

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noufel*
> 
> It's a little late to fight maxwell if info about summer is true ( monopoly is never a good thing for us )
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> a 3500sp hawaii can match a gtx 980 i dont see why it wouldnt be possible
Click to expand...

if it's a 3500sp hawaii gpu it will be ( i presume ) watercooled like the rumored reference cooler linked it will be a little tricky to put 2 in cfx


----------



## Acefire

I still hate the new naming scheme.


----------



## DNMock

Either these Chipnell benchmarks are wrong or the Fudzilla article is wrong, no way a Hawaii Refresh is going to be able to beat the 980 at those power consumption levels.

I'm just gonna say they are both right and they just forgot to add an extra "0" to the end of the power consumption part of the benchmarks listed at Chipnell.

(I'd totally still buy 2 of em)


----------



## PontiacGTX

Quote:


> Originally Posted by *DNMock*
> 
> Either these Chipnell benchmarks are wrong or the Fudzilla article is wrong, no way a Hawaii Refresh is going to be able to beat the 980 at those power consumption levels.
> 
> I'm just gonna say they are both right and they just forgot to add an extra "0" to the end of the power consumption part of the benchmarks listed at Chipnell.
> 
> (I'd totally still buy 2 of em)


hawaii is a volcanic island gpu and the new gpu are caribbean(pirate?) Islands.the benchmarks arent from volcanic islands unless they fixed some part in the architecture to improve the power consumption

Chiphell.


----------



## DNMock

Quote:


> Originally Posted by *PontiacGTX*
> 
> hawaii is a volcanic island gpu and the new gpu are caribbean(pirate?) Islands.the benchmarks arent from volcanic islands unless they fixed some part in the architecture to improve the power consumption
> 
> Chiphell.


Yeah, but if the Fudzilla article is to be believed I find it hard to believe there would be benchmarks for something that's 6 to 8 months out still.


----------



## darealist

Fudzilla doe. Instant void. Invalid.


----------



## azanimefan

Quote:


> Originally Posted by *DNMock*
> 
> Yeah, but if the Fudzilla article is to be believed I find it hard to believe there would be benchmarks for something that's 6 to 8 months out still.


the r9-380 is supposed to be released in february.

that's 3 months not 6... and it's about the right time frame for leaks on performance.


----------



## GoldenTiger

Quote:


> Originally Posted by *mtcn77*
> 
> It feels nice when this audit of ht4u's direct comparison between MSI 970 & [email protected] has findings pointing out that:
> 
> At 4K + 2/4/8 MSAA 290x is 12% ahead of a 1342 MHz 970,
> At 4K + 0/2/4 SSAA 290x is 4% ahead of a 1342 MHz 970,
> Collecting these findings, 970 has got to either hit 1816(lol) to match [email protected] at 4K with MSAA, or hit 1676 MHz to call in No AA & SSAA.
> 
> 
> 
> Spoiler: taken from ht4u direct hardware comparison


No name cherry picked review with settings none of us 4k users run in a skewed comparison, and then you make flawed conclusions off of that. Most Gtx 970 cards do 1500 to 1600 core. Most 4k owners including myself will attest that you don't really need MSAA in most cases with it, and if you do want it 2x is generally plenty (which with MFAA hits almost 4x quality at the lower performance hit). And most 290x cards are very difficult to keep cool enough or even bearable noise levels when OC'ing, which has widely known issues with causing 4k displayport problems on that card series.

Really though you're going to be running sli or crossfire to power 4k and then you essentially need watercooling for r9 290x cards in a pair to do that let alone oc them. That just adds even more hassle, expense, and heat output into your room or office. Also every driver set so far has been improving 4k maxwell 2.0 performance, in the newest Hardocp articles a stock 970 sli is basically dead even with the 290X crossfire (custom air) for 4k gaming. Add in how well 970 sli OC's on air and stays quiet/cool (mine for example runs 1506 core whisper quiet, or almost 1600 if I pump up the volts, for 24/7 gaming) and no special watercooling install needed and it's a simpler, more reliable solution for sure.
.

Also just to add, the last steam survey shows 4k users as a whopping three tenths of one percent of the gaming market. That means 99.7 percent of users don't run 4k so this whole argument is basically academic anyways, and at 2560 or below it shifts back towards Gtx cards regardless.


----------



## mtcn77

Quote:


> Originally Posted by *GoldenTiger*
> 
> *A lot of justification*


MFAA=exclusive sensational Temporal Anti-Aliasing.
Quote:


> Originally Posted by *GoldenTiger*
> 
> Most 4k owners including myself will attest that you don't really need MSAA in most cases with it, and if you do want it 2x is generally plenty (which with MFAA hits almost 4x quality at the lower performance hit)


I accept that MSAA isn't the best of filtering there is that make the most visual impact, henceforth I'm recommending all games should be tested with maximum SSAA presently available in your gpu configuration(crossfire tends to let 16x ssaa option for Radeon users) on top of that SMAA is vital for geometric fidelity.
I mean, literally you cannot expect me to derange the topic like you have on whether msaa is beneficial, or not. You just have to grow and accept it.
The review is flawed says you, fine. What aspect, or proof you have to pull on that dismissive reversal?


----------



## kingduqc

Quote:


> Originally Posted by *GoldenTiger*
> 
> No name cherry picked review with settings none of us 4k users run in a skewed comparison, and then you make flawed conclusions off of that. *Most Gtx 970 cards do 1500 to 1600 core*. Most 4k owners including myself will attest that you don't really need MSAA in most cases with it, and if you do want it 2x is generally plenty (which with MFAA hits almost 4x quality at the lower performance hit). And most 290x cards are very difficult to keep cool enough or even bearable noise levels when OC'ing, which has widely known issues with causing 4k displayport problems on that card series.
> 
> Really though you're going to be running sli or crossfire to power 4k and then you essentially need watercooling for r9 290x cards in a pair to do that let alone oc them. That just adds even more hassle, expense, and heat output into your room or office. Also every driver set so far has been improving 4k maxwell 2.0 performance, in the newest Hardocp articles a stock 970 sli is basically dead even with the 290X crossfire (custom air) for 4k gaming. Add in how well 970 sli OC's on air and stays quiet/cool (mine for example runs 1506 core whisper quiet, or almost 1600 if I pump up the volts, for 24/7 gaming) and no special watercooling install needed and it's a simpler, more reliable solution for sure.
> .
> 
> Also just to add, the last steam survey shows 4k users as a whopping three tenths of one percent of the gaming market. That means 99.7 percent of users don't run 4k so this whole argument is basically academic anyways, and at 2560 or below it shifts back towards Gtx cards regardless.


I find it hard to believe that "most" 970 can do over 1550 considering many reviews can't get past 1500 on half a dozen sample..

http://techgage.com/article/taking-it-to-the-limit-overclocking-nvidias-geforce-gtx-970-980/

*"peak of 1,510MHz "*

http://www.bit-tech.net/hardware/graphics/2014/09/19/nvidia-geforce-gtx-970-review/12

Observed boost clock ~*1,430MHz ~1,430MHz ~1,504MHz*

http://www.hardocp.com/article/2014/09/29/msi_geforce_gtx_970_gaming_4g_video_card_review/8#.VIO9w_ldWgw

*1542MHz.*

http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-970-g1-gaming-review,26.html

*roughly 1516 MHz*

http://www.guru3d.com/articles_pages/galax_geforce_gtx_970_exoc_review,25.html

*roughly 1.4 GHz*

http://www.guru3d.com/articles_pages/asus_geforce_gtx_970_strix_review,26.html

*roughly 1443 MHz*

http://www.guru3d.com/articles_pages/msi_geforce_gtx_970_gaming_review,26.html

*roughly 1501 MHz*

No where near 1600, not even close on non reference model too. By the way you are talking it's like 1600 is an average witch is far from it. Unless I'm mistaken and magically you can go 100-150mhz faster from release...


----------



## maarten12100

Quote:


> Originally Posted by *DNMock*
> 
> Either these Chipnell benchmarks are wrong or the Fudzilla article is wrong, no way a Hawaii Refresh is going to be able to beat the 980 at those power consumption levels.
> 
> I'm just gonna say they are both right and they just forgot to add an extra "0" to the end of the power consumption part of the benchmarks listed at Chipnell.
> 
> (I'd totally still buy 2 of em)


Uhm AMD will be updating their architecture for the first time in nearly 3 years rather than revisions. Also AMD will be using a more advanced node either 28nm at GF or a smaller node on either TSMC or GF.

Clearly AMD could surpass Nvidia that is the way it has always been with one surpassing the other. (as did ATI before AMD bought them).
The 5870 was a card where the efficiency tables were turned.

You are spouting nonsense AMD cards can be and sometimes are either the same or more efficient as Nvidia cards. (how did this even become an issue on OCN and no noise is not a valid argument a better cooler will solve that neither is heat since you can just cool your house too)

Whether those benches are true or not you can bet all your money on AMD working on a competing product.


----------



## GrimDoctor

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kingduqc*
> 
> I find it hard to believe that "most" 970 can do over 1550 considering many reviews can't get past 1500 on half a dozen sample..
> 
> http://techgage.com/article/taking-it-to-the-limit-overclocking-nvidias-geforce-gtx-970-980/
> 
> *"peak of 1,510MHz "*
> 
> http://www.bit-tech.net/hardware/graphics/2014/09/19/nvidia-geforce-gtx-970-review/12
> 
> Observed boost clock ~*1,430MHz ~1,430MHz ~1,504MHz*
> 
> http://www.hardocp.com/article/2014/09/29/msi_geforce_gtx_970_gaming_4g_video_card_review/8#.VIO9w_ldWgw
> 
> *1542MHz.*
> 
> http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-970-g1-gaming-review,26.html
> 
> *roughly 1516 MHz*
> 
> http://www.guru3d.com/articles_pages/galax_geforce_gtx_970_exoc_review,25.html
> 
> *roughly 1.4 GHz*
> 
> http://www.guru3d.com/articles_pages/asus_geforce_gtx_970_strix_review,26.html
> 
> *roughly 1443 MHz*
> 
> http://www.guru3d.com/articles_pages/msi_geforce_gtx_970_gaming_review,26.html
> 
> *roughly 1501 MHz*
> 
> No where near 1600, not even close on non reference model too. By the way you are talking it's like 1600 is an average witch is far from it. Unless I'm mistaken and magically you can go 100-150mhz faster from release...






For what it's worth I'm hitting 1483 MHz on air with my Strix 970 so far - still haven't adjusted voltage and it's been very stable in game so far.


----------



## szeged

are the reviewers using auto fan profiles? if so, that might be the problem.


----------



## GrimDoctor

Quote:


> Originally Posted by *szeged*
> 
> are the reviewers using auto fan profiles? if so, that might be the problem.


From the ones I've read I think they have. To keep the VRM in a reasonable range I had the define the curve.


----------



## szeged

maxwell seems to scale pretty well with temps, my classified 980 wouldnt clock for anything when using stock fan profile, set my own profile and bam, 1606 mhz on the core with ambient temps of 78f.


----------



## DNMock

Quote:


> Originally Posted by *maarten12100*
> 
> Uhm AMD will be updating their architecture for the first time in nearly 3 years rather than revisions. Also AMD will be using a more advanced node either 28nm at GF or a smaller node on either TSMC or GF.
> 
> Clearly AMD could surpass Nvidia that is the way it has always been with one surpassing the other. (as did ATI before AMD bought them).
> The 5870 was a card where the efficiency tables were turned.
> 
> You are spouting nonsense AMD cards can be and sometimes are either the same or more efficient as Nvidia cards. (how did this even become an issue on OCN and no noise is not a valid argument a better cooler will solve that neither is heat since you can just cool your house too)
> 
> Whether those benches are true or not you can bet all your money on AMD working on a competing product.


So much fail in reading...

I specifically said Hawaii, referring to the specific GPU made by AMD. I said nothing of the company as a whole or their capability to do so in past or future GPU's.

I was referring to this article here: http://www.fudzilla.com/home/item/36326-amd-to-launch-faster-hawaii-iteration (the Fudzilla one) and how it does not align with the article here: http://www.chiphell.com/thread-1182382-1-1.html (The Chipnell one) In that a Hawaii refresh is the next upcoming AMD release so it would make sense for the upcoming card to be benchmarked and not the card coming in 6 to 8 months.

I am quite sure the Pirate Islands architecture will be capable of competing with and even outperforming the current Maxwell cards in both TDP and performance, however, I find it quite unlikely that a Hawaii refresh will be able to out perform a 980 without a massive power draw.

Again, AMD can, but not with a Hawaii refresh outperform maxwell in all ways at the same time...


----------



## maarten12100

Quote:


> Originally Posted by *DNMock*
> 
> So much fail in reading...
> 
> I specifically said Hawaii, referring to the specific GPU made by AMD. I said nothing of the company as a whole or their capability to do so in past or future GPU's.
> 
> I was referring to this article here: http://www.fudzilla.com/home/item/36326-amd-to-launch-faster-hawaii-iteration (the Fudzilla one) and how it does not align with the article here: http://www.chiphell.com/thread-1182382-1-1.html (The Chipnell one) In that a Hawaii refresh is the next upcoming AMD release so it would make sense for the upcoming card to be benchmarked and not the card coming in 6 to 8 months.
> 
> I am quite sure the Pirate Islands architecture will be capable of competing with and even outperforming the current Maxwell cards in both TDP and performance, however, I find it quite unlikely that a Hawaii refresh will be able to out perform a 980 without a massive power draw.
> 
> Again, AMD can, but not with a Hawaii refresh outperform maxwell in all ways at the same time...


Here is what you said again:
Quote:


> no way a Hawaii Refresh is going to be able to beat the 980 at those power consumption levels.


Upgraded hawaii on FD-SOI would already be able to do this let alone the new architecture which should be better than hawaii. I don't see how I failed in reading your negative expectations.
The 980 is in no way a miracle card "Hawaii refresh" means hawaii based but slightly updated in my book.

Hawaii on a less dense node would obviously be able to compete on power and performance with the 980 but of course there will not be just a refresh it is time for something new meaning a better node and a new architecture.

I think you meant power consumption rather than TDP. (2 different things)


----------



## kpzero

I dont necessarily believe the following but posting it as a potential scenario:

The chiphell graphs and the fudzilla article could actually be referring to the same card.
If the chiphell card really is a sample then the person could have erroneously labeled it as being pirate/carribean in the graph.
Then you have several sisoft entries listed as R9 200 series for 3520sp and 3200sp done in recent months.
3520sp hawaii/maui would likely be about the size of the supposed 550 die size listed in the frequently posted tapeout image.
The above sisoft entries are listed as 3GB which would make sense for the sample time period and a 384 bit bus using the new compression. Those would become 6GB with the new wave of memory that is shipping.
The chiphell results could potentially be achieved by 3520sp provided that it was using a preliminary Omega driver that gave it a decent boost. That may be a slight reach.
A superior global 28nm process combined with all of those power optimizations that have been listed on the AMD roadmap(same as the Nvidia "magic dust") could reach that 197 watt mark.
The above potential maui chip would still likely have problems overclocking.
The above is just rambling.

I am currently leaning towards a refresh Hawaii/Maui on global 28 FDSOI coming out Jan/Feb that edges out the 980 and the next generation chips coming Septemberish on Advanced FD-SOI. Big Maxwell likely coming out about a month after Maui and Nvidia having free reign for several months again.

All this waiting is annoying with AMD being quiet. Too many leaks/misinformation. At this point nothing would surprise me with the upcoming releases.

TLDR: Ignore everything above.


----------



## Olivon

FD-SOI ? HBM ?

Grabbing pop-corns and waiting for the truth ...


Spoiler: Warning: Spoiler!


----------



## diggiddi

Quote:


> Originally Posted by *azanimefan*
> 
> there was a power color r9-290x that was 250 this weekend... it's back up to 300 today. there is a gigabyte going for 290 with a rebate...
> 
> i guess it was a black friday/cyber monday deal. I'm sure we'll see more of them as we get closer to xmas, keep your eyes open. i nearly bought the powercolor when i saw it for 250... though i'm happy with my r9-280x (probably the two reasons i DIDN'T buy it was because not only am i happy with my current card, but because i only have a 1080p monitor, *seems like a waste of gpu power to use a r9-290x on a 1080p monitor)*.
> 
> Its still a little cheaper then the 970 even at the 300 price point. and the r9-290 (which is pretty comparable to the 970) was selling for 200 (both versions of the power color 290 were going for 200 over the weekend)


Umm Crysis 3 says NO


----------



## raghu78

Quote:


> Originally Posted by *diggiddi*
> 
> Umm Crysis 3 says NO


in fact many more recent AAA games say NO . Farcry 4, DAI, CODAW,. maxing these games at 1080p requires a GTX 980. AC unity too, but thats a poorly optimized and buggy game









http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,7
http://www.purepc.pl/karty_graficzne/dragon_age_inquisition_ma_wymagania_jak_smok_test_wydajnosci?page=0,6
http://www.overclock.net/t/1522671/gamegpu-call-of-duty-advanced-warfare-gpu-test
http://www.anandtech.com/show/8738/benchmarked-assassins-creed-unity/2


----------



## PontiacGTX

Quote:


> Originally Posted by *raghu78*
> 
> in fact many more recent AAA games say NO . Farcry 4, DAI, CODAW,. maxing these games at 1080p requires a GTX 980. AC unity too, but thats a poorly optimized and buggy game
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,7
> http://www.purepc.pl/karty_graficzne/dragon_age_inquisition_ma_wymagania_jak_smok_test_wydajnosci?page=0,6
> http://www.overclock.net/t/1522671/gamegpu-call-of-duty-advanced-warfare-gpu-test
> http://www.anandtech.com/show/8738/benchmarked-assassins-creed-unity/2


FC4,AC U and CODAW are mere ports dont expect to be demanding as the non ported ce3 games


----------



## daviejams

Quote:


> Originally Posted by *raghu78*
> 
> in fact many more recent AAA games say NO . Farcry 4, DAI, CODAW,. maxing these games at 1080p requires a GTX 980. AC unity too, but thats a poorly optimized and buggy game
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,7
> http://www.purepc.pl/karty_graficzne/dragon_age_inquisition_ma_wymagania_jak_smok_test_wydajnosci?page=0,6
> http://www.overclock.net/t/1522671/gamegpu-call-of-duty-advanced-warfare-gpu-test
> http://www.anandtech.com/show/8738/benchmarked-assassins-creed-unity/2


You don't have to play games at ultra you know


----------



## Orangey

Quote:


> Originally Posted by *kpzero*
> 
> I dont necessarily believe the following but posting it as a potential scenario:
> 
> The chiphell graphs and the fudzilla article could actually be referring to the same card.
> If the chiphell card really is a sample then the person could have erroneously labeled it as being pirate/carribean in the graph.
> Then you have several sisoft entries listed as R9 200 series for 3520sp and 3200sp done in recent months.
> 3520sp hawaii/maui would likely be about the size of the supposed 550 die size listed in the frequently posted tapeout image.
> The above sisoft entries are listed as 3GB which would make sense for the sample time period and a 384 bit bus using the new compression. Those would become 6GB with the new wave of memory that is shipping.
> The chiphell results could potentially be achieved by 3520sp provided that it was using a preliminary Omega driver that gave it a decent boost. That may be a slight reach.
> A superior global 28nm process combined with all of those power optimizations that have been listed on the AMD roadmap(same as the Nvidia "magic dust") could reach that 197 watt mark.
> The above potential maui chip would still likely have problems overclocking.
> The above is just rambling.
> 
> I am currently leaning towards a refresh Hawaii/Maui on global 28 FDSOI coming out Jan/Feb that edges out the 980 and the next generation chips coming Septemberish on Advanced FD-SOI. Big Maxwell likely coming out about a month after Maui and Nvidia having free reign for several months again.
> 
> All this waiting is annoying with AMD being quiet. Too many leaks/misinformation. At this point nothing would surprise me with the upcoming releases.
> 
> TLDR: Ignore everything above.


Quote:


> Originally Posted by *Olivon*
> 
> FD-SOI ? HBM ?
> 
> Grabbing pop-corns and waiting for the truth ...
> 
> 
> Spoiler: Warning: Spoiler!


http://www.overclock.net/t/1523844/vc-amd-fiji-xt-spotted-at-zauba/80#post_23139947
http://www.overclock.net/t/1523844/vc-amd-fiji-xt-spotted-at-zauba/120#post_23145188


----------



## raghu78

Quote:


> Originally Posted by *daviejams*
> 
> You don't have to play games at ultra you know


if you don't want to play at max settings then you might as well go with mid-range cards like R9 280X. in fact we can take this even further and say if we lower resolution to 1366 x768 and play at med-high settings we can run with entry level cards like GTX 750.


----------



## GoldenTiger

Quote:


> Originally Posted by *daviejams*
> 
> You don't have to play games at ultra you know


Quote:


> Originally Posted by *raghu78*
> 
> if you don't want to play at max settings then you might as well go with mid-range cards like R9 280X. in fact we can take this even further and say if we lower resolution to 1366 x768 and play at med-high settings we can run with entry level cards like GTX 750.


Forum warriors are as I keep saying far too obsessed with the ERMAHGERRRRRRD MAX EVERY SLIDARRRRRR mentality and cripple their performance for negligible gains in visual fidelity. I see people claiming you can't run 4k games at great settings with top end sli cards like the Gtx 970 and 980 as a pair and laugh hysterically every time. I'm loving every minute of my 4k ips 32 inch 60hz game time. I lower one or two sliders down a notch and can't even find the changes in screenshots, but the game then runs flawlessly. As usually is the case with people having computer issues, PEBKAC: user error.

P. S. Assassin's Creed unity performs well for the visual fidelity and simulation at given settings levels provided. But what would I expect other than typical forum warrior spin with no understanding but "it says high!" from our friend named after spaghetti sauce?


----------



## incog

Quote:


> Originally Posted by *szeged*
> 
> maxwell seems to scale pretty well with temps, my classified 980 wouldnt clock for anything when using stock fan profile, set my own profile and bam, 1606 mhz on the core with ambient temps of 78f.


did you set your fans up to 80% + ? that's unrealistic for most people who use their computers i guess, not sure how quiet those cards are but I try to keep my video card at under ~50% fan speed. 7970s aren't the coolest cards either though


----------



## raghu78

Quote:


> Originally Posted by *GoldenTiger*
> 
> Forum warriors are as I keep saying far too obsessed with the ERMAHGERRRRRRD MAX EVERY SLIDARRRRRR mentality and cripple their performance for negligible gains in visual fidelity. I see people claiming you can't run 4k games at great settings with top end sli cards like the Gtx 970 and 980 as a pair and laugh hysterically every time. I'm loving every minute of my 4k ips 32 inch 60hz game time. I lower one or two sliders down a notch and can't even find the changes in screenshots, but the game then runs flawlessly. As usually is the case with people having computer issues, PEBKAC: user error.
> 
> P. S. Assassin's Creed unity performs well for the visual fidelity and simulation at given settings levels provided. But what would I expect other than typical forum warrior spin with no understanding but "it says high!" from our friend named after spaghetti sauce?


I always look for the best playable experience. So that means the settings will vary from game to game. But that does not mean others don't want to play the game at close to the highest settings. Most enthusiasts would like to enable MSAA 2x or SSAA 2x if their card can provide playable frame rates (40 fps) in the latest games at the resolution they are playing. Eg: Farcry 4 with SMAA is definitely playable at 1440p on GTX 970 / GTX 980 / R9 290 / R9 290X. But if you max the game with Ultra, enhanced godrays and run with MSAA 4x you can kiss game playability goodbye


----------



## hyp36rmax

Quote:


> Originally Posted by *GoldenTiger*
> 
> Forum warriors are as I keep saying far too obsessed with the ERMAHGERRRRRRD MAX EVERY SLIDARRRRRR mentality and cripple their performance for negligible gains in visual fidelity. I see people claiming you can't run 4k games at great settings with top end sli cards like the Gtx 970 and 980 as a pair and laugh hysterically every time. I'm loving every minute of my 4k ips 32 inch 60hz game time. I lower one or two sliders down a notch and can't even find the changes in screenshots, but the game then runs flawlessly. As usually is the case with people having computer issues, PEBKAC: user error.
> 
> P. S. Assassin's Creed unity performs well for the visual fidelity and simulation at given settings levels provided. But what would I expect other than typical forum warrior spin with no understanding but "it says high!" from our friend named after spaghetti sauce?


Quote:


> Originally Posted by *raghu78*
> 
> I always look for the best playable experience. So that means the settings will vary from game to game. But that does not mean others don't want to play the game at close to the highest settings. Most enthusiasts would like to enable MSAA 2x or SSAA 2x if their card can provide playable frame rates (40 fps) in the latest games at the resolution they are playing. Eg: Farcry 4 with SMAA is definitely playable at 1440p on GTX 970 / GTX 980 / R9 290 / R9 290X. But if you max the game with Ultra, enhanced godrays and run with MSAA 4x you can kiss game playability goodbye


Agreed for both of you! It's all about play ability and what looks good to you. I game and work with a 4k monitor myself first expecting my crossfire 7970's to cry (Still looking forward to upgrade as these will work until a proper replacement has been released), however i was proven wrong as it's very possible to play at high or ultra 60fps as long as you disable / lower AA and ambient occulsion. I personally feel those two settings alone are icing on the very already scrumpulcious cake that 4k can enable .









I'm loving the "forum warrior's" terminology, as i've been using magazine benchers (your's sounds more appropriate @Golden Tiger) LOL!!!!


----------



## GoldenTiger

Quote:


> Originally Posted by *raghu78*
> 
> if you don't want to play at max settings then you might as well go with mid-range cards like R9 280X. in fact we can take this even further and say if we lower resolution to 1366 x768 and play at med-high settings we can run with entry level cards like GTX 750.


Quote:


> Originally Posted by *raghu78*
> 
> I always look for the best playable experience. So that means the settings will vary from game to game. But that does not mean others don't want to play the game at close to the highest settings. Most enthusiasts would like to enable MSAA 2x or SSAA 2x if their card can provide playable frame rates (40 fps) in the latest games at the resolution they are playing. Eg: Farcry 4 with SMAA is definitely playable at 1440p on GTX 970 / GTX 980 / R9 290 / R9 290X. But if you max the game with Ultra, enhanced godrays and run with MSAA 4x you can kiss game playability goodbye


. Oh I agree just misread your post and thought you said he should just play with a 750 or go home. What you said makes sense







.


----------



## incog

Quote:


> Originally Posted by *raghu78*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GoldenTiger*
> 
> Forum warriors are as I keep saying far too obsessed with the ERMAHGERRRRRRD MAX EVERY SLIDARRRRRR mentality and cripple their performance for negligible gains in visual fidelity. I see people claiming you can't run 4k games at great settings with top end sli cards like the Gtx 970 and 980 as a pair and laugh hysterically every time. I'm loving every minute of my 4k ips 32 inch 60hz game time. I lower one or two sliders down a notch and can't even find the changes in screenshots, but the game then runs flawlessly. As usually is the case with people having computer issues, PEBKAC: user error.
> 
> P. S. Assassin's Creed unity performs well for the visual fidelity and simulation at given settings levels provided. But what would I expect other than typical forum warrior spin with no understanding but "it says high!" from our friend named after spaghetti sauce?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I always look for the best playable experience. So that means the settings will vary from game to game. But that does not mean others don't want to play the game at close to the highest settings. Most enthusiasts would like to enable MSAA 2x or SSAA 2x if their card can provide playable frame rates (40 fps) in the latest games at the resolution they are playing. Eg: Farcry 4 with SMAA is definitely playable at 1440p on GTX 970 / GTX 980 / R9 290 / R9 290X. But if you max the game with Ultra, enhanced godrays and run with MSAA 4x you can kiss game playability goodbye
Click to expand...

Maybe it just looks so damn fine on 4K even with slightly reduced settings that it doesn't matter to some people.

Wish I had myself a 4K screen. but they cost half the price of my rig


----------



## GoldenTiger

Quote:


> Originally Posted by *incog*
> 
> Maybe it just looks so damn fine on 4K even with slightly reduced settings that it doesn't matter to some people.
> 
> Wish I had myself a 4K screen. but they cost half the price of my rig


It does, and the same settings really aren't distinguishable even at 1440p either for the most part. Running 4k is like putting on a pair of eyeglasses in benefit. I ran a 24 inch 4k 60hz ips monitor back in may and recently upgraded to a 32 inch sst model.


----------



## DNMock

Quote:


> Originally Posted by *maarten12100*
> 
> Here is what you said again:
> Upgraded hawaii on FD-SOI would already be able to do this let alone the new architecture which should be better than hawaii. I don't see how I failed in reading your negative expectations.
> The 980 is in no way a miracle card "Hawaii refresh" means hawaii based but slightly updated in my book.
> 
> Hawaii on a less dense node would obviously be able to compete on power and performance with the 980 but of course there will not be just a refresh it is time for something new meaning a better node and a new architecture.
> 
> I think you meant power consumption rather than TDP. (2 different things)


Yeah, I did mean to say power consumption, it was late. Sorry for that.

It may be possible, I'm not a computer or electrical engineer, I just have my reservations that a Hawaii refresh can win out on both fronts. Asking for a stop-gap refresh to improve performance by 30% and do so at a 30% decrease in power consumption over the previous version is just being unreasonable. Now a 30% performance bump at similar or slightly higher power consumption? That's not too out of the box i think. (basing the 30% totals on ballparking quick math in my head between the Captain-Jack and the 290x numbers on the chipnell benchmark.)


----------



## diggiddi

Quote:


> Originally Posted by *raghu78*
> 
> in fact many more recent AAA games say NO . Farcry 4, DAI, CODAW,. maxing these games at 1080p requires a GTX 980. AC unity too, but thats a poorly optimized and buggy game


Might as well Add BF4 to that list too

Quote:


> Originally Posted by *daviejams*
> 
> You don't have to play games at ultra you know


Yes you do, Thisss issss Overrrrccclockkk!!! (In a King Leonidas voice)
On the real, if you have the GPU firepower to do so, why not? I personally want to see the game just as the creator intended it to be seen, in all its full glory(excuse the cliché)
I want to be able to maxx it out and nothing less is acceptable, unless of course my system cant handle it


----------



## Richardbenson22

Is this the new version you people are talking about !!


----------



## PostalTwinkie

Quote:


> Originally Posted by *diggiddi*
> 
> Might as well Add BF4 to that list too
> Yes you do, Thisss issss Overrrrccclockkk!!! (In a King Leonidas voice)
> On the real, if you have the GPU firepower to do so, why not? I personally want to see the game just as the creator intended it to be seen, in all its full glory(excuse the cliché)
> I want to be able to maxx it out and nothing less is acceptable, unless of course my system cant handle it


Why wouldn't you want to run at ultra?

Because in many games the visual difference between Ultra and the next step down is pretty much nonexistent, yet the performance difference can be huge. A fantastic example of that is Call of Duty: Ghosts. There were no texture swaps when you went to Ultra, models didn't change, visually there were no changes. Yet you took up to a 50% performance hit. I am not the only person to have noticed this. Frankly I wouldn't be surprised if in many games "Ultra" was just a placebo for those that must have it cranked up. Ghosts wasn't the only, or the first, game to do this.

Anymore it seems when you slide that button to "Ultra" they don't actually give you a visual upgrade, they might put some stupid high level of an unnecessary AA on, and use some tesselation in a rock or something.

There are plenty of reasons to not use "Ultra" settings in most AAA - they run like ass and don't look better.


----------



## curly haired boy

Custom > Ultra

I like to know what I'm getting


----------



## PostalTwinkie

Quote:


> Originally Posted by *curly haired boy*
> 
> Custom > Ultra
> 
> I like to know what I'm getting


Exactly!

There is no reason a game today shouldn't have an expansion customization menu allowing people to tweak everything in the games visuals/performance. Those menus should also be detailed and explain what each setting is.


----------



## GoldenTiger

Quote:


> Originally Posted by *curly haired boy*
> 
> Custom > Ultra
> 
> I like to know what I'm getting


One hundred thousand percent this. Anyone claiming you need multiple top end graphics cards to play at 95 percent or more of the visual quality of absolute maximum should probably learn more about tech instead of whining that just clicking max didn't run well, on any resolution but especially 4k. I've always done this and even in screenshots it is often either entirely or just about indistinguishable in many games, heck I'd say most games, yet runs leaps and bounds faster. I've been running 2560 resolutions since 2008 and 4k since may 2014 with issue. I was on just one Gtx 780 when I got my first 4k display, and months later finally upgraded to GTX 970 cards in SLI plus more recently a 32 inch monitor. Honestly though either setup gives awesome graphics, the sli combo just lets me add some aa or tick a couple of extras up a notch and maintain 60fps at all times, instead. Bottom line really is just pebkac on gamers' parts, they think they should just slam every slider to max and if it doesn't run well it's a poorly optimized game, even if they're trying to use things way above their hardware capabilities. Been this way for decades, although it seems to be more pervasive nowadays sort of like the general entitlement attitude in a lot of places. Probably connected, but that's going a little off topic into politics eh?


----------



## Orangey

Quote:


> Originally Posted by *GoldenTiger*
> 
> Bottom line really is just pebkac on gamers' parts, they think they should just slam every slider to max and if it doesn't run well it's a poorly optimized game, even if they're *trying to use things way above their hardware capabilities.*


Like GM204 and 4K, for example.


----------



## hyp36rmax

Quote:


> Originally Posted by *GoldenTiger*
> 
> One hundred thousand percent this. Anyone claiming you need multiple top end graphics cards to play at 95 percent or more of the visual quality of absolute maximum should probably learn more about tech instead of whining that just clicking max didn't run well, on any resolution but especially 4k. I've always done this and even in screenshots it is often either entirely or just about indistinguishable in many games, heck I'd say most games, yet runs leaps and bounds faster. I've been running 2560 resolutions since 2008 and 4k since may 2014 with issue. I was on just one Gtx 780 when I got my first 4k display, and months later finally upgraded to GTX 970 cards in SLI plus more recently a 32 inch monitor. Honestly though either setup gives awesome graphics, the sli combo just lets me add some aa or tick a couple of extras up a notch and maintain 60fps at all times, instead. Bottom line really is just pebkac on gamers' parts, they think they should just slam every slider to max and if it doesn't run well it's a poorly optimized game, even if they're trying to use things way above their hardware capabilities. Been this way for decades, although it seems to be more pervasive nowadays sort of like the general entitlement attitude in a lot of places. Probably connected, but that's going a little off topic into politics eh?


Amen!


----------



## curly haired boy

it's down to individual taste. i'd sacrifice AA of any kind in favor of keeping SSAO, for instance


----------



## GoldenTiger

Quote:


> Originally Posted by *Orangey*
> 
> Like GM204 and 4K, for example.


But I don't complain about performance and my two gm204 cards oc'd are able to beat the snot out of 4k, let alone your poor fangirl jeering. Head back to the catalyst omega thread and enjoy the complaints there the the other Radeon owners about how they failed to deliver anything much let alone close to their promises







.


----------



## PontiacGTX

Quote:


> Originally Posted by *GoldenTiger*
> 
> But I don't complain about performance and my two gm204 cards oc'd are able to beat the snot out of 4k, let alone your poor fangirl jeering. Head back to the catalyst omega thread and enjoy the complaints there the the other Radeon owners about how they failed to deliver anything much let alone close to their promises
> 
> 
> 
> 
> 
> 
> 
> .


the promises were right they stated 19% compared to 13.12 and just adding 20 features plus improving frames/solving issues in recent games like ryse(8fps ) and I havent got performance issues...


----------



## Clovertail100

Quote:


> Originally Posted by *GoldenTiger*
> 
> But I don't complain about performance and my two gm204 cards oc'd are able to beat the snot out of 4k, let alone your poor fangirl jeering. Head back to the catalyst omega thread and enjoy the complaints there the the other Radeon owners about how they failed to deliver anything much let alone close to their promises
> 
> 
> 
> 
> 
> 
> 
> .


Omega drivers deliver quite well. New features, fixes, a bit more performance over the past beta and quite a bit more compared to the last WHQL driver which they expressly noted on the performance graph. My 7970's have aged like a fine wine as opposed to a certain other card released around that time. But hey, forget GK104, even, and just look at the 780 and Titan. What happened to those cards beating the snot out of Tahiti? Why is the gap suddenly so narrow?
AMD's drivers are nothing to complain about, and certainly not a vice. If you want to dispute his misgivings toward the 980's performance in 4K scenarios, do it. But what you're doing, here, is fighting ignorance with more ignorance. It only makes you look as bad as him.


----------



## Orangey

Ignorant? I have facts on my side. I said GM204, not 980.










Not great for a brand new product to be stomped by year old tech.


----------



## hyp36rmax

Quote:


> Originally Posted by *Orangey*
> 
> Ignorant? I have facts on my side. I said GM204, not 980.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not great for a brand new product to be stomped by year old tech.


That's far from stomped. 2-3 fps isn't that noticeable... The GTX 980 was aiming at the R9 290X, which would be a much more fair comparison of raw power. I don't feel price comparisons really do give justice as the GTX 970 and r9 290X seem to show in the Toms benchmark. Not biased at all either as I own Crossfire Sapphire Vapor-X 290X 8gb's, Crossfire 7970's, and a GTX 780Ti.


----------



## astrallite

Quote:


> Originally Posted by *hyp36rmax*
> 
> That's far from stomped. 2-3 fps isn't that noticeable... The GTX 980 was aiming at the R9 290X, which would be a much more fair comparison of raw power. I don't feel price comparisons really do give justice as the GTX 970 and r9 290X seem to show in the Toms benchmark. Not biased at all either as I own Crossfire Sapphire Vapor-X 290X 8gb's, Crossfire 7970's, and a GTX 780Ti.


Also the clockspeeds of the 4GB and 8GB 290X that Toms tested are not the same. When you test 4GB and 8GB versions of the same Sapphire card at the same clock rates the framerate is virtually identical in 95% of games, even at 4K.


----------



## hyp36rmax

Quote:


> Originally Posted by *astrallite*
> 
> Also the clockspeeds of the 4GB and 8GB 290X that Toms tested are not the same. When you test 4GB and 8GB versions of the same Sapphire card at the same clock rates the framerate is virtually identical in 95% of games, even at 4K.


I agree with this. There really wasn't much of a difference between the two. I had an opportunity to get two 8gb's and jumped on impulse haha


----------



## HillaryClinton

So rumor is the 380x comes out in feb based on the million links I been hitting all over google right? Or is that speculation? I was gonna grab a 280/290 on sale, since you can get a 280 for like 170ish on sale and a 290 for like 250ish on sale.

If these benches are for 380x and are accurate...would it be worth waiting a month or are these things gonna be priced in the 350 USD+ range? This thing looks more powerful then a 290...and uses less wattage.

Also I been a bit stumped on the DX12 thing, it seems to bring a bunch of nice performance based on speculation and I have been told only the AMD 300 series and Nvidia 900 series will FULLY support it, does that mean partial support for 200 series and if so how partial, I know its all speculation, but its a concern of mine.


----------



## PontiacGTX

Quote:


> Originally Posted by *hyp36rmax*
> 
> That's far from stomped. 2-3 fps isn't that noticeable... The GTX 980 was aiming at the R9 290X, which would be a much more fair comparison of raw power. I don't feel price comparisons really do give justice as the GTX 970 and r9 290X seem to show in the Toms benchmark. Not biased at all either as I own Crossfire Sapphire Vapor-X 290X 8gb's, Crossfire 7970's, and a GTX 780Ti.


the 980 was aiming for the incoming gpu.if they were doing that comparison they will have a hard time catching the incoming carribean/pirate islands gpus

You dont need to have all the video cards of the present to be unbiased

Quote:


> Originally Posted by *HillaryClinton*
> 
> So rumor is the 380x comes out in feb based on the million links I been hitting all over google right? Or is that speculation? I was gonna grab a 280/290 on sale, since you can get a 280 for like 170ish on sale and a 290 for like 250ish on sale.
> 
> If these benches are for 380x and are accurate...would it be worth waiting a month or are these things gonna be priced in the 350 USD+ range? This thing looks more powerful then a 290...and uses less wattage.
> 
> Also I been a bit stumped on the DX12 thing, it seems to bring a bunch of nice performance based on speculation and I have been told only the AMD 300 series and Nvidia 900 series will FULLY support it, does that mean partial support for 200 series and if so how partial, I know its all speculation, but its a concern of mine.


if it comes at feb can be mostly a gpu on 28nm

Get a r9 290 by now and upgrade when the gpu node is 20 or 14nm .the gpu incoming could be at 450usd

Also the 200 series support DX12
http://www.amd.com/en-us/press-releases/Pages/amd-demonstrates-2014mar20.aspx


----------



## CasualCat

Quote:


> Originally Posted by *Orangey*
> 
> Ignorant? I have facts on my side. I said GM204, not 980.










GM204 is both the 980 and 970...


----------



## HillaryClinton

Quote:


> Originally Posted by *PontiacGTX*
> 
> the 980 was aiming for the incoming gpu.if they were doing that comparison they will have a hard time catching the incoming carribean/pirate islands gpus
> 
> You dont need to have all the video cards of the present to be unbiased
> if it comes at feb can be mostly a gpu on 28nm
> 
> Get a r9 290 by now and upgrade when the gpu node is 20 or 14nm .the gpu incoming could be at 450usd
> 
> Also the 200 series support DX12
> http://www.amd.com/en-us/press-releases/Pages/amd-demonstrates-2014mar20.aspx


Ahh so it seems the 200 series does fully support DX12, odd that people were saying only the 970/980 currently do, thats annoying, read it on various forums, thanks for the link confirming DX12 support.


----------



## CasualCat

Quote:


> Originally Posted by *HillaryClinton*
> 
> Ahh so it seems the 200 series does fully support DX12, odd that people were saying only the 970/980 currently do, thats annoying, read it on various forums, thanks for the link confirming DX12 support.


Unless the DX12 spec is finalized, I don't understand how any existing GPU (either Nvidia or AMD) can fully support all DX12 features in the hardware. Anyone know the answer to this?


----------



## HillaryClinton

Quote:


> Originally Posted by *CasualCat*
> 
> Unless the DX12 spec is finalized, I don't understand how any existing GPU (either Nvidia or AMD) can fully support all DX12 features in the hardware. Anyone know the answer to this?


Not sure but I am guessing it works with current hardware and doesn't need anything new or special?


----------



## hyp36rmax

Quote:


> Originally Posted by *PontiacGTX*
> 
> *the 980 was aiming for the incoming gpu.if they were doing that comparison they will have a hard time catching the incoming carribean/pirate islands gpus
> 
> You dont need to have all the video cards of the present to be unbiased
> if it comes at feb can be mostly a gpu on 28nm*
> 
> Get a r9 290 by now and upgrade when the gpu node is 20 or 14nm .the gpu incoming could be at 450usd
> 
> Also the 200 series support DX12
> http://www.amd.com/en-us/press-releases/Pages/amd-demonstrates-2014mar20.aspx


What are you talking about? Nvidia GM204 GTX 980 was released to to take the crown as "the fastest GPU in the world" (Single GPU not counting the R9 295X2 and Titan-Z) which at the time depending on the application an AMD R9 290X and GTX 780Ti, which it rightfully has. Besides how can you aim for an incoming GPU, that's like showing *all* your cards at a poker table...









However you're correct you can also be un-biased without owning the current GPU's, imagine how my statement would have come across if you only knew I owned a GTX 780Ti in reference to that specific Toms Hardware benchmark? I would have just been a defending "Fan boy"...

Bottom line I enjoy both AMD and Nvidia and look forward to their competition to innovate new tech for us to reap.


----------



## maarten12100

Quote:


> Originally Posted by *CasualCat*
> 
> Unless the DX12 spec is finalized, I don't understand how any existing GPU (either Nvidia or AMD) can fully support all DX12 features in the hardware. Anyone know the answer to this?


Emulate around the things that you haven't build in to your gpu yet. Nvidia has made kepler compliant by using some software also with almost everything in place worst case you'll lose some performance over having it build in.


----------



## SlackerITGuy

Quote:


> Originally Posted by *CasualCat*
> 
> Unless the DX12 spec is finalized, I don't understand how any existing GPU (either Nvidia or AMD) can fully support all DX12 features in the hardware. Anyone know the answer to this?


Because DirectX 12, AFAICT, will not be introducing any new features in the traditional way (where hardware has to "catch up").

The whole point of a low level approach is to fully expose the GPU capabilities, that way devs gain complete control of the GPU so they can try new stuff out/innovate the way they want.


----------



## HillaryClinton

So its similar to Mantle then, just more accepted(Or will be).


----------



## maarten12100

Quote:


> Originally Posted by *HillaryClinton*
> 
> So its similar to Mantle then, just more accepted(Or will be).


Exactly Mantle will be phased out in 1,5 years in favor of DX12 it is a win for amd since they can add high perf gpus to lower performance (thouh by that time an efficient strong arch and node should used for their low power parts. (by that time the need may be gone)


----------



## SlackerITGuy

Quote:


> Originally Posted by *maarten12100*
> 
> Exactly Mantle will be phased out in 1,5 years in favor of DX12 it is a win for amd since they can add high perf gpus to lower performance (thouh by that time an efficient strong arch and node should used for their low power parts. (by that time the need may be gone)


AMD is actually saying developers are telling them to keep Mantle going even after DirectX 12 launches, most likely as a way to keep Microsoft at its toes when it comes to innovating DirectX.


----------



## PostalTwinkie

Quote:


> Originally Posted by *SlackerITGuy*
> 
> AMD is actually saying developers are telling them to keep Mantle going even after DirectX 12 launches, most likely as a way to keep Microsoft at its toes when it comes to innovating DirectX.


Mantle will have zero impact on DX, any version, unless it comes up to even a level of adoption close to DX. Which in reality is very unlikely, and that is pretty unfortunate as real competition is always a good thing.


----------



## Imglidinhere

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Hahahaha, yeah right!
> 
> I would buy this chart if it were a 390X, but not a 380X. Unless they have done away with the 290/290X naming scheme, and just went with 390X (290X replacement) and 380X (290 replacement).


I sense the power of the green team in this one... strong it is...


----------



## HillaryClinton

Quote:


> Originally Posted by *Imglidinhere*
> 
> I sense the power of the green team in this one... strong it is...


Yeah I mean wasn't everyone in shock over the 970/980, not sure why its hard to believe a 2015 card will be a improvement over a 2014 card....


----------



## SlackerITGuy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Mantle will have zero impact on DX, any version, unless it comes up to even a level of adoption close to DX. Which in reality is very unlikely, and that is pretty unfortunate as real competition is always a good thing.


Oh my sweet PostalTwinkie, we've gone back and forth on this several times already, there's no changing your mind (or mine for that matter), so I'll just leave you with this:

A little bit of background:

Microsoft announces DirectX 11.2, introducing the famous Tiled Resources, couple of months go by then AMD announces they're taking on Mantle, creating quite a bit of stir, time goes by then all of the sudden boom!, Microsoft announces DirectX 12, committing to a low level approach with the collaboration of AMD, Intel, NVIDIA and Qualcomm, then after that Khronos announces they're starting an initiative to develop a brand new OpenGL from the ground up, called Next Generation OpenGL, not only that but we also learn Johan Andersson is now joining the Khronos Group, while at the same time finding out AMD would be making every bit of Mantle available to them in order to support their initiative. A month goes by then all of the sudden Microsoft announces a new iteration of DirectX, called DirectX 11.3, announcing it will be including some of the features of DirectX 12 to the much higher level DirectX 11 API.

So what do we got?:

*- Microsoft developed and announced DirectX 11.2 to introduce this new exciting feature called Tiled Resources, AMD then announces Mantle, then all of the sudden Microsoft announces DirectX 12, and DirectX 11.3.*

Am I the only one that sees something weird here? or am I crazy for thinking Microsoft's original plan wasn't to allocate resources on developing and hyping DirectX 11.2 only to announce DirectX 12 (and DirectX 11.3) not that long after?, making DirectX 11.2 absolutely irrelevant to dev studios everywhere. Good planning! 

*- Microsoft claims DirectX 12 has been in development for years now.*

If that statement is true, then one would assume a developer like Johan Andersson would have found out about this right? then why did he go through with pitching his Mantle idea to every IHV out there in the past? with the incredibly significant baggage it would carry if picked up by one, in this case AMD (now viewed as an AMD puppet, having the responsibility of co-developing the API from scratch, etc). The smart guess would be he would have dropped his Mantle plans right after finding out about DirectX 12 being developed as a low level API with the collaboration of AMD, Intel, NVIDIA and Qualcomm.

Let's all not forget this was the same guy that was on an NVIDIA stage praising G-Sync just after announcing Mantle to the world.

Also, something I forgot to mention, Microsoft also announced DirectX 12 would be coming to the XBOX One as well, replacing their DirectX 11.x API over there. Now, if they have been developing DirectX 12 for years now, then why in hell didn't the XBOX One launch with this API? they couldn't get this API done for a *single hardware config*!? after all a low level API is just a thin abstraction layer over the hardware. If DirectX 12 is so much better than DirectX 11.x and they have been developing DirectX 12 for years now, why not launch one of their most important products and biggest source of revenue with it?. Again, this is just 1 hardware config we're talking about.

*- But SlackerITGuy, Microsoft and Khronos have been optimizing their APIs for years now to allow for a much lower level access (I remember you [PostalTwinkie] posting about Microsoft going low level ~10 years ago? lol).*

While true, to some extend of course, why did both DirectX 12 and OpenGL NG require a from the ground up redesign?

Also, and I'm gonna use my same argument as before, why didn't developers like Johan Andersson take advantage of this instead of calling for a different approach to API design?

I remember reading several developers over at twitter when Apple launched its Metal API, making fun of OpenGL, with the most retweeted one saying something along these lines "Somewhere an OpenGL programmer wants to flip a table now, but they probably need an extension to do that".

Here:
Quote:


> Somewhere an opengl programmer wants to flip a table now, but they probably need an extension to do that #WWDC14
> 
> - Christina Ann Coffin (@ChristinaCoffin)
> 
> 
> 
> June 2, 2014
Click to expand...

*TL;DR version? Microsoft was ready to move forward with DirectX 11.2 and tiled resources, AMD then announces they're taking on Mantle, Microsoft rapidly announces both DirectX 12 and DirectX 11.3, but AMD's Mantle had nothing to do with it according to PostalTwinkie.*


----------



## CasualCat

@SlackerITGuy I thought there was a point too after DX11 that MS said they were done with DX updates, or am I remembering incorrectly?


----------



## HillaryClinton

Quote:


> Originally Posted by *CasualCat*
> 
> @SlackerITGuy I thought there was a point too after DX11 that MS said they were done with DX updates, or am I remembering incorrectly?


Wouldn't that be a bad move on MS part? I mean most devs would start to swing towards OpenGl right? Which in reality would make linux just as good at Windows for gaming, I myself love Linux and would welcome that but I am guessing MS would dislike that change.


----------



## DaaQ

Quote:


> Originally Posted by *CasualCat*
> 
> @SlackerITGuy I thought there was a point too after DX11 that MS said they were done with DX updates, or am I remembering incorrectly?


I remember that too. I remember reading it in Max PC mag, that there wouldn't be any new versions of it. ie DX12. IIRC it was around the release, or shortly after release of win7.
I have tried searching it but I can't find all my old mags, but I had a physical copy of it at one time.


----------



## Olivon

Quote:


> So you're not surprised, my answer, we are not taking the foot off the gas. So you'll see continued very, very strong graphics, *we'll have a refresh that you'll see, that we'll talk about later in 2015 that we are excited about*.


http://seekingalpha.com/article/2743255-advanced-micro-devices-amd-management-presents-at-barclays-global-technology-conference-transcript?part=single

Hope this refresh will be very competitive, it's really needed if they want to counter the Maxwell galore..


----------



## dragneel

I just wish it had been released already since i had to buy a 280 the other day, so close to the new series release because my 6950 died and i couldn't live without my computer for a month or two. Would have liked a GTX970 but it was $200 over my budget.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Imglidinhere*
> 
> I sense the power of the green team in this one... strong it is...


I have had, and have, more AMD hardware than just about every "AMD" person on this forum. My 780 Ti is the first Nvidia card I have had in my rig since my GTX 470 years ago.

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Here:
> *TL;DR version? Microsoft was ready to move forward with DirectX 11.2 and tiled resources, AMD then announces they're taking on Mantle, Microsoft rapidly announces both DirectX 12 and DirectX 11.3, but AMD's Mantle had nothing to do with it according to PostalTwinkie.*


So I skipped everything but your tl:dr, because in the past you have shown that you don't understand what you are talking about. Instead I will say this, addressing the quote above.....

Just from a development time point, and basic math, Microsoft wouldn't have enough time between Mantle announcement and when DX 12 is supposed to drop to shift from a DX 11.2, to a full ground up build for a low level API for desktop. The time span between Mantle announcement and DX 12's target release window isn't a long enough development window.

I know that is something many of you "Mantle changed the world" types like to ignore. It is also why every studio didn't jump on Mantle, because DX 12 has been in development for a very long time.

Quote:


> Originally Posted by *CasualCat*
> 
> @SlackerITGuy I thought there was a point too after DX11 that MS said they were done with DX updates, or am I remembering incorrectly?


....

This was long debunked.

Microsoft ended support for the XDA community and their XNA development tools, which were targeted towards Windows Phone. People got the "leaked" e-mail discussing it, and for some stupid reason, thought Microsoft was dumping DX. Obviously not the case as Microsoft has the PC gaming market in their hands with DX.


----------



## GoldenTiger

Quote:


> Originally Posted by *Olivon*
> 
> http://seekingalpha.com/article/2743255-advanced-micro-devices-amd-management-presents-at-barclays-global-technology-conference-transcript?part=single
> 
> Hope this refresh will be very competitive, it's really needed if they want to counter the Maxwell galore..


Talk about? Not launch? Nvidia pascal, the next architecture after Maxwell, is due in early to mid 2016. If amd is only talking about their stuff later in 2015, that means late launch or even into 2016. Yikes!


----------



## SlackerITGuy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> So I skipped everything but your tl:dr, because in the past you have shown that you don't understand what you are talking about. Instead I will say this, addressing the quote above.....
> 
> Just from a development time point, and basic math, Microsoft wouldn't have enough time between Mantle announcement and when DX 12 is supposed to drop to shift from a DX 11.2, to a full ground up build for a low level API for desktop. The time span between Mantle announcement and DX 12's target release window isn't a long enough development window.
> 
> I know that is something many of you "Mantle changed the world" types like to ignore. It is also why every studio didn't jump on Mantle, because DX 12 has been in development for a very long time.


Really? I don't understand what I'm talking about?

But you do right? claiming Microsoft has gone low level with DirectX in the past (~around 10 years ago right?), yet DirectX 12, as you clearly pointed out, required a "from the ground up approach"? to please IHVs and developers that have been complaining about DirectX for a very long time now, that doesn't seem to make much sense right Twinkie?

But to answer to your post:

So in your opinion *a ~21 month window* for the development of DirectX 12, with the help of AMD, Intel, NVIDIA and Qualcomm is not enough? That doesn't seem that unreasonable to me (BTW that time frame could easily change in the future), not to mention the inclusion of Microsoft's superior workforce compared to what AMD had working on Mantle.

I find it extremely convenient for you that you ignored ALL of my post by just sticking to the TL;DR version, go read the rest and try to put together a well informed post, please, put me in my place.

EDIT: Also:
Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is also why every studio didn't jump on Mantle, because DX 12 has been in development for a very long time.


Really? so that makes more sense to you than dev studios not wanting to develop a low level rendering path for their game for just 1 GPU vendor?

The extends you go to defend Microsoft/NVIDIA and bash AMD over this is just insane.

People forget the damage Microsoft has done to PC Gaming.


----------



## raghu78

Quote:


> Originally Posted by *GoldenTiger*
> 
> Talk about? Not launch? Nvidia pascal, the next architecture after Maxwell, is due in early to mid 2016. If amd is only talking about their stuff later in 2015, that means late launch or even into 2016. Yikes!


yeah AMD will just talk and not release their products in Q1 2015







btw where did you get the info on Pascal launch dates.







My opinion is the first 16FF+ products are going to be Maxwell die shrinks. These will come in early 2016. Pascal is more a late 2016 event. Nvidia will want to avoid taking too many risks on a bleeding edge FINFET process which is likely to have yield challenges at launch. a new architecture with a new high speed low latency bus called NVLink, a new memory standard (HBM), a new production technology (die stacking on 2.5 silicon interposer) and a new FINFET process is too much risk even for Nvidia. so a 16FF+ maxwell die shrink (300 sq mm) in early 2016 followed by a flagship Pascal in late Q3 / early Q4 2016 sounds possible.


----------



## PostalTwinkie

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Really? I don't understand what I'm talking about?
> 
> But you do right? claiming Microsoft has gone low level with DirectX in the past (~around 10 years ago right?), yet DirectX 12, as you clearly pointed out, required a "from the ground up approach"? to please IHVs and developers that have been complaining about DirectX for a very long time now, that doesn't seem to make much sense right Twinkie?
> 
> But to answer to your post:
> 
> So in your opinion *a ~21 month window* for the development of DirectX 12, with the help of AMD, Intel, NVIDIA and Qualcomm is not enough? That doesn't seem that unreasonable to me (BTW that time frame could easily change in the future), *not to mention the inclusion of Microsoft's superior workforce compared to what AMD had working on Mantle.*
> 
> I find it extremely convenient for you that you ignored ALL of my post by just sticking to the TL;DR version, go read the rest and try to put together a well informed post, please, put me in my place.
> 
> EDIT: Also:
> Really? so that makes more sense to you than dev studios not wanting to develop a low level rendering path for their game for just 1 GPU vendor?
> 
> The extends you go to defend Microsoft/NVIDIA and bash AMD over this is just insane.
> 
> People forget the damage Microsoft has done to PC Gaming.


Again, you completely show that you have ZERO understanding of development on this scale!

DX did go "low level" with the ORIGINAL XBox back in 2001, the development of which began somewhere back in 1997! However, the console environment isn't the same as the desktop environment and they didn't move that lower level flavor of DX 8 over to desktop. There is no "I am claiming.", it is absolute objective fact this happened! You can buy the Xbox from your local Craigslist or used game store if you don't believe.

No, 21 months isn't long enough to develop an API on the scale that DX 12 is! No, having more people doesn't mean it gets done faster, not by a long shot, that isn't how software development works.

Please, just stop, the more you speak the more you show you don't understand software development. Hell, I have been out of it for some 15 years and still understand these basic concepts.

More manpower != faster development!

21 months is hardly enough time to get from original, ground up design, to an early preview product. Let alone a release candidate or even release!

EDIT: Oh, you can find it convenient all you want, you have zero understanding of software development. I am not going to bullet point your outlandish claims that are not founded in reality.

Deny all you want, and be ignorant about it for all I care. Unfortunately for you; mathematics are against you, facts are against you, history is against you.


----------



## SlackerITGuy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Again, you completely show that you have ZERO understanding of development on this scale!


Oh noes!, I have zero understanding of this topic because PostalTwinkie said so!! OMG!
Quote:


> Originally Posted by *PostalTwinkie*
> 
> DX did go "low level" with the ORIGINAL XBox back in 2000, the development of which began somewhere back in 1997! However, the console environment isn't the same as the desktop environment and they didn't move that lower level flavor of DX 8 over to desktop. There is no "I am claiming.", it is absolute objective fact this happened! You can buy the Xbox from your local Craigslist or used game store if you don't believe.


And what does the XBOX variant of DirectX has to do with the PC variant? We're not talking consoles here.

Of course a console API (in this case the XBOX One) is going to be low level! hahahaha, but again, how does that affect the PC variant of the DirectX API?. DirectX has been a high level API for the PC since its birth, that was the whole point of DirectX in the first place (which kinda saved PC gaming back then).
Quote:


> Originally Posted by *PostalTwinkie*
> 
> No, 21 months isn't long enough to develop an API on the scale that DX 12 is! No, having more people doesn't mean it gets done faster, not by a long shot, that isn't how software development works.
> 
> Please, just stop, the more you speak the more you show you don't understand software development. Hell, I have been out of it for some 15 years and still understand these basic concepts.
> 
> More manpower != faster development!
> 
> 21 months is hardly enough time to get from original, ground up design, to an early preview product. Let alone an release candidate or even release!


Oh, so that settles it then, since in your opinion it couldn't possibly be done in *freaking 21 months* that absolutely makes it fact.

Give me a break.

And again, you've dodged by original response to your post, go read it, and try to put together an intelligent and informed response.

EDIT:
Quote:


> EDIT: Oh, you can find it convenient all you want, you have zero understanding of software development. I am not going to bullet point your outlandish claims that are not founded in reality.
> 
> Deny all you want, and be ignorant about it for all I care. Unfortunately for you; mathematics are against you, facts are against you, history is against you.


I have zero understanding of software development, yet I work in software development 

How are mathematics, facts and history against me? hahahahah because you say it couldn't possibly be done in 21 months? that's how you would sum up mathematics, FACTS and history against me? talk about being full of yourself.

Keep believing Microsoft has been working on this for years and years now, but somewhat managed NOT to put that work on their latest console, on a SINGLE HARDWARE config, that seems reasonable as well.


----------



## PostalTwinkie

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Oh noes!, I have zero understanding of this topic because PostalTwinkie said so!! OMG!
> And what does the XBOX variant of DirectX has to do with the PC variant? We're not talking consoles here.
> 
> Of course a console API (in this case the XBOX One) is going to be low level! hahahaha, but again, how does that affect the PC variant of the DirectX API?. DirectX has been a high level API for the PC since its birth, that was the whole point of DirectX in the first place (which kinda saved PC gaming back then).
> Oh, so that settles it then, since in your opinion it couldn't possibly be done in *freaking 21 months* that absolutely makes it fact.
> 
> Give me a break.
> 
> And again, you've dodged by original response to your post, go read it, and try to put together an intelligent and informed response.


Now I know why many others have put you on ignore, I have never met someone so willfully dense as you.

That is fine though, ignorance is bliss; I wish you a good evening and rest of your life, you are now added to my ignore list as well.


----------



## SlackerITGuy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Now I know why many others have put you on ignore, I have never met someone so willfully dense as you.
> 
> That is fine though, ignorance is bliss; I wish you a good evening and rest of your life, you are now added to my ignore list as well.


Another post full of facts, mathematics and history proving me wrong


----------



## HillaryClinton

Hehe wound up grabbing a Gigabyte Gamer Edition 970 instead of waiting, 312 dollars open box at microcenter, its a tad loud even in my Fractal R4 but it is a giant card with 3 fans....290 probably would have been louder.


----------



## Ultracarpet

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Again, you completely show that you have ZERO understanding of development on this scale!
> 
> DX did go "low level" with the ORIGINAL XBox back in 2001, the development of which began somewhere back in 1997! However, the console environment isn't the same as the desktop environment and they didn't move that lower level flavor of DX 8 over to desktop. There is no "I am claiming.", it is absolute objective fact this happened! You can buy the Xbox from your local Craigslist or used game store if you don't believe.
> 
> No, 21 months isn't long enough to develop an API on the scale that DX 12 is! No, having more people doesn't mean it gets done faster, not by a long shot, that isn't how software development works.
> 
> Please, just stop, the more you speak the more you show you don't understand software development. Hell, I have been out of it for some 15 years and still understand these basic concepts.
> 
> More manpower != faster development!
> 
> 21 months is hardly enough time to get from original, ground up design, to an early preview product. Let alone a release candidate or even release!
> 
> EDIT: Oh, you can find it convenient all you want, you have zero understanding of software development. I am not going to bullet point your outlandish claims that are not founded in reality.
> 
> Deny all you want, and be ignorant about it for all I care. Unfortunately for you; mathematics are against you, facts are against you, history is against you.


Well, you are really are assuming two likely possibilities remain untrue. First, you are assuming that it would take them longer than the 20ish months of dev time to produce dx12. Which, even if we accept, does not ensure certainty that mantle had no influence. This is because your second assumption is that much of the required development was not already in place.

For my example I would point to Intel. If you have no competitor, you slow development. If AMD suddenly released a competitive architecture, do you believe Intel would then have to, from that point, suddenly develop a brand new architecture from the ground up? No, it is much more likely Intel has canned responses in case of such a thing happening. Would Microsoft be so naive to not protect their industry in the same way?

This point of view is compounded with your own claims of low level dx being in development since the late 90's. If that was the focus the entire time, one would be inclined to think they had something related to the next step of development sitting on the shelf in case of a rainy day.

While I agree that there may be more to the story than just "Mantle came out, Microsoft forced to make dx12", it is somewhat ignorant to claim it had no effect at all. The only motive for believing otherwise is simply not wanting it to be true. Which is not necessarily a bad point of view, but is dangerous in that it can be ignorant of fair arguments.

Also, can we please judge people based on the merit and truth of their arguments rather than their character? Entering into a debate with someone establishes a responsibility to address a person's arguments, not just their thesis or character. I understand that certain people get frustrated with certain people, but if one has a prejudice towards someone that cannot be overcome, engaging in debate with them will accomplish nothing, and result in a failure to listen or care for the actual truths that are brought forth.


----------



## kpzero

I was going to post a new thread but I cant so this is the best spot:

There are some new supposed benchmarks over at chiphell for Bermuda, Fiji, and GM200.

I cant read it for the life of me but it is saying AMD is using Global Foundries 20nm I think. For all I know that says they arent lol.





http://www.chiphell.com/thread-1196441-1-1.html


----------



## Pantsu

Seems extremely implausible they'd have all these chips to test.


----------



## Olivon

LoL

I have 2 GM200 and all the upcoming AMD cardz, come at me internet !


----------



## kingduqc

Do they half life 3 to bench too?


----------



## KeepWalkinG

Now we know that the new cards will be faster than Gtx 980 and maybe faster than Full MAXWELL chip


----------



## Seronx

RMG(Replacement Metal Gate) yields at GlobalFoundries are questionable. If its 20nm, we won't be seeing anything until 2016.


----------



## darealist

20nm huh. Damn! This will stop nvidia from milking midrange maxwells for sure. We all knew the GTX 980 should had been a 960ti or a 970 at most.


----------



## zealord

yeah sure...

two new AMD cards and two new Nvidia like we'd believe that


----------



## gamervivek

Quote:


> Originally Posted by *zealord*
> 
> yeah sure...
> 
> two new AMD cards and two new Nvidia like we'd believe that


? Nvidia has one new gpu, AMD have two, which in all probability will have two versions themselves.


----------



## PontiacGTX

Quote:


> Originally Posted by *Seronx*
> 
> RMG(Replacement Metal Gate) yields at GlobalFoundries are questionable. If its 20nm, we won't be seeing anything until 2016.


why Bermuda keeps in the high end???


----------



## darealist

Bermuda with mature drivers and overclock = on par with 295x2!? This is what a real nextgen highend looks like.

GTX 980 should be rebranded as 1070 for 1080p gaming.


----------



## y2kcamaross

Quote:


> Originally Posted by *darealist*
> 
> Bermuda with mature drivers and overclock = on par with 295x2!? This is what a real nextgen highend looks like.
> 
> GTX 980 should be rebranded as 1070 for 1080p gaming.


If you believe that chart is real, you have problems


----------



## CasualCat

Quote:


> Originally Posted by *gamervivek*
> 
> ? Nvidia has one new gpu, AMD have two, which in all probability will have two versions themselves.


I think the skepticism is that somehow the stars would align and they'd have samples simultaneously to benchmark them all. Throw me in that category as well.

I hope the top end AMD gpu performs that well though as I'd love to have that.


----------



## raghu78

Quote:


> Originally Posted by *y2kcamaross*
> 
> If you believe that chart is real, you have problems


we will see in Q1 2015 what is real and what is not. we will also see if anybody has problems if that chart reflects the competitive performance of the upcoming GPUs. just a matter of couple of months.


----------



## y2kcamaross

Quote:


> Originally Posted by *raghu78*
> 
> we will see in Q1 2015 what is real and what is not. we will also see if anybody has problems if that chart reflects the competitive performance of the upcoming GPUs. just a matter of couple of months.


That was deep, very insightful. The odds of them having all those GPUs to test now are slim to none, do I hope those results are true? Of course. I'm all for the fastest GPUs possible, manufacturer is irrelevant


----------



## Kuivamaa

Chiphell is usually reliable but I see ES in front of the radeon numbers and just "full fat" and "cut down" before GM200 ones,as If they truly only have AMD ES cards but just doing a projection of some sort of nvidia. Anyone has a link to this story?


----------



## Olivon

Quote:


> Originally Posted by *KeepWalkinG*
> 
> http://www.chiphell.com/thread-1196441-1-1.html


----------



## darealist

Quote:


> Originally Posted by *y2kcamaross*
> 
> If you believe that chart is real, you have problems


I'll have a problem for sure if I were one of those who just bought a midrange gtx 980.


----------



## Orangey

What's strange is "full fat" is a term I have only heard on British forums and Chiphell is a Chinese site.


----------



## PontiacGTX

Quote:


> Originally Posted by *Orangey*
> 
> What's strange is "full fat" is a term I have only heard on British forums and Chiphell is a Chinese site.


well Flut Fall can be used for the total amount of power of something as a whole. EX a 780 TI is Full Flat of GK110(maybe it isnt). english used around the world so any term is frequenctly used


----------



## Orangey

I didn't ask what it means, I said it's strange to see a colloquial expression used outside of the region of origin.


----------



## y2kcamaross

Quote:


> Originally Posted by *darealist*
> 
> I'll have a problem for sure if I were one of those who just bought a midrange gtx 980.


I assume this was some lame attempt at an insult to my cards in my system? I buy the fastest cards available, I'll buy either the new 390x series or the full GM200 series as well, so yup...you really got me good


----------



## raghu78

Quote:


> Originally Posted by *y2kcamaross*
> 
> That was deep, very insightful. The odds of them having all those GPUs to test now are slim to none, do I hope those results are true? Of course. I'm all for the fastest GPUs possible, manufacturer is irrelevant


Its going to be good old fashioned dog fight and the consumer is going to get good price perf. if those numbers are true GTX 980 will finally hit USD 300 - 350. I think its very much possible. think about it. R9 290X with GCN 1.1 and no major architectural modifications was 35% faster than HD 7970 for 37.5% more shaders or stream processors.

R9 390X will be a significant architectural upgrade over HD 7970 / R9 290X. GCN 2.0 will have Tonga (GCN 1.2) improvements and more. AMD has improved ROP, tesselation performance and memory bandwidth efficiency.

http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2

Whats missing is an improvement to the shader/compute unit architecture to improve perf/shader, perf/ sqmm and perf/watt. HBM brings massive bandwith of 512 GB/s with 1/3rd the memory power (controller + memory chips) of an equivalent GDDR5 system. see slide 45.

http://www.microarch.org/micro46/files/keynote1.pdf

AMD has already been producing Kaveri, Beema, Mullins, semi-custom game console chips and GPUs at GF 28SHP. GF 28SHP is a better process than TSMC 28HP on which Hawaii was built.

http://www.anandtech.com/show/7974/amd-beema-mullins-architecture-a10-micro-6700t-performance-preview

"AMD claims a *19% reduction in core leakage/static current* for Puma+ compared to Jaguar at 1.2V, and *a 38% reduction for the GPU*. The drop in leakage directly contributes to a substantially lower power profile for Beema and Mullins."

GF 28SHP (with 38% lower leakage than TSMC 28HP) + HBM (which cuts power by 2/3rd for memory controller and GDDR5 memory chips) + architectural efficiency changes + chip power efficiency improvements (using techniques like adaptive voltage operation)

http://images.anandtech.com/doci/8742/Voltage%20Adaptive.png
http://images.anandtech.com/doci/8742/Carrizo%20Efficiency.png

as for chiphell their GTX 980 / GTX 970 leaks were spot on. With 45% more shaders and so many other factors its not unimaginable for R9 390X to beat R9 290X by 60%.


----------



## gamervivek

Quote:


> Originally Posted by *CasualCat*
> 
> I think the skepticism is that somehow the stars would align and they'd have samples simultaneously to benchmark them all. Throw me in that category as well.
> 
> I hope the top end AMD gpu performs that well though as I'd love to have that.


Well, the 380x benchmark was released earlier by the same guy, the maxwell samples have been in the wild for quite some time and it's already three months since nvidia's 980/970 release. Bermuda is the surprise.


----------



## FreeElectron

When when when when when when?


----------



## darealist

Quote:


> Originally Posted by *y2kcamaross*
> 
> I assume this was some lame attempt at an insult to my cards in my system? I buy the fastest cards available, I'll buy either the new 390x series or the full GM200 series as well, so yup...you really got me good


I'm on mobile. I can't even see your sig lol.


----------



## PontiacGTX

Quote:


> Originally Posted by *Orangey*
> 
> I didn't ask what it means, I said it's strange to see a colloquial expression used outside of the region of origin.


well either way I said it is a global language


----------



## orick

Quote:


> Originally Posted by *PontiacGTX*
> 
> well either way I said it is a global language


Maybe it's a Hong Kong site which used to be a British colony.


----------



## KeepWalkinG

Is it possible for *chiphell* to have these cards?


----------



## geoxile

Quote:


> Originally Posted by *KeepWalkinG*
> 
> Is it possible for *chiphell* to have these cards?


Pretty sure it's actually a forum, so who knows who goes on that site. Hell, it could be an AMD guy posting it for all we know


----------



## maarten12100

Quote:


> Originally Posted by *raghu78*
> 
> Its going to be good old fashioned dog fight and the consumer is going to get good price perf. if those numbers are true GTX 980 will finally hit USD 300 - 350. I think its very much possible. think about it. R9 290X with GCN 1.1 and no major architectural modifications was 35% faster than HD 7970 for 37.5% more shaders or stream processors.
> 
> R9 390X will be a significant architectural upgrade over HD 7970 / R9 290X. GCN 2.0 will have Tonga (GCN 1.2) improvements and more. AMD has improved ROP, tesselation performance and memory bandwidth efficiency.
> 
> http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2
> 
> Whats missing is an improvement to the shader/compute unit architecture to improve perf/shader, perf/ sqmm and perf/watt. HBM brings massive bandwith of 512 GB/s with 1/3rd the memory power (controller + memory chips) of an equivalent GDDR5 system. see slide 45.
> 
> http://www.microarch.org/micro46/files/keynote1.pdf
> 
> AMD has already been producing Kaveri, Beema, Mullins, semi-custom game console chips and GPUs at GF 28SHP. GF 28SHP is a better process than TSMC 28HP on which Hawaii was built.
> 
> http://www.anandtech.com/show/7974/amd-beema-mullins-architecture-a10-micro-6700t-performance-preview
> 
> "AMD claims a *19% reduction in core leakage/static current* for Puma+ compared to Jaguar at 1.2V, and *a 38% reduction for the GPU*. The drop in leakage directly contributes to a substantially lower power profile for Beema and Mullins."
> 
> GF 28SHP (with 38% lower leakage than TSMC 28HP) + HBM (which cuts power by 2/3rd for memory controller and GDDR5 memory chips) + architectural efficiency changes + chip power efficiency improvements (using techniques like adaptive voltage operation)
> 
> http://images.anandtech.com/doci/8742/Voltage%20Adaptive.png
> http://images.anandtech.com/doci/8742/Carrizo%20Efficiency.png
> 
> as for chiphell their GTX 980 / GTX 970 leaks were spot on. With 45% more shaders and so many other factors its not unimaginable for R9 390X to beat R9 290X by 60%.


So much truth in this post








I can't wait for Carrizo and the new cards


----------



## darealist

The main concern is wait time. I have a 970 to wait on for the time being. By then quantum dot 4k tvs will be out, and i dont have to settle for those pathetic antsize monitors.


----------



## Ganf

3xx's and Maxwell's dropping just in time for tax returns?

Oh...



Oh.... Yes....

Looks like video cards for my new PC are at the bottom of my shopping list.


----------



## ReHWolution

Having some good peeps in Sapphire, lemme tell you: no 3xx is yet to be done. It's further than you think







and so is 980Ti and Titan II, or whatever they're going to call it


----------



## kingduqc

Quote:


> Originally Posted by *ReHWolution*
> 
> Having some good peeps in Sapphire, lemme tell you: no 3xx is yet to be done. It's further than you think
> 
> 
> 
> 
> 
> 
> 
> and so is 980Ti and Titan II, or whatever they're going to call it


it's further then april/may?

rip amd.


----------



## StereoPixel

Quote:


> Originally Posted by *ReHWolution*
> 
> Having some good peeps in Sapphire, lemme tell you: no 3xx is yet to be done. It's further than you think
> 
> 
> 
> 
> 
> 
> 
> and so is 980Ti and Titan II, or whatever they're going to call it


it's 2H 2015?

RIP AMD


----------



## Orangey

AMD are in no rush, Maxwell was a huge disappointment. Hopefully NV come with something good eventually to force AMD prices down.


----------



## MunneY

Quote:


> Originally Posted by *Orangey*
> 
> AMD are in no rush, Maxwell was a huge disappointment. Hopefully NV come with something good eventually to force AMD prices down.


On what planet was maxwell a huge disappointment?

The 970 alone brought the R9 prices to their knees


----------



## Luciferxy

Quote:


> Originally Posted by *MunneY*
> 
> On what planet was maxwell a huge disappointment?
> 
> The 970 alone brought the R9 prices to their knees


Somewhere on a red hot planet ofc


----------



## nleksan

GM200/210 taped out many months ago... How is AMD's perpetual tardiness in any way reflective of Nvidia, other than the latter having no reason to release something when they're already ahead and only gaining market?


----------



## Ultracarpet

Quote:


> Originally Posted by *nleksan*
> 
> GM200/210 taped out many months ago... How is AMD's perpetual tardiness in any way reflective of Nvidia, other than the latter having no reason to release something when they're already ahead and only gaining market?


Ehhh, even beyond that, I think this person and their sources information should be taken with a grain of salt. I mean, "further than you think" is subjective, in their context it could be that they are assuming we think something is coming out by February, whereas it actually won't be announced til late march... I'm not saying this should be a leading theory lol, just that what that person said is not very conclusive. Also, we have no idea if those "peeps" at sapphire even actually possess information out of our own reach.


----------



## Forceman

If we've learned anything over the years, it's that video card release time brings out all the "I have a friend of a friend with inside information" trolls.


----------



## Tsumi

Quote:


> Originally Posted by *nleksan*
> 
> GM200/210 taped out many months ago... How is AMD's perpetual tardiness in any way reflective of Nvidia, other than the latter having no reason to release something when they're already ahead and only gaining market?


Perpetual tardiness? In recent history, the HD5xxx series beat the GTX 4xx series to market. The HD7xxx series beat the GTX 6xx series to market.

This is a good example of selective memory. It happens to everyone, so don't feel bad about it.


----------



## Noufel

Very good if true, i'll have more time to save money for 2 390x because sinceraly i'm a little disapointed with my 980 sli


----------



## Boomstick727

Quote:


> Originally Posted by *Orangey*
> 
> AMD are in no rush, Maxwell was a huge disappointment. Hopefully NV come with something good eventually to force AMD prices down.


Lol nice trolling attempt









Maxwell was a disappointment.. Yup further extending the lead they already had in performance against AMD, along with lowering power consumption and full Direct X12 support. Such a disappointment









I've got one of those 980 disappointments in my PC, running all my games nicely at 1440P and running really cool. Disappointing really


----------



## FreeElectron

Quote:


> Originally Posted by *Boomstick727*
> 
> Lol nice trolling attempt
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maxwell was a disappointment.. Yup further extending the lead they already had in performance against AMD, along with lowering power consumption and full Direct X12 support. Such a disappointment
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've got one of those 980 disappointments in my PC, running all my games nicely at 1440P and running really cool. Disappointing really


It is a disappointment because there was no big leap in performance.
Which is really what matters..
I am not going to buy a card because it uses less power but instead i will get something that runs games with higher fps.

4k resolution is already here.
We now need games that runs at 4k with at least 60 fps at max settings.
Because high end monitors and high end games should be satisfied by high end graphics cards.


----------



## kingduqc

Quote:


> Originally Posted by *Boomstick727*
> 
> Lol nice trolling attempt
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maxwell was a disappointment.. Yup further extending the lead they already had in performance against AMD, along with lowering power consumption and full Direct X12 support. Such a disappointment
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've got one of those 980 disappointments in my PC, running all my games nicely at 1440P and running really cool. Disappointing really


Perf gained over the 780ti is lousy at best and retail price of the 980 is way too high above the msrp... the reason maxwell is selling like that was the huge perf you got at lower cost. Nowadays gpus are almost 150$ more expensive then they used to be. 450 for the 970 and close tp 650$ for most custom coolers On the 980.


----------



## DannyDK

Quote:


> Originally Posted by *kingduqc*
> 
> Perf gained over the 780ti is lousy at best and retail price of the 980 is way too high above the msrp... the reason maxwell is selling like that was the huge perf you got at lower cost. Nowadays gpus are almost 150$ more expensive then they used to be. 450 for the 970 and close tp 650$ for most custom coolers On the 980.


If we could get a custom 980 for 650 in Denmark it would be great, ad another 100 and thats what we pay for it, i know cause I did and im not all that sorry about it allthough it is a bit overpriced compared to the 970 which i had for a week or so before upgrading to the 980. All in all the card is great but overpriced.


----------



## Boomstick727

Quote:


> Originally Posted by *FreeElectron*
> 
> It is a disappointment because there was no big leap in performance.
> Which is really what matters..
> I am not going to buy a card because it uses less power but instead i will get something that runs games with higher fps.
> 
> 4k resolution is already here.
> We now need games that runs at 4k with at least 60 fps at max settings.
> Because high end monitors and high end games should be satisfied by high end graphics cards.


Quote:


> Originally Posted by *kingduqc*
> 
> Perf gained over the 780ti is lousy at best and retail price of the 980 is way too high above the msrp... the reason maxwell is selling like that was the huge perf you got at lower cost. Nowadays gpus are almost 150$ more expensive then they used to be. 450 for the 970 and close tp 650$ for most custom coolers On the 980.


Yeah I don't agree, those with 780 Ti / 290X etc don't need to upgrade. For those buying now, the 970 / 980 give more performance at the same price point. More vram. Use less power and offer full Direct X12 support. That is not a disappointment. At least Nvidia released something worthy for new buyers. What did AMD give us recently... Tonga? Now that is a disappointment.


----------



## iSlayer

Quote:


> Originally Posted by *Orangey*
> 
> AMD are in no rush, Maxwell was a huge disappointment. Hopefully NV come with something good eventually to force AMD prices down.


Orangey you're beginining to sound like a shill say it ain't so man.

Maxwell is a winner for the same reason Evergreen was awesome...
Quote:


> Originally Posted by *MunneY*
> 
> On what planet was maxwell a huge disappointment?
> 
> The 970 alone brought the R9 prices to their knees


The nonsense planet of the horse crap dimension.


----------



## FreeElectron

Quote:


> Originally Posted by *Boomstick727*
> 
> Yeah I don't agree, those with 780 Ti / 290X etc don't need to upgrade. For those buying now, the 970 / 980 give more performance at the same price point. More vram. Use less power and offer full Direct X12 support. That is not a disappointment. At least Nvidia released something worthy for new buyers. What did AMD give us recently... Tonga? Now that is a disappointment.


Quote:


> Originally Posted by *FreeElectron*
> 
> It *is a disappointment* because there was no *big leap* in performance.
> Which is really what matters..
> I am not going to buy a card because it uses less power but instead i will get something that runs games with higher fps.
> 
> 4k resolution is already here.
> We now need games that runs at 4k with at least 60 fps at max settings.
> Because high end monitors and high end games should be satisfied by high end graphics cards.


When i say BIG i mean something more than 50% improvement in performance over the previous gen 780 ti
I don't care about what AMD will provide, if they do similar then they are both a disappointment.
Your lack of agreement will not change any thing.


----------



## Leopard2lx

Quote:


> Originally Posted by *FreeElectron*
> 
> It is a disappointment because there was no big leap in performance.
> Which is really what matters..
> I am not going to buy a card because it uses less power but instead i will get something that runs games with higher fps.
> 
> 4k resolution is already here.
> We now need games that runs at 4k with at least 60 fps at max settings.
> Because high end monitors and high end games should be satisfied by high end graphics cards.


AMD's 390x will be the same "disappointment". Roughly 20-25% more perormance give or take over the 290x and it's going to be a miracle if they are able to cut the power consumption / heat / noise down. It's going to be just enough to beat the 980. Then NVIDIA will come out with something that will edge out the 390x.
No one is coming out with huge leaps in performance anymore. It's all going to be relatively incremental.


----------



## FreeElectron

Quote:


> Originally Posted by *Leopard2lx*
> 
> AMD's 390x will be the same "disappointment". Roughly 20-25% more perormance give or take over the 290x and it's going to be a miracle if they are able to cut the power consumption / heat / noise down. It's going to be just enough to beat the 980. Then NVIDIA will come out with something that will edge out the 390x.
> No one is coming out with huge leaps in performance anymore. It's all going to be relatively incremental.


----------



## Ding Chavez

Quote:


> Originally Posted by *Leopard2lx*
> 
> No one is coming out with huge leaps in performance anymore. It's all going to be relatively incremental.


I think this is true. Look at Intel with their small increases in CPU performance now. It's business and marketing. They just need to make something a bit better than what's out now. And remember they have to not only make a faster product but then plan to make something after that even faster so big increases become very difficult to sustain.


----------



## Artikbot

Quote:


> Originally Posted by *Leopard2lx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FreeElectron*
> 
> It is a disappointment because there was no big leap in performance.
> Which is really what matters..
> I am not going to buy a card because it uses less power but instead i will get something that runs games with higher fps.
> 
> 4k resolution is already here.
> We now need games that runs at 4k with at least 60 fps at max settings.
> Because high end monitors and high end games should be satisfied by high end graphics cards.
> 
> 
> 
> AMD's 390x will be the same "disappointment". Roughly 20-25% more perormance give or take over the 290x and it's going to be a miracle if they are able to cut the power consumption / heat / noise down.
Click to expand...

Difference being the AMD core is usable for anything else than gaming.

And speaking of gaming, as per usual the 980 will be absolutely no match to AMD's counterpart in high resolution with AA/postprocessing due to the aforementioned. Like it has been for the past three generations.

Then nVIDIA will release the x10 core... And everyone will run away from the 980 like the plague and get the big cores instead, and nVIDIA will be happy because people bought GPUs twice.

Which is a genius marketing strategy if you ask me.


----------



## FreeElectron

Quote:


> Originally Posted by *Artikbot*
> 
> Difference being the AMD core is usable for anything else than gaming.
> 
> And speaking of gaming, as per usual the 980 will be absolutely no match to AMD's counterpart in high resolution with AA/postprocessing due to the aforementioned. Like it has been for the past three generations.
> 
> Then nVIDIA will release the x10 core... And everyone will run away from the 980 like the plague and get the big cores instead, and nVIDIA will be happy because people bought GPUs twice.
> 
> Which is a genius marketing strategy if you ask me.


That is why i am holding of.


----------



## Jorginto

Quote:


> Originally Posted by *FreeElectron*
> 
> That is why i am holding of.


Me too. Still sitting on my r9 290 CF and:


----------



## kael13

Quote:


> Originally Posted by *Artikbot*
> 
> Difference being the AMD core is usable for anything else than gaming.
> 
> And speaking of gaming, as per usual the 980 will be absolutely no match to AMD's counterpart in high resolution with AA/postprocessing due to the aforementioned. Like it has been for the past three generations.
> 
> Then nVIDIA will release the x10 core... And everyone will run away from the 980 like the plague and get the big cores instead, and nVIDIA will be happy because people bought GPUs twice.
> 
> Which is a genius marketing strategy if you ask me.


They did it before and they'll do it again. I don't mind, who wants to shell out for a new GPU every year?


----------



## mtcn77

Some leaks indicate a 300w 2.5D HBM card. On top of that, looking at the time frame of Asetek's new cooling deal, it will possibly have a water cooling heatsink.


----------



## Rayar69

can some one explain me this please

Memory Clock Speed why nvidia uses 7 GHZ and amd 5 ghz or fake leak in wccftech 1ghz for 390x, this memory clock speed do not increase speed ingames or whatever cuz i dont know what is this, and memory bandwidth is for what?


----------



## Xuper

300w? What the hell ? If it's True then , the Leak of that benchmark is Fake.


----------



## DNMock

Quote:


> Originally Posted by *HillaryClinton*
> 
> So rumor is the 380x comes out in feb based on the million links I been hitting all over google right? Or is that speculation? I was gonna grab a 280/290 on sale, since you can get a 280 for like 170ish on sale and a 290 for like 250ish on sale.
> 
> If these benches are for 380x and are accurate...would it be worth waiting a month or are these things gonna be priced in the 350 USD+ range? This thing looks more powerful then a 290...and uses less wattage.
> 
> Also I been a bit stumped on the DX12 thing, it seems to bring a bunch of nice performance based on speculation and I have been told only the AMD 300 series and Nvidia 900 series will FULLY support it, does that mean partial support for 200 series and if so how partial, I know its all speculation, but its a concern of mine.


Quote:


> Originally Posted by *Rayar69*
> 
> can some one explain me this please
> 
> Memory Clock Speed why nvidia uses 7 GHZ and amd 5 ghz or fake leak in wccftech 1ghz for 390x, this memory clock speed do not increase speed ingames or whatever cuz i dont know what is this, and memory bandwidth is for what?


The ram is just like your PC RAM. Over time you can get faster RAM with better timings due to better mfg practices, binning, better Motherboards, etc. etc. It's the same here, on some 290X cards if you get lucky you can clock them up to 7 ghz I'm sure.

Having faster RAM does improve your performance too. http://techbuyersguru.com/VRAMocing.php The catch is you get better results from overclocking your chip generally than you do from overclocking your RAM, and just like your PC, when you overclock your RAM, you are going to limit how much you can overclock your CPU.

Long story short, you want higher clock speeds on both, but if you have to choose, go with higher GPU clocks over higher RAM clocks (on the same cards of course).


----------



## Kuivamaa

Quote:


> Originally Posted by *Xuper*
> 
> 300w? What the hell ? If it's True then , the Leak of that benchmark is Fake.


The 197W part is supposed to be the 380X. The watercooled part is expected to be the 390X.


----------



## daviejams

I honestly could not care less if the new card is a 300w part. This is over clock.net after all.
It will give ammo to nvida fanboys though


----------



## GoldenTiger

Quote:


> Originally Posted by *daviejams*
> 
> I honestly could not care less if the new card is a 300w part. This is over clock.net after all.
> It will give ammo to nvida fanboys though


Enjoy sweating while gaming because your rig is dumping out over 600w of heat (more oc'd) under your desk onto your legs in Crossfire mode....







.


----------



## Ultracarpet

Quote:


> Originally Posted by *GoldenTiger*
> 
> Enjoy sweating while gaming because your rig is dumping out over 600w of heat (more oc'd) under your desk onto your legs in Crossfire mode....
> 
> 
> 
> 
> 
> 
> 
> .


Considering that people who crossfire OC'd 290x's and sli 780ti's are still alive, I don't think 30 or 40 more watts is going to kill anyone. Switch an old light bulb in your house to an LED one lol.


----------



## hht92

Quote:


> Originally Posted by *GoldenTiger*
> 
> Enjoy sweating while gaming because your rig is dumping out over 600w of heat (more oc'd) under your desk onto your legs in Crossfire mode....
> 
> 
> 
> 
> 
> 
> 
> .


Man stop being like that, like you have the best gpu ever, we will see in a few months new games and your card will not be
able to max them out.








(970 is very good gpu BUT we are talking about every 6 months to change gpu to be able to play the new game.)


----------



## DannyDK

Quote:


> Originally Posted by *hht92*
> 
> Man stop being like that, like you have the best gpu ever, we will see in a few months new games and your card will not be
> able to max them out.
> 
> 
> 
> 
> 
> 
> 
> 
> (970 is very good gpu BUT we are talking about every 6 months to change gpu to be able to play the new game.)


What about the 980?


----------



## GoldenTiger

Quote:


> Originally Posted by *Ultracarpet*
> 
> Considering that people who crossfire OC'd 290x's and sli 780ti's are still alive, I don't think 30 or 40 more watts is going to kill anyone. Switch an old light bulb in your house to an LED one lol.


Try turning on 10 light bulbs and putting them 2 feet from your legs, and report back how great it feels







.
Quote:


> Originally Posted by *hht92*
> 
> Man stop being like that, like you have the best gpu ever, we will see in a few months new games and your card will not be
> able to max them out.
> 
> 
> 
> 
> 
> 
> 
> 
> (970 is very good gpu BUT we are talking about every 6 months to change gpu to be able to play the new game.)


What are you on about, exactly now?


----------



## hht92

Quote:


> Originally Posted by *GoldenTiger*
> 
> What are you on about, exactly now?


I am talking about to wait and see what amd have to offer (don't argue about the more watts its amd man







).


----------



## ZealotKi11er

Quote:


> Originally Posted by *GoldenTiger*
> 
> Enjoy sweating while gaming because your rig is dumping out over 600w of heat (more oc'd) under your desk onto your legs in Crossfire mode....
> 
> 
> 
> 
> 
> 
> 
> .


But its -20C here in Canada. I am loving my AMD cards.


----------



## overpass

If you can't handle the heat, get out of the kitchen







:thumb:


----------



## daviejams

Quote:


> Originally Posted by *GoldenTiger*
> 
> Enjoy sweating while gaming because your rig is dumping out over 600w of heat (more oc'd) under your desk onto your legs in Crossfire mode....
> 
> 
> 
> 
> 
> 
> 
> .


I live in Scotland. I'd never crossfire again but I did run two 7970s with an overclocked CPU for a time and honestly never really noticed heat


----------



## FreeElectron

Quote:


> Originally Posted by *daviejams*
> 
> I live in Scotland. I'd never crossfire again but I did run two 7970s with an overclocked CPU for a time and honestly never really noticed heat


I used to have 3 of the 7970 ghz vapor-x
They used to make my winter warm.


----------



## nleksan

First FACTORY 3x 8-pin PCI-E power (reference) card?

390X Lightning = 4x 8-pin?

Also, to any AMD Apologists, 300W (which if like their rating for the 290X, actually means "335W") means that the power draw will make OC-ing unfeasible for many, but I would bet money that the OC headroom is one of the lowest of any reference GPU ever released...

That's just ridiculous power draw!


----------



## joeh4384

I won't trash the power usage until I see how much better it performs than the 980. I think if it uses the same power as the current 290x and out performs the current 290x by 40-50%, I think it is a pretty good job on efficiency for AMD.


----------



## caswow

people going nuts over rumours







nope times have not changed


----------



## Ultracarpet

Quote:


> Originally Posted by *GoldenTiger*
> 
> *Try turning on 10 light bulbs and putting them 2 feet from your legs, and report back how great it feels
> 
> 
> 
> 
> 
> 
> 
> .
> *What are you on about, exactly now?


Yea... no. The argument is more like you already have 9 lightbulbs 2 feet from your legs, how much of a difference would you notice from adding 1 more?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ultracarpet*
> 
> Yea... no. The argument is more like you already have 9 lightbulbs 2 feet from your legs, how much of a difference would you notice from adding 1 more?


Quote:


> Originally Posted by *GoldenTiger*
> 
> Try turning on 10 light bulbs and putting them 2 feet from your legs, and report back how great it feels
> 
> 
> 
> 
> 
> 
> 
> .
> What are you on about, exactly now?


Is it really that bad?


----------



## Ultracarpet

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is it really that bad?


No, it isn't.


----------

