# [TPU] AMD Announces The Radeon R9 290X - Features 64 ROPs



## Baghi

*Radeon R9 290X Features 64 ROPs*
Quote:


> A leaked company slide by AMD confirmed that its high-end "Hawaii" silicon indeed features 64 raster operations units (ROPs). In reference to its predecessor, "Tahiti," the slide speaks of 2 times the ROPs (32 on "Tahiti") and 1.4 times the stream processors (2048 on "Tahiti," so 2816 on "Hawaii"). Other known specifications include up to 1 GHz GPU clock, up to 5.00 GHz memory clock, and a 512-bit wide GDDR5 memory interface, holding 4 GB of memory. Reviews of Radeon R9 290X could surface around mid-October.


Source: TPU

----

*Pre-orders begin for AMD Radeon R9 290X, likely priced at $699*
Source: TechSpot

----

*AMD Radeon R9 290X Battlefield 4 Edition Pre-Orders Show $1,145 / €840 Price* (link)


Spoiler: Warning: Spoiler!







----

*UPDATED SPECS (link):*

Price US $599.99 (or 499.99€, £399.99 before taxes)
Availability mid-October
28nm silicon
2,816 GCN stream processors
44 SIMDs (11 computing units)
172 TMUs
44 ROPs
512-bit memory interface
4GB/6GB GDDR5 video memory
>300GB bandwidth
5.00 GHz memory clocks

----

Quote:


> AMD announced the new Radeon R9 290X, its next-generation flagship graphics card. *Based on the second-generation Graphics CoreNext micro-architecture, the card is designed to outperform everything NVIDIA has at the moment, including a hypothetical GK110-based graphics card with 2,880 CUDA cores.*


Source

TPU projected final specs:
- 4 Independent Tessellation Units
- ~3,000/2,800 Stream Processors
- 512-bit Memory Interface
- 4GB Video Memory
- DirectX 11.2


----------



## DADDYDC650

How much? !?!?!


----------



## batman900

Droooooooooooooollll


----------



## Hydroplane

I'm ordering two at midnight on launch day.


----------



## anticommon

Quote:


> Originally Posted by *Hydroplane*
> 
> I'm ordering two at midnight on launch day.


Or preorder them on october 3rd before launch day?


----------



## Baghi

Quote:


> Originally Posted by *DADDYDC650*
> 
> How much? !?!?!


No word on the price yet.

BTW, "our most powerful GPU yet" makes it sound like it's faster than the HD 7990 even!


----------



## illuz

I did not expect that. Benchmarks now!


----------



## FinalForm7

I am hoping the price is free of sticker shock.


----------



## youra6

Quote:


> Originally Posted by *Baghi*
> 
> No word on the price yet.
> 
> BTW, "our most powerful GPU yet" makes it sound like it's faster than the HD 7990 even!


They clarified it as the most powerful "single GPU."


----------



## Baghi

Quote:


> Originally Posted by *youra6*
> 
> They clarified it as the most powerful "single GPU."


Oh, I see. Didn't know that.


----------



## sherlock

Quote:


> Originally Posted by *Baghi*
> 
> No word on the price yet.
> 
> BTW, "our most powerful GPU yet" makes it sound like it's faster than the HD 7990 even!


7990 is graphic card with two GPUs.


----------



## Deadboy90

"~3000 stream processors"
How much is about? Still that's amazing I can't wait for benchmarks! Take that Nvidia!


----------



## EliteReplay

Quote:


> Originally Posted by *Deadboy90*
> 
> "~3000 stream processors"
> How much is about? Still that's amazing I can't wait for benchmarks! Take that Nvidia!


Take that alatar


----------



## batman900

^ Who cares who makes it? As long as we get cool new tech at a "hopefully" good price.

Edit: Been watching this a while and reallllly want them to get past the boring AMD audio junk....


----------



## Juub

Quote:


> Originally Posted by *Baghi*
> 
> 
> Source
> 
> Final specs:
> - 4 Independent Tessellation Units
> - ~3,000 Stream Processors
> - 512-bit Memory Interface
> - 4GB Video Memory
> - DirectX 11.2




8000 is quite terrible for a GTX Titan. Some guy here had 9,900 with a stock Titan and another one had almost 9,000 with a stock 780 on Fire Strike.


----------



## Baghi

Quote:


> Originally Posted by *sherlock*
> 
> 7990 is graphic card with two GPUs.


Doesn't mean it can not be beat, but *youra6* already pointed it out AMD's clarification.
Quote:


> Originally Posted by *Deadboy90*
> 
> "~3000 stream processors"
> How much is about? Still that's amazing I can't wait for benchmarks! Take that Nvidia!


Very close, maybe the 2816 rumor is true. Let's see.


----------



## HowHardCanItBe

http://www.livestream.com/amdlivestream


----------



## Norlig

got two 7970's, will still buy one of these first, then one more when I get a 4K gaming display, but I might wait for a 8GB version if there will be one.


----------



## Rahulzz

I had my eye on this one...


----------



## Ukkooh

Quote:


> Originally Posted by *Juub*
> 
> 8000 is quite terrible for a GTX Titan. Some guy here had 9,900 with a stock Titan and another one had almost 9,000 with a stock 780 on Fire Strike.


I find that image deceiving as the last bar looks blurred out meaning that it could go on. Time to wait for the reviews and then decide if I'll order a gtx 780 or this.


----------



## Roaches

Where does it say 3000 Stream Processor!???





















Sorry I'm at work and can't view any videos and streams


----------



## Juub

Quote:


> Originally Posted by *Ukkooh*
> 
> I find that image deceiving as the last bar looks blurred out meaning that it could go on. Time to wait for the reviews and then decide if I'll order a gtx 780 or this.


If it could do more you can bet they would place the bear higher.


----------



## Stay Puft

Quote:


> Originally Posted by *Rahulzz*
> 
> 
> 
> I had my eye on this one...


Thats simply a rebranded 7870 with higher clocks


----------



## Alatar

Uh.... I've been watching the live stream and no where did they say anything about Nvidia at all...

The part about which GK110 the thing goes against is just some random drivel from the TPU writer. Actually the firestrike perf numbers in AMD's slides were less than promising considering how well GK110 cards score...


----------



## Insan1tyOne

Quote:


> Originally Posted by *Rahulzz*
> 
> 
> 
> I had my eye on this one...


^ Okay so there is this picture... Now where is the similar one featuring the 290X? I want to see the official price point. If the 270X is $199 Im going to assume the 280X is $399 and the 290X will be $599? I guess that's not too bad... I was hoping for around $500-550 for the 290X. Oh well...


----------



## Baghi

Those are the official prices, "fastest GPU under $200" they said.


----------



## Newbie2009

Quote:


> Originally Posted by *Alatar*
> 
> Uh.... I've been watching the live stream and no where did they say anything about Nvidia at all...
> 
> The part about which GK110 the thing goes against is just some random drivel from the TPU writer. Actually the firestrike perf numbers in AMD's slides were less than promising considering how well GK110 cards score...


Agree. However, even in the "leaked benchmarks" the titan scored higher in that benchmark.

No expert but I wonder, have AMD decided on clocks yet? Perhaps that is why so vague on the high end. Not sure where they will put it yet or charge for it? Or too late in the game to not have decided on clocks?


----------



## s-x

Going to be a Titan killer in price/performance.


----------



## szeged

Quote:


> Originally Posted by *s-x*
> 
> Going to be a Titan killer in price/performance.


780s and 7970s already did that.


----------



## Insan1tyOne

Quote:


> Originally Posted by *Newbie2009*
> 
> Agree. However, even in the "leaked benchmarks" the titan scored higher in that benchmark.


In that benchmark the R9 290X sat between the Titan with default, factory clocks, and the Extreme Edition Titan, so needless to say the 290X should trade blows with a completely stock Titan if all the rumors are true.


----------



## Alatar

Quote:


> Originally Posted by *szeged*
> 
> 780s and 7970s already did that.


Pretty much every card on the market already did that









Shame if these can't score higher in fire strike though, was looking forward to a benching war


----------



## Raf Leung

where do you pre-order it? i live in Australia


----------



## McMogg

They said (when I watched) that it was the fastest single GPU that *THEY* had ever produced.
You think if they were holding the fastest single GPU ever, they'd show it off - I think it'll be close between Titan and R9 290x, with the 290x a little behind atleast.


----------



## Alatar

Quote:


> Originally Posted by *Insan1tyOne*
> 
> In that benchmark the R9 290X sat between the Titan with default, factory clocks, and the Extreme Edition Titan, so needless to say the 290X should trade blows with a completely stock Titan if all the rumors are true.


The leaked benches had a bone stock Titan.

And a bone stock Titan is extremely conservatively clocked and TDP limited....


----------



## Yungbenny911

Oh my... I'm so fired up!







Good Job AMD, now polish those drivers and you gain my loyalty







.


----------



## Ukkooh

At this point I'm surprised if it manages to trade blows with gtx 780.


----------



## Razor 116

Quote:


> Originally Posted by *Alatar*
> 
> Pretty much every card on the market already did that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shame if these can't score higher in fire strike though, was looking forward to a benching war


I'd rather concentrate on actual games.


----------



## DMHernandez

Did they mention price for the 290X?


----------



## Yungbenny911

Quote:


> Originally Posted by *Alatar*
> 
> Pretty much every card on the market already did that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Shame if these can't score higher in fire strike though*, was looking forward to a benching war


I believe they will be better in firestrike than GK110, for example, my 770's loose to very highly clocked 7950's in firestrike (for some reason), but destroys them in 3dmark11. I think firestrike is not optimized for Nvidia, so i won't be surprised if they beat GK110 in that.


----------



## sikkly

Quote:


> Originally Posted by *s-x*
> 
> Going to be a Titan killer in price/performance.


I hope no one was stupid enough to buy a Titan for price/performance. Everyone knows, the more you pay for tech usually the worse of a deal you are getting. You pay exponentially for that last little bit of power.

Anyways, I'm interested in seeing how these cards do in the real world. Press releases are nice, but we never know until real people get them and see what they can do.


----------



## Raf Leung

So there will be a R9 290 and R9 290X?
just like 7950 and 7970?
and R9 280 R9 280X?


----------



## KrazyKap

Here is everything you can now know for certain:

http://anandtech.com/show/7368/amd-gpu-product-showcase-live-blog


----------



## Baghi

Quote:


> Originally Posted by *Yungbenny911*
> 
> I believe they will be better in firestrike than GK110, for example, my 770's loose to very highly clocked 7950's in firestrike (for some reason), but destroys them in 3dmark11. *I think firestrike is not optimized for Nvidia,* so i won't be surprised if they beat GK110 in that.


OR you could say 3DMark 11 is NVIDIA biased? Please, it varies and you know it; a graphics card benefits in one benchmark is not necessarily to benefit in others as well.


----------



## Juub

Quote:


> Originally Posted by *Raf Leung*
> 
> So there will be a R9 290 and R9 290X?
> just like 7950 and 7970?
> and R9 280 R9 280X?


Only announced a 290x, no 290. 280x is a rebadged 7970.


----------



## Newbie2009

Quote:


> Originally Posted by *Juub*
> 
> Only announced a 290x, no 290. 280x is a rebadged 7970.


I noticed this. 290x was only 290 at the start. But a 290 and 290x appeared on a later slide, so I must say I'm confused a little.


----------



## bossie2000

Seems like they've got there fingers in more and more game studio's pie's!!


----------



## Yungbenny911

Quote:


> Originally Posted by *Baghi*
> 
> OR you could say 3DMark 11 is NVIDIA biased? Please, it varies and you know it; a graphics card benefits in one benchmark is not necessarily to benefit in others as well.


No... don't pull that card lol. The 770 and 7950 are not in the same league of GPU's. I have a buddy that has his 7970 @ 1380Mhz, and my 770 only tops it by a few points in 3dmark11 with higher clocks (which shows fairness), but he tottally destroys my score in firestrike with about 600+points lead.

I think it's safe to assume that 3dmark11 is fair.


----------



## Booty Warrior

Quote:


> Based on the second-generation Graphics CoreNext micro-architecture, the card is designed to outperform everything NVIDIA has at the moment, including a hypothetical GK110-based graphics card with 2,880 CUDA cores.


Funny, I've been watching this (awful) launch presentation since it started and that was never mentioned. Where exactly did they get that from?


----------



## Baghi

Quote:


> Originally Posted by *Yungbenny911*
> 
> I think it's safe to assume that 3dmark11 is fair.


I beg to differ, bro. You'll almost always see NVIDIA GPUs topping the chart against AMD equivalent. FS-E gives most unbiased scores IMO, even CPU doesn't plays much of a role in that particular benchmark. But anyway, to each his own.


----------



## szeged

nvidia card beats amd card

MUST BE BIASED BENCHMARK


----------



## wstanci3

Personally, I could care less about a new flagship card. I came to hear about TrueAudio.


----------



## HeliXpc

not to bash AMD, but ive had less issues running nVidia, as far as build quality and drivers, of course build quality differs from manufacturer to manufacturer, but reference cards vs reference , nvidia does have better quality overall IMO. I have used AMD cards in the past and i might give them a shot again, but its so hard to move from my trusty and stable gtx 780.


----------



## jeffro37

R-280x (rebadged 7970) for $299. Not to bad. I'm guessing this is supposed to replace the 7870. If so, it is priced a little lower then the 7870 at launch. Might have to grab 1 of these. Around 7970 performance, but better and lower power draw.


----------



## Artikbot

Quote:


> Originally Posted by *szeged*
> 
> nvidia card beats amd card
> 
> MUST BE BIASED BENCHMARK


I can see those Titans of yours start to tremble...









Joking, I really couldn't care less.

But I for sure am interested in seeing what will these cards deliver. Because next year, after a few price drops, I might be getting one.


----------



## thestache

I've learnt nothing from this other than how ugly the reference shroud will be. Also why in the pictures does the R9 290 and R9 290X not have crossfire fingers?


----------



## Phenomanator53

Quote:


> Originally Posted by *thestache*
> 
> I've learnt nothing from this other than how ugly the reference shroud will be. Also why in the pictures does the R9 290 and R9 290X not have crossfire fingers?


^typical reply from a fanboi. and who even buys reference cards these days other than OEM's anyway? for the crossfire connectors, they removed it because apparently they can get enough bandwidth though PCIE 3.0


----------



## Taint3dBulge

Quote:


> Originally Posted by *thestache*
> 
> I've learnt nothing from this other than how ugly the reference shroud will be. Also why in the pictures does the R9 290 and R9 290X not have crossfire fingers?


Is an engineering sample.... You can see there is something there, but just not cut out.


----------



## Star Forge

Quote:


> Originally Posted by *Phenomanator53*
> 
> ^typical reply from a fanboi. and who even buys reference cards these days other than OEM's anyway?


Your statement is equally fanboi. We haven't learned anything relevant to gaming from them today. AMD is known to have higher compute than nVidia but higher compute =/= better gaming performance and they were mum about actual benchmarks for the bloody thing or other more relevant stats. So yeah if you want to go brag all over OCN about AMD having higher compute on the R9-290X, be my guest but that means nothing to the mainstreamers here.

I will be crying a little if I was a researcher though...


----------



## undeadhunter

Quote:


> Originally Posted by *Phenomanator53*
> 
> ^typical reply from a fanboi. and who even buys reference cards these days other than OEM's anyway? for the crossfire connectors, they removed it because apparently they can get enough bandwidth though PCIE 3.0


It's the truth.... worthless conference, no price on the card everyone was expecting, no benchies or detailed specs.... sorry but it's the sad truth.


----------



## PureBlackFire

I really hope this wasn't the end of such a hyped reveal. no specs, no performance, no price on the only new card in the stack. what a waste of time.


----------



## reqq

that mantle thing sounds sick


----------



## fateswarm

These are not the full specs. Don't insult us. We sat through a billion hours of torture and you pretend you got full specs out of it?


----------



## MunneY

Quote:


> Originally Posted by *fateswarm*
> 
> These are not the full specs. Don't insult us. We sat through a billion hours of torture and you pretend you got full specs out of it?


SRSLY THO.....

Wanna go play in traffic? Can't possible be worse than what we just did.


----------



## Kuivamaa

Quote:


> Originally Posted by *Juub*
> 
> 
> 
> 8000 is quite terrible for a GTX Titan. Some guy here had 9,900 with a stock Titan and another one had almost 9,000 with a stock 780 on Fire Strike.




These are the leaked numbers. AMD claims ~8000 for their flagship ,probably benched on an FX-9590/AM3+ ,which might explain the slight regression from the leaked info (every little bit counts even on cpu/chipset level) which probably was on some intel i7. Lower firestrike didn't prevent it from doing pretty well vs the Titan on gaming benchies. Still leaks might be using non-stock radeons. We will know for sure in a week or two.


----------



## Stay Puft

Quote:


> Originally Posted by *Kuivamaa*
> 
> 
> 
> These are the leaked numbers. AMD claims ~8000 for their flagship ,probably benched on an FX-9590/AM3+ ,which might explain the slight regression from the leaked info (every little bit counts even on cpu/chipset level) which probably was on some intel i7. Lower firestrike didn't prevent it from doing pretty well vs the Titan on gaming benchies. Still leaks might be using non-stock radeons. We will know for sure in a week or two.


Exactly. I chalk up all the low numbers for the sole fact they used an amd processor to benchmark them


----------



## Yvese

I'm really curious about pricing.

The 280x is supposed to be $299. That leaves quite a bit of room for the 290x. Now, in a perfect world, the 290 will be $399 ( 7950 equivalent for next-gen ), and the 290x will be $499 ( 7970 equivalent for next-gen )

I mean.. if those aren't the prices then I'd go with $499 290 and $599 290x which makes no sense since that leaves the $399 bracket completely open.


----------



## Stay Puft

Quote:


> Originally Posted by *Yvese*
> 
> I'm really curious about pricing.
> 
> The 280x is supposed to be $299. That leaves quite a bit of room for the 290x. Now, in a perfect world, the 290 will be $399 ( 7950 equivalent for next-gen ), and the 290x will be $499 ( 7970 equivalent for next-gen )
> 
> I mean.. if those aren't the prices then I'd go with $499 290 and $599 290x which makes no sense since that leaves the $399 bracket completely open.


The 290 will battle the 770 at 399.99. The 290X is obviously alittle slower then the 780 so 499 would be a perfect price but 549 seems more realistic


----------



## Juub

Quote:


> Originally Posted by *Stay Puft*
> 
> The 290 will battle the 770 at 399.99. The 290X is obviously alittle slower then the 780 so 499 would be a perfect price but 549 seems more realistic


The 280x most likely beats the 770 and the 290 was not announced. Only 290x.


----------



## Yvese

Quote:


> Originally Posted by *Stay Puft*
> 
> The 290 will battle the 770 at 399.99. The 290X is obviously alittle slower then the 780 so 499 would be a perfect price but 549 seems more realistic


Uh, no.

The 7970 battles the 770 since it's essentially a rebadged 680.

It's just Nvidia charging higher price because people pay for the name.


----------



## thestache

Quote:


> Originally Posted by *Phenomanator53*
> 
> ^typical reply from a fanboi. and who even buys reference cards these days other than OEM's anyway? for the crossfire connectors, they removed it because apparently they can get enough bandwidth though PCIE 3.0


Just telling it like it is. We learnt nothing about the R9 290X today other than its ugly looking and possibly doesn't have crossfire fingers. Could you link me to a source for that? It's interesting.

I'm also surprised by the display outputs of the card and them abandoning the two MDP, which sucks for eyefinity.


----------



## cloudzeng

Quote:


> Originally Posted by *thestache*
> 
> Just telling it like it is. We learnt nothing about the R9 290X today other than its ugly looking and possibly doesn't have crossfire fingers. Could you link me to a source for that? It's interesting.
> 
> I'm also surprised by the display outputs of the card and them abandoning the two MDP, which sucks for eyefinity.


Yep, I think I remember reading that they could would CF with the PCIE I'll try and find a source for you


----------



## sugarhell

Quote:


> Originally Posted by *thestache*
> 
> Just telling it like it is. We learnt nothing about the R9 290X today other than its ugly looking and possibly doesn't have crossfire fingers. Could you link me to a source for that? It's interesting.
> 
> I'm also surprised by the display outputs of the card and them abandoning the two MDP, which sucks for eyefinity.


It has crossfire fingers...


----------



## iamhollywood5

No prices, no benches, only 1 new card... honestly pretty underwhelming.

HOWEVA

The audio stuff does sound pretty cool. Too bad I don't really care about audio... I'm perfectly satisfied with the audio coming from my motherboard going to my normal stereo speakers.

Mantle, on the other hand, is something worth talking about, and sadly probably the ONLY thing that truly excited me today. I'm assuming mantle works on GCN 1.0 as well? It sure as hell better... if they tell me mantle works for the re-badged 280X but not my 7970s because they did some software disabling, I will flip numerous tables.

As for the price, I could see the 290 being $399 with the 290X being $499. Which would give Nvida a real good kick in the 'nads. At the same time, I can also see the 290X being $599 easily. 3000 stream processors, 512 bit interface, and the new audio tech? I think that's justified.


----------



## Stay Puft

Quote:


> Originally Posted by *Yvese*
> 
> Uh, no.
> 
> The 7970 battles the 770 since it's essentially a rebadged 680.
> 
> It's just Nvidia charging higher price because people pay for the name.


I dont see anyone buying a 770 over a 280X. Overall im happy with the new lineup. The 270X at 199.99 will be a great card


----------



## Yvese

Quote:


> Originally Posted by *Stay Puft*
> 
> I dont see anyone buying a 770 over a 280X.


You'd be surprised how many people buy into the Nvidia name and gimmicks like Physx despite AMD offering 3 games with their card.

A lot of people buy for the brand over value which is truly sad.


----------



## AlphaC

Quote:


> Originally Posted by *Yvese*
> 
> You'd be surprised how many people buy into the Nvidia name and gimmicks like Physx despite AMD offering 3 games with their card.
> 
> A lot of people buy for the brand over value which is truly sad.


People pay a price premium for (perceived) quality.


----------



## ZealotKi11er

I want a new toy to play with. 4GB sounds nice.


----------



## Ha-Nocri

So we saw nothing... except some synthetic benches (but who really cares 'bout that) and the cooler design, which looks the best so far for me


----------



## Johnny Rook

Quote:


> Originally Posted by *McMogg*
> 
> They said (when I watched) that it was the fastest single GPU that *THEY* had ever produced.
> You think if they were holding the fastest single GPU ever, they'd show it off - I think it'll be close between Titan and R9 290x, with the 290x a little behind atleast.


My thoughts exactly.
I saw the live streaming and at the end of R9 290 Series presentation, right after they said nothing about R9 290X performance and/or pricing, I though to myself: "Nah! If you had it, you had shown it". More so when after, they announced a R9 290X + BF4 Bundle PRE-ORDER
Plus, from their own slides, I see the R9 290X FireStike score in mid 7k-8k. Well, my stock GTX 780 does 8245 and scratches the 10k @ 1202MHz.
So, I am not expecting a TITAN killer but a GTX 780 competitor at best.

I wish I am wrong though, I would very much like to get one of those "TITAN killers" AMD CGN2 chips for myself.


----------



## Brutuz

Quote:


> Originally Posted by *Newbie2009*
> 
> Or too late in the game to not have decided on clocks?


afaik clocks are one of the few things that can be changed very close to release, iirc the HD4850 had its clocks and vRAM size increased a month or two before launch.
Quote:


> Originally Posted by *Yvese*
> 
> You'd be surprised how many people buy into the Nvidia name and gimmicks like Physx despite AMD offering 3 games with their card.
> 
> A lot of people buy for the brand over value which is truly sad.


And how many people still believe nVidia has better drivers than AMD...They have more features but that's really it these days.


----------



## Master__Shake

Quote:


> Originally Posted by *Baghi*


hope the aib partners dont screw up that awesome shroud with some useless stickers.


----------



## Johnny Rook

Quote:


> Originally Posted by *Master__Shake*
> 
> hope the aib partners dont screw up that awesome shroud with some useless stickers.


Of course not! They will screw it up with aftermarket coolers!
Very much like they did with the GTX 700 series stock cooler.


----------



## th3illusiveman

Quote:


> Originally Posted by *Yungbenny911*
> 
> No... don't pull that card lol. The 770 and 7950 are not in the same league of GPU's. I have a buddy that has his 7970 @ 1380Mhz, and my 770 only tops it by a few points in 3dmark11 with higher clocks (which shows fairness), but he tottally destroys my score in firestrike with about 600+points lead.
> 
> I think it's safe to assume that 3dmark11 is fair.


3DM11 is heavily Nvidia bias and everyone knows it, firestrike is AMD bias. Valley is neutral, or was till people started cheating with software hacks lol .


----------



## scyy

Quote:


> Originally Posted by *Yvese*
> 
> You'd be surprised how many people buy into the Nvidia name and gimmicks like Physx despite AMD offering 3 games with their card.
> 
> A lot of people buy for the brand over value which is truly sad.


Yes, everyone who buys nvidia buys it for the name and gimmicks. You figured it all out. No other possible reason there at all.


----------



## BigDaddyBleeder

AMD has stated that they are not going after the extreme enthusiast gamer. This means the Titan folks. They are going after the GTX 780 (or at least should be). So anyone hoping or wanting or dreaming that AMD is going to kill a Titan is dreaming.

Having said that though, if AMD is trying to compete or beat the GTX 780 then I feel that they either don't have the card to do that with the R9-290x or they just completely failed with their presentation. If they are trying to tease us by withholding any actual data relating to the R9-290x - like actual bench marks and the price - then it didn't work. They way they went about the presentation makes me feel that this card actually doesn't compete with the GTX 780. They talked about all the cool other things about the new cards but not if it actually is at least just as fast.

I'm hoping that is does compete though because, as many people have already stated, we need the competition in price and performance. We need AMD to be able to keep up with the other guys to keep the prices of these cards in check. $600+ is way too much for a card that can properly play new games the way that the developers made them to be played.

They have about 1 week to release actual benches of their flagship card. If not then I will not, and I don't know why anyone would, be ordering their Battlefield 4 bundle.

Fingers crossed on this one but I really don't feel good about it.


----------



## sugarhell

Who cares about the new gpu. Mantle is way more interesting.


----------



## BigDaddyBleeder

Quote:


> Originally Posted by *sugarhell*
> 
> Who cares about the new gpu. Mantle is way more interesting.


This is VERY true but does it actually matter in games? Did AMD show anything today that shows that Mantle makes buying a new AMD GPU a good idea?

AMD presented videos and PowerPoints today and not real information. I'm hoping that in the next few days they release real information about the new cards and not more of that fluff that I sat 2.5 hours though today.


----------



## rcfc89

Quote:


> Originally Posted by *BigDaddyBleeder*
> 
> This is VERY true but does it actually matter in games? Did AMD show anything today that shows that Mantle makes buying a new AMD GPU a good idea?
> 
> AMD presented videos and PowerPoints today and not real information. I'm hoping that in the next few days they release real information about the new cards and not more of that fluff that I sat 2.5 hours though today.


After watching that train wreck today I won't be buying anything from Amd anytime soon. What a disaster.


----------



## jomama22

Quote:


> Originally Posted by *rcfc89*
> 
> After watching that train wreck today I won't be buying anything from Amd anytime soon. What a disaster.


lol, judges performance on speaking ability of engineers. That makes sense.


----------



## HanSomPa

Quote:


> Originally Posted by *Brutuz*
> 
> And how many people still believe nVidia has better drivers than AMD...They have more features but that's really it these days.


Frame Pacing?


----------



## Redwoodz

Quote:


> Originally Posted by *Juub*
> 
> 
> 
> 8000 is quite terrible for a GTX Titan. Some guy here had 9,900 with a stock Titan and another one had almost 9,000 with a stock 780 on Fire Strike.


Quote:


> Originally Posted by *Ukkooh*
> 
> I find that image deceiving as the last bar looks blurred out meaning that it could go on. Time to wait for the reviews and then decide if I'll order a gtx 780 or this.


Quote:


> Originally Posted by *Alatar*
> 
> Pretty much every card on the market already did that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shame if these can't score higher in fire strike though, was looking forward to a benching war


Quote:


> Originally Posted by *Alatar*
> 
> The leaked benches had a bone stock Titan.
> 
> And a bone stock Titan is extremely conservatively clocked and TDP limited....


Quote:


> Originally Posted by *szeged*
> 
> nvidia card beats amd card
> 
> MUST BE BIASED BENCHMARK


Quote:


> Originally Posted by *thestache*
> 
> I've learnt nothing from this other than how ugly the reference shroud will be. Also why in the pictures does the R9 290 and R9 290X not have crossfire fingers?


Quote:


> Originally Posted by *th3illusiveman*
> 
> 3DM11 is heavily Nvidia bias and everyone knows it, firestrike is AMD bias. Valley is neutral, or was till people started cheating with software hacks lol .


Let's just end this train of thought right now.
http://hwbot.org/submission/2409330_kingpin_3dmark___fire_strike_extreme_geforce_gtx_titan_7810_marks
The World record for Fire Strike Extreme is 7,810,@ 1408MHz and a 3930K under LN2 @5.7GHz. Highest AMD score (7970) is 6133 @ 1650MHz.


There is no AMD bias.


----------



## lacrossewacker

Quote:


> *the card is designed to outperform everything NVIDIA has at the moment*, including a hypothetical GK110-based graphics card with 2,880 CUDA cores.


probably like how the FX8150 was said to outperform the i7-980X lol

I'm all for a push in technology, but I take AMD's word with a LARGE grain of salt.


----------



## BigDaddyBleeder

Quote:


> Originally Posted by *jomama22*
> 
> lol, judges performance on speaking ability of engineers. That makes sense.


What? Was rcfc89 talking about the presenters? This has nothing to do with the people presenting (at least for me). This has everything to do with what was presented.

Please tell us what they presented today that will make you want to purchase a new AMD GPU.


----------



## SniperOct

Quote:


> AMD announced the new Radeon R9 290X, its next-generation flagship graphics card. Based on the second-generation Graphics CoreNext micro-architecture, the card is designed to outperform everything NVIDIA has at the moment, including a hypothetical GK110-based graphics card with 2,880 CUDA cores. It's based on the new "Hawaii" silicon, with four independent tessellation units, close to 2,800 stream processors, and 4 GB of memory. The card supports DirectX 11.2, and could offer an inherent performance advantage over NVIDIA's GPUs at games such as "Battlefield 4". Battlefield 4 will also be included in an exclusive preorder bundle. *The card will be competitively priced against NVIDIA's offerings. We're awaiting more details.*


Does this mean Titan level prices?


----------



## mcg75

Quote:


> Originally Posted by *Yvese*
> 
> You'd be surprised how many people buy into the Nvidia name and gimmicks like Physx despite AMD offering 3 games with their card.
> 
> A lot of people buy for the brand over value which is truly sad.


Sorry but product loyalty and market share isn't gained by gimmicks. It's gained by giving the majority of users a trouble free experience.

Nvidia gained the market share lead because of ATI not providing that for a lot of users and AMD has to work very hard to get that back.

But don't be under the delusion that AMD would not be doing the same thing Nvidia is now if they end up as market share leader.

AMD doesn't exist as a charity for you and others to have cheap gpus.


----------



## lacrossewacker

Quote:


> Originally Posted by *SniperOct*
> 
> Does this mean Titan level prices?


They've been on the record to say they're not putting anything in that "enthusiast" market, but then, I think I read either today or yesterday that they are releasing an "enthusiastic" market GPU.


----------



## jomama22

Quote:


> Originally Posted by *lacrossewacker*
> 
> They've been on the record to say they're not putting anything in that "enthusiast" market, but then, I think I read either today or yesterday that they are releasing an "enthusiastic" market GPU.


They said they wont forray into the "ultra enthusiast" pricing category and were aiming for "enthusiast"


----------



## jomama22

Quote:


> Originally Posted by *mcg75*
> 
> Sorry but product loyalty and market share isn't gained by gimmicks. It's gained by giving the majority of users a trouble free experience.
> 
> Nvidia gained the market share lead because of ATI not providing that for a lot of users and AMD has to work very hard to get that back.
> 
> But don't be under the delusion that AMD would not be doing the same thing Nvidia is now if they end up as market share leader.
> 
> AMD doesn't exist as a charity for you and others to have cheap gpus.


Its more since amd placed themselves as the price/performace company and not the "go all out, spare no expense" as nivida has shown. Much less to do with what is actually a better overall experience.


----------



## wstanci3

Quote:


> Originally Posted by *Redwoodz*
> 
> Let's just end this train of thought right now.
> http://hwbot.org/submission/2409330_kingpin_3dmark___fire_strike_extreme_geforce_gtx_titan_7810_marks
> The World record for Fire Strike Extreme is 7,810,@ 1408MHz and a 3930K under LN2 @5.7GHz. Highest AMD score (7970) is 6133 @ 1650MHz.


No, let's not. That world record is FireStrike Extreme. The graph is FireStrike Performance.


----------



## scorpscarx

Quote:


> Originally Posted by *jomama22*
> 
> They said they wont forray into the "ultra enthusiast" pricing category and were aiming for "enthusiast"


Which is exactly what makes me really hyped for this card, more details, and release date.

I'm expecting $550.

I'll probably get one at that price, if it's any higher than that I will probably wait for the 880's.


----------



## BankaiKiller

Yea most people, even on overclock.net don't have titans


----------



## jomama22

Quote:


> Originally Posted by *wstanci3*
> 
> No, let's not. That world record is FireStrike Extreme. The graph is FireStrike Performance.


why do i have a feeling that they are being ever so modest on that last card...i like how the "8000" score number is missing on the right. When it releases they will pull it and it will actually say "OVER 9000!!!!"


----------



## Quantium40

Why does AMD always have to drum up announcements like this, only to reveal pretty much zip. Ah, my care level is pretty close to zero. I only care that the product they release is decent, so that I may buy it


----------



## wstanci3

Quote:


> Originally Posted by *jomama22*
> 
> why do i have a feeling that they are being ever so modest on that last card...i like how the "8000" score number is missing on the right. When it releases they will pull it and it will actually say "OVER 9000!!!!"


The ultimate ruse.


----------



## CCast88

oh man this is killing me. I want the prices of the R9 290 and 290x

And it's also making me feel completely undergeared for this card. I'm wondering If I would have to build a new system for this...


----------



## mohit9206

Does anyone have an idea whether the R7 250 will be more powerful than 7750 and does it require a pcie connection from psu.


----------



## KrazyKap

Quote:


> Originally Posted by *mohit9206*
> 
> Does anyone have an idea whether the R7 250 will be more powerful than 7750 and does it require a pcie connection from psu.


It is and it will.
https://amd.app.box.com/GPU14publicpreso

*NVM that link doesn't show anything and I am just guessing.


----------



## mohit9206

Is it faster than radeon 7750? I mean the new R7 250.


----------



## KrazyKap

I would say so since it seems to be on the same tier, but newer and all


----------



## KnightVII

R9 290 x will cost $1000 or above $1000.


----------



## KrazyKap

Quote:


> Originally Posted by *KnightVII*
> 
> R9 290 x will cost $1000 or above $1000.


Source please? It's quite the claim


----------



## ShooterFX

Well according to news at: http://www.thinkcomputers.org/amd-radeon-r9-290x-flagship-graphics-card-detailed/ , the card wil go for $599

"The R9 290X will launch in October for $599. Expect to see many announcements from other partners on different variants of the card."


----------



## Ha-Nocri

Quote:


> Originally Posted by *KrazyKap*
> 
> Source please? It's quite the claim


If it beats the Titan it probably will cost >= Titan


----------



## Testier

Quote:


> Originally Posted by *Ha-Nocri*
> 
> If it beats the Titan it probably will cost >= Titan


Titan's huge price is justified by its 1/3 DP. I somehow doubt R9 290X can offer the same computing performance.


----------



## jincuteguy

Quote:


> Originally Posted by *anticommon*
> 
> Or preorder them on october 3rd before launch day?


When is the launch day? Oct 5th?


----------



## Cy5Patrick

Quote:


> Originally Posted by *KnightVII*
> 
> R9 290 x will cost $1000 or above $1000.


No.
Quote:


> While no pricing has been announced for AMD's high-end GPU just yet, It is worth mentioning that Nekechuck did confirm to us that the company does not plan to release single-GPU cards in the $1,000 price range because AMD thinks that is such a small, niche market


Source


----------



## iatacs19

Where are the reviews?


----------



## jincuteguy

I'm sure Nvidia will announce something very soon after Amd announced the R290X. Nvidia will not let Amd take the top spot this holiday season.


----------



## fateswarm

Quote:


> Originally Posted by *jincuteguy*
> 
> I'm sure Nvidia will announce something very soon after Amd announced the R290X. Nvidia will not let Amd take the top spot this holiday season.


I was thinking about that all this time we have known about this series. I suspect they may have calculated the loss is very small since around Q1 or Q2 of 2014 we get 20nm from both of them, though, the holiday season is always a money grab and an offloading of old stock, especially before a new process node, if all fails, NVIDIA may just drop prices to -20% before Christmas so they don't lose nearly anything.


----------



## Stay Puft

Quote:


> Originally Posted by *KnightVII*
> 
> R9 290 x will cost $1000 or above $1000.


No it won't be even close. Amd isn't going to ripoff it's customers like Nvidias high end offerings


----------



## ZombieJon

Quote:


> Originally Posted by *jincuteguy*
> 
> When is the launch day? Oct 5th?


Doubt it. A 2 day pre-order seems kinda useless.

Possibility that NDAs will lift on launch day, like how it did for Haswell.


----------



## Master__Shake

here is a quote from our resident amd spokesperson Warsam71
Quote:


> Originally Posted by *Warsam71*
> 
> Well, one thing I can tell you, and I'm echoing some of the statements we've made recently in interviews, that the price for the R9 290X is/will be more "consumer friendly"...so you won't have to cut too many corners


http://www.overclock.net/t/1429336/high-resolution-photos-of-amds-newest-gpus-r7-and-r9/0_20#post_20869215

and when i asked himn if it was 500-650 he said warm...so unless he is really bad at hot and cold i wouldn't expect 1000 to be the msrp.


----------



## CCast88

Actually, Titan price is high because we allow it to be and people are willing to pay that much for it. And as long as people keep buying at these prices, they will keep raising the price each generation. Titan WILL drop in price in the near future just like every other card in the past has dropped in price. Supply and Demand.

Personally, I think $600 is WAY to much for an "Enthusiast" level gpu. This card should be launching for $399. Leave the $600 price tag to the dual GPU cards or super enthusiast or ultra enthusiast or whatever they want to call that market. We are getting taken advantage of people...


----------



## Master__Shake

Quote:


> Originally Posted by *CCast88*
> 
> Actually, Titan price is high because we allow it to be and people are willing to pay that much for it. And as long as people keep buying at these prices, they will keep raising the price each generation. Titan WILL drop in price in the near future just like every other card in the past has dropped in price. Supply and Demand.


i bet you it won't, it will eol at that 1000 price tag come hell or high water.


----------



## CCast88

Quote:


> and when i asked himn if it was 500-650 he said warm...so unless he is really bad at hot and cold i wouldn't expect 1000 to be the msrp.


he also said
Quote:


> Originally Posted by *Warsam71*
> 
> LOL - you'll get me trouble.
> 
> As you can imagine I'm not allowed to expand on certain topics. But take a look at the presentation it does include prices for the rest of the R7 and R9s.


I'm taking this as the price will be as predicted. $500 - 600. They won't jump from a $400 price tag to $1000 for the next 2 cards up


----------



## Forceman

Quote:


> Originally Posted by *ZombieJon*
> 
> Doubt it. A 2 day pre-order seems kinda useless.
> 
> Possibility that NDAs will lift on launch day, like how it did for Haswell.


Just occurred to me that they are going to have to lift the NDA before the 3rd, unless they expect people to just preorder it on faith.
Quote:


> Originally Posted by *CCast88*
> 
> he also said
> I'm taking this as the price will be as predicted. $500 - 600. They won't jump from a $400 price tag to $1000 for the next 2 cards up


Maybe $300, $450, $600 for the 280X, 290, 290X?


----------



## fleetfeather

Quote:


> Originally Posted by *Raf Leung*
> 
> where do you pre-order it? i live in Australia


"We're expecting them to arrive towards late October and will be listing them up as soon as the ETA is confirmed" - PCCG


----------



## MxPhenom 216

All Nvidia needs to do it drop all the prices (780 and Titan) then release a full blown GK110 card that doesn't hold anything back to beat out this R9 290x and call it Titan Ultra or something.


----------



## fateswarm

Quote:


> Originally Posted by *MxPhenom 216*
> 
> Titan Ultra


It sounds like a detergent. The Titans were ancient figures of Glory, what about GTX Hephaestus (the god of Fire)?


----------



## sherlock

Quote:


> Originally Posted by *fateswarm*
> 
> It sounds like a detergent. The Titans were ancient figures of Glory, what about GTX Hephaestus (the god of Fire)?


GTX Hyperion


----------



## fleetfeather

Quote:


> Originally Posted by *fateswarm*
> 
> It sounds like a detergent. The Titans were ancient figures of Glory, what about GTX Hephaestus (the god of Fire)?


Fire-related names might send the wrong message to consumers...

unless it's a Galaxy GTX 780 Hall of Fire Edition lololol


----------



## fateswarm

Quote:


> Originally Posted by *sherlock*
> 
> GTX Hyperion


Hrm, " twelve Titans, the males were Oceanus, *Hyperion*, Coeus, Cronus, Crius, and Iapetus and the females-the Titanesses-were Mnemosyne, Tethys, Theia, Phoebe, Rhea, and Themis. "

Phoebe, it sounds so ominous in a good way, the evil sister of Phobos.


----------



## Tonza

I like how you guys already compare it to Titan, its price doesnt have anything to do with performance anyway. Also there is card called GTX 780? That is like 10% slower than Titan and it costs in here Finland 619e (reference model), AMD R9-290X in the other hand is gonna be for sure 599e minimum in here (something like 599-649e is for sure gonna be its final price in here). And what have seen from those slides, its performance is little slower in Firestrike than on stock 780, which is quite underwhelming.

Those also who are saying that Firestrike is not optimized for Nvidia.... AMD won on their 7970/7950 the GTX 680/670 because they have better compute performance and Firestrike uses that a lot.

You guys are simply asking too much from AMD new chip which is like 30% smaller than GK110 chip.


----------



## Stay Puft

Quote:


> Originally Posted by *MxPhenom 216*
> 
> All Nvidia needs to do it drop all the prices (780 and Titan) then release a full blown GK110 card that doesn't hold anything back to beat out this R9 290x and call it Titan Ultra or something.


I for one am sick of the high priced enthusiast cards from Nvidia. Almost bought titan and thankfully that didn't work out. What happened to the days of 399.99 top of the line enthusiast cards?


----------



## TooBAMF

Quote:


> Originally Posted by *Stay Puft*
> 
> I for one am sick of the high priced enthusiast cards from Nvidia. Almost bought titan and thankfully that didn't work out. What happened to the days of 399.99 top of the line enthusiast cards?


My first top of the line card was 6800 Ultra @ $500. They've mostly been $500+ except for the GTX 285 iirc.


----------



## fateswarm

Quote:


> Originally Posted by *Tonza*
> 
> I like how you guys already compare it to Titan, its price doesnt have anything to do with performance anyway. Also there is card called GTX 780? That is like 10% slower than Titan and it costs in here Finland 619e (reference model), AMD R9-290X in the other hand is gonna be for sure 599e minimum in here (something like 599-649e is for sure gonna be its final price in here). And what have seen from those slides, its performance is little slower in Firestrike than on stock 780, which is quite underwhelming.
> 
> Those also who are saying that Firestrike is not optimized for Nvidia.... AMD won on their 7970/7950 the GTX 680/670 because they have better compute performance and Firestrike uses that a lot.
> 
> You guys are simply asking too much from AMD new chip which is like 30% smaller than GK110 chip.


They claim to have 39% more transistors. That's quite impressive and hype-free, provided they didn't flat out lie.


----------



## th3illusiveman

Quote:


> Originally Posted by *Stay Puft*
> 
> No it won't be even close. Amd isn't going to ripoff it's customers like Nvidias high end offerings


Are you an AMD fanboy again? I swear you hop over the fence so many times i don't know if it's sarcasm or a legitimate statement from you.
Quote:


> Originally Posted by *Tonza*
> 
> I like how you guys already compare it to Titan, its price doesnt have anything to do with performance anyway. Also there is card called GTX 780? That is like 10% slower than Titan and it costs in here Finland 619e (reference model), AMD R9-290X in the other hand is gonna be for sure 599e minimum in here (something like 599-649e is for sure gonna be its final price in here). And what have seen from those slides, its performance is little slower in Firestrike than on stock 780, which is quite underwhelming.
> 
> Those also who are saying that Firestrike is not optimized for Nvidia.... AMD won on their 7970/7950 the GTX 680/670 because they have better compute performance and Firestrike uses that a lot.
> 
> You guys are simply asking too much from AMD new chip which is like 30% smaller than GK110 chip.


A stock 7970 isn't very far off from a stock GTX780... if you think an improved GCN architecture with ~2800 cores, 44 ROPs and a 512bit bus won't demolish a stock 780 then i have bad news for you. If the rumored specs are legitimate this thing will run over stock GK110 offerings including the titan, the only thing that remains to be seen is how well they overclock because i'm sure the power draw will be huge much higher than GK110 that's for sure.


----------



## kot0005

Quote:


> The red team says that with Mantel, its new flagship GPU will "ridicule" the GeForce Titan


Source: http://www.maximumpc.com/amd_r9_290x_will_be_much_faster_titan_battlefield_4


----------



## malpais

Quote:


> Originally Posted by *Stay Puft*
> 
> I for one am sick of the high priced enthusiast cards from Nvidia. Almost bought titan and thankfully that didn't work out. What happened to the days of 399.99 top of the line enthusiast cards?


They realized people have more money than brains


----------



## szeged

Mantle will be exciting for those hardcore amd gamers on bf4, i only played bf3 for like a total of 10 hours on multiplayer, so it isnt as exciting for me, what i am excited for is if it takes off and gets widely accepted how it will effect other games

now, on to how the r9 290x overclocks and its maximum overclock potential in benchmarks, now thats something im interested in, my titans need a bit of competition, my 7970s and 780s just dont compare









also, thinking the titan will EOL at $1k usd aswell since nvidia will probably gouge it to the very end, then drop the 780s price to compete with 290x lol


----------



## fateswarm

Guys, seriously, settle down with Mantle, they said "900% faster" in a promotional slide and you believed it.

OpenGL and DirectX are already *as fast as a card can ever be once you've send a shader and the relevant assets to the card*, they have obviously tested it in specialized conditions to make that impressive claim, and that if.


----------



## Scorpion49

Mildly interested in this Mantle business, kinda worried about the 290X. They show the slide saying it has a firestrike score of 7000, my 770 did that stock and my 780 easily went past 10,000. Since that is the only thing they have "show" for performance I'm wondering what the heck this titan-beating talk is about. Does it only beat everything Nvidia if you're running a game with their new API?


----------



## Clocknut

Quote:


> Originally Posted by *fateswarm*
> 
> Guys, seriously, settle down with Mantle, they said "900% faster" in a promotional slide and you believed it.
> 
> OpenGL and DirectX are already *as fast as a card can ever be once you've send a shader and the relevant assets to the card*, they have obviously tested it in specialized conditions to make that impressive claim, and that if.


Anything that bring additional 10-20% fps is more than enough reason to use Mantel over DirectX on a top title like BF4.

Lets not forget all EA titles use frostbite 3, so there is quite a lot of title coming to use mantel there alone from just 1 single developer.


----------



## Ashuiegi

i m dubious as my hd7970 matrix does 7800 in fire strike , this would do only 7000-8000 stock ,......, don't think it s that good and sadly not better then overpriced nvidia cards


----------



## fateswarm

Quote:


> Originally Posted by *Clocknut*
> 
> Anything that bring additional 10-20% fps is more than enough reason to use Mantel over DirectX on a top title like BF4.
> 
> Lets not forget all EA titles use frostbite 3, so there is quite a lot of title coming to use mantel there alone from just 1 single developer.


In any case all those claims require specialized cases, and that if we assumed the claims are even true. A graphics card nowadays, via the nature of the DirectX or OpenGL APIs themselves spends almost all of the time having stuff offloaded to it and NOT communicating with the CPU and that's precisely what their Committees have been doing in the last 20 years, any novice programmer following the OpenGL changelog for example knows that, it's offload, offload, and more offload.


----------



## M1kuTheAwesome

HOLY UNICORN PORN! That's it, I'm selling my kidney for this thing...
On a more serious note, if the benches are gonna be as good as they're saying and the price is low things on the market are gonna get interesting...


----------



## Clocknut

Quote:


> Originally Posted by *szeged*
> 
> Mantle will be exciting for those hardcore amd gamers on bf4, i only played bf3 for like a total of 10 hours on multiplayer, so it isnt as exciting for me, what i am excited for is if it takes off and gets widely accepted how it will effect other games
> 
> now, on to how the r9 290x overclocks and its maximum overclock potential in benchmarks, now thats something im interested in, my titans need a bit of competition, my 7970s and 780s just dont compare
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also, thinking the titan will EOL at $1k usd aswell since nvidia will probably gouge it to the very end, then drop the 780s price to compete with 290x lol


TITAN will highly likely to go EOL. 290X is already over 250w TDP limit. So Nvidia is more than likely to release GTX785/790 to retaliate, may be 2688 @ 1GHz @ 7Ghz memory, >250w TDP. it is not hard for Nvidia as they already a few months after TITAN release, GK110 now would probably have better binned than TITAN by now.

however if mantel are providing serious performance bump to AMD, Nvidia will really need work really hard to overcome this gap with a much more superior hardware. The same thing how they did with Geforce 2 GTS vs 3dfx Voodoo 5 on glide.

Honestly I am looking for just 10-20% performance bump at best for mantel. It won be a super bump. not 100% lol

I am just glad I pick 7790 over Nvidia's GTX650Ti.


----------



## Alatar

Quote:


> Originally Posted by *Scorpion49*
> 
> Mildly interested in this Mantle business, kinda worried about the 290X. They show the slide saying it has a firestrike score of 7000, my 770 did that stock and my 780 easily went past 10,000. Since that is the only thing they have "show" for performance I'm wondering what the heck this titan-beating talk is about. Does it only beat everything Nvidia if you're running a game with their new API?


the slides were showing ~8000 actually. And around 17% faster than that refreshed 1070MHz 7970GHz that they now call R9 280X...


----------



## Forceman

Quote:


> Originally Posted by *th3illusiveman*
> 
> A stock 7970 isn't very far off from a stock GTX780...


Sure, what's 20+% among friends.


----------



## Stay Puft

Quote:


> Originally Posted by *th3illusiveman*
> 
> Are you an AMD fanboy again? I swear you hop over the fence so many times i don't know if it's sarcasm or a legitimate statement from you.
> A stock 7970 isn't very far off from a stock GTX780... if you think an improved GCN architecture with ~2800 cores, 44 ROPs and a 512bit bus won't demolish a stock 780 then i have bad news for you. If the rumored specs are legitimate this thing will run over stock GK110 offerings including the titan, the only thing that remains to be seen is how well they overclock because i'm sure the power draw will be huge much higher than GK110 that's for sure.


Everyone needs to stop with calling each other fanboys. I never have and never will be a fanboy. I like Price/Performance and will always go the way for best bang for my buck. I'm just sick of the high prices. I can easily afford them but I simply just refuse to pay them.


----------



## szeged

Quote:


> Originally Posted by *Clocknut*
> 
> TITAN will highly likely to go EOL. 290X is already over 250w TDP limit. So Nvidia is more than likely to release GTX785/790 to retaliate, may be 2688 @ 1GHz @ 7Ghz memory, >250w TDP. it is not hard for Nvidia as they already a few months after TITAN release, GK110 now would probably have better binned than TITAN by now.
> 
> however if mantel are providing serious performance bump to AMD, Nvidia will really need work really hard to overcome this gap with a much more superior hardware. The same thing how they did with Geforce 2 GTS vs 3dfx Voodoo 5 on glide.
> 
> Honestly I am looking for just 10-20% performance bump at best for mantel. It won be a super bump. not 100% lol
> 
> I am just glad I pick 7790 over Nvidia's GTX650Ti.


yeah im not expecting mantel to be a 100% increase on games like amd is suggesting at, probably 10 to 20% maybe, still a nice performance increase though.

Titan will most likely EOL within the coming 5 months, and probably at the 1k price still lol, hopefully it drops the price on 780s though cuz i know lots of friends that want 780s but can really only get two if they are at around the 500 price range lol, but even then, the 290x might be the better option, all depends on how mantle turns out, and how the 290x performs compared to gk110 cards.


----------



## glina

Quote:


> Originally Posted by *Scorpion49*
> 
> Mildly interested in this Mantle business, kinda worried about the 290X. They show the slide saying it has a firestrike score of 7000, my 770 did that stock and my 780 easily went past 10,000.


They probably tested with AMD cpu's


----------



## Blackops_2

Quote:


> Originally Posted by *Stay Puft*
> 
> Everyone needs to stop with calling each other fanboys. I never have and never will be a fanboy. I like Price/Performance and will always go the way for best bang for my buck. I'm just sick of the high prices. I can easily afford them but I simply just refuse to pay them.


I'll even admit it does seem your attitude has changed a bit regarding AMD. Though that doesn't make you a fanboy.

So noone got out of there and leaked some info? My feelings about it is i wasted two hours of my time to see an announcement of a naming scheme we already knew about and the one thing we cared about (the price) wasn't shown. I didn't even need benches i just wanted to know the price of the 290x. It's frustrating when you think about it.


----------



## Chrono Detector

I never knew that AMD would use a 512 bit memory bus on GDDR5, though its been a while since GDDR5 has been out and about time that they decided to use the 512 bit memory bus again, interested to see how this performs. Hopefully it won't be a flop like the 2900XT.


----------



## Clocknut

Quote:


> Originally Posted by *fateswarm*
> 
> In any case all those claims require specialized cases, and that if we assumed the claims are even true. A graphics card nowadays, via the nature of the DirectX or OpenGL APIs themselves spends almost all of the time having offloaded to the GPU and NOT communicating with the CPU and that's precisely what their Committees have been doing in the last 20 years, any novice programmer following the OpenGL changelog for example knows that, it's offload, offload, and more offload.


with having Forstbite 3 fully support it, this is already a very good head start. I do not expect these "optimization" or low level metal API will bring the kind of performance gain we see in console.

This is "a more specialize API than what DirectX can do" but less than what we see in console. Since AMD will need this to work in all the future GCN arch.

And I dont think AMD will bother doing all these API if they only get a puny <10% gain. This API will give GCN at least 10% to as high as 20% to be worth all these trouble implement it.

IMO, for me 10% gain is already a pretty good reason to use Mantel if I have GCN GPU. That 10-20% gap is pretty significant to Nvidia because they have to make a GPU 20% more powerful to get same performance in DirectX


----------



## Brutuz

Quote:


> Originally Posted by *Testier*
> 
> Titan's huge price is justified by its 1/3 DP. I somehow doubt R9 290X can offer the same computing performance.


GCN is great at compute, the HD7970 Ghz isn't actually that far behind the Titan a lot of the time...This will probably beat or match a Titan in compute performance at least. I'm not even going to bother to comment on the 3DMark because..well, it's not a real game and the performance is merely above that, not equal...we also don't know clocks or overclocks, it could easily end up like the HD7970/HD7950 where it's slower at stock but ends up coming a lot closer OCed.
Quote:


> Originally Posted by *Ashuiegi*
> 
> i m dubious as my hd7970 matrix does 7800 in fire strike , this would do only 7000-8000 stock ,......, don't think it s that good and sadly not better then overpriced nvidia cards


Everyone seems to be ignoring the fact that it's clearly >7700, not equal to it...You can clearly see that the top is shaded out and that the graph is fairly imprecise and that on the other GPUs slides, all the numbers are marked as "Greater Than".
AMD is just saying "You'll definitely get this level of performance, maybe get more".

I'm hoping this will be around $500-$550...I wanna get two and start playing with Eyefinity.


----------



## Clockster

Well I have put my 7990's up for sale, although the event was piss poor, I have high hopes for this card.
Hoping for $550, expecting it to be $630 or something like that.


----------



## Baghi

Quote:


> Originally Posted by *th3illusiveman*
> 
> Are you an AMD fanboy again? I swear you hop over the fence so many times i don't know if it's sarcasm or a legitimate statement from you.


Actually it's the way how he talks. Whenever he speaks in the favor of a particular company (be it Intel/NVIDIA or AMD) it makes him look like a fanboy which doesn't seems to be the case after this statement of his.


----------



## bigtonyman1138

Posted this in the other thread, but figured I'd post it here as well as it relates to the discussion. Was still semi-tempted to get a 780 after the announcement, but this article got rid of that notion in a heartbeat.









[Maximum PC] AMD: R9 290X Will Be "Much Faster Than Titan in Battlefield 4"


----------



## Baghi

"Much Faster Than Titan in Battlefield 4" Lol, meaning my GCN-based HD 7850 will also be faster than the GTX 760 in Battlefield 4 as well, because of Metal. It's just a marketing stunt, nothing much.


----------



## bigtonyman1138

Quote:


> Originally Posted by *Baghi*
> 
> "Much Faster Than Titan in Battlefield 4" Lol, meaning my GCN-based HD 7850 will also be faster than the GTX 760 in Battlefield 4 as well, because of Metal. It's just a marketing stunt, nothing much.


If other developers sign on though that are releasing console ports, it could be a huge advantage. Really depends on how many games make use of it. Seems like a solid offering regardless as they hinted at the $599 at the bottom of the article. As long as its cheaper and consistently beats the 780, I can't really complain.


----------



## Kipsta77

Wonder how it'll compare to TITAN.


----------



## specopsFI

So it will beat the Titan... once you benchmark it with a proprietary API.

Just so that you know, AMD: that's a great way to kill any goodwill you've collected along the years by supporting open API's.

That's all there was to this announcement: two proprietary API's. What a complete waste of time.


----------



## bigtonyman1138

Quote:


> Originally Posted by *specopsFI*
> 
> So it will beat the Titan... once you benchmark it with a proprietary API.
> 
> Just so that you know, AMD: that's a great way to kill any goodwill you've collected along the years by supporting open API's.
> 
> That's all there was to this announcement: two proprietary API's. What a complete waste of time.


Can't really blame them though. If you had the ability to leverage the market like it looks like mantel has the potential to do, you wouldn't go through with it?


----------



## szeged

if you look at it from a big corporation trying to maximize their profits then yes pushing mantle to be the go to when benching cards makes sense

when looking at it as someone from reality, its kinda lame.


----------



## Baghi

Quote:


> Originally Posted by *specopsFI*
> 
> So it will beat the Titan... once you benchmark it with a proprietary API.
> 
> Just so that you know, AMD: that's a great way to kill any goodwill you've collected along the years by supporting open API's.
> 
> That's all there was to this announcement: two proprietary API's. What a complete waste of time.


It's not a propriety program like NVIDIA PhysX. Talking about PhysX, @AMD, GTX TITAN will "ludicule" 2 of those 290X IF PhysX enabled.

Hope I'm not overstating things.


----------



## fateswarm

Quote:


> Originally Posted by *Kipsta77*
> 
> Wonder how it'll compare to TITAN.


I believe the most reliable of the promotional info right now is the transistor count, assuming of course they didn't lie. They are supposed to be 39% more than 7970's which is pretty good, though of course it can't be directly 39% faster because of various reasons like architecture, bandwidth(s), ..


----------



## Kipsta77

Quote:


> Originally Posted by *fateswarm*
> 
> I believe the most reliable of the promotional info right now is the transistor count, assuming of course they didn't lie. They are supposed to be 39% more than 7970's which is pretty good, though of course it can't be directly 39% faster because of various reasons like architecture, bandwidth(s), ..


A major factor other than speed is noise & driver support, the things that I can't depend on with AMD cards (yet).


----------



## Johnny Rook

Quote:


> Originally Posted by *Baghi*
> 
> It's not a propriety program like NVIDIA PhysX. Talking about PhysX, @AMD, GTX TITAN will "ludicule" 2 of those 290X IF PhysX enabled.
> 
> Hope I'm not overstating things.


I have heard for years the majority of ATI cards owners - like I was for 5 years -, AMD cards owners saying nVidia cards owners couldn't turn on PhysX in software like 3D Mark Vantage or in games like Borderlands 2 to "fairly" benchmark the nVIDIA cards performance vs ATI/AMD cards performance. I just can't wait to see what that majority will say about not using Mantle to "fairly" test the cards. I think I'll be quite entertained with the discussion...

Anyways, I really hope the R9 290X beats and "ridicules" the TITAN for €550-600. I will get one for sure!


----------



## mtcn77

Not many have noticed "Forward+", as it seems... Graphics cards performance scaling like Dirt series, anyone?


----------



## Baghi

Those good days, "NVIDIA's drivers are better", "AMD has superior image quality and build quality".


----------



## nz3777

I must say I like the design of the card itself,Looks like a 5000,6000 series but fixed-up version,Iam glad they stuck to the Red and Black theme- These are considerd the Refrence coolers no? ......... 3000 Stream Prosessors wow just wow!


----------



## Moragg

Quote:


> Originally Posted by *Johnny Rook*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Baghi*
> 
> It's not a propriety program like NVIDIA PhysX. Talking about PhysX, @AMD, GTX TITAN will "ludicule" 2 of those 290X IF PhysX enabled.
> 
> Hope I'm not overstating things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have heard for years the majority of ATI cards owners - like I was for 5 years -, AMD cards owners saying nVidia cards owners couldn't turn on PhysX in software like 3D Mark Vantage or in games like Borderlands 2 to "fairly" benchmark the nVIDIA cards performance vs ATI/AMD cards performance. I just can't wait to see what that majority will say about not using Mantle to "fairly" test the cards. I think I'll be quite entertained with the discussion...
> 
> Anyways, I really hope the R9 290X beats and "ridicules" the TITAN for €550-600. I will get one for sure!
Click to expand...

Wasn't PhysX an extra feature set that would negatively impact performance? Whereas the only thing Mantle does is optimise how the GPU is used.

I swear everyone on here has been asking for better software, AMD claim to have delivered it with Mantle. So what's the issue?


----------



## malmental

Quote:


> Originally Posted by *Stay Puft*
> 
> No it won't be even close. Amd isn't going to ripoff it's customers like Nvidias high end offerings


Quote:


> Originally Posted by *th3illusiveman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Stay Puft*
> 
> No it won't be even close. Amd isn't going to ripoff it's customers like Nvidias high end offerings
> 
> 
> 
> Are you an AMD fanboy again? I swear you hop over the fence so many times i don't know if it's sarcasm or a legitimate statement from you.
Click to expand...

^^
*LOL you two...
Pot meet kettle..!*

Quote:


> Originally Posted by *kot0005*
> 
> Quote:
> 
> 
> 
> The red team says that with Mantel, its new flagship GPU will "ridicule" the GeForce Titan
> 
> 
> 
> Quote:
> 
> 
> 
> Source: http://www.maximumpc.com/amd_r9_290x_will_be_much_faster_titan_battlefield_4
> 
> 
> 
> 
> Click to expand...
> 
> he looks like Linus sucking cooks again huh.?
Click to expand...


----------



## Baghi

TPU edited their article with 2800 Stream Processor, and shamelessly removed 512-bit interface completely. 2 rumors are down already.


----------



## Alatar

"Close to 2800 steam processors"

Either shoddy editing or there's something weird going on with Hawaii.


----------



## Moragg

Quote:


> Originally Posted by *Alatar*
> 
> "Close to 2800 steam processors"
> 
> Either shoddy editing or there's something weird going on with Hawaii.


AMD need to hurry up and give us specs. 512-bit bus (likely), 3000+ Shaders (unlikely) and 2816 (possible) 2500 (very possible). Mantle isn't going to be out for 2 months, we need to know why we shouldn't wait until it and it's benchmarks arrive. Looks certainly isn't a reason. And did they explain what the stock cooling solution was either?


----------



## veyron1001

Quote:


> Originally Posted by *Brutuz*
> 
> afaik clocks are one of the few things that can be changed very close to release, iirc the HD4850 had its clocks and vRAM size increased a month or two before launch.
> And how many people still believe nVidia has better drivers than AMD...They have more features but that's really it these days.


Well they are better at destroying cards with their drivers.


----------



## Oubadah

..


----------



## taafe

Nvidias answer to this is the titan ultra which is being released late 2013.. although amd would sure win on pricing I bet


----------



## Oubadah

..


----------



## Shiveron

Quote:


> Originally Posted by *Oubadah*
> 
> I can see this throwing a spanner in the works of my SSD scheme (again). But damn, if that card isn't ugly. Looks like Asus have had their hands on it. Why can't they take a leaf out of Nvidia's book (Titan/690).
> 
> I hope it has no DX9 issues. That's why I never got the 7970 - alleged issues with DX9. I want performance in new games, but if it can't handle old games competently then AMD knows where they can shove it.


I play plenty of old dx9 games stil with my 7970. It's not that the card has problems with dx9, it's that most people don't know that dx11 doesn't include the libraries from dx9 and below, and with all these new machines, most people just install the most up to date version. You have to actually go download dx 9.0c and install that as well, and everything works great.


----------



## taafe

Quote:


> Originally Posted by *Oubadah*
> 
> I'd like to see evidence of this "Titan Ultra".


Try google coz im on my phome and cant send links for some reason?? there is talks about a titan ultra and titan LE ok so theres no release date but imo I think it will be to compete with the r9 290x late this year.


----------



## taafe

Quote:


> Originally Posted by *Oubadah*
> 
> I'd like to see evidence of this "Titan Ultra".


Try google coz im on my phome and cant send links for some reason?? there is talks about a titan ultra and titan LE ok so theres no release date but imo I think it will be to compete with the r9 290x late this year.


----------



## taafe

Quote:


> Originally Posted by *Oubadah*
> 
> I'd like to see evidence of this "Titan Ultra".


Try google coz im on my phome and cant send links for some reason?? there is talks about a titan ultra and titan LE ok so theres no release date but imo I think it will be to compete with the r9 290x late this year.


----------



## TooBAMF

Isn't Titan LE the GTX 780? If not, then what purpose does it serve? Titan Ultra with full 2880 GK110 is possible since it already exists in the Quadro line, but it won't compete with the buzz surrounding AMD's Mantle, and especially not at $1000+

If I'm Nvidia, Titan Ultra doesn't solve the problem. People thought AMD's announcement was fail until Mantle was mentioned. Titan Ultra is not going to take any hype away from Mantle unless Mantle only adds 10-15% which Nvidia could add with more shaders in a full GK110.


----------



## wstanci3

Quote:


> Originally Posted by *TooBAMF*
> 
> Isn't Titan LE the GTX 780? If not, then what purpose does it serve? Titan Ultra with full 2880 GK110 is possible since it already exists in the Quadro line, but it won't compete with the buzz surrounding AMD's Mantle, and especially not at $1000+
> 
> If I'm Nvidia, Titan Ultra doesn't solve the problem. People thought AMD's announcement was fail until Mantle was mentioned. Titan Ultra is not going to take any hype away from Mantle unless Mantle only adds 10-15% which Nvidia could add with more shaders in a full GK110.


True enough. While Nvidia can throw more horsepower into the market with a Titan Ultra, it wouldn't solve the problem. The problem here isn't that we don't have the horsepower for games, it is the utilization of that power that is the problem. Proper optimization should be the focal point for the new generation of consoles that would, in turn, give better pc ports.
My only fear is that AMD won't push this onto developers and it fails before it begins, or AMD uses it as their own proprietary feature. Time will tell.


----------



## Rahulzz




----------



## kot0005

XFX R9 280X

Source: http://videocardz.com/46097/xfx-radeon-r9-280x-double-dissipation-pictured


----------



## TooBAMF

Quote:


> Originally Posted by *Rahulzz*


I just noticed that says "Winning at all the *major* price points," Yet the Firestrike results don't show it winning against GTX 780. Either AMD doesn't consider $650 "major" or they mislabeled the 290 as a 290X and left out the real 290X. They could mean because of Mantle, but why show Firestrike results?


----------



## wstanci3

Quote:


> Originally Posted by *kot0005*
> 
> XFX R9 280X
> 
> Source: http://videocardz.com/46097/xfx-radeon-r9-280x-double-dissipation-pictured


XFX?
Ugh, I'll pass.


----------



## kot0005

Quote:


> AMD's new line of cards ... can't Crossfire them on pci-e 2.0 motherboards


Source: https://linustechtips.com/main/topic/59975-amds-new-line-of-cards-cant-crossfire-them-on-pci-e-20-motherboards/#entry814022

well, this will make a lot of people angry... So you need a PCI E 3.0 CPU and mobo...


----------



## TooBAMF

Quote:


> Originally Posted by *kot0005*
> 
> Source: https://linustechtips.com/main/topic/59975-amds-new-line-of-cards-cant-crossfire-them-on-pci-e-20-motherboards/#entry814022


I assume they just mean 290 and 290X since the others are rebrands? Finally a reason for Sandy Bridge owners to upgrade though...


----------



## Mas

Quote:


> Originally Posted by *kot0005*
> 
> Source: https://linustechtips.com/main/topic/59975-amds-new-line-of-cards-cant-crossfire-them-on-pci-e-20-motherboards/#entry814022
> 
> well, this will make a lot of people angry... So you need a PCI E 3.0 CPU and mobo...


Well, guess I'll be waiting on green team response then.


----------



## BradleyW

Quote:


> Originally Posted by *kot0005*
> 
> Source: https://linustechtips.com/main/topic/59975-amds-new-line-of-cards-cant-crossfire-them-on-pci-e-20-motherboards/#entry814022
> 
> well, this will make a lot of people angry... So you need a PCI E 3.0 CPU and mobo...


Will I be able to CF? I have X79 and the slots do show PCI-E 3.0 in GPU-Z!


----------



## kot0005

Quote:


> Originally Posted by *TooBAMF*
> 
> I assume they just mean 290 and 290X since the others are rebrands? Finally a reason for Sandy Bridge owners to upgrade though...


The new cards dont have a Crossfire finger, rumour is that they use the PCI e slots and its not been confirmed by AMD that they require PCI e 3.0 , but if it did then that would turn out to be bad. I am guessing that lots of people including myself prefer i5 2500k over the new gen Intel CPU's because of all the Heat issues from intel not soldering the silicon to the betal plate.


----------



## TooBAMF

Quote:


> Originally Posted by *BradleyW*
> 
> Will I be able to CF? I have X79 and the slots do show PCI-E 3.0 in GPU-Z!


AMD has always supported PCI-E 3.0 on X79 afaik. Also LOL if they don't support it on X79, that would be suicide in the ultra enthusiast market. Then again, they said they aren't going for that


----------



## kot0005

Quote:


> Originally Posted by *BradleyW*
> 
> Will I be able to CF? I have X79 and the slots do show PCI-E 3.0 in GPU-Z!


You will need a CPU that can provide the bandwidth for the PCIe 3.0 ports. I am not sure about the X79 boards, am only aware of the sockets 1150 and 1155 CPU's


----------



## Jared2608

Am I the only one that thinks that graph shows the R9-290X getting 8000 in Firestrike? Clearly it's last red square ends above the 7000 line, and if you go on the way the other blocks work, then that block represents 8000. There just isn't a label saying so. It could be 8000ish and they just didn't label it.


----------



## whtchocla7e

This ain't the first time that a manufacturer breaks compatibility. Progress people, get with it.


----------



## TooBAMF

Quote:


> Originally Posted by *Jared2608*
> 
> Am I the only one that thinks that graph shows the R9-290X getting 8000 in Firestrike? Clearly it's last red square ends above the 7000 line, and if you go on the way the other blocks work, then that block represents 8000. There just isn't a label saying so. It could be 8000ish and they just didn't label it.


I was under the impression that's what most people thought. There are eight boxes, 8000 is the only logical conclusion unless it's over 8000







but cutoff. The point is *only* 8000 is kind of disappointing given the scores for the 780.


----------



## Jared2608

I've seen people else where say that since it's not labeled it doesn't count. I mean it could be 8000+, for all we know. It does seem logical to me that it is saying 8000, but in my time on the internet I've come to the conclusion that logic isn't always applied.


----------



## kot0005

OR may be the firestrike score is TBA as the arrow points upwards to infinity and there's no numbering after 7000


----------



## raghu78

here is an interesting tidbit.

http://techreport.com/news/25428/driver-fix-for-crossfire-eyefinity-4k-frame-pacing-issues-coming-this-fall

"In our talk, Koduri acknowledged that the frame delivery issues with CrossFire were "unfortunate," and he expressed a desire to make things right. Koduri told me AMD is working on a fix for CrossFire and single-large-surface display configurations, like Eyefinity and tiled 4K monitors, for current Radeon graphics cards. He said AMD plans to deliver a driver with the fix this fall.

AMD's newest Radeon GPU, the "Hawaii" chip that will power the Radeon R9 290 and 290X cards announced earlier today, will of course be a top priority, as well. *Although we can't yet divulge too many details, we expect Hawaii-based graphics cards to arrive with a very capable solution for CrossFire frame compositing and pacing already in place.*

Some of the new Radeon cards announced today that are based on older GPUs like Tahiti and Pitcairn will presumably have to wait for the fall driver release in order to see this issue resolved."

so is that a hint at a hardware based CF frame pacing solution in Hawaii. if its is then niceeeeee


----------



## Jared2608

I like reading about a new tech, and then I cry a little inside knowing I can't afford it...


----------



## darkstar585

Quote:


> Originally Posted by *kot0005*
> 
> OR may be the firestrike score is TBA as the arrow points upwards to infinity and there's no numbering after 7000


It has to be! I mean I can clear 7000 with my 7970 non ghz edition, just with a modest overclock of 1180/1600...and that's down clocked from what I normally run (1270/1900) due to the temps in my living room











http://www.3dmark.com/3dm/1281733

Unless that chart really does mean extreme setting in which case that is very impressive.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Jared2608*
> 
> I like reading about a new tech, and then I cry a little inside knowing I can't afford it...


I so totally feel ya bro...


----------



## Rustynails

Quote:


> Originally Posted by *kot0005*
> 
> XFX R9 280X
> 
> Source: http://videocardz.com/46097/xfx-radeon-r9-280x-double-dissipation-pictured


this card looks really nice, lets hope it will put back my faith in xfx, because before the 7xxx series they where the go to card dealer.


----------



## y2kcamaross

Quote:


> Originally Posted by *darkstar585*
> 
> It has to be! I mean I can clear 7000 with my 7970 non ghz edition, just with a modest overclock of 1180/1600...and that's down clocked from what I normally run (1270/1900) due to the temps in my living room
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/1281733
> 
> *Unless that chart really does mean extreme setting* in which case that is very impressive.


not even close, you do realize the 280x(7970 rebrand) doesnt get close to that score using extreme settings, right?


----------



## Baghi

Quote:


> Originally Posted by *kot0005*
> 
> Source: https://linustechtips.com/main/topic/59975-amds-new-line-of-cards-cant-crossfire-them-on-pci-e-20-motherboards/#entry814022
> 
> well, this will make a lot of people angry... So you need a PCI E 3.0 CPU and mobo...


I highly doubt it. Majority of AMD motherboards still don't come with PCI-E 3.0 support.
Quote:


> Originally Posted by *y2kcamaross*
> 
> not even close, you do realize the 280x(7970 rebrand) doesnt get close to that score using extreme settings, right?


Yeah and also do note 5500 3dmarks of the HD 7850 replacement. The HD 7870 already pulls this much.


----------



## malmental

Quote:


> Originally Posted by *kot0005*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Will I be able to CF? I have X79 and the slots do show PCI-E 3.0 in GPU-Z!
> 
> 
> 
> You will need a CPU that can provide the bandwidth for the PCIe 3.0 ports. I am not sure about the X79 boards, am only aware of the sockets 1150 and 1155 CPU's
Click to expand...


----------



## Baghi

Quote:


> [...]
> 
> Moving on, there's the $199 Radeon R9 270X. Based on a chip not much unlike "Tahiti LE," it features 2 GB of memory, and 3DMark Firestrike score of over 5,500 points. Then there's the Radeon R9 280X. This card, priced attractively at $299, is practically a rebrand of the Radeon HD 7970 GHz Edition with. It features 3 GB of RAM, and over 6,800 points on 3DMark Firestrike.


Source: techPowerUp

As expected, it's 3DMark FS and not the FS-E.


----------



## Master__Shake

Quote:


> Originally Posted by *Baghi*
> 
> I highly doubt it. Majority of AMD motherboards still don't come with PCI-E 3.0 support.
> Yeah and also do note 5500 3dmarks of the HD 7850 replacement. The HD 7870 already pulls this much.


its an engineering sample...the release ones will be able to crossfire.


----------



## Juub

Quote:


> Originally Posted by *Baghi*
> 
> Source: techPowerUp
> 
> As expected, it's 3DMark FS and not the FS-E.


Nobody in their right mind thought it was Fire Strike Extreme. Titan scores 7K in Fire Strike Extreme. A rebadged 7970GHZE couldn't get this close.


----------



## darkstar585

Quote:


> Originally Posted by *y2kcamaross*
> 
> not even close, you do realize the 280x(7970 rebrand) doesnt get close to that score using extreme settings, right?


good point, didn't think of that









Lets hope the 290x has some serious overclocking potential...and that chart is grossly out of proportion. Otherwise this new card could be quite lame.


----------



## Baghi

Quote:


> Originally Posted by *Juub*
> 
> Nobody in their right mind thought it was Fire Strike Extreme. Titan scores 7K in Fire Strike Extreme. A rebadged 7970GHZE couldn't get this close.


Hold your horses young man, no need to be mad at this. There are some people who "believed" it's FS-E; I was just clearing that confusion. You may want to go back to few pages back I reckon.



See at the very right side of above slide (courtesy of Chiploco), it's clearly written "*3DMARK (R) FIRESTRIKE PERFORMANCE**".


----------



## Alatar

It being P preset fire strike was clear since ~6800 is pretty much exactly what a 7970GHz scores with a good CPU.


----------



## Baghi

Quote:


> Originally Posted by *Master__Shake*
> 
> its an engineering sample...the release ones will be able to crossfire.


At first, I thought you mistakenly quoted me but then I went back to the post I originally quoted and then realized what it meant. So, it's an engineering sample and perhaps one of the reviewer is commenting on it - this doesn't indicate that the new lineup won't run in CFX mode on motherboards having PCI-E 2.0 at all (not even with bridges). Thanks.


----------



## Juub

Quote:


> Originally Posted by *Baghi*
> 
> Hold your horses young man, no need to be mad at this. There are some people who "believed" it's FS-E; I was just clearing that confusion. You may want to go back to few pages back I reckon.
> 
> 
> 
> See at the very right side of above slide (courtesy of Chiploco), it's clearly written "*3DMARK (R) FIRESTRIKE PERFORMANCE**".


Oh I wasn't mad. Just calling those who thought it was Fire Strike Extreme were insane. A 300$ card competing with the Titan?


----------



## Ghoxt

Quote:


> Originally Posted by *fateswarm*
> 
> I believe the most reliable of the promotional info right now is the transistor count, assuming of course they didn't lie. They are supposed to be 39% more than 7970's which is pretty good, though of course it can't be directly 39% faster because of various reasons like architecture, bandwidth(s), ..


How much of the new transistor count is the new Programmable Audio engine... Didn't see that coming. None of us did.


----------



## BradleyW

I'm not sure if I should get two 290X cards or just keep what I have.....


----------



## zulk

^Do it


----------



## BradleyW

RRRRRRR!!!!! OK I WILL!!!!!!


----------



## malmental

do it....


----------



## Baghi

Do it.


----------



## BradleyW

I'm doing it!!! Right now somehow!!


----------



## DADDYDC650

Quote:


> Originally Posted by *BradleyW*
> 
> I'm doing it!!! Right now somehow!!


You can do it!!!


----------



## BradleyW

No pre-order button, No money, I am still doing this!!!!


----------



## fateswarm

What are you on about with CF hardware frame pacing fixes? It was shown that the recent drivers fix it to NVIDIA's levels and maybe more. Which was ironic since the guy that proved it was accused of being a Shill by AMD fans.


----------



## malmental

credit cards can be your friend.....
not American Express though, bastards.


----------



## DizzlePro

can someone confirm this?

Amd new lineups

*will it be like this ?*

R9 290X ( titan rival)
R9 290 (780 rival)
R9 280x (770 rival)
R9 280 (760 rival
R9 270 (750 rival) (

*or will it be like this?*

R9 290X ( titan rival)
R9 280x (780 rival)
R9 270x (770 rival)
R7 260x (760 rival
R7 250 (750 rival)


----------



## Robertdt

Wondering that too. And the prices of the R290 or 7950 equivalent


----------



## Stay Puft

Quote:


> Originally Posted by *malmental*
> 
> credit cards can be your friend.....
> not American Express though, bastards.


I love American Express but these new cards are going on my Newegg preferred account


----------



## TooBAMF

Quote:


> Originally Posted by *DizzlePro*
> 
> can someone confirm this?
> 
> Amd new lineups
> 
> *will it be like this ?*
> 
> R9 290X ( titan rival)
> R9 290 (780 rival)
> R9 280x (770 rival)
> R9 280 (760 rival
> R9 270 (750 rival) (
> 
> *or will it be like this?*
> 
> R9 290X ( titan rival)
> R9 280x (780 rival)
> R9 270x (770 rival)
> R7 260x (760 rival
> R7 250 (750 rival)


More like your first list. However, no real confirmation on the 290X being a Titan rival, it might just be a 780 rival. The numbers we have from AMD show the 290X potentially being slower than the 780 in Firestrike. 290X should beat the Titan in BF4. 280X is a rebranded 7970 GHz with faster memory.


----------



## malmental

Quote:


> Originally Posted by *Stay Puft*
> 
> Quote:
> 
> 
> 
> Originally Posted by *malmental*
> 
> credit cards can be your friend.....
> not American Express though, bastards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I love American Express but these new cards are going on my Newegg preferred account
Click to expand...

I leave my AMEX for Amazon only now and use PayPal for almost everything else excluding Newegg Preferred..


----------



## Stay Puft

Quote:


> Originally Posted by *malmental*
> 
> I leave my AMEX for Amazon only now and use PayPal for almost everything else excluding Newegg Preferred..


People still use paypal? I gave up on that scam after PayPal gave back the money this douche scammed me out of. He got a brand newish 7970, he claimed defective and mailed me back a rock in a box and since he had a tracking number PayPal gave him back his money. It still pisses me off to this day. I even took pics for PayPal to see he returned me a rock. Nope.. Didn't matter. He had a tracking number so it was golden.. Ugh


----------



## Ghoxt

Quote:


> Originally Posted by *Stay Puft*
> 
> People still use paypal? I gave up on that scam after PayPal gave back the money this douche scammed me out of. He got a brand newish 7970, he claimed defective and mailed me back a rock in a box and since he had a tracking number PayPal gave him back his money. It still pisses me off to this day. I even took pics for PayPal to see he returned me a rock. Nope.. Didn't matter. He had a tracking number so it was golden.. Ugh


I feel for you. I had my run in with them years ago. Took forever to get it straightened out.


----------



## fateswarm

Quote:


> Originally Posted by *Stay Puft*
> 
> People still use paypal? I gave up on that scam after PayPal gave back the money this douche scammed me out of. He got a brand newish 7970, he claimed defective and mailed me back a rock in a box and since he had a tracking number PayPal gave him back his money. It still pisses me off to this day. I even took pics for PayPal to see he returned me a rock. Nope.. Didn't matter. He had a tracking number so it was golden.. Ugh


Wow that's just purely antisocial behavior, a medical condition that he possibly had that doesn't only make him a criminal but also someone *enjoying* being criminal. You might have had ground for psychological evolution of that person's integrity.


----------



## Sir Beregond

Ouch. Sorry to hear about that. Glad I don't use paypal then.

Anyway, can't wait to see what the final pricing is on the R9 290X. If its around $500ish give or take, I may jump on one a month or so after they come out. If they pull an nvidia and price it at $600 or $650, forget it. Granted 7970 pricing was crazy when they first released, but Titan and 780 are ludicrous.


----------



## ejb222

Well 280x was $299 on their slide right? So if the 290x is $599 they could fill that with the 290 @ $499 I suppose. But they would be hurting in the $300-$450 range with not much on the market for competition. I really hope the 290x is $499. That would be a better price advantage for them.


----------



## Sir Beregond

Quote:


> Originally Posted by *ejb222*
> 
> Well 280x was $299 on their slide right? So if the 290x is $599 they could fill that with the 290 @ $499 I suppose. But they would be hurting in the $300-$450 range with not much on the market for competition. I really hope the 290x is $499. That would be a better price advantage for them.


That's my hope. I could live with $499.


----------



## pac08

Rumor has it that 290X will retail at 699$... The guy who made this claim is usually well informed so i kind of trust him. If that's the case lots of people are going to be disappointed.


----------



## Moragg

Quote:


> Originally Posted by *pac08*
> 
> Rumor has it that 290X will retail at 699$... The guy who made this claim is usually well informed so i kind of trust him. If that's the case lots of people are going to be disappointed.


But what will the actual consumer price be? AMD really need to drive sales of GCN based cards (hence the 7950 price drop) so devs will see it is worth the time to use Mantle, and they certainly won't manage that at $700.

By the way - does Mantle require an AMD cpu? Becuase if this does reduce CPU overheads massively then suddenly the 4770K becomes extremely under-utilised for gaming... and the 8350 becomes the sensible choice.


----------



## scyy

Quote:


> Originally Posted by *fateswarm*
> 
> What are you on about with CF hardware frame pacing fixes? It was shown that the recent drivers fix it to NVIDIA's levels and maybe more. Which was ironic since the guy that proved it was accused of being a Shill by AMD fans.


While it is fixed(on single screens dx10/11) it is still not quite to nvidia's levels. There is still more variation in most games, however not enough to likely be perceivable to the vast majority of people.


----------



## rdr09

Quote:


> Originally Posted by *Baghi*
> 
> Hold your horses young man, no need to be mad at this. There are some people who "believed" it's FS-E; I was just clearing that confusion. You may want to go back to few pages back I reckon.
> 
> 
> 
> See at the very right side of above slide (courtesy of Chiploco), it's clearly written "*3DMARK (R) FIRESTRIKE PERFORMANCE**".


it does seem that the 280X is an equivalent of the 7970GHz. here is my 7950 oc'ed to 7970 GHz speed . . .

http://www.3dmark.com/3dm/1168637


----------



## Blindsay

Quote:


> Originally Posted by *Baghi*
> 
> Hold your horses young man, no need to be mad at this. There are some people who "believed" it's FS-E; I was just clearing that confusion. You may want to go back to few pages back I reckon.
> 
> 
> 
> See at the very right side of above slide (courtesy of Chiploco), it's clearly written "*3DMARK (R) FIRESTRIKE PERFORMANCE**".


Where is the rest of that bar for the 290x lol. my 780 classified scores a little under 10k stock


----------



## szeged

Quote:


> Originally Posted by *Blindsay*
> 
> Where is the rest of that bar for the 290x lol. my 780 classified scores a little under 10k stock


they were probably using a FX 9590 cpu, and who knows what the ram was lol


----------



## Ghoxt

Quote:


> Originally Posted by *pac08*
> 
> Rumor has it that 290X will retail at 699$... The guy who made this claim is usually well informed so i kind of trust him. If that's the case lots of people are going to be disappointed.


AMD consumers want a low price - well it's their Mantra...

AMD stockholders however look at the GTX Titan which sold way beyond Nvidia's expectations...make no mistake they want a competitive price for the performance.

Who wins this argument on where it should be priced?

$600 - Stockholders see it as a loss, and giving the cards away losing $300+ per card sold. AMD fans rejoice...if performance is as expected. Nvidia guys say meh. Boards implode over OC bench's.
$700 - Stockholders grumble still... AMD consumers will insist on 3 - 5 Launch Title games at this price....to save face.
$800 - Stockholders may be happy if numbers sold is the required amount for profit. - AMD consumers will have to eat crow. Nvidia fans might not be able to contain themselves regardless.

And as of right now we have no concrete info to go on....


----------



## rdr09

Quote:


> Originally Posted by *Ghoxt*
> 
> AMD consumers want a low price - well it's their Mantra...
> 
> AMD stockholders however look at the GTX Titan which sold way beyond Nvidia's expectations...make no mistake they want a competitive price for the performance.
> 
> Who wins this argument on where it should be priced?
> 
> $600 - Stockholders see it as a loss, and giving the cards away losing $300+ per card sold. AMD fans rejoice...if performance is as expected. Nvidia guys say meh. Boards implode over OC bench's.
> $700 - Stockholders grumble still... AMD consumers will insist on 3 - 5 Launch Title games at this price....to save face.
> $800 - Stockholders may be happy if numbers sold is the required amount for profit. - AMD consumers will have to eat crow. Nvidia fans might not be able to contain themselves regardless.
> 
> And as of right now we have no concrete info to go on....


i am an amd consumer, yes. i like the low price but last march i seriously considered the 670 over the 7970, however . . .

checkout my avatar. that was an nvidia card.


----------



## Stay Puft

Quote:


> Originally Posted by *rdr09*
> 
> it does seem that the 280X is an equivalent of the 7970GHz. here is my 7950 oc'ed to 7970 GHz speed . . .
> 
> http://www.3dmark.com/3dm/1168637


280X is a rebranded 7970 Ghz


----------



## CallsignVega

Hm, glad they went with 4GB of memory on the 290X. Should be plenty for those of us at higher resolution. If they do a proper card with 5x mini-DP outputs I just may do 4-way crossfire these babies with five Lightboost monitors, or just "settle" for a 4K monitor.


----------



## Ha-Nocri

So no price cuts. I was expecting 280x (aka 7970) to be 250$ since 7970 is already under 300$. Lame


----------



## Dart06

Do we know a release date for the 290/290x?

I sold off one of my 670s and I am planning on getting a 290 or a 290x and sell off my other 670.


----------



## Seronx

Quote:


> Originally Posted by *Dart06*
> 
> Do we know a release date for the 290/290x?


Preorders are on October 3rd, but the cards should be out by October 29th.


----------



## Regent Square

Quote:


> Originally Posted by *Seronx*
> 
> Preorders are on October 3rd, but the cards should be out *by October 29th*.


Lol, how old should you be to think this way?


----------



## Dart06

Quote:


> Originally Posted by *Seronx*
> 
> Preorders are on October 3rd, but the cards should be out by October 29th.


Just in time for my birthday on November 1st. Should be a nice birthday present to myself. Do you know if Newegg is going to be in the pre-order program (that I just saw includes Battlefield 4?) or where do I look for that?


----------



## Regent Square

Is 4Gb enough for 1200p or not worth it?


----------



## Regent Square

Quote:


> Originally Posted by *Dart06*
> 
> Just in time for my birthday on November 1st. Should be a nice birthday present to myself. Do you know if Newegg is going to be in the pre-order program (that I just saw includes Battlefield 4?) or where do I look for that?


Happy upcoming birthday


----------



## Seronx

Quote:


> Originally Posted by *Regent Square*
> 
> Lol, how old should you be to think this way?


You are pre-ordering the graphics card with Battlefield 4. In reference, non-BF4 +card probably will launch afterwards. With aftermarket cards coming after November.


----------



## Regent Square

Quote:


> Originally Posted by *Seronx*
> 
> You are pre-ordering the graphics card with Battlefield 4. In reference, non-BF4 +card probably will launch afterwards. With aftermarket cards coming after November.


Since when launching a gpu on the release day makes sense?


----------



## Dart06

It would be easy for them to make the code not redeemable until BF4 launches.


----------



## Seronx

Quote:


> Originally Posted by *Regent Square*
> 
> Since when launching a gpu on the release day makes sense?


To get into it I did type out "by October 29th." Leaving the possibility that it can be earlier than October 29th.


----------



## Regent Square

Quote:


> Originally Posted by *Seronx*
> 
> To get into it I did type out "by October 29th." Leaving the possibility that it can be earlier than October 29th.


Usually people here say it when the event is happening a few days before a certain date. But, ok, not everyone is alike.


----------



## jincuteguy

Unless Nvidia comes up with something, Amd will always be faster if most of the Games used Mantle and highly optimized for Amd cards.


----------



## 161029

Still can't get over that cooler for some reason.


----------



## scyy

Quote:


> Originally Posted by *jincuteguy*
> 
> Unless Nvidia comes up with something, Amd will always be faster if most of the Games used Mantle and highly optimized for Amd cards.


That's a big, big if.

I read these marketing materials from amd and there are so many should and ifs it's hard to take it as fact till we actually see the results.

Glide worked great for 3dfx but that was when direct x was pretty much complete garbage. I would be surprised if it has as big of a boost as some are expecting. I could be wrong though, just my view on it.

Don't take this as me saying mantle is pointless or anything either. I'm just saying until we actually see how big of a boost it brings and how many devs actually utilize it it will be hard to judge just how much this will help amd.


----------



## malmental

Quote:


> Originally Posted by *scyy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jincuteguy*
> 
> Unless Nvidia comes up with something, Amd will always be faster if most of the Games used Mantle and highly optimized for Amd cards.
> 
> 
> 
> That's a big, big if.
> 
> I read these marketing materials from amd and there are so many should and ifs it's hard to take it as fact till we actually see the results.
Click to expand...

BIG BIG IF....


----------



## Moragg

I don't think it's unrealistic. AMD's been working on Mantle for 2 years along with the xbone ps4 systems, they'll have made porting from the console API to mantle very easy and cheap.

And all the exclusives would be very easy to port to PC too - and more importantly so cheap that devs would be denying themselves a large revenue source by not porting to Mantle. We may even end up with a situation where all "exclusives" are available to PC gamers - though only if on AMD.


----------



## scyy

Quote:


> Originally Posted by *Moragg*
> 
> I don't think it's unrealistic. AMD's been working on Mantle for 2 years along with the xbone ps4 systems, they'll have made porting from the console API to mantle very easy and cheap.
> 
> And all the exclusives would be very easy to port to PC too - and more importantly so cheap that devs would be denying themselves a large revenue source by not porting to Mantle. We may even end up with a situation where all "exclusives" are available to PC gamers - though only if on AMD.


Enjoying that pie in the sky? I like how so many amd users here are all for open api's and completely anti closed till amd has a closed api and suddenly who cares about open, we have mantle!

These low level api's are harder to work with and most devs will probably stick with dx11. Yes it will give amd an advantage when used but as of now we have no idea how wide spread it will be.


----------



## Moragg

Quote:


> Originally Posted by *scyy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Moragg*
> 
> I don't think it's unrealistic. AMD's been working on Mantle for 2 years along with the xbone ps4 systems, they'll have made porting from the console API to mantle very easy and cheap.
> 
> And all the exclusives would be very easy to port to PC too - and more importantly so cheap that devs would be denying themselves a large revenue source by not porting to Mantle. We may even end up with a situation where all "exclusives" are available to PC gamers - though only if on AMD.
> 
> 
> 
> Enjoying that pie in the sky? I like how amd users are all for open api's till amd has a closed api and suddenly who cares about open, we have mantel!
Click to expand...

From my limited understanding, open apis wouldn't work. To make that as low-key as possible (and thus best performance) AMD and Nvidia would have to collaborate on making GPU architectures. Yeah... not gonna happen.

AMD have instead taken advantage of the fact game devs are optimising for GCN by saying "Here's an easy cheap PC port" with the potential to blow away Nvidia. I never had anything against the Titan nor the 780 (or even 770) - they were priced appropriately according to business sense. I have AMD simply because I cannot afford Nvidia, but it's good to see AMD being proactive in getting it'[s customers better performance for their money. If Nvidia had done it I wouldn't care, so long as I got more value for money.

AMD fans are obviously going to be sky-high if the supposed boost from mantle is true. They would get price and performance in that case.

If Mantle does have the gains it says then I would expect Nvidia to create a higher level software that would take the Mantle commands and map them to Nvidia. IT would obviously have overheads that Mantle wouldn't but it might still be better than DirectX.

What I don't like about Mantle - when we move on from GCN are the Mantle only PC games even playable anymore?


----------



## scyy

I wasn't implying make mantle open. My point was the very same people who crap all over physx for being closed will praise this despite being closed as well. I have no problem with amd pushing this just as I have no problem with physx. Just pointing out hypocrisy.

If this really gets huge and amd incorporates hardware frame pacing I very well may go amd once my 780s show their age but for now I'm skeptical of how wide spread it will be.


----------



## TheLAWNOOB

Am I the only one thinks mantle sounds too close to mental ?

Edit: No offense to malmental


----------



## malmental




----------



## th3illusiveman

Quote:


> Originally Posted by *Jared2608*
> 
> I like reading about a new tech, and then I cry a little inside knowing I can't afford it...


i know that feel


----------



## CallsignVega

Quote:


> Originally Posted by *Regent Square*
> 
> Is 4Gb enough for 1200p or not worth it?


----------



## malmental

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Regent Square*
> 
> Is 4Gb enough for 1200p or not worth it?
Click to expand...

you be nice...


----------



## Regent Square

Quote:


> Originally Posted by *malmental*
> 
> you be nice...


?


----------



## fateswarm

Quote:


> Originally Posted by *szeged*
> 
> they were probably using a FX 9590 cpu, and who knows what the ram was lol


Haha. I just realized the irony. AMD has to use Intel CPUs to compete with its GPUs at higher numbers.


----------



## fateswarm

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Am I the only one thinks mantle sounds too close to mental ?
> 
> Edit: No offense to malmental


You British?


----------



## malmental

Quote:


> Originally Posted by *Regent Square*
> 
> Quote:
> 
> 
> 
> Originally Posted by *malmental*
> 
> you be nice...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ?
Click to expand...

CallsignVega


----------



## TheLAWNOOB

Quote:


> Originally Posted by *fateswarm*
> 
> You British?










why Do You think I'm a brit?

Hmm, maybe its because I learned british english instead of amurican english Back In kindergarten, who Knows.


----------



## Durquavian

Quote:


> Originally Posted by *scyy*
> 
> I wasn't implying make mantle open. My point was the very same people who crap all over physx for being closed will praise this despite being closed as well. I have no problem with amd pushing this just as I have no problem with physx. Just pointing out hypocrisy.
> 
> If this really gets huge and amd incorporates hardware frame pacing I very well may go amd once my 780s show their age but for now I'm skeptical of how wide spread it will be.


Physix is not anywhere near the same here. It is an added feature not one that increases performance. It is reasonable that this Mantle will have somewhat of a grip even in the PC market simply because of ports. DX isn't viable on consoles because of inefficiency. If they truly want to sell consoles and games then they need all advantages front and center. Ports will likely and likely as in 100% will have DX as well as Mantle. As I said earlier or maybe in another thread(so many of these AMD threads right now) it will behave like ICC and set qualifiers for which set API it will use dependent on the hardware in use, you likely wont get a choice, assuming Nvidia hasn't somehow worked a fix for its use as well.


----------



## scyy

Quote:


> Originally Posted by *Durquavian*
> 
> Physix is not anywhere near the same here. It is an added feature not one that increases performance. It is reasonable that this Mantle will have somewhat of a grip even in the PC market simply because of ports. DX isn't viable on consoles because of inefficiency. If they truly want to sell consoles and games then they need all advantages front and center. Ports will likely and likely as in 100% will have DX as well as Mantle. As I said earlier or maybe in another thread(so many of these AMD threads right now) it will behave like ICC and set qualifiers for which set API it will use dependent on the hardware in use, you likely wont get a choice, assuming Nvidia hasn't somehow worked a fix for its use as well.


My mentioning of physx was to the point of having a specific advantage over the other side through a(likely) closed feature. Not a direct comparison of the technologies. If anything this is a much bigger deal than physx if it gets adopted widely, key word being if.


----------



## Durquavian

Quote:


> Originally Posted by *scyy*
> 
> My mentioning of physx was to the point of having a specific advantage over the other side through a(likely) closed feature. Not a direct comparison of the technologies. If anything this is a much bigger deal than physx if it gets adopted widely.


How widely is the question. I don't even want to bet on any outcome. But also I wonder where HSA comes in with this as well.


----------



## SniperOct

Quote:


> Originally Posted by *scyy*
> 
> My mentioning of physx was to the point of having a specific advantage over the other side through a(likely) closed feature. Not a direct comparison of the technologies. If anything this is a much bigger deal than physx if it gets adopted widely, key word being if.


apples and oranges. you cannot use it in the sense you intent. Compare tressfx with physx instead. As to it being close, there is no choice but for it to be closed. It is in the very nature of it to be closed. It's like a firmware. Even if they wanted it open (which was reported to be so by one of the tech site), they can't because it is architecture specific. If it was open across architectures then it would be high level just like directX. If they open it in the sense where the programmer interface is the same but the backend code is implemented by each manufacturer then what is the point? Nvidia would basically need to create mantle from scratch and to AMD's specifications. That doesn't make any sense whatsoever. Even if Nvidia were that stupid, why would they get with it when AMD will have control of the interface code and future development/versions? They might as well create their own and have full control. So no, What you are saying doesn't work from any direction I could think of.


----------



## scyy

Quote:


> Originally Posted by *SniperOct*
> 
> apples and oranges. you cannot use it in the sense you intent. Compare tressfx with physx instead. As to it being close, there is no choice but for it to be closed. It is in the very nature of it to be closed. It's like a firmware. Even if they wanted it open (which was reported to be so by one of the tech site), they can't because it is architecture specific. If it was open across architectures then it would be high level just like directX. If they open it in the sense where the programmer interface is the same but the backend code is implemented by each manufacturer then what is the point? Nvidia would basically need to create mantle from scratch and to AMD's specifications. That doesn't make any sense whatsoever. Even if Nvidia were that stupid, why would they get with it when AMD will have control of the interface code and future development/versions? They might as well create their own and have full control. So no, What you are saying doesn't work from any direction I could think of.


Again, I'm not comparing them directly as far as the technologies behind it and have no problem with it being closed and completely understand the very nature of a low level api will likely be tied in some way to the architecture making it likely not possible to run on Kepler or any other nvidia GPU. My point was just to point out hypocrisy of some amd users who complain about physx and talk as if anything exclusive to either side is a bad thing. I suppose you can argue since it's not really possible to make it open it's not as bad but it's still a clear advantage that is exclusive to amd.


----------



## SniperOct

Quote:


> Originally Posted by *scyy*
> 
> Again, I'm not comparing them directly as far as the technologies behind it and have no problem with it being closed and completely understand the very nature of a low level api will likely be tied in some way to the architecture making it likely not possible to run on Kepler or any other nvidia GPU. My point was just to point out hypocrisy of some amd users who complain about physx and talk as if anything exclusive to either side is a bad thing. I suppose you can argue since it's not really possible to make it open it's not as bad but it's still a clear advantage that is exclusive to amd.


Okay, you are not comparing it directly, you can't compare them indirectly, etc. Your first sentence above jumps from admitting just that to "my point is..." So tell us, how are you tying your first sentence with your second one. How are you tying the claim with the conclusion? Your point of 'exclusivity-hypocrisy' would be fine if you had an example of where that is being done at the moment. But since you are not using mantle as an example then you are simply using it as an excuse to make your point.


----------



## 2010rig

Quote:


> Originally Posted by *Stay Puft*
> 
> People still use paypal? I gave up on that scam after PayPal gave back the money this douche scammed me out of. He got a brand newish 7970, he claimed defective and mailed me back a rock in a box and since he had a tracking number PayPal gave him back his money. It still pisses me off to this day. I even took pics for PayPal to see he returned me a rock. Nope.. Didn't matter. He had a tracking number so it was golden.. Ugh


Damn dude, that's literally the most ridiculous I ever heard about PayPal.

It's almost on par when they froze my account with $22,000 in it, and I had ZERO complaints, because they considered me "high risk". Bastards didn't release the funds for 6 months either. Ever visit PayPalSucks.com?

I'm a sucker though, I still have 2 PayPal accounts, I just withdraw the funds on a daily basis. That happened before eBay bought them out, they've gotten better since.

*Obligatory On Topic post:* Those scores were with the Performance preset, it says so right there on the slide.


----------



## jincuteguy

Quote:


> Originally Posted by *2010rig*
> 
> Damn dude, that's literally the most ridiculous I ever heard about PayPal.
> 
> It's almost on par when they froze my account with $22,000 in it, and I had ZERO complaints, because they considered me "high risk". Bastards didn't release the funds for 6 months either. Ever visit PayPalSucks.com?
> 
> I'm a sucker though, I still have 2 PayPal accounts, I just withdraw the funds on a daily basis. That happened before eBay bought them out, they've gotten better since.
> 
> *Obligatory On Topic post:* Those scores were with the Performance preset, it says so right there on the slide.


\

Wait you said Ebay bought Paypal? since when?


----------



## Forceman

Quote:


> Originally Posted by *jincuteguy*
> 
> \
> 
> Wait you said Ebay bought Paypal? since when?


5 years ago? Maybe more. That's where Elon Musk got the money to start Tesla and SpaceX .


----------



## 2010rig

Quote:


> Originally Posted by *Forceman*
> 
> 5 years ago? Maybe more. That's where Elon Musk got the money to start Tesla and SpaceX .


Try 11 years.
http://news.cnet.com/2100-1017-941964.html

My timeline was wrong, when they closed my account that I referred to above, it was 2004.


----------



## BiG StroOnZ

The only good news about the release of these cards is the following:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131483

$599


----------



## Zealon

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> The only good news about the release of these cards is the following:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131483
> 
> $599


I just noticed these cards dropped in price, so I am seriously considering getting one and water cooling it. I'm on the fence about it though because I can either get the 7990 or wait till Q2 2014 and see what Nvidia has to offer for the Maxwell architecture.


----------



## th3illusiveman

Quote:


> Originally Posted by *Zealon*
> 
> I just noticed these cards dropped in price, so I am seriously considering getting one and water cooling it. I'm on the fence about it though because I can either get the 7990 or wait till Q2 2014 and see what Nvidia has to offer for the Maxwell architecture.


7990 won't be running out of steam any time soon, the thing practically leads the charts in most reviews and it's not like you will run into any Vram bottlenecks before 780 owners. It's old sure, but still a beast - at least for the games out on the market right now. The new cards are based on the same architecture so your driver updates will still be top notch for many years to come but hey, we're so close to the launch of the new cards why not just wait another month...


----------



## szeged

im hoping the msi lightning version of the 290x is great, the 780 lightning left me dissapointed


----------



## Forceman

Quote:


> Originally Posted by *2010rig*
> 
> Try 11 years.
> http://news.cnet.com/2100-1017-941964.html
> 
> My timeline was wrong, when they closed my account that I referred to above, it was 2004.


Wow, time flies.


----------



## Roaches

Quote:


> Originally Posted by *szeged*
> 
> im hoping the msi lightning version of the 290x is great, the 780 lightning left me dissapointed


They're likely to use the same cooler/shroud design since the cost of engineering and tooling would be too great to design a new cooler.
unless they make a non-lightning Tri Frozer design which could happen...


----------



## szeged

Quote:


> Originally Posted by *Roaches*
> 
> They're likely to use the same cooler/shroud design since the cost of engineering and tooling would be too great to design a new cooler.
> unless they make a non-lightning Tri Frozer design which could happen...


im just hoping we get full unlocked support on it, the software on the 780 lightning was lacking in so many ways


----------



## BiG StroOnZ

Quote:


> Originally Posted by *th3illusiveman*
> 
> 7990 won't be running out of steam any time soon, the thing practically leads the charts in most reviews and it's not like you will run into any Vram bottlenecks before 780 owners. It's old sure, but still a beast - at least for the games out on the market right now. The new cards are based on the same architecture so your driver updates will still be top notch for many years to come but hey, we're so close to the launch of the new cards why not just wait another month...


Yup, I'll be grabbing a 7990 on Black Friday, if I can come across a Devil 13 that would just be perfect. Except, looking at the price drops right now 7970's and 7950's are super cheap right now (can only imagine come Black Friday).


----------



## Roaches

Quote:


> Originally Posted by *szeged*
> 
> im just hoping we get full unlocked support on it, the software on the 780 lightning was lacking in so many ways


Our best bet is to hope for small revisions on their part for improvement, both software and hardware.









I'm very happy with my Gigabyte GTX 680 SOC on the cooler design and performance, though the software on its part was very disappointing as well...moar fans!! Hope to see Gigabytes cooler offering on the volcanic islands as well...


----------



## Johnny Rook

Quote:


> Originally Posted by *Moragg*
> 
> Wasn't PhysX an extra feature set that would negatively impact performance? Whereas the only thing Mantle does is optimise how the GPU is used.
> 
> I swear everyone on here has been asking for better software, AMD claim to have delivered it with Mantle. So what's the issue?


Issue? I have no issue. Did I say I had an issue?

What I said is that there will be a discussion if Mantle proves to be a proprietary software - which I don't think it will be. Why do I talk about "proprietary"? Because, we can't forget it seams to be for "Frostbite 3", EA games only. Is not like Mantle will be available for everybody from the get go.
At least, PhysX could and can be used from everybody that wants to use it. And the software PhysX runs pretty much on every modern CPU with both AMD and nVIDIA cards. Only "Accelerated GPU PhysX" runs on GeForce GPUs. So, I think we are not talking about same thing unless, Mantle has some chance to run on nVIDIA cards as well - as I think it will. Otherwise, it can be taken as "proprietary software". Time will tell, I suppose.


----------



## Zealon

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Yup, I'll be grabbing a 7990 on Black Friday, if I can come across a Devil 13 that would just be perfect. Except, looking at the price drops right now 7970's and 7950's are super cheap right now (can only imagine come Black Friday).


I think I'll go the same route and pick up a 7990 on black friday unless the 290X can actually impress me come late October. I thought about picking up a used 7970 later on for some nice tri-fire.


----------



## fleetfeather

[sigh, looks like I'm late to the party with my link]


----------



## 2010rig

Quote:


> Originally Posted by *fleetfeather*
> 
> [sigh, looks like I'm late to the party with my link]


You got me curious....


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *2010rig*
> 
> You got me curious....


About?


----------



## 2010rig

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> About?


That connector towards the middle. Do we know what that is?


----------



## szeged

the worst part about that video was the thermal-tacky case


----------



## Snuckie7

Quote:


> Originally Posted by *2010rig*
> 
> That connector towards the middle. Do we know what that is?


Could be a dual BIOS switch. Those are standard on AMD reference models.


----------



## 2010rig

Quote:


> Originally Posted by *Snuckie7*
> 
> Could be a dual BIOS switch. Those are standard on AMD reference models.


Cool, makes sense.


----------



## Forceman

Quote:


> Originally Posted by *Snuckie7*
> 
> Could be a dual BIOS switch. Those are standard on AMD reference models.


Yeah, that's what it looks like now that you mention it.


----------



## kot0005

Only 384 bit now  ??


----------



## szeged

lol, so many rumors you probably shouldnt worry about what specs have been "released" so far.


----------



## Tonza

The switch is gonna enable "Turbo Mantle" mode, which will create blackhole into the case and sucks whole world + Titans in it.


----------



## wermad

So $599 is official?


----------



## Baghi

So the $50 dollar cheap trend continues and it has 4 ROPs less than projected and no 512-bit interface as many of us predicted.

*UPDATED SPECS (link):*

Price US $599.99 (or 499.99€, £399.99 before taxes)
Availability mid-October
28nm silicon
2,816 GCN stream processors
44 SIMDs (11 computing units)
172 TMUs
44 ROPs
512-bit memory interface
4GB/6GB GDDR5 video memory
>300GB bandwidth
5.00 GHz memory clocks


----------



## szeged

nothing is official until amd says so imo

$599 is probably a price that tpu got from another site that got it from another site that got it from another site who got it from someone on OCN that said " i hope its 599"


----------



## wermad

Quote:


> Originally Posted by *Baghi*
> 
> So the $50 dollar cheap trend continues and it has 4 ROPs less than projected and no 512-bit interface as many of us predicted:
> *UPDATED SPECS (link):*
> 
> Price US $599.99 (or 499.99€, £399.99 before taxes)
> Availability mid-October
> 28nm silicon
> 2,816 GCN stream processors
> 44 SIMDs (11 computing units)
> 172 TMUs
> 44 ROPs
> 384-bit memory interface
> 4-6GB GDDR5 video memory
> >300GB bandwidth
> 6.40 GHz memory clocks


Gotcha







384 bit means 3gb or 6gb. Hmmmm....sounds like 3gb to me for this price point.
Quote:


> Originally Posted by *szeged*
> 
> nothing is official until amd says so imo
> 
> $599 is probably a price that tpu got from another site that got it from another site that got it from another site who got it from someone on OCN that said " i hope its 599"


.

Lol, ocn is the source









Lame, its getting watered down. No reason to switch from my 780s. Titan prices, come down to ~$650







!!!!!!!!1


----------



## szeged

yeah if the specs keep coming down and down and eventually turn out to be true, not really a reason to drop 780/titans to sidegrade to these, even if theyre 50 bucks less, nvidia will follow suit and probably drop 50 bucks off also, im still hoping that amd isnt saying anything about specs so they can blow our minds when they release em, but things arent looking that way atm, unless you factor in mantle lol.


----------



## Forceman

We know for sure it's going to be 4GB, and the only way that makes sense is with a 512-bit bus. That also makes the >300 GB/s memory bandwidth work, so I think that's pretty much set in stone. Any 384 rumor may be the 290, but it isn't the 290X.

Edit: And the source for the Techpowerup article is a Softpedia article from before the launch announcement. So crap, in other words.


----------



## szeged

Quote:


> Originally Posted by *Forceman*
> 
> We know for sure it's going to be 4GB, and the only way that makes sense is with a 512-bit bus. That also makes the >300 GB/s memory bandwidth work, so I think that's pretty much set in stone. Any 384 rumor may be the 290, but it isn't the 290X.


tpu is saying the 290x would ship with 4gb 384 bit bus using different density chips, but then again all rumors etc etc grumble grumble


----------



## wermad

Quote:


> Originally Posted by *Forceman*
> 
> We know for sure it's going to be 4GB, and the only way that makes sense is with a 512-bit bus. That also makes the >300 GB/s memory bandwidth work, so I think that's pretty much set in stone. Any 384 rumor may be the 290, but it isn't the 290X.
> 
> Edit: And the source is a Softpedia article from before the launch announcement. So crap, in other words.


so what's replacing what?

Quote:


> Originally Posted by *szeged*
> 
> yeah if the specs keep coming down and down and eventually turn out to be true, not really a reason to drop 780/titans to sidegrade to these, even if theyre 50 bucks less, nvidia will follow suit and probably drop 50 bucks off also, im still hoping that amd isnt saying anything about specs so they can blow our minds when they release em, but things arent looking that way atm, unless you factor in mantle lol.


No 4-way 780 makes me a sad panda, $1k Titan makes me a sad panda, no 5x1 Surround makes me a sad panda...


----------



## szeged

rofl









you can hack and get 4 way 780s









you can hack a bank irl and afford 4 way titans







so many options, so little time!


----------



## wermad

Quote:


> Originally Posted by *szeged*
> 
> rofl
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you can hack and get 4 way 780s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you can hack a bank irl and afford 4 way titans
> 
> 
> 
> 
> 
> 
> 
> so many options, so little time!


It only works w/ driver 314.xx since it did not have any provisioning for 780. Its really for benchmark tbh (the hacked driver). Most of my games run like crap or crashed w/ this older driver. Plus, 314.xx was pretty so-so w/ my old Ttitans. Getting nervous on Hawaii are we???







. For the cost of two Titans, I got three 780s....but I want more powah!!!!!


----------



## szeged

when EK comes out with 780 classified blocks ill start benching those, then i will have all the powah i need









until the 290x comes out, then i will require more powah.


----------



## wermad

Quote:


> Originally Posted by *szeged*
> 
> when EK comes out with 780 classified blocks ill start benching those, then i will have all the powah i need
> 
> 
> 
> 
> 
> 
> 
> 
> 
> until the 290x comes out, then i will require more powah.


Amd did have some nice scaling with Tahiti. Really, its the software that killed quad 7970s for me







. Plus, the MST hubs weren't out then and I had the dreaded screen tearing. I'm just getting tired of paying Nvidia tons of money and yet they still hold a tight leash on us. Weather is upgrade-ability or price, they wanna screw you anyways







. Drivers have been almost pain free. Something Amd couldn't do previously.

We'll see. I'm going to sit back and watch the show for now. Eyefinity pacing driver is more appealing to me. I can buy some used 7970s and pickup a couple more Dells for good 5x1 Eyefinity setup.


----------



## Oubadah

..


----------



## HellAce

Quote:


> Originally Posted by *szeged*
> 
> yeah if the specs keep coming down and down and eventually turn out to be true, not really a reason to drop 780/titans to sidegrade to these, even if theyre 50 bucks less, *nvidia will follow suit and probably drop 50 bucks off* also, im still hoping that amd isnt saying anything about specs so they can blow our minds when they release em, but things arent looking that way atm, unless you factor in mantle lol.


Nvidia rarely drops prices of their cards without really good reason..........i doubt they will do it this time around. Let me ask you this, when the 7970 Ghz edition released ,did they drop the GTX 680? No. When the Official 7990 came down to $650 did they drop the price of the GTX 690? No.

Dont hold your breath............it will probably still stay at the price point, same with the Titan.


----------



## Clockster

Quote:


> Originally Posted by *Oubadah*
> 
> I knew it was too good to be true.
> 
> When was the last 512-bit card from AMD/ATI, the ill fated 2900XT?


Still all just rumors xD AMD Never confirmed that it would be 512bit...lol


----------



## wermad

Quote:


> The company is expected to launch 6 GB variants of the card a little later.


This alone tells my it will be a 3gb version and a 6gb uber version (to tackle Titan) is coming out later.


----------



## szeged

r9 290x 6gb toxic from sapphire inc


----------



## Ramzinho

Amd got us all on our toes. The hype and anticipation prior to the release is so tense. That could either lead to great outcome or a huge frustration. In the meantime I'm waiting for black Friday. That might be a new 7990 for me and placing my old 7970 in the old rig. Right now patience is the only solution :beer:

Sent from my GT-N7100 using Tapatalk 2


----------



## anticommon

So I'm considering preordering one of these if it's under $600 with BF4 included, but I don't know if it's worth it. Only reason I say that is because I'm not and never have been a fan of stock blower designs. I would almost rather wait for a windforce or lightning version of the card to come out, but at the same time, I need a new GPU by the time BF4 comes out (my GTX 260 can only get me so far...).

Thoughts?

Plus, with BF4 included I'm basically considering the card being $50 off and looking at it as though it's $550 instead of $600 because I would be buying BF4 anyways.


----------



## Durquavian

Quote:


> Originally Posted by *anticommon*
> 
> So I'm considering preordering one of these if it's under $600 with BF4 included, but I don't know if it's worth it. Only reason I say that is because I'm not and never have been a fan of stock blower designs. I would almost rather wait for a windforce or lightning version of the card to come out, but at the same time, I need a new GPU by the time BF4 comes out (my GTX 260 can only get me so far...).
> 
> Thoughts?
> 
> Plus, with BF4 included I'm basically considering the card being $50 off and looking at it as though it's $550 instead of $600 because I would be buying BF4 anyways.


Reference has the advantage of water cooling upgrade. If that helps you feel better about the purchase.


----------



## sugarhell

Lol what a trash article. We already know about the 4gb and the 300+ bandwidth. No way that this card will come with 3gb or 6gb or even 384 bit bus.


----------



## Ghoxt

What's with the rumor specs getting lower and lower? The rumor now that there will only be a 384-bit bus is now in the wild. And not an official word from anyone who should know. I'll take it with a grain of salt. I'm hoping Titan prices come down a notch but if the R9 cannot match (stable OC) that OCN members have done with firmware & config changes on the 780 and Titan, i'll just buy another Titan if I get the itch.

And to be clear, those of us sitting on Titan's or 780's, what are we expecting here really? Are we expecting some AMD monster frame rate increase to launch the genre forward and provide headroom for Liquids and Fluid dynamics? No. No one is expecting that. This seems to be just the normal incremental leap - frog game, getting no closer to photorealism in my lifetime. I know I ask too much for my money...


----------



## MaCk-AtTaCk

were are we seeing this supposed 384bit? all i see is 512 for tpu.


----------



## Blindsay

Quote:


> Originally Posted by *Durquavian*
> 
> Reference has the advantage of water cooling upgrade. If that helps you feel better about the purchase.


not always


----------



## FoamyV

Pretty anxious to see some reviews of it. Judging by the AMD slides it scores around 8k points on firestrike perf preset, way lower than a titan right?


----------



## TooBAMF

Quote:


> Originally Posted by *FoamyV*
> 
> Pretty anxious to see some reviews of it. Judging by the AMD slides it scores around 8k points on firestrike perf preset, way lower than a titan right?


Yeah I get about 10500 in Firestrike. I don't know what CPU AMD is running but it's still not very impressive. The only interesting thing about the card, besides possibly the price, is Mantle. If it takes off, Nvidia and its users will be left in an awkward place. If it doesn't, looks like business as usual. AMD will have the price advantage but the new cards seem to be plugging gaps in AMD's product line as opposed to forcing prices lower across the board.


----------



## provost

Quote:


> Originally Posted by *anticommon*
> 
> So I'm considering preordering one of these if it's under $600 with BF4 included, but I don't know if it's worth it. Only reason I say that is because I'm not and never have been a fan of stock blower designs. I would almost rather wait for a windforce or lightning version of the card to come out, but at the same time, I need a new GPU by the time BF4 comes out (my GTX 260 can only get me so far...).
> 
> Thoughts?
> 
> Plus, with BF4 included I'm basically considering the card being $50 off and looking at it as though it's $550 instead of $600 because I would be buying BF4 anyways.


http://s1364.photobucket.com/user/provostelite/media/138028682130106_zps4aef3378.gif.html

Wait for some real data and reviews before making your decision. No point jumping on a bandwagon that may turn out to be a train wreck. just saying


----------



## Ghoxt

It will be interesting to see the Bench rigs and the difference using a AMD vs Intel CPU's, and GPU/Memory clocks when we look back on some of the performance targets alluded to during the live event. What shocks me is that normally by now we would have heard some underground review from Shenzin, China, or from the mountains of Norway etc...

Total info blackout, and for all it gave us, AMD's media event was NDA only.

The live stream was a complete waste of effort on their part imo. Except for the Audio reveal. That was Huge , causing water cooled systems to implode and babies to cry. Audio will never be the same on PC's! 1ST!!!


----------



## 47 Knucklehead

Well, I'm hoping AMD gives the REAL specs and costs soon. To me, the only thing more idiotic than paying $4000 for 4 Titans is placing a pre-order for a video card where the spec's and prices haven't been given.


----------



## Baghi

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> were are we seeing this supposed 384bit? all i see is 512 for tpu.


Damn! They again edited their page, AGAIN! It was 384-bit and 6.40 GHz memory clock and now 512-bit with 5.00 GHz.







Thanks btw.


----------



## InfoWarrior

Quote:


> Originally Posted by *TooBAMF*
> 
> Yeah I get about 10500 in Firestrike. I don't know what CPU AMD is running but it's still not very impressive. The only interesting thing about the card, besides possibly the price, is Mantle. If it takes off, Nvidia and its users will be left in an awkward place. If it doesn't, looks like business as usual. AMD will have the price advantage but the new cards seem to be plugging gaps in AMD's product line as opposed to forcing prices lower across the board.


As far as I know Mantle or any proprietary api is bad for the PC industry. The consumer always wins when hardware companies compete on a level playing field. Instead of a level playing field its more like AMD is trying to clear the playing field this time around.


----------



## Johnny Rook

Quote:


> Originally Posted by *szeged*
> 
> yeah if the specs keep coming down and down and eventually turn out to be true, *not really a reason to drop 780/titans to sidegrade to these*, even if theyre 50 bucks less, nvidia will follow suit and probably drop 50 bucks off also, im still hoping that amd isnt saying anything about specs so they can blow our minds when they release em, but things arent looking that way atm, unless you factor in mantle lol.


Yeah, I don't think I will move from what I have right now, no... Conscientiously, I wasn't expecting AMD to "blow our minds" by keeping the specs secret; although, my inner child was kinda wishing that. From what I saw in the AMD live streaming -- mostly, because of what I DIDN'T see --, I don't think I will be giving myself a R9 290X Christmas present, after all.








"Hope" is the last thing to die. Let's see what October brings to us. Until then, is all speculation.


----------



## selk22

I think this card is going to finally be a worthy upgrade from the 580sc..


----------



## TooBAMF

Quote:


> Originally Posted by *InfoWarrior*
> 
> As far as I know Mantle or any proprietary api is bad for the PC industry. The consumer always wins when hardware companies compete on a level playing field. Instead of a level playing field its more like AMD is trying to clear the playing field this time around.


Yeah I argued that most of the day yesterday then gave up. We're on the same page









I think it's interesting but potentially very dangerous for the industry and competition.


----------



## Doogiehouser

Quote:


> Originally Posted by *selk22*
> 
> I think this card is going to finally be a worthy upgrade from the 580sc..


I have been waiting for the same thing. The 580 is such a great card, especially for single 1080p monitors, that it has been making upgrading lately a very difficult decision.


----------



## selk22

Quote:


> Originally Posted by *Doogiehouser*
> 
> I have been waiting for the same thing. The 580 is such a great card, especially for single 1080p monitors, that it has been making upgrading lately a very difficult decision.


Yes I am in the same boat. I am at 1920x1200 and the 580 has simply OC'd enough to fit my needs or else the stock setting were enough to butter most games right up. I have only seen a couple options as a reasonable upgrade..

1. 780 expensive but beastly

2. 760 SLI really good performance for a good price

3. 290x which is really catching my eye as an option lately. I am waiting for real world benchmarks.


----------



## 47 Knucklehead

What I wonder is how fast we can get waterblocks for the 290x.


----------



## anticommon

I just hope that Mantle doesn't go the way of PhysX... where all but one or two AAA titles supports it, everyone wants an open version for everyone's GPU's to take advantage of, and in the end everyone looses with crappy physics in most games.

Mantle needs the following to succeed:

Ease of access for developers to write for DX11.1 & Mantle
Lots of games to support it as soon as possible
+10/15% performance boost minimum
Some sort of compatibility with nVidia. Nvidia doesn't need to necessarily get the same performance boost, the programs just need to be able to run on machines that only have mantle.
Steambox (and presumably linux support. Apple support wouldn't hurt either)

Also, I have a 240mm double-wide rad with my 4.5ghz 2500k on it, and 4 fans in push/pull config. Would there be much point to put the r290x (or any GPU for that matter) in the same loop ? Or would it defeat the purpose.


----------



## Testier

Quote:


> Originally Posted by *Baghi*
> 
> *UPDATED SPECS (link):*
> 
> Price US $599.99 (or 499.99€, £399.99 before taxes)
> Availability mid-October
> 28nm silicon
> 2,816 GCN stream processors
> 44 SIMDs (11 computing units)
> 172 TMUs
> 44 ROPs
> 512-bit memory interface
> 4GB/6GB GDDR5 video memory
> >300GB bandwidth
> 5.00 GHz memory clocks
> 
> ----
> 
> Source
> 
> TPU projected final specs:
> - 4 Independent Tessellation Units
> - ~3,000/2,800 Stream Processors
> - 512-bit Memory Interface
> - 4GB Video Memory
> - DirectX 11.2


Some of the other specs I have seen had 64 ROPs, I suppose that was rather, unrealistic.


----------



## rusky1

Quote:


> Originally Posted by *TooBAMF*
> 
> Yeah I argued that most of the day yesterday then gave up. We're on the same page
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think it's interesting but *potentially* very dangerous for the industry and competition.


That's the key word right there. It all depends on which direction AMD decides to go with the tech. I've read that they're focusing more on the open-source side of things but *not completely open-source*. Only time will tell what that means exactly. From a programming perspective, if the syntax is fairly easy to become accustomed to, this seems like a very good way of extracting more performance from GPUs. I'll keep my fingers crossed that they don't pull a PhysX.


----------



## Regent Square

Quote:


> Originally Posted by *selk22*
> 
> Yes I am in the same boat. I am at 1920x1200 and the 580 has simply OC'd enough to fit my needs or else the stock setting were enough to butter most games right up. I have only seen a couple options as a reasonable upgrade..
> 
> 1. 780 expensive but beastly
> 
> 2. 760 SLI really good performance for a good price
> 
> 3. 290x which is really catching my eye as an option lately. I am waiting for real world benchmarks.


This.

You just summarized in what boat I am floating lately. Except I own 570


----------



## Ghoxt

It wouldn't surprise me if the R9 290 was the surprise and had lower specs than some expect...

Also are we thinking that supporting both new consoles is driving the "Mantle" thingy from a strategy standpoint? Future roadmap of APU's that support it etc for even PC's? Thinking strategy down the road for AMD to make moves to get ahead and not compete in the software middleware are where they can make improvements and avoid other areas where they are not strongest... Interesting.


----------



## jomama22

Quote:


> Originally Posted by *Ghoxt*
> 
> It wouldn't surprise me if the R9 290 was the surprise and had lower specs than some expect...
> 
> Also are we thinking that supporting both new consoles is driving the "Mantle" thingy from a strategy standpoint? Future roadmap of APU's that support it etc for even PC's? Thinking strategy down the road for AMD to make moves to get ahead and not compete in the software middleware are where they can make improvements and avoid other areas where they are not strongest... Interesting.


i expect the 290 to fall about 5%-10% behind a 780 @ $499. Performace and and price would make sense. Though the $499 is high compared to the $299 280x (for only 15% performance increase) they aren't competing with themselves. also gives a little breathing room for drops if needed.


----------



## bencher

Quote:


> Originally Posted by *Clockster*
> 
> Still all just rumors xD AMD Never confirmed that it would be 512bit...lol


Yes they did. That is the only way they could get that bandwidth.


----------



## Stay Puft

Its 4GB memory so it's definitely 512 bit


----------



## wstanci3

Quote:


> Originally Posted by *bencher*
> 
> Yes they did. That is the only way they could get that bandwidth.


No, you can have 384 bit bus with 4gb gram.
Ex: 660 Ti 192 bit bus with 2gb vram


----------



## TomSG

I still can't get over the HORRIBLE naming system they've developed. Seriously, I'd like to meet the guys who come up with this stuff.


----------



## szeged

whats hard about 290x?


----------



## wermad

Quote:


> Originally Posted by *szeged*
> 
> whats hard about 290x?


I think "its making some people hard"









why do the specs keep changing??? At this rate, the article and this thread might as well go into the rumor section







.


----------



## Robertdt

Quote:


> Originally Posted by *jomama22*
> 
> i expect the 290 to fall about 5%-10% behind a 780 @ $499. Performace and and price would make sense. Though the $499 is high compared to the $299 280x (for only 15% performance increase) they aren't competing with themselves. also gives a little breathing room for drops if needed.


If that's the case I will be skipping this generation. No way I'm paying 499 for a 7950 equivalent that can't even beat a 780. I'll pay 400 or less for a card that gives me 35-40% more improvement in performance than my current 7950.


----------



## Regent Square

Quote:


> Originally Posted by *Robertdt*
> 
> If that's the case I will be skipping this generation. No way I'm paying 499 for a 7950 equivalent that can't even beat a 780. *I'll pay 400 or less for a card that gives me 35-40%* more improvement in performance than my current 7950.


not even with 20nm


----------



## sugarhell

Quote:


> Originally Posted by *Regent Square*
> 
> not even with 20nm


A 290x between 780 and titan is already 40%(or close) over a 7950...


----------



## Regent Square

Quote:


> Originally Posted by *sugarhell*
> 
> A 290x between 780 and titan is already 40%(or close) over a 7950...


for 400 bucks no


----------



## sugarhell

Quote:


> Originally Posted by *Regent Square*
> 
> for 400 bucks no


I bet in 6-8 months he will get this card close to 400 bucks


----------



## 2010rig

Quote:


> Originally Posted by *Regent Square*
> 
> This.
> 
> You just summarized in what boat I am floating lately. Except I own 570


Welcome to my world, accept I have a 470, and primary uses aren't gaming.

btw - I thought you owned 3 7970's?








Quote:


> Originally Posted by *Regent Square*
> 
> I wont buy a camera and record it for you...
> 
> 1600p screens with 3 7970s do have lots of stuttering issues. Look it on youtube or elsewhere.
> 
> My 3 7970s do a fine job to play quake 3 maxed out and 2012 titles med-low settings. The higher the res, the more stutter you have. Don't believe me? Fine, go ask others who own 3+ screen set ups with high res. They will confirm what I said.
> 
> *PS If you started to talk about dp and dvi, then it makes sense why you want to go for 5x1, lol


----------



## Robertdt

Quote:


> Originally Posted by *Regent Square*
> 
> for 400 bucks no


I got my 7950 at 375 or so. Why should I have to pay more for its equivalent that can't even beat the refreshed 780?


----------



## Archngamin

Quote:


> Originally Posted by *Robertdt*
> 
> I got my 7950 at 375 or so. Why should I have to pay more for its equivalent that can't even beat the refreshed 780?


GTX 780 had/was part of a refresh?


----------



## Baghi

Quote:


> Originally Posted by *TomSG*
> 
> I still can't get over the HORRIBLE naming system they've developed. Seriously, I'd like to meet the guys who come up with this stuff.


R9-290X sounds terrifying.









BTW, has anyone seen tsm around?


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> Welcome to my world, accept I have a 470, and primary uses aren't gaming.
> 
> btw - I thought you owned 3 7970's?


I am not the only 1?! Really









I do own loads of cards

3 7970 as I got leaned towards AMD at that time.

570 is for my price/performance desire, speaking of which, I am looking to satisfy it with R9 and free games which hopefully come with it.

Just for a reference, I own 2 Titans and(sold already) 7990. It was worth to use it for a few months.


----------



## Regent Square

Quote:


> Originally Posted by *sugarhell*
> 
> I bet in 6-8 months he will get this card close to 400 bucks


Lol, if there is a good deal on a market , I don't wait long to make a purchase.


----------



## Majin SSJ Eric

I really like R9 290X to be honest. They had to do something as the old naming scheme was rapidly approaching the end. Nvidia will have to change their naming scheme soon as well...


----------



## 2010rig

Quote:


> Originally Posted by *Regent Square*
> 
> I am not the only 1?! Really
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do own loads of cards
> 
> 3 7970 as I got leaned towards AMD at that time.
> 
> 570 is for my price/performance desire, speaking of which, I am looking to satisfy it with R9 and free games which hopefully come with it.
> 
> Just for a reference, I own 2 Titans and(sold already) 7990. It was worth to use it for a few months.


Pics or it didn't happen.









btw - just playing with ya, though I still want to see dem pics.....


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> Pics or it didn't happen.


I am not at home now, but I will deliver when I get back. Work takes up almost the whole time.

Edit:


----------



## rcfc89

Too bad they aren't releasing a dual-gpu version of this card if it indeed reaches Titan levels at $600. I'd love to go 16x Tri-fire.


----------



## Artikbot

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I really like R9 290X to be honest. They had to do something as the old naming scheme was rapidly approaching the end. Nvidia will have to change their naming scheme soon as well...


I agree. R9-290X sounds like some bad ass Nissan engine name


----------



## sugarhell

Quote:


> Originally Posted by *Artikbot*
> 
> I agree. R9-290X sounds like some bad ass Nissan engine name


Rx?


----------



## provost

I know Regent is bit of a shake things up kinda poster, but you gotta give him some credit for his wit


----------



## EliteReplay

Quote:


> Originally Posted by *Robertdt*
> 
> If that's the case I will be skipping this generation. No way I'm paying 499 for a 7950 equivalent that can't even beat a 780. I'll pay 400 or less for a card that gives me 35-40% more improvement in performance than my current 7950.


that is plain stupid of your part... that like asking samsung to give u a 40% boost every year... which is not going to happend... i paid 340dollars for my HD7950 and if AMD can provided me a videocard arround 500 able to match a GTX780, maybe i will not pay it but someone else will... remember that u can still sell you current GPU and put the difference and get alone with it that way... or just get another HD7950 for $210 and CF it... there is no reason for you to respond like that...


----------



## Disturbed117

I think i would pay $599 if it offers 25%~ over a 7970.


----------



## Roaches

I sure missed alot in this thread today...Is $599 the actual confirmed MSRP or we still speculating? :O


----------



## Mr357

Quote:


> Originally Posted by *Baghi*
> 
> R9-290X sounds terrifying.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, has anyone seen tms around?


You mean tsm106? I've been wondering the same thing.


----------



## wstanci3

Quote:


> Originally Posted by *Roaches*
> 
> I sure missed alot in this thread today...Is $599 the actual confirmed MSRP or we still speculating? :O


Speculation


----------



## Baghi

Quote:


> Originally Posted by *Mr357*
> 
> You mean tsm? I've been wondering the same thing.


Yup, him.


----------



## Roaches

Quote:


> Originally Posted by *wstanci3*
> 
> Speculation


Thanks.


----------



## lajgnd

Considering making the jump from a Titan to R 290X.

The raw performance through leaked benches and raw hardware specs of the card itself seems to leave a bit to be desired...

But if Mantle can deliver "titan ridiculing performance" through optimizations, then I'm definitely on board.


----------



## Regent Square

Quote:


> Originally Posted by *provost*
> 
> I know Regent is bit of a shake things up kinda poster, but you gotta give him some credit for his wit


----------



## Roaches

Quote:


> Originally Posted by *lajgnd*
> 
> Considering making the jump from a Titan to R 290X.
> 
> The raw performance through leaked benches and raw hardware specs of the card itself seems to leave a bit to be desired...
> 
> But if Mantle can deliver "titan ridiculing performance" through optimizations, then I'm definitely on board.


Unless you're really unsatisfied with the Titan already....I really don't see it wise to switch to the R9-290X over small marginal gains if it turns out to be, even with Mantle performance gains.... 20nm is just 6-8 months away, I'd say its more rewarding to keep your Titan until its time to retire it. At least I would


----------



## Fletcherea

I don't feel so bad getting my cute little ZOTAC 760 AMP! now. I never fall into the 400+ dollar card category, and as it seems nothing is reallychanging in my area of the price spectrum.


----------



## lajgnd

Quote:


> Originally Posted by *Roaches*
> 
> Unless you're really unsatisfied with the Titan already....I really don't see it wise to switch to the R9-290X over small marginal gains if it turns out to be, even with Mantle performance gains.... 20nm is just 6-8 months away, I'd say its more rewarding to keep your Titan until its time to retire it. At least I would


Well, that's the thing. It's all going to come down to the performance results.

If Mantle is able to do more than just a marginal performance increase (say 50%+) then I think it would be worth it.

If it's any less than that, I'm sure nvidia can compensate with optimized drivers or development moneyhats to make the switch not worth it.

Trust me, I'm not ready to jump ship immediately. I'm just keeping an open mind and waiting until I see the data to make any sort of decision.


----------



## Roaches

I see your point







very well then.


----------



## phinexswarm71

this card is a nice leap forward from my 7970,ill probably buy it within my next build,in the meantime i will hold onto till the price point drops to ~400$,all things considerate if the card delivers its promises though


----------



## SniperOct

Quote:


> Originally Posted by *Roaches*
> 
> Unless you're really unsatisfied with the Titan already....I really don't see it wise to switch to the R9-290X over small marginal gains if it turns out to be, even with Mantle performance gains.... 20nm is just 6-8 months away, I'd say its more rewarding to keep your Titan until its time to retire it. At least I would


If it does turn out to be better or even trade blows, you can sell your titan for +-$900 and buy a 290x for $599. You would make a +-$300 profit while getting a card with newer tech and more resale value. The titan's value will probably depreciate at a rapid pace or nvidia might even lower prices. The case can also be made that the 780 is a better value card than the Titan for certain users. I'd say one has more than enough reason to make the switch. I agree that a few fps is not normally a good reason to sidegrade/upgrade if the price is the same. But this is not that case.


----------



## Stay Puft

Quote:


> Originally Posted by *SniperOct*
> 
> If it does turn out to be better or even trade blows, you can sell your titan for +-$900 and buy a 290x for $599. You would make a $300 profit while getting a card with newer tech and more resale value. The titan's value will depreciate at a rapid pace. I'd say one has more than enough reason to make the switch. I agree that a few fps is not normally a good reason to sidegrade/upgrade if the price is the same. But this is not that case.


Whos going to pay 900 for a *used* titan if the 290X trades blows with it at 599? When the 290X hits titan's value is going to hit the crapper.


----------



## lajgnd

Quote:


> Originally Posted by *Stay Puft*
> 
> Whos going to pay 900 for a *used* titan if the 290X trades blows with it at 599? When the 290X hits titan's value is going to hit the crapper.


With the release of the 780, I think the second hand Titan market is clearly aimed at people who aren't using the thing to exclusively play games since it's got the much better compute performance.


----------



## TooBAMF

Quote:


> Originally Posted by *lajgnd*
> 
> With the release of the 780, I think the second hand Titan market is clearly aimed at people who aren't using the thing to exclusively play games since it's got the much better compute performance.


Exactly, people pay $800-$850 for used Titans on Ebay and the 290X isn't going to change that ON release. In 6 months if more games start using Mantle, it may. BF4 won't significantly change the selling price of used Titans if the 780 didn't because the 290X is not significantly different from the 780. 3GB GTX 580s still go for over $200 on ebay. That's pretty insane when you can buy a 3GB 7970 for $270 AR.


----------



## wermad

Quote:


> Originally Posted by *2010rig*
> 
> Welcome to my world, accept I have a 470, and primary uses aren't gaming.
> 
> btw - I thought you owned 3 7970's?


Quote:


> Originally Posted by *2010rig*
> 
> Pics or it didn't happen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw - just playing with ya.


That's :

3x 7970
Gtx 570
2x Titans
7990
3x 1600 monitors.

This is thousands of dollars in hardware, yet he can't afford a $50 p-n-s camera or a decent cell phone with a simple camera to take pics


----------



## Roaches

All I can imagine is when 20nm is out...Nvidia and AMD would have chips crushing the Titan and 290X by miles by then at least hoping to 1.5x or twice as better, and more mature 4k+ resolution support...etc. Hopefully at a $500 price point or at least rebranding the GK110 and and Hawaii XT in shrunken die sizes at 20nm....

(thinking out loud here)


----------



## Regent Square

Quote:


> Originally Posted by *wermad*
> 
> That's :
> 
> 3x 7970
> Gtx 570
> 2x Titans
> 7990
> 3x 1600 monitors.
> 
> This is thousands of dollars in hardware, yet he can't afford a $50 p-n-s camera or a decent cell phone with a simple camera to take pics


Hello, it is you again.

Hmm, what made you think I cant afford it?

*If you can think logically* I don't have my set up in front of me as I wrote in posts. And secondly, why should I prove smth. to you, I don't own you anything as far as I can remember. Oh, and thanks for taking my advice about eyefinity set up.









Edit: I sold my 7990, not sure why u included it in price pool. I think reading comprehension wont hurt, I mean it is never too late...


----------



## Baghi

*AMD Radeon R9 290X Battlefield 4 Edition - Only 8,000 and No Specs For Pre-Order*:
Quote:


> Earlier this week we told you that AMD would be releasing a limited edition Radeon R9 290X Battlefield 4 video card. AMD Corporate Vice President of the Graphics Business Unit, Matt Skynner, said that limited quantities of the Radeon R9 290X Battlefield 4 Edition card will be available for pre-order only, starting October 3 with select partners.


Source: Legit Reviews


----------



## Forceman

Quote:


> Originally Posted by *Baghi*
> 
> *AMD Radeon R9 290X Battlefield 4 Edition - Only 8,000 and No Specs For Pre-Order*:
> Source: Legit Reviews


Well, at least that puts a number on how many die-hard fans AMD thinks it has.









Sucks though, since it means no NDA lift next week. So we get to keep waiting.


----------



## Regent Square

Quote:


> Originally Posted by *Forceman*
> 
> Well, at least that puts a number on how many die-hard fans AMD thinks it has.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sucks though, since it means no NDA lift next week. So we get to keep waiting.


I guess I am in


----------



## TheLAWNOOB

Quote:


> Originally Posted by *Baghi*
> 
> *AMD Radeon R9 290X Battlefield 4 Edition - Only 8,000 and No Specs For Pre-Order*:
> Source: Legit Reviews


Do you even know what the cooler the card has when you preorder?

Must suck for those who preorder if that special edition is just a regular 290X with some stickers on top.


----------



## lajgnd

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Do you even know what the cooler the card has when you preorder?
> 
> Must suck for those who preorder if that special edition is just a regular 290X with some stickers on top.


I got the impression that it's just a standard R9-290X that we've all seen with a copy of battlefield 4 bundled in for free, nothing more and nothing less.


----------



## TooBAMF

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Do you even know what the cooler the card has when you preorder?
> 
> Must suck for those who preorder if that special edition is just a regular 290X with some stickers on top.


Wouldn't be surprised, that's what EVGA used to do with their SC cards, probably still do, idk.


----------



## 2010rig

Quote:


> Originally Posted by *Forceman*
> 
> Well, at least that puts a number on how many die-hard fans AMD thinks it has.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sucks though, since it means no NDA lift next week. So we get to keep waiting.


What the .... they expect people to buy the card without knowing the full specs?

Bulldozer flashbacks...


----------



## Durquavian

Quote:


> Originally Posted by *2010rig*
> 
> What the .... they expect people to buy the card without knowing the full specs?
> 
> Bulldozer flashbacks...


Careful. The article seems misleading. It doesn't reference anything as to fact that specs/benches wont be released first. Nor does it give a date for the actual NDA lift. So like most of you say, wait for the facts.


----------



## Stay Puft

All 8000 will sell in 3 hours


----------



## wermad

Quote:


> Originally Posted by *Regent Square*
> 
> Hello, it is you again.
> 
> Hmm, what made you think I cant afford it?
> 
> *If you can think logically* I don't have my set up in front of me as I wrote in posts. And secondly, why should I prove smth. to you, I don't own you anything as far as I can remember. Oh, and thanks for taking my advice about eyefinity set up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I sold my 7990, not sure why u included it in price pool. I think reading comprehension wont hurt, I mean it is never too late...


Well if mom and pop get your hardware, ask for a decent camera for pics. Other wise you can talk, and that's pretty much all you've done, all you want









If had a few cards myself but progressively have been switching as the hardware changes:

4870x2 + 4870
470 x3
480 x4
6950-70 x3
6970 Lightning x3
560 Ti 448 x3
GTX 590 x2
GTX 580 3gb x4
7970 x2
670 4gb
690 x2
Titans x2
780 x3 (now).

I'll leave you alone now so you can continue to smack talk nonsense here in this thread so I can amuse myself (a little bit). Good day young sir


----------



## Artikbot

Quote:


> Originally Posted by *wermad*
> 
> Well if mom and pop get your hardware, ask for a decent camera for pics. Other wise you can talk, and that's pretty much all you've done, all you want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If had a few cards myself but progressively have been switching as the hardware changes:
> 
> 4870x2 + 4870
> 470 x3
> 480 x4
> 6950-70 x3
> 6970 Lightning x3
> 560 Ti 448 x3
> GTX 590 x2
> GTX 580 3gb x4
> 7970 x2
> 670 4gb
> 690 x2
> Titans x2
> 780 x3 (now).
> 
> I'll leave you alone now so you can continue to smack talk nonsense here in this thread so I can amuse myself (a little bit). Good day young sir


Oh, what a massive amount of money wasted.


----------



## Baghi

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Do you even know what the cooler the card has when you preorder?


You see, this is the most interesting part. It's like you're buying a plot and don't know in what part of the country or city it is located lol. On a serious note, the pre-order stated date is 3 Oct, so it's still plenty of time and you can expect anything during this time.
Quote:


> Must suck for those who preorder if that special edition is just a regular 290X with some stickers on top.


Just what EVGA did with their GTX 580 Call of Duty editions.


----------



## psyside

Quote:


> Originally Posted by *wermad*
> 
> Well if mom and pop get your hardware, ask for a decent camera for pics. Other wise you can talk, and that's pretty much all you've done, all you want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If had a few cards myself but progressively have been switching as the hardware changes:
> 
> 4870x2 + 4870
> 470 x3
> 480 x4
> 6950-70 x3
> 6970 Lightning x3
> 560 Ti 448 x3
> GTX 590 x2
> GTX 580 3gb x4
> 7970 x2
> 670 4gb
> 690 x2
> Titans x2
> 780 x3 (now).
> 
> I'll leave you alone now so you can continue to smack talk nonsense here in this thread so I can amuse myself (a little bit). Good day young sir


3x 780 on i5? non-logical move bro.


----------



## wermad

Quote:


> Originally Posted by *Artikbot*
> 
> Oh, what a massive amount of money wasted.


If you can't play big, then go home







. Its a common disease, the upgrade bug, some of us have it worse then others







. No different then many other hobbies out there.


----------



## wermad

Quote:


> Originally Posted by *psyside*
> 
> 3x 780 on i5? non-logical move bro.


Haha, says the man with an i5 too









I only game, I don't care for benchmarks. Besides, this cpu clocks very nicely. Its already better then my old 2700k. The logic comes in saving money while not worrying about having more e-peen with the baddest cpu around.

Btw, my old 2700k did the exact same fps in Metro LL compared to a 3930k in Surround







You don't need the biggest cpu to game


----------



## Robertdt

Quote:


> Originally Posted by *EliteReplay*
> 
> that is plain stupid of your part... that like asking samsung to give u a 40% boost every year... which is not going to happend... i paid 340dollars for my HD7950 and if AMD can provided me a videocard arround 500 able to match a GTX780, maybe i will not pay it but someone else will... remember that u can still sell you current GPU and put the difference and get alone with it that way... or just get another HD7950 for $210 and CF it... there is no reason for you to respond like that...


Sure there is. It's called being a consumer who is utterly fed up with putting up with corporate malfeasance and greed from every angle, including in the tech industry and gaming.

I'm not sure if anyone told ATI and Nvidia, but 5-600 dollars is a LOT of money. (Even if you can sell your prior card to recoup some of it, that is not relevant to their retail pricing).

If the card doesn't even give 25% better framerates, you're getting (in Arma 3 for instance), maybe *12-13* additional frames in your game for *500 dollars!* That's absurd. Especially when you read about Nvidia and AMD price fixing (in another thread on here).

It's about value for what you get. A GPU does one thing basically. It renders games. The price to value ratio is horribly skewed with GPUs. I'll pay 300 for an intel CPU and not be terribly upset about it because that benefits me in multiple areas of computing. Same for my SSD, same for my monitor (Korean 1440p vs. 10 more fps?).

All I get from a GPU is a few more frames per second. And for 5-600 dollars, it better be more than a few (which to me means 35-40% or more performance) or to me it is simply not worth it.

Expensive GPUs (7800 GTX, X1900XTX) have been the ONLY top-quality computing part I've ever regretted buying. I'm even very happy with my K70 keyboard, because at least it delivers a lot of quality for the price.

But paying 500 dollars for 10-15 more frames and essentially nothing else? Not a good value and I resent how much they try to charge, and I don't fully believe that it's a fair price for the cost to them to make it, especially when we're getting essentially refreshes two years later.


----------



## psyside

Quote:


> Originally Posted by *wermad*
> 
> I only game, I don't care for benchmarks. Besides, this cpu clocks very nicely. Its already better then my old 2700k. The logic comes in saving money while not worrying about having more e-peen with the baddest cpu around.
> 
> Btw, says the man with an i5. Oh, my old 2700k did the exact same fps in Metro LL compared to a 3930k in Surround
> 
> 
> 
> 
> 
> 
> 
> You don't need the biggest cpu to game


Why so defensive? i don't say your cpu is bad. just your configuration is not balanced, and i dont like or understand the logic behind stuff like that. You have over 2K dollars in gpus, and you cheap out on cpu, 39xx is the correct cpu for that configuration, i would rather have 2x sli then 3x sli with cpu which holding me back, doesen't matter if its 10% its still does, like you can see in Crysis 3 benchmarks which is game i'm sure you play.

Also i got single gpu, and i dont' play games lately why would i need i7? your just getting way much defensive about this. Also e-peen? you got 780 3x sli, no logic again to talk about it.

Yes you don't need the biggest cpu to game for single gpu, for 3x you NEED.


----------



## Regent Square

Quote:


> Originally Posted by *wermad*
> 
> *Well if mom and pop get your hardware, ask for a decent camera for pics*. Other wise you can talk, and that's pretty much all you've done, all you want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If had a few cards myself but progressively have been switching as the hardware changes:
> 
> 4870x2 + 4870
> 470 x3
> 480 x4
> 6950-70 x3
> 6970 Lightning x3
> 560 Ti 448 x3
> GTX 590 x2
> GTX 580 3gb x4
> 7970 x2
> 670 4gb
> 690 x2
> Titans x2
> 780 x3 (now).
> 
> I'll leave you alone now so you can continue to smack talk nonsense here in this thread so I can amuse myself (a little bit). Good day young sir


If you say this, it definitely drives a conclusion of what you are as a person . If your parents did buy all of these for ya, great! Other than that, keep your own reference to yourself (aka, not everyone`s parents buy their children expensive toys).


----------



## Regent Square

Quote:


> Originally Posted by *psyside*
> 
> 3x 780 on i5? *non-logical* move bro.


That is what most of his posts tend to be.


----------



## maarten12100

Quote:


> Originally Posted by *Artikbot*
> 
> Oh, what a massive amount of money wasted.


Why is that the only cards that are crap on that list are the 690's(gimped kepler cores GK104) 670(gimped kepler and can't utilize the 4GB since there is no horsepower) and the 590 (unable to overclock) "epicfail"


----------



## Stay Puft

Quote:


> Originally Posted by *Artikbot*
> 
> Oh, what a massive amount of money wasted.


My list looks similar.


----------



## wermad

Quote:


> Originally Posted by *psyside*
> 
> Why so defensive? i don't say your cpu is bad. just your configuration is not balanced, and i dont like or understand the logic behind stuff like that. You have over 2K dollars in gpus, and you cheap out on cpu, 39xx is the correct cpu for that configuration, i would rather have 2x sli then 3x sli with cpu which holding me back, doesen't matter if its 10% its still does, like you can see in Crysis 3 benchmarks which is game i'm sure you play.
> 
> Also i got single gpu, and i dont' play games lately why would i need i7? your just getting way much defensive about this. Also e-peen? you got 780 3x sli, no logic again to talk about it.
> 
> Yes you don't need the biggest cpu to game for single gpu, for 3x you NEED.


Few games take advantage of the extra threads that come w/ the most of the i7s. And if they do, its not a huge difference. BF3 is one of them but I hardly play that.

If my lowly 2700k can match a 3930k and my 4670K is better then my 2700k, what does that tell you? Your assumption is not valid. You conclude this because most reviews and highend multi gpu setups use a top of the line cpu. When it comes to gaming, especially surround gaming, the gpu is most important. The cpu just has to ensure its not creating a road block (ie bottle neck). Heck, I can run at 4.5 and still prevent a bottle neck. So far, I know anything below 4.4 will slow down my gpu(s) but above that, it won't hold them back. With the 2700K, there was never once a difference running 4.7 from 5.0 in Surround. Changing gpu did on the other hand and that's why I kept my lowly SB for a while. Haswell change was to finally move on to pcie 3.0 and I happened to adore the Sniper5 Z87 board.

If you're not sure about this, go to the Surround thread. We have members running 920s w/ two or three Titans. We have guys running SB cpu w/ triple 1440 monitors.

Maybe, down the road, extra threads and cores will be an advantage but as it stands now, I'm right up there w/ the big cpu(s). Heck, this chip might easily push beyond 5.0. And it was cheap! Can't argue with that? Proof is in the pudding, kinda of hard to believe a little cpu can do that? Just look at your SB i5









Btw, calling someone out for their hardware, you're looking to pick a fight. I can easily pick apart your rig(s) and criticize them but I'm not here for that.


----------



## HowHardCanItBe

Guys, don't start a fight in here. It's getting quite tiring to see everday now.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *5entinel*
> 
> Guys, don't start a fight in here. It's getting quite tiring to see everday now.


The 290X should do well against nVidias offerings at a better price.

Btw, did you get demoted? Last time I saw you your name is not dark green









Nice avatar though


----------



## Stay Puft

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> The 290X should do well against nVidias offerings at a better price.
> 
> Btw, did you get demoted? Last time I saw you your name is not dark green
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice avatar though


Sentinel can do what he wants. Don't question the all powerful


----------



## wermad

Quote:


> Originally Posted by *maarten12100*
> 
> Why is that the only cards that are crap on that list are the 690's(gimped kepler cores GK104) 670(gimped kepler and can't utilize the 4GB since there is no horsepower) and the 590 (unable to overclock) "epicfail"


Yeah, some of the gpu(s) weren't great on paper but they were pretty good overall, especially in Surround. I guess its method of just taste testing if you will. Comes to show how imperfect both Nvidia and Amd are. Hence why I wouldn't hesitate to go with Hawaii if it turns out to be epic.

Quote:


> Originally Posted by *Stay Puft*
> 
> My list looks similar.










Its a bit Nvidia leaning but that's because of the tremendous amount of software headaches w/ Amd. I like Amd cards so I really don't have an alligiance to one. Its just that Nvidia has a more stable (albeit expensive) product. Come on Hawaii, me wants more specs!!!!!!!


----------



## Stay Puft

So since the NDA wont be dropping for the preorder its safe to assume these cards wont be had till late October


----------



## wermad

Seeing all the leaked pics, have review samples been issued yet? I'm sure reviewers will need a couple of weeks for testing.


----------



## szeged

lol amd really hoping people pre order with 0 specs leaked, im sure the die hard amd fans will do it, but i hope those with atleast two braincells making contact dont do it.


----------



## 2010rig

Quote:


> Originally Posted by *Durquavian*
> 
> Careful. The article seems misleading. It doesn't reference anything as to fact that specs/benches wont be released first. Nor does it give a date for the actual NDA lift. So like most of you say, wait for the facts.


I sure hope people are smart enough to NOT buy a card without the full specs and performance known to them.









I know everything is pretty much a rumor now, but if these guys are right, it would definitely bring back Bulldozer flashbacks. We all know how that ended...
Quote:


> Earlier this week we told you that AMD would be releasing a limited edition Radeon R9 290X Battlefield 4 video card. AMD Corporate Vice President of the Graphics Business Unit, Matt Skynner, said that limited quantities of the Radeon R9 290X Battlefield 4 Edition card will be available for pre-order only, starting October 3 with select partners.
> 
> What we have learned since bringing you that news is that AMD will be releasing a grand total of 8,000 AMD Radeon R9 290X Battlefield 4 Edition cards. This means globally and not just regionally! *We also learned that the cards specifications and the final price of the card will not be disclosed when the pre-order begins. This means that you'll have to put down a deposit without knowing the price or clock speeds of the card that you will be purchasing.* This is a very interesting and something we've never seen done before in recent times. To top that off only certain Add-In board partners will be offering it.
> 
> http://www.legitreviews.com/amd-radeon-r9-290x-battlefield-4-edition-8000-specs-pre-order_125224#MjYKFYzB3jSPqBD2.99


See the bolded part, that's quite interesting and unprecedented.


----------



## psyside

Sorry for oftopic, i wasn't looking for fight for sure


----------



## HowHardCanItBe

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *5entinel*
> 
> Guys, don't start a fight in here. It's getting quite tiring to see everday now.
> 
> 
> 
> The 290X should do well against nVidias offerings at a better price.
> 
> Btw, did you get demoted? Last time I saw you your name is not dark green
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice avatar though
Click to expand...

Nope! All is A okay here.


----------



## wermad

Quote:


> Originally Posted by *szeged*
> 
> lol amd really hoping people pre order with 0 specs leaked, im sure the die hard amd fans will do it, but i hope those with atleast two braincells making contact dont do it.


I hate the launch rush as well. Did it w/ the 470 and 780 and what a nightmare. I never do preorders but I'm sure not going to do launch ordering. Best to let the dust settle and get a good sense of what this card can do first. Also, avoids price gauging by retailers due to the initial high demand (if there will be one).


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> lol amd really hoping people pre order with 0 specs leaked, im sure the die hard amd fans will do it, but i hope those with atleast *two braincells* making contact dont do it.


They give free BF4 with it. Nice discount imho. Unless you are all about spending in a manner the more the better, speaking of which that is where the bolded text corresponds to.


----------



## szeged

Quote:


> Originally Posted by *wermad*
> 
> I hate the launch rush as well. Did it w/ the 470 and 780 and what a nightmare. I never do preorders but I'm sure not going to do launch ordering. Best to let the dust settle and get a good sense of what this card can do first. Also, avoids price gauging by retailers due to the initial high demand (if there will be one).


ill never pre order a card with 0 listed specs other than speculations and rumors lol, its setting yourself up for dissapointment really, people will get the card then be like...i pre ordered....this? that is if theyre expecting super amazing results from it, itll be a good card no doubt, but for what ill be using it for, i need to know how itll do in max OC benching lol


----------



## Dart06

I'm thinking about biting on this. Why not? Either way it will be better than my single current 670.


----------



## szeged

Quote:


> Originally Posted by *Dart06*
> 
> I'm thinking about biting on this. Why not? Either way it will be better than my single current 670.


in your situation yeah the pre order would make sense, depending on its actual price, itll be an upgrade either way, but for people with 780s and titans, i would advice to wait lol.


----------



## Blackops_2

Quote:


> Originally Posted by *2010rig*
> 
> I sure hope people are smart enough to NOT buy a card without the full specs and performance known to them.


I think your giving some too much credit lol









I guess we all find out the third, either way i'm getting impatient.

Anymore news on the R9 290? (not the 290x)


----------



## Dart06

Quote:


> Originally Posted by *szeged*
> 
> in your situation yeah the pre order would make sense, depending on its actual price, itll be an upgrade either way, but for people with 780s and titans, i would advice to wait lol.


If I had a 780 or a Titan, I would skip this card all together. I would wait for the next set I think.

I just need to find out what retailers are going to have it for pre-order.


----------



## Regent Square

Quote:


> Originally Posted by *Dart06*
> 
> If I had a 780 or a Titan, I would skip this card all together. I would wait for the next set I think.


Your 670 does not need an upgrade either. It is more for Fermi owners like me.


----------



## szeged

Quote:


> Originally Posted by *Dart06*
> 
> If I had a 780 or a Titan, I would skip this card all together. I would wait for the next set I think.
> 
> I just need to find out what retailers are going to have it for pre-order.


for normal gaming every day use yeah it would be fine to stick with 780s/titans, but im in it for the maximum.....pursuit of performance


----------



## davio

Hmm I agree if you don't have information about how the card will perform it is pointless to purchase it. I know I'll probably be cross firing 280X unless the 290X happens to match the performance of 280X in CF for same price... But that will never happen. Use your brain before you buy or else you might suffer from buyers remorse!


----------



## szeged

buyers remorse is the worst lol, i had it when the 780 classifieds were coming out, and toppled the titans for a while, but then we got the voltage hack, and everything was fine and dandy









then i ended up getting 780 classifieds cuz i couldnt stand not trying them first hand


----------



## Dart06

Quote:


> Originally Posted by *Regent Square*
> 
> Your 670 does not need an upgrade either. It is more for Fermi owners like me.


When I like to play my games are high/max settings on my 120hz monitor I do. My single 670 doesn't cut it. I used to have two of them in SLI and that was decent enough.


----------



## Alatar

I still doubt that the card is even going to match a 780 in most things, especially once OC'd.

1) AMD's firestrike score puts it at ~17% faster than a 7970GHz. (or not even a 7970GHz as the comparison was done against the 280X and the latest info is that AMD is again favoring efficiency and actually will not clock the 280X as high as the 7970GHz)

2) Price is lower than that of the 780

3) AMD clearly does not want to reveal clear performance numbers and they're always referring to efficiency, or software optimizations

4) to quote the VC article about the leaked Korean benches:
Quote:


> The leaker claims that his sample could run at 1020 MHz core clock and 1250 MHz memory clock. That would be the peak performance for the new Radeon (we don't know yet if this is the reference clock). You may think that 1250 MHz clock for the memory is not very high, but that's due to wider memory bus. This is in fact 5Ghz effectives peed with a bandwidth of 320 GB/s. The Radeon R9 290X will operate at lower clock (something between 800 to 900 MHz), It has dual-BIOS feature for a reason. Some actually call it Turbo mode. With this feature enable the card will go much higher (near 1 GHz in boost mode), but this will significantly increase the TDP.


And now they're asking people to pre-order these things without knowing specs or performance... I'm sorry but all I hear is "underwhelming launch incoming".


----------



## Regent Square

Quote:


> Originally Posted by *davio*
> 
> Hmm I agree *if you don't have information about how the card will perform it is pointless to purchase it*. I know I'll probably be cross firing 280X unless the 290X happens to match the performance of 280X in CF for same price... But that will never happen. Use your brain before you buy or else you might suffer from buyers remorse!


Gibbo from OCUK told us already. Not much info at hands, it seems.


----------



## Blackops_2

Quote:


> Originally Posted by *Alatar*
> 
> I still doubt that the card is even going to match a 780 in most things, especially once OC'd.
> 
> 1) AMD's firestrike score puts it at ~17% faster than a 7970GHz. (or not even a 7970GHz as the comparison was done against the 280X and the latest info is that AMD is again favoring efficiency and actually will not clock the 280X as high as the 7970GHz)
> 
> 2) Price is lower than that of the 780
> 
> 3) AMD clearly does not want to reveal clear performance numbers and they're always referring to efficiency, or software optimizations
> 
> 4) to quote the VC article about the leaked Korean benches:
> And now they're asking people to pre-order these things without knowing specs or performance... I'm sorry but all I hear is "underwhelming launch incoming".


My feelings as well


----------



## 2010rig

Quote:


> Originally Posted by *Dart06*
> 
> I'm thinking about biting on this. Why not? Either way it will be better than my single current 670.


A 780 is also faster than your 770, and at least you know what you're getting in return.


----------



## szeged

idk if im more excited to try the 290x or to get more 780s/titans if the price drops


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> idk if im more excited to try the 290x or to get more 780s/titans if the price drops


szeged, _how many_ cards do you have?


----------



## 2010rig

Quote:


> Originally Posted by *wstanci3*
> 
> szeged, _how many_ cards do you have?


At last count, 22. ( I bet I'm close )


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> szeged, _how many_ cards do you have?


oh lawd you had to ask

9 titans
2 evga 780 classifieds
2 reference 780s
2 680 classifieds
4 7970s, down from 6 after selling the two runts
2 7950s
1 7870

and some various other older gen cards that i cant find or forgot about lol


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> oh lawd you had to ask
> 
> 9 titans
> 2 evga 780 classifieds
> 2 reference 780s
> 2 680 classifieds
> 4 7970s, down from 6 after selling the two runts
> 2 7950s
> 1 7870
> 
> and some various other older gen cards that i cant find or forgot about lol


Oh my god







Sitting on a pile of Titans, I'm jealous.
Ladies and Gentlemen, the Ebay hunter. I have no words.


----------



## szeged

oh and two 7990s lol


----------



## bencher

Quote:


> Originally Posted by *szeged*
> 
> oh and two 7990s lol


Good job...


----------



## Majin SSJ Eric

Silliest thing I've ever heard trying to get people to preorder a card that they know nothing about! I would never spend this kind of money on a product that was totally on faith. Can you imagine the outrage there will be from people who preordered if this thing is seriously behind the 780/Titan? Could be a catastrophic marketing blunder by AMD if this card isn't up to snuff. At least they could come clean and show what performance it does have before ripping people's money and then when they complain sying "well, we never promised it would be faster."

Hopefully the card delivers and all of this is irrelevant but its starting to give me a bad taste in my mouth. Good luck to anybody crazy enough to preorder on faith...


----------



## szeged

yeah preordering on faith with amd just puts up giant alarms for me lol, could go either way, pre order and get an amazing card, or pre order and get dumped on.


----------



## 2010rig

Well after reading it carefully, it says you have to put down *a deposit*, not pay the full price, because the full price along with specs won't be revealed yet.


----------



## szeged

Quote:


> Originally Posted by *2010rig*
> 
> Well after reading it carefully, it says you have to put down *a deposit*, not pay the full price, because the full price along with specs won't be revealed yet.


can you get your deposit back if it turns out to suck?


----------



## Majin SSJ Eric

Why on earth would you put a down payment on something that you don't even know how much its going to cost? Its just insane!


----------



## Dart06

Quote:


> Originally Posted by *szeged*
> 
> can you get your deposit back if it turns out to suck?


Yes, like any other pre order you can get your money back if you change your mind and cancel. Most websites don't charge you until a product ships.


----------



## 2010rig

Quote:


> Originally Posted by *szeged*
> 
> can you get your deposit back if it turns out to suck?


Define "suck"









No idea if you can get your money back or not, I would *assume* it's refundable.


----------



## szeged

Quote:


> Originally Posted by *2010rig*
> 
> Define "suck"


bulldozer


----------



## Majin SSJ Eric

Yeah but I'd certainly hope I knew how much money the thing was going to cost me before they charge my card and ship it...


----------



## wstanci3

See, if Amazon got a stock of those bundled 290x's I wouldn't mind putting down money at that time because you aren't charged until it ships. Amazon iz da best.


----------



## Regent Square

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Why on earth would you put a down payment on something that you don't even know how much its going to cost? Its just insane!


cause it comes with a discount for bf4 players..


----------



## Kinaesthetic

Quote:


> Originally Posted by *Regent Square*
> 
> I am not the only 1?! Really
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do own loads of cards
> 
> 3 7970 as I got leaned towards AMD at that time.
> 
> 570 is for my price/performance desire, speaking of which, I am looking to satisfy it with R9 and free games which hopefully come with it.
> 
> Just for a reference, *I own 2 Titans and(sold already) 7990*. It was worth to use it for a few months.


Quote:



> Originally Posted by *Regent Square*
> 
> Your 670 does not need an upgrade either. *It is more for Fermi owners like me*.


I'm normally not one to do this, but doesn't this seem fishy to anyone else besides me?

You own "loads of cards" in one post, yet you label yourself as a Fermi owner, and nothing else, in another. Logically, your posts indicate you'd be upgrading from Fermi to Hawaii, which would imply that you do not have any of those cards, minus the Fermi (GTX 570) listed in the first post linked.

If mods feel like this is attacking someone, feel free to remove. But I think something is fishy, or I'm just too tired and am not reading this correctly.


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> Define "suck"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No idea if you can get your money back or not, I would *assume* it's refundable(tm).


Fixed


----------



## wstanci3

Quote:


> Originally Posted by *Regent Square*
> 
> cause it comes with a discount for bf4 players..


You are willing to put down an unconfirmed amount of money on a product that has unconfirmed performance for a $50 game?


----------



## Regent Square

Quote:


> Originally Posted by *Kinaesthetic*
> 
> I'm normally not one to do this, but doesn't this seem fishy to anyone else besides me?
> 
> You own "loads of cards" in one post, yet you label yourself as a Fermi owner, and nothing else, in another. Logically, your posts indicate you'd be upgrading from Fermi to Hawaii, which would imply that you do not have any of those cards, minus the Fermi (GTX 570) listed in the first post linked.
> 
> If mods feel like this is attacking someone, feel free to remove. But I think something is fishy, or *I'm just too tired and am not reading this correctly*.


----------



## Regent Square

Quote:


> Originally Posted by *wstanci3*
> 
> You are willing to put down an unconfirmed amount of money on a product that has unconfirmed performance for a $50 game?


60$

AND it is more or less confirmed by Gibbo. So, yes.


----------



## 2010rig

Quote:


> Originally Posted by *wstanci3*
> 
> You are willing to put down an unconfirmed amount of money on a product that has unconfirmed performance for a $50 game?


AMD said it will ridicule Titan..... in December with the Mantle patch.


----------



## Majin SSJ Eric

Stupidest thing ever. Just release the numbers and be done with it. If you're so worried about Nvidia's reaction then hurry up and get the hardware out asap...


----------



## wstanci3

Quote:


> Originally Posted by *Regent Square*
> 
> 60$
> 
> AND it is more or less confirmed by Gibbo. So, yes.


Alright, then. You are committed. Good on ya.


----------



## wstanci3

Quote:


> Originally Posted by *2010rig*
> 
> AMD said it will ridicule Titan..... in December with the Mantle patch.


I'm really expecting a Fixer video on that. I'm calling it right now.


----------



## 2010rig

Quote:


> Originally Posted by *wstanci3*
> 
> I'm really expecting a Fixer video on that. I'm calling it right now.


NVIDIA should make one this time around called "The Promiser"


----------



## wermad

Quote:


> Originally Posted by *Blackops_2*
> 
> This ^ was about to come in and say it but you clarified. Nothing wrong with a high clocked i5 and Tri/Quad setups. Hell there isn't too much difference between the i5/i7 (quads) in those scenarios so if everyone wants to be critical about it they should be running x79 for Tri/Quad setups.


I'm no expert but having owned a lot of different hardware does give you some knowledge or insight into the workings.
Quote:


> Originally Posted by *szeged*
> 
> ill never pre order a card with 0 listed specs other than speculations and rumors lol, its setting yourself up for dissapointment really, people will get the card then be like...i pre ordered....this? that is if theyre expecting super amazing results from it, itll be a good card no doubt, but for what ill be using it for, i need to know how itll do in max OC benching lol


Ditto, how does it stack up first.
Quote:


> Originally Posted by *szeged*
> 
> for normal gaming every day use yeah it would be fine to stick with 780s/titans, but im in it for the maximum.....pursuit of performance


Surround is no normal usage








Quote:


> Originally Posted by *Alatar*
> 
> I still doubt that the card is even going to match a 780 in most things, especially once OC'd.
> 
> 1) AMD's firestrike score puts it at ~17% faster than a 7970GHz. (or not even a 7970GHz as the comparison was done against the 280X and the latest info is that AMD is again favoring efficiency and actually will not clock the 280X as high as the 7970GHz)
> 
> 2) Price is lower than that of the 780
> 
> 3) AMD clearly does not want to reveal clear performance numbers and they're always referring to efficiency, or software optimizations
> 
> 4) to quote the VC article about the leaked Korean benches:
> And now they're asking people to pre-order these things without knowing specs or performance... I'm sorry but all I hear is "underwhelming launch incoming".


Alatar strikes again!

Yeah, its interesting on how mum amd is on the specs. Either they're playing coy or just not admitting something....


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> I'm really expecting a Fixer video on that. I'm calling it right now.


amd wouldnt smash a titan, they would rather take it home and have a better benching experience


----------



## Artikbot

Quote:


> Originally Posted by *wermad*
> 
> If you can't play big, then go home
> 
> 
> 
> 
> 
> 
> 
> . Its a common disease, the upgrade bug, some of us have it worse then others
> 
> 
> 
> 
> 
> 
> 
> . No different then many other hobbies out there.


I guess









I like high end hardware, but even if I had the money to spend, I could not justify it.

1200€ every three years is what I can justify, and even that, it still hurts a little in the inside


----------



## Regent Square

Quote:


> Originally Posted by *wstanci3*
> 
> Alright, then. You are committed. Good on ya.


pffs. The best deal for me=/= the best deal for ya. Got it


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> amd wouldnt smash a titan, they would rather take it home and have a better benching experience


Now that I know you have all those Titans, I just picture Jim Carrey sitting on a pile of Titans. Lol


----------



## flopper

I have faith in amd.
the lack of faith is disturbing


----------



## Stay Puft

Quote:


> Originally Posted by *flopper*
> 
> I have faith in amd.
> the lack of faith is disturbing


After this we have a reason to be cautious


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> Now that I know you have all those Titans, I just picture Jim Carrey sitting on a pile of Titans. Lol


haha


----------



## wstanci3

Quote:


> Originally Posted by *Regent Square*
> 
> pffs. The best deal for me=/= the best deal for ya. Got it


I "got it." Wasn't ridiculing your choice, believe it or not. I was just surprised.


----------



## bmgjet

Not looking good for me.
Only have $800 and was hoping to get one.

600USD = 725.08NZD
Then as most electronics here it gets taxed hard out so thats going to put it at $833 at least before the greedy shops add there cut on top.
So looks like it with be 1.1K+ like the 7990 is here.


----------



## Regent Square

Despite all the hate NV fanboys bring in this thread, I cant wait till October 3rd!


----------



## bencher

Quote:


> Originally Posted by *Regent Square*
> 
> Despite all the hate NV fanboys bring in this thread, I cant wait till October 3rd!


I am so excited


----------



## wermad

"ive got dah powahwahwahwaaaaaaah!"








Sorry, so many ppl pictured holding Hawaii like this, I just couldn't resist anymore


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> Despite all the hate NV fanboys bring in this thread, I cant wait till October 3rd!


Why? The NDA isnt dropping for weeks


----------



## Durquavian

Quote:


> Originally Posted by *Stay Puft*
> 
> Why? The NDA isnt dropping for weeks


Did they say when it was dropping. I mean an actual date not some vague reference to some future time where any guess would be as accurate as another...


----------



## Stay Puft

Quote:


> Originally Posted by *Durquavian*
> 
> Did they say when it was dropping. I mean an actual date not some vague reference to some future time where any guess would be as accurate as another...


Your guess is as good as mine but no details will be revealed when the preorder starts. I say it'll be released same day as BF4


----------



## wstanci3

Quote:


> Originally Posted by *Durquavian*
> 
> Did they say when it was dropping. I mean an actual date not some vague reference to some future time where any guess would be as accurate as another...


No, from the people that went to the NDA conference, AMD didn't even tell them when the NDA will be lifted.


----------



## Durquavian

Gotta admit AMD has been super at keeping quiet, good or bad. Seriously so much blackout in last few months. They don't need marketing, just say one thing and the internet lights up and spreads your name faster fire thru a propane plant.


----------



## Ghoxt

Correct me if I'm reading this right.

No confirmed specs, and the only thing we've heard is that with 1 Mantle optimized game, when patched in the future, will increase performance to be faster than some immaterial Titan somewhere. Oh and put down some money now to pre-order on Faith.









I'm still looking to make a purchase down the road, but their pre launch marketing leaves something to be desired. My goal is to have 2 rigs, one green the other red.


----------



## vs17e

Does anyone know a reason to justify buying a 290x over crossfire 280x's? Power consumption isn't a problem for me


----------



## PhantomTaco

Quote:


> Originally Posted by *vs17e*
> 
> Does anyone know a reason to justify buying a 290x over crossfire 280x's? Power consumption isn't a problem for me


You have to realize no one can justify recommending/buying any of them yet as we still don't know actual performance numbers or prices on the 290x. Anything anyone will recommend to you will be more or less baseless.


----------



## fleetfeather

Quote:


> Originally Posted by *Stay Puft*
> 
> Your guess is as good as mine but no details will be revealed when the preorder starts. I say it'll be released same day as BF4


shut the front door,

taking pre-orders on a card with no specs out? I doubt AMD expects people to hand over >$599 for a 'mystery bag', especially off the back of Bulldozer...
release specs same-day as bf4 launch? I think they'd want owners to 'have' the card before/on the day of the bf4 launch...


----------



## Forceman

Quote:


> Originally Posted by *Stay Puft*
> 
> Your guess is as good as mine but no details will be revealed when the preorder starts. I say it'll be released same day as BF4


They need to release it at least a day or two before BF4 so people can get one shipped to them before BF4 unlocks.

Do the BF4 SE pre-order people get their cards ahead of everyone else, or is that literally just a stickered-BF4-bundled-normal-card I wonder?


----------



## PhantomTaco

Quote:


> Originally Posted by *fleetfeather*
> 
> shut the front door,
> 
> taking pre-orders on a card with no specs out? I doubt AMD expects people to hand over >$599 for a 'mystery bag', especially off the back of Bulldozer...
> release specs same-day as bf4 launch? I think they'd want owners to 'have' the card before/on the day of the bf4 launch...


Just a quick update. It appears at this point that's what they're expecting people to do, hope that's not what they're planning on doing.


----------



## Gunderman456

Quote:


> Originally Posted by *fleetfeather*
> 
> shut the front door,
> 
> taking pre-orders on a card with no specs out? I doubt AMD expects people to hand over >$599 for a 'mystery bag', especially off the back of Bulldozer...
> release specs same-day as bf4 launch? I think they'd want owners to 'have' the card before/on the day of the bf4 launch...


I'll also add anyone who pre-orders not knowing specs or performance numbers is criminal and would set a new low that the industry will surely abuse. I hope everyone will make an informed purchase.

So with that, and with pre-orders taking place on October 3rd, AMD will and should lift NDA and let review sites inform on specs and performance.

Edit: Just read the update below and AMD is acting like a criminal in the night; not even divulging price. I guess there will be 8000 willing, but those who will buy on impulse will not be doing the rest of us any favours. It will be a trial balloon by AMD that will lead to further abuses in the future.


----------



## fleetfeather

Quote:


> Originally Posted by *PhantomTaco*
> 
> Just a quick update. It appears at this point that's what they're expecting people to do, hope that's not what they're planning on doing.


My god, AMD certainly do make it hard to support them...

I mean, I understand the idea behind keeping their competitor in the dark so they can't react, but keep the customer in the dark so they don't know what they're getting into is a complete joke lol

Consumers should always be free to buy in confidence...


----------



## jomama22

Quote:


> Originally Posted by *Forceman*
> 
> They need to release it at least a day or two before BF4 so people can get one shipped to them before BF4 unlocks.
> 
> Do the BF4 SE pre-order people get their cards ahead of everyone else, or is that literally just a stickered-BF4-bundled-normal-card I wonder?


I believe it will be the same price as a normal 290x, just with a bf4 preorder thrown in. Kinda like getting a little perk for being an early adopter.

I doubt the 290x will come with a game bundle, there doesn't seem to be a need for it at the top tier honestly.

Also, seems weird only one site would pick this up unless others aren't buying the preorder stuff.


----------



## TooBAMF

Quote:


> Originally Posted by *fleetfeather*
> 
> My god, AMD certainly do make it hard to support them...
> 
> I mean, I understand the idea behind keeping their competitor in the dark so they can't react, but keep the customer in the dark so they don't know what they're getting into is a complete joke lol


AMD seems to be the way to go if all you care about is BF4 but it will still take 2 months to matter. If people are legitimately buying these cards just for BF4 I say go for it, but I doubt it. BF4 is a nice bonus when you were already planning to get BF4 but not really enough to take this type of gamble.


----------



## lajgnd

i don't get the big deal about pre ordering without releasing specs.

we all know what the ballpark performance is going to be.

expect a card that's roughly 25%-50% faster than a 7970.

May be a bigger deal for nVidia owners looking to jump ship, but for AMD owners looking to upgrade, this is your refresh card and Titan competitor. I don't think it's that much of a mystery as far as performance is concerned.


----------



## TooBAMF

Quote:


> Originally Posted by *lajgnd*
> 
> i don't get the big deal about pre ordering without releasing specs.
> 
> we all know what the ballpark performance is going to be.
> 
> expect a card that's roughly 25%-50% faster than a 7970.
> 
> May be a bigger deal for nVidia owners looking to jump ship, but for AMD owners looking to upgrade, this is your refresh card. I don't think it's that much of a mystery as far as performance is concerned.


True but I feel like a lot of people are on the fence between 780 and 290X with no real ties to either. Super hardcore BF players can probably go for this just like people sticking with AMD but even people who plan to play a lot of BF4 probably plan to play a lot of other things.


----------



## Majin SSJ Eric

It SHOULD be 25-50% faster but that's no guarantee. That's the point. If I'm shelling out $600-$700 for a piece of hardware I should darn well know what it is I'm getting. Sure my Titans were hideously expensive but at least I knew what kind of performance they offered beforehand and could make an informed decision on whether to buy or not...


----------



## coolhandluke41

Quote:


> Originally Posted by *2010rig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wstanci3*
> 
> I'm really expecting a Fixer video on that. I'm calling it right now.
> 
> 
> 
> NVIDIA should make one this time around called "The Promiser"
Click to expand...

how about " almost" ..








on the other hand I hope it performs as promised so I can insert this card in to my Impact which is perfect for AMD card (only one PCI-E slot ....how's Crossfire this days ?)


----------



## TheLAWNOOB

Pre-ordering a GPU without knowing how the retail version performs is like giving your neighbour's kid $2000 and trust him to build A PC for you.


----------



## jomama22

I have a feeling this is being overblown. If and did this it would be quite weird to say the least. Don't be surprised when they announce open Pre orders that they at least give you a price. Performance wise... Who knows.


----------



## Master__Shake

Quote:


> Originally Posted by *bmgjet*
> 
> Not looking good for me.
> Only have $800 and was hoping to get one.
> 
> 600USD = 725.08NZD
> Then as most electronics here it gets taxed hard out so thats going to put it at $833 at least before the greedy shops add there cut on top.
> So looks like it with be 1.1K+ like the 7990 is here.


ncix ships internationally now









www.ncix.ca


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *jomama22*
> 
> I have a feeling this is being overblown. If and did this it would be quite weird to say the least. Don't be surprised *when they announce open Pre orders that they at least give you a price*. Performance wise... Who knows.


For God's sakes I would hope so!


----------



## 2010rig

Quote:


> Originally Posted by *lajgnd*
> 
> i don't get the big deal about pre ordering without releasing specs.
> 
> we all know what the ballpark performance is going to be.
> 
> expect a card that's roughly 25%-50% faster than a 7970.
> 
> May be a bigger deal for nVidia owners looking to jump ship, but for AMD owners looking to upgrade, this is your refresh card and Titan competitor. I don't think it's that much of a mystery as far as performance is concerned.


On the only benchmark AMD has shown it was 17% faster than a 280x.
Quote:


> Originally Posted by *jomama22*
> 
> I have a feeling this is being overblown. If and did this it would be quite weird to say the least. Don't be surprised when they announce open Pre orders that they at least give you a price. Performance wise... Who knows.


We're just going by the rumored info, see for yourself.
http://www.legitreviews.com/amd-radeon-r9-290x-battlefield-4-edition-8000-specs-pre-order_125224

It's from LegitReviews, so it must be legit.


----------



## jomama22

Quote:


> Originally Posted by *2010rig*
> 
> We're just going by the rumored info, see for yourself.
> http://www.legitreviews.com/amd-radeon-r9-290x-battlefield-4-edition-8000-specs-pre-order_125224


I read it. thats what I mean by other sources not picking bit up yet. But we shall see.


----------



## lajgnd

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> It SHOULD be 25-50% faster but that's no guarantee. That's the point. If I'm shelling out $600-$700 for a piece of hardware I should darn well know what it is I'm getting. Sure my Titans were hideously expensive but at least I knew what kind of performance they offered beforehand and could make an informed decision on whether to buy or not...


I think it's safe to say it's guaranteed to be at minimum 25% faster than 7970.

37% more stream processors
Roughly 25% more bandwidth
37% more TMU
37% more ROPs

Now compared to Titan it has

4% more stream processors
roughly 12% more bandwidth
22% less TMUs
8% less ROPs

So based on how 7970 GE performed against Titan I'd say this is going to be a card that comes in right below Titan performance in most scenarios and even outperforms it in some. While we don't know the price, based on AMD's current pricing structure and vocal reluctance to stay away from $999 single cards (and the 780s $650 price point), it's most likely going to be around $500-$600.

Of course there's also Mantle and AMD making claims that it will "ridicule" the titan with mantle, so expect greater performance in that and Frostbite 3 games. The million dollar question of course is how much faster it will be, If it was going to be a massive increase, I'm sure AMD would have provided benchmarks. But I'd say it'll probably come in around 20% faster than a Titan, otherwise this big focus on Mantle doesn't seem like it would be worth the effort IMO.

Regardless, you can extrapolate that the Titan is 50% faster than the 7970 GE in Battlefield 3, so if the R290X is going to be faster than the Titan, it's going to probably be around 50% faster than the 7970 GE (assuming they both get 20% increases from mantle, and that the R290X is indeed faster than a Titan in BF4).

The real way to look at this scenario isn't about performance. Performance of the R290X can be estimated pretty closely at this point. The real question is whether you think this kind of upgrade is worth the price, which is the same question many people asked when they upgraded from a 680 to 780 (let's forget about Titan here since the 680-780 upgrade path is more akin to the 7970-R290x path).


----------



## GTR Mclaren

Quote:


> Originally Posted by *fleetfeather*
> 
> My god, AMD certainly do make it hard to support them...
> 
> I mean, I understand the idea behind keeping their competitor in the dark so they can't react, but keep the customer in the dark so they don't know what they're getting into is a complete joke lol
> 
> Consumers should always be free to buy in confidence...


The marketing team of AMD (and M$ xD) are just apes, thats the only explanation I can see.


----------



## kot0005

Quote:


> Originally Posted by *Master__Shake*
> 
> ncix ships internationally now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> www.ncix.ca


Whaaa?? since when ?? you go to b joking! If its true I am guna b all over linus.


----------



## xoleras

edit: dp


----------



## kot0005

Pre order deposit without specs and price? now thats low AMD.

http://www.legitreviews.com/amd-radeon-r9-290x-battlefield-4-edition-8000-specs-pre-order_125224


----------



## delavan

Quote:


> Originally Posted by *vs17e*
> 
> Does anyone know a reason to justify buying a 290x over crossfire 280x's? Power consumption isn't a problem for me


Hardware frame pacing on the new chip (290X) vs software solution on the R280/rebranded 7970???

I'm throwing that out there is that correct?


----------



## bencher

Quote:


> Originally Posted by *GTR Mclaren*
> 
> The marketing team of AMD (and M$ xD) are just apes, thats the only explanation I can see.


AMD doesn't need marketing right now. Every news post since June was all about AMD.


----------



## Durquavian

Quote:


> Originally Posted by *bencher*
> 
> AMD doesn't need marketing right now. Every news post since June was all about AMD.


Funny and so completely TRUE.


----------



## fateswarm

Am I the only one that gets angry people that pretend to be journalists take the time to submit "articles" that constitute only of "could" and "what if" things?

I hope I am alone, because we would never be friends, we would both be too angry.

But seriously though, have the decency to not post them in a "news" thread but a rumors one and at least say clearly that they are rumors, not just "the price changed", what price changed, you mean "the price that random dude speculated changed".


----------



## szeged

how the rumors started out

beats titan a little

amd chimes in

now it beats the titan tremendously

couple days later

itll beat titan when used with mantle

day after that

itll be under the titan but it wont cost as much

nda lifted

doesnt beat titan


----------



## tinmann

And priced only $50.00 under a GTX 780. Are there any benches yet? How much faster than a GTX 780 is it if at all.


----------



## bencher

Quote:


> Originally Posted by *szeged*
> 
> how the rumors started out
> 
> beats titan a little
> 
> amd chimes in
> 
> now it beats the titan tremendously
> 
> couple days later
> 
> itll beat titan when used with mantle
> 
> day after that
> 
> itll be under the titan but it wont cost as much
> 
> nda lifted
> 
> doesnt beat titan


It doesn't have to beat titan. it however has to beat the 780 though. Or else it is a massive fail.


----------



## fleetfeather

Quote:


> Originally Posted by *szeged*
> 
> snip


Not sure why you'd be happy about this. Regardless of whether you have affiliation with either party or not, this is bad news for enthusiasts. If one party can't (or wont) compete with the other, expect the ~$1000 cards to continue. GLHF buying titan ultra's, 880's, 890's etc.


----------



## Roaches

Quote:


> Originally Posted by *szeged*
> 
> how the rumors started out
> 
> beats titan a little
> 
> amd chimes in
> 
> now it beats the titan tremendously
> 
> couple days later
> 
> itll beat titan when used with mantle
> 
> day after that
> 
> itll be under the titan but it wont cost as much
> 
> nda lifted
> 
> doesnt beat titan


Basically its Bulldozer all over again if it does fail....Though I'm sure they learned their lesson by now....They have no reason to shill their potential customers into buying them when the benches surfaces are all over their le happy merchant faces.


----------



## szeged

Quote:


> Originally Posted by *bencher*
> 
> It doesn't have to beat titan. it however has to beat the 780 though. Or else it is a massive fail.


if it doesnt beat the titan then oh well, itll probably come close, if it doesnt beat the 780 thats kinda dissapointing, but amd could just drop the price by another 50 bucks and save face by saying they never intended to beat it and all the rumors are just that


----------



## fateswarm

Quote:


> Originally Posted by *tinmann*
> 
> And priced only $50.00 under a GTX 780. Are there any benches yet? How much faster than a GTX 780 is it if at all.


Come on. It must be faster. 39% more transistors (it's the only promotional info I take too seriously, I hope they didn't lie!).


----------



## LaBestiaHumana

Quote:


> Originally Posted by *szeged*
> 
> if it doesnt beat the titan then oh well, itll probably come close, if it doesnt beat the 780 thats kinda dissapointing, but amd could just drop the price by another 50 bucks and save face by saying they never intended to beat it and all the rumors are just that


Titan has been out since February, why wouln't AMD just release a card that easily outperforms it? Im not talking about beating Titan by 2 FPS on Crysis 3 or getting 300 more points on Firestrike, I'm talking beat Titan just like Titan beats 7970. Clearly and be a good amount. Give Titan/780 owners a reason to ditch Nvidaia and go AMD. If the 290X comes out and actually beats Titan by 2% I will just laugh!! If it doesn't beat Titan, it better be dirt cheap, then again 7970 owners won't have a reason to upgrade either.


----------



## Durquavian

Quote:


> Originally Posted by *fleetfeather*
> 
> Not sure why you'd be happy about this. Regardless of whether you have affiliation with either party or not, this is bad news for enthusiasts. If one party can't (or wont) compete with the other, expect the ~$1000 cards to continue. GLHF buying titan ultra's, 880's, 890's etc.


Keep in mind sure Nvidia could charge a hefty sum, but that die size isn't cheap to manufacture. The Price has more to do with the manufacturing of the product, then some for limited quantities, and lastly because they can.


----------



## szeged

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Titan has been out since February, why wouln't AMD just release a card that easily outperforms it? Im not talking about beating Titan by 2 FPS on Crysis 3 or getting 300 more points on Firestrike, I'm talking beat Titan just like Titan beats 7970. Clearly and be a good amount. Give Titan/780 owners a reason to ditch Nvidaia and go AMD. If the 290X comes out and actually beats Titan by 2% I will just laugh!! If it doesn't beat Titan, it better be dirt cheap, then again 7970 owners won't have a reason to upgrade either.


if it beats titan by 1 or two percent then ill probably laugh because i wouldnt call that "ridiculing" the titan lol

if it beat it by 30% then id probably change my bedsheets from the telletubbies i have now to amd







though i doubt this would happen lol.

if it beats titan at all, thatll be enough fire for nvidia to drop prices and release their next overpriced flagship lol


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Durquavian*
> 
> Keep in mind sure Nvidia could charge a hefty sum, but that die size isn't cheap to manufacture. The Price has more to do with the manufacturing of the product, then some for limited quantities, and lastly because they can.


And the card performs quite well. Like yeah we paid a premium, but we got awesome performance since Feb 2013


----------



## szeged

if i could go back to feb and buy a titan on release day all over again, i still would, even for the price.


----------



## Durquavian

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Titan has been out since February, why wouln't AMD just release a card that easily outperforms it? Im not talking about beating Titan by 2 FPS on Crysis 3 or getting 300 more points on Firestrike, I'm talking beat Titan just like Titan beats 7970. Clearly and be a good amount. Give Titan/780 owners a reason to ditch Nvidaia and go AMD. If the 290X comes out and actually beats Titan by 2% I will just laugh!! If it doesn't beat Titan, it better be dirt cheap, then again 7970 owners won't have a reason to upgrade either.


Die size my friend. May not be worth it to AMD to produce one that size for such a small portion of the market.

ATTENTION: Asked in another thread. Does anyone know if the 5__mm size for Titan is the size of the usable die or the total including parts that have been rendered unusable.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *szeged*
> 
> if it beats titan by 1 or two percent then ill probably laugh because i wouldnt call that "ridiculing" the titan lol
> 
> if it beat it by 30% then id probably change my bedsheets from the telletubbies i have now to amd
> 
> 
> 
> 
> 
> 
> 
> though i doubt this would happen lol.
> 
> if it beats titan at all, thatll be enough fire for nvidia to drop prices and release their next overpriced flagship lol


Id even wear AMD socks and undies if they beat Titan by 30%.

I figure by now we have some performance numbers


----------



## selk22

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Id even wear AMD socks and undies if they beat Titan by 30%.
> 
> I figure by now we have some performance numbers


If its over 50% I will tattoo AMD on my ass


----------



## vs17e

Quote:


> Originally Posted by *selk22*
> 
> If its over 50% I will tattoo AMD on my ass


I feel like thats more insulting than praising, then again I really don't know your intention


----------



## szeged

Quote:


> Originally Posted by *selk22*
> 
> If its over 50% I will tattoo AMD on my ass


if its over 50% ill have amd tattooed onto my ass, have someone photograph it, then have the picture of my ass with amd on it tattooed onto my face.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *selk22*
> 
> If its over 50% I will tattoo AMD on my ass


Good one, lol

OMG imagine if they beat Titan by 50%, id set my Titans on Fire on youtube and make it seem like volcanic lava is melting them.


----------



## selk22

Quote:


> Originally Posted by *szeged*
> 
> if its over 50% ill have amd tattooed onto my ass, have someone photograph it, then have the picture of my ass with amd on it tattooed onto my face.


hahah! This one got me good lol


----------



## youra6

Quote:


> Originally Posted by *szeged*
> 
> if its over 50% ill have amd tattooed onto my ass, have someone photograph it, then have the picture of my ass with amd on it tattooed onto my face.


Nvidia announces Maxwell release in Q4 2013... leaked benchmarks shows 85% faster than the Geforce Titan- priced competitively with R9 290X


----------



## szeged

Quote:


> Originally Posted by *youra6*
> 
> _Nvidia announces Maxwell release in Q4 2013... leaked benchmarks shows 85% faster than the Geforce Titan- priced competitively with R9 290X_


ill get a tattoo of my face on my ass, have my ass photographed, have the photo of my face on my ass tattoed onto my face. never ending ass face.


----------



## 2010rig

I don't like the direction this thread is taking....


----------



## fateswarm

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Titan has been out since February, why wouln't AMD just release a card that easily outperforms it?


*Because 80% of the forum haven't realized the obvious*. The price of the GK110 isn't only demand driven, it is largely supply driven, i.e. it costs more than twice at TSMC since it takes more space on the wafer and it fails more often. Hence NVIDIA would LOVE to have lower prices and doesn't just play high-nosed, why do you think they went on an offensive against their own supplier's prices last year if there was not a real reason and not just a publicity stunt, both AMD and NVIDIA have weak financials (especially AMD) (which indicates the high prices are not a cash cow) and they'd love to have $400 cards that perform like an overclocked Titan with a huge profit left to them.


----------



## 2010rig

Quote:


> Originally Posted by *fateswarm*
> 
> *Because 80% of the forum haven't realized the obvious*. The price of the GK110 isn't only demand driven, it is largely supply driven, i.e. it costs more than twice at TSMC since it takes more space on the wafer and it fails more often. Hence NVIDIA would LOVE to have lower prices and doesn't just play high-nosed, why do you think they went on an offensive against their own supplier's prices last year if there was not a real reason and not just a publicity stunt, both AMD and NVIDIA have weak financials (especially AMD) (which indicates the high prices are not a cash cow) and they'd love to have $400 cards that perform like an overclocked Titan with a huge profit left to them.


Checks sig rig, NVIDIA has sold big dies like GK110 for as little as $350 for, and GF100 had some terrible yields initially.


----------



## wermad

Quote:


> Originally Posted by *2010rig*
> 
> I don't like the direction this thread is taking....


This.....


----------



## Master__Shake

Quote:


> Originally Posted by *2010rig*
> 
> Checks sig rig, NVIDIA has sold big dies like GK110 for as little as $350 for, and GF100 had some terrible yields initially.


except when TSMC was making gf100 nvidia had a deal with tsmc, they didn't have to pay for failed gf100 wafers.

this time around nvidia has to pay for every single failure.


----------



## 2010rig

Quote:


> Originally Posted by *Master__Shake*
> 
> except when TSMC was making gf100 nvidia had a deal with tsmc, they didn't have to pay for failed gf100 wafers.
> 
> this time around nvidia has to pay for every single failure.


oh, I did not know that. Learn something new every day.


----------



## Kipsta77

I also wonder what the acoustics are like...


----------



## raghu78

this thread has derailed. get it back on topic. AMD should have been more forthcoming with information. atleast if they had said that NDA expires on Oct 3rd and reviews will be up. The users can then decide whether to go for R9 290X BF4 pre-orders that would have been fine. they have left too much for idle gossip. Time and again AMD have proved their marketing sucks


----------



## jomama22

Im pulling a cartman, gonna freeze myself in the freezer. I'll call you guys on my time travel prank phone when AMD and Nvidia joint release the Hercules r275 569xt ti to compete with the 3dfx voodoo 9500 w/ glide 13.7.

Hopefully k-9 and cat-10 don't screw this up again....bark bark.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *raghu78*
> 
> this thread has derailed. get it back on topic. AMD should have been more forthcoming with information. atleast if they had said that NDA expires on Oct 3rd and reviews will be up. The users can then decide whether to go for R9 290X BF4 pre-orders that would have been fine. they have left too much for idle gossip. Time and again AMD have proved their marketing sucks


Agreed. Its just bad business to offer a preorder on a product you haven't released any info about...


----------



## szeged

Quote:


> Originally Posted by *raghu78*
> 
> this thread has derailed. get it back on topic. AMD should have been more forthcoming with information. atleast if they had said that NDA expires on Oct 3rd and reviews will be up. The users can then decide whether to go for R9 290X BF4 pre-orders that would have been fine. they have left too much for idle gossip. Time and again AMD have proved their marketing sucks


lets just hope the 290x isnt as bad as their marketing team.


----------



## Forceman

Quote:


> Originally Posted by *Durquavian*
> 
> Die size my friend. May not be worth it to AMD to produce one that size for such a small portion of the market.
> 
> ATTENTION: Asked in another thread. Does anyone know if the 5__mm size for Titan is the size of the usable die or the total including parts that have been rendered unusable.


That's the full GK110 die. The used area is effectively smaller with the SMXes disabled. The GTX 780 works out to be about 485mm^2 effective, very roughly measured by me.


----------



## jomama22

Quote:


> Originally Posted by *raghu78*
> 
> this thread has derailed. get it back on topic. AMD should have been more forthcoming with information. atleast if they had said that NDA expires on Oct 3rd and reviews will be up. The users can then decide whether to go for R9 290X BF4 pre-orders that would have been fine. they have left too much for idle gossip. Time and again AMD have proved their marketing sucks


We don't even know what's accurate or not.so we can't really be all that upset.

Most times, we don't even know a release date for any new generation. We just see a press release and refresh every website over and over until its in stock. I dont think most people realize we actually know way more about these GPUs then we do most new generations. To even say they will have pre-orders on the 3rd is more info then we usually have.

If anything, marketing has done its job in making sure the only GPU mist have their minds in is the 290x. They put themselves in a position for those wanting a 780/Titan to wait for their new release.


----------



## szeged

i just wish they would go ahead and drop the 7970s price








i want more to play around with lol.


----------



## carlhil2

Quote:


> Originally Posted by *szeged*
> 
> if i could go back to feb and buy a titan on release day all over again, i still would, even for the price.


----------



## selk22

Quote:


> Originally Posted by *2010rig*
> 
> I don't like the direction this thread is taking....


I am sorry I could not resist.


----------



## selk22

On a more on topic note..

Why oh why AMD would you have people pre-order something they don't even have a clue about besides empty promises that AMD seems excellent at making.


----------



## provost

Quote:


> Originally Posted by *raghu78*
> 
> this thread has derailed. get it back on topic. AMD should have been more forthcoming with information. atleast if they had said that NDA expires on Oct 3rd and reviews will be up. The users can then decide whether to go for R9 290X BF4 pre-orders that would have been fine. they have left too much for idle gossip. Time and again AMD have proved their marketing sucks


i wonder whatever gave AMD the idea that they could pull an xbox one or ps4 type of marketing trick, and it would work. I would not be surprised if someone in the AMD bubble has woken up and realized this pre-order crap ain't gonna work by just promising the bf4 bundle, and based on the song and dance they did at the amd conference.

i would be very surprised if we don't see any real data before the pre-ordering begins. the fastest way to kill a marketing buzz is with a lackluster start when the pre-orders fall way short of expectations.


----------



## Majin SSJ Eric

We'll just have to see what happens between now and Oct 3. Hopefully AMD at least releases a price by the preorder date. Anybody with a brain should really wait for NDA lift and impartial reviews before preordering these cards IMO...


----------



## Nick7269

Haha, I love the hype, rumors, and stories! There are so much fun!
I can't wait for the real benchmarks and the real updated drivers for a real product that people really own. Until then we are living in an imagination of what we want it to be.


----------



## szeged

Quote:


> Originally Posted by *Nick7269*
> 
> Haha, I love the hype, rumors, and stories! There are so much fun!
> I can't wait for the real benchmarks and the real updated drivers for a real product that people really own. Until then we are living in an imagination of what we want it to be.


i want it to beat the titan by 1000% and instead of blowing out hot air out of the reference cooler i want it to blow out unicorns rainbows and chocolate.


----------



## cookiesowns

Looks like I'm going back to AMD if they implement hardware frame pacing for Crossfire 2 and 3way.

Been with Nvidia since GTX480 days.. and recently got a 7970 for a build... Image quality on AMD cards have always been better, especially in video. Everything just looks more natural.


----------



## fleetfeather

I was in the market for that bf4 bundle, but without the specs, I'm out.

Bf4 can be preordered for around $50 last time i checked the 'online deals' section here on OCN. Ill just wait and see what the NDA lift produces, and if I have to drop an extra 50 bucks because I didn't take the early risk, so be it.


----------



## raghu78

Quote:


> Originally Posted by *cookiesowns*
> 
> Looks like I'm going back to AMD if they implement hardware frame pacing for Crossfire 2 and 3way.
> 
> Been with Nvidia since GTX480 days.. and recently got a 7970 for a build... Image quality on AMD cards have always been better, especially in video. Everything just looks more natural.


once reviews are up we will know. But looks like Hawaii is the only chip to have hardware frame pacing for CF. Hawaii Pro might be ideal for value for money at USD 400 - 450


----------



## Forceman

Quote:


> Originally Posted by *raghu78*
> 
> Hawaii Pro might be ideal for value for money at USD 400 - 450


Especially if it is 2560 shaders.


----------



## OwnedINC

Quote:


> Originally Posted by *selk22*
> 
> On a more on topic note..
> 
> Why oh why AMD would you have people pre-order something they don't even have a clue about besides empty promises that AMD seems excellent at making.


Well... you know its TDP is 260w, you know its faster than a 7970. Which is more than you can say about most pre-orders for products.










I mean I could go on about the dozen other things we know about it, but that'd ruin the fun of everyone crying "we know nothing!!!!!111"


----------



## maarten12100

Quote:


> Originally Posted by *szeged*
> 
> bulldozer


Quote:


> If an application binary currently includes the instructions that are common to AMD's new core architecture
> and to the Intel CPUs (e.g., AVX, SSE3, SSE4.1, SSE4.2, AES-NI), then this code will run well on the AMD
> Opteron™ 6200 CPU, as long as the binary checks only the ISA feature bits in the CPUID. Unfortunately, much
> code generated by the Intel compiler and Intel libraries inserts checks for the CPU Vendor to be "GENUINEINTEL"
> and will thus either fail or execute an inefficient code sequence on AMD processors. Recompile such software.


Bottom line is it isn't as bad as OCN makes you think it is rather the opposite in fact but considering most here think Cinebench is a fair comparison which doesn't have the above problem you won't hear that.

If you want to argue that Intel still does this, they even state it in their own guides and AMD states it in their compiling guide.
Maybe I should change the vendor ID on my dual E5520's to AuthenticAMD and see what happens running Cinebench








Quote:


> Originally Posted by *2010rig*
> 
> NVIDIA should make one this time around called "The Promiser"


GTX Titan and 780 both boost and if that benchmarks had all card boosting to around 1000MHz it will be faster at stock that the Titan with a way smaller die which is quite the achievement.
(Unless you're some guy that doesn't need such a gpu for a purpose but only wants to run Firestrike all day well that isn't their problem is it?)
Quote:


> Originally Posted by *szeged*
> 
> amd wouldnt smash a titan, they would rather take it home and have a better benching experience


I don't think anybody would smash a Titan but GK104 cards being smashed well they are bad after all
Quote:


> Originally Posted by *Stay Puft*
> 
> After this we have a reason to be cautious


Mostly due to gimped software that it performed so bad but I concur I was expecting more myself (the actually had a real bulldozer designed from what I've hear but they didn't got it complete somehow)
Even with the gimped software AMD using their new base architecture are improving in greater steps and will therefore catch up with Intel @Excavator (only if they attain the mentioned gains)
All Intel can do is make minor improvements and shrink the die (in the server space there is at least the advantage that you get extra cores and cache but not so much for the desktop.

Quote:


> Originally Posted by *Durquavian*
> 
> Gotta admit AMD has been super at keeping quiet, good or bad. Seriously so much blackout in last few months. They don't need marketing, just say one thing and the internet lights up and spreads your name faster fire thru a propane plant.


More power to them right








Quote:


> Originally Posted by *fleetfeather*
> 
> shut the front door,
> 
> taking pre-orders on a card with no specs out? I doubt AMD expects people to hand over >$599 for a 'mystery bag', especially off the back of Bulldozer...
> release specs same-day as bf4 launch? I think they'd want owners to 'have' the card before/on the day of the bf4 launch...


Nobody serius compares the AMD CPU brand to the GPU branch we all know AMD GPU is on par with Nvidia.

Quote:


> Originally Posted by *Forceman*
> 
> They need to release it at least a day or two before BF4 so people can get one shipped to them before BF4 unlocks.
> 
> Do the BF4 SE pre-order people get their cards ahead of everyone else, or is that literally just a stickered-BF4-bundled-normal-card I wonder?


The pre orders probably wouldn't get you a direct unlock to BF4 ahead in time would be so nice though.


----------



## Forceman

Quote:


> Originally Posted by *OwnedINC*
> 
> Well... you know its TDP is 260w, you know its faster than a 7970. Which is more than you can say about most pre-orders for products.


Most pre-orders don't cost on the order of $600 though.


----------



## Majin SSJ Eric

How nostalgic. I gotta admit its been quite a while since I saw somebody actually trying to defend BD in a serious manner...


----------



## TheLawIX

Wish it was $499, then I'd buy three instead of two.


----------



## maarten12100

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> How nostalgic. I gotta admit its been quite a while since I saw somebody actually trying to defend BD in a serious manner...


Just saying that it isn't as bad as Cinebench makes it seem and will do quite well on normal code.
Sure even I consider it a flop since they knew Intel was crippling them and still say they would make "the fastest processor on earth with the best IPC"
But it has potential over the P3 architecture

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> It SHOULD be 25-50% faster but that's no guarantee. That's the point. If I'm shelling out $600-$700 for a piece of hardware I should darn well know what it is I'm getting. Sure my Titans were hideously expensive but at least I knew what kind of performance they offered beforehand and could make an informed decision on whether to buy or not...


I hope so I figure they'll deliver if they can match rather than compete on price with that small die at equal clocks I would say that is impressive.
Quote:


> Originally Posted by *coolhandluke41*
> 
> how about " almost" ..
> 
> 
> 
> 
> 
> 
> 
> 
> on the other hand I hope it performs as promised so I can insert this card in to my Impact which is perfect for AMD card (only one PCI-E slot ....how's Crossfire this days ?)


Rather good on par with SLI and it has great scaling all the way up to 4 cards compared to SLI.
Those cards will have hardware based frame metering since Raja said it won't be dependent on the new driver









Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Pre-ordering a GPU without knowing how the retail version performs is like giving your neighbour's kid $2000 and trust him to build A PC for you.


I think they'll let sources release numbers by end of September.

Quote:


> Originally Posted by *delavan*
> 
> Hardware frame pacing on the new chip (290X) vs software solution on the R280/rebranded 7970???
> 
> I'm throwing that out there is that correct?


Correct though after the driver drops they should be nearly equal
Quote:


> Originally Posted by *szeged*
> 
> how the rumors started out
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> beats titan a little
> 
> amd chimes in
> 
> now it beats the titan tremendously
> 
> couple days later
> 
> itll beat titan when used with mantle
> 
> day after that
> 
> itll be under the titan but it wont cost as much
> 
> nda lifted
> 
> doesnt beat titan


It isn't competing with Titan which is a massive die of which ~540mm^2 is in use.
In other words if it can compete clock for clock with a Titan then yes it is pretty much the better chip.

Quote:


> Originally Posted by *Roaches*
> 
> Basically its Bulldozer all over again if it does fail....Though I'm sure they learned their lesson by now....They have no reason to shill their potential customers into buying them when the benches surfaces are all over their le happy merchant faces.


AMD's GPU and CPU divisions are nothing alike

Quote:


> Originally Posted by *szeged*
> 
> if it beats titan by 1 or two percent then ill probably laugh because i wouldnt call that "ridiculing" the titan lol
> 
> if it beat it by 30% then id probably change my bedsheets from the telletubbies i have now to amd
> 
> 
> 
> 
> 
> 
> 
> though i doubt this would happen lol.
> 
> if it beats titan at all, thatll be enough fire for nvidia to drop prices and release their next overpriced flagship lol


I figure according to those leaked benches it will match Titan at boost vs boost then with the Mantle patch it should give at least 20% performance over Titan.

Quote:


> Originally Posted by *Durquavian*
> 
> Die size my friend. May not be worth it to AMD to produce one that size for such a small portion of the market.
> 
> ATTENTION: Asked in another thread. Does anyone know if the 5__mm size for Titan is the size of the usable die or the total including parts that have been rendered unusable.


it is about ~540mm^2 active where the 780 is about ~480mm^2 active
Quote:


> Originally Posted by *fleetfeather*
> 
> I was in the market for that bf4 bundle, but without the specs, I'm out.
> 
> Bf4 can be preordered for around $50 last time i checked the 'online deals' section here on OCN. Ill just wait and see what the NDA lift produces, and if I have to drop an extra 50 bucks because I didn't take the early risk, so be it.


I figure we see reviews before pre order


----------



## Majin SSJ Eric

Even if it matches Titan clock for clock, I think its doubtful that it will clock as high (especially with the voltage options Titan owners have now). Also let's not get ahead of ourselves with Mantle. As of now (and I really mean as of December) the ONLY game that Mantle will be of any use for AMD in bench tests is BF4...


----------



## fleetfeather

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Even if it matches Titan clock for clock, I think its doubtful that it will clock as high (especially with the voltage options Titan owners have now). Also let's not get ahead of ourselves with Mantle. As of now (and I really mean as of December) the ONLY game that Mantle will be of any use for AMD in bench tests is BF4...


I think if BF4 benches look good, devs would be crazy not to take up the open-source Mantle. Sure, it'll be a change of process, and ways of working will be shaken up a bit while those who work with the API's get their head around a new system, but It'll end up meaning they can push the envelope for PC gaming further; more textures, more polys, more assets in general, and anything that results in improvements for the end product has got to be taken as a positive (or at least you'd hope that's the mindset)


----------



## LaBestiaHumana

Quote:


> Originally Posted by *fleetfeather*
> 
> I think if BF4 benches look good, devs would be crazy not to take up the open-source Mantle. Sure, it'll be a change of process, and ways of working will be shaken up a bit while those who work with the API's get their head around a new system, but It'll end up meaning they can push the envelope for PC gaming further; more textures, more polys, more assets in general, and anything that results in improvements for the end product has got to be taken as a positive (or at least you'd hope that's the mindset)


So it will take another gpu generation for mainstream games to use Mantle. Unless someone is a die hard battlefield fan, mantle means nothing for at least another year.


----------



## bigtonyman1138

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> So it will take another gpu generation for mainstream games to use Mantle. Unless someone is a die hard battlefield fan, mantle means nothing for at least another year.


I bet all the next gen games running the frostbite 3 engine will make use of mantle. Why go through all the work of getting the engine to work with mantle just to use it on one game?


----------



## Forceman

Quote:


> Originally Posted by *bigtonyman1138*
> 
> I bet all the next gen games running the frostbite 3 engine will make use of mantle. Why go through all the work of getting the engine to work with mantle just to use it on one game?


Depends whether it's something that is automatically enabled with the use of FB3, or if it requires some integration work. I'm sure new from now games will use it, but ones already in development may be too late.


----------



## fleetfeather

I guess the timeframe for mainstream game implementation is dependent upon how much work is needed to be done to apply the framework to other game engines and other gpu architectures. How flexible is the api for use with engines other than FB3, how much work is needed to include other gpu variants? These are the main questions.

I take heart in that the api is being used for r9 cards and BF4 is being added after the r9 launches, suggesting the mantle architecture does not need to be fully implemented before a gpu ships, but rather it can be added (or at least enabled) along the way. It lends hope to the idea that it can be implemented on other chips as the coding is built upon (which, if there's demand for it, should happen quicker due to open sourcing)


----------



## LaBestiaHumana

Quote:


> Originally Posted by *bigtonyman1138*
> 
> I bet all the next gen games running the frostbite 3 engine will make use of mantle. Why go through all the work of getting the engine to work with mantle just to use it on one game?


It will be a while for army if two, NFS and Medal of Honor to bring next gen games.

Besides Battlefield, I don't see any new games that will take advantage of mantle for at least another 8 months. AC4, Batman, CoD Ghosts, or any current gen title or benchmark will not be able to use utilize it. While it may be a good thing, it's more of a next gen feature.


----------



## 2010rig

Quote:


> Originally Posted by *bencher*
> 
> AMD doesn't need marketing right now. Every news post since June was all about AMD.


Yet, what new hardware has AMD brought to market ever since? *FX-9590*, the rest has been pure rumors and speculation.

Interestingly enough it's down to $699 now, or $589 with a Sabertooth 8GB RAM, and an H60.








http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1444538

AMD and price drops go hand in hand, I wonder why.


----------



## Clockster

Quote:


> Originally Posted by *2010rig*
> 
> Yet, what new hardware has AMD brought to market ever since? *FX-9590*, the rest has been pure rumors and speculation.
> 
> Interestingly enough it's down to $699 now, or $589 with a Sabertooth 8GB RAM, and an H60.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1444538
> 
> AMD and price drops go hand in hand, I wonder why.


Just go through the hardware section here and go check benchmarking forums and all you see is talk of the new AMD Gpu's, It doesn't matter whether its speculation or rumors. Any and all press is good press for AMD, they don't have to bother with advertising and marketing right now because they are plastered all over the net with whats coming.

I own my own little pc shop in SA, I have a good client base, nothing like the big companies but since the mantle news dropped I've gotten 30 back orders for R9 290X BF4 editions.
To put that into perspective, when the Titan launched I had 6 back orders. That's a big difference.

A close friend of mine owns the largest online shop in SA and he has confirmed the exact same thing, huge amount of back orders, more than we've ever seen with anything from AMD.


----------



## maarten12100

It is indeed big that it'll compete with such a small die.


----------



## Moragg

Quote:


> Originally Posted by *maarten12100*
> 
> It is indeed big that it'll compete with such a small die.


If optimisation with Mantle brings the same improvements as optimisation on consoles the 290x should dominate the TItan in Mantle games - unless Nvidia has something up it's sleeve (likely) to get games optimised for it's architecture.

As I understand it, Mantle means it's easy to port console optimised games (so most of them) to GCN on PCs. It also is designed to make porting to DirectX easier, so convincing them to do an NVIDIA optimisation would be quite hard.

The only thing that has not been addressed: if Mantle allows GPUs to be utilised much more, surely power draw and heat will also increase in line with this? Kind of like Furmark.


----------



## raghu78

Quote:


> Originally Posted by *Clockster*
> 
> Just go through the hardware section here and go check benchmarking forums and all you see is talk of the new AMD Gpu's, It doesn't matter whether its speculation or rumors. Any and all press is good press for AMD, they don't have to bother with advertising and marketing right now because they are plastered all over the net with whats coming.
> 
> I own my own little pc shop in SA, I have a good client base, nothing like the big companies but since the mantle news dropped I've gotten 30 back orders for R9 290X BF4 editions.
> To put that into perspective, when the Titan launched I had 6 back orders. That's a big difference.
> 
> A close friend of mine owns the largest online shop in SA and he has confirmed the exact same thing, huge amount of back orders, more than we've ever seen with anything from AMD.


People are underestimating AMD's sweep of next gen console wins. Mantle is a good indication of what the benefits are for AMD. We can expect other game developers and publishers jump on the mantle bandwagon.

https://twitter.com/wadetb

"*We'll have to support #Mantle now; wish it was just #OpenGL extensions though! Maybe someone will lightly wrap D3D or GL around it for us?*"

Wade Brainerd
@wadetb
Principal Technical Director at Activision


----------



## bencher

Quote:


> Originally Posted by *2010rig*
> 
> Yet, what new hardware has AMD brought to market ever since? *FX-9590*, the rest has been pure rumors and speculation.
> 
> Interestingly enough it's down to $699 now, or $589 with a Sabertooth 8GB RAM, and an H60.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1444538
> 
> AMD and price drops go hand in hand, I wonder why.


What does that have to do with anything?


----------



## maarten12100

Quote:


> Originally Posted by *bencher*
> 
> What does that have to do with anything?


Because he thinks people will actually buy a overpriced binned 8320 for things other than benching.
Rig2010 hate on enthusiast products much?

The is no good or bad side on neither Nvidia (besides Greenlight since this is OCN) and AMD grow up


----------



## provost

Quote:


> Originally Posted by *Clockster*
> 
> Just go through the hardware section here and go check benchmarking forums and all you see is talk of the new AMD Gpu's, It doesn't matter whether its speculation or rumors. Any and all press is good press for AMD, they don't have to bother with advertising and marketing right now because they are plastered all over the net with whats coming.
> 
> I own my own little pc shop in SA, I have a good client base, nothing like the big companies but since the mantle news dropped I've gotten 30 back orders for R9 290X BF4 editions.
> To put that into perspective, when the Titan launched I had 6 back orders. That's a big difference.
> 
> A close friend of mine owns the largest online shop in SA and he has confirmed the exact same thing, huge amount of back orders, more than we've ever seen with anything from AMD.


Being in the news section is one thing, as its all about hype







and getting people to buy you products based on blind faith is another.

I saw your exact post on overclockers.uk too, but sorry, don't know where your shop is and who your friend is. Unless AMD opens its kimono on detailed specs and people have real reviews, this pre-order gimmick ain't going anywhere.


----------



## Moragg

Quote:


> Originally Posted by *bencher*
> 
> Quote:
> 
> 
> 
> Originally Posted by *2010rig*
> 
> Yet, what new hardware has AMD brought to market ever since? *FX-9590*, the rest has been pure rumors and speculation.
> 
> Interestingly enough it's down to $699 now, or $589 with a Sabertooth 8GB RAM, and an H60.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1444538
> 
> AMD and price drops go hand in hand, I wonder why.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What does that have to do with anything?
Click to expand...

There are still people who believe that price is an indicator of quality.









That said, the 9590 was never worth the cost and still isn't. But if the 290x and Mantle are any indication AMD looks to be becoming much more aggressive, hopefully something that carries over to the CPU division.


----------



## amputate

Finally I get to ditch these goddamned 680s


----------



## xzamples

now i hate myself for picking a gtx 760


----------



## Durquavian

Quote:


> Originally Posted by *raghu78*
> 
> People are underestimating AMD's sweep of next gen console wins. Mantle is a good indication of what the benefits are for AMD. We can expect other game developers and publishers jump on the mantle bandwagon.
> 
> https://twitter.com/wadetb
> 
> "*We'll have to support #Mantle now; wish it was just #OpenGL extensions though! Maybe someone will lightly wrap D3D or GL around it for us?*"
> 
> Wade Brainerd
> @wadetb
> Principal Technical Director at Activision


Now maybe it was just someone speculating, but wasn't it mentioned that AMD was giving some Mantle-to-DX conversion software for ease of transition from console to PC.


----------



## Moragg

Quote:


> Originally Posted by *Durquavian*
> 
> Now maybe it was just someone speculating, but wasn't it mentioned that AMD was giving some Mantle-to-DX conversion software for ease of transition from console to PC.


I heard someone else say (only once) that it would make the port to DX easier.
While this should be possible (to some extent) I think it's wrong. Here's how I understand it.

GCN has a very basic language. Consoles have their own api in which games are programmed. For PC there is an api called mantle. This is very similar to the console api but obviously going to have some differences. This makes porting to Mantle quite easy.

DirectX takes abstract commands (draw a line) and turns them into language GPUs understand. So, theoretically, if you could reverse this process you would end up with a DX version of the game. Obviously you'd have to go through and do stuff manually, but that's the only way I can see that this will make porting to DX easier.

BUT I don't think Mantle is the API for consoles. Think of it like American "English" vs British English.


----------



## selk22

Quote:


> Originally Posted by *xzamples*
> 
> now i hate myself for picking a gtx 760


You still made a good choice especially if you SLI in the future


----------



## maarten12100

Quote:


> Originally Posted by *Moragg*
> 
> BUT I don't think Mantle is the API for consoles. Think of it like American "English" vs British English.


I concur I myself lost a point on my Mastering Physics because the Americans don't know how to spell aluminium instead they throw half the characters out the window. (they do this with a lot of things I mean colour -> color)


----------



## Regent Square

Everything slower than gtx 780 is a big NO NO in my book

Also, I laugh how people say Mantle with BF4.. is not a big deal. BF4 is the most anticipated game of a year. Stop posting BS info you know nothing about.


----------



## Regent Square

Quote:


> Originally Posted by *maarten12100*
> 
> I concur I myself lost a point on my Mastering Physics because the Americans don't know how to spell aluminium instead they throw half the characters out the window. (they do this with a lot of things I mean colour -> color)


Main difference between British and American English is?? Except the British accent.


----------



## Moragg

There are lots of differences between American and British English (e.g. the latter one is actually correct) but if you're interested there's a whole wiki article on it.

Back on topic, Mantle does appear to be very useful, a single 290x ran BF4 on 3x1080p Eyefinity. Not sure on fps/quality settings, but if it does provide even 50% boost Nvidia's flagships will take a while to catch up. Of course, most games (or at least the very graphic intensive ones) have to adopt Mantle first. And if the 290x can go toe-to-toe with a Titan without Mantle then I can't see myself recommending Nvidia for a while.

'tis going to be an interesting few months in the tech world.


----------



## Yeroon

Quote:


> Originally Posted by *maarten12100*
> 
> I concur I myself lost a point on my Mastering Physics because the Americans don't know how to spell aluminium instead they throw half the characters out the window. (they do this with a lot of things I mean colour -> color)


Lol, saying the other spelling is wrong is laughable. They are both correct.
The British have actually made the attempt to use as little of the 'American' forms of spelling, along with changing their own pronunciation (going from rhetoric to non-rhetoric [hard vs soft r's]) , after the Americans separated.

http://www.livescience.com/33652-americans-brits-accents.html

http://www.livescience.com/33844-british-american-word-spelling.html


----------



## maarten12100

Quote:


> Originally Posted by *Regent Square*
> 
> Main difference between British and American English is?? Except the British accent.


Removing characters from many words and writing words together that should have been separate.
No need to turn this into a discussion why American English seems less polite.


----------



## Baghi

Jeez! Car discussion is more interesting than this.


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> Main difference between British and American English is?? Except the British accent.


Americans don't get that cool accent they have across the pond


----------



## darkstar585

Quote:


> Originally Posted by *Stay Puft*
> 
> Americans don't get that cool accent they have across the pond


Its not as cool as you think in places.... Cue example below.


----------



## Yeroon

Quote:


> Originally Posted by *darkstar585*
> 
> Its not as cool as you think in places.... Cue example below.


That man is awesome! Nothing you can say about his accent will take that away. (providing that is Guy? who races the sportbikes, and rides with Peaty on the pedallybikes)

Yes, it is.


----------



## Stay Puft

Girls with an Aussie accent as well mmmmmmmm


----------



## GTR Mclaren

Quote:


> Originally Posted by *Stay Puft*
> 
> Girls with an Aussie accent as well mmmmmmmm


OMG she is gorgeous


----------



## bencher

There are alot of differences between English and American English (bootleg English)


----------



## Mr357

So.... how about that 290X? Pretty cool huh?











Spoiler: Warning: Spoiler!


----------



## maarten12100

Quote:


> Originally Posted by *Mr357*
> 
> So.... how about that 290X? Pretty cool huh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Is that you in the picture?
Anyhow yes it is very cool can't wait for non reference models with more DP ports and more ram.


----------



## GTR Mclaren

Quote:


> Originally Posted by *Mr357*
> 
> So.... how about that 290X? Pretty cool huh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


we don know anything about the card...at least no more than they show in that horrible presentation


----------



## Majin SSJ Eric

Lol, Porkins!


----------



## Moragg

Quote:


> Originally Posted by *Mr357*
> 
> So.... how about that 290X? Pretty cool huh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


On topic - i.e. speculate about a card with no specs on a new architecture that has massive performance boost from a new api we've never seen before, and compare it with it's closest competitor - a card which Nvidia have yet to reveal or release any specs about









Oh, and I do hope the new shroud is alumini*u*m. I really like the colo*u*r


----------



## Clockster

Quote:


> Originally Posted by *provost*
> 
> Being in the news section is one thing, as its all about hype
> 
> 
> 
> 
> 
> 
> 
> and getting people to buy you products based on blind faith is another.
> 
> I saw your exact post on overclockers.uk too, but sorry, don't know where your shop is and who your friend is. Unless AMD opens its kimono on detailed specs and people have real reviews, this pre-order gimmick ain't going anywhere.


I am not a member on overclockers.uk








Also my pc shop is based in SA as in South Africa. People saw how well BF4 run at that massive res on 1 card and how smooth it was.
That's pretty impressive. At the end of the day I'll probably only get my hands on about 10-12 of these cards an anyway seeing as there is such a limited number of the BF4 edition.

I am really excited to see final specs and pricing on them just like everyone else, we can't force them into giving us what we want, we'll just have to wait and see.


----------



## provost

i could swear I saw this exact post on another forum..lol may be hard forum, i can't recall. thought it was over clocker uk..anyway

8,000 is limited edition, if he rumor is true? if so, not sure what unlimited run looks like, 80,000
anyways, AMD will be lucky to sell even 800, if they ask for people to lock in via a deposit or any other kind of commitment for pre-order

I have seen a lot of things look awesome in a sales pitch, but we will only know how awesome something is once we have some independent reviews or real world results.
I give AMD credit for a good job hyping everyone, but I am always skeptical when I see this much hype and nothing to back it up.


----------



## Clockster

Quote:


> Originally Posted by *provost*
> 
> i could swear I saw this exact post on another forum..lol may be hard forum, i can't recall. thought it was over clocker uk..anyway
> 
> 8,000 is limited edition, if he rumor is true? if so, not sure what unlimited run looks like, 80,000
> anyways, AMD will be lucky to sell even 800, if they ask for people to lock in via a deposit or any other kind of commitment for pre-order
> 
> I have seen a lot of things look awesome in a sales pitch, but we will only know how awesome something is once we have some independent reviews or real world results.
> I give AMD credit for a good job hyping everyone, but I am always skeptical when I see this much hype and nothing to back it up.


Well I do know a little which I am unfortunately not allowed to share, but I haven't heard of this deposit crap from any of my suppliers.
I also doubt AMD would pull something like that, they are just getting back on their feet, def not the smartest thing to do.

On a related note I have had guys willing to pay up to R7500 which is roughly $750. That is taking into account how weak the Rand is compared to the dollar.
Oh and I am not on that forum either









Oh and locally a Titan retails for around $1300 xD lol


----------



## mcg75

Quote:


> Originally Posted by *provost*
> 
> i could swear I saw this exact post on another forum..lol may be hard forum, i can't recall. thought it was over clocker uk..anyway
> 
> 8,000 is limited edition, if he rumor is true? if so, not sure what unlimited run looks like, 80,000
> anyways, AMD will be lucky to sell even 800, if they ask for people to lock in via a deposit or any other kind of commitment for pre-order
> 
> I have seen a lot of things look awesome in a sales pitch, but we will only know how awesome something is once we have some independent reviews or real world results.
> I give AMD credit for a good job hyping everyone, but I am always skeptical when I see this much hype and nothing to back it up.


You aren't the only one. I recall somebody posting the same thing about owning a store except I'm sure he said somewhere in California. San Diego perhaps.


----------



## provost

Quote:


> Originally Posted by *mcg75*
> 
> You aren't the only one. I recall somebody posting the same thing about owning a store except I'm sure he said somewhere in California. San Diego perhaps.


Yeah, the more I am reading about this 290x, the more its looking like a bunch of bs. Oh well, it was amusing to read, I guess. But a waste of time.


----------



## Artikbot

Quote:


> Originally Posted by *provost*
> 
> Yeah, the more I am reading about this 290x, the more its looking like a bunch of bs. Oh well, it was amusing to read, I guess. But a waste of time.


I learnt the hype lesson the hard way with Zambezi.

So no expectations have been let down with the R9-290X, nor I did have any strong hype to begin with.

This way, any positive news are good news, and never let downs. Small tips to live a happy life


----------



## Regent Square

how can 290x be a bs card if it trades blows with a titan and is faster than 780 in some games?!


----------



## L36

Quote:


> Originally Posted by *Regent Square*
> 
> how can 290x be a bs card if it trades blows with a titan and is faster than 780 in some games?!


Id like to see some valid data from reputable source that confirms that statement.


----------



## Stay Puft

Quote:


> Originally Posted by *L36*
> 
> Id like to see some valid data from reputable source that confirms that statement.


Not going to happen till the NDA drops but the leaked benchmarks are not fakes. They're from a respected overclocker who a lot of us know very well.


----------



## Blackops_2

Quote:


> Originally Posted by *Stay Puft*
> 
> Not going to happen till the NDA drops but the leaked benchmarks are not fakes. They're from a respected overclocker who a lot of us know very well.


Which leaks? Point me in that direction please?


----------



## flopper

Quote:


> Originally Posted by *Regent Square*
> 
> how can 290x be a bs card if it trades blows with a titan and is faster than 780 in some games?!


single card eyefinity also
http://www.dsogaming.com/news/battlefield-4-demo-was-running-on-a-single-r9-290x-at-amds-event-at-5760x1080/


----------



## Regent Square

Quote:


> Originally Posted by *flopper*
> 
> single card eyefinity also
> http://www.dsogaming.com/news/battlefield-4-demo-was-running-on-a-single-r9-290x-at-amds-event-at-5760x1080/


could not get any better than that.


----------



## bencher

Quote:


> Originally Posted by *Artikbot*
> 
> I learnt the hype lesson the hard way with Zambezi.
> 
> So no expectations have been let down with the R9-290X, nor I did have any strong hype to begin with.
> 
> This way, any positive news are good news, and never let downs. Small tips to live a happy life


If this card lets me down I would have no choice but to go nvidia.


----------



## Regent Square

Quote:


> Originally Posted by *Blackops_2*
> 
> Which leaks? Point me in that direction please?


no, no! Your titan is still a king, no need to worry about a slummy 290,


----------



## AlphaC

Quote:


> Originally Posted by *Artikbot*
> 
> I learnt the hype lesson the hard way with Zambezi.
> 
> So no expectations have been let down with the R9-290X, nor I did have any strong hype to begin with.
> 
> This way, any positive news are good news, and never let downs. Small tips to live a happy life


yeh it's ok to be excited but until release benches hit the net with stock speeds, max OCs and max stable OCs I'd say be wary of AMD claims


----------



## Mr357

Quote:


> Originally Posted by *maarten12100*
> 
> Is that you in the picture?


What would compel you to make such a terrible and rude joke?


----------



## Blackops_2

Quote:


> Originally Posted by *Regent Square*
> 
> no, no! Your titan is still a king, no need to worry about a slummy 290,


What are you talking about?


----------



## Regent Square

Quote:


> Originally Posted by *Mr357*
> 
> What would compel you make such a terrible and rude joke?


Dude are you a guy with "Gaben: picture who was visiting 780 pre release threads, back in days?


----------



## provost

I think I am starting to get excited again. 290x would blow a Titan, 780, and may be Maxwell too.


----------



## 2010rig

Quote:


> Originally Posted by *provost*
> 
> I think I am starting to get excited again. 290x would blow a Titan, 780, and may be Maxwell too.


And Volta.


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> And Volta.


Maxwell is not looking promising imho.


----------



## Mr357

Quote:


> Originally Posted by *Regent Square*
> 
> Dude are you a guy with "Gaben: picture who was visiting 780 pre release threads, back in days?


No, that wasn't me. I wasn't too big into the 780.

EDIT: And it wasn't until a few days ago that I was using this avatar


----------



## Regent Square

Quote:


> Originally Posted by *Mr357*
> 
> No, that wasn't me. I wasn't too big into the 780.


Got ya! Thanks.


----------



## 2010rig

Quote:


> Originally Posted by *Regent Square*
> 
> Maxwell is not looking promising imho.


Why not?


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> Why not?


long story short..


----------



## maarten12100

Quote:


> Originally Posted by *flopper*
> 
> single card eyefinity also
> http://www.dsogaming.com/news/battlefield-4-demo-was-running-on-a-single-r9-290x-at-amds-event-at-5760x1080/


To the millennium ***con!

Sound like a hell of a card "perfect for 4K" sounded like marketing but it seems to be really really good might be due to the abundance of bandwidth right?


----------



## maarten12100

Quote:


> Originally Posted by *Mr357*
> 
> What would compel you to make such a terrible and rude joke?


Your avatar looks like the guy in the picture.


----------



## maarten12100

Quote:


> Originally Posted by *2010rig*
> 
> And Volta.


That is years away


----------



## Mr357

Quote:


> Originally Posted by *maarten12100*
> 
> Your avatar looks like the guy in the picture.


Well my avatar is in no way, shape, or form me. It's Gabe Newell, the CEO of Valve.


----------



## PhantomTaco

Quote:


> Originally Posted by *maarten12100*
> 
> To the millennium ***con!
> 
> Sound like a hell of a card "perfect for 4K" sounded like marketing but it seems to be really really good might be due to the abundance of bandwidth right?


Tell me if i'm interpreting this wrong but the memory bandwidth is 300gb/s, while the titan is 288. I don't think that extra 12 gb/s would be enough to make that kind of difference do you?


----------



## davio

long story short? Do go on, I'm interested, I thought it might be a good series!


----------



## Forceman

Quote:


> Originally Posted by *Regent Square*
> 
> long story short..


That was short alright.


----------



## Mombasa69

R9 290X does look like a damn fine card, but I like the R9 280X as well. My 3 way 570 SLI setup are still managing to run every game in max settings, even though they're nearly 3 years old, maybe I should wait.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mombasa69*
> 
> R9 290X does look like a damn fine card, but I like the R9 280X as well. My 3 way 570 SLI setup are still managing to run every game in max settings, even though they're nearly 3 years old, maybe I should wait.


You are probably playing 1080p. 1.25GB is not going to be enough for BF4 even @ 1080p.


----------



## 2010rig

Quote:


> Originally Posted by *Regent Square*
> 
> long story short..


Now that makes perfect sense, thanks for the explanation.


----------



## jomama22

Quote:


> Originally Posted by *2010rig*
> 
> Now that makes perfect sense, thanks for the explanation.


I dunno, i dont really agree with the whole ".." business, just doesnt seem like the optimizations we were looking for.


----------



## 2010rig

Quote:


> Originally Posted by *jomama22*
> 
> I dunno, i dont really agree with the whole ".." business, just doesnt seem like the optimizations we were looking for.


Well, obviously you're incapable of reading between the lines dots, Regent Square delivers informative info as usual.


----------



## coolhandluke41

Quote:


> Originally Posted by *provost*
> 
> I think I am starting to get excited again. 290x would blow a Titan, 780, and may be Maxwell too.


Quote:


> Originally Posted by *2010rig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *provost*
> 
> I think I am starting to get excited again. 290x would blow a Titan, 780, and may be Maxwell too.
> 
> 
> 
> And Volta.
Click to expand...

and whatever is there after Volta ,..290x is the greatest ,best-est,card of all time ! ,you guys make my day ,there is not one legit review and all that talk







,keep it up, it's very entertaining


----------



## ZealotKi11er

I am pretty sure everyone want it to be faster then Titan & Cheaper then GTX780. Those that don't should really think more before buying 1K @ $650 GPU. If you spend a lot of money in PC hopping nothing will beat it for longest possible time you are doing it wrong. With the price of HD 7970 you know GTX780 is very overpriced.


----------



## szeged

anyone who doesnt want it to be priced at the 780s level and beat the titan has some backwards as all hell thinking.


----------



## Durquavian

Quote:


> Originally Posted by *szeged*
> 
> anyone who doesnt want it to be priced at the 780s level and beat the titan has some backwards as all hell thinking.


Cant blame em for wanting it. Whether it is sound business or not probably wasn't in their conclusion.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *maarten12100*
> 
> Is that you in the picture?
> Anyhow yes it is very cool can't wait for non reference models with more DP ports and more ram.


LMAO,


----------



## LaBestiaHumana

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am pretty sure everyone want it to be faster then Titan & Cheaper then GTX780. Those that don't should really think more before buying 1K @ $650 GPU. If you spend a lot of money in PC hopping nothing will beat it for longest possible time you are doing it wrong. With the price of HD 7970 you know GTX780 is very overpriced.


Not really, game play is a lot smoother on 780 than it is on 7970. Is not just about x amount of fps.


----------



## KaiserFrederick

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Not really, game play is a lot smoother on 780 than it is on 7970. Is not just about x amount of fps.


Really? In XFire/SLI sure, but I thought smoothness was equal for single card configurations.
Really want some hard info about the specs of the R9 290X, I have the money for a 780 but am trying to hold off until I can make a more informed decision...


----------



## LaBestiaHumana

Quote:


> Originally Posted by *KaiserFrederick*
> 
> Really? In XFire/SLI sure, but I thought smoothness was equal for single card configurations.
> Really want some hard info about the specs of the R9 290X, I have the money for a 780 but am trying to hold off until I can make a more informed decision...


On my game capture pc, I had a 660ti installed, then replaced it with a 7870, and there is noticeable stuttering on the 7870. Gets worse with crossfire, luckily I don't game on that card. Funny thing is my friend who owned the card never complained about stuttering issues. Now that he has the 660ti, he noticed games to be much smoother.


----------



## KaiserFrederick

Quote:


> Originally Posted by *bencher*
> 
> Even though my 7870 plays smooth as butter.
> 
> Cool story bro.


Yeah, I went from a 670 FTW to a 7970, couldn't discern any difference, but that's just me.


----------



## Ukkooh

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> On my game capture pc, I had a 660ti installed, then replaced it with a 7870, and there is noticeable stuttering on the 7870. Gets worse with crossfire, luckily I don't game on that card. Funny thing is my friend who owned the card never complained about stuttering issues. Now that he has the 660ti, he noticed games to be much smoother.


TBH this sounds like a faulty card.


----------



## fateswarm

Quote:


> Originally Posted by *Master__Shake*
> 
> except when TSMC was making gf100 nvidia had a deal with tsmc, they didn't have to pay for failed gf100 wafers.
> 
> this time around nvidia has to pay for every single failure.


Let me Speculate something. I rarely do it, but let me do it. Doesn't it make sense that since TSMC did not accept the same agreement of paying for failed chips that the failed chips will be a mountain, and therefore not only NVIDIA would have a much higher cost for the GK110 because of the higher size and failure rate hence the reason of a consistently high price, but we also have an indication that the failures are *a lot*?


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> Well, obviously you're incapable of reading between the lines dots, *Regent Square delivers informative info as usual.*


cough* 780 launch(it`s existence); r9 launch* cough, no 1 believed. When the time comes, you`ll see.

People hate on a deliberate info as usual


----------



## 2010rig

Quote:


> Originally Posted by *Regent Square*
> 
> cough* 780 launch(it`s existence); r9 launch* cough, no 1 believed. When the time comes, you`ll see.
> 
> People hate on a deliberate info as usual


*cough*
Quote:


> Originally Posted by *2010rig*
> 
> Why not?
> Quote:
> 
> 
> 
> Originally Posted by *Regent Square*
> 
> long story short..
Click to expand...

*cough*


----------



## wermad

Quote:


> Originally Posted by *KaiserFrederick*
> 
> Yeah, I went from a 670 FTW to a 7970, couldn't discern any difference, but that's just me.


Briefly had a 670 4gb and also had a 7970 on hand. For a Single screen, the amd was better but in triple monitors, the kepler was more consistent. Both suck for triple monitors as they don't have enough grunt. This was before the pacing the fix. I ended selling both in the end. Quad Keplers have sucky scaling and the tahitis had massive amounts of screen tearing (pre MST hub).


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> *cough*
> *cough*


yep


----------



## bencher

Quote:


> Originally Posted by *wermad*
> 
> Briefly had a 670 4gb and also had a 7970 on hand. For a Single screen, the amd was better but in triple monitors, the kepler was more consistent. Both suck for triple monitors as they don't have enough grunt. This was before the pacing the fix. I ended selling both in the end. Quad Keplers have sucky scaling and the tahitis had massive amounts of screen tearing (pre MST hub).


I have 3x 23" screens on my 7870. Was a breeze to setup. Just plug and play.

I do not game on 3 screens though. I use them efficiency while programming.

No minimizing ftw.


----------



## wermad

Quote:


> Originally Posted by *bencher*
> 
> I have 3x 23" screens on my 7870. Was a breeze to setup. Just plug and play.
> 
> I do not game on 3 screens though. I use them efficiency while programming.
> 
> No minimizing ftw.


Zotac sent me a 670 4gb since they couldn't replace my 580 3gb. Had some fun playing a bit with both.I ended up picking up a pair of 690s for a killer price.

Throw in another 7870 for Eyefinity, just don't go crazy with the settings.


----------



## bencher

Quote:


> Originally Posted by *wermad*
> 
> Zotac sent me a 670 4gb since they couldn't replace my 580 3gb. Had some fun playing a bit with both.I ended up picking up a pair of 690s for a killer price.
> 
> Throw in another 7870 for Eyefinity, just don't go crazy with the settings.


I have been thinking about getting another 7870.

However I cannot live with having 2GB vram. So i plan to go 280x just for 3GB ram.


----------



## Stay Puft

Quote:


> Originally Posted by *bencher*
> 
> I have been thinking about getting another 7870.
> 
> However I cannot live with having 2GB vram. So i plan to go 280x just for 3GB ram.


I could play bf3 with a 512mb video card at 1920x1080. Why are all of you so scared of 2gb? It's plenty


----------



## bencher

Quote:


> Originally Posted by *Stay Puft*
> 
> I could play bf3 with a 512mb video card at 1920x1080. Why are all of you so scared of 2gb? It's plenty


When I play BF# afterburner shows that I am using 1.5GB vram.

2GB feels outdated now. Been on 2GB vram since 6970.


----------



## wermad

Quote:


> Originally Posted by *bencher*
> 
> I have been thinking about getting another 7870.
> 
> However I cannot live with having 2GB vram. So i plan to go 280x just for 3GB ram.


Aa is what hampers vram. Most of the newer games I had to tone down aa with the 690s due to the 2gb. Isn't the 280x a 7970? Should see prices fall more for tahiti after hawaii launch. I've seen them go for ~$250 used. even when you have a lot of vram, you still need a lot hp to utilize and justify it. Thats why we still recommend 2gb for Surround for entry level systems.


----------



## bencher

Quote:


> Originally Posted by *wermad*
> 
> Aa is what hampers vram. Most of the newer games I had to tone down aa with the 690s due to the 2gb. Isn't the 280x a 7970? Should see prices fall more for tahiti after hawaii launch. I've seen them go for ~$250 used. even when you have a lot of vram, you still need a lot hp to utilize and justify it. Thats why we still recommend 2gb for Surround for entry level systems.


If they both have same tdp I will definitely get a 7970 instead of 280x.


----------



## wermad

Quote:


> Originally Posted by *bencher*
> 
> If they both have same tdp I will definitely get a 7970 instead of 280x.


Thought I saw tdp was250w but I can't find the article. Interesting how the anandtech site says 300w (







). Msrp $299, not bad.


----------



## 2010rig

Quote:


> Originally Posted by *Regent Square*
> 
> yep


btw - I've been curious to see your Tri-Fire set up, you said you were posting pics yesterday, did you get around to it?


----------



## Warfox101

I have been saving $30 a month each month since I got my two 5870's on launch day. I have not found a card that i could love more than my 5870's However im getting a warm fuzzy feeling about this card.


----------



## th3illusiveman

Quote:


> Originally Posted by *selk22*
> 
> Yes I am in the same boat. I am at 1920x1200 and the 580 has simply OC'd enough to fit my needs or else the stock setting were enough to butter most games right up. I have only seen a couple options as a reasonable upgrade..
> 
> 1. 780 expensive but beastly
> 
> 2. 760 SLI really good performance for a good price
> 
> 3. 290x which is really catching my eye as an option lately. I am waiting for real world benchmarks.


y-you could get a 7990 for 699 ya know... Superior performance to all those options while being $50 less than a 780.


----------



## maarten12100

Quote:


> Originally Posted by *PhantomTaco*
> 
> Tell me if i'm interpreting this wrong but the memory bandwidth is 300gb/s, while the titan is 288. I don't think that extra 12 gb/s would be enough to make that kind of difference do you?


3x 1080P of a single card looking like glorious 60fps so that is 3/4 of the pixels of a 4K screen (A little bit more pixels if you account that it will have more blanking)


----------



## maarten12100

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are probably playing 1080p. 1.25GB is not going to be enough for BF4 even @ 1080p.


I ran a single 570 1.25GB on 3840 * 2160 granted I hit the memory limit a couple of times but staying away from AA I was getting very playable framerates.
Quote:


> Originally Posted by *Stay Puft*
> 
> I could play bf3 with a 512mb video card at 1920x1080. Why are all of you so scared of 2gb? It's plenty


Exactly this
Quote:


> Originally Posted by *bencher*
> 
> When I play BF# afterburner shows that I am using 1.5GB vram.
> 
> 2GB feels outdated now. Been on 2GB vram since 6970.


Running @ 4K UHD I max out the little ram I have but 3GB would suffice still I'm going with a 8GB after market card with more displayports.
Quote:


> Originally Posted by *wermad*
> 
> Aa is what hampers vram. Most of the newer games I had to tone down aa with the 690s due to the 2gb. Isn't the 280x a 7970? Should see prices fall more for tahiti after hawaii launch. I've seen them go for ~$250 used. even when you have a lot of vram, you still need a lot hp to utilize and justify it. Thats why we still recommend 2gb for Surround for entry level systems.


AA makes your effective resoltution so much bigger that you are actually rendering huge amounts of pixels then down sampling.
Quote:


> Originally Posted by *bencher*
> 
> If they both have same tdp I will definitely get a 7970 instead of 280x.


The 280x will probably have a better cooler, pcb layout and better clocking memory besides for the 7970 you could go non reference but that would be throwing the TDP out the window as those cards are for overclocking rather than power.

Quote:


> Originally Posted by *Warfox101*
> 
> I have been saving $30 a month each month since I got my two 5870's on launch day. I have not found a card that i could love more than my 5870's However im getting a warm fuzzy feeling about this card.


The 5870's were and are such great cards

http://www.overclock.net/t/1429998/wccftech-battlefield-4-and-frostbite-3-will-support-both-amd-mantle-and-nvidia-nvapi-apis-for-pc-optimizations
Even NvAPI can't save them from Mantle I guess but only time will tell (since Mantle is a layer lower than NvAPI is)


----------



## selk22

Quote:


> Originally Posted by *th3illusiveman*
> 
> y-you could get a 7990 for 699 ya know... Superior performance to all those options while being $50 less than a 780.


All I have heard is horror stories of driver issues and heat problems. It just never appealed to me


----------



## szeged

Quote:


> Originally Posted by *selk22*
> 
> All I have heard is horror stories of driver issues and heat problems. It just never appealed to me


the 7990s stock cooler was surprisingly good(for thermals...not acoustics) and amd recently "fixed" their drivers for crossfire, for the most part, i would recommend a 7990 over a 780 or titan any day for people who just want best bang for their buck for gaming today









but if you got money to throw around and want the absolute best...


----------



## darkstar585

Quote:


> Originally Posted by *Yeroon*
> 
> That man is awesome! Nothing you can say about his accent will take that away. (providing that is Guy? who races the sportbikes, and rides with Peaty on the pedallybikes)
> 
> Yes, it is.


There is no denying the man's a legend. However his accent and dialect is ridiculous. Whenever he is on TV I have to put the subtitles on...and even they don't make any sense 90% of the time.


----------



## USFORCES

Quote:


> Originally Posted by *Stay Puft*
> 
> No it won't be even close. Amd isn't going to ripoff it's customers like Nvidias high end offerings


Did you say Ripoff, I find this funny considering the AMD FX62 was $1050.00 7yrs ago, it's a safe bet if the 290x is faster than the titan than AMD is going compete with titan prices.


----------



## Artikbot

Quote:


> Originally Posted by *USFORCES*
> 
> Did you say Ripoff, I find this funny considering the AMD FX62 was $1050.00 7yrs ago, it's a safe bet if the 290x is faster than the titan than AMD is going compete with titan prices.


FX chip... Not a mass produced high end GPU.


----------



## maarten12100

Quote:


> Originally Posted by *Artikbot*
> 
> FX chip... Not a mass produced high end GPU.


Back in the days when Intel compilers weren't gimping code and Intel hadn't started to bribe OEM's yet.
Great times for the enthusiast

This card will be great whether it gives good price/perf or costs the same as the 780 and beats the Titan.


----------



## fleetfeather

Slightly off-topic; when preorders for the limited ed. cards begin, is the general idea to smash online preorder options, or do US/CA retailers typically do some sort of midnight bricks-and-mortar preorder system?

This question is based on the idea that no Aussie retailers will be taking preorders for the limited ed's of these cards, and I have no idea how they do it over in the states and canada


----------



## Brutuz

Quote:


> Originally Posted by *Stay Puft*
> 
> I could play bf3 with a 512mb video card at 1920x1080. Why are all of you so scared of 2gb? It's plenty


Heavily mod Skyrim on that 512MB card and tell me how it goes. I bet it won't go that well...

Not to mention, 2+ GB is great for future proofing.
Quote:


> Originally Posted by *maarten12100*
> 
> Back in the days when Intel compilers weren't gimping code and Intel hadn't started to bribe OEM's yet.


afaik Intel compilers have always crippled AMD CPUs, and they were bribing OEMs mostly around that specific time...The Athlon64 was just that much faster than the Pentium 4.


----------



## EliteReplay

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> On my game capture pc, I had a 660ti installed, then replaced it with a 7870, and there is noticeable stuttering on the 7870. Gets worse with crossfire, luckily I don't game on that card. Funny thing is my friend who owned the card never complained about stuttering issues. Now that he has the 660ti, he noticed games to be much smoother.


really just really? that has to be something with the drivers... going from NV to AMD... in single GPU solution u dont get any stuttering.... why people keep saying nonsense like this?
i greed with CF/SLI stuttering in some cases... every peace of eletronic device has issue on some way... but if u had problems in yours it doesnt mean everyone has...

fact is AMD is as smooth as Nvidia when it comes to 1 GPU.


----------



## Nonehxc

Quote:


> Originally Posted by *darkstar585*
> 
> Its not as cool as you think in places.... Cue example below.


Doesn't matter. That guy is doing the Isle of Man TT. Finely tuned Superbikes. Balls to the wall. Whatever he's saying, he speaks the truth.


----------



## revro

this guys a werewolf

but whats this guys excuse? xD
Quote:


> Originally Posted by *darkstar585*
> 
> Its not as cool as you think in places.... Cue example below.


i think the 290x is great card too, hmm personelly i never experienced that a game crashed cause i did not have enough vram, so i guess my 780 will suffice

best
revro


----------



## maarten12100

Quote:


> Originally Posted by *Brutuz*
> 
> afaik Intel compilers have always crippled AMD CPUs, and they were bribing OEMs mostly around that specific time...The Athlon64 was just that much faster than the Pentium 4.


I thought they only did that for Via chips in the beginning it is bad we need someone to call forth a higher power and show them the performance different caused by a simple string (strings as it checks for more like family number)
I maneged to change my CPUID info for an i5 2300k to a harpertown 45nm ES processor I'm certainly not educated enough to write those strings for VMware to show the non believers what is actually happening and allow them to test it for themselves.
A lot of non believers saying Intel is better because Cinebench and then I'm like "mind is blown"


----------



## Nonehxc

Quote:


> Originally Posted by *revro*
> 
> this guys a werewolf
> 
> *but whats this guys excuse? xD*
> i think the 290x is great card too, hmm personelly i never experienced that a game crashed cause i did not have enough vram, so i guess my 780 will suffice
> 
> best
> revro


Adrenaline and endorphines makes your hair grow like that.









It's really useful for a Superbike rider, he can knot his palms' hair to the throttle/clutch and never lose grip.


----------



## Karlz3r

Checking out AMD's new card gave me goosebumps. Got my hopes up!


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> btw - I've been curious to see your Tri-Fire set up, you said you were posting pics yesterday, did you get around to it?


I will be in residence for 4 more months, then I come back home. My job is not the best in the world.


----------



## mcg75

Quote:


> Originally Posted by *EliteReplay*
> 
> really just really? that has to be something with the drivers... going from NV to AMD... in single GPU solution u dont get any stuttering.... why people keep saying nonsense like this?


How exactly is another person's experience nonsense? Just because it differs from yours doesn't make it nonsense.
Quote:


> Originally Posted by *EliteReplay*
> 
> i greed with CF/SLI stuttering in some cases... every peace of eletronic device has issue on some way... but if u had problems in yours it doesnt mean everyone has...


It doesn't mean nobody else has problems either. Very possible he had a bad card.
Quote:


> Originally Posted by *EliteReplay*
> 
> fact is AMD is as smooth as Nvidia when it comes to 1 GPU.


Yeah, after umpteen tech sites called AMD out on it they do. The difference in Skyrim was night and day once they fixed it.


----------



## Stay Puft

Quote:


> Originally Posted by *USFORCES*
> 
> Did you say Ripoff, I find this funny considering the AMD FX62 was $1050.00 7yrs ago, it's a safe bet if the 290x is faster than the titan than AMD is going compete with titan prices.


If my memory serves me right it was 999.99 on newegg.


----------



## 8800GT

Quote:


> Originally Posted by *scyy*
> 
> Enjoying that pie in the sky? I like how so many amd users here are all for open api's and completely anti closed till amd has a closed api and suddenly who cares about open, we have mantle!
> 
> These low level api's are harder to work with and most devs will probably stick with dx11. Yes it will give amd an advantage when used but as of now we have no idea how wide spread it will be.


If it really is the same as the console api, then you're talking potentially every game that is released on the xbox one (and for that matter ps4 which will utilize hUMA another AMD software assuming it uses a similar api as well). That's a big deal if you can not only have an advantage from the get go, but have a large library of games that already support your software.


----------



## revro

mantle is open just like nvidia api is open, but you cant use access language for amd hw to access nvidia hw, cause those are different architectures


----------



## darkstar585

Quote:


> Originally Posted by *Nonehxc*
> 
> Doesn't matter. That guy is doing the Isle of Man TT. Finely tuned Superbikes. Balls to the wall. Whatever he's saying, he speaks the truth.


Your right, Guy is mental and I have huge respect for what he does...I just think his accent is great and find it funny that nobody can really understand him. I am British and I haven't got a clue without subtitles!

The point I was trying to make is when people think British accent, they normally only think about accents from the likes of Hugh Grant or Colin Firth. When in actual fact our accents are very diverse...and not always cool and posh







Quote:


> Originally Posted by *revro*
> 
> this guys a werewolf
> 
> but whats this guys excuse? xD


That's what being bottle fed testosterone from birth will do to you. Guy is 150% man








Quote:


> Originally Posted by *revro*
> 
> mantle is open just like nvidia api is open, but you cant use access language for amd hw to access nvidia hw, cause those are different architectures


Excuse my ignorance as I just don't know but wouldn't hw emulation be possible and still make it faster than direct X?


----------



## Brutuz

Quote:


> Originally Posted by *EliteReplay*
> 
> really just really? that has to be something with the drivers... going from NV to AMD... in single GPU solution u dont get any stuttering.... why people keep saying nonsense like this?
> i greed with CF/SLI stuttering in some cases... every peace of eletronic device has issue on some way... but if u had problems in yours it doesnt mean everyone has...
> 
> fact is AMD is as smooth as Nvidia when it comes to 1 GPU.


It's because a lot of nVidia owners hear from other nVidia owners about AMDs drivers and assume it's truth when in reality, AMDs drivers have been equal to nVidia's (On Windows) for a very long time now.
Quote:


> Originally Posted by *maarten12100*
> 
> I thought they only did that for Via chips in the beginning it is bad we need someone to call forth a higher power and show them the performance different caused by a simple string (strings as it checks for more like family number)
> I maneged to change my CPUID info for an i5 2300k to a harpertown 45nm ES processor I'm certainly not educated enough to write those strings for VMware to show the non believers what is actually happening and allow them to test it for themselves.
> A lot of non believers saying Intel is better because Cinebench and then I'm like "mind is blown"


It was both VIA and AMD, however VIAs CPUs were the only ones that allowed you to change the family name and see the difference first hand iirc.

I think it's actually been going on as far back as when VIAs x86 division was still Centaur/IDT and Cyrix.


----------



## revro

Quote:


> Originally Posted by *darkstar585*
> 
> Your right, Guy is mental and I have huge respect for what he does...I just think his accent is great and find it funny that nobody can really understand him. I am British and I haven't got a clue without subtitles!
> 
> The point I was trying to make is when people think British accent, they normally only think about accents from the likes of Hugh Grant or Colin Firth. When in actual fact our accents are very diverse...and not always cool and posh
> 
> 
> 
> 
> 
> 
> 
> 
> That's what being bottle fed testosterone from birth will do to you. Guy is 150% man
> 
> 
> 
> 
> 
> 
> 
> 
> Excuse my ignorance as I just don't know but wouldn't hw emulation be possible and still make it faster than direct X?


uff i am pretty sure it would not. i mean you would need to emulate from
mantle->nvidia api->nvidia hw
or
nvidia api->mantle->amd hw

ou boy my head hurts just thinking about it









i mean accessing hw based on hw api thats ment for architecture of another hw vendor... i am sure it might be theoretically possible, i mean everything is possible since at least mantle is open. but why would anyone do such emulation at all?









best
revro


----------



## maarten12100

Quote:


> Originally Posted by *Brutuz*
> 
> It's because a lot of nVidia owners hear from other nVidia owners about AMDs drivers and assume it's truth when in reality, AMDs drivers have been equal to nVidia's (On Windows) for a very long time now.
> It was both VIA and AMD, however VIAs CPUs were the only ones that allowed you to change the family name and see the difference first hand iirc.
> 
> I think it's actually been going on as far back as when VIAs x86 division was still Centaur/IDT and Cyrix.


I'd give 100$ for a universal spoofer Anger has one for VIA processors but I rather have a universal one so I can give my E5520's a different vendor ID and see the performance go out of the window.
For AMD the performance would enter trough the window. (which means my phenom II 955 will probably beat a single E5520 hands down







)


----------



## Brutuz

Quote:


> Originally Posted by *maarten12100*
> 
> I'd give 100$ for a universal spoofer Anger has one for VIA processors but I rather have a universal one so I can give my E5520's a different vendor ID and see the performance go out of the window.
> For AMD the performance would enter trough the window. (which means my phenom II 955 will probably beat a single E5520 hands down
> 
> 
> 
> 
> 
> 
> 
> )


Personally I'm glad that the PS4 and Xbox One are using the MS compiler in Visual Studio 2010 now, I'm also hoping SteamOS takes off to increase support for applications in Linux.


----------



## 2010rig

Quote:


> Originally Posted by *Brutuz*
> 
> It's because a lot of nVidia owners hear from other nVidia owners about AMDs drivers and assume it's truth when in reality, AMDs drivers have been equal to nVidia's (On Windows) for a very long time now.
> It was both VIA and AMD, however VIAs CPUs were the only ones that allowed you to change the family name and see the difference first hand iirc.
> 
> I think it's actually been going on as far back as when VIAs x86 division was still Centaur/IDT and Cyrix.


Sorry bro, but it's not about NVIDIA owners, it's about what has been in reviews, and real world users.

Remember what got Skyrim and other games finally fixed on SINGLE cards? Tech Report reviews.

What got frame pacing drivers finally out? A combination of PcPer & Tech Report reporting on Microstuttering on CF systems.

If it wasn't for these sites, AMD would have never gotten around to fixing the issue, that they acknowledged they "didn't know existed"

I can speak from personal experience that Microstutter has been in place in CF for many years, saw it first hand on a pair of 5870's, and if you like, I can link you to reviews that date back to 2008 when they were speaking out about it on CF 4870's.


----------



## fateswarm

Quote:


> Originally Posted by *2010rig*
> 
> "didn't know existed"


That indicates they need a Quality Assurance team. Or one that isn't totally horrible. They should not require the audience, after the fact, to tell them that which frankly should be obvious (even casual gamers notice it and write about it). NVIDIA probably has a better QA team.


----------



## amd655

I would like to say, i have a pair of 7770's, they run great.
We use Radeon-Pro for quick fixes if they are needed, BF3 actually takes the cake on even a single 7770, it requires the frames to be lowered with Radeon Pro to stop input lag, really horrible, but once done works great.

My bro played FC3 on dual/single 7770 and had massive stuttering with both setups, however later drivers fixed this to a good extent.

Other than the small niggles, i think AMD drivers are improving, they just need to be a lot quicker in releasing fixes.

Nvidia drivers... BF3 would randomly artifact on my GTX 480 back when i was using it, it was all to do with 320.xx drivers, 314.xx and below worked flawlessly, however 327.23 is installed on both of my Nvidia systems with not a single issue thus far.


----------



## rdr09

Quote:


> Originally Posted by *2010rig*
> 
> Sorry bro, but it's not about NVIDIA owners, it's about what has been in reviews, and real world users.
> 
> Remember what got Skyrim and other games finally fixed on SINGLE cards? Tech Report reviews.
> 
> What got frame pacing drivers finally out? A combination of PcPer & Tech Report reporting on Microstuttering on CF systems.
> 
> If it wasn't for these sites, AMD would have never gotten around to fixing the issue, that they acknowledged they "didn't know existed"
> 
> I can speak from personal experience that Microstutter has been in place in CF for many years, saw it first hand on a pair of 5870's, and if you like, I can link you to reviews that date back to 2008 when they were speaking out about it on CF 4870's.


as much money some spend on their gpu - they deserve better support. look at one member here in this thread . . .

http://www.overclock.net/t/1427861/guru3d-geforce-327-23-whql/80


----------



## Brutuz

Quote:


> Originally Posted by *2010rig*
> 
> Sorry bro, but it's not about NVIDIA owners, it's about what has been in reviews, and real world users.
> 
> Remember what got Skyrim and other games finally fixed on SINGLE cards? Tech Report reviews.
> 
> What got frame pacing drivers finally out? A combination of PcPer & Tech Report reporting on Microstuttering on CF systems.
> 
> If it wasn't for these sites, AMD would have never gotten around to fixing the issue, that they acknowledged they "didn't know existed"
> 
> I can speak from personal experience that Microstutter has been in place in CF for many years, saw it first hand on a pair of 5870's, and if you like, I can link you to reviews that date back to 2008 when they were speaking out about it on CF 4870's.


Really? So despite having played Skyrim like mad on my HD7950 and not noticing those issues with drivers as far back as 12.11, if you're meaning random small issues then I don't know why you're complaining about AMD getting them as every GPU driver ever has some, regardless of brand...Remember the recent driver issues with corrupt colours?

As for frame pacing...It's been a while in the making for both nVidia and AMD as proven by the R9 290X having hardware frame pacing, they wouldn't have added that this year or even last year...The specs for the GPU would have been set in stone by then. nVidia simply got there earlier than AMD which, to their credit is a great thing for everyone. As for not knowing it existed, it's called damage control...Do you really believe the PR departments from any company? Remember when Fermi wasn't going to be delayed from November to March? And then it was? Or the wood screw thing? PR departments are paid to make the company look as good as possible, and it looks better to "not know it existed" than "Oh, we knew but we've been working on a software fix for x months already". It makes their driver team look better.

I had CFX HD4890s, it was barely any less smooth than my single GTX 470...There was microstutter but it isn't anywhere nearly as big of a problem as people make it out to be. Funnily enough, that's when I worked out that at least some of the "AMD driver issues" were unstable OCs...I had my RAM running very tight timings at DDR2-800 and was getting the GSOD (Grey Screen of Death) until I loosened my memory timings a little bit.
Why am I not surprised to see you pushing the FUD about how bad AMDs drivers are? They're not any worse than nVidia's these days...Or are you forgetting that nVidia only incorporated frame pacing with the GTX 6*0s? AMD is ahead sometimes, nVidia is ahead other times but they usually catch up fairly quickly. (Frame Pacing, quad SLI/quadfire, early 64bit drivers, etc)


----------



## 2010rig

Quote:


> Originally Posted by *Brutuz*
> 
> -snip-


What fud am I spreading?

I'm just telling you the timeline of events, If it wasn't for the reviews that exposed the issues, the stuttering wouldn't have gotten fixed.

You're implying that NVIDIA users are making stuff up about Microstutter, and driver issues, which is simply false.

I'm not gonna take your word for it that AMD hasn't had these MS issues for years, because you'll always spin things as "AMD drivers are great!"





Source

Here's some interesting reading for ya, may want to see the date of that while you're at it.
http://hardforum.com/showthread.php?t=1317582

It wasn't until Tech Report came along exposing single card MS that it finally got fixed, and it wasn't until PCPER & TR having the tools to properly demonstrate MS in CF, that AMD finally took action to fixing those issues. They are also *still* getting fixed for Eyefinity & DX9. If those issues didn't exist before, care to explain this:









Notice the following graphs WITH frame pacing drivers:





It's laughable that you would still argue that these issues did not exist, after so much evidence has been presented for the past year.

If you want to accuse someone of spreading fud, look in the mirror.
Quote:


> Originally Posted by *Brutuz*
> 
> It's because a lot of nVidia owners hear from other nVidia owners about AMDs drivers and assume it's truth when in reality, AMDs drivers have been equal to nVidia's (On Windows) for a very long time now.


----------



## Nonehxc

Quote:


> Originally Posted by *darkstar585*
> 
> Your right, Guy is mental and I have huge respect for what he does...I just think his accent is great and find it funny that nobody can really understand him. I am British and I haven't got a clue without subtitles!
> 
> The point I was trying to make is when people think British accent, they normally only think about accents from the likes of Hugh Grant or Colin Firth. When in actual fact our accents are very diverse...and not always cool and posh
> 
> 
> 
> 
> 
> 
> 
> 
> That's what being bottle fed testosterone from birth will do to you. Guy is 150% man
> 
> 
> 
> 
> 
> 
> 
> 
> Excuse my ignorance as I just don't know but wouldn't hw emulation be possible and still make it faster than direct X?


Yeah, I know. I'm a spanish BoA in English, and you have quite the glotal in there. Scottish is particularly difficult for me, although many people find it easy. Go figure.









Nice knowing another TT lover. TT and Group B rallies are my most beloved motorsports, being an 80s child, motorbikes and rallies were literally EVERYWHERE in my country. Man+Machine+a line to follow. No security, only your skills. If you don't know what you are doing, you better don't do it. Only if you are/were mental enough to take those machines to the edge, and smart enough to not cross that edge.




Group B and their huge turbocharged engines. They were drifting even during straights cause the torque was so high it couldn't take a long grip. During that time, Group B cars were more powerful than Formula 1 cars. Although, to be fair, the 80s were an amazing time for motosports. Many builders reached their peak in the different motorsports during that time, making machines for TT(Isle of Man), Group B, F1. That was the time when winning was all about "we make the biggest, meanest machine we can build and give it to a guy that has enough balls to try to tame it"


----------



## Durquavian

Quote:


> Originally Posted by *2010rig*
> 
> What fud am I spreading?
> 
> I'm just telling you the timeline of events, If it wasn't for the reviews that exposed the issues, the stuttering wouldn't have gotten fixed.
> 
> You're implying that NVIDIA users are making stuff up about Microstutter, and driver issues, which is simply false.
> 
> I'm not gonna take your word for it that AMD hasn't had these MS issues for years, because you'll always spin things as "AMD drivers are great!"
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Source
> 
> Here's some interesting reading for ya, may want to see the date of that while you're at it.
> http://hardforum.com/showthread.php?t=1317582
> 
> It wasn't until Tech Report came along exposing single card MS that it finally got fixed, and it wasn't until PCPER & TR having the tools to properly demonstrate MS in CF, that AMD finally took action to fixing those issues. They are also *still* getting fixed for Eyefinity & DX9. If those issues didn't exist before, care to explain this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Notice the following graphs WITH frame pacing drivers:
> 
> 
> 
> 
> 
> 
> It's laughable that you would still argue that these issues did not exist, after so much evidence has been presented for the past year.
> 
> If you want to accuse someone of spreading fud, look in the mirror.


I think his original point there was in fact single GPU not CF/SLI. And in the other post when he mentioned CF he said it wasn't nearly as bad for him as it was being reported. I have said the same thing. Neither of those experiences say it doesn't exist just that the descriptions were not what we experienced. And review sites.. well makes me remember that one APU Hybrid CF review where the writer used 1866 cl13 memory and complained about the fps but never tried the memory at the suggested and already tried 2133 cl10. To this day I wish I had the funds to build an APU rig and test for myself.


----------



## Kuivamaa

Quote:


> Originally Posted by *2010rig*
> 
> What fud am I spreading?
> 
> I'm just telling you the timeline of events, If it wasn't for the reviews that exposed the issues, the stuttering wouldn't have gotten fixed.


It is much much more complex than that, not just AMD having issues, reviewers finding those and forcing AMD to fix them. You've mentioned Techreport:
http://techreport.com/review/21982/today-mid-range-gpus-in-battlefield-3/6

This is one year before all hell broke loose with the Skyrim 7970 stuttering commotion. Fermi cards in that test clearly have worse stuttering issues than VLIW ones. But there was no crusade against nvidia. Why reviewers (I do not include PCPer, among them in my book they are just nvidia partners,I explained my reasoning in another thread,won't go offtopic) suddenly deemed it as such a huge deal when both vendors suffered from it in the past is beyond me. But of course nvidia focus group also helped to spread the word, that's their job.

http://forums.anandtech.com/showthread.php?t=2287709


----------



## KillThePancake

The 270X looks interesting to me, especially at $199. I'm too cheap for flagship cards


----------



## Jack Mac

Quote:


> Originally Posted by *KillThePancake*
> 
> The 270X looks interesting to me, especially at $199. I'm too cheap for flagship cards


7950s are 200 now...particularly interesting is this one:
http://www.amazon.com/Sapphire-DisplayPort-PCI-Express-Graphics-21196-00-20G/dp/B00BXVFM3K/ref=sr_1_2?ie=UTF8&qid=1380343838&sr=8-2&keywords=sapphire+7950


----------



## Durquavian

Quote:


> Originally Posted by *KillThePancake*
> 
> The 270X looks interesting to me, especially at $199. I'm too cheap for flagship cards


I usually don't get flagships either but I am really tryin to find a way to convince both me and my Wife that it is a *necessary* purchase.


----------



## Stay Puft

Quote:


> Originally Posted by *KillThePancake*
> 
> The 270X looks interesting to me, especially at $199. I'm too cheap for flagship cards


270X Hawk is going to be a great card


----------



## 2010rig

Quote:


> Originally Posted by *Kuivamaa*
> 
> It is much much more complex than that, not just AMD having issues, reviewers finding those and forcing AMD to fix them. You've mentioned Techreport:
> http://techreport.com/review/21982/today-mid-range-gpus-in-battlefield-3/6
> 
> This is one year before all hell broke loose with the Skyrim 7970 stuttering commotion. Fermi cards in that test clearly have worse stuttering issues than VLIW ones. But there was no crusade against nvidia. Why reviewers (I do not include PCPer, among them in my book they are just nvidia partners,I explained my reasoning in another thread,won't go offtopic) suddenly deemed it as such a huge deal when both vendors suffered from it in the past is beyond me. But of course nvidia focus group also helped to spread the word, that's their job.


About Battlefield 3, the link you shared is from 2011, and mid-range cards. Craziness how bad it was on Fermi's *mid-range cards*, NVIDIA addressed the issue with Hardware frame metering with Kepler.
http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/4

See, if there's an issue, I will happily acknowledge it, pretending like it's not there doesn't do anyone any good.

Please don't pretend like AMD fans never point out issues with NVIDIA, if / when they have it, it goes both ways really. I was never implying that NVIDIA doesn't have issues, which they most certainly have had them in the past.

I appreciate your concerns about PcPer, keep in mind that AMD is their #1 advertiser, what are your thoughts on that?









I remember taking screenshots of AMD ads along side those articles, when they first appeared. I couldn't imagine what kind of reaction would have taken place if those were NVIDIA ads instead.


----------



## xoleras

I can't believe that to this day, people are bored enough to argue nvidia vs AMD for pages on end.


----------



## Blackops_2

I think it will be interesting to see how the 290 fares. We know it will be faster than the 770/7970 so apparently it's going to take that 400$ spot in between the 780/290x (assuming the 290x is as the title claims)


----------



## raghu78

Quote:


> Originally Posted by *Blackops_2*
> 
> I think it will be interesting to see how the 290 fares. We know it will be faster than the 770/7970 so apparently it's going to take that 400$ spot in between the 780/290x (assuming the 290x is as the title claims)


the 290 will be the card which would provide awesome value for money. clock for clock R9 290 will be close to R9 290X , just like HD 7950 and HD 7970 (around 5 - 8%). it will cost $400 - $450. $100 - $150 less than R9 290X. this is the card which is the biggest threat to Nvidia. We can expect this chip to have low stock speeds of around 800 - 850 mhz and have a ton of OC headroom with voltage control. with 4 GB VRAM and hardware CF frame pacing this card will be a perfect choice for Eyefinity 1600p and Eyefinity 4k.


----------



## rdr09

Quote:


> Originally Posted by *2010rig*
> 
> About Battlefield 3, the link you shared is from 2011, and mid-range cards. Craziness how bad it was on Fermi's *mid-range cards*, NVIDIA addressed the issue with Hardware frame metering with Kepler.
> http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/4
> 
> See, if there's an issue, I will happily acknowledge it, pretending like it's not there doesn't do anyone any good.
> 
> Please don't pretend like AMD fans never point out issues with NVIDIA, if / when they have it, it goes both ways really. I was never implying that NVIDIA doesn't have issues, which they most certainly have had them in the past.
> 
> I appreciate your concerns about PcPer, keep in mind that AMD is their #1 advertiser, what are your thoughts on that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I remember taking screenshots of AMD ads along side those articles, when they first appeared. I couldn't imagine what kind of reaction would have taken place if those were NVIDIA ads instead.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


here ignore this . . .

" . . . Anything 306.97 works well for me. Just that for many 306.97 is known as the official last good driver for BF3 . . . I do play some DayZ and Deus Ex and get lower FPS. If I decide to give BF3 a two day break, I just uninstall 306.97 and install 314.07. I have probably installed and reinstalled at least 20 times this week alone lol."

that guy said - lol. to me, that was not funny.


----------



## Stay Puft

Not happy about the current rumor from raja or raj that the price will be 649.99. For that price I can grab 780s instead


----------



## xoleras

Quote:


> Originally Posted by *Stay Puft*
> 
> Not happy about the current rumor from raja or raj that the price will be 649.99. For that price I can grab 780s instead


The only thing I heard was from Kyle at HardOCP - he showed the 599.99 rumored price to Raja at the press event and asked if it was true, and Raja shook his head no.

Of course that doesn't tell us the real price, though.


----------



## Brutuz

Quote:


> Originally Posted by *2010rig*
> 
> What fud am I spreading?
> 
> I'm just telling you the timeline of events, If it wasn't for the reviews that exposed the issues, the stuttering wouldn't have gotten fixed.
> 
> You're implying that NVIDIA users are making stuff up about Microstutter, and driver issues, which is simply false.
> 
> I'm not gonna take your word for it that AMD hasn't had these MS issues for years, because you'll always spin things as "AMD drivers are great!"
> 
> Here's some interesting reading for ya, may want to see the date of that while you're at it.
> http://hardforum.com/showthread.php?t=1317582
> 
> It wasn't until Tech Report came along exposing single card MS that it finally got fixed, and it wasn't until PCPER & TR having the tools to properly demonstrate MS in CF, that AMD finally took action to fixing those issues. They are also *still* getting fixed for Eyefinity & DX9. If those issues didn't exist before, care to explain this:
> 
> Notice the following graphs WITH frame pacing drivers:
> 
> It's laughable that you would still argue that these issues did not exist, after so much evidence has been presented for the past year.
> 
> If you want to accuse someone of spreading fud, look in the mirror.


You're acting as though AMDs drivers are far worse than nVidia's, which is pure FUD through and through now that the frame pacing driver is out. (And was FUD before Kepler came out and fixed the frame pacing for nVidia)

You're putting words into my mouth and trying to make me look like a fanboy when in reality, I buy what is fastest for me: I ran a GTX 470 before this HD7950 and a GTX 275 before that. I also never said AMD drivers are great, merely that they're equal to nVidia nor did I imply that MS isn't a problem (Just that not everyone can see it which is true, otherwise why do you think it only became a big issue after those reviews came out? A few people noticed it before on both SLI and CFX but plenty of people also noticed the increased FPS without noticing the microstutter either as frame pacing was only put in for nVidia with Kepler yet plenty of people who are now complaining were perfectly happy with SLI GTX 580s or CFX HD5870s) as I've known and heard about it as far back as when Crossfire required an external dongle and the ability to run two high-end GPUs in the same consumer level motherboard was still fairly new after years of AGP...Everyone knows that a faster single GPU is better than two slower GPUs (Why else do you think people buy a Titan instead of getting two HD7950s or GTX 770s? Replace that with two midrange cards against one high-end card for every generation and it's always applied bar the 8800GTX beating the 7950GX2 both in FPS and lack of stutter) and I've even been making recommendations based off of that as far back as when the 8800GT was the card to get, if not further back.

Also, that quote was about *driver quality*...If you're going to start saying nVidia's drivers are better in quality than AMDs these days I'm just not even going to bother replying, neither drivers are great and have bugs yet people (Nearly always nVidia owners) still act like nVidia's drivers are better quality. Do you really want me to start digging up screenshots of the texture corruption with the 314 drivers? Or of the nvlddmkm.sys BSODs? Or of the driver that broke fan profiles and caused cards to overheat? Or the other various issues that have occured over the years with nVidia's drivers? Same with AMD as neither company has great drivers despite what words you try to put in my mouth.
Quote:


> Originally Posted by *Durquavian*
> 
> I think his original point there was in fact single GPU not CF/SLI. And in the other post when he mentioned CF he said it wasn't nearly as bad for him as it was being reported. I have said the same thing. Neither of those experiences say it doesn't exist just that the descriptions were not what we experienced. And review sites.. well makes me remember that one APU Hybrid CF review where the writer used 1866 cl13 memory and complained about the fps but never tried the memory at the suggested and already tried 2133 cl10. To this day I wish I had the funds to build an APU rig and test for myself.


I thought it was pretty obvious what I was talking about given that I've maintained the same stance on it for ages now (That single card stutter is barely a problem regardless of brand unless it's a game bug like in FC3 and multi-card stutter is an issue depending on how sensitive you are) but I guess that doesn't paint nVidia in a good enough light for him. I also like that he tries to paint me as an AMD fanboy despite me owning more nVidia cards than AMD. (x1600Pro, 3x HD4890s, HD7850 and HD7950 vs 2x 6800GS, 9600GT, 9800GT, 9800GTX+, G210, GTX 275 and GTX 470)
I'm not a fanboy for AMD or nVidia...I'm simply realistic with driver issues on both sides, the nvlddmkm.sys BSODs are still occuring to some people today. (They originally started around the time of Vista's launch...That's 7 years and being fair, have much reduced in frequency and how many people they effect but still occur nonetheless)
Quote:


> Originally Posted by *Forceman*
> 
> Straight from AMD PR's mouth to our ears.


The fanboys on either side are just plain sad.


----------



## Stay Puft

Quote:


> Originally Posted by *xoleras*
> 
> The only thing I heard was from Kyle at HardOCP - he showed the 599.99 rumored price to Raja at the press event and asked if it was true, and Raja shook his head no.
> 
> Of course that doesn't tell us the real price, though.


Hoping it would be lower... Much lower. 499.99 would be outstanding

290X - 499
290 - 399
280X - 299
280 - 249
270X - 199


----------



## th3illusiveman

Quote:


> Originally Posted by *2010rig*
> 
> Sorry bro, but it's not about NVIDIA owners, it's about what has been in reviews, and real world users.
> 
> Remember what got Skyrim and other games finally fixed on SINGLE cards? Tech Report reviews.
> 
> What got frame pacing drivers finally out? A combination of PcPer & Tech Report reporting on Microstuttering on CF systems.
> 
> If it wasn't for these sites, AMD would have never gotten around to fixing the issue, that they acknowledged they "didn't know existed"
> 
> I can speak from personal experience that Microstutter has been in place in CF for many years, saw it first hand on a pair of 5870's, and if you like, I can link you to reviews that date back to 2008 when they were speaking out about it on CF 4870's.


You do realize that even when running on 12.11 drivers the HD7XXX cards didn't have any major issues with stutter in a single card configurations right? There were two methods used for recording stutter, one was recording FPS with fraps and the other was using Nvidias own FCAT tool. when using Fraps the HD7K cards reported high stutter rates yet when using Nvidias FCAT and the exact drivers those stutters were reduced to "acceptable" levels. Which method produces more accurate data is still debatable but considering most sites have moved to using FCAT it's fair to assume that one is more trusted among reviewers.

As for micro-stutter you should also know that up until the GTX600 series Nvidia cards had stutter just as bad as the 7K series, you should look at some GTX570 Sli reviews on Techreport.

Given the fact that AMD has a significantly smaller driver team and budget then Nvidia, it shouldn't come as a surprise that they don't have the same resources Nvidia has into looking at every single aspect of their driver performance and bugs and as a result will have afew more issues then they will. There is also the fact that the GTX400, 600 and 700 series are all based on the fermi architecture and Nvidia has had years of experience developing drivers for it while the GCN architecture for the HD7K series is completely new for AMD and required completely new driver code.

You really should give AMD more credit then you do, because with all they have going against them they are still very competitive in the GPU industry.... Having owned an HD4870, GTX460, GTX570, Sli GTX570 and an HD7970 i have had no major issues with drivers from any of these two companies so i really don't get where all these "horror" stories are coming from.
Quote:


> Originally Posted by *Stay Puft*
> 
> Hoping it would be lower... Much lower. 499.99 would be outstanding
> 
> 290X - 499
> 290 - 399
> 280X - 299
> 280 - 249
> 270X - 199


lol, I'm getting deja-vu of those Quadfire 7970 posting days while reading your recent posts. I sure hope the 290x lives up to your expectations this time


----------



## Stay Puft

Quote:


> Originally Posted by *th3illusiveman*
> 
> You do realize that even when running on 12.11 drivers the HD7XXX cards didn't have any major issues with stutter in a single card configurations right? There were two methods used for recording stutter, one was recording FPS with fraps and the other was using Nvidias own FCAT tool. when using Fraps the HD7K cards reported high stutter rates yet when using Nvidias FCAT and the exact drivers those stutters were reduced to "acceptable" levels. Which method produces more accurate data is still debatable but considering most sites have moved to using FCAT it's fair to assume that one is more trusted among reviewers.
> 
> As for micro-stutter you should also know that up until the GTX600 series Nvidia cards had stutter just as bad as the 7K series, you should look at some GTX570 Sli reviews on Techreport.
> 
> Given the fact that AMD has a significantly smaller driver team and budget then Nvidia, it shouldn't come as a surprise that they don't have the same resources Nvidia has into looking at every single aspect of their driver performance and bugs and as a result will have afew more issues then they will. There is also the fact that the GTX400, 600 and 700 series are all based on the fermi architecture and Nvidia has had years of experience developing drivers for it while the GCN architecture for the HD7K series is completely new for AMD and required completely new driver code.
> 
> You really should give AMD more credit then you do, because with all they have going against them they are still very competitive in the GPU industry.... Having owned an HD4870, GTX460, GTX570, Sli GTX570 and an HD7970 i have had no major issues with drivers from any of these two companies so i really don't get where all these "horror" stories are coming from.
> lol, I'm getting deja-vu of those Quadfire 7970 posting days while reading your recent posts. I sure hope the 290x lives up to your expectations this time


That setup was fun except for the crashing in BF3 and the stutter


----------



## Baghi

Hawaii won't require external CF bridges and the released date is 15 Oct.
http://wccftech.com/amd-radeon-r9-290x-feature-amd-crossfirex-technology-crossfire-bridge-required-anymore/
http://videocardz.com/46186/amd-radeon-r9-290x-officially-released-october-15th


----------



## Forceman

Quote:


> Originally Posted by *Baghi*
> 
> Hawaii won't require external CF bridges and the released date is 15 Oct.
> http://wccftech.com/amd-radeon-r9-290x-feature-amd-crossfirex-technology-crossfire-bridge-required-anymore/
> http://videocardz.com/46186/amd-radeon-r9-290x-officially-released-october-15th


Wonder if it'll be available at retail that day, or if it'll be a announce now sell later kind of situation like the 7970.


----------



## th3illusiveman

Quote:


> Originally Posted by *Baghi*
> 
> Hawaii won't require external CF bridges and the released date is 15 Oct.
> http://wccftech.com/amd-radeon-r9-290x-feature-amd-crossfirex-technology-crossfire-bridge-required-anymore/
> http://videocardz.com/46186/amd-radeon-r9-290x-officially-released-october-15th


For this card to win me over it needs to be within 10% of GTX780 performance or above it while costing $500. "Hoping" for a $600 price point because Nvidia got greedy and decided that should be the new flagship price is silly, but deep down i know it's either one or the other.

don't know when $500 stopped being alot of freaking money for a GPU...


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *2010rig*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> What fud am I spreading?
> 
> I'm just telling you the timeline of events, If it wasn't for the reviews that exposed the issues, the stuttering wouldn't have gotten fixed.
> 
> You're implying that NVIDIA users are making stuff up about Microstutter, and driver issues, which is simply false.
> 
> I'm not gonna take your word for it that AMD hasn't had these MS issues for years, because you'll always spin things as "AMD drivers are great!"
> 
> 
> 
> 
> 
> Source
> 
> Here's some interesting reading for ya, may want to see the date of that while you're at it.
> http://hardforum.com/showthread.php?t=1317582
> 
> It wasn't until Tech Report came along exposing single card MS that it finally got fixed, and it wasn't until PCPER & TR having the tools to properly demonstrate MS in CF, that AMD finally took action to fixing those issues. They are also *still* getting fixed for Eyefinity & DX9. If those issues didn't exist before, care to explain this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Notice the following graphs WITH frame pacing drivers:
> 
> 
> 
> 
> 
> It's laughable that you would still argue that these issues did not exist, after so much evidence has been presented for the past year.
> 
> If you want to accuse someone of spreading fud, look in the mirror.


I wonder just how many times you've posted those same tired graphs? I think there should be a limit to posting the same material over and over again, like say at 100 times or so you aren't allowed to post it anymore.









I ran CF 7970's for over a year and while there was some noticeable stutter it really wasn't any worse than my SLI 580's had been. It certainly NEVER was to a point where I couldn't enjoy my games and all the massively overhyped hysteria concerning the subject honestly makes me nauseous.


----------



## maarten12100

Quote:


> Originally Posted by *2010rig*
> 
> About Battlefield 3, the link you shared is from 2011, and mid-range cards. Craziness how bad it was on Fermi's *mid-range cards*, NVIDIA addressed the issue with Hardware frame metering with Kepler.
> http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/4
> 
> See, if there's an issue, I will happily acknowledge it, pretending like it's not there doesn't do anyone any good.
> 
> *Please don't pretend like AMD fans never point out issues with NVIDIA*, if / when they have it, it goes both ways really. I was never implying that NVIDIA doesn't have issues, which they most certainly have had them in the past.
> 
> I appreciate your concerns about PcPer, keep in mind that AMD is their #1 advertiser, what are your thoughts on that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I remember taking screenshots of AMD ads along side those articles, when they first appeared. I couldn't imagine what kind of reaction would have taken place if those were NVIDIA ads instead.


Well they have but you do it all the time.
Considering Nvidia has had this same issue why hate on AMD for it.


----------



## Baghi

Quote:


> Originally Posted by *Forceman*
> 
> I'm not talking about the information, I'm talking about the wording. Who talks like that? It sounds like exactly what you hear from a marketing team. If someone had posted that as a quote from AMD no one would know the difference.


I can clearly see the difference between his post and comments made by a hardware rep. He's not promising anything, he's "hoping" for said things. A hardware rep will ever reveal OC potential like he has.


----------



## Kuivamaa

Quote:


> Originally Posted by *2010rig*
> 
> About Battlefield 3, the link you shared is from 2011, and mid-range cards. Craziness how bad it was on Fermi's *mid-range cards*, NVIDIA addressed the issue with Hardware frame metering with Kepler.
> http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/4
> 
> See, if there's an issue, I will happily acknowledge it, pretending like it's not there doesn't do anyone any good.
> 
> Please don't pretend like AMD fans never point out issues with NVIDIA, if / when they have it, it goes both ways really. I was never implying that NVIDIA doesn't have issues, which they most certainly have had them in the past.
> 
> I appreciate your concerns about PcPer, keep in mind that AMD is their #1 advertiser, what are your thoughts on that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I remember taking screenshots of AMD ads along side those articles, when they first appeared. I couldn't imagine what kind of reaction would have taken place if those were NVIDIA ads instead.


Is AMD still getting advertised there (PCPer) after their true colours where shown, that's the question-my guess is that they haven't got a cent's worth of money from AMD in a while. You previous post insinuated exactly that only AMD had issues and good reviewers found them and helped poor radeon users to finally get help from team red. When it comes to single gpu metering(dual gpu is a different discusson where AMD had real issues). even Kepler has had its share of stuttering games. As for BF3, even if we accept everything is fine now on the green side in this game (it isn't, their drivers have been iffy there for a long time) what happened to fermi owners?

The bottom line is this: Nvidia through a well oiled marketing mechanism (lobbying the press and using their focus group) often takes minor issues ,issues that sometimes are present in their own products as well and blows them out of proportion to create a mostly false image of superiority. Exactly what they tried to do before the 290x announcement. It works and of course now AMD is doing the same, hiring nvidia marketeers, formed their own version of focus group etc.


----------



## PhantomTaco

I posted this in the thread regarding the PCPer recent article on multi monitor frame pacing, so a little of it is out of context, but the vast majority still applies, sorry if htis technically counts as double posting, but I'm not in the mood to retype it.. It's slightly modified from the original version
Quote:


> This entire thread has taken a turn for the disgusting, and it applies to both sides:
> 
> To the AMD shills (or whatever you see fit to call yourselves). You actively deny there was an issue based on anecdotal evidence from yourselves? Some grattitude to those that spent the time to quantify it to help get these updates pushed. What's more is it's not like [Ryan Shrout of PCPer] is the only one that's reported these issues and it isn't like we haven't seen a lot of posts on forums as well stating similar issues (although not necessarily quantified). Yes you can complain oh but Nvidia made the software there's a bias we don't fully understand how it's being measured etc. So would you rather stick your heads in the sand and let nothing be done to figure out ways to quantify these issues? And then you complain about the timing. Ryan Shrout was even was in contact with them a month ago about it and told them well in advance and what did they have to say to him apparently? Nothing. And yet you still want to poo poo him over it? Don't forget that AMD actively DENIED there were any issues whatsoever until Ryan and others found ways to show them it was in fact an issue. Instead you jump at his throat for the timing? If anything the timing couldn't be better. It's a very cold reminder that they need to fix these issues, and that people have every right to be reminded/know that these issues are apparent in AMD cards before they do go out and buy the new lineup, or would you rather it was slipped under the table without anyone knowing? What's more is on one side you have a quick hastily written article mud slinging at pcper insinuating they're being guided by nvidia to release these, and on the other side you have a review site merely releasing factual data, and you choose to side with the mud slinger? Shameful.
> 
> To the NVIDIA shills. Don't forget for a second that them improving these issues does nothing but improve the overall market and increases competition. Increased competition forces both sides to work harder to provide better products to us at better prices. Instead of giving AMD afficionados a hard time for their decisions, we should focus on shifting the blame to manufacturers on both sides and demanding better. Arguing amongst ourselves does nothing but ruin the community, while standing up and complaining to our beloved manufacturers has the chance of getting us all better products and better prices. What's more for the love of donkey balls keep this out of this thread. THIS IS NOT TO DISCUSS THIS BEATEN MS HORSE, but to talk about the 290x itself. Operate under the assumption that they WILL have it fixed by the time that the cards are released and base your discussion points on that, we've all already discussed this garbage a million times on a trillion threads. Get off your high horses and discuss instead of stirring up more garbage.


Going back to being on topic:
Quote:


> the 290 will be the card which would provide awesome value for money. clock for clock R9 290 will be close to R9 290X , just like HD 7950 and HD 7970 (around 5 - 8%). it will cost $400 - $450. $100 - $150 less than R9 290X. this is the card which is the biggest threat to Nvidia. We can expect this chip to have low stock speeds of around 800 - 850 mhz and have a ton of OC headroom with voltage control. with 4 GB VRAM and hardware CF frame pacing this card will be a perfect choice for Eyefinity 1600p and Eyefinity 4k. thumb.gif


Is this conjecture on your part? All of this definitely seems to me like educated guessing based on what rumored information we've gleaned (with a few exceptions). I'm very interested to see how exactly the 290 and 290x are differentiated, as you may very well be right that the 290 will be the value killer for them. As for the pricing part of your post, I'll reserve judgement until the 15th, I'm still more convinced personally that the 290x will be either 600 or 650, but the idea of the 290 being 400 or 450 makes sense to me. As for the fact that the cards no longer need xfire connectors and actively use bandwidth from the pci lanes, I'm a little skeptical as to how well that will work. As we know to date only the most powerful cards exhibit signs of bottlenecking on pci 3.0x8 (even then it's still hotly contested, and it's something I hope to test and find out for myself soon when I make the jump to x79 in the coming weeks), wouldn't this potentially just come at the cost of performance? Don't know if it would even be a noticeable amount, but I feel as though it has the potential to be. Then again I could be wrong, regardless it's a smart move at least for those on the x79 platform where pci lanes aren't a concern in multigpu setups at all. That said, seeing the EVGA sli bridges (and planning to buy one myself) shows that they can be made to be something that does look good and add flair to a build, so at the same time I'm somewhat sad to see they may be leaving us soon. Conflicted :/


----------



## mcg75

Quote:


> Originally Posted by *Brutuz*
> 
> Really? So despite having played Skyrim like mad on my HD7950 and not noticing those issues with drivers as far back as 12.11, if you're meaning random small issues then I don't know why you're complaining about AMD getting them as every GPU driver ever has some, regardless of brand...Remember the recent driver issues with corrupt colours?


With all due respect Brutuz, Skyrim on any driver before the 13.2 beta was terrible on my 7970. Bad enough I hunted down the leaked version of 13.2 beta sent to Wasson that the AMD rep said not to use because it was incomplete.

13.2 vs anything previous I used was night and day. With 13.2, you could spin around and the view was smooth. Before that, you'd get frame skips that felt like I was back playing console games. I thought I had it settled down using various other methods like RadeonPro but nothing came close to the actual fix in 13.2.

Not going to comment on anything else because other than that, I had no issues with AMD drivers. But that issue was 110% real and should not be downplayed because without Wasson's report, we may not have a fix to this day.


----------



## Artikbot

Quote:


> Originally Posted by *mcg75*
> 
> With all due respect Brutuz, Skyrim on any driver before the 13.2 beta was terrible on my 7970. Bad enough I hunted down the leaked version of 13.2 beta sent to Wasson that the AMD rep said not to use because it was incomplete.
> 
> 13.2 vs anything previous I used was night and day. With 13.2, you could spin around and the view was smooth. Before that, you'd get frame skips that felt like I was back playing console games. I thought I had it settled down using various other methods like RadeonPro but nothing came close to the actual fix in 13.2.
> 
> Not going to comment on anything else because other than that, I had no issues with AMD drivers. But that issue was 110% real and should not be downplayed because without Wasson's report, we may not have a fix to this day.


I could play it no problems (albeit with generous stuttering when riding the horse) on my HD6950 since I've had it.

It improved a crapton with 12.8, though.


----------



## raghu78

http://www.youtube.com/watch?v=P67tWF2mMrM

Austin explains the Rx GPU product stack. he gives hints at the performance of the entire stack. The chips themselves are Oland (most likely), Bonaire, Pitcairn, Tahiti, Hawaii.


----------



## Rtrbtn

Is the 7990 still a viable option at this point? How is it now with microstutter? If I am considering spending a large amount on a gpu, I might well consider the absolute most performance I can get for my money.


----------



## Durquavian

Quote:


> Vsync - Only a Partial Answer
> In some select instances for AMD's CrossFire we can actually see a completely resolved frame variance result, as demonstrated with the Battlefield 3 2560x1440 graphs. But Vsync still introduces other problems to latency and interactivity of PC games and is a topic we are going to dive into again soon.


See herein lies the issue with the one calling the AMD/ATI users liers when the say that MS wasn't really an issue with them. Some people like myself have always used v-sync or better aka: Radeonpro. So those huge issues as we have been told may not have been so huge to all users. Doesn't mean the issue didn't exist, by god we have more than enough data to prove otherwise. But you have to understand one thing, these articles of proof are meant to have the greatest impact possible and had they started with v-sync or Dynamic Frame Control from RadeonPro the issue would not have seemed so dire. I got lucky and used Radeonpro well before I CFd so I never really had the issues documented, that may be the case for a lot of others. So its not that we that have AMD are trying to cover it up or lie or be fanbois, it is simply that we were already fixing the problem and doing what OCers to best, fix or stuff and get the most we can.


----------



## Baghi

Lack of Crossfire connector on R9 290X explained
Quote:


> Naturally, it supports Crossfire, no need to panic, two to three R9 or R7 cards will work in Crossfire as long as the motherboard has enough slots. The reason behind the decision to drop the connector is relatively simple. We spoke with a few industry friends and we were told that you don't need this communication anymore as the PCIe bus is fast enough for all the frame syncing and communication necessary to make the card work synced.


Nothing that we don't know already but sharing this regardless.


----------



## Kuivamaa

Quote:


> Originally Posted by *Johnny Rook*
> 
> I am not defending Ryan Shrout and he doesn't need me to fight is battles but, if there's a person in the industry that could be called as being a "fanboy" is him. Do you know his first websites had the words "Athlon" and "AMD" in their names? Yeah! Call Ryan Shrout "AMD fanboy"!
> 
> That being said, of course there are problems with both AMD (formerly, ATI), and with nVIDIA. It couldn't be otherwise since there are literally thousands of PC configurations and is very difficult to deliver hassle-free experience to everybody.
> Are you calling the problems with Crossfire and the crap gaming experience it provided a "minor issue"?! I used ATI cards since they first showed up and I used Crossfire for 5 years - 2008-2013 -, and I was aware of the micro-stutter and poor GPU scaling way long TechReport or PcPer talked about it! It was NOT a minor issue! It was a HUGE issue! It actually hurt my gaming experience. In the games less taxing, I wouldn't note - the share raw power and performance of the GPUs would just disguise the problem. However, in games highly taxing on the GPUs, the stutter was noticeable and that was when my gaming experience felt like one of my GPUs rendering was actually being wasted; a waste of resources and a waste of power. In that matter of multi-GPU implementation, nVIDIA had (has) not a "false image of superiority". It was (still is) superior.
> So, what "minor issues" nVIDIA blew "out of proportion to create a mostly false image of superiority" are you talking about?


No, I am specifically calling the stuttering issues on single solutions in select few games, minor because they are exactly that, minor issues that affect both camps from time to time. If you read my post I clearly state crossfire had true issues. I don't care about "fanboys/fanboyism" and I didn't call Shrout a fanboi.

_" It doesn't matter all that much who exactly invented FCAT, PCPer or nvidia. What does matter is this: The rest of the sites got it from nvidia which at a point either invented it or endorsed it. Now, this is a very specific type of hardware for a very specialized type of work duty. Chances that both pcper and nvidia worked independent one from another to make it, are NIL. They worked together, either co-developed or one showed it to the other before the rest of the sites had any knowledge of it. No way around this. What I believe? It was created by nvidia that needed a vocal advocate of it which they found in pcper. "_

http://www.overclock.net/t/1427828/bsn-state-of-4k/250#post_20834380

No conspiracy theories, just common sense. PCP and nvidia are partners.


----------



## Brutuz

Quote:


> Originally Posted by *2010rig*
> 
> Please re-read what I wrote, no where did I state or imply NVIDIA drivers are far better. It's only been until recently ( see August 1st ) that AMD finally put out the MS drivers, that still do not fully fix the MS issues ( Eyefinity, DX9 )
> 
> You said AMD drivers are as good as NVIDIA's for a long time now ( 2 months isn't a "long time"), you also stated that you had no issues with 4890's.
> 
> Perhaps you're just not susceptible to MS, I've seen and experienced 5870's first hand, it was so bad that my brother in law just ended up disabling the 2nd card most of the time, due to all the issues he was having.
> 
> Maybe I misunderstood what you meant by this....
> 
> I see nothing wrong in demonstrating that an issue does in fact exist, where others will have you believe it doesn't, and everything is peachy and perfect. I've only posted those graphs 23 times, so I still have 77 to go.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How is this overhyped hysteria?
> 
> 
> 
> It makes me nauseous just looking at it.
> 
> See what I did there?


They have been, the MS issue aside (Which is matched by the texture corruption and issues with recent nVidia drivers in my books considering they've been on WHQL drivers...Which are meant to be tested) there aren't any major problems with AMDs drivers.

It's not just me, *most* people aren't susceptible to MS...That's why I keep saying it's an overblown issue, you had people (Typically nVidia owners) saying how bad it was with only a handful of actual users complaining about it...You also didn't have as many people complaining about it until after the PCPer articles came out, people who like to rag on AMD were handed a great opportunity to do so and took full advantage of it to spread FUD as to how badly it actually affects users and how many can actually feel it...You didn't see tonnes of people complaining about the MS of a GTX 580 SLI (Or GTX 4*0, GTX 2**, 9800GTX, 8800GTX, etc) setup that often, same with CFX...Until the PCPer article. I'm not saying it wasn't there, but it certainly wasn't anywhere nearly as big of an issue as people made it out to be. (Much like the 314 drivers apparently killing cards, which seemed more and more like crap as time went on)

You can link post graphs until the cows come home, but that doesn't change the fact that the microstutter wasn't that noticable on my CFX HD4890s nor was it that noticable on Majin SSJ Eric's CFX HD7970s...Plus it's completely irrelevant given that the R9 290X has hardware frame pacing like Kepler and is in fact one of the various reasons (Beyond mere performance and easier availability) I'm considering going for an R9 290X (With a second later) instead of a second HD7950.
Quote:


> Originally Posted by *mcg75*
> 
> With all due respect Brutuz, Skyrim on any driver before the 13.2 beta was terrible on my 7970. Bad enough I hunted down the leaked version of 13.2 beta sent to Wasson that the AMD rep said not to use because it was incomplete.
> 
> 13.2 vs anything previous I used was night and day. With 13.2, you could spin around and the view was smooth. Before that, you'd get frame skips that felt like I was back playing console games. I thought I had it settled down using various other methods like RadeonPro but nothing came close to the actual fix in 13.2.
> 
> Not going to comment on anything else because other than that, I had no issues with AMD drivers. But that issue was 110% real and should not be downplayed because without Wasson's report, we may not have a fix to this day.


I've had it since launch and have been modding it since I was able to get good mods for it, with the mods the vRAM on my GTX 470 would fill up and I'd get lag whenever I turned around as it loaded textures...It was far better on my HD7950 which I got before the 13.2 Betas even came out although they did improve the experience some, it certainly wasn't unplayable for me and that's the thing with a lot of bugs especially when they're more related to a game like that, some people get them way worse than others if they get them at all. It definitely was an issue or the 13.2 drivers wouldn't have been so important.


----------



## Moragg

Quote:


> Originally Posted by *th3illusiveman*
> 
> don't know when $500 stopped being alot of freaking money for a GPU...


Probably when Nvidia released the 780. Cos, like, if Nvidia are charging 650 it's worth it.

I once saw an article about menu pricing, and GPU lineups are very similar. There is a price "anchor" - the big expensive thing from which people judge the lesser goods. In this case it was the Titan, and the "small" Titan next to it at a much lower cost seemed very good value comparatively.


----------



## $ilent

£400 before tax, so its still gonna end up being best part of £500 after tax. Yeah...no AMD I dont think so.


----------



## scyy

Quote:


> Originally Posted by *Kuivamaa*
> 
> No, I am specifically calling the stuttering issues on single solutions in select few games, minor because they are exactly that, minor issues that affect both camps from time to time. If you read my post I clearly state crossfire had true issues. I don't care about "fanboys/fanboyism" and I didn't call Shrout a fanboi.
> 
> _" It doesn't matter all that much who exactly invented FCAT, PCPer or nvidia. What does matter is this: The rest of the sites got it from nvidia which at a point either invented it or endorsed it. Now, this is a very specific type of hardware for a very specialized type of work duty. Chances that both pcper and nvidia worked independent one from another to make it, are NIL. They worked together, either co-developed or one showed it to the other before the rest of the sites had any knowledge of it. No way around this. What I believe? It was created by nvidia that needed a vocal advocate of it which they found in pcper. "_
> 
> http://www.overclock.net/t/1427828/bsn-state-of-4k/250#post_20834380
> 
> No conspiracy theories, just common sense. PCP and nvidia are partners.


Again, another person who has no idea the history of what went down. Look at their initial reviews, they had the frame capture setup before nvidia came to them. They were going through captured data one frame at a time and actually coloring in the frames by hand in software to show the microstutter. What nvidia was able to do was bring software forward that allowed software to automatically process the captured footage for them and create graphs for them which showed the results.

Those are exactly conspiracy theories and you are just showing how misinformed you are on the entire issue. Just because something benefited nvidia doesn't automatically mean they were behind every article about it. Go and do a little research on the history of multicard setups and maybe realize this is something that has been complained about for years. Do you really believe that reviewers and the like who had noticed it and wanted to find a way to measure it wouldn't have done this on their own? Again, the only thing nvidia gave was the means to measure captured data, everything else was already in place before fcat was given to them in Feb/March.

Seriously, I know you seem to absolutely hate pcper but try watching one of their podcasts once. They have plenty of really good things to say about AMD as well, you are making yourself look foolish by only looking at one side and making a sweeping claim of pcper being an nvidia partner despite the fact that they get a large chunk of their advertising money from AMD and rarely have nvidia ads. That sure looks like an nvidia partner to me you know. Or is that part of their plan to make it seem like they aren't nvidia partners? Grow up, not everyone is in collusion against AMD and pointing out real issues doesn't make you an nvidia shill.


----------



## scyy

Quote:


> Originally Posted by *Brutuz*
> 
> They have been, the MS issue aside (Which is matched by the texture corruption and issues with recent nVidia drivers in my books considering they've been on WHQL drivers...Which are meant to be tested) there aren't any major problems with AMDs drivers.
> 
> It's not just me, *most* people aren't susceptible to MS...That's why I keep saying it's an overblown issue, you had people (Typically nVidia owners) saying how bad it was with only a handful of actual users complaining about it...You also didn't have as many people complaining about it until after the PCPer articles came out, people who like to rag on AMD were handed a great opportunity to do so and took full advantage of it to spread FUD as to how badly it actually affects users and how many can actually feel it...You didn't see tonnes of people complaining about the MS of a GTX 580 SLI (Or GTX 4*0, GTX 2**, 9800GTX, 8800GTX, etc) setup that often, same with CFX...Until the PCPer article. I'm not saying it wasn't there, but it certainly wasn't anywhere nearly as big of an issue as people made it out to be. (Much like the 314 drivers apparently killing cards, which seemed more and more like crap as time went on)
> 
> You can link post graphs until the cows come home, but that doesn't change the fact that the microstutter wasn't that noticable on my CFX HD4890s nor was it that noticable on Majin SSJ Eric's CFX HD7970s...Plus it's completely irrelevant given that the R9 290X has hardware frame pacing like Kepler and is in fact one of the various reasons (Beyond mere performance and easier availability) I'm considering going for an R9 290X (With a second later) instead of a second HD7950.
> I've had it since launch and have been modding it since I was able to get good mods for it, with the mods the vRAM on my GTX 470 would fill up and I'd get lag whenever I turned around as it loaded textures...It was far better on my HD7950 which I got before the 13.2 Betas even came out although they did improve the experience some, it certainly wasn't unplayable for me and that's the thing with a lot of bugs especially when they're more related to a game like that, some people get them way worse than others if they get them at all. It definitely was an issue or the 13.2 drivers wouldn't have been so important.


Did you really just compare the stupid rumors of 314 killing cards and microstutter which has been commented on since the reintroduction of multi gpu setups?

Different people will perceive microstutter at different levels, everyones eyes are different but the fact of the matter is testing across a wide range of systems and games showed it exists almost everywhere multigpu setups were used. To try and play it off because you don't notice it as much doesn't mean others don't notice it much more or that others complaints aren't valid because they are more perceptible to it.


----------



## raghu78

Quote:


> Originally Posted by *$ilent*
> 
> *No I think £550 is too much for a gtx 780 and a 290X*, hence why I havent bought a 780 and probably wont buy a 290X now because its too much. I dont think its justifiable playing that much money for a graphics card.


well said. the R9 290X needs to be around £450. that was the launch price of HD 7970. anything higher is indeed disappointing.
Quote:


> My title just shows my anger at AMD for having a 3 hour GPU presentation and using 2 hours 59 minutes of that talking about audio. *I wanted to know the 290X specs and its price, and they disappointed.*


can't blame AMD for that. those details were revealed to the press who are under NDA till Oct 15th when embargo is lifted and reviews go up.


----------



## Majin SSJ Eric

I blame AMD for it. There's no reason not to just go ahead and release the NDA now. Seriously, there's nothing Nvidia can do in two weeks that's going to really harm the 290X's release...


----------



## scyy

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I blame AMD for it. There's no reason not to just go ahead and release the NDA now. Seriously, there's nothing Nvidia can do in two weeks that's going to really harm the 290X's release...


Pretty much, I just want to know the real specs and get some real benches. I wouldn't mind if it's faster than the 780 honestly, I've had almost half a year of having the best mainstream gpu's money can buy,(I guess you could consider titan mainstream but I just mean as a part of the regular graphics lines) that's worth the price premium imo.


----------



## Clockster

Quote:


> Originally Posted by *$ilent*
> 
> No I think £550 is too much for a gtx 780 and a 290X, hence why I havent bought a 780 and probably wont buy a 290X now because its too much. I dont think its justifiable playing that much money for a graphics card.
> 
> My title just shows my anger at AMD for having a 3 hour GPU presentation and using 2 hours 59 minutes of that talking about audio. I wanted to know the 290X specs and its price, and they disappointed. Also whats wrong with my sig name?
> thank you, nice to know some people understand


Referring to both your title and this"The NEW AMD Radeon Naming Scheme:
9970 is R2D2, 9950 is R2D2 LE, 9870 is R2D1, 9850 is R2D1 LE and so on"

No hard feelings








Quote:


> Originally Posted by *Sheyster*
> 
> You're even more of a Fanboi than he is! You have two R290X's in your sig rig already, with no reviews/benches/pricing to base that decision on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why not change your avatar photo to the AMD logo right now? That's all you need to complete your profile.


Um lol The reason why I have them there is because I have already placed a back order for them from our local supplier









Oh and I owned a Titan and a 780 and 680 sli and 670 sli...lol (Still have a 670 for testing purposes)
Quote:


> Originally Posted by *Baghi*
> 
> He thinks you're biased towards Intel/NVIDIA just because you've them in your sig ring.


Has nothing to do with his sig rig...lol
I run Nvidia cards, I am far from a fanboy, I just wish people would stop thinking that AMD automatically has to price their cards lower.
So for argument sake, The R9 290X is 15% faster than a 780 (No idea about performance yet) and it comes with 6-8 AAA games do you think it should be priced lower than the 780? LOL


----------



## Vesku

Quote:


> Originally Posted by *$ilent*
> 
> No I think £550 is too much for a gtx 780 and a 290X, hence why I havent bought a 780 and probably wont buy a 290X now because its too much. I dont think its justifiable playing that much money for a graphics card.
> 
> My title just shows my anger at AMD for having a 3 hour GPU presentation and using 2 hours 59 minutes of that talking about audio. I wanted to know the 290X specs and its price, and they disappointed. Also whats wrong with my sig name?
> thank you, nice to know some people understand


Yes, not sure why AMD was so coy about 290X specs considering they will probably be revealed in 2 weeks anyway.

There may be a 290 (non-x) that fits in your pricing cut off, just have to wait and see.


----------



## james8

Look, when Nvidia release TITAN II with a fully enabled GK110, AMD will inevitably put out a GHz edition of this using 6 GHz memory


----------



## Ghoxt

Quote:


> Originally Posted by *xoleras*
> 
> I can't believe that to this day, people are bored enough to argue nvidia vs AMD for pages on end.


You underestimate the 'Power' of the Dark Side!


----------



## $ilent

The r2d2 name thing was a joke because the new naming scheming was confusing









I hope the 290 is cheap enough then!


----------



## Yvese

Quote:


> Originally Posted by *$ilent*
> 
> The r2d2 name thing was a joke because the new naming scheming was confusing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope the 290 is cheap enough then!


Agreed. I'm hoping the 290 is 780 performance, whereas the 290x trades blows with Titan, with the 290 being $499 and 290x $599. That would then result in Nvidia lowering prices of the 780. That would be sweet.


----------



## provost

checked in to see if any updates on this card...nope..people still, hoping, praying and wishing...okie dokie ..
My guess is this card is not ready for prime time yet, and this is why the delay in reviews, NDA, pre-order date, etc. ...
either way does not instill a lot of confidence in AMD's ability to deliver, after flip flopping on Oct 2nd pre -order date... I guess the live stream snafu was a good lead indicator of where things were headed with this release..
well, good luck to AMD!


----------



## Forceman

What's the flip-flop on the Oct pre-release date (and it's the 3rd, right)?


----------



## provost

Quote:


> Originally Posted by *Forceman*
> 
> What's the flip-flop on the Oct pre-release date (and it's the 3rd, right)?


That's what I meant, Oct 3rd.
Who knows..pick a date from now until Christmas and you may get it right. The 15th may just be a paper launch with NDA lifted and for pre-orders. But who the heck knows at this point.


----------



## xoleras

Quote:


> Originally Posted by *provost*
> 
> checked in to see if any updates on this card...nope..people still, hoping, praying and wishing...okie dokie ..
> My guess is this card is not ready for prime time yet, and this is why the delay in reviews, NDA, pre-order date, etc. ...
> either way does not instill a lot of confidence in AMD's ability to deliver, after flip flopping on Oct 2nd pre -order date... I guess the live stream snafu was a good lead indicator of where things were headed with this release..
> well, good luck to AMD!


I wouldn't expect anything until the 15th October. That is when the NDA lifts, apparently.


----------



## Stay Puft

Quote:


> Originally Posted by *Forceman*
> 
> What's the flip-flop on the Oct pre-release date (and it's the 3rd, right)?


October 3rd the 290X and BF4 290X bundle will be available

What im thinking

290X - 599
290X/BF4 - 649.99


----------



## Clockster

Quote:


> Originally Posted by *Stay Puft*
> 
> October 3rd the 290X and BF4 290X bundle will be available
> 
> What im thinking
> 
> 290X - 599
> 290X/BF4 - 649.99


Yip that's exactly what I am expecting as well.
I spoke to the local suppliers and that's what they estimate as well.


----------



## Sheyster

Quote:


> Originally Posted by *Stay Puft*
> 
> Its what people do. It's like Coke vs Pepsi... coke wins hands down


More like Coke vs. Ralphs or Vons store brand knock-off Cola...


----------



## Forceman

Quote:


> Originally Posted by *provost*
> 
> That's what I meant, Oct 3rd.
> Who knows..pick a date from now until Christmas and you may get it right. The 15th may just be a paper launch with NDA lifted and for pre-orders. But who the heck knows at this point.


I just saw the WCCF article about the possible change of the pre-order date, so I understand your comment now. Hadn't seen that before. Makes sense really, selling a pre-order without releasing performance data is kinda iffy.
Quote:


> Originally Posted by *Stay Puft*
> 
> October 3rd the 290X and BF4 290X bundle will be available
> 
> What im thinking
> 
> 290X - 599
> 290X/BF4 - 649.99


I was thinking the pre-order would be the same price and BF4 would be a freebie as incentive to pre-order. Kind of like pre-orders of BF4 include the first DLC - I just asusmed we were now moving to that model for hardware as well.


----------



## $ilent

I think the reveal is 3rd October, you can't buy it on the 3rd surely?


----------



## kpo6969

Quote:


> Originally Posted by *Clockster*
> 
> So its ok to pay that much for a 780 but not for a R9 290X? LOL I love your logic xD
> 
> Edit: Nvm I just saw your sig and title, that says it all


Quote:


> Originally Posted by *Sheyster*
> 
> You're even more of a Fanboi than he is! You have two R290X's in your sig rig already, with no reviews/benches/pricing to base that decision on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why not change your avatar photo to the AMD logo right now? That's all you need to complete your profile.


+1
System signature specs usually don't say a lot but sometimes can cause anything the poster has said to be taken with a grain a salt.


----------



## Regent Square

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't know why anybody would ever drink a Pepsi?


Wut!? Pepsi has been a king until this coke trend begun to take in place. Pffs pips who dare to say smth, about Pepsi being inferior to coke!


----------



## Brutuz

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I blame AMD for it. There's no reason not to just go ahead and release the NDA now. Seriously, there's nothing Nvidia can do in two weeks that's going to really harm the 290X's release...


Not to mention, information about the various GPUs and the like is usually spread around Taiwan meaning nVidia probably knows more about the R9 290X than we do...That's what made the lack of any leaks about Eyefinity so weird.


----------



## mcg75

I still find it strange that AMD could have done so much damage to Nvidia with this conference by announcing the price and full specs for the 290x but they choose not to. That's seems to point to one of two things. Either the card is not what it was made out to be by the leaked benches. Or AMD is going to launch it at a higher price point than rumored.

If I was AMD, promoting the hell out of the 290x at $599 and beats Titan by 5-10% would have been the main focus of the conference with Mantle and audio coming distant second.


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> Wut!? Pepsi has been a king until this coke trend begun to take in place. Pffs pips who dare to say smth, about Pepsi being inferior to coke!


Pepsi Sucks


----------



## Vesku

Quote:


> Originally Posted by *mcg75*
> 
> I still find it strange that AMD could have done so much damage to Nvidia with this conference by announcing the price and full specs for the 290x but they choose not to. That's seems to point to one of two things. Either the card is not what it was made out to be by the leaked benches. Or AMD is going to launch it at a higher price point than rumored.
> 
> If I was AMD, promoting the hell out of the 290x at $599 and beats Titan by 5-10% would have been the main focus of the conference with Mantle and audio coming distant second.


I think it's mainly an attempt to build up suspense, one which is failing to work with me and apparently most enthusiasts. There is probably a secondary purpose in terms of trying to get Nvidia to reveal any price drop plans they may have, early.


----------



## $ilent

Quote:


> Originally Posted by *Vesku*
> 
> I think it's mainly an attempt to build up suspense, one which is failing to work with me and apparently most enthusiasts. There is probably a secondary purpose in terms of trying to get Nvidia to reveal any price drop plans they may have, early.


I think all its done has irritated alot of people. Im sick of waiting.


----------



## szeged

Quote:


> Originally Posted by *$ilent*
> 
> I think all its done has irritated alot of people. Im sick of waiting.


amd is really good at getting the opposite results that they desire.

attempt to hype people up, end up pissing everyone off. history repeats itself again.


----------



## hatlesschimp

When will gpus get hdmi 2.0?


----------



## TheLAWNOOB

Quote:


> Originally Posted by *Stay Puft*
> 
> Pepsi Sucks


Damn right it sucks. I've always been a coca-cola fan myself.

Heck, even that "part-time" model friend of mine couldn't change my mind


----------



## szeged

we all know dr pepper is the best, you need atleast quadfire pepsi or 4 way sli cola to even compete.


----------



## xoleras

Quote:


> Originally Posted by *szeged*
> 
> we all know dr pepper is the best, you need atleast quadfire pepsi or 4 way sli cola to even compete.


You sir, shut your dirty whore mouth immediately. Everyone knows Mt. Dew in 4-way SLI is the best.


----------



## Stay Puft

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Damn right it sucks. I've always been a coca-cola fan myself.
> 
> Heck, even that "part-time" model friend of mine couldn't change my mind


Plus size model?








Quote:


> Originally Posted by *szeged*
> 
> we all know dr pepper is the best, you need atleast quadfire pepsi or 4 way sli cola to even compete.


Dr. Pepper? Nasty! 7Up used to be great till they changed the formula


----------



## szeged

burn the heretic


----------



## TheLAWNOOB

Quote:


> Originally Posted by *Stay Puft*
> 
> Plus size model?
> 
> 
> 
> 
> 
> 
> 
> 
> Dr. Pepper? Nasty! 7Up used to be great till they changed the formula


Oh hell no. She is under 100









And she only does it part time, her other job requires her to be in a video if you know what I mean









Edit: How did this thread turn into a discussion of drinks again?


----------



## fateswarm

The frame pacing story gives a great lesson. If you are a fan and not a user it returns as a boomerang. By hiding the faults of "your"(not really, they get the money) company you may in the end harm it since if you exposed them you may have had better products in the end and sooner.


----------



## wermad

Where are all the leakers? In Soviet AMD, spy work for us









Seriously, a few leaked pics (now published) and some graphs, we need moar!


----------



## Kuivamaa

Quote:


> Originally Posted by *scyy*
> 
> Again, another person who has no idea the history of what went down. Look at their initial reviews, they had the frame capture setup before nvidia came to them. They were going through captured data one frame at a time and actually coloring in the frames by hand in software to show the microstutter. What nvidia was able to do was bring software forward that allowed software to automatically process the captured footage for them and create graphs for them which showed the results.
> 
> Those are exactly conspiracy theories and you are just showing how misinformed you are on the entire issue. Just because something benefited nvidia doesn't automatically mean they were behind every article about it. Go and do a little research on the history of multicard setups and maybe realize this is something that has been complained about for years. Do you really believe that reviewers and the like who had noticed it and wanted to find a way to measure it wouldn't have done this on their own? Again, the only thing nvidia gave was the means to measure captured data, everything else was already in place before fcat was given to them in Feb/March.
> 
> Seriously, I know you seem to absolutely hate pcper but try watching one of their podcasts once. They have plenty of really good things to say about AMD as well, you are making yourself look foolish by only looking at one side and making a sweeping claim of pcper being an nvidia partner despite the fact that they get a large chunk of their advertising money from AMD and rarely have nvidia ads. That sure looks like an nvidia partner to me you know. Or is that part of their plan to make it seem like they aren't nvidia partners? Grow up, not everyone is in collusion against AMD and pointing out real issues doesn't make you an nvidia shill.


Have you even read what I wrote? Have you even seen my post on this very thread that shows I am well aware that Techreport has spotted frame latencies on single GPU setups even as early as 2011? Most people in here are well awayre that HardOCP was talking about good results but not so good visual experience on crossfire setupts vs SLi ones for years too. Do you seriously believe that nvidia set FCAT up in 1 month or so after they saw what pcper did? This is their first publication on the issue.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-New-Graphics-Performance-Metric

Nvidia was obviously working on this technology years before. This kind of work takes alot of time,funds and research and I do not believe in this type of coincidence, that a techsite, that haven't spoken in detail of multigpu issues ever before, had it all figured already. " I've stayed quiet about it simply because we were working on quite a few different angles on our side and I wasn't ready to share results." was Shrout's poor explanation. So there are only two cases. Both PCPer and nvidia developing this method independently is totally out of the question. It was either PCPer inventing it and reaching out to nvidia or nvidia inventing it and reaching out to PCPer. Make your choice, they are partners either way. No conspiracy, just common sense.


----------



## fateswarm

There are no Saints in this world. Say NVIDIA saw a benefit. So what, it's not that evil, it's actually very positive in the end for both since even if NVIDIA takes some very limited publicity for it (*very limited* because let's face it, only us nerds here notice those things) but it will last for what, 2 months, AMD already fixes it or fixed it.


----------



## Nonehxc

Who cares? We came here to see a scarlet woman with big cards in CF, a fine lady entertainment for our sad and hungry eyes. That's the only thing we're interested in. Bonus pay if she shows us her pci-e slot being fully crossfired and no CF connector on her back.









Cherry Coke sponsors this Nightclub.


----------



## wermad

ATI ftw!:



Purrtttyyy gurl


----------



## wermad

Quote:


> Originally Posted by *szeged*
> 
> the best part is the gpu, i almost didnt notice the girl.


There's silicone there!?!?!?!? Hardware, I means







.Go fig


----------



## wermad

Yeah baby! Groovy!


----------



## wermad

Googled "amd models" and nothing if little came up. "nvidia models" got me Katrina (bombshell). Typed "amd girls" and finally got some green/red team pics (Amd logo and cpu stuff is green btw fellas







):


----------



## wermad

I think we have found neutral grounds for both camps


----------



## nvidiaftw12

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> The only good news about the release of these cards is the following:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131483
> 
> $599


Quote:


> Originally Posted by *Zealon*
> 
> I just noticed these cards dropped in price, so I am seriously considering getting one and water cooling it. I'm on the fence about it though because I can either get the 7990 or wait till Q2 2014 and see what Nvidia has to offer for the Maxwell architecture.


Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Yup, I'll be grabbing a 7990 on Black Friday, if I can come across a Devil 13 that would just be perfect. Except, looking at the price drops right now 7970's and 7950's are super cheap right now (can only imagine come Black Friday).


Quote:


> Originally Posted by *Zealon*
> 
> I think I'll go the same route and pick up a 7990 on black friday unless the 290X can actually impress me come late October. I thought about picking up a used 7970 later on for some nice tri-fire.


Don't. I have one and it has bad, bad coil whine. I RMA'd it and the replacement sounds exactly the same. If you really want one, I have one I'd be happy to sell.


----------



## $ilent

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Don't. I have one and it has bad, bad coil whine. I RMA'd it and the replacement sounds exactly the same. If you really want one, *I have one I'd be happy to sell*.


Lol your really 'selling' it to potential buyers


----------



## 2010rig

Quote:


> Originally Posted by *Kuivamaa*
> 
> Have you even read what I wrote? Have you even seen my post on this very thread that shows I am well aware that Techreport has spotted frame latencies on single GPU setups even as early as 2011? Most people in here are well awayre that HardOCP was talking about good results but not so good visual experience on crossfire setupts vs SLi ones for years too. Do you seriously believe that nvidia set FCAT up in 1 month or so after they saw what pcper did? This is their first publication on the issue.
> 
> http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-New-Graphics-Performance-Metric
> 
> Nvidia was obviously working on this technology years before. This kind of work takes alot of time,funds and research and I do not believe in this type of coincidence, that a techsite, that haven't spoken in detail of multigpu issues ever before, had it all figured already. " I've stayed quiet about it simply because we were working on quite a few different angles on our side and I wasn't ready to share results." was Shrout's poor explanation. So there are only two cases. Both PCPer and nvidia developing this method independently is totally out of the question. It was either PCPer inventing it and reaching out to nvidia or nvidia inventing it and reaching out to PCPer. Make your choice, they are partners either way. No conspiracy, just common sense.


Let's entertain the idea of your conspiracy theory.

What difference does it make? The issue is very real, and cannot be denied.

If it was a made up issue, when the Frame Pacing drivers released, nothing would have changed. But you can see it clearly in the videos and the charts that CF is MUCH MUCH smoother now, than prior to August 1st.

So be happy that PcPer & TR exposed these issues, AMD users who use CF benefit in the end, and we have much more competitive solutions. I really don't see the problem here.

Why are you guys so adamant in SHOOTING the messenger, yet, just as adamant at downplaying the issues?

If you don't like PCPER, what about this?
http://www.anandtech.com/show/7195/amd-frame-pacing-explorer-cat138/2

I can't believe this whole issue is still up for debate. I also don't understand why so many people literally hate PCPER & NVIDIA ( though it's never been proven ) for exposing very real issues that before August 1st, were NOT fixed yet.


----------



## wermad

Quote:


> Originally Posted by *amd655*
> 
> 
> 
> Not the AMD we know of... unless AMD are now a car tuning company


Great eye! Didn't realize that myself, meh, still says amd and she's a beauty, I say leave her there for now









Btw:



Bought a couple of 7990s, got dibs on pic w/ girls. Pretty much all he got....


----------



## dman811

Quote:


> Originally Posted by *wstanci3*
> 
> Yeah, "mostly" no change. I just think I'm going to delete my search history. Better that way.


What has been seen cannot be unseen.

I am thinking the R9 280X in CF will just tear it up. I might just get some. I would just have to upgrade my CPU, MB, and possibly PSU first.


----------



## TooBAMF

Hitting a VRAM wall won't automatically and permanently cut frame rate in half. It causes momentary drops when textures need to be swapped in and out. This can be interpreted as stuttering or pausing, but it doesn't cause a constant drop.

The game will run, and depending on how much of a short fall there is, seem OK most of the time. Someone that claims a game runs fine on a low VRAM card probably doesn't notice or care about momentary drops. That's their preference, however, just because the average frame rate is high doesn't mean the game isn't negatively affected by a VRAM wall.

Edit: didn't realize how far back the VRAM discussion was...


----------



## Durquavian

Quote:


> Originally Posted by *TooBAMF*
> 
> Hitting a VRAM wall won't automatically and permanently cut frame rate in half. It causes momentary drops when textures need to be swapped in and out. This can be interpreted as stuttering or pausing, but it doesn't cause a constant drop.
> 
> The game will run, and depending on how much of a short fall there is, seem OK most of the time. Someone that claims a game runs fine on a low VRAM card probably doesn't notice or care about momentary drops. That's their preference, however, just because the average frame rate is high doesn't mean the game isn't negatively affected by a VRAM wall.
> 
> Edit: didn't realize how far back the VRAM discussion was...


But then how does offloading to system Ram affect that. Skyrim will only use 3.2gb of ram and I have used upwards of 7gb of ram for the game. Of course at the time I was using a lot of high res textures but since reduced them to 2K and that usage down to 5gb so about 2 gb for textures I guess. Honestly not sure what is on the ram be it textures/ GPU offloading or whatnot.


----------



## Kuivamaa

Quote:


> Originally Posted by *2010rig*
> 
> Let's entertain the idea of your conspiracy theory.
> 
> What difference does it make? The issue is very real, and cannot be denied.
> 
> If it was a made up issue, when the Frame Pacing drivers released, nothing would have changed. But you can see it clearly in the videos and the charts that CF is MUCH MUCH smoother now, than prior to August 1st.
> 
> So be happy that PcPer & NVIDIA exposed these issues, AMD users who use CF benefit in the end, and we have much more competitive solutions. I really don't see the problem here.
> 
> Why are you guys so adamant in SHOOTING the messenger, yet, just as adamant at downplaying the issues?
> 
> If you don't like PCPER, what about this?
> http://www.anandtech.com/show/7195/amd-frame-pacing-explorer-cat138/2
> 
> I can't believe this whole issue is still up for debate. I also don't understand why so many people literally hate PCPER & NVIDIA ( though it's never been proven ) for exposing very real issues that before August 1st, were NOT fixed yet.


Did you ever pay attention to my posts?Have I,in any shape of form denied there was a real issue with crossfire, at least out of the box? Are you done with the straw man?Can we proceed now? I do not hate nvidia (you on the other hand obviously adore them),I have never bought a desktop AMD/ATi card in my life (have only had two laptop gpus from red team and I suspect an ATi CGA in a huyndai XT my parents bought me 23-24 years ago). My criticism is purely focused on pcper because I am a journalist myself. Most famous sites had stories on the subject last spring, they were all very critical of AMD but only PCPer came to ridiculous conclusions (2 radeons in xfire as good as one), plus a very iffy Sleeping Dogs video comparison (where they compared obviously different IQ) which raised an eyebrow from me in the first place. If you can't see how a site that had never ever ever been a part of the "inside the second/crossfire has the numbers but not smoothness" discussion ,came to prominence out of the blue,a couple of months before this whole FCAT story became mainstream, then I don't know what. PCPer is the case here,not nvidia or crossfire microsttuter. I rest my case.


----------



## wermad

Quote:


> Originally Posted by *Kuivamaa*
> 
> Did you ever pay attention to my posts?Have I,in any shape of form denied there was a real issue with crossfire, at least out of the box? Are you done with the straw man?Can we proceed now? I do not hate nvidia (you on the other hand obviously adore them),I have never bought a desktop AMD/ATi card in my life (have only had two laptop gpus from red team and I suspect an ATi CGA in a huyndai XT my parents bought me 23-24 years ago). My criticism is purely focused on pcper because I am a journalist myself. Most famous sites had stories on the subject last spring, they were all very critical of AMD but only PCPer came to ridiculous conclusions (2 radeons in xfire as good as one), plus a very iffy Sleeping Dogs video comparison (where they compared obviously different IQ) which raised an eyebrow from me in the first place. If you can't see how a site that had never ever ever been a part of the "inside the second/crossfire has the numbers but not smoothness" discussion ,came to prominence out of the blue,a couple of months before this whole FCAT story became mainstream, then I don't know what. PCPer is the case here,not nvidia or crossfire microsttuter. I rest my case.


A lot of issues with hardware are found not by reviewers, but by customers







. Customers who may be entire noobs or long time enthusiast. In the end, voicing your concern or displeasure is very common when a company is not willing to accept product failure and do something about it. Nvidia is guilty of this too (GTX 570 and 590 go boom). The truth will eventually surface whether you like it or not, even if its reported by someone you despise. You don't have to agree with everyone's opinion (but do accept the truth), as a journalist you claim to be, don't you already know this? I rest my case









edit: some ATi love:



Who dat in the awning? Lol...


----------



## wermad

Interesting news:
Quote:


> Radeon R9 290X Launch Date Revealed?
> by
> btarunr
> Sunday, September 29th 2013 Discuss (25 Comments)
> At a local press gathering in Turkey, AMD revealed the launch date of its next high-end product, the Radeon R9 290X. The press NDA over the card will end on October 15, 2013, at 12:01 AM EST (Berlin time). This NDA expiry time was disclosed in the slides that AMD showed the press. There's always the possibility that the NDA expiry date doesn't match market availability. It could merely mark NDA expiry for the press to post reviews of their AMD reference-design R9 290X (the only kind of R9 290X that will be initially available). AMD is handling R9 290X launch much in the same way NVIDIA handled the GTX TITAN, in that there won't be non-reference design cards in the foreseeable future, with the exception of cards with factory-fitted full-coverage water-blocks.


http://www.techpowerup.com/191704/radeon-r9-290x-launch-date-revealed.html

Could be NDA and launch on same day (October 15th). If they do, it could be a smart move by not wasting time if the card proves to be a big success (reviewers will release their findings as NDA lifts). On the other hand, it could be panic to try to get it to market asap as it may not deliver up to the hype. With the supposedly low stock of the Bf4 bundle, pre-orders have stopped, seems a bit panicky imho. I'm really hoping this guy is a winner (another excuse to "upgrade"







).

To mods: I hope this is on-topic enough to not warrant a post delete







. The article is too short to really create another thread.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Kuivamaa*
> 
> Did you ever pay attention to my posts?Have I,in any shape of form denied there was a real issue with crossfire, at least out of the box? Are you done with the straw man?Can we proceed now? I do not hate nvidia (you on the other hand obviously adore them),I have never bought a desktop AMD/ATi card in my life (have only had two laptop gpus from red team and I suspect an ATi CGA in a huyndai XT my parents bought me 23-24 years ago). My criticism is purely focused on pcper because I am a journalist myself. Most famous sites had stories on the subject last spring, they were all very critical of AMD *but only PCPer came to ridiculous conclusions (2 radeons in xfire as good as one), plus a very iffy Sleeping Dogs video comparison (where they compared obviously different IQ) which raised an eyebrow from me in the first place*. If you can't see how a site that had never ever ever been a part of the "inside the second/crossfire has the numbers but not smoothness" discussion ,came to prominence out of the blue,a couple of months before this whole FCAT story became mainstream, then I don't know what. PCPer is the case here,not nvidia or crossfire microsttuter. I rest my case.


This, so much this. After PCPer tried to pull that crap I decided then and there I would never read that ridiculous website again (I had CF 7970's at the time and I can tell you in no uncertain terms that the "2 video cards equals 1 video card experience" was pure horse bleep).


----------



## Sheyster

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> This, so much this. After PCPer tried to pull that crap I decided then and there I would never read that ridiculous website again (I had CF 7970's at the time and I can tell you in no uncertain terms that the "2 video cards equals 1 video card experience" was pure horse bleep).


I have no respect for that site either, FWIW...


----------



## wermad

There's has been plenty of reviews were ppl will cry absurd and heresy. Its been published, its done, its over. If the author feels there's enough demand and compelling information to issue a retraction or correction, they would have done so. If you don't like, don't read it. By not reading Ryan's reviews won't spell the end of pcper.com. Just like we like to pick sides (or non), we have our favorite review sites.

I hate on how some never test MMG, or how others have some irrelevant graphs, or how some use older or too newer benchmarks (why not a little bit of both?), so I can 8i7ch about any site easily but I don't let it drag on and on and on and *off-topic*.


----------



## 2010rig

I've owned ATI cards in the past such as the X1800XT, 2900XT....

I've also seen and experienced 5870's CF first hand, where the "fix" was disabling the 2nd card in order not to have to deal with so many issues in some games. I went the NVIDIA route due to the simple fact that my primary uses wasn't gaming, and AMD was NOT an option in Adobe Premiere at the time. I'm sure that's somehow my fault though, for not choosing a 5870 instead.

I buy what is best suited for my needs, tyvm.


----------



## bencher

Quote:


> Originally Posted by *2010rig*
> 
> I've owned ATI cards in the past such as the X1800XT, 2900XT....
> 
> I've also seen and experienced 5870's CF first hand, where the "fix" was disabling the 2nd card in order not to have to deal with so many issues in some games. After that experience is what really drove me to stay away from AMD for my next upgrade, that and the simple fact that my primary uses wasn't gaming, and AMD was NOT an option in Adobe Premiere at the time.
> 
> I buy what is best suited for my needs, tyvm.


I enjoy seeing you pretend to be neutral lol...

Be true to yourself.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *2010rig*
> 
> I've owned ATI cards in the past such as the X1800XT, 2900XT....
> 
> *I've also seen and experienced 5870's CF first hand, where the "fix" was disabling the 2nd card in order not to have to deal with so many issues in some games.* I went the NVIDIA route due to the simple fact that my primary uses wasn't gaming, and AMD was NOT an option in Adobe Premiere at the time. I'm sure that's somehow my fault though, for not choosing a 5870 instead.
> 
> I buy what is best suited for my needs, tyvm.


Yeah, I guess its just too much to consider that maybe, just maybe, AMD improved things a little bit by the time they went from the 5870's to the 7970's? Ya think?


----------



## 2010rig

Quote:


> Originally Posted by *bencher*
> 
> I enjoy seeing you pretend to be neutral lol...
> 
> Be true to yourself.


This coming from you.









I am true to myself, have you seen what it says below my user name?

Did you see his passive aggressive post, where the implication was that I've never owned an AMD card in my life, you know what happens when people assume things....

I'm shocked you had nothing to say to his post, yet reply to mine.








Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yeah, I guess its just too much to consider that maybe, just maybe, AMD improved things a little bit by the time they went from the 5870's to the 7970's? Ya think?


Oh, I'm not at all denying that AMD certainly improved things for *7970 CF 20 months into their life*, I'm very proud of AMD and their timely fixes.

I'll be happy to look up some posts where I've said that the 7990 is an amazing deal after its price drops, and after the Frame Pacing drivers were released. Those 2 facts cannot be denied.


----------



## Majin SSJ Eric

Haha, well I'm not sure what you would call me considering that my 7970's were the only AMD cards I've ever owned, meanwhile I've had two 560Ti's, two 580 Lightnings, and two Titans. I buy whatever's best at the time (if I have the cash) to get decent bench scores. I just hate seeing plain old fanboyism where everything one side does smells of roses while the other side shouldn't even build video cards because they suck so bad.


----------



## Stay Puft

Quote:


> Originally Posted by *szeged*
> 
> no problem with being a fanboy, if you like something then thats fine, but going around calling others names etc while youre doing exactly what youre trying to call out, makes you look like an idiot (not you)


OCN needs a policy of automatic bannage of fanboys and anyone who prefers Pepsi over Coke


----------



## kot0005

Leaked benches

http://www.techpowerup.com/191768/radeon-r9-290x-clock-speeds-surface-benchmarked.html


----------



## kot0005

Quote:


> Originally Posted by *kot0005*
> 
> Leaked benches
> 
> http://www.techpowerup.com/191768/radeon-r9-290x-clock-speeds-surface-benchmarked.html


At 800Mhz and 1150mhz memory speeds, its close to a titan, i wonder how it performs atb 1.2ghz, will it go that high?


----------



## dman811

It might not be a Titan killer, but it most definitely is happy to throw a few punches with it in the ring and even win a few rounds.


----------



## Ultracarpet

Quote:


> Originally Posted by *kot0005*
> 
> Leaked benches
> 
> http://www.techpowerup.com/191768/radeon-r9-290x-clock-speeds-surface-benchmarked.html


good god please be true... and $499


----------



## wermad

Quote:


> Originally Posted by *kot0005*
> 
> Leaked benches
> 
> http://www.techpowerup.com/191768/radeon-r9-290x-clock-speeds-surface-benchmarked.html


Woot woot!









edit: so no boosting technology? Could have given it the slight edge imho. I'm sure they're going to launch a GE later on.


----------



## szeged

guess the extra mem bandwidth helps, lets seem some benchs at other resolutions


----------



## Forceman

Quote:


> Originally Posted by *kot0005*
> 
> At 800Mhz and 1150mhz memory speeds, its close to a titan, i wonder how it performs atb 1.2ghz, will it go that high?


If it is clocked 800 I doubt it'll overclock to 1200 without some effort, although I guess that partly depends on the reason for the clockspeed. Wonder why it's clocked so low though, TDP or just a chip limit?


----------



## LaBestiaHumana

Quote:


> Originally Posted by *kot0005*
> 
> Leaked benches
> 
> http://www.techpowerup.com/191768/radeon-r9-290x-clock-speeds-surface-benchmarked.html


Over 5GB vram usage on Skyrim?

The 290X did really good on those benchmarks. Nice to see AMD step up their game.


----------



## wermad

Quote:


> Originally Posted by *Ultracarpet*
> 
> good god please be true... and *$499*


Sign me up for quad-fire!

Lol, seriously, at $599, I could get three and then beg the wife to let me get a 4th on tax day









Anyone know or wanna guess if the pacing Eyefinity fix was implemented in this run?


----------



## Clocknut

Quote:


> Originally Posted by *kot0005*
> 
> Leaked benches
> 
> http://www.techpowerup.com/191768/radeon-r9-290x-clock-speeds-surface-benchmarked.html


techpower up has been wrong alll the time. AMD already said it is over 300GB/s


----------



## kot0005

Quote:


> Originally Posted by *Forceman*
> 
> If it is clocked 800 I doubt it'll overclock to 1200 without some effort, although I guess that partly depends on the reason for the clockspeed. Wonder why it's clocked so low though, TDP or just a chip limit?


I think its because of the 512 bit bus or it could be that, If nvidia releases a 2880sp titan ultra, AMD can overclock the card and release it as R9 295X ?










Quote:


> Originally Posted by *Clocknut*
> 
> techpower up has been wrong alll the time. AMD already said it is over 300GB/s
> 
> But the article is saying that the card was validated in their GPU-z


----------



## amd655

o.0

290X looking sexy in the benchies, and frame latency is higher than TITAN, so that now stops the claim of hardware frame pacing lol.


----------



## wermad

Quote:


> Originally Posted by *Clocknut*
> 
> techpower up has been wrong alll the time. AMD already said it is over 300GB/s


Early ES and not finalized (or driver too)??? 288 is pretty close to 300, so I feel this is somewhat true imho.


----------



## fleetfeather

yeah but
Quote:


> Originally Posted by *Alatar*
> 
> These cards wont match highly OC'd 780s or Titans


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *amd655*
> 
> o.0
> 
> 290X looking sexy in the benchies, and frame latency is higher than TITAN, *so that now stops the claim of hardware frame pacing lol*.


Lol, no it doesn't. Just because its not as good as Titan doesn't mean it doesn't have frame pacing...


----------



## superericla

Quote:


> Originally Posted by *fleetfeather*
> 
> yeah but


Unless of course they can overclock similarly.


----------



## Baghi

They said >300 GB (less than three hundred) not <300.


----------



## Stay Puft

Quote:


> Originally Posted by *amd655*
> 
> o.0
> 
> 290X looking sexy in the benchies, and frame latency is higher than TITAN, so that now stops the claim of hardware frame pacing lol.


Here's how i look at it.

599 for a 290X that performs like a titan with some high frame latency is FINE BY ME


----------



## amd655

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Lol, no it doesn't. Just because its not as good as Titan doesn't mean it doesn't have frame pacing...


HW frame pacing.


----------



## Forceman

Quote:


> Originally Posted by *amd655*
> 
> HW frame pacing.


I don't think AMD ever actually said it had hardware frame pacing, it was just implied/interpreted from one of the slides. Unless I missed something in the announcement.

But even if it did, would it make any difference for a single card? Thought it was a Crossfire fix.


----------



## amd655

Quote:


> Originally Posted by *Forceman*
> 
> I don't think AMD ever actually said it had hardware frame pacing, it was just implied/interpreted from one of the slides. Unless I missed something in the announcement.


A certain member kicked up a large fuss saying it will have hardware frame pacing, and i was stating that he/she is just talking rubbish with no proof.

If that is indeed hardware frame pacing, it is really bad.

Still, card is looking great in numbers and cost


----------



## kot0005

Quote:


> Originally Posted by *Stay Puft*
> 
> 599 for a 290X


Hopefully, I need a new GP, look towards EVGA 780 classi or AMD 290X


----------



## wermad

Quote:


> Originally Posted by *Stay Puft*
> 
> Here's how i look at it.
> 
> 599 for a 290X that performs like a titan with some high frame latency is FINE BY ME


I look forward to your quad-fire results


----------



## Stay Puft

Quote:


> Originally Posted by *wermad*
> 
> I look forward to your quad-fire results


Looking forward to dethroning Joa as the king of Valley


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Forceman*
> 
> I don't think AMD ever actually said it had hardware frame pacing, it was just implied/interpreted from one of the slides. Unless I missed something in the announcement.
> 
> But even if it did, would it make any difference for a single card? Thought it was a Crossfire fix.


Agreed. If this is a single card then the frame pacing becomes a moot point...


----------



## Pheozero

Needs 1920x1080 and 2560x1440.


----------



## Sheyster

Quote:


> Originally Posted by *$ilent*
> 
> Lol your really 'selling' it to potential buyers


At least he's honest about it. +Rep to him.

I owned a 5970 for a week. Never going to own a dual GPU AMD card ever again.


----------



## EastCoast

All they need to do is:
1. Create a 3D print of Heisingburg on the fan shroud.
2.Use some BB box cover art
3. ????
4. Profit!!!!


----------



## bencher

Quote:


> Originally Posted by *dman811*
> 
> Don't worry, he is just acting out because he is about to evolve into his true form.



















I nearly died laughing.

Hmm so now we know it easily matches a Titan







.


----------



## maarten12100

Quote:


> Originally Posted by *james8*
> 
> Look, when Nvidia release TITAN II with a fully enabled GK110, AMD will inevitably put out a GHz edition of this using 6 GHz memory


A Titan ultra is only one cluster extra and it can't compete on price due to Nvidia's big die.
Quote:


> Originally Posted by *$ilent*
> 
> The r2d2 name thing was a joke because the new naming scheming was confusing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope the 290 is cheap enough then!


Agreed
Quote:


> Originally Posted by *provost*
> 
> That's what I meant, Oct 3rd.
> Who knows..pick a date from now until Christmas and you may get it right. The 15th may just be a paper launch with NDA lifted and for pre-orders. But who the heck knows at this point.


What a lot of people don't figure is that any company works with NDAs


----------



## Blindsay

So the 290x is looking pretty good but at the same time i cant help but feel meh. It basically just ties a card like my 780 classified which is great for someone buying a new card because its cheaper but for someone looking to upgrade they have nothing. Its like they finally just caught up rather than taking a step forward, when is nvidia due to have their next gen out?


----------



## maarten12100

Quote:


> Originally Posted by *2010rig*
> 
> Let's entertain the idea of your conspiracy theory.
> 
> What difference does it make? The issue is very real, and cannot be denied.
> 
> If it was a made up issue, when the Frame Pacing drivers released, nothing would have changed. But you can see it clearly in the videos and the charts that CF is MUCH MUCH smoother now, than prior to August 1st.
> 
> So be happy that PcPer & TR exposed these issues, AMD users who use CF benefit in the end, and we have much more competitive solutions. I really don't see the problem here.
> 
> Why are you guys so adamant in SHOOTING the messenger, yet, just as adamant at downplaying the issues?
> 
> If you don't like PCPER, what about this?
> http://www.anandtech.com/show/7195/amd-frame-pacing-explorer-cat138/2
> 
> I can't believe this whole issue is still up for debate. I also don't understand why so many people literally hate PCPER & NVIDIA ( though it's never been proven ) for exposing very real issues that before August 1st, were NOT fixed yet.


It will have hardware frame metering and the second patch will fix it for normal cards.
Besides Fermi had it's own frame latency issues.

Why are uou being so biased


----------



## SKYMTL

Quote:


> Originally Posted by *maarten12100*
> 
> It will have hardware frame metering and the second patch will fix it for normal cards.
> Besides Fermi had it's own frame latency issues.
> 
> Why are uou being so biased


What the heck is "hardware frame metering"? Can you explain what architectural enhancements AMD and NVIDIA could possibly make to their respective core architectures that would affect frame pacing?

Moving Crossfire over the primary PCI-E interface could potentially reduce latency times but under no circumstance is that frame metering, nor will this move eliminate all problems.

People don't seem to understand that for frame pacing to work, the driver and core have to work together. AMD failed at the first point regardless of what their architecture can achieve from a hardware perspective. Everything else is marketing / PR goobleygack.


----------



## jomama22

Quote:


> Originally Posted by *SKYMTL*
> 
> What the heck is "hardware frame metering"? Can you explain what architectural enhancements AMD and NVIDIA could possibly make to their respective core architectures that would affect frame pacing?
> 
> Moving Crossfire over the primary PCI-E interface could potentially reduce latency times but under no circumstance is that frame metering, nor will this move eliminate all problems.
> 
> People don't seem to understand that for frame pacing to work, the driver and core have to work together. AMD failed at the first point regardless of what their architecture can achieve from a hardware perspective. Everything else is marketing / PR goobleygack.


I think the reason people talk about such a thing is because Nvidia has stated using hardware based frame pacing in their promo for their 690.

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-690

Top center states it uses "hardware based frame metering..." so if its just marketing spin, i wouldn't exactly rest it all on AMDs shoulders.


----------



## Mainian

I'm kinda happy I picked up a 7970 vapor x during a nice summer sale. I'm actually really disappointed from the hardware progression of the new GPU's (although, it honestly makes me pretty happy with my purchase).


----------



## Nonehxc

Quote:


> Originally Posted by *Stay Puft*
> 
> Here's how i look at it.
> 
> 599 for a 290X that performs like a titan with some high frame latency is FINE BY ME


How much more latency than Titan, do you have that info?Pls?Titan is a beast in frame metering, 5ms more on average would be nothing. It would be idiotic to fix a problem at the end of life of Tahiti to reproduce it on Hawaii, which is(supposedly)GCN+1.

Lol, I was thinking about holding this gen and instead going for a 2nd really low priced 7950 until Maxwell, then decide between Maxwell or Volcanic Islands. But a TOTALLY NOT TITAN that performs like a Titan with a TOTALLY NOT TITAN PRICE is a iced dark chocolate cake. TASTY.


----------



## Penryn

This thread, as far as I am aware, is for the discussion of the R9 290X. Let's keep it that way. Discussion of fanboyism, PCPer, etc has it's own places that are not this thread.

Thank you~


----------



## SKYMTL

Quote:


> Originally Posted by *jomama22*
> 
> I think the reason people talk about such a thing is because Nvidia has stated using hardware based frame pacing in their promo for their 690.
> 
> http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-690
> 
> Top center states it uses "hardware based frame metering..." so if its just marketing spin, i wouldn't exactly rest it all on AMDs shoulders.


It is marketing spin as far as I am concerned.

For frame metering to work on the R9 or NVIDIA cards for that matter, there needs to be close harmony between the software driver stack and the hardware. It's not a one-way street. As AMD has demonstrated so effectively, a driver update CAN indeed fix frame pacing, regardless of the underlying hardware. With that in mind, NVIDIA's position of "hardware-based" solutions strikes me as well-placed PR above all else, especially considering some of NVIDIA's own issues in this field have been fixed by driver updates rather than new hardware.


----------



## Baghi

Report: Radeon R9-290X Won't Need CFX Bridge, to be $600
Quote:


> The AMD Radeon R9-290X is expected to hit retail starting October 15 for a rather surprising price. Numerous sources, including Softpedia, indicate that the card will cost just $599.


Note the word "just". lol

What has happened to tom's? They're always late to the party.


----------



## Roaches

Quote:


> Originally Posted by *Baghi*
> 
> Report: Radeon R9-290X Won't Need CFX Bridge, to be $600
> Note the word "just". lol
> 
> What has happened to tom's? They're always late to the party.


Tom's Hardware is always a few days to several weeks late whenever it comes to tech news....really not a reliable place for the latest tech news, other than whatever else tech articles they write, benches, reviews, etc...Site is no longer worth bookmarking..


----------



## 2010rig

Quote:


> Originally Posted by *SKYMTL*
> 
> It is marketing spin as far as I am concerned.
> 
> For frame metering to work on the R9 or NVIDIA cards for that matter, there needs to be close harmony between the software driver stack and the hardware. It's not a one-way street. As AMD has demonstrated so effectively, a driver update CAN indeed fix frame pacing, regardless of the underlying hardware. With that in mind, NVIDIA's position of "hardware-based" solutions strikes me as well-placed PR above all else, especially considering some of NVIDIA's own issues in this field have been fixed by driver updates rather than new hardware.


That's very interesting.
Quote:


> Combining the power of two Kepler GPUs, the GeForce GTX 690 is the fastest graphics card ever built. The GTX 690 uses hardware based frame metering for smooth, consistent frame rates across both GPUs. A meticulously designed board with dual vapor chamber coolers ensure quiet operation.


http://www.nvidia.in/object/geforce-gtx-690-in.html
Quote:


> In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames. Those frames that show up "early" are delayed slightly-in other words, the GPU doesn't flip to a new buffer immediately-in order to ensure a more even pace of frames presented for display.


http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/11

If they claim to have hardware based frame metering, and they don't, wouldn't that be false advertising?

I guess that's not something you can prove one way or the other...


----------



## Kuivamaa

But is that something present only in the 690 or is it in every SLi-capable geforce out there? AMD slide insinuates that the 290X will have some sort of hardware based solution as well.


----------



## maarten12100

There is just a extra chip that ensures that frames are put in the correct order and are complete also skips a frame that takes too long. (it lowers performance a bit)
We can't really say Nvidia does better than AMD here or we can start throwing around the frame time benchmarks of Fermi cards I for one am excited for this card.


----------



## sugarhell

If i remember even a hardware solution for frame metering needs software 'optimizations' based on engines. If i remember nvidia was bad on grid 2


----------



## wermad

I hope pcie crossfire helps w/ the issue of Eyefinity having to communicate everything with the primary card. I still can't understand the advantage in plugging in all your monitors into one gpu where as Nvidia allows you to spread them on each gpu. A notable disadvantage is the pairing of different connection interfaces which causes dreaded screen tearing







. Two weeks left until judgement day


----------



## fateswarm

Hold your horses. There is a real possibility they pull an NVIDIA. 290X $99999 and 290 being reasonable.


----------



## Brutuz

Quote:


> Originally Posted by *TooBAMF*
> 
> Hitting a VRAM wall won't automatically and permanently cut frame rate in half. It causes momentary drops when textures need to be swapped in and out. This can be interpreted as stuttering or pausing, but it doesn't cause a constant drop.
> 
> The game will run, and depending on how much of a short fall there is, seem OK most of the time. Someone that claims a game runs fine on a low VRAM card probably doesn't notice or care about momentary drops. That's their preference, however, just because the average frame rate is high doesn't mean the game isn't negatively affected by a VRAM wall.
> 
> Edit: didn't realize how far back the VRAM discussion was...


This. It happened on my GTX 470 in Skyrim, if I turned around too quickly (Even at a moderate pace) it'd pause and I'd be able to watch it load textures and the like as fast as it could if I had performance monitor running when I ran it modded on that, the HD7950 fixed that due to more vRAM and even then I'm pretty sure I could go further if I went for eyefinity.
Quote:


> Originally Posted by *2010rig*
> 
> I've owned ATI cards in the past such as the X1800XT, 2900XT....
> 
> I've also seen and experienced 5870's CF first hand, where the "fix" was disabling the 2nd card in order not to have to deal with so many issues in some games. I went the NVIDIA route due to the simple fact that my primary uses wasn't gaming, and AMD was NOT an option in Adobe Premiere at the time. I'm sure that's somehow my fault though, for not choosing a 5870 instead.
> 
> I buy what is best suited for my needs, tyvm.


No, it's not anything to do with what you picked...You picked nVidia because of CUDA and Adobe, fair enough.

It's the fact you keep acting like this problem is a massive one despite the fact that plenty of CFX owners are telling you it didn't effect everyone, then you act as though we're saying it's a complete non-issue...It's overblown and what PCPer was saying is not indicative of real world results unless you're going to tell me that a single HD4890 can nearly keep up with a GTX 470 because the difference between my CFX HD4890s and the GTX 470 wasn't that great...I'm not even comparing FPS here, just how it felt because I went to a GTX 275 in between them two and it was a heck of a lot slower than either of them. Anyway, enough on this topic as it's already established by far that it's subjective as to whether you can feel it or not.
Quote:


> Originally Posted by *2010rig*
> 
> If they claim to have hardware based frame metering, and they don't, wouldn't that be false advertising?


Most likely related to the output part of the GPU, instead of putting frames out as it gets it (Or whatever they were doing beforehand) it probably times it as closely as possible to the refresh rate of the monitor instead.


----------



## Regent Square

No way in HELL will 290X cost 500$. The price settles at 600$ which is very reasonable.

Still out of my budget







I do hope they include Never Settle Bundle with it, just to equalize it more or less to 550$, the money I can afford to spend on a card.


----------



## amd655

290x for 400 i'm sold.

Make it happen.


----------



## fateswarm

Slight congestion of VRAM has been established from various reviews that it doesn't drop performance like a rock. Most of the time it's like, roughly, "you lose 0.5% performance for every extra 100MB of VRAM needed" or something in that track. Of course, I'd personally still go for at least 40% more than usually needed since being 'just' fine, is bound to be outdated too soon.


----------



## youra6

Quote:


> Originally Posted by *amd655*
> 
> 290x for 400 i'm sold.
> 
> Make it happen.


Give me your 780s for $300 each and I'll make it happen. 

400 dollars will never happen, considering the 280x is sold for $299. The 290 may go for $399-499, but not the 290x.


----------



## wermad

Crossing fingers for $599 and I wouldn't mind off loading my Titan LEs. SC version seem to sell quicker. Glad i spent that extra $10 for the SC version







. But, before they go, need some #s.....especially Eyefinity!


----------



## szeged

might sell my one titan SC i use for my web browser/mmo machine and try a 290x in its place, depending on how bad the prices tank on them on ebay/marketplace...if they get low enough i might just pick up another titan and call it a day for that rig


----------



## szeged

double post because internet is being sucky. zzz


----------



## fleetfeather

Quote:


> Originally Posted by *szeged*
> 
> might sell my one titan SC i use for my web browser/mmo machine and try a 290x in its place, depending on how bad the prices tank on them on ebay/marketplace...if they get low enough i might just pick up another titan and call it a day for that rig


Really, The best way to go about this is you give me your Titan, and I'll buy you a R2D2.


----------



## szeged

Quote:


> Originally Posted by *fleetfeather*
> 
> Really, The best way to go about this is you give me your Titan, and I'll buy you a R2D2.


haha







depending on price, grab me to 290x's and ill throw in a hydro copper and EK titan block


----------



## Brutuz

Quote:


> Originally Posted by *fateswarm*
> 
> Slight congestion of VRAM has been established from various reviews that it doesn't drop performance like a rock. Most of the time it's like, roughly, "you lose 0.5% performance for every extra 100MB of VRAM needed" or something in that track. Of course, I'd personally still go for at least 40% more than usually needed since being 'just' fine, is bound to be outdated too soon.


Unless the game has to suddenly load new textures in, then you get a pause as it loads them in because it couldn't preload them.

3GB is perfectly fine for 1080p as it stands now, though.


----------



## M1kuTheAwesome

If I could sell someone my GTX 550 Ti's with badly resprayed blue rear whatevertheycallit and get sponsored after my birthday, I might go for a 270X. Actually, looking at the Firestrike scores, any of these would be better than my current setup. Too bad 7790's weren't around when I was building my rig in the first place. They seem to pack one hell of a punch for the money.


----------



## Baghi

Crossfired Radeon R9-290X:









Image courtesy of twitter.com/AMDRadeon


----------



## Sheyster

Quote:


> Originally Posted by *wermad*
> 
> Crossing fingers for $599 and I wouldn't mind off loading my Titan LEs. SC version seem to sell quicker. Glad i spent that extra $10 for the SC version
> 
> 
> 
> 
> 
> 
> 
> . But, before they go, need some #s.....especially Eyefinity!


IMO you're nuts to go from SLI from Xfire.









Fellow San Diegan here BTW.


----------



## fineyoung

These are two 8 pins power connectors right ?


----------



## Majin SSJ Eric

I thought they were 6 + 8?


----------



## SKYMTL

The image is purposely blurred for some reason.


----------



## Sheyster

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I thought they were 6 + 8?


Pretty sure that is the case. Hard to tell from the pic though.


----------



## youra6

Quote:


> Originally Posted by *Sheyster*
> 
> Pretty sure that is the case. Hard to tell from the pic though.


One of the left is definitely a 8 pin. The one on the right could be a 6, but it is hard to tell. With how much juice this thing needs, I'm not surprised.


----------



## gregoryH

Quote:


> Originally Posted by *fineyoung*
> 
> These are two 8 pins power connectors right ?


It's 8 + 6. Count the wires.


----------



## Testier

I am more concerned about the lack of crossfire bridge. I prefer not to upgrade to ivy bridge to run 2 of these in CF. PCIE 2.0 x8 is probably not enough.


----------



## SKYMTL

The amount of data passed over a Crossfire connector is infinitesimal compared to the bandwidth of an 8x PCI-E 2.0 interface.


----------



## Roaches

I wonder if their revision also increased the bandwidth between GPUs compared to CF bridges...I'd love to start seeing 2x the performance when a second GPU added instead of small performance gains from expensive dual GPU setups....


----------



## jomama22

Quote:


> Originally Posted by *Roaches*
> 
> I wonder if their revision also increased the bandwidth between GPUs compared to CF bridges...I'd love to start seeing 2x the performance when a second GPU added instead of small performance gains from expensive dual GPU setups....


Cfx scaling isn't dependent so much on the bandwidth needed for communication between the two. Scaling is more effected by the timing difference between frame producing on each GPU. Frame A may take longer then frame B to be chruned out by the GPU or vise versa.


----------



## fateswarm

Quote:


> Originally Posted by *Brutuz*
> 
> Unless the game has to suddenly load new textures in, then you get a *pause* as it loads them in because it couldn't preload them.


I have personal experience in fixing this problem on the id tech 3 engine. What you are describing there is very often an issue with having to read the *disk* for the assets missing, which is a very slow contraption that in many cases has to go out of sleep. That means if you make your engine to pre-load in RAM in the form of a consistent cache the assets missing, then the pauses will go away.


----------



## Forceman

Quote:


> Originally Posted by *gregoryH*
> 
> It's 8 + 6. Count the wires.


Why does the top card appear to have the 6 pin connector on the left instead of the right like the bottom card?


----------



## FtW 420

Quote:


> Originally Posted by *Forceman*
> 
> Why does the top card appear to have the 6 pin connector on the left instead of the right like the bottom card?


Probably just the cables twisted & crossing, it's like an optical illusion in the pic & hard to tell. But they wouldn't have produced cards with the connectors swapped around both ways.


----------



## szeged

Quote:


> Originally Posted by *SKYMTL*
> 
> The image is purposely blurred for some reason.


i think thats called depth of field


----------



## Forceman

Quote:


> Originally Posted by *FtW 420*
> 
> Probably just the cables twisted & crossing, it's like an optical illusion in the pic & hard to tell. But they wouldn't have produced cards with the connectors swapped around both ways.


The more I look at it, the more confusing it gets. That top wire on the top card right wire bundle looks like it just keeps going out the top of the picture. And the actual plugs look like they are in different places on the card, judging by where the card turns from silver to black near the plugs. Must be a really weird angle.


----------



## numero-uno

Jesus Christ! Stop over analysing the photo.

I'm sure both cards are the same and have the same connectors.


----------



## TheLawIX

Stilling hoping Nvidia will lower pricing and AMD will hit a $499 price mark for the R9 290x. I'll buy 3 if that actually happens.


----------



## Brutuz

Quote:


> Originally Posted by *fateswarm*
> 
> I have personal experience in fixing this problem on the id tech 3 engine. What you are describing there is very often an issue with having to read the *disk* for the assets missing, which is a very slow contraption that in many cases has to go out of sleep. That means if you make your engine to pre-load in RAM in the form of a consistent cache the assets missing, then the pauses will go away.


With Skyrim it was definitely due to running out of vRAM from high resolution textures, and I run it off of my SSD to lower the bottleneck as much as I can. Above 1080p and 3GB is really the minimum acceptable vRAM unless you only play FPS and the like, honestly.


----------



## wermad

Final specs have been confirmed:

Quote:


> AMD Radeon R9 290X Specifications
> One of the stores even posted the final specifications (source):
> 
> GPU Codename - Hawaii
> GPU Process - 28nmlarge
> Stream Processors - 2816
> TMUs - 176
> ROPs - 44
> Base Clock - 800 MHz
> Turbo Clock - 1000 MHz
> VRAM - 4GB GDDR5
> Memory Bus - 512-Bit
> Memory Clock - 1250 MHz (5GHz Effective)
> Memory Bandwidth - 320 GBps
> Power Configuration - 8+6 Pin
> PCB VRM - 5+1+1
> Die Size - 424mm2
> Transistors - 6 billion
> 3D Technology - DirectX 11.2, OpenGL 4.3 and Mantle
> Audio Technology - AMD TrueAudio


http://www.overclock.net/t/1430935/vc-r9-290x-final-specifications-confirmed

I'm sure the mods will merge these threads soon


----------



## dman811

Any word on the official price?


----------



## wermad

Quote:


> Originally Posted by *dman811*
> 
> Any word on the official price?


That's still mum. Everyone is still speculating from $499 to $750 USD. My guess is $599-649 which would make it compete both with Titan and 780.

Looks like everyone jumped over to the new thread. I thought mods were merging additional threads with this one??? Oh well, lets just fallow the crowds


----------



## Baghi

*AMD Radeon R9 290X Battlefield 4 Edition Pre-Orders Show $1,145 / €840 Price* (link)


Spoiler: Warning: Spoiler!







Damn you AMD!


----------



## wermad

Quote:


> Originally Posted by *Baghi*
> 
> *AMD Radeon R9 290X Battlefield 4 Edition Pre-Orders Show $1,145 / €840 Price* (link)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Damn you AMD!


Some sites are taking pre-orders. This one seems way too gauged. So far, we've seen $729 w/ the bundle, so most like it will fall around $650-700.

Uk site is taking £99 refundable deposit. Upon availability, you pay the difference.

Looks like the crowd has moved on to this thread:

http://www.overclock.net/t/1430935/vc-amd-radeon-r9-290x-final-specifications-bf4-bundle-available-for-preorder


----------



## szeged

if it is 650 without the bf4 bundle, if it ends up not beating the 780, or just ties with it.....thats a long wait to be dissapointed.


----------



## DarkBlade6

Quote:


> Originally Posted by *szeged*
> 
> if it is 650 without the bf4 bundle, if it ends up not beating the 780, or just ties with it....*.thats a long wait to be dissapointed.*


Déja Vù ?


----------



## maarten12100

Quote:


> Originally Posted by *Baghi*
> 
> *AMD Radeon R9 290X Battlefield 4 Edition Pre-Orders Show $1,145 / €840 Price* (link)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Damn you AMD!


Quote:


> Originally Posted by *wermad*
> 
> Some sites are taking pre-orders. This one seems way too gauged. So far, we've seen $729 w/ the bundle, so most like it will fall around $650-700.
> 
> Uk site is taking £99 refundable deposit. Upon availability, you pay the difference.
> 
> Looks like the crowd has moved on to this thread:
> 
> http://www.overclock.net/t/1430935/vc-amd-radeon-r9-290x-final-specifications-bf4-bundle-available-for-preorder


You shouldn't look at pre pre order europe price


----------



## Newbie2009

http://www.overclockers.co.uk/showproduct.php?prodid=GX-015-AM&groupid=701&catid=56&subcat=1752

Interesting one. Not the price btw, note the ETA. 31/10/13


----------



## Ghoxt

It is noted the box being Dominated by the Battlefield 4 display, and as a footnote...R9 290x. Why?


----------



## Nintendo Maniac 64

Quote:


> Originally Posted by *Newbie2009*
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-015-AM&groupid=701&catid=56&subcat=1752
> 
> Interesting one. Not the price btw, note the ETA. *13-10-31*


Fixed that for you.


----------



## Tisca

Quote:


> Originally Posted by *Baghi*
> 
> *AMD Radeon R9 290X Battlefield 4 Edition Pre-Orders Show $1,145 / €840 Price* (link)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Damn you AMD!


Webhallen rep said they only priced it like that because the competitor Inet had that price. Real price to be determined later.

http://www.techpowerup.com/191909/radeon-r9-290x-priced-at-729-99-on-newegg-com.html
$730 + BF4


----------



## selk22

Quote:


> Originally Posted by *Tisca*
> 
> Webhallen rep said they only priced it like that because the competitor Inet had that price. Real price to be determined later.
> 
> http://www.techpowerup.com/191909/radeon-r9-290x-priced-at-729-99-on-newegg-com.html
> $730 + BF4


Id say that's a pretty solid find! good job


----------



## Testier

New egg ca on my phone is showing the price. Hopefully is wrong.


----------



## L36

Its normal for newegg to mark up new hardware. If the actual price supposed to be $700 as rumored with no BF3, i would not be surprised if newegg marked it up up to $50.
That throws me off is if AMD claims this slaps around the 780 and goes toe to toe with the titan, why are there no numbers of any kind directly from AMD?

Preordering something you know no numbers of is just silly.


----------



## SKYMTL

Quote:


> Originally Posted by *L36*
> 
> Its normal for newegg to mark up new hardware. If the actual price supposed to be $700 as rumored with no BF3, i would not be surprised if newegg marked it up up to $50.
> That throws me off is if AMD claims this slaps around the 780 and goes toe to toe with the titan, why are there no numbers of any kind directly from AMD?
> 
> Preordering something you know no numbers of is just silly.


Three key things have not been mentioned to reviewers: power consumption, comparative performance results and price. I've actually never attended a briefing where the two first items were omitted. Is it odd? Yes. Should we worry? Absolutely not.


----------



## provost

Actually whenever I hear not to worry, I start worrying more. Lol
You only have to worry when someone tells you not to, otherwise they would not have to say it


----------



## raghu78

Quote:


> Originally Posted by *SKYMTL*
> 
> Three key things have not been mentioned to reviewers: power consumption, comparative performance results and price. I've actually never attended a briefing where the two first items were omitted. Is it odd? Yes. Should we worry? Absolutely not.


you mean to say in the slides which are under NDA there was no mention of R9 290X TDP, performance comparisons with Nvidia Titan / GTX 780 and price. the first two points are not an issue as you reviewers would have been given your GPU samples for preparing reviews and you will have actual data wrt performance and power consumption . But without the price how do you rate the product and give your recommendations. doesn't make sense. Is AMD going to give you the price a day before Oct 15th embargo lifts


----------



## rusky1

I'm still hoping for $599.

Sent from my Galaxy Nexus using Tapatalk 2


----------



## Blindsay

so we can pre-order it today but we cant see an actual review for like 2 weeks? call me crazy but who would pre-order it before knowing what it can actually do


----------



## dir_d

ok im tired of the fluff from all around the web. I need some benches, so far the price is depressing and i know nothing.


----------



## Tisca

All I know is if the price seems too good to be true, it is.


----------



## Newbie2009

Quote:


> Originally Posted by *Blindsay*
> 
> so we can pre-order it today but we cant see an actual review for like 2 weeks? call me crazy but who would pre-order it before knowing what it can actually do


Crazy
Quote:


> Originally Posted by *dir_d*
> 
> ok im tired of the fluff from all around the web. I need some benches, so far the price is depressing and i know nothing.


Well, prices are always gouged on launch and pre-order so don't give up hope yet.


----------



## Stay Puft

Quote:


> Originally Posted by *rusky1*
> 
> I'm still hoping for $599.
> 
> Sent from my Galaxy Nexus using Tapatalk 2


Not going to happen


----------



## fleetfeather

Quote:


> Originally Posted by *Tisca*
> 
> All I know is if the price seems too good to be true, it is.


Price isn't too good to be true, it's too bad to be true.

Wait.

Does that mean it's going to be good?


----------



## Ashuiegi

2 week ago i saw a new in the box hd7970 matrix platinium for about 350$ from a retailer , i really regret not getting it and going for matrix CF , would be far better price/perf ratio then a 290x CF even with the price of the waterblock,....


----------



## Gabrielzm

http://www.anandtech.com/show/7388/r9290x-preordering-begins-today

It seems AMD will be moving up in the price ladder to tie with the 780 prices...Personally I really hope the card does well. We need competition and I don´t hold allegiances to any (green or red). I always buy whatever is fast and not obscenely loud. Last generation I went with a saphire 7950 which was a tie or even surpass a bunch of 7970 while been very quiet.

Cheers


----------



## Nonehxc

1145$?!?!?!?

Oh my poor heart! Tell me that's a placeholder!


----------



## kosmosrl

At this price I might as well go 280x CF..


----------



## Gabrielzm

Quote:


> Originally Posted by *Nonehxc*
> 
> 1145$?!?!?!?
> 
> Oh my poor heart! Tell me that's a placeholder!


Not likely...It seems more like US$ 700 +-. It will all comes down to what performance level it will offer. Remember so far we only have leaked benchmarks not a through and validated investigation of comparative benchmarks.


----------



## maarten12100

Quote:


> Originally Posted by *Nonehxc*
> 
> 1145$?!?!?!?
> 
> Oh my poor heart! Tell me that's a placeholder!


it is


----------



## Nonehxc

Quote:


> Originally Posted by *maarten12100*
> 
> it is












I'm reading that the BF4 Edition may be 729$ on preorder. After the European early adopter fee and the direct conversion craze subsides, I think we can see them for as low as 500€ and something after some months.

Early European Adopters, my sincerest condolences.


----------



## routek

I'll wait till 2014, see how the new games run

7970 seems a great buy right now


----------



## xoleras

A mod at xtremesystems has tested the R9 290X:

http://www.xtremesystems.org/forums/...=1#post5209420

Quote:


> Performance in. Killer card on air (RIP TITAN), sucks on LN2. 1400 MHz or so. It's (Titan's) reign is over.
> 
> I, for one, thought those benchmark results shown were fake and actually lower than shown. Turns out the exact opposite is true. Well played, AMD, bloody well played.


Apparently the 290X overclocks REALLY well on air, he is saying "RIP Titan"


----------



## Forceman

Quote:


> Originally Posted by *xoleras*
> 
> A mod at xtremesystems has tested the R9 290X:
> 
> http://www.xtremesystems.org/forums/...=1#post5209420
> Apparently the 290X overclocks REALLY well on air, he is saying "RIP Titan"


Followed immediately by this post from SKYMTL
Quote:


> Please stop posting baseless rumors. It's because of posts like this that people's hopes are stretched too far and then they end up complaining that performance doesn't live up to some imaginary figures.


----------



## xoleras

So did SKYMTL benchmark the 290X yet? I dunno, I get a strict anti-AMD vibe from him. Something I don't like seeing from a reviewer that should be objective. Maybe i'm just reading into his posts too much, I do love HWC's website.

That said, I'm not putting 100% confidence in the rumors, obviously. I want real benchmarks just as much as everyone else.


----------



## Master__Shake

Quote:


> Originally Posted by *xoleras*
> 
> So did SKYMTL benchmark the 290X yet? I dunno, I get a strict anti-AMD vibe from him. I love HWC, though.
> 
> That said, I'm not putting 100% confidence in the rumors, obviously. I want real benchmarks just as much as everyone else.


he uses amd in his gaming rig AFAIK (gpu)


----------



## Forceman

Quote:


> Originally Posted by *xoleras*
> 
> So did SKYMTL benchmark the 290X yet? I dunno, I get a strict anti-AMD vibe from him. Something I don't like seeing from a reviewer that should be objective. Maybe i'm just reading into his posts too much, I do love HWC's website.
> 
> That said, I'm not putting 100% confidence in the rumors, obviously. I want real benchmarks just as much as everyone else.


I don't know, but he was at the AMD NDA event. Follow-up post by the original poster says:
Quote:


> You call it baseless based on what? It comes within a few percentage points of the TITAN while costing significantly less, it outright beats the 780 (not just in price/performance but overall performance as well), I know what it can or can't do on LN2. I'm sure you know all of this too.


So is it within a few percentage points of the Titan, or does it kill the Titan? We knew it would be within a few percentage points of the Titan, so I'm not sure how that translates into RIP Titan, unless you are factoring in cost (which is kind of dumb in Titan's case since it is priced so outrageously anyway). If that's the criteria than the 780 already RIPs the Titan.


----------



## xoleras

Quote:


> Originally Posted by *Forceman*
> 
> I don't know, but he was at the AMD NDA event. Follow-up post by the original poster says:
> So is it within a few percentage points of the Titan, or does it kill the Titan? We knew it would be within a few percentage points of the Titan, so I'm not sure how that translates into RIP Titan, unless you are factoring in cost (which is kind of dumb in Titan's case since it is priced so outrageously anyway).


I guess overclocking is the wildcard. The vibe I get from that guy is that the 290X overclocks very well on air.

We'll see on the 15th, though.







Like I said - I want real benchmarks just like everyone else does. Sick of rumors!


----------



## Forceman

Quote:


> Originally Posted by *xoleras*
> 
> I guess overclocking is the wildcard. The vibe I get from that guy is that the 290X overclocks very well on air.
> 
> We'll see on the 15th, though.
> 
> 
> 
> 
> 
> 
> 
> Like I said - I want real benchmarks just like everyone else does. Sick of rumors!


Yep, always the wildecard. Hopefully not voltage locked. He also said the power might be holding it back, so might have to wait for non-reference cards.

Here's an update, maybe not as impressive as the first post made it sound:
Quote:


> When I talk about performance I'm talking stock performance. In some cases it pulls ahead of the TITAN nicely, in others the TITAN pulls ahead. I averaged it out and the TITAN is slightly faster, but not by much. *I said TITAN was history before averaging the scores, and thought that a good trumping pulled the 290X further ahead than it really did.* 1400 MHz is what the card does on LN2.


----------



## $ilent

xoleras that xtreme link you put up is down?


----------



## Forceman

Quote:


> Originally Posted by *$ilent*
> 
> xoleras that xtreme link you put up is down?


His link got shortened, try this one:

http://www.xtremesystems.org/forums/showthread.php?286994-Amd-Volcanic-Islands-details/page28


----------



## $ilent

thanks

Im not sure what to think, like you say forceman that guy on XS site says one thing, then contradicts himself with his next sentence.

Do you think he really does have a 290x and its true what he says?


----------



## Forceman

Probably. But is it an ES or a retail card? Was it cherry-picked for delivery to a known overclocker?

This is the worst part of the GPU cycle - when there are reasonably reliable leaks, but still no hard info. Lots of room for error and correction over the next two weeks until NDA lifts. I wish we knew more about the 290 (non-X) though.


----------



## wstanci3

Quote:


> Originally Posted by *Forceman*
> 
> Probably. But is it an ES or a retail card? Was it cherry-picked for delivery to a known overclocker?
> 
> This is the worst part of the GPU cycle - when there are reasonably reliable leaks, but still no hard info. Lots of room for error and correction over the next two weeks until NDA lifts. I wish we knew more about the 290 (non-X) though.


AMD has been awfully quiet about the 290. Even though I am excited to see the official specs and performance of the 290x, the 290 has me intrigued a bit more honestly.


----------



## rusky1

Quote:


> Originally Posted by *wstanci3*
> 
> AMD has been awfully quiet about the 290. Even though I am excited to see the official specs and performance of the 290x, the 290 has me intrigued a bit more honestly.


Maybe they're keeping quiet about it because they know it unlocks to a 290x at a much lower price?


----------



## $ilent

I just hope the price isnt too ridiculous. Dont get me wrong id love to get a 290X to go with my new 1440p monitor, but when you only play 1 or 2 games on your pc can I really justify paying over the odds for a premium gpu?

I waited for the gtx 780 and was hoping that would be relatively good value for money, but then it was released at £550 and has been that price since release, I was just like get out of here im not paying that Nvidia. I was so annoyed, because people were speculating, much like they are with this 290X, and saying it was going to be cheap.


----------



## Stay Puft

Since when did you have to start paying to join XS?


----------



## kingduqc

Quote:


> Originally Posted by *wstanci3*
> 
> AMD has been awfully quiet about the 290. Even though I am excited to see the official specs and performance of the 290x, the 290 has me intrigued a bit more honestly.


so quiet i thought tat it didnt exist... building a nice rig for my friend l waited to see some benche. He got like 650 budget max on gpu so it was either dual 7970, a 290x at 599 or a 780.. still not sure whats the best move for 1080 120hz.


----------



## specopsFI

It seems to me that AMD is still calculating different scenarios. It's clear they've decided not to sell the new cards with current performance but rather on the promise of Mantle. That's a tricky aspect for pricing: how much can you ask for something that isn't quite there yet? A similar situation bit them the wrong way with 7990: on release it had great potential but they couldn't get people to pay for it with a promise that it will shine _some day_. If there was no Mantle, they'd probably just go with $549 for the 290X. But there is, and they're not quite sure if it's worth $50 or $150 extra.

I'm actually more interested in what the 280X and most of all the 290 is going to be about. There are some amazing 7970 deals out right now and I've been itching to build an all-AMD rig on the side. Most likely I should just go for it, but I've already had 4 of those 7970's... So if 290X is to be their cash cow then maybe 290 will hit the sweet spot.


----------



## maarten12100

Quote:


> You call it baseless based on what? It comes within a few percentage points of the TITAN while costing significantly less, it outright beats the 780 (not just in price/performance but overall performance as well), I know what it can or can't do on LN2. I'm sure you know all of this too.


nuff said doesn't beat Titan in performance alone will OCs great on air has incredible bandwidth which scales well with ultra high resolutions we have a winner for 95% of the ultra enthusiasts.


----------



## rusky1

Quote:


> Originally Posted by *maarten12100*
> 
> nuff said doesn't beat Titan in performance alone will OCs great on air has incredible bandwidth which scales well with ultra high resolutions we have a winner for 95% of the ultra enthusiasts.


Except according to AMD it wasn't marketed towards the ultra enthusiast market which makes me question their $730 pre-order price. Anything over $600 for a single card is ultra enthusiast for most people.


----------



## Phaethon666

Quote:


> Originally Posted by *rusky1*
> 
> Except according to AMD it wasn't marketed towards the ultra enthusiast market which makes me question their $730 pre-order price. Anything over $600 for a single card is ultra enthusiast for most people.


I sorta agree with this statement. $730 is a lot of money, and though others have a stronger opinion on what "Ultra Enthusiast" means, "Enthusiast" seems to have a very negligible price difference.


----------



## szeged

and people continue to forget that titans can be overclocked.


----------



## maarten12100

Quote:


> Originally Posted by *rusky1*
> 
> Except according to AMD it wasn't marketed towards the ultra enthusiast market which makes me question their $730 pre-order price. Anything over $600 for a single card is ultra enthusiast for most people.


It is probably well rather most certainly a place holder








Quote:


> Originally Posted by *szeged*
> 
> and people continue to forget that titans can be overclocked.


The XS mod acounted for OCing potential clock for clock the Titan pulls ahead slightly and 290x fails to impress on LN2 (probably they didn't zombiemod it yet and were using standard voltages meaning they only benifit from the decreased usage/leakage of the chip rather than go the way.)

Alatar will most likely keep his 10 dollar and make 10 extra guess he made a smart bet.


----------



## $ilent

Quote:


> Originally Posted by *maarten12100*
> 
> Alatar will most likely keep his 10 dollar and make 10 extra guess he made a smart bet.


If he has money to burn buying GTX Titans he probably wont care about winning that $10, its likely more the principle of being right.

Thats what I think anyway.


----------



## szeged

Quote:


> Originally Posted by *maarten12100*
> 
> It is probably well rather most certainly a place holder
> 
> 
> 
> 
> 
> 
> 
> 
> The XS mod acounted for OCing potential clock for clock the Titan pulls ahead slightly and 290x fails to impress on LN2 (probably they didn't zombiemod it yet and were using standard voltages meaning they only benifit from the decreased usage/leakage of the chip rather than go the way.)
> 
> Alatar will most likely keep his 10 dollar and make 10 extra guess he made a smart bet.


hope they dont have to zombie mod the card, nvidia screwed up by giving the titan a poop vrm section compared to other cards (classified should have been the standard vrm for reference titan/780s imo)

im not much into ln2, mostly just water benching, so i wanna see max OC vs max OC on air and water, with ln2 as an afterthought.


----------



## maarten12100

Quote:


> Originally Posted by *$ilent*
> 
> If he has money to burn buying GTX Titans he probably wont care about winning that $10, its likely more the principle of being right.
> 
> Thats what I think anyway.


You are correct there


----------



## $ilent

Hey guys the AMD R9 290X price is official:

http://www.aria.co.uk/Products/Components/Graphics+Cards/AMD+Radeon/Radeon+HD+R9+290X/Club+3D+R9+290X+Graphics+Card+%2B+FREE+Battlefield+4!+?productId=57699

£11999 or just under $19200. I am on the fence about preordering at this price now.


----------



## bmt22033

Quote:


> Originally Posted by *$ilent*
> 
> Hey guys the AMD R9 290X price is official:
> 
> http://www.aria.co.uk/Products/Components/Graphics+Cards/AMD+Radeon/Radeon+HD+R9+290X/Club+3D+R9+290X+Graphics+Card+%2B+FREE+Battlefield+4!+?productId=57699
> 
> £11999 or just under $19200. I am on the fence about preordering at this price now.


Whew! And I feared it would be priced out of my budget. Thank you AMD! I'll take 4 please!!!


----------



## maarten12100

Quote:


> Originally Posted by *$ilent*
> 
> Hey guys the AMD R9 290X price is official:
> 
> http://www.aria.co.uk/Products/Components/Graphics+Cards/AMD+Radeon/Radeon+HD+R9+290X/Club+3D+R9+290X+Graphics+Card+%2B+FREE+Battlefield+4!+?productId=57699
> 
> £11999 or just under $19200. I am on the fence about preordering at this price now.


That FREE! BF4 really makes it worth the price.
Which is OVER 9000!
Place holders are such fun.


----------



## Jack Mac

Quote:


> Originally Posted by *$ilent*
> 
> Hey guys the AMD R9 290X price is official:
> 
> http://www.aria.co.uk/Products/Components/Graphics+Cards/AMD+Radeon/Radeon+HD+R9+290X/Club+3D+R9+290X+Graphics+Card+%2B+FREE+Battlefield+4!+?productId=57699
> 
> £11999 or just under $19200. I am on the fence about preordering at this price now.


Lmao, I could buy good used cars for my entire family with that kind of money.


----------



## $ilent

I went ahead and preordered two of those £12,000 290X's. I did it quickly before they sold out their allocation.


----------



## Nonehxc

Quote:


> Originally Posted by *bmt22033*
> 
> Whew! And I feared it would be priced out of my budget. Thank you AMD! I'll take 4 please!!!


Okay. It will be a prima nocta and your first firstborn, not the first born in your couple.









Also, if you were so kind as to sign here with your blood, we'll process your order right away.


----------



## Forceman

Quote:


> Originally Posted by *$ilent*
> 
> I went ahead and preordered two of those £12,000 290X's. I did it quickly before they sold out their allocation.


That's good thinking. With a price like that you know it has to be good.


----------



## Moragg

Quote:


> Originally Posted by *Forceman*
> 
> That's good thinking. With a price like that you know it has to be good.


AMD. Bringing POWER to the masses classes.


----------



## CynicalUnicorn

Quote:


> Originally Posted by *$ilent*
> 
> Hey guys the AMD R9 290X price is official:
> 
> http://www.aria.co.uk/Products/Components/Graphics+Cards/AMD+Radeon/Radeon+HD+R9+290X/Club+3D+R9+290X+Graphics+Card+%2B+FREE+Battlefield+4!+?productId=57699
> 
> £11999 or just under $19200. I am on the fence about preordering at this price now.


Since AMD is always better than Nvidia when it comes to price/performance, that must be _at least 25 Titans!_ I'll take ten!


----------



## BankaiKiller

Seriously? Someone pre ordered that from that website?


----------



## $ilent

Quote:


> Originally Posted by *BankaiKiller*
> 
> Seriously? Someone pre ordered that from that website?


Yeah i got 2 just in case.


----------



## fateswarm

Quote:


> Originally Posted by *Brutuz*
> 
> With Skyrim it was definitely due to running out of vRAM from high resolution textures, and I run it off of my SSD to lower the bottleneck as much as I can. Above 1080p and 3GB is really the minimum acceptable vRAM unless you only play FPS and the like, honestly.


I wouldn't discount an fdisk() on an SSD pausing it since those can go to sleep as well. Sometimes disks give the impression that even if the OS tells them to not sleep, they do sleep in one way or another (or perhaps their controller does). Or there may be other ways for them to not give data fast enough and in general it's silly nowadays to not have your overflowed assets on a RAM cache since it's extremely faster compared to a disk, it probably approaches the full speed of the PCI-E bus, which is still tons slower than VRAM<->GPU speed, but nowhere near disks.


----------



## Mr357

Quote:


> Originally Posted by *$ilent*
> 
> Yeah i got 2 just in case.


Stop! You're killing me!


----------



## selk22

Quote:


> Originally Posted by *BankaiKiller*
> 
> Seriously? Someone pre ordered that from that website?


I just got one for my Lan Rig. You should hop on it before they run out of stock.


----------



## amd655

Quote:


> Originally Posted by *Jack Mac*
> 
> Lmao, I could buy good used cars for my entire family with that kind of money.


----------



## Ashuiegi

Does it beat a 7990 ? , because you can find them cheaper then that atm ,.....


----------



## Tisca

I wish no one would preorder at an unknown price so AMD would have to (re)think their pricing for a lower one.
Quote:


> Originally Posted by *Ashuiegi*
> 
> Does it beat a 7990 ? , because you can find them cheaper then that atm ,.....


I would never compare a dual GPU card to a single.


----------



## wermad

Quote:


> Originally Posted by *Tisca*
> 
> I wish no one would preorder at an unknown price so AMD would have to (re)think their pricing for a lower one.


Its a mess right now for the US pre-order. Seeing a screenshot of twitter, amd rep says basically sorry, Newegg and TD are late, ??? Not sure if this is a Amd fudge up and they're trying to cover it up and blame the retailers they can't add a simple listing (which I find hard to believe for Newegg.com).

Apparently, newegg will have preorders available today the 4th (Newegg is on PST). I'm still showing the "auto notify".

edit:
Quote:


> Originally Posted by *jimreaper218*


----------



## Baghi

Quote:


> Originally Posted by *szeged*
> 
> and people continue to forget that titans can be overclocked.


Custom BIOSes came after a while for the TITAN, wait a few months till we see max OC vs. max OC battle.









----

That guy at XS says:
Quote:


> In Firestrike the 290X is in line with a 780 OC.


Stock vs. OC? Makes me believe on those who say FS is bandwidth intensive.


----------



## anticommon

Quote:


> Originally Posted by *Baghi*
> 
> Custom BIOSes came after a while for the TITAN, wait a few months till we see max OC vs. max OC battle.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ----
> 
> That guy at XS says:
> Stock vs. OC? Makes me believe on those who say FS is bandwidth intensive.


But what kind of OC? a 1100mhz OC on a 780 is nowhere near the 1300-1400mhz OC's people are getting with the likes of Classys... which can be had for $600 through EVGA promotion.


----------



## Alatar

Quote:


> Originally Posted by *xoleras*
> 
> A mod at xtremesystems has tested the R9 290X:
> 
> http://www.xtremesystems.org/forums/...=1#post5209420
> Apparently the 290X overclocks REALLY well on air, he is saying "RIP Titan"


So 1400mhz on LN2....

My Titan does more on water...

And even in the leaked multi monitor 4XMSAA benches the 290X hasn't been on par with a titan clock for clock.


----------



## Forceman

Quote:


> Originally Posted by *wermad*
> 
> Not sure if this is a Amd fudge up and they're trying to cover it up and blame the retailers they can't add a simple listing (which I find hard to believe for Newegg.com).


Wonder if it has something to do with no price being available. Maybe Newegg is reluctant to list a card without an actual price attached.


----------



## 2010rig

Quote:


> Originally Posted by *Baghi*
> 
> Custom BIOSes came after a while for the TITAN, wait a few months till we see max OC vs. max OC battle.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ----
> 
> That guy at XS says:
> Stock vs. OC? Makes me believe on those who say FS is bandwidth intensive.


He also said.
Quote:


> When I talk about performance I'm talking stock performance. In some cases it pulls ahead of the TITAN nicely, in others the TITAN pulls ahead. *I averaged it out and the TITAN is slightly faster,* but not by much. I said TITAN was history before averaging the scores, and thought that a good trumping pulled the 290X further ahead than it really did. 1400 MHz is what the card does on LN2.


780's do 1800 on LN2.

So I wonder what the Max on Air / Water OC will be.

I also wonder what games were tested, were the good "trumping" Dirt Showdown for example.... Inquiring minds want to know.

And...
Quote:


> would you say its time to take the 780 (1250/7200) out of my system ?
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Nope
Click to expand...


----------



## Alatar

Quote:


> Originally Posted by *2010rig*
> 
> He also said.
> 780's do 1800 on LN2.
> 
> So I wonder what the Max on Air / Water OC will be.
> 
> I also wonder what games were tested, were the good "trumping" Dirt Showdown for example.... Inquiring minds want to know.
> 
> And...


If the 290X truly only does 1400MHz on LN2 (which isn't entirely unfeasible since 7970s usually hit the 1600mhz range) then the thing wont stand a chance against OC'd Titans.


----------



## 2010rig

Quote:


> Originally Posted by *Alatar*
> 
> If the 290X truly only does 1400MHz on LN2 (which isn't entirely unfeasible since 7970s usually hit the 1600mhz range) then the thing wont stand a chance against OC'd Titans.


No kidding, but depending on who you ask, Titan's don't OC.









You are only allowed to compare OC'd cards to stock Titans.









So, my guess based on what we know is this:

Card comes with a Boost of 1020, which is what the leaked benchmarks were run at, Max OC will likely be around 1100 - 1200 MHZ.

So essentially, it's the 6970 all over again, those were clocked to their near limits in order to compete.

Makes sense why AMD isn't releasing benchmark numbers, clocks, etc, until the last possible second.


----------



## szeged

titans cant be overclocked and all 290x's hit 1400 out of the box, according to these people.


----------



## maarten12100

Quote:


> Originally Posted by *szeged*
> 
> titans cant be overclocked and all 290x's hit 1400 out of the box, according to these people.


That is on LN2 I guess 1200MHz and more on no reference for sure


----------



## $ilent

Call it a wild guess if you will, but I think we will see 290X's hitting over 2000mhz on air.


----------



## kot0005

Quote:


> Originally Posted by *$ilent*
> 
> Call it a wild guess if you will, but I think we will see 290X's hitting over 2000mhz on air.


LOL

Edit: you mean the memory?


----------



## fleetfeather

Quote:


> Originally Posted by *$ilent*
> 
> Call it a wild guess if you will, but I think we will see 290X's hitting over 2000mhz on air.


10/10


----------



## $ilent

Quote:


> Originally Posted by *kot0005*
> 
> LOL
> 
> Edit: you mean the memory?


No I mean gpu core.


----------



## Moragg

Quote:


> Originally Posted by *$ilent*
> 
> Call it a wild guess if you will, but I think we will see 290X's hitting over 2000mhz on air.


Sunny-side-up for me please


----------



## psyside

Quote:


> Originally Posted by *Stay Puft*
> 
> Girls with an Aussie accent as well mmmmmmmm


I don't get people who try to upload/records videos, for youtube having terrible mic/soundcard, i turned it off at the very first few seconds, the audio is simply awful!!!!!


----------



## Chewy

£800 to buy the 290X with bf4

http://www.overclockers.co.uk/showproduct.php?prodid=GX-015-AM&groupid=701&catid=56&subcat=1752

Forget it!!


----------



## sugarhell

Quote:


> Originally Posted by *Chewy*
> 
> £800 to buy the 290X with bf4
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-015-AM&groupid=701&catid=56&subcat=1752
> 
> Forget it!!


I would like to know where did you see the 800 bucks price


----------



## maarten12100

Quote:


> Originally Posted by *Chewy*
> 
> £800 to buy the 290X with bf4
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-015-AM&groupid=701&catid=56&subcat=1752
> 
> Forget it!!


Place holder Duh!


----------



## Chewy

Quote:


> Originally Posted by *sugarhell*
> 
> I would like to know where did you see the 800 bucks price


Did you read the description???
Quote:


> Originally Posted by *maarten12100*
> 
> Place holder Duh!


Once again......


----------



## sugarhell

Quote:


> Originally Posted by *Chewy*
> 
> Did you read the description???
> Once again......





Spoiler: Warning: Spoiler!



Quote:


> Quote:
> 
> 
> 
> **THIS IS NOT AN R290X FOR £99 - THIS IS A DEPOSIT for PRE-ORDER SLOT ONLY! YOU WILL NEED TO PAY ADDITIONAL MONEY MID - END OCTOBER!!**
> 
> You are placing a DEPOSIT / PRE-ORDER for your slot in the queue of R290X BF4 Edition Pre-Orders. Upon product launch in mid-end October you will need to pay the remaining fee which will be several hundred pounds.
> 
> OcUK shall be stocking BF4 Editions of Sapphire, HIS, Asus, Gigabyte and MSI. OcUK is one of two exclusive Etail partners which has stock guaranteed en-route and we have several hundred units coming. So we can guarantee you will get a BF4 edition by placing a deposit with ourselves!
> 
> Instructions:-
> 1. If you want AMD's new flagship single GPU card the R9 290X place a DEPOSIT NOW by ordering!
> 2. Middle too end October we shall contact you via email, at which point you can call us to select the brand you want and pay the additional money required.
> 3. If you are unhappy with the launch pricing or have changed your mind, your full £99 DEPOSIT shall be refunded at your request.
> 4. Final price to be confirmed!
Click to expand...





Where tell me the 800 bucks price. Also gibbo (ocuk stuff) said that the price will be between 450-650


----------



## Baghi

Quote:


> Originally Posted by *Chewy*
> 
> Once again......


Count me, too. Because, I don't see the price either. Maybe we all three should get our eyes checked?


----------



## $ilent

Yea chewy where does it say £800?


----------



## Chewy

Quote:


> Originally Posted by *sugarhell*
> 
> 
> Where tell me the 800 bucks price. Also gibbo (ocuk stuff) said that the price will be between 450-650


Are you completely blind??

You are placing a DEPOSIT for your slot in the queue of R290X BF4 Edition Pre-Orders. Upon product launch in October you will need to pay the remaining fee which will be several hundred pounds.

£99 + £700 = £799

SRSLY GUYS....


----------



## Baghi

Quote:


> Originally Posted by *$ilent*
> 
> Yea chewy where does it say £800?


4th! I think his eyes are overclocked!


----------



## sugarhell

Maybe you have something already on your basket?


----------



## Yvese

Quote:


> Originally Posted by *Chewy*
> 
> Are you completely blind??
> 
> You are placing a DEPOSIT for your slot in the queue of R290X BF4 Edition Pre-Orders. Upon product launch in October you will need to pay the remaining fee which will be several hundred pounds.
> 
> £99 + £700 = £799
> 
> SRSLY GUYS....


Where are you getting the 700 from?


----------



## Chewy

Quote:


> Originally Posted by *Yvese*
> 
> Where are you getting the 700 from?


The description says "in english" You will need to pay the remaining balance of several hundred pounds? Omg you guys not able to convert words to numbers or what???


----------



## sugarhell

Several doesnt mean 700...


----------



## fleetfeather

Quote:


> Originally Posted by *Chewy*
> 
> Are you completely blind??
> 
> You are placing a DEPOSIT for your slot in the queue of R290X BF4 Edition Pre-Orders. Upon product launch in October you will need to pay the remaining fee which will be several hundred pounds.
> 
> £99 + £700 = £799
> 
> SRSLY GUYS....

































"Several" =/= Seven

Sorry buddy, too funny.


----------



## criminal

Quote:


> Originally Posted by *Chewy*
> 
> The description says "in english" You will need to pay the remaining balance of several hundred pounds? Omg you guys not able to convert words to numbers or what???


It says "several" hundred pounds, not "seven" hundred pounds. Reread it again.


----------



## wstanci3

He's trolling guys.
Well played, Cartman.


----------



## Chewy

Quote:


> Originally Posted by *sugarhell*
> 
> Several doesnt mean 700...


No it means three obviously


----------



## Newbie2009

Several means more than two. "in english"

I assume you are trolling. Thanks needed a good lol today


----------



## sugarhell

Quote:


> Originally Posted by *Chewy*
> 
> No it means three obviously


Are you for real? Several means:

http://www.thefreedictionary.com/several


----------



## criminal

Quote:


> Originally Posted by *sugarhell*
> 
> Are you for real? Several means:
> 
> http://www.thefreedictionary.com/several


More than two or three but not many. So four, five or six I take it? $599 hopefully.


----------



## Chewy

Ok guys you got me..... But i had a laugh non the less


----------



## sugarhell

Quote:


> Originally Posted by *criminal*
> 
> More than two or three but not many. So four, five or six I take it? $599 hopefully.


I hope 499 because its ******ed to pay so much for single gpus? Anyone remember 4k series? If it is more than 600 i will not get any


----------



## criminal

Quote:


> Originally Posted by *sugarhell*
> 
> I hope 499 because its ******ed to pay so much for single gpus? Anyone remember 4k series? If it is more than 600 i will not get any


Well yeah, that would be better. I just really hope no more than $599.


----------



## $ilent

Chewy several and SEVEN are two different words.


----------



## Newbie2009

Quote:


> Originally Posted by *criminal*
> 
> Well yeah, that would be better. I just really hope no more than $599.


Any more than that and I will have to spend some serious time in the sperm bank.


----------



## wstanci3

Quote:


> Originally Posted by *Newbie2009*
> 
> Any more than that and I will have to spend some serious time in he sperm bank.


LOL








You got to make that green somehow. Desperate times call for desperate measures.


----------



## Chewy

Quote:


> Originally Posted by *$ilent*
> 
> Chewy several and SEVEN are two different words.


Poor attempt of humour...

Now SRSLY i dont for one minute expect it to be £800


----------



## Baghi

Quote:


> Originally Posted by *Chewy*
> 
> Poor attempt of humour...
> 
> Now SRSLY i dont for one minute expect it to be £800


LOL. +REP added. Much needed lol.









EDIT:
*Pre-orders begin for AMD Radeon R9 290X, likely priced at $699*

Source: TechSpot


----------



## Ashuiegi

for these price it s insane , i would go for a 7990 anytime over this. for this price you got the 7990 and the waterblock


----------



## rusky1

Quote:


> Originally Posted by *Ashuiegi*
> 
> for these price it s insane , i would go for a 7990 anytime over this. for this price you got the 7990 and the waterblock


Or 2 7970's. The Gigabyte non-ref ones are going for $289.99 after mail-in rebate right now.


----------



## Blindsay

Quote:


> Originally Posted by *Ashuiegi*
> 
> for these price it s insane , i would go for a 7990 anytime over this. for this price you got the 7990 and the waterblock


there are pros/cons to each, neither is a clear winner


----------



## djriful

Quote:


> Originally Posted by *szeged*
> 
> oh lawd you had to ask
> 
> 9 titans
> 2 evga 780 classifieds
> 2 reference 780s
> 2 680 classifieds
> 4 7970s, down from 6 after selling the two runts
> 2 7950s
> 1 7870
> 
> and some various other older gen cards that i cant find or forgot about lol


Whhyyyyy? Give me another titan I'll be a happy SLI user.


----------



## Blindsay

Quote:


> Originally Posted by *Regent Square*
> 
> Funny how Nvidia fanboys make a conclusion about R9 290X already meanwhile praised 780 in a pre release thread.


Not exactly the same, the 780 was a lot easier to figure out what its performance would be since we knew it was a cutdown titan where as the 290x is something entirely new

edit: also if anyone is a fanboy it's you


----------



## bencher

Quote:


> Originally Posted by *Regent Square*
> 
> Funny how Nvidia fanboys make a conclusion about R9 290X already meanwhile praised 780 in a pre release thread.


They weren't praising 780 if I remember correctly.

They were saying there is no way Nvidia would cannibalize titan by making the 780 a close performer.


----------



## maarten12100

Point being this card doesn't beat Titan stock vs stock without boost but it costs less and makes less heat and scales better with multiple cards and resolution.
AMD has a winner you have to be a complete fanboy to take a 780 over a 290x let alone take a Titan over a 290x (unless you need CUDA for something







)

The above point of view is made up by the statements of the XS mod and the posts from [H] Kyle (he doesn't have one but he has the contacts)


----------



## youra6

No way I'm getting this now.


----------



## Blindsay

Quote:


> Originally Posted by *maarten12100*
> 
> Point being this card doesn't beat Titan stock vs stock without boost but it costs less and makes less heat and scales better with multiple cards and resolution.
> AMD has a winner you have to be a complete fanboy to take a 780 over a 290x let alone take a Titan over a 290x (unless you need CUDA for something
> 
> 
> 
> 
> 
> 
> 
> )
> 
> The above point of view is made up by the statements of the XS mod and the posts from [H] Kyle (he doesn't have one but he has the contacts)


If it is $700 then its not really that good of a deal, thats an ok deal but nothing special


----------



## bencher

Quote:


> Originally Posted by *Ashuiegi*
> 
> for these price it s insane , i would go for a 7990 anytime over this. for this price you got the 7990 and the waterblock


That's what i am thinking. If I was to spend $700 on a card it would have to be a 7990.


----------



## youra6

Quote:


> Originally Posted by *bencher*
> 
> That's what i am thinking. If I was to spend $700 on a card it would have to be a 7990.


In the states, you can find 7990s for under 600 dollars.


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> Funny how Nvidia fanboys make a conclusion about R9 290X already meanwhile praised 780 in a pre release thread.


What conclusions are we making? A member over at XS said it needs LN2 for 1400 core. Alatar's titan does 1400 core on water i believe.


----------



## Baghi

Quote:


> Originally Posted by *Stay Puft*
> 
> *What conclusions are we making?* A member over at XS said it needs LN2 for 1400 core. Alatar's titan does 1400 core on water i believe.


FANBOY DETECTED!


----------



## Stay Puft

Quote:


> Originally Posted by *Baghi*
> 
> FANBOY DETECTED!


----------



## provost

The conclusion is that we already have Nvidia's version of 290x with the same price/performance attributes vis-a-vis the Titan, and we have had it for a while now. What does 290x bring to the table that is not already there?
780 classy - $700 beats the stock bios Titan
290x rumored - $600 or $700 with Bf4? rumored to beat the stock bios Titan (stock clocks and stock bios are two different things) . May have something called Mantle sometime in December for Bf4 and sometime next year for other games ( again based on rumors) May have 4gb of ram, may have hardware pacing drivers, and True Audio

AMD's turbo unlike Nvidia boost (which I don't like anyway) is a new feature, and thus yet unproven
AMD's version of Nvidia's GeForce experience (which I could not care less for) is a new feature, and thus unproven
AMD's driver pacing is a new feature (if true) compared to Nvidia's hardware metering thus unproven
AMD's Mantle is still a work in progress, thus unproven
True Audio is a new feature and thus unproven

So, the conclusion is, that there are a lot of "un-provens" at this point for someone to take a leap of faith, and purchase this card at launch. Just speaking for myself of course.


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> What conclusions are we making? A member over at XS said it needs LN2 for 1400 core. Alatar's titan does 1400 core on water i believe.


It will go higher with a voltage mod a LN2 a Titan won't go that high either without a voltage hardmod.


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> A member told us, okey. Where is a creditability of this souce?! Exactlly.
> 
> Nvidia fanboys call all the benches fake and every negative fact about R9 290X a truth worthy source, LOl Stay Puff, please post more of your laughable arguments. It is real fun to read them the way u post em.


At least my arguments are spelled correctly









Okay? Credibility? Source?


----------



## fateswarm

The OP keeps editing the title with a new rumored price. Please move it to the speculation side. The forum doesn't exist for the "glory" of any OP's popularity.


----------



## Supranium

Quote:


> Originally Posted by *Stay Puft*
> 
> What conclusions are we making? A member over at XS said it needs LN2 for 1400 core. Alatar's titan does 1400 core on water i believe.


He posted like this:
Quote:


> Quote Originally Posted by [XC] Oj101
> Performance in. Killer card on air (RIP TITAN), sucks on LN2. 1400 MHz or so


----------



## Moragg

Quote:


> Originally Posted by *Supranium*
> 
> He posted like this:


Shh, lets all concentrate on the fact it doesn't do well on LN2, which absolutely nobody is going to use.

I think best bet is to wait 6 months, then we can compare modded bioses for 7xx and R9 290X and see how much of a difference Mantle makes, and how many games support it. WIth the new gen of consoles everything's up in the air as to what will happen for the next few years.


----------



## $ilent

Is it safe to assume the 290X beats a 7990 too at this point? It seems to me that it must do because the 7990 is £500. So either it beats the 7990 too (in which case it must also beat the titan), or it doesnt cost as much as everyone is thinking and it will be under £500.

I dont see how they could price the 290X over £500 if it doesnt beat the 7990, but then if it did beat the 7990 people would be saying in these quotes from XS that the 290X beats the titan across the board too.


----------



## Forceman

Quote:


> Originally Posted by *$ilent*
> 
> Is it safe to assume the 290X beats a 7990 too at this point? It seems to me that it must do because the 7990 is £500. So either it beats the 7990 too (in which case it must also beat the titan), or it doesnt cost as much as everyone is thinking and it will be under £500.
> 
> I dont see how they could price the 290X over £500 if it doesnt beat the 7990, but then if it did beat the 7990 people would be saying in these quotes from XS that the 290X beats the titan across the board too.


There's no chance it'll beat a 7990 in games that have anything better than atrocious CFX scaling. It's probably 25-30% faster than a 7970. AMD appears to be trying to clear stock of the 7990, or else they just aren't selling at all, which would account for the low price (now $569 after rebate, which is crazy for a card that launched at $1000 6 months ago).


----------



## $ilent

Quote:


> Originally Posted by *Forceman*
> 
> *There's no chance it'll beat a 7990 in games* that have anything better than atrocious CFX scaling. It's probably 25-30% faster than a 7970. AMD appears to be trying to clear stock of the 7990, or else they just aren't selling at all, which would account for the low price (now $569 after rebate, which is crazy for a card that launched at $1000 6 months ago).


Exactly, so how could they possibly price the 290X at like £500+ like people keep guessing?


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> At least my arguments are spelled correctly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Okay? Credibility? Source?


Quote:


> Originally Posted by *Regent Square*
> 
> A member told us, okey. Where is a creditability of this souce?! Exactlly.
> 
> Nvidia fanboys call all the benches fake and every negative fact about R9 290X a truth worthy source, LOl Stay Puff, please post more of your laughable arguments. It is real fun to read them the way u post em.


Two moderators of 2 credible forums I mean really guys they are credible.


----------



## Kinaesthetic

Quote:


> Originally Posted by *$ilent*
> 
> Exactly, so how could they possibly price the 290X at like £500+ like people keep guessing?


Because the market for single GPU cards, and dual GPU cards are completely different.


----------



## $ilent

How bad is the 7990 to use? Because if the 290X is over £500 I probably wont get one, ill just grab a cheap 780 or 7990.


----------



## wermad

So, we're at $699 now? Meh, I'll just sell my 780s and look for used Titans, they're going for ~$700 now. I can re-use my blocks.


----------



## $ilent

Quote:


> Originally Posted by *wermad*
> 
> So, we're at $699 now? Meh, I'll just sell my 780s and look for used Titans, they're going for ~$700 now. I can re-use my blocks.


Ill give you 400 bucks for a 780


----------



## fateswarm

Quote:


> Originally Posted by *$ilent*
> 
> Ill give you 400 bucks for a 780


Trader Rating: 11+1


----------



## wermad

Quote:


> Originally Posted by *$ilent*
> 
> Ill give you 400 bucks for a 780


£? Sure! That's ~$650 USD


----------



## $ilent

No not £400 lol, how much in $ would you want?


----------



## darkstar585

Quote:


> Originally Posted by *wermad*
> 
> £? Sure! That's ~$650 USD


Doubt it, I am guessing he means dollars as we don't use the term "bucks" for anything else...On the other hand if he had said quid, squid you would be laughing









^beat me to it







^


----------



## smoggysky

so thats why an unnamed local electronics store(f*ys) is selling a asus 7970 for 259......i almost bought it....


----------



## nvidiaftw12

Quote:


> Originally Posted by *Ashuiegi*
> 
> for these price it s insane , i would go for a 7990 anytime over this. for this price you got the 7990 and the waterblock


Quote:


> Originally Posted by *bencher*
> 
> That's what i am thinking. If I was to spend $700 on a card it would have to be a 7990.


I wouldn't. They almost all have bad coil whine.


----------



## Red eyed fiend

Surprised regent hasn't been banned yet, by the looks of things he is only in here for a fight. While making out anyone that has any reservation against the prices are nvidia fanboys.


----------



## wermad

Quote:


> Originally Posted by *$ilent*
> 
> No not £400 lol, how much in $ would you want?


Gonna need to research first but $400 USD is doable if you pay $250 shipping









Other then that, way too low ball. I'll formulate my sales strategy soon if I decide to switch (checks own trader rating







). I'm gonna need to research this a bit more. Titans are sitting @ $800 USD on ebay so that tells they're selling for ~$750-775 there. Translated into the forum sale, they should be going for ~$700-725 (usually forum sales go for ~10-15% less then ebay due to the ebay fees).

Btw, is it official that the 290x is $699 USD????


----------



## anticommon

Quote:


> Originally Posted by *wermad*
> 
> Gonna need to research first but $400 USD is doable if you pay $250 shipping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Other then that, way too low ball. I'll formulate my sales strategy soon if I decide to switch (checks own trader rating
> 
> 
> 
> 
> 
> 
> 
> ). I'm gonna need to research this a bit more. Titans are sitting @ $800 USD on ebay so that tells they're selling for ~$750-775 there. Translated into the forum sale, they should be going for ~$700-725 (usually forum sales go for ~10-15% less then ebay due to the ebay fees).
> 
> Btw, is it official that the 290x is $699 USD????


Honestly $475-525 depending on model is pretty much where its at for 780's Id guess. Especially when classys are going for $600 new.


----------



## Testier

Quote:


> Originally Posted by *wermad*
> 
> Gonna need to research first but $400 USD is doable if you pay $250 shipping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Other then that, way too low ball. I'll formulate my sales strategy soon if I decide to switch (checks own trader rating
> 
> 
> 
> 
> 
> 
> 
> ). I'm gonna need to research this a bit more. Titans are sitting @ $800 USD on ebay so that tells they're selling for ~$750-775 there. Translated into the forum sale, they should be going for ~$700-725 (usually forum sales go for ~10-15% less then ebay due to the ebay fees).
> 
> Btw, is it official that the 290x is $699 USD????


I say around 550-600 CAD is fair for the reference mod. 600 might be on the high end though. And no, I do not think there is any official pricing for R9 290X yet.


----------



## $ilent

Quote:


> Originally Posted by *wermad*
> 
> Gonna need to research first but $400 USD is doable if you pay $250 shipping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Other then that, way too low ball. I'll formulate my sales strategy soon if I decide to switch (checks own trader rating
> 
> 
> 
> 
> 
> 
> 
> ). I'm gonna need to research this a bit more. Titans are sitting @ $800 USD on ebay so that tells they're selling for ~$750-775 there. Translated into the forum sale, they should be going for ~$700-725 (usually forum sales go for ~10-15% less then ebay due to the ebay fees).
> 
> Btw, is it official that the 290x is $699 USD????


Lol if Titans are going for $750 why would I pay 650 for a 780? The price difference here between new 780s and new titans is £400, so like $650. No official price been ste for 290x yet.


----------



## wermad

Classy is very mediocre so evga dropped the price. It took a while for the 580 classy to drop. I lost count where owners were disappointed with their cards. My best guess is that dictator Nvidia pulled the reins back to protect their Titan cash cow. The 780 overall feels like a 85% product to avoid encroachment on Titan sales. Sad and why I'm willing to go with something else.


----------



## wermad

Quote:


> Originally Posted by *$ilent*
> 
> Lol if Titans are going for $750 why would I pay 650 for a 780? The price difference here between new 780s and new titans is £400, so like $650. No official price been ste for 290x yet.


Lol. How I counter lowball offers, offer highball


----------



## $ilent

perhaps we could say $450?


----------



## wermad

Quote:


> Originally Posted by *$ilent*
> 
> perhaps we could say $450?


Haha, nope. Plus shipping to the UK is expensive. Ill stop before mods step in since "you can't discuss sales outside the market". If I do sell, I'm probably going with ebay for a quicker sale.


----------



## $ilent

good ol fleabay, I got some stuff on there too


----------



## wermad

Did anyone in the US already get a preorder in?


----------



## TheLAWNOOB

Quote:


> Originally Posted by *wermad*
> 
> Did anyone in the US already get a preorder in?


You should ask that in AMDZone or some gaming forums.

I don't think many people here would do a pre-order without knowing the exact price and benchmarks.


----------



## kot0005

Quote:


> Originally Posted by *wermad*
> 
> Did anyone in the US already get a preorder in?


I dont think you can preorder atleast till the 10th. Just give up already. You are just wasting your time.

AMD failed to deliver.


----------



## $ilent

Quote:


> Originally Posted by *kot0005*
> 
> I dont think you can preorder atleast till the 10th. Just give up already. You are just wasting your time.
> 
> AMD failed to deliver.


Wait so we could preorder from the 3rd, but people in the US have to wait another week? Where did you read that?


----------



## wermad

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> You should ask that in AMDZone or some gaming forums.
> 
> I don't think many people here would do a pre-order without knowing the exact price and benchmarks.
> Quote:
> 
> 
> 
> Originally Posted by *kot0005*
> 
> I dont think you can preorder atleast till the 10th. Just give up already. You are just wasting your time.
> 
> AMD failed to deliver.
Click to expand...

I definitely wouldn't pre-order without knowing for certain the product I'm going to purchase. I'm curious since other countries/sites are offering pre-orders with some offering a refund on that deposit. Kinda of strange the US is not getting that option to at least preorder. Sounds like AMD has dropped the ball big time, which makes this whole launch even more worse looking.

I'm waiting for msrp release and reviews to pass final judgement. Honestly, if msrp is $699, I won't bother. Titans are selling for this price and I already have blocks for it. Quad titans would be a much easier transition if I can get four (and pass the missus) for close to this speculated msrp. If msrp is ~$599, then reviews will make the final decision for me







.


----------



## $ilent

Where are these Titans that are $700?


----------



## szeged

Quote:


> Originally Posted by *$ilent*
> 
> Where are these Titans that are $700?


you must have strong ebay kung fu, my titans were mostly $800 each and came with waterblocks


----------



## kot0005

Quote:


> Originally Posted by *$ilent*
> 
> Wait so we could preorder from the 3rd, but people in the US have to wait another week? Where did you read that?


No, everyone has to wait. I dont know if it will be a week or a month, but I think AMD is collecting data to price it based on the interest.


----------



## $ilent

Quote:


> Originally Posted by *kot0005*
> 
> No, everyone has to wait. I dont know if it will be a week or a month, but I think AMD is collecting data to price it based on the interest.


Ive preordered mine.


----------



## kot0005

Quote:


> Originally Posted by *$ilent*
> 
> Ive preordered mine.


Not everyone is as awesome as Gibbo..

Do you have any discount codes from OCUK ? the shipping costs me $37 which isnt too bad. So I might get it from OCUK store If Au retailers price it ridiculously .


----------



## $ilent

Forum members get free shipping.


----------



## wermad

Quote:


> Originally Posted by *$ilent*
> 
> Where are these Titans that are $700?


Ebay allows you to see sold listings and also if you check current listings, there are quite a few sitting at ~$800 USD. Now, six months ago ppl would have scooped this up. Then look at the auction listings, Most are ending ~$750-775. Factor in that you're paing ~11-12% in ebay fees (including paypal), and you can expect forum prices to be ~$700-750. I'm also seeing wanted listings in the forums and ppl are asking for ~$700-750.

Quote:


> Originally Posted by *szeged*
> 
> you must have strong ebay kung fu, my titans were mostly $800 each and came with waterblocks


There you go. Factor in the block is probably ~$50-75 so that brings it down to ~$700-750 *then*. E bay is always a double sided sword, so I"m always weary when either selling or buying there.


----------



## Regent Square

When will pre orders go live so I can ditch my collecting dust cash?!


----------



## wstanci3

Quote:


> Originally Posted by *wermad*
> 
> Ebay allows you to see sold listings and also if you check current listings, there are quite a few sitting at ~$800 USD. Now, six months ago ppl would have scooped this up. Then look at the auction listings, Most are ending ~$750-775. Factor in that you're paing ~11-12% in ebay fees (including paypal), and you can expect forum prices to be ~$700-750. I'm also seeing wanted listings in the forums and ppl are asking for ~$700-750.
> There you go. Factor in the block is probably ~$50-75 so that brings it down to ~$700-750 *then*. E bay is always a double sided sword, so I"m always weary when either selling or buying there.


I wish there was another place to sell than Ebay. I've lost love for that site.


----------



## szeged

im done with ebay forever though, too many scammers lately, either break the item when they get it and demand a refund even though its their fault, or get your payment and straight up try to ignore you acting like you wont get your money back. only selling and buying stuff from people i know/ocn market from now on.


----------



## nvidiaftw12

Quote:


> Originally Posted by *szeged*
> 
> im done with ebay forever though, too many scammers lately, either break the item when they get it and demand a refund even though its their fault, or get your payment and straight up try to ignore you acting like you wont get your money back. only selling and buying stuff from people i know/ocn market from now on.


Craigslist is good for somethings, too. Cash in hand, then forget about 'em.


----------



## wermad

Craiglist has its nuggets if you're patient. I've heard numerous stories of ppl finding outrageous deals on parts. Most are being sold by ppl who have no idea on their market value. Or an ex-lover getting their revenge (ouch!







).


----------



## szeged

Quote:


> Originally Posted by *wermad*
> 
> Craiglist has its nuggets if you're patient. I've heard numerous stories of ppl finding outrageous deals on parts. Most are being sold by ppl who have no idea on their market value. Or an ex-lover getting their revenge (ouch!
> 
> 
> 
> 
> 
> 
> 
> ).


i live in a town with 3 major collages and a few not as big collages, when the kids are moving out for the summer or whenever classes are over, 99% of them just give their stuff away if it cant fit in their car, the other 1% basically charge 10% of the actual retail price, i love it, got a pool table that was normally $3500 for free, only downside was it was on the 3rd floor of the dorms and we had to get it out ourselves -_-


----------



## LaBestiaHumana

M
Quote:


> Originally Posted by *wermad*
> 
> Craiglist has its nuggets if you're patient. I've heard numerous stories of ppl finding outrageous deals on parts. Most are being sold by ppl who have no idea on their market value. Or an ex-lover getting their revenge (ouch!
> 
> 
> 
> 
> 
> 
> 
> ).


my first Titan was 700 on Craigslist. I drove from Chicago to Cincinatti + $70 gas. In April.
Was able to test it, register it with the original receipt.

I've seen some great deals, but lately people are overpricing stuff. I also noticed that computer hardware is really hard to sell unless you price it aggressively.


----------



## nvidiaftw12

95% of what I see is people trying to sell age old slow-to-begin-with dells for $300. Then a few gaming rigs that people call super fast when they are not, and then that small percentage of good deals.


----------



## Kinaesthetic

Quote:


> Originally Posted by *szeged*
> 
> i live in a town with 3 major collages and a few not as big collages, when the kids are moving out for the summer or whenever classes are over, 99% of them just give their stuff away if it cant fit in their car, the other 1% basically charge 10% of the actual retail price, i love it, got a pool table that was normally $3500 for free, only downside was it was on the 3rd floor of the dorms and we had to get it out ourselves -_-


Not as bad as when I and a friend had to carry a leather loveseat down from the 5th floor of a door down a cramped stairwell. Was a torrential pain. But damn it was worth it since it was free. Too bad my dog tore it up >.>

I've also picked up some used PSUs/CPUs off of Craigslist for dirt cheap (and not bad gear either).


----------



## FtW 420

Quote:


> Originally Posted by *wermad*
> 
> Lol. How I counter lowball offers, offer highball


Excellent idea, I have to remember that.

Feed the guy a few highballs & then try the lower offer again...


----------



## szeged

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Not as bad as when I and a friend had to carry a leather loveseat down from the 5th floor of a door down a cramped stairwell. Was a torrential pain. But damn it was worth it since it was free. Too bad my dog tore it up >.>
> 
> I've also picked up some used PSUs/CPUs off of Craigslist for dirt cheap (and not bad gear either).


lol loveseats are always the worst to get around tight bends









all the computer related stuff in my town on craigslist is like " have this old computer with some stuff in it, i payed $1500 for it brand new about 10 years ago so i wanna get about $1200 for it total"

then you look at the picture and its like


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> lol loveseats are always the worst to get around tight bends
> 
> 
> 
> 
> 
> 
> 
> 
> 
> all the computer related stuff in my town on craigslist is like " have this old computer with some stuff in it, i payed $1500 for it brand new about 10 years ago so i wanna get about $1200 for it total"
> 
> then you look at the picture and its like


That case has plenty of good airflow.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *szeged*
> 
> lol loveseats are always the worst to get around tight bends
> 
> 
> 
> 
> 
> 
> 
> 
> 
> all the computer related stuff in my town on craigslist is like " have this old computer with some stuff in it, i payed $1500 for it brand new about 10 years ago so i wanna get about $1200 for it total"
> 
> then you look at the picture and its like


LMAO, yep I've seen those posts.


----------



## CynicalUnicorn

Quote:


> Originally Posted by *szeged*
> 
> all the computer related stuff in my town on craigslist is like " have this old computer with some stuff in it, i payed $1500 for it brand new about 10 years ago so i wanna get about $1200 for it total"


Guys... Stahp. Why. $1500 ten years ago is $100 today and mostly for the OS license.


----------



## $ilent

Hey also the BF4 290X is just normal BF4 not premium edition.


----------



## Majin SSJ Eric

I generally only buy and sell computer hardware here on OCN. Much easier to trust people here and much more likely to deal with someone that knows what they are talking about.


----------



## $ilent

Anyone else noticed gtx 780 prices have all been reduced?

Unless Im mistaken all gtx 780s up until like past few days were minimum £550, now they are as low as £470 brand new.


----------



## szeged

Quote:


> Originally Posted by *$ilent*
> 
> Anyone else noticed gtx 780 prices have all been reduced?
> 
> Unless Im mistaken all gtx 780s up until like past few days were minimum £550, now they are as low as £470 brand new.


from where?


----------



## wermad

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=gtx%20780&bop=And&Order=PRICE&PageSize=100

Except a few specials, most are still hovering at MSRP at the egg.


----------



## $ilent

Quote:


> Originally Posted by *szeged*
> 
> from where?


on UK sites scan and at OCUK.

I would have noticed id of thought, but I swear all 780s have been around £550 until this week.

Also wermad those 780s are $630 brand new, so surly you wouldnt be seriously expecting over $500 for your cards each?


----------



## BBEG

EVGA just did or is doing a 15% discount on 780s as well.


----------



## wermad

Quote:


> Originally Posted by *$ilent*
> 
> on UK sites scan and at OCUK.
> 
> I would have noticed id of thought, but I swear all 780s have been around £550 until this week.
> 
> Also wermad those 780s are $630 brand new, so surly you wouldnt be seriously expecting over $500 for your cards each?


nice try







. only one for $630.

Edit: salling tip - EVGA SC versions sell quicker and command more money. I sold my Asus Titan for $860 and the SC EVGA sold for $960. Seems like the SC always sells quicker and for the most cash. Why I invested a little bit more in those


----------



## $ilent

Well your card is $650 online. So minus 20% for second hand price your looking at only $520, and even then people appraise less than 20% off new prices. So yeah $500 would be about right?


----------



## Stay Puft

Quote:


> Originally Posted by *$ilent*
> 
> Where are these Titans that are $700?


I'm watching an eBay titan at 660 right now. I think the Titan market is going to crash after the 15th which is great because I'd love a few used titans


----------



## wermad

Quote:


> Originally Posted by *$ilent*
> 
> Well your card is $650 online. So minus 20% for second hand price your looking at only $520, and even then people appraise less than 20% off new prices. So yeah $500 would be about right?


Nah, that's evga sale, it doesn't represent the market as a whole. Some sales are done since they charge California Sales tax.. Lol, I've sold a few things already (checks trader rating again!) so I'll know how to price the 780s if they go







.


----------



## Regent Square

Quote:


> Originally Posted by *$ilent*
> 
> Well your card is $650 online. So minus 20% for second hand price your looking at only $520, and even then people appraise less than 20% off new prices. So yeah $500 would be about right?


I would not buy 780 even for 500$.

Why did you pre order 9970? I mean, u can buy 780 cheap in UK now.


----------



## wstanci3

Quote:


> Originally Posted by *wermad*
> 
> Nah, that's evga sale, it doesn't represent the market as a whole. Some sales are done since they charge California Sales tax.. Lol, I've sold a few things already (checks trader rating again!) so I'll know how to price the 780s if they go
> 
> 
> 
> 
> 
> 
> 
> .


So, what would your lowest price you would be hypothetically if you were selling?


----------



## $ilent

Quote:


> Originally Posted by *Stay Puft*
> 
> I'm watching an eBay titan at 660 right now. I think the Titan market is going to crash after the 15th which is great because I'd love a few used titans


Someone has a used GTX titan on ebay for £640 BIN or offer, but it finishes in 3 hours which is 7AM here. What an odd time to have a listing end...


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> I would not buy 780 even for 500$.
> 
> Why did you pre order 9970? I mean, u can buy 780 cheap in UK now.


Thats because you're an amd fan. A 500 dollar 780 is an amazing deal


----------



## Regent Square

Quote:


> Originally Posted by *Stay Puft*
> 
> Thats because you're an amd fan. A 500 dollar 780 is an amazing deal


I don't buy non supported products.


----------



## szeged

Quote:


> Originally Posted by *Stay Puft*
> 
> Thats because you're an amd fan. A 500 dollar 780 is an amazing deal


people could be giving out titans and 780s, amd fans would still call them bad and stay with their 7970.


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> I don't buy non supported products.


What is your current rig anyway?


----------



## wermad

As much as I would like to discuss the sale of my gear, its against the ToS. Didn't Pio already stepped in and mentioned this? I'll be refraining from this topic at this point but I can discuss other things







. Check the market, anything I have for sale will be in my signature and in the market


----------



## $ilent

Quote:


> Originally Posted by *Regent Square*
> 
> I would not buy 780 even for 500$.
> 
> Why did you pre order 9970? I mean, u can buy 780 cheap in UK now.


gtx 780 is still best part of £500 in UK. Why would I buy one of those when there is a good chance the 290X might not cost £500, plus its better than a 780

Quote:


> Originally Posted by *wstanci3*
> 
> So, what would your lowest price you would consider? $575? $600?


You do realise the EVGA 780 SC is $650 new right?


----------



## Majin SSJ Eric

What are you guys smoking? $500 is a great deal on a 780...


----------



## Regent Square

Quote:


> Originally Posted by *$ilent*
> 
> gtx 780 is still best part of £500 in UK. Why would I buy one of those when there is a good chance the 290X might not cost £500, plus its better than a 780
> You do realise the EVGA 780 SC is $650 new right?


Ok, what about if it is not cheaper; will u cancel?


----------



## wstanci3

Quote:


> Originally Posted by *$ilent*
> 
> gtx 780 is still best part of £500 in UK. Why would I buy one of those when there is a good chance the 290X might not cost £500, plus its better than a 780
> You do realise the EVGA 780 SC is $650 new right?


Yes, I do...
But it is a Classified. That's all I got.


----------



## szeged

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What are you guys smoking? $500 is a great deal on a 780...


but its an nvidia card! its obviously an inferior card even if it was free!


----------



## nvidiaftw12

Quote:


> Originally Posted by *Regent Square*
> 
> I don't buy non supported products.


What do you mean nVidia card's aren't supported?

Quote:


> Originally Posted by *wermad*
> 
> As much as I would like to discuss the sale of my gear, its against the ToS. Didn't Pio already stepped in and mentioned this? I'll be refraining from this topic at this point but I can discuss other things
> 
> 
> 
> 
> 
> 
> 
> . Check the market, anything I have for sale will be in my signature and in the market


Wait, what? Since when have we not been able to talk about stuff we have for sale in forums? So we can link it in our sigs, but not posts? That's ******ed.


----------



## $ilent

I havent decided yet, even £400 is alot of money for a piece of plastic that will be obsolete in 3 months.
Quote:


> Originally Posted by *wstanci3*
> 
> Yes, I do...
> But it is a Classified. That's all I got.


His 780s arent classifieds...unless i have the wrong end of the stick and your saying your already own a classified and your trying to gauge how much to sell yours for?


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> people could be giving out titans and 780s, amd fans would still call them bad and stay with their 7970.


Dear, I have 3 Titans. Sold 7990. Calling me an AMD fan is :wrong: in all senses.

I would buy 780 for 400$ though.


----------



## wstanci3

Quote:


> Originally Posted by *$ilent*
> 
> I havent decided yet, even £400 is alot of money for a piece of plastic that will be obsolete in 3 months.


That's the nature of the beast.


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> Dear, I have 3 Titans. Sold 7990. Calling me an AMD fan is :wrong: in all senses.
> 
> I would buy 780 for 400$ though.


wasnt refering to you







 whyd you sell the 7990?

id buy a 780 for $400, i would also buy one for $500, $600 is pushing it unless its a great overclocker(used that is)


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> wasnt refering to you
> 
> 
> 
> 
> 
> 
> 
> whyd you sell the 7990?
> 
> id buy a 780 for $400, i would also buy one for $500, $600 is pushing it unless its a great overclocker(used that is)


U want Titan almost new for 800$?

Edit; nm,got a dealer already.


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> U want Titan almost new for 800$?


i got my first two titans for 800 each and they came with EK and hydro copper waterblocks









If i needed titans right now, i would pay $800 for them, fortunately i already have mine. Hopefully nvidia drops the price on titans so that used ones are more affordable for everyone.


----------



## Regent Square

Homer Simpson at his finest...


----------



## szeged

lol


----------



## bencher

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What are you guys smoking? $500 is a great deal on a 780...


I wouldn't buy a Gtx 780 or any card for $500


----------



## szeged

which you have stated a thousand times, thanks for reminding us.


----------



## Regent Square

Quote:


> Originally Posted by *bencher*
> 
> I wouldn't buy a Gtx 780 or any card for $500


Yea, bring 350 $ for X70 part back. Bencher`s got ***** together, the guy knows what is the deal.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *bencher*
> 
> I wouldn't buy a Gtx 780 or any card for $500


It's a 650 dollar card for 500. Isn't that your thing? Paying as little as you can for something that's worth more?


----------



## wermad

edit:
delete


----------



## bencher

Quote:


> Originally Posted by *szeged*
> 
> which you have stated a thousand times, thanks for reminding us.


Quote:


> Originally Posted by *LaBestiaHumana*
> 
> It's a 650 dollar card for 500. Isn't that your thing? Paying as little as you can for something that's worth more?


Hmm it's worth more to you.

For example I bought my 7870 for $200 + tax. That's what it's worth to me and should have never been $350.

7970.... never in a million years would I spend $550 on that. To me that cards is worth $320 at best.

For $550 I would consider a 7990 or gtx 690. Those are worth $550 to *me*.


----------



## szeged

which is why i dont get why people bash titan owners, obviously the titans are worth the price to *us*


----------



## th3illusiveman

Quote:


> Originally Posted by *Ashuiegi*
> 
> for these price it s insane , i would go for a 7990 anytime over this. for this price you got the 7990 and the waterblock


These Prices are insane. I knew from the moment it launched the Titan would screw *everyone* over despite what those die hard Nvidia fanboys were claiming.

All those "trusted" reviewers instead of actually calling Nvidia out on that blatant cash out, started sucking the long green pipe and now $600 and $700 GPUs are the norm because apparently $500 isn't enough anymore for "high end"







As for Nvidia, i hope their Greed catches up to them one day, and if AMD follows their lead then they can suffer the same fate for all I care.

Yes, my jimmies are indeed rustled at the moment.


----------



## wermad

Not sure if you'll have have seen this:



So, this is pointing towards an MSRP of $699









http://cdn4.wccftech.com/wp-content/uploads/2013/10/AMD-Radeon-R9-290X-Newegg.jpg


----------



## Regent Square

Quote:


> Originally Posted by *wermad*
> 
> Not sure if you'll have have seen this:
> 
> 
> 
> So, this is pointing towards an MSRP of $699
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://cdn4.wccftech.com/wp-content/uploads/2013/10/AMD-Radeon-R9-290X-Newegg.jpg


might as well post "GTX 680 official pricing list"


----------



## LaBestiaHumana

Quote:


> Originally Posted by *bencher*
> 
> Hmm it's worth more to you.
> 
> For example I bought my 7870 for $200 + tax. That's what it's worth to me and should have never been $350.
> 
> 7970.... never in a million years would I spend $550 on that. To me that cards is worth $320 at best.
> 
> For $550 I would consider a 7990 or gtx 690. Those are worth $550 to *me*.


Lmao, you're saying because it's Nvidia its worth less? Wow!

I mean who sells 780s for 350?

I've seen 7970 and 7950 around 300. Which is a good deal. But 780 cream pies both of those cards, why would it be priced in the same category? Just cause you feel it should be 350? Get real man.


----------



## bencher

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Lmao, you're saying because it's Nvidia its worth less? Wow!
> 
> I mean who sells 780s for 350?
> 
> I've seen 7970 and 7950 around 300. Which is a good deal. But 780 cream pies both of those cards, why would it be priced in the same category? Just cause you feel it should be 350? Get real man.


Where and when did I say that?


----------



## kpoeticg

Quote:


> Originally Posted by *wermad*
> 
> Not sure if you'll have have seen this:
> 
> 
> 
> So, this is pointing towards an MSRP of $699
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://cdn4.wccftech.com/wp-content/uploads/2013/10/AMD-Radeon-R9-290X-Newegg.jpg


Didn't the OP update it with that TPU article saying $599?

That's the price I've been hoping for since announcement so i can order 2 on the release date


----------



## Testier

For me, Flagship cards are worth between 600-700 to me. dual gpus are at 900-1000 for me.


----------



## fleetfeather

Quote:


> Originally Posted by *Testier*
> 
> For me, Flagship cards are worth between 600-700 to me. dual gpus are at 900-1000 for me.


Flagship $500

Dual-GPU $800-$1000 depending on crippling


----------



## th3illusiveman

Quote:


> Originally Posted by *fleetfeather*
> 
> Flagship $500
> 
> Dual-GPU $800-$1000 depending on crippling


this


----------



## Durquavian

Quote:


> Originally Posted by *fleetfeather*
> 
> Flagship $500
> 
> Dual-GPU $800-$1000 depending on crippling


I agree this seems more logical.


----------



## wermad

Quote:


> Originally Posted by *kpoeticg*
> 
> Didn't the OP update it with that TPU article saying $599?
> 
> That's the price I've been hoping for since announcement so i can order 2 on the release date


I believe this was ousted after the tpu.com announcement. Hehehe, the anticipation is getting the most of us. Good job Amd, good job









edit: I see the new article, it was released after this one i posted. Tpu.com article in the op says $699 (so does the thread title).


----------



## nvidiaftw12

Quote:


> Originally Posted by *fleetfeather*
> 
> Flagship $500
> 
> Dual-GPU $800-$1000 depending on crippling


Maybe up to $550 for flagship but that's it.


----------



## wermad

You'll forgetting how Amd pushed the pricing scheme with the 9590. It didn't really work out for them but I'm sure they won't shoot for the skies this time. My guesstimate is ~$650-699.


----------



## wstanci3

How's it look?
$699-290x BF4 Edition
$649-290x
$499-290


----------



## th3illusiveman

Quote:


> Originally Posted by *wstanci3*
> 
> How's it look?
> $699-290x BF4 Edition
> $649-290x
> $499-290


they won't charge you extra for a game they will be offering in their Never settle bundle with most of their cards....


----------



## wstanci3

Quote:


> Originally Posted by *th3illusiveman*
> 
> they won't charge you extra for a game they will be offering in their Never settle bundle with most of their cards....


Comes with BF4 Premium...


----------



## fateswarm

Quote:


> Originally Posted by *wermad*
> 
> dictator Nvidia pulled the reins back to protect their Titan cash cow


Do you live in an imaginary world? There is no evidence Titan sells a lot outside enthusiasts, and NVIDIA even expressed surprise it even sold at all (they had a shortage at first because the pieces were extremely limited). In all likelihood they do not sell easily above $400 (unless the cards are Professional-grade) and that is shown from various indications, if you want to start seeing it realistically realize GK110 isn't as cheap as GK104 to manufacture, at least at the point of release and the initial pricing.


----------



## kot0005

Quote:


> Originally Posted by *Testier*
> 
> For me, Flagship cards are worth between 600-700 to me. dual gpus are at 900-1000 for me.


I'd go upto $600 for flagship, but too bad thers none available for that price, with a flagship rated performance


----------



## wermad

Quote:


> Originally Posted by *fateswarm*
> 
> Do you live in an imaginary world? There is no evidence Titan sells a lot outside enthusiasts, and NVIDIA even expressed surprise it even sold at all (they had a shortage at first because the pieces were extremely limited). In all likelihood they do not sell easily above $400 (unless the cards are Professional-grade) and that is shown from various indications, if you want to start seeing it realistically realize GK110 isn't as cheap as GK104 to manufacture, at least at the point of release and the initial pricing.


Lol, I have yet to visit your world. Bottom line, Nvidia went to great lengths to ensure Titan stayed king all this time. Hence the limitations they added to 780s. That's my gripe


----------



## szeged

Quote:


> Originally Posted by *wermad*
> 
> Lol, I have yet to visit your world. Bottom line, Nvidia went to great lengths to ensure Titan stayed king all this time. Hence the limitations they added to 780s. That's my gripe


if they wanted titan to truly be king, it should have came with the 780 classifieds power section


----------



## kpoeticg

Quote:


> Originally Posted by *wermad*
> 
> I believe this was ousted after the tpu.com announcement. Hehehe, the anticipation is getting the most of us. Good job Amd, good job
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: I see the new article, it was released after this one i posted. Tpu.com article in the op says $699 (so does the thread title).


I'm talking about THIS ONE in the op. So that one's outdated then?


----------



## nvidiaftw12

Yup.


----------



## kpoeticg

Blah. There's 3 links in the OP and the one with $599.99 is the only one that says Updated next to it. Misleading...


----------



## fateswarm

Quote:


> Originally Posted by *wermad*
> 
> Lol, I have yet to visit your world. Bottom line, Nvidia went to great lengths to ensure Titan stayed king all this time. Hence the limitations they added to 780s. That's my gripe


I think you are confusing hype with financial success. Titan is a Luxury product and their flagship for advertisement. There are various indications, it is not a cash cow though, especially since their own analysis showed it would not sell at all, even if it sold a bit more than anticipated, and that is not the only or even stronger indication, but it's one that is easily found.


----------



## wermad

Quote:


> Originally Posted by *fateswarm*
> 
> I think you are confusing hype with financial success. Titan is a Luxury product and their flagship for advertisement. There are various indications, it is not a cash cow though, especially since their own analysis showed it would not sell at all, even if it sold a bit more than anticipated, and that is not the only or even stronger indication, but it's one that is easily found.


Its based on a crippled sku. Its not an entirely new sku and hence the continual attempt to revenue from an older sku. Also known as cash cow. 780, is a cash cow as well if we're going to get technical









Seems like ppl just like to nit-pick at other's post. Not sure if this is due to the anticipation of Amd


----------



## fateswarm

Quote:


> Originally Posted by *wermad*
> 
> Its based on a crippled sku. Its not an entirely new sku and hence the continual attempt to revenue from an older sku. Also known as cash cow. 780, is a cash cow as well if we're going to get technical
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seems like ppl just like to nit-pick at other's post. Not sure if this is due to the anticipation of Amd


Using random partial technical information doesn't make your old argument right. You are not even right about those. Titan is not "crippled", it just misses a marginal part of the die that I doubt it's noticeable outside benchmarks.


----------



## wermad

Quote:


> Originally Posted by *kpoeticg*
> 
> I'm talking about THIS ONE in the op. So that one's outdated then?


Yes its old







Lame














. If you go back to the op, its the link at the very top of the post:

http://www.techspot.com/news/54229-pre-orders-begin-for-amd-radeon-r9-290x-likely-priced-at-699.html


----------



## wermad

Quote:


> Originally Posted by *fateswarm*
> 
> Using random partial technical information doesn't make your old argument right. You are not even right about those. Titan is not "crippled", it just misses a marginal part of the die that I doubt it's noticeable outside benchmarks.







Its pretty common to use "lesser", ahem, crippled, parts to make something "cheaper". I'm sorry, that's as simple as I can put it. Thank you for the engaging but totally unnecessary dialogue


----------



## wermad

Quote:


> Originally Posted by *fateswarm*
> 
> Do you have any arguments? Or only memes and taunts? I'm not a child so they don't work on me.


Its a JT vid, c'mon now! Gotta count for something







. I'll stop here and leave you to chase your own tail









Btw, says the ex-owner of two titans and current triple 780s









Something to keep this on-topic:



Should say "also, select countries"


----------



## Baghi

Quote:


> Originally Posted by *kpoeticg*
> 
> I'm talking about THIS ONE in the op. So that one's outdated then?


The announcement of R9 290X is a news, the price isn't. I just updated the title based on most recent article/post from TechSpot, when official price is finally out I'll update the title with word "official". The most recent update is at the top and anything below it is outdated.


----------



## kpoeticg

Thanx for clearing that up. The "blah" wasn't meant for you =P It was meant towards the price "not" being officially $599 =)


----------



## wermad

If you guys remember, the 6990 was boldly priced and retailers had fun hiking that price up. So its not new territory for amd (albeit the 7990 played w/ the $1k crowd) and I'm sure that 290x will out dual a $699 (msrp) 6990.

Based on the "bf4 bundle" speculation of $729, $699 seems more plausible. But, nothing "official" yet so I'm still keeping an open mind for now.


----------



## Blackops_2

Waiting for performance, then if it lives up, waiting for price drop/competition to ensue.


----------



## maarten12100

Quote:


> Originally Posted by *kot0005*
> 
> I dont think you can preorder atleast till the 10th. Just give up already. You are just wasting your time.
> 
> AMD failed to deliver.


NDA lifts 15th and same performane from a smaller die is a big win.
That is what the 290x will be a big win for something on the 28nm node


----------



## Dart06

So did AMD say anything about the wait in NA for preorders yet?


----------



## kot0005

Quote:


> Originally Posted by *Dart06*
> 
> So did AMD say anything about the wait in NA for preorders yet?


There was no pre order plan to begin with, AMD just worded their "register for Interest" as "pre order". It shows how inexperienced AMD is with marketing.

If this card fails to live up to the Hype, AMD will loose a lot of customers and potential future AMD product buyers.

Just like how Diablo 3 ended up...


----------



## XxOsurfer3xX

As soon as AMD gives real price and there are reviews, I'm buying. This card is looking beastly.


----------



## amd655




----------



## maarten12100

Quote:


> Originally Posted by *kot0005*
> 
> There was no pre order plan to begin with, AMD just worded their "register for Interest" as "pre order". It shows how inexperienced AMD is with marketing.
> 
> If this card fails to live up to the Hype, AMD will loose a lot of customers and potential future AMD product buyers.
> 
> Just like how Diablo 3 ended up...


we already know it won't fail to deliver but how great it is still depends on final non place holder price after the pre order phase is over.
You're not really font of AMD's new card, are you?


----------



## kot0005

Quote:


> Originally Posted by *maarten12100*
> 
> we already know it won't fail to deliver but how great it is still depends on final non place holder price after the pre order phase is over.
> You're not really font of AMD's new card, are you?


I need an GPU upgrade! So I was really looking foreword to the R9 290X, but i started loosing hope since the live event, I even woke up at 5am to watch it! I registered for the card with my local retailers, I will preorder it if its within my $800 budget.


----------



## wstanci3

Quote:


> Originally Posted by *$ilent*
> 
> I havent decided yet, even £400 is alot of money for a piece of plastic that will be obsolete in 3 months.
> His 780s arent classifieds...unless i have the wrong end of the stick and your saying your already own a classified and your trying to gauge how much to sell yours for?


Wow, for some reason I thought he had Classifieds.


----------



## maarten12100

Quote:


> Originally Posted by *kot0005*
> 
> I need an GPU upgrade! So I was really looking foreword to the R9 290X, but i started loosing hope since the live event, I even woke up at 5am to watch it! I registered for the card with my local retailers, I will preorder it if its within my $800 budget.


It'll be a great card according to both [H] Kyle's sources and the XS moderator that being said I don't think you'll have if before the 15th anyways.


----------



## kot0005

Also, I asked Ek if the blocks for this card will be released on the same day or if well hav to wait, They reply was that they wil be available on the same day.









https://www.facebook.com/photo.php?fbid=567549039965798&set=a.506206946100008.1073741826.182927101761329&type=1&theater
9 more days to go..


----------



## fleetfeather

Quote:


> Originally Posted by *kot0005*
> 
> I need an GPU upgrade! So I was really looking foreword to the R9 290X, but i started loosing hope since the live event, I even woke up at 5am to watch it! I registered for the card with my local retailers, I will preorder it if its within my $800 budget.


you already know it wont hit us @ <$800. PCCG rakes up the price first month of release. They put the 780 Lightning @ $960 when it hit us, then dropped back to $900 a month later (cos that's a bargain now, right)


----------



## Tisca

Anything on the 290 (not the X)? Specs, price speculation, performance? That one will certainly be of better value.


----------



## CynicalUnicorn

Quote:


> Originally Posted by *wermad*
> 
> You'll forgetting how Amd pushed the pricing scheme with the 9590. It didn't really work out for them but I'm sure they won't shoot for the skies this time. My guesstimate is ~$650-699.


But remember that 9590s got their prices slashed in half and its little brother, the 9370, has too and is at a more reasonable sub-$300. Granted, that's double an 8320 for less than a 1GHz ocerclock, but it only took a couple months. I foresee these quickly dropping in price to $500-550 if only because AMD seems to get more bad press than Intel/Nvidia for the same stuff and not just because Nvidia cuts prices on 780s.


----------



## kot0005

Quote:


> Originally Posted by *fleetfeather*
> 
> you already know it wont hit us @ <$800. PCCG rakes up the price first month of release. They put the 780 Lightning @ $960 when it hit us, then dropped back to $900 a month later (cos that's a bargain now, right)


Well there is no official price yet, If its above $800, I'd rather try buying it from NCIX.ca or amazon. Right now 780 classified comes to $766 AUD shipped for me, so I am looking foreword to that as well. Not paying $900 for it...

Also Classified is $699 USD and Lightning is $769 USD. Classified is much better in terms of overclocking with the samsung memory "?"


----------



## jojoenglish85

Im so confused at this point lol. I have seen videos of sli 670's running on 7680x1440p which i will be in the next week and all i want is a pair of good enough cards to play a mostly High & some Ultra settings. Which cards that are abailable now will do that?


----------



## wermad

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> But remember that 9590s got their prices slashed in half and its little brother, the 9370, has too and is at a more reasonable sub-$300. Granted, that's double an 8320 for less than a 1GHz ocerclock, but it only took a couple months. I foresee these quickly dropping in price to $500-550 if only because AMD seems to get more bad press than Intel/Nvidia for the same stuff and not just because Nvidia cuts prices on 780s.


If it delivers at 780 performance, it sure to stay around $600-650. Only scenrio i see Amd dropping the price is with the GTX 8xx series (aka Maxwell). When the 680 came out @ $500 traded blows with the $550 7970 amd dropped the price. As their Tahiti aged, they dropped the price even further. Now that 280x is ~$300, you're seeing 7970s ~$300 (same card).

Well, I hope you're right and they drop the price seriously to $500. I would definitely be inline for quads at this price


----------



## KaiserFrederick

Quote:


> Originally Posted by *kot0005*
> 
> I registered for the card with my local retailers, I will preorder it if its within my $800 budget.


Lucky you, I went down to my local retailer (MSY) to ask when they'd be getting the 290X in stock and they didn't even know what it was! Not sure whether that's incompetence on their part, really bad communication on AMD's end, or a bit of both


----------



## rusirius

What a sad situation we have. we have people posting about blinding preodering and this behavior is tolerated by the members. These members are harmful to our hobby.


----------



## kot0005

Quote:


> Originally Posted by *KaiserFrederick*
> 
> Lucky you, I went down to my local retailer (MSY) to ask when they'd be getting the 290X in stock and they didn't even know what it was! Not sure whether that's incompetence on their part, really bad communication on AMD's end, or a bit of both


I registered with PCCG, Scorptech and Mwave. I woudnt really buy a Expensive unreleased GPU from MSY or anyother stores than scorptech,pccg and may be umart.

PCCG and scorptech charge a premium compared to other stores, but its worth it. Their RMA is excellent! I bent my old motherboard's Cpu socket pins by accident and they repaired it for free! well it was under warranty but physical damage will void it. Had a build quality issue with Seasonic 1000w platinum on day 1 and they gave me a refund the next day. Scoptech is good with warranty too, I got my GTX 570 Graphics card exchanged for a new one because I was worried about the temperatures getting too high (80-90c)


----------



## KaiserFrederick

Quote:


> Originally Posted by *kot0005*
> 
> I registered with PCCG, Scorptech and Mwave. I woudnt really buy a Expensive unreleased GPU from MSY or anyother stores than scorptech,pccg and may be umart.


Yeah, I mostly use PCCG for my hardware, I was just interested in when Australian retailers would be receiving shipments of the 290X, as we're usually up to a week behind US stores.


----------



## Axon14

Wait what, places are taking pre-orders? Screw that noise.


----------



## diggiddi

Nope, bruh u need to upgraaade!


----------



## maarten12100

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> You think my dual Xeon 8 cores can handle it at medium? With AA off at 800 * 600 ?
> 
> But seriously, I can probably run it on the iGPU of a server mobo.


If it is sandy yes, if you're running older ones from the 45nm era and state HT cores as cores then no.
Games fail to utilize multiple cores though that will change a bit in the upcoming years.


----------



## wermad

If we go back to the RIVE BF3E, it sold for an extra $20 over the non-BF3 RIVE. So the tpu speculation on the $729 USD and $749 CAD for the 290x BF4E would price the non-bundle package ~$689-699 USD.

http://www.brightsideofnews.com/news/2013/10/3/amd-r9-290x-pre-orders-go-live---but-no-price!.aspx

Have we received any more info?


----------



## szeged

so $699 for this card that was speculated to be sent by the amd gods to be consumer wallet friendly

that turned out well.


----------



## wermad

Quote:


> Originally Posted by *szeged*
> 
> so $699 for this card that was speculated to be sent by the amd gods to be consumer wallet friendly
> 
> that turned out well.


If they're bold enough to price it this high it may be that it does much better then a 780 and enough or better then a Titan to warrant a higher price. On paper, it looks to have the upper hand on 780.

Btw, enter XFX and Asus:


----------



## maarten12100

Quote:


> Originally Posted by *szeged*
> 
> so $699 for this card that was speculated to be sent by the amd gods to be consumer wallet friendly
> 
> that turned out well.


We won't know until non pre release or actually a little after that.
Not that I care for what the Americans pay as I'll pay that *1,5(1.5 for Americans)


----------



## Stay Puft

Quote:


> Originally Posted by *wermad*
> 
> If they're bold enough to price it this high it may be that it does much better then a 780 and enough or better then a Titan to warrant a higher price. On paper, it looks to have the upper hand on 780.
> 
> Btw, enter XFX and Asus:


Those all wanted to preorder I urge you to wait for 2 things.

1. Official benchmarks and reviews
2. Nvidias price cuts

The Titan and 780 are getting cut. Some say

Titan - 699
780 - 599

Some even say we will get a 785 with a full GK110 and 2880 cuda cores. Either way this is a huge win for all of us.


----------



## szeged

Quote:


> Originally Posted by *Stay Puft*
> 
> Those all wanted to preorder I urge you to wait for 2 things.
> 
> 1. Official benchmarks and reviews
> 2. Nvidias price cuts
> 
> The Titan and 780 are getting cut. Some say
> 
> Titan - 699
> 780 - 599
> 
> Some even say we will get a 785 with a full GK110 and 2880 cuda cores. Either way this is a huge win for all of us.


i can see titans dropping to $800 ish but anything below that and i think nvidia would rather just EOL it because...nvidia.


----------



## wermad

I'm not pre-ordering since there's little to nothing out there to justify that now. If amd would like to send me four for evaluation, sure !

Btw, a Titan drop could leave room for Titan-Ultra to take over @ $1k.


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> Those all wanted to preorder I urge you to wait for 2 things.
> 
> 1. Official benchmarks and reviews
> 2. Nvidias price cuts
> 
> The Titan and 780 are getting cut. Some say
> 
> Titan - 699
> 780 - 599
> 
> Some even say we will get a 785 with a full GK110 and 2880 cuda cores. Either way this is a huge win for all of us.


I concur but as far as price cuts on the Nvidia cards go I doubt we'll see them.
Titan will drop since CF scales better per added card than SLI does but it won't drop that much.

Nvidia has a bigger advantage so if this was to turn into a price war it would be AMD that has the last laugh. (as in they can go the lowest)


----------



## Stay Puft

Quote:


> Originally Posted by *szeged*
> 
> i can see titans dropping to $800 ish but anything below that and i think nvidia would rather just EOL it because...nvidia.


If the 290X leaks are correct and Titan is slower nvidia would have no choice but to match. They'd lose a lot more money EOLing then dropping the price. Also remember when the 4870 was released. The gtx 280 dropped 200 dollars overnight


----------



## szeged

im not gonna pre order it anymore, was going to and just sell the bf4 code or something, but with the way the "leaked rumors" on this card is going id rather just wait for confirmed specs, gonna get one either way cuz i wanna test it vs 780 classified and titans in person, but who knows if ill keep it







only time will tell.


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> If the 290X leaks are correct and Titan is slower nvidia would have no choice but to match. They'd lose a lot more money EOLing then dropping the price. Also remember when the 4870 was released. The gtx 280 dropped 200 dollars overnight


Stock vs stock no boost Titan is a few percent faster according to 2 respected mods on tech forums.
AMD will have the multi gpu performance crown and depending on how well they clock on air also the single card performance crown.


----------



## szeged

Quote:


> Originally Posted by *maarten12100*
> 
> Stock vs stock no boost Titan is a few percent faster according to 2 respected mods on tech forums.
> AMD will have the multi gpu performance crown and depending on how well they clock on air also the single card performance crown.


boo stock clock battle..boooooooo









i dont think many people buy titans (or future 290x) to run them at stock so that bench doesnt matter to me at all, i wanna see max OC's on air water and to a lesser extent ln2.


----------



## Blackops_2

I just need two 290x's or 780s at 1000$ or cheaper







come on competition


----------



## fleetfeather

[no longer relevant]


----------



## wermad

Final specs leaked (R9 290X & R9 290):

http://www.overclock.net/t/1432010/tpu-final-radeon-r9-290-series-specifications-leaked


----------



## szeged

ooh nice find


----------



## BradleyW

Quote:


> Originally Posted by *Blackops_2*
> 
> I just need two 290x's or 780s at 1000$ or cheaper
> 
> 
> 
> 
> 
> 
> 
> come on competition


This! Exactly This! My budget is 1k for these cards.


----------



## Ghoxt

I knew it. Stockholders pressure. "Welcome back to February 2013" For most of us here, there's really nothing new here except Audio and the promise that one day Mantle will change the game for some AAA titles. IE how is this any different than the usual 9 months later driver optimizations.

I'll take the competition lower prices, but was hoping for more performance from AMD at a foundational level, not just some Driver/API down the road....AMD owners are likely tired of that by now even with the new Marketing,.

I really do wish they could stop fighting a two front war (Intel & Nvidia) but that's another long post...so i desist.


----------



## rdr09

you can get this even with a HD7770 . . .


----------



## maarten12100

Quote:


> Originally Posted by *Ghoxt*
> 
> I knew it. Stockholders pressure. "Welcome back to February 2013" For most of us here, there's really nothing new here except Audio and the promise that one day Mantle will change the game for some AAA titles. IE how is this any different than the usual 9 months later driver optimizations.
> 
> I'll take the competition lower prices, but was hoping for more performance from AMD at a foundational level, not just some Driver/API down the road....AMD owners are likely tired of that by now even with the new Marketing,.
> 
> I really do wish they could stop fighting a two front war (Intel & Nvidia) but that's another long post...so i desist.


Performance/die area is actually how a architecture should be measured winning from the 485nm(if it was cut down to what of the gk110 is used) while having a 430nm die on the same node is a huge win


----------



## superx51

I had a long talk with my friend last night who is a engineer at Nvidia and they will be releasing a gtx 790 that uses 2 full 780s on one pcb like the 690. It will have 3 gigs of ram x2 and a lower tdp than 780 sli like the 690. They tried to develop a 790 using titans but couldn't fit 6 gb x2 on the board, plus I guess the dual titans created to much heat and required to much power . I believe nvidia will be announcing it by the end of the month, but I'm not exactly sure when they plan to unveil it. Sounds pretty cool to me!


----------



## y2kcamaross

Quote:


> Originally Posted by *superx51*
> 
> I had a long talk with my friend last night who is a engineer at Nvidia and they will be releasing a gtx 790 that uses 2 full 780s on one pcb like the 690. It will have 3 gigs of ram x2 and a lower tdp than 780 sli like the 690. They tried to develop a 790 using titans but couldn't fit 6 gb x2 on the board, plus I guess the dual titans created to much heat and required to much power . I believe nvidia will be announcing it by the end of the month, but I'm not exactly sure when they plan to unveil it. Sounds pretty cool to me!


my grandma's friends nephew works there and told me they were bringing a wish single slot gpu based on the titan that has 100w tdp and is cooled by unicorns


----------



## anticommon

Quote:


> Originally Posted by *y2kcamaross*
> 
> my grandma's friends nephew works there and told me they were bringing a wish single slot gpu based on the titan that has 100w tdp and is cooled by unicorns


Eh, not entirely unreasonable.

But, back on topic, I really really really want to see what kinds of 290x's will be released and if they will compete well against the 780 classified. ~$700-710 for a DCU2 or Windforce w/BF4 would be perfect for me. Good performance, half-way decent price.

But even at these prices, can it compete against EVGA's offer for $600 classified.

I've seen that the stock 290x seems to get around 9600 in firestrike, while an OC'ed classy gets around 10600 with 1300/7000 clocks.

So, do you guys think that the 290x can overclock enough to beat the classy? And if so is it worth the price premium ($100-150)? This, all coming from someone who is currently using a backup GTX260 to hold me over until I get a new card.


----------



## Stay Puft

Quote:


> Originally Posted by *anticommon*
> 
> Eh, not entirely unreasonable.
> 
> But, back on topic, I really really really want to see what kinds of 290x's will be released and if they will compete well against the 780 classified. ~$700-710 for a DCU2 or Windforce w/BF4 would be perfect for me. Good performance, half-way decent price.
> 
> But even at these prices, can it compete against EVGA's offer for $600 classified.
> 
> I've seen that the stock 290x seems to get around 9600 in firestrike, while an OC'ed classy gets around 10600 with 1300/7000 clocks.
> 
> So, do you guys think that the 290x can overclock enough to beat the classy? And if so is it worth the price premium ($100-150)? This, all coming from someone who is currently using a backup GTX260 to hold me over until I get a new card.


290X Firestrike Extreme



VS

GTX Titan firestrike extreme

http://www.3dmark.com/3dm/1346514?

Were talking a 700 point difference in graphics score alone

More here

http://www.overclock.net/t/1432081/290x-vs-gtx-780-vs-gtx-770-in-firestrike-extreme


----------



## criminal

Quote:


> Originally Posted by *y2kcamaross*
> 
> my grandma's friends nephew works there and told me they were bringing a wish single slot gpu based on the titan that has 100w tdp and is cooled by unicorns


Will I be able to preorder?


----------



## GoldenTiger

Quote:


> Originally Posted by *Stay Puft*
> 
> 290X Firestrike Extreme
> 
> 
> 
> VS
> 
> GTX Titan firestrike extreme
> 
> http://www.3dmark.com/3dm/1346514?
> 
> Were talking a 700 point difference in graphics score alone
> 
> More here
> 
> http://www.overclock.net/t/1432081/290x-vs-gtx-780-vs-gtx-770-in-firestrike-extreme


Considering a well-oc'd 780 can pull 6000+ graphics score in Firestrike Extreme, that would be a heck of a gap for AMD to fill if they only pull in 4400-4500 stock. It would need to OC well in excess of 35% (remember, you don't get a linear increase matching the percentage of your OC in performance on GPU's) and from the reports of LN2 capping out even at 1400mhz with the stock clock being 1050... yeah, not happening on air or water. I'm unimpressed here, was hoping for some good competition from both sides but this isn't going to spur nVidia to the kind of cuts we'd all love.


----------



## superx51

It will be unveiled at the Motreal Gaming Event on 16th of October.


----------



## scyy

Quote:


> Originally Posted by *GoldenTiger*
> 
> Considering a well-oc'd 780 can pull 6000+ graphics score in Firestrike Extreme, that would be a heck of a gap for AMD to fill if they only pull in 4400-4500 stock. It would need to OC well in excess of 35% (remember, you don't get a linear increase matching the percentage of your OC in performance on GPU's) and from the reports of LN2 capping out even at 1400mhz with the stock clock being 1050... yeah, not happening on air or water. I'm unimpressed here, was hoping for some good competition from both sides but this isn't going to spur nVidia to the kind of cuts we'd all love.


Yup, it's going to be all about the overclock headroom and given reports it's looking like gk110 may have the edge there. We shall all know in 8 days though.


----------



## jomama22

Pretty sure that firstrike is for the 290 and not the. Though we have been shown like 5 different 3dmarks and they are all different so who knows lol.


----------



## fateswarm

Quote:


> Originally Posted by *superx51*
> 
> I had a long talk with my friend last night who is a engineer at Nvidia and they will be releasing a gtx 790 that uses 2 full 780s on one pcb like the 690. It will have 3 gigs of ram x2 and a lower tdp than 780 sli like the 690. They tried to develop a 790 using titans but couldn't fit 6 gb x2 on the board, plus I guess the dual titans created to much heat and required to much power . I believe nvidia will be announcing it by the end of the month, but I'm not exactly sure when they plan to unveil it. Sounds pretty cool to me!


That "they tried to use titans" sounds like your friend is an angineer at their storage room of toilet paper. But! Anything is possible, I'm an agnostic, always.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *fateswarm*
> 
> That "they tried to use titans" sounds like your friend is an angineer at their storage room of toilet paper. But! Anything is possible, I'm an agnostic, always.


Yep. And what does "full 780s" mean anyways? He didn't think the Titian's die is a lot different than the 780 (both of which are cut down dies) did he?


----------



## superx51

I mean the same amount of cores as the 780! Duhh


----------



## fateswarm

It's a good guess though that 6GB X 2. It would indeed be hard to fit. Well, I guess they could get weird novel chips or do other machinations.

edit: No wait, not even that stands. Professional makes do it. Fake!


----------



## wermad

Quote:


> Originally Posted by *superx51*
> 
> I had a long talk with my friend last night who is a engineer at Nvidia and they will be releasing a gtx 790 that uses 2 full 780s on one pcb like the 690. It will have 3 gigs of ram x2 and a lower tdp than 780 sli like the 690. They tried to develop a 790 using titans but couldn't fit 6 gb x2 on the board, plus I guess the dual titans created to much heat and required to much power . I believe nvidia will be announcing it by the end of the month, but I'm not exactly sure when they plan to unveil it. Sounds pretty cool to me!


Cool beans. Makes sense to avoid two Titans as the msrp would be ~$1599-1999 and that would just be plain silly (Asus







). It is technically feasible to cram two Titans on a single pcb albeit it would be wider and longer (see Asus Mars).

A 780x2 makes sense. It can be priced ~$1000-1299 and 6gb (3gb useable) has been done (amd 7990 and 7970x2). If Titan Ultra comes in ~$1k, this would ensure the performance crown for a single card solution until Amd can respond with a 290x2 (8GB of vram







). By then, we'll be well into the next GTX 8xx series. Amd didn't go for a 7990 but eventually they did to buy them more time and to offer sometime to compete against Titan and 690 (AIB did a good job w/ the 7970x2s too imho).

Also, Titan Ultra and a 780x2 can give Nvidia more time to work on the next gen cards to bring the game to a new level. Hawaii is definitely looking like it may do that shortly. Maxwell better be impressive









My


----------



## Majin SSJ Eric

See this is the kind of nonsense that holding back numbers gets you. We all sit around with leaked benches that may or may not have any basis in reality and everybody starts making snap judgments about the card before it even gets a chance to prove itself. I just wish AMD had used the forum they had in Hawaii to give us concrete performance numbers and a price for the card rather than having us dangling in the breeze for 3 more weeks of rumors and speculation. Either its up to snuff or it isn't and 3 weeks isn't going to change that....


----------



## raghu78

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> See this is the kind of nonsense that holding back numbers gets you. We all sit around with leaked benches that may or may not have any basis in reality and everybody starts making snap judgments about the card before it even gets a chance to prove itself. I just wish AMD had used the forum they had in Hawaii to give us concrete performance numbers and a price for the card rather than having us dangling in the breeze for 3 more weeks of rumors and speculation. Either its up to snuff or it isn't and 3 weeks isn't going to change that....


these people who are speculating can do so. it does not affect AMD. what matters is launch day reviews. stock and oc performance and price. the other point I want to talk about is Linus mentioned in his youtube video that he (and all the press) got custom R9 290X cards for testing (video at 50:10). so if we get custom cards on launch day that would be a cool thing .









http://www.youtube.com/watch?v=c8WrGJ4GLm4


----------



## rubicsphere

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> See this is the kind of nonsense that holding back numbers gets you. We all sit around with leaked benches that may or may not have any basis in reality and everybody starts making snap judgments about the card before it even gets a chance to prove itself. I just wish AMD had used the forum they had in Hawaii to give us concrete performance numbers and a price for the card rather than having us dangling in the breeze for 3 more weeks of rumors and speculation. Either its up to snuff or it isn't and 3 weeks isn't going to change that....


Well said sir


----------



## Gabrielzm

Ok Fellows...some real data start to pop up. It is not the r9 290x but instead the 280x. Enjoy:

http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx


----------



## wermad

Quote:


> Originally Posted by *Gabrielzm*
> 
> Ok Fellows...some real data start to pop up. It is not the r9 290x but instead the 280x. Enjoy:
> 
> http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx


Reading through that right now. I'm baffled why the new turbine design of the 280x (which looks very similar to the 290/290x) was not shown at load. I can't see them saying the 7970 GE represents it since the coolers are slight different.

Btw, NDA for these guys was tomorrow (already then, Oct. 8th, in the UK).


----------



## Majin SSJ Eric

Oh yeah, NDA lifted today for the 280X didn't it?


----------



## wermad

Yes, the 280/270/260 mid to entry sku NDA was (or is for me) 10/8/2013.

From these reviews, it seems be to par with its 7xxx series equivalent.


----------



## geoxile

I'm surprised how much better the 270x is better than the 7870 in some cases. I thought for sure it'd perform nearly the same


----------



## Gabrielzm

it seems the review is not fully online yet...Overclocking section still missing. Some nice custom cards in there too, specially the ASUS one.


----------



## Majin SSJ Eric

Why are people disappointed with the 270X? Its not the replacement for the 7950, the 280 is isn't it?


----------



## fleetfeather

There's no NSF bundling with the R7 series, according to PCPer


----------



## wermad

Quote:


> Originally Posted by *fleetfeather*
> 
> There's no NSF bundling with the R7 series, according to PCPer


I think NSF is more plausible w/ quad 290x









(*N*on *S*ufficient *F*unds )


----------



## raghu78

Quote:


> Originally Posted by *fleetfeather*
> 
> There's no NSF bundling with the R7 series, according to PCPer


yeah you are correct. in fact none of the Rx cards get the bundle. confirmed by anandtech and hwc. its done mostly to give an edge to HD 7000 cards and to clear existing stock in the channel. I am guessing sometime later Rx series will get the game bundle. but yeah disappointing nonetheless as it stands today.


----------



## cbarros82

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Why are people disappointed with the 270X? Its not the replacement for the 7950, the 280 is isn't it?


Yes the 280 is the old 7950 , 270x is the old 7870 GHz. They should of used the 7870 xt w/1536 sp's instead


----------



## Blackops_2

Is there an actual 280? I never saw confirmation of many non-X variants, just the 290.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Gabrielzm*
> 
> Ok Fellows...some real data start to pop up. It is not the r9 290x but instead the 280x. Enjoy:
> 
> http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx


Is it wrong to sell all my christmas presents from all mu uncles and aunties to get that Asus 280X?









On a more serious note, if the 280X can do that, what will happen if those Asus blokes get their hands on a 290X? Getting shivers just thinking about that... Well done, there.


----------



## Lennyx

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Is it wrong to sell all my christmas presents from all mu uncles and aunties to get that Asus 280X?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a more serious note, if the 280X can do that, what will happen if those Asus blokes get their hands on a 290X? Getting shivers just thinking about that... Well done, there.


You should tell your uncle and aunties to get together and buy you a gpu for xmas


----------



## TrevBlu19

Should i pick up a 7950 Boost now, while they last?


----------



## fleetfeather

As opposed to a R7 variant? Yeah, I would.


----------



## raghu78

Quote:


> Originally Posted by *TrevBlu19*
> 
> Should i pick up a 7950 Boost now, while they last?


yeah grab a HD 7950 before they run out of stock. also the game bundle is only with HD 7000 cards for the time being. so more beneficial to you.


----------



## fleetfeather

andddd suddenly things get confusing...

http://www.youtube.com/watch?v=hZQANNndqOA&feature=youtu.be&a

While reviewing a 280X, Linus says NSF is "still a thing" (FF to 5m52s)

has he been mislead by chance?


----------



## fateswarm

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> See this is the kind of nonsense that holding back numbers gets you. We all sit around with leaked benches that may or may not have any basis in reality and everybody starts making snap judgments about the card before it even gets a chance to prove itself. I just wish AMD had used the forum they had in Hawaii to give us concrete performance numbers and a price for the card rather than having us dangling in the breeze for 3 more weeks of rumors and speculation. Either its up to snuff or it isn't and 3 weeks isn't going to change that....


It's fine. Though it's fine provided we know we're speculating. It's a problem when people believe the rumors as "news", since most of the tech "journalism" is basically "I just copy whatever I see", "double sourcing? what is that?", "some guy told me, not AMD, but a guy", "I heard it on a chinese site that 40% of the time is correct!" (a main culprit lately).


----------



## TrevBlu19

Quote:


> Originally Posted by *fleetfeather*
> 
> andddd suddenly things get confusing...
> 
> http://www.youtube.com/watch?v=hZQANNndqOA&feature=youtu.be&a
> 
> While reviewing a 280X, Linus says NSF is "still a thing" (FF to 5m52s)
> 
> has he been mislead by chance?


Yay what games?? BF4?!?


----------



## Baghi

Quote:


> Originally Posted by *jomama22*
> 
> Pretty sure that firstrike is for the 290 and not the.


Yup, pretty much this. R9 280X already pulls 4400 on max overclocks, I don't believe AMD would want their top card to be beaten like this from a GPU which is based on close to 2 year old Tahiti chip. Then again, a +99MHz GTX 780 scores just over 4800 FS-E pts, so who knows?


----------



## Baghi

Sorry for the dp, but...


*Radeon R9 290X Features 64 ROPs*
Quote:


> A leaked company slide by AMD confirmed that its high-end "Hawaii" silicon indeed features 64 raster operations units (ROPs). In reference to its predecessor, "Tahiti," the slide speaks of 2 times the ROPs (32 on "Tahiti") and 1.4 times the stream processors (2048 on "Tahiti," so 2816 on "Hawaii"). Other known specifications include up to 1 GHz GPU clock, up to 5.00 GHz memory clock, and a 512-bit wide GDDR5 memory interface, holding 4 GB of memory. Reviews of Radeon R9 290X could surface around mid-October.


Source: TPU


----------



## raghu78

wow thats a lot of ROPs. this card is going to beast at MSAA and SSAA







serious competition for Titan


----------



## criminal

Quote:


> Originally Posted by *raghu78*
> 
> wow thats a lot of ROPs. this card is going to beast at MSAA and SSAA
> 
> 
> 
> 
> 
> 
> 
> serious competition for Titan


Wow.... you don't say.


----------



## youra6

48, then 44, now 64 ROPs. Which one is it? 6 more days until we know for sure.


----------



## Durquavian

Quote:


> Originally Posted by *youra6*
> 
> 48, then 44, now 64 ROPs. Which one is it? 6 more days until we know for sure.


since this is the first actual AMD slides we've seen with this info, I'd say 64 is the actual count. Most numbers before were sites releasing what they had or likely a typo.


----------



## criminal

Quote:


> Originally Posted by *Durquavian*
> 
> since this is the first actual AMD slides we've seen with this info, I'd say 64 is the actual count. Most numbers before were sites releasing what they had or likely a typo.


I agree. 44 just seemed off the whole time.


----------



## raghu78

Quote:


> Originally Posted by *criminal*
> 
> Wow.... you don't say.


why not ? for the first time in ages the top AMD card has more ROPs than the top Nvidia card. I have always felt AMD were slow to move to higher ROP counts. the 64 ROPs configuration can last for another 3 generations at the high end. AMD has laid the foundation for atleast 3 generations now. 28nm R9 290X , 20nm R9 390X, 16 FF R9 490X.

GTX 280 - 32 ROPs vs HD 4870 - 16 ROPs
GTX 480 - 48 ROPs vs HD 5870 - 32 ROPs
GTX 580 - 48 ROPs vs HD 6970 - 32 ROPs
GTX 680 - 32 ROPs vs HD 7970 - 32 ROPs
GTX Titan - 48 ROPs vs R9 290X - 64 ROPs


----------



## criminal

Quote:


> Originally Posted by *raghu78*
> 
> why not ? for the first time in ages the top AMD card has more ROPs than the top Nvidia card. I have always felt AMD were slow to move to higher ROP counts. the 64 ROPs configuration can last for another 3 generations at the high end. AMD has laid the foundation for atleast 3 generations now. 28nm R9 290X , 20nm R9 390X, 16 FF R9 490X.
> 
> GTX 280 - 32 ROPs vs HD 4870 - 16 ROPs
> GTX 480 - 48 ROPs vs HD 5870 - 32 ROPs
> GTX 580 - 48 ROPs vs HD 6970 - 32 ROPs
> GTX 680 - 32 ROPs vs HD 7970 - 32 ROPs
> GTX Titan - 48 ROPs vs R9 290X - 64 ROPs


What I am saying is that anyone with half a brain knows by now it is going to compete with the Titan. We don't need you to keep stating the obvious... lol


----------



## Newbie2009

Quote:


> Originally Posted by *raghu78*
> 
> why not ? for the first time in ages the top AMD card has more ROPs than the top Nvidia card. I have always felt AMD were slow to move to higher ROP counts. the 64 ROPs configuration can last for another 3 generations at the high end. AMD has laid the foundation for atleast 3 generations now. 28nm R9 290X , 20nm R9 390X, 16 FF R9 490X.
> 
> GTX 280 - 32 ROPs vs HD 4870 - 16 ROPs
> GTX 480 - 48 ROPs vs HD 5870 - 32 ROPs
> GTX 580 - 48 ROPs vs HD 6970 - 32 ROPs
> GTX 680 - 32 ROPs vs HD 7970 - 32 ROPs
> GTX Titan - 48 ROPs vs R9 290X - 64 ROPs


Quote:


> Originally Posted by *criminal*
> 
> What I am saying is that anyone with half a brain knows by now it is going to compete with the Titan. We don't need you to keep stating the obvious... lol


Neither of u are asshats, so don;t start please. We have enough.


----------



## criminal

Quote:


> Originally Posted by *Newbie2009*
> 
> Neither of u are asshats, so don;t start please. We have enough.


LOL... my bad.


----------



## Majin SSJ Eric

Lol!


----------



## grunion

Quote:


> Originally Posted by *raghu78*
> 
> why not ? for the first time in ages the top AMD card has more ROPs than the top Nvidia card. I have always felt AMD were slow to move to higher ROP counts. the 64 ROPs configuration can last for another 3 generations at the high end. AMD has laid the foundation for atleast 3 generations now. 28nm R9 290X , 20nm R9 390X, 16 FF R9 490X.
> 
> GTX 280 - 32 ROPs vs HD 4870 - 16 ROPs
> GTX 480 - 48 ROPs vs HD 5870 - 32 ROPs
> GTX 580 - 48 ROPs vs HD 6970 - 32 ROPs
> GTX 680 - 32 ROPs vs HD 7970 - 32 ROPs
> GTX Titan - 48 ROPs vs R9 290X - 64 ROPs


Remember the performance increase the last time AMD doubled the ROPs?
Architecture transition at the time, exciting times.


----------



## raghu78

Quote:


> Originally Posted by *criminal*
> 
> What I am saying is that anyone with half a brain knows by now it is going to compete with the Titan. We don't need you to keep stating the obvious... lol


sorry to remind you there are many in the forums who were were arguing for the past few months that R9 290X card is going to be slower than GTX 780







how come all of a sudden its so obvious that its going to compete with Titan


----------



## Stay Puft

Someone want to give me a quick lesson on what ROP's do and why more is better?


----------



## Majin SSJ Eric

More is always better!









Sorry, I don't really know!


----------



## keikei

Quote:


> Originally Posted by *Stay Puft*
> 
> Someone want to give me a quick lesson on what ROP's do and why more is better?


I would like to know as well. Please.


----------



## Usario

Hm.

44 or 48 never made much sense, since neither are divisible by 512 (as in the 512-bit bus). Also, the people saying 44 claimed it was true because Hawaii supposedly has 4 CUs with 11 SIMDs each, and therefore 44 ROPs; ROP and CU count are independent of each other afaik.

64 ROPs would make it a monster. Also helps explain why it really only starts to crush NVIDIA in the leaked benchmarks we've seen when you up the AA.


----------



## criminal

Quote:


> Originally Posted by *raghu78*
> 
> sorry to remind you there are many in the forums who were were arguing for the past few months that R9 290X card is going to be slower than GTX 780
> 
> 
> 
> 
> 
> 
> 
> how come all of a sudden its so obvious that its going to compete with Titan


People have been very much saying that the new AMD cards would be slower than the comparable Nvidia cards, but that has kinda died down some since the "unconfirmed" leaks. But part of that I believe is due to the AMD cards having higher default clock speeds. Anyway, 64 rops is exciting.

Also, I apologize for my snarky comment.


----------



## SpacemanSpliff

Quote:


> Originally Posted by *grunion*
> 
> Remember the performance increase the last time AMD doubled the ROPs?
> Architecture transition at the time, exciting times.


I certainly do... my 5870 has been a trooper through a couple years of solid gaming / folding use.


----------



## vs17e

http://en.wikipedia.org/wiki/Render_output_unit


----------



## Usario

Quote:


> Originally Posted by *criminal*
> 
> People have been very much saying that the new AMD cards would be slower than the comparable Nvidia cards, but that has kinda died down some since the "unconfirmed" leaks. But part of that I believe is due to the AMD cards having higher default clock speeds. Anyway, 64 rops is exciting.
> 
> Also, I apologize for my snarky comment.


The default clocks seem to barely be any different when you factor in GPU Boost. Looks like 1000 or 1020MHz on the 290X and around 950MHz on the 290. Most GK110s seem to boost to around 1000 afaik.


----------



## criminal

Quote:


> Originally Posted by *Usario*
> 
> The default clocks seem to barely be any different when you factor in GPU Boost. Looks like 1000 or 1020MHz on the 290X and around 950MHz on the 290.


[email protected] 1k versus a regular Titan that boosts to 876. Clock for clock GK110 appears to be faster and overclocks like a beast.


----------



## Usario

Quote:


> Originally Posted by *criminal*
> 
> Yep. [email protected] 1k versus a regular Titan that boosts to 876. Clock for clock GK110 appears to be faster and overclocks like a beast.


Come on. No Titan is ever going to run at 876; not when gaming at least... most seem to boost to 950-1050 or so out of the box.

We could be looking at an issue similar to that of the GTX 680, where cards labeled as "1006MHz" running at 1100-1150 that had maybe 100-150MHz of extra headroom were being compared to 925MHz HD 7970s that mostly had around 250-300MHz of OC headroom.


----------



## IIMaxII

IMO AMD is going to win the customers the holiday season just because a lot of consumers think that AMD will be the way to go just because they're on all the consoles. When IMO Intel will still stomp and NVIDIA will still provide comparable or a bit better performance until AMD step their game up.

Sent from my SAMSUNG-SGH-I337 using Tapatalk 4


----------



## y2kcamaross

Quote:


> Originally Posted by *criminal*
> 
> Will I be able to preorder?


yes, but we won't tell you the price or the exact specifications until we ship you the card, and there's no take backs!


----------



## criminal

Quote:


> Originally Posted by *Usario*
> 
> Come on. No Titan is ever going to run at 876; not when gaming at least... most seem to boost to 950-1050 or so out of the box.
> 
> We could be looking at an issue similar to that of the GTX 680, where cards labeled as "1006MHz" running at 1100-1150 that had maybe 100-150MHz of extra headroom were being compared to 925MHz HD 7970s that mostly had around 250-300MHz of OC headroom.


No, a regular stock Titan will not boost that high without some seriously cold temps and some overclocking applied:
Quote:


> The results of the default real-world frequencies are quite revealing. In Crysis 3 the frequency settled to 862MHz at 80c. Tomb Raider settled to 875MHz at 80c. Far Cry 3 settled to 875MHz at 80c. Sleeping Dogs settled to 862MHz at 80c.
> 
> What this tells us is that to keep the temperature at 80c, the clock speeds are being kept at GPU Boost clock, or even slightly under in the more demanding games. Crysis 3 and Sleeping Dogs seem to be so demanding that GPU Boost runs the clocks a bit lower to keep the GPU temperature at 80c. At the fastest speed, 875MHz, the GPU is running right at the default GPU Boost clock speed, no faster.


http://www.hardocp.com/article/2013/04/29/nvidia_geforce_gtx_titan_overclocking_review#.UlWVMhDkzGY

Titan and 780 uses boost 2.0. You take away the restriction Nvidia imposed and there is 275-400MHz of overclocking headroom.

Like I said awhile back. We will know the truth once some OCN members get the cards in their hands.


----------



## 2010rig

Quote:


> Originally Posted by *youra6*
> 
> 48, then 44, now 64 ROPs. Which one is it? 6 more days until we know for sure.


Definitely looks like 64, and makes more sense.
Quote:


> Originally Posted by *y2kcamaross*
> 
> yes, but we won't tell you the price or the exact specifications until we ship you the card, and there's no take backs!


Trust us, it's worth it.









It will ridicule a Titan!

In 1 game that requires Mantle optimizations, and will be available ONLY 2 months after its release... *A*lways "*M*inor" *D*elays


----------



## raghu78

Quote:


> Originally Posted by *criminal*
> 
> People have been very much saying that the new AMD cards would be slower than the comparable Nvidia cards, but that has kinda died down some since the "unconfirmed" leaks. But part of that I believe is due to the AMD cards having higher default clock speeds. Anyway, 64 rops is exciting.


GTX 780 cards boost up to 992 mhz out of the box and unless you have poor case cooling you are most likely to run at close to 1 Ghz

http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/3

Titan will run at lower clocks due to lesser TDP headroom as both 780 and Titan have same 250w TDP.
Quote:


> Also, I apologize for my snarky comment.


no probs.


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> Definitely looks like 64, and makes more sense.
> Trust us, it's worth it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It will ridicule a Titan!
> 
> In 1 game that requires Mantle optimizations, and will be available ONLY 2 months after its release... *A*lways "*M*inor" *D*elays


Lol, the only game where 290x wont beat Titan with 4xmaas is RAGE which runs poorly on AMD hardware. This is on air. Aka, a clear winner.


----------



## Usario

criminal:

Anandtech's sample was hitting 992MHz max at stock: http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/2

Guru3D got similar results: http://www.guru3d.com/articles_pages/geforce_gtx_titan_review,4.html (look at the afterburner screenshot)

I could go looking for more reviews showing the same thing. I didn't even cherry pick these; they were the first two I looked at. Seems more likely that [H] got a bad sample than anything else.


----------



## 2010rig

Quote:


> Originally Posted by *Regent Square*
> 
> Lol, the only game where 290x wont beat Titan with 4xmaas is RAGE which runs poorly on AMD hardware. This is on air.


Keyword: *ridicule*.

We shall see if it beats a Titan in EVERY game, at Max OC vs Max OC.


----------



## CBZ323

Given that the titan launched last February I would only *expect* AMD to come with something better. I don't understand what the big deal is about. They *should* bring out a better product and it seems that they are.

Clearly, Nvidia had no titan competitor in the single GPU market, so they marked up the prices as much as people were willing to pay, as any normal company would do. This is the tech hardware business, not a charity.


----------



## Usario

Quote:


> Originally Posted by *2010rig*
> 
> Keyword: *ridicule*.
> 
> We shall see if it beats a Titan in EVERY game, at Max OC vs Max OC.


There's always going to be a few odd titles where the overall slower card pulls ahead. The 7970 is faster than the Titan in Dirt 3. The 660 Ti, IIRC, is faster than the 7970 in Portal 2 (or at least it was with the old drivers; not that either of them aren't going to pull well over 100fps anyway)
Quote:


> Originally Posted by *CBZ323*
> 
> Given that the titan launched last February I would only *expect* AMD to come with something better. I don't understand what the big deal is about. They *should* bring out a better product and it seems that they are.
> 
> Clearly, Nvidia had no titan competitor in the single GPU market, so they marked up the prices as much as people were willing to pay, as any normal company would do. This is the tech hardware business, not a charity.


If everybody always charged $1000 for the first GPU of a new generation (which, duh, is going to be faster than anything else on the market)... I'm not even going to finish

The 5870 had no competition for half a year and AMD only charged $400

and the 5870 actually competed very well with the GTX 295, unlike the Titan that gets its ass handed to it by the 7990 (which also happens to cost, what, $300 less?)


----------



## Regent Square

Quote:


> Originally Posted by *2010rig*
> 
> Keyword: *ridicule*.
> 
> We shall see if it beats a Titan in EVERY game, at Max OC vs Max OC.


Max OC on air- no problems.

Even if it ridicules Titan in only 1 game, it is worthy of applaud as it is a unique scenario on the same architecture.


----------



## AlphaC

Quote:


> Originally Posted by *keikei*
> 
> I would like to know as well. Please.


raster operations pipeline...

http://http.developer.nvidia.com/GPUGems/gpugems_ch28.html

http://graphics.stanford.edu/~liyiwei/courses/GPU/paper/paper.pdf
Quote:


> ROP (Raster Operation) is the unit that writes fragments into the
> frame-buffer. The main functionality of ROP is to efficiently write
> batches of fragments into the frame-buffer via compression. It
> also performs alpha, depth, and stencil tests to determine if the
> fragments should be written or discarded. ROP deals with several
> buffers residing in the frame-buffer, including color, depth, and
> stencil buffers.


http://cs.nyu.edu/courses/spring12/CSCI-GA.3033-012/lecture2.pdf

http%3A%2F%2Fwww.cse.ohio-state.edu%2F~crawfis%2Fcse786%2FReferenceMaterial%2FCourseNotes%2FModern%2520GPU%2520Architecture.ppt

http://www.cs.kent.edu/~zhao/gpu/lectures/ProgrammableGraphicsPipeline.pdf

http://web.cs.wpi.edu/~gogo/courses/cs543/misc_materials/Cg_tutorial_Chapter_1_abridged.pdf

http://www.win.tue.nl/~keesh/ow/2IV40/pipeline2.pdf

http://csc.lsu.edu/~kooima/csc4356/notes/icg-pipeline-2.pdf
see page 2

http://homepages.math.uic.edu/~jan/mcs572/gpu_evolution.pdf
Quote:


> ROP (Raster Operation)
> The final raster operations blend the color of overlapping/adjacent
> objects for transparency and antialiasing effects.
> For a given viewpoint, visible objects are determined and occluded
> pixels (blocked from view by other objects) are discarded.


You know _culling_ and AA?

google knows.









You people need to use google more.









Plus, it won't ridicule a TITAN unless they mean price/performance wise , at stock. (if we're talking FPS/$, GTX 780 already does that) Maybe with whatever Mantle optimizations, but not in a standard DX /OpenGL thing.


----------



## Moragg

Quote:


> Originally Posted by *2010rig*
> 
> Keyword: *ridicule*.
> 
> We shall see if it beats a Titan in EVERY game, at Max OC vs Max OC.


Ridicule won't happen unless Mantle is used on most new games. If that happens and we get the expected 20%+ perf boost _and_ and increase in game quality since we can have more stuff on screen thanks to more draw calls - only then can we say it makes the Titan a waste of money for most gamers.

For now I'd expect the Titan and 290X to be fairly close OC'ed, with the 290X gaining a few fps at high AA.


----------



## criminal

Quote:


> Originally Posted by *Usario*
> 
> criminal:
> 
> Anandtech's sample was hitting 992MHz max at stock: http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/2
> 
> Guru3D got similar results: http://www.guru3d.com/articles_pages/geforce_gtx_titan_review,4.html (look at the afterburner screenshot)
> 
> I could go looking for more reviews showing the same thing. I didn't even cherry pick these; they were the first two I looked at. Seems more likely that [H] got a bad sample than anything else.


Both those look like they applied voltage and/or upped the clock speeds.

Do you actually believe that the leaked slides had any of that going on? If I was "leaking" slides of a card I want to look good, I would stick the competitors card in a hot box and run the tests.

Anyway, it doesn't matter, we will know in the short term once the cards are officially released. And once they start hitting here on OCN.


----------



## Blindsay

Quote:


> Originally Posted by *criminal*
> 
> Both those look like they applied voltage and/or upped the clock speeds.
> 
> Do you actually believe that the leaked slides had any of that going on? If I was "leaking" slides of a card I want to look good, I would stick the competitors card in a hot box and run the tests.
> 
> Anyway, it doesn't matter, we will know in the short term once the cards are officially released. And once they start hitting here on OCN.


Yeah at this point im not believing anything until the cards are actually out, i swear the details change each week,

I think ill be keeping my 780 classy though









On a random side note, your first gpu and your current gpu are exactly the same as mine lol


----------



## Usario

Nope. Anandtech was just demonstrating the overvolting feature, further evidenced by this later in the article:

GeForce GTX Titan Average Clockspeeds
Max Boost Clock 992MHz
DiRT:S 992MHz
Shogun 2 966MHz
Hitman 992MHz
Sleeping Dogs 966MHz
Crysis 992MHz
Far Cry 3 979MHz
Battlefield 3 992MHz
Civilization V 979MHz

And in the screenshot from Guru3D there was absolutely nothing about upping clock speeds and you can clearly see "Core Voltage - +0mV"

And you can also see later in their review that when they actually overclocked the card they were boosting up to 1176MHz max


----------



## s-x

Quote:


> Originally Posted by *2010rig*
> 
> Definitely looks like 64, and makes more sense.
> Trust us, it's worth it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It will ridicule a Titan!
> 
> In 1 game that requires Mantle optimizations, and will be available ONLY 2 months after its release... *A*lways "*M*inor" *D*elays


Yeah... about that. The optimizations are being done to the engine, not BF4. So any game using frostbite 3 will have mantle optimizations, which currently include, but are not limited to Battlefield 4, Mirror's Edge 2, Dragon Age: Inquisition, Need for Speed: Rivals, Plants Vs Zombies: Garden Warfare, Mass Effect 4 and Star Wars: Battlefront.

Pretty big games, huh? Then you have people like activision already pledging to use mantle, and an AMD manager being interviewed saying their were even more undisclosed company's pledging to use mantle.

What are you going to come up with next? Nvidia's physx fur in the witcher 3 performs 9000% better than AMD? Selective thinking bro.


----------



## keikei

Quote:


> Originally Posted by *s-x*
> 
> Yeah... about that. The optimizations are being done to the engine, not BF4. So any game using frostbite 3 will have mantle optimizations, which currently include, but are not limited to Battlefield 4, Mirror's Edge 2, Dragon Age: Inquisition, Need for Speed: Rivals, Plants Vs Zombies: Garden Warfare, Mass Effect 4 and Star Wars: Battlefront.
> 
> Pretty big games, huh? Then you have people like activision already pledging to use mantle, and an AMD manager being interviewed saying their were even more undisclosed company's pledging to use mantle.


Oh wow!


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *criminal*
> 
> [email protected] 1k versus a regular Titan that boosts to 876. Clock for clock GK110 appears to be faster and overclocks like a beast.


I have the EVGA SC's and mine boost to 1006MHz completely stock...


----------



## Testier

Quote:


> Originally Posted by *s-x*
> 
> Yeah... about that. The optimizations are being done to the engine, not BF4. So any game using frostbite 3 will have mantle optimizations, which currently include, but are not limited to Battlefield 4, Mirror's Edge 2, Dragon Age: Inquisition, Need for Speed: Rivals, Plants Vs Zombies: Garden Warfare, Mass Effect 4 and Star Wars: Battlefront.
> 
> Pretty big games, huh? Then you have people like activision already pledging to use mantle, and an AMD manager being interviewed saying their were even more undisclosed company's pledging to use mantle.
> 
> What are you going to come up with next? Nvidia's physx fur in the witcher 3 performs 9000% better than AMD? Selective thinking bro.


Except zero proof of how much mantle optimization actually do. Also, thank you for bring up "fur". Witcher 3 is top game I am looking forward to.


----------



## PureBlackFire

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have the EVGA SC's and mine boost to 1006MHz completely stock...


1074mhz boost.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *PureBlackFire*
> 
> 1074mhz boost.


For your 780 SC?


----------



## Regent Square

Lets start NvA , I am ready.


----------



## maarten12100

Quote:


> Originally Posted by *2010rig*
> 
> Definitely looks like 64, and makes more sense.
> Trust us, it's worth it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It will ridicule a Titan!
> 
> In 1 game that requires Mantle optimizations, and will be available ONLY 2 months after its release... *A*lways "*M*inor" *D*elays


Cards with gcn will match Nvidia and beat them hands down in any Mantle game.
And they were saying consoles were too low margin to bother (well they have no cpu besides arm tegra cores)


----------



## specopsFI

My personal experience with both a reference Titan and a reference 780 is perfectly in line with what [H] wrote in their Titan OC article. Under sustained heavy load such as Sleeping Dogs or even 3DMark11, a reference GK110 card will eventually throttle down all the way to base clock. How long it takes and how often that happens is subject to cooling and chip quality, but all of them *will* throttle unless the power and temp limits are raised. Just by raising those two limits to their max will give about 10% more FPS in long benchmark runs.


----------



## PureBlackFire

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> 1074mhz boost.
> 
> For your 780 SC?


yea.


----------



## Blindsay

Hexus noted that the 780 classified boosted to 1163 even

http://hexus.net/tech/reviews/graphics/60269-evga-geforce-gtx-780-classified/


----------



## Moragg

Quote:


> Originally Posted by *Testier*
> 
> Except zero proof of how much mantle optimization actually do. Also, thank you for bring up "fur". Witcher 3 is top game I am looking forward to.


Mantle should provide at the very least a 20% improvement. The performance boost will be nice, but the lower CPU overheads could mean the difference in gaming performance between the 8350 and much more expensive 4770K will become too small to worry about. The 9x more draw calls means games can look much better - many more things can be drawn in the same frame which will mean games look better and feel more immersive.

Add to all that TrueAudio, which would use further free up CPU resources and provide high quality 3D positional audio (sorely needed) and sell it for lower prices than Nvidia sell their GPUs - AMD has a potential winner on their hands here. Sure, maybe I don't get performance decreasing fog, fur, and explosions but the upsides of Hawaii are pretty big.


----------



## Terse

Mantle is in zero games right now, we don't have anything besides speculation at this point. If game devs are just now pledging to use it, we probably won't see many games using it within a year, maybe longer.


----------



## Moragg

Quote:


> Originally Posted by *Terse*
> 
> Mantle is in zero games right now, we don't have anything besides speculation at this point. If game devs are just now pledging to use it, we probably won't see many games using it within a year, maybe longer.


Definitley not. AMD have only been doing for 2 years and have made it close to the console APIs to make ports as easy as possible







And it's only tiny things like the Frostbite 3 engine that are going to use Mantle.

Look at console hardware and how much they benefit from optimisation, then tell me we won't get at least a 20% boost from Mantle.


----------



## bmt22033

Quote:


> Originally Posted by *CBZ323*
> 
> Clearly, Nvidia had no titan competitor in the single GPU market, so they marked up the prices as much as people were willing to pay, as any normal company would do. This is the tech hardware business, not a charity.


How dare you suggest that a company operating in a free market be allowed to charge what the market will bear for their products?!?!?!?


----------



## maarten12100

Now everything seems to turns out better than expected...

I hope it'll have a nice backplate like my HD5870 has:


----------



## criminal

Quote:


> Originally Posted by *Blindsay*
> 
> Yeah at this point im not believing anything until the cards are actually out, i swear the details change each week,
> 
> I think ill be keeping my 780 classy though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a random side note, your first gpu and your current gpu are exactly the same as mine lol












Quote:


> Originally Posted by *Usario*
> 
> Nope. Anandtech was just demonstrating the overvolting feature, further evidenced by this later in the article:
> 
> GeForce GTX Titan Average Clockspeeds
> Max Boost Clock 992MHz
> DiRT:S 992MHz
> Shogun 2 966MHz
> Hitman 992MHz
> Sleeping Dogs 966MHz
> Crysis 992MHz
> Far Cry 3 979MHz
> Battlefield 3 992MHz
> Civilization V 979MHz
> 
> And in the screenshot from Guru3D there was absolutely nothing about upping clock speeds and you can clearly see "Core Voltage - +0mV"
> 
> And you can also see later in their review that when they actually overclocked the card they were boosting up to 1176MHz max


Oh well, my head hurts. I still stand by my statement that clock for clock GK110 will be faster. Maybe not by much,but faster. I haven't used a boost enabled bios in so long, I can't remember how my Titan or this 780 fairs.


----------



## Terse

Quote:


> Originally Posted by *Moragg*
> 
> Definitley not. AMD have only been doing for 2 years and have made it close to the console APIs to make ports as easy as possible
> 
> 
> 
> 
> 
> 
> 
> And it's only tiny things like the Frostbite 3 engine that are going to use Mantle.
> 
> Look at console hardware and how much they benefit from optimisation, then tell me we won't get at least a 20% boost from Mantle.


If they have been working on it for 2 years, why is their biggest game partner only implementing it months after it releases? Will devs signing up now take 2 more years as well to actually release games with it?

I'm probably going red team this holiday season but I am definitely NOT using mantle or supposed 'console synergy' as justifications.


----------



## Moragg

Quote:


> Originally Posted by *Terse*
> 
> If they have been working on it for 2 years, why is their biggest game partner only implementing it months after it releases? Will devs signing up now take 2 more years as well to actually release games with it?
> 
> I'm probably going red team this holiday season but I am definitely NOT using mantle or supposed 'console synergy' as justifications.


Umm... no? BF4 is on Frostbite 3, and any other game on that engine will be optimised for Mantle. Some other big dev is also using it for next-gen games, and supposedly others are also using it.

As for the release date -
1) So people don't find their 7970 plays BF4 amazingly and don't upgrade
2) To have more DX11 data to improve drivers
3) So issues with BF4 don't get blamed on them.
4) For final tweaking

So mostly about bringing a good addition rather than a half-complete item that has problems caused by other factors.

Edit: My bad, I accidentally put BF3 when I meant BF4


----------



## Gooberman

Quote:


> Originally Posted by *Moragg*
> 
> Umm... no? *BF3 is on Frostbite 3*, and any other game on that engine will be optimised for Mantle. Some other big dev is also using it for next-gen games, and supposedly others are also using it.
> 
> As for the release date -
> 1) So people don't find their 7970 plays BF4 amazingly and don't upgrade
> 2) To have more DX11 data to improve drivers
> 3) So issues with BF4 don't get blamed on them.
> 4) For final tweaking
> 
> So mostly about bringing a good addition rather than a half-complete item that has problems caused by other factors.


isn't bf3 on frostbite 2? lol


----------



## Regent Square

Quote:


> Originally Posted by *Gooberman*
> 
> isn't bf3 on frostbite 2? lol


It uses 80% of frostbite 2. EA decided to call it Frostbite 3 for the sake of marketing.

Source: Leaks from EA China awhile ago.


----------



## Mr357

Quote:


> Originally Posted by *Regent Square*
> 
> It uses 80% of frostbite 2. EA decided to call it Frostbite 3 for the sake of marketing.
> 
> Source: Leaks from EA China awhile ago.


BF3 or BF4?


----------



## Gooberman

Well it seems ea is marketing frostbite 2 with bf3 lol

EDIT: ^^ what he said up there lol


----------



## Ukkooh

Most propably moragg just typoed BF4 as BF3 and regent square used his brain's autocorrect to read it as BF4. Move on people.


----------



## Regent Square

Quote:


> Originally Posted by *Mr357*
> 
> BF3 or BF4?


BF4 - 80% of frostbite

BF3- 20%

Why? Cause at the time of BF3 next gen consoles were not ready to be released.

Now, they decided to unveil more of Dice` engine. BF4 is just BF3 with sidegrades/new weapons/maps. Aka, resembles MW3 and MW2.


----------



## mltms

Quote:


> Originally Posted by *2010rig*
> 
> Definitely looks like 64, and makes more sense.
> Trust us, it's worth it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It will ridicule a Titan!
> 
> In 1 game that requires Mantle optimizations, and will be available ONLY 2 months after its release... *A*lways "*M*inor" *D*elays


even if the R9 290X Smash the head of the titan in all games and all setting 4K or 800x600 stock and overclock you Will continue to complain


----------



## Moragg

Quote:


> Originally Posted by *Ukkooh*
> 
> Most propably moragg just typoed BF4 as BF3 and regent square used his brain's autocorrect to read it as BF4. Move on people.


Yeah, my bad. Fixed









And may I congratulate you sir on your common sense


----------



## Archngamin

Quote:


> Originally Posted by *Regent Square*
> 
> BF4 - 80% of frostbite
> 
> BF3- 20%
> 
> Why? Cause at the time of BF3 next gen consoles were not ready to be released.
> 
> Now, they decided to unveil more of Dice` engine. BF4 is just BF3 with sidegrades/new weapons/maps. Aka, resembles MW3 and MW2.


Except the scale of BF games means the content added from BF3 to BF4 is more than MW2 and MW3 combined added to CoD4.


----------



## Regent Square

Quote:


> Originally Posted by *mltms*
> 
> even if the R9 290X Smash the head of the titan in all games and all setting 4K or 800x600 stock and overclock you Will continue to complain


They will say Nvidia has better drivers/ PhysX, and 3d mark score. then they will abandon the forum, buy amd cards and come back when NVidia releases somt better.


----------



## Regent Square

Quote:


> Originally Posted by *Archngamin*
> 
> Except the scale of BF games means the content added from BF3 to BF4 is more than MW2 and MW3 combined added to CoD4.


yea, more guns, more peks.

and dead bf3 support.


----------



## vish92

Looks like AMD R9290X is approx 2x faster than titan

Look at this

Notice FC3 gets about 60FPS



Now look at Hexus 4K gaming review

http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=5

Here 7970 scores 27.8FPS while Titan scores 35.4FPS

AMD wont run FC3 at lower settings just for presentation decks.


----------



## kot0005

Just got this e mail lol and Newegg doesnt ship to AUstralia haha.


----------



## grunion

Couple months ago I sat in on a marketing meeting, the cards they showed us had the dual slot DCII coolers but were black&gold themed like the Z87 boards.
Wonder what happened there








And the numbers I saw weren't that impressive, the card that was demoed at stock = 7970 Matrix Platinum @ 1200 in FS.


----------



## TamaDrumz76

Me being without my 7970 for a few weeks now (damn MSI pulling an "oops" on my RMA) really makes me want a 290X... Hell, it makes me want a video card, period.

I wonder with all this horsepower if there is going to be any feasible way of getting these new AMD cards to support down-sampling. I've tried tweeting and Fbooking them, telling them to make it a feasible option like it is on NV products.


----------



## mltms

Quote:


> Originally Posted by *vish92*
> 
> Looks like AMD R9290X is approx 2x faster than titan
> 
> Look at this
> 
> Notice FC3 gets about 60FPS
> 
> 
> 
> Now look at Hexus 4K gaming review
> 
> http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=5
> 
> Here 7970 scores 27.8FPS while Titan scores 35.4FPS
> 
> AMD wont run FC3 at lower settings just for presentation decks.


look it crysis 3 titan 26 and r9 290x is about 35 that is %40-35 faster


----------



## Testier

Quote:


> Originally Posted by *vish92*
> 
> Looks like AMD R9290X is approx 2x faster than titan
> 
> Look at this
> 
> Notice FC3 gets about 60FPS
> 
> 
> 
> Now look at Hexus 4K gaming review
> 
> http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=5
> 
> Here 7970 scores 27.8FPS while Titan scores 35.4FPS
> 
> AMD wont run FC3 at lower settings just for presentation decks.


Oh? I was not aware we had OFFICAL BENCHMARKS. Please link me the page if possible. Otherwise, this bench is worthless, it could be easily photoshopped.


----------



## Forceman

Quote:


> Originally Posted by *vish92*
> 
> AMD wont run FC3 at lower settings just for presentation decks.


Ha. Good one. Just like they would never manipulate the scales to emphasize tiny differences.


----------



## vish92

You sir didnt read the chart correctly

Crysis 3 as per AMD slide gets 45-46 FPS

Titan gets 26 FPS

So, 46/26 equals approx 80%

Clock the mem to 1.5Ghz and you will see it at 100%

that is 2x


----------



## vish92

Quote:


> Originally Posted by *Testier*
> 
> Oh? I was not aware we had OFFICAL BENCHMARKS. Please link me the page if possible. Otherwise, this bench is worthless, it could be easily photoshopped.


http://wccftech.com/amd-radeon-r9-290x-radeon-r9-290-series-official-presentation-leaked-4k-gaming-performance-unveiled/


----------



## wstanci3

Quote:


> Originally Posted by *vish92*
> 
> Looks like AMD R9290X is approx 2x faster than titan
> 
> *AMD wont run FC3 at lower settings just for presentation decks*.


Yeah. Sure.
If you are marketing a product, you want to show your product in the best light possible. Everyone should take the marketing slides with a heaping of salt and wait for the official benchmarks.


----------



## criminal

Quote:


> Originally Posted by *vish92*
> 
> You sir didnt read the chart correctly
> 
> Crysis 3 as per AMD slide gets 45-46 FPS
> 
> Titan gets 26 FPS
> 
> So, 46/26 equals approx 80%
> 
> Clock the mem to 1.5Ghz and you will see it at 100%
> 
> that is 2x


If you believe those charts to be 100% fact, well... I have some ocean front property in Arizona for sale.


----------



## sugarhell

That means 64 rop?


----------



## vish92

Things to note before coming up against me

1. I am unbiased (own a 560 praising AMD )

2. The graph is showing equal increments and starting from 0

3.Lets deduct 10% perf as marketing tactic but still we will be left with 70% thats great


----------



## Moragg

Quote:


> Originally Posted by *TamaDrumz76*
> 
> Me being without my 7970 for a few weeks now (damn MSI pulling an "oops" on my RMA) really makes me want a 290X... Hell, it makes me want a video card, period.
> 
> I wonder with all this horsepower if there is going to be any feasible way of getting these new AMD cards to support down-sampling. I've tried tweeting and Fbooking them, telling them to make it a feasible option like it is on NV products.


I believe that's called SSAA.
Quote:


> Originally Posted by *vish92*
> 
> You sir didnt read the chart correctly
> 
> Crysis 3 as per AMD slide gets 45-46 FPS
> 
> Titan gets 26 FPS
> 
> So, 46/26 equals approx 80%
> 
> Clock the mem to 1.5Ghz and you will see it at 100%
> 
> that is 2x


While I've been touting the 290X as pretty darn good, I don't believe the 290X can be twice as good as Titan without some majorly serious optimisation. No details of quality settings have been released, so I don't think the benches can be considered useful. I'll wait for an independent reviewer to do a test using their best test-bench that they also used for Titan with exactly the same settings.

A properly documented testing procedure by independent competent people should never be underestimated.


----------



## criminal

Quote:


> Originally Posted by *vish92*
> 
> Things to note before coming up against me
> 
> 1. I am unbiased (own a 560 praising AMD )
> 
> 2. The graph is showing equal increments and starting from 0
> 
> 3.Lets deduct 10% perf as marketing tactic but still we will be left with 70% thats great


I am unbiased as well. Even though I own a 780 Classified, I still praise AMD. See I can type that too.

It is not a matter of us going against you. Those slides are intended to make AMD's new cards look really good. They do not show any settings, so as far as we know they running everything on low. Best to take those slides with a grain of salt.
Quote:


> Originally Posted by *Moragg*
> 
> While I've been touting the 290X as pretty darn good, I don't believe the 290X can be twice as good as Titan without some majorly serious optimisation. No details of quality settings have been released, so I don't think the benches can be considered useful. I'll wait for an independent reviewer to do a test using their best test-bench that they also used for Titan with exactly the same settings.
> 
> A properly documented testing procedure by independent competent people should never be underestimated.


This^

Don't get me wrong though, if the 290X was really that fast, I would seriously be looking at a way to obtain one ASAP... lol


----------



## vish92

Quote:


> Originally Posted by *criminal*
> 
> I am unbiased as well. Even though I own a 780 Classified, I still praise AMD. See I can type that too.
> 
> It is not a matter of us going against you. Those slides are intended to make AMD's new cards look really good. They do not show any settings, so as far as we know they running everything thing on low. Best to take those slides with a grain of salt.


To put it in context , a single post is enough derail a thread , so it was just a remainder to not waste time on my post and move ahead.


----------



## mtcn77

Those extra render backends will really push the resolution & ssaa skyline. I recall my 4890 being not much faster than 5770; in fact the graphics flow rate was quite jumpy.


----------



## Forceman

Quote:


> Originally Posted by *vish92*
> 
> Things to note before coming up against me
> 
> 1. I am unbiased (own a 560 praising AMD )
> 
> 2. The graph is showing equal increments and starting from 0
> 
> 3.Lets deduct 10% perf as marketing tactic but still we will be left with 70% thats great


If you really believe AMD has figured out a way to get 70% greater performance from a similar number of shaders, then I don't know what to say. Those slides have no information at all besides the resolution, so trying to compare them to other posted benchmarks is completely pointless.


----------



## dir_d

Its kinda funny to see everyone drink the AMD coolaid instead of the Nvidia coolaid.


----------



## EastCoast

Quote:


> Originally Posted by *Forceman*
> 
> If you really believe AMD has figured out a way to get 70% greater performance from a similar number of shaders, then I don't know what to say. Those slides have no information at all besides the resolution, so trying to compare them to other posted benchmarks is completely pointless.


Ok, we get it that you aren't thrilled by the latest charts. However, the opinion remains that the R9 290X will hold the performance crown.


----------



## maarten12100

Quote:


> Originally Posted by *Regent Square*
> 
> They will say Nvidia has better drivers/ PhysX, and 3d mark score. then they will abandon the forum, buy amd cards and come back when NVidia releases somt better.


Yep that are Nvidia fanboys for you in a nutshell.
Better drivers such a joke more like the opposite, both can have physX but Nvidia is making it proprietary to claim it is special, benchmarks well what can you do about it you can't play a benchmark anyways nor can it do something usefull beside grade and load your hardware.

I'm so thrilled for these cards "something is emerging"
Quote:


> Originally Posted by *Forceman*
> 
> Ha. Good one. Just like they would never manipulate the scales to emphasize tiny differences.


Nvidia does that too and so does Intel it is just a marketing gimick.
Chart starts at zero though and has a numbered Y axis.(nothing about the settings those if that is maxed at 4K UHD then I'm even more impressed)


----------



## provost

132 pages of opinions, but no independent benchmarks to prove the facts...will check back again when this thread is at 264 pages


----------



## Roaches

PhysX is a flop!

Source: My GTX 680s and Planetside 2 GPU physics

Expect frequent crashes every 3-5 minutes in that game....


----------



## Regent Square

Quote:


> Originally Posted by *maarten12100*
> 
> Yep that are Nvidia fanboys for you in a nutshell.
> Better drivers such a joke more like the opposite, both can have physX but Nvidia is making it proprietary to claim it is special, benchmarks well what can you do about it you can't play a benchmark anyways nor can it do something usefull beside grade and load your hardware.
> 
> I'm so thrilled for these cards "something is emerging"


Then they all switched Titans to 780s, meanwhile commenting how each fps matters and the price does not.









I will never pick up a titan at this point unless it is a bargain, aka 400$.


----------



## keikei

Quote:


> Originally Posted by *vish92*
> 
> Looks like AMD R9290X is approx 2x faster than titan


If true....







I think we've found our *1 card* solution to 4K gaming. Jesus, please be true.


----------



## Regent Square

Quote:


> Originally Posted by *Roaches*
> 
> PhysX is a flop!
> 
> Source: My GTX 680s and Planetside 2 GPU physics
> 
> Expect frequent crashes every 3-5 minutes in that game....


When I read your sig, instead of pronouncing housefire, I say housewife... LOL


----------



## Forceman

Quote:


> Originally Posted by *EastCoast*
> 
> Ok, we get it that you aren't thrilled by the latest charts. However, the opinion remains that the R9 290X will hold the performance crown.


No one said it wouldn't, but the idea that it will be twice as fast as a Titan at 4K (or even 70%) based on those charts is ridiculous.
Quote:


> Originally Posted by *maarten12100*
> 
> Nvidia does that too and so does Intel it is just a marketing gimick.
> Chart starts at zero though and has a numbered Y axis.(nothing about the settings those if that is maxed at 4K UHD then I'm even more impressed)


Of course they do, which is why you can never make judgements based on marketing slides.


----------



## bencher

Quote:


> Originally Posted by *vish92*
> 
> Looks like AMD R9290X is approx 2x faster than titan
> 
> Look at this
> 
> Notice FC3 gets about 60FPS
> 
> 
> 
> Now look at Hexus 4K gaming review
> 
> http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=5
> 
> Here 7970 scores 27.8FPS while Titan scores 35.4FPS
> 
> AMD wont run FC3 at lower settings just for presentation decks.


----------



## Moragg

Quote:


> Originally Posted by *keikei*
> 
> If true....
> 
> 
> 
> 
> 
> 
> 
> I think we've found our *1 card* solution to 4K gaming. Jesus, please be true.


Titan will give you playable fps at 4K. Just turn down the quality settings









512-bit bus/64 ROPs should give AMD a distinct advantage at the higher resolutions, but you have to consider how much that will affect us. I'm running Korean 1440p now, and can't see myself being able to afford or wanting a higher res display for a while. My next upgrade will be whenever a new tech like quantum dot or OLED becomes available.


----------



## Roaches

Quote:


> Originally Posted by *Regent Square*
> 
> When I read your sig, instead of pronouncing housefire, I say housewife... LOL


"Nvidia Housewife Enthusiast"

You just made my day









No seriously; Planetside 2 really pissed me off when it comes to PhysX...even other people in my outfit with high end Nvidia setups are complaning too....its a shame to not to experience the extra eye candy with investment on a beefy GPU....


----------



## bencher

Quote:


> Originally Posted by *Moragg*
> 
> *Titan will give you playable fps at 4K. Just turn down the quality settings
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 512-bit bus/64 ROPs should give AMD a distinct advantage at the higher resolutions, but you have to consider how much that will affect us. I'm running Korean 1440p now, and can't see myself being able to afford or wanting a higher res display for a while. My next upgrade will be whenever a new tech like quantum dot or OLED becomes available.


Shouldn't need to do that after spending 1k on a gpu.


----------



## Moragg

Quote:


> Originally Posted by *bencher*
> 
> Shouldn't need to do that after spending 1k on a gpu.


4K is a crazy number of pixels. I reckon 290X + Mantle is the only chance of an ultra (no) single gpu solution for 4K. And with that I expect you to be hovering around 30fps at best for AAA games.


----------



## Regent Square

Quote:


> Originally Posted by *Roaches*
> 
> "Nvidia Housewife Enthusiast"
> 
> You just made my day
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No seriously; Planetside 2 really pissed me off when it comes to PhysX...even other people in my outfit with high end Nvidia setups are complaning too....its a shame to not to experience the extra eye candy with investment on a beefy GPU....


LOLz

Planetside 2 is often being played. PhysX is def. a flop. All I know, considering how this gen of cards goes,I wont be buying NVidia gpu unless it is 400$.


----------



## keikei

Quote:


> Originally Posted by *Moragg*
> 
> 4K is a crazy number of pixels. I reckon 290X + Mantle is the only chance of an ultra (no) single gpu solution for 4K. And with that I expect you to be hovering around 30fps at best for AAA games.


Very exciting times for PC gaming. Kinda funny 1080P use to be the pinnacle resolution. So, we get real benchmarks on the 15th?


----------



## maarten12100

Quote:


> Originally Posted by *Moragg*
> 
> Titan will give you playable fps at 4K. Just turn down the quality settings
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 512-bit bus/64 ROPs should give AMD a distinct advantage at the higher resolutions, but you have to consider how much that will affect us. I'm running Korean 1440p now, and can't see myself being able to afford or wanting a higher res display for a while. My next upgrade will be whenever a new tech like quantum dot or OLED becomes available.


My 570 gives playable fps at 4K UHD with reasonable settings no AA.
Quote:


> Originally Posted by *bencher*
> 
> Shouldn't need to do that after spending 1k on a gpu.


The people that buy Titan are usually the people that would buy the overpriced 4K screens just throwing away money for the bragging rights. (I like my MVA black levels)
Quote:


> Originally Posted by *Moragg*
> 
> 4K is a crazy number of pixels. I reckon 290X + Mantle is the only chance of an ultra (no) single gpu solution for 4K. And with that I expect you to be hovering around 30fps at best for AAA games.


only 8.3MP so really nothing special compared to anything 4 way 1080P eyefinity or 2 way 1440P.
The only thing I don't like is the lack of CF connectors on this card I was hoping for at least 2


----------



## criminal

Quote:


> Originally Posted by *Moragg*
> 
> 4K is a crazy number of pixels. I reckon 290X + Mantle is the only chance of an ultra (no) single gpu solution for 4K. And with that I expect you to be hovering around 30fps at best for AAA games.


Yeah 4k is something serious. Good thing I have no plans to go that route until it becomes more mainstream.









If the 290X somehow pulls off 70% increase over the Titan... well let's just say prices on Nvidia cards will bottom out quick.

Anyone want to buy a 780 Classified?


----------



## DzillaXx

Quote:


> Originally Posted by *criminal*
> 
> Yeah 4k is something serious. Good thing I have no plans to go that route until it becomes more mainstream.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If the 290X somehow pulls off 70% increase over the Titan... well let's just say prices on Nvidia cards will bottom out quick.
> 
> Anyone want to buy a 780 Classified?


Personally we will only see those increases on a rez like 4K

1080p or 1440p users will most likely find that the Titan and 290X are pretty close at those Rez.


----------



## Usario

Quote:


> Originally Posted by *vish92*
> 
> Looks like AMD R9290X is approx 2x faster than titan
> 
> Look at this
> 
> Notice FC3 gets about 60FPS
> 
> Now look at Hexus 4K gaming review
> 
> http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=5
> 
> Here 7970 scores 27.8FPS while Titan scores 35.4FPS
> 
> AMD wont run FC3 at lower settings just for presentation decks.


Can't really compare to other benchmark scores since we don't know what settings AMD was using.

But those 64 ROPs are definitely going to give AMD a big advantage in high-resolution as well as high-AA situations.
Quote:


> Originally Posted by *keikei*
> 
> If true....
> 
> 
> 
> 
> 
> 
> 
> I think we've found our *1 card* solution to 4K gaming. Jesus, please be true.


AMD probably wasn't using the best settings. 4K will probably remain a multi-GPU-only affair, unless you're fine with spending thousands on a monitor and $700 on a graphics card to play at medium.


----------



## Blindsay

Quote:


> Originally Posted by *criminal*
> 
> Yeah 4k is something serious. Good thing I have no plans to go that route until it becomes more mainstream.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If the 290X somehow pulls off 70% increase over the Titan... well let's just say prices on Nvidia cards will bottom out quick.
> 
> Anyone want to buy a 780 Classified?


as crazy as that would be i find it highly unlikely, my money is on it matching the titan


----------



## criminal

Quote:


> Originally Posted by *Blindsay*
> 
> as crazy as that would be i find it highly unlikely, my money is on it matching the titan


Oh I agree. I was just joshing.


----------



## Ultracarpet

Just curious as to some points brought up earlier about it having to be some sort of miracle for amd to beat the titan... as a rough estimate the titan is usually about ~30 percent faster than a 7970 right? so with 2x the rop's, 30% more shaders, 512 bit memory bus etc... how could it not compete? if the r9 290x has no core improvements over the 7970 at all, it technically speaking should trade blows with a titan no? So if you added any core improvements at all on-top of all these upgraded specs, which amd has had 2 years to do, it's just icing on top the cake when being compared to a titan. Like I saw alatar a while back saying gcn2.0 needs to be like 30 percent faster than 1.0 in order for the 290x to compete with the titan... I don't understand why that has to be. I would like someone to explain it to me.


----------



## bencher

Quote:


> Originally Posted by *DzillaXx*
> 
> Personally we will only see those increases on a rez like 4K
> 
> 1080p or 1440p users will most likely find that the Titan and 290X are pretty close at those Rez.


I agree.


----------



## Regent Square

Quote:


> Originally Posted by *Ultracarpet*
> 
> Just curious as to some points brought up earlier about it having to be some sort of miracle for amd to beat the titan... as a rough estimate the titan is usually about ~30 percent faster than a 7970 right? so with 2x the rop's, 30% more shaders, 512 bit memory bus etc... how could it not compete? if the r9 290x has no core improvements over the 7970 at all, it technically speaking should trade blows with a titan no? So if you added any core improvements at all on-top of all these upgraded specs, which amd has had 2 years to do, it's just icing on top the cake when being compared to a titan. Like I saw alatar a while back saying gcn2.0 needs to be like 30 percent faster than 1.0 in order for the 290x to compete with the titan... I don't understand why that has to be. I would like someone to explain it to me.


This forum is filled with Nvida fanboys. Some tell false info, other trash about AMD to find any way of joking badly with, praising Nvidia. Don't look for logical explanations here. Pips here are not AMD engineers, take em with a grain of salt.


----------



## $ilent

is anyone excited for the 290?


----------



## selk22

Quote:


> Originally Posted by *$ilent*
> 
> is anyone excited for the 290?


Me personally. Because I think the 290x will be slightly out of reach.. the 290 should be a very solid upgrade from a 580 1.5gb. Gives me room for Xfire later for even better performance. I only game single 1920x1200 so I think its plenty.


----------



## Ultracarpet

Quote:


> Originally Posted by *$ilent*
> 
> is anyone excited for the 290?


<--- this guy. Probably what I'm going to buy


----------



## $ilent

Whos gonna be up for some overvolting of these new cards though? Im thinkin 1.4v should be enough...


----------



## Blameless

Quote:


> Originally Posted by *sugarhell*
> 
> 
> 
> That means 64 rop?


Since the 79xx, 78xx, R270, and R280 all have 32, yes.


----------



## CynicalUnicorn

Quote:


> Originally Posted by *Moragg*
> 
> 4K is a crazy number of pixels. I reckon 290X + Mantle is the only chance of an ultra (no) single gpu solution for 4K. And with that I expect you to be hovering around 30fps at best for AAA games.


If you have a setup that can handle 120Hz 1080p, it ought to work for a 4K TV since those are typically 30Hz. The closest thing to a single-card (which I interpret as anything that works with mITX) 4K solution right now is a 7990 and considering the 690's failure and 7990's mostly-failure, I doubt we'll see new dual-GPUs soon.


----------



## lacrossewacker

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> If you have a setup that can handle 120Hz 1080p, it ought to work for a 4K TV since those are typically 30Hz. The closest thing to a single-card (which I interpret as anything that works with mITX) 4K solution right now is a 7990 and considering the 690's failure and 7990's mostly-failure, I doubt we'll see new dual-GPUs soon.


There are only a few recent AAA games that can run @ 60fps with SLI titans.

Not only is 30hz, crap, you'll need at least a titan to even hold that


----------



## $ilent

Guys how valuable is AA on say BF3?

Is it worth spending extra say $300 on a gpu just to get 60fps with 4x AA/MSAA as apposed to having full settings but no AA?


----------



## lacrossewacker

Quote:


> Originally Posted by *$ilent*
> 
> Guys how valuable is AA on say BF3?
> 
> Is it worth spending extra say $300 on a gpu just to get 60fps with 4x AA/MSAA as apposed to having full settings but no AA?


I'm personally one that'd sacrifice AA for fps/textures/shading etc....

The need for AA does vary from game to game though. Some games can get away without the use of AA. Other games, BF3 for example, NEED AA to me.

You do have a point though. Benchmarking games with MSAax8 is pretty stupid


----------



## DzillaXx

Quote:


> Originally Posted by *$ilent*
> 
> Guys how valuable is AA on say BF3?
> 
> Is it worth spending extra say $300 on a gpu just to get 60fps with 4x AA/MSAA as apposed to having full settings but no AA?


I just made the jump for gtx470 sli @850 to single 7950 @ 1250mhz

I lost 400 points in 3DMark 11

I can now play games With MSAA at or near max settings.

Newer games like Crysis 3, the new Bioshock, BF3 and BF4 have been smoother. BF3 performance was pretty bad with my 470's because of the drivers until the last couple what helped alot. BF4 has been butter on my 7950, getting 60 avg FPS @1080 Everything Maxed 4xMSAA and 120% rez scaling. Crysis 3 I can now play with MSAA on and still is smoother.


----------



## CynicalUnicorn

Quote:


> Originally Posted by *lacrossewacker*
> 
> There are only a few recent AAA games that can run @ 60fps with SLI titans.
> 
> Not only is 30hz, crap, you'll need at least a titan to even hold that


30Hz sucks, but that's common for TVs. 4K is still a bit out of reach considering HD channels are still 720p sometimes. What do you mean by SLI Titans? 4-way? Given their performance in 1080p, and assuming perfect SLI and resolution scaling, I think you are referring to $4000 of GPUs to go with your $4000 TV.


----------



## Usario

Quote:


> Originally Posted by *$ilent*
> 
> is anyone excited for the 290?


Definitely. Coming in at ~$500-$550 it'll be a great deal, considering that the only difference between it and the 290X is 256 SPs.


----------



## $ilent

See heres my dilemma...Its pretty much double the cost if I want to buy a gpu for my res to play with AA as apposed to turning it off if anandtech bench is anything to go buy.

Can AA really be worth that much more for BF?


----------



## PureBlackFire

Quote:


> Originally Posted by *Usario*
> 
> Definitely. Coming in at ~$500-$550 it'll be a great deal, considering that the only difference between it and the 290X is 256 SPs.


yeah. and if the base clock speed is really 100mhz higher than the 290X then even at stock it won't be much behind at all.


----------



## lacrossewacker

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> 30Hz sucks, but that's common for TVs. 4K is still a bit out of reach considering HD channels are still 720p sometimes. What do you mean by SLI Titans? 4-way? Given their performance in 1080p, and assuming perfect SLI and resolution scaling, I think you are referring to $4000 of GPUs to go with your $4000 TV.


2xTitans is just enough for recent games at 60fps, but games will only get harder to run too


----------



## BradleyW

Quote:


> Originally Posted by *lacrossewacker*
> 
> 2xTitans is just enough for recent games at 60fps, but games will only get harder to run too


At what res?
I have all games max out except witcher 2 and I achieve 60fps at all times. My CPU also helps on CPU intensive games compared to my 3770k. Then again, CFX scales much better and I only play at 1080p HD, not 1440p.


----------



## lacrossewacker

Quote:


> Originally Posted by *BradleyW*
> 
> At what res?
> I have all games max out except witcher 2 and I achieve 60fps at all times. My CPU also helps on CPU intensive games compared to my 3770k. Then again, CFX scales much better and I only play at 1080p HD, not 1440p.


4k.

It'd be more demanding than witcher 2s ubersampling..and that's actually an easy game to run! Imagine applying that same "ubersampling" to tomb raider, crisis, bf4, metro, etc...


----------



## ZealotKi11er

Quote:


> Originally Posted by *$ilent*
> 
> See heres my dilemma...Its pretty much double the cost if I want to buy a gpu for my res to play with AA as apposed to turning it off if anandtech bench is anything to go buy.
> 
> Can AA really be worth that much more for BF?


AA @ 1440p makes no difference in my eye. Only game that needed AA in my eye was Crysis 3. Only time i use AA is CPU is bottlneck. In reality not a worth feature at least in BF.


----------



## lacrossewacker

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AA @ 1440p makes no difference in my eye. Only game that needed AA in my eye was Crysis 3. Only time i use AA is CPU is bottlneck. In reality not a worth feature at least in BF.


Oddly enough you actually need more AA with 1600p than 1440p









A 27in 1440p monitor has a higher ppi than a 30in 1600p....


----------



## $ilent

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AA @ 1440p makes no difference in my eye. Only game that needed AA in my eye was Crysis 3. Only time i use AA is CPU is bottlneck. In reality not a worth feature at least in BF.


So if thats the case should I not just get a 280X and overclock it a abit and then turn AA off or down? On anandtech the 7970 ghz gets 50fps on BF3 at ultra and 4xMSAA, so Im guessing with AA off that would be well over 60fps?

Unless im missing something and somehow BF4 is gonna be alot more GPU hungry...its possible but we wont know til end of October.


----------



## keikei

Quote:


> Originally Posted by *$ilent*
> 
> See heres my dilemma...Its pretty much double the cost if I want to buy a gpu for my res to play with AA as apposed to turning it off if anandtech bench is anything to go buy.
> 
> Can AA really be worth that much more for BF?


If you want to win in BF, AA topples detail. If you can see clearer because of your higher res monitor or AA maxed out compared to the other guy, you have the advantage. IMO, AA is very important in BF. Set details/shadows to low or medium, but crank dat AA high Baby!


----------



## ZealotKi11er

Quote:


> Originally Posted by *$ilent*
> 
> So if thats the case should I not just get a 280X and overclock it a abit and then turn AA off or down? On anandtech the 7970 ghz gets 50fps on BF3 at ultra and 4xMSAA, so Im guessing with AA off that would be well over 60fps?
> 
> Unless im missing something and somehow BF4 is gonna be alot more GPU hungry...its possible but we wont know til end of October.


Right now with Beta performance you need 2 x HD 7970 to get over 60 fps @ Ultra with no AA. People like running one card. Personally i do too.


----------



## lacrossewacker

Quote:


> Originally Posted by *$ilent*
> 
> So if thats the case should I not just get a 280X and overclock it a abit and then turn AA off or down? On anandtech the 7970 ghz gets 50fps on BF3 at ultra and 4xMSAA, so Im guessing with AA off that would be well over 60fps?
> 
> Unless im missing something and somehow BF4 is gonna be alot more GPU hungry...its possible but we wont know til end of October.


Who knows how demanding bf4 will actuality be. My OC'd 670 could handle bf3 at 1440p with no AA just fine with the occasional slowdown if I was surrounded by smoke.

A single 280X wouldn't have much of a long life at 1440p though with future games. If you go with the 280X or 7970, at least go 2 way CFX


----------



## $ilent

See I cant be bothered risking getting micro stutter with 2 gpus; I know AMD keep releasing drivers but honestly AMD drivers are complete garbage and I wouldnt trust them to bring out a be all end of driver to solve all the CF problems.

Ill just wait and see how much a 290 is then, or 290x if i get lucky and its cheaper than I imagine it will be.


----------



## wstanci3

Quote:


> Originally Posted by *$ilent*
> 
> See I cant be bothered risking getting micro stutter with 2 gpus; I know AMD keep releasing drivers but *honestly AMD drivers are complete garbage* and I wouldnt trust them to bring out a be all end of driver to solve all the CF problems.
> 
> Ill just wait and see how much a 290 is then, or 290x if i get lucky and its cheaper than I imagine it will be.


Oh dear. You are about to get flak. Incoming.


----------



## lacrossewacker

Quote:


> Originally Posted by *$ilent*
> 
> See I cant be bothered risking getting micro stutter with 2 gpus; I know AMD keep releasing drivers but honestly AMD drivers are complete garbage and I wouldnt trust them to bring out a be all end of driver to solve all the CF problems.
> 
> Ill just wait and see how much a 290 is then, or 290x if i get lucky and its cheaper than I imagine it will be.


Depending on the 290's release, you may want to keep an eye out for nvidias potential counter. It could be pathetic, but it could make a 780 (or even two) a very attractive offer
Quote:


> Originally Posted by *wstanci3*
> 
> Oh dear. You are about to get flak. Incoming.


not everybody's play style falls right into that pretty little cushion that AMD released a month and a half ago

Maybe there will be a hardware fix with the 290s


----------



## Blackops_2

Quote:


> Originally Posted by *lacrossewacker*
> 
> Depending on the 290's release, you may want to keep an eye out for nvidias potential counter. It could be pathetic, but it could make a 780 (or even two) a very attractive offer


This ^. If the 290x matches Titan and exceeds the 780, Nvidia should lower prices of the 780. Especially if the 290 trades a couple of blows with the 780 for 499$.

Or at least one can hope cause i don't want to pay 1300$ for a pair of either lol


----------



## AlphaC

Quote:


> Originally Posted by *$ilent*
> 
> is anyone excited for the 290?


If it is 90% of the R9-290X in shaders (2560) with everything else except clock speed in tact (mem bus, mem speed, TMUs, ROPS, etc) for $450 or less (tops) and coming in <250W TDP, maybe...









The HD7950 had its TMUs slashed from 128 to 112 in addition to the 12.5%shader loss (256 shaders) and that wasn't too bad. If the R9-290 has its TMU in tact and 90% of the R9-290X's shaders it'd be a stronger derivative than the HD7950









The rumors for GTX 770 ti say that it will be 1920 shaders (5 disabled SMX with possibly one or 2 GPC disabled) , 48 ROPs , 160 TMU (texture units) down from 192 on the GTX 780 or 224 on TITAN .


----------



## wstanci3

The 290 does sound very intriguing. Offering ~10%-15% less performance but maybe $100 lower than the 290x. I could live with that.
Assuming of course, Nvidia doesn't counter with slashing prices and/ or bring more cards to the folds. Competition is a glorious thing.


----------



## $ilent

Quote:


> Originally Posted by *lacrossewacker*
> 
> Depending on the 290's release, you may want to keep an eye out for nvidias potential counter. It could be pathetic, but it could make a 780 (or even two) a very attractive offer
> not everybody's play style falls right into that pretty little cushion that AMD released a month and a half ago
> 
> Maybe there will be a hardware fix with the 290s


I dont have £550 to spend on a gtx 780 to be honest, frankly Im not chuffed about having to fork out around £400 for this new AMD gpu. Also I want the new gpu in time for BF4, I dont wanna wait and play at crap settings for a month until NV decide to drop prices


----------



## Clocknut

Quote:


> Originally Posted by *criminal*
> 
> Yeah 4k is something serious. Good thing I have no plans to go that route until it becomes more mainstream.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If the 290X somehow pulls off 70% increase over the Titan... well let's just say prices on Nvidia cards will bottom out quick.
> 
> Anyone want to buy a 780 Classified?


may be with a highly optimized Mantle game.


----------



## SchmoSalt

Quote:


> Originally Posted by *$ilent*
> 
> See I cant be bothered risking getting micro stutter with 2 gpus; I know AMD keep releasing drivers but honestly AMD drivers are complete garbage and I wouldnt trust them to bring out a be all end of driver to solve all the CF problems.
> 
> Ill just wait and see how much a 290 is then, or 290x if i get lucky and its cheaper than I imagine it will be.


It's not like nVidia drivers are that amazing either. The main reason why I switched to ATI was because of nVidia's drivers. I'd get game crashes and even monthly BSODs from nVidia's drivers dating early 2011 to December 2012. I haven't had any problems with my 7850 or 7950 since.


----------



## Roaches

Quote:


> Originally Posted by *$ilent*
> 
> See I cant be bothered risking getting micro stutter with 2 gpus; I know AMD keep releasing drivers but honestly AMD drivers are complete garbage and I wouldnt trust them to bring out a be all end of driver to solve all the CF problems.
> 
> Ill just wait and see how much a 290 is then, or 290x if i get lucky and its cheaper than I imagine it will be.


I'm expecting improvements with the removal of the CFX bridges and fingers....Hardware level revisions usually means good R&D behind the design improvements and performance...


----------



## $ilent

I think NV drivers are fine to be honest, the only issues I ever have with BF3, which is just about the only game I play, is when punkbuster doesnt update or it just messes up altogether.


----------



## Blackops_2

Personally i've never had issues with either camp on drivers. Though i've only owned two AMD cards. My third will probably be a 290/290x less the 780 price drops.


----------



## mtcn77

Currently, 7970Ghz CF stomping GTX780SLI... and ironically it is Nvidia's current slogan, "driving 4K".
AMD is really amazing, this time. I mean come on, what is the ratio between those things, 1:2?
Source


----------



## TheOCNoob

Quote:


> Originally Posted by *jomama22*
> 
> lol, judges performance on speaking ability of engineers. That makes sense.


Rep'd.


----------



## Lennyx

Quote:


> Originally Posted by *$ilent*
> 
> I think NV drivers are fine to be honest, the only issues I ever have with BF3, which is just about the only game I play, is when punkbuster doesnt update or it just messes up altogether.


Well i had a horrible couple of days finding a decent driver for my gtx 670. Older drivers worked good but did not support new games. Some new drivers gave me fps drops inn all games i tried.
In the end this new beta driver seems to work for now.

Both Amd and Nvidia got problems with drivers.
And its ur decision as a consumer to chose wich company you want to experience problems with. Couse its 100% sure you will at some point.


----------



## mcg75

Quote:


> Originally Posted by *vish92*
> 
> Looks like AMD R9290X is approx 2x faster than titan
> 
> Look at this
> 
> Notice FC3 gets about 60FPS
> 
> 
> 
> Now look at Hexus 4K gaming review
> 
> http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=5
> 
> Here 7970 scores 27.8FPS while Titan scores 35.4FPS
> 
> AMD wont run FC3 at lower settings just for presentation decks.


Sweet, yet another fake home made graph.

"aliens vs predators?"
"bioshock infinte?"


----------



## wstanci3

Quote:


> Originally Posted by *mcg75*
> 
> Sweet, yet another fake home made graph.
> 
> "aliens vs predators?"
> "bioshock infinte?"











Just noticed that.
AMD go home, you're drunk.


----------



## anticommon

That leaked presentation could very well be fake with just a ton of speculation. Who knows. We won't know until the 15th +/- a day or two.


----------



## Majin SSJ Eric

Chart seems suspect to me because there's no way any card runs BF3 and FC3 at the same FPS...


----------



## y2kcamaross

Quote:


> Originally Posted by *mtcn77*
> 
> Currently, 7970Ghz CF stomping GTX780SLI... and ironically it is Nvidia's current slogan, "driving 4K".AMD is really amazing, this time. I mean come on, what is the ratio between those things, 1:2?Sourcehttp://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/27912-4k-gaming-amd-gegen-nvidia-wer-ist-bereit-fuer-die-zukunft.html?start=3]Source[/URL[/URL]]


wow if believe those numbers you are dense, 7970 crossfire averaging over 100+fps in Bioshock infinite and sli 780s getting less than 50?


----------



## LaBestiaHumana

Quote:


> Originally Posted by *anticommon*
> 
> That leaked presentation could very well be fake with just a ton of speculation. Who knows. We won't know until the 15th +/- a day or two.


Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Chart seems suspect to me because there's no way any card runs BF3 and FC3 at the same FPS...


Here's the real graph!


----------



## GoldenTiger

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Here's the real graph!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [*IMG ALT=""]http://www.overclock.net/content/type/61/id/1694324/width/500/height/1000[/IMG]


Hahahahahaa.


----------



## mtcn77

Quote:


> Originally Posted by *y2kcamaross*
> 
> wow if believe those numbers you are dense, 7970 crossfire averaging over 100+fps in Bioshock infinite and sli 780s getting less than 50?


Got more serious comments?


----------



## anticommon

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Here's the real graph!


Seems a bit low. I thought they said something about BF4 mantle 4k eyefinity 120hz. Yeah. That's the one.


----------



## szeged

one 780 classified sold to buy a 290x when they arrive, hurry up amd my bench rig is looking lonely with no gpu in it.


----------



## anticommon

Quote:


> Originally Posted by *szeged*
> 
> one 780 classified sold to buy a 290x when they arrive, hurry up amd my bench rig is looking lonely with no gpu in it.


That was fast... didn't you start selling that card within the last day or so?


----------



## y2kcamaross

Quote:


> Originally Posted by *mtcn77*
> 
> [quote name="y2kcamaross" url="/t/1429286/tpu-amd-announces-the-radeon-r9-290x-features-64-rops/1370#post_20956537"] wow if believe those numbers you are dense, 7970 crossfire averaging over 100+fps in Bioshock infinite and sli 780s getting less than 50? :lachen:


Got more serious comments?[/QUOTE]
that was a serious comment, that entire "article"is a lie


----------



## szeged

Quote:


> Originally Posted by *anticommon*
> 
> That was fast... didn't you start selling that card within the last day or so?


yeah i sold it for pretty cheap lol so people were all over it,

$600 for a 780 classified ACX card with a hydro copper and evga backplate pre overclocked to 1400+ core and +800 on the memory with 1.35v lol

guy got a good deal imo. ill be selling another one soon depending on how well the 290x's do.


----------



## mtcn77

Quote:


> Originally Posted by *y2kcamaross*
> 
> that was a serious comment, that entire "article"is a lie


Mkay, it is Nvidia's FCAT after all...


----------



## Majin SSJ Eric

Holy hell that's a heckuva deal that guy got!


----------



## jomama22

Quote:


> Originally Posted by *szeged*
> 
> yeah i sold it for pretty cheap lol so people were all over it,
> 
> $600 for a 780 classified ACX card with a hydro copper and evga backplate pre overclocked to 1400+ core and +800 on the memory with 1.35v lol
> 
> guy got a good deal imo. ill be selling another one soon depending on how well the 290x's do.


I am reffering to the drastic drop in value of a new GPU as "szeged" from now on.

"dude i totaly just bought a titan today!"
"You know that the 290x is coming out this month right?"
"aww crap, i totaly just got szeged"


----------



## y2kcamaross

Quote:


> Originally Posted by *mtcn77*
> 
> [quote name="y2kcamaross" url="/t/1429286/tpu-amd-announces-the-radeon-r9-290x-features-64-rops/1380#post_20956857"] that was a serious comment, that entire "article"is a lie


Mkay, it is Nvidia's FCAT after all...[/QUOTE]
do you seriously think 2 7970s can average over 100fps in Bioshock [email protected] 4k resolutions at settings where the 780s only average ~40? Here's a hint, they can't. I have 2 of them, I know what they can and can't do


----------



## szeged

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Holy hell that's a heckuva deal that guy got!


yeah he was very happy to say the least









Quote:


> Originally Posted by *jomama22*
> 
> I am reffering to the drastic drop in value of a new GPU as "szeged" from now on.
> 
> "dude i totaly just bought a titan today!"
> "You know that the 290x is coming out this month right?"
> "aww crap, i totaly just got szeged"


haha







ill probably never sell any of my titans, i love them too much, and when they become too old to power new games, ill make a glass wall mount for them to sit in like i do with all my old cards









also sold the classified for so cheap because i got one for pretty cheap, so it was really like an even trade off


----------



## LaBestiaHumana

Quote:


> Originally Posted by *anticommon*
> 
> Seems a bit low. I thought they said something about BF4 mantle 4k eyefinity 120hz. Yeah. That's the one.


Fixed. Alien vs predatos is just to demanding.


----------



## mtcn77

Quote:


> Originally Posted by *y2kcamaross*
> 
> do you seriously think 2 7970s can average over 100fps in Bioshock [email protected] 4k resolutions at settings where the 780s only average ~40? Here's a hint, they can't. I have 2 of them, I know what they can and can't do


And your argument?
I'll help;
-"Maybe he didn't activate sli".


----------



## szeged

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Fixed. Alien vs predatos is just to demanding.


rofl









i wanna get quad 290x's to play oregon trail at 4k res, hurry up amd.


----------



## jomama22

Quote:


> Originally Posted by *szeged*
> 
> rofl
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i wanna get quad 290x's to play oregon trail at 4k res, hurry up amd.


You have died of cholera, lost 3 wagon wheels and dropped your grandfather clock. Next time don't forge the river.


----------



## szeged

Quote:


> Originally Posted by *jomama22*
> 
> You have died of cholera, lost 3 wagon wheels and dropped your grandfather clock. Next time don't forge the river.










EVERY TIME! never drink the water!


----------



## y2kcamaross

Quote:


> Originally Posted by *mtcn77*
> 
> And your argument?
> I'll help;
> -"Maybe he didn't activate sli".


No, my arguement is the entire article is a lie. I could barely average 100 fps in bioshock infinite with my 7970s @1440p yet they are averaging over 100fps with 2.3 times more pixels? Get real.


----------



## anticommon

Quote:


> Originally Posted by *szeged*
> 
> yeah i sold it for pretty cheap lol so people were all over it,
> 
> $600 for a 780 classified ACX card with a hydro copper and evga backplate pre overclocked to 1400+ core and +800 on the memory with 1.35v lol
> 
> guy got a good deal imo. ill be selling another one soon depending on how well the 290x's do.


Damn lol, I would have been on that like seaweed on a sushi roll. Now I'll have to play the silicon lottery with the classy coming in the mail...


----------



## anticommon

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Fixed. Alien vs predatos is just to demanding.


There we go. Now we just need to add titan's to the chart and show how they get ~ 10fps at 1280x768


----------



## specopsFI

Quote:


> Originally Posted by *Ultracarpet*
> 
> Just curious as to some points brought up earlier about it having to be some sort of miracle for amd to beat the titan... as a rough estimate the titan is usually about ~30 percent faster than a 7970 right? so with 2x the rop's, 30% more shaders, 512 bit memory bus etc... how could it not compete? if the r9 290x has no core improvements over the 7970 at all, it technically speaking should trade blows with a titan no? So if you added any core improvements at all on-top of all these upgraded specs, which amd has had 2 years to do, it's just icing on top the cake when being compared to a titan. Like I saw alatar a while back saying gcn2.0 needs to be like 30 percent faster than 1.0 in order for the 290x to compete with the titan... I don't understand why that has to be. I would like someone to explain it to me.


Since no one else said it, I will.

Power is the thing. If AMD had 30% TDP headroom over the 7970GE then it wouldn't be any problem to hit Titan performance or beat it. The reality is though that 7970GE already has the same TDP as Titan. So they would need to get all that extra hardware to run with the same TDP without sacrificing clock speed.

Some people will counter that by saying "but look at the 7790, it brought 30% more fps with just 5% more power consumption". Yes, it did. And if you've read any R7 260X reviews, you'd know what happened when that 7790 needed to be pushed for any higher clock speeds: the power consumption went up significantly. "...overall performance per watt has actually dropped a lot. The HD 7790 delivered leading efficiency, but the R7 260X is now below the average when compared to the whole market. The good HD 7790 numbers in mind, I was a bit disappointed by the R7 260X's gaming power consumption." (source). The same thing happened with 7970GE: the original 7970 was actually pretty good for performance/W, 7970GE much less so.

TDP limit is the reason why AMD has made Hawaii so massive. They couldn't go for higher clocks because GCN goes mental with TDP. They needed to _lower_ their clocks and the only way to get more performance with lower clocks is with a big chip running low voltage. The rumoured turbo BIOS thing seems like an indication that they've still had to push the clocks higher than they would have liked to. There most likely is very little extra to be had over those turbo clocks (last time they did this was 6990 which was tapped out). Another thing that might be a limiting factor is that Hawaii is extremely densely packed: the transistors/area is very high. Heat transfer gets more difficult when the heat source is more dense.

If I'm sounding too critical, then that's just the hype getting out of control. There are limiting factors for the Hawaii as well although it seems that some folks are taking any news as indication of world domination. I was sceptical that AMD would aim for Titan performance but the hardware seems to be aimed at just that. As of now, I'm thinking 290X will practically match the Titan at stock and with OC on stock BIOS since Titan is so strictly restricted, but Titan will have more headroom when those limits are removed. All in all, I'm really excited to see AMD doing a real monster chip!


----------



## Ultracarpet

Quote:


> Originally Posted by *specopsFI*
> 
> Since no one else said it, I will.
> 
> Power is the thing. If AMD had 30% TDP headroom over the 7970GE then it wouldn't be any problem to hit Titan performance or beat it. The reality is though that 7970GE already has the same TDP as Titan. So they would need to get all that extra hardware to run with the same TDP without sacrificing clock speed.
> 
> Some people will counter that by saying "but look at the 7790, it brought 30% more fps with just 5% more power consumption". Yes, it did. And if you've read any R7 260X reviews, you'd know what happened when that 7790 needed to be pushed for any higher clock speeds: the power consumption went up significantly. "...overall performance per watt has actually dropped a lot. The HD 7790 delivered leading efficiency, but the R7 260X is now below the average when compared to the whole market. The good HD 7790 numbers in mind, I was a bit disappointed by the R7 260X's gaming power consumption." (source). The same thing happened with 7970GE: the original 7970 was actually pretty good for performance/W, 7970GE much less so.
> 
> TDP limit is the reason why AMD has made Hawaii so massive. They couldn't go for higher clocks because GCN goes mental with TDP. They needed to _lower_ their clocks and the only way to get more performance with lower clocks is with a big chip running low voltage. The rumoured turbo BIOS thing seems like an indication that they've still had to push the clocks higher than they would have liked to. There most likely is very little extra to be had over those turbo clocks (last time they did this was 6990 which was tapped out). Another thing that might be a limiting factor is that Hawaii is extremely densely packed: the transistors/area is very high. Heat transfer gets more difficult when the heat source is more dense.
> 
> If I'm sounding too critical, then that's just the hype getting out of control. There are limiting factors for the Hawaii as well although it seems that some folks are taking any news as indication of world domination. I was sceptical that AMD would aim for Titan performance but the hardware seems to be aimed at just that. As of now, I'm thinking 290X will practically match the Titan at stock and with OC on stock BIOS since Titan is so strictly restricted, but Titan will have more headroom when those limits are removed. All in all, I'm really excited to see AMD doing a real monster chip!












So the perf/watt would need to be ~30% better if they just straight up scaled a 7970... ooooook, that makes more sense.


----------



## Newbie2009

Do we know when the reviews are out?


----------



## hrockh

Quote:


> Originally Posted by *Newbie2009*
> 
> Do we know when the reviews are out?


rumour has it that the NDA lifts on October 15th.


----------



## maarten12100

Quote:


> Originally Posted by *lacrossewacker*
> 
> Oddly enough you actually need more AA with 1600p than 1440p
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A 27in 1440p monitor has a higher ppi than a 30in 1600p....


Ppi isn't that important as it is pixel count that matters.
A higher PPI will only make the artefacts smaller.
Quote:


> Originally Posted by *$ilent*
> 
> See I cant be bothered risking getting micro stutter with 2 gpus; I know AMD keep releasing drivers but honestly AMD drivers are complete garbage and I wouldnt trust them to bring out a be all end of driver to solve all the CF problems.
> 
> Ill just wait and see how much a 290 is then, or 290x if i get lucky and its cheaper than I imagine it will be.


They'll have hardware based frame mettering.
And really besides the horrible layout of the controll panel AMD's drivers are rather good.


----------



## Lumo841

Hey look another Red vs Green circlejerk... never saw that coming.


----------



## Chickenman

Quote:


> Originally Posted by *Lumo841*
> 
> Hey look another Red vs Green circlejerk... never saw that coming.


Are you new around here?

Lol, I'm joking









Kinda looking forward to this card, even if it just draws out a new Nvidia card. I wont be buying either but we need some real tit for tat and oneupmanship.


----------



## y2kcamaross

Quote:


> Originally Posted by *mtcn77*
> 
> [quote name="y2kcamaross" url="/t/1429286/tpu-amd-announces-the-radeon-r9-290x-features-64-rops/1390#post_20957040"]No, my arguement is the entire article is a lie. I could barely average 100 fps in bioshock infinite with my 7970s @1440p yet they are averaging over 100fps with 2.3 times more pixels? Get real.


I love your humour crossfiring with that cpu.[/QUOTE]
um no, the 7970s were in my main rig until I got my 780s, but good try


----------



## mtcn77

Quote:


> Originally Posted by *y2kcamaross*
> 
> um no, the 7970s were in my main rig until I got my 780s, but good try


You never proved your results were any better than the tests, either. Relatively better than 7970's, shall we say?
I would recommend you check out HPET and Timer resolution utilities before any concrete remark, though.


----------



## y2kcamaross

Quote:


> Originally Posted by *mtcn77*
> 
> You never proved your results were any better than the tests, either. Relatively better than 7970's, shall we say?
> I would recommend you check out HPET and Timer resolution utilities before any concrete remark, though.


??
I don't have to prove my tests, anyone in here who has both 780s and 7970s, or at least HAD both at one point, can tell you right now that the 780s absolutely crush the 7970s at any resolution. Common sense would tell you that 2 7970s aren't pulling those kind of framerates, considering they can barely break 45fps @1600p in crysis 3 with all the bells and whistles, do you REALLY think they are averaging over 50+ @ 3840x2160? Just use your head.


----------



## mtcn77

Quote:


> Originally Posted by *y2kcamaross*
> 
> ??
> I don't have to prove my tests, anyone in here who has both 780s and 7970s, or at least HAD both at one point, can tell you right now that the 780s absolutely crush the 7970s at any resolution. Common sense would tell you that 2 7970s aren't pulling those kind of framerates, considering they can barely break 45fps @1600p in crysis 3 with all the bells and whistles, do you REALLY think they are averaging over 50+ @ 3840x2160? Just use your head.


"I have used it." -Fixer.
The tests really demolish 780SLI, I mean it could be triple crossfire; although they clearly state there were only two cards.
When was the last time you last checked 7970's?


----------



## mltms

dam this tahiti is a monster when you OC



Spoiler: Warning: Spoiler!















http://pctuning.tyden.cz/hardware/graficke-karty/28069-asus-r9-280x-dc2-top-staronovy-radeon-v-akci?start=9


----------



## y2kcamaross

Quote:


> Originally Posted by *mtcn77*
> 
> "I have used it." -Fixer.
> The tests really demolish 780SLI, I mean it could be triple crossfire; although they clearly state there were only two cards.
> When was the last time you last checked 7970's?











so let me get this straight, you actually believe that 2 7970s are pulling 120+fps average in Bioshock Infinite @ a 4k resoultion? And over 50fps average in Crysis 3 at the same settings that 2 780s in sli only pull roughly 20, is that what you actually believe?


----------



## maarten12100

Quote:


> Originally Posted by *y2kcamaross*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so let me get this straight, you actually believe that 2 7970s are pulling 120+fps average in Bioshock Infinite @ a 4k resoultion? And over 50fps average in Crysis 3 at the same settings that 2 780s in sli only pull roughly 20, is that what you actually believe?


Without AA on high you can easily pull 120+ with 4K UHD but ofcourse the 780 can do a tad more.


----------



## Devildog83

Well yeah, the GTX 780 costs as much as the 7990 which crush's it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mltms*
> 
> dam this tahiti is a monster when you OC
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://pctuning.tyden.cz/hardware/graficke-karty/28069-asus-r9-280x-dc2-top-staronovy-radeon-v-akci?start=9


Its always been like that. People just chose to ignore it. That a card that cost more 2x the amount and has come 1.5 years latter and is faster with very small %.


----------



## Stay Puft

Quote:


> Originally Posted by *Devildog83*
> 
> Well yeah, the GTX 780 costs as much as the 7990 which crush's it.


7990 was garbage till the frame pacing driver was released. Would you buy a 7990 over a 780 because i sure wouldnt


----------



## SpacemanSpliff

Quote:


> Originally Posted by *y2kcamaross*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so let me get this straight, you actually believe that 2 7970s are pulling 120+fps average in Bioshock Infinite @ a 4k resoultion? And over 50fps average in Crysis 3 at the same settings that 2 780s in sli only pull roughly 20, is that what you actually believe?


Now I know there's a huge difference between ultra settings at 1080 and at 4K... but using the Bioshock benching utility... here's what my 1GB Vapor-X 5870 can do at max settings (DX11 with DDOF, 1080p, ultra settings, vsync on)...

Per Scene Stats:

32.72 sec avg 43.4 fps Min 25.08 fps max 88.77 fps Welcome Center
22 sec avg 45.79 fps min 25.06 fps max 71.78 fps Town Center
8.15 sec avg 42.7 fps min 11.23 fps max 62.47 fps Raffle
9.15 sec avg 61.89 fps min 23.18 fps max 85.59 fps Monument Island
81.7 sec avg 47.06 fps min 7.94 fps max 88.77 fps Overall (includes scene changes, hence the 7.94 minimum and the extra almost 10 seconds of total time)

Now I know that's not during actual gameplay, but still, for maxed out settings with a 5870, I figure I get 40-45 fps average in actual gameplay... with a single 1GB, 256 bit card at stock clocks. This isn't that GPU demanding of a game as you think it is... not if I can get such good performance with an almost 3 year old card... with the extra ram, bandwidth, and faster clocks, Yeah, I think crossfire 7970s should be easily capable of 100+ fps, even at 4K.


----------



## $ilent

Quote:


> Originally Posted by *mltms*
> 
> dam this tahiti is a monster when you OC


Damm that 280X is getting almost 60 fps with 4x AA on Bf3 at 1600p...

I am so tempted right now, but im going to wait for 290 prices


----------



## Devildog83

Quote:


> Originally Posted by *Stay Puft*
> 
> 7990 was garbage till the frame pacing driver was released. Would you buy a 7990 over a 780 because i sure wouldnt


Yes I would. I would get the Devil 13 or most 7990's over the 780 lightning any day. It's just my choice, not a knock on Nvidia. The GTX 790 might be a different thing. Just price to performance 7990 wins IMO.


----------



## y2kcamaross

Quote:


> Originally Posted by *SpacemanSpliff*
> 
> Now I know there's a huge difference between ultra settings at 1080 and at 4K... but using the Bioshock benching utility... here's what my 1GB Vapor-X 5870 can do at max settings (DX11 with DDOF, 1080p, ultra settings, vsync on)...
> 
> Per Scene Stats:
> 
> 32.72 secavg 43.4 fps Min 25.08 fpsmax 88.77 fps Welcome Center
> 22 sec avg 45.79 fpsmin 25.06 fpsmax 71.78 fps Town Center
> 8.15 sec avg 42.7 fps min 11.23 fpsmax 62.47 fps Raffle
> 9.15 sec avg 61.89 fpsmin 23.18 fpsmax 85.59 fps Monument Island
> 81.7 sec avg 47.06 fpsmin 7.94 fps max 88.77 fps Overall (includes scene changes, hence the 7.94 minimum and the extra almost 10 seconds of total time)
> 
> Now I know that's not during actual gameplay, but still, for maxed out settings with a 5870, I figure I get 40-45 fps average in actual gameplay... with a single 1GB, 256 bit card at stock clocks. This isn't that GPU demanding of a game as you think it is... not if I can get such good performance with an almost 3 year old card... with the extra ram, bandwidth, and faster clocks, Yeah, I think crossfire 7970s should be easily capable of 100+ fps, even at 4K.


The settings werent shown, but at the same settings used, 780s in sli got roughly 50 FPS, so they have to be soem pretty intense settings, all I'm trying to say is the tests are completely flawed because that's just not possible


----------



## Baghi

I don't understand why everyone is going goo-goo ga-ga over R9-280X's overclocked performance, GTX 770 already gives a hard time to even a Titan when overclocked let alone GTX 780!


The thing was overclocked to 1220MHz on the core and it's not even a monstrous overclock on said card, ~1200C is seen average OC on several samples Hexus tested (like WF 3X, AMP! and LTD OC by KFA2).


----------



## Stay Puft

Quote:


> Originally Posted by *Baghi*
> 
> I don't understand why everyone is going goo-goo ga-ga over R9-280X's overclocked performance, GTX 770 already gives a hard time to even a Titan when overclocked let alone GTX 780!
> 
> 
> The thing was overclocked to 1220MHz on the core and it's not even a monstrous overclock on said card, ~1200C is seen average OC on several samples Hexus tested (like WF 3X, AMP! and LTD OC by KFA2).


770's is in no mans land at 399. Needs a major price drop


----------



## fleetfeather

Less OC vs Stock comparisons pls... I suspect no 780 owners on OCN are running their 780 @ stock


----------



## PureBlackFire

Quote:


> Originally Posted by *fleetfeather*
> 
> Less OC vs Stock comparisons pls... I suspect no 780 owners on OCN are running their 780 @ stock


stock? not in here lol.


----------



## szeged

Quote:


> Originally Posted by *PureBlackFire*
> 
> stock? not in here lol.


comparing vs stock titans is apparently the in thing now lol

"omg 780s beat titans when super overclocked, omg 7970s match titans with super overclocked, omg etc etc"

"titan overclocks also"

" RAWR BURN THE HERETIC, TITAN CANT BE OVERCLOCKED!"


----------



## fleetfeather

Quote:


> Originally Posted by *szeged*
> 
> comparing vs stock titans is apparently the in thing now lol
> 
> "omg 780s beat titans when super overclocked, omg 7970s match titans with super overclocked, omg etc etc"
> 
> "titan overclocks also"
> 
> " RAWR BURN THE HERETIC, TITAN CANT BE OVERCLOCKED!"


yeah but dat stock titan bench doe


----------



## Stay Puft

Quote:


> Originally Posted by *szeged*
> 
> comparing vs stock titans is apparently the in thing now lol
> 
> "omg 780s beat titans when super overclocked, omg 7970s match titans with super overclocked, omg etc etc"
> 
> "titan overclocks also"
> 
> " RAWR BURN THE HERETIC, TITAN CANT BE OVERCLOCKED!"


1300 core 290X Vs 800 core Titan

290X OWNZZZZZZZZ Titan!!!!!!!!!!!!!!!!


----------



## fateswarm

NVIDIA is so ridiculed. They had a better card many months ago. But now an unconfirmed set of specs and an unreliable set of benchmarks likely manufactured for marketing say they may be beaten ,for a month, until 20nm is on.


----------



## szeged

Quote:


> Originally Posted by *Stay Puft*
> 
> 1300 core 290X Vs 800 core Titan
> 
> 290X OWNZZZZZZZZ Titan!!!!!!!!!!!!!!!!


im not going to be surprised when this happens lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stay Puft*
> 
> 1300 core 290X Vs 800 core Titan
> 
> 290X OWNZZZZZZZZ Titan!!!!!!!!!!!!!!!!


Actually Titan run ~ 1000MHz.


----------



## szeged

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Actually Titan run ~ 1000MHz.


we know, but the point is people try to say "card x beats card y when card x is super heavy overclocked and card y is left at stock, therefore card x is better."


----------



## fateswarm

All those problems would be eliminated with my idea. Graphics Mother Boards. That way you put your electronics for overclocking, your VRAM, your everything.


----------



## raghu78

Quote:


> Originally Posted by *fateswarm*
> 
> NVIDIA is so ridiculed. They had a better card many months ago. But now an unconfirmed set of specs and an unreliable set of benchmarks likely manufactured for marketing say they may be beaten *,for a month, until 20nm is on*.


1 month, oh you mean 1 year







20nm is a long way off. 9 months best case. 12 months worst case. you haven't been paying attention to TSMC earnings calls Q1 and Q2. TSMC is struggling to be consistent with their statements about 20nm volume production start and 20nm wafer volume in 2014 in Q1 and Q2 calls. they are trying hard to put on a show that 20nm is ready for H1 2014. but the reality is its H2 2014 and with very low volume till Q4 2014.


----------



## y2kcamaross

Quote:


> Originally Posted by *raghu78*
> 
> 1 month, oh you mean 1 year
> 
> 
> 
> 
> 
> 
> 
> 20nm is a long way off. 9 months best case. 12 months worst case. you haven't been paying attention to TSMC earnings calls Q1 and Q2. TSMC is struggling to be consistent with their statements about 20nm volume production start and 20nm wafer volume in 2014 in Q1 and Q2 calls. they are trying hard to put on a show that 20nm is ready for H1 2014. but the reality is its H2 2014 and with very low volume till Q4 2014.


Odd how AMD shills were saying, before AMD officially announced that the 290x would be on a 28nm process, that there was small hope that it COULD be on 20nm! Now that they confirm it's not, we wont see any 20nm for a year! Hilarious.


----------



## Baghi

Quote:


> Originally Posted by *szeged*
> 
> we know, but the point is people try to say "card x beats card y when card x is super heavy overclocked and card y is left at stock, therefore card x is better."


LOL, Mad Catz! Point being no need to go nuts at R9-280X's overclockability when an almost half year old GTX 770 clocks ridiculously good.


----------



## fleetfeather

Quote:


> Originally Posted by *y2kcamaross*
> 
> Odd how AMD shills were saying, before AMD officially announced that the 290x would be on a 28nm process, that there was small hope that it COULD be on 20nm! Now that they confirm it's not, we wont see any 20nm for a year! Hilarious.


Those who claimed it could be on 20nm are not the same people who are saying 20nm wont be available for a while still, afaik


----------



## Usario

Quote:


> Originally Posted by *y2kcamaross*
> 
> Odd how AMD shills were saying, before AMD officially announced that the 290x would be on a 28nm process, that there was small hope that it COULD be on 20nm! Now that they confirm it's not, we wont see any 20nm for a year! Hilarious.


Because everyone who likes AMD shares all the same thoughts...


----------



## Stay Puft

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Actually Titan run ~ 1000MHz.


We know Zeal. I was just making a joke


----------



## Jack Mac

Quote:


> Originally Posted by *fateswarm*
> 
> NVIDIA is so ridiculed.


You're joking, right?


----------



## szeged

Quote:


> Originally Posted by *Jack Mac*
> 
> You're joking, right?


he is, the post pretty much spelled the word sarcasm lol


----------



## Kinaesthetic

Quote:


> Originally Posted by *szeged*
> 
> he is, the post pretty much spelled the word sarcasm lol


Well, knowing that it was Fateswarm that posted that.....I wouldn't be so sure that it is sarcasm.....


----------



## criminal

Quote:


> Originally Posted by *szeged*
> 
> he is, the post pretty much spelled the word sarcasm lol


I don't know why, but this is the mental image I get now when I here ridicule.


----------



## Stay Puft

Quote:


> Originally Posted by *criminal*
> 
> I don't know why, but this is the mental image I get now when I here ridicule.


Always been more of a "Ken" fan


----------



## Tobiman

Quote:


> Originally Posted by *Stay Puft*
> 
> 7990 was garbage till the frame pacing driver was released. Would you buy a 7990 over a 780 because i sure wouldnt


I find it funny that those who don't own AMD cards have the most problem with em but, hey, this is the internet after all.


----------



## BradleyW

Quote:


> Originally Posted by *Tobiman*
> 
> I find it funny that those who don't own AMD cards have the most problem with em but, hey, this is the internet after all.


----------



## TamaDrumz76

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127758

What a bargain!!11!


----------



## kpoeticg

LMAO. Nice Find!!


----------



## PureBlackFire

Quote:


> Originally Posted by *TamaDrumz76*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127758
> 
> What a bargain!!11!





Spoiler: This card's price


----------



## Sheyster

Quote:


> Originally Posted by *Tobiman*
> 
> I find it funny that those who don't own AMD cards have the most problem with em but, hey, this is the internet after all.


As a former owner of an AMD 5850, 5970 and 6970, I don't have a problem with AMD. However, I believe that nVidia has better graphics cards and drivers overall. If the R9 290x ends up being a good card, and overclocks well, I'll buy one.


----------



## Stay Puft

Quote:


> Originally Posted by *Tobiman*
> 
> I find it funny that those who don't own AMD cards have the most problem with em but, hey, this is the internet after all.


Since when dont i own Amd cards?


----------



## xoleras

Quote:


> Originally Posted by *fleetfeather*
> 
> Those who claimed it could be on 20nm are not the same people who are saying 20nm wont be available for a while still, afaik


Most folks in the know would state that TSMC's 20nm wont' be ready until 2H 2014 if they're lucky. Anyone expecting TSMC's 20nm prior to July 2014 is probably delusional - on top of this, wafer costs are extremely expensive on 20nm; so much so that most companies may not be able to pay the asking price. Anyway, TSMC is in the risk production state right now with 20nm- Obviously some here have no idea what risk production is.

The abridged version is that "risk production" means it is NOWHERE near ready. Just to put this into perspective, TSMC's 28nm was in "risk production" in late 2009 and 2010. When did TSMC's 28nm ship? Q4 2011. El oh el. If anything thinks 20nm is imminent, yeah, good luck with that. The only fans that ever suggested the 290X had a possibility of being 20nm were clueless. That isn't to say that 28nm is bad - that just means that the rate of transistor increases will basically halt. Performance increases are still possible with 28nm, see Apple's A7 for proof of this - the main problem is that transistor density cannot increase. Therefore there are definite diminishing returns.

Anyway, current rumors pin NVidia's Maxwell at also being 28nm and likely a "refresh" on 20nm when it's ready.


----------



## maarten12100

Quote:


> Originally Posted by *PureBlackFire*
> 
> 
> 
> Spoiler: This card's price


beat me to it









most ridiculous place holder (Titan's price really ridicules his placeholder)


----------



## Usario

Quote:


> Originally Posted by *TamaDrumz76*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127758
> 
> What a bargain!!11!


I'm buying on release day


----------



## criminal

Quote:


> Originally Posted by *Usario*
> 
> I'm buying on release day


You are crazy! That is just the 290. The 290X couldn't be much more than that. It should totally be worth the extra cost.


----------



## TamaDrumz76

Quote:


> Originally Posted by *criminal*
> 
> You are crazy! That is just the 290. The 290X couldn't be much more than that. It should totally be worth the extra cost.


If the 290X is not at least the price of a brand new car, it can't be worth a damn!


----------



## maarten12100

Quote:


> Originally Posted by *TamaDrumz76*
> 
> If the 290X is not at least the price of a brand new *sports* car, it can't be worth a damn!


fixed it for you


----------



## nvidiaftw12

Quote:


> Originally Posted by *Devildog83*
> 
> Yes I would. I would get the Devil 13 or most 7990's over the 780 lightning any day. It's just my choice, not a knock on Nvidia. The GTX 790 might be a different thing. Just price to performance 7990 wins IMO.


I wouldn't. 7990's have terrible coil whine.

Quote:


> Originally Posted by *Tobiman*
> 
> I find it funny that those who don't own AMD cards have the most problem with em but, hey, this is the internet after all.


Not always.


----------



## infranoia

Yup, I thought we already saw the X was $55 more. So, $10,054.99.

http://www.youtube.com/watch?v=Pa9AI8FdMdk#t=20s


----------



## Devildog83

Quote:


> Originally Posted by *nvidiaftw12*
> 
> I wouldn't. 7990's have terrible coil whine.
> 
> Not always.


Not all 7990's are the same. Because some people have coil whine doesn't mean they all do. If you are telling me you would rather have less performance for the same price I would say you have to be crazy because the 780 can't even come close to the 7990. The 790 Titan is close to equal but it's still $1000. I am not a fanboy either way I just ave common sense.

Edit: Since you have Nvidia in your name I will assume you must be a Fanboy.


----------



## nvidiaftw12

Quote:


> Originally Posted by *Devildog83*
> 
> Not all 7990's are the same. Because some people have coil whine doesn't mean they all do. If you are telling me you would rather have less performance for the same price I would say you have to be crazy because the 780 can't even come close to the 7990. The 790 Titan is close to equal but it's still $1000. I am not a fanboy either way I just ave common sense.


It appears to be in every Powercolor 7990.


----------



## Devildog83

Quote:


> Originally Posted by *nvidiaftw12*
> 
> It appears to be in every Powercolor 7990.


I am quite sure that's what you believe. I couldn't here it very well anyhow because I have Tinnitus.


----------



## Booty Warrior

Quote:


> Originally Posted by *Devildog83*
> 
> I am quite sure that's what you believe. I couldn't here it very well anyhow because I have Tinnitus.


Haha, that's the first time I've heard a plus side to having tin. Coil whine? What coil whine?


----------



## nvidiaftw12

These guys can hear it.

http://www.overclock.net/t/1419187/question-for-the-7990-owners


----------



## Devildog83

Quote:


> Originally Posted by *Booty Warrior*
> 
> Haha, that's the first time I've heard a plus side to having tin. Coil whine? What coil whine?


LOL


----------



## nvidiaftw12

Listen to it long enough and you will have tinnitus. xD


----------



## Devildog83

Quote:


> Originally Posted by *nvidiaftw12*
> 
> These guys can hear it.
> 
> http://www.overclock.net/t/1419187/question-for-the-7990-owners


I read that, all you have to do is use afterburner which is what I would use anyway. I use it now on my Devil because the Powercolor one is trash. All in all I would still get it over a 780 just for a performance boost. 2 7970's @ about $700 beats both except for the 7990 Devil 13, which I could buy right now for $800 and it's on par with 2 7970's, would be my 1st choice just because it looks awesome. If I were to compare it to the Titan 790 then it would be another story. I like that card too but I would balk at $1000, maybe you could find an open box or something for $800 and I could see it.


----------



## Moragg

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Listen to it long enough and you _will_ have tinnitus. xD


All I can say is my 7870XT gets coil whine when OCed, but I have my headphones in so can't hear it anyways







though if you can't block it out it is very, very annoying.


----------



## Devildog83

Quote:


> Originally Posted by *Moragg*
> 
> All I can say is my 7870XT gets coil whine when OCed, but I have my headphones in so can't hear it anyways
> 
> 
> 
> 
> 
> 
> 
> though if you can't block it out it is very, very annoying.


Who makes the XT, I have none on mine at all. Very quite unless I manually turn the fans to 80%+.


----------



## nvidiaftw12

Quote:


> Originally Posted by *Devildog83*
> 
> I read that, all you have to do is use afterburner which is what I would use anyway. I use it now on my Devil because the Powercolor one is trash. All in all I would still get it over a 780 just for a performance boost. 2 7970's @ about $700 beats both except for the 7990 Devil 13, which I could buy right now for $800 and it's on par with 2 7970's, would be my 1st choice just because it looks awesome. If I were to compare it to the Titan 790 then it would be another story. I like that card too but I would balk at $1000, maybe you could find an open box or something for $800 and I could see it.


Afterburner fix didn't work on my second card, I didn't read that thread until after I rma'd the first. I paid $680 for my Devil, could have gotten it at $50 bucks less if know you could file a rebate while the card was on rma.


----------



## grunion

Coil whine now?

I have 14 AMD cards and not a one coil whines.
I have 7 NV cards and 3 of those whine, my 780 is the worst, it also makes my PSU run hotter.


----------



## DzillaXx

Quote:


> Originally Posted by *Moragg*
> 
> All I can say is my 7870XT gets coil whine when OCed, but I have my headphones in so can't hear it anyways
> 
> 
> 
> 
> 
> 
> 
> though if you can't block it out it is very, very annoying.


Anyone can get it.

My gtx470 has it and so does my current 7950.
Not while gaming though. gtx470 would whine while folding and my 7950 gets whine at the exit screen to heaven and other things never while gaming.

I know someone with a brand new gtx760 that sometimes whines during gameplay, luck of the draw I guess.

Though the Ref HD7000 series seems to get a lot more of them, Most of the time they rarely bad enough to effect users during normal gameplay. I have seen more Nvidia cards whine with gameplay then AMD.


----------



## Majin SSJ Eric

My Titans whine during the credit page of the Valley bench...


----------



## Devildog83

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Afterburner fix didn't work on my second card, I didn't read that thread until after I rma'd the first. I paid $680 for my Devil, could have gotten it at $50 bucks less if know you could file a rebate while the card was on rma.


$680 is a good deal for that card. I wish I could afford it now but for now I will just do with the 7870. I am sure when I can afford an upgrade there will be a lot more to chose from.


----------



## mtcn77

Quote:


> Originally Posted by *y2kcamaross*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so let me get this straight, you actually believe that 2 7970s are pulling 120+fps average in Bioshock Infinite @ a 4k resoultion? And over 50fps average in Crysis 3 at the same settings that 2 780s in sli only pull roughly 20, is that what you actually believe?


That's it, use your knowledge; get statistical.








I just believe some hardware performing some feat is not a performance of mine since "it isn't yours if you cannot fix it".
Thus, I remind you of the chess strategic principle I learned from Chessmaster GM, "you cannot defend two weaknesses".
- How do you want to approach this incident? Whether price being unaccordingly, or the abject performance which I would believe to be totally false if they didn't post the review with FCAT reference, making it ironic.

In all solemnity, I really do believe people have to play up their game. Especially, the review editors have to drop vendor PR jargon.
How can you state some hardware (in example 2gb) being superior to (3gb) ones, when the relativity of them will be absolutely false if new software makes use of more hardware resources? Same happened when I bought a 6870, it just plain lacks compute units compared to a 5870, yet they stated it was comparable.
I think this is unacceptable. I mean to say stating gtx770>gtx780 is plain false. I'm talking about you ThinkComputer
Same at the other end of the spectrum, you need the gpu only to compute(provide torque). If you have more memory access, you have more HP for resolution and I do believe AMD(ATi in this instance) can pull together a gpu that can keep as fast as that card's specified GDDR5 memory, GDDR5 being a definition finalized by AMD.


----------



## GoldenTiger

Quote:


> Originally Posted by *Devildog83*
> 
> *you have to be crazy because the 780 can't even come close to the 7990.*
> 
> Edit: Since you have Nvidia in your name I will assume you must be a Fanboy.


Oh really?











It can in certain games & settings







but as a general across-the-board thing, no, it isn't as fast by a fair margin.


----------



## grunion

Quote:


> Originally Posted by *y2kcamaross*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so let me get this straight, you actually believe that 2 7970s are pulling 120+fps average in Bioshock Infinite @ a 4k resoultion? And over 50fps average in Crysis 3 at the same settings that 2 780s in sli only pull roughly 20, is that what you actually believe?


IDK might not be too far fetched.
A single 7970 pummels a 680 in Bioshock Infinite at 1200p.


----------



## GoldenTiger

Grrr, won't let me edit for some reason. Also was gonna post this chart:


----------



## Booty Warrior

Quote:


> Originally Posted by *GoldenTiger*
> 
> Grrr, won't let me edit for some reason. Also was gonna post this chart:
> 
> 
> Spoiler: Warning: Spoiler!


Never settle!


----------



## Devildog83




----------



## Brutuz

Quote:


> Originally Posted by *Stay Puft*
> 
> 7990 was garbage till the frame pacing driver was released. Would you buy a 7990 over a 780 because i sure wouldnt


I would because I've ran CFX previously without issue, and as I (and others) have said plenty of times the issue isn't clearly noticeable for everyone.
Quote:


> Originally Posted by *Baghi*
> 
> I don't understand why everyone is going goo-goo ga-ga over R9-280X's overclocked performance, GTX 770 already gives a hard time to even a Titan when overclocked let alone GTX 780!
> 
> 
> The thing was overclocked to 1220MHz on the core and it's not even a monstrous overclock on said card, ~1200C is seen average OC on several samples Hexus tested (like WF 3X, AMP! and LTD OC by KFA2).


This is keeping fairly close to an overclocked 780, a stock one/OCed 770 would be beaten here.
Quote:


> Originally Posted by *grunion*
> 
> Coil whine now?
> 
> I have 14 AMD cards and not a one coil whines.
> I have 7 NV cards and 3 of those whine, my 780 is the worst, it also makes my PSU run hotter.


My HD7950 whines when I'm running a decent OC (At 1200/1475 atm, gonna start playing with memory voltage next I think) but at the same time my 470 whines too, and my 9800GT did. Yet another thing that is practically equal yet people put nVidia in higher regard...


----------



## GoldenTiger

Quote:


> Originally Posted by *Devildog83*
> 
> [*IMG ALT=""]http://www.overclock.net/content/type/61/id/1695543/width/500/height/1000[/IMG]


Think you missed the point of my post. I said as a general thing it's not the case, but when you OC them to the hills they can match in *some* (not all) games. 7990 has very little to none on the OC-room front, as a side note. I wasn't saying an oc'd 780 beats/matches a 7990 in most cases, just that it can in a few games. Not sure what your post of a random stock-clock chart was for when I am talking oc'd and in only certain cases.


----------



## DzillaXx

Quote:


> Originally Posted by *GoldenTiger*
> 
> Think you missed the point of my post. I said as a general thing it's not the case, but when you OC them to the hills they can match in *some* (not all) games. 7990 has very little to none on the OC-room front, as a side note. I wasn't saying an oc'd 780 beats/matches a 7990 in most cases, just that it can in a few games. Not sure what your post of a random stock-clock chart was for when I am talking oc'd and in only certain cases.


Personally I would only get a 7990 over a 780 if I was never going to SLI that gtx780.

Though for someone getting a 7990 and want to OC, a Waterblock is Key. You still need a OCed gtx780 to at best Match a 7990. And while you can still get a slight OC out of a 7990 before temps get out of control, Tossing a waterblock on would give you much better Max OC performance then getting a gtx780 and tossing a block on that.

7990 + Waterblock is still cheaper then a Titan and would run circles around it.

CF with GCN is pretty good ATM


----------



## Devildog83

Quote:


> Originally Posted by *DzillaXx*
> 
> Personally I would only get a 7990 over a 780 if I was never going to SLI that gtx780.
> 
> Though for someone getting a 7990 and want to OC, a Waterblock is Key. You still need a OCed gtx780 to at best Match a 7990. And while you can still get a slight OC out of a 7990 before temps get out of control, Tossing a waterblock on would give you much better Max OC performance then getting a gtx780 and tossing a block on that.
> 
> 7990 + Waterblock is still cheaper then a Titan and would run circles around it.
> 
> CF with GCN is pretty good ATM


Thank you, that was my point. For the same price point you get either a 7990 or a 780 lightning. Which gives you better performance? It's obvious so I don't understand what the deal is. I wasn't only responding to Golden Tiger, sorry about that, but we were having a small debate over this and I said I would buy a 7990 as opposed to a 780 because of price for performance. All of the reasons I was given why somebody would get the 780 had nothing to do with performance. For $150 or so more I would get a Devil 13 because if you use Afterburner is overclocks like a champ and would hang or beat a Titan. It also doesn't run near as hot as most other 7990's so W/C might not be needed. These are just my opinions.


----------



## bencher

Quote:


> Originally Posted by *Devildog83*
> 
> Thank you, that was my point. For the same price point you get either a 7990 or a 780 lightning. Which gives you better performance? It's obvious so I don't understand what the deal is. I wasn't only responding to Golden Tiger, sorry about that, but we were having a small debate over this and I said I would buy a 7990 as opposed to a 780 because of price for performance. All of the reasons I was given why somebody would get the 780 had nothing to do with performance. For $150 or so more I would get a Devil 13 because if you use Afterburner is overclocks like a champ and would hang or beat a Titan. It also doesn't run near as hot as most other 7990's so W/C might not be needed. These are just my opinions.


You dont need to overclock a 7990 for it to beat titan

-.-


----------



## szeged

Quote:


> Originally Posted by *bencher*
> 
> You dont need to overclock a 7990 for it to beat titan
> 
> -.-


havent had a chance to catch up with the past few pages, are we talking about in game fps 7990 vs 780/titan? or benchmarks? because any serious bencher doesnt put the 7990 into the same category as 780/titans etc. Single pcb yeah, dual gpu etc etc .


----------



## Devildog83

Quote:


> Originally Posted by *bencher*
> 
> You dont need to overclock a 7990 for it to beat titan
> 
> -.-


I saw a couple of head to head reviews that showed that.


----------



## Durquavian

Quote:


> Originally Posted by *szeged*
> 
> havent had a chance to catch up with the past few pages, are we talking about in game fps 7990 vs 780/titan? or benchmarks? because any serious bencher doesnt put the 7990 into the same category as 780/titans etc. Single pcb yeah, dual gpu etc etc .


I am sure Devildog is speaking of games for the most part.


----------



## scyy

Quote:


> Originally Posted by *xoleras*
> 
> Most folks in the know would state that TSMC's 20nm wont' be ready until 2H 2014 if they're lucky. Anyone expecting TSMC's 20nm prior to July 2014 is probably delusional - on top of this, wafer costs are extremely expensive on 20nm; so much so that most companies may not be able to pay the asking price. Anyway, TSMC is in the risk production state right now with 20nm- Obviously some here have no idea what risk production is.
> 
> The abridged version is that "risk production" means it is NOWHERE near ready. Just to put this into perspective, TSMC's 28nm was in "risk production" in late 2009 and 2010. When did TSMC's 28nm ship? Q4 2011. El oh el. If anything thinks 20nm is imminent, yeah, good luck with that. The only fans that ever suggested the 290X had a possibility of being 20nm were clueless. That isn't to say that 28nm is bad - that just means that the rate of transistor increases will basically halt. Performance increases are still possible with 28nm, see Apple's A7 for proof of this - the main problem is that transistor density cannot increase. Therefore there are definite diminishing returns.
> 
> Anyway, current rumors pin NVidia's Maxwell at also being 28nm and likely a "refresh" on 20nm when it's ready.


So pretty much be ready for the next 280.


----------



## szeged

Quote:


> Originally Posted by *Durquavian*
> 
> I am sure Devildog is speaking of games for the most part.


yeah no doubt 7990 beats single gpu cards like 780 and titan for games, unless the game is awfully coded or the crossfire/sli support is almost non existent lol.


----------



## wstanci3

Bit off topic, but how much OC headroom does a 7990 have? With and without a water block?


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> Bit off topic, but how much OC headroom does a 7990 have? With and without a water block?


without, if your case doesnt have some good airflow, the OC headroom isnt that great from what i saw with my tests, but it was the middle of summer in florida.

With the waterblock on it goes up, but my 7990 still couldnt reach the overclocks of crossfired 7970s i tested against. Might just be my results with a dud overclocking 7990, but i didnt wanna buy more 7990s just to see if i got a bad clocker lol


----------



## Testier

People should not compare single gpu to dual gpu.


----------



## bencher

Quote:


> Originally Posted by *Testier*
> 
> People should not compare single gpu to dual gpu.


Why because you said so?


----------



## szeged

Quote:


> Originally Posted by *Testier*
> 
> People should not compare single gpu to dual gpu.


most of us know this, but those that only look at pure fps in games will always continue to compare dual gpu cards to single gpu cards and say " but its on just one pcb so it counts" Many people have tried to help change their PoV on it. Yes its true dual gpu cards such as the 7990 beat the titan, but then again they arent in the same category(imo atleast) A lot of these users just see " HA IT BEATS YOUR TITAN LOL NVIDIA" and go with that (once again imo, and not all users do this.)


----------



## wstanci3

^ Yet people still do, unfortunately.
You can't blame some people, though. If you are looking for pure performance and if a dual gpu costs the same as a single gpu and offers superior performance, then that is a no brainer. But, if you are looking for something to bench or experiment with, then the single gpu offers more "freedom."


----------



## GoldenTiger

Quote:


> Originally Posted by *szeged*
> 
> yeah no doubt 7990 beats single gpu cards like 780 and titan for games, unless the game is awfully coded or the crossfire/sli support is almost non existent lol.


Definitely, I was just throwing a few random tests where it didn't as a sarcastic post... apparrently that went over people's heads even after I responded back saying it was done that way and obviously the 7990 is much faster in most games/tests.


----------



## selk22

For many people its not about if it is Dual GPU or Single GPU its about how much performance they can get for one single card. So the 7990 is still a viable comparison to these single GPU cards when it comes to purchase options. If you want to VS cards in benchmarks yes its stupid to compare dual to single but we are talking about what yields the most FPS for the money right?


----------



## Devildog83

Quote:


> Originally Posted by *szeged*
> 
> without, if your case doesnt have some good airflow, the OC headroom isnt that great from what i saw with my tests, but it was the middle of summer in florida.
> 
> With the waterblock on it goes up, but my 7990 still couldnt reach the overclocks of crossfired 7970s i tested against. Might just be my results with a dud overclocking 7990, but i didnt wanna buy more 7990s just to see if i got a bad clocker lol


Since your a graphics card dude I was wondering, have you tried using the new Afterburner beta with the 7990's? I saw a review that said he could not overclock the worth a darn until he used it but with it he was able to clock the heck out of the Devil 13.


----------



## Devildog83

Quote:


> Originally Posted by *selk22*
> 
> For many people its not about if it is Dual GPU or Single GPU its about how much performance they can get for one single card. So the 7990 is still a viable comparison to these single GPU cards when it comes to purchase options. If you want to VS cards in benchmarks yes its stupid to compare dual to single but we are talking about what yields the most FPS for the money right?


Yep, the question was - for the $650 would you buy a 7990 or 780, I said 7990 and that's when it was on like Donkey Kong.


----------



## szeged

Quote:


> Originally Posted by *Devildog83*
> 
> Since your a graphics card dude I was wondering, have you tried using the new Afterburner beta with the 7990's? I saw a review that said he could not overclock the worth a darn until he used it but with it he was able to clock the heck out of the Devil 13.


i havent played weith the 7990s in a while, definitely not on the latest afterburner, ill have to check it out and see if they clock any differently than an older version.


----------



## selk22

Quote:


> Originally Posted by *Devildog83*
> 
> Yep, the question was - for the $650 would you buy a 7990 or 780, I said 7990 and that's when it was on like Donkey Kong.


Right. I was really replying to this
Quote:


> Originally Posted by *Testier*
> 
> People should not compare single gpu to dual gpu.


So like I said. At that price point what yields the best frames.. Its more often the 7990 in gaming so for people looking for a single card solution I think its an excellent choice.

I also think the 780 is an excellent choice if you ever plan to to SLI or heavy OC. I also am a sucker for Phsyx and really enjoy when its present in my gaming experience.


----------



## Devildog83

Quote:


> Originally Posted by *selk22*
> 
> Right. I was really replying to this
> So like I said. At that price point what yields the best frames.. Its more often the 7990 in gaming so for people looking for a single card solution I think its an excellent choice.
> 
> I also think the 780 is an excellent choice if you ever plan to to SLI or heavy OC. I also am a sucker for Phsyx and really enjoy when its present in my gaming experience.


Question, can you crossfire 7990's?


----------



## szeged

Quote:


> Originally Posted by *selk22*
> 
> Right. I was really replying to this
> So like I said. At that price point what yields the best frames.. Its more often the 7990 in gaming so for people looking for a single card solution I think its an excellent choice.
> 
> I also think the 780 is an excellent choice if you ever plan to to SLI or heavy OC. I also am a sucker for Phsyx and really enjoy when its present in my gaming experience.


yeah if all you care about is raw performance for the best price, its hard to beat a 7990 if your space is constrained to just 1 gpu. Sorry im always stuck in benchmark mode so i never put the dual gpu cards in the same category as single lol.


----------



## wstanci3

Quote:


> Originally Posted by *Devildog83*
> 
> Question, can you crossfire 7990's?


Yes, but Oh God Why?


----------



## selk22

Quote:


> Originally Posted by *Devildog83*
> 
> Question, can you crossfire 7990's?


Yes you can but in my opinion at this point id prefer the 780 because if im going to have two then I may just want three someday and the 7990 cant do that my friend


----------



## Devildog83

Quote:


> Originally Posted by *wstanci3*
> 
> Yes, but Oh God Why?


Well he talked about multiple 780's why not 2 7990's? They cost the same.


----------



## szeged

two 7990s on the amd reference cooler, i dont even want to think about that. especially if you want them in a closed case lol.


----------



## Devildog83

Quote:


> Originally Posted by *selk22*
> 
> Yes you can but in my opinion at this point id prefer the 780 because if im going to have two then I may just want three someday and the 7990 cant do that my friend


6 gpus is way in the future huh.


----------



## wstanci3

Quote:


> Originally Posted by *Devildog83*
> 
> Well he talked about multiple 780's why not 2 7990's? They cost the same.


1) Dat power draw
2) Dat heat
3) 4-way doesn't scale too well

Though, it would be sick if both were in a watercooled setup.


----------



## nvidiaftw12

Quote:


> Originally Posted by *Devildog83*
> 
> Well he talked about multiple 780's why not 2 7990's? They cost the same.


Scaling gets worse with more gpus.


----------



## szeged

if i could get my hands on two ares II 7990s, i would be all over that so fast despite the cost.


----------



## selk22

Quote:


> Originally Posted by *Devildog83*
> 
> 6 gpus is way in the future huh.


I am not trying to argue with you buddy just saying what I would do personally in this unlimited money situation.
Quote:


> Originally Posted by *szeged*
> 
> two 7990s on the amd reference cooler, i dont even want to think about that. especially if you want them in a closed case lol.


----------



## Devildog83

Quote:


> Originally Posted by *selk22*
> 
> I am not trying to argue with you buddy just saying what I would do personally in this unlimited money situation.


I knew that, I was just playin' around. I can't even afford 1 7990 or 780 right now.


----------



## Devildog83

Quote:


> Originally Posted by *szeged*
> 
> if i could get my hands on two ares II 7990s, i would be all over that so fast despite the cost.


Right now I would be happy just to get a second Devil 7870. It's only a 7870 but I love this card.


----------



## szeged

Quote:


> Originally Posted by *Devildog83*
> 
> Right now I would be happy just to get a second Devil 7870. It's only a 7870 but I love this card.


i tested one of the devil 7870s, best 7870 i got to play with, you have a nice card there.


----------



## selk22

Quote:


> Originally Posted by *Devildog83*
> 
> I knew that, I was just playin' around. I can't even afford 1 7990 or 780 right now.


Such is life









What most likely I will end up doing is either the 280x or the 290 then Xfire at some point. That or SLI 770's

Id love to get my hands on that GK110 but its a little out of the price range right now. Maybe the 290x will force the price down on the 780's some...

Who knows until the 15th but I feel its a great time for GPU purchases this year


----------



## Devildog83

Quote:


> Originally Posted by *szeged*
> 
> i tested one of the devil 7870s, best 7870 i got to play with, you have a nice card there.


Thanks, I got an open box for $208, the only thing missing was the mouse pad. $52 is a lot for a mouse pad. LOL

Maybe I got the one you tested. Do you have my mouse pad.


----------



## szeged

rofl no mousepad for $52 no ty lololol

thats a nice discount you got, that 7870 is worth the premium over the other ones imo.


----------



## rdr09

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Scaling gets worse with more gpus.


i wonder how 4 290X will scale?



is it even possible?


----------



## Devildog83

Quote:


> Originally Posted by *szeged*
> 
> rofl no mousepad for $52 no ty lololol
> 
> thats a nice discount you got, that 7870 is worth the premium over the other ones imo.


I love the backplate too. It would have cost about $30 to get an aftermarket one.


----------



## szeged

i wish they did a devil 7970, i would have grabbed one for sure lol


----------



## Devildog83

Quote:


> Originally Posted by *szeged*
> 
> i wish they did a devil 7970, i would have grabbed one for sure lol


No doubt but it probably would have been $500


----------



## DzillaXx

Quote:


> Originally Posted by *rdr09*
> 
> i wonder how 4 290X will scale?
> 
> 
> 
> is it even possible?


iirc QuadFire Scales better the Quad SLI
So that should at least be helpful for it.


----------



## selk22

Quote:


> Originally Posted by *DzillaXx*
> 
> iirc QuadFire Scales better the Quad SLI
> So that should at least be helpful for it.


Really? Interesting!

Maybe we will get a Devil 280x ?


----------



## Devildog83

Anyone know what the 290x will cost?


----------



## rdr09

Quote:


> Originally Posted by *DzillaXx*
> 
> iirc QuadFire Scales better the Quad SLI
> So that should at least be helpful for it.


but is it possible to quadfire the 290s?


----------



## Roaches

Quote:


> Originally Posted by *szeged*
> 
> i wish they did a devil 7970, i would have grabbed one for sure lol


I'd love a R9-280X Devil as well or hopes Gigabyte does a Windforce 5x on their 280X


----------



## wstanci3

Quote:


> Originally Posted by *Devildog83*
> 
> Anyone know what the 290x will cost?


The rumor mill says $699 or $649.


----------



## Devildog83

Quote:


> Originally Posted by *wstanci3*
> 
> The rumor mill says $699 or $649.


Hmmm, about the same as a 7990 or 780


----------



## wstanci3

Yeah, about right unfortunately.
Was hoping for a $599 price tag at the most. Oh well. I want to know more about the 290 though!


----------



## selk22

Quote:


> Originally Posted by *wstanci3*
> 
> Yeah, about right unfortunately.
> Was hoping for a $599 price tag at the most. Oh well. I want to know more about the 290 though!


I do also, if we can get a 4gb-6gb 290 I will be hoping on that train pretty quick if it can rival a 780's stock performance


----------



## wstanci3

Wouldn't be surprised if the 290 rivals the 780 at stock. But, with heavy over clocking? That will be interesting to see.


----------



## szeged

i doubt im gonna even read all the sponsored review sites take on the 290x when it comes out because stock vs stock and high overclock vs stock are the worst reviews. gonna wait till we can find max OC vs max OC lol, so tired of seeing reviews " omg when we overclocked this card a lot, it beat this other card at stock! clear winner guys!"


----------



## wstanci3

Yeah, going by some reviews today for gtx 780s, if you didn't know any better, you'd thought the Titan was incapable of overclocking. Lol


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> Yeah, going by some reviews today for gtx 780s, if you didn't know any better, you'd thought the Titan was incapable of overclocking. Lol


the classified 780 reviews are perfect examples, dont get me wrong the 780 classifieds are monster cards, but all the reviews compare them to stock titans then go " well titan is dead all hail our 780 overlords, its official all 780s now beat stock titans when highly overclocked, alert the press"


----------



## bencher

Quote:


> Originally Posted by *selk22*
> 
> I do also, if we can get a 4gb-6gb 290 I will be hoping on that train pretty quick if it can rival a 780's stock performance


I am pretty sure it will compete with a 780 if rumored benches are true.


----------



## fateswarm

Quote:


> Originally Posted by *raghu78*
> 
> 1 month, oh you mean 1 year
> 
> 
> 
> 
> 
> 
> 
> 20nm is a long way off. 9 months best case. 12 months worst case. you haven't been paying attention to TSMC earnings calls Q1 and Q2. TSMC is struggling to be consistent with their statements about 20nm volume production start and 20nm wafer volume in 2014 in Q1 and Q2 calls. they are trying hard to put on a show that 20nm is ready for H1 2014. but the reality is its H2 2014 and with very low volume till Q4 2014.


You keep acting with juvenile sarcasm when you are apparently 35 years old. You refuse to see the obvious news. They have explicitly reported in several news agencies that Q1 '04 will have 20nm and in fact they have a deal with Apple to have them ready by then or they will be ridiculed and they will even have 16nm launching by the end of '04 because of the same Apple deal (even if the latter sounds hard to believe).


----------



## renaldy

the numbers look good but i don't think it will surpass the titan


----------



## raghu78

Quote:


> Originally Posted by *fateswarm*
> 
> You keep acting with juvenile sarcasm when you are apparently 35 years old. You refuse to see the obvious news. They have explicitly reported in several news agencies that Q1 '04 will have 20nm and in fact they have a deal with Apple to have them ready by then or they will be ridiculed and they will even have 16nm launching by the end of '04 because of the same Apple deal (even if the latter sounds hard to believe).


I preserve my sarcasm for the people who deserve it. you are probably one of the few people who is completely clueless to believe TSMC 20nm is ready in q1 2014. In tsmc q1 2013 earnings call they stated q2 2014 for start of volume production. in q2 2013 earnings call when the same question was asked they were vague on the start of 20nm volume production. they started saying 20nm wafer revenue will be high single digit (around 8%) of 2014 TSMC revenue. in q1 2013 earnings call tsmc said 20nm wafer volume in 2014 would be greater than 28nm wafer volume in 2012. in q2 2013 earnings call they did an about turn and said 20nm wafer revenue will be more than 28nm wafer revenue in first year of 28nm volume production. TSMC 28nm started in Q4 2011 with 2% wafer volume that quarter. these statements indicate TSMC's lack of confidence in getting 20nm volume production in H1 2014. further more the 20nm volume in 2014 is going to be very small. only in q4 2014 will it get to >=5% do you think AMD knew less than you about the state of TSMC 20nm when they decided to go with Hawaii on 28nm. silly guy.


----------



## Forceman

FWIW, this DigiTimes article from 2 weeks ago says they are possibly ahead of schedule and will start volume production in 1Q 2014, although that almost certainly wouldn't be for GPUs. Next earnings call is next week, so we should know more then.
Quote:


> Taiwan Semiconductor Manufacturing Company (TSMC) has stepped up its purchases of manufacturing equipment for its 20nm process which is slated to enter volume production in the first quarter of 2014, according to sources at fab-tool suppliers.


http://www.digitimes.com/news/a20130926PD205.html


----------



## raghu78

Quote:


> Originally Posted by *Forceman*
> 
> FWIW, this DigiTimes article from 2 weeks ago says they are possibly ahead of schedule and will start volume production in 1Q 2014, although that almost certainly wouldn't be for GPUs. Next earnings call is next week, so we should know more then.
> http://www.digitimes.com/news/a20130926PD205.html


FWIW which do you believe ? TSMC official statements on an earnings call or some internet news site


----------



## GoldenTiger

Quote:


> Originally Posted by *raghu78*
> 
> FWIW which do you believe ? TSMC official statements on an earnings call or some internet news site


Real news site saying official statements from TSMC, vs. old TSMC call when a new one's coming in a week that'll clarify things for sure? Yeah, I go with the former, it's more up-to-date. AMD seems to want us to believe 20nm is bad for end-users, which is hilarious, per an interview I saw claiming it's "too hot and too expensive".


----------



## TrevBlu19

Quote:


> Originally Posted by *GoldenTiger*
> 
> Real news site saying official statements from TSMC, vs. old TSMC call when a new one's coming in a week that'll clarify things for sure? Yeah, I go with the former, it's more up-to-date. AMD seems to want us to believe 20nm is bad for end-users, which is hilarious, per an interview I saw claiming it's "too hot and too expensive".


Just skip to 16nm with finfets.


----------



## Artikbot

Quote:


> Originally Posted by *GoldenTiger*
> 
> AMD seems to want us to believe 20nm is bad for end-users, which is hilarious, per an interview I saw claiming it's "too hot and too expensive".


If less nanometers was always better, you wouldn't see most of the ASICs on the market besides GPUs and CPUs on older 32, 45, 60, 90 and 130 nm, wouldn't you?

**DISCLAIMER: Numbers pulled out of my rear*

Imagine AMD gets a Tahiti XT die, manufactured in 28nm for $120. It has a 300W power consumption, and a heat density of 0.82W/mm^2

This same die is scaled down to 20nm. Imagine it costs AMD (now) $160 to be made, with a 250W power consumption, and a corresponding heat density of 0.96W/mm^2

That it will run hotter is pretty much a given, as the cooler will have a harder time coping with the much reduced radiating surface, even if the consumption is lower. Now, the die is also more expensive (this is quite obvious, too). The cost of a redesigned cooler and a more expensive die adds down the road... What if the R9-290X would now retail for $829 instead of $729? It would be far less competitive.

And not only that one, but if they scaled every single die down, the R9-280X instead of being into the $299 mark could potentially hit $379, making nVidia's offering a no brainer...

See what I mean?


----------



## DzillaXx

Quote:


> Originally Posted by *Forceman*
> 
> FWIW, this DigiTimes article from 2 weeks ago says they are possibly ahead of schedule and will start volume production in 1Q 2014, although that almost certainly wouldn't be for GPUs. Next earnings call is next week, so we should know more then.
> http://www.digitimes.com/news/a20130926PD205.html


Apple has first access to TSMC's 20nm anyways

Won't be ready for AMD's and Nvidia's GPU's till atleast 2nd half of 2014.


----------



## raghu78

Quote:


> Originally Posted by *GoldenTiger*
> 
> Real news site saying official statements from TSMC, vs. old TSMC call when a new one's coming in a week that'll clarify things for sure? Yeah, I go with the former, it's more up-to-date. AMD seems to want us to believe 20nm is bad for end-users, which is hilarious, per an interview I saw claiming it's "too hot and too expensive".


Q2 2013 was the last official tsmc earnings call. now thats old because some internet news site posted rumours a couple of weeks back. moreover the news site did not have any TSMC official by name on record saying anything. so you are just talking rubbish.

http://www.tsmc.com/uploadfile/ir/quarterly/2013/1J5NC/C/TSMC%201Q13%20Transcript.pdf

page 11,12,17,18

http://www.tsmc.com/uploadfile/ir/quarterly/2013/2TGw1/E/TSMC%202Q13%20Transcript.pdf

page 9,10,12,13

if you still don't get a clue that 20nm is not ready for H1 2014 let me say you are incapable of basic understanding.


----------



## Forceman

Quote:


> Originally Posted by *raghu78*
> 
> FWIW which do you believe ? TSMC official statements on an earnings call or some internet news site


I believe that both could be true. In June when they made the earnings call they thought one thing, but then they accelerated the schedule (or just got ahead) between then and the end of September (when that article came out). They don't have to be mutually exclusive, considering one is several months old. As I said, we'll find out more next week - a lot can change in a quarter. And I'd rate DigiTimes as better than just "some" internet news site. We're not talking about WCCF or Videocardz here.

As for what I believe, I think it'll be second half 2014 until we see 20nm GPUs, but I don't work at TSMC and neither do you.


----------



## TrevBlu19

Quote:


> Originally Posted by *Forceman*
> 
> I believe that both could be true. In June when they made the earnings call they thought one thing, but then they accelerated the schedule (or just got ahead) between now and the end of September (when that article came out). They don't have to be mutually exclusive, considering one is several months old. As I said, we'll find out more next week - a lot can change in a quarter.
> 
> I believe it'll be second half 2014 until we see 20nm GPUs, but I don't work at TSMC and neither do you.


and i bet they will cost an arm and a leg too lol


----------



## Seronx

@raghu78

You should probably look for some quotes from Dr. Jack Sun(TSMC Vice President of R&D and CTO). He is the guy to listen to, not the CEO or the Marketing Director.

--
Quote:


> First the conference, as I have written before 20nm is ramping, exceeding expectations. We will see production 20nm FPGAs and mobile devices in Q1 2014, absolutely. This comes not only from TSMC's Jack Sun and Cliff Hou, but also from the fabless crowd: Bob Maines of Oracle, Brad Howe of Altera, VJ Janapaty of LIS Logic, Esin Torfioglu of QCOM, and Sandeep Bharathi of Xilinx. Always listen to the crowd, never listen to the press, especially EETimes.


http://www.semiwiki.com/forum/content/2820-tsmc-oip-2013-trip-report.html


----------



## raghu78

Quote:


> Originally Posted by *Seronx*
> 
> @raghu78
> 
> You should probably look for some quotes from Dr. Jack Sun(TSMC Vice President of R&D and CTO). He is the guy to listen to, not the CEO or the Marketing Director.
> 
> --
> http://www.semiwiki.com/forum/content/2820-tsmc-oip-2013-trip-report.html


Do you think the CEO of TSMC Morris Chang is not regularly updated on the state of TSMC 20nm ? I find that tough to accept. he is the boss and needs to know how his company is doing at the next gen process node. I think Q3 2013 earnings call on Oct 17th should bring more clarity


----------



## TamaDrumz76

A few more 290's showed up on NewEgg. There are now a total of 9 different ones listed (including both 290 and 290X).


----------



## Nonehxc

Quote:


> Originally Posted by *szeged*
> 
> i doubt im gonna even read all the sponsored review sites take on the 290x when it comes out because stock vs stock and high overclock vs stock are the worst reviews. gonna wait till we can find max OC vs max OC lol, so tired of seeing reviews " omg when we overclocked this card a lot, it beat this other card at stock! clear winner guys!"


FPS Lol.









OC FPS is an accurate depiction of what a card can do(yes yes, silicon lottery, headroom, etc). Then, for comparison, I look for users results on the particular card that I want compared with it(would be your Titan OC vs journos 290X OC, my humble but hardkicking 7950 vs. the 780 I plan to buy). Guru3D has, IMHO, the best reviews out there, so usually I go by their numbers, they like to show OCed numbers and don't skimp on the sliders.









Then, I look for user benchmarks here, which at release it's confusing since some guys get the right settings and numbers and others unleash the apocalypse on their cards, which usually translates in OCN classy fashion into "OMG THIS CARDZ OWSUM!!!!!!!!!!!" or "CRAPCRAPCRAPCRAP WURTHLISS PIE OF CRAP!!!!". The partisan's duality.


----------



## TrevBlu19

http://videocardz.com/46610/amd-hawaii-r9-290-series-gpu-diagram-leaks


----------



## Forceman

So the pixel fill rate is quite a bit higher than Titan, but the texture fill rate is still quite a bit lower. What impact does that have on performance?

Looks like they made a concerted effort to un-bottleneck the backend, if that's what was holding them back before.


----------



## Jared Pace

Quote:


> Originally Posted by *Forceman*
> 
> So the pixel fill rate is quite a bit higher than Titan, but the texture fill rate is still quite a bit lower. What impact does that have on performance?


it means 3% faster & 40% cheaper.


----------



## sugarhell

Wow that die. How they did? Interesting


----------



## TheBlademaster01

The 64 ROPs along with 512-bit bus (assuming clocks will follow) might be killer at high resolutions.


----------



## psyside

Motherofgod.jpg


----------



## raghu78

Nvidia picked probably the worst time to hype up 4k gaming. last month they went viral marketing on 4K and AMD CF problems at 4k. this month R9 290 series arrives with CF Eyefinity at 4K working at launch according to reviewers who are under NDA. 33% more ROPs which should help at 1440p / 1600p / 4k with 4x/8x MSAA and 2x/4x SSAA .

http://techreport.com/news/25428/driver-fix-for-crossfire-eyefinity-4k-frame-pacing-issues-coming-this-fall

"AMD's newest Radeon GPU, the "Hawaii" chip that will power the Radeon R9 290 and 290X cards announced earlier today, will of course be a top priority, as well. Although we can't yet divulge too many details, we expect Hawaii-based graphics cards to arrive with a very capable solution for CrossFire frame compositing and pacing already in place."

http://www.pcper.com/news/General-Tech/TechReport-AMD-Plans-Frame-Pacing-Driver-CrossFire-Eyefinity-and-4K-Autumn

"Editor's Note: I just spoke with Raja Koduri as well and he basically reiterated everything that Scott noted in his story on The Tech Report as well. The upcoming 290X will have frame pacing at Eyefinity and 4K resolution at launch while the cards below that in the R9 series, and users of Radeon HD 7000 cards (and likely beyond) will need some more time before the driver is ready. I'll be able to talk quite a bit more about the changes to BOTH architectures very shortly so stay tuned for that."


----------



## Moragg

Quote:


> Originally Posted by *raghu78*
> 
> *Nvidia picked probably the worst time to hype up 4k gaming*. last month they went viral marketing on 4K and AMD CF problems at 4k. this month R9 290 series arrives with CF Eyefinity at 4K working at launch according to reviewers who are under NDA. 33% more ROPs which should help at 1440p / 1600p / 4k with 4x/8x MSAA and 2x/4x SSAA .
> "


How to make your products look better? Get your competitors to advertise them


----------



## maarten12100

Actually 100% more ROPs
I doubt the Titan will not get ridiculed in high resolutions and even more on AA high resolutions which is just a higher resolution downsampled


----------



## TheBlademaster01

Quote:


> Originally Posted by *maarten12100*
> 
> Actually 100% more ROPs
> I doubt the Titan will not get ridiculed in high resolutions and even more on AA high resolutions *which is just a higher resolution downsampled*


Totally depends on the algorithm. For supersampling and its derivatives, yes.


----------



## Kuivamaa

This thing dwarfs tahiti. I would be surprised If AMD didn't push it in FirePro form somehow, like asap.


----------



## kingduqc

Quote:


> Originally Posted by *bencher*
> 
> Why because you said so?


Because there are obvious perks of having a single gpu vs two that are not mesured in framerate...

No issues at releases, less heat and power draw, posibility of upgrading, less shuttering or tearing and the list goes on and on. 2 dual gpu vs one will always win the framerate benchmarks but if you would look at the bigger picture you would understand that comparing single vs dual by looking at only one side of the card is being a blind and not fair.

The titan get almost dual card performance while having the perks of being single gpu and thats how you should look at it.


----------



## specopsFI

Slightly OT, but could someone tell me what's the current status on downsampling with Radeons? I've become addicted to it, such an easy way (straight from Nvidia Control Panel) to get universal AA that looks amazing and is surprisingly lightweight.

Personally, this really is the biggest thing keeping me from going from my 780 to 290(X). Hawaii seems like a really good high-res card but since I don't think I'm going to get a native 4K monitor any time soon, downsampling would be the only way to get good use out of all that back-end!


----------



## Regent Square

Quote:


> Originally Posted by *raghu78*
> 
> Nvidia picked probably the worst time to hype up 4k gaming. last month they went viral marketing on 4K and AMD CF problems at 4k. this month R9 290 series arrives with CF Eyefinity at 4K working at launch according to reviewers who are under NDA. 33% more ROPs which should help at 1440p / 1600p / 4k with 4x/8x MSAA and 2x/4x SSAA .
> 
> http://techreport.com/news/25428/driver-fix-for-crossfire-eyefinity-4k-frame-pacing-issues-coming-this-fall
> 
> "AMD's newest Radeon GPU, the "Hawaii" chip that will power the Radeon R9 290 and 290X cards announced earlier today, will of course be a top priority, as well. Although we can't yet divulge too many details, we expect Hawaii-based graphics cards to arrive with a very capable solution for CrossFire frame compositing and pacing already in place."
> 
> http://www.pcper.com/news/General-Tech/TechReport-AMD-Plans-Frame-Pacing-Driver-CrossFire-Eyefinity-and-4K-Autumn
> 
> "Editor's Note: I just spoke with Raja Koduri as well and he basically reiterated everything that Scott noted in his story on The Tech Report as well. *The upcoming 290X will have frame pacing at Eyefinity and 4K resolution at launch while the cards below that in the R9 series*, and users of Radeon HD 7000 cards (and likely beyond) will need some more time before the driver is ready. I'll be able to talk quite a bit more about the changes to BOTH architectures very shortly so stay tuned for that."


290x AND 290 that is what he propably meant. They both are based on the same architecture, would be dumb to optimize 1 card only.


----------



## Ukkooh

Quote:


> Originally Posted by *kingduqc*
> 
> The titan get almost dual card performance


This depends entirely on what you compare it to. This makes no sense if you don't state what you are comparing it against.


----------



## maarten12100

Quote:


> Originally Posted by *TheBlademaster01*
> 
> Totally depends on the algorithm. For supersampling and its derivatives, yes.


Ofc I'm not talking about blur AA methods just supersampling it is the best but the most taxing


----------



## TrevBlu19




----------



## sugarhell

Quote:


> Originally Posted by *TrevBlu19*


----------



## Stay Puft

Quote:


> Originally Posted by *TrevBlu19*


Fake


----------



## TrevBlu19

ahhh sorry..







i hope its fake


----------



## Stay Puft

Quote:


> Originally Posted by *TrevBlu19*
> 
> ahhh sorry..
> 
> 
> 
> 
> 
> 
> 
> i hope its fake


US site has already listed the 290X for 591. Its extremely fake


----------



## wstanci3

Oh God, that would be hilarious. So sad, but so hilarious.


----------



## TheBlademaster01

Artifacts near the reflection of the 290 and 290X models -> bad photoshop.


----------



## Regent Square

Quote:


> Originally Posted by *Stay Puft*
> 
> US site has already listed the 290X for 591. Its extremely fake


I can stare at your avatar for hours....

Damn,


----------



## grunion

Quote:


> Originally Posted by *specopsFI*
> 
> Slightly OT, but could someone tell me what's the current status on downsampling with Radeons? I've become addicted to it, such an easy way (straight from Nvidia Control Panel) to get universal AA that looks amazing and is surprisingly lightweight.
> 
> Personally, this really is the biggest thing keeping me from going from my 780 to 290(X). Hawaii seems like a really good high-res card but since I don't think I'm going to get a native 4K monitor any time soon, downsampling would be the only way to get good use out of all that back-end!


Still no native support, but CRU is easy peasy to use.


----------



## BusterOddo

Quote:


> Originally Posted by *grunion*
> 
> Still no native support, but CRU is easy peasy to use.


Could you please explain to me what CRU is? I had a downsampling tool but it only worked with older drivers. I really liked the results and would love to get downsampling back with current drivers. Thanks


----------



## cdoublejj

would around 100-128 bit truly "DOUBLE" with 48-64 being the norm for better/higher end cards?


----------



## grunion

Quote:


> Originally Posted by *BusterOddo*
> 
> Could you please explain to me what CRU is? I had a downsampling tool but it only worked with older drivers. I really liked the results and would love to get downsampling back with current drivers. Thanks


CRU


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> I can stare at your avatar for hours....
> 
> Damn,


Boxxy?


----------



## BusterOddo

Quote:


> Originally Posted by *grunion*
> 
> CRU


+ High five since I cant rep you?







Will give this a go when I get home from work tonight. Thanks


----------



## vhsownsbeta

Quote:


> Originally Posted by *BusterOddo*
> 
> + High five since I cant rep you?
> 
> 
> 
> 
> 
> 
> 
> Will give this a go when I get home from work tonight. Thanks


Yep. Definitely virtua-brofist worthy. I had never heard of this tool before. It looks extremely interesting...


----------



## TamaDrumz76

I've tried using CRU to downsample... It does not work in nearly the same way as NV control panel's implementation does. I can also report that I've had 0 success trying to use it. On the other hand, on my laptop, downsampling works well and is done very easily on NV side (with the control panel, not CRU).


----------



## specopsFI

Yeah, CRU doesn't seem to do what Nvidia Control Panel can do. I've read about it before and this part of ToastyX's notes baffles me: "Lower resolutions will be scaled up if GPU scaling is enabled, but higher resolutions won't be scaled down by the GPU. Higher resolutions will only work if the monitor can handle them."

I want a way to render games at 3840x2160, then use the GPU to scale the image down to 1920x1080 and leave the connection between the GPU and the monitor (EDID) out of all of that. I suppose that can't be done with AMD as of now?


----------



## CynicalUnicorn

Quote:


> Originally Posted by *wstanci3*
> 
> ^ Yet people still do, unfortunately.
> You can't blame some people, though. If you are looking for pure performance and if a dual gpu costs the same as a single gpu and offers superior performance, then that is a no brainer. But, if you are looking for something to bench or experiment with, then the single gpu offers more "freedom."


The only time a dual-GPU is better than a single-GPU is for mITX. You can only fit a single Titan/780, but you can fit two 680s/7970s using a 690/7990. Otherwise, crossfire/SLI is better, mostly for heat, but also partially for additional PCIe bandwidth.


----------



## TamaDrumz76

Quote:


> Originally Posted by *specopsFI*
> 
> Yeah, CRU doesn't seem to do what Nvidia Control Panel can do. I've read about it before and this part of ToastyX's notes baffles me: "Lower resolutions will be scaled up if GPU scaling is enabled, but higher resolutions won't be scaled down by the GPU. Higher resolutions will only work if the monitor can handle them."
> 
> I want a way to render games at 3840x2160, then use the GPU to scale the image down to 1920x1080 and leave the connection between the GPU and the monitor (EDID) out of all of that. I suppose that can't be done with AMD as of now?


Exactly the problem with CRU. I want a simple and effective way of downsampling on AMD like NV has.

I couldn't effectively increase my resolution at all as it had severe adverse effects on the monitor.


----------



## Regent Square

Quote:


> Originally Posted by *TamaDrumz76*
> 
> Exactly the problem with CRU. I want a simple and effective way of downsampling on AMD like NV has.
> 
> I couldn't effectively increase my resolution at all as it had severe adverse effects on the monitor.


can u overclock your monitor with amd gpu?


----------



## TheBlademaster01

Yes, third party software.


----------



## Regent Square

Quote:


> Originally Posted by *TheBlademaster01*
> 
> Yes, third party software.


could u give a few examples?


----------



## HighTemplar

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> The only time a dual-GPU is better than a single-GPU is for mITX. You can only fit a single Titan/780, but you can fit two 680s/7970s using a 690/7990. Otherwise, crossfire/SLI is better, mostly for heat, but also partially for additional PCIe bandwidth.


Not sure why you're referring to PCIe bandwidth when discussing single GPU cards vs crossfire. In the case of the 7990 for example, the 7990 is actually MORE efficient than 7970s in CF, and performs better, which nulls your statement about there being 'more' bandwidth available. There is actually less, and FAR more latency over PCIe than a dual gpu card would have locally over the PCB. The signal has to travel a much farther distance. Not only that, but we are not limited by PCIe bandwidth at all with the current GPUs. Even PCIe 2.0 8x does not incur a significant performance penalty when running even a GTX Titan, for example.


----------



## TheBlademaster01

Quote:


> Originally Posted by *Regent Square*
> 
> could u give a few examples?


Quote:


> *Step 1*
> 
> Update your video card drivers to the newest version.
> If using a crossfire (multi AMD GPU) setup, than multiple crossfire bridges must be used. For example, if crossfiring two GPUs than two crossfire bridges must be used.
> 
> Note: If you have a 200 or below series Nvidia card (ex: GTX260, etc.), reports suggest that an OC above 96 Hz is not possible. Time for an upgade!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Step 2*
> 
> Single and SLI Nvidia card setups should patch their Nvidia driver to overclock (OC). The patch is more unnecessary for SLI card owners, but its recommended for all Nvidia owners to use.*[CLICK] The patch is found here.*
> 
> AMD/ATI card owners need to patch their driver as well. The patch used is different than the Nvidia one used above. *[CLICK] Instructions and patch can be found here.*
> 
> *Step 3*
> 
> Both the Nvidia and AMD clock patch should include an executable which puts your computer in and out of Test Mode. You need to put your computer in Test Mode and keep it in Test Mode as long as your monitor is OCed. Restart your computer. In the lower right corner it will say "Test Mode" as long as it is on. I've read there is a way to make the text go away.
> 
> *Step 4*
> 
> Nvidia and AMD:
> Download and install the *[CLICK] Custom Resolution Utility (CRU) here.*
> 
> To use the utility above: click the stock 60Hz profile, press the copy button, make a new profile and then use the paste button. This has created a mirror image of the stock profile. Next, just edit the refresh rate number at the bottom (should be 60 originally) to something like 96. *Here is some explanation + pictures of the window that pops up to edit your custom refresh rate in CRU* *[CLICK]*. This makes a refresh rate profile for your computer of 96 Hz. Once that's done, reboot your computer can go into your computer settings (catalyst control center for AMD, and Nvidia control panel for Nvidia) and choose the refresh rate that you just created.
> 
> If having bad OC results, it may help to have the "LCD Reduced" enabled in CRU. Also, in CCC enable Reduce DVI Frequency.


Not sure where this came from. But someone I know is using it for his 7990.


----------



## Ultracarpet

Quote:


> Originally Posted by *fateswarm*
> 
> raghu78 has an exceptional ability to fool teenagers that he is "intelligent" and I applaud him for it, though it's not any real achievement in his age. He indirectly imposes under the facade of "more reliable sources" that reports related to financial data are more reliable than news directly sourced to TSMC and Taiwan that 20nm will be ready by Q1 '04. But his logic has a black hole, he hasn't convinced why sources directly pointing to TSMC and Taiwan aren't more reliable than the financial data he found.


Who are you talking to?

Kind of sounds like some sort of internal dialogue you are having with yourself lmao.


----------



## Regent Square

Quote:


> Originally Posted by *TheBlademaster01*
> 
> Not sure where this came from. But someone I know is using it for his 7990.


Thank you









+rep


----------



## fateswarm

Quote:


> Originally Posted by *Ultracarpet*
> 
> Who


Read first word in what you quoted.


----------



## BusterOddo

Quote:


> Originally Posted by *TamaDrumz76*
> 
> I've tried using CRU to downsample... It does not work in nearly the same way as NV control panel's implementation does. I can also report that I've had 0 success trying to use it. On the other hand, on my laptop, downsampling works well and is done very easily on NV side (with the control panel, not CRU).


Yes it does seem like it will be different; from the Notes section: •This program adds monitor resolutions, not scaled resolutions. Lower resolutions will be scaled up if GPU scaling is enabled, but higher resolutions won't be scaled down by the GPU. Higher resolutions will only work if the monitor can handle them. Will be interesting to see the results when I get home from work.


----------



## TheBlademaster01

Quote:


> Originally Posted by *Regent Square*
> 
> Thank you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +rep


thanks


----------



## raghu78

Quote:


> Originally Posted by *fateswarm*
> 
> Read first word in what you quoted.


i have noted down your post in which you say q1 2014 for 20nm. i am going to show it back to you in april 2014 to prove how wrong you were.


----------



## fateswarm

Quote:


> Originally Posted by *raghu78*
> 
> i have noted down your post in which you say q1 2014 for 20nm. i am going to show it back to you in april 2014 to prove how wrong you were.


Don't be so sure until April is well in, since they have promised to Apple up to March, hence it's very likely to be April for those. I have two extreme scenarios, as an optimist, March, maybe mid-Feb, as a pessimist, June.


----------



## BusterOddo

Quote:


> Originally Posted by *grunion*
> 
> Still no native support, but CRU is easy peasy to use.


Quote:


> Originally Posted by *TamaDrumz76*
> 
> I've tried using CRU to downsample... It does not work in nearly the same way as NV control panel's implementation does. I can also report that I've had 0 success trying to use it. On the other hand, on my laptop, downsampling works well and is done very easily on NV side (with the control panel, not CRU).


Quote:


> Originally Posted by *TamaDrumz76*
> 
> Exactly the problem with CRU. I want a simple and effective way of downsampling on AMD like NV has.
> 
> I couldn't effectively increase my resolution at all as it had severe adverse effects on the monitor.


Quote:


> Originally Posted by *BusterOddo*
> 
> Yes it does seem like it will be different; from the Notes section: •This program adds monitor resolutions, not scaled resolutions. Lower resolutions will be scaled up if GPU scaling is enabled, but higher resolutions won't be scaled down by the GPU. Higher resolutions will only work if the monitor can handle them. Will be interesting to see the results when I get home from work.


From reading several pages of posts on the CRU, directly from ToasyX this tool will not downsample. Its a shame that the AMD GUI downsampling tool only works for pre 13.1 drivers.


----------



## Ultracarpet

Quote:


> Originally Posted by *fateswarm*
> 
> Read first word in what you quoted.


Usually, when you are talking to someone you don't use words like "he" and "him" to address the person you are talking to.

Watch as I now construct that sentence with your English:

Usually, when he is talking to someone he doesn't use words like "he" and "him" to address the person he is talking to.

Sounds like I'm not talking to you, rather a 3rd party, right?


----------



## TheBlademaster01

Quote:


> Originally Posted by *Ultracarpet*
> 
> Usually, when you are talking to someone you don't use words like "he" and "him" to address the person you are talking to.
> 
> Watch as I now construct that sentence with your English:
> 
> Usually, when he is talking to someone he doesn't use words like "he" and "him" to address the person he is talking to.
> 
> Sounds like I'm not talking to you, rather a 3rd party, right?


Doesn't have to since he didn't quote anything/anyone. He was just stating his opinion about Raghu in this thread, towards the readers of this thread. It's not that unusual to express yourself in such a manner and does not imply "having an internal dialogue with yourself" IMO (look at the OP or any post without a quote for that matter).


----------



## CynicalUnicorn

Quote:


> Originally Posted by *HighTemplar*
> 
> Not sure why you're referring to PCIe bandwidth when discussing single GPU cards vs crossfire. In the case of the 7990 for example, the 7990 is actually MORE efficient than 7970s in CF, and performs better, which nulls your statement about there being 'more' bandwidth available. There is actually less, and FAR more latency over PCIe than a dual gpu card would have locally over the PCB. The signal has to travel a much farther distance. Not only that, but we are not limited by PCIe bandwidth at all with the current GPUs. Even PCIe 2.0 8x does not incur a significant performance penalty when running even a GTX Titan, for example.


While the performance hit is small, it's measureable. PCIe x16 is slightly better than x8. I would assume that for a single dual-GPU, it would be a bit faster than two discrete since they share the same chip but the bandwidth limit would take a small hit compared to 4x 680s/7970s in x16 slots if you had two 690s/7990s in two x16 slots, essentially four x8 slots.


----------



## TamaDrumz76

Quote:


> Originally Posted by *BusterOddo*
> 
> From reading several pages of posts on the CRU, directly from ToasyX this tool will not downsample. Its a shame that the AMD GUI downsampling tool only works for pre 13.1 drivers.


Yeah, it's a shame. I really hope there becomes a way to do it with the 290 cards. I have thrown AMD tweets and FBook messages asking them to consider making it a viable option like it is for nVidia.


----------



## Ultracarpet

Quote:


> Originally Posted by *TheBlademaster01*
> 
> Doesn't have to since he didn't quote anything/anyone. He was just stating his opinion about Raghu in this thread, towards the readers of this thread. It's not that unusual to express yourself in such a manner and does not imply "having an internal dialogue with yourself" IMO (look at the OP or any post without a quote for that matter).


Except that he said he was talking to raghu...


----------



## Blameless

Quote:


> Originally Posted by *specopsFI*
> 
> Since no one else said it, I will.
> 
> Power is the thing. If AMD had 30% TDP headroom over the 7970GE then it wouldn't be any problem to hit Titan performance or beat it. The reality is though that 7970GE already has the same TDP as Titan. So they would need to get all that extra hardware to run with the same TDP without sacrificing clock speed.
> 
> Some people will counter that by saying "but look at the 7790, it brought 30% more fps with just 5% more power consumption". Yes, it did. And if you've read any R7 260X reviews, you'd know what happened when that 7790 needed to be pushed for any higher clock speeds: the power consumption went up significantly. "...overall performance per watt has actually dropped a lot. The HD 7790 delivered leading efficiency, but the R7 260X is now below the average when compared to the whole market. The good HD 7790 numbers in mind, I was a bit disappointed by the R7 260X's gaming power consumption." (source). The same thing happened with 7970GE: the original 7970 was actually pretty good for performance/W, 7970GE much less so.
> 
> TDP limit is the reason why AMD has made Hawaii so massive. They couldn't go for higher clocks because GCN goes mental with TDP. They needed to _lower_ their clocks and the only way to get more performance with lower clocks is with a big chip running low voltage. The rumoured turbo BIOS thing seems like an indication that they've still had to push the clocks higher than they would have liked to. There most likely is very little extra to be had over those turbo clocks (last time they did this was 6990 which was tapped out). Another thing that might be a limiting factor is that Hawaii is extremely densely packed: the transistors/area is very high. Heat transfer gets more difficult when the heat source is more dense.
> 
> If I'm sounding too critical, then that's just the hype getting out of control. There are limiting factors for the Hawaii as well although it seems that some folks are taking any news as indication of world domination. I was sceptical that AMD would aim for Titan performance but the hardware seems to be aimed at just that. As of now, I'm thinking 290X will practically match the Titan at stock and with OC on stock BIOS since Titan is so strictly restricted, but Titan will have more headroom when those limits are removed. All in all, I'm really excited to see AMD doing a real monster chip!


A few points:

- A more mature process will allow for lower power consumption on the same parts as yields improve and average leakage goes down.

- Lower clock speeds will reduce power even if voltage remains the same.

- TDP limits can be bypassed. The limiting factor is usually the VRM, but if the card has a robust one, or you can cool it well enough, most GCN cards seem to have considerable headroom.
Quote:


> Originally Posted by *Stay Puft*
> 
> 7990 was garbage till the frame pacing driver was released. Would you buy a 7990 over a 780 because i sure wouldnt


Now that frame pacing works for the majority of situations I would personally need two GPUs for, I'd certainly wouldn't automatically choose a 780 over a 7990.
Quote:


> Originally Posted by *grunion*
> 
> Coil whine now?
> 
> I have 14 AMD cards and not a one coil whines.
> I have 7 NV cards and 3 of those whine, my 780 is the worst, it also makes my PSU run hotter.


Pretty much every GPU I've ever had, NVIDIA, AMD, or otherwise, that could pull more than 50w has had coil whine under the right load.
Quote:


> Originally Posted by *szeged*
> 
> because any serious bencher doesnt put the 7990 into the same category as 780/titans etc. Single pcb yeah, dual gpu etc etc .


"Serious benchers" are a tiny sub set of a tiny subset of the enthusiast market. Most enthusiasts are buying high end GPU setups to play games on them and only care about benchmarks as far as they reflect gameplay.

Fact of the matter is that where frame pacing is working, you normally get a similar or better gaming experience with a 7990 as you would with a Titan.

Now, AMD's frame pacing still isn't mature, and that is certainly a downside, but it's often not as much a downside as the Titan costing ~400 dollars more.


----------



## nvidiaftw12

Quote:


> Originally Posted by *Blameless*
> 
> Pretty much every GPU I've ever had, NVIDIA, AMD, or otherwise, that could pull more than 50w has had coil whine under the right load.


Yes, but in my case, at the right load is a lot different that at idle.


----------



## Devildog83

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Yes, but in my case, at the right load is a lot different that at idle.


You are correct, if you have a constant coil whine and it's loud enough to annoy you it's an issue but I would still take a Devil 13 in a New York minute.







A lot of folks would not but I would.


----------



## nvidiaftw12

I'd rather have 2 of the new msi gamer 280x's.


----------



## Devildog83

Quote:


> Originally Posted by *nvidiaftw12*
> 
> I'd rather have 2 of the new msi gamer 280x's.


That would be cool too.


----------



## Regent Square

Quote:


> Originally Posted by *Devildog83*
> 
> That would be cool too.


----------



## Devildog83

Quote:


> Originally Posted by *Regent Square*


That little guy never get's tired of laughing.


----------



## Devildog83

Question, it seems that the 280 and 290 cards are designed specifically for higher resolution as opposed to massive frame rates, not that they wound not get high frame rates also, is that true. Just trying to learn here.


----------



## Forceman

Quote:


> Originally Posted by *Devildog83*
> 
> Question, it seems that the 280 and 290 cards are designed specifically for higher resolution as opposed to massive frame rates, not that they wound not get high frame rates also, is that true. Just trying to learn here.


Existing cards are already good enough to run almost any game at 1080p at near max settings, so yeah, the top-end cards are really geared towards 1440p and higher gaming. Doesn't stop people from buying them for 1080p of course, especially 120Hz panel people, but the target audience would be higher resolution or multi-screen.


----------



## Blameless

Quote:


> Originally Posted by *Devildog83*
> 
> Question, it seems that the 280 and 290 cards are designed specifically for higher resolution as opposed to massive frame rates, not that they wound not get high frame rates also, is that true. Just trying to learn here.


The 280 is a rebranded 7970.

You can't really design for resolution but not frame rate, or vice versa. Fill rate can reveal the maximum potential frame rate at a given resolution, or resolution and target frame rate can hint at the minimum fill rate you'll need, but thats about it. Point is that they are too intertwined to be separate.


----------



## GraveDigger7878

Not impressed...


----------



## Devildog83

Quote:


> Originally Posted by *Blameless*
> 
> The 280 is a rebranded 7970.
> 
> You can't really design for resolution but not frame rate, or vice versa. Fill rate can reveal the maximum potential frame rate at a given resolution, or resolution and target frame rate can hint at the minimum fill rate you'll need, but thats about it. Point is that they are too intertwined to be separate.


I am, thanks for the info.


----------



## Devildog83

Quote:


> Originally Posted by *Forceman*
> 
> Existing cards are already good enough to run almost any game at 1080p at near max settings, so yeah, the top-end cards are really geared towards 1440p and higher gaming. Doesn't stop people from buying them for 1080p of course, especially 120Hz panel people, but the target audience would be higher resolution or multi-screen.


I use a 240hz 47in LED TV for a monitor.


----------



## Blameless

Quote:


> Originally Posted by *Devildog83*
> 
> I use a 240hz 47in LED TV for a monitor.


Your TV is almost certainly limited to 60Hz input.


----------



## zealord

Why dooe the 280X NON reference run so hot? There a a few reviews where it hits (87)94 with OC °C in games and that doesn't make any sense since it's just a 7970 or am I missing something?

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/30.html


----------



## Devildog83

Quote:


> Originally Posted by *Blameless*
> 
> Your TV is almost certainly limited to 60Hz input.


You could be right, all I know is that's what the specs say. I love gaming with it though since I don't see to well anymore. I am one of those old guys.


----------



## Stay Puft

Quote:


> Originally Posted by *zealord*
> 
> Why dooe the 280X NON reference run so hot? There a a few reviews where it hits (87)94 with OC °C in games and that doesn't make any sense since it's just a 7970 or am I missing something?
> 
> http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/30.html


Fan profile is probably low to eliminate noise. A custom fan profile would eliminate this issue


----------



## Forceman

Quote:


> Originally Posted by *zealord*
> 
> Why dooe the 280X NON reference run so hot? There a a few reviews where it hits (87)94 with OC °C in games and that doesn't make any sense since it's just a 7970 or am I missing something?
> 
> http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/30.html


There was speculation that they got a bad sample or something, because some other reviews didn't get such high temps. There was a discussion about it in the reviews thread, I think.


----------



## zealord

Oh alright thanks for clearing that up. Lets hope the 290X is a bit cooler


----------



## Regent Square

Quote:


> Originally Posted by *Forceman*
> 
> There was speculation that they got a bad sample or something, because some other reviews didn't get such high temps. There was a discussion about it in the reviews thread, I think.


780 has 3gb of vram, but it is not aimed at high res displays, rather giving you a memory space. Lots of games use 2+ gigs at 1080p

R 290 series have 4gb of vram to give u a potential "future proof" for higher res + demanding games. Those cards are not aimed at higher res, rather they give u a back up if u decide to upgrade.

290 series will compete with 780/Titan from a green camp. None of `em offers a big chunk of fps compared to 600x series.


----------



## raghu78

Quote:


> Originally Posted by *Forceman*
> 
> There was speculation that they got a bad sample or something, because some other reviews didn't get such high temps. There was a discussion about it in the reviews thread, I think.


yeah MSI sent a BIOS which fixed the problem but TPU could not get the new BIOS installed as the review had gone live by that time.

http://www.techpowerup.com/forums/showpost.php?p=2993244&postcount=44

http://www.techpowerup.com/forums/showthread.php?t=191904&page=4


----------



## Slaughterem

Maybe someone could help with some advise on a system I would like to build next month. I was looking at the Qnix monitors 1440P and would like to purchase 3 to make a landscaped eyefinity system.
My first question is do I have to purchase 1 with a d port? or can all be DVI? What graphics cards would i need to run these if i OC them to 120HZ? I would think that I would need at least 2.


----------



## criminal

Quote:


> Originally Posted by *specopsFI*
> 
> Slightly OT, but could someone tell me what's the current status on downsampling with Radeons? I've become addicted to it, such an easy way (straight from Nvidia Control Panel) to get universal AA that looks amazing and is surprisingly lightweight.
> 
> Personally, this really is the biggest thing keeping me from going from my 780 to 290(X). Hawaii seems like a really good high-res card but since I don't think I'm going to get a native 4K monitor any time soon, downsampling would be the only way to get good use out of all that back-end!


Thanks for bringing up downsampling. I had heard the term used before, but never really tried using it myself. Just got through trying it out in Borderlands 2. Very awesome!

+rep


----------



## GraveDigger7878

Is there a downsampling how to that someone could be kind enough to direct me to please! I have tried but failed at it a few times!


----------



## grunion

Quote:


> Originally Posted by *specopsFI*
> 
> Slightly OT, but could someone tell me what's the current status on downsampling with Radeons? I've become addicted to it, such an easy way (straight from Nvidia Control Panel) to get universal AA that looks amazing and is surprisingly lightweight.
> 
> Personally, this really is the biggest thing keeping me from going from my 780 to 290(X). Hawaii seems like a really good high-res card but since I don't think I'm going to get a native 4K monitor any time soon, downsampling would be the only way to get good use out of all that back-end!


What is your native res, what is your target downsample?


----------



## Majin SSJ Eric

I've been somewhat interested in downsampling as well but never have really given it a go. I generally just use whatever max AA is available in the particular game I'm playing. Isn't that good enough?


----------



## zealord

found this at my local shop I always buy stuff. Don't know if the price is accurate, but normaly they have mediocre prices amongst shops in germany.

http://bora-computer.de/detail/index/sArticle/23235

699€ would probably mean 699$, but Imho that price seems a bit high. Maybe its a placeholder until they get the final confirmation of the real price

It's the OC version tho. Maybe normal version is a bit cheaper


----------



## istudy92

October 15th we will 100% know the specs, price, and bundles of the new cards!!
Thats when the NDA is lifted, so stay tuned!


----------



## specopsFI

Quote:


> Originally Posted by *GraveDigger7878*
> 
> Is there a downsampling how to that someone could be kind enough to direct me to please! I have tried but failed at it a few times!


This works like a charm for me:

http://screenarchery.wikia.com/wiki/Downsampling_%E2%80%93_A_full_guide_to_achieve_3840x2160_resolution_%E2%80%93_NVIDIA_only

The settings in the example 2 are the ones I use to get 3840x2160 -> 1920x1080 which in theory is 4xSSAA. The interesting part is that it looks a bit better than any in-game SSAA I've tried and usually runs a tiny bit better, too. Combined with some slight post-processing AA it really makes 1080p gaming look amazing.


----------



## TheBlademaster01

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I've been somewhat interested in downsampling as well but never have really given it a go. I generally just use whatever max AA is available in the particular game I'm playing. Isn't that good enough?


Those mostly don't offer true supersampling. Supersampling is often the only method to really blur out Moiré patterns and other annoying artifacts/flicker while moving. That at the cost of quite some performance.


----------



## specopsFI

Quote:


> Originally Posted by *Blameless*
> 
> A few points:
> 
> - A more mature process will allow for lower power consumption on the same parts as yields improve and average leakage goes down.
> 
> - Lower clock speeds will reduce power even if voltage remains the same.
> 
> - TDP limits can be bypassed. The limiting factor is usually the VRM, but if the card has a robust one, or you can cool it well enough, most GCN cards seem to have considerable headroom.


All true, but...

-I'm not sure how much the process has been matured since the 7970GE was introduced. Seems to me that neither AMD nor Nvidia have been able to really get much more out of the process although both have reintroduced 28nm chips

-Clocks reduce (or increase) power consumption linearly whereas voltage reduces (or increases) it exponentially.

-TDP limits can be bypassed, but that's not something either chip maker really wants. They aim for a certain TDP (usually 250W for single chip high-end) and build the reference PCB and cooler accordingly. If 290X is to be all reference cards, then I wouldn't expect miracles for OC headroom. True, Tahiti reference cards are amazing overclockers but that is a much smaller die and from what we've seen, the Hawaii PCB and cooler don't seem that much beefed up compared to Tahiti reference design. The chip design is totally different though, so we'll have to wait and see.


----------



## SpacemanSpliff

Quote:


> Originally Posted by *wstanci3*
> 
> 1) Dat power draw
> 2) Dat heat
> 3) 4-way doesn't scale too well
> 
> Though, it would be sick if both were in a watercooled setup.


Why bother with crossfiring them? For a while I was considering dual 7990s for my upcoming build... put them both under water, one for all my normal use and have one that can be a dedicated 20+/7 folder... think of the PPD ooooo. Having each one dedicated to a specific set of tasks is the better way to use two of them in the same tower... only catch is it would most likely require an XL/E-ATX board since you would need to have 4 PCIe lanes available. That would honestly help out with the cooling potential if one wanted to use a high air-flow case, and aftermarket coolers like an Accelero Hybrid or Extreme.


----------



## theilya

does anyone know if 750w PSU can handle 2x 280x?


----------



## specopsFI

Quote:


> Originally Posted by *criminal*
> 
> Thanks for bringing up downsampling. I had heard the term used before, but never really tried using it myself. Just got through trying it out in Borderlands 2. Very awesome!
> 
> +rep


My pleasure. It really is some nice stuff. As it so happens, I'm playing BL2 myself at the moment. 780 downsampling from 4K and my 570 as a dedicated PhysX card, if there ever was a real TWIMTBP title then this is it. And even then, I'm so intrigued by Hawaii... It's been friggin six months since I've bought something for my main rig! There's only so much playing games can do, I need new hardware! For crying out loud, I'm trying to get my fix by buying a 7790 but even that is painful since it's screaming for a Kaveri to go with and it's nowhere to be found. I WANT MY HARDWARE!!!11!


----------



## maarten12100

Quote:


> Originally Posted by *theilya*
> 
> does anyone know if 750w PSU can handle 2x 280x?


Easily as long as it isn't a gray non brand psu.


----------



## Tkconserve

I think Newegg lists crossfire needing 1000 watts for r9 280x


----------



## fleetfeather

I'd grab a 850w just to be safe. No clue on how hard we'll be able to push voltages on these puppies


----------



## theilya

i'd would be upgrading from my 660ti SLI and currently own corsair 750w.

I dont want to upgrade my PSU yet


----------



## Pheonix777z




----------



## fleetfeather

Ugh, I don't even pay attention to claims made by AMD marketing anymore. I would put 0 grains of salt into that tweet.


----------



## TamaDrumz76

Quote:


> Originally Posted by *Pheonix777z*


Looks like a friendly jab more than anything else.


----------



## TheBlademaster01

Roy is another AMD trolling prototype, not much harm in this trolling attempt though.


----------



## raghu78

Quote:


> Originally Posted by *TheBlademaster01*
> 
> Roy is another AMD trolling prototype, not much harm in this trolling attempt though.


all PR are supposed to do a bit of trolling especially against the competition . its a part of their job description.


----------



## TheBlademaster01

Not talking about this attempt.


----------



## mcg75

AMD could probably get back 10% market share in mere months if EVGA was one of their partners.


----------



## TheBlademaster01

Wouldn't be that good for EVGA I imagine.


----------



## maarten12100

Quote:


> Originally Posted by *Pheonix777z*


Well a single is exagerated even with those abundant ROPs and memory bandwidth but if we factor in scaling and running 4K with massive amounts of AA I would say 4 Titan's will be 3 or 2 Hawaii cards


----------



## FearzUSA

http://gyazo.com/2328b7d451391da646ee7910376d90a1

http://gyazo.com/c9fe3fd055a4ead619fe83fecad6d8ff

http://gyazo.com/63028d779dd16fb79453a2b542b5d64f

750 usd MAYBE ?


----------



## raghu78

Quote:


> Originally Posted by *maarten12100*
> 
> Well a single is exagerated even with those abundant ROPs and memory bandwidth but if we factor in scaling and running 4K with massive amounts of AA I would say 4 Titan's will be 3 or 2 Hawaii cards


R9 290X even in DX11.1 looks to be the top card for BF4 and the gap is only going to increase at 4k as AMD has 64 ROPs and 512 bit mem bus to drive 4K. But once you bring in Mantle it could literally be a single R9 290X competing with a Titan SLI (DX11.1). The AMD BF4 Eyefinity demo at GPU 14 was running on a single R9 290X and powering 3 x 1080p displays.

https://twitter.com/AMDRadeon/status/383322739286609920


----------



## NABBO

BF4 in 4K to how many frames? 60, 50, 40??
and that anti-aliasing? POST AA or MSAA?


----------



## szeged

3 more days, pepper your angus.


----------



## Death Saved

Anyone think that AMD will delay the R9 290 cards till after Nvidia's announcement on the 16th/17th.


----------



## maarten12100

Quote:


> Originally Posted by *raghu78*
> 
> R9 290X even in DX11.1 looks to be the top card for BF4 and the gap is only going to increase at 4k as AMD has 64 ROPs and 512 bit mem bus to drive 4K. But once you bring in Mantle it could literally be a single R9 290X competing with a Titan SLI (DX11.1). The AMD BF4 Eyefinity demo at GPU 14 was running on a single R9 290X and powering 3 x 1080p displays.
> 
> https://twitter.com/AMDRadeon/status/383322739286609920



Or mother of god the real 2900x


----------



## SMK

wouldnt mind ponying up the dough if one 290X was faster than two 7950s in CF. But with this release, I havent really seen much from the other cards (or Nvidia) that would make me want to upgrade...


----------



## Sheyster

Quote:


> Originally Posted by *SMK*
> 
> wouldnt mind ponying up the dough if one 290X was faster than two 7950s in CF. But with this release, I havent really seen much from the other cards (or Nvidia) that would make me want to upgrade...


For me it's gonna be all about the BF4 performance. Looking forward to seeing how well Mantle does (or does not) do...


----------



## Regent Square

The butcher is ready for business from October 15- (has to be clarified)


----------



## fateswarm

It may yet.


----------



## fleetfeather

Quote:


> Originally Posted by *fateswarm*
> 
> It may yet.


ease up turbo, you're stepping close to contradiction territory with that statement
Quote:


> Originally Posted by *fateswarm*
> 
> Hi, I'd like to place a bet for $599 and the performance of a 780 please. If a system is available, $550-650 with the performance of +/-20% 780


----------



## mcg75

Quote:


> Originally Posted by *Durquavian*
> 
> Sorry but sounds the same as what he interpreted. You basically said AMD would gain 10% with and gain none without. Sounds very close to the same.


Wrong and wrong and wrong again.

I said AMD might get back 10% market share in mere months if EVGA was one of their partners. I didn't say a word about what they would do without EVGA. Both of you invented that for your own purposes.

But since you opened your mouth, tell us when has either company made a 10% swing in market share in a few months? That's right, never.

It would, at a minimum, take acquiring a large AIB partner such as EVGA to do such a thing.

Common sense.


----------



## Regent Square

Quote:


> Originally Posted by *Slaughterem*
> 
> The smart thing would have been for him not to make speculative hard claims in the first place.


Confirmative.


----------



## Slaughterem

Hey pure black fire maybe you could advise me on a system I would like to put together. If i was to have 3 Qnix monitors at 1440P and OC to 120 Hz with the speculation of the R9 290X cards performance would I be able to play most games at max settings with 2 of these?


----------



## NABBO

to 290x = or perhaps a 5% - less than 2x 7870 Cross.
290 10 or 15% -


----------



## PureBlackFire

Quote:


> Originally Posted by *Slaughterem*
> 
> Hey pure black fire maybe you could advise me on a system I would like to put together. If i was to have 3 Qnix monitors at 1440P and OC to 120 Hz with the speculation of the R9 290X cards performance would I be able to play most games at max settings with 2 of these?


the cards should be able to handle that. whether or not you can get all three monitors to overclock to 120hz I can't say. from what I've seen it's not 100%.
Quote:


> Originally Posted by *Regent Square*
> 
> got put bottled up fast, huh


yeah I had enough playing devil's advocate for one post.
Quote:


> Originally Posted by *NABBO*
> 
> to 290x = or perhaps a 5% - less than 2x 7870 Cross.
> 290 10 or 15% -


if their scaling has improved it will be faster than cf 7870. at higher resolutions it will be faster anyway.


----------



## NABBO

Quote:


> Originally Posted by *Regent Square*
> 
> No worries, Titan will lose its crown very soon.


a comparison (review at random) crossfire 7870 vs GTX Titan

http://www.legionhardware.com/articles_pages/gigabyte_geforce_gtx_titan,7.html


----------



## CoolRonZ

scaling is bad on AMD GPU's????? thats news to me.... I think it has alot to do with your MB/CPU too..... with my HD7970s in say metro:ll, crysis3/eyefinity they scale really really highly, everything else doesn't need to... and altho it may not be perfect, but the last few beta drivers definitely make things alot smoother...







can't wait til the completely fix CF/4K drivers


----------



## fateswarm

Quote:


> Originally Posted by *NABBO*
> 
> a comparison (review at random) crossfire 7870 vs GTX Titan
> 
> http://www.legionhardware.com/articles_pages/gigabyte_geforce_gtx_titan,7.html


Did you make the face of your avatar when you thought that comparison is of any interest?


----------



## raghu78

Quote:


> Originally Posted by *Regent Square*
> 
> No worries, Titan will lose its crown very soon.


yeah by now its pretty clear that AMD R9 290X will beat Titan (stock) and match it on a clock for clock basis. given the massive resources R9 290X has a 2x perf of HD 7870 is the norm even if AMD maintained the same efficiency of GCN 1.0

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/26.html

HD 7770 - 36
HD 7870 - 68 (scaling of roughly 1.9x)
R9 280X - 97 (newer drivers put it faster than HD 7970 Ghz with older drivers)
R9 290 (1 Ghz) - 129.2 ( similar 1.9x scaling as HD 7870 from HD 7770)
R9 290( 947 mhz) - 123 (around 4 - 5% less perf)
R9 290X(1 Ghz) - 140 (8% faster perf quite possible when you have such a well balanced chip)

the HD 7870 perf numbers are with older drivers. so a 3 % gain as seen from HD 7970 ghz with older drivers compared to MSI R9 280X with newer drivers puts the HD 7870 as 71. a doubling for 2.2x increase in sp count puts R9 290X at 140. so 12 - 13% faster than Titan would not surprise me. that would put Titan and R9 290X at around same perf on a clock for clock basis.


----------



## NABBO

if the 290X performs in that way,

+ Or - would be equal in performance with Titan in dx9/10/11
also depend, to which games would be tested in reviews


----------



## Regent Square

Quote:


> Originally Posted by *raghu78*
> 
> yeah by now its pretty clear that AMD R9 290X will beat Titan (stock) and match it on a clock for clock basis. given the massive resources R9 290X has a 2x perf of HD 7870 is the norm even if AMD maintained the same efficiency of GCN 1.0
> 
> http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/26.html
> 
> HD 7770 - 36
> HD 7870 - 68 (scaling of roughly 1.9x)
> R9 280X - 97 (newer drivers put it faster than HD 7970 Ghz with older drivers)
> R9 290 (1 Ghz) - 129.2 ( similar 1.9x scaling as HD 7870 from HD 7770)
> R9 290( 947 mhz) - 123 (around 4 - 5% less perf)
> R9 290X(1 Ghz) - 140 (8% faster perf quite possible when you have such a well balanced chip)
> 
> the HD 7870 perf numbers are with older drivers. so a 3 % gain as seen from HD 7970 ghz with older drivers compared to MSI R9 280X with newer drivers puts the HD 7870 as 71. a doubling for 2.2x increase in sp count puts R9 290X at 140. so 12 - 13% faster than Titan would not surprise me. that would put Titan and R9 290X at around same perf on a clock for clock basis.


Not considering future driver improvements that will come along.


----------



## Slaughterem

Quote:


> Originally Posted by *Regent Square*
> 
> Exactly. Half of this forum members were trash talking about AMD new gpus and now they are quiet like rats, anxiously waiting for proper benches. In case they were right, be ready for em to strict their tales up and pronounce loudly: Ha ha, AMD fail.....
> *
> If they appear to be wong, most of em will lick up with AMD dudes and put smiley faces..* aka, lots of people saying 700$ for 290x are now gone due to them being false informative .


The one thing that I realized about this forum is that you can not change peoples allegiance to their product choice. Which I don't have a problem with as long as they are not trolling and making comments as if they are the all knowing god about someone else's allegiance. I personally buy products that give me the best price/performance. The release of re branded R 280X cards at the price point and performance they have is an exceptional bargain for the consumer. After the 15th we will have reviews and no matter what the results are there will be people who will bash AMD. Guarantee there will be comments about drivers, about the price of the new cards and the flame war of who first moved up the price point to whatever amount, crossfire drives that are still to come, and anything else that they can think of such as power usage and heat output. We should all hope that R290 since its specs are twice that of a 7870 card will give the same results as cross-fired 7870's which makes it competitive with Titan. And that the R 290X with the additional shaders provides even more, at a price point less than a 780. This will lower prices on the Nvidia cards and if that is what your allegiance is towards than I am happy for you.


----------



## NABBO

the technical features which is comparable to those cards.
as titan is comparable to a SLI GTX 660 Ti (and in fact performs + or - at those levels)


----------



## wstanci3

Quote:


> Originally Posted by *fateswarm*
> 
> Did you make the face of your avatar when you thought that comparison is of any interest?


This should be your signature








Fateswarm:
"Making friends at every turn"


----------



## fateswarm

Those benchmarks are comedic. They have an excuse of comparison but on the big picture it's clear what they are. Propaganda attempts to ridicule the Titan to children "See? It's Worse than a 7950((((((((((((((((((((crossfire))))))))))))))))))))".


----------



## Regent Square

Quote:


> Originally Posted by *NABBO*
> 
> the technical features which is comparable to those cards.
> as titan is comparable to a SLI GTX 660 Ti (and in fact performs + or - at those levels)


I wont even say anth. to it.....


----------



## fleetfeather

Quote:


> Originally Posted by *fateswarm*
> 
> Pitty. And I liked your avatar. Apparently you fail to grasp the concept of basic probability, it might be wise to talk about what you think it's most probable, not about certainties and impossibilities, having an agnostic perspective even if there are dominating probabilities.


I understand probability fine. My post was in reference to your previously exhibited hard-line stance against the 290x before more was known about it. The above comment was one of many from you suggesting the 290x was DOA (read: suggesting, not stating). I'm fairly confident you'll ask me to provide proof of these past comments, but really you make a lot of posts on a daily basis, so I'd rather not trawl through your extensive post history on the subject.


----------



## Falknir

Quote:


> Originally Posted by *PureBlackFire*
> 
> he's probably doing the smart thing and waiting on actual results before making hard claims. there is enough speculation going on in this and the 7 other threads about these cards. or maybe he has a life outside of these topic.


I would avoid these threads to. Many people are getting overly defensive and hostile over speculation and supposed leaks. When everyone is using pseudo ammunition, civil and reasonable discussion tends to get tossed out the window.

I am hoping AMD has a really good card to compete with the GTX 780 and GTX TITAN soon, would like to see some alternatives and competition in the marketplace.


----------



## Slaughterem

Quote:


> Originally Posted by *Falknir*
> 
> I would avoid these threads to. Many people are getting overly defensive and hostile over speculation and supposed leaks. When everyone is using pseudo ammunition, civil and reasonable discussion tends to get tossed out the window.
> 
> I am hoping AMD has a really good card to compete with the GTX 780 and GTX TITAN soon, would like to see some alternatives and competition in the marketplace.


I agree competition is what we need, but even that subject becomes hostile about who released what card and when. Things like sure they beat X now but look how long it took them.
I heard a rumor that Alatar Fateswarm and Stay Puff are now the owners of Origin PC so much for competition.







( This is not true JK)


----------



## Usario

Quote:


> Originally Posted by *fateswarm*
> 
> Those benchmarks are comedic. They have an excuse of comparison but on the big picture it's clear what they are. Propaganda attempts to ridicule the Titan to children "See? It's Worse than a 7950((((((((((((((((((((crossfire))))))))))))))))))))".


Meh... when you consider that you can get four 7950s and go to a fancy restaurant for the price of a Titan...


----------



## PureBlackFire

Quote:


> Originally Posted by *szeged*
> 
> im gonna sell one of my titans and buy 1000 cheeseburgers from mcdonalds and feed all the hobos in town.


that's an admirable thing to do. you should make a more personal gesture though and make them all sandwiches yourself.


----------



## mcg75

Quote:


> Originally Posted by *Usario*
> 
> Meh... when you consider that you can get four 7950s and go to a fancy restaurant for the price of a Titan...


99% of people would have to buy a new motherboard, processor and psu in order to use 4 way 7950.

So that $1k quickly becomes $2k and makes it a pointless comparison.

If you had said you could buy a pair of 290's for the price of a Titan then it's very valid.


----------



## szeged

if evga sold amd cards aswell, id never buy from a different company again.


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> if evga sold amd cards aswell, id never buy from a different company again.


are u gonna buy asus 290x?


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> are u gonna buy asus 290x?


if they make a matrix 290x early on ill grab that, if it takes them a while to announce a matrix 290x ill go with HIS again as i always will for amd cards from now on lol


----------



## mcg75

Quote:


> Originally Posted by *Durquavian*
> 
> What I based my thoughts on. See the last part.


Dear lord.

I guessed hypothetically that EVGA might give them 10% more marketshare.

*That 10% is applied to AMD regardless of what their market share really is. This is the point that you are not getting. I did not say AMD could not gain 10% marketshare without EVGA. I was saying they could gain 10% more on top of whatever they have with EVGA.*


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> if they make a matrix 290x early on ill grab that, if it takes them a while to announce a matrix 290x ill go with HIS again as i always will for amd cards from now on lol


So who is better to buy from, Sapphire, HIS or recently changed sides Club 3D?

They are not too far ahead in number of cards sold.


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> So who is better to buy from, Sapphire, HIS or recently changed sides Club 3D?
> 
> They are not too far ahead in number of cards sold.


my favorite of those 3 is definitely HIS, ive had nothing but good luck with their cards. my best overclocking 7970s are from HIS, and their ice qx2 coolers they use are pretty top notch for those who air cool.


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> my favorite of those 3 is definitely HIS, ive had nothing but good luck with their cards. my best overclocking 7970s are from HIS, and their ice qx2 coolers they use are pretty top notch for those who air cool.


Thanks. I still have not decided if I`ll buy from AMD at all. 500 series look good .


----------



## TheBlademaster01

HIS was really great in the IceQ4 and IceQ5 times. Not sure about now, club3D always had
AMD cards but was budget/value as far as I can remember. They might change their image now though.

Sapphire is probably best out of those.


----------



## fleetfeather

Quote:


> Originally Posted by *szeged*
> 
> my favorite of those 3 is definitely HIS, ive had nothing but good luck with their cards. my best overclocking 7970s are from HIS, and their ice qx2 coolers they use are pretty top notch for those who air cool.


I can't get around the iceQx2 cooler design hey.... it may cool well, but the bowed edges just rustle my jimmies.

The 280 toxic on the other hand; it's like the DCUII + Lightning had a baby.


----------



## Regent Square

Quote:


> Originally Posted by *TheBlademaster01*
> 
> HIS was really great in the IceQ4 and IceQ5 times. Not sure about now, club3D always had
> AMD cards but was budget/value as far as I can remember. They might change their image now though.
> 
> Sapphire is probably best out of those.


Thanks


----------



## mboner1

Quote:


> Originally Posted by *szeged*
> 
> my favorite of those 3 is definitely HIS, ive had nothing but good luck with their cards. my best overclocking 7970s are from HIS, and their ice qx2 coolers they use are pretty top notch for those who air cool.


Yep, i got a really good 7970 from HiS as well, was skeptical going in but it overclocks easily to the max in afterburner, got a msi 7970 for crossfire expecting the msi to be the better card and could barely OC at all on it. Will never buy msi again and won't hesitate to grab HiS if it's available. Fan is super loud on the HiS tho.


----------



## szeged

Quote:


> Originally Posted by *mboner1*
> 
> Yep, i got a really good 7970 from HiS as well, was skeptical going in but it overclocks easily to the max in afterburner, got a msi 7970 for crossfire expecting the msi to be the better card and could barely OC at all on it. Will never buy msi again and won't hesitate to grab HiS if it's available. Fan is super loud on the HiS tho.


ive had nothing but bad things with msi, theyre done for me. Only grabbing HIS cards when it comes to AMD from now on, unless i can actually get a asus matrix card this time around lol. Weird about the loud fan, i used the air cooler on my 7970s for about 3 days waiting for the waterblocks to arrive, all of them ran silent lol.


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> ive had nothing but bad things with msi, theyre done for me. Only grabbing HIS cards when it comes to AMD from now on, unless i can actually get a asus matrix card this time around lol. Weird about the loud fan, i used the air cooler on my 7970s for about 3 days waiting for the waterblocks to arrive, all of them ran silent lol.


290x day 1 buy?


----------



## szeged

yeah ill try to get one on day one, and if they release custom cards later on ill sell the reference model one to grab one of the custom cards. unless i really end up liking the 290x reference, then i may sell a 780 classy to grab a custom 290x


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> yeah ill try to get one on day one, and if they release custom cards later on ill sell the reference model one to grab one of the custom cards. unless i really end up liking the 290x reference, then i may sell a 780 classy to grab a custom 290x


Will u join r 290x owners club?


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> Will u join r 290x owners club?


of course







best place to share/learn info on new cards is the clubs on ocn, i love the titan owners club, amazing info shared there lol


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> of course
> 
> 
> 
> 
> 
> 
> 
> best place to share/learn info on new cards is the clubs on ocn, i love the titan owners club, amazing info shared there lol


Do you read through all the pages for info, or u take a look only on the 1st page?


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> Do you read through all the pages for info, or u take a look only on the 1st page?


if i join a club late ill skim through most of the pages, skipping most of the random talk while keeping an eye out for any valuable info, front pages miss a lot of good stuff









if i join a club near day one of it opening, i try to read every page there is. ive read almost every page of the1600+ page titan owners club completely lol


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> if i join a club late ill skim through most of the pages, skipping most of the random talk while keeping an eye out for any valuable info, front pages miss a lot of good stuff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> if i join a club near day one of it opening, i try to read every page there is. *ive read almost every page of the1600+ page* titan owners club completely lol


(insert cursing word here) man, that's awesome.
















If u join the club late, do you go through all 1000+ pages, spending like 6 secs on a page?


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> (insert cursing word here) man, that's awesome.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If u join the club late, do you go through all 1000+ pages, spending like 6 secs on a page?


if i join late i scroll through the pages ill read a sentence or so of each post to see if its interesting to me lol, sometimes ill just skip pages if the recent chat in the club is way off topic lol


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> if i join late i scroll through the pages ill read a sentence or so of each post to see if its interesting to me lol, sometimes ill just skip pages if the recent chat in the club is way off topic lol


Thanks for info. I will have wut to read in a few days.


----------



## Stay Puft

Quote:


> Originally Posted by *szeged*
> 
> my favorite of those 3 is definitely HIS, ive had nothing but good luck with their cards. my best overclocking 7970s are from HIS, and their ice qx2 coolers they use are pretty top notch for those who air cool.


HIS? Really? If the price is under 600 I'm grabbing any reference. Over 600 ill just wait


----------



## szeged

Quote:


> Originally Posted by *Stay Puft*
> 
> HIS? Really? If the price is under 600 I'm grabbing any reference. Over 600 ill just wait


the HIS 7970 ghz cards were reference cards with a little beefed up vrm, and two 8 pin pci-e connectors instead of 6/8, better power draw etc etc, yet fit reference waterblocks because of the way HIS designed the card. best of both worlds.


----------



## infranoia

On topic, I hadn't seen x-fire 290x yet, but 17K people out there have.






3x 4K Eyefinity. My god, those pixel pumps in these things...


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> the HIS 7970 ghz cards were reference cards with a little beefed up vrm, and two 8 pin pci-e connectors instead of 6/8, better power draw etc etc, yet fit reference waterblocks because of the way HIS designed the card. best of both worlds.


Man, I am disappointed by the amount of leaks being released. We have a leaked out diagram of 290x floating around for 2 days already; so bad


----------



## keikei

Quote:


> Originally Posted by *Regent Square*
> 
> Man, I am disappointed by the amount of leaks being released. We have a leaked out diagram of 290x floating around for 2 days already; so bad


May I ask, are you picking one up (or at least waiting until benches)? You're on this R9 290X news like 'white on rice'.


----------



## Regent Square

Quote:


> Originally Posted by *keikei*
> 
> May I ask, are you picking one up (or at least waiting until benches)? You're on this R9 290X news like 'white on rice'.


I have been eager to upgrade since prior a few months to 600 series release. Yes, I do consider picking one up(if things work out well enough this gen).

I was around during the 780 agiotage but it was rather a failure..


----------



## keikei

Quote:


> Originally Posted by *Regent Square*
> 
> I have been eager to upgrade since prior a few months to 600 series release. Yes, I do consider picking one up(if things work out well enough this gen).
> 
> I was around during the 780 agiotage but it was rather a failure..


Well, early next week we'll have our long awaited answers. I think the TITAN might just be dethroned. Either way, we'll get some price drops on both sides.


----------



## Regent Square

Quote:


> Originally Posted by *keikei*
> 
> Well, early next week we'll have our long awaited answers. I think the TITAN might just be dethroned. Either way, we'll get some price drops on both sides.


It is still 3 long days of wait time. We will see reviews coming up a day before release. They will be leaked from Chinese forums or elsewhere.


----------



## Majin SSJ Eric

In my 7970 days it seemed like Diamond had all the best clockers believe it or not...


----------



## szeged

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> In my 7970 days it seemed like Diamond had all the best clockers believe it or not...


when i was first searching around for a new 7970 everyone was saying the diamond reference cards had the best luck with OC potential, i went with HIS instead and got an amazing card, after a while i got a diamond reference to compare to see if it was true, it was a awful OC card lol, eventually sold it to a friend for $50 less than i got it for since he was gonna run it at stock anyways lol.


----------



## Majin SSJ Eric

I believe two or three of TSM's 7970's are DIamonds...


----------



## Stay Puft

Back when I had quad 7970s they were all diamond brown boxes and did 1250 core at 1250mv.


----------



## szeged

i just had bad luck with the diamond card imo, there were lots of reports of people getting insane clockers with the diamond card.


----------



## Majin SSJ Eric

My Sapphires were dogs. They'd only do 1225MHz at 1381mV!


----------



## szeged

D: wow lol


----------



## szeged

TTL's 780 was overclocked for that run btw, while the titan was at stock.


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> TTL's 780 was overclocked for that run btw, while the titan was at stock.


Lol, didn't we have the same conversation a few pages back about reviewers comparing overclocked or nonreference 780s to stock Titans.








History repeats itself, it seems. Over and over and over...


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> TTL's 780 was overclocked for that run btw, while the titan was at stock.


It lost to TITAN in every benchmark expect FC3...

Wut kinda of overclock was that. ....

It was turbo boosting higher than Titan as I remember...


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> Lol, didn't we have the same conversation a few pages back about reviewers comparing overclocked or nonreference 780s to stock Titans.


yes lol, which is why review places like that shouldnt be trusted 100% when comparing cards that are so close to each other.

both cards are within 5% of each other, overclock the lesser one a lot, claim it beats the other one in all scenarios. get paid for advertising another good product from the supplier of the item.


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> yes lol, which is why review places like that shouldnt be trusted 100% when comparing cards that are so close to each other.
> 
> both cards are within 5% of each other, overclock the lesser one a lot, claim it beats the other one in all scenarios. get paid for advertising another good product from the supplier of the item.


Or, if they are going to compare products that have little performance gap in between, at least show the overclocking potential of both, not just one. Try to at least make it not so one sided all the time...
Edit: But when you really think about it, the reviewers aren't going to use more time to overclock everything to see how everything is on a level-playing field (level as much as possible with the silicon lottery).


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> Or, if they are going to compare products that have little performance gap in between, at least show the overclocking potential of both, not just one. Try to at least make it not so one sided all the time...


they could, or they could lavish praises on one while claiming its better so they can get paid for giving it a good review lol


----------



## Regent Square

Quote:


> Originally Posted by *wstanci3*
> 
> Or, if they are going to compare products that have little performance gap in between, at least show the overclocking potential of both, not just one. Try to at least make it not so one sided all the time...
> Edit: But when you really think about it, the reviewers aren't going to use more time to overclock everything to see how everything is on a level-playing field (level as much as possible with the silicon lottery).


They need to review 290x oc vs Titan oc


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> They need to review 290x oc vs Titan oc


they need to, but they arent going to. They always overclock new cards and leave other cards at stock and claim victory for the new card.


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> they could, or they could lavish praises on one while claiming its better so they can get paid for giving it a good review lol


Yeah, lol.
I don't take hardware reviewing sites seriously for gpu reviews. But, even though I shouldn't, I probably will read the 290x reviews simply because I can't help myself.








I will wait to see what people on OCN are able to do with it and base whether I will buy one or skip.


----------



## szeged

yeah its definitely smarter to trust people getting paid to give a card a good review over actual users.


----------



## Regent Square

Quote:


> Originally Posted by *szeged*
> 
> yeah its definitely smarter to trust people getting paid to give a card a good review over actual users.


Some sites only*

There are still non paid sites that provide proper reviews

I mean, if u read tom`s hardware than...


----------



## BradleyW

I don't understand fan boy's. They support businesses that make millions from taking all your money.


----------



## Echoa

Quote:


> Originally Posted by *Regent Square*
> 
> [quote name="szeged" url="/t/1429286/tpu-amd-announces-the-radeon-r9-290x-features-64-rops/1730#post_20973307"]they need to, but they arent going to. They always overclock new cards and leave other cards at stock and claim victory for the new card.


I wont trust OCN for overclocks, considering the number of NV fanboys here.Some reviews do oc vs oc but not to the max limit.Also, day 1 release cant be a deciding factor of who is a victor as Titan`s been for a while on the market.[/QUOTE]

My personal fav is Guru3D c: after looking at the reviews of many tech sites, CPU and GPU wise i find them to review best. Besides that though i see no reason to not trust people here, i personally am an AMD fan, but i also will give you a fairly objective opinion on A vs B hardware. Ive seen plenty of constructive and decent feedback on most hardware, sure bias does get inserted at times but there is still plenty of fairly unbiased info here

Sent from my Galaxy Nexus using Tapatalk now Free


----------



## xoleras

Quote:


> Originally Posted by *szeged*
> 
> they need to, but they arent going to. They always overclock new cards and leave other cards at stock and claim victory for the new card.


Don't agree with you here. What I always see at nearly every website is testing everything at stock, and THEN devoting one page to show the overclock potential of the new GPU. For instance, Guru3d tested the 280X reference and compared it against reference cards from NV. Then they devoted one page to overclocking to show the potential of the card - this is what most websites do.

As seen here:

http://www.guru3d.com/articles_pages/radeon_r7_260x_r9_270x_280x_review_benchmarks,1.html

Now, post LAUNCH they will test pre overclocked models and compare it to the reference 280x. As an example, they may test a DC II 280X and compare it to reference. This is similar to how the 680 was reviewed - at launch, it was all stock clock GTX 680s. Then a couple weeks later, the factory overclocked models were tested to show how much faster they were versus other cards *and* the stock 680. This is the formula that most websites follow, and is a fair testing methodology for launch cards - launch cards for *new GPUs* are *usually* tested at stock clocks. Again - with one page dedicated to OC potential.

Like I said, though, they may test pre overclocked models such as the MSI lightning - but for new launch SKUs they test reference across the board for the most part with one page dedicated to OC potential of the new GPU architecture. Now, for what I like to call the "nutty insane" type of testing - where someone does a 2 minute suicide run in unigine or 3dmark firestrike using 16 gabillion volts - you won't find stuff like that on most websites. For suicide testing, you'll want to ask here, heh. I don't personally find suicide run benchmarks to be useful as far as real world performance.


----------



## selk22

This is one Sexy GPU

http://www.anandtech.com/show/7406/the-sapphire-r9-280x-toxic-review

I am hoping to see something amazing from Sapphire with the 290-290x


----------



## Regent Square

Quote:


> Originally Posted by *selk22*
> 
> This is one Sexy GPU
> 
> http://www.anandtech.com/show/7406/the-sapphire-r9-280x-toxic-review
> 
> I am hoping to see something amazing from Sapphire with the 290-290x


No reference for ya?


----------



## selk22

Quote:


> Originally Posted by *Regent Square*
> 
> No reference for ya?


I would like to.. Then expand my h220 with a second radiator and put it under water. But this is also the pricey option.

I don't see the point of reference unless you plan to put them under water for sure. My experience with Nvidia reference coolers has been not so good. My 580 SC reference card is really quite the heater.

Its all speculation for me at this point but I do really like the look and temps of that 280x Toxic


----------



## maarten12100

The reference cooler seems to do well and since the 280x and 290x have about the same draw at stock this cooler is going to do just fine.
To let you guys in on a little something my HD5870's actually go lower than what afterburner allows as fan settings they go to like 21% and at full load go up to 35% I really love this factory fan profile. (They don't run hot either)
Having bought those 5870's a couple of days ago and seeing what they can do makes me wonder why I got my 570 over this...

Still can't stand the CCC panel though it is so bad compared to the well sorted Nvidia control panel.
Not sure if I'll buy this gpu but it'll be beast and it is needless to say it will have multi gpu performance crown because CF scales much better than SLI and they are probably very close.

I recall a AMD tech saying that it was stronger than anything Nvidia had well time will tell. (UHDL is gonna rock our worlds and Mantle after that)


----------



## Fniz92

If this cards specifications are indeed to be true, then mother of god.. That is one powerful beast.


----------



## Regent Square

Quote:


> Originally Posted by *selk22*
> 
> I would like to.. Then expand my h220 with a second radiator and put it under water. But this is also the pricey option.
> 
> I don't see the point of reference unless you plan to put them under water for sure. My experience with Nvidia reference coolers has been not so good. My 580 SC reference card is really quite the heater.
> 
> Its all speculation for me at this point but I do really like the look and temps of that 280x Toxic


Always been a fan of reference. They said all reviewers were given non ref coolers, does it mean reference design will be in limited quantities only?


----------



## Moragg

"Fans" are just people who like the products and/or company, may have a slight bias but otherwise look at all offerings and price equally.
"Fanboys" are people who will stick to one company regardless of price/performance, bas competitors and _not even look at alternatives_.

On topic, I am a fan of AMD because of their prices and really want them to bring out a CPU that can compete with Intel when both are OCed purely on performance. A new chipset and a slightly more expensive die is more than tolerable if it pushes AMD close enough to Intel. As it stands having AMD for budget and Intel for performance means relativly little competition, unlike the GPU side of things.


----------



## Fniz92

I agree with what you say, I still don't think AMD can compete with such a huge company like Intel.
With graphics card they can compete against nvidia, and in my opinion have the better products for the money.

Competetion would be good for the CPU market though, seeing intel throwing 3 generation of proccesors with barely 5-7 % improvement everytime is a disgrace..


----------



## Moragg

I'm not sure AMD can compete, but they need to do something. As for us consumers, we might get some competition from ARM, so hopefully they can push Intel/drive down prices through similar offerings.


----------



## CallsignVega

I only check this thread every 20 pages or so. So what's the deal, still no leaked benchmarks?


----------



## PureBlackFire

Quote:


> Originally Posted by *CallsignVega*
> 
> I only check this thread every 20 pages or so. So what's the deal, still no leaked benchmarks?


nope.


----------



## Stay Puft

Quote:


> Originally Posted by *CallsignVega*
> 
> I only check this thread every 20 pages or so. So what's the deal, still no leaked benchmarks?


Thread over at Chiphell with new 3dmark's from the 290X tested with a 3930K

3dmark11 X4487
FireStrike Extreme 4723


----------



## Fniz92

Quote:


> Originally Posted by *PureBlackFire*
> 
> nope.


There is already leaked benchmarks out there.
NDA lifts off in 2 days so reviews should be up by then.


----------



## PureBlackFire

Quote:


> Originally Posted by *Stay Puft*
> 
> Thread over at Chiphell with new 3dmark's from the 290X tested with a 3930K
> 
> 3dmark11 X4487
> FireStrike Extreme 4723


link to this thread please.
Quote:


> Originally Posted by *Fniz92*
> 
> There is already leaked benchmarks out there.
> NDA lifts off in 2 days so reviews should be up by then.


I know.


----------



## Stay Puft

Quote:


> Originally Posted by *PureBlackFire*
> 
> link to this thread please.
> I know.


http://www.chiphell.com/thread-875687-1-1.html


----------



## geoxile

How does the Titan compare?


----------



## Stay Puft

Quote:


> Originally Posted by *geoxile*
> 
> How does the Titan compare?


Titan's higher


----------



## CallsignVega

Quote:


> Originally Posted by *Stay Puft*
> 
> Thread over at Chiphell with new 3dmark's from the 290X tested with a 3930K
> 
> 3dmark11 X4487
> FireStrike Extreme 4723


Those are a bit lower than the Titan aren't they?


----------



## Fniz92

Quote:


> Originally Posted by *geoxile*
> 
> How does the Titan compare?


About 100 less.

Titan = 4600 something.


----------



## sugarhell

Below they mention about 9k fs


----------



## PureBlackFire

Quote:


> Originally Posted by *Stay Puft*
> 
> http://www.chiphell.com/thread-875687-1-1.html


thanks.
Quote:


> Originally Posted by *geoxile*
> 
> How does the Titan compare?


a stock titan is in the same range. mostly lower, but could be equal or higher depending on how much the card boosts at stock settings.


Spoiler: Warning: Spoiler!


----------



## Stay Puft

Quote:


> Originally Posted by *Fniz92*
> 
> About 100 less.
> 
> Titan = 4600 something.


I havent seen a Titan score with a 3930K

Quote:


> Originally Posted by *CallsignVega*
> 
> Those are a bit lower than the Titan aren't they?


Yes they are.


----------



## Fniz92

Titan got a score of 4364 with a 3960 4.6GHz at launch.

http://www.guru3d.com/articles_pages/geforce_gtx_titan_review,22.html

r9-290x should be above 5.000 once drivers gets optimized.


----------



## Stay Puft

Quote:


> Originally Posted by *Fniz92*
> 
> Titan got a score of 4364 with a 3960 4.6GHz at launch.
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_titan_review,22.html
> 
> r9-290x should be above 5.000 once drivers gets optimized.


Unfortunately those numbers are from 8 months ago


----------



## jincuteguy

so when are these R290X suppose to come out?


----------



## Fniz92

Quote:


> Originally Posted by *Stay Puft*
> 
> Unfortunately those numbers are from 8 months ago


You don't get the point i'm trying to come across? There is still work to be done and a score of 4.7 k is very decent compared to Titans 4.4k at launch.


----------



## wstanci3

Quote:


> Originally Posted by *jincuteguy*
> 
> so when are these R290X suppose to come out?


NDA is lifted on the 15th.
I'd expect it to be available to order shortly after.


----------



## CallsignVega

That stinks, I am wanting something faster than my Titans, not something around the same speed. Any limitations on 290X Crossfire or can you go full 4-way?


----------



## sugarhell

Quote:


> Originally Posted by *CallsignVega*
> 
> That stinks, I am wanting something faster than my Titans, not something around the same speed. Any limitations on 290X Crossfire or can you go full 4-way?


Full 4-way. Also with 64rops and 512 bus its great for your res vegas


----------



## BradleyW

Waiting for 290's or 290x's here! Will be making the purchase!


----------



## Fniz92

Quote:


> Originally Posted by *CallsignVega*
> 
> That stinks, I am wanting something faster than my Titans, not something around the same speed. Any limitations on 290X Crossfire or can you go full 4-way?


Nvidia is king of multiple GPU's why would you ever change your titans?


----------



## Regent Square

Quote:


> Originally Posted by *BradleyW*
> 
> Waiting for 290's or 290x's here! Will be making the purchase!


Same


----------



## Regent Square

Quote:


> Originally Posted by *CallsignVega*
> 
> That stinks, I am wanting something faster than my Titans, not something around the same speed. Any limitations on 290X Crossfire or can you go full 4-way?


We don't know

a) Was it intel or amd cpu

b) 290x or 290 that got this score

c) drivers are not mature.


----------



## Regent Square

I remember Golden Tiger was wondering if it is worth to upgrade. Definitely not, approx. Titan/780 performance is expected.


----------



## whtchocla7e

These 290x threads are fun to read. Full of closet AMD-lover Titan owners who are mad eager to "upgrade" to the latest, greatest, and 5% faster, lol.


----------



## SoloCamo

Quote:


> Originally Posted by *CallsignVega*
> 
> That stinks, I am wanting something faster than my Titans, not something around the same speed. Any limitations on 290X Crossfire or can you go full 4-way?


<-- Can't comprehend the need to replace four titans..

perhaps I'm just not gaming right?


----------



## Stay Puft

Quote:


> Originally Posted by *Regent Square*
> 
> We don't know
> 
> a) Was it intel or amd cpu
> 
> b) 290x or 290 that got this score
> 
> c) drivers are not mature.


Read the thread i linked

a. It was a 3930K
b 290X obviously
c. Its GCN. Its not something entirely new


----------



## vs17e

Quote:


> Originally Posted by *Fniz92*
> 
> Nvidia is king of multiple GPU's why would you ever change your titans?


Doesn't crossfire have greater scaling?


----------



## Fniz92

Quote:


> Originally Posted by *vs17e*
> 
> Doesn't crossfire have greater scaling?


Depends on the game, nvidia is strongly focused on multiple GPU's.
Let's assume almost every game coming out on consoles are AMD optimized which seems to be the case with BF4 which has much lower frame latency and scaling than nvidia counter parts. Example would be the 7990 being 3-4 frames off the 780 SLI in BF4, that is incredible flawless scaling.

I assume the 290X which doesn't have crossfire bridges will also scale better according to AMD atleast. Who
knows, perhaps mantle will make scaling much better.

Most games today run better on SLI though, time will tell how much optimization developers can make with their games seeing as they are coding directly to AMD hardware. I assume that is how only they can make mantle possible.


----------



## Joa3d43

Quote:


> Originally Posted by *Fniz92*
> 
> Depends on the game, nvidia is strongly focused on multiple GPU's.
> Let's assume almost every game coming out on consoles are AMD optimized which seems to be the case with BF4 which has much lower frame latency and scaling than nvidia counter parts. Example would be the 7990 being 3-4 frames off the 780 SLI in BF4, that is incredible flawless scaling.
> 
> I assume the 290X which doesn't have crossfire bridges will also scale better according to AMD atleast. Who knows, perhaps mantle will make scaling much better.


...also wondering about multi R9 290x scaling - and why AMD only showed up to tri-fire (and not quads) in their pic-link below... how 'externally bridgeless' will do w/4 GPU cards on PCIe bus...hopefully some test results soon as it looks to be an interesting card

http://videocardz.com/images/2013/10/AMD-Radeon-R9-290X-CrossFire-Scaling.jpg


----------



## Blackops_2

Quote:


> Originally Posted by *Joa3d43*
> 
> ...also wondering about multi R9 290x scaling - and why AMD only showed up to tri-fire (and not quads) in their pic-link below... how 'externally bridgeless' will do w/4 GPU cards on PCIe bus...hopefully some test results soon as it looks to be an interesting card
> 
> http://videocardz.com/images/2013/10/AMD-Radeon-R9-290X-CrossFire-Scaling.jpg


Maybe for some odd reason they figured noone would use/need 4 290s? Either way it's a little strange.


----------



## infranoia

Quote:


> Originally Posted by *Fniz92*
> 
> Depends on the game


But not really. There is an absolute answer to that question.
http://forums.anandtech.com/showthread.php?t=2234652

Unless you include Batman as the 'depends' part, which is a game nestled deeply in Nvidia's pocket, with all kinds of engine shenanigans going on.


----------



## Particle

Fniz92 is correct. Some games scale poorly or not at all with CFX. Some games like Natural Selection 2 work with CFX but do not yield better performance. Other games like Company of Heroes 2 don't even work with CFX enabled.


----------



## infranoia

Quote:


> Originally Posted by *Particle*
> 
> Fniz92 is correct. Some games scale poorly or not at all with CFX. Some games like Natural Selection 2 work with CFX but do not yield better performance. Other games like Company of Heroes 2 don't even work with CFX enabled.


The question was "does Crossfire have greater scaling" and in the general sense, yes it does, all other things being equal. But I'm not about to defend crappy game code.

EDIT: Oh, frack. Besides that one Anand forum post all I can find are Tom's Hardware links. I hereby formally abandon this line of reasoning.


----------



## Fniz92

Quote:


> Originally Posted by *infranoia*
> 
> But not really. There is an absolute answer to that question.
> http://forums.anandtech.com/showthread.php?t=2234652
> 
> Unless you include Batman as the 'depends' part, which is a game nestled deeply in Nvidia's pocket, with all kinds of engine shenanigans going on.


Games like : Assassin's Creed 3, Batman Arkham City, F1 2012, StarCraft II, Skyrim, and World of Warcraft is all where crossfire performance is very poor.

But you are indeed right, for the most part the crossfire scaling is pretty good, perhaps just as good as nvidia in most parts.
Techpowerup seems to be placing the 7990 at the same performance of a 690, other than the games I listed though.

My concern is frame latency though, I know they have fixed alot of issues with crossfire lately, but do crossfire cause stutter?


----------



## SpacemanSpliff

Quote:


> Originally Posted by *Fniz92*
> 
> Games like : Assassin's Creed 3, Batman Arkham City, F1 2012, StarCraft II, Skyrim, and World of Warcraft is all where crossfire performance is very poor.
> 
> But you are indeed right, for the most part the crossfire scaling is pretty good, perhaps just as good as nvidia in most parts.
> Techpowerup seems to be placing the 7990 at the same performance of a 690, other than the games I listed though.
> 
> My concern is frame latency though, I know they have fixed alot of issues with crossfire lately, but do crossfire cause stutter?


From the sounds of it (and we'll have to wait for official real-world reviews to know for sure) but it seems like AMD has implemented some hardware changes to help out with this, including a frame pacing/metering controller on the card itself, as well as designing the cards to no longer require the bridge cables for Crossfire...

From the way it looks, these should have much better crossfire performance than anything currently out from AMD by a large margin... but as I said, just speculation... we still get to wait until Tuesday to know for sure...


----------



## Echoa

Wont let me quote for some reason :3 but for those mentioning quad-fire not being shown, doesnt performance in sli and crossfire scale poorly beyond 2 cards, with 3 cards really being the max before your cost to benefit ratio drops into the toilet? Thats probably why AMD didnt bother showing quad-fire, performance increase was probably negligible.

Sent from my Galaxy Nexus using Tapatalk now Free


----------



## fateswarm

Quote:


> Originally Posted by *fleetfeather*
> 
> I understand probability fine. My post was in reference to your previously exhibited hard-line stance against the 290x before more was known about it. The above comment was one of many from you suggesting the 290x was DOA (read: suggesting, not stating). I'm fairly confident you'll ask me to provide proof of these past comments, but really you make a lot of posts on a daily basis, so I'd rather not trawl through your extensive post history on the subject.


So? I said, I see probabilities and I support the highest probability, and I may even change a stance completely in the distant feature if I see a more complete picture, there is no reason to attack those that do it. My role models aren't politicians, they are scientists.


----------



## Fniz92

Seems like I was wrong and AMD have stepped up their game in terms of frame latency for the large part of the time.

Stumbled upon this review which seems to suggest that the frame latency is of the past with the new drivers.

http://www.hardwareheaven.com/reviews/1727/pg20/amd-radeon-hd-7990-graphics-card-review-conclusion-and-rating.html

Just had too many issues with my 6970 crossfire at the time, glad they finally fixed the issues.

Can't seem to find any with their newest drivers which should have some frame pacing though. I still think nvidia have a slight edge in multi gpu, time will tell how much longer this will last though.


----------



## maarten12100

Quote:


> Originally Posted by *CallsignVega*
> 
> That stinks, I am wanting something faster than my Titans, not something around the same speed. Any limitations on 290X Crossfire or can you go full 4-way?


Vega this cards will be faster than your Titans on your resolution
Way faster and will scale better with 4 cards than your Titans do


----------



## maarten12100

Quote:


> Originally Posted by *Fniz92*
> 
> Nvidia is king of multiple GPU's why would you ever change your titans?


What?!
After the frame pacing driver nope not anymore
These cards will have hardware based fram mettering and scale better than SLI


----------



## fleetfeather

Quote:


> Originally Posted by *fateswarm*
> 
> So? I said, I see probabilities and I support the highest probability, and I may even change a stance completely in the distant feature if I see a more complete picture, there is no reason to attack those that do it. My role models aren't politicians, they are scientists.


PM'd

Edit: stuff it, I see who I'm trying to rationalise with and realise such an attempt would fall on deaf ears.


----------



## bencher

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> My Sapphires were dogs. They'd only do 1225MHz at 1381mV!


I bought an Asus 7970 dc ii and the voltage can't go higher than 1.18v and clocks can't go higher than 1.15ghz :'(


----------



## mcg75

Quote:


> Originally Posted by *infranoia*
> 
> Unless you include Batman as the 'depends' part, which is a game nestled deeply in Nvidia's pocket, with all kinds of engine shenanigans going on.


The source information for this "graph" was from March 23, *2012.*

The game makers released patches and AMD has had multiple Catalyst updates for upping performance in Xfire since then.

If you're going to scream conspiracy, have proof of it first please.


----------



## Fahrenheit85

I didnt know till this thread that the 290 and the 290x are two different cards.


----------



## Regent Square

Quote:


> Originally Posted by *Stay Puft*
> 
> Read the thread i linked
> 
> a. It was a 3930K
> b 290X obviously
> c. Its GCN. Its not something entirely new


a) ok

b) lol

c)lol


----------



## XxOsurfer3xX

The 290X at extreme resolutions is going to crush the Titan IMO, I want to see cfx scaling, but I'm not going to cfx for sure. SLI or CFX has been a pain in the ass for me, just give the fastest single card and I'm happy...


----------



## King4x4

CRUSH, SMASH, MURDER, SCREW, MAKE BABIES OUT OF... Been hearing this all day.

Lets see some benchmarks before going on the spam train please.


----------



## Jack Mac

Quote:


> Originally Posted by *King4x4*
> 
> CRUSH, SMASH, MURDER, SCREW, MAKE BABIES OUT OF... Been hearing this all day.
> 
> Lets see some benchmarks before going on the spam train please.


Two days to go.


----------



## MeanBruce

Quote:


> Originally Posted by *Jack Mac*
> 
> Two days to go.


Best case scenario: the R9-290 (without the X) comes in at $499 and the GTX780 is discounted to the price equivalent, the DC2 is $50 more so a cool $549. Finally I'll have a real video card, instead of the junk I've been running.









T-Minus 48hours aren't there normally a few barely translatable leaks by now? What's going on?


----------



## ultimeus

Can't wait for reviews.


----------



## DADDYDC650

Any news on a possible 6-8GB R9-290x? I needs moar vramz @ 7680x1440p!


----------



## maarten12100

Quote:


> Originally Posted by *MeanBruce*
> 
> Best case scenario: the R9-290 (without the X) comes in at $499 and the GTX780 is discounted to the price equivalent, the DC2 is $50 more so a cool $549. Finally I'll have a real video card, instead of the junk I've been running.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> T-Minus 48hours aren't there normally a few barely translatable leaks by now? What's going on?


We already had those for some time.
It pulls ahead of Titan under higher resolutions and or supersampling and is equal on lower resolutions clock for clock (a few percent slower according to the XS mod)
At this point we pretty much know that it will beat Titan on average in multi gpu setups.

Kyle said a while back that a AMD eployee told him that it was "faster than anything Nvidia had" now that is some time ago but I figure it will be faster at high resolutions. (too bad it has only 1DP port)


----------



## Alatar

Quote:


> Originally Posted by *maarten12100*
> 
> We already had those for some time.
> It pulls ahead of Titan under higher resolutions and or supersampling and is equal on lower resolutions clock for clock (a few percent slower according to the XS mod)
> At this point we pretty much know that it will beat Titan on average in multi gpu setups.
> 
> Kyle said a while back that a AMD eployee told him that it was "faster than anything Nvidia had" now that is some time ago but I figure it will be faster at high resolutions. (too bad it has only 1DP port)


Do find the quote and benches.

Because the only high res (3x1080p) benches we've seen with 4xMSAA the 290X at 1050MHz was matching a Titan. The XS mod you're talking about has also said that once he checked more tests and actually calculated the averages the card was below the Titan when both were at stock. And the same guy also said not the expect much when it comes to the OCing potential.

Trading blows with the 780 once both are at max OCs is what I'm still going to guess. Based on all the leaked stuff and OCing talk from XS.


----------



## CallsignVega

Quote:


> Originally Posted by *maarten12100*
> 
> Vega this cards will be faster than your Titans on your resolution
> Way faster and will scale better with 4 cards than your Titans do


While I hope so (as I am always looking for excuses to build/get the newest parts), what are your references for such a statement?

"Way faster" than 1.32v 1300 MHz Titan's? Sign me up!


----------



## Alatar

Quote:


> Originally Posted by *CallsignVega*
> 
> While I hope so (as I am always looking for excuses to build/get the newest parts), what are your references for such a statement?
> 
> "Way faster" than 1.32v 1300 MHz Titan's? Sign me up!


Everything we've seen so far suggests lower clock for clock performance than Titans and questionable overclocking potential (1400mhz on LN2 and 1150 on air rumored max).

Not saying that's exactly what's gonna happen, but all rumors are pointing to it.


----------



## sugarhell

Quote:


> Originally Posted by *Alatar*
> 
> Everything we've seen so far suggests lower clock for clock performance than Titans and questionable overclocking potential (1400mhz on LN2 and 1150 on air rumored max).
> 
> Not saying that's exactly what's gonna happen, but all rumors are pointing to it.


Last thread from chiphell with a stock 3930k a 290x score 9k on fs and 4700~ on fse with alpha drivers and quite unstable


----------



## SoloCamo

So at this point, I'm only at 1080p but I absolutely love to use AA and as much of it as possible. Just from what has been gathered so far, how would a 780/titan fair against 290/290x under high levels of AA / SSAA? Looking to play BF4 with more AA and though my 1225/1650 OC on my 7970GE does admirably at 1080p maxed, it definitely could use more grunt.


----------



## Alatar

Quote:


> Originally Posted by *sugarhell*
> 
> Last thread from chiphell with a stock 3930k a 290x score 9k on fs and 4700~ on fse with alpha drivers and quite unstable


Yeah and if you read the thread beyond a couple of pages the same guy said his Titan was ~100 points (or something along those lines) higher in FSE.


----------



## Regent Square

Quote:


> Originally Posted by *Alatar*
> 
> Everything we've seen so far suggests lower clock for clock performance than Titans and questionable overclocking potential (1400mhz on LN2 and 1150 on air rumored max).
> 
> Not saying that's exactly what's gonna happen, but all rumors are pointing to it.


And u gonna compare day 1 release driver/1day card vs Titan which is out for 7 months. LOL. Yea, fair treat....

XS mod is not the most reliable source.... ; You are basing it off rumors into 100% accurate info.

No worries, Titan will have a certain life span upon 290x release...


----------



## infranoia

Quote:


> Originally Posted by *mcg75*
> 
> The source information for this "graph" was from March 23, *2012.*
> 
> The game makers released patches and AMD has had multiple Catalyst updates for upping performance in Xfire since then.
> 
> If you're going to scream conspiracy, have proof of it first please.


1. The graph was an argument for AMD's superior scaling. I'm not sure why you're calling out Catalyst improvements since then, as the data shows Tahiti's superior scaling in a number of recent games.
2. Part of TWIMTBP in the Batman case was just a series of IF AMD / THEN SKIP statements, which might explain why Batman is alone in showing superior SLI scaling. My only point there.
http://www.ngohq.com/graphic-cards/16223-nvidia-disables-physx-when-ati-card-is-present.html
http://forums.techgage.com/showthread.php?t=5313

Necroedit: One more source showing AMD's superior PCI-E scaling efficiency across a huge set of tested 2012 games, which may account for the Crossfire edge:
http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html

I've come to understand that it is often the game engine that determines how efficient the scaling is, but there are some broad statistics that show Crossfire slightly more efficient overall. Not enough to get our panties in a wad though, so I'm bailing out of this burning wreck a second time.


----------



## Moragg

Quote:


> Originally Posted by *Alatar*
> 
> Everything we've seen so far suggests lower clock for clock performance than Titans and questionable overclocking potential (1400mhz on LN2 and 1150 on air rumored max).
> 
> Not saying that's exactly what's gonna happen, but all rumors are pointing to it.


No offence, but there haven't exactly been many "rumors" about the 290(X) OCing ability - at best a random line or two on a post. And the one which said "1400 on LN2" also said "1300 on air" which you appear to have forgotten









So as far as I see you nor anyone else has any basis on which to say how well these OC. Let the rest of us keep our hopes that it OCs and scales well, please.


----------



## maarten12100

Quote:


> Originally Posted by *Alatar*
> 
> Do find the quote and benches.
> 
> Because the only high res (3x1080p) benches we've seen with 4xMSAA the 290X at 1050MHz was matching a Titan. The XS mod you're talking about has also said that once he checked more tests and actually calculated the averages the card was below the Titan when both were at stock. And the same guy also said not the expect much when it comes to the OCing potential.
> 
> Trading blows with the 780 once both are at max OCs is what I'm still going to guess. Based on all the leaked stuff and OCing talk from XS.


Kyle said it somewhere end of September
Quote:


> Faster than anything Nvidia has


Yes I based it of the statement by the XS mod but Alatar you have to admit this card is going to be beastly at higher resolutions and supersampling.
I wonder however wouldn't doubling the number of ROPs also give only half the bandwidth per ROP and raster units are quite bandwidth intensive.

It might be time to face that you may lose that 10 dollar bet (but it will depend on what benches you run)
If you run high res or multi gpu then the 290X should win on average if just running normal benches then it might be the other way around.
Quote:


> Originally Posted by *CallsignVega*
> 
> While I hope so (as I am always looking for excuses to build/get the newest parts), what are your references for such a statement?
> 
> "Way faster" than 1.32v 1300 MHz Titan's? Sign me up!


Well it may depend a bit on OC ability the power phases seems to suck a bit (LN2 without voltage mod is a joke)
OC's well on air I would say 1200MHz range but:
Quote:


> Thanks to more Texture Mapping Units and Raster Operating Units we have 30% and 90% fill rate increase for textures and pixels


Over a 7970 which will mean that it would be faster especially on higher resolutions

http://www.overclock.net/t/1433350/vc-amd-hawaii-r9-290-series-gpu-diagram-leaks-out

Quote:


> Originally Posted by *Alatar*
> 
> Everything we've seen so far suggests lower clock for clock performance than Titans and questionable overclocking potential (1400mhz on LN2 and 1150 on air rumored max).
> 
> Not saying that's exactly what's gonna happen, but all rumors are pointing to it.


I'm not talking Firestrike bench what use are benches if a card does better at a higher resolution that a card that did great in Firestrike.
If anything AMD did underwhelm us with those Firestrike numbers to drop a "better than expected" card.

But I concur even with better power gating(less leakage/higher efficiency) under LN2 the power phases of the card could not keep up. (either this or the chip is a dud)
There has been speculation of there not being aftermarket pcb cards based on the Hawaii chip if there are aftermarket pcb cards that would be so great.

Depending on how easy that resistor pot mod is people might be willing to do that. (I'm a chicken last time I was soldering on a graphics card's pcb I failed to make a tweezer connection and screwed up my EEPROM (which already was failing but now was beyond repair)

I would like to get this card for myself I actually ordered a Titan for 850 euro a long time ago but it was canceled since that was not the real price.
I recently picked up 2 5870's for 40 euro(broken) which I fixed so I might switch from my GTX570 to these if this card isn't that good. (1GB of Vram is really little for 3840*2160 but the raw muscle of the 5870's makes up for it(should still be more than 50% faster in CF than my 570))

Too bad Zotac doesn't offer AMD cards I would've loved a 5 years warrenty (my cards die on me a lot)
My 570 classified currently running 963MHz as boost profile but I usually have it at ~938 for it to be stable.

As you can see I'm very excited for this launch.


----------



## sugarhell

Quote:


> Originally Posted by *Alatar*
> 
> Yeah and if you read the thread beyond a couple of pages the same guy said his Titan was ~100 points (or something along those lines) higher in FSE.


Quote:


> with alpha drivers and quite unstable


It match the titan even with alpha drivers. So its a good thing for us. Cheaper titan for all of us


----------



## Regent Square

Quote:


> Originally Posted by *maarten12100*
> 
> Kyle said it somewhere end of September
> Yes I based it of the statement by the XS mod but Alatar you have to admit this card is going to be beastly at higher resolutions and supersampling.
> I wonder however wouldn't doubling the number of ROPs also give only half the bandwidth per ROP and raster units are quite bandwidth intensive.
> 
> It might be time to face that you may lose that 10 dollar bet (but it will depend on what benches you run)
> If you run high res or multi gpu then the 290X should win on average if just running normal benches then it might be the other way around.


He wrote:

AMD Sez

Next gen is faster than everything Nvidia has.


----------



## Regent Square

Quote:


> Originally Posted by *sugarhell*
> 
> It match the titan even with *alpha drivers*. So its a good thing for us. Cheaper titan for all of us


Really?


----------



## sugarhell

Quote:


> Originally Posted by *Regent Square*
> 
> Really?


Search the thread and use google translate. Or if you can read chinese


----------



## Regent Square

Quote:


> Originally Posted by *sugarhell*
> 
> Search the thread and use google translate. Or if you can read chinese


I am Chinese...









+REP


----------



## Alatar

Quote:


> Originally Posted by *Regent Square*
> 
> And u gonna compare day 1 release driver/1day card vs Titan which is out for 7 months. LOL. Yea, fair treat....
> 
> XS mod is not the most reliable source.... ; You are basing it off rumors into 100% accurate info.
> 
> No worries, Titan will have a certain life span upon 290x release...


That's what the 290X will be competing against so i have no problem comparing it against that.
Quote:


> Originally Posted by *Moragg*
> 
> No offence, but there haven't exactly been many "rumors" about the 290(X) OCing ability - at best a random line or two on a post. And the one which said "1400 on LN2" also said "1300 on air" which you appear to have forgotten
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So as far as I see you nor anyone else has any basis on which to say how well these OC. Let the rest of us keep our hopes that it OCs and scales well, please.


And where did he say 1300 on air? Only air OCing quote I've seen has been 1100MHz stable and another post where the same user hinted at running at 1.15ghz.

Quote:


> Originally Posted by *maarten12100*
> 
> Kyle said it somewhere end of September
> Yes I based it of the statement by the XS mod but Alatar you have to admit this card is going to be beastly at higher resolutions and supersampling.
> I wonder however wouldn't doubling the number of ROPs also give only half the bandwidth per ROP and raster units are quite bandwidth intensive.
> 
> It might be time to face that you may lose that 10 dollar bet (but it will depend on what benches you run)
> If you run high res or multi gpu then the 290X should win on average if just running normal benches then it might be the other way around.


Kyle didn't say anything, AMD did.

And again, where is the proof of those claims about high resolution or CFX scaling? We have none.

So far everything points at a lower clock for clock perf than the Titan and a considerably lower OCing headroom.

How many of specific computational units it has has little relevance. Only the performance matters.


----------



## Regent Square

Quote:


> Originally Posted by *Alatar*
> 
> That's what the 290X will be competing against so i have no problem comparing it against that.
> And where did he say 1300 on air? Only air OCing quote I've seen has been 1100MHz stable and another post where the same user hinted at running at 1.15ghz.
> Kyle didn't say anything, AMD did.
> 
> And again, where is the proof of those claims about high resolution or CFX scaling? We have none.
> 
> So far everything points at a lower clock for clock perf than the Titan and a considerably lower OCing headroom.
> 
> How many of specific computational units it has has little relevance. Only the performance matters.


1) Cant compare day 1 card vs 7 months old. Might as well compare titan oc vs 290x stock and call it a victory.

2) Kyle has a reliable source inside AMD... Not everyone in AMD are company dedicated workers who only praise it/ or not allowed to talk.

crossfire scaling is leaked in VC diagram, It has been improved according it.


----------



## GoldenTiger

Quote:


> Originally Posted by *maarten12100*
> 
> Kyle said it somewhere end of September
> Yes I based it of the statement by the XS mod but Alatar you have to admit this card is going to be beastly at higher resolutions and supersampling.
> I wonder however wouldn't doubling the number of ROPs also give only half the bandwidth per ROP and raster units are quite bandwidth intensive.
> 
> If you run high res or multi gpu then the 290X should win on average if just running normal benches then it might be the other way around.


Lol, Kyle said nothing, he quoted an amd rep paraphrasing. Nothing more than marketing. And the guy on Xs isn't a moderator, he is a normal member who subscribes for the 25 dollar annual title edit/donation. You can't even get basics like that right.....?


----------



## Regent Square

Quote:


> Originally Posted by *GoldenTiger*
> 
> Lol, Kyle said nothing, he quoted an amd rep paraphrasing. Nothing more than marketing. And the guy on Xs isn't a moderator, he is a normal member who subscribes for the 25 dollar annual title edit/donation. You can't even get basics like that right.....?


Why was he screaming about a card if he subs 25$ per month/ Plus, it said he was a forum moderator... (not sure if it is paid feature)


----------



## maarten12100

Quote:


> Originally Posted by *GoldenTiger*
> 
> Lol, Kyle said nothing, he quoted an amd rep paraphrasing. Nothing more than marketing. And the guy on Xs isn't a moderator, he is a normal member who subscribes for the 25 dollar annual title edit/donation. You can't even get basics like that right.....?


Sorry I said it like that but I stated in posts before all over OCN that Kyle said that AMD said go trough my recent posts you'll see.

Now don't get mad at me for such things I'm just trying to help people figure out what to expect just like you guys.

There if you read the thread before posting you would've seen it and you wouldn't have to respond in such a manner. (I mean it is not like it is far away just 3 pages or so)
Quote:


> Originally Posted by *maarten12100*
> 
> We already had those for some time.
> It pulls ahead of Titan under higher resolutions and or supersampling and is equal on lower resolutions clock for clock (a few percent slower according to the XS mod)
> At this point we pretty much know that it will beat Titan on average in multi gpu setups.
> 
> Kyle said a while back that a AMD eployee told him that it was "faster than anything Nvidia had" now that is some time ago but I figure it will be faster at high resolutions. (too bad it has only 1DP port)


----------



## Regent Square

That is a pointless discussion.

In 2 days Titan will have a card that dethrones it.

Will come back when everything settles down.


----------



## raghu78

Quote:


> Originally Posted by *GoldenTiger*
> 
> Lol, Kyle said nothing, he quoted an amd rep paraphrasing. Nothing more than marketing. And the guy on Xs isn't a moderator, he is a normal member who subscribes for the 25 dollar annual title edit/donation. You can't even get basics like that right.....?


http://hardforum.com/showthread.php?t=1782602&highlight=

this is the thread. draw your own inferences. anyway its just 18 hrs. quit squabbling. all will be settled


----------



## Regent Square

Quote:


> Originally Posted by *raghu78*
> 
> http://hardforum.com/showthread.php?t=1782602&highlight=
> 
> this is the thread. draw your own inferences. anyway its just 18 hrs. quit squabbling. all will be settled


U have a link to Time To Love Customs` channel where he should be showing a review of a card, like he did with 780 ?!


----------



## Testier

Quote:


> Originally Posted by *Regent Square*
> 
> That is a pointless discussion.
> 
> In 2 days Titan will have a card that dethrones it.
> 
> Will come back when everything settles down.


I actually hope the r9 will be slower. I much rather get a titan for certain features aka fur tech and physx than AMD.


----------



## Moragg

Quote:


> Originally Posted by *Regent Square*
> 
> That is a pointless discussion.
> 
> In 2 days Titan will have a card *could possibly*dethrone it *in a few places*.
> 
> Will come back when everything settles down.


Fixed that for you







it's this kind comment that feeds some very logically-lacking arguements.


----------



## Majin SSJ Eric

Haha, it is funny to me that Alatar dismisses rumors and claims that he doesn't like as unsubstantiated and unconfirmed yet is more than happy to quote rumors all day long that support his views. I personally think AMD would be crazy to release a card like the 290X that DIDN'T at least match Titan considering how late they are and how much hype they have created but then again, this is the same company that released BD....


----------



## Kane2207

I can't help but feel that if AMD could thoroughly trounce Titan, there would have been benches and leaked slides all over the place by now.

Can anyone point me in the direction of these benches or AMD marketing slides?


----------



## Death Saved

Does the R9 290X have HDMI 2.0?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Kane2207*
> 
> I can't help but feel that if AMD could thoroughly trounce Titan, there would have been benches and leaked slides all over the place by now.
> 
> Can anyone point me in the direction of these benches or AMD marketing slides?


I don't think anyone is foolish enough to believe that the 290X will trounce the Titan (except maybe in BF4 when Mantle is released). I think it will win some and lose some against Titan but I do believe it will be faster than the 780...

EDIT - I also don't believe it will be able to compete with a overvolted Titan over 1300MHz but we will see soon enough...


----------



## Alatar

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Haha, it is funny to me that Alatar dismisses rumors and claims that he doesn't like as unsubstantiated and unconfirmed yet is more than happy to quote rumors all day long that support his views. I personally think AMD would be crazy to release a card like the 290X that DIDN'T at least match Titan considering how late they are and how much hype they have created but then again, this is the same company that released BD....


What rumor did I dismiss....?

I just asked to be quoted the specific post somewhere (about clocks) or given the benches some people were talking about.

However yes I do dismiss PR talk. Works for all companies though. Never trust AMD when it comes to AMD products, NV when it comes to NV products etc.

E: also I don't know how you find my views and reasoning funny when you yourself apparently hold the same views (post above this one)


----------



## Majin SSJ Eric

But always trust NV when it comes to AMD products, right Al?


----------



## maarten12100

Quote:


> Originally Posted by *Testier*
> 
> I actually hope the r9 will be slower. I much rather get a titan for certain features aka fur tech and physx than AMD.


If it is slower but keeps the same price that is actually bad for competition even if you want a Titan.

Not sure if you're serious about the fur tech though (might be nice for the Witcher 3 though it should be universal code instead of non ported proprietary crap you need a license for)
It is proven to run on AMD GPU's with very minimal code changes Nvidia is just hating especially since they went over the line by no longer allowing having a Nvidia card running next to AMD cards for PhysX.

You get my point
Quote:


> Originally Posted by *Death Saved*
> 
> Does the R9 290X have HDMI 2.0?


Not confirmed but it will have the pixel clock block removed which could already be done with Toasyx's clock patcher.
Anyhow HDMI 2.0 is still 3 links just like HDMI 1.4 so I think it will work just fine with older cards. (I'm currently trying to get my hands on the PCB used in the Panasonic TV but to no avail)


----------



## Moragg

Quote:


> Originally Posted by *Alatar*
> 
> What rumor did I dismiss....?
> 
> I just asked to be quoted the specific post somewhere (about clocks) or given the benches some people were talking about.
> 
> However yes I do dismiss PR talk. Works for all companies though. Never trust AMD when it comes to AMD products, NV when it comes to NV products etc.
> 
> E: also I don't know how you find my views and reasoning funny when you yourself apparently hold the same views (post above this one)


About the 1300MHz quote - would you mind pointing me to the 1400MHz on LN2 source, iirc it was the same guy who said 1300MHz on air.


----------



## infranoia

Quote:


> Originally Posted by *Kane2207*
> 
> I can't help but feel that if AMD could thoroughly trounce Titan, there would have been benches and leaked slides all over the place by now.
> 
> Can anyone point me in the direction of these benches or AMD marketing slides?


AMD is keeping a tight ship. Speculation is fun, I guess, but nearly every post in this thread becomes irrelevant in T-minus-36 hours or so.


----------



## Kane2207

Quote:


> Originally Posted by *infranoia*
> 
> AMD is keeping a tight ship. Speculation is fun, I guess, but nearly every post in this thread becomes irrelevant in T-minus-36 hours or so.


Keeping a tight ship isn't gaining them market share or equating to pre-orders.

If they had something, they'd be singing it from the rooftops.


----------



## Regent Square

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't think anyone is foolish enough to believe that the 290X will trounce the Titan (except maybe in BF4 when Mantle is released). I think it will win some and lose some against Titan but I do believe it will be faster than the 780...
> 
> *EDIT - I also don't believe it will be able to compete with a overvolted Titan over 1300MHz but we will see soon enough*...


How long it took to make this hack for titan? Exactly..

Wait till r9 290x has it and we will see.


----------



## infranoia

Quote:


> Originally Posted by *Kane2207*
> 
> Keeping a tight ship isn't gaining them market share or equating to pre-orders.


And you know this how?
I see a lot of buzz out there. I had a few carts filled with 780s a few weeks back that I deleted because of the rumors. So there's one, at least.


----------



## maarten12100

Quote:


> Originally Posted by *Regent Square*
> 
> How long it took to make this hack to titan? Exactly..
> 
> Wait till r9 290x has it and we will see.


1200MHz after 2 weeks but it would be locked at a higher voltage which sucks.
2 months after a bios that throttles down while in idle came out made by a TI mod
5 months (don't know exactly have left the scene after I couldn't get myself a Titan) a voltage soft unlock came out.


----------



## Testier

Quote:


> Originally Posted by *maarten12100*
> 
> If it is slower but keeps the same price that is actually bad for competition even if you want a Titan.
> 
> Not sure if you're serious about the fur tech though (might be nice for the Witcher 3 though it should be universal code instead of non ported proprietary crap you need a license for)
> It is proven to run on AMD GPU's with very minimal code changes Nvidia is just hating especially since they went over the line by no longer allowing having a Nvidia card running next to AMD cards for PhysX.


I am very serious for fur tech for witcher 3. I do not care whether it is nvidia's decision to lock it down or not. Nor do I care about competition. I simply care about what I want, though I am still gonna see if nvidia do a price drop on the titan either way. Though admitting, nvidia is a prick, and I would not be happy to see them lock it down, but nothing I can do.


----------



## Kane2207

Quote:


> Originally Posted by *infranoia*
> 
> And you know this how?
> I see a lot of buzz out there. I had a few carts filled with 780s a few weeks back that I deleted because of the rumors. So there's one, at least.


Anyone pre-ordering an unpriced product with no benchmarks released really should have their head checked, especially when some sites are charging extortionate cancellation fees for those pre-orders.

The only 'buzz' I keep hearing is Mantle - which sounds great and could be promising, but also hasn't been benched or proven, and after all, this is a company that should have delivered a frame pacing fix about June time but have now said DX9, Eyefinity and 4k fixes will be part of 'Phase 2'. Always waiting for something it seems...

*edit* - Grammar...


----------



## Alatar

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> But always trust NV when it comes to AMD products, right Al?


Please do point out where I've said this...
Quote:


> Originally Posted by *Moragg*
> 
> About the 1300MHz quote - would you mind pointing me to the 1400MHz on LN2 source, iirc it was the same guy who said 1300MHz on air.


http://www.xtremesystems.org/forums/showthread.php?286994-Amd-Volcanic-Islands-details&p=5209441&viewfull=1#post5209441

And the only quotes there from supposed card owners that hint at normal OCing potential are these:
Quote:


> Don't hold your breath for a 290X OC...


Quote:


> overclocking of hawaii is not too good like Tahiti! Max Tahiti was around 1250 MHz, Hawaii can be rock stable clocked aroun 1100/1150 MHz and mem around 6 GHz maximaly


----------



## infranoia

Quote:


> Originally Posted by *Kane2207*
> 
> Anyone pre-ordering an unpriced product with no benchmarks released really should have their head checked, especially when some sites are charging extortionate cancellation fees for those pre-orders.


So your proof that AMD's prerelease strategy is failing is... your opinion? I'd need more than that.

Never underestimate a successful GPU launch. R300 blew everyone's doors off and redefined the industry. Several recent Nvidia SKUs have been unobtainable for months after their hard launch. IMHO this is not that, but it's premature to judge right now.


----------



## maarten12100

Quote:


> Originally Posted by *Testier*
> 
> I am very serious for fur tech for witcher 3. I do not care whether it is nvidia's decision to lock it down or not. Nor do I care about competition. I simply care about what I want, though I am still gonna see if nvidia do a price drop on the titan either way. Though admitting, nvidia is a prick, and I would not be happy to see them lock it down, but nothing I can do.


Nvidia has less headroom to price drop than AMD could since they have a larger die but it wouldn't suprise me of the prices were to go down about 100 dollar on 780 reference models (non reference will keep the same price as they usually do)
I can't really speculate about the Titan since it is not only for the gaming market but is also the GF110 Fermi replacement for CUDA work I would say a 200 dollar drop at best.


----------



## Regent Square

Quote:


> Originally Posted by *maarten12100*
> 
> 1200MHz after 2 weeks but it would be locked at a higher voltage which sucks.
> 2 months after a bios that throttles down while in idle came out made by a TI mod
> 5 months (don't know exactly have left the scene after I couldn't get myself a Titan) a voltage soft unlock came out.


5 months. Now he says he will compare the 2 on 1 day. See the point?! NV fanboys are absurd...


----------



## Liranan

Quote:


> Originally Posted by *Kane2207*
> 
> Keeping a tight ship isn't gaining them market share or equating to pre-orders.
> 
> If they had something, they'd be singing it from the rooftops.


You mean like how nVidia keep a tight lid on data until release? The only website that keeps releasing numbers is Chiphell and they never release all they have, just a little at a time which could be marketting ploy as these companies keep giving them hardware despite their "violations" of NDA.


----------



## Regent Square

Quote:


> Originally Posted by *Alatar*
> 
> Please do point out where I've said this...
> http://www.xtremesystems.org/forums/showthread.php?286994-Amd-Volcanic-Islands-details&p=5209441&viewfull=1#post5209441
> 
> And the only quotes there from supposed card owners that hint at normal OCing potential are these:


And they are the once to trust? Please give me a reliable source where it says amd` 290x is not a good oce`r.


----------



## Alatar

Quote:


> Originally Posted by *Regent Square*
> 
> 5 months. Now he says he will compare the 2 on 1 day. See the point?! NV fanboys are absurd...


Like it or not, those Titans are what the 290X will have to compete against if it wants to claim the single GPU crown. One of the advantages of launching 8 months before the competition. That's an even bigger gap than the one between Cypress and GF100.

Also unless AMD locks down the TDP limit and voltage extremely aggressively you wont need any similar fixes for the 290X so I don't understand what waiting would really do...
Quote:


> Originally Posted by *Regent Square*
> 
> And they are the once to trust? Please give me a reliable source where it says amd` 290x is not a good oce`r.


Please give me a credible source that it says it will match a Titan.

We don't have any. Rumors only, however as I've pointed out those are pretty much the only OCing rumors we have. So until we get something better I'll speculate based on those.


----------



## dawn1980

the r9 290x will beat a gtx 780 and be close to a titan...but I cant trust amd due to there drivers. I've owned both company cards for testing and benchmarking but for my own personal gaming rig I stick with the solid drivers NV puts out. I remember my 6870 cf rig constant freeze up and blue screens on certain games but switch to NV all is well. If amd gets better with there drivers and this mantle takes off than maybe I will come back because I know the performance per dollar will always ne there over NV but better drivers....NO!! I will stick with my evga 780's acx that overclock titan performance thanks to my water cooled rig....ITS A STABLE GREEN MACHINE


----------



## mboner1

Quote:


> Originally Posted by *Alatar*
> 
> Like it or not, those Titans are what the 290X will have to compete against
> .


If it performs somewhere between the 780 and the titan and is priced the same as the 780 (which it is going to be isn't it?) then i believe it only has to compete with the 780 .. and will win.

I'm holding off on grabbing a 780 for this very reason, titan is over priced and i would be embarrassed to own it, the 780 would also make me feel a little dirty , like i'm being played for a fool by nvidia, if the 290x meets my above criteria i will be grabbing one for sure.


----------



## Blackops_2

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Haha, it is funny to me that Alatar dismisses rumors and claims that he doesn't like as unsubstantiated and unconfirmed yet is more than happy to quote rumors all day long that support his views. I personally think AMD would be crazy to release a card like the 290X that DIDN'T at least match Titan considering how late they are and how much hype they have created but then again, this is the same company that released BD....


Agreed, though i don't compare bulldozer to this product. This product is at the very least, an improved implementation o GCN 1.0 which was/is a success, it would be something spectacular to see them take a step backwards with Hawaii how they sort of did with bulldozer.

As you said it'd be a waste of time to release a product seven months later that didn't at least match Titan. Given the specs i'm expecting it to. Pricing is the real question. I want a 290x for 500$ and a 290 for 400$


----------



## Moragg

Quote:


> Originally Posted by *Alatar*
> 
> http://www.xtremesystems.org/forums/showthread.php?286994-Amd-Volcanic-Islands-details&p=5209441&viewfull=1#post5209441


Read a few pages around that, you'll find quite a few posts which say the 290X is better. Specifically on here:
http://www.xtremesystems.org/forums/showthread.php?286994-Amd-Volcanic-Islands-details/page28

he says
Quote:


> I'm not going to give numbers, but I'll say that it's all over for the TITAN until LN2 is used, at which point the roles are reversed. LN2 is kinda wasted on the card.


----------



## maarten12100

Quote:


> Originally Posted by *dawn1980*
> 
> the r9 290x will beat a gtx 780 and be close to a titan...but I cant trust amd due to there drivers. I've owned both company cards for testing and benchmarking but for my own personal gaming rig I stick with the solid drivers NV puts out. I remember my 6870 cf rig constant freeze up and blue screens on certain games but switch to NV all is well. If amd gets better with there drivers and this mantle takes off than maybe I will come back because I know the performance per dollar will always ne there over NV but better drivers....NO!! I will stick with my evga 780's acx that overclock titan performance thanks to my water cooled rig....ITS A STABLE GREEN MACHINE


They certainly have gotten better actually on par with Nvidia about now. (due to them taking their time instead of doing a release every month)
The look of the control panel still sucks compared to the tidy Nv control panel but you ain't gonna spend a lot of time there anyways


----------



## Alatar

Quote:


> Originally Posted by *Moragg*
> 
> Read a few pages around that, you'll find quite a few posts which say the 290X is better. Specifically on here:
> http://www.xtremesystems.org/forums/showthread.php?286994-Amd-Volcanic-Islands-details/page28
> 
> he says


That was way before his other posts that said that once he did some more tests the averages showed the card being slower than a Titan at stock.

He also said he spoke too quickly and that it isn't as good as it originally seemed from his tests (referring to the post you linked me to).

Also way before any of the OCing potential talk.


----------



## Fniz92

Kyle_Bennett says the card is faster than the titan yet alatar still says he don't believe it?
I believe we are sticking to the green side are we not?

Alatar, releasing a 600-650 $ GPU that is equal or better than a 1000 $ GPU is whats it's about, not sure why you're comparing it to a overclocked Titan. R9-280X has unlocked 1.4v cards, so why would that stop AMD from listening to the overclocking community for the r9-290x?


----------



## Kane2207

Quote:


> Originally Posted by *Fniz92*
> 
> Kyle_Bennett says the card is faster than the titan yet alatar still says he don't believe it?
> I believe we are sticking to the green side are we not?
> 
> Alatar, releasing a 600-650 $ GPU that is equal or better than a 1000 $ GPU is whats it's about, not sure why you're comparing it to a overclocked Titan. R9-280X has unlocked 1.4v cards, so why would that stop AMD from listening to the overclocking community for the r9-290x?


Seeing as you already claim to have a 290X in your sig, why don't you bench it and tell us whether you agree with Kyle or not?


----------



## maarten12100

Quote:


> Originally Posted by *Kane2207*
> 
> Seeing as you already claim to have a 290X in your sig, why don't you bench it and tell us whether you agree with Kyle or not?


Because he is under NDA of course









Gotta hurt to call someone out while actually even if he had it it would be obvious he couldn't bench it without getting into trouble.


----------



## Kuivamaa

From what I've gathered, people that have access to those say they will trade blows. Benching 5 pro AMD games out of 7 doesn't mean 290X will be faster just because it averages better,and the opposite also applies- I can make a test and use WoW,Diablo III,ACIII and find that 770~290X or bench Dirt Showdown,company of heroes 2 etc and claim plain 290 beats Titan or so. Unless one is consistently faster than the other like 780 is vs 7970, you just see what you wanna play and buy accordingly.


----------



## Fniz92

Quote:


> Originally Posted by *Kane2207*
> 
> Seeing as you already claim to have a 290X in your sig, why don't you bench it and tell us whether you agree with Kyle or not?


Let's just wait a little longer , it's gonna be worth it


----------



## numero-uno

Quote:


> Originally Posted by *Kane2207*
> 
> Seeing as you already claim to have a 290X in your sig, why don't you bench it and tell us whether you agree with Kyle or not?


Lol.

That is quality!


----------



## Moragg

Quote:


> Originally Posted by *Alatar*
> 
> That was way before his other posts that said that once he did some more tests the averages showed the card being slower than a Titan at stock.
> 
> He also said he spoke too quickly and that it isn't as good as it originally seemed from his tests (referring to the post you linked me to).
> 
> Also way before any of the OCing potential talk.


Just read that far ahead, shame about that. Here's hoping to Mantle and better drivers.


----------



## zealord

Hmm don't know if 4GB ram is enough for 4K atm. That one guys needed 5.1 GB Ram for BF4 in 4K. Since my next monitor is very likely going to be 4K and I am looking for a rather powerful card that can properly display 4K (WHEN ITS TIME). But I think the next GPU I buy is still in my rig, when I am buying a 4K monitor so I just can't justify dropping so much money on a 4GB GPU that I would need to swap out when I am buying a 4K display.
I think I am gonna lay low until proper 20nm GPUs are out.


----------



## maarten12100

Quote:


> Originally Posted by *zealord*
> 
> Hmm don't know if 4GB ram is enough for 4K atm. That one guys needed 5.1 GB Ram for BF4 in 4K. Since my next monitor is very likely going to be 4K and I am looking for a rather powerful card that can properly display 4K (WHEN ITS TIME). But I think the next GPU I buy is still in my rig, when I am buying a 4K monitor so I just can't justify dropping so much money on a 4GB GPU that I would need to swap out when I am buying a 4K display.
> I think I am gonna lay low until proper 20nm GPUs are out.


I'm running it on 1.25GB so yes it'll do fine in 99% of the cases and where not it will make up with raw power (it has a gigantic edge over the Titan in high resolutions judging by the specs)


----------



## fateswarm

Quote:


> Originally Posted by *fleetfeather*
> 
> PM'd
> 
> Edit: stuff it, I see who I'm trying to rationalise with and realise such an attempt would fall on deaf ears.


Are you talking to the mirror? I suggest to go learn what the scientific method is. You do not attack a person changing opinion based on new data, and you do not attack them because of basing their opinion on the highest probability (in this case it was exceptionally ridiculous because I didn't even change opinion, I just said initially what I think is most probable and I then just pointed out to someone else that there is still a small probability for something else).


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Alatar*
> 
> *That was way before his other posts that said that once he did some more tests the averages showed the card being slower than a Titan at stock.*
> 
> He also said he spoke too quickly and that it isn't as good as it originally seemed from his tests (referring to the post you linked me to).
> 
> Also way before any of the OCing potential talk.


Lol, you mean that was before NV got to him with some $$ and/or free hardware...









Just being silly...


----------



## szeged

Quote:


> Originally Posted by *Regent Square*
> 
> U have a link to Time To Love Customs` channel where he should be showing a review of a card, like he did with 780 ?!


TTL, while i like his personality and he does do some good hardware reviews such as cases/fans/etc stuff like that, i will never trust the potential of a graphics card or cpu to reviewers like him.

Quote:


> Originally Posted by *kingduqc*
> 
> Nvidia fanboys will argue that the titan mac OC is 4% faster so it actually worth 400$ more..


I own many titans, ill go ahead and tell you that if you demand the best absolute performance, and are willing to pay the price, then its worth more to those people. For every day people who want the best, but dont want to pay for the best, should go ahead and get a 780 or 290 or 290x (depending on how good the 290 and 290x are and how they are priced)

Quote:


> Originally Posted by *Regent Square*
> 
> They will call it "ridiculing"


Just like how if the 290x ends up beating the titan by even 1% itll be ridiculing, that frame of thought goes both ways







no matter who wins, even if its by the slightest amount, it will "ridicule" the other card, and fanboys on both sides will go apesh... for weeks.

Im actually gonna miss all this endless banter going back and forth once the actual results are out and all the hype dies down in a few months, it is certainly interesting to read both "sides" view on the upcoming cards and what they will do to the current cards.


----------



## Forceman

Quote:


> Originally Posted by *Fniz92*
> 
> Kyle_Bennett says the card is faster than the titan yet alatar still says he don't believe it?


Actually Kyle didn't say that. His full quote was "AMD sez ...".


----------



## mcg75

Quote:


> Originally Posted by *infranoia*
> 
> 1. The graph was an argument for AMD's superior scaling. I'm not sure why you're calling out Catalyst improvements since then, as the data shows Tahiti's superior scaling in a number of recent games.
> 2. Part of TWIMTBP in the Batman case was just a series of IF AMD / THEN SKIP statements, which might explain why Batman is alone in showing superior SLI scaling. My only point there.
> http://www.ngohq.com/graphic-cards/16223-nvidia-disables-physx-when-ati-card-is-present.html
> http://forums.techgage.com/showthread.php?t=5313
> 
> Necroedit: One more source showing AMD's superior PCI-E scaling efficiency across a huge set of tested 2012 games, which may account for the Crossfire edge:
> http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html
> 
> I've come to understand that it is often the game engine that determines how efficient the scaling is, but there are some broad statistics that show Crossfire slightly more efficient overall. Not enough to get our panties in a wad though, so I'm bailing out of this burning wreck a second time.


I wasn't interested in the comparison results. It's pretty clear to everyone by now that the AMD cards do scale better. I don't even know anyone who would argue otherwise.

But the whole point of my post is what you claimed was "shenanigans" by Nvidia to make Batman not scale well on purpose was completely disproven by the fact that game patches and several catalyst updates have been done to improve it. AMD and Nvidia have both played this game before but before condemning either entity, proof is needed. In this case, you have no facts to back up your claim.


----------



## infranoia

Quote:


> Originally Posted by *mcg75*
> 
> I wasn't interested in the comparison results. It's pretty clear to everyone by now that the AMD cards do scale better. I don't even know anyone who would argue otherwise.
> 
> But the whole point of my post is what you claimed was "shenanigans" by Nvidia to make Batman not scale well on purpose was completely disproven by the fact that game patches and several catalyst updates have been done to improve it. AMD and Nvidia have both played this game before but before condemning either entity, proof is needed. In this case, you have no facts to back up your claim.


My response was to someone who absolutely doesn't believe Crossfire scaling is more efficient. So that's the context.

Re: Batman, I can keep tossing links at you to read. They are legion. But you're right, everything was hearsay, and didn't address Crossfire performance itself, despite the clearly anomalous behavior-- just like Nvidia's recent OriginPC incident.

EDIT: Not hearsay, per my second link, Eidos admitted they were locking down vendor IDs per Nvidia legal, and removed the restriction later.

When you change your adapter ID and get an immediate feature and FPS boost, then I'm happy to stand by my "shenanigans" claim. I've already backed that up above. I'm simply explaining away the one anomaly in the posted data set to show Crossfire efficiency across the board.

But it sounds like whoever you are, you have a dog in that race. I can accept that. Happy to move on. It's old news, hashed and rehashed, and way off-topic.


----------



## SpacemanSpliff

Quote:


> Originally Posted by *szeged*
> 
> ...Im actually gonna miss all this endless banter going back and forth once the actual results are out and all the hype dies down in a few months, it is certainly interesting to read both "sides" view on the upcoming cards and what they will do to the current cards.


Not me... it'll be nice for those of us looking to get a great card or two for an upgrade / new build because we WILL know the results, and we'll have seen several months of price battles (not too mention all those wonderful holiday sales pricing events). I'll be able to probably get two excellent GPUs that will destroy what my 5870 can do (and I'll be totally honest... if the 780 can give me 5-10% more performance when overclocked under water than the 290/290X... Team Green it is as long as Nvidia caves and sets a price drop makes it feasible), and for probably at least $250 less than initial pricing would have cost for the pair. That sounds like a win/ win situation for those who'll have the patience to wait until the holidays.


----------



## GoldenTiger

Quote:


> Originally Posted by *Regent Square*
> 
> Why was he screaming about a card if he subs 25$ per month/ Plus, it said he was a forum moderator... (not sure if it is paid feature)


$25/year, and he is not a moderator if you go look at XS. He is a normal member/subscriber.







Not to discount his knowledge, of course, but it's a real distinction and has its own connotation to say someone's a moderator of a highly-technical forum when they're not.
Quote:


> Originally Posted by *raghu78*
> 
> http://hardforum.com/showthread.php?t=1782602&highlight=
> 
> this is the thread. draw your own inferences. anyway its just 18 hrs. quit squabbling. all will be settled


He said that AMD said it, not that HE is saying it himself. There's a definite distinction between saying you have personally tested something and know it is faster, versus saying an AMD marketer told you so. I'm not sure why you link this considering I posted in that very same thread you linked







, unless you're just trying to troll.


----------



## maarten12100

http://www.overclock.net/t/1432331/vc-amd-radeon-r9-290x-and-r9-290-european-pricing-unveiled/310#post_20978429

Not sure if happy that AMD finds it worth the wait or there might not be a release the 15th


----------



## szeged

i was excited for the 15th nda release, but after this im just debating not even getting the 290x anymore, yeah i want one, but i dont want to give amd money after all the delays and teasing.


----------



## GoldenTiger

Quote:


> Originally Posted by *maarten12100*
> 
> http://www.overclock.net/t/1432331/vc-amd-radeon-r9-290x-and-r9-290-european-pricing-unveiled/310#post_20978429
> 
> Not sure if happy that AMD finds it worth the wait or there might not be a release the 15th


Considering reviewers don't have cards for the most part, it's not a "might" at this point. They aren't holding back because they want to, but rather due to seeing what the market does and simply not having it ready yet. Again, AMD and nVidia aren't benevolent charities...









http://videocardz.com/46729/amd-radeon-r9-290-series-launch-postponed


----------



## Forceman

Quote:


> Originally Posted by *szeged*
> 
> i was excited for the 15th nda release, but after this im just debating not even getting the 290x anymore, yeah i want one, but i dont want to give amd money after all the delays and teasing.


Worst part is now all the speculative crap has to continue for an unknown longer amount of time.


----------



## jomama22

Quote:


> Originally Posted by *szeged*
> 
> i was excited for the 15th nda release, but after this im just debating not even getting the 290x anymore, yeah i want one, but i dont want to give amd money after all the delays and teasing.


Considering amd has only started talking about the 290x 3 weeks ago, I don't exactly put that on their shoulders. They havnt promised anything, never spoke about true performance or price, that is everyone else.

The community has blown it up the way you describe, not really amd.


----------



## szeged

the community blew it up, and now amd continues to delay, making it worse. both at fault.


----------



## jomama22

Quote:


> Originally Posted by *szeged*
> 
> the community blew it up, and now amd continues to delay, making it worse. both at fault.


How did they delay it? Considering everyone expected a 290x release at the end of October and NDA lifts the 15th...they never set a date of any sort. Its been all rumors.


----------



## Forceman

Quote:


> Originally Posted by *jomama22*
> 
> The community has blown it up the way you describe, not really amd.


Well, the GPU 14 event was pretty unusual, and responsible for a lot of the hype. Always plenty of hype for new releases though, nothing new there.


----------



## SKYMTL

How can something that wasn't even announced be delayed? It's not AMD's fault that all these rumors were taken as gospel truth.

Basically, there is no delay. AMD will launch it when they're good and ready.


----------



## maarten12100

I figure it will still launch in Oct so I couldn't care less.
Though I was hoping for benchmarks (it took 2 weeks before the Titan was available here in NL)


----------



## Majin SSJ Eric

No kidding. Why on earth host that ridiculous hype bonanza in September then hold onto the cards until nearly November? Just a bunch of BS in my opinion...


----------



## szeged

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> No kidding. Why on earth host that ridiculous hype bonanza in September then hold onto the cards until nearly November? Just a bunch of BS in my opinion...


yeah if we dont get an nda lift soon im gonna go ahead and forget this launch is even happening


----------



## maarten12100

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> No kidding. Why on earth host that ridiculous hype bonanza in September then hold onto the cards until nearly November? Just a bunch of BS in my opinion...


Why polish a game to perfection, why look at your competition,
AMD has their reasons so I figure it'll be for Nvidia's "new" cards

does anybody else notice that 290x and 2900x were both the great but 2900x flopped terribly due to it being a bad design.
What if we would see a 290XT I would probably do a gallon of milk challenge for that again...


----------



## Majin SSJ Eric

Its made worse by the trolling posts by AMD employees on social media yipping and yapping about how great the 290X is and how its so worth the wait yet they are apparently scared to death to let us actually see how fast it is.







I hear this nonsense about there being no delay because they never released an actual release date but that is pure bull. By that logic they could wait indefinitely to release it and still say "Hey, we never actually gave you guys a release date..."

Give me a break...


----------



## infranoia

Mass confusion, @amd_roy at least is declaring ignorance, and he of all people should know of a delay. I'm sure the phone calls are flying if that's true.


----------



## PureBlackFire

Quote:


> Originally Posted by *SKYMTL*
> 
> How can something that wasn't even announced be delayed? It's not AMD's fault that all these rumors were taken as gospel truth.
> 
> Basically, there is no delay. AMD will launch it when they're good and ready.


according to this rumor it's the embargo date that's been delayed, not a launch that we never knew the exact time of. I believe you yourself were in agreement that the embargo date was the 15th? what about it? any word on when you can publish a review? have you got a sample to review for that matter?


----------



## Majin SSJ Eric

Yeah, I don't really have a problem with there not being any stock for sale on the 15th, I just want the stupid NDA to be over with...


----------



## Death Saved

Quote:


> Originally Posted by *PureBlackFire*
> 
> according to this rumor it's the embargo date that's been delayed, not a launch that we never knew the exact time of. I believe you yourself were in agreement that the embargo date was the 15th? what about it? any word on when you can publish a review? have you got a sample to review for that matter?


If it is indeed the embargo date that has been delayed, could that mean they are working on new drivers that would either solve a problem or give an increase in performance?


----------



## psyside




----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *psyside*





Lets hope this is the case


----------



## psyside

Someone with Twitter account should ask him, will the reviews be delayed, not launch


----------



## Majin SSJ Eric

They never really said the release date would be the 15th. I always thought that was just when the NDA lifted. That's what really aggravates me anyway. Why are they making us wait so long to find out what the precious reviewers have known since Hawaii?


----------



## selk22

Quote:


> Originally Posted by *psyside*
> 
> Someone with Twitter account should ask him, will the reviews be delayed, not launch


Yeah... Twitter...


----------



## szeged

yeah i can wait for the launch, i just wanted the NDA to be lifted this week. i guess its all in how you choose your wording with them.

"launch isnt delayed, nda wont be lifted till an hour before launch though" could be the entire thing he would have said if asked about delaying the nda.


----------



## Moragg

The teasing's gone on long enough. Right now we just want some benches


----------



## fleetfeather

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> They never really said the release date would be the 15th. I always thought that was just when the NDA lifted. That's what really aggravates me anyway. Why are they making us wait so long to find out what the precious reviewers have known since Hawaii?


Quote:


> All those who have expressed interest will be contacted by email once the cards arrive in mid October


Retailer in Aus

Sure, "mid October" != the 15th specifically, but it sure doesn't mean the end of Oct or beyond either

just sayin'


----------



## mcg75

Quote:


> Originally Posted by *infranoia*
> 
> My response was to someone who absolutely doesn't believe Crossfire scaling is more efficient. So that's the context.
> 
> Re: Batman, I can keep tossing links at you to read. They are legion. But you're right, everything was hearsay, and didn't address Crossfire performance itself, despite the clearly anomalous behavior-- just like Nvidia's recent OriginPC incident.
> 
> EDIT: Not hearsay, per my second link, Eidos admitted they were locking down vendor IDs per Nvidia legal, and removed the restriction later.
> 
> When you change your adapter ID and get an immediate feature and FPS boost, then I'm happy to stand by my "shenanigans" claim. I've already backed that up above. I'm simply explaining away the one anomaly in the posted data set to show Crossfire efficiency across the board.
> 
> But it sounds like whoever you are, you have a dog in that race. I can accept that. Happy to move on. It's old news, hashed and rehashed, and way off-topic.


There's only one problem with your hypothesis.

We were talking about crossfire scaling issues in Batman Arkham City and you keep providing links to Batman Arkham Asylum to back up your claims.

You can sit there and accuse me of whatever you want but reality is I just like to see things backed up with facts. To this point, you've provided none.

And no, it's not off topic. It's information for anyone who would want to crossfire 290x. Sometimes, it takes a little while to get crossfire working with both patches from the developer and catalyst enhancements.


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Someone with Twitter account should ask him, will the reviews be delayed, not launch


Yes, NDA does not have to be the same as launch. So both could very well be true (NDA is delayed, launch [meaning retail sales] is unchanged). But what everyone cares about right now is the reviews.


----------



## PureBlackFire

Quote:


> Originally Posted by *psyside*
> 
> Someone with Twitter account should ask him, will the reviews be delayed, not launch


exactly. somebody needs to ask the right questions.


----------



## infranoia

Quote:


> Originally Posted by *PureBlackFire*
> 
> exactly. somebody needs to ask the right questions.




AMD tends to clam up on Twitter if they don't like the question. We'll see.


----------



## psyside

Don't know who is HE HATE ME, but thanks


----------



## infranoia

Quote:


> Originally Posted by *mcg75*
> 
> The OCN community deserves to have factual information to make their decisions. Don't know why you have such an issue with that.


*sigh* Fact: Crossfire shows a higher efficiency in several 2011/2012 games with the sole exception, at one moment in time, of Batman Arkham City.

Supposition never posed as fact: As a statistical anomaly, why would this be the case? Well, one suggestion (not fact, but suggestion) is that because that game's codebase has had a storied and controversial history that penalized AMD (indeed, *a fact*), and that may have had-- *may have had*, not did have-- something to do with it.

Hope that's clear. That's it. That's all, and I've already abandoned the argument as pretty thin, especially since SLI vs. Crossfire continues to be a moving target and is affected by game engines in different ways. Not sure what more you want from me.

Apologies to all, I'll take any further stuff on this to PM.


----------



## szeged

Quote:


> Originally Posted by *infranoia*
> 
> 
> 
> AMD tends to clam up on Twitter if they don't like the question. We'll see.


inc no answer because they knew what we meant in the first place


----------



## maarten12100

Quote:


> Originally Posted by *fleetfeather*
> 
> Retailer in Aus
> 
> Sure, "mid October" != the 15th specifically, but it sure doesn't mean the end of Oct or beyond either
> 
> just sayin'


cards arrive at shop mid October doesn't mean they can be sold at that moment in time


----------



## mcg75

Quote:


> Originally Posted by *infranoia*
> 
> *sigh* Fact: Crossfire shows a higher efficiency in several 2011/2012 games with the sole exception, at one moment in time, of Batman Arkham City.
> 
> Supposition never posed as fact: As a statistical anomaly, why would this be the case? Well, one suggestion (not fact, but suggestion) is that because that game's codebase has had a storied and controversial history that penalized AMD (indeed, *a fact*), and that may have had-- *may have had*, not did have-- something to do with it.


*Batman: Arkham City patched for AMD 7900 cards series users.*

http://forums.anandtech.com/showthread.php?t=2233816
Quote:


> Big patch for crossfire users, it is on steam now. This enhances crossfire performance in the game _a lot_, My performance went up literally by 4x with all GPU's showing 99% usage at all times.


It's obvious the testing was done before the patch. There simply is no shenanigans just the developer needing time to fix it.


----------



## fleetfeather

Quote:


> Originally Posted by *maarten12100*
> 
> cards arrive at shop mid October doesn't mean they can be sold at that moment in time


correspondence via email with them says it does. now I have to respond to you over 2 threads?


----------



## Stay Puft

BLT isn't getting any 290Xs till Halloween so I'm thinking no one will have them come the 15th


----------



## maarten12100

Quote:


> Originally Posted by *fleetfeather*
> 
> correspondence via email with them says it does. now I have to respond to you over 2 threads?


Don't you get it if the cards arrive mid October that means they won't ship until they get the greenlight status but they will email the people that did pre ordered it means nothing.
If they don't launch in October I will be dissapointed but not at AMD more likely at the rumours being not accurate.

We all figured 44/48 ROPs from rumours yet we were proven wrong that is the thing with rumours they don't have to have a place in reality.

Alatar hasn't been so wrong about the specs in a long time at a card launch he said 512bit bus wasn't going to happen 64 ROPs wouldn't happen and probably a few other things.
Depending on whether they run it on high resolutions in the tests or are just going to run lame benchmarks like firestrike he either loses or wins 10 dollar.


----------



## Usario

Quote:


> Originally Posted by *Stay Puft*
> 
> BLT isn't getting any 290Xs till Halloween so I'm thinking no one will have them come the 15th


Since when have BLT's preorder pages been right about anything at all


----------



## Stay Puft

Quote:


> Originally Posted by *Usario*
> 
> Since when have BLT's preorder pages been right about anything at all


When havent they? There prices are normally high and for the 290X one is actually low. The In stock dates have never really been the issue.


----------



## fleetfeather

Quote:


> Originally Posted by *maarten12100*
> 
> Don't you get it if the cards arrive mid October that means they won't ship until they get the greenlight status but they will email the people that did pre ordered it means nothing.
> If they don't launch in October I will be dissapointed but not at AMD more likely at the rumours being not accurate.
> 
> We all figured 44/48 ROPs from rumours yet we were proven wrong that is the thing with rumours they don't have to have a place in reality.
> 
> Alatar hasn't been so wrong about the specs in a long time at a card launch he said 512bit bus wasn't going to happen 64 ROPs wouldn't happen and probably a few other things.
> Depending on whether they run it on high resolutions in the tests or are just going to run lame benchmarks like firestrike he either loses or wins 10 dollar.


The store said they would contact people when they arrive. What do you think they'll say? "hey there, the 290X's have arrived in our warehouse. They'll be shipped later at a later date. Bye!"

No. A store wouldn't contact you via email to say they have items in stock which can't be shipped to you. That's what pre-orders emails are for; for stock which has not arrived yet but will later. This is not a pre-order. They explicitly state that the cards will arrive in mid Oct.

Pre-order = not in stock yet = will be in stock soon = not what this is.

In-stock = arrived at store = in stock now = what this is.

Edit: this discussion is pointless. i'm going to bed.


----------



## Fniz92

Oh, am I the only one who just loves the AMD/Nvidia competetion? Greatest thing ever for us customers, and the drama too´.


----------



## selk22

Quote:


> Originally Posted by *Fniz92*
> 
> Oh, am I the only one who just loves the AMD/Nvidia competetion? Greatest thing ever for us customers, and the drama too´.


I really enjoy it. I hold no loyalties and its often amusing to watch those who do defend his/her side.


----------



## GraveDigger7878

Yeah, i just want prices to go down on both sides


----------



## GoldenTiger

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> No kidding. Why on earth host that ridiculous hype bonanza in September then hold onto the cards until nearly November? Just a bunch of BS in my opinion...


Sorta like the framepacing drivers... "they're coming soon, well before July 31st!" and they didn't release them up until the last minute of a couple of days past their self-set deadline. (shrugs)
Quote:


> Originally Posted by *selk22*
> 
> I really enjoy it. I hold no loyalties and its often amusing to watch those who do defend his/her side.


Yeah, it's pretty humorous frankly to see how wound up people get over which card someone wants to buy or prefers. The only thing not amusing is self-admitted shills/employees trolling threads pretending as though they're normal end-users. There's one in particular here and at Hardforum that always referred to AMD as his employer, but if you bring that up nowadays... well, moderation inc (heck, at that other site I got a nasty PM with an infraction attached, quite pathetic really).







. But hey, that's "viral marketing" 's job... drum up hype for their company's products and run away







, no disclosure given.


----------



## 2010rig

Quote:


> Originally Posted by *Fniz92*
> 
> Oh, am I the only one who just loves the AMD/Nvidia competetion? Greatest thing ever for us customers, and the drama too´.


What competition? It's been boring for the past 7 months, and who knows when AMD will release the cards that match performance already available.
Quote:


> Originally Posted by *GoldenTiger*
> 
> Sorta like the framepacing drivers... "they're coming soon, well before July 31st!" and they didn't release them up until the last minute of a couple of days past their self-set deadline. (shrugs)
> Yeah, it's pretty humorous frankly to see how wound up people get over which card someone wants to buy or prefers. The only thing not amusing is self-admitted shills/employees trolling threads pretending as though they're normal end-users. There's one in particular here and at Hardforum that always referred to AMD as his employer, but if you bring that up nowadays... well, moderation inc (heck, at that other site I got a nasty PM with an infraction attached, quite pathetic really).
> 
> 
> 
> 
> 
> 
> 
> . But hey, that's "viral marketing" 's job... drum up hype for their company's products and run away
> 
> 
> 
> 
> 
> 
> 
> , no disclosure given.


Are you implying that he may work for AMD? would make a lot of sense.


----------



## Fniz92

Quote:


> Originally Posted by *2010rig*
> 
> and who knows when AMD will release the cards that match performance already available. .


What cards? There is only 1 GPU they are planning to counter, that is the GK110.
You don't know the performance nor the price/performance, so stating something like that makes no sense at all.
Also seeing the title under your username makes me feel silly for responding


----------



## 2010rig

Quote:


> Originally Posted by *Fniz92*
> 
> What cards? There is only 1 GPU they are planning to counter, that is the GK110.
> You don't know the performance nor the price/performance, so stating something like that makes no sense at all.
> Also seeing the title under your username makes me feel silly for responding


All I'm saying is that AMD cards aren't out yet, and there's no word of when they're coming out.


----------



## Majin SSJ Eric

Well I think its safe to assume that it should be some time before BF4 comes out but I'm more interested in when the NDA lifts...


----------



## Blackops_2

Quote:


> Originally Posted by *2010rig*
> 
> All I'm saying is that AMD cards aren't out yet, and there's no word of when they're coming out.


Tis true ^







i want some benchmarks tuesday. Most will say it's not like many of can get agitated over this, i've been waiting on their response since Titan came out.

Who's tired of waiting? > This guy.


----------



## Fniz92

We already know the release date.
Just dig in deeper and don't let the rumors go over your head.


----------



## Forceman

Quote:


> Originally Posted by *Fniz92*
> 
> We already know the release date.
> Just dig in deeper and don't let the rumors go over your head.


Really? What is it?


----------



## Seronx

Quote:


> Originally Posted by *Forceman*
> 
> Really? What is it?


October 29th.


----------



## Forceman

Quote:


> Originally Posted by *Seronx*
> 
> October 15th to October 31st.


Thanks for narrowing that down for us.


----------



## 2010rig

Quote:


> Originally Posted by *Forceman*
> 
> Thanks for narrowing that down for us.


That's a typical AMD response though.

It's not delayed....

Launch is at launch....

Drivers will be out in 2 months.

I could be at this for days.


----------



## Stay Puft

My guess is a paper launch on the 15th and retail availability the 31st


----------



## theilya

sold my 2 660tis for $400...........

debating on 2x 280x or 1 290.... arghh


----------



## jomama22

Quote:


> Originally Posted by *2010rig*
> 
> That's a typical AMD response though.
> 
> It's not delayed....
> 
> Launch is at launch....
> 
> Drivers will be out in 2 months.
> 
> I could be at this for days.


Well considering they never said a launch date it cant be delayed. I find it funny that merely because a company holds an event for a new generation of cards (only 3 weeks ago mind you) it means that they should have released them by now. Most companies have floor samples and showings months in advance for a product release. All these rumors are getting people confused about what AMD itself has said about the 290/x.

And can you blame a company for hyping a product on twitter or by other means? I mean hell, the ps2/xbox/ps3/xbox360 launch consisted of months of small tiny pictures or leaks here and there, full on hype machine bonanza for months. Its been less then 3 weeks since the event and people want to jump down AMDs throat for not haveing a price or official benchmarks. And I cant remember the last time a gpu received a more then a weeks notice of an official release date. Titan/780 for example were officially released 2/3 days prior to the actual street date. So if you expect a hard date from amd more then a few days out, you are expecting the unusual, not the norm.

At the end, the community and rumors are the main contributors to this and the way many people feel about being tugged around, not AMD.


----------



## fateswarm

I suspect many of those promotional missteps are the casualties of a Press that doesn't care about GPUs. AMD is under the radar for 99% of the technological Press (it's all Apple and Samsung and whatever these days). So, we don't have major agencies being "embedded" at AMD giving us reliable information, instead we have "journalist wanna be' Joes for 90% of time (not all are like that, very few have consistency) and 90% of what we get is "I heard it's like that" and "I heard it's like this", and if we had big coverage we would have 600 more sources pressuring AMD and getting more reliable results early.

edit: It might work in dual ways. i.e. If the Press doesn't care, it might mean people don't care. In turn it might mean AMD doesn't care to be that spectacularly accurate at their timing (which might be flawed as a strategy, but it might be a strategy that is followed, at least indirectly).


----------



## Superplush

Quote:


> Originally Posted by *fateswarm*
> 
> I suspect many of those promotional missteps are the casualties of a Press that doesn't care about GPUs. AMD is under the radar for 99% of the technological Press (it's all Apple and Samsung and whatever these days). So, we don't have major agencies being "embedded" at AMD giving us reliable information, instead we have "journalist wanna be' Joes for 90% of time (not all are like that, very few have consistency) and 90% of what we get is "I heard it's like that" and "I heard it's like this", and if we had big coverage we would have 600 more sources pressuring AMD and getting more reliable results early.
> 
> edit: It might work in dual ways. i.e. If the Press doesn't care, it might mean people don't care. In turn it might mean AMD doesn't care to be that spectacularly accurate at their timing (which might be flawed as a strategy, but it might be a strategy that is followed, at least indirectly).


Very true. Especially so for the Nvidia news.

Ati have won contracts for all the next-gen consoles, pretty much dominate all but the enthusiast GPU price ranges ( Still a battle going on for "High-end" though ) and yet. Nvidia breaks wind and the press are all over it whilst AMD seems to get left. It's not always strictly like that but it generally does feel like it, especially with Nvidia's smear campaign they seem to be running lately.

I'm still trying to get my head around these new numbering systems. I hope it all sorts out by the time I'm going to upgrade from my 6970 2GB.


----------



## Newbie2009

Logically, the date of the 15th was never correct in the first place, website finds out they are wrong and says the cards are delayed.

OR AMD are holding the cards back because they want to loose more customers day by day in the high end segment?









I wonder which it is


----------



## Forceman

Wouldn't be the first time AMD changed the NDA date at the last minute. Same thing happened with the 7970, although that was earlier instead of later.
Quote:


> AMD originally told us that we'd have until January 9th to put together our review of the Radeon HD 7970. For a brand new GPU architecture, about three weeks with the card would be good enough to thoroughly investigate performance and deliver a complete review. Sometimes things are too good to be true. The 9th got pulled into the 22nd and three weeks turned into 6 days of testing.


----------



## raghu78

Quote:


> Originally Posted by *Stay Puft*
> 
> My guess is a paper launch on the 15th and retail availability the 31st


I don't think the gap will be so much. I am thinking retail availability on Oct 18th or Oct 23rd . reviews might go up in the next 2 - 3 days. AMD would want to get it done before their earnings call on Oct 17th evening so that if they do get back the GPU crown they would want to atleast mention it in their earnings call.

http://www.anandtech.com/show/7405/maingear-prepping-r9290x-desktop-systems

"*Interestingly, Maingear says both systems will ship on October 23 if ordered today. AMD has yet to announce a release date for the R9 290X, but perhaps this could be it.*"

I don't think AMD will want to push the availability any further. they do want to cash in on the BF4 sales. for that AMD needs to ideally launch a week before BF4.


----------



## maarten12100

Quote:


> Originally Posted by *szeged*
> 
> way to make it obvious you just look at our rigs then see titan and assume all we care about is titans and nvidia.


Most kinda do I mean Titan owners got mad for a NDA that wasn't the release date and both were rumours.

AMD never stated a release date but we expect before BF4 or end of October


----------



## szeged

Quote:


> Originally Posted by *maarten12100*
> 
> Most kinda do I mean Titan owners got mad for a NDA that wasn't the release date and both were rumours.
> 
> AMD never stated a release date but we expect before BF4 or end of October


i was upset the nda is supposedly moved back because ive been waiting for it lol, i am a titan owner wanting these cards to release because i wanna grab one to bench mark it, not because it makes me lose sleeping thinking a card would beat my current card like most seem to think we do. If youre scared of your stuff getting outdated and beaten by better stuff, you are in the wrong hobby lol.


----------



## vhsownsbeta

HIS 290x gets naked...

http://www.expreview.com/album/28801.html


----------



## vhsownsbeta

(nvm, just saw the other thread...)


----------



## Majin SSJ Eric

I've never seen an AMD die in that orientation before. They're usually diagonal, right? I guess because this is the biggest die they've used in years...


----------



## sugarhell

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I've never seen an AMD die in that orientation before. They're usually diagonal, right? I guess because this is the biggest die they've used in years...


The die its still small







( compared to a full gk110)


----------



## Remij

Quote:


> Originally Posted by *szeged*
> 
> way to make it obvious you just look at our rigs then see titan and assume all we care about is titans and nvidia.


Indeed. I mean, if anything, I would think that looking at people who have Titans are interested in ONE thing. Having the best.


----------



## fateswarm

In general they can't have many delays any more. You reach Christmas and people will start thinking 20nm. Sure it might still be at least 2 to 4 months for a release but many smart users with no immediate hardware needs will start holding back.


----------



## Fniz92

20nm is atleast 6-10 months away, and they will be very expensive at launch.


----------



## fateswarm

Quote:


> Originally Posted by *Fniz92*
> 
> 20nm is atleast 6-10 months away, and they will be very expensive at launch.


Google the sources about Q1 '04. I can pull random numbers as well, but they need sources. e.g. it's in 10 days, but nope I have no sources, or it's in H2 of '04 but nope, I played "economist" and didn't hear TSMC's official word.


----------



## Fniz92

Just use your head, who do you think is getting the 20nm at Q1 2014? Nvidia? Amd? They are little tiny underdogs compared to apple and samsung.


----------



## istudy92

Quote:


> Originally Posted by *theilya*
> 
> sold my 2 660tis for $400...........
> 
> debating on 2x 280x or 1 290.... arghh


you best got with 1 290x, single gpu is always better than dual. plus supposibly 290x may be better than titain ""


----------



## Regent Square

Quote:


> Originally Posted by *Fniz92*
> 
> Just use your head, who do you think is getting the 20nm at Q1 2014? Nvidia? Amd? They are little tiny underdogs compared to apple and samsung.


Lol, apple will be there 1st

Nvidia/AMD will follow up. The best situation will be June release of 20nm, like 1 card with insane price tag.


----------



## bencher

Quote:


> Originally Posted by *theilya*
> 
> sold my 2 660tis for $400...........
> 
> debating on 2x 280x or 1 290.... arghh


Wait for reviews of the 290x or 290.


----------



## theilya

Quote:


> Originally Posted by *istudy92*
> 
> you best got with 1 290x, single gpu is always better than dual. plus supposibly 290x may be better than titain ""


hopefully 290x lives up to the expecations.
I plan to game BF4 mostly so I'm looking for the bf4 bundle.

This would be my first "red" card after 10+ EVGA's

I kept getting errors in BF4 with my SLI setup, but as soon as I disconnected 1 card all bf4 erros went away....


----------



## fateswarm

Quote:


> Originally Posted by *Fniz92*
> 
> Just use your head, who do you think is getting the 20nm at Q1 2014? Nvidia? Amd? They are little tiny underdogs compared to apple and samsung.


You seem to imply TSMC has a microwave oven that can only be used by one guy at a time for months. They said there is mass production available by March. It is highly unlikely they will be cutting only Apple products for months, and even if they give a priority to Apple - which they will - it doesn't mean NVIDIA will be around begging them for months for a turn.

edit: Let alone the volume of Apple is likely enormous compared to the GPUs, making them easily fitting in. And in general it makes the 'mass production' part really 'massive'.

edit: Samsung cuts their own dies.


----------



## kingduqc

Quote:


> Originally Posted by *Regent Square*
> 
> Lol, apple will be there 1st
> 
> Nvidia/AMD will follow up. The best situation will be June release of 20nm, like 1 card with insane price tag.


Yeah, can't wait to see titan 2.0 on 20nm... 1500$ maybe 2000$?

edit: quoted the wrong guys...


----------



## criminal

Quote:


> Originally Posted by *kingduqc*
> 
> Yeah, can't wait to see titan 2.0 on 20nm... 1500$ maybe 2000$?
> 
> edit: quoted the wrong guys...


Will never happen. And unless Nvidia is happy with where the current Titan stands compared to the 290X once they are released, we might just get Titan 2.0 on 28nm rather than waiting on 20nm being available.


----------



## fateswarm

There's no way it's going to be called Titan 2.0 or ultra or whatever. It's going to have its own unique glorious names. GTX Goliath.

edit: No wait, that's racist.


----------



## Stay Puft

Titan Ultra = T2
290X = John Connor


----------



## Roaches

They'll probably call it GTX Atlas or something since the fully enabled GK-110 is the GK-180
I do agree Titan Ultra is a silly successor name...


----------



## GraveDigger7878

I do not think they are going to make another Titan style card for awhile. Probably just a 880 or something.


----------



## maarten12100

Quote:


> Originally Posted by *Roaches*
> 
> They'll probably call it GTX Atlas or something since the fully enabled GK-110 is the GK-180
> I do agree Titan Ultra is a silly successor name...


Nvidia please tell me more about how GK180 isn't a unlocked GK110.

Guess it isn't though as they already used unlocked GK110 in a Quadro product wonder what they did.


----------



## DzillaXx

Quote:


> Originally Posted by *maarten12100*
> 
> Nvidia please tell me more about how GK180 isn't a unlocked GK110.
> 
> Guess it isn't though as they already used unlocked GK110 in a Quadro product wonder what they did.


I think GK180 is just minor revisions done to improve compute slightly
Nothing useful for gamers.


----------



## SandGlass

Maxwell is coming in Q4 2013, I'm personally not waiting for it.


----------



## Stay Puft

Quote:


> Originally Posted by *SandGlass*
> 
> Maxwell is coming in Q4 2013, I'm personally not waiting for it.


Maxwell wont be here till atleast june next year


----------



## Tobiman

Quote:


> Originally Posted by *Roaches*
> 
> They'll probably call it GTX Atlas or something since the fully enabled GK-110 is the GK-180
> I do agree Titan Ultra is a silly successor name...


Nvidia will probably call it Titan TI and if AMD responds with a GHZ edition then we'll see the Titan TI Boost edition. LoLz


----------



## SandGlass

Semiaccurate reported that Maxwell would be coming in Q4 2014, in contrary to earlier reports that it would come in Q1 2014. Of course Nvidia didn't publicly announce this, so take it with a grain of salt. But they seem to have a tendency to delay stuff now, Denver was supposed to be out by Q1 2013, but we still don't have it in Q4 2013. That's a pretty big delay.


----------



## Nonehxc

Quote:


> Originally Posted by *Roaches*
> 
> They'll probably call it GTX Atlas or something since the fully enabled GK-110 is the GK-180
> I do agree Titan Ultra is a silly successor name...


*GTX Chronos. Or GTX Aeon*

Mythologically...

Can't get better than Time, yo.









(don't confuse with Cronus, Zeus' father and one of the first Titans. Although ancient greeks somewhat merged them into one after some time...Father Time(Chronos) with the reaping scythe(Cronus)


----------



## SandGlass

Oh yeah, I also posted the slides for the R9-290x from VC, Link here
So far I think this is the only PCIe 3.0 x16 device that can saturate the connection that is not SATA/SAS related...


----------



## keikei

Quote:


> Originally Posted by *Nonehxc*
> 
> *GTX Chronos. Or GTX Aeon*
> 
> Mythologically...
> 
> Can't get better than Time, yo.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (don't confuse with Cronus, Zeus' father and one of the first Titans. Although ancient greeks somewhat merged them into one after some time...Father Time(Chronos) with the reaping scythe(Cronus)


Either way, the price will be beyond us mere humans.


----------



## Fniz92

Quote:


> Originally Posted by *keikei*
> 
> Either way, the price will be beyond us mere humans.


*rational humans.


----------



## scyy

Quote:


> Originally Posted by *szeged*
> 
> i was upset the nda is supposedly moved back because ive been waiting for it lol, i am a titan owner wanting these cards to release because i wanna grab one to bench mark it, not because it makes me lose sleeping thinking a card would beat my current card like most seem to think we do. If youre scared of your stuff getting outdated and beaten by better stuff, you are in the wrong hobby lol.


Exactly I don't get what's so hard to understand about that.


----------



## istudy92

Quote:


> Originally Posted by *GraveDigger7878*
> 
> I do not think they are going to make another Titan style card for awhile. Probably just a 880 or something.


or a 790.... lol


----------



## SoloCamo

Quote:


> Originally Posted by *Forceman*
> 
> Wouldn't be the first time AMD changed the NDA date at the last minute. Same thing happened with the 7970, although that was earlier instead of later.


Hey if it's even remotely as great as the 7970 has been in it's lifetime then they can take all the time they want.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *SoloCamo*
> 
> Hey if it's even remotely as great as the 7970 has been in it's lifetime then they can take all the time they want.


I agree the 7970 has been a great card for AMD (maybe one of their best ever) but its time for a replacement...


----------



## fateswarm

Quote:


> Originally Posted by *Stay Puft*
> 
> Titan Ultra = T2
> 290X = John Connor


That means they would team up. Against what common enemy? Intel Xeon Phi!

edit: The irony is that exactly what's going on with the AMD/NVIDIA duopoly. Intel has superior transistor technology. But AMD/NV hold the GPU Patents captive.

edit: Or Intel tries to avoid destroying them to avoid being segmented by antitrust laws.


----------



## Stay Puft

Quote:


> Originally Posted by *fateswarm*
> 
> That means they would team up. Against what common enemy? Intel Xeon Phi!
> 
> edit: The irony is that exactly what's going on with the AMD/NVIDIA duopoly. Intel has superior transistor technology. But AMD/NV hold the GPU Patents captive.
> 
> edit: Or Intel tries to avoid destroying them to avoid being segmented by antitrust laws.


Could you imagine how amazing 3000+ Cuda cores would be on Intel's 14nm process? Could probably be done on a 200mm die


----------



## fateswarm

Quote:


> Originally Posted by *Stay Puft*
> 
> Could you imagine how amazing 3000+ Cuda cores would be on Intel's 14nm process? Could probably be done on a 200mm die


Even 22nm would be an impressive advantage over 28nm! 32nm of Intel was considered better than TSMC's 28.


----------



## Stay Puft

Quote:


> Originally Posted by *fateswarm*
> 
> Even 22nm would be an impressive advantage over 28nm! 32nm of Intel was considered better than TSMC's 28.


Intel needs to buy Nvidia.


----------



## szeged

Quote:


> Originally Posted by *Stay Puft*
> 
> Intel needs to buy Nvidia.


so we could get 5% bumps in performance each time a new product comes out in the gpu section aswell? nty.


----------



## raghu78

the wait continues. is the NDA going to be up in the next 2 days before their oct 17th earnings call or is it 24th. AMD is teasing us with this wait.


----------



## Stay Puft

Quote:


> Originally Posted by *szeged*
> 
> so we could get 5% bumps in performance each time a new product comes out in the gpu section aswell? nty.


Thankfully Nvidia has a competitor in amd so it would never happen.


----------



## GraveDigger7878

Yawn, AMD will no show


----------



## raghu78

Quote:


> Originally Posted by *Stay Puft*
> 
> Thankfully Nvidia has a competitor in amd so it would never happen.


Nvidia with Intel's process node advantage will kill the competition. GPUs are completely different from CPUs. graphics is an extremely parallel problem and by throwing more transistors you can scale performance quite linearly. if Nvidia had twice the transistor budget as AMD its game over for AMD.


----------



## fateswarm

Quote:


> Originally Posted by *szeged*
> 
> so we could get 5% bumps in performance each time a new product comes out in the gpu section aswell? nty.


Lol I agree a monopoly is hardly an improvement (also notice how intel shrinked their die sizes lately to have headroom for improvement (they seem taking precautions for the impending doom of the common silicon era)).

What I see as the real problem is the Patents being held on x86/x86-64 making AMD and Intel effectively a duopoly (and the unhealthy effects are showing, e.g. Intel is almost a monopoly now because of that) and the GPU Patents that I suppose NV and AMD hold captive in a duopoly (though I also suspect that Intel may want to avoid destroying them in case its hit by antitrust laws).

Then again, there is another possibility. All of of this being an american government ploy. The latest attack of Obama on Samsung made the possibility of that quite clear.


----------



## mingocr83

Quote:


> Originally Posted by *Fniz92*
> 
> 20nm is atleast 6-10 months away, and they will be very expensive at launch.


Indeed...2H'14 for 20nm node..they are right now at testing runs, takes a while to tune up all the machinery and processes. 28nm node is ready from December, this at TSMC where NV makes all the cards..AMD some of them...we need to wait for Global Foundries...they should be ready by now IIRC...


----------



## mingocr83

Quote:


> Originally Posted by *raghu78*
> 
> Nvidia with Intel's process node advantage will kill the competition. GPUs are completely different from CPUs. graphics is an extremely parallel problem and by throwing more transistors you can scale performance quite linearly. if Nvidia had twice the transistor budget as AMD its game over for AMD.


Well Intel on the idea of expanding their business, they will start to offer fab services soon. I dont recall if it was 14nm node or 20nm node..but for sure they are going to offer that service...TSMC and Global Foundries have the cake basically for them...Samsung on a smaller scale...for only for their phone /memory business


----------



## Forceman

Quote:


> Originally Posted by *raghu78*
> 
> the wait continues. is the NDA going to be up in the next 2 days before their oct 17th earnings call or is it 24th. AMD is teasing us with this wait.


What happened to all your confident proclamations that it would be lifted on the 15th?


----------



## fateswarm

Quote:


> Originally Posted by *mingocr83*
> 
> Indeed...2H'14 for 20nm node..they are right now at testing runs, takes a while to tune up all the machinery and processes. 28nm node is ready from December, this at TSMC where NV makes all the cards..AMD some of them...we need to wait for Global Foundries...they should be ready by now IIRC...


I doubt AMD makes any GPUs at GF. CPUs are at GF. And GF are likely behind TSMC in progress.

PS. It's very likely to see 20nm earlier than you think. They have a deal with Apple to have mass production ready by March. Hence GPUs still have a chance to be here before June.


----------



## Seronx

GlobalFoundries is ahead of TSMC but are usually late to production chain.


----------



## mingocr83

Quote:


> Originally Posted by *fateswarm*
> 
> I doubt AMD makes any GPUs at GF. CPUs are at GF. And GF are likely behind TSMC in progress.
> 
> PS. It's very likely to see 20nm earlier than you think. They have a deal with Apple to have mass production ready by March. Hence GPUs still have a chance to be here before June.


Correct Apple has some business now with TSMC..the thing is that everyone has to do the conga line for manufacturing...Bet you Apple has bought thousands of waffers way before anyone else did...


----------



## raghu78

Quote:


> Originally Posted by *Forceman*
> 
> What happened to all your confident proclamations that it would be lifted on the 15th?


videocardz was the one who started the 15th rumours. and now they have given up on predicting a launch date









http://videocardz.com/46741/radeon-r9-290x-pictured-close

but right now its most likely 18th or 24th.


----------



## Forceman

Quote:


> Originally Posted by *raghu78*
> 
> videocardz was the one who started the 15th rumours. and now they have given up on predicting a launch date
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://videocardz.com/46741/radeon-r9-290x-pictured-close
> 
> but right now its most likely 18th or 24th.


You certainly embraced it:
Quote:


> Originally Posted by *raghu78*
> 
> wait till 15th. once reviews are out make your decision.


Quote:


> Originally Posted by *raghu78*
> 
> yeah as was earlier expected Oct 15th will be the retail launch date and the day when reviews are up. AMD should not have had pre-order for BF4 Battlefield edition on Oct
> 
> 3rd. they should have released the SKU on Oct 15th along with reviews and then users would decide.


Quote:


> Originally Posted by *raghu78*
> 
> if you want a good resale value on the GTX 770 do it before Oct 15th.
> 
> 
> 
> 
> 
> 
> 
> 
> thats the case with any price cuts. no GPU is exempt. oh you mean like the people who bought GTX 770 and GTX 780 this last month only to find out their card's price got cut
> 
> after Oct 15th.


Quote:


> Originally Posted by *raghu78*
> 
> wait for Oct 15th and check out R9 290 reviews for perf and price.


----------



## Artikbot

Quote:


> Originally Posted by *Forceman*
> 
> You certainly embraced it:


As all of us did?


----------



## wermad

ladies, lets quit the school-girl chatter. Some more pics (BF4 bundle) and some benchmark screenies:

http://www.overclock.net/t/1434320/tieba-xfx-radeon-r9-290x-battlefield-4-edition-unbox-photos-and-3d-mark-results-from-china


----------



## maarten12100

Quote:


> Originally Posted by *mingocr83*
> 
> Well Intel on the idea of expanding their business, they will start to offer fab services soon. I dont recall if it was 14nm node or 20nm node..but for sure they are going to offer that service...TSMC and Global Foundries have the cake basically for them...Samsung on a smaller scale...for only for their phone /memory business


They already do that for Altera but it is overpriced better to stay with TSMC from a price perspective


----------



## mboner1

Quote:


> Originally Posted by *GraveDigger7878*
> 
> Yawn, AMD will no show


... And people wonder why we question the titan owners motive lololol.

edit: Go play with your titan, and stop worrying about this snoozefest then.


----------



## This calling

Quote:


> Originally Posted by *mboner1*
> 
> ... And people wonder why we question the titan owners motive lololol.
> 
> edit: Go play with your titan, and stop worrying about this snoozefest then.


Defensive much? Amd has an entire pr team, they don't need you to defend them as well =p


----------



## Sheyster

Quote:


> Originally Posted by *mboner1*
> 
> ... And people wonder why we question the titan owners motive lololol.


This whole thread is 90% troll fest. Ignore/report them...


----------



## fateswarm

Quote:


> Originally Posted by *Seronx*
> 
> GlobalFoundries is ahead of TSMC but are usually late to production chain.


Well that means they are in effect behind though, doesn't it? Though my main reason I see GF behind is that TSMC recently announced that they vetted most of their customers and most stayed with them. Then again, their prices might have been better or companies like NVIDIA also consider the cost of changing a manufacturer to begin with.


----------

