# [wccf] NVIDIA GeForce Titan Features 6 GB Memory – Review Samples Already Shipped



## rubicsphere

And so it begins...

Dang those look a little on the long side


----------



## Qu1ckset

Ugh its going to be hard not to replace my gtx690 with this, im loving all that vram!


----------



## Interpolation

This is wonderful news. Thank you for sharing it with us Alatar.


----------



## Vlasov_581

are those quadros?


----------



## Jermasaurus

Those are MONSTEROUS!!

...I love it!


----------



## DuckieHo

Single GPU....


----------



## rubicsphere

Please no Green PCB!!


----------



## Alatar

Quote:


> Originally Posted by *rubicsphere*
> 
> Please no Green PCB!!


The ones in the pic are teslas, I'm sure the titan will have a nice black PCB


----------



## TAr

When they going to release these? Any one knows?


----------



## th3illusiveman

So it is a single GPU... now to see if it wrecks the 690 like the rumors said it would.


----------



## TheGovernment

Sweet mother of god!!!


----------



## Booty Warrior

Quote:


> Originally Posted by *TAr*
> 
> When they going to release these? Any one knows?


Supposedly late Feb. At $899.99


----------



## Bloodbath

Ill take three


----------



## Roadkill95

I call Nvidia vs AMD/ATI war.

Subbed.


----------



## youra6

Can anyone make out the green words on the side of the card?

Quote:


> Originally Posted by *Alatar*
> 
> The ones in the pic are teslas, I'm sure the titan will have a nice black PCB


That sounded dirty...


----------



## rubicsphere

Quote:


> Originally Posted by *Booty Warrior*
> 
> Supposedly late Feb. At $899.99


If performance is where it's rumored to be (>690) in a single GPU solution it is worth every red cent.


----------



## youra6

Maxwell better be leaps and bounds better than this, otherwise I might just cave and get the Titan.. (if I am lucky enough to get my hands on one)


----------



## Forsakenfire

My bank account will suffer for this later this year.


----------



## MKHunt

... tempting. This or 690 successor? Life is hard.


----------



## doomlord52

Quote:


> Originally Posted by *Forsakenfire*
> 
> My bank account will suffer for this later this year.


Unfortunately, yea


----------



## Alatar

No matter how much people want to get their hands on one of these I would still take everything with a grain of salt at this point.

As SKYMTL said on XS:
Quote:


> Originally Posted by *SKYMTL*
> Some sites are really blowing this all out of proportion with false information.....


----------



## Mad Pistol

The picture in the OP isn't actually them, btw. Notice how the cards each have 2 6-pin power connectors. Titan will need more power than that if the rumors are correct.

Also, +1 for everyone blowing this thing completely out of proportion.


----------



## Alatar

Quote:


> Originally Posted by *Mad Pistol*
> 
> The picture in the OP isn't actually them, btw. Notice how the cards each have 2 6-pin power connectors. Titan will need more power than that if the rumors are correct.
> 
> Also, +1 for everyone blowing this thing completely out of proportion.


It actually says they're K20X cards just under the pic. I added it because I knew people would get confused but apparently no one reads


----------



## Mad Pistol

Quote:


> Originally Posted by *Alatar*
> 
> It actually says they're K20X cards just under the pic. I added it because I knew people would get confused but apparently no one reads


Yep. I'm proof positive on that one.


----------



## rubicsphere

IDK at almost 3000 CUDA cores performance has to be pretty dang good


----------



## Mad Pistol

Quote:


> Originally Posted by *rubicsphere*
> 
> IDK at almost 3000 CUDA cores performance has to be pretty dang good


Guaranteed the clock speeds are going to be throttled due to TDP limitations.


----------



## motherpuncher

I'm just going to have to buy another 7970 and pretend this doesn't exist... my wife would freaking kill me. SOOOO tempting.


----------



## SageQi

TBH if the rumors on the delay of the AMD 8000 and GTX 700 series are true than I might just buy this card.

Here's to hoping that the Titan beats the 690


----------



## tsm106

Quote:


> Originally Posted by *Alatar*
> 
> 
> (pictured: two Tesla K20X cards)


Man that is so retrol with the green pcbs.


----------



## SinX7

This is AMAZING. Hope AMD comes out with something like this to compete! Gotta start saving!


----------



## mavere

235W TDP? Significantly less than a 690's TDP, while having a massive amount of transistors and supposedly better performance. All that without a process shrink...

Yea, at least one of those pieces don't fit. I guess we'll find out soon enough though.


----------



## Alatar

Quote:


> Originally Posted by *mavere*
> 
> 235W TDP? Significantly less than a 690's TDP, while having a massive amount of transistors and supposedly better performance. All that without a process shrink...
> 
> Yea, at least one of those pieces don't fit. I guess we'll find out soon enough though.


235W is the TDP for the teslas. I'm pretty sure that if and when we get the card, it'll be with leakier dies and higher clocks meaning higher power consumption and most likely as a result, higher TDP.

But you do have to remember that TDP =/= power consumption.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Roadkill95*
> 
> I call Nvidia vs AMD/ATI war.
> 
> Subbed.


There is no war titan rocks any other single GPU card, I highly doubt the 8xxx series will offer much of a challenge.


----------



## Cannon19932006

Quote:


> Originally Posted by *Dimaggio1103*
> 
> There is no war titan rocks any other single GPU card, I highly doubt the 8xxx series will offer much of a challenge.


It's a little premature for that don't you think?


----------



## Vakten

Quote:


> Originally Posted by *Dimaggio1103*
> 
> There is no war titan rocks any other single GPU card, I highly doubt the 8xxx series will offer much of a challenge.


And this is based on what proven evidence?


----------



## Alatar

Quote:


> Originally Posted by *Cannon19932006*
> 
> It's a little premature for that don't you think?


It's a pretty safe assumption considering that it's going to be the biggest chip on 28nm by far. NV and AMD aren't changing their architectures so big changes wont be made so whatever will be replacing GK104 and Tahiti later this year will just not have the transistor budget for the improvements even conservative speculation puts GK110 at.

Unless AMD pulls a miracle chip from somewhere, the assumption that GK110 will be the most powerful single GPU on 28nm is entirely reasonable.


----------



## twitchyzero

As soon as I saw WCCF i stopped reading


----------



## Alatar

Quote:


> Originally Posted by *twitchyzero*
> 
> As soon as I saw WCCF i stopped reading


The original sources aren't wccf, they just put some stuff into a single article which is why it was easy to post.


----------



## tsm106

Quote:


> Originally Posted by *Alatar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Cannon19932006*
> 
> It's a little premature for that don't you think?
> 
> 
> 
> It's a pretty safe assumption considering that it's going to be the biggest chip on 28nm by far. NV and AMD aren't changing their architectures so big changes wont be made so whatever will be replacing GK104 and Tahiti later this year will just not have the transistor budget for the improvements even conservative speculation puts GK110 at.
> 
> Unless AMD pulls a miracle chip from somewhere, the assumption that GK110 will be the most powerful single GPU on 28nm is entirely reasonable.
Click to expand...

It's odd that yall are already crowning this card king before it even actually materializes. There's a descriptor for this type of thing right?


----------



## ghostrider85

i hope it will fit in a SG08 case


----------



## tpi2007

Quote:


> Originally Posted by *tsm106*
> 
> It's odd that yall are already crowning this card king before it even actually materializes. There's a descriptor for this type of thing right?


Yes, there is, it's called expectations of Titanic proportions. Let's just hope that the card can sail the seven seas and not sink enthusiasts' wet dream.

I can't possibly afford this, but it's nice to know it exists. I'll be waiting for the GTX 780 and GTX 770 with a 384-bit memory bus and 3 GB of RAM.


----------



## Vakten

Quote:


> Originally Posted by *Alatar*
> 
> It's a pretty safe assumption considering that it's going to be the biggest chip on 28nm by far. NV and AMD aren't changing their architectures so big changes wont be made so whatever will be replacing GK104 and Tahiti later this year will just not have the transistor budget for the improvements even conservative speculation puts GK110 at.
> 
> Unless AMD pulls a miracle chip from somewhere, the assumption that GK110 will be the most powerful single GPU on 28nm is entirely reasonable.


That's fine to say but even then isn't everything still just a rumor about specs? I can't say I've seen anything solid on specs other than :Zomg it's gonna blow everything out of the water" lol


----------



## zinfinion

For the price of one of these I could get way more fun out of a racing wheel & pedal combo as well as a Warthog HOTAS and rudder pedals.


----------



## Alatar

Quote:


> Originally Posted by *tsm106*
> 
> It's odd that yall are already crowning this card king before it even actually materializes. There's a descriptor for this type of thing right?


As I said it's a pretty safe *assumption*... I'm not being 100% on anything yet.

It's a bit like assuming SB-E was going to be the most powerful CPU for a while. No one could be 100% sure but everyone knew it was a safe assumption due to the specs and so on...
Quote:


> Originally Posted by *Vakten*
> 
> That's fine to say but even then isn't everything still just a rumor about specs? I can't say I've seen anything solid on specs other than :Zomg it's gonna blow everything out of the water" lol


We know the exact specs of the teslas and GK110. If there's a geforce card being released based on it (which looks pretty sure at this point seeing how the rumor mill has kicked into gear and how some reviewers are claiming to have samples) then you don't really need an official press release for the specs since you already have them.


----------



## emett

Quote:


> Originally Posted by *Bloodbath*
> 
> Ill take three


Be careful you may blow up your 1080p monitor.


----------



## rubicsphere

Quote:


> Originally Posted by *ghostrider85*
> 
> i hope it will fit in a SG08 case


I'm hoping it'll fit in a SG05!


----------



## Alatar

Quote:


> Originally Posted by *rubicsphere*
> 
> I'm hoping it'll fit in a SG05!


NV reference designs aren't really all that long but I'd be much more concerned about the kind of heat this thing is going to dump into the case even though the exhaust will be through the pci-e bracket...


----------



## tsm106

You know... the last time a company was that sure of something it was a bonafide flop. Just saying. It's a bit early for the sure thing. That said if true, it would be cool not to have to sport quad cards anymore.


----------



## Arni90

Quote:


> Originally Posted by *Alatar*
> 
> It's a pretty safe assumption considering that it's going to be the biggest chip on 28nm by far. NV and AMD aren't changing their architectures so big changes wont be made so whatever will be replacing GK104 and Tahiti later this year will just not have the transistor budget for the improvements even conservative speculation puts GK110 at.
> 
> Unless AMD pulls a miracle chip from somewhere, the assumption that GK110 will be the most powerful single GPU on 28nm is entirely reasonable.


Transistor count and chip size has little to do with actual performance. The most conservative estimate for the GK110 is it clocking in at 732MHz and ROPs being the limit: that would give it 9% more performance than a GTX 680 or 30% more shading power, which in no way would make it impossible for AMD to beat.

The *most likely* estimate is that this Titan will clock in at around 850MHz and have 14 SMXes and 6 ROP/L2/memory partitions, that would put the performance in the ballpark of 35-40% better than the 680 which *still* isn't out of reach if AMD has decided to make a similarly big monster as GK110 is, even though I really doubt it.
Quote:


> Originally Posted by *tsm106*
> 
> It's odd that yall are already crowning this card king before it even actually materializes. There's a descriptor for this type of thing right?


The term is fan, or fanatic if you prefer the full word.


----------



## Alatar

Quote:


> Originally Posted by *tsm106*
> 
> You know... *the last time a company was that sure of something it was a bonafide flop.* Just saying. It's a bit early for the sure thing. That said if true, it would be cool not to have to sport quad cards anymore.


I'm pretty sure lots of companies have been sure about their products after BD (and I'm sure AMD knew it sucked) launched... Like intel with SB-E, AMD with the 7970 etc.. BD was also surrounded by rumors of bad performance and low price.

Not that this has anything to do with companies anyway, there's never an official statement on anything when it comes to performance.

Are you seriously going to tell me that when we know the specs of GK110, we know there wont be any radical architecture changes for the next gen and we know AMD's small die strategy that the assumption that was made isn't a safe one?


----------



## Stay Puft

I WANT IT!!!!!









But i don't need it









God i hate getting older. In my younger days i would have just whipped out the AE card


----------



## Dimaggio1103

Quote:


> Originally Posted by *Vakten*
> 
> And this is based on what proven evidence?


Um it should be fairly obvious to anyone with glasses and a basic understanding of GPU's.

Nvidia is placing a GK110 Chip with 2688 stream processors and 384-bit memory controller as well as 6GB of GDDR5 memory, and calling it a GTX Titan. IIRC it is the Tesla GPU just tweaked for mainstream. So knowing what we know about the Tesla, we can safely guess to its performance.
Quote:


> Originally Posted by *tsm106*
> 
> It's odd that yall are already crowning this card king before it even actually materializes. There's a descriptor for this type of thing right?


Not really a rumor considering we already know what GPU it is along with specs. Simple math will lead you to the conclusion that the Titan will have zero competitors this year.


----------



## Alatar

Quote:


> Originally Posted by *Arni90*
> 
> Transistor count and chip size has little to do with actual performance. The most conservative estimate for the GK110 is it clocking in at 732MHz and ROPs being the limit: that would give it 9% more performance than a GTX 680 or 30% more shading power, which in no way would make it impossible for AMD to beat.
> 
> The *most likely* estimate is that this Titan will clock in at around 850MHz and have 14 SMXes and 6 ROP/L2/memory partitions, that would put the performance in the ballpark of 35-40% better than the 680 which *still* isn't out of reach if AMD has decided to make a similarly big monster as GK110 is, even though I really doubt it.
> The term is fan, or fanatic if you prefer the full word.


The proof of ROPs being the limit has been shady at best so far. And the proof of them being a hard limit that's the only thing to look at when it comes to kepler performance isn't there. And even if they are limiting they aren't the only factor, which makes the 9% number complete bogus that I wouldn't even call an estimate. Not to mention a 732mhz clock is very unlikely considering that that's also pretty much what GK104 teslas are clocked at.

And relying on AMD making a gigantic die shouldn't be part of the discussion when the topic is "safe assumptions about future GPUs"


----------



## DETERMINOLOGY

Sick now give us dates on 700 series woot


----------



## almighty15

I doubt the desktop will see cards like this... maybe in a very limited run but not as a mainstream release.


----------



## Defoler

I still don't know the usage of this up and coming GPU.
Is it going to be the "GPU to rule all other GPUs"?
A special case GPU with limited manufacturing?
A test case before 7xx series comes out?
If they are going to put out a GPU which will not be the next 7xx series, but be better, what is the point of the 7xx series?

I'm really confused here:thinking:

Not that I wouldn't like to see that monster coming out. But I'm not sure where it will fit.


----------



## DrBrogbo

Quote:


> Originally Posted by *Defoler*
> 
> I still don't know the usage of this up and coming GPU.
> Is it going to be the "GPU to rule all other GPUs"?
> A special case GPU with limited manufacturing?
> A test case before 7xx series comes out?
> If they are going to put out a GPU which will not be the next 7xx series, but be better, what is the point of the 7xx series?
> 
> I'm really confused here:thinking:
> 
> Not that I wouldn't like to see that monster coming out. But I'm not sure where it will fit.


I've been thinking the same thing.

I imagine that played a small part in why Nvidia delayed the 700 series until Q4. Might give them time to shrink their reference design, or work on lowering TDP or something.

Of course, maybe they don't care about the 700 series competing. Maybe the Titan will be the flagship card, and the rest will fall into place below it, for people that don't have $1k to blast on a single card.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Defoler*
> 
> I still don't know the usage of this up and coming GPU.
> 
> Not that I wouldn't like to see that monster coming out. But I'm not sure where it will fit.


It will fit, right in my case.


----------



## sugarhell

I predict OOS like the gtx680. Wait 3 months to get one


----------



## PatrickCrowely

Quote:


> Originally Posted by *Defoler*
> 
> I still don't know the usage of this up and coming GPU.
> 
> Not that I wouldn't like to see that monster coming out. But I'm not sure where it will fit.


At the top of the hill. Nvidia is really looking to put a stranglehold on the market. AMD has nothing to counter this @ the moment. Nvidia will sell these cards rather quickly. If the TDP is 300 or less it will be great. A Person only needs to run one of these, if you were lucky enough to SLI this beast it will be amazing. Only thing that would put this card over the top is unlocked voltage, but many doubt that including myself. It still will overclock well.


----------



## i7monkey

Let's assume for arguments sake that it's $899 and it's ~85% the speed of a GTX 690.

I'd love to here your opinions and guesses on a bunch of general questions.

Is this card worth getting for someone who only wants to buy one card and then get a refresh of Maxwell in two years?

Is it worth buying seeing as how it's so limited?

Is a limited card like this going to have decent driver support in a year?

Do you think it will come with a rash of issues like Fermi 1.0 that they will later fix if they release the 780 in Q4?

Rumors indicate that this will be faster than the GTX 780, which is rumored for Q4, but is there a chance that Nvidia will come out with a GTX 780 that comes close to the Titan in performance and in a 3GB package that will be much cheaper? In other words, do you think this is a rushed card and early adopters of the Titan are guinea pigs?

To be honest, I'm thinking of buying it despite what I've said about the price, and I'd hate to have Nvidia release a full GK110 chip as a GTX 780 down the road, as they have more time producing chips and even sorting out some kinks.

Do you think people buying this are guinea pigs for future 780 users?

In general, do you think this is a safe buy in terms of future driver support, kinks, issues, and protected from surprise releases like a cheaper better version in Q4 or even Maxwell in Q1 2014?


----------



## Vonnis

It would be a sidegrade at best from GTX680 SLI, but still... so much want.


----------



## Kaldari

Quote:


> Originally Posted by *Defoler*
> 
> I still don't know the usage of this up and coming GPU.
> Is it going to be the "GPU to rule all other GPUs"?
> A special case GPU with limited manufacturing?
> A test case before 7xx series comes out?
> If they are going to put out a GPU which will not be the next 7xx series, but be better, what is the point of the 7xx series?
> 
> I'm really confused here:thinking:
> 
> Not that I wouldn't like to see that monster coming out. But I'm not sure where it will fit.


It fits in the same place 680 SLI fits, except it's actually better for the most part. Getting comparable performance without having to deal with SLI is a huge boon. There are a variety of issues that SLI introduces. And if the price point is what the rumors say, the performance/dollar ratio is excellent.

After running SLI for a couple generations, I'm pretty much done with it. It's more trouble than it's worth. I'm all about the most powerful single-GPU card I can get these days.

I'll likely pick one up if the rumors are remotely close to true.

*edit*: I forgot to mention that the 6GB of RAM completely future-proofs it as well. You can run as many mods as you want in any game with that kind of buffer.


----------



## MxPhenom 216

Quote:


> Originally Posted by *Dimaggio1103*
> 
> There is no war titan rocks any other single GPU card, I highly doubt the 8xxx series will offer much of a challenge.


The logic of a fanboy. Don't get too excited now. For all we know this could be another GTX480.


----------



## Capt

Quote:


> Originally Posted by *th3illusiveman*
> 
> So it is a single GPU... now to see if it wrecks the 690 like the rumors said it would.


Highly doubt it.


----------



## Seid Dark

Quote:


> Originally Posted by *MxPhenom 216*
> 
> The logic of a fanboy. Don't get too excited now. For all we know this could be another GTX480.


So it would be very powerful card with excellent OC scaling at the cost of high power consumption. I wouldn't mind that


----------



## Cyclonic

I am going to buy a 690 if this releases







Think that will drop in price to arround 600


----------



## ABeta

Quote:


> Originally Posted by *Vonnis*
> 
> It would be a sidegrade at best from GTX680 SLI, but still... so much want.


A sidegrade to any multi card setup that is a single card is always an upgrade. If rumors hold true, and if this thing can surpass 690 performance and even gets close to dual 680 performance, I will surely sell mine to end up with a single card beast.


----------



## Systemlord

What's this Q4 release date for GTX 780?


----------



## Artikbot

Quote:


> Originally Posted by *PatrickCrowely*
> 
> At the top of the hill. Nvidia is really looking to put a stranglehold on the market. AMD has nothing to counter this @ the moment. Nvidia will sell these cards rather quickly. If the TDP is 300 or less it will be great.


As long as it doesn't happen the same as the GTX480... Two HD5870s costed less, ran faster, and ate less power.


----------



## RB Snake

This card seems perfect for multimonitor people, especially with 1440p monitors. But I can't really see myself buying this, unless there's actually more games that'll challenge it on single monitor. I think I'll wait for IvyBridge-E for my next upgrade.


----------



## Celeras

Quote:


> Originally Posted by *RB Snake*
> 
> This card seems perfect for multimonitor people, especially with 1440p monitors. But I can't really see myself buying this, unless there's actually more games that'll challenge it on single monitor. I think I'll wait for IvyBridge-E for my next upgrade.


You'd need to go higher than even two 1440p's to even come close to utilizing 6GB. Seems so unnecessary to me. By the time you get to a resolution that requires that much, you'd need more than one card to power it at decent frames.


----------



## i7monkey

I have a suspicion that:


This card is rushed because they want attention away from the PS4 and given that it's limited in quantity it also supports that theory.
Because of this, they haven't worked out all the kinks and users who buy Titan will end up as guinea pigs.

I for one don't want to be a guinea pig.

I want this card to be a mainstream release like the 680 or 580. I don't want to be stuck with an unsupported, overheating mess that gets fixed with the release of the GTX 780 8 or 9 months from now.

I don't want to pay a grand to beta test a video card so that others benefit 9 months from now by getting a trouble free, efficient card that's much cheaper.

Anyone else afraid of this too?


----------



## Kaldari

Quote:


> Originally Posted by *i7monkey*
> 
> I have a suspicion that:
> 
> 
> This card is rushed because they want attention away from the PS4 and given that it's limited in quantity it also supports that theory.
> Because of this, they haven't worked out all the kinks and users who buy Titan will end up as guinea pigs.
> 
> I for one don't want to be a guinea pig.
> 
> I want this card to be a mainstream release like the 680 or 580. I don't want to be stuck with an unsupported, overheating mess that gets fixed with the release of the GTX 780 8 or 9 months from now.
> 
> I don't want to pay a grand to beta test a video card so that others benefit 9 months from now by getting a trouble free, efficient card that's much cheaper.
> 
> Anyone else afraid of this too?


Or you could just wait for these review samples to actually be *reviewed*, see what they say about the cards, and make a decision from that instead of worrying about gut feelings.


----------



## Newbie2009

With a bit of luck this will be like the gtx 580 launch and not the gtx 480.


----------



## Ghoxt

Why would it not be just like the 680 launch? Quality wise. I would think Nvidia learned form the 590 mess so don't expect a repeat of the questionable drivers and persons overvolting their cards. This has been remediated. Does someone think Nvidia just forgot?

The 680 launch seemed fine, and its recent, so no reason to think they would take a complete turnaround forget every lesson learned over the last couple years , jump in a time machine and go back 5 years...and release a dog.


----------



## edalbkrad

are those 10 vram chips?
10 x 512 = 5 GB not 6GB


----------



## Kiracubed

Any exact, or even rumored release day? I keep hearing February 2013... and we're already 5 days into February 2013. With test samples out, it seems legit, and I lvoe to hear, but I'd like to see a pre-order option on Amazon, or a date when a listing will show so I can camp out here with all the overclock members and put in an order; for those that wouldn't mind risking being a "guinea pig"









I certainly hope the latter remains a joke. I would love to buy this with full driver support. I don't mind the cost, because I paid for my 680's with mostly credit, so I can eBay them no problem. For $900 for a single GPU that delivers 85% of the 690 in a single GPU solution, 6GB DDR5 VRAM and a 384-bit memory interface, I can only say, "Shutup and take my money!"


----------



## sydas

Hawt.


----------



## supergamer

Quote:


> Originally Posted by *edalbkrad*
> 
> are those 10 vram chips?
> 10 x 512 = 5 GB not 6GB


those are K20's so 320bit 5GB.


----------



## maarten12100

Quote:


> Originally Posted by *Kiracubed*
> 
> a single GPU that delivers 85% of the 690 in a single GPU solution, 6GB DDR5 VRAM and a 384-bit memory interface, I can only say, "Shutup and take my money!"


My thoughts exactly


----------



## EnticingSausage

Why are people saying this is a good deal if the RRP is to be believed? If the price is supposed to be linear with performance then shouldn't cards go up in price every generation? I sure don't want Nvidia or anyone else for that matter thinking they can charge almost a grand for a single gpu


----------



## Artikbot

Quote:


> Originally Posted by *i7monkey*
> 
> I have a suspicion that:
> 
> 
> This card is rushed because they want attention away from the PS4 and given that it's limited in quantity it also supports that theory.
> Because of this, they haven't worked out all the kinks and users who buy Titan will end up as guinea pigs.


8800 Ultra. AKA a supervitaminated 8800GTX. It was made just for the hell of it, to show Microsoft that nVIDIA had a larger epeen than them.


----------



## mcg75

Quote:


> Originally Posted by *MxPhenom 216*
> 
> The logic of a fanboy. Don't get too excited now. For all we know this could be another GTX480.


That's not logic of a fanboy. That's solid logic based on everything we currently know about the Titan and next gen AMD cards to this point.

Does this give AMD a chance to retaliate? Sure. But can they is another story that we won't know the answer to for awhile.

Oh and mods frown upon the "fanboy" term being used here as well.


----------



## Edge Of Pain

Quote:


> Originally Posted by *Mad Pistol*
> 
> The picture in the OP isn't actually them, btw. Notice how the cards each have 2 6-pin power connectors. Titan will need more power than that if the rumors are correct.
> 
> Also, +1 for everyone blowing this thing completely out of proportion.


I thought the rumours said it would sit right in the middle of GTX 680 and 690? That's fairly reasonable IMO. Unless I haven't read the right rumours


----------



## mcg75

Quote:


> Originally Posted by *i7monkey*
> 
> I have a suspicion that:
> 
> 
> This card is rushed because they want attention away from the PS4 and given that it's limited in quantity it also supports that theory.
> Because of this, they haven't worked out all the kinks and users who buy Titan will end up as guinea pigs.


Can't agree on the first one Monkey.

A limited release card is going to do absolutely nothing against next generation console sales. 99% of them don't game on PC anyway.


----------



## Artikbot

Quote:


> Originally Posted by *mcg75*
> 
> Can't agree on the first one Monkey.
> 
> A limited release card is going to do absolutely nothing against next generation console sales. 99% of them don't game on PC anyway.


Gives NVIDIA e-peen.

E-peen attracts buyers, even if they end up buying GT620s with 2GB of DDR2. They have a Geforce, and to their knowledge Geforces are the fastest cards in the market.


----------



## Serephucus

Has no-one else noticed that the cards don't have any display outputs?

They're not Titans.

Edit: Derp, never mind, didn't read the fine print.


----------



## Alatar

Quote:


> Originally Posted by *Artikbot*
> 
> As long as it doesn't happen the same as the GTX480... Two HD5870s costed less, ran faster, and ate less power.


What?

a 5870 was $400

a 480 was $500

The 480 at launch performed 5-10% better than the 5870 and after a few months of drivers the difference was around 15%+. Two 5870s also definitely ate more power. I actually had a 5870 CFX setup when they were new and I still use a 480 a a secondary card today








Quote:


> Originally Posted by *supergamer*
> 
> those are K20's so 320bit 5GB.


Oh I didn't notice they had two memory chips missing, yeah K20s instead of K20Xs, will fix.


----------



## bojinglebells

Quote:


> Originally Posted by *EnticingSausage*
> 
> Why are people saying this is a good deal if the RRP is to be believed? If the price is supposed to be linear with performance then shouldn't cards go up in price every generation? I sure don't want Nvidia or anyone else for that matter thinking they can charge almost a grand for a single gpu


people are saying its a good deal based on a few things with the latest rumors

1. its faster than 690
2. single gpu vs. 690 dual gpu
3. $100 cheaper than 690


----------



## Artikbot

Quote:


> Originally Posted by *Alatar*
> 
> What?
> 
> a 5870 was $400
> 
> a 480 was $500
> 
> The 480 at launch performed 5-10% better than the 5870 and after a few months of drivers the difference was around 15%+


Let me browse the internet again, as there is some serious evidence that my memory derped in massively epic proportions









Edit: Reviews from back then indicate that my memory did indeed derp, and what I was thinking of was 5850 CFX against GTX480, which were indeed faster, cheaper, and eating less power









http://hexus.net/tech/reviews/graphics/24061-nvidia-geforce-gtx-480-vs-amd-radeon-hd-5850-crossfire-battle-450/

Thanks for correcting me









What's kind of weird is that I did effectively own a HD5870 until 6 months ago, and I always felt it was a good 30% slower than the GTX480.


----------



## Vengeance47

Quote:


> Originally Posted by *bojinglebells*
> 
> people are saying its a good deal based on a few things with the latest rumors
> 
> 1. its faster than 690
> 2. single gpu vs. 690 dual gpu
> 3. $100 cheaper than 690


This.

To have a *single GPU* that _beats_ out a *dual GPU* by something like 10%(?) and is $100 cheaper is an absolute behemoth and worth every last cent.

Props to Nvidia if Titan does indeed perform as is rumoured. Sadly people will whinge because its such a limited release and blah blah but at the end of the day, if Titan does what is rumoured.....you'd be a fool not to realise how much of an amazing engineering feat GK110 is


----------



## EnticingSausage

Right, except when the 680 came out it was within spitting distance of the 590 in most games at half the price. Why is this any different? Aren't you afraid of price gouging like this becoming the norm?


----------



## Bloodbath

Quote:


> Originally Posted by *emett*
> 
> Be careful you may blow up your 1080p monitor.


As soon as I buy my next house I'm getting two more monitors, one will just have to do for the next couple months I guess


----------



## Vengeance47

Quote:


> Originally Posted by *EnticingSausage*
> 
> Right, except when the 680 came out it was within spitting distance of the 590 in most games at half the price. Why is this any different? Aren't you afraid of price gouging like this becoming the norm?


Not really a fair comparison since that was a change in architecture (Fermi to Kepler) that brought that change.

Titan is based on the same architecture (kepler) as the 600 series and destroys them let alone anything else that is out there.

Likewise, when Maxwell releases, it is fair to assume that the GTX880 (or whatever the high-end first gen maxwell is called) will beat the Titan as its a change in architecture. But to destroy your own dual GPU product using the same architecture.......astonishing


----------



## maarten12100

Quote:


> Originally Posted by *Vengeance47*
> 
> Likewise, when Maxwell releases, it is fair to assume that the GTX880 (or whatever the high-end first gen maxwell is called) will beat the Titan as its a change in architecture. But to destroy your own dual GPU product using the same architecture.......astonishing


Well putting 2 small dies on 1 pcb and clocking the hell out of them isn't really something to write about either.
The gtx295 was the real doomsday device back in the days.
With 2 huge ass chips and a heat production and power consumption above anything seen before. (gtx9800x2 wasn't even close to that beast)

If they would make a dual titan 500W powerhouse xD (but as we all know there will be no non ref designs)
Asus Mars 3 anyone?


----------



## coachmark2

Quote:


> Originally Posted by *Vengeance47*
> 
> Not really a fair comparison since that was a change in architecture (Fermi to Kepler) that brought that change.
> 
> Titan is based on the same architecture (kepler) as the 600 series and destroys them let alone anything else that is out there.
> 
> Likewise, when Maxwell releases, it is fair to assume that the GTX880 (or whatever the high-end first gen maxwell is called) will beat the Titan as its a change in architecture. *But to destroy your own dual GPU product using the same architecture.......astonishing*


Very well thought out. Especially the bold.


----------



## mott555

I hope this causes a big price drop for the GTX 680.


----------



## dph314

Quote:


> Originally Posted by *mott555*
> 
> I hope this causes a big price drop for the GTX 680.


................. Anyone wanna buy some 680's?


----------



## mott555

Quote:


> Originally Posted by *dph314*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mott555*
> 
> I hope this causes a big price drop for the GTX 680.
> 
> 
> 
> ................. Anyone wanna buy some 680's?
Click to expand...

I think it's about time to upgrade the ol' 460. Especially since I have a 1440p monitor now. And it's tax season.


----------



## ZealotKi11er

Like Rumors said paper launch and $899.


----------



## wTheOnew

Quote:


> Originally Posted by *EnticingSausage*
> 
> Aren't you afraid of price gouging like this becoming the norm?


This isn't price gouging. Price gouging can't exist for non-essential items. This is just capitalism. If the market can't bear the $900 price tag it will drop.

That being said. Very low total production plus the segment of the enthusiast world that will buy the highest end whatever for any price means this will likely sell out at the $900 asking price.


----------



## Alatar

I really hope the guys thinking that this will beat a 690 aren't getting their hopes up...

I mean I've been touting GK110 as the best thing since sliced bread for almost a year now but cmon, lets be realistic.


----------



## ZealotKi11er

Also you cant really compare to GTX690 and determine that its good value because GTX690 was never good value. GTX680 is already 1 years old so this card should not be that much more expensive. That like GTX680 costing $699 because it that much faster then GTX580.


----------



## Boomstick777

Quote:


> Originally Posted by *Alatar*
> 
> What?
> 
> a 5870 was $400
> 
> a 480 was $500
> 
> The 480 at launch performed 5-10% better than the 5870 and after a few months of drivers the difference was around 15%+. Two 5870s also definitely ate more power. I actually had a 5870 CFX setup when they were new and I still use a 480 a a secondary card today


Agreed, the GTX 480 was a beast, in fact those still @ 1080P could of rocked a GTX 480 from launch (2010) right up until now (2013) that's 3 years worth of gaming, good value.

Hopefully the 'Titan' is like the 8800GTX or GTX 480, big powerful beast that'll be on top for a long time. For all we know it may be an early April fools joke







.


----------



## Raptorpowa

time to dump the 690 and the 7950's....One of these baby's each rig will last me a couple of years.


----------



## maarten12100

Quote:


> Originally Posted by *Raptorpowa*
> 
> time to dump the 690 and the 7950's....One of these baby's each rig will last me a couple of years.


That al rather depends if you want the highest of the highest judging by having a system with gtx690 or XF 7950 you do want the best of the best as for graphics.


----------



## brasco

It's like the ultimate tease. Come on reviewers (and the shipping people) faster faster


----------



## Master__Shake

Quote:


> Originally Posted by *i7monkey*
> 
> I have a suspicion that:
> 
> 
> This card is rushed because they want attention away from the PS4 and given that it's limited in quantity it also supports that theory.
> Because of this, they haven't worked out all the kinks and users who buy Titan will end up as guinea pigs.
> 
> I for one don't want to be a guinea pig.
> 
> I want this card to be a mainstream release like the 680 or 580. I don't want to be stuck with an unsupported, overheating mess that gets fixed with the release of the GTX 780 8 or 9 months from now.
> 
> I don't want to pay a grand to beta test a video card so that others benefit 9 months from now by getting a trouble free, efficient card that's much cheaper.
> 
> Anyone else afraid of this too?


i don't understand how this card cabn take attention away from the ps4? if anything it will give console fanboys more ammo in the console vs pc war...

it's not really a viable solution to pull gamers from the console to the pc, nor is it a better deal considering this card is just a card and not a whole system.


----------



## Cloudfire777

The chinese people who usually leak the benchmarks way ahead of other sources can`t publish the scores either, since the Titan is most likely only sent to a handful of reviewer sites to keep everything clean.

Sucks


----------



## Victor_Mizer

I wonder how the Titan will compare against the 7xx in Q4... It would suck to put out 900$ and then have to replace it later in the year.


----------



## Vengeance47

Quote:


> Originally Posted by *Alatar*
> 
> I really hope the guys thinking that this will beat a 690 aren't getting their hopes up...
> 
> I mean I've been touting GK110 as the best thing since sliced bread for almost a year now but cmon, lets be realistic.


Agreed that it is unlikely, but even if it were say ~40-50% faster than the GTX680/7970GE.......that's still a beastly chip. Since not only will it have the gaming performance but also the GPGPU performance should also smash the 7970GE (let alone what it will do to the GTX680). So you get the best of both worlds.

But from the sounds of it, Titan will be more than 40-50% faster (looking at the specs would support that theory) which is still a staggering performance increase that was achieved without a new architecture being used. Even the GTX580 was only 20% faster than the GTX480 and that was a decent improvement using the same architecture. So getting a 40%+ increase........how can anyone possibly argue that that is not an incredible feat


----------



## Votkrath

Quote:


> Originally Posted by *Victor_Mizer*
> 
> I wonder how the Titan will compare against the 7xx in Q4... It would suck to put out 900$ and then have to replace it later in the year.


Why would you have to replace it? Only reason I can see is if you want to benchmark whore.


----------



## maarten12100

Quote:


> Originally Posted by *Victor_Mizer*
> 
> I wonder how the Titan will compare against the 7xx in Q4... It would suck to put out 900$ and then have to replace it later in the year.


Doubt that as the gtx7XX will likely be having small dies al over again.
It is like comparing a gtx275 with a gtx460


----------



## Victor_Mizer

Quote:


> Originally Posted by *Votkrath*
> 
> Why would you have to replace it? Only reason I can see is if you want to benchmark whore.


I am just looking ahead. I am interested in seeing how the Titan performs, however, they could have a 790 Q4 that is faster than the Titan is what I'm trying to say. Some people may want more FPS on ultra setting games, not everyone is a benchmark whore.


----------



## PatrickCrowely

Quote:


> Originally Posted by *Vengeance47*
> 
> Even the GTX580 was only 20% faster than the GTX480 and that was a decent improvement using the same architecture. So getting a 40%+ increase........how can anyone possibly argue that that is not an incredible feat


Maybe this is the real 680, some say that the current 680 is a mid-range GPU. Maybe Nvidia is finally releasing *"The Real GTX 680"*


----------



## maarten12100

Quote:


> Originally Posted by *PatrickCrowely*
> 
> Maybe this is the real 680, some say that the current 680 is a mid-range GPU. Maybe Nvidia is finally releasing *"The Real GTX 680"*


It contained a mid end chip which has been clocked the hell out.
So yeah however I think they planned it this way the "Forest" tesla cards being rebranded to high end GTX cards.


----------



## Falknir

These cards have a real possibility of destroying my bank balance before the end of the month.


----------



## Vengeance47

Quote:


> Originally Posted by *PatrickCrowely*
> 
> Maybe this is the real 680, some say that the current 680 is a mid-range GPU. Maybe Nvidia is finally releasing *"The Real GTX 680"*


GK110 was originally intended to be the GTX680, it's a known fact

But obviously when Nvidia saw just how good GK104 was they decided to use the GK110 as a Tesla card only as there was no reason to release a GK110 based GeForce product as they cost much MUCH more to produce than the GK104 chips (much larger die) and it didn't make economic sense for them to do so.

However the point still stands. Kepler is an amazing architecture. The simple fact that the mid-range GK104 keeps up with Tahiti XT and then the original high-end chip (GK110) puts both of them to shame.....it just goes to show what a wonderful job the engineers at Nvidia did with Kepler and they should be praised for their work. Its by no-means an easy job to design a new GPU architecture and given that Kepler is as efficient and powerful as it is, they should be acknowledged for their work. No matter whether you are an AMD or Nvidia fan, you have to give credit where it is due. And by god do the engineers at Nvidia deserve praise for what they acheived with Kepler


----------



## Stay Puft

Quote:


> Originally Posted by *Alatar*
> 
> I really hope the guys thinking that this will beat a 690 aren't getting their hopes up...
> 
> I mean I've been touting GK110 as the best thing since sliced bread for almost a year now but cmon, lets be realistic.


How can they actually believe that? The 690 has 14.2% more cuda cores and will probably be clocked 5-8% higher


----------



## WarMacheen

Maybe time to get rid of my 7970's


----------



## Master__Shake

Quote:


> Originally Posted by *Vengeance47*
> 
> GK110 was originally intended to be the GTX680, it's a known fact
> 
> But obviously when Nvidia saw just how good GK104 was they decided to use the GK110 as a Tesla card only as there was no reason to release a GK110 based GeForce product as they cost much MUCH more to produce than the GK104 chips (much larger die) and it didn't make economic sense for them to do so.
> 
> However the point still stands. Kepler is an amazing architecture. The simple fact that the mid-range GK104 keeps up with Tahiti XT and then the original high-end chip (GK110) puts both of them to shame.....it just goes to show what a wonderful job the engineers at Nvidia did with Kepler and they should be praised for their work. Its by no-means an easy job to design a new GPU architecture and given that Kepler is as efficient and powerful as it is, they should be acknowledged for their work. No matter whether you are an AMD or Nvidia fan, you have to give credit where it is due. And by god do the engineers at Nvidia deserve praise for what they acheived with Kepler


ahem...gk110 wasn't even made when the 680 was released...

unless nvidia waited till november to release the 680 that is.

http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last


----------



## linkdiablo

This is old, but seems to be mostly coming to reality http://www.geeks3d.com/20111126/nvidia-kepler-gpus-roadmap-gk107-gk106-gk104-gk110-and-gk112/

What happened to GK112 or was it a hoax?


----------



## maarten12100

Quote:


> Originally Posted by *linkdiablo*
> 
> This is old, but seems to be mostly coming to reality http://www.geeks3d.com/20111126/nvidia-kepler-gpus-roadmap-gk107-gk106-gk104-gk110-and-gk112/
> 
> What happened to GK112 or was it a hoax?


Yes it is a hoax as the gk104 is 384 bit and the gk110 is the gtx690 with dual titan
Something above a dual titan in a single die what a joke #700Wpowerhouse


----------



## RagingCain

I will believe it when I see it. Then if I see it, I may buy it. I may just close my eyes from this point on...


----------



## Mhill2029

Ponders the idea of 4-Way Titans......


----------



## Vengeance47

Quote:


> Originally Posted by *Master__Shake*
> 
> ahem...gk110 wasn't even made when the 680 was released...
> 
> unless nvidia waited till november to release the 680 that is.
> 
> http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last


You clearly lack the distinction between when a product is released and when it is actually finished.

You can bet your bottom dollar that GK110 had completed the prototyping phase well and truly before kepler was ever released and could have been rolled out as a GTX680 if Nvidia pleased. What happened was once kepler tapered out and the HD7970 was released, Nvidia saw the potential of GK104 and made the decision to shelve GK110 from GeForce production......in other words, they could have used GK110, however GK104 was strong enough, and much cheaper to manufacture and as such, GK104 became the GTX680 as it performed on par with Tahiti XT and cost considerably less to manufacture (this means higher profit margins for Nvidia).

GPU chips aren't like shoes, they don't get manufactured, shoved in a box, then sent in a shipping container on a boat on the same day. The production ready designs are completed well in advance of product shipment dates and Nvidia would have had an idea of GK104 and GK110 performance months before the GTX680 was released and once they saw the Tahiti XT performance numbers, it made economical and business sense to go with the GK104 chip for the GTX680.

So just because the Tesla GK110 cards weren't released until November, that doesn't mean that GK110 wasn't ready for mass production well and truly in advance, it simply means that Nvidia made the smart business decision and used a cheaper product that performed more than adequately.


----------



## tsm106

Quote:


> Originally Posted by *Mhill2029*
> 
> Ponders the idea of 4-Way Titans......


Can you say cpu bottleneck?


----------



## freitz

Quote:


> Originally Posted by *Mhill2029*
> 
> Ponders the idea of 4-Way Titans......


Would be sick I just want to see a Nvidia Announcment.


----------



## pcfoo

Quote:


> Originally Posted by *Vengeance47*
> 
> Not really a fair comparison since that was a change in architecture (Fermi to Kepler) that brought that change.
> 
> Titan is based on the same architecture (kepler) as the 600 series and destroys them let alone anything else that is out there.
> 
> Likewise, when Maxwell releases, it is fair to assume that the GTX880 (or whatever the high-end first gen maxwell is called) will beat the Titan as its a change in architecture. But to destroy your own dual GPU product using the same architecture.......astonishing


Well, that what you say there is not exactly logical.
There are two ways to do it with the same architecture: either clock one product higher, or provide drivers that "unlock" potential that is locked in the other card (i.e. something like Quadro vs. Geforce of the same architecture).

We don't know the exact numbers yet - it is all rumors - but the Titan won't be able to beat the 690 in RL gaming - unless it is clocked higher, or nVidia works "magic" on the driver side.
The 1st is not likely to be so.


----------



## Avonosac

Quote:


> Originally Posted by *Stay Puft*
> 
> How can they actually believe that? The 690 has 14.2% more cuda cores and will probably be clocked 5-8% higher


Regardless, I would take the single card over the on board SLI any day. The 6GB can actually be utilized, and the memory interface at 384 will remove a lot of the large resolution memory limits, as the bandwidth will be huge when OC'd. Even if the performance is 10% less than the 690, this is the first 600 series era card which feels worth the investment.


----------



## Master__Shake

Quote:


> Originally Posted by *Vengeance47*
> 
> You clearly lack the distinction between when a product is released and when it is actually finished.
> 
> You can bet your bottom dollar that GK110 had completed the prototyping phase well and truly before kepler was ever released and could have been rolled out as a GTX680 if Nvidia pleased. What happened was once kepler tapered out and the HD7970 was released, Nvidia saw the potential of GK104 and made the decision to shelve GK110 from GeForce production......in other words, they could have used GK110, however GK104 was strong enough, and much cheaper to manufacture and as such, GK104 became the GTX680 as it performed on par with Tahiti XT and cost considerably less to manufacture (this means higher profit margins for Nvidia).
> 
> GPU chips aren't like shoes, they don't get manufactured, shoved in a box, then sent in a shipping container on a boat on the same day. The production ready designs are completed well in advance of product shipment dates and Nvidia would have had an idea of GK104 and GK110 performance months before the GTX680 was released and once they saw the Tahiti XT performance numbers, it made economical and business sense to go with the GK104 chip for the GTX680.
> 
> So just because the Tesla GK110 cards weren't released until November, that doesn't mean that GK110 wasn't ready for mass production well and truly in advance, it simply means that Nvidia made the smart business decision and used a cheaper product that performed more than adequately.


http://semiaccurate.com/2012/02/07/gk110-tapes-out-at-last/#.URFCoPJLQwo

taped out in january, so they had 2 months to get it on the shelves for the 680 release, when in reality it took about 10 months to get it to the market.

http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-World-s-Fastest-Most-Efficient-Accelerators-Powers-World-s-No-1-Supercomputer-8b6.aspx


----------



## Cloudfire777

Why are you linking to SemiAccurate? Its run by a clown, doesnt know anyone in the industry and everyone @ Intel laughs at him

And yeah I almost forgot: He charge $50 for a BIOS picture


----------



## h2spartan

Aww man, I just bought a EVGA 680 FTW. I've heard of EVGA's step up program. Do you guys think I'll be able to that when this comes out?


----------



## freitz

90 days for the program


----------



## h2spartan

Is it suppose to come out within 90 days? or is that still up in the air?


----------



## freitz

Quote:


> Originally Posted by *h2spartan*
> 
> Is it suppose to come out within 90 days? or is that still up in the air?


'
I have no idea on that Nvidia has yet to release anytype of information to my knowledge but If something does come out which there is reason to believe something will then you should be ok.


----------



## rcfc89

Quote:


> Originally Posted by *Cyclonic*
> 
> I am going to buy a 690 if this releases
> 
> 
> 
> 
> 
> 
> 
> Think that will drop in price to arround 600


Keep dreaming. With the Titan going for 9 bills I see the 690 dropping to maybe 8 bills. This is only if the Titan performs better. If not the 690 will stay at a G.


----------



## maarten12100

Quote:


> Originally Posted by *dtolios*
> 
> Well, that what you say there is not exactly logical.
> There are two ways to do it with the same architecture: either clock one product higher, or provide drivers that "unlock" potential that is locked in the other card


A third is micro architecture improvements and they can really add up cache is the simpelest example about this kind of improvement.


----------



## h2spartan

Okay, yeah that would make most sense since 780 is suppose to hit Q4, that the Titan would be released much sooner. Thx!


----------



## freitz

Quote:


> Originally Posted by *h2spartan*
> 
> Okay, yeah that would make most sense since 780 is suppose to hit Q4, that the Titan would be released much sooner. Thx!


I hope everyone is right. Im looking to make a move on new gpu solutions within the next 30 days.


----------



## freitz

Quote:


> Originally Posted by *h2spartan*
> 
> Okay, yeah that would make most sense since 780 is suppose to hit Q4, that the Titan would be released much sooner. Thx!


I hope everyone is right. Im looking to make a move on new gpu solutions within the next 30 days.


----------



## Master__Shake

Quote:


> Originally Posted by *Cloudfire777*
> 
> Why are you linking to SemiAccurate? Its run by a clown, doesnt know anyone in the industry and everyone @ Intel laughs at him
> 
> And yeah I almost forgot: He charge $50 for a BIOS picture


well the same jack off who is the source for this artice got his info for the gk110 article he re-posted from 3dcenter, who got their repost from semi-accurate

http://wccftech.com/nvidia-gk110-specifications-disclosed/

http://www.3dcenter.org/news/tape-out-von-nvidias-kepler-chip-gk110-erfolgreich-gk110-karten-aber-nicht-vor-august

so whos to say wccf is even accurate in this artice.


----------



## h2spartan

Unless of course the Titan is the 780, then I'm out of luck


----------



## Artikbot

Quote:


> Originally Posted by *Boomstick777*
> 
> Agreed, the GTX 480 was a beast, in fact those still @ 1080P could of rocked a GTX 480 from launch (2010) right up until now (2013) that's 3 years worth of gaming, good value.
> 
> Hopefully the 'Titan' is like the 8800GTX or GTX 480, big powerful beast that'll be on top for a long time.


That also eats enough power to light up a whole city? No thanks. I'll stick to wimpy in this scenario.


----------



## Master__Shake

Quote:


> Originally Posted by *Cloudfire777*
> 
> Why are you linking to SemiAccurate? Its run by a clown, doesnt know anyone in the industry and everyone @ Intel laughs at him
> 
> And yeah I almost forgot: He charge $50 for a BIOS picture


and theres this posted may 17 2 months after 680 launched.
Quote:


> The other Tesla announced this week is Tesla K20, which is the first and so far only product announced that will be using GK110. T*esla K20 is not expected to ship until October-November of this year due to the fact that GK110 is still a work in progress,* but since NVIDIA is once again briefing developers of the new capabilities of their leading compute GPU well ahead of time there's little reason not to announce the card, particularly since they haven't attached any solid specifications to it beyond the fact that it will be composed of a single GK110 GPU.


http://www.beta.forums.anandtech.com/Show/Index/5840?cPage=2&all=False&sort=0&page=2&slug=gtc-2012-part-1-nvidia-announces-gk104-based-tesla-k10-gk110-based-tesla-k20

unless anandtech is run by a clown aswell


----------



## Avonosac

Quote:


> Originally Posted by *Master__Shake*
> 
> http://semiaccurate.com/2012/02/07/gk110-tapes-out-at-last/#.URFCoPJLQwo
> 
> taped out in january, so they had 2 months to get it on the shelves for the 680 release, when in reality it took about 10 months to get it to the market.
> 
> http://nvidianews.nvidia.com/Releases/NVIDIA-Unveils-World-s-Fastest-Most-Efficient-Accelerators-Powers-World-s-No-1-Supercomputer-8b6.aspx


Plenty of time to bring those GPUs to market if they wanted to, they didn't because they can sell a K20 for 3000 retail or at least 2500+ bulk, but can only sell a gtx 680 for 500-600 retail. And the GK104 was already beating the tahiti-XT with the very bad drivers, so why take the loss in revenue for no reason?

This limited run will only be out there to prove the performance dominance, it won't be a huge release.


----------



## Master__Shake

Quote:


> Originally Posted by *Avonosac*
> 
> *Plenty of time to bring those GPUs to market if they wanted to,* they didn't because they can sell a K20 for 3000 retail or at least 2500+ bulk, but can only sell a gtx 680 for 500-600 retail. And the GK104 was already beating the tahiti-XT with the very bad drivers, so why take the loss in revenue for no reason?
> 
> This limited run will only be out there to prove the performance dominance, it won't be a huge release.


ummmm no


----------



## prava

Quote:


> Originally Posted by *Vengeance47*
> 
> You clearly lack the distinction between when a product is released and when it is actually finished.
> 
> You can bet your bottom dollar that GK110 had completed the prototyping phase well and truly before kepler was ever released and could have been rolled out as a GTX680 if Nvidia pleased. What happened was once kepler tapered out and the HD7970 was released, Nvidia saw the potential of GK104 and made the decision to shelve GK110 from GeForce production......in other words, they could have used GK110, however GK104 was strong enough, and much cheaper to manufacture and as such, GK104 became the GTX680 as it performed on par with Tahiti XT and cost considerably less to manufacture (this means higher profit margins for Nvidia).
> 
> GPU chips aren't like shoes, they don't get manufactured, shoved in a box, then sent in a shipping container on a boat on the same day. The production ready designs are completed well in advance of product shipment dates and Nvidia would have had an idea of GK104 and GK110 performance months before the GTX680 was released and once they saw the Tahiti XT performance numbers, it made economical and business sense to go with the GK104 chip for the GTX680.
> 
> So just because the Tesla GK110 cards weren't released until November, that doesn't mean that GK110 wasn't ready for mass production well and truly in advance, it simply means that Nvidia made the smart business decision and used a cheaper product that performed more than adequately.


So much fail in this reply.

a) In I+D environments, as soon as the product is done, you put it out. Everyday the tech is ready but not into the market you are losing money, since you allow your competition to catch-up. Would you invest hundreds of millions into a product and, then, wait a year to release it? Really?

b) Any CEO not following a) strictly would get fired by their shareholders any minute. Do you have any idea why products get revisions? Or bugs? Because all companies in I+D related industries work against the clock and any minute the product is not out there is a ton of revenue lost.

c) Do you truly think that GK110 didn't get released early even for Tesla when such cards have a HUGE mark-up and there was a TON of expectation?

Conclusion: you are clueless. NVIDIA, and any other tech-company that has competition around it, will release every product the single minute they can, and not a minute later. Why did GK110 come that late? Because it wasn't ready: be it that it was too expensive, or the fab couldn't keep up or whatever reason: it wasn't ready.


----------



## Rayleyne

Quote:


> Originally Posted by *Avonosac*
> 
> Plenty of time to bring those GPUs to market if they wanted to, they didn't because they can sell a K20 for 3000 retail or at least 2500+ bulk, but can only sell a gtx 680 for 500-600 retail. And the GK104 was already beating the tahiti-XT with the very bad drivers, so why take the loss in revenue for no reason?
> 
> This limited run will only be out there to prove the performance dominance, it won't be a huge release.


Again no, Gk110 wasn't working at all they were having problems the GK 104 was the best they had and they clocked the balls off of it to compete and fyi, The 7970 beat the 680

Quote:


> Originally Posted by *Master__Shake*
> 
> ummmm no


+1
Quote:


> Originally Posted by *prava*
> 
> So much fail in this reply.
> 
> a) In I+D environments, as soon as the product is done, you put it out. Everyday the tech is ready but not into the market you are losing money, since you allow your competition to catch-up. Would you invest hundreds of millions into a product and, then, wait a year to release it? Really?
> 
> b) Any CEO not following a) strictly would get fired by their shareholders any minute. Do you have any idea why products get revisions? Or bugs? Because all companies in I+D related industries work against the clock and any minute the product is not out there is a ton of revenue lost.
> 
> c) Do you truly think that GK110 didn't get released early even for Tesla when such cards have a HUGE mark-up and there was a TON of expectation?
> 
> Conclusion: you are clueless. NVIDIA, and any other tech-company that has competition around it, will release every product the single minute they can, and not a minute later. Why did GK110 come that late? Because it wasn't ready: be it that it was too expensive, or the fab couldn't keep up or whatever reason: it wasn't ready.


^this


----------



## Avonosac

Quote:


> Originally Posted by *Rayleyne*
> 
> *Again no, Gk110 wasn't working at all they were having problems the GK 104 was the best they had and they clocked the balls off of it to compete and fyi, The 7970 beat the 680*
> +1
> ^this


Its beating the snot out of it now, but it didn't at launch. This is taking into account CF and SLI, and the bad drivers for each new architecture. I believe GK110 was ready, but production wasn't because of TSMCs huge manufacturing issues, the release of GK110 would have been a complete paper release, instead of this mostly paper release now. As I stated, and prava said the other side of the point, they would have been killing their profit margins trying to sell the few GK110 they were able to produce as GeForce instead of Tesla K20. It was the manufacturing holding them up with such low yields, not the architecture .. GK110 was working just fine.


----------



## Master__Shake

Quote:


> Originally Posted by *Avonosac*
> 
> Its beating the snot out of it now, but it didn't at launch. This is taking into account CF and SLI, and the bad drivers for each new architecture. *I believe GK110 was ready, but production wasn't because of TSMCs huge manufacturing issues,* the release of GK110 would have been a complete paper release, instead of this mostly paper release now. As I stated, and prava said the other side of the point, they would have been killing their profit margins trying to sell the few GK110 they were able to produce as GeForce instead of Tesla K20. It was the manufacturing holding them up with such low yields, not the architecture .. GK110 was working just fine.


the only one having issues at TSMC were nvidia.


----------



## Avonosac

Quote:


> Originally Posted by *Master__Shake*
> 
> the only one having issues at TSMC were nvidia.


I'm not sure I understand what your point is, I'm saying the reason the 680 shipped with GK104 is the production issues of GK110 at TSMC and AMD's release performance of the 7970 being so poor. Your statement agrees with my point, but in a way which leads me to believe you think you're disagreeing...


----------



## L36

Why are people are so delusional in this thread? To the bunch saying that GK110 was not working, sure it has problems but they have been ironed out for the most part, in addition its all about yields. This thing is 7.1 billion worth of transistors. Manufacturing this with a profit is pushing the boundaries of semiconductor manufacturing as is. It was held back not because it was broken, but to get the yields to improve. Only reason Teslas were released first because nvidia can make a good buck there to recoup loses due to poor yields. Now 28nm is much more stable than a year ago and they can afford to release this card to consumers and still make a profit. Its all about binning. Tesla get the well binned GPUs while consumers get semi pleb GPUs with some GPGPU aspects cut out but for the most part functional.

This comes to the second part where some claim it wil lbe a "limited run"
Lol, why would they do that? They will continue making the GK110 dies and some will not qualify to Tesla TDP quality standards so they will be dumped into consumer space. Nvidia aren't just gonna throw away leaky but functional GK110 dies. They're gonna capitalize on it as much as they can.


----------



## GoldenTiger

Quote:


> Originally Posted by *Master__Shake*
> 
> the only one having issues at TSMC were nvidia.


Wrong... both companies had plenty of trouble. Are you trying to disagree with the guy you quoted, or back up his statement by mentioning that there were 28nm troubles?


----------



## Master__Shake

Quote:


> Originally Posted by *Avonosac*
> 
> I'm not sure I understand what your point is, I'm saying the reason the 680 shipped with GK104 is the production issues of GK110 at TSMC and AMD's release performance of the 7970 being so poor. Your statement agrees with my point, but in a way which leads me to believe you think you're disagreeing...


you say that it was TSMC's fault for there manufacturing issues, which is plain old wrong like super wrong like the wrongest thing ever said.

i say nvida's design is the problem...it always was that's why gk100 was a failure and gk104 had yield issues and why gk 110 is was 8-9 months after design on the market.

and the 7970's performance was poor?? maybe because of a lower clock speed, but poor? really ok then poor it is....

WTB: hair pulling emoticon...


----------



## Capt

Let's hope it's not over $1000.


----------



## GoldenTiger

Quote:


> Originally Posted by *Rayleyne*
> 
> Again no, Gk110 wasn't working at all they were having problems the GK 104 was the best they had and they clocked the balls off of it to compete and fyi, The 7970 beat the 680
> +1
> ^this


What? No, it didn't.... the 680 was faster than the 7970 and had better drivers, features, and pricing, than the 7970 at launch. Recent drivers may have helped the 7970 some (still with horrifically bad frametime distribution that makes the image juddery), but they're not far apart at all even after all that as the 680 has seen nice improvements through drivers as well.


----------



## GoldenTiger

Quote:


> Originally Posted by *Master__Shake*
> 
> and the 7970's performance was poor?? maybe because of a lower clock speed, but poor? really ok then poor it is....
> 
> WTB: hair pulling emoticon...


He said "release performance" as in the TSMC manufacturing, presumably. What are you on about? Clocks have nothing to do with that.


----------



## Master__Shake

Quote:


> Originally Posted by *GoldenTiger*
> 
> Wrong... both companies had plenty of trouble. Are you trying to disagree with the guy you quoted, or back up his statement by mentioning that there were 28nm troubles?


yes, yes i am...
Quote:


> SemiAccurate's channel checks have confirmed the anecdotal evidence that simple shopping provides; Nvidia can't supply GK104 based GPUs. Sources tell us that AMD shipped more than 10,000 units in their initial shipment of Tahiti based GPUs, the GK104′s direct competitor, and another larger shipment followed the first. Both were before the launch of the first Tahiti card. Since then, with the exception of TSMC's still unexplained hiccup, shipments have been common and plentiful. Stock in the channel is also plentiful, and Tahiti has never been completely out of stock even if specific models come and go.
> 
> In contrast, sources tell SemiAccurate that initial shipments of GK104s, launched more than three months after Tahiti, were unlikely to be more than 1000 units worldwide. Those same channel sources tell us that to date, volume of GK104 based Kepler cards is almost assuredly less than 10,000. At this point, TSMC's 28nm process has had more than six months to mature, essentially 1/4 of its lifespan, and Nvidia can still not get yields up to par. Everyone else however, can. Nvidia repeatedly says that the abject unavailability of the GTX 680 is due to demand, but shipping numbers directly contradict that claim. At least that is what is being reported here, here, here, and many other places. You may recall that during the Q&A session of the last quarter's conference call, Nvidia was rather subdued about 28nm. Jen-Hsun Huang said, "The gross margin decline is contributed almost entirely to the yields of 28-nanometer being lower than expected. And that is, I guess, unsurprising at this point.". While the effect on gross margins may not be a surprise, the fact that Nvidia was having problems on 28nm sure was. Why? Because they are the only company having problems with 28nm. It also doesn't play well with the rather stretched reasoning about demand vs supply as highlighted above.


http://semiaccurate.com/2012/05/08/nvidias-five-new-keplers-raise-a-red-flag/#.URFXSPJLQwo

almost forgot


----------



## Master__Shake

Quote:


> Originally Posted by *GoldenTiger*
> 
> He said "release performance" as in the TSMC manufacturing, presumably. What are you on about? Clocks have nothing to do with that.


so launch cards are garbage right?? they don't perform at all that well right? you're wronger than wrongy up there!


----------



## FTWRoguE

I keep seeing people saying they'd rather go with the fastest single card then go SLI, is it really that bad?


----------



## BizzareRide

Quote:


> Originally Posted by *i7monkey*
> 
> I have a suspicion that:
> 
> 
> This card is rushed because they want attention away from the PS4 and given that it's limited in quantity it also supports that theory.
> Because of this, they haven't worked out all the kinks and users who buy Titan will end up as guinea pigs.
> 
> I for one don't want to be a guinea pig.
> 
> I want this card to be a mainstream release like the 680 or 580. I don't want to be stuck with an unsupported, overheating mess that gets fixed with the release of the GTX 780 8 or 9 months from now.
> 
> I don't want to pay a grand to beta test a video card so that others benefit 9 months from now by getting a trouble free, efficient card that's much cheaper.
> 
> Anyone else afraid of this too?


I don't get what the PS4 has to do with this... Nvidia's current cards are already significantly faster than an under clock 7870.


----------



## n00byn4t3r

Quote:


> Originally Posted by *FTWRoguE*
> 
> I keep seeing people saying they'd rather go with the fastest single card then go SLI, is it really that bad?


Nah it's not all that bad, just look at 600 series SLI as that is pretty good. But SLI will always have more issues then single GPU such as the need for good SLI profiles and other things.


----------



## Mygaffer

So are they really going to be $899?


----------



## Master__Shake

sli/crossfire both have pro's and cons, scaling is sometimes really good in some games and sometimes it's very bad, some games never ever get a working sli/crossfire profile GTA IV for example sucks a big one when it comes to sli or crossfire.


----------



## rcfc89

Quote:


> Originally Posted by *FTWRoguE*
> 
> I keep seeing people saying they'd rather go with the fastest single card then go SLI, is it really that bad?


The only issues I've run in with both my 690 and when I had 580 sli is the occasional game not utilizing both gpu's. I've never had any serious stuttering or some of the others issues that others have had. My experience with Nvidia sli has been great. If I can pull it off on launch day I'm going to shoot for two of these Titan's. Thankfully I have connections with an Nvidia rep


----------



## maarten12100

Quote:


> Originally Posted by *FTWRoguE*
> 
> I keep seeing people saying they'd rather go with the fastest single card then go SLI, is it really that bad?


Well you don't want bugs in the top of the line cards.
That is not the only things since most games run without any bugs the second issue is some games don't use SLI of Xfire capabilities rendering the second chip useless.

If all are contained on a single die glue-less those problems are eliminated.


----------



## Dimaggio1103

Quote:


> Originally Posted by *FTWRoguE*
> 
> I keep seeing people saying they'd rather go with the fastest single card then go SLI, is it really that bad?


No SLI has not been that bad IMO since before the 4xx series. People keep repeating information they heard through the grape vine, even though the information is incorrect. Even crossfire has improved significantly the past two gens.

I have ran SLI 460, 560, 560ti, and now 660. Not a single issue at all. Even Crossfired without cause for complaint.

People (noobs) have a way of hearing something and then repeating it like its fact even though they have no idea what they are talking about.

Here is my thoughts with my recent purchase. I could have afforded to buy a 7970 GHZ edition, but it was cheaper and more powerful to grab two GTX 660's in SLI, wich will beat a 7970 GHZ hand down for less. I knew I would never crossfire the 7970 so my choice was obvious. I ever once was worried about SLI problems.


----------



## ABeta

Quote:


> Originally Posted by *FTWRoguE*
> 
> I keep seeing people saying they'd rather go with the fastest single card then go SLI, is it really that bad?


it's not at all bad, but if you had the choice, lower power consumption, less heat and noise(if not watercooling) with similar performance, i think the single card route would be best. I have never had any issues with SLI aside from small perceived input lag. Otherwise, my SLI experience has been good


----------



## Avonosac

Quote:


> Originally Posted by *Master__Shake*
> 
> so launch cards are garbage right?? they don't perform at all that well right? you're wronger than wrongy up there!


My brain is hurting trying to make you realize there is a temporal difference in what we are talking about. You and Raylene seem stuck on release statistics and current day performance. You are either supporting my statements, or holding your fingers in your ears saying "nananananana CANT HEAR YOU!!"

The GTX 680 beat out the 7970 when it was released. SLI also worked MUCH better than CF the day the 680 was released. Both architectures have had driver enhancements, and these statements are no longer true about the current performance.
Quote:


> Originally Posted by *FTWRoguE*
> 
> I keep seeing people saying they'd rather go with the fastest single card then go SLI, is it really that bad?


No, and yes. Both the client (game / program) need to support SLI, and the driver needs to be optimized to work with that software. When they work its almost as good as 1 card with twice the power, you can see scaling approaching 100% in some instances. But its usually somewhere less than that, and the best optimized titles are usually the newest and flashiest games.
Quote:


> Originally Posted by *Dimaggio1103*
> 
> No SLI has not been that bad IMO since before the 4xx series. People keep repeating information they heard through the grape vine, even though the information is incorrect. Even crossfire has improved significantly the past two gens.
> 
> I have ran SLI 460, 560, 560ti, and now 660. Not a single issue at all. Even Crossfired without cause for complaint.
> 
> People (noobs) have a way of hearing something and then repeating it like its fact even though they have no idea what they are talking about.
> 
> Here is my thoughts with my recent purchase. I could have afforded to buy a 7970 GHZ edition, but it was cheaper and more powerful to grab two GTX 660's in SLI, wich will beat a 7970 GHZ hand down for less. I knew I would never crossfire the 7970 so my choice was obvious. I ever once was worried about SLI problems.


No, software needs to be designed to utilize it. Good drivers do not make the software utilize the both GPUs. Also, crossfires main issues come in x3 and x4 configurations, when high resolutions or refresh rates along with high levels of AA are used. Since crossfire uses pass through, all the traffic must use the CF bridges to get to the first card, and the line gets saturated very quickly.


----------



## Master__Shake

Quote:


> Originally Posted by *Avonosac*
> 
> My brain is hurting trying to make you realize there is a temporal difference in what we are talking about. You and Raylene seem stuck on release statistics and current day performance. You are either supporting my statements, or holding your fingers in your ears saying "nananananana CANT HEAR YOU!!"


Quote:


> Originally Posted by *Avonosac*
> 
> Its beating the snot out of it now, but it didn't at launch. This is taking into account CF and SLI, and the bad drivers for each new architecture. I believe GK110 was ready, but production wasn't because of TSMCs huge manufacturing issues, the release of GK110 would have been a complete paper release, instead of this mostly paper release now. As I stated, and prava said the other side of the point, they would have been killing their profit margins trying to sell the few GK110 they were able to produce as GeForce instead of Tesla K20. It was the manufacturing holding them up with such low yields, not the architecture .. GK110 was working just fine.


Quote:


> Its beating the snot out of it now, but it didn't at launch. This is taking into account CF and SLI, and the bad drivers for each new architecture.


The 7970 at launch was 925 mhz, since launch AMD put out a card that clocked in at 1050 mhz and released drivers that were better than launch drivers that weren't actual drivers for the 7970 to begin with. but yes the 680 was marginally better.
Quote:


> I believe GK110 was ready, but production wasn't because of TSMCs huge manufacturing issues, the release of GK110 would have been a complete paper release, instead of this mostly paper release now.


it was not ready until October-November of 2012 as it's been showed before...
Quote:


> It was the manufacturing holding them up with such low yields, not the architecture .. GK110 was working just fine.


the only company having issues with the 28nm process were Nvidia, i have quoted and posted that article already so it was the architecture.


----------



## RobotDevil666

This really sounds little too good to be true , but if it is I'm sold , really want to get rid of my 670SLI and if single GPU card can give me comparable performance ....... shutupandtakemymoney.jpg


----------



## Avonosac

Quote:


> Originally Posted by *Master__Shake*
> 
> The 7970 at launch was 925 mhz, since launch AMD put out a card that clocked in at 1050 mhz and released drivers that were better than launch drivers that weren't actual drivers for the 7970 to begin with. but yes the 680 was marginally better.
> it was not ready until October-November of 2012 as it's been showed before...
> the only company having issues with the 28nm process were Nvidia, i have quoted and posted that article already so it was the architecture.


Manufacturing process != the architecture.

As for the guy who was talking about releasing the architecture as soon as its ready (development and manufacturing), this is kind of true, except its not. 3DFX did exactly that, and it is the reason the company went under. It released the next generations they were developing well before they were able to recoup R&D costs and make a profit on their previous generations.

TSMC having trouble producing the 28nm chips may or may not have been architecturally based and I would lean towards not(the architecture might have made it more complicated to produce), but you certainly don't release a flagship chip for 550 which you can sell for 3000, when your midrange chip at less than half the cost to produce already beats the flagship of the competition just because you have the tech completed. Low yields of the manufacturing process could easily explain the march 22nd to fall release window. Not to mention how marketing always plays a huge hand in when products will be release. It's naive to assume companies will push tech right out the door the minute they get it complete if the market is not ready for it.


----------



## test tube

These GK110s are the Tesla cards... they come packaged with high throughput FPUs and are not targeted to gaming performance. My personal guess is that GTX 680s and 690s will be higher performance and use less power/produce less heat.

The neat thing about GK104 (GTX680) is that nVidia removed all the components that I use, namely FP/DP ops, and only kept the components in that made gaming fast. That's why I'm sill running 5xx series cards in the lab.


----------



## Rayleyne

Quote:


> Originally Posted by *Avonosac*
> 
> Its beating the snot out of it now, but it didn't at launch. This is taking into account CF and SLI, and the bad drivers for each new architecture. I believe GK110 was ready, but production wasn't because of TSMCs huge manufacturing issues, the release of GK110 would have been a complete paper release, instead of this mostly paper release now. As I stated, and prava said the other side of the point, they would have been killing their profit margins trying to sell the few GK110 they were able to produce as GeForce instead of Tesla K20. It was the manufacturing holding them up with such low yields, not the architecture .. GK110 was working just fine.


they were equal with the 7970 in favour of more games at launch especialy considering the 7970 is older it won then hands down and won now, Sorry but the 680 while a great card and Gk 104 was a great chip, It didn't beat tahiti XT outright and loses thoroughly now.

Gk110 wasn't ready, It wasn't working at all because they were having issues at TSMC
Quote:


> Manufacturing process != the architecture.
> 
> As for the guy who was talking about releasing the architecture as soon as its ready (development and manufacturing), this is kind of true, except its not. 3DFX did exactly that, and it is the reason the company went under. It released the next generations they were developing well before they were able to recoup R&D costs and make a profit on their previous generations.
> 
> TSMC having trouble producing the 28nm chips may or may not have been architecturally based and I would lean towards not(the architecture might have made it more complicated to produce), but you certainly don't release a flagship chip for 550 which you can sell for 3000, when your midrange chip at less than half the cost to produce already beats the flagship of the competition just because you have the tech completed. Low yields of the manufacturing process could easily explain the march 22nd to fall release window. Not to mention how marketing always plays a huge hand in when products will be release. It's naive to assume companies will push tech right out the door the minute they get it complete if the market is not ready for it.


that's the thing, Their 3000 dollar chip had not yet been built it was having issues, Nvidia was the only one at TMSC that was having issues, And as such it was an Nvidia issue as to why GK 110 wasn't working, Hence they clocked the balls off and taped some features to GK 104 and hoped it worked and it did.


----------



## Alatar

Quote:


> Originally Posted by *Rayleyne*
> 
> they were equal with the 7970 in favour of more games at launch especialy considering the 7970 is older it won then hands down and won now, Sorry but the 680 while a great card and Gk 104 was a great chip, It didn't beat tahiti XT outright and loses thoroughly now.


Even the TPU review that was quoted by everyone saying the difference between the 680 and 7970 wasn't a big one says that the 680 was the faster card at all resolutions...

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html

The 680 launched with higher performance than the 7970 at a lower price. Back then it was about the 7970 and 680 being on par when both were overclocked (7970 or 680 winning by a bit depending on the OCs in question).


----------



## rcfc89

Quote:


> Originally Posted by *Alatar*
> 
> Even the TPU review that was quoted by everyone saying the difference between the 680 and 7970 wasn't a big one says that the 680 was the faster card at all resolutions...
> 
> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html
> 
> The 680 launched with higher performance than the 7970 at a lower price. Back then it was about the 7970 and 680 being on par when both were overclocked (7970 or 680 winning by a bit depending on the OCs in question).


^This

The 7970 never took the lead until the GHZ was released with new drivers 6 months later. It was little to late at that point. Nvidia had already raked in most of the sales even though the 7970 came first. Add in the gtx690 which had no competition from Amd failing to have a dual-gpu card. Nvidia easily took this round in profits.


----------



## Rayleyne

Quote:


> Originally Posted by *Alatar*
> 
> Even the TPU review that was quoted by everyone saying the difference between the 680 and 7970 wasn't a big one says that the 680 was the faster card at all resolutions...
> 
> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html
> 
> The 680 launched with higher performance than the 7970 at a lower price. Back then it was about the 7970 and 680 being on par when both were overclocked (7970 or 680 winning by a bit depending on the OCs in question).


Unfortunatly alatar my frozen friend, Some of those are within what some would call a margin of error, And i never quoted TPU


----------



## hammerforged

The real question is will it have voltage control or not? If so


----------



## Avonosac

Quote:


> Originally Posted by *Rayleyne*
> 
> they were equal with the 7970 in favour of more games at launch especialy considering the 7970 is older it won then hands down and won now, Sorry but the 680 while a great card and Gk 104 was a great chip, It didn't beat tahiti XT outright and loses thoroughly now.
> 
> Gk110 wasn't ready, It wasn't working at all because they were having issues at TSMC
> that's the thing, Their 3000 dollar chip had not yet been built it was having issues, Nvidia was the only one at TMSC that was having issues, And as such it was an Nvidia issue as to why GK 110 wasn't working, Hence they clocked the balls off and taped some features to GK 104 and hoped it worked and it did.


Quote:


> Rayleyne
> ATI Enthusiast


Explains a lot.

Now, on to TSMC, the chip was *obviously* built, or they wouldn't have been able to find out their manufacturing process was borked. The problem they had was low yields *not* architecture problems. Meaning their process was flawed, and needed to be tweaked. Your beloved ATI was also having issues, they were just getting higher yields, but not great ones at the time.

Even so, the GK104 was also having low yields, it was just less expensive to fail those, than it was to fail GK110s, so they shipped the 680 as a mostly voltage locked clocked to hell GK104. They could get away with this, because the performance of the 7970 at launch was abysmal due to horrible drivers. Obviously, this has changed because the 7970 of today beats out a 680 quite readily, but that is also because the 680 should really have been released as the 660 or 665 or 660TI, since its the top of the midrange line kepler GPU.
Quote:


> Originally Posted by *Alatar*
> 
> Even the TPU review that was quoted by everyone saying the difference between the 680 and 7970 wasn't a big one says that the 680 was the faster card at all resolutions...
> 
> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html
> 
> The 680 launched with higher performance than the 7970 at a lower price. Back then it was about the 7970 and 680 being on par when both were overclocked (7970 or 680 winning by a bit depending on the OCs in question).


^


----------



## Alatar

Quote:


> Originally Posted by *Rayleyne*
> 
> Unfortunatly alatar my frozen friend, Some of those are within what some would call a margin of error, And i never quoted TPU


Never said you did but it was quoted a lot. Other sites showed bigger differences in favor of the 680.

Anyways, I'd agree that they're within the margin of error if we were talking about one game and both of the cards won some resolutions. But the 680 averaged better in a big bunch of games and won all the resolutions.

That doesn't mean that the 7970 didn't pull ahead clearly after the GHz edition and 12.11s but when the 680 launched, there was a reason NV called it the fastest and most efficient GPU ever built. Pretty much all reviewers said it was just that.


----------



## Avonosac

Quote:


> Originally Posted by *Rayleyne*
> 
> Unfortunatly alatar my frozen friend, Some of those are within what some would call a margin of error, And i never quoted TPU


How does that even matter, you are still comparing the first to market top of the line card, to a mid-range GPU. How can you delude yourself into thinking this was somehow a GOOD situation for AMD to be in? Heads should have rolled in the driver teams down in Austin, and you boast of this like its a _good_ thing?


----------



## Rayleyne

Quote:


> Originally Posted by *Avonosac*
> 
> Explains a lot.
> 
> Now, on to TSMC, the chip was *obviously* built, or they wouldn't have been able to find out their manufacturing process was borked. The problem they had was low yields *not* architecture problems. Meaning their process was flawed, and needed to be tweaked. Your beloved ATI was also having issues, they were just getting higher yields, but not great ones at the time.
> 
> Even so, the GK104 was also having low yields, it was just less expensive to fail those, than it was to fail GK110s, so they shipped the 680 as a mostly voltage locked clocked to hell GK104. They could get away with this, because the performance of the 7970 at launch was abysmal due to horrible drivers. Obviously, this has changed because the 7970 of today beats out a 680 quite readily, but that is also because the 680 should really have been released as the 660 or 665 or 660TI, since its the top of the midrange line kepler GPU.
> ^


Actually just because i have the ATI enthusiast tag below my name doesn't mean anything, That just says you won't even have an argument, They weren't having yield problems they were having problems everywhere and the whole thing needed tinkering at the time of Gk104 launch GK110 wasn't even remotely close to ready, And before anyone even tries to say it was cheaper, the GTX 680 was a thousand bucks here on launch and remained that way for a very long time here, It was only recently did it start heading down to affordable 600 and 500 dollar marks in australia, Want to know the 7970 launch price? 500 bucks, Want to know what it was when the 680 launched? Still 500 bucks.

7970 had equal to or better and was cheaper, I'm not saying the 680 was a bad chip, It was a brilliant chip but for everyone saying they "knew" that gk104 could compete and didn't need to sell Gk110 is wrong, Nvidia needed something to compete, They knew they weren't going to get 110 out anywhere near in time so they super charged 104 and slapped some features on it.


----------



## maarten12100

Quote:


> Originally Posted by *Alatar*
> 
> Even the TPU review that was quoted by everyone saying the difference between the 680 and 7970 wasn't a big one says that the 680 was the faster card at all resolutions...
> 
> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/27.html
> 
> The 680 launched with higher performance than the 7970 at a lower price. Back then it was about the 7970 and 680 being on par when both were overclocked (7970 or 680 winning by a bit depending on the OCs in question).


That isn't really fair as relative performance also counts the games in which bad drivers gave the hd7970's rubish performance.
So I guess neither is really better the hd7970 can support multi monitors and has the vram for it.
While the 680 is more efficient and has a better cooler. (the sweet sport performance/watt is still a hd7750 or an hd7850 so let's keep that out of the picture as we are talking bout monster cards)


----------



## freitz

So are we going to get back on topic on new info about the GK110 release or is this another thread that the mod's will close soon since its now a AMD vs. NV


----------



## iARDAs

Hmmm to SLI my 670 4GB or sell it and get this.

Hard choice ahead.


----------



## maarten12100

Quote:


> Originally Posted by *freitz*
> 
> So are we going to get back on topic on new info about the GK110 release or is this another thread that the mod's will close soon since its now a AMD vs. NV


Let's hope not I don't get the disscusion the gtx680 had a midrange gk104 chips proven fact I myself own a gtx570 and an hd7850 and a gtx275 I also had a hd4870 2GB an gtx9800 and a hd4850 512mb
I tend to chose what is best who cares who it is from as many years have show us there was never a clear winner. (however we can all agree the gtx2xx series were very nice)
Quote:


> Originally Posted by *iARDAs*
> 
> Hmmm to SLI my 670 4GB or sell it and get this.
> 
> Hard choice ahead.


If you can get good money for it sell it and get this monster as it has more ram and the same performance in a single card.








Why would you put money towards something that will out-date sooner you might sli the Titan in a few years if you need more performance.


----------



## rcfc89

What's really sad is Amd had to do a refresh of the 7970 with the GHZ 6 months later just to compete. The 7970 was getting outperformed by even the 670. Did Nvidia ever do a refresh? Nope. They don't have to since they already took the majority of sales and no longer care. They have much bigger things in store.


----------



## maarten12100

Quote:


> Originally Posted by *Avonosac*
> 
> That first statement is kind of trying to ignore the point that the performance of the card in its best, at the time of the 680 launch was below or equal to the 680. People buy off of performance now, they had no way to know AMD would finally fix the drivers and make the card perform as it should.


Well that is almost like saying if a gtx690 doesn't scale in sli in one game therefor falls below a gtx680, the gtx690 is worse than a gtx680 just because it doesn't have a correct SLI profile.

You understand what I'm getting at?
If you look at the benches individually you see some games really screwing up under the AMD cards and making the average score turn out way lower.

I would rather have a strong card with sinky drivers than have a weak card with good drivers (As long as drivers getting better over time)

Now lets get back ontopic about this monster card


----------



## Rayleyne

Quote:


> I'm glad I don't live in Australia
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd still have to pay around $1400 for a 690 but at least AMD and NV are competitively priced (compared to AUS, obviously two 7970s are better value than a 690).
> 
> Oh wait that's not good is it...


I have no doubts titan will be a great card, But i don't see it retailing any less then 1500 here


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> Well that is almost like saying if a gtx690 doesn't scale in sli in one game therefor falls below a gtx680, the gtx690 is worse than a gtx680 just because it doesn't have a correct SLI profile.
> 
> You understand what I'm getting at?
> If you look at the benches individually you see some games really screwing up under the AMD cards and making the average score turn out way lower.
> 
> I would rather have a strong card with sinky drivers than have a weak card with good drivers (As long as drivers getting better over time)
> 
> Now lets get back ontopic about this monster card


You are missing the entire context of the discussion. You jumped in and without context said something that didn't mesh with the conversation.

It still doesn't, go back and read the thread.

Quote:


> Originally Posted by *Rayleyne*
> 
> I have no doubts titan will be a great card, But i don't see it retailing any less then 1500 here


Bru-tal. I guess I can forgive you, seems quite painful to try to nerd it up on the prison continent.


----------



## Rayleyne

Quote:


> Originally Posted by *Avonosac*
> 
> My mistake, I was more concerned with the jibe.
> You are missing the entire context of the discussion. You jumped in and without context said something that didn't mesh with the conversation.
> 
> It still doesn't, go back and read the thread.
> Bru-tal. I guess I can forgive you, seems quite painful to try to nerd it up on the prison continent.


I did the math at launch if it retails for 899 in america and about 1500 here it's cheaper in the long run if i buy a 2 way plane ticket to america and buy four there and bring them home in a suitcase then it is to buy 4 here


----------



## Avonosac

Quote:


> Originally Posted by *Rayleyne*
> 
> I did the math at launch if it retails for 899 in america and about 1500 here it's cheaper in the long run if i buy a 2 way plane ticket to america and buy four there and bring them home in a suitcase then it is to buy 4 here


I'll bring the beer.


----------



## Gallien

Quote:


> Originally Posted by *Rayleyne*
> 
> I did the math at launch if it retails for 899 in america and about 1500 here it's cheaper in the long run if i buy a 2 way plane ticket to america and buy four there and bring them home in a suitcase then it is to buy 4 here


Ill ship them for you if you want, save on the plane ticket







Just let me take pictures with 4 of them in my rig first!


----------



## Rayleyne

Quote:


> Originally Posted by *Avonosac*
> 
> I'll bring the beer.


I'm more of a Tea person, And i think 1 titan should be able to do 7680x1440 no?


----------



## Usario

Quote:


> Originally Posted by *rcfc89*
> 
> What's really sad is Amd had to do a refresh of the 7970 with the GHZ 6 months later just to compete. The 7970 was getting outperformed by even the 670. Did Nvidia ever do a refresh? Nope. They don't have to since they already took the majority of sales and no longer care. They have much bigger things in store.


This card costs $900 and appeals to an entirely different market than the one that the 7970 and 680 appeal to, so I have little idea what you're on about when you say they have bigger things in store. This card is for those who would otherwise require CFX/SLI.


----------



## Nautilus

Ugh. Who cares? It's still a kepler card which means it's crippled computing wise. Almost 3k CUDA cores and still pathetic GPGPU performance...

If you're into folding, crunching, photoshop, after effects or video editing then this is not the right card for you. This is only for gaming.


----------



## Avonosac

Quote:


> Originally Posted by *Rayleyne*
> 
> I'm more of a Tea person, And i think 1 titan should be able to do 7680x1440 no?


The gddr shouldn't be any problem at all, but I think the FPS you get out of it will depend a lot on exactly where the performance lands in relation to a 690 and how much post render processing will be done. Short answer, probably but it might not get a steady 60 in the rougher games.


----------



## smithyzbak

The Titan is supposedly to be incredibly powerful, but who is the target for this card? Like the multi monitor and 1440+ crowd?


----------



## Usario

Quote:


> Originally Posted by *Nautilus*
> 
> Ugh. Who cares? It's still a kepler card which means it's crippled computing wise. Almost 3k CUDA cores and still pathetic GPGPU performance...


Poor GPGPU has nothing to do with Kepler; it was the GK104 chip which has 1/24 FP64. GK110's FP64 performance is, IIRC, 1/3 of FP32 (not as impressive as Fermi's 1/2, but when you consider that FP32 went up by nearly 3x it's not bad at all).


----------



## maarten12100

Quote:


> Originally Posted by *Nautilus*
> 
> Ugh. Who cares? It's still a kepler card which means it's crippled computing wise. Almost 3k CUDA cores and still pathetic GPGPU performance...
> 
> If you're into folding, crunching, photoshop, after effects or video editing then this is not the right card for you. This is only for gaming.


Well it might be decent in gpgpu performance unless Nvidia decided to laser out those parts on the die but it will be half crippled as the eprom bios will be rewritten for as it was a gaming card.


----------



## rcfc89

Quote:


> Originally Posted by *Avonosac*
> 
> The gddr shouldn't be any problem at all, but I think the FPS you get out of it will depend a lot on exactly where the performance lands in relation to a 690 and how much post render processing will be done. Short answer, probably but it might not get a steady 60 in the rougher games.


I'm going to say no on this one. My 690 can barely hold things at 60fps in max settings 2560x1440 in the latest demanding games FC3, HitMan,C3 etc. Throwing 3 (2560) screens at it would surely bring trouble. I'd say go sli Titan's to assure you have no problems with upcoming titles.


----------



## Razor 116

So let me get this straight Nvidia is going to release a card that is not 7xx series, more powerful than a GTX 690 supposedly in a new line for "Enthusiasts" called Titan whilst also having the 7xx series coming out Q3-Q4. So is the 780 going to be toned down or are they going to be faster than a 680 but not as fast as a GeForce Titan. All seems a little far fetched to me but we'll see in the coming weeks.


----------



## Avonosac

Quote:


> Originally Posted by *rcfc89*
> 
> I'm going to say no on this one. My 690 can barely hold things at 60fps in max settings 2560x1440 in the latest demanding games FC3, HitMan,C3 etc. Throwing 3 (2560) screens at it would surely bring trouble. I'd say go sli Titan's to assure you have no problems with upcoming titles.


I'd listen to this guy then, I tend to use DX 9c games or 10, when I have time to game, and that is usually just on 1 2560x1440 monitor. I can't see them being that demanding, since my 670 has no trouble. If this card does end up being voltage unlocked, I can see myself ponying up the money for this, and potentially getting a second one down the line. Maybe getting into some newer.. and hopefully not crappy FPS'.


----------



## rcfc89

Quote:


> Originally Posted by *smithyzbak*
> 
> The Titan is supposedly to be incredibly powerful, but who is the target for this card? Like the multi monitor and 1440+ crowd?


In short....Yes. A single 670 can easily max things out on a single 1080p monitor. This is for the big dawgs wanting max eyecandy. 2560/120hz/Eyefinity resolutions


----------



## Razor 116

Quote:


> Originally Posted by *rcfc89*
> 
> In short....Yes. A single 670 can easily max things out on a single 1080p monitor. This is for the big dawgs wanting max eyecandy. 2560/120hz/Eyefinity resolutions


A 670 cannot max out everything at 1080p. Far Cry 3, Crysis 3 etc.


----------



## Nocturin

Quote:


> Originally Posted by *Razor 116*
> 
> A 670 cannot max out everything at 1080p. Far Cry 3, Crysis etc.


Especially FC3.

the 670 is great but... not a 680, and even then it falls short without sli.


----------



## Avonosac

Quote:


> Originally Posted by *Razor 116*
> 
> A 670 cannot max out everything at 1080p. Far Cry 3, Crysis etc.


Save the horribly optimized games, I didn't have any issues with older games with my 670 on 1200p. It does require a little bit better messaging at 2560x1440 though.. specially when its still running stuff on the 1200p 2nd monitor. Skyrim and AC3 were no issues for me to play.


----------



## freitz

GTx 680 overclocked at 1300 for me at 1440p only gets 30fps in Crysis 3 to give you a idea on where this card would be needed if those of us that would otherwise SLI can now have a single card solution. I myself would pull the trigger on this if numbers are true. However we still have yet to hear anything from Nvidia.


----------



## Rayleyne

Well the main issue is, I'll be going phase with one plate for cpu and gpu (Phasing both gpu and cpu with only one unit) And going for high overclocks, Multi gpu's always introduce micro stutter, single gpu seems alot smoother, 2 9600gTs? micro stutter, Two gtx 460s? micro stuttered again, Same with 2 7970s. Hence i'd really like a single ultra fast card.


----------



## Nocturin

Skyrim a properly optimized game?

AC3 a properly optimized game?










You funny.


----------



## Avonosac

Quote:


> Originally Posted by *Nocturin*
> 
> Especially FC3.
> 
> the 670 is great but... not a 680, and even then it falls short without sli.


There isn't that much difference between the 670 and 680 when you clock em similar, its why I ended up with the 670 and not the 680.. to really max out the games you need SLI, in which case you're usually at least 20% performance overkill for a single monitor display, even at 2560x1600.


----------



## Stay Puft

Quote:


> Originally Posted by *Nocturin*
> 
> Especially FC3.
> 
> the 670 is great but... not a 680, and even then it falls short without sli.


670 is like 5-8% slower then a 680 with the same memory bandwidth


----------



## Avonosac

Quote:


> Originally Posted by *Nocturin*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Skyrim a properly optimized game?
> 
> AC3 a properly optimized game?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You funny.


no, but out of the box when I played them on my 670 4GB I didn't have any issues


----------



## Usario

Quote:


> Originally Posted by *Nocturin*
> 
> woosaaa Usario.
> 
> They don't think it be like it is, but it do.


They never seem to think nothing be like it is. ;-;
Quote:


> Originally Posted by *Razor 116*
> 
> So let me get this straight Nvidia is going to release a card that is not 7xx series, more powerful than a GTX 690 supposedly in a new line for "Enthusiasts" called Titan whilst also having the 7xx series coming out Q3-Q4. So is the 780 going to be toned down or are they going to be faster than a 680 but not as fast as a GeForce Titan. All seems a little far fetched to me but we'll see in the coming weeks.


The Titan is entering a class of single GPUs that in recent history only had one other member (the 8800 Ultra). It has no equivalent in the 600 series, or any series before that until you get back to the 8 series... and even then, the 8800 Ultra really wasn't much better than the GTX; it just clocked insanely high (similar to the HD 4890 on the ATI side). The GeForce 700 series will be a moderate improvement over the 600 series, probably around 15-30%, which is reasonable considering they're on the same 28nm node.

Here's how it's probably going to stack up:

GTX 680 - 1536 SPs - 2GB/4GB 256-bit GDDR5 - 32 ROPs - GK104 - ~$500
GTX 770 - ~1728 SPs - 2GB/4GB 256-bit GDDR5 - 32 ROPs - GK114 - ~$400
GTX 780 - ~1920 SPs - 2GB/4GB 256-bit GDDR5 - 32 ROPs - GK114 - ~$500
Titan - ~2688 SPs - 6GB 384-bit GDDR5 - 48 ROPs - GK110 - ~$900
GTX 690 - 3072 SPs - 4GB 512-bit GDDR5 - 64 ROPs - GK104 x2 - ~$1000
GTX 790 - ~3840 SPs - 4GB/8GB? 512-bit GDDR5 - 64 ROPs - GK114 x2 - $????

On the AMD side there probably won't be a direct competitor:

HD 7970 - 2048 SPs - 3GB 384-bit GDDR5 - 32 ROPs - ~$550 (now down to ~$400 ofc)
HD 8950 - ~2304 SPs - 3GB 384-bit GDDR5 - 48 ROPs - ~$400?
HD 8970 - ~2560 SPs - 3GB/6GB 384-bit GDDR5 - 48 ROPs - ~$500?
HD 7990 (Dual Tahiti XT [7970]) - 4096 SPs - 6GB 768-bit GDDR5 - 64 ROPs - ~$900
HD 8990 (Dual 8970) - ~5120 SPs - 6GB/12GB 768-bit GDDR5 - 96 ROPs - ~$1000?

These are just the generally accepted rumors going around though so don't quote me on it... they're probably correct though.


----------



## maarten12100

Quote:


> Originally Posted by *Avonosac*
> 
> There isn't that much difference between the 670 and 680 when you clock em similar, its why I ended up with the 670 and not the 680.. to really max out the games you need SLI, in which case you're usually at least 20% performance overkill for a single monitor display, even at 2560x1600.


Isn't it a proven fact that the cards simply can not run Far Cry 3 even at 1080P.
A sli set is required to get decent pro 20+ frames/s especially if going up to 1440P

I guess you're pro Nvidia or something but what does it really matter had both still have both and both are fine.


----------



## Cloudfire777

FFS, leave AMD or any fanboy discussions out of this discussion.

680 was faster than 7970 at launch. AMD made a new card with higher clocks, aka a new GPU (lets be real, its not really a new GPU, but you get my point). It is faster than 680. Nvidia could have done the same thing to 7970GHz version, and around we go.

Included fanboys who have to have a pissing contest on which GPU is faster, which is the best for the buck, while some prefer Nvidia and doesn`t care so much about the performance/price and just want the best, this is topped with AMD people calling Nvidia guys stupid, threads are closed, new cards are soon out, and the whole freaking MF cycle repeats itself.

Nobody actually game on their computer. GPUs are just e-peen, and people use way more time on forums discussing the very GPU they just bought to reinsure themselves and try to convince other people that their train of thoughts is the only correct.

**** humans, and **** and just buy whatever you like. Less discussing, more gaming.


----------



## DADDYDC650

Quote:


> Originally Posted by *Usario*
> 
> They never seem to think nothing be like it is. ;-;
> The Titan is entering a class of single GPUs that in recent history only had one other member (the 8800 Ultra). It has no equivalent in the 600 series, or any series before that until you get back to the 8 series... and even then, the 8800 Ultra really wasn't much better than the GTX; it just clocked insanely high (similar to the HD 4890 on the ATI side). The GeForce 700 series will be a moderate improvement over the 600 series, probably around 15-30%, which is reasonable considering they're on the same 28nm node.
> 
> Here's how it's probably going to stack up:
> 
> GTX 680 - 1536 SPs - 2GB/4GB 256-bit GDDR5 - 32 ROPs - GK104 - ~$500
> GTX 770 - ~1728 SPs - 2GB/4GB 256-bit GDDR5 - 32 ROPs - GK114 - ~$400
> GTX 780 - ~1920 SPs - 2GB/4GB 256-bit GDDR5 - 32 ROPs - GK114 - ~$500
> Titan - ~2688 SPs - 6GB 384-bit GDDR5 - 48 ROPs - GK110 - ~$900
> GTX 690 - 3072 SPs - 4GB 512-bit GDDR5 - 64 ROPs - GK104 x2 - ~$1000
> GTX 790 - ~3840 SPs - 4GB/8GB? 512-bit GDDR5 - 64 ROPs - GK114 x2 - $????
> 
> On the AMD side there probably won't be a direct competitor:
> 
> HD 7970 - 2048 SPs - 3GB 384-bit GDDR5 - 32 ROPs - ~$550 (now down to ~$400 ofc)
> HD 8950 - ~2304 SPs - 3GB 384-bit GDDR5 - 48 ROPs - ~$400?
> HD 8970 - ~2560 SPs - 3GB/6GB 384-bit GDDR5 - 48 ROPs - ~$500?
> HD 7990 (Dual Tahiti XT [7970]) - 4096 SPs - 6GB 768-bit GDDR5 - 64 ROPs - ~$900
> HD 8990 (Dual 8970) - ~5120 SPs - 6GB/12GB 768-bit GDDR5 - 96 ROPs - ~$1000?
> 
> These are just the generally accepted rumors going around though so don't quote me on it... they're probably correct though.


+1


----------



## Rayleyne

Quote:


> Originally Posted by *Cloudfire777*
> 
> FFS, leave AMD out of this discussion.
> 
> 680 was faster than 7970 at launch. AMD made a new card with higher clocks, aka a new GPU. It is faster than 680. Nvidia could have done the same thing to 7970GHz version, and the whole freaking cycle repeats itself.
> 
> Included fanboys who have to have a pissing contest on which GPU is faster, which is the best for the buck, while some prefer Nvidia and doesn`t care so much about the performance/price and just want the best, this is topped with AMD people calling Nvidia guys stupid, threads are closed, new cards are soon out, and the whole freaking MF cycle repeats itself.
> 
> Nobody actually game on their computer. GPUs are just e-peen, and people use way more time on forums discussing the very GPU they just bought to reinsure themselves and try to convince other people that their train of thoughts is the only correct.
> 
> **** humans, and **** and just buy whatever you like. Less discussing, more gaming.


Uhhhh.... the 7970 and the 7970ghz edition are the same thing just clocked differantly...


----------



## Nocturin

Quote:


> Originally Posted by *Avonosac*
> 
> There isn't that much difference between the 670 and 680 when you clock em similar, its why I ended up with the 670 and not the 680.. to really max out the games you need SLI, in which case you're usually at least 20% performance overkill for a single monitor display, even at 2560x1600.


Quote:


> Originally Posted by *Stay Puft*
> 
> 670 is like 5-8% slower then a 680 with the same memory bandwidth


Your both talking to someone who has a 7950. I can clock just as high and get roughly (between 5-8%) same numbers as the 7970. It's still not a 7970. I can pretend all day by clocking at 1100+ and getting those extra few frames... see where I'm going?
Quote:


> Originally Posted by *Cloudfire777*
> 
> FFS, leave AMD or any fanboy discussions out of this discussion.
> 
> 680 was faster than 7970 at launch. AMD made a new card with higher clocks, aka a new GPU (lets be real, its not really a new GPU, but you get my point). It is faster than 680. Nvidia could have done the same thing to 7970GHz version, and around we go.
> 
> Included fanboys who have to have a pissing contest on which GPU is faster, which is the best for the buck, while some prefer Nvidia and doesn`t care so much about the performance/price and just want the best, this is topped with AMD people calling Nvidia guys stupid, threads are closed, new cards are soon out, and the whole freaking MF cycle repeats itself.
> 
> *Nobody actually game on their computer. GPUs are just e-peen,* and people use way more time on forums discussing the very GPU they just bought to reinsure themselves and try to convince other people that their train of thoughts is the only correct.
> 
> **** humans, and **** and just buy whatever you like. Less discussing, more gaming.


I dunno about you, but I game more on my computer more than anything else. I care very much about price/performance, and that's why I have a 7950, not a 680 (splitting hairs).

If I wanted e-peen, I wouldn't have a 965be paired with a 7950.


----------



## Cloudfire777

Quote:


> Originally Posted by *Rayleyne*
> 
> Uhhhh.... the 7970 and the 7970ghz edition are the same thing just clocked differantly...


Try to use your brain for once ok?
Its the same GPU but its a new GPU because its clocked differently. Same as rebadge are "new GPUs", and same as higher clocked GPUs are faster than lower clocked GPUs. Nvidia could have done the exact same. Case closed.

Continue with the Titan discussion and leave AMD out of this. I know I`m asking for something impossible here but its so frustrating watching people try to convince other that any company is better than the other.


----------



## Usario

Quote:


> Originally Posted by *Cloudfire777*
> 
> FFS, leave AMD or any fanboy discussions out of this discussion.
> 
> 680 was faster than 7970 at launch. AMD made a new card with higher clocks, aka a new GPU (lets be real, its not really a new GPU, but you get my point). It is faster than 680. Nvidia could have done the same thing to 7970GHz version, and around we go.
> 
> Included fanboys who have to have a pissing contest on which GPU is faster, which is the best for the buck, while some prefer Nvidia and doesn`t care so much about the performance/price and just want the best, this is topped with AMD people calling Nvidia guys stupid, threads are closed, new cards are soon out, and the whole freaking MF cycle repeats itself.
> 
> Nobody actually game on their computer. GPUs are just e-peen, and people use way more time on forums discussing the very GPU they just bought to reinsure themselves and try to convince other people that their train of thoughts is the only correct.
> 
> **** humans, and **** and just buy whatever you like. Less discussing, more gaming.


The 7970 GHz Edition isn't a new GPU. It's a new BIOS. The GPU itself did not change one bit and there's no evidence to suggest there's any binning going on.

And at this point in time nvidia being the best is subjective and depends on your needs.

But I agree that some people make GPUs out to be all about epeen.
Quote:


> Originally Posted by *Cloudfire777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Rayleyne*
> 
> Uhhhh.... the 7970 and the 7970ghz edition are the same thing just clocked differantly...
> 
> 
> 
> Try to use your brain for once ok?
> Its the same GPU but its a new GPU because its clocked differently. Same as rebadge are "new GPUs", and same as higher clocked GPUs are faster than lower clocked GPUs. Nvidia could have done the exact same. Case closed.
> 
> Continue with the Titan discussion and leave AMD out of this. I know I`m asking for something impossible here but its so frustrating watching people try to convince other that any company is better than the other.
Click to expand...

Um no. It's not like a 4890, which actually clocks better than the 4870. You can literally make a 7970 a GHz Edition by flashing a BIOS. It's like GeForce 8800 -> 9800 or Radeon 5700 -> 6700. It's not a new GPU.

If Nvidia did the same thing they'd have to remove the TDP cap considering that Kepler Boost already allows for their cards to boost up as high as they can so long as they don't start spewing out too much heat.


----------



## Nocturin

Quote:


> Originally Posted by *Cloudfire777*
> 
> Try to use your brain for once ok?
> Its the same GPU but its a new GPU because its clocked differently. Same as rebadge are "new GPUs", and same as higher clocked GPUs are faster than lower clocked GPUs. Nvidia could have done the exact same. Case closed.
> 
> Continue with the Titan discussion and leave AMD out of this.


It's the same GPU with a different firmware. That's it.

It replaced the hot-clocked versions from the different manufactures.

Don't insult others.
Quote:


> Originally Posted by *Usario*
> 
> The 7970 GHz Edition isn't a new GPU. It's a new BIOS. The GPU itself did not change one bit and there's no evidence to suggest there's any binning going on.
> 
> And at this point in time nvidia being the best is subjective and depends on your needs.
> 
> But I agree that some people make GPUs out to be all about epeen.


It's been awhile since you and I've gone tit-a-tat in thread. Forgot how fun it was!


----------



## Rayleyne

Quote:


> Originally Posted by *Cloudfire777*
> 
> Try to use your brain for once ok?
> Its the same GPU but its a new GPU because its clocked differently. Same as rebadge are "new GPUs", and same as higher clocked GPUs are faster than lower clocked GPUs. Nvidia could have done the exact same. Case closed.
> 
> Continue with the Titan discussion and leave AMD out of this. I know I`m asking for something impossible here but its so frustrating watching people try to convince other that any company is better than the other.


I am using my brain, If i take a 7970 and clock it at 1 ghz it's still the same 7970


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> Isn't it a proven fact that the cards simply can not run Far Cry 3 even at 1080P.
> A sli set is required to get decent pro 20+ frames/s especially if going up to 1440P
> 
> I guess you're pro Nvidia or something but what does it really matter had both still have both and both are fine.


I have a 670, I came from 4890s. I buy the right price / performance cards which fit my needs. The significantly lower power consumption of 600 series swayed me, when performance in the games I play most was the same. CS:GO, LoL, AC3 and Skyrim.

Quote:


> Originally Posted by *Nocturin*
> 
> Your both talking to someone who has a 7950. I can clock just as high and get roughly (between 5-8%) same numbers as the 7970. It's still not a 7970. I can pretend all day by clocking at 1100+ and getting those extra few frames... see where I'm going?


You're missing the point, I'm really not sure what you are trying to get at, but the point is my 670s in SLI will net me 3-4 less FPS when clocked the same as a 680s playing FC3 or other card burner games, but those 3-4 FPS will cost $200 If I need to spend 800 to get the game playable, the question becomes is the extra 3-4 frames worth 200$? My answer was no, so I got the 670 which fits my needs for AA, high textures and 1440p in Skyrim.


----------



## Cloudfire777

My point here was that a vbios with higher clocks from AMD/Nvidia along with a new hardware ID name = New GPU.

Thats how its been since ages ago. Its really not a new GPU but rather a rebadge to people who know tech.


----------



## maarten12100

Quote:


> Originally Posted by *Cloudfire777*
> 
> Try to use your brain for once ok?
> Its the same GPU but its a new GPU because its clocked differently. Same as rebadge are "new GPUs", and same as higher clocked GPUs are faster than lower clocked GPUs. Nvidia could have done the exact same. Case closed.
> 
> Continue with the Titan discussion and leave AMD out of this.


I'm gonna say something and it is not intended to be flaming I just want to get back ontopic with this thread as everytime I check it'll be fan garbage all over the place.
We have a place on the forum for bragging about either Nvidia or AMD and it is the designated "owner topics"

Leave all the fanboy junk out of this thread.
Nobody cares!


----------



## Usario

Quote:


> Originally Posted by *Nocturin*
> 
> It's been awhile since you and I've gone tit-a-tat in thread. Forgot how fun it was!











Quote:


> Originally Posted by *Avonosac*
> 
> You're missing the point, I'm really not sure what you are trying to get at, but the point is my 670s in SLI will net me 3-4 less FPS when clocked the same as a 680s playing FC3 or other card burner games, but those 3-4 FPS will cost $200 If I need to spend 800 to get the game playable, the question becomes is the extra 3-4 frames worth 200$? My answer was no, so I got the 670 which fits my needs for AA, high textures and 1440p in Skyrim.


I don't think anyone's arguing that the 670 isn't definitely a far superior value... I don't think anyone mentioned value though.
Quote:


> Originally Posted by *Cloudfire777*
> 
> My point here was that a vbios with higher clocks from AMD/Nvidia along with a new hardware ID name = New GPU.
> 
> Thats how its been since ages ago. Its really not a new GPU but rather a rebadge to people who know tech.


Usually it IS a new GPU (see: 8800 Ultra, GTX 285, HD 4890). However, the 7970 GHz Edition clocks no better and isn't binned at all. It's not really a GPU, it's a BIOS.


----------



## Avonosac

Quote:


> Originally Posted by *Usario*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think anyone's arguing that the 670 isn't definitely a far superior value... I don't think anyone mentioned value though.
> Usually it IS a new GPU (see: 8800 Ultra, GTX 285, HD 4890). However, the 7970 GHz Edition clocks no better and isn't binned at all. It's not really a GPU, it's a BIOS.












That guy seemed like he was, thats usually where people go when they bring up 7950's in a 7970 or anything related thread. I guess it was a knee-jerk reaction.


----------



## Master Freez

*WHAT TO BUY???*

Just think about it: Titan almost now and 780 later....looks like that 780 should be at leat similar with lower TDP or smaller.

So...is there any reason to change 680's now? I don't know.... Probably no reason.


----------



## maarten12100

Quote:


> Originally Posted by *Avonosac*
> 
> I have a 670, I came from 4890s. I buy the right price / performance cards which fit my needs. The significantly lower power consumption of 600 series swayed me, when performance in the games I play most was the same. CS:GO, LoL, AC3 and Skyrim.


Seems fair just thought so since your previous arguments were all like yeah Nvidia is so much better than AMD cards it can max everything blaba.
But if this was the case why didn't you pick dual hd7850 as they have lower power consumption/performance.


----------



## Cloudfire777

Quote:


> Originally Posted by *Usario*
> 
> Usually it IS a new GPU (see: 8800 Ultra, GTX 285, HD 4890). However, the 7970 GHz Edition clocks no better and isn't binned at all. It's not really a GPU, it's a BIOS.


Allright, you are partly right.
Does it really matter? Higher clocked GPU will be faster. Nvidia could have done the same. That was my point.


----------



## Nocturin

Quote:


> Originally Posted by *Avonosac*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That guy seemed like he was, thats usually where people go when they bring up 7950's in a 7970 or anything related thread. I guess it was a knee-jerk reaction.












This guy was not.


----------



## Usario

Quote:


> Originally Posted by *Master Freez*
> 
> *WHAT TO BUY???*
> 
> Just think about it: Titan almost now and 780 later....looks like that 780 should be at leat similar with lower TDP or smaller.
> 
> So...is there any reason to change 680's now? I don't know.... Probably no reason.


Titan will remain much more powerful.
Quote:


> Originally Posted by *Cloudfire777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Usario*
> 
> Usually it IS a new GPU (see: 8800 Ultra, GTX 285, HD 4890). However, the 7970 GHz Edition clocks no better and isn't binned at all. It's not really a GPU, it's a BIOS.
> 
> 
> 
> Allright, you are partly right.
> Does it really matter? Higher clocked GPU will be faster. Nvidia could have done the same. That was my point.
Click to expand...

But Nvidia already had the boost feature implemented. AMD, on the other hand, did not: their cards remained at a constant 925MHz and often overclocked above 1200MHz, while Nvidia's cards were marketed as running at 1006MHz yet almost always boosted well above 1100MHz and infrequently even over 1200MHz out of the box while usually overclocking to somewhere between 1200-1300MHz after boost (though I must admit the voltage unlocked 600 cards are some very, very impressive overclockers often breaking 1400MHz). They'd have to go over TDP to do the same, and then their fanboys' power consumption cry would crumble.


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> Seems fair just thought so since your previous arguments were all like yeah Nvidia is so much better than AMD cards it can max everything blaba.
> But if this was the case why didn't you pick dual hd7850 as they have lower power consumption/performance.


Micro-stutter and future proofability.

When you start out with multiGPU, you can't extend the life of your current gen by adding another. Its generally more cost / performance viable to buy at or near top, and then wait for later tech, use multi-GPU to catch up to a generation, and then when a single near top card surpasses your multi- setup by a margin, you then upgrade. This way you make it through 2-3 generations before you truly feel a graphics hit on your gaming. Its the "sweet spot" I've found, and I've spent far too much money on gaming machines.


----------



## Cloudfire777

Quote:


> Originally Posted by *Usario*
> 
> Titan will remain much more powerful.
> But Nvidia already had the boost feature implemented. They'd have to go over TDP to do the same.


Sure, but compared to the TDP of 7970, Nvidia have a LOT of headroom to catch up and surpass AMD.

250W > 195W
7970/GHz > GTX 680

But enough of this. I`m not going to participate in these discussions. Buy whatever you like. Both will do just fine, well until Titan is here


----------



## Usario

Quote:


> Originally Posted by *Cloudfire777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Usario*
> 
> Titan will remain much more powerful.
> But Nvidia already had the boost feature implemented. They'd have to go over TDP to do the same.
> 
> 
> 
> Sure, but compared to the TDP of 7970, Nvidia have a LOT of headroom to catch up and surpass AMD.
> 
> 250W > 195W
> 7970/GHz > GTX 680
> 
> But enough of this. I`m not going to participate in these discussions. Buy whatever you like. Both will do just fine, well until Titan is here
Click to expand...

The GTX 580 has a 245W TDP; I'm sure we all know that the 7970 is more efficient however. The TDP calculations are different.


----------



## thestache

Quote:


> Originally Posted by *tsm106*
> 
> You know... the last time a company was that sure of something it was a bonafide flop. Just saying. It's a bit early for the sure thing. That said if true, it would be cool not to have to sport quad cards anymore.


That's it.

Most exciting thing is potentially being able to run 2 way SLI that will far exceed GTX 680-690 4 way SLI. Wich is huge for surround users but incredibly limited by Nvidias current drivers. 3 screen max surround support (with the accessory display) with these cards will be a horrible waste and I hope Nvidia make the jump to support 5 with the release of this card even if it works horribly and is heavily patched in the future.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Avonosac*
> 
> Micro-stutter and future proofability.
> 
> When you start out with multiGPU, you can't extend the life of your current gen by adding another. Its generally more cost / performance viable to buy at or near top, and then wait for later tech, use multi-GPU to catch up to a generation, and then when a single near top card surpasses your multi- setup by a margin, you then upgrade. This way you make it through 2-3 generations before you truly feel a graphics hit on your gaming. Its the "sweet spot" I've found, and I've spent far too much money on gaming machines.


Hardly anyone does that though. You get HD 7970s and GTX680s because you want the best. People that will get the Titan are those that hate SLI/CF or those that are now running 3x + cards. I got my second HD 7970 so no complaining here but considering the best PC games dont need that much GPU power spending 2K in GPUs is kind of stupid with current games. The sweet spot is $300-350. Much rather wait for something with the power of Titan and cheaper. Considering this is most likely a paper launch it will be hard to get.


----------



## Cloudfire777

You lose performance with going SLI or CF. They don`t scale 100% perfect...


----------



## rcfc89

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Hardly anyone does that though. You get HD 7970s and GTX680s because you want the best. People that will get the Titan are those that hate SLI/CF or those that are now running 3x + cards. I got my second HD 7970 so no complaining here but considering the best PC games*dont need that much GPU power* spending 2K in GPUs is kind of stupid with current games. The sweet spot is $300-350. Much rather wait for something with th *power of Titan and cheaper*r. Considering this is most likely a paper launch it will be hard to get.


Good luck with your 2015 goals. I can name at least 4 games that are brand new and current that will stomp 7970's in crossfire on a single 2560x1440 monitor at max settings. FarCry 3, Crysis 3, Sleeping Dogs, Hitman A. My 690 is brought to its knee's in max settings in these games. Anything past 2x msaa in FC3 I drop below 30fps on a 2560x1440 single monitor. For those that want to have everything completely maxed out in eyefinity or 2560 resolutions the Gpu power that Titan sli will bring is very much needed. Of course you can turn things down a bit (AA off etc.) and keep things smooth at 60fps. But honestly who builds a top of the line gaming rig and drops thousands of dollars doing so to "turn things down." Not me and there are plenty others like me. I'll drop 1800 dollars easily to achieve this.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Cloudfire777*
> 
> You lose performance with going SLI or CF. They don`t scale 100% perfect...


How do you lose performance? even if it only scales 50% your still not losing performance. If a current gen game does not utilize both GPU's odds are its not GPU dependent anyways.

Almost all the games I play scale 80-99% BF3, Crysis 3, BFBC2, BLack ops 2, Metro 2033, ect. People don't go sli/crossfire to play Starcraft type games. lol


----------



## rcfc89

Quote:


> Originally Posted by *Dimaggio1103*
> 
> How do you lose performance? even if it only scales 50% your still not losing performance. If a current gen game does not utilize both GPU's odds are its not GPU dependent anyways.
> 
> Almost all the games I play scale 80-99% BF3, Crysis 3, BFBC2, BLack ops 2, Metro 2033, ect. People don't go sli/crossfire to play Starcraft type games. lol


Or BF3/TF2. I easily average over 100fps with BF3 maxed at 2560x1440. Its far from demanding. FC3 was the first that made me realize I need more gpu power.


----------



## Rayleyne

Quote:


> Originally Posted by *Dimaggio1103*
> 
> How do you lose performance? even if it only scales 50% your still not losing performance. If a current gen game does not utilize both GPU's odds are its not GPU dependent anyways.
> 
> Almost all the games I play scale 80-99% BF3, Crysis 3, BFBC2, BLack ops 2, Metro 2033, ect. People don't go sli/crossfire to play Starcraft type games. lol


Actually For a very long time i had to run GTX 460 sli to get 60 fps in sc2 when the map was loaded and that was with a 2600k @ 5g's


----------



## SavantStrike

I'll be impressed when they have a non thousand dollar variant of this card.

Either that or perhaps it may drive 680 prices down a bit, but it's Nvidia, so most likely not going to happen. The GTX 560 TI I bought two years ago is only a little cheaper than it was when I bought it.


----------



## Rayleyne

Quote:


> Originally Posted by *SavantStrike*
> 
> I'll be impressed when they have a non thousand dollar variant of this card.
> 
> Either that or perhaps it may drive 680 prices down a bit, but it's Nvidia, so most likely not going to happen. The GTX 560 TI I bought two years ago is only a little cheaper than it was when I bought it.


As fanboyish towards aMD as it sounds, nvidia isn't in the habit of lowering prices.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Rayleyne*
> 
> Actually For a very long time i had to run GTX 460 sli to get 60 fps in sc2 when the map was loaded and that was with a 2600k @ 5g's


Then something was wrong with your system champ, cause I maxed the game on a GTS 450, and 3570K. Starcraft 2 is not graphically demanding in the least.


----------



## Rayleyne

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Then something was wrong with your system champ, cause I maxed the game on a GTS 450, and 3570K. Starcraft 2 is not graphically demanding in the least.


When there is 200 odd units on the screen SC2 is exceptionaly demanding especialy at 1440, Don't underestimate blizzard games when it comes to demanding. 10 man raids could rape sli setups on the maloriak fight, The rag fight, During Zonozz in black phases, Sc2 can rape gpu's when you've got a rediculous amount of units on screen, Plain and simple.

The differance between you and me is below 60 fps at max details isn't acceptable for me, Hence why i have two 7970s and you don't.


----------



## Nocturin

Quote:


> Originally Posted by *rcfc89*
> 
> Good luck with your 2015 goals. I can name at least 4 games that are brand new and current that will stomp 7970's in crossfire on a single 2560x1440 monitor at max settings. FarCry 3, Crysis 3, Sleeping Dogs, Hitman A. My 690 is brought to its knee's in max settings in these games. Anything past 2x msaa in FC3 I drop below 30fps on a 2560x1440 single monitor. For those that want to have everything completely maxed out in eyefinity or 2560 resolutions the Gpu power that Titan sli will bring is very much needed. Of course you can turn things down a bit (AA off etc.) and keep things smooth at 60fps. But honestly who builds a top of the line gaming rig and drops thousands of dollars doing so to "turn things down." Not me and there are plenty others like me. I'll drop 1800 dollars easily to achieve this.


There are more like that wont spend more than $300, and even more that won't spend more than 150$ (OEMS).

I wish I had your job and/or credit







.


----------



## xFALL3Nx

OH COMMON I just got my 480!

Runs wicked hot, but still a wicked beast!


----------



## yoi

if this "titan" can help me render faster (like a Quadro ) AND is as good as the top of the line GTX for gaming ... im sold and i want 2 of these


----------



## EpicAMDGamer

I can has a review sample? lol
Quote:


> Originally Posted by *xFALL3Nx*
> 
> OH COMMON I just got my 480!
> 
> Runs wicked hot, but still a wicked beast!


What are you doing with your GTX 2 series card? (Gonna Sell?)


----------



## Kaldari

Quote:


> Originally Posted by *Dimaggio1103*
> 
> No SLI has not been that bad IMO since before the 4xx series. People keep repeating information they heard through the grape vine, even though the information is incorrect. Even crossfire has improved significantly the past two gens.
> 
> I have ran SLI 460, 560, 560ti, and now 660. Not a single issue at all. Even Crossfired without cause for complaint.
> 
> People (noobs) have a way of hearing something and then repeating it like its fact even though they have no idea what they are talking about.
> 
> Here is my thoughts with my recent purchase. I could have afforded to buy a 7970 GHZ edition, but it was cheaper and more powerful to grab two GTX 660's in SLI, wich will beat a 7970 GHZ hand down for less. I knew I would never crossfire the 7970 so my choice was obvious. I ever once was worried about SLI problems.


I ran SLI with both the 2xx series and the 4xx series, so I've dealt with SLI evolving over years now. I just recently sold off my 480s, so this experience runs up until very recently. Games supporting SLI is very hit and miss, and you can bet that at least half of the new games that hit the market will have problems out of the gate. Some don't get profiles for months, and making your own profile doesn't fix anything in many cases. It isn't only a scaling issue either. Some games just hate SLI. They'll run better with a single card than with two of the same card with SLI on. Having to constantly toggle SLI off and on, off and on, off and on depending on the game you're playing or what you're doing is a headache and just not worth it after a while.

I guess I'm a "noob" that "keeps repeating information they heard through the grape vine" though.


----------



## Rayleyne

Quote:


> Originally Posted by *Kaldari*
> 
> I ran SLI with both the 2xx series and the 4xx series, so I've dealt with SLI evolving over years now. I just recently sold off my 480s, so this experience runs up until very recently. Games supporting SLI is very hit and miss, and you can bet that at least half of the new games that hit the market will have problems out of the gate. Some don't get profiles for months, and making your own profile doesn't fix anything in many cases. It isn't only a scaling issue either. Some games just hate SLI. They'll run better with a single card than with two of the same card with SLI on. Having to constantly toggle SLI off and on, off and on, off and on depending on the game you're playing or what you're doing is a headache and just not worth it after a while.
> 
> I guess I'm a "noob" that "keeps repeating information they heard through the grape vine" though.


I've never had any "problems" getting games to scale in multi gpu configs myself, But i always notice microstutter


----------



## Kaldari

Quote:


> Originally Posted by *Rayleyne*
> 
> I've never had any "problems" getting games to scale in multi gpu configs myself, But i always notice microstutter


Well that's another issue that was real bad early on, but it has been mostly dealt with in recent times.


----------



## ZealotKi11er

I bet AMD will come with the codename Zeta or Zeus if you know the joke







.


----------



## Rayleyne

Quote:


> Originally Posted by *Kaldari*
> 
> Well that's another issue that was real bad early on, but it has been mostly dealt with in recent times.


Gotta be honest, Microstutter only goes away at the 100+ fps mark, the higher it is the less noticeable.


----------



## F1ynn

bleh,. Green color


----------



## GoldenTiger

Quote:


> Originally Posted by *F1ynn*
> 
> bleh,. Green color


It pictures a Tesla card, not a GeForce one.


----------



## GoldenTiger

Quote:


> Originally Posted by *Rayleyne*
> 
> Gotta be honest, Microstutter only goes away at the 100+ fps mark, the higher it is the less noticeable.


Nvidia cards have frame metering tech that helps reduce the effect significantly compared to AMD's.


----------



## SavantStrike

Quote:


> Originally Posted by *F1ynn*
> 
> bleh,. Green color


You know what needs to make a come back? That golden brown of old school mobos.

I remember Red PCBs and loved them. Now everything is black. I'd almost take a green PCB just for something different.


----------



## GoldenTiger

Quote:


> Originally Posted by *SavantStrike*
> 
> You know what needs to make a come back? That golden brown of old school mobos.
> 
> I remember Red PCBs and loved them. Now everything is black. I'd almost take a green PCB just for something different.


I liked green PCB's







, red was cool too.


----------



## Rayleyne

Quote:


> Originally Posted by *GoldenTiger*
> 
> Nvidia cards have frame metering tech that helps reduce the effect significantly compared to AMD's.


Dynamic vsync microsttuter does not fix, Microstutter is caused by using 2 or more gpu's in tandem because of the delay between the two parts.


----------



## GoldenTiger

Quote:


> Originally Posted by *Rayleyne*
> 
> Dynamic vsync microsttuter does not fix, Microstutter is caused by using 2 or more gpu's in tandem because of the delay between the two parts.


Wrong, microstutter happens from poor frame distribution per second, and affects both single and dual gpu solutions. Did you somehow miss TechReport, AlienBabelTech, and many other articles over the recent past about that? AMD's gpu's have very poor frametimes and result in a juddery motion. This doesn't have anything to do with vsync.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Rayleyne*
> 
> When there is 200 odd units on the screen SC2 is exceptionaly demanding especialy at 1440, Don't underestimate blizzard games when it comes to demanding. 10 man raids could rape sli setups on the maloriak fight, The rag fight, During Zonozz in black phases, Sc2 can rape gpu's when you've got a rediculous amount of units on screen, Plain and simple.
> 
> The differance between you and me is below 60 fps at max details isn't acceptable for me, Hence why i have two 7970s and you don't.


Lol, I never drop below 60 FPS in any games on this new setup. For the cost of one of your cards I got two 660's that handle anything at max. you mad?









As I said sli/crossfire is not needed for SC2 im talking 4v4 with max late game battle. You show your ignorance my friend. When you have tons of units on screen, it be CPU dependent and almost nothing to do with GPU strength.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> I bet AMD will come with the codename Zeta or Zeus if you know the joke
> 
> 
> 
> 
> 
> 
> 
> .


Lol, that would be awesome.


----------



## Rayleyne

Quote:


> Originally Posted by *GoldenTiger*
> 
> Wrong, microstutter happens from poor frame distribution per second, and affects both single and dual gpu solutions. Did you somehow miss TechReport, AlienBabelTech, and many other articles over the recent past about that? AMD's gpu's have very poor frametimes and result in a juddery motion. This doesn't have anything to do with vsync.


Actually i am right, Both sides given i've made many sli and crossfire setups introduce noticeable microstutter when a second gpu is added sorry bro but it's true, dynamic vsync does not fix microstutter, both the 600 and 7000 series from nvidia and amd both get it, Deal with it.


----------



## Rayleyne

Quote:


> Originally Posted by *Dimaggio1103*
> Lol, I never drop below 60 FPS in any games on this new setup. For the cost of one of your cards I got two 660's that handle anything at max. you mad?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As I said sli/crossfire is not needed for SC2 im talking 4v4 with max late game battle. You show your ignorance my friend. When you have tons of units on screen, it be CPU dependent and almost nothing to do with GPU strength.
> Lol, that would be awesome.


Gotta ask bro, do you play at 7680x1440?


----------



## ZealotKi11er

The micro stutter is apparent in both cases. If you game at 60Hz having 120 fps helps because the frame time is only 8ms and the jump is not that bad up to 20ms. Most of the time is a game thing but Nvdia has done work in the past to fix this. This was mostly done on flagship cards most of the time. For example GTX580 had more consistent frames then GTX570. AMD is a bit late but they are working on it too.


----------



## Rayleyne

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The micro stutter is apparent in both cases. If you game at 60Hz having 120 fps helps because the frame time is only 8ms and the jump is not that bad up to 20ms. Most of the time is a game thing but Nvdia has done work in the past to fix this. This was mostly done on flagship cards most of the time. For example GTX580 had more consistent frames then GTX570. AMD is a bit late but they are working on it too.


Frametimes on a single card aren't really an issue unless there is something REALLY wrong but when you introduce a pair of cards, The time it takes to communicate between each other and eachother to the cpu does introduce delays and thus microstutter is born


----------



## GoldenTiger

Quote:


> Originally Posted by *Rayleyne*
> 
> Actually i am right, Both sides given i've made many sli and crossfire setups introduce noticeable microstutter when a second gpu is added sorry bro but it's true, dynamic vsync does not fix microstutter, both the 600 and 7000 series from nvidia and amd both get it, Deal with it.


Wrong, buddy, as I said...
Since you seem to not want to use google for yourself, here:

http://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/9
http://techreport.com/r.x/7950-vs-660ti/skyrim.gif

Other sites have confirmed the results.... AMD has horrendously bad frame time distribution which introduces NOTABLE microstutter. Vsync again has nothing to do with this, please have some knowledge of terminology and how things work before flaming again.

http://www.pcper.com/news/General-Tech/Frame-latency-youve-always-cared-about-it-you-just-didnt-know-it


----------



## HellAce

*MOAR POWER! I WANT GEFORCE TITAN TI SUPERCLOCKED SIGNATURE EDITION!!!!*









Lol on a more serious note.....seems like great news, soon reviews will be up. Also omg, ive noticed Alatar and some others making folly predictions of how no single card will be able to match the Titan this year. Ok man, i would hold back on that thought if i were you, its wayyy to soon to make that kinda prediction.


----------



## Rayleyne

Quote:


> Originally Posted by *GoldenTiger*
> 
> Wrong, buddy, as I said...
> Since you seem to not want to use google for yourself, here:
> 
> http://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/9
> http://techreport.com/r.x/7950-vs-660ti/skyrim.gif
> 
> Other sites have confirmed the results.... AMD has horrendously bad frame time distribution which introduces NOTABLE microstutter. Vsync again has nothing to do with this, please have some knowledge of terminology and how things work before flaming again.


Actually "Buddy" I am right, See they tested the game (when the whole frametime fad went through a few weeks ago) the game "skyrim" A game notorious for having problems, Microstutter is introduced when you are using 2 or more gpu's because it takes additional time to sync second third and fourth (If you choose to have 2 3 or 4) With the first card and create alternating frames, YES you can get better performance and often do get better performance using 2 3 or 4 cards over one, But until they get it to 100% scaling (pro hint you can get 99.999999999% but never 100%) scaling you will always suffer from microstutter in one form or another, Sorry buddy but this is how it works, you sacrifice very very minor slight amount of smoothness that most won't notice for additional performance by adding an additional gpu, That is why some people prefer a single high end gpu IE a titan instead of 2 lesser,

AND fyi to the other person

2 660s are far more powerful then 2 460s i had back then, And again i'm still playing at a higher resolution.

It doesn't matter what camp you are with, I've done it with cards from nearly every generation

9000 NVIDIA series, 8000 Nvidiia series, 200 Nvidia series, 400 Nvidia series 600 Nvidia series, 4000 AMD series 5000 AMD series 7000 AMD series, Guess what all smooth as butter with a single card bar a hitch here and there, Second you add an additional card? BAM microstutter.


----------



## onthemour

Quote:


> Originally Posted by *Rayleyne*
> 
> Gotta ask bro, do you play at 7680x1440? But since you want to start acting like a child with the "You mad" crap i've reported your post to a moderator


lmao reminds me of my sister telling my mom when I was 7 years old


----------



## GoldenTiger

Quote:


> Originally Posted by *Rayleyne*
> 
> Actually "Buddy" I am right, See they tested the game (when the whole frametime fad went through a few weeks ago) the game "skyrim" A game notorious for having problems, Microstutter is introduced when you are using 2 or more gpu's because it takes additional time to sync second third and fourth (If you choose to have 2 3 or 4) With the first card and create alternating frames, YES you can get better performance and often do get better performance using 2 3 or 4 cards over one, But until they get it to 100% scaling (pro hint you can get 99.999999999% but never 100%) scaling you will always suffer from microstutter in one form or another, Sorry buddy but this is how it works, you sacrifice very very minor slight amount of smoothness that most won't notice for additional performance by adding an additional gpu, That is why some people prefer a single high end gpu IE a titan instead of 2 lesser,
> 
> AND fyi to the other person
> 
> 2 660s are far more powerful then 2 460s i had back then, And again i'm still playing at a higher resolution.
> 
> It doesn't matter what camp you are with, I've done it with cards from nearly every generation
> 
> 9000 NVIDIA series, 8000 Nvidiia series, 200 Nvidia series, 400 Nvidia series 600 Nvidia series, 4000 AMD series 5000 AMD series 7000 AMD series, Guess what all smooth as butter with a single card bar a hitch here and there, Second you add an additional card? BAM microstutter.


You are claiming it can only occur on a multi-GPU setup but are 100% wrong as I have linked and is well known. Quit while behind.... AMD's cards have worse frametime distribution as I said.


----------



## GoldenTiger

Quote:


> Originally Posted by *onthemour*
> 
> lmao reminds me of my sister telling my mom when I was 7 years old


Considering he/she can't comprehend frametime distribution causing the same visual effect as microstutter.... yeah, I kinda wonder too.


----------



## Rayleyne

Quote:


> Originally Posted by *GoldenTiger*
> 
> Considering he/she can't comprehend frametime distribution causing the same visual effect as microstutter.... yeah, I kinda wonder too.


Considering this arguement started with multi gpu setups, Yes that is actually what i am talking about, I never claimed that it can't happen on a single card, It's possible if somethings wrong, Poorly coded game, Bad driver etc, But since you want to act like a child like everyone else, I am done with you.


----------



## GoldenTiger

Quote:


> Originally Posted by *Rayleyne*
> 
> Considering this arguement started with multi gpu setups, Yes that is actually what i am talking about, I never claimed that it can't happen on a single card, It's possible if somethings wrong, Poorly coded game, Bad driver etc, But since you want to act like a child like everyone else, I wonder why i come to OCN thesedays, I heard this was a professional forum where you clearly don't know ****, I'm done here have a nice day.


Yet, it happens severely with AMD GPU's despite not having a "poorly coded game" or "bad driver" with "somethings wrong", it is a fairly widespread thing. Thanks for your input.


----------



## sugarhell

Quote:


> Originally Posted by *GoldenTiger*
> 
> Yet, it happens severely with AMD GPU's despite not having a "poorly coded game" or "bad driver" with "somethings wrong", it is a fairly widespread thing. Thanks for your input.


one BETA driver had worse frame latency


----------



## GoldenTiger

Quote:


> Originally Posted by *sugarhell*
> 
> one BETA driver had worse frame latency


Wrong, techreport has been reporting this for years, as has hardocp. It's not a single driver.

http://techreport.com/review/22192/amd-radeon-hd-7970-graphics-processor/12

Here's from February 2012 for example, where it showed huge spikes.


----------



## twitchyzero

Quote:


> Originally Posted by *ABeta*
> 
> A sidegrade to any multi card setup that is a single card is always an upgrade. If rumors hold true, and if this thing can surpass 690 performance and even gets close to dual 680 performance, I will surely sell mine to end up with a single card beast.


trading 670/680 SLI or 7950/7970 CFX for it would be silly

Let's say it has comparable performance

you are gonna take let's say $400-500 hit just so it runs cooler/quieter/more space efficient than a 2-way SLI? Only real problem I've had with SLI is microstuttering which is only apparent when it dips below 30fps.

Only real benefit of this is the generous amount of frame buffer but even 3-4GB is enough to last us a few more years. (Unless you are the type to run 3x1600p 8xMSAA with mods or some ridiculous setting). If it real claims to have comparable performance it probably means limited production which means a subpar drivers after a year of release.


----------



## Roadkill95

Quote:


> Originally Posted by *Roadkill95*
> 
> I call Nvidia vs AMD/ATI war.
> 
> Subbed.


..... Called it.


----------



## sugarhell

Quote:


> Originally Posted by *GoldenTiger*
> 
> Wrong, techreport has been reporting this for years, as has hardocp. It's not a single driver.
> 
> http://techreport.com/review/22192/amd-radeon-hd-7970-graphics-processor/12
> 
> Here's from February 2012 for example, where it showed huge spikes.


Lets play the same game with you

http://techreport.com/review/22653/nvidia-geforce-gtx-680-graphics-processor-reviewed/9


----------



## GoldenTiger

Quote:


> Originally Posted by *sugarhell*
> 
> Lets play the same game with you
> 
> http://techreport.com/review/22653/nvidia-geforce-gtx-680-graphics-processor-reviewed/9


Did you read the page? It explains that it was due to a driver hitch at the beginning of each test run, not normal frametime latency throughout the run. It says "A broader look at the latency picture shows that the GTX 680 generally produces lower-latency frames than the 7970, which is why its FPS average is so high. " Amazing.


----------



## GoldenTiger

Quote:


> Originally Posted by *Roadkill95*
> 
> ..... Called it.


What did you expect? The AMD lovers feel a perceived threat to their beloved "kingdom" and the drones swarm like fireants to "PROTECT THE QUEEN". Why? No one knows, the fanboy is a mysterious insect.


----------



## Master__Shake

Quote:


> Originally Posted by *GoldenTiger*
> 
> What did you expect? The AMD lovers feel a perceived threat to their beloved "kingdom" and the drones swarm like fireants to "PROTECT THE QUEEN". Why? No one knows, the fanboy is a mysterious insect.


you won the internet!!!! now shut off your computer and go to bed.


----------



## HellAce

Quote:


> Originally Posted by *GoldenTiger*
> 
> What did you expect? The AMD lovers feel a perceived threat to their beloved "kingdom" and the drones swarm like fireants to "PROTECT THE QUEEN". Why? No one knows, the fanboy is a mysterious insect.


yes......and your post wasnt biased at all, how about all the nvidia fanatics on this thread that say nothing will top it this year, especially "AMD" -___-


----------



## sherlock

Quote:


> Originally Posted by *sugarhell*
> 
> Lets play the same game with you
> 
> http://techreport.com/review/22653/nvidia-geforce-gtx-680-graphics-processor-reviewed/9


Nvidia had an immediate response, in the same review you picked
Quote:


> Nvidia tells us the slowdown is the result of a problem with its GPU Boost mechanism that will be fixed in an upcoming driver update


In the 670 review:


----------



## GoldenTiger

Quote:


> Originally Posted by *sherlock*
> 
> Nvidia had an immediate response, in the same review you picked
> In the 670 review:


Uh, actually, that's the review you linked, not me. And that chart you're posting? It's a summary, not a timegraph, but even there shows nVidia having an advantage. It's time graphs that matter however.


----------



## sugarhell

Quote:


> Originally Posted by *GoldenTiger*
> 
> Did you read the page? It explains that it was due to a driver hitch at the beginning of each test run, not normal frametime latency throughout the run. It says "A broader look at the latency picture shows that the GTX 680 generally produces lower-latency frames than the 7970, which is why its FPS average is so high. " Amazing.


So child.You didint like it

This?

http://techreport.com/review/22653/nvidia-geforce-gtx-680-graphics-processor-reviewed/11

Yeah nvidia is so much better with latency.I can play this game so easily. But its not right to choose random games for an overall performance. Some games play better with nvidia some others with amd.You just favor nvidia.Its not bad but its unhealthy.


----------



## Master__Shake

Quote:


> Originally Posted by *sherlock*
> 
> Nvidia had an immediate response, in the same review you picked
> In the 670 review:


you mean, an nvidia card plays better in an nvidia game!!!!!!!!!! MIND BLOWN!!


----------



## sherlock

Quote:


> Originally Posted by *GoldenTiger*
> 
> Uh, actually, that's the review you linked, not me. And that chart you're posting? It's a summary, not a timegraph, but even there shows nVidia having an advantage. It's time graphs that matter however.


I was not responding to you AFAIK.


----------



## sherlock

Quote:


> Originally Posted by *sugarhell*
> 
> So child.You didint like it
> 
> This?
> 
> http://techreport.com/review/22653/nvidia-geforce-gtx-680-graphics-processor-reviewed/11
> 
> Yeah nvidia is so much better with latency.I can play this game so easily. But its not right to choose random games for an overall performance. Some games play better with nvidia some others with amd.You just favor nvidia.Its not bad but its unhealthy.


This


----------



## ZealotKi11er

This is about Titan. Titan is so fast the time between frames 0.00000001 ms. Titan thinks micro stutter is a myth.
Anyways to get everyone clear in micro stutter problem. Nvidia has made more effort to fix it, Its more apartment in dual + GPU, low fps and bad coded games.
Most people dont notice it and for SP is not really a problem. The problem is mostly when you have CF or SLI and are getting 50-60fps. a but off ms will cause havoc.


----------



## sugarhell

Quote:


> Originally Posted by *sherlock*
> 
> I was not responding to you AFAIK.


I know but he just bash amd. Latency can change from driver to driver. We all know that.Nvidia had problem with the dynamic boost that produce stuttering. Both teams have flaws in their drivers

GG win;Amd drivers 'sux' again


----------



## sherlock

Quote:


> Originally Posted by *sugarhell*
> 
> I know but he just bash amd. Latency can change from driver to driver. We all know that.Nvidia had problem with the dynamic boost that produce stuttering. Both teams have flaws in their drivers
> 
> GG win;Amd drivers 'sux' again


I agree, AMD had made substential improvement with 13.2 beta(albeit only 3 games but according to TR more fixes are coming). Considering it took NV the same amount of time to fix their performance in TR(680->670 was about 6 weeks) as AMD did after the 12.11 article(early December->Jan 16th).

Back on topic, I look forward to see these reviews,it is a good thing that according to rumors TR will get a Titan to review(Scott Wasson also hinted it in the TR podcast monday).


----------



## sugarhell

Quote:


> Originally Posted by *sherlock*
> 
> I agree, AMD had made substential improvement with 13.2


As you can see we both link reviews from the same site. Drivers can produce latency or bugs with some games. But to choose one single game that makes your card better its dumb.He is here just to bash and create flame war


----------



## supermi

So much talk about things that can be pm'd if it is that important to you.

Good topics for "OTHER" threads as well, reading through so much off topic muck does waste time but not in a fun way for those of us wanting to talk about, ask about and think about the titan LOL.

My questions center around what kind of tdp would this Titan need to surpass a 690?

Might this be a good enough substitiute and if so how many 2 or 3 to replace 4 680 classifieds which run at about 1400mhz and 7200mhz vram ... I max all 4 out in C3 surround 3d and get 50-65fps but would only upgrade for more fps than my cards currently give me ...

each of my cards is currently over 30% faster than stock 680's ... but might the bandwidth of the titans let 2 of them keep up with these 4 cards...?

High powered users like me want to know LOL









Though if you have a single screen and 2 cards lightnings/classifieds a single titan might be enough, 3 screens in 3d or 3 higer res screens that is a question!


----------



## GoldenTiger

Quote:


> Originally Posted by *sugarhell*
> 
> I know but he just bash amd. Latency can change from driver to driver. We all know that.Nvidia had problem with the dynamic boost that produce stuttering. Both teams have flaws in their drivers
> 
> GG win;Amd drivers 'sux' again


AMD left it "unfixed" for a year. Nvidia left it in better but not great shape, then fixed it, in 6 weeks. Yeah, GG AMD drivers do "sux again". And it still isn't fixed for most games.


----------



## thestache

Quote:


> Originally Posted by *rcfc89*
> 
> Good luck with your 2015 goals. I can name at least 4 games that are brand new and current that will stomp 7970's in crossfire on a single 2560x1440 monitor at max settings. FarCry 3, Crysis 3, Sleeping Dogs, Hitman A. My 690 is brought to its knee's in max settings in these games. Anything past 2x msaa in FC3 I drop below 30fps on a 2560x1440 single monitor. For those that want to have everything completely maxed out in eyefinity or 2560 resolutions the Gpu power that Titan sli will bring is very much needed. Of course you can turn things down a bit (AA off etc.) and keep things smooth at 60fps. But honestly who builds a top of the line gaming rig and drops thousands of dollars doing so to "turn things down." Not me and there are plenty others like me. I'll drop 1800 dollars easily to achieve this.


For once you said something that didn't come completely out of your ass! You're exactly right and I'm proud. All those games and more are ahead of current gen tech and we need a refresh or new GPUs like the Titan. Especially for surround users. New GPUs are very much needed.


----------



## GoldenTiger

I can name another.... The Secret World (TSW) with TXAA enabled.


----------



## Captain1337

I guess someone at Nvidia decided to make a 6 GB card because he or she is tired of all the forum posts asking why Nvidia only has 1.5 to 2 GB of vram compared to AMD's cards.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Rayleyne*
> 
> AND fyi to the other person
> 2 660s are far more powerful then 2 460s i had back then, And again i'm still playing at a higher resolution.
> 
> It doesn't matter what camp you are with, I've done it with cards from nearly every generation
> 
> 9000 NVIDIA series, 8000 Nvidiia series, 200 Nvidia series, 400 Nvidia series 600 Nvidia series, 4000 AMD series 5000 AMD series 7000 AMD series, Guess what all smooth as butter with a single card bar a hitch here and there, Second you add an additional card? BAM microstutter.


Sorry brah, but you are still wrong. Adding a second card does not automatically induce micro stutter. At least not on SLI from my recent experience. I just think your mad still.........

Since your all knowing on the subject lets see this proof of micro stutter when adding a second card no matter the brand. Maybe on those junky AMD cards but not so much on newer nvidia cards.


----------



## th3illusiveman

Well this thread turned out badly....

Wonder how well titan will OC.


----------



## Kiracubed

ALL THREAD DERAILING AND TROLLING ASIDE, CHILDREN....

I hope Nvidia announces a date in the next few days!


----------



## TheReciever

Wow, nV gets a card on the way and people just cant appreciate the added competition? This will only bring things lower in price and advance the tech a little further than it was yesterday.

Simply out of curiosity I look forward the performance of the new card!


----------



## PimpSkyline

If this is not a GeForce card then why do we care? Sure it might CAD like a BOSS and maybe Fold, but unless this is gonna be a GTX 685 or something then lets all calm down.

Does this mean the GTX 780 will be like the 580 and have the Keplar GPU FULLY Unlocked and OC?


----------



## Dimaggio1103

Quote:


> Originally Posted by *PimpSkyline*
> 
> If this is not a GeForce card then why do we care? Sure it might CAD like a BOSS and maybe Fold, but unless this is gonna be a GTX 685 or something then lets all calm down.
> 
> Does this mean the GTX 780 will be like the 580 and have the Keplar GPU FULLY Unlocked and OC?


Um it is gonna be a Gaming card so not sure what you mean bro. Not a GTX 780 but a GeForce Titan, made for gaming.


----------



## yoi

Quote:


> Originally Posted by *PimpSkyline*
> 
> If this is not a GeForce card then why do we care? Sure it might CAD like a BOSS and maybe Fold, but unless this is gonna be a GTX 685 or something then lets all calm down.
> 
> Does this mean the GTX 780 will be like the 580 and have the Keplar GPU FULLY Unlocked and OC?


this exactly , if this card can "CAD like a BOSS" and game as the best GPU out there ... this might be a good thing for someone with a home office that could do his work at home , like a 3dsMax user or Solidwork user like myself ...

... i hate when i do my perforated panels .. my computer freezes ;_; it cant even preview the item , and i have to "hide" the feature to keep working , save it on a pendrive and load it up at my work


----------



## PostalTwinkie

Oh god...it is getting real, isn't it?

If this thing actually breaks X6000, I would have to buy one.....


----------



## Cloudfire777

Quote:


> Originally Posted by *GoldenTiger*
> 
> What did you expect? The AMD lovers feel a perceived threat to their beloved "kingdom" and the drones swarm like fireants to "PROTECT THE QUEEN". Why? No one knows, the fanboy is a mysterious insect.


Well said.

If there is one type of fanboys that is very active and does everything they can to protect their precious brand of choice, its AMD and Apple fanboys.

Other fanboys don`t ever come near to how hostile and protective those two groups mentioned are. Its like totally different leagues.


----------



## Overkill

Quote:


> Originally Posted by *Cloudfire777*
> 
> Well said.
> 
> If there is one type of fanboys that is very active and does everything they can to protect their precious brand of choice, its AMD and Apple fanboys.
> 
> Other fanboys don`t ever come near to how hostile and protective those two groups mentioned are. Its like totally different leagues.


I would rather deal with an AMD fanboy than an Apple one.


----------



## Votkrath

I'm looking forward to this card very much. I just want a good single-gpu card that will keep me going for a while. I could care less about voltage lock and stuff like that since I'm not really much of an overclocker. Will prolly run some benchies though, just for the kicks.


----------



## Roadkill95

Quote:


> Originally Posted by *Cloudfire777*
> 
> Well said.
> 
> If there is one type of fanboys that is very active and does everything they can to protect their precious brand of choice, its AMD and Apple fanboys.
> 
> Other fanboys don`t ever come near to how hostile and protective those two groups mentioned are. Its like totally different leagues.


A fanboy is a fanboy, regardless of the corporation they're a fan of. Nvidia fanboys are equally as irritating as AMD or apple fanboys.


----------



## Votkrath

Guys, please don't get this thread locked as well.









For some of us this thread contains interesting information and is not a playground.


----------



## Roadkill95

Quote:


> Originally Posted by *Votkrath*
> 
> I'm looking forward to this card very much. I just want a good single-gpu card that will keep me going for a while. I could care less about voltage lock and stuff like that since I'm not really much of an overclocker. Will prolly run some benchies though, just for the kicks.


honestly voltage locking isn't a problem if it's already running at a relatively high voltage. Some of the gigabyte 7950s running at 1.2v can just as high voltage unlocked ones. Plus because this is an underclocked k20 with a few less cores it should OC admirably.


----------



## Votkrath

Quote:


> Originally Posted by *Roadkill95*
> 
> honestly voltage locking isn't a problem if it's already running at a relatively high voltage. Some of the gigabyte 7950s running at 1.2v can just as high voltage unlocked ones. Plus because this is an underclocked k20 with a few less cores it should OC admirably.


I'm not afraid of anything being too locked down for me anyway but you're probably right. It's just that I've heard a lot of whining about voltage lock lately.


----------



## maarten12100

Quote:


> Originally Posted by *Votkrath*
> 
> I'm not afraid of anything being too locked down for me anyway but you're probably right. It's just that I've heard a lot of whining about voltage lock lately.


We have a stock overhere.
I just oc the core clock and leave the voltage the same not that I have much choice if it is locked


----------



## d3v0

I have been googling this every 10-15 minutes looking for more news on it. Got the OK from the wife for purchase, haha.


----------



## Cloudfire777

F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5 F5

d3v0: Congrats getting approval from your boss


----------



## Cyclonic

Quote:


> Originally Posted by *d3v0*
> 
> I have been googling this every 10-15 minutes looking for more news on it. Got the OK from the wife for purchase, haha.


Have fun on Google then for the next 2 weeks


----------



## guinner16

Quote:


> Originally Posted by *d3v0*
> 
> I have been googling this every 10-15 minutes looking for more news on it. Got the OK from the wife for purchase, haha.


I am building my first rig and also just got the ok from my wife. I live in pittsubrgh too so I may be messaging you for help on the buil. lol

To everybody else. How does a so called "limited release" typically run with a card like this. Does the average person have the chance at getting one? If you are ready to go at midnight release do you have a pretty good shot? Or is it something where they are normally sold out after a few days. I would just hate to have a whole build planned around something like this and then not be able to get one (or two!!!!!) I know it is hard without knowing how limited it is, but I was wondering what experiences have been like in the past.


----------



## Cyclonic

Quote:


> Originally Posted by *guinner16*
> 
> I am building my first rig and also just got the ok from my wife. I live in pittsubrgh too so I may be messaging you for help on the buil. lol
> 
> To everybody else. How does a so called "limited release" typically run with a card like this. Does the average person have the chance at getting one? If you are ready to go at midnight release do you have a pretty good shot? Or is it something where they are normally sold out after a few days. I would just hate to have a whole build planned around something like this and then not be able to get one (or two!!!!!) I know it is hard without knowing how limited it is, but I was wondering what experiences have been like in the past.


Spam newegg like 100 times a min, and hope you are 1 of the first to order them







:thumb:


----------



## Apocalypse Maow

Quote:


> Originally Posted by *ghostrider85*
> 
> i hope it will fit in a SG08 case


You and I both!


----------



## EPiiKK

How are the teslas related?
Good post tho, really looking forward to actual info


----------



## driftingforlife

I have learned that these will be very limited quantity


----------



## Cloudfire777

That limited quantity rumor might come from the fact that Titan will probably have the greatest silicon and are therefor binned and the chips that don`t make the cut in terms of quality will be used in the GTX 780 or any other GPU.

Last thing I heard from TSMC is that they now deliver more than AMD and Nvidia want to buy so I don`t know if Nvidia will have problem producing what the market want...


----------



## freitz

I don't see why spamming google and newegg will help. They will have a offical announcment like everything else with a launch date. Its not just going to appear like magic. Poof.... Geforce Titan


----------



## Votkrath

Quote:


> Originally Posted by *freitz*
> 
> I don't see why spamming google and newegg will help. They will have a offical announcment like everything else with a launch date. Its not just going to appear like magic. Poof.... Geforce Titan


Actually, I don't think it's impossible that it will show up at the vendor's websites prematurely.


----------



## guinner16

Quote:


> Originally Posted by *Cloudfire777*
> 
> That limited quantity rumor might come from the fact that Titan will probably have the greatest silicon and are therefor binned and the chips that don`t make the cut in terms of quality will be used in the GTX 780 or any other GPU.
> 
> Last thing I heard from TSMC is that they now deliver more than AMD and Nvidia want to buy so I don`t know if Nvidia will have problem producing what the market want...


I hope you are correct. Hopefully it means they will be producing alot, but not enough to keep shelves totally stocked until 7XX series comes out. It would suck if they have the launch day stock, and then nothing else after that. My wife would kill, but I am thinking about picking up 2 of these (if I can). If for some reason I dont use it I can sell it to a friend at cost.

Also, lets keep in mind these will be around $900. That will narrow it down to a very small group of buyers right away.


----------



## guinner16

Quote:


> Originally Posted by *freitz*
> 
> I don't see why spamming google and newegg will help. They will have a offical announcment like everything else with a launch date. Its not just going to appear like magic. Poof.... Geforce Titan


I think he meant spamming google for news updates, and spamming newegg when it releases so he can order it right away.


----------



## PatrickCrowely

Quote:


> Originally Posted by *guinner16*
> 
> I hope you are correct. Hopefully it means they will be producing alot, but not enough to keep shelves totally stocked until 7XX series comes out. It would suck if they have the launch day stock, and then nothing else after that. My wife would kill, but I am thinking about picking up 2 of these (if I can). If for some reason I dont use it I can sell it to a friend at cost.
> 
> Also, lets keep in mind these will be around $900. That will narrow it down to a very small group of buyers right away.


They'll sell out in a hour...... Then people will have them on Ebay for $2,000


----------



## guinner16

Quote:


> Originally Posted by *PatrickCrowely*
> 
> They'll sell out in a hour...... Then people will have them on Ebay for $2,000


If they last at least an hour I would be happy.


----------



## j3st3r

Holy crap that card is a beast.... Makes my 7950 look like a noob


----------



## PostalTwinkie

Alright, who else threw their wallet at their display hoping it would make something happen?

I know I did a few times!


----------



## maarten12100

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Alright, who else threw their wallet at their display hoping it would make something happen?
> 
> I know I did a few times!


Better be patient end of Feb will be atleast 2 weeks from now.


----------



## PatrickCrowely

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Alright, who else threw their wallet at their display hoping it would make something happen?
> 
> I know I did a few times!


LOL.... On another note, Pics VIA Cloudfire777... Also I read these cards have no video outputs, which you can notice in previous shots...


----------



## PostalTwinkie

Quote:


> Originally Posted by *PatrickCrowely*
> 
> LOL.... On another note, Pics VIA Cloudfire777... Also I read these cards have no video outputs, which you can notice in previous shots...


Those are the Tesla cards that Titan is based off of.....

If the rumors are true, Titans are those cards, with one SMX disabled, and video out put on them. The cards in the above picture are not Titan.


----------



## PatrickCrowely

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Those are the Tesla cards that Titan is based off of.....
> 
> If the rumors are true, Titans are those cards, with one SMX disabled, and video out put on them. The cards in the above picture are not Titan.


Oh okay, thanks for clearing that up


----------



## Cloudfire777




----------



## PostalTwinkie

Quote:


> Originally Posted by *Cloudfire777*


We have seen this one a few times.


----------



## maarten12100

Quote:


> Originally Posted by *Cloudfire777*


Why did you cut out this from that screenshot as we've already seen it no info in it


----------



## Cloudfire777

Silly nugget. I haven`t cut out anything
http://www.overclockers.ru/hardnews/51993/GeForce_Titan_po_sledam_pervyh_benchmarkov.html


----------



## sherlock

Quote:


> Originally Posted by *Cloudfire777*
> 
> Silly nugget. I haven`t cut out anything
> http://www.overclockers.ru/hardnews/51993/GeForce_Titan_po_sledam_pervyh_benchmarkov.html


That X7377 score have been scrutinized enough(2 GTX680 in SLI matching it 100%) that it is not really believable.


----------



## maarten12100

Quote:


> Originally Posted by *Cloudfire777*
> 
> Silly nugget. I haven`t cut out anything
> http://www.overclockers.ru/hardnews/51993/GeForce_Titan_po_sledam_pervyh_benchmarkov.html


It was cut from a 3dmark bench screen showing x7300 points or so.
If you ask me it was a fake I put more trust in the 85% of the gtx690 power.
If I could up it to 1Ghz under water this'll blow the gtx690 away


----------



## Cloudfire777

Stop going around in circles.
We already have established that the X7300 score is not from Titan. I didn`t post that screenshot either. X71xx is the real one. Or 85% of 690.

The GPU-Z however shows information which will match Titan: Like the Die size, transistor count and the GK110.

Sorry for posting it. Didn`t know it was posted before.
Continue thread


----------



## n00byn4t3r

People are a little aggressive at times in this thread. Don't worry a out it too much, you can't really expect everyone to know everything.


----------



## rush2049

The titan release will be no different than the gtx 590 release as far as availability is concerned. It was available for the first two days from all the usual etailers, and then it was sold out. It did come back into stock when they let some of their reserved RMA stock go, and when defective units were repaired. It was a single production run, which is what I suspect the titan will be as well.


----------



## PostalTwinkie

Quote:


> Originally Posted by *n00byn4t3r*
> 
> People are a little aggressive at times in this thread. Don't worry a out it too much, you can't really expect everyone to know everything.


This will calm down completely when Nvidia gives an official release date and specifications.

I think what worries a lot of us is that we are so close to the impending rumored release, and still haven't had official word.


----------



## guinner16

Quote:


> Originally Posted by *PostalTwinkie*
> 
> This will calm down completely when Nvidia gives an official release date and specifications.
> 
> I think what worries a lot of us is that we are so close to the impending rumored release, and still haven't had official word.


If I had already built my rig I wouldnt be worried about this card. It just so happens that I a building my first rig, and get my yearly bonus at the exact time the titan is coming out. I just think a single gpu option would be much easier for a first time build/setup than sli. If for some reason I dont get a card I will just go with a pair of cheaper 670's and then upgrade when the 7xx comes out. Hopefully the estimate of them being available for a day or two is correct.


----------



## TheBlindDeafMute

Quote:


> Originally Posted by *Booty Warrior*
> 
> Supposedly late Feb. At $899.99


Ironically at tax time, lmao


----------



## Avonosac

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Alright, who else threw their wallet at their display hoping it would make something happen?
> 
> I know I did a few times!


Seriously. If someone gets a full cover block out for this thing even somewhat soon, I'll faint.








Quote:


> Originally Posted by *guinner16*
> 
> If I had already built my rig I wouldnt be worried about this card. It just so happens that I a building my first rig, and get my yearly bonus at the exact time the titan is coming out. I just think a single gpu option would be much easier for a first time build/setup than sli. If for some reason I dont get a card I will just go with a pair of cheaper 670's and then upgrade when the 7xx comes out. Hopefully the estimate of them being available for a day or two is correct.


670's aren't really "cheap" unless you go with the 2GB version, and then they are only.. somewhat cheaper


----------



## Cloudfire777

Need
a
review
-
-
-
-
NOW


----------



## Kaldari

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> Ironically at tax time, lmao


It being released around tax time is the opposite of ironic.


----------



## akbisw

ALL i want is a $200 card that can play 1080p 3d at 60Hz. Is that too much to ask??? I hope that the 760 can achieve what Im asking for


----------



## Avonosac

Quote:


> Originally Posted by *akbisw*
> 
> ALL i want is a $200 card that can play 1080p 3d at 60Hz. Is that too much to ask??? I hope that the 760 can achieve what Im asking for


Save money, I doubt that will ever exist.


----------



## Rayleyne

1080p 60hz 3d? that's relativly easy yo...


----------



## Avonosac

Quote:


> Originally Posted by *Rayleyne*
> 
> 1080p 60hz 3d? that's relativly easy yo...


Its easy, but the price point is where I don't think it will happen. Also he doesn't really say what software he wants to keep at 3d 60hz, what AA settings.... etc. Does he mean 120hz and the 3d meaning 60 full frames shown, or 60hz 3d and 30 frames shown?


----------



## maarten12100

Quote:


> Originally Posted by *akbisw*
> 
> ALL i want is a $200 card that can play 1080p 3d at 60Hz. Is that too much to ask??? I hope that the 760 can achieve what Im asking for


Why spent 400 dollar on a 3d monitor if you are only gonna spent 200 on the card better get yourself a decent ips screen for 200 dollar then spent 200 dollar on a decent card like a 7870 or a gtx660ti both can do 1080p maxed in most games with 30+ frames. (of course some games are a tad heavier)


----------



## Compaddict

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *n00byn4t3r*
> 
> People are a little aggressive at times in this thread. Don't worry a out it too much, you can't really expect everyone to know everything.
> 
> 
> 
> This will calm down completely when Nvidia gives an official release date and specifications.
> 
> I think what worries a lot of us is that we are so close to the impending rumored release, and still haven't had official word.
Click to expand...

Is there some ridiculous purpose for not reveling a tentative release date? Fabrication of rumors, leaked tidbits, all totally unnecessary because of what exactly? I'm not alone in waiting for this next card to upgrade, Crysis 3 will need this power for my maximum enjoyment and I'm not going to buy a GTX6XX card no matter how long the wait. When they finally do release the ???700 whatever series card, Nvidia will get some of my money and it seems like the sooner the better would make both parties happy IMO. SO HURRY IT UP ALREADY, or at least give out a unofficial date and some REAL data instead of this ridiculous top secret crap which really doesn't do anyone much good.

Whew! I feel a little better now.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Avonosac*
> 
> Seriously. If someone gets a full cover block out for this thing even somewhat soon, I'll faint.


I will absolutely add it to my loop if that happens.


----------



## supermi

Originally Posted by d3v0

I have been googling this every 10-15 minutes looking for more news on it. Got the OK from the wife for purchase, haha.

CONGRATS!!!!


----------



## NBAasDOGG

Lucky Americans








This thing is gonna be 900 Dollars, wich is going to cost 900 Eruos in EU.

Normaly 900 Dollar = 665 Euro, but EU people are rich







, escpecially Scandinavian countries and north EU (Netherlands, Germany ext)


----------



## Avonosac

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I will absolutely add it to my loop if that happens.


It will completely derail my plans for my prodigy build, but a single card of this power, without any SLI headaches under water? Perfect. I'll have to work on building a custom case which can fit this monster.


----------



## maarten12100

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Lucky Americans
> 
> 
> 
> 
> 
> 
> 
> 
> This thing is gonna be 900 Dollars, wich is going to cost 900 Eruos in EU.
> 
> Normaly 900 Dollar = 665 Euro, but EU people are rich
> 
> 
> 
> 
> 
> 
> 
> , escpecially Scandinavian countries and north EU (Netherlands, Germany ext)


Yes we are,the rich supremacy of Europe Germany, Netherlands, Belgium, Austria and Swiss-land. (not sure if England can be considered rich)
Hopefully it will be more like 800 euro sometimes we even pay more than it is in dollars which is really too bad.
Main reason I get stuff from china as it is even cheaper however the high end will cost almost the same in most cases.


----------



## RJT

Quote:


> Originally Posted by *supermi*
> 
> Originally Posted by d3v0
> 
> I have been googling this every 10-15 minutes looking for more news on it. Got the OK from the wife for purchase, haha.
> 
> CONGRATS!!!!


Hahaha! I'm continually checking the web for news on the Titan as well. I just recently bought a 27" 120Hz 2560x1440p monitor, and am going to buy the new Corsair 900D Godzilla and build the mother of all water-cooled systems. This card (or 2) will be its defining feature, and allow me to pump up the eye candy in virtually any game on my monitor, even Crysis 3!! God, I'm so OCD. I hope there's some kind of announcement soon, or I'm gonna end up getting fired, and divorced for being so tech-obsessed!


----------



## xFALL3Nx

Quote:


> Originally Posted by *EpicAMDGamer*
> 
> I can has a review sample? lol
> What are you doing with your GTX 2 series card? (Gonna Sell?)


My gtx 2 series? I sold my GTX 275s already. Unless the GTX480 uses a 2xx chip?


----------



## Avonosac

Quote:


> Originally Posted by *RJT*
> 
> Hahaha! I'm continually checking the web for news on the Titan as well. I just recently bought a 27" 120Hz 2560x1440p monitor, and am going to buy the new Corsair 900D Godzilla and build the mother of all water-cooled systems. This card (or 2) will be its defining feature, and allow me to pump up the eye candy in virtually any game on my monitor, even Crysis 3!! God, I'm so OCD. I hope there's some kind of announcement soon, or I'm gonna end up getting fired, and divorced for being so tech-obsessed!


Between this and the H220, it is driving me nuts waiting for these things to come out.


----------



## Roadkill95

So back on topic, what do you guys reckon the cards gonna look like? I would be stoked if they made it look like the GTX480, think it's the best looking graphics card ever made. I don't wanna see your run of the mill gtx600 design though, it has to look the part as well considering that it costs 900$...


----------



## Avonosac

Quote:


> Originally Posted by *Roadkill95*
> 
> So back on topic, what do you guys reckon the cards gonna look like? I would be stoked if they made it look like the GTX480, think it's the best looking graphics card ever made. I don't wanna see your run of the mill gtx600 design though, it has to look the part as well considering that it costs 900$...


Doesn't matter to me, I'll only see it look stock for however long it takes EK to make a full cover block for it.


----------



## driftingforlife

Quote:


> Originally Posted by *Avonosac*
> 
> Doesn't matter to me, I'll only see it look stock for however long it takes EK to make a full cover block for it.


Same here.


----------



## Roadkill95

Is water cooling your GPU really worth it, except for benching? A card this powerful should be able to hold respectable fps at 60hz even with the most demanding games.


----------



## Avonosac

Quote:


> Originally Posted by *Roadkill95*
> 
> Is water cooling your GPU really worth it, except for benching? A card this powerful should be able to hold respectable fps at 60hz even with the most demanding games.


Yes.

No, a 690 even has trouble with FC3 with everything maxed. You will see people get 2 of these and SLI them to truly tame some of the current games.


----------



## driftingforlife

Lower temps = longer life and quieter.


----------



## supermi

Quote:


> Originally Posted by *Roadkill95*
> 
> Is water cooling your GPU really worth it, except for benching? A card this powerful should be able to hold respectable fps at 60hz even with the most demanding games.


Well if it does not have much in the way of voltage control then no not much help, other than noise and a few more MHZ OC.

But any card with voltage control like the 680 lightnings (or my first batch classifieds) OHHHH MY it does!!!... If this TITAN can pour on the voltage heat will be the limiting factor and GOSH some water goes a LOOOOONG way towards letting you wrestle EACH and every drop of fps from this card for high res and 3d triple monitor gaming.

Another use of water is if you run multiple cards in the same system say 2 cards on a small mb or 3 - 4 cards on a large mb, keeps the heat in check.


----------



## maarten12100

Quote:


> Originally Posted by *driftingforlife*
> 
> Lower temps = longer life and quieter.


And higher OC potential if you can OC it that is.


----------



## Roadkill95

Quote:


> Originally Posted by *Avonosac*
> 
> Yes.
> 
> No, a 690 even has trouble with FC3 with everything maxed. You will see people get 2 of these and SLI them to truly tame some of the current games.


1080p? Didn't know that FC3 was that demanding. Gaming is an expensive hobby glad I'm not into it lol.
Quote:


> Originally Posted by *driftingforlife*
> 
> Lower temps = longer life and quieter.


True. I don't know if I agree with the longer life thing because it's pretty much negligible considering your average enthusiasts upgrade cycle but noise levels i can definitely agree.


----------



## Mals

Quote:


> Originally Posted by *rcfc89*
> 
> Or BF3/TF2. I easily average over 100fps with BF3 maxed at 2560x1440. Its far from demanding. FC3 was the first that made me realize I need more gpu power.


It could surely be some GPU power issues, which is great to see the 690 sweating, but couldn't a lot of that bottlenecking be also due to CPU power?


----------



## Roadkill95

Quote:


> Originally Posted by *supermi*
> 
> Well if it does not have much in the way of voltage control then no not much help, other than noise and a few more MHZ OC.
> 
> But any card with voltage control like the 680 lightnings (or my first batch classifieds) OHHHH MY it does!!!... If this TITAN can pour on the voltage heat will be the limiting factor and GOSH some water goes a LOOOOONG way towards letting you wrestle EACH and every drop of fps from this card for high res and 3d triple monitor gaming.
> 
> Another use of water is if you run multiple cards in the same system say 2 cards on a small mb or 3 - 4 cards on a large mb, keeps the heat in check.


Rumor is that nvidia isn't allowing any non-reference pcbs or coolers so it'll most likely be voltage locked.


----------



## Avonosac

Quote:


> Originally Posted by *Roadkill95*
> 
> 1080p? Didn't know that FC3 was that demanding. Gaming is an expensive hobby glad I'm not into it lol.
> True.


Yea, that is one of the biggest contributors to the size of the console gaming market, its a fixed, and far lower entry cost.
Quote:


> Originally Posted by *Mals*
> 
> It could surely be some GPU power issues, which is great to see the 690 sweating, but couldn't a lot of that bottlenecking be also due to CPU power?


You won't likely find too many 690's ever bottlenecked by CPU, simply because the people who buy a $1,000 GPU will not settle for a CPU which will bottleneck it.


----------



## PhantomTaco

Yeah I highly doubt there will be voltage control on these cards, I'd give it a 5% chance at best. I'm just hoping that this card really performs better than the 690, as my 690 isn't good enough for what I want it (120fps 1440p gaming lol, I know it will be a while before that's a reality)


----------



## Noob_with_Tools

any news about date of reléase ?? meybe will be from 18 to 26 or 13 to 22, witch will be "TTHE BIG TITAN week"


----------



## shahramkel

Can't wait for reviews!


----------



## supermi

Quote:


> Originally Posted by *Roadkill95*
> 
> Rumor is that nvidia isn't allowing any non-reference pcbs or coolers so it'll most likely be voltage locked.


I KNOW









That is the main reason I have not sold my 680 classifieds. I WANT voltage control on a high end card









SHEESH NVIDIA









Let us hope that THOSE rumors are wrong LOL!!!


----------



## Cloudfire777

Quote:


> Originally Posted by *Noob_with_Tools*
> 
> any news about date of reléase ?? meybe will be from 18 to 26 or 13 to 22, witch will be "TTHE BIG TITAN week"


Late February or early March, according to the rumors


----------



## supermi

Quote:


> Originally Posted by *Avonosac*
> 
> Yea, that is one of the biggest contributors to the size of the console gaming market, its a fixed, and far lower entry cost.
> You won't likely find too many 690's ever bottlenecked by CPU, simply because the people who buy a $1,000 GPU will not settle for a CPU which will bottleneck it.


Well my 3930 at 5ghz does bottlneck my 4 way sli in BF3







but in most other games like C3 with enough pixels and eye candy the GPU's become the main bottleneck again YAY!


----------



## Cloudfire777

Quote:


> Originally Posted by *supermi*
> 
> Well my 3930 at 5ghz does bottlneck my 4 way sli in BF3
> 
> 
> 
> 
> 
> 
> 
> but in most other games like C3 with enough pixels and eye candy the GPU's become the main bottleneck again YAY!


Well in that case you must buy 4 x Titans so that you can show that CPU who is the boss


----------



## RJT

Quote:


> Originally Posted by *PhantomTaco*
> 
> Yeah I highly doubt there will be voltage control on these cards, I'd give it a 5% chance at best. I'm just hoping that this card really performs better than the 690, as my 690 isn't good enough for what I want it (120fps 1440p gaming lol, I know it will be a while before that's a reality)


Not true. I get 120fps in Black OPs II on my 2560x1440p Catleap 2B Extreme (120Hz), with my GTX 680. Mind you, that PC game is a console port and not all that demanding, but it sure is fun gaming at that resolution and 120fps!







Gonna need the Titan (or 2) to achieve that on more demanding titles however: e.g. In BF3, high settings, 2xMSAA and motion blur off, I can only average 80fps at that resolution with my single 680.


----------



## supermi

Quote:


> Originally Posted by *Cloudfire777*
> 
> Well in that case you must buy 4 x Titans so that you can show that CPU who is the boss


LOL


----------



## Cloudfire777

The funny thing is that my GTX 680M inside my notebook cost almost as much as this Titan. My 3940XM cost like $500-600 more than i7 3930.

When you`re in the notebook world you get accustomed to expensive hardware.


----------



## Avonosac

Quote:


> Originally Posted by *supermi*
> 
> Well my 3930 at 5ghz does bottlneck my 4 way sli in BF3
> 
> 
> 
> 
> 
> 
> 
> but in most other games like C3 with enough pixels and eye candy the GPU's become the main bottleneck again YAY!


Thats also because BF3 is bad on CPU for some god aweful un-optimized reason. You are right though, with enough GPU's in SLI / CF there is enough CPU overhead where it could become a problem, so go for 5.5ghz


----------



## Noob_with_Tools

Quote:


> Originally Posted by *Cloudfire777*
> 
> Late February or early March, according to the rumors


March mmmmm long waitinggg want some new from nvidia. @ least a oficial date or just a seller post


----------



## supermi

Quote:


> Originally Posted by *Cloudfire777*
> 
> The funny thing is that my GTX 680M inside my notebook cost almost as much as this Titan. My 3940XM cost like $500-600 more than i7 3930.
> 
> When you`re in the notebook world you get accustomed to expensive hardware.


I HEAR YOU, I have a 680m 4gb clevo and 2920XM in my M18X, HOWEVER having a quad core cpu @ 4.3 - 4.5ghz and a GPU that when overclocked matches a stock 670 4gb ALL in a laptop you can use on the sofa, cafe, friends house bedroom etc is PRICELESS ... I can run C3 fully maxed in stereo 3d on my 720p 3d vision projector form my laptop and get loke 65fps lol!!! Will add another in SLI at some point as well!!! HEHEHE

Quote:


> Originally Posted by *Avonosac*
> 
> Thats also because BF3 is bad on CPU for some god aweful un-optimized reason. You are right though, with enough GPU's in SLI / CF there is enough CPU overhead where it could become a problem, so go for 5.5ghz


YES , though lots of the BF3 seems like poor SLI optimizations as well ... a buddy of mine and I find switching to and from full screen or changing res can at times trick the game into giving us full gpu scaling LOL, however Battlelog seems to CRASH my whole pc if I try to start more than 1 match before doing a full pc restart!?!?!?!

I am sooooo happy C3 is here because I am sooooooooooo tired of the issues with BF3 and especially Battlelog, I get similar issues with MOH if I open it with Battlelog ... but if i open MOH from in the game itself no issues at all amazing gpu scaling and ZERO crashes LOL ....

ok back to the titans!!!

I would love to shoot for 5.2ghz, but not until IVY E , my next upgrade is closer LOL OHHH YEAH!!!!


----------



## PhantomTaco

Quote:


> Originally Posted by *RJT*
> 
> Not true. I get 120fps in Black OPs II on my 2560x1440p Catleap 2B Extreme (120Hz), with my GTX 680. Mind you, that PC game is a console port and not all that demanding, but it sure is fun gaming at that resolution and 120fps!
> 
> 
> 
> 
> 
> 
> 
> Gonna need the Titan (or 2) to achieve that on more demanding titles however: e.g. In BF3, high settings, 2xMSAA and motion blur off, I can only average 80fps at that resolution with my single 680.


Like you said, Black Ops II isn't that demanding. Try something like Far Cry 3 for instance, at 1440p everything maxed out I get an average of 34 fps. I'd be interested to see how the titan does in that game.


----------



## Cloudfire777

Quote:


> Originally Posted by *supermi*
> 
> I HEAR YOU, I have a 680m 4gb clevo and 2920XM in my M18X, HOWEVER having a quad core cpu @ 4.3 - 4.5ghz and a GPU that when overclocked matches a stock 670 4gb ALL in a laptop you can use on the sofa, cafe, friends house bedroom etc is PRICELESS ... I can run C3 fully maxed in stereo 3d on my 720p 3d vision projector form my laptop and get loke 65fps lol!!! Will add another in SLI at some point as well!!! HEHEHE


Agreed 100%. Its nice to have a notebook that is capable of playing all games in 1080p with max settings. Kepler was a godsend to notebooks since we got so huge performance leap from Fermi. Its also nice to have a notebook when going to LAN parties and such. You see people dragging all sort of crap with them, and you just come there with a thin little notebook, plug in the power cable and you are done.








My MSI even have a pretty good sound system (well not as good as real 5.1 systems and similar obv) which is good enough for gaming and movies.

That said, this Titan seem very interesting and it won`t surprise me if I once again build myself a new desktop system.


----------



## supermi

Quote:


> Originally Posted by *Cloudfire777*
> 
> Agreed 100%. Its nice to have a notebook that is capable of playing all games in 1080p with max settings. Kepler was a godsend to notebooks since we got so huge performance leap from Fermi. Its also nice to have a notebook when going to LAN parties and such. You see people dragging all sort of crap with them, and you just come there with a thin little notebook, plug in the power cable and you are done.
> 
> 
> 
> 
> 
> 
> 
> 
> My MSI even have a pretty good sound system (well not as good as real 5.1 systems and similar obv) which is good enough for gaming and movies.
> 
> That said, this Titan seem very interesting and it won`t surprise me if I once again build myself a new desktop system.


YES YES YES, and add a nice external DAC and headphones, a nice mouse < WOW > you have MONSTER gaming ANYWHERE.

Oh there is no substitute for a full desktop rig either!!! Def, need both hahaha....

May TITAN usher you into another build!!!!


----------



## RJT

Quote:


> Originally Posted by *PhantomTaco*
> 
> Like you said, Black Ops II isn't that demanding. Try something like Far Cry 3 for instance, at 1440p everything maxed out I get an average of 34 fps. I'd be interested to see how the titan does in that game.


Definitely. If the Titan can handle FC3, BF3, Crysis 3 or a heavily modded Skyrim @ max setting and get 120fps at that resolution, we truly will have the mother of all gpus!


----------



## Avonosac

Quote:


> Originally Posted by *RJT*
> 
> Definitely. If the Titan can handle FC3, BF3, Crysis 3 or a heavily modded Skyrim @ max setting and get 120fps at that resolution, we truly will have the mother of all gpus!


It will likely run out of juice, but it will also not have the overhead of the 690, and other instabilities SLI introduces. You'll probably need 2 of em or more to get that performance out of these newest games, that being said I think 2 of these in SLI will probably beat the crap out of 2x690s.

Since they will be able to capitalize on memory bandwidth and power.


----------



## maarten12100

Quote:


> Originally Posted by *Avonosac*
> 
> It will likely run out of juice, but it will also not have the overhead of the 690, and other instabilities SLI introduces. You'll probably need 2 of em or more to get that performance out of these newest games, that being said I think 2 of these in SLI will probably beat the crap out of 2x690s.
> 
> Since they will be able to capitalize on memory bandwidth and power.


And they will scale beyond 2 gpus which the gtx690 mostly doesn't win


----------



## RJT

Quote:


> Originally Posted by *Avonosac*
> 
> It will likely run out of juice, but it will also not have the overhead of the 690, and other instabilities SLI introduces. You'll probably need 2 of em or more to get that performance out of these newest games, that being said I think 2 of these in SLI will probably beat the crap out of 2x690s.
> 
> Since they will be able to capitalize on memory bandwidth and power.


Agreed. If these Titans and their purported specs are real, and they are not voltage locked....enthusiast/gamer Nirvana!!







And even though a pair of them might set you back ~$1800...I'd bet they could handle anything thrown at them for the next 3-5 years!


----------



## goesto11

Am I the only one who thinks that's a little weird for review samples to be shipped this early before expected release (2-4 weeks away)? Maybe I'm wrong, but I would think shipping review samples about a week prior to launch would be typical. Conversely, if review samples have been shipped, would that indicate that the card might be released sooner than anticipated?


----------



## driftingforlife

Quote:


> Originally Posted by *goesto11*
> 
> Am I the only one who thinks that's a little weird for review samples to be shipped this early before expected release (2-4 weeks away)? Maybe I'm wrong, but I would think shipping review samples about a week prior to launch would be typical. Conversely, if review samples have been shipped, would that indicate that the card might be released sooner than anticipated?


Nope. 4xx cards went out waaaaaaaay before release.


----------



## GoldenTiger

Quote:


> Originally Posted by *goesto11*
> 
> Am I the only one who thinks that's a little weird for review samples to be shipped this early before expected release (2-4 weeks away)? Maybe I'm wrong, but I would think shipping review samples about a week prior to launch would be typical. Conversely, if review samples have been shipped, would that indicate that the card might be released sooner than anticipated?


I hear they usually don't go out more than two weeks in advance.... I'm guessing more towards the "sooner than anticipated" thing if the review samples have truly shipped.


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> And they will scale beyond 2 gpus which the gtx690 mostly doesn't win


Yea, scaling at 2 is much better than at 4. Although I saw a review here a few days ago that showed the scaling at 4 in some games over 80%, that is not bad at all.

Quote:


> Originally Posted by *RJT*
> 
> Agreed. If these Titans and their purported specs are real, and they are not voltage locked....enthusiast/gamer Nirvana!!
> 
> 
> 
> 
> 
> 
> 
> And even though a pair of them might set you back ~$1800...I'd bet they could handle anything thrown at them for the next 3-5 years!


I wouldn't say 3-5 years, but at least 2 for real solid gaming. The only issue I have with that kind of a statement is, we don't know how much devs are going to go nuts since the newer consoles will be here soon.

Either way, they will be strong for a while.


----------



## xd_1771

*I would like to issue a reminder that trolling is a violation of the Overclock.net TOS (link in sig) and is punishable through our standard infraction policy.*

*Please carry on with the on-topic discussion.*


----------



## Avonosac

Quote:


> Originally Posted by *xd_1771*
> 
> *I would like to issue a reminder that trolling is a violation of the Overclock.net TOS (link in sig) and is punishable through our standard infraction policy.*
> 
> *Please carry on with the on-topic discussion.*












And in comes the Mounties.


----------



## HanakoIkezawa

I think it's great tat nvidia is creating another market segment for the enthusiasts. For the people who price is no object, the 3960/70x is the obvious choice over the 3930k, but in graphics the best money can buy (that is usable in games) is 3 680s or 3 7970s. I see droves of users on this forum running out to buy 3 titans the day they are released, because they are simply the best money can buy.

I also do not see this effecting the price on the 680/670/660ti very much, 50-100$ max. I hope it does though, because I would love another 670 :3


----------



## goesto11

Quote:


> Originally Posted by *RJT*
> 
> Agreed. If these Titans and their purported specs are real, and they are not voltage locked....enthusiast/gamer Nirvana!!
> 
> 
> 
> 
> 
> 
> 
> And even though a pair of them might set you back ~$1800...I'd bet they could handle anything thrown at them for the next 3-5 years!


I'd be shocked if these cards were not voltage locked. If NVIDIA finds it "necessary" to voltage lock a GK104 card like the GTX680, why would a GK110 based Titan not be voltage locked as well? It's a massive chip - much larger than GK104, and yet will only be available with a reference board and cooling. Frankly, I think it makes more sense to voltage lock a Titan than a GTX680 or 670. The current Kepler cards cards may run cool, but I suspect that won't be the case with the Titan. Maybe you could adapt a GTX680 aftermarket WC or top-notch air cooler, but there goes the warranty on your $900+ card. Point is that, even if the Titan's voltage was unlocked, how much could you really boost voltage before GPU temps become a problem?

FWIW, I'd prefer no voltage lock, but before that, I'd rather have either a high end non-reference versions (ex. MSI Lightning & similar), or NVIDIA improve "reference" to an equivalent level of component quality and cooling performance. At 28nm, my guess is that Titan OC headroom will be quite limited - much more than a GTX680 or 670.


----------



## smithyzbak

Quote:


> Originally Posted by *Avonosac*
> 
> Yea, scaling at 2 is much better than at 4. Although I saw a review here a few days ago that showed the scaling at 4 in some games over 80%, that is not bad at all.
> I wouldn't say 3-5 years, but at least 2 for real solid gaming. The only issue I have with that kind of a statement is, we don't know how much devs are going to go nuts since the newer consoles will be here soon.
> 
> Either way, they will be strong for a while.


I'm wondering exactly how limited the quantity of Titans is actually going to be, so would it even be possible to get another one a year or two down the line to sli? The rumored performance numbers people are suggesting are pretty damn high, and I have a hard time imagining any game coming out this year or even mentioned that could legitmately max out sli titans or even stress them.


----------



## HanakoIkezawa

Quote:


> Originally Posted by *smithyzbak*
> 
> I'm wondering exactly how limited the quantity of Titans is actually going to be, so would it even be possible to get another one a year or two down the line to sli? The rumored performance numbers people are suggesting are pretty damn high, and I have a hard time imagining any game coming out this year or even mentioned that could legitmately max out sli titans or even stress them.


metro at 7680x1440


----------



## goesto11

Quote:


> Originally Posted by *driftingforlife*
> 
> Nope. 4xx cards went out waaaaaaaay before release.


Thanks. Never paid attention to that stuff before. This is the first time I'm seriously considering buying a new card design at release. One last question - even if reviewers do get the cards far ahead of release, they still can't post reviews until NVIDIA allows, right? I'm dying to know how the Titan performs so the sooner we get some benchmarks the better.


----------



## GoldenTiger

Quote:


> Originally Posted by *goesto11*
> 
> Thanks. Never paid attention to that stuff before. This is the first time I'm seriously considering buying a new card design at release. One last question - even if reviewers do get the cards far ahead of release, they still can't post reviews until NVIDIA allows, right? I'm dying to know how the Titan performs so the sooner we get some benchmarks the better.


Usually the most they receive them at is about 2 weeks ahead, and no, they are bound by an NDA (non-disclosure agreement) as to what they can say (which is usually virtually nothing) before the embargo date (launch time/reviews allowed).


----------



## maarten12100

Quote:


> Originally Posted by *goesto11*
> 
> Thanks. Never paid attention to that stuff before. This is the first time I'm seriously considering buying a new card design at release. One last question - even if reviewers do get the cards far ahead of release, they still can't post reviews until NVIDIA allows, right? I'm dying to know how the Titan performs so the sooner we get some benchmarks the better.


Well as performance I would say 80/85% of the gtx690 while scaling works.
Everything gained by overclocking the core clock is great


----------



## Avonosac

Quote:


> Originally Posted by *smithyzbak*
> 
> I'm wondering exactly how limited the quantity of Titans is actually going to be, so would it even be possible to get another one a year or two down the line to sli? The rumored performance numbers people are suggesting are pretty damn high, and I have a hard time imagining any game coming out this year or even mentioned that could legitmately max out sli titans or even stress them.


Quote:


> Originally Posted by *HanakoIkezawa*
> 
> metro at 7680x1440


Bingo, the thing you aren't accounting for is massive resolutions. There are display set ups which are putting 4x 680's or 7970s on their knees. I game now at 2560x1440p, and really that is going to be a small resolution in a short time, for some it is now.

Game developers will start to take advantage of more expensive portions of DX or OGL libraries, all the games you play which are just console ports, will all the sudden be much harder on your graphics cards to run.

There is a lot of ways now to put 2 of these on their knees. They might have the power at 4x in SLI to make everything bow down currently, but how soon until the next Crysis comes out? A game which is designed to be as gorgeous as possible, which means its designed to crush all hardware available; a hardware industry target. 3-5 years is a LOT of time in the tech industry, I just don't see these lasting that long at the top.


----------



## goesto11

Quote:


> Originally Posted by *maarten12100*
> 
> Well as performance I would say 80/85% of the gtx690 while scaling works.Everything gained by overclocking the core clock is great


I'm hoping for 90% of GTX690 gaming performance though 85% may be more accurate. You make a good point about scaling, but how much can you overclock an stock Titan? That might be harder to guess than relative default performance. Very much looking forward to release & reviews.


----------



## NoGuru

Quote:


> Originally Posted by *rubicsphere*
> 
> And so it begins...
> 
> Dang those look a little on the long side


That's what she said


----------



## tastegw

300w is fine with me, make it upto 400 if you have too
900 bucks is fine with me as long as its performance is at least 75% of the 690
If its a limited production, plz reserve two of them for me, cherry picked if applicable

For those thinking this will be a dual gpu card, what GPU's were you thinking of?
2x k20's? No way 2 k20's will be slower or about the same as a 690 unless the k20's are serverly gimped
So if its not k20's, then what? Best I could come up with is 2 660ti's, and I highly doubt that's the case here.

So bring on this single GPU and take my money!
384-bit / 6GB GDDR5 / 2500+\- cuda cores / a 32oz squeeze bottle of strawberry jelly / and for the love of god, put a damned backplate on it.


----------



## AMD_Freak

Quote:


> Originally Posted by *rubicsphere*
> 
> If performance is where it's rumored to be (>690) in a single GPU solution it is worth every red cent.


that will be great , now the wait on the marketplace here on OCN for the slightly used gtx 690s


----------



## Whole Wheat

2688 cuda cores, 6 GB GDDR5 memory, 384-bit interface
Nividia just used their Ulti!


----------



## rcfc89

Quote:


> Originally Posted by *AMD_Freak*
> 
> that will be great , now the wait on the marketplace here on OCN for the slightly used gtx 690s


Mine will be there at only 2 months old.....But only if I can get a pair of Titans


----------



## Avonosac

Even if their performance is 85% of the 690, its a better card because of the memory bandwidth. The GTX690 runs out of memory and starts losing a ton of performance at higher resolutions / AA because it runs out of room for textures and starts playing games with the memory. being able to utilize 100% of the performance of the titan at the higher resolutions with the full memory bandwidth will outweigh the slight loss in processing power in the long run.


----------



## Ghoxt

Quote:


> Originally Posted by *prava*
> 
> So much fail in this reply.
> 
> a) In I+D environments, as soon as the product is done, you put it out. Everyday the tech is ready but not into the market you are losing money, since you allow your competition to catch-up. Would you invest hundreds of millions into a product and, then, wait a year to release it? Really?
> 
> b) Any CEO not following a) strictly would get fired by their shareholders any minute. Do you have any idea why products get revisions? Or bugs? Because all companies in I+D related industries work against the clock and any minute the product is not out there is a ton of revenue lost.
> 
> c) Do you truly think that GK110 didn't get released early even for Tesla when such cards have a HUGE mark-up and there was a TON of expectation?
> 
> Conclusion: you are clueless. NVIDIA, and any other tech-company that has competition around it, will release every product the single minute they can, and not a minute later. Why did GK110 come that late? Because it wasn't ready: be it that it was too expensive, or the fab couldn't keep up or whatever reason: it wasn't ready.


Intel and Nvidia consider themselves smarter than that. Having a taped out GK110 ES is different than paying for a full run of thousands from the FAB. They get samples made at "ANY" time and make an educated mgmt decision. I won't repeat what others have said, but your reason is not bulletproof, and assumes these companies are run by robots.

By your reasoning Intel would have never hamstrung half their lineup. They've had insanely powerful chips available lately and at any time within the last year could have delivered their best offering at the time to consumers. But why should they. Their reasoning is "give the market a decent 10% bump", stay ahead of the competition but go "No Further" until they have to show their hand. This gives them breathing room down the road. It's not assured of course, but with AMD's current offerings this is almost childsplay for Intel CPU wise...right now.

Nvidia is in the same game with their entire lineup. The Engineers develop, benchmark and deliver the reports to mgmt. Management makes the decision not just for the present, but for the future of the company which is not to blow your load, and then have nothing for a long while, which can also disappoint shareholders and that's what gets CEO's fired.


----------



## Compaddict

Wouldn't it be perfect to see this in a couple of weeks? *"Free Crysis 3 with Titan 780 purchase"*


----------



## chronicfx

I think I may replay witcher 2 with über sampling after crysis







Titan should handle that easily


----------



## RJT

Oooo ya! As long as that bundle includes the new Bioshock and The Last of Us, I'm in! Hahaha!


----------



## dioxholster

when will it get released?


----------



## rcfc89

Quote:


> Originally Posted by *Avonosac*
> 
> Even if their performance is 85% of the 690, its a better card because of the memory bandwidth. The GTX690 runs out of memory and starts losing a ton of performance at higher resolutions / AA because it runs out of room for textures and starts playing games with the memory. being able to utilize 100% of the performance of the titan at the higher resolutions with the full memory bandwidth will outweigh the slight loss in processing power in the long run.


I've never seen memory usage go past 1850 in any current game with aa maxed on either gpu on my 690 on 2560x1440. Not sure where you are getting this info from. Only I highly modded skyrim would possibly do that to a 690 in 2560x1440. Maybe if you went to eyefinity 2560 would you run into this but never on a single monitor.


----------



## PhantomTaco

Quote:


> Originally Posted by *rcfc89*
> 
> I've never seen memory usage go past 1850 in any current game with aa maxed on either gpu on my 690 on 2560x1440. Not sure where you are getting this info from. Only I highly modded skyrim would possibly do that to a 690 in 2560x1440. Maybe if you went to eyefinity 2560 would you run into this but never on a single monitor.


FC3 hits 1924mb on my 690...


----------



## Brianmz

Haven't been monitoring vram usage on most games lately, but decided to run the hitman absolution benchmark maxed at 1440p and it hit a constat 3gb usage on my cards and 38fps average, so the 6gb vram wouldn't hurt for future games that push the envelope.

Really looking forward to these titans, to grab a pair or 3 of them and put them underwater. Hopefully a 3770k doesn't bottleneck them, looking at the 3930k in case it does and then replace it at the end of the year with ivy-e.


----------



## Avonosac

Quote:


> Originally Posted by *rcfc89*
> 
> I've never seen memory usage go past 1850 in any current game with aa maxed on either gpu on my 690 on 2560x1440. Not sure where you are getting this info from. Only I highly modded skyrim would possibly do that to a 690 in 2560x1440. Maybe if you went to eyefinity 2560 would you run into this but never on a single monitor.


Quote:


> Originally Posted by *PhantomTaco*
> 
> FC3 hits 1924mb on my 690...


Quote:


> Originally Posted by *Brianmz*
> 
> Haven't been monitoring vram usage on most games lately, but decided to run the hitman absolution benchmark maxed at 1440p and it hit a constat 3gb usage on my cards and 38fps average, so the 6gb vram wouldn't hurt for future games that push the envelope.
> 
> Really looking forward to these titans, to grab a pair or 3 of them and put them underwater. Hopefully a 3770k doesn't bottleneck them, looking at the 3930k in case it does and then replace it at the end of the year with ivy-e.


I only have the 670 as non of the offerings from either side really tickled my fancy as a true inline upgrade from my 4890s. So I don't have experience, but I was more taking the ideas of the hard new games as baselines for future games, and surround / eyefinity starting to get higher market share. Even so, what happens when 1600p monitors start getting 120 / 144hz and 3d? size might not be bigger, but you're going to need much more memory to juice those. The 256 bit bus is a major buzzkill for the 690.


----------



## Systemlord

You would think Nvidia could easily get higher GK110 yields come Q4 and release more Titan goodness launching right next to the GTX 780...?









On another note can we please stay on topic, too many pages full of ****!


----------



## i7monkey

It sounds like it's a very limited release so I'm worried that early adopters of Titan will be nothing but guinea pigs and this will later translate to a better release of Titan In Q4 along wit the GTX 780. I can easily see them doing something like this.

And what's driver support for such limited cards like this?


----------



## Votkrath

Quote:


> Originally Posted by *i7monkey*
> 
> It sounds like it's a very limited release so I'm worried that early adopters of Titan will be nothing but guinea pigs and this will later translate to a better release of Titan In Q4 along wit the GTX 780. I can easily see them doing something like this.
> 
> And what's driver support for such limited cards like this?


Titan Test Version $899

Get real.


----------



## brasco

Quote:


> Originally Posted by *i7monkey*
> 
> It sounds like it's a very limited release


Genuine question - where is the info on it's limited release?


----------



## driftingforlife

It IS a very limited release. I know this from trying to get a sample from the vendors I know.


----------



## brasco

Thanks, I'm no market expert, does the availability of review samples etc directly translate into a very limited retail release?


----------



## dph314

Quote:


> Originally Posted by *driftingforlife*
> 
> It IS a very limited release. I know this from trying to get a sample from the vendors I know.


Like 590 limited? Or Mars II limited?


----------



## maarten12100

Quote:


> Originally Posted by *dph314*
> 
> Like 590 limited? Or Mars II limited?


Nope as they are based on non telsa chips already however the gf110 chips was used in teslas aswell it was not a tesla only chip.
This is actually a tesla only chip as it hasn't been released in consumer cards those others are limited because they were chosen to only have a couple of thousand made.


----------



## i7monkey

Quote:


> Originally Posted by *Votkrath*
> 
> Titan Test Version $899
> 
> Get real.


Haha...I get what you're saying, but consider the following:

A) How much future driver support will a super niche product like this really get, even if it's $899? How much time and money will Nvidia set aside to support a product that only a small % of enthusiasts own?

B) Nvidia's only releasing this to prove a point, as it doesn't seem like a mainstream release at all, so we're going to be guinea pigs. If this was a mainstream release that many people were buying, then Nvidia risks it's reputation, but since only a small % of enthusiasts are buying this, then know they can get away with a bit more since enthusiasts are generally more forgiving and understanding of the risks of buying such a limited release card than a guy who buys his computer at Best Buy.


----------



## driftingforlife

I don't know the numbers sorry.


----------



## brasco

Quote:


> Originally Posted by *i7monkey*
> 
> Haha...I get what you're saying, but consider the following:
> 
> A) How much future driver support will a super niche product like this really get, even if it's $899? How much time and money will Nvidia set aside to support a product that only a small % of enthusiasts own?


Won't the more matured drivers from the Tesla cards be used as a the foundation for a more Geforce oriented driver?


----------



## maarten12100

I think it'll be working just fine.
The architecture remains mostly is the same as the gk104 so it will run fine.

I hope it does


----------



## maarten12100

Quote:


> Originally Posted by *brasco*
> 
> Won't the more matured drivers from the Tesla cards be used as a the foundation for a more Geforce oriented driver?


It might be but it is a driver made for gpgpu which wouldn't do good at those calculations if you just port it.
Just how a gk104 doesn't do good at computing.


----------



## i7monkey

I'm a noob compared to most of you guys so please take my comments with a grain of salt.

In the business world, companies always strive for the best customer service and reputation, but we also have to be honest with ourselves. We can't expect customer support similar to that of mainstream releases, it's just not feasible.

Nvidia is basically granting a minority of it's enthusiast customer base "the luxury" of such a card, and it comes at a hefty price and it will come with the usual risks that extremely limited products tend to come with.

This isn't something that's advertised at the SuperBowl, it's an extremely niche product that's geared towards a small percentage of enthusiasts, let alone casual gamers in general.

It would be suicide for Nvidia to release a half-baked product, but if they release only a thousand cards and then stop, then how much support can we really honestly expect?

I'm not experienced with super high end cards like the Mars II etc..., but logic tells me that these types of products, assuming the rumors are true, have very limited quantity, may come with possible issues, and maybe have limited support in the long run.

To be honest, it's a risky card to buy.


----------



## Stay Puft

Titan listed on a Denmark site

http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html

6GB also confirmed







What is that like 900 dollars USD?

There is also this


----------



## GoldenTiger

Quote:


> Originally Posted by *Stay Puft*
> 
> Titan isnt a rumor anymore guys. Listed on a Denmark site
> 
> http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html
> 
> 6GB also confirmed
> 
> 
> 
> 
> 
> 
> 
> What is that like 900 dollars USD?


Yup, $907 pre-VAT if you transform the currency directly. I just came here to post it myself!


----------



## dph314

Quote:


> Originally Posted by *Stay Puft*
> 
> Titan isnt a rumor anymore guys. Listed on a Denmark site
> 
> http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html
> 
> 6GB also confirmed
> 
> 
> 
> 
> 
> 
> 
> What is that like 900 dollars USD?
> 
> There is also this




































































*Ti*tan! I get it!


----------



## GoldenTiger

Quote:


> Originally Posted by *i7monkey*
> 
> I'm a noob compared to most of you guys so please take my comments with a grain of salt.
> 
> In the business world, companies always strive for the best customer service and reputation, but we also have to be honest with ourselves. We can't expect customer support similar to that of mainstream releases, it's just not feasible.
> 
> Nvidia is basically granting a minority of it's enthusiast customer base "the luxury" of such a card, and it comes at a hefty price and it will come with the usual risks that extremely limited products tend to come with.
> 
> This isn't something that's advertised at the SuperBowl, it's an extremely niche product that's geared towards a small percentage of enthusiasts, let alone casual gamers in general.
> 
> It would be suicide for Nvidia to release a half-baked product, but if they release only a thousand cards and then stop, then how much support can we really honestly expect?
> 
> I'm not experienced with super high end cards like the Mars II etc..., but logic tells me that these types of products, assuming the rumors are true, have very limited quantity, may come with possible issues, and maybe have limited support in the long run.
> 
> To be honest, it's a risky card to buy.


It will use the same drivers likely as "normal" cards do, anyway... cards like the MARS just use standard ones with crossfire profiles. There is no particular extra risk here on that front. Additionally, enthusiasts dropping $900 on a card know it's not going to be top dog for the "long run" anyway....


----------



## GoldenTiger

Quote:


> Originally Posted by *dph314*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Ti*tan! I get it!


That chart is bull, just Tesla and random specs filled in.


----------



## Stay Puft

Quote:


> Originally Posted by *GoldenTiger*
> 
> That chart is bull, just Tesla and random specs filled in.


I thought so too but i figured i'd post it anyway.


----------



## i7monkey

Quote:


> Originally Posted by *Stay Puft*
> 
> There is also this


So wait, since it says 140%, the Titan is only 40% faster than a GTX 680?


----------



## dph314

Well, hopefully the release dates are accurate. But I doubt it.

At least that one Dutch site listing them as coming Feb 18 is something to look forward to.


----------



## PatrickCrowely

I pray those release dates are close. It sucks not being able to game!


----------



## Stay Puft

Quote:


> Originally Posted by *dph314*
> 
> Well, hopefully the release dates are accurate. But I doubt it.
> 
> At least that one Dutch site listing them as coming Feb 18 is something to look forward to.


I've changed my tune. I'm buying one. The lady will be peeved but i can live with this sex instead of her's for a week.


----------



## MoBeeJ

dutch site that took a pic from an egyptian site!

Regardless, the chart is definitly wrong. i mean the gtx 780 has higher tdp than titan?? Almost same exact bandwidth?? The titans memory is @ 5200 while the others are at 6000?? I hope this istrue cause this means that the card can oc ALOT. Also i like the abscence of boost meaning old fashion oc.

But all in all the chart is wrong.


----------



## GoldenTiger

Quote:


> Originally Posted by *Stay Puft*
> 
> I've changed my tune. I'm buying one. The lady will be peeved but i can live with this sex instead of her's for a week.


LOL.... I'm trying to convince myself I really need a video card with that much power for 60hz 2560x1600 gaming.


----------



## Alatar

Quote:


> Originally Posted by *i7monkey*
> 
> So wait, since it says it says 140%, the Titan is only 40% faster than a GTX 680?


Completely plausible.

According to TPUs 690 review that would bring the titan to 83% of the 690 at 1600p, 92% of the 690 at 1200p and 97% of the 690 across all tested resolutions.


----------



## Stay Puft

Quote:


> Originally Posted by *GoldenTiger*
> 
> LOL.... I'm trying to convince myself I really need a video card with that much power for 60hz 2560x1600 gaming.


YES.. Yes you do


----------



## GoldenTiger

Quote:


> Originally Posted by *Stay Puft*
> 
> YES.. Yes you do










I know I do... but.... I... money.... somethingsomething.... want it!

For posterity...

Leveringsstatus: Bestilt, forventet på lager d. 18-02-2013


----------



## Alatar

For some reason the thought of these going in i5, AIO/air cooled, low to mid range mobo systems is depressing me.

(in other words, I hope I get one







)


----------



## GoldenTiger

Quote:


> Originally Posted by *Alatar*
> 
> For some reason the thought of these going in i5, AIO/air cooled, low to mid range mobo systems is depressing me.
> 
> (in other words, I hope I get one
> 
> 
> 
> 
> 
> 
> 
> )


I have an i7, can I has one?


----------



## i7monkey

Quote:


> Originally Posted by *Stay Puft*
> 
> Titan listed on a Denmark site
> 
> http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html


That website says (after Google translate):
Quote:


> Ordered, expected on stock d 02.18.2013


Hmm....Feb 18 then? Hope it's true.


----------



## Stay Puft

Quote:


> Originally Posted by *Alatar*
> 
> For some reason the thought of these going in i5, AIO/air cooled, low to mid range mobo systems is depressing me.
> 
> (in other words, I hope I get one
> 
> 
> 
> 
> 
> 
> 
> )


I doubt people with mid range rigs will get one. Either the performance minded will or they'll be bought to be sold on ebay for 2K. I just want one. An asus one please


----------



## driftingforlife

That chart is rubbish. It ****** to make the memory slower on the card that needs it more.


----------



## Alatar

Quote:


> Originally Posted by *GoldenTiger*
> 
> I have an i7, can I has one?


Seeing how you're not competing for the same stock as I am then go for it








Quote:


> Originally Posted by *Stay Puft*
> 
> I doubt people with mid range rigs will get one. Either the performance minded will or they'll be bought to be sold on ebay for 2K. I just want one. An asus one please


I dunno, I've seen a lot of people say they want one.

I'm hoping I'll be able to grab one on launch day (assuming any stored here get stock). And I don't care about the brand, the cards are all reference anyway and the retailer offers full warranty here.


----------



## i7monkey

Guys, is my rig good enough to run this? i7 920 @ 4.2Ghz.


----------



## n00byn4t3r

No, a 920 will probably bottleneck a high end GPU now a days.


----------



## driftingforlife

You want IVY or SB-E really.


----------



## i7monkey

Quote:


> Originally Posted by *n00byn4t3r*
> 
> No, a 920 will probably bottleneck a high end GPU now a days.


Quote:


> Originally Posted by *driftingforlife*
> 
> You want IVY or SB-E really.


----------



## Stay Puft

Quote:


> Originally Posted by *i7monkey*
> 
> Guys, is my rig good enough to run this? i7 920 @ 4.2Ghz.


Youd have a massive cpu bottleneck. You want a high clocked 3770K or 3930K for this card.


----------



## Cakewalk_S

oh boy...this year might be great for graphics cards....


----------



## iTurn

Quote:


> Originally Posted by *i7monkey*


It's a limited release... get the GPU and upgrade the processor later.


----------



## Alatar

Quote:


> Originally Posted by *i7monkey*
> 
> Guys, is my rig good enough to run this? i7 920 @ 4.2Ghz.


You'd most likely get higher fps in majority of the situations if you used SB-E or IB but the difference shouldn't be huge. And it will still be faster than any other single GPU setup...


----------



## striderstone

I just invested in the Asrock extreme 11 with a 3970X, as for 2011 socket chips was this the best choice to be able to run 3 of these titans? I will most likely be running a double loop and OCing the processor to a pretty decently high level. I have done minor air and AIO watercooling overclocking in the past but nothing to the 5ghz before. I just need to make sure that I won't get any disgusting bottlenecks like I have now


----------



## driftingforlife

SB-E is best for 2+ cards.


----------



## Alatar

Quote:


> Originally Posted by *striderstone*
> 
> I just invested in the Asrock extreme 11 with a 3970X, as for 2011 socket chips was this the best choice to be able to run 3 of these titans? I will most likely be running a double loop and OCing the processor to a pretty decently high level. I have done minor air and AIO watercooling overclocking in the past but nothing to the 5ghz before. I just need to make sure that I won't get any disgusting bottlenecks like I have now


rampage IV extreme for proper overclocking on LGA2011 and custom watercooling (or better) for 5ghz+


----------



## freitz

Quote:


> Originally Posted by *striderstone*
> 
> I just invested in the Asrock extreme 11 with a 3970X, as for 2011 socket chips was this the best choice to be able to run 3 of these titans? I will most likely be running a double loop and OCing the processor to a pretty decently high level. I have done minor air and AIO watercooling overclocking in the past but nothing to the 5ghz before. I just need to make sure that I won't get any disgusting bottlenecks like I have now


I would say get a RIVE and save on the 3970x just get a 3930k and overclock it... plenty of rad with with a case too helps.


----------



## Stay Puft

Quote:


> Originally Posted by *striderstone*
> 
> I just invested in the Asrock extreme 11 with a 3970X, as for 2011 socket chips was this the best choice to be able to run 3 of these titans? I will most likely be running a double loop and OCing the processor to a pretty decently high level. I have done minor air and AIO watercooling overclocking in the past but nothing to the 5ghz before. I just need to make sure that I won't get any disgusting bottlenecks like I have now


I just have to ask. Why would you buy a 3970x when the 3930K does the same job and is 600 dollars cheaper?


----------



## brasco

So based off the danish listing, that'd be ~£900 (~US$1400) here in the UK, if it is the CUDA beast it promises to be, I'm in.


----------



## driftingforlife

If it is that then that one months hole pay for me


----------



## brasco

Sell something! It's a big whack but I can recoup a lot in direct time savings _IF_ it's close to the Teslas' terafloppage








So really I need to see benches before jumping in, hopefully won't mean they all get snapped up...


----------



## Cloudfire777

Quote:


> Originally Posted by *Stay Puft*
> 
> Titan listed on a Denmark site
> 
> http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html
> 
> 6GB also confirmed
> 
> 
> 
> 
> 
> 
> 
> What is that like 900 dollars USD?


EVGA GTX 690 from ProShop: 7416 DKK/$1348
EVGA GTX 690 from Newegg: $1000
Ratio from European price to American price: 0.74.

Geforce Titan from Proshop: 7226 DKK/$1312
*Geforce Titan in US: $1312x0.74 = $973 = $999*


----------



## GoldenTiger

Quote:


> Originally Posted by *Cloudfire777*
> 
> EVGA GTX 690 from ProShop: 7416 DKK/$1348
> EVGA GTX 690 from Newegg: $1000
> Ratio from European price to American price: 0.74.
> 
> Geforce Titan from Proshop: 7226 DKK/$1312
> *Geforce Titan in US: $1312x0.74 = $973 = $999*


*And you're using the post-VAT value, so it's actually LESS than that. The actual price is 5,820kr, *NOT* 72xx kr*


----------



## PatrickCrowely

Quote:


> Originally Posted by *Cloudfire777*
> 
> *Geforce Titan in US: $1312x0.74 = $973 = $999*


I believe the Titan will be $1,099.99, just as the 690 was @ first or it will be a $100 over MSRP. I was able to get my 900D cheaper by buying on Amazon, but Newegg has already marked the case up...


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> Titan listed on a Denmark site
> 
> http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html
> 
> 6GB also confirmed
> 
> 
> 
> 
> 
> 
> 
> What is that like 900 dollars USD?
> 
> There is also this


It has been removed just now but 18Feb is pretty soon better than expected


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> Titan listed on a Denmark site
> 
> http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html
> 
> 6GB also confirmed
> 
> 
> 
> 
> 
> 
> 
> What is that like 900 dollars USD?
> 
> There is also this


It has been removed just now but 18Feb is pretty soon better than expected


----------



## Cloudfire777

Quote:


> Originally Posted by *GoldenTiger*
> 
> *And you're using the post-VAT value, so it's actually LESS than that. The actual price is 5,820kr, *NOT* 72xx kr*


Now lets be real here. American`s pay tax too on their products. Titan obv won`t cost $799
Quote:


> Originally Posted by *PatrickCrowely*
> 
> I believe the Titan will be $1,099.99, just as the 690 was @ first or it will be a $100 over MSRP. I was able to get my 900D cheaper by buying on Amazon, but Newegg has already marked the case up...


Yeah sounds reasonable. Around $999-$1099 at launch I guess


----------



## GoldenTiger

Quote:


> Originally Posted by *Cloudfire777*
> 
> Now lets be real here. American`s pay tax too on their products. Titan obv won`t cost $799
> Yeah sounds reasonable. Around $999-$1099 at launch I guess


You can't compare prices with a 25 percent tax to ones with none built in and sometimes 5-7 percenttax . Use non taxed values to compare.


----------



## Cloudfire777

Quote:


> Originally Posted by *Stay Puft*
> 
> There is also this


Several things here that seems a bit off:

Die size of GTX 780Ti = 550mm^2? GK110 aka Tesla, GPU-Z report 502m^2.

TDP og GTX 770 and 760TI is the same, yet the 770 have more cuda cores.
TDP of 780Ti is lower than GTX 780, and there is a huge difference in transistor count between the two.

Why does GTX 780 have higher memory bandwidth than 780Ti? I can see that from the memory bus and that the 780 GDDR is higher clocked, but why is it higher clocked when it have a lot less cuda cores to use that bandwidth?


----------



## Cloudfire777

Proshop just removed the product from their website


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> I just have to ask. Why would you buy a 3970x when the 3930K does the same job and is 600 dollars cheaper?


Why would you buy a 3930K if a ES can do the same job faster for 150 Euro there is no sense in having those procs.
However the 8/16 20m 2.3Ghz 32nm processor can't oc is is far superior.


----------



## WorldExclusive

Quote:


> Originally Posted by *Cloudfire777*
> 
> Several things here that seems a bit off:
> 
> Die size of GTX 780Ti = 550mm^2? GK110 aka Tesla, GPU-Z report 502m^2.
> 
> TDP og GTX 770 and 760TI is the same, yet the 770 have more cuda cores.
> TDP of 780Ti is lower than GTX 780, and there is a huge difference in transistor count between the two.
> 
> Why does GTX 780 have higher memory bandwidth than 780Ti? I can see that from the memory bus and that the 780 GDDR is higher clocked, but why is it higher clocked when it have a lot less cuda cores to use that bandwidth?


That chart is way off, especially on the TDP values.
GK114 TDP will be lower than 200.

Also the GTX 700 series will release by June 2013 at the earliest.
GTX 780 Ti? Gimme a break. Looks like someone is trying too hard.


----------



## driftingforlife

Quote:


> Originally Posted by *maarten12100*
> 
> Why would you buy a 3930K if a ES can do the same job faster for 150 Euro there is no sense in having those procs.
> However the 8/16 20m 2.3Ghz 32nm processor can't oc is is far superior.


Because ESs should not be sold and no warranty and some can't clock for crap.


----------



## maarten12100

Quote:


> Originally Posted by *driftingforlife*
> 
> Because ESs should not be sold and no warranty and some can't clock for crap.


They are stable enough also you can buy for for the price of one fail ratio per year is 10% or less.
They shouldn't be clocked server boards can't even clock them + they are locked.
But yeah they shouldn't be sold I just like the fact that they are.


----------



## brasco

Quote:


> Originally Posted by *GoldenTiger*
> 
> And you're using the post-VAT value, so it's actually LESS than that. The actual price is 5,820kr, *NOT* 72xx kr


Didn't realise that, that's even better then about 100 quid cheaper. Of course the price might not be final and the site might just be driving clicks their way, we've had no announcement from Nvidia about it so who knows.


----------



## Votkrath

Through the Swedish pricematching site I found out there is another store listing it for a higher price than proshop. Some kind of finish based computer store selling to Sweden.

Proshop price in SEK: 8 399:-
Multitronic price in SEK: 8 709:-


----------



## striderstone

Quote:


> Originally Posted by *Alatar*
> 
> rampage IV extreme for proper overclocking on LGA2011 and custom watercooling (or better) for 5ghz+


Quote:


> Originally Posted by *Stay Puft*
> 
> I picked up the extreme11 mainly because I wanted the RAID controller, I have 8 vertex4's in a RAID0 and I wanted the option to run 4-way SLI still. I also heard recently that the extreme11's are overclocker VERY nicely like the RIVE if not better. At least that's what I have seen from the results that some people have posted on the forums.
> I just have to ask. Why would you buy a 3970x when the 3930K does the same job and is 600 dollars cheaper?


This is why I asked, because I don't know these kinds of things. I know that almost everyone was getting the 3930k because it overclocked really nicely, but I also haven't looked much into it. I had the money to spend on an extreme edition processor and I remember when I looked into them in the past that the EE processors can be stable with higher voltage (overclocked more) since it came from the center part of the wafer. Is that not the case anymore? I will be doing some custom watercooling for sure, most likely a double loop. I would also think that the extra 3mb of L3 cache would help prevent bottlenecking if I got to run 4 titans. I am using this machine for a lot of streaming and video editing for my Youtube channel, so I am trying to get the best of both worlds. Is any of this relevant ? I would be 100% happy saving 500 bucks, I just need to make sure that it would be able to run everything I want it to.


----------



## Daredevil 720

Wow, that was a big price jump there. To go from 899$ to 1300-1400 is quite the leap.

I think one is better off buying 3 680s instead of this (unfortunately) overpriced beast.


----------



## Votkrath

Quote:


> Originally Posted by *Daredevil 720*
> 
> Wow, that was a big price jump there. To go from 899$ to 1300-1400 is quite the leap.
> 
> I think one is better off buying 3 680s instead of this (unfortunately) overpriced beast.


Do note that hardware tend to be more expensive in Europe than in the US, but since you live in Europe yourself, I guess you are affected.


----------



## Avonosac

Quote:


> Originally Posted by *Daredevil 720*
> 
> Wow, that was a big price jump there. To go from 899$ to 1300-1400 is quite the leap.
> 
> I think one is better off buying 3 680s instead of this (unfortunately) overpriced beast.


Problem with that is you can't put 3 680s in SLI with 3 680s.


----------



## Daredevil 720

Quote:


> Originally Posted by *Votkrath*
> 
> Do note that hardware tend to be more expensive in Europe than in the US, but since you live in Europe yourself, I guess you are affected.


Yeah I was a bit off here. It'll probably be like 900-1000€ in Europe and 900-1000$ in the US.

I always hated how things are more expensive in Europe, but in reality I guess it all depends on people's wages. If american people get the same amount of dollars as we get euros, then it is fair. But again I don't know what's the average wage in America.


----------



## Stay Puft

Quote:


> Originally Posted by *striderstone*
> 
> This is why I asked, because I don't know these kinds of things. I know that almost everyone was getting the 3930k because it overclocked really nicely, but I also haven't looked much into it. I had the money to spend on an extreme edition processor and I remember when I looked into them in the past that the EE processors can be stable with higher voltage (overclocked more) since it came from the center part of the wafer. Is that not the case anymore? I will be doing some custom watercooling for sure, most likely a double loop. I would also think that the extra 3mb of L3 cache would help prevent bottlenecking if I got to run 4 titans. I am using this machine for a lot of streaming and video editing for my Youtube channel, so I am trying to get the best of both worlds. Is any of this relevant ? I would be 100% happy saving 500 bucks, I just need to make sure that it would be able to run everything I want it to.


3930K, 3960x, 3970x is going to max out the same on air and water. Ln2 would probably yield a little better results with the extreme. Its your money so whatever you want


----------



## dph314

You guys really think the core clock will be that low? 732mhz? I remember the same was said about the 680 prior to launch, and we all know how that turned out...


----------



## Votkrath

Hmm.. now they removed it from the online store I mentioned. Looks like they are reading this forum.


----------



## guinner16

Quote:


> Originally Posted by *freitz*
> 
> So NV. Whats the deal where is my press release??


I have stated before that I am pretty new to PC's, but doesn't it seem weird a product is rumored to release in 11 days, and there has not been a word from NVIDIA. My guess is this will be a march release and we will hear something in 1-2 weeks. My bet with vegas still stays with 6gb vram, March release, voltage locked, 85% performance compared to a 690, with the ability to overclock near equal to a 690, and an $899 USD price tag (healthcare not inlcuded...LOL).


----------



## Nocturin

Quote:


> Originally Posted by *guinner16*
> 
> I have stated before that I am pretty new to PC's, but doesn't it seem weird a product is rumored to release in 11 days, and there has not been a word from NVIDIA. My guess is this will be a march release and we will hear something in 1-2 weeks. My bet with vegas still stays with 6gb vram, March release, voltage locked, 85% performance compared to a 690, with the ability to overclock near equal to a 690, and an $899 USD price tag (healthcare not inlcuded...LOL).


It's normal not to hear anything concrete, and everything is a rumor until it's being sold


----------



## guinner16

Alright. Since we have zero facts to go on now, how about helping me with my possible Titan buiild. As stated earlier this is my first build ever. I will be gaming at 2560X1440 IPS 120hz. Obviously with the 120HZ monitor I would like to stay at 120 fps whenever I can. I will list my previously planned build (670SLI), and just insert Titan in there. My main concerns is if I ever went with Titans in SLI. Now I know the PSU will probably not be enough but I dont have a problem upgrading that. The big questions is how would my motherboard and CPU hand Titan's in SLI. I would imagine I would have a bottleneck problem. Also would 2 reference titans stay cool with just air. Selling the PSU and buying a better one is not a big deal compared to upgrading a motherboard and cpu. Build date is March 2013

PCPartPicker part list: http://pcpartpicker.com/p/BCH3 Price breakdown by merchant: http://pcpartpicker.com/p/BCH3/by_merchant/ Benchmarks: http://pcpartpicker.com/p/BCH3/benchmarks/ CPU:

Intel Core i7-3770K 3.5GHz Quad-Core Processor ($229.99 @ Microcenter)
CPU Cooler: Noctua NH-D14 65.0 CFM CPU Cooler ($80.76 @ Amazon)
Motherboard: Asus Maximus V Formula EATX LGA1155 Motherboard ($269.99 @ Newegg)
Memory: Corsair Vengeance LP 16GB (2 x 8GB) DDR3-1600 Memory ($89.99 @ Amazon)
Storage: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive ($99.98 @ Outlet PC)
Storage: Samsung 840 Pro Series 256GB 2.5" Solid State Disk ($226.99 @ Amazon)
Video Card: EVGA GeForce GTX 670 4GB Video Card (2-Way SLI) ($433.98 @ Newegg)
Video Card: EVGA GeForce GTX 670 4GB Video Card (2-Way SLI) ($433.98 @ Newegg)
Case: NZXT Switch 810 (Black) ATX Full Tower Case ($165.98 @ Outlet PC)
Power Supply: Corsair Professional Gold 850W 80 PLUS Gold Certified ATX12V / EPS12V Power Supply ($150.49 @ Newegg)
Optical Drive: Lite-On iHAS124-04 DVD/CD Writer ($18.98 @ Outlet PC)
Operating System: Microsoft Windows 7 Home Premium SP1 (OEM) (64-bit) ($89.98 @ Outlet PC)
Other: Catleap 2b OC Edition Pixel Perfect ($817.00)

Total: $3108.09 (Prices include shipping, taxes, and discounts when available.) (Generated by PCPartPicker 2013-02-07 11:23 EST-0500)


----------



## freitz

Spoiler: Warning: Spoiler!






Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *guinner16*
> 
> Alright. Since we have zero facts to go on now, how about helping me with my possible Titan buiild. As stated earlier this is my first build ever. I will be gaming at 2560X1440 IPS 120hz. Obviously with the 120HZ monitor I would like to stay at 120 fps whenever I can. I will list my previously planned build (670SLI), and just insert Titan in there. My main concerns is if I ever went with Titans in SLI. Now I know the PSU will probably not be enough but I dont have a problem upgrading that. The big questions is how would my motherboard and CPU hand Titan's in SLI. I would imagine I would have a bottleneck problem. Also would 2 reference titans stay cool with just air. Selling the PSU and buying a better one is not a big deal compared to upgrading a motherboard and cpu. Build date is March 2013
> 
> PCPartPicker part list: http://pcpartpicker.com/p/BCH3 Price breakdown by merchant: http://pcpartpicker.com/p/BCH3/by_merchant/ Benchmarks: http://pcpartpicker.com/p/BCH3/benchmarks/ CPU:
> 
> Intel Core i7-3770K 3.5GHz Quad-Core Processor ($229.99 @ Microcenter)
> CPU Cooler: Noctua NH-D14 65.0 CFM CPU Cooler ($80.76 @ Amazon)
> Motherboard: Asus Maximus V Formula EATX LGA1155 Motherboard ($269.99 @ Newegg)
> Memory: Corsair Vengeance LP 16GB (2 x 8GB) DDR3-1600 Memory ($89.99 @ Amazon)
> Storage: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive ($99.98 @ Outlet PC)
> Storage: Samsung 840 Pro Series 256GB 2.5" Solid State Disk ($226.99 @ Amazon)
> Video Card: EVGA GeForce GTX 670 4GB Video Card (2-Way SLI) ($433.98 @ Newegg)
> Video Card: EVGA GeForce GTX 670 4GB Video Card (2-Way SLI) ($433.98 @ Newegg)
> Case: NZXT Switch 810 (Black) ATX Full Tower Case ($165.98 @ Outlet PC)
> Power Supply: Corsair Professional Gold 850W 80 PLUS Gold Certified ATX12V / EPS12V Power Supply ($150.49 @ Newegg)
> Optical Drive: Lite-On iHAS124-04 DVD/CD Writer ($18.98 @ Outlet PC)
> Operating System: Microsoft Windows 7 Home Premium SP1 (OEM) (64-bit) ($89.98 @ Outlet PC)
> Other: Catleap 2b OC Edition Pixel Perfect ($817.00)
> 
> Total: $3108.09 (Prices include shipping, taxes, and discounts when available.) (Generated by PCPartPicker 2013-02-07 11:23 EST-0500)









Not completely positive but I think 2x 670 4gb will not max out new games like Crysis 3 at 120hz. You might get 60 fps. You would probably need 2x titans. With that said as far as Im aware the Catleap 2b are unavailable right now. Only thing that is a 120hz 1440p is the overlord ones which are also out of stock. Everything seems to be good, good use of parts I would go for the Samsung 30nm mircle ram over the vengeance.


----------



## guinner16

Quote:


> Originally Posted by *freitz*
> 
> 
> Not completely positive but I think 2x 670 4gb will not max out new games like Crysis 3 at 120hz. You might get 60 fps. You would probably need 2x titans. With that said as far as Im aware the Catleap 2b are unavailable right now. Only thing that is a 120hz 1440p is the overlord ones which are also out of stock. Everything seems to be good, good use of parts I would go for the Samsung 30nm mircle ram over the vengeance.


I know games like crisis and far cry I will get somewhere around 60, but battlefied and others will be close. With the higer res monitor I dont plan on cranking AA. There are some 2b's available right now, just not the pixel perfects. However, there may be an order coming in the next 2-3 months. If that is the case I will just hook up an existing crap monitor right now. I was also looking at the Asus PB278Q as an alternative if the 120hz catleap wasn't available. I just wanted to see if anything else would need changed desides PSU to run Titans in SLI.


----------



## zulk

The chart does not make sense for the GTX 780 and the 770.

Also wasn't there a confirmation on how the new card will not be GTX but rather a Geforce Titan :s. Anyhow I'm really looking forward to this ;o


----------



## Mals

Quote:


> Originally Posted by *Avonosac*
> 
> Yea, that is one of the biggest contributors to the size of the console gaming market, its a fixed, and far lower entry cost.
> You won't likely find too many 690's ever bottlenecked by CPU, simply because the people who buy a $1,000 GPU will not settle for a CPU which will bottleneck it.


I don't know if any CPU can keep up with certain games right now. I've posted on other threads about the minimum FPS increases going from 4 cores to 8 (with HT) cores, etc.

I know for a fact games like Planetside 2 are severely CPU limited, but my GPU's (dual 670's) take it like a cakewalk.
I currently have a 3770k and I don't think a 3930k would solve that.

This could have more to do with poor game optimization, but to tie this back to the Titan thread...

We might never be able to perfectly run most games at 120hz on 1440p monitors until the CPU technology and the games are more optimized for utilization.

I run BF3 at 100+ FPS always on a 1080p 120hz monitor.

I currently have zero games (I don't have FC3 or Max Payne installed yet) that run under 100+ frames unless it is a CPU limitation. Graphically, my cards are capable of handling anything. Makes me think a Titan will be more than enough (especially SLI'd). Let's hope the CPU and game optimization catches up.


----------



## h2spartan

So I may be a little behind in following the news of the supposed "Titan".....is it real or just rumor still? Is it really coming soon if is legit? I really do hope this is past speculation as I should have the funds by the time its released mid February(if that's really the case)


----------



## PostalTwinkie

Quote:


> Originally Posted by *h2spartan*
> 
> So I may be a little behind in following the news of the supposed "Titan".....is it real or just rumor still? Is it really coming soon if is legit? I really do hope this is past speculation as I should have the funds by the time its released mid February(if that's really the case)


It is past speculation, but the exact specifications are not confirmed yet by Nvidia....

This is what is know to be true....


It does exist.
It is based off GK110
It is called the Geforce Titan
It was listed online today by a retailer, but quickly pulled, as the "Geforce GTX Titan", and the page did list specifications.
New information that I read today is that a "source" has the release date between February 24th and 26th. $899 USD is the MSRP.

Other tidbits still being argued, if there is an SMX unit disabled or not on it, as we are getting closer the debate as to if it has the full 15 SMX units or 14 is heating up.


----------



## hifibuff

Quick question : will my RIVE and 3960X @ 4.4 bottleneck 3xTitan at 2560x1600 ? I haven't tried more than 4.4ghz for the time being but I doubt I can squeeze much more out of a Corsair H80...


----------



## sugarhell

Quote:


> Originally Posted by *hifibuff*
> 
> Quick question : will my RIVE and 3960X @ 4.4 bottleneck 3xTitan at 2560x1600 ? I haven't tried more than 4.4ghz for the time being but I doubt I can squeeze much more out of a Corsair H80...


Its limited edition. Probably you will not be able to get three.Like gtx680. Limit 1 per buyer.1 titan is enough for 1600p.Probably you will not get bottleneck but i am not sure


----------



## tsm106

Quote:


> Originally Posted by *hifibuff*
> 
> Quick question : will my RIVE and 3960X @ 4.4 bottleneck 3xTitan at 2560x1600 ? I haven't tried more than 4.4ghz for the time being but I doubt I can squeeze much more out of a Corsair H80...


Most definitely.


----------



## PostalTwinkie

Quote:


> Originally Posted by *hifibuff*
> 
> Quick question : will my RIVE and 3960X @ 4.4 bottleneck 3xTitan at 2560x1600 ? I haven't tried more than 4.4ghz for the time being but I doubt I can squeeze much more out of a Corsair H80...


Eh, no. I would doubt, hoping as well, that the 2500K with a decent clock on it wouldn't bottleneck a Titan. You can run 680 SLi on a 2500K and not bottleneck....


----------



## hifibuff

Hmmm, I wouldn't be interested in getting a single Titan... I would barely get the same performance level as I have now with my two 680's.
The aim for me is to get three titans and be settled for 2-3 years. If indeed I cannot snatch more than one upon release, I'll wait for the 780's.


----------



## Cloudfire777

Quote:


> Originally Posted by *GoldenTiger*
> 
> *And you're using the post-VAT value, so it's actually LESS than that. The actual price is 5,820kr, *NOT* 72xx kr*


Quote:


> Originally Posted by *Cloudfire777*
> 
> EVGA GTX 690 from ProShop: 7416 DKK/$1348
> EVGA GTX 690 from Newegg: $1000
> Ratio from European price to American price: 0.74.
> 
> Geforce Titan from Proshop: 7226 DKK/$1312
> *Geforce Titan in US: $1312x0.74 = $973 = $999*


Quote:


> Originally Posted by *GoldenTiger*
> 
> You can't compare prices with a 25 percent tax to ones with none built in and sometimes 5-7 percenttax . Use non taxed values to compare.


I compared danish price vs american price. Obviously all tax have been taken care of there, 5-7% for Newegg, and 20% for ProShop. I took a rough calculation and found how much less US customers pay for the same product. It goes without saying that this is rough estimation since some products are cheaper/cost more and the ratio change

If I were to follow your line of reasoning, I would be comparing a non customer price in Denmark vs customer price in US. That ratio would be 100% wrong.
But if you meant using the non VAT and adding 7%, its roughly the same except a little less accurate method because the non VAT in Denmark will most likely cost more than the non VAT in US because hardware IS more expensive in Denmark. The shops doesn`t buy from Nvidia at the same price as US and trick the customers and keep all the extra profit to themselves.

5820 DKK x 1.07 = 6227 DKK/$1117

The die size of Titan is 502mm^2. The GTX 690 have 588mm^2 (2x294mm^2). Manufacturing cost of Titan and the materials used should cost less than 690.


----------



## Phishy714

<--- Excited

Here's hoping they release the benchmarks before the actual product release, instead of releasing them both on the same day.


----------



## rcfc89

Quote:


> Originally Posted by *Phishy714*
> 
> <--- Excited
> 
> Here's hoping they release the benchmarks before the actual product release, instead of releasing them both on the same day.


I sure hope so. I'd hate to pick one and find out it has weaker performance then my gtx690. If it turns out it does its either 2 or bust for me. I'll wait for the 790.


----------



## Cloudfire777

Quote:


> Originally Posted by *dph314*
> 
> You guys really think the core clock will be that low? 732mhz? I remember the same was said about the 680 prior to launch, and we all know how that turned out...


732MHz is for the Tesla. That picture from EgyptHardware is flawed and contains several errors.

I`m think that the Nvidia Titan will have turbo boost just like every GTX card. 732MHz is most likely the base clock and turbo boost goes to...? I dont know


----------



## furyn9

I can't wait for the reviews , that's gonna be the key factor for me


----------



## Cloudfire777

Here is more news on Titan








Quote:


> *NVIDIA GeForce Titan PCB Unveiled - 6 GB Memory, GK110 GPU, 8 Phase Supply*


Quote:


> Since our source was still under NDA, he had to blur out the key mappings of the GPU which include part labels. Still we can see much of the GPU itself so have a look below at the beauty that is the GeForce Titan. The PCB is custom designed and reminds me of the GTX 680 classified from EVGA except that the GeForce Titan is based on the GK110 core architecture that features a massive core count of 2688 Cuda Cores. The PCB holds a total 8 phase VRM which is situated at the backside of the PCB along with ICs and MOSFETs. Additional two VRM phases are located near the SLI connectors that power the memory.


Quote:


> There's only one 4 pin fan connector on the PCB so we can say that the cooler shroud would consists of only one fan unlike dual fans found on other flagship GPU. The card itself would occupy two slots and the upper slot would be used to ventilate heat out of the GPU. Well this is all for now but expect more details soon. The GeForce Titan launches this month at a price tag just around the $1000 mark. Its listing can be found here.


wccftech.com/exclusive-nvidia-geforce-titan-pcb-unveiled-6-gb-memory-gk110-gpu-8-phase-supply/


----------



## guinner16

I have been telling myself that it will be 85% of the 690 as to not disappoint myself. However part of me says it may be equal or better than a 690. First off, the card better perform with that high of a price tax, especially since they should be cheaper to produce. The card should be cheaper with only one GPU compared to two GPU's. Also it seems like these chips were just "laying" around for lack of a better term. It is my understanding these were underperforming Tesla chips, and why not recoup part of that cost by selling them to pc gamers. Having said all that I think we will see a stock card get within 10% of a 690, and overclock just up to or past a 690.


----------



## guinner16

Quote:


> Originally Posted by *Cloudfire777*
> 
> 732MHz is for the Tesla. That picture from EgyptHardware is flawed and contains several errors.
> 
> I`m think that the Nvidia Titan will have turbo boost just like every GTX card. 732MHz is most likely the base clock and turbo boost goes to...? I dont know


There was already an article stating they were clocked this low to deal with the ultra consistant server demands. For PC gaming this wouldn't be needed. I can't confirm but I recall a site stating these cards could easily get to 900MHz or so.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Cloudfire777*
> 
> Here is more news on Titan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wccftech.com/exclusive-nvidia-geforce-titan-pcb-unveiled-6-gb-memory-gk110-gpu-8-phase-supply/


The link on the pricing is just to the other article of the Titan that was listed today.....

At $1000 I would pass, at $899, the US cost that is being tossed around, I would get it, we all have our points of justification. If I can't get it for $899 I will just buy another 7970 and add them to my water loop, overclock the hell out of them, and be happy.
Quote:


> Originally Posted by *Cloudfire777*
> 
> 732MHz is for the Tesla. That picture from EgyptHardware is flawed and contains several errors.
> 
> I`m think that the Nvidia Titan will have turbo boost just like every GTX card. 732MHz is most likely the base clock and turbo boost goes to...? I dont know


If they have one SMX unit disabled, which looks to be the case, I would expect a base clock of about 850Mhz boosting to 900Mhz. With the ability to maybe hit 1Ghz in an OC on Titan.


----------



## Usario

Quote:


> Originally Posted by *SavantStrike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F1ynn*
> 
> bleh,. Green color
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You know what needs to make a come back? *That golden brown of old school mobos.*
> 
> I remember Red PCBs and loved them. Now everything is black. I'd almost take a green PCB just for something different.
Click to expand...

Ew, no. I love PCBs with all kinds of colors except that nasty golden brown and that stupid shade of blue that was overused to death (at least the overuse of green has completely stopped and now it's "retro").

I prefer my PCBs white, black, red, and purple.


----------



## maarten12100

Quote:


> Originally Posted by *Usario*
> 
> I prefer my PCBs


How I prefer my woman
Quote:


> Originally Posted by *Usario*
> 
> white, black, red, and purple.


Purple and red woman









Can't wait for a review price is at least within budget if that danish site has regular pricing.


----------



## freitz

Edit I was wrong


----------



## PostalTwinkie

Quote:


> Originally Posted by *freitz*
> 
> 
> 
> not sure if anyone has posted this yet but I just came across this... Has the DVI ports.


I believe that is a photo of a 670/680 next to the Tesla cards to show size and comparison.


----------



## Phishy714

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I believe that is a photo of a 670/680 next to the Tesla cards to show size and comparison.


That's the server cards put against the GTX 690 bud.


----------



## Newbie2009

Quote:


> Originally Posted by *freitz*
> 
> Edit I was wrong


1 sli connector


----------



## PostalTwinkie

Quote:


> Originally Posted by *Phishy714*
> 
> That's the server cards put against the GTX 690 bud.


Ah, yes you are right...

The 1 SLi connect gives that up.


----------



## freitz

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *PostalTwinkie*
> 
> I believe that is a photo of a 670/680 next to the Tesla cards to show size and comparison.






I fixed it said I was wrong i guess you replied before I caught it.


----------



## Newbie2009

We want benchmarks


----------



## freitz

Quote:


> Originally Posted by *Newbie2009*
> 
> We want benchmarks


I just wanted a official announcement. I'd be happy with that.


----------



## RJT

Quote:


> Originally Posted by *freitz*
> 
> I just wanted a official announcement. I'd be happy with that.


Agreed. Enough with the rumours, suppositions and cheesy photoshop fake pictures. Until Nvidia actually announces something, anything official, I'm going to retain a healthy amount of scepticism.


----------



## alick

I wll take 4 of them please














when the price comes down lol


----------



## chronicfx

leave some for us plzzz.. Just take what u need lol


----------



## Cloudfire777

What I`m super curious about is
Is Titan part of the Kepler 2.0? I just call it that, but its architecture improvements.
If yes,
Then how much have they managed to squeeze out of Kepler this time?

Anyone remember GTX 480 and GTX 580 (both Fermi)?


----------



## maarten12100

Quote:


> Originally Posted by *alick*
> 
> I wll take 4 of them please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> when the price comes down lol


I don't think it will as it is the ultimate high end second hand maybe in a year or 2 but why would you buy it then as the high end then will have the same power.


----------



## RJT

Quote:


> Originally Posted by *maarten12100*
> 
> I don't think it will as it is the ultimate high end second hand maybe in a year or 2 but why would you buy it then as the high end then will have the same power.


Dude. Punctuation. Google it.


----------



## RJT

Quote:


> Originally Posted by *chronicfx*
> 
> leave some for us plzzz.. Just take what u need lol


With a limited production run for this special card they will probably set an ordering limit per person of 1-2 cards. If someone could tri or quad sli/cf these babies, they wouldn't need to buy new cards for 3-5 years...not good for business. LoL


----------



## hifibuff

Also, provided the GK110 is only in the Titan card and that the GK114 is used for the whole 7XX series, wouldn't there be risk that drivers aren't as optimized and "durable" for the GK110 as they would for the GK114 ? Maybe it doesn't work this way, but just making sure my potential 3000 euro worth of graphics card is not obsolete from the drivers' side of things too quickly









@RJT, precisely my point ! I'm willing to shell out big bucks right now for three of these bad babies in order to be settled for the foreseeable future


----------



## Cloudfire777

^ Meh, "Kepler is Kepler"


----------



## RJT

Quote:


> Originally Posted by *hifibuff*
> 
> Also, provided the GK110 is only in the Titan card and that the GK114 is used for the whole 7XX series, wouldn't there be risk that drivers aren't as optimized and "durable" for the GK110 as they would for the GK114 ? Maybe it doesn't work this way, but just making sure my potential 3000 euro worth of graphics card is not obsolete from the drivers' side of things too quickly


That's a helluva good question, but I'd assume they would have good driver support given that even 4-5 year old cards are still supported by the new drivers released for later generation cards. Hmmmm.


----------



## dph314

Quote:


> Originally Posted by *RJT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chronicfx*
> 
> leave some for us plzzz.. Just take what u need lol
> 
> 
> 
> With a limited production run for this special card they will probably set an ordering limit per person of 1-2 cards. If someone could tri or quad sli/cf these babies, they wouldn't need to buy new cards for 3-5 years...not good for business. LoL
Click to expand...

It may be one or two limit, yep. But I think Maxwell will be a decent rival for Titan, so I don't think Nvidia would lose 3-5 years' revenue from most Titan owners. The audience Titan is marketed towards are enthusiasts who have too much to spend (or in my case, don't, but will anyways







) and want the latest and greatest. If someone buys Titan and is completely satisfied, but Maxwell comes out with even only minimal performance gains, many will still upgrade I think.


----------



## PostalTwinkie

Quote:


> Originally Posted by *dph314*
> 
> It may be one or two limit, yep. But I think Maxwell will be a decent rival for Titan, so I don't think Nvidia would lose 3-5 years' revenue from most Titan owners. The audience Titan is marketed towards are enthusiasts who have too much to spend (or in my case, don't, but will anyways
> 
> 
> 
> 
> 
> 
> 
> ) and want the latest and greatest. If someone buys Titan and is completely satisfied, but Maxwell comes out with even only minimal performance gains, many will still upgrade I think.


Hmm....hard to say I guess.

Maxwell is supposed to be the next huge jump in performance, rumors of double GK104......

I would be disappointed if Maxwell, what comes after the 700 series, didn't give Titan a run for the money.


----------



## nyk20z3

When Nvidia releases it then I will believe it although I would take 2 770 or 780's over it anyway.


----------



## Bitech

Quote:


> Originally Posted by *nyk20z3*
> 
> When *Invidia* releases it then I will believe it although I would take 2 770 or 780's over it anyway.


Nvidia outources to India?


----------



## nyk20z3

Sry I am mobile and misspelled it.


----------



## Arni90

Quote:


> Originally Posted by *Cloudfire777*
> 
> What I`m super curious about is
> Is Titan part of the Kepler 2.0? I just call it that, but its architecture improvements.
> If yes,
> Then how much have they managed to squeeze out of Kepler this time?
> 
> Anyone remember GTX 480 and GTX 580 (both Fermi)?


The difference between the 480 and 580 is that the 580 is clocked 10% higher on the core and slightly less on the memory, has 5% more shaders and TMUs, and has improved texture filtering capabilites and Z-culling. The performance difference between the two add up with the improvements in the core architecture, but there's nothing to suggest Nvidia implemented any improvements in raster performance from the first iteration of Kepler.


----------



## Phishy714

[H]ardOcp Editor in Chief just threw in a little bit of a bummer:

The OP of the thread mentioned some specs and that cards were out being reviewed atm. Then Kyle mentioned that the OP was wrong.
Quote:


> Review samples are not out to reviewers.


http://hardforum.com/showpost.php?p=1039592395&postcount=52

While, he could very well be wrong or trolling - I don't think he would purposely mislead his audience. Being an editor in chief of a highly respected tech news site, what do you guys think of this?

other quotes:
Quote:


> So, less performance than GTX 680 SLI for $900?


Quote:


> I have not seen specs presented to me by NVIDIA, so I do not know those for certain.


Quote:


> Given the time lines I have heard rumored on the launch date, again NOT from NVIDIA, I expect this to be a soft launch. Again, these are not facts, just what I THINK is likely to happen.


----------



## freitz

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Phishy714*
> 
> [H]ardOcp Editor in Chief just threw in a little bit of a bummer:
> 
> The OP of the thread mentioned some specs and that cards were out being reviewed atm. Then Kyle mentioned that the OP was wrong.
> http://hardforum.com/showpost.php?p=1039592395&postcount=52
> 
> While, he could very well be wrong or trolling - I don't think he would purposely mislead his audience. Being an editor in chief of a highly respected tech news site, what do you guys think of this?
> 
> other quotes:






Well that would suck, all this hype for nothing.


----------



## Cloudfire777

Quote:


> Originally Posted by *freitz*
> 
> 
> Well that would suck, all this hype for nothing.


So that is Hard-Forum`s word
Quote:


> Review samples are not out to reviewers.


http://hardforum.com/showpost.php?p=1039592395&postcount=52

against Rage3D writer...
Quote:


> PC Perspective and TechReport have them in hand for testing now. Others may too, but not me.


http://www.rage3d.com/board/showpost.php?s=8a67c78f7eabea7e7a84f31c2638c375&p=1337152339&postcount=6679

So who is right?


----------



## dph314

I'm going to guess Rage3D is, because, well, I _want_ him to be


----------



## maarten12100

Quote:


> Originally Posted by *dph314*
> 
> I'm going to guess Rage3D is, because, well, I _want_ him to be


You want the card to have a 300w tdp instead of 235w!?
I agree if the clocks are made higher but you know what I mean


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> You want the card to have a 300w tdp instead of 235w!?
> I agree if the clocks are made higher but you know what I mean


Its his second statement he wants to be right, as in some review samples HAVE already shipped, meaning the release date would be somewhat imminent.


----------



## dph314

Quote:


> Originally Posted by *Avonosac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maarten12100*
> 
> You want the card to have a 300w tdp instead of 235w!?
> I agree if the clocks are made higher but you know what I mean
> 
> 
> 
> Its his second statement he wants to be right, as in some review samples HAVE already shipped, meaning the release date would be somewhat imminent.
Click to expand...

Correct


----------



## Alatar

Quote:


> Originally Posted by *maarten12100*
> 
> You want the card to have a 300w tdp instead of 235w!?
> I agree if the clocks are made higher but you know what I mean


I want as high clocks as possible out of the box because that also means that the voltage will be as high as possible out of the box.


----------



## Avonosac

Quote:


> Originally Posted by *Alatar*
> 
> I want as high clocks as possible out of the box because that also means that the voltage will be as high as possible out of the box.


Especially if its voltage locked


----------



## dph314

I'm just kind of prepping myself for a very limited OC. 780 shouldn't be any competition for this Titan if it's supposed to be the first of its kind, a new 'Enthusiast' line. So it'll probably have killer performance out of the box, but hardly any OC room. But many reference 680's seem to be able to hit +100-150mhz with the voltage lock, so at least that would be _something_.

Sure will miss my Lightnings







I miss the Fermi days even more.


----------



## iARDAs

I think I will stick with a 670 SLI. Instead of grabbing the Titan, I will go with a 2nd 670 and a 3rd one perhaps later on.

Very happy with single 670 at 1440p 60hz but 2nd one will help me more.

Selling the Titan in the future will be a pain in the arse here in Turkey. Selling a 670 is always easier.

Great GPU though.


----------



## evoll88

I have 670 ftw 4gb in sli so i wont upgrade intill they cant keep up with alot of games.


----------



## hifibuff

If it really is 300w per card, what kind of PSU would you recommend to run three titans ?
I also have an Asus RIVE + 3960X @ 4.4ghz, 4 HDDs and 1 SSD + 1 blu ray burner and a few fans.
I currently have a thermaltake toughpower 1200w, but I'm wondering if it would be enough if the 300w scenario turns out to be true...


----------



## Cloudfire777

Quote:


> Originally Posted by *maarten12100*
> 
> You want the card to have a 300w tdp instead of 235w!?
> I agree if the clocks are made higher but you know what I mean


Since Tesla K20x with 735MHz as clock have a TDP of 235W, having a TDP of 300W would mean that Titan is an insane beast.

Then we would be talking around 1000MHz for the shaders, and a higher memory clock than Tesla. Who wouldn`t want that...

Its still only 50W over 7970


----------



## maarten12100

Quote:


> Originally Posted by *Cloudfire777*
> 
> Since Tesla K20x with 735MHz as clock have a TDP of 235W, having a TDP of 300W would mean that Titan is an insane beast.
> 
> Then we would be talking around 1000MHz for the shaders, and a higher memory clock than Tesla. Who wouldn`t want that...
> 
> Its still only 50W over 7970


732Mhz but indeed
I hope that they up the core clock and the voltage.
Would be dissapointing if they increase the memory clock.


----------



## PostalTwinkie

Quote:


> Originally Posted by *maarten12100*
> 
> You want the card to have a 300w tdp instead of 235w!?
> I agree if the clocks are made higher but you know what I mean


Those of us buying this card don't give a crap about TDP. You don't buy this kind of card for its TDP profile.
Quote:


> Originally Posted by *hifibuff*
> 
> If it really is 300w per card, what kind of PSU would you recommend to run three titans ?
> I also have an Asus RIVE + 3960X @ 4.4ghz, 4 HDDs and 1 SSD + 1 blu ray burner and a few fans.
> I currently have a thermaltake toughpower 1200w, but I'm wondering if it would be enough if the 300w scenario turns out to be true...


Uh...you are more than fine with a 1200W......the possible TDP being thrown around right now is just over a 7970.
Quote:


> Originally Posted by *Cloudfire777*
> 
> Since Tesla K20x with 735MHz as clock have a TDP of 235W, having a TDP of 300W would mean that Titan is an insane beast.
> 
> Then we would be talking around 1000MHz for the shaders, and a higher memory clock than Tesla. Who wouldn`t want that...
> 
> Its still only 50W over 7970


That is just the point. 7970 Ghz is 250, add any overclock onto that it goes up. So knowing K20X is 235 at factory....

What kind of clocks would it be running if it was 300W?


----------



## maarten12100

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Those of us buying this card don't give a crap about TDP. You don't buy this kind of card for its TDP profile.


That was not what I was saying I said that it would be dissapointing if the extra usage wouldn't result in higher voltage and clocks.
As I'm a power user myself


----------



## PostalTwinkie

Quote:


> Originally Posted by *maarten12100*
> 
> That was not what I was saying I said that it would be dissapointing if the extra usage wouldn't result in higher voltage and clocks.
> As I'm a power user myself


Aaahhhh...I got ya.

Yea, I would be a little irked if Nvidia was like "Here! Have a 300W TDP...but no extra power for you!!!"
Quote:


> Originally Posted by *Phishy714*
> 
> [H]ardOcp Editor in Chief just threw in a little bit of a bummer:
> 
> The OP of the thread mentioned some specs and that cards were out being reviewed atm. Then Kyle mentioned that the OP was wrong.
> http://hardforum.com/showpost.php?p=1039592395&postcount=52
> 
> While, he could very well be wrong or trolling - I don't think he would purposely mislead his audience. Being an editor in chief of a highly respected tech news site, what do you guys think of this?
> 
> other quotes:


My problem with this is that a couple of retailers have already listed them, but quickly pulled them, with complete model numbers that fall in line with everything we have heard so far.


----------



## AMD_Freak

but will it play D3 on Ultra settings?


----------



## maarten12100

Quote:


> Originally Posted by *AMD_Freak*
> 
> but will it play D3 on Ultra settings?


Diablo 3 isn't really that hard to max out so yes this beast can run it with ease.


----------



## GoldenTiger

Quote:


> Originally Posted by *maarten12100*
> 
> Diablo 3 isn't really that hard to max out so yes this beast can run it with ease.


Pretty sure it was a joke







.


----------



## GoldenTiger

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Aaahhhh...I got ya.
> 
> Yea, I would be a little irked if Nvidia was like "Here! Have a 300W TDP...but no extra power for you!!!"
> My problem with this is that a couple of retailers have already listed them, but quickly pulled them, with complete model numbers that fall in line with everything we have heard so far.


Same here.

And.... there is this which makes me kinda confused too... that I just got alerted to... from a major tech site.



Shows a beginning-of-April release date for the GTX 7xx cards, and Feb 18th for Titan.

There's a radeon one too which also fits with current rumors....

http://img507.imageshack.us/img507/5388/hmmmm2redacted.png


----------



## cookiesowns

Quote:


> Originally Posted by *GoldenTiger*
> 
> Same here.
> 
> And.... there is this which makes me kinda confused too... that I just got alerted to... from a major tech site.
> 
> Shows a beginning-of-April release date for the GTX 7xx cards, and Feb 18th for Titan.
> 
> There's a radeon one too which also fits with current rumors....
> 
> http://img507.imageshack.us/img507/5388/hmmmm2redacted.png


The GTX770 with 3GB of VRAM seems tempting. I wonder if they changed the bus-width, or went with a mixture of different VRAM


----------



## GoldenTiger

Quote:


> Originally Posted by *cookiesowns*
> 
> The GTX770 with 3GB of VRAM seems tempting. I wonder if they changed the bus-width, or went with a mixture of different VRAM


Both cards (GTX 770/780) show a 384-bit bus width. I have no idea if this stuff is accurate, just passing along the info....


----------



## cookiesowns

Quote:


> Originally Posted by *GoldenTiger*
> 
> Both cards (GTX 770/780) show a 384-bit bus width. I have no idea if this stuff is accurate, just passing along the info....


Ah right, totally missed the fine-print. I really doubt that is accurate, as considering the die size of the current GK104, I really doubt they were able to fit memory controller without increasing TDP or size significantly.

That said, I still think the Titan rumor is true to an extent, just probably won't beat the GTX690.


----------



## GoldenTiger

Here is the data page for GTX 780...
Quote:


> Graphics Processor
> 
> GPU Name: GK114
> Process Size: 28 nm
> Transistors: 3,540 million
> Die Size: 294 mm²
> 
> Render Config
> 
> Shading Units: 1920
> TMUs: 160
> ROPs: 48
> SMX: 10
> Pixel Rate: 40.0 GPixel/s
> Texture Rate: 160.0 GTexel/s
> Floating-point performance: 3,840.00 GFLOPS
> 
> Graphics Features
> 
> DirectX: 11.1
> OpenGL: 4.3
> OpenCL: 1.2
> Shader Model: 5.0
> 
> Graphics Card
> 
> Released: April 01, 2013
> Bus Interface: PCIe 3.0 x16
> 
> Clock Speeds
> 
> GPU Clock: 1000 MHz
> Boost Clock: 1050 MHz
> Memory Clock: 1502 MHz
> 6008 MHz effective
> 
> Memory
> 
> Memory Size: 3072 MB
> Memory Type: GDDR5
> Memory Bus: 384 bit
> Bandwidth: 288.4 GB/s
> 
> Reference Board
> 
> Slot Width: Dual-slot
> TDP: 250 W


And yeah, I don't think Titan is going to beat a GTX 690 at this point. I guess time will tell if this database is accurate... not too long till we know.


----------



## kyfire

Quote:


> Originally Posted by *GoldenTiger*
> 
> Same here.
> 
> And.... there is this which makes me kinda confused too... that I just got alerted to... from a major tech site.
> 
> 
> 
> Shows a beginning-of-April release date for the GTX 7xx cards, and Feb 18th for Titan.
> 
> There's a radeon one too which also fits with current rumors....
> 
> http://img507.imageshack.us/img507/5388/hmmmm2redacted.png


Nicely shopped to *not* show what "major tech site"


----------



## GoldenTiger

Quote:


> Originally Posted by *kyfire*
> 
> Nicely shopped to *not* show what "major tech site"


I was saving pages still and didn't want it taken down while I did so. Have at it







.

http://www.techpowerup.com/gpudb/
http://www.techpowerup.com/gpudb/1701/NVIDIA_GeForce_GTX_780.html (April 2013)
http://www.techpowerup.com/gpudb/1856/NVIDIA_GeForce_GTX_770.html (April 2013)
http://www.techpowerup.com/gpudb/1996/NVIDIA_GeForce_GTX_Titan.html (Feb 18 2013)

http://www.techpowerup.com/gpudb/1932/AMD_Radeon_HD_8970.html (Oct 2013)

You're welcome.


----------



## cookiesowns

Quote:


> Originally Posted by *GoldenTiger*
> 
> Here is the data page for GTX 780...
> And yeah, I don't think Titan is going to beat a GTX 690 at this point. I guess time will tell if this database is accurate... not too long till we know.


Almost certain it's fake. According to Wikipedia, the GTX680 has the same Die size, and Transitor amount as the image above. Judging from some posts were about regarding Nvidia's choice in TSMC, I seriously doubt they were able to make that much of a significant refinement in the 28NM process.
Quote:


> Originally Posted by *GoldenTiger*
> 
> I was saving pages still and didn't want it taken down while I did so. Have at it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> http://www.techpowerup.com/gpudb/
> http://www.techpowerup.com/gpudb/1701/NVIDIA_GeForce_GTX_780.html (April 2013)
> http://www.techpowerup.com/gpudb/1856/NVIDIA_GeForce_GTX_770.html (April 2013)
> http://www.techpowerup.com/gpudb/1996/NVIDIA_GeForce_GTX_Titan.html (Feb 18 2013)
> 
> http://www.techpowerup.com/gpudb/1932/AMD_Radeon_HD_8970.html (Oct 2013)
> 
> You're welcome.


Had a feeling it was techpowerup. Thanks btw!


----------



## Joneszilla

Hope that release date ends up being legit. Im ready to buy.


----------



## cookiesowns

Quote:


> Originally Posted by *Joneszilla*
> 
> Hope that release date ends up being legit. Im ready to buy.


As am I!


----------



## kyfire

Wasn't trying to be a jerk. There's just been so many screen shots posted and cryptic references to major tech sources.


----------



## GoldenTiger

Quote:


> Originally Posted by *kyfire*
> 
> Wasn't trying to be a jerk. There's just been so many screen shots posted and cryptic references to major tech sources.


Ahhh, no worries.







I guess you're right, it would look a bit iffy heh otherwise.


----------



## xzamples

beasts!!!


----------



## supermi

Well I am putting up my 4 classifieds for sale for these.

HOPE HOPE HOPE they turn out to be real 'titans"







and are all we hope for!!!!!!!!!!!

More info please


----------



## mike88931

I realize this is a speculation but I went from pissing myself in excitement to feeling kind of "eh." After rmaing 2 ridiculously overpriced 6b 7970 cards i now have a 1200 dollar budget for gpus. I originally planned on getting one of these which at the time it was said would be 20 percent better than the 690 and was just going ot wait until i needed it more after the prices dropped. But now after hearing they may just be turned down teslas, not even as fast as the 690, and be very limited I am quite hesitant. Ido not want to spend 900 on one that is that weak compared to the original rumor if I will not be able to get another one for around 300 when the prices drop enough by the time i need another. Especially not with next gen cards being right around the corner and the fact i could get two 780's into my budget on release. The only real advantage of this over doing that is the 6gb of vram which comes in handy for extreme skyrim modding aficionados such as myself and the fact it offers sli performance without the dual gpu issues (or at least almost sli performance if latest rumors areto be trusted.

As everyone else, I am obsessively wishing for benchmarks ASAP!

ps hopefully i will have a complete absence of the ludicrously pathetic driver issues that accompanied me with the AMD cards...


----------



## dph314

Quote:


> Originally Posted by *mike88931*
> 
> I realize this is a speculation but I went from pissing myself in excitement to feeling kind of "eh." After rmaing 2 ridiculously overpriced 6b 7970 cards i now have a 1200 dollar budget for gpus. I originally planned on getting one of these which at the time it was said would be 20 percent better than the 690 and was just going ot wait until i needed it more after the prices dropped. But now after hearing they may just be turned down teslas, not even as fast as the 690, and be very limited I am quite hesitant. Ido not want to spend 900 on one that is that weak compared to the original rumor if I will not be able to get another one for around 300 when the prices drop enough by the time i need another. Especially not with next gen cards being right around the corner and the fact i could get two 780's into my budget on release. The only real advantage of this over doing that is the 6gb of vram which comes in handy for extreme skyrim modding aficionados such as myself and the fact it offers sli performance without the dual gpu issues (or at least almost sli performance if latest rumors areto be trusted.
> 
> As everyone else, I am obsessively wishing for benchmarks ASAP!
> 
> ps hopefully i will have a complete absence of the ludicrously pathetic driver issues that accompanied me with the AMD cards...


I think this card is going to be a lot more expensive for a little more performance, relative to the 780, just like all top-of-the-line cards are. Look at the 670 and 680. ~5-7% more performance for $100 more. For Titan and the 780, I don't think the numbers will be _exactly_ similar to those, but you get my point. It's always been like that for the best: minimal gain for a good chunk of loot. If you want the best, Titan will be the best. But you'll have to decide if the price-gap between the best and second-best is worth it for you.

Basically what I'm saying is benchmarks don't matter if you want the best. You'll be going with Titan if you do. If you, however, have a price/performance value you want to stay around, then you'll have to wait till both are out for a bit to see if the price jump is worth the performance.


----------



## Avonosac

Quote:


> Originally Posted by *dph314*
> 
> I think this card is going to be a lot more expensive for a little more performance, relative to the 780, just like all top-of-the-line cards are. Look at the 670 and 680. ~5-7% more performance for $100 more. For Titan and the 780, I don't think the numbers will be _exactly_ similar to those, but you get my point. It's always been like that for the best: minimal gain for a good chunk of loot. If you want the best, Titan will be the best. But you'll have to decide if the price-gap between the best and second-best is worth it for you.
> 
> Basically what I'm saying is benchmarks don't matter if you want the best. You'll be going with Titan if you do. If you, however, have a price/performance value you want to stay around, then you'll have to wait till both are out for a bit to see if the price jump is worth the performance.


If the 780 and 770 both do actually have 384bit memory bandwidth... I think the arguement is going to come down to which one has unlocked voltages or the better VRMs. Based on this chart, the titan obviously has the advantage, but its clocked so low, the amount of power you can get out of it will be severely throttled at the memory.

The way I'm looking at this, is there will be 1 memory controller now taking care of effectively 2 GPUs worth of cores. I think the far slower clock times on the memory and the harder work the controller will have to do, will be a huge limit to titans performance. If the release for 700 series pans out the way it does in these pictures, I can easily see the Titan beating the performance of the 690, because the card you really need to compare it against will likely be the 790, which should be 7-10% better, in normal NVidia fashion.


----------



## brettjv

Quote:


> Originally Posted by *Avonosac*
> 
> If the 780 and 770 both do actually have 384bit memory bandwidth... I think the arguement is going to come down to which one has unlocked voltages or the better VRMs. Based on this chart, the titan obviously has the advantage, but its clocked so low, the amount of power you can get out of it will be severely throttled at the memory.
> 
> The way I'm looking at this, is there will be 1 memory controller now taking care of effectively 2 GPUs worth of cores. I think the far slower clock times on the memory and the harder work the controller will have to do, will be a huge limit to titans performance. If the release for 700 series pans out the way it does in these pictures, I can easily see the Titan beating the performance of the 690, because the card you really need to compare it against will likely be the 790, which should be 7-10% better, in normal NVidia fashion.


Actually, increasing the bus width by 50% pretty much automatically implies a +50% increase in the count of the memory controllers attached to the chip









Although, since the core count is going up by >50%, and it's already known that GK114 with a 256-bit bus can be bandwidth-limited, there's some legitimate cause for concern around the issue of available bandwidth for Titan, esp. given the purported memory downclocking to the 1300MHz range. That might also be part of why they're looking at only 732-ish for the core clock ... it may be they're seeing diminishing returns with further core increases due to bandwidth restrictions, and they're looking to put out a balanced part.

Hopefully people will be able to push the memory clocks on Titan like we see on GK114, wherein +250MHz (in actual frequency) OC's to the memory are fairly routine, and even +300-350MHz not unheard of.

At the clocks we're presently discussing, the Titan isn't going to even be in the same ballpark as a 690 in terms of raw fps, let alone 'beating' it. Not unless you're using a game w/terrible SLi scaling for comparisons.


----------



## Avonosac

Quote:


> Originally Posted by *brettjv*
> 
> Actually, increasing the bus width by 50% pretty much automatically implies a +50% increase in the count of the memory controllers attached to the chip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Although, since the core count is going up by >50%, and it's already known that GK114 with a 256-bit bus can be bandwidth-limited, there's some legitimate cause for concern around the issue of available bandwidth for Titan, esp. given the purported memory downclocking to the 1300MHz range. That might also be part of why they're looking at only 732-ish for the core clock ... it may be they're seeing diminishing returns with further core increases due to bandwidth restrictions, and they're looking to put out a balanced part.
> 
> Hopefully people will be able to push the memory clocks on Titan like we see on GK114, wherein +250MHz (in actual frequency) OC's to the memory are fairly routine, and even +300-350MHz not unheard of.
> 
> At the clocks we're presently discussing, the Titan isn't going to even be in the same ballpark as a 690 in terms of raw fps, let alone 'beating' it. Not unless you're using a game w/terrible SLi scaling for comparisons.


I believe the terrible SLI scaling is what most people are banking on for the titan. Also if you have seen the TPU pages for GTX 780, GTX 770, and Titan, you will see where I was drawing my information on the memory buses. The size of the transitor count for memory wasn't exactly my issue, more like the trouble with addressing 6 GBs of memory at the same speed as 2GBs of memory. If you're still using the same algorithm to address the space, the logical loops will take longer for the higher capacity. We already see with the 4GB versions, the GK104 does not scale well when you add extra memory at the higher clocks and you lose performance from the chips overall.


----------



## almighty15

You'll find that this will be released in limited numbers like the GTX690.

This card will also replace the ultra dual GPU cards.

Then well have a normal GTX780 and GTX770 with a 10-15% increase over the 600 series.


----------



## hifibuff

almlighty15, source ?


----------



## dph314

Quote:


> Originally Posted by *hifibuff*
> 
> almlighty15, source ?


Anything at this point is an opinion, so, I just think not everyone bothers putting a 'imo' at the end of their statements, since it's implied


----------



## maarten12100

In a few days we'll know if the feb 18 rumor is true.
If the cores is locked we can atleast up the memory clocks but I don't think it will be a bottleneck as the tesla have this core clock.
But if you up the core (if you could) you might run into a memory bottleneck


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> In a few days we'll know if the feb 18 rumor is true.
> If the cores is locked we can atleast up the memory clocks but I don't think it will be a bottleneck as the tesla have this core clock.
> But if you up the core (if you could) you might run into a memory bottleneck


I would imagine the demands on memory from tesla work is very different than gaming. I think capacity is more important than performance in rendering large textures. I really think memory will be an issue with Titan. At least when trying to use it all.


----------



## brettjv

Quote:


> Originally Posted by *Avonosac*
> 
> I believe the terrible SLI scaling is what most people are banking on for the titan. Also if you have seen the TPU pages for GTX 780, GTX 770, and Titan, you will see where I was drawing my information on the memory buses. The size of the transitor count for memory wasn't exactly my issue, more like the trouble with addressing 6 GBs of memory at the same speed as 2GBs of memory. If you're still using the same algorithm to address the space, the logical loops will take longer for the higher capacity. We already see with the 4GB versions, the GK104 does not scale well when you add extra memory at the higher clocks and you lose performance from the chips overall.


Ah, I see. Well, I can only respond to what's on the page, and you were talking as though you thought the # of memory controllers wouldn't be higher on Titan vs GK104, so ... I just wanted to clear it up ... there will be 50% more memory controllers.

And I don't know where you are drawing the point your current post from ... the way I see it, the core of GK104 (esp. the cut-down version in the 670) just isn't powerful enough in general to benefit from 4GB, even in SLi (this is per the review on [H]). It would require some pretty sophisticated tests to prove whether this is due to bandwidth restrictions in the memory subsystem, or just simple raw gpu grunt (calculations, rasterizing, etc), and ttbomk those tests have really not been done ... but if you have evidence to the contrary, I'd be keen to see it









And I've definitely not seen the proof that going to 4GB (in and of itself) hurts the perf of GK104, at least not outside the bounds of what could be explained by differences in actual operating frequency on the core clock on the models being tested, or looser timings on the 4GB model.

AFA SLi goes, I suspect 'scaling' will be much the same as GK104 (2GB or 4GB ... again, I've not seen evidence there's a difference in terms of SLi scaling), but with the caveat that CPU bottlenecking will be even more likely to come into play than it already is (which means a pretty high likelihood outside of surround resolutions ... in part because CPU's have not been getting faster at NEARLY the rate GPU's have over the past 4 years).


----------



## Avonosac

Quote:


> Originally Posted by *brettjv*
> 
> Ah, I see. Well, I can only respond to what's on the page, and you were talking as though you thought the # of memory controllers wouldn't be higher on Titan vs GK104, so ... I just wanted to clear it up ... there will be 50% more memory controllers.
> 
> And I don't know where you are drawing the point your current post from ... the way I see it, the core of GK104 (esp. the cut-down version in the 670) just isn't powerful enough in general to benefit from 4GB, even in SLi (this is per the review on [H]). It would require some pretty sophisticated tests to prove whether this is due to bandwidth restrictions in the memory subsystem, or just simple raw gpu grunt (calculations, rasterizing, etc), and ttbomk those tests have really not been done ... but if you have evidence to the contrary, I'd be keen to see it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I've definitely not seen the proof that going to 4GB (in and of itself) hurts the perf of GK104, at least not outside the bounds of what could be explained by differences in actual operating frequency on the core clock on the models being tested, or looser timings on the 4GB model.
> 
> AFA SLi goes, I suspect 'scaling' will be much the same as GK104 (2GB or 4GB ... again, I've not seen evidence there's a difference in terms of SLi scaling), but with the caveat that CPU bottlenecking will be even more likely to come into play than it already is (which means a pretty high likelihood outside of surround resolutions ... in part because CPU's have not been getting faster at NEARLY the rate GPU's have over the past 4 years).


1) More memory controllers doesn't necessarily improve memory addressing, the way a better architecture would. It still has the same O. (whatever it is for hardware).

2) Well the [H] scaling did say when over 2 gb was used, it did perform better, but only then, otherwise it was distinctly inferior. Yea, the idea for the 4gb, when i got mine was for SLI, but it proved not worth it so I'm just staying at one.

3) The 4 GB cards perform worse than reference, in the same [H] review you were referencing it was usually 6-8% worse. I am also drawing on many other posts from 4gb owners with less than satisfactory results, but no additional proof. Also, when you stress a memory controller of a CPU by adding more memory to address, it usually greatly reduces its OC potential. The memory of a GPU and CPU are different, and addressed differently, but I believe the same principals of electronics should likely hold true.

4) CPU bottleneck could seriously come into play with 2 titans, though I doubt a single one would bottleneck on the CPU. It does remain to be seen if this will be the case. But recall that the current SLI scaling is not the way it was on release, there have been serious improvements for SLI to get where it is, on anything but the top titles to whom the cards are being marketed.


----------



## Shiftstealth

Quote:


> Originally Posted by *brettjv*
> 
> Ah, I see. Well, I can only respond to what's on the page, and you were talking as though you thought the # of memory controllers wouldn't be higher on Titan vs GK104, so ... I just wanted to clear it up ... there will be 50% more memory controllers.
> 
> And I don't know where you are drawing the point your current post from ... the way I see it, the core of GK104 (esp. the cut-down version in the 670) just isn't powerful enough in general to benefit from 4GB, even in SLi (this is per the review on [H]). It would require some pretty sophisticated tests to prove whether this is due to bandwidth restrictions in the memory subsystem, or just simple raw gpu grunt (calculations, rasterizing, etc), and ttbomk those tests have really not been done ... but if you have evidence to the contrary, I'd be keen to see it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I've definitely not seen the proof that going to 4GB (in and of itself) hurts the perf of GK104, at least not outside the bounds of what could be explained by differences in actual operating frequency on the core clock on the models being tested, or looser timings on the 4GB model.
> 
> AFA SLi goes, I suspect 'scaling' will be much the same as GK104 (2GB or 4GB ... again, I've not seen evidence there's a difference in terms of SLi scaling), but with the caveat that CPU bottlenecking will be even more likely to come into play than it already is (which means a pretty high likelihood outside of surround resolutions ... in part because CPU's have not been getting faster at NEARLY the rate GPU's have over the past 4 years).


Do you have any idea on where we would see diminishing returns on memory bandwidth (Gbps)?


----------



## maarten12100

Quote:


> Originally Posted by *brettjv*
> 
> AFA SLi goes, I suspect 'scaling' will be much the same as GK104 (2GB or 4GB ... again, I've not seen evidence there's a difference in terms of SLi scaling), but with the caveat that CPU bottlenecking will be even more likely to come into play than it already is (which means a pretty high likelihood outside of surround resolutions ... in part because CPU's have not been getting faster at NEARLY the rate GPU's have over the past 4 years).


Cpus did actually but multithreading isn't ready at all.


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> Cpus did actually but *software isn't ready for multithreading* at all.


FYP.

Since I'm pretty sure that is what you meant.


----------



## maarten12100

Quote:


> Originally Posted by *Avonosac*
> 
> FYP.
> 
> Since I'm pretty sure that is what you meant.


Most games utilize only 2 cores which is more than sad when you have 16.
Cpu bottlenecks suck since the only way to resolve is upping the performance/thread


----------



## Arni90

Quote:


> Originally Posted by *Shiftstealth*
> 
> Do you have any idea on where we would see diminishing returns on memory bandwidth (Gbps)?


When the amount of data a card is able to process per second is less than the amount of data the memory controller is able to feed the card with


----------



## Shiftstealth

Quote:


> Originally Posted by *Arni90*
> 
> When the amount of data a card is able to process per second is less than the amount of data the memory controller is able to feed the card with


To be more specific i mean for Titan with 2688 Cores at 732 Mhz.


----------



## Arni90

Quote:


> Originally Posted by *Shiftstealth*
> 
> To be more specific i mean for Titan with 2688 Cores at 732 Mhz.


That depends on what the hardware is doing, the ROPs and TMUs can easily increase bandwidth consumption just by activating Anisotropic Filtering and Multisampling AA, not to mention doubling texture resolution increases bandwidth consumption for textures by four.


----------



## Shiftstealth

Quote:


> Originally Posted by *Arni90*
> 
> That depends on what the hardware is doing, the ROPs and TMUs can easily increase bandwidth consumption just by activating Anisotropic Filtering and Multisampling AA, not to mention doubling texture resolution increases bandwidth consumption for textures by four.


I feel you are being difficult and you understand what i am asking.

Lets simplify this. For a 7970, when do you see diminishing returns on bandwidth in 3D Mark 11.


----------



## maarten12100

Quote:


> Originally Posted by *Shiftstealth*
> 
> I feel you are being difficult and you understand what i am asking.
> 
> Lets simplify this. For a 7970, when do you see diminishing returns on bandwidth in 3D Mark 11.


The only way to spot this is to measure latency on the board which we can't.
The other way is having 2 cards made one with a shallower interface for memory and one with a bigger one with the same core attached.


----------



## striderstone

Found this little gem on the EVGA forums in case no one here saw it.

http://videocardz.com/395...eased-on-february-18th

If anyone ends up getting 2 of them and then has 2 4gb 680 classy HC's for sale hit me up


----------



## supermi

Quote:


> Originally Posted by *striderstone*
> 
> Found this little gem on the EVGA forums in case no one here saw it.
> 
> http://videocardz.com/395...eased-on-february-18th
> 
> If anyone ends up getting 2 of them and then has 2 4gb 680 classy HC's for sale hit me up


that might be me but I have 4 , are you into EXTREME systems? LOL


----------



## striderstone

well I am planning on getting 4, Some guy on the EVGA forums said he would hook me up, 1200 for 2 of his. So unless you can beat that, I wouldn't be interested in getting more than 2. I would be interested in getting 4, yes. I do a lot of video editing so the cuda cores help ^_^


----------



## almighty15

Quote:


> Originally Posted by *striderstone*
> 
> well I am planning on getting 4, Some guy on the EVGA forums said he would hook me up, 1200 for 2 of his. So unless you can beat that, I wouldn't be interested in getting more than 2. I would be interested in getting 4, yes. I do a lot of video editing so the cuda cores help ^_^


Then more to AMD and OpenCL and get more performance..


----------



## maarten12100

Quote:


> Originally Posted by *almighty15*
> 
> Then more to AMD and OpenCL and get more performance..


The Titan will be perfect for him as it is a partly gpgpu core


----------



## striderstone

Quote:


> Originally Posted by *almighty15*
> 
> Then more to AMD and OpenCL and get more performance..


I use Sony Vegas Pro 12 right now and their drivers crash vegas on the version that I am using, on other versions they are still not optimized. This is just from what I have read online as I did look into the possibility. Everywhere I looked stated that I should stick with nvidia strictly for the support and yes they may have less cores but it still renders faster. I'm not an expert but I would rather not just "go against the grain" since I have always had a bad experience with AMD to begin with. That and I love EVGA, I don't know if it's possible to rip me away from that company.


----------



## Master Freez

I hope that Titan will support GOp and Ultra Fast Boot into Windows 8.

Probably nobody remembered that feature in the thread


----------



## NCSUZoSo

Our link that you posted above was a faulty publish by myself









It was put into the wrong section and had to be moved so it would show up on our front page, lol.

Here is the correct link: http://videocardz.com/39536/nvidia-geforce-gtx-titan-to-be-released-on-february-18th


----------



## maarten12100

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Our link that you posted above was a faulty publish by myself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was put into the wrong section and had to be moved so it would show up on our front page, lol.
> 
> Here is the correct link: http://videocardz.com/39536/nvidia-geforce-gtx-titan-to-be-released-on-february-18th


Seen it already can't wait.
Money well spent can't wait to post in the Titan ownership thread.


----------



## NCSUZoSo

Quote:


> Originally Posted by *maarten12100*
> 
> Seen it already can't wait.
> Money well spent can't wait to post in the Titan ownership thread.


Do you somehow have one secured already?

VideoCardz.com (the site I am Chief Editor and Writer for) is trying to get a hold of one and we are even willing to pay for it. So far we have yet to secure one..

We are trying to start our card reviews off with a bang (this would be our first), so we have sourced the funds to purchase Titan for a review.

I am contacting EVGA tomorrow and then ASUS. We tried OBR Hardware (as we have a relationship there), but as of yet they haven't been able to help us.

So if you know of a way to pre-order this card, please PM with info. Or maybe EVGA/ASUS rep will see this?? (yah right







)


----------



## maarten12100

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Do you somehow have one secured already?
> 
> VideoCardz.com (the site I am Chief Editor and Writer for) is trying to get a hold of one and we are even willing to pay for it. So far we have yet to secure one..
> 
> We are trying to start our card reviews off with a bang (this would be our first), so we have sourced the funds to purchase Titan for a review.
> 
> I am contacting EVGA tomorrow and then ASUS. We tried OBR Hardware (as we have a relationship there), but as of yet they haven't been able to help us.
> 
> So if you know of a way to pre-order this card, please PM with info. Or maybe EVGA/ASUS rep will see this?? (yah right
> 
> 
> 
> 
> 
> 
> 
> )


I meant I had seen the link already
If I had it as a consumer I would ofcourse post info about it


----------



## NBAasDOGG

If there's gonna be a ASUS DC II version of this Titian, i am going to buy it


----------



## 47 Knucklehead

I wonder how long it will be until EVGA comes out with the "EVGA Titian Hydro Copper 2".


----------



## dph314

Nothing at all but reference, if the rumors hold any truth at all. And judging from the 690, I wouldn't doubt it.

Would love a Lightning version


----------



## Cloudfire777

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Do you somehow have one secured already?
> 
> VideoCardz.com (the site I am Chief Editor and Writer for) is trying to get a hold of one and we are even willing to pay for it. So far we have yet to secure one..
> 
> We are trying to start our card reviews off with a bang (this would be our first), so we have sourced the funds to purchase Titan for a review.
> 
> I am contacting EVGA tomorrow and then ASUS. We tried OBR Hardware (as we have a relationship there), but as of yet they haven't been able to help us.
> 
> So if you know of a way to pre-order this card, please PM with info. Or maybe EVGA/ASUS rep will see this?? (yah right
> 
> 
> 
> 
> 
> 
> 
> )


There is very little chance you are getting any. As far as I know, only the reputational sites that have done a lot of reviews get samples from Nvidia/Asus/EVGA. Even if you have done a lot of reviews before, they still only handpicked 2-3 sites to do the review.

What you could try is contacting said sites and ask if you could borrow any of their samples. All of them have recieved 3 Titans according to rumors, to cover all bases for those who want to do SLI or 3-way review. This might mean you have to publish your review a day after them so they can have the exclusive right to the review.


----------



## hifibuff

Is it confirmed that the design will be exactly the same as that of the 690s ? I'd much prefer a blower-type (similar to that of the 680s) for 3-way SLI purposes


----------



## NCSUZoSo

Quote:


> Originally Posted by *hifibuff*
> 
> Is it confirmed that the design will be exactly the same as that of the 690s ? I'd much prefer a blower-type (similar to that of the 680s) for 3-way SLI purposes


It is confirmed that similar materials will be used (magnesium alloy), but the actual design of the card is still a mystery. besides that it is housed inside of a wooden box, just like the 690 was.


----------



## hifibuff

Well, as long as the hot air is taken completely out of the case, I'll be fine with magnesium and whatnot


----------



## Mr.Eiht

Hm, I wanted to go all in when nvidia releases their new cards. But I thought that this will be in Q3 this year. I got the money already but I thought I can keep it for a little longer, hug it, squeeze it....now it looks like nvidia might have it soon.

Maybe somone can enlighten me if this is already their new flagship or of they will release the TOP GK110 cards later this year...
Thanks in advance.


----------



## Cloudfire777

Quote:


> Originally Posted by *Mr.Eiht*
> 
> Hm, I wanted to go all in when nvidia releases their new cards. But I thought that this will be in Q3 this year. I got the money already but I thought I can keep it for a little longer, hug it, squeeze it....now it looks like nvidia might have it soon.
> 
> Maybe somone can enlighten me if this is already their new flagship or of they will release the TOP GK110 cards later this year...
> Thanks in advance.


This will be their flagship for sure. It will beat all 700 series cards, except GTX 790 which will be 2xGK114. So for single GPUs, this is the best you can get with Kepler.

NB: There is a small risk that Nvidia push out a FULL GK110 later on, aka 2880 cores. Its not been rumored or shown in any leaks, but Titan 2 might just happen. Really small risk though.


----------



## Mr.Eiht

@Cloudfire777 thanks for the infos. +REP`ed you for sure.

OK so the idea of the titan thing is that it outperforms all 7XX cards except the 790...
Lets see how that thing computes. I dont need two of them for gaming. just some funky rendering.
And Crysis 3


----------



## Shiftstealth

Quote:


> Originally Posted by *Mr.Eiht*
> 
> @Cloudfire777 thanks for the infos. +REP`ed you for sure.
> 
> OK so the idea of the titan thing is that it outperforms all 7XX cards except the 790...
> Lets see how that thing computes. I dont need two of them for gaming. just some funky rendering.
> And Crysis 3


From what i've read nvidia is supposed to be hosting something tomorrow, and having a financial call on the 17th(Or they had it on the 13th) so we might know something this weekend.

EDIT: I saw this posted over in Hardforums.


----------



## NCSUZoSo

It is pretty much confirmed Paper Launch on the 18th and then reviews at midnight (the 19th)


----------



## Shiftstealth

Quote:


> Originally Posted by *NCSUZoSo*
> 
> It is pretty much confirmed Paper Launch on the 18th and then reviews at midnight (the 19th)


Yeah, whatever they are hosting they will probably have it locked in a wooden crate again lol.


----------



## Cloudfire777

Quote:


> Originally Posted by *NCSUZoSo*
> 
> It is pretty much confirmed Paper Launch on the 18th and then reviews at midnight (the 19th)


We are counting on videocardz getting hold of a sample tonight and putting out the review 19th.
I can`t wait til 19th. No way


----------



## Shiftstealth

Quote:


> Originally Posted by *Cloudfire777*
> 
> We are counting on videocardz getting hold of a sample tonight and putting out the review 15th.
> I can`t wait til 19th. No way


+1


----------



## dph314

Why would a review go out on the 15th? Did videocardz say that they were planning on releasing something earlier than the 18th? Sorry, haven't had much time to keep up on everything. Though I'm having fun trying


----------



## NCSUZoSo

They are just joking, we (videocardz.com) haven't even got access to a single Titan card yet. We are trying to work it out between other sites and/or ASUS//EVGA.


----------



## Cloudfire777

LOL you should contact the guys @ Tweaktown. They seem to know how to get hands on the GPUs, spit on Nvidia`s NDA and do whatever they want








Quote:


> Originally Posted by *dph314*
> 
> Why would a review go out on the 15th? Did videocardz say that they were planning on releasing something earlier than the 18th? Sorry, haven't had much time to keep up on everything. Though I'm having fun trying


I wrote wrong. Should say 19th. Sorry


----------



## dph314

Quote:


> Originally Posted by *Cloudfire777*
> 
> LOL you should contact the guys @ Tweaktown. They seem to know how to get hands on the GPUs, spit on Nvidia`s NDA and do whatever they want
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *dph314*
> 
> Why would a review go out on the 15th? Did videocardz say that they were planning on releasing something earlier than the 18th? Sorry, haven't had much time to keep up on everything. Though I'm having fun trying
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wrote wrong. Should say 19th. Sorry
Click to expand...

Ah, I see. No problem. Yeah I'll definitely be up late Monday. Looking forward to what comes out.


----------



## driftingforlife

I have been trying as well. No luck and I know a well known reviewer that can't get one ether.


----------



## Nemessss

Quote:


> Originally Posted by *Cloudfire777*
> 
> LOL you should contact the guys @ Tweaktown. They seem to know how to get hands on the GPUs, spit on Nvidia`s NDA and do whatever they want
> 
> 
> 
> 
> 
> 
> 
> 
> I wrote wrong. Should say 19th. Sorry


how can we contact them?


----------



## hifibuff

@ driftingforlife, that wouldn't happen to be Kyle from Hard OCP by any chance ?


----------



## driftingforlife

Nope it wouldn't.


----------



## hifibuff

Ok, so it seems a lot of reviewers haven't received their cards yet. I hope it's no indication of how scarce the availability is going to be


----------



## maarten12100

Quote:


> Originally Posted by *hifibuff*
> 
> Ok, so it seems a lot of reviewers haven't received their cards yet. I hope it's no indication of how scarce the availability is going to be


maybe they want to keep it a "surprise" so it doesn't leak out early however we already exactly know what it is going to be on release and have indicated performance in 3 benchmarks.


----------



## Cloudfire777

Quote:


> Originally Posted by *hifibuff*
> 
> Ok, so it seems a lot of reviewers haven't received their cards yet. I hope it's no indication of how scarce the availability is going to be


Only a select few got samples from what I`ve read


----------



## xoleras

Quote:


> Originally Posted by *hifibuff*
> 
> Ok, so it seems a lot of reviewers haven't received their cards yet. I hope it's no indication of how scarce the availability is going to be


It will definitely be scarce, it's a limited edition item.


----------



## guinner16

Quick questions. How do these releases work on sites like Newegg or Amazon? Do they list them at midnight, or are they listed randomly throughout the morning/day. Any help in this would be greatly appreciated as I have not tried to buy a very limited GPU like this at launch.


----------



## ChronoBodi

where does one preorder, or at least, order a titan?


----------



## willdearborn

Quote:


> Originally Posted by *ChronoBodi*
> 
> where does one preorder, or at least, order a titan?


There is no pre-ordering. Once the NDA lifts they will just suddenly be available on sites like newegg. First come first served


----------



## ChronoBodi

Quote:


> Originally Posted by *willdearborn*
> 
> There is no pre-ordering. Once the NDA lifts they will just suddenly be available on sites like newegg. First come first served


-_- dammit... i understand there's a paper launch on the 18th, but when's the real release date?


----------



## guinner16

Quote:


> Originally Posted by *willdearborn*
> 
> There is no pre-ordering. Once the NDA lifts they will just suddenly be available on sites like newegg. First come first served


I get the 18th is a paper launch but lets say you wanted to try and get one off new egg. You basically just sit there for 24 hours hitting f5???? That sucks. The retail release date is rumored on the 24-26th. Does this work the same way, or do they list the items at specific times? At least with Iphone they tell you it will be ready at midnight or something like that.


----------



## Murlocke

Quote:


> Originally Posted by *guinner16*
> 
> I get the 18th is a paper launch but lets say you wanted to try and get one off new egg. You basically just sit there for 24 hours hitting f5???? That sucks. The retail release date is rumored on the 24-26th. Does this work the same way, or do they list the items at specific times? At least with Iphone they tell you it will be ready at midnight or something like that.


Same as every other piece of hardware ever released. It's always the F5 game, it's just even worse with GPUs. I could see Titan being REALLY hard to get, or easy to get. Mainly because it's ~$1000. I use a program to scan websites for changes. The 680s sold out in minutes.

Since only 2 brands are getting these, your best bet (imo) is buying directly from ASUS or eVGA websites. It's something lots of people overlook.


----------



## ChronoBodi

Quote:


> Originally Posted by *Murlocke*
> 
> Same as every other piece of hardware ever released. It's always the F5 game, it's just even worse with GPUs. I could see Titan being REALLY hard to get, or easy to get. Mainly because it's ~$1000. I use a program to scan websites for changes. The 680s sold out in minutes.
> 
> Since only 2 brands are getting these, your best bet (imo) is buying directly from ASUS or eVGA websites. It's something lots of people overlook.


Huh... hopefully not too many people order from EVGA/Asus directly... i hope.

Do you guys think the Titan will have the same availability as GTX 690s in general? I still see 690s on Newegg, but maybe ordering direct is the surefire way here?


----------



## ChronoBodi

anyone has length on this Titan? I'm gonna be pissed if it's longer than 11.3 inches... with the GTX 690 being 11 inches, this Titan has to be a wee bit shorter due to single gpu design. It HAS to be!

edit: nevermind, it'll fit. Here's the K20 compared to a 690:


----------



## ChronoBodi

Quote:


> Originally Posted by *Murlocke*
> 
> Same as every other piece of hardware ever released. It's always the F5 game, it's just even worse with GPUs. I could see Titan being REALLY hard to get, or easy to get. Mainly because it's ~$1000. I use a program to scan websites for changes. The 680s sold out in minutes.
> 
> Since only 2 brands are getting these, your best bet (imo) is buying directly from ASUS or eVGA websites. It's something lots of people overlook.


What magical program is this? I would like one please


----------



## Alatar

Another listing:

http://webshop.eurosys.be/en/components/graphics-cards/gtxtitan-6gb-graphics-card-gtxtitan6gd5-p387473


----------



## maxmix65

Price Asus GTXTITAN-6GD5








GTX TITAN-6GD5
http://www.hwlegend.com/bbforum/viewtopic.php?f=17&t=2643&start=150#p41648


----------



## Phishy714

It is a well known fact that these random little online retailers will mark up the price of unreleased hardware, listing them early to get traffic to their website only - and in hopes that some idiot will throw down the money in false hopes of getting one early.

I would take that listing price with a grain of salt, much like all the "articles" that have come out recently that add absolutely nothing to what's already known.


----------



## Seraphic

I am interested in a Titan card. However, I am waiting for the next generation Intel Ivy Bridge E/EP CPUs before I build a new system. Will Titan still be available for sale Q3/Q4 of this year?


----------



## Shiftstealth

Quote:


> Originally Posted by *Seraphic*
> 
> I am interested in a Titan card. However, I am waiting for the next generation Intel Ivy Bridge E/EP CPUs before I build a new system. Will Titan still be available for sale Q3/Q4 of this year?


1-1,560,000 chances.


----------



## ChronoBodi

Quote:


> Originally Posted by *Seraphic*
> 
> I am interested in a Titan card. However, I am waiting for the next generation Intel Ivy Bridge E/EP CPUs before I build a new system. Will Titan still be available for sale Q3/Q4 of this year?


Even then... the Sandy Bridge-E cpus still blow everything, especially when OCed. So overpowered they are, indeed, that i intend to get a Titan to complete my overkill i7-3930k rig, which has a GTX 580 3GB currently.


----------



## Cloudfire777

Quote:


> Originally Posted by *maxmix65*
> 
> Price Asus GTXTITAN-6GD5
> 
> 
> 
> 
> 
> 
> 
> 
> GTX TITAN-6GD5
> http://www.hwlegend.com/bbforum/viewtopic.php?f=17&t=2643&start=150#p41648


wow thats a lot of shops that have listed the Asus Titan...


----------



## Shiftstealth

Quote:


> Originally Posted by *Cloudfire777*
> 
> wow thats a lot of shops that have listed the Asus Titan...


Probably a lot of cows they collected fecal matter from too.


----------



## maarten12100

I advise everyone not too purchase on yet on those sites as you will probably get them later than if bought after the release in a know shop.
All the sites I've seen offering them were poorly designed and leaking a product isn't really something to be proud of.

If it is 1000 Euro it is a big miss for the European market should be more like 800/900 but certrainly not more.
For America it will be 1000/1100 dollar however it will probably be a tad cheaper so 950/1100 dollar somewhere in that price range.


----------



## ChronoBodi

rather check official asus/evga site and Newegg, if anyone is willing to do F5ing those sites or have a scanner program to do it.


----------



## maxmix65

Another listing:
http://uptiki.altervista.org/viewer.php?file=u91a6tpdxrx9rhvtunp.jpg


----------



## maarten12100

Quote:


> Originally Posted by *maxmix65*
> 
> Another listing:
> http://uptiki.altervista.org/viewer.php?file=u91a6tpdxrx9rhvtunp.jpg


Why does that pic say dvi-d D-SUB hdmi
D-sub is vga if this card has a vga port I'll eat both my shoes.


----------



## guinner16

If my math is correct does that convert to about $1,100 USD without VAT. If that is the case this will be a no go for me. $1,000 is my limit, and only If i can get 2.


----------



## ChronoBodi

Quote:


> Originally Posted by *guinner16*
> 
> If my math is correct does that convert to about $1,100 USD without VAT. If that is the case this will be a no go for me. $1,000 is my limit, and only If i can get 2.


You live in Europe? Are they jacking up the $900 price?


----------



## Shiftstealth

Quote:


> Originally Posted by *ChronoBodi*
> 
> You live in Europe? Are they jacking up the $900 price?


IDK 900 is my limit. Not spending 1k.

Rather buy a used gtx 690 from someone who got a titan.


----------



## maarten12100

Quote:


> Originally Posted by *ChronoBodi*
> 
> You live in Europe? Are they jacking up the $900 price?


I do but it will most likely be Euro price= Dollar price as we always pay a tad more for the same stuff.
900 dollar is a very fair price.


----------



## Dart06

So quick question:

If I already have a 2GB GTX 670 (got it with the evga step up when it released) would it be better to just get another one of those?

I switch between a 2560x1440 ips monitor and a benq xl2420t 120hz monitor for games currently but I don't know if 2GB of VRAM will be enough yet to justify getting another 2GB card. I COULD sell it off and buy two 670 4GB cards or sell it off and get a Titan regardless of it's price.

One 670 just doesn't cut it for my needs.

Trying to find the best solution to this problem...


----------



## maarten12100

Quote:


> Originally Posted by *Dart06*
> 
> So quick question:
> 
> If I already have a 2GB GTX 670 (got it with the evga step up when it released) would it be better to just get another one of those?
> 
> I switch between a 2560x1440 ips monitor and a benq xl2420t 120hz monitor for games currently but I don't know if 2GB of VRAM will be enough yet to justify getting another 2GB card. I COULD sell it off and buy two 670 4GB cards or sell it off and get a Titan regardless of it's price.
> 
> One 670 just doesn't cut it for my needs.
> 
> Trying to find the best solution to this problem...


The Titan will make sure you don't have sli problems also you have enough power and vram for everything unless you go 3 way 1600p surround.


----------



## Dart06

Quote:


> Originally Posted by *maarten12100*
> 
> The Titan will make sure you don't have sli problems also you have enough power and vram for everything unless you go 3 way 1600p surround.


So you think I should try to secure a Titan when it comes out?

I've been thinking about it.


----------



## Shiftstealth

Quote:


> Originally Posted by *Dart06*
> 
> So you think I should try to secure a Titan when it comes out?
> 
> I've been thinking about it.


A titan might squeeze out 110 fps in bf3, but in crysis 3 and bf4, not so sure.

EDIT: It also has been accepted that SLi GTX 670 or 680's will be more powerful than a titan.


----------



## Cloudfire777

EKWaterBlocks is having a poll where they ask which Waterblock they are gonna make for the Titan
http://thinkcell.ekwb.com/idea/new-full-cover-block-design---choose-your-best


----------



## Domino

Quote:


> Originally Posted by *Qu1ckset*
> 
> Ugh its going to be hard not to replace my gtx690 with this, im loving all that vram!


downgrade! downgrade! downgrade! downgrade! doooooowngraaaaaaade!!!!!


----------



## maarten12100

Quote:


> Originally Posted by *Dart06*
> 
> So you think I should try to secure a Titan when it comes out?
> 
> I've been thinking about it.


I think it is the best way too go if you need the power.
This is mostly because I dislike the hassle with sli configs and the fact it sometime doesn't work at all or give you less performance.


----------



## Shiftstealth

Quote:


> Originally Posted by *Domino*
> 
> downgrade! downgrade! downgrade! downgrade! doooooowngraaaaaaade!!!!!


Pffft lol, my plan is if i cant catch a titan i'm grabbing a used 690


----------



## Avonosac

Quote:


> Originally Posted by *Domino*
> 
> downgrade! downgrade! downgrade! downgrade! doooooowngraaaaaaade!!!!!


Unless you're running surround 1440p or higher :/


----------



## Dart06

Quote:


> Originally Posted by *Shiftstealth*
> 
> A titan might squeeze out 110 fps in bf3, but in crysis 3 and bf4, not so sure.
> 
> EDIT: It also has been accepted that SLi GTX 670 or 680's will be more powerful than a titan.


sure 670s in SLI would be more powerful, but at 2560x1440 I don't know if 2GB of vram will do much and/or bottleneck if I want to have AA on.

If I do decide to run 670s I would end up buying 2 of the 4GB versions and sell off my 2GB. I'm trying to decide what route to take and it sucks. :/

Quote:


> Originally Posted by *maarten12100*
> 
> I think it is the best way too go if you need the power.
> This is mostly because I dislike the hassle with sli configs and the fact it sometime doesn't work at all or give you less performance.


I've SLI'd 2.5GB 570s before and the only problem I had was League of Legends. I had to disable SLI every time I played that game. I'd much like to avoid it, but it depends on whatever route is better for me to take overall.


----------



## rcfc89

Quote:


> Originally Posted by *Shiftstealth*
> 
> Pffft lol, my plan is if i cant catch a titan i'm grabbing a used 690


Mine will hit the market if I can sang a pair of Titans. Its only 7 weeks old.


----------



## guinner16

Quote:


> Originally Posted by *ChronoBodi*
> 
> You live in Europe? Are they jacking up the $900 price?


No, I am in the US. I believe that site had it listed in Krona. I just did the conversion and subtraced 20% for VAT, which came out to about $1,100 USD. Wasn't the ASUS 690 a couple hundred dollars more than the EVGA. It woulc be the ASUS Titan will be $1,100, and the EVGA $900 or $1,000. Somthing tells me this card will be priced just below the 690 at $900. The price will be justified by the 6GB VRAM, and the less than 690 price will be justified by the lower performance than a 690.


----------



## Cloudfire777

You can`t do that. Finnish retailers pay more for hardware than US retailers. Meaning that when you remove 20% VAT, its still more expensive than the US price


----------



## guinner16

Quote:


> Originally Posted by *Cloudfire777*
> 
> You can`t do that. Finnish retailers pay more for hardware than US retailers. Meaning that when you remove 20% VAT, its still more expensive than the US price


I understand that, however it still gives us a pretty good idea of the high side. I think it is safe to assume this card will cost no less than $899, and no more than $1,099. That website could also have some fluff built in for extra profits.

Edit: I am on my crappy work monitor and just noticed it is in Euros anyway.


----------



## Cloudfire777

Yups, probably close to the real price. I`m also guessing $799-899


----------



## NCSUZoSo

I'll stick with the rumors of $800-$1000, I honestly doubt they charge over $1000 because it can't beat the 690 across the board. Now in a game/app that doesn't play well with SLI, Titan is going to be the fastest consumer GPU in the world.

Pricing is always a hard thing to write about, even when we have solid info at VC the purchase cost of the card is always a little different than what we posted. This is due to retailers, like NewEgg, profiting off of either extremely popular/limited releases. Price gouging will be at its finest with the Titan release, just watch. When quantity goes down, prices are going to sky rocket, it is basic economics, supply and demand curve.


----------



## Shiftstealth

Quote:


> Originally Posted by *NCSUZoSo*
> 
> I'll stick with the rumors of $800-$1000, I honestly doubt they charge over $1000 because it can't beat the 690 across the board. Now in a game/app that doesn't play well with SLI, Titan is going to be the fastest consumer GPU in the world.
> 
> Pricing is always a hard thing to write about, even when we have solid info at VC the purchase cost of the card is always a little different than what we posted. This is due to retailers, like NewEgg, profiting off of either extremely popular/limited releases. Price gouging will be at its finest with the Titan release, just watch. When quantity goes down, prices are going to sky rocket, it is basic economics, supply and demand curve.


WHERES YOUR SAMPLE I WANT A REVIEW!


----------



## NCSUZoSo

lol help us get a hold of Titan, I can't believe we are having such trouble and we are willing to pay for it.


----------



## guinner16

Quote:


> Originally Posted by *Cloudfire777*
> 
> Yups, probably close to the real price. I`m also guessing $799-899


If it is $799 I might be able to get two without my wife killing me.


----------



## rationalthinking

Quote:


> Originally Posted by *guinner16*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ChronoBodi*
> 
> You live in Europe? Are they jacking up the $900 price?
> 
> 
> 
> No, I am in the US. I believe that site had it listed in Krona. I just did the conversion and subtraced 20% for VAT, which came out to about $1,100 USD. Wasn't the ASUS 690 a couple hundred dollars more than the EVGA. It woulc be the ASUS Titan will be $1,100, and the EVGA $900 or $1,000. Somthing tells me this card will be priced just below the 690 at $900. The price will be justified by the 6GB VRAM, and the less than 690 price will be justified by the lower performance than a 690.
Click to expand...

No the EVGA and Asus were the same price point. They are both reference cards.


----------



## maarten12100

Quote:


> Originally Posted by *rationalthinking*
> 
> No the EVGA and Asus were the same price point. They are both reference cards.


The EVGA one might have a sticker.
Deal with it!


----------



## NCSUZoSo

Quote:


> Originally Posted by *Cloudfire777*
> 
> EKWaterBlocks is having a poll where they ask which Waterblock they are gonna make for the Titan
> http://thinkcell.ekwb.com/idea/new-full-cover-block-design---choose-your-best


Just wrote this: http://videocardz.com/39590/ekwb-preparing-a-water-block-for-geforce-gtx-titan

Quote:


> Originally Posted by *rationalthinking*
> 
> No the EVGA and Asus were the same price point. They are both reference cards.


Differences in warranties could slightly affect the prices.


----------



## Rezard

Am I the only one who sees the Titan as "meh"?

I guess I care more about pixel pushing power and bandwidth than most.
It's a great jump over the HD 7900s in textures,
but just a modest upgrade over one, otherwise.

I'd rather save a TON of money and go with those GTX 770 or 780.
They still top the 7900s specs, and are right around the corner, right?

I tell ya--
It's those GTX 880 specs in the image posted on this thread --
*They* got me drooooooolin'....









640 GB/s bandwidth...!?
....!
......!
.........!!!!!


----------



## supermi

Quote:


> Originally Posted by *Rezard*
> 
> Am I the only one who sees the Titan as "meh"?
> 
> I guess I care more about pixel pushing power and bandwidth than most.
> It's a great jump over the HD 7900s in textures,
> but just a modest upgrade over one, otherwise.
> 
> I'd rather save a TON of money and go with those GTX 770 or 780.
> They still top the 7900s specs, and are right around the corner, right?
> 
> I tell ya--
> It's those GTX 880 specs in the image posted on this thread --
> *They* got me drooooooolin'....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 640 GB/s bandwidth...!?
> ....!
> ......!
> .........!!!!!


I am on the same page as you!!!!

Though I am still looking at the titans as my 4x 680 classifieds seem to struggle with MSAA in bf3 and MSAA or SMAA in the C3 beta , which SUCKS!!!! Was hoping the extra bandwidth would help with that!

How is your system running those games?

and YES MAXWEL 880 OH I am getting those !!!!! Titan or not LOL


----------



## NCSUZoSo

If right around the corner is 7-8 months, sure.


----------



## Rezard

Quote:


> Originally Posted by *NCSUZoSo*
> 
> If right around the corner is 7-8 months, sure.


The posted image says April 2013, bro...


----------



## maarten12100

Quote:


> Originally Posted by *Rezard*
> 
> The posted image says April 2013, bro...


It was pushed to next year according to most rumors bro.
however I think Q3 or Q4 of this year would be more likely.


----------



## hifibuff

Unfortunately for us, it seems increasingly likely that nVidia will wait AMD who recently said : "nothing new until Q4 at best". And for those who won't be able to get their hands on the Titans, well...let's just say, it will feel like a loooong year


----------



## maarten12100

Quote:


> Originally Posted by *hifibuff*
> 
> Unfortunately for us, it seems increasingly likely that nVidia will wait AMD who recently said : "nothing new until Q4 at best". And for those who won't be able to get their hands on the Titans, well...let's just say, it will feel like a loooong year


Well Ivy-E is coming this year and so is the brickland platform with ivy-EX also there will be Haswell which will be POOP for the low end it uses less power while not gaining any performance just great guys.


----------



## sherlock

Quote:


> Originally Posted by *maarten12100*
> 
> Well Ivy-E is coming this year and so is the brickland platform with ivy-EX also there will be *Haswell which will be POOP* for the low end it uses less power while not gaining any performance just great guys.


POOP= Protect Our Only Planet







[/ sarcasm]


----------



## driftingforlife

Quote:


> Originally Posted by *maarten12100*
> 
> Well Ivy-E is coming this year and so is the brickland platform with ivy-EX also there will be Haswell which will be POOP for the low end it uses less power while not gaining any performance just great guys.


How do you know there is no more performance gain?


----------



## hifibuff

Quote:


> Originally Posted by *maarten12100*
> 
> Well *Ivy-E* is coming this year and so is the brickland platform with ivy-EX also there will be Haswell which will be POOP for the low end it uses less power while not gaining any performance just great guys.


Now, something I'm looking forward to, and which won't be titan-like difficult to get at launch, hopefully.


----------



## Avonosac

Quote:


> Originally Posted by *Rezard*
> 
> The posted image says April 2013, bro...


880 was 2014, not 2013.
Quote:


> Originally Posted by *hifibuff*
> 
> Unfortunately for us, it seems increasingly likely that nVidia will wait AMD who recently said : "nothing new until Q4 at best". And for those who won't be able to get their hands on the Titans, well...let's just say, it will feel like a loooong year


Hi welcome to OCN. While AMD will not have a new architecture until Q4 of this year, NVidia is still most likely to release their 700 series cards in Q2 of this year, April / May. There has only been one article which hints at NVidia delaying their release, and nothing else. With there slotted release for Maxwell, they would be crazy to release two series of cards within less than a year of each other.


----------



## cookiesowns

Does anyone know where to find that website scanner program that someone at EVGA forums coded and released it as open source? I had it before with the kepler launch, but can't seem to find it anymore.

Really excited for the Titan, and most likely will get one, or maybe more!


----------



## xoleras

Quote:


> Originally Posted by *driftingforlife*
> 
> How do you know there is no more performance gain?


The performance difference between the sandy bridge and ivy bridge uarchitecture are minimal. It's all about efficiency and graphics performance, that's all intel is concerned about improving - as they use this for their mobile platforms. This will apply to IB-E as well.

That said, IB is possibly 10% IPC increase over SB. Still insignificant , though.


----------



## xoleras

Quote:


> Originally Posted by *Avonosac*
> 
> 880 was 2014, not 2013.
> Hi welcome to OCN. While AMD will not have a new architecture until Q4 of this year, NVidia is still most likely to release their 700 series cards in Q2 of this year, April / May. There has only been one article which hints at NVidia delaying their release, and nothing else. With there slotted release for Maxwell, they would be crazy to release two series of cards within less than a year of each other.


Nvidia already stated Q4 for new GTX 700 products.

Titan is an outlier that is being made in ridiculously small quantities. If you want more than GTX 600 better F5 newegg on release.


----------



## Avonosac

Quote:


> Originally Posted by *xoleras*
> 
> Did you make this up out of thin air? Apparently you did because nvidia already stated Q4 for new GTX 700 products.
> 
> Titan is an outlier that is being made in ridiculously small quantities. If you want more than GTX 600 better F5 newegg on release.


Source? I've been trolling the hardware threads, I've seen nothing but the Sweclockers rumor article about nvidia saying Q4. I on the other hand, am using this well respected and so far (recently) fairly accurate page as my source for Q2.


----------



## sherlock

Quote:


> Originally Posted by *xoleras*
> 
> The performance difference between the sandy bridge and ivy bridge uarchitecture are minimal. *It's all about efficiency and graphics performance,* that's all intel is concerned about improving - as they use this for their mobile platforms. This will apply to IB-E as well.
> 
> That said, IB is possibly 10% IPC increase over SB. Still insignificant , though.


Intel is not concerned about graphic performance for the Server platform imo, otherwise SB-E/EP would have had an IGP, without an IGP and with a 22nm die shrink, the question on what do you do with all the extra transistor arises.

With a die shrink you can get a sizeable increase in transistor counts( 400 million for SB-Ivy, spent mostly on the IGPU), so I wouldn't rule out a core count increase for Ivy-E. 22nm die shrink would allow Ivy-EP to be a native 10/12 core chip, which give reason to believe in the rumor of a 8 core i7 for Ivy-E or a 6 core 3820 successor.





Ivy without IGPU and with a 95 W TDP it could have easily been a native 6 core.Blow that Ivy die shot up without the IGPU onto 125 W TDP and LGA2011 size and a 6 core 3820 successor is probably attainable.


----------



## formula m

So.. what is the prevailing thought here, how much moAr of an improvement is the Titan going to be, over an HD7970..?

15%~20%..?


----------



## Alatar

Quote:


> Originally Posted by *formula m*
> 
> So.. what is the prevailing thought here, how much moAr of an improvement is the Titan going to be, over an HD7970..?
> 
> _15%~20%..?_


Most likely considerably more.


----------



## Astral Fly

Quote:


> Originally Posted by *xoleras*
> 
> Nvidia already stated Q4 for new GTX 700 products.


You're just making stuff up in your own mind here. In the future it might be a good idea to remind yourself that you need sources when you post claims like that and stop yourself before you hit the 'submit' button.

They might wait till the end of the year with the 700 series, but I hope not because I plan to get a gtx 770.


----------



## maarten12100

Quote:


> Originally Posted by *driftingforlife*
> 
> How do you know there is no more performance gain?


I have looked trough the benchmarks comparing haswell to ivy.
Haswell will be energy efficient but not faster @samespeeds as ivybridge.


----------



## cookiesowns

Aren't most of the performance figures of Haswell leaked anyways? Plus, most of those figures are mostly done with ES, outdated BIOS and ME.


----------



## chaogui

So who here has managed to recruit their own F5 squad?


----------



## XxOsurfer3xX

Can't wait, If there are not going to be alternatives from AMD until the end of the year I might pull the trigger on one of these, if it doesn't go over 1000 bucks.


----------



## maarten12100

Quote:


> Originally Posted by *cookiesowns*
> 
> Aren't most of the performance figures of Haswell leaked anyways? Plus, most of those figures are mostly done with ES, outdated BIOS and ME.


Different microcode won't make it perform better also an ES can be benched as the real deal simply by evaluating performance, clock per clock.


----------



## zalbard

Quote:


> Originally Posted by *maarten12100*
> 
> Haswell will be energy efficient but not faster @samespeeds as ivybridge.


False.
It will be around 10% faster at the same clock speed.


----------



## driftingforlife

Quote:


> Originally Posted by *maarten12100*
> 
> I have looked trough the benchmarks comparing haswell to ivy.
> Haswell will be energy efficient but not faster @samespeeds as ivybridge.


This is wrong.


----------



## villain

Newer articles state the Titan could cost $1200. Not unlikely when there are only 10000 of these cards.


----------



## maarten12100

Quote:


> Originally Posted by *zalbard*
> 
> False.
> It will be around 10% faster at the same clock speed.


No the ipc improvements were 10% but they shrank the die(transistor count) > lower power consumption.

http://nl.hardware.info/nieuws/33187/eerste-benchmarks-van-intel-haswell-duiken-op
It is in Dutch but the benches speak for themselves.
Quote:


> Originally Posted by *driftingforlife*
> 
> This is wrong.


It is pretty common sense as the low end consumer read notebook user want decent performance with as low power consumption as possible.
They decided to take it all the way too the mid end however the high end will still be performance based as the tdp is set between 70 and 150W


----------



## Swolern

Quote:


> Originally Posted by *villain*
> 
> Newer articles state the Titan could cost $1200. Not unlikely when there are only 10000 of these cards.


I doubt it. Nvidia is in the business to make money, but they don't screw their customers over, just because they can. I believe the Titan's price will be related to performance when compared to the GTX 690 and its $1000 price point. Now I can already see price gouging by 3rd party sellers that could go up to $1200-1400 USD.


----------



## altereDad

I would probably guess it to be at the same price (or greater) than the ARES II AMD GPU. It's selling for over $1,600 US


----------



## Alatar

Quote:


> Originally Posted by *altereDad*
> 
> I would probably guess it to be at the same price (or greater) than the ARES II AMD GPU. It's selling for over $1,600 US


Not going to happen. The only listing that priced it that high came from aus.


----------



## maarten12100

Quote:


> Originally Posted by *altereDad*
> 
> I would probably guess it to be at the same price (or greater) than the ARES II AMD GPU. It's selling for over $1,600 US


1000 euro and lower is my estimate 1.6k dollars would just be absurd as that is half the price of a tesla which you could get right now.
I know the tesla's don't have display out connectors but a simple card will fix that.


----------



## maarten12100

Btw AMD said 8xxx series will be launched by the end of this year.
http://www.pcworld.com/article/2028408/amd-to-release-new-radeon-hd-8000-graphics-cards-in-2013.html

Nvidia will most likely also wait and take it slowly, so if you want to upgrade and you're into ultra high end this may be a very viable option.


----------



## xoleras

Quote:


> Originally Posted by *Astral Fly*
> 
> You're just making stuff up in your own mind here. In the future it might be a good idea to remind yourself that you need sources when you post claims like that and stop yourself before you hit the 'submit' button.
> 
> They might wait till the end of the year with the 700 series, but I hope not because I plan to get a gtx 770.


I already posted this smart guy

http://www.tomshardware.com/news/Radeon-GeForce-Delay-GPU-Next-Generation,20838.html


----------



## sherlock

Quote:


> Originally Posted by *maarten12100*
> 
> No the ipc improvements were 10% but they shrank the die(transistor count) > lower power consumption.
> 
> http://nl.hardware.info/nieuws/33187/eerste-benchmarks-van-intel-haswell-duiken-op
> It is in Dutch but the benches speak for themselves.
> It is pretty common sense as the low end consumer read notebook user want decent performance with as low power consumption as possible.
> They decided to take it all the way too the mid end however the high end will still be performance based as the tdp is set between 70 and 150W


Just FYI you are contradicting Driftingforlife who have a Haswell ES.


----------



## maarten12100

Quote:


> Originally Posted by *sherlock*
> 
> Just FYI you are contradicting Driftingforlife who have a Haswell ES. We already have Haswell benchmarks showing IPC gains from 5-


That all rather depends whether he has a high end chip say socket 2011(2011B the new socket) or he has a mid end chip.
If he has a mid end one he is incorrect if he has a high end chip he isn't.

On the other hand it would be quite weird for him to have a high end Haswell as Ivy-e and Ivy-ex is being tested as we speak.


----------



## sherlock

Quote:


> Originally Posted by *maarten12100*
> 
> That all rather depends whether he has a high end chip say socket 2011(2011B the new socket) or he has a mid end chip.
> If he has a mid end one he is incorrect if he has a high end chip he isn't.
> 
> On the other hand it would be quite weird for him to have a high end Haswell as Ivy-e and Ivy-ex is being tested as we speak.


You can't get Socket 2011 Haswell-E ES chips outside an Intel lab right now(as those are slated for 2014 release at the earliest), all the ES out there of that type are Ivy-E. He definitely have a LGA 1150 ES chip.

So you believe you are correct and he is wrong despite him having a Haswell ES? Haswell benches so far have shown IPC(the transistor count decreasing you are claiming would certainly reduce IPC) gains of 5-10% and we already have leaked 4670K/4770K clocks showing they have the same clocks as Ivy(3.4/3.8 for i5K, 3.5/3.9 for i7K) so I don't see where you get the performance not increasing stuff from.


----------



## maarten12100

Quote:


> Originally Posted by *sherlock*
> 
> You can't get Socket 2011 Haswell-E ES chips right now(as those are slated for 2014 release at the earliest), all the ES out there of that type are Ivy-E. He definitely have a LGA 1150 ES chip.
> 
> So you believe you are correct and he is wrong despite him having a Haswell ES? Haswell benches so far have shown IPC gains 5-10% increase and we already have leaked 4670K/4770K clocks showing they have the same clocks as Ivy(3.4/3.8 for i5K, 3.5/3.9 for i7K) so I don't see where you get the performance not increasing stuff from.


It depends on how he comaperes performance with current gen if he clock it up until consumption is equal the Haswell would perform way better of course.
But clock per clock it seems not so much.

If he has one I can't do anything but assume either my source was wrong or the benching was done different.
He can't leak specs but he can leak pictures of the proc itself (not that I have reason to look at a IHS that says Intel Confidential or qa96







)


----------



## sherlock

Quote:


> Originally Posted by *maarten12100*
> 
> It depends on how he comaperes performance with current gen if he clock it up until consumption is equal the Haswell would perform way better of course.
> But clock per clock it seems *not so much.*
> 
> If he has one I can't do anything but assume either my source was wrong or the benching was done different.
> He can't leak specs but he can leak pictures of the proc itself (not that I have reason to look at a IHS that says Intel Confidential or qa96 )


5-10% IPC increase shown so far in leaked benches seem in line from what I seen from predictions made by sites like Anandtech(5-15% depend on application), and Haswell i5K/i7K TDP is actually higher than Ivy(84W vs 77W) and their clock speed is the same(this leaked last year) for i5 4670K(vs3570K)/i74770K (vs 4670K).


----------



## maarten12100

Quote:


> Originally Posted by *sherlock*
> 
> 5-10% IPC increase shown so far in leaked benches seem in line from what I seen from predictions made by sites like Anandtech(5-15% depend on application), and Haswell i5K/i7K TDP is actually higher than Ivy(84W vs 77W) and their clock speed is the same(this leaked last year) for i5 4670K(vs3570K)/i74770K (vs 4670K).


Higher tdp and almost no increase in performance it better use less power idle, otherwise mid end Haswell is pretty much a bust.
I feel so lucky I won't touch the mid end Haswell anyway


----------



## whitingnick

But can it handle Crysis??


----------



## sherlock

Quote:


> Originally Posted by *maarten12100*
> 
> Higher tdp and almost no increase in performance it better use less power idle, otherwise mid end Haswell is pretty much a bust.
> I feel so lucky I won't touch the mid end Haswell anyway


Idle Power saving is like one of the top 3 main focus of Haswell, you must not read any of the anandtech Haswell articles. Plus that TDP increase is mainly from a more powerful IGPU(disable that on Ivy =TDP drops 10W+, could be much more for Haswell), and integrated vrm(which helps power saving).


----------



## HYPERDRIVE

I would like to ask a realistic question.
Lets just consider single GPU numbers without dual GPU solutions, for guys with power need just for a single screen.
Considering what we know, the Titan will be about 70-85% faster then the current top end GPU GTX680.
We speculate that the GTX780 will bring in about maybe 15-20% gain over GTX 680.
would whatever comes next after 780, the 880s be able to match the raw power and speed of the Titan?
Even tho if it did that will be a long time from now, probably soonest 1.5-2 years, at typical price range of 500-600 for top
end model. If 880s will "just" match same power of the Titan, by then I think having the same power for the past 2 years for the additional
300 bucks over the typical high end price of 500-600, is not really all that bad considering you will have a really "futureproof" card that might last you for at least 3-5 years.
That is why I'm really considering getting one for just for the fact that the future tech is slowing down, and power gains seem to be just incremental over each generation.
Lets not put out AMDs offering out of the question either, I'm only looking for best,fastest at a given time, with reasonable price range.


----------



## Boomstick777

Quote:


> Originally Posted by *Swolern*
> 
> Nvidia is in the business to make money, but they don't screw their customers over, just because they can.


Hahahahahahahahahahaha









Best comment ever







.


----------



## furyn9

Quote:


> Originally Posted by *Boomstick777*
> 
> Hahahahahahahahahahaha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Best comment ever
> 
> 
> 
> 
> 
> 
> 
> .


----------



## Brianmz

^ lol Indeed, besides price range is just pure speculation at this point, plus we'll find out soon enough, While i do plan to get 2-3 of these cards if possible, I think at 1000$ it's overpriced for the rumoured increase of 60-80% of a gtx 690 which is almost 1 year old, I would think a price range of 699-799$ is fair. Either way I will get them as I need an upgrade soon and they are the best, just 1000$ will leave a bad taste in my mouth as a loyal nvidia consumer for the past 9 years.


----------



## armando666

Quote:


> Originally Posted by *Swolern*
> 
> I doubt it. Nvidia is in the business to make money, but they don't screw their customers over, just because they can. I believe the Titan's price will be related to performance when compared to the GTX 690 and its $1000 price point. Now I can already see price gouging by 3rd party sellers that could go up to $1200-1400 USD.


I would, if I could, as Nividia. "Screwing" is a subjective nuance as seen through the eyes of the beholder (buyer). But, they would not screw the customers, because they can't, as the customers will vote with their proverbial feet by walking away, rather than their wallets, by buying it, if the customers see it as a "screwing". Forces of free market are a beautiful thing, even in an oligopolistic market structure.


----------



## maarten12100

Quote:


> Originally Posted by *Brianmz*
> 
> ^ lol Indeed, besides price range is just pure speculation at this point, plus we'll find out soon enough, While i do plan to get 2-3 of these cards if possible, I think at 1000$ it's overpriced for the rumoured increase of 60-80% of a gtx 690 which is almost 1 year old, I would think a price range of 699-799$ is fair. Either way I will get them as I need an upgrade soon and they are the best, just 1000$ will leave a bad taste in my mouth as a loyal nvidia consumer for the past 9 years.


If this goes for 800 dollar / 800 euro I'm gonna eat one of my socks


----------



## Astral Fly

Quote:


> Originally Posted by *xoleras*
> 
> I already posted this smart guy
> 
> http://www.tomshardware.com/news/Radeon-GeForce-Delay-GPU-Next-Generation,20838.html


So Sweclockers is now officially speaking for Nvidia? That is just a rumor. Nowhere have Nvidia stated that 700 series is coming in q4.


----------



## brandonb21

linus tech tips confirmed it on his live stream under something is in the mail


----------



## NCSUZoSo

Quote:


> Originally Posted by *villain*
> 
> Newer articles state the Titan could cost $1200. Not unlikely when there are only 10000 of these cards.


Source on quantity being produced?


----------



## maarten12100

Quote:


> Originally Posted by *NCSUZoSo*
> 
> Source on quantity being produced?


Videocardz said it I guess.
10k is a pretty grand number


----------



## invincible20xx

Quote:


> Originally Posted by *brandonb21*
> 
> linus tech tips confirmed it on his live stream under something is in the mail


well i would trust a guy like linus if he confirmed it


----------



## badrapper

So this is being done to take single gpu crown away from AMD, and if only 10k are being produce ie nvidia will earns $10k from it all... does that even count with a card that isn't mainstream and rare?

This is just nvidia getting desperate from the dwindling sales of the over priced 6 series.. with nothing being set in stone, i suppose we will find out soon what this drama is all about.


----------



## maarten12100

Quote:


> Originally Posted by *badrapper*
> 
> So this is being done to take single gpu crown away from AMD, and if only 10k are being produce ie nvidia will earns $10k from it all... does that even count with a card that isn't mainstream and rare?
> 
> This is just nvidia getting desperate from the dwindling sales of the over priced 6 series.. with nothing being set in stone, i suppose we will find out soon what this drama is all about.


They market chips they couldn't sell in their teslas so almost the price is the cost as pcb and components don't cost compared to the rest of the card lets say 800 dollar profit per card.
10000 of those would make 0.8 million dollars of extra profit for something they would otherwise put to waste.

Seems like a great deal for them and a great deal for us.


----------



## Avonosac

Quote:


> Originally Posted by *badrapper*
> 
> So this is being done to take single gpu crown away from AMD, and if only 10k are being produce ie nvidia will earns $10k from it all... does that even count with a card that isn't mainstream and rare?
> 
> This is just nvidia getting desperate from the dwindling sales of the over priced 6 series.. with nothing being set in stone, i suppose we will find out soon what this drama is all about.


How do you draw the conclusion that this is them being desperate?


----------



## maarten12100

Quote:


> Originally Posted by *Avonosac*
> 
> How do you draw the conclusion that this is them being desperate?


Indeed this is them being on top of their game since the result is Nvidia owns crown gpu Nvidia makes the best gpus in the world (that is what a regular person thinks)
So regular person goes to the store and buys a junk low end card yeah nvidia is the best blah blah.
Amd however stated they might counter it if they find it necessary however I yet don't see how.


----------



## NCSUZoSo

Quote:


> Originally Posted by *maarten12100*
> 
> Videocardz said it I guess.
> 10k is a pretty grand number


K, I was wondering if any other site has released that info after we updated our article with the info.

Quote:


> Originally Posted by *maarten12100*
> 
> Indeed this is them being on top of their game since the result is Nvidia owns crown gpu Nvidia makes the best gpus in the world (that is what a regular person thinks)
> So regular person goes to the store and buys a junk low end card yeah nvidia is the best blah blah.
> Amd however stated they might counter it if they find it necessary however I yet don't see how.


AMD's only counter is a true reference release of the 7990 and that is two GPUs fighting one. That would still give NVIDIA the single GPU crown. AMD has no counter here, even their 8000 Series in Q4 will most likely not get close to Titan.


----------



## Forceman

Quote:


> Originally Posted by *badrapper*
> 
> So this is being done to take single gpu crown away from AMD, and if only 10k are being produce ie nvidia will earns $10k from it all... does that even count with a card that isn't mainstream and rare?
> 
> This is just nvidia getting desperate from the dwindling sales of the over priced 6 series.. with nothing being set in stone, i suppose we will find out soon what this drama is all about.


So Nvidia is only making $1 a card on them?


----------



## Alatar

Quote:


> Originally Posted by *NCSUZoSo*
> 
> K, I was wondering if any other site has released that info after we updated our article with the info.
> 
> AMD's only counter is a true release of the 7990 and that is two GPUs fighting one. That would still give NVIDIA the single GPU crown. AMD has no counter here, even their 8000 Series will most likely not get close to Titan.


xbitlabs mentioned the 10K number: http://www.xbitlabs.com/news/graphics/display/20130213151212_Nvidia_to_Speed_Up_Launch_of_GeForce_GTX_Titan_Reports.html


----------



## NCSUZoSo

Quote:


> Originally Posted by *Alatar*
> 
> xbitlabs mentioned the 10K number: http://www.xbitlabs.com/news/graphics/display/20130213151212_Nvidia_to_Speed_Up_Launch_of_GeForce_GTX_Titan_Reports.html


It's nice to see XBitLabs a full 22 hours behind us









I was just wondering if anyone revealed their source for the 10k number, we didn't and for good reason


----------



## Cloudfire777

AAAAAARRRRRGGGGHHHHHHHHHHH SO UNFAIR









1:04:15, start there. "Saturday delivery". Oh well atleast its 100% confirmed now that this GPU is real and coming within pretty short time


----------



## maarten12100

Quote:


> Originally Posted by *NCSUZoSo*
> 
> K, I was wondering if any other site has released that info after we updated our article with the info.
> AMD's only counter is a true reference release of the 7990 and that is two GPUs fighting one. That would still give NVIDIA the single GPU crown. AMD has no counter here, even their 8000 Series in Q4 will most likely not get close to Titan.


Roy Taylor said this:


Spoiler: Warning: Spoiler!



Code:



Code:


"We aren't afraid," says Roy Taylor, a corporate vice president and head of AMD's global channel sales. "We have new products. We have a roadmap. We are not sitting still, we do not lack resources, and we do not lack imagination. Let me be clear: The new products will be a new architecture."

When asked specifically about the Nvidia GeForce GTX Titan-a rumored top-of-the-line card based on Nvidia's powerful Tesla K20 offering , supposedly being released soon-Taylor said "We'll wait and see what Nvidia comes back with, and when it arrives we'll deal with it, but we believe we'll maintain leadership. I'm a firm believer in bringing back the old GPU wars. We're taking them on again."





They might have something on their hands after all or they are panicking as we speak to put a product together.


----------



## XxOsurfer3xX

Quote:


> Originally Posted by *maarten12100*
> 
> Roy Taylor said this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "We aren't afraid," says Roy Taylor, a corporate vice president and head of AMD's global channel sales. "We have new products. We have a roadmap. We are not sitting still, we do not lack resources, and we do not lack imagination. Let me be clear: The new products will be a new architecture."
> 
> When asked specifically about the Nvidia GeForce GTX Titan-a rumored top-of-the-line card based on Nvidia's powerful Tesla K20 offering , supposedly being released soon-Taylor said "We'll wait and see what Nvidia comes back with, and when it arrives we'll deal with it, but we believe we'll maintain leadership. I'm a firm believer in bringing back the old GPU wars. We're taking them on again."
> 
> 
> 
> 
> 
> They might have something on their hands after all or they are panicking as we speak to put a product together.


They are not nervous because titan is going to be a niche product, and if nvidia is coming out with this it probably means that the 700 series is not coming anytime soon.


----------



## Avonosac

Quote:


> Originally Posted by *maarten12100*
> 
> Indeed this is them being on top of their game since the result is Nvidia owns crown gpu Nvidia makes the best gpus in the world (that is what a regular person thinks)
> So regular person goes to the store and buys a junk low end card yeah nvidia is the best blah blah.
> Amd however stated they might counter it if they find it necessary however I yet don't see how.


I don't see how you draw this conclusion, NVidia is raking in the money with 670/680s. They are doing quite well, the marketing wars have little to do with a companies approach to business when they are selling things just fine.

Quote:


> Originally Posted by *maarten12100*
> 
> Roy Taylor said this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "We aren't afraid," says Roy Taylor, a corporate vice president and head of AMD's global channel sales. "We have new products. We have a roadmap. We are not sitting still, we do not lack resources, and we do not lack imagination. Let me be clear: The new products will be a new architecture."
> 
> When asked specifically about the Nvidia GeForce GTX Titan-a rumored top-of-the-line card based on Nvidia's powerful Tesla K20 offering , supposedly being released soon-Taylor said "We'll wait and see what Nvidia comes back with, and when it arrives we'll deal with it, but we believe we'll maintain leadership. I'm a firm believer in bringing back the old GPU wars. We're taking them on again."
> 
> 
> 
> 
> 
> They might have something on their hands after all or they are panicking as we speak to put a product together.


If its true, and not just necessary product backing and support, then awesome. When GPU wars start, we win.
Quote:


> Originally Posted by *XxOsurfer3xX*
> 
> They are not nervous because titan is going to be a niche product, and if nvidia is coming out with this it probably means that the 700 series is not coming anytime soon.


There is absolutely no logical reason to draw that conclusion from this statement.


----------



## Cloudfire777

Great, pollute this thread with another one of the AMD vs Nvidia discussions.


----------



## Astral Fly

I just read an interesting rumor over on Xtremesystems. Maxwell on 28nm in Q4. We already know AMD is coming with a new architecture on 28nm in Q4. Nvidia might decide to skip Kepler refresh and do the same. It would fit with the Sweclockers rumor, wouldn't it? Perhaps TSMC is having problems with 20nm and it will be late 2014 before it is ready? Perhaps there is information out there that contradicts this, this is all speculation of course.


----------



## XxOsurfer3xX

Quote:


> Originally Posted by *Avonosac*
> 
> I don't see how you draw this conclusion, NVidia is raking in the money with 670/680s. They are doing quite well, the marketing wars have little to do with a companies approach to business when they are selling things just fine.
> If its true, and not just necessary product backing and support, then awesome. When GPU wars start, we win.
> There is absolutely no logical reason to draw that conclusion from this statement.


I was just saying what I think, I don't think AMD is going to respond to titan. I think they would respond if it was a 780, something affordable and without limited quantities.
Quote:


> Originally Posted by *Astral Fly*
> 
> I just read an interesting rumor over on Xtremesystems. Maxwell on 28nm in Q4. We already know AMD is coming with a new architecture on 28nm in Q4. Nvidia might decide to skip Kepler refresh and do the same. It would fit with the Sweclockers rumor, wouldn't it? Perhaps TSMC is having problems with 20nm and it will be late 2014 before it is ready? Perhaps there is information out there that contradicts this, this is all speculation of course.


Exactly what I'm saying.


----------



## mcg75

Quote:


> Originally Posted by *badrapper*
> 
> This is just nvidia getting desperate from the dwindling sales of the over priced 6 series.. with nothing being set in stone, i suppose we will find out soon what this drama is all about.


That's great logic. If 6 series sales are indeed tumbling (which they aren't), let's introduce a top tier niche card to boost sales instead of a price drop.

Ok, let's go back to reality now. In Q3 2012, Nvidia raised it's discrete gpu marketshare to 65% vs Amd's 35%. I have yet to find the result for Q4 2012 but in Amd's latest financial report, the graphics division was down $16 million in revenue for q4 2012 vs q3 2012.

I ROFL every time I hear someone say the 6 series is overpriced. Both Amd and Nvidia are a business. If Amd could sell their product for Nvidia's prices, they would. They don't reduce prices because they feel sorry for us gamers and want to be nice guys, they drop prices because the product isn't moving.


----------



## driftingforlife

I have to put this carefully. The charts that were posted are not right. HS is faster (can't say by how much) but it's not down to just IPC.


----------



## Cloudfire777

The latest articles had very little substance:

IF AMD release their next architecture Q4 2013, there is absolutely no point for Nvidia to release Kepler 2.0 at the same time. None. 0. Next architecture from AMD will destroy those cards. I personally believe the latest charts is legit were it said Kepler 2.0 is announced in March, 1 month after Titan. Its one year after GTX 680, and it makes sense regarding AMDs next move.

AMD can`t combat Titan. No chance unless they harvested great architecture gains from GCN. The die will be too big and too hot for a single GPU configuration. But I doubt they will even try because A) They just said yesterday that 7970GHz will be their fastest GCN single GPU through entire 2013 and they rather use their resources on getting the next architecture done, B) $1000 GPUs is a niche in a niche, meaning they will let Nvidia have the niche in niche market alone with Titan. AMD will combat this with: Lower price for 7970 and trying to push 7970 CF instead.

Not to even mention 7990 with a semi low price AMD said themselves that they would let 7970GHz be the fastest single GPU from GCN in 2013. But will push out new dual cores from GCN. Go figure


----------



## NCSUZoSo

Quote:


> Originally Posted by *maarten12100*
> 
> Roy Taylor said this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "We aren't afraid," says Roy Taylor, a corporate vice president and head of AMD's global channel sales. "We have new products. We have a roadmap. We are not sitting still, we do not lack resources, and we do not lack imagination. Let me be clear: The new products will be a new architecture."
> 
> When asked specifically about the Nvidia GeForce GTX Titan-a rumored top-of-the-line card based on Nvidia's powerful Tesla K20 offering , supposedly being released soon-Taylor said "We'll wait and see what Nvidia comes back with, and when it arrives we'll deal with it, but we believe we'll maintain leadership. I'm a firm believer in bringing back the old GPU wars. We're taking them on again."
> 
> 
> 
> 
> 
> They might have something on their hands after all or they are panicking as we speak to put a product together.


AMD has no counter to Titan in a respectable time frame I guess is what I should have said. I don't believe the HD8970 will be able to beat TItan, considering predictions say the 780 will be behind Titan by at least 25% and that is what the 8970 is going to be sparing with (the 780).

Now about this Maxwell 28nm rumor, that could be very interesting if we see that in Q4. If that is the case I will be upgrading GPUs a lot quicker than I expected.


----------



## Cloudfire777

Quote:


> Originally Posted by *driftingforlife*
> 
> I have to put this carefully. The charts that were posted are not right. HS is faster (can't say by how much) but it's not down to just IPC.


Whats HS? If Haswell, what charts?


----------



## driftingforlife

Yea haswell. The charts maarten12100 posted a link to.


----------



## j3st3r

I am super excited to see how well this thing does. I have a 7950 but if these numbers are correct then this thing is a pure monster. I will be fine CPU wise for quite sometime but the GPU aspect might be changed in a year or so


----------



## Cloudfire777

Quote:


> Originally Posted by *driftingforlife*
> 
> Yea haswell. The charts maarten12100 posted a link to.


Ah yeah.

I`d say Intel will push out higher clocked CPUs than Ivy Bridge, netting that extra speed plus the 10% extra IPC which in total will be a big deal. Power consumption will be better because of the onboard Voltage regulator and better management of idle/active phases


----------



## maarten12100

Quote:


> Originally Posted by *Avonosac*
> 
> I don't see how you draw this conclusion, NVidia is raking in the money with 670/680s. They are doing quite well, the marketing wars have little to do with a companies approach to business when they are selling things just fine.


Well being able to say we make the best fastest gpu's in the world will boost the lower end for the normal consumer with normal consumer I mean non pc enthusiast.


----------



## maarten12100

Quote:


> Originally Posted by *driftingforlife*
> 
> I have to put this carefully. The charts that were posted are not right. HS is faster (can't say by how much) but it's not down to just IPC.


I'm really glad to hear that.
I know you can't say anything on performance or specs just like that other guy I spoke on this forum which you only get me photographs.
Lucky bastards









Haswell might be a future upgrade unless I of course skip a generation all rather depends if it is worth it for me or not at that time.


----------



## driftingforlife

It could have been the motherboard they were useing or a different ES.


----------



## Cloudfire777

Quote:


> Originally Posted by *maarten12100*
> 
> Haswell might be a future upgrade unless I of course skip a generation all rather depends if it is worth it for me or not at that time.


Have fun waiting those 2-2.5 years for something worth upgrading over Haswell...


----------



## maarten12100

Quote:


> Originally Posted by *Cloudfire777*
> 
> Have fun waiting those 2-2.5 years for something worth upgrading over Haswell...


What makes you think that I have to wait if my platform suffices as my new platform will last 3 gens so I might skip a gen.
Even my dual 1366 build which I build years ago still holds up to the higher mid end. (I'm just itching to upgrade but I would be in to late to go quad 2011(however I can get if for cheap 750 for the board and another 150 euro per proc))
Quote:


> Originally Posted by *driftingforlife*
> 
> It could have been the motherboard they were useing or a different ES.


That could've been the case also those benches aren't really that great would've rather seen Cinebench 11.5(however I won't be able to compare that with my new platform as Cinebench only goes up to 64 threads







)


----------



## Cloudfire777

You`d be a fool to skip Haswell only to aim for Broadwell. Its only a die shrink.

And 3 gens before upgrading? That sounds very boring


----------



## maarten12100

Quote:


> Originally Posted by *Cloudfire777*
> 
> You`d be a fool to skip Haswell only to aim for Broadwell. Its only a die shrink.
> 
> And 3 gens before upgrading? That sounds very boring


I'm on Nehalem I'm bad and I should feel bad.
From Ivy-EX to Broadwell-EX will be the plan so as soon as I buy the platform it can last me 3/4 years.
It is actually great price/performance as it will be faster than any single socket for the coming 6 years+.


----------



## Cloudfire777

Quote:


> Originally Posted by *maarten12100*
> 
> I'm on Nehalem I'm bad and I should feel bad.
> From Ivy-EX to Broadwell-EX will be the plan so as soon as I buy the platform it can last me 3/4 years.
> It is actually great price/performance as it will be faster than any single socket for the coming 6 years+.


Oh well, your choice. You could have gotten that Haswell-EX a year before that Broadwell-EX and the only difference between those two will be like 5%.

I totally understand the price/performance thingy, although I`m not like that at all. I buy new hardware each year, maybe twice a year


----------



## hifibuff

Since everyone is going OT, when's the IB-E due ? Is there even a confirmed release date and, more importantly, how many cores will the most powerful model feature ? I'm guessing 8 but I saw rumors not too long ago about a 10 core IB-E ??


----------



## maarten12100

Quote:


> Originally Posted by *hifibuff*
> 
> Since everyone is going OT, when's the IB-E due ? Is there
> even a confirmed release date and, more importantly, how many cores will the most powerful model feature ? I'm guessing 8 but I saw rumors not too long ago about a 10 core IB-E ??


We shouldn't get too off topic but there isn't really that much to discuss as long as there are no new benches nor pictures nor listings.
release is set for 2013 it should've been already released but Q2 is my guess.


----------



## sherlock

It is Q3 for IB-E according to this roadmap leak


----------



## maarten12100

Quote:


> Originally Posted by *sherlock*
> 
> It is Q3 for IB-E according to this roadmap leak


Sometimes I really hate my life.
Q3 means I will be in university already when it is launched


----------



## Jaju123

Worth selling sli 680s for this card do you guys think? Would two used 680s come close to covering the cost anyway?

Sent from my Nexus 7 using Tapatalk HD


----------



## amstech

Quote:


> Originally Posted by *Jaju123*
> 
> Worth selling sli 680s for this card do you guys think?


Nope.


----------



## hifibuff

680s in SLI will outperform the Titan since it won't match a 690, which, itself is inferior to two 680s.
Coming from this setup my advice is to go for 2 Titans at least if you want to notice any improvement.


----------



## dph314

Quote:


> Originally Posted by *Jaju123*
> 
> Worth selling sli 680s for this card do you guys think? Would two used 680s come close to covering the cost anyway?
> 
> Sent from my Nexus 7 using Tapatalk HD


If you want to get a good amount of the performance you have now but within a single card. A good amount, but not all









You would still be paying more for less performance though (in games where SLI worked flawlessly for you). But you could always SLI Titans someday and blow those 680s right outta the water.


----------



## Ghoxt

All this Haswell talk ??? in this specifically designated "Only Talk about "Titan" here thread...And you guys have been here awhile...


----------



## dph314

Yeah, don't get this thread closed. We need at least one of these open for tomorrow night, while we're all excited and waiting up for the first new thread around midnight with "[XXXXX] First _Official_ Titan Review" as the title


----------



## maarten12100

Quote:


> Originally Posted by *dph314*
> 
> Yeah, don't get this thread closed. We need at least one of these open for tomorrow night, while we're all excited and waiting up for the first new thread around midnight with "[XXXXX] First _Official_ Titan Review" as the title


This is going like a different disscusion on a different forum which currently 473 pages long and doesn't have a release date yet.
I heard it will be gimped for computing which is a bit sad since I was planing on folding with it when not gaming or using it in other ways.
I figured this would be great if not gimped as tesla k20x is 6x faster than a gtx580 > a gtx580 is 20k ppd so up to 120k ppd from one of those


----------



## hifibuff

http://www.ebay.fr/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200896975248?pt=PCC_Video_TV_Cards&hash=item2ec6649190#ht_500wt_1361

Seems legit.


----------



## maarten12100

Quote:


> Originally Posted by *hifibuff*
> 
> http://www.ebay.fr/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200896975248?pt=PCC_Video_TV_Cards&hash=item2ec6649190#ht_500wt_1361
> 
> Seems legit.


As in it doesn't seem legit.
That cooler looks like reference crap we were promised magnesium design just as staggering as the gtx690 this is far from it.


----------



## Alatar

It's just a reference asus 680 lol


----------



## hifibuff

The photo is obviously that of a 680, but maybe the specs are legit ?


----------



## maarten12100

Quote:


> Originally Posted by *hifibuff*
> 
> The photo is obviously that of a 680, but maybe the specs are legit ?


They are however the asus was rumored to be clocked higher than the 840MHz ref


----------



## Alatar

If they're listed on ebay with pictures of a reference 680 I don't see why anyone would trust them.


----------



## Newbie2009

Still no update? Hurry up!


----------



## dph314

They probably did that so they could start taking money for the card without breaking the NDA. They'll probably take it down after they sell however many they actually have. But I'm sure people have no problem buying it without seeing it. I know I'm buying it regardless of what it looks like









The seller actually has a perfect history over more than a year, so, I think it's legit and they just can't post pics yet. If it was the seller's first sale, I'd be suspicious


----------



## Nemessss

interesting:

http://www.xtremesystems.org/forums/showthread.php?284767-Geforce-Titan-GK110-will-be-a-consumer-part&p=5171013&viewfull=1#post5171013

"I seem to recall something to that effect.

This card is potent, based on what I've seen myself with premature drivers I'm selling my 580s right away."


----------



## maarten12100

Quote:


> Originally Posted by *dph314*
> 
> They probably did that so they could start taking money for the card without breaking the NDA. They'll probably take it down after they sell however many they actually have. But I'm sure people have no problem buying it without seeing it. I know I'm buying it regardless of what it looks like
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The seller actually has a perfect history over more than a year, so, I think it's legit and they just can't post pics yet. If it was the seller's first sale, I'd be suspicious


Reclaim your money after receiving it stating this was not what I paid for it didn't match the pictures (as they stated they took it out of the box to photograph it)
If they send you a gtx680 also reclaim cause it doesn't fit specifications.
Paypal trolololo


----------



## gdmk

http://tieba.baidu.com/p/2168512087?pn=1


----------



## maarten12100

Quote:


> Originally Posted by *gdmk*
> 
> 
> 
> 
> http://tieba.baidu.com/p/2168512087?pn=1


They weren't lieing about it looking similair as a gtx690 but it doesn't really look even beter just the same.


----------



## mcg75

Now that, that is the most believable picture I've seen to date.

Doesn't mean it is though.


----------



## grunion

I'd hit it!!!!!


----------



## DADDYDC650

Quote:


> Originally Posted by *gdmk*
> 
> 
> 
> 
> http://tieba.baidu.com/p/2168512087?pn=1


Sold!


----------



## Shiftstealth

Quote:


> Originally Posted by *gdmk*
> 
> 
> 
> 
> http://tieba.baidu.com/p/2168512087?pn=1


If no one caught this TITAN is stamped close to the video out.


----------



## dph314

That looks pretty damn legit.

Is that white and green round lights on the PCB?


----------



## Darco19

Looks great


----------



## maarten12100

I took the liberty to take the full picture it is pretty high def I think this can be confirmed legit or it is the best mock-up in years.


----------



## Shiftstealth

Quote:


> Originally Posted by *maarten12100*
> 
> 
> 
> 
> I took the liberty to take the full picture it is pretty high def I think this can be confirmed legit or it is the best mock-up in years.


I like the chrome on it, makes me want it more.


----------



## Joneszilla

Nice looking card! Dying to see the true specs...


----------



## Roadkill95

looks fantastic. If only it didn't cost such outrageous amounts of money


----------



## Swolern

SEXY PICS!!!!!


----------



## WALSRU

Needs a backplate but maaaan that cooler is fancy


----------



## dph314

Quote:


> Originally Posted by *WALSRU*
> 
> Needs a backplate but maaaan that cooler is fancy


Yep. And it damn-well better be for what I'm paying for it


----------



## Levesque

They will probably be near impossible to buy for months (with the usual Nvidia non availability and half-baked paper launch tactic/strategy), but if I can find some at launch, I'm in for 4 in Quad-SLI to replace my 4X 7970 Quad-Fire.

IF I can find some. Knowing Nvidida, only the ''reviewers'' and geeks refreshing their pages every 2 seconds at midnight will be able to buy some at launch, while people with a live will have to stare at all those ''Not available'' and ''Out of stock'' messages for months...









Nvidia, I want you to take my money! Put them on the market more then 10 a time! I know you do it intentionnally because marketing dep says so, but it's getting old...


----------



## WALSRU

Seriously doubt anyone is going to get a hold of more than 2 at launch. Even that might be a stretch.


----------



## Alatar

Quote:


> Originally Posted by *Levesque*
> 
> They will probably be near impossible to buy for months (with the usual Nvidia non availability and half-baked paper launch tactic/strategy), but if I can find some at launch, I'm in for 4 in Quad-SLI to replace my 4X 7970 Quad-Fire.
> 
> IF I can find some. Knowing Nvidida, only the ''reviewers'' and geeks refreshing their pages every 2 seconds at midnight will be able to buy some at launch, while people with a live will have to stare at all those ''Not available'' and ''Out of stock'' messages for months...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nvidia, I want you to take my money! Put them on the market more then 10 a time! I know you do it intentionnally because marketing dep says so, but it's getting old...


I doubt they care who their money goes to since all the cards will be sold anyway. Releasing them a few at a time is supposed to stop people from hoarding them and losing the opportunity to get one just because they couldn't press the button 5 minutes after launch.


----------



## maarten12100

So sad it can only drive 3 screens and you have to drive them half dp half dvi-d such a hassle.
Why doesn't nvidia switch to DP ports and drops in some adapters.


----------



## Compaddict

Quote:


> Originally Posted by *WALSRU*
> 
> Seriously doubt anyone is going to get a hold of more than 2 at launch. Even that might be a stretch.


I will be happy if I can get 2 of these (Depending on launch price), they will a great replacement for my 2x GTX570 Sli rig.







I skipped the GTX6XX series and am ready for this! If it all falls into place, it will be interesting to see how high I can go with Crysis 3 settings in 6010x1080 2D Surround.


----------



## Shiftstealth

Quote:


> Originally Posted by *Alatar*
> 
> I doubt they care who their money goes to since all the cards will be sold anyway. Releasing them a few at a time is supposed to stop people from hoarding them and losing the opportunity to get one just because they couldn't press the button 5 minutes after launch.


So nvidia is releasing 10 every 5 minutes or something?


----------



## Shaba




----------



## HYPERDRIVE

For me its being able to at least play my games in 1920x1200 at max settings, with 85+ frames throughout.
Man I will be fighting the urge of buying it or not, till the moment they pop out on sale, next 24-48 hours will be intense.


----------



## driftingforlife

IF I could afford one it would almost be a shame to put a waterblock on it.


----------



## HYPERDRIVE

Quote:


> Originally Posted by *driftingforlife*
> 
> IF I could afford one it would almost be a shame to put a waterblock on it.


I know what you mean, if I do get it , that's what Im planning on doing, kind of a shame with that nice aircooler.
Going to build a Haswell 4770k rig this summer, and that's when Im planing on doing a whole new loop.


----------



## andre02

I like the looks of the standard cooler too , if it's efficient and not noisy i wouldn't change it.


----------



## renat77

Reel pictures here!!


----------



## Cloudfire777

I have no idea if this have been posted before but the Asus Titan is up for sale @ Ebay:
http://www.ebay.co.uk/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200896975248?pt=PCC_Video_TV_Cards&hash=item2ec6649190


----------



## Remij

Nah, that's extremely fake. It's a reference Asus 680. Not only the cooler, but it's obvious the PCB is different because the power connectors. They are stacked like the 680..when the PCB of the Titan has them side by side.

Also tons of other things give it away but, yea.. no.


----------



## Cloudfire777

Thanks for clearing ut up. Doesn`t look like the real thing


----------



## Cloudfire777

Question: NDA ends 18th February. What time zone? Are we talking hours here? 2.5 hours from now its 18th here in Norway


----------



## Alatar

Probably midnight PST or something


----------



## Cloudfire777

Noooo, 12 hours left?


----------



## Shiftstealth

Quote:


> Originally Posted by *Alatar*
> 
> Probably midnight PST or something


I thought it was EST?


----------



## Roadkill95

Quote:


> Originally Posted by *Cloudfire777*
> 
> Question: NDA ends 18th February. What time zone? Are we talking hours here? 2.5 hours from now its 18th here in Norway


Lol someone should report that.. what if someone actually buys it hoping that it's a titan?


----------



## xoleras

Quote:


> Originally Posted by *Cloudfire777*
> 
> Question: NDA ends 18th February. What time zone? Are we talking hours here? 2.5 hours from now its 18th here in Norway


I thought it was midnight on the 18th. So 1 more full day?


----------



## maarten12100

Quote:


> Originally Posted by *Roadkill95*
> 
> Lol someone should report that.. what if someone actually buys it hoping that it's a titan?


You mean the ebay add?
If it is the titan reclaim your money if it isn't reclaim your money free Titan!
As the picture shows a gtx680 but the specs list a titan you can't miss in the eyes of the lord named Paypal


----------



## lowfiwhiteguy

Forgive my foolishness but, is the Titan the card that is supposed to succeed the GTX 680? Or is this "named" series of GTX cards a new tier which performs/costs above the numbered GTX series (GTX x80 etc) and below the dual-GPU (GTX x90, etc)? So, is the Titan the GTX 780 with a name change, or is there actually going to be a cheaper, less-high-end GTX 780 card released later on as the successor to the GTX 680?


----------



## Shiftstealth

Quote:


> Originally Posted by *lowfiwhiteguy*
> 
> Forgive my foolishness but, is the Titan the card that is supposed to succeed the GTX 680? Or is this "named" series of GTX cards a new tier which performs/costs above the numbered GTX series (GTX x80 etc) and below the dual-GPU (GTX x90, etc)? So, is the Titan the GTX 780 with a name change, or is there actually going to be a cheaper, less-high-end GTX 780 card released later on as the successor to the GTX 680?


Cheaper less high end 780


----------



## maarten12100

Quote:


> Originally Posted by *lowfiwhiteguy*
> 
> Forgive my foolishness but, is the Titan the card that is supposed to succeed the GTX 680? Or is this "named" series of GTX cards a new tier which performs/costs above the numbered GTX series (GTX x80 etc) and below the dual-GPU (GTX x90, etc)? So, is the Titan the GTX 780 with a name change, or is there actually going to be a cheaper, less-high-end GTX 780 card released later on as the successor to the GTX 680?


It is not the gtx780 this is just a rebranded Tesla k20x (however it might be gimped on computation part of the die we don't know yet)


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *lowfiwhiteguy*
> 
> Forgive my foolishness but, is the Titan the card that is supposed to succeed the GTX 680? Or is this "named" series of GTX cards a new tier which performs/costs above the numbered GTX series (GTX x80 etc) and below the dual-GPU (GTX x90, etc)? So, is the Titan the GTX 780 with a name change, *or is there actually going to be a cheaper, less-high-end GTX 780 card released later on as the successor to the GTX 680?*


Yes there will be a GK114 based GTX 780 late this year according to the rumors I've read. Titan will be a stand alone sku...


----------



## Cloudfire777

Quote:


> Originally Posted by *lowfiwhiteguy*
> 
> Forgive my foolishness but, is the Titan the card that is supposed to succeed the GTX 680? Or is this "named" series of GTX cards a new tier which performs/costs above the numbered GTX series (GTX x80 etc) and below the dual-GPU (GTX x90, etc)? So, is the Titan the GTX 780 with a name change, or is there actually going to be a cheaper, less-high-end GTX 780 card released later on as the successor to the GTX 680?


Rating and explanation of top tier Geforce cards:

#1: GTX 690 (GK104x2, 600 series) --> GTX 790 (GK114x2, 700 series)
#2: GTX Titan (GK110, neither 600 or 700 series, exclusive card)
#3: GTX 680 (GK104, 600 series) --> GTX 780 ( GK114, 700 series)
Quote:


> Originally Posted by *xoleras*
> 
> I thought it was midnight on the 18th. So 1 more full day?


Ah I found the right website. 7 more hours until NDA ends. Geez gonna be a long night








Quote:


> According to popular technology websites, on Monday, February 18th at 12:01AM EST Nvidia will lift the NDA on GeForce GTX Titan


Quote:


> Originally Posted by *Roadkill95*
> 
> Lol someone should report that.. what if someone actually buys it hoping that it's a titan?


Yeah someone should really report that thug. I`m 100% certain someone will take that bait and be scammed by this guy.









I`m to lazy to log in and write a mail with pictures of 680 and all that crap.
But please someone else do it
http://www.ebay.co.uk/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200896975248?pt=PCC_Video_TV_Cards&hash=item2ec6649190


----------



## Shiftstealth

Quote:


> Originally Posted by *Cloudfire777*
> 
> Rating and explanation of top tier Geforce cards:
> 
> #1: GTX 690 (GK104x2, 600 series) --> GTX 790 (GK114x2, 700 series)
> #2: GTX Titan (GK110, neither 600 or 700 series, exclusive card)
> #3: GTX 680 (GK104, 600 series) --> GTX 780 ( GK114, 700 series)
> Ah I found the right website. 7 more hours until NDA ends. Geez gonna be a long night
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah someone should really report that thug. I`m 100% certain someone will take that bait and be scammed by this guy.


Do we think its going on sale tonight?


----------



## Majin SSJ Eric

No.


----------



## hifibuff

No, we don't know, or no, it won't ?


----------



## maarten12100

Quote:


> Originally Posted by *hifibuff*
> 
> No, we don't know, or no, it won't ?


No we don't think it will it is a paper launch so it'll take at least till tomorrow.


----------



## Sapientia

Quote:


> Originally Posted by *Cloudfire777*
> 
> Ah I found the right website. 7 more hours until NDA ends. Geez gonna be a long night


Well a lot of Nvidia NDAs end at 9AM EST so we could be waiting a bit longer


----------



## Roadkill95

Quote:


> Originally Posted by *Cloudfire777*
> 
> Rating and explanation of top tier Geforce cards:
> 
> #1: GTX 690 (GK104x2, 600 series) --> GTX 790 (GK114x2, 700 series)
> #2: GTX Titan (GK110, neither 600 or 700 series, exclusive card)
> #3: GTX 680 (GK104, 600 series) --> GTX 780 ( GK114, 700 series)
> Ah I found the right website. 7 more hours until NDA ends. Geez gonna be a long night
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah someone should really report that thug. I`m 100% certain someone will take that bait and be scammed by this guy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I`m to lazy to log in and write a mail with pictures of 680 and all that crap.
> But please someone else do it
> http://www.ebay.co.uk/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200896975248?pt=PCC_Video_TV_Cards&hash=item2ec6649190


Just did. Hope ebay comes cracking down on his fraudulent butt.


----------



## lowfiwhiteguy

I reported the guy on eBay selling the GTX 680 masquerading as a GTX Titan. I suggest you all do the same.


----------



## Cloudfire777

Good job guys. Did the same too


----------



## maarten12100

Quote:


> Originally Posted by *lowfiwhiteguy*
> 
> I reported the guy on eBay selling the GTX 680 masquerading as a GTX Titan. I suggest you all do the same.


I suggest ordering then reclaiming my money since if it is a gtx titan the picture doesn't match and if it is a gtx 680 the specs don't match.
Paypal will refund and you can keep the card seems like a fair deal to me.


----------



## hifibuff

I for one, give the guy the benefit of the doubt. The specs he gives in the description seems accurate. Oh, and the photos, on eBay are non contractual. Plus, he's got a flawless record so far.
On the other hand, he shouldn't have posted ANY pics at all or explained why the photo isn't that of the actual product.


----------



## dph314

I think he's just trying to make the sale, but can't post pics because of the NDA. He didn't just become an eBay member yesterday, and he has a flawless record over the past year, or more. I think it's legit and he just can't post the pics yet.


----------



## Roadkill95

1.How did he get a card so soon!? He's not a known reviewer or retailer.

2.His description implies that the card pictured is actually what you're getting because he describes the condition as 'New but opened to take pics' or something like that


----------



## 2010rig

Quote:


> Originally Posted by *maarten12100*
> 
> I suggest ordering then reclaiming my money since if it is a gtx titan the picture doesn't match and if it is a gtx 680 the specs don't match.
> Paypal will refund and you can keep the card seems like a fair deal to me.


If you do get a Titan, your little plan is called fraud.

Besides, why would someone be stupid enough to pay $1600 in the first place...


----------



## maarten12100

Quote:


> Originally Posted by *2010rig*
> 
> If you do get a Titan, your little plan is called fraud.
> 
> Besides, why would someone be stupid enough to pay $1600 in the first place...


It is called EU guidelines practically making the buyer untouchable.
Dell a while back had a wrong advertised laptop on their site containing 2 ssds worth about 3x the laptops price Dell figured it out and removed it after a couple of hours.
Thanks to EU guidelines an regulations those who bought it got it for that price.

I know that plan is wrong but he is being wrong posting those pictures stating they are gtx titan pictures don't you think?


----------



## Majin SSJ Eric

I'm sure all legitimate resellers have Titans in stock right now if the launch is this week. He may be ready to ship tomorrow but can't show pics in violation of NDA. Whatever, I ain't paying that much for a Titan even if it is legit.


----------



## Cloudfire777

He just changed his pictures
http://www.ebay.co.uk/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200896975248?pt=PCC_Video_TV_Cards&hash=item2ec6649190

and used this one....
http://videocardz.com/39618/nvidia-geforce-gtx-titan-pictured


----------



## 2010rig

Quote:


> Originally Posted by *maarten12100*
> 
> It is called EU guidelines practically making the buyer untouchable.
> Dell a while back had a wrong advertised laptop on their site containing 2 ssds worth about 3x the laptops price Dell figured it out and removed it after a couple of hours.
> Thanks to EU guidelines an regulations those who bought it got it for that price.
> 
> I know that plan is wrong but he is being wrong posting those pictures stating they are gtx titan pictures don't you think?


2 wrongs don't make a right.









If he did send you a real Titan, why would you screw him out of that money?

If he sent you a 680, then by all means get your money back from PayPal, AND make him pay for shipping to send the card back.









That'd be the right thing to do, after making the wrong choice to pay $1600 for it in the first place.

Besides, I've dealt with PayPal both as a seller and a buyer, I really wouldn't rely on them to do the right thing every time. It's *possible* that even if you got a 680, that you wouldn't get your money back.

All he has to do is update the pics for his listing, provide a tracking number as proof of delivery, and PayPal *could* still rule in his favor. His listing does specifically state "No Returns Accepted".

It's a sticky situation, just order one from a legit retailer. Don't deal with shady eBay sellers.


----------



## Roadkill95

Quote:


> Originally Posted by *Cloudfire777*
> 
> He just changed his pictures
> http://www.ebay.co.uk/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200896975248?pt=PCC_Video_TV_Cards&hash=item2ec6649190
> 
> and used this one....
> http://videocardz.com/39618/nvidia-geforce-gtx-titan-pictured


Lol. Even if it was the real deal 1600 is madness, considering the only card he could've gotten is one of the review samples that could've been used, abused and torn apart for reviewing. I honestly don't think some guy with 70 rep on ebay gets a titan from NV when even videocardz couldn't get one.


----------



## maarten12100

Quote:


> Originally Posted by *2010rig*
> 
> 2 wrongs don't make a right.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If he did send you a real Titan, why would you screw him out of that money?
> 
> If he sent you a 680, then by all means get your money back from PayPal, AND make him pay for shipping to send the card back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That'd be the right thing to do, after making the wrong choice to pay $1600 for it in the first place.
> 
> Besides, I've dealt with PayPal both as a seller and a buyer, I really wouldn't rely on them to do the right thing every time. It's *possible* that even if you got a 680, that you wouldn't get your money back.
> 
> All he has to do is update the pics for his listing, provide a tracking number as proof of delivery, and PayPal *could* still rule in his favor. His listing does specifically state "No Returns Accepted".
> 
> It's a sticky situation, just order one from a legit retailer. Don't deal with shady eBay sellers.


I don't swing that way anyway.
It is just wrong to advertise that way with wrong info and indeed scamming this way is equally wrong just saying that Paypal + EU gives you an advantage.

"Packaging opened for stock photographing."
Still not quite right as he took the picture from a site


----------



## hzac

Quote:


> Originally Posted by *Cloudfire777*


Just burst out laughing.. hahaa

EDIT: what time is it in america? its like 11am onthe 18th here in Australia. ill be lurking local sites to see if there is any news.


----------



## Vonnis

So back on track...

We can get reviews on the 18th right? Which time zone does this normally go for? Because it's been the 18th for a while here, and I'd like to know how many hours I have to wait to finally get some facts.


----------



## mike88931

Do you guys have any idea when the exact release for reviews should be? I was considering pulling an all nighter but if the reviewers are a different time zone I may as well rise nd shine to see them a couple hours late with a full body of energy. Also, they will not be available for purchase until the 24th right? Do not need to get ready to uy tonight or anything...?


----------



## maarten12100

And hes gone








The kid is probably mad because he can't afford a Gtx Titan


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Vonnis*
> 
> So back on track...
> 
> We can get reviews on the 18th right? Which time zone does this normally go for? Because it's been the 18th for a while here, and I'd like to know how many hours I have to wait to finally get some facts.


Its been speculated several times that 9am EST the NDA will lift. Could be before or after that though...


----------



## Sapientia

Quote:


> Originally Posted by *mike88931*
> 
> Do you guys have any idea when the exact release for reviews should be? I was considering pulling an all nighter but if the reviewers are a different time zone I may as well rise nd shine to see them a couple hours late with a full body of energy. Also, they will not be available for purchase until the 24th right? Do not need to get ready to uy tonight or anything...?


If past releases are anything to go by it'll be 9AM EST. All rumors point to a paper launch but I don't think anyone knows for sure.


----------



## Vonnis

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its been speculated several times that 9am EST the NDA will lift. Could be before or after that though...


Quote:


> Originally Posted by *Sapientia*
> 
> If past releases are anything to go by it'll be 9AM EST. All rumors point to a paper launch but I don't think anyone knows for sure.


Cheers. Let's hope it'll be earlier than that, I don't want to have to wait that long.


----------



## Forceman

Quote:


> Originally Posted by *Sapientia*
> 
> If past releases are anything to go by it'll be 9AM EST. All rumors point to a paper launch but I don't think anyone knows for sure.


I was against it at first, but now I kind of like the announce with reviews then product availability a week later (or whatever) strategy. Gives you time to read the reviews and make a decision before buying time. Otherwise you almost have to buy first, then go read the reviews, to be sure you get one.


----------



## Majin SSJ Eric

I agree. I kinda hope they aren't available for public consumption til next week (like some of the rumors have said). That way I have plenty of time to peruse the reviews and get my money together.


----------



## Roadkill95

Quote:


> Originally Posted by *maarten12100*
> 
> And hes gone
> 
> 
> 
> 
> 
> 
> 
> 
> The kid is probably mad because he can't afford a Gtx Titan


It's sold !! :O


----------



## Parkerm35

2am still no reviews







i'm waiting up until one surfaces!


----------



## maarten12100

Quote:


> Originally Posted by *Parkerm35*
> 
> 2am still no reviews
> 
> 
> 
> 
> 
> 
> 
> i'm waiting up until one surfaces!


3 AM local time here however I have a friend which is currently in the UK as a part of his family lives there.
Just gonna get some sleep now and maybe in the morning I'll be greeted by some wonderful reviews.


----------



## Vonnis

Tempted to just stay up until I see something. I normally don't go to bed for another couple hours anyway.


----------



## maarten12100

Quote:


> Originally Posted by *Vonnis*
> 
> Tempted to just stay up until I see something. I normally don't go to bed for another couple hours anyway.


It is Monday tomorrow don't you have work to go too?
Or are you just like me having a holiday or a day of.

I could stay up but I don't really see the point. staying up until 9 AM EST really no reason to do so.


----------



## Vonnis

Quote:


> Originally Posted by *maarten12100*
> 
> It is Monday tomorrow don't you have work to go too?
> Or are you just like me having a holiday or a day of.
> 
> I could stay up but I don't really see the point. staying up until 9 AM EST really no reason to do so.


I work evenings. A side effect of that is that I basically reversed my day/night cycle over time.








As an added bonus I have the day (or night I suppose) off.


----------



## maarten12100

Quote:


> Originally Posted by *Vonnis*
> 
> I work evenings. A side effect of that is that I basically reversed my day/night cycle over time.
> 
> 
> 
> 
> 
> 
> 
> 
> As an added bonus I have the day (or night I suppose) off.


That is nice if you have the ability to sleep instantly something I don't poses.
Actual sleeping during the day is near impossible for me I always get up at 9 AM no matter when I go to bed.

good luck tonight


----------



## Vonnis

Quote:


> Originally Posted by *maarten12100*
> 
> That is nice if you have the ability to sleep instantly something I don't poses.
> Actual sleeping during the day is near impossible for me I always get up at 9 AM no matter when I go to bed.
> 
> good luck tonight


It's something you get used to after a while. I'm now at the point where if I go to bed earlier, say 23:00-2:00, I only sleep for three hours at most and then I'm wide awake.

Anyhow, cheers and good night.


----------



## Seeing Red

I would agree with Alatar, the NDA probably won't be over until midnight PST since Nvidia's HQ is in Cali.


----------



## guinner16

Quote:


> Originally Posted by *Seeing Red*
> 
> I would agree with Alatar, the NDA probably won't be over until midnight PST since Nvidia's HQ is in Cali.


Alot of events that happen on the West Coast still prefer to use Eastern Standard as the "standard", for lack of a better word. Either way I dont think it wil matter for us peons, as I think this will be strictly a paper launch.


----------



## Shiftstealth

Quote:


> Originally Posted by *Seeing Red*
> 
> I would agree with Alatar, the NDA probably won't be over until midnight PST since Nvidia's HQ is in Cali.


So the sites can't even tell us when the NDA is up?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Shiftstealth*
> 
> So the sites can't even tell us when the NDA is up?


I actually don't think they can.


----------



## Seeing Red

Quote:


> Originally Posted by *guinner16*
> 
> Alot of events that happen on the West Coast still prefer to use Eastern Standard as the "standard", for lack of a better word. Either way I dont think it wil matter for us peons, as I think this will be strictly a paper launch.


This is not really an event, it's an expiration of a contract so it may be more time specific to the location the contract was created.


----------



## guinner16

Quote:


> Originally Posted by *Seeing Red*
> 
> This is not really an event, it's an expiration of a contract so it may be more time specific to the location the contract was created.


Maybe...maybe not. If this is more than a paper launch then the 9am time would be smarts. That means the US will be awake, or at work, and Europe will be awake. Who would want to launch a product whwere its biggest markets would be sleeping, or right in the middle of a work day.


----------



## HanakoIkezawa

I hope reviews are live tomorrow so I have something to do at work


----------



## Majin SSJ Eric

I'm sure Nvidia would've specified the exact date and time the NDA expires. That doesn't mean every site is going to release reviews right at that time though...


----------



## Seeing Red

I would only expect the NDA to expire in the morning if Nvidia were to make an official announcement before then. It's easier to define the end of a day to be an expiration date.


----------



## Shiftstealth

Quote:


> Originally Posted by *Seeing Red*
> 
> I would only expect the NDA to expire in the morning if Nvidia were to make an official announcement before then. It's easier to define the end of a day to be an expiration date.


Disappointing i can't find anything. Anyone else have any luck?


----------



## -iceblade^

Nothing on AnandTech and HardwareCanucks as of the time of this post.


----------



## The-Real-Link

Guess I'd imagine 3am EST (midnight PST) is the next time milestone, then 9am EST?

Good luck to all looking at one (or more).


----------



## Roadkill95

I know I'm gonna get a lot of ish for this but the more I look at the pictures the more I think that it looks bad.

I've come to the point where I think it looks terrible.


----------



## dph314

Quote:


> Originally Posted by *The-Real-Link*
> 
> Guess I'd imagine 3am EST (midnight PST) is the next time milestone, then 9am EST?
> 
> Good luck to all looking at one (or more).


That's what I did last time, for the 680. Woke up every 3 hours, just before 3am, 6am, and 9am. Snagged one shortly after 9am









Quote:


> Originally Posted by *Roadkill95*
> 
> I know I'm gonna get a lot of ish for this but the more I look at the pictures the more I think that it looks bad.
> 
> I've come to the point where I think it looks terrible.












Ah well. That's how it is with everything. There's nothing out there that _everyone_ will like. I think it looks damn sexy though.


----------



## HyperBCS

Techpowerup just released this maybe the NDA will be released soon?
http://www.techpowerup.com/180297/NVIDIA-GeForce-GTX-Titan-Graphics-Card-Pictured-in-Full.html


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Roadkill95*
> 
> I know I'm gonna get a lot of ish for this but the more I look at the pictures the more I think that it looks bad.
> 
> I've come to the point where I think it looks terrible.


I think it looks pretty good for a reference card but not as good as the 690 (I prefer central fan position). I would LOVE to see MSI's Lightning version if they were allowed to make one...


----------



## Roadkill95

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think it looks pretty good for a reference card but not as good as the 690 (I prefer central fan position). I would LOVE to see MSI's Lightning version if they were allowed to make one...


I know, imagine a solid black lightning without those yellow stripes with white fans... sigh.
Quote:


> Originally Posted by *HyperBCS*
> 
> Techpowerup just released this maybe the NDA will be released soon?
> http://www.techpowerup.com/180297/NVIDIA-GeForce-GTX-Titan-Graphics-Card-Pictured-in-Full.html


is that a pwm controller I see next to the power pins?


----------



## striderstone

it's past midnight! Anyone found anything yet? Are we now waiting for the 9AM quote?


----------



## supermi

Quote:


> Originally Posted by *striderstone*
> 
> it's past midnight! Anyone found anything yet? Are we now waiting for the 9AM quote?


still checking , still hoping

will keep doing so till 9 AM if necessary LOL


----------



## The-Real-Link

Is that 9am EST or PST?


----------



## Majin SSJ Eric

Best part about going to bed is by the time you wake up there may be reviews and benches to look at!


----------



## Kiracubed

Where would you even look to buy this? Amazon, Newegg? Or direct from Nvidia, Asus and EVGA?


----------



## The-Real-Link

I figure it'd be stocked at your favorite etailers just like any GPU. You should be able to buy it from their stores or partner stores.


----------



## Artikbot

250W TDP I see?

That's absolutely impossible. It's only 55W higher than the GTX680 for DOUBLE the die space!


----------



## maarten12100

Quote:


> Originally Posted by *Artikbot*
> 
> 250W TDP I see?
> 
> That's absolutely impossible. It's only 55W higher than the GTX680 for DOUBLE the die space!


It hasn't been abused on clocks, like GK104 with his terribly small dies.


----------



## Alatar

Quote:


> Originally Posted by *Artikbot*
> 
> 250W TDP I see?
> 
> That's absolutely impossible. It's only 55W higher than the GTX680 for DOUBLE the die space!


lower clocks and the process has matured a bit


----------



## Kiracubed

Strictly speaking performance, which is better: Geforce Titan, or 2 680's in SLI? Check my build in sig to see which model I'm using, as well as my system.

I see that the Titan scores a X7100 in 3D Mark 11, and I just ran it at same Extreme preset on build in sig, and got an X6197, with highest being X6201; ~X6200 being a safe average.

I know that people say that Titan is more efficient, has better driver support because some games won't take full advantage of SLI, and single card solutions use 100% of GPU, where SLI will be more like 96-99% of each, but power consumption and driver support aside, where needed, what will run better in games? Specifically 1440p?

Boiling it down short: All other factors aside, and speaking strictly gaming performance (and benchmarks, too), which will be better?


----------



## hifibuff

your two 680s will


----------



## maarten12100

Quote:


> Originally Posted by *Kiracubed*
> 
> Strictly speaking performance, which is better: Geforce Titan, or 2 680's in SLI? Check my build in sig to see which model I'm using, as well as my system.
> 
> I see that the Titan scores a X7100 in 3D Mark 11, and I just ran it at same Extreme preset on build in sig, and got an X6197, with highest being X6201; ~X6200 being a safe average.
> 
> I know that people say that Titan is more efficient, has better driver support because some games won't take full advantage of SLI, and single card solutions use 100% of GPU, where SLI will be more like 96-99% of each, but power consumption and driver support aside, where needed, what will run better in games? Specifically 1440p?
> 
> Boiling it down short: All other factors aside, and speaking strictly gaming performance (and benchmarks, too), which will be better?


Clock per clock and power/perfromance wise the Titan will be better

However raw performance if scaling works the gtx680's in sli will be better. (as long as it isn't vram limited)


----------



## X-oiL

According to this http://www.nordichardware.se/Grafik/nvidia-lanserar-geforce-gtx-titan-den-21-februari.html we have to wait until Thursday for the launch. NDA will be lifted on Tuesday.


----------



## maxmix65

http://uptiki.altervista.org/viewer.php?file=ls7mbxysrb99ylss01k.jpg
OMG look the grafich over the boost frequency gpu


----------



## maarten12100

Quote:


> Originally Posted by *maxmix65*
> 
> http://uptiki.altervista.org/viewer.php?file=ls7mbxysrb99ylss01k.jpg
> OMG look the grafich over the boost frequency gpu


80Hz vsync but there already is 120Hz with vsync just throw in moar power!


----------



## AverageNinja

Still waiting for any reviews, I really want to see those things perform!


----------



## maxmix65

http://videocardz.com/39662/nvidia-geforce-gtx-titan-official-slides-leaked-launch-postponed-to-feb-19th


----------



## maarten12100

Quote:


> Originally Posted by *maxmix65*
> 
> http://videocardz.com/39662/nvidia-geforce-gtx-titan-official-slides-leaked-launch-postponed-to-feb-19th


NO my dream!

Edit: for those not getting my comment the voltage is locked...


----------



## Cloudfire777

So basically it will have turbo bost on top of turbo boost whenever the temperature allows it?








That might be the reason why its beating Ares2 (rumors)...

The 80Hz is nice though, since Titan is able to push FPS above 60FPS in most games, the new vsync for this baby will be capped at 80FPS for those who have 120Hz screens


----------



## hatlesschimp

When will it actually be for sale. ie website and in shops.


----------



## chronicfx

Found an unfinished review

http://www.ocaholic.ch/xoops/html/modules/smartsection/item.php?itemid=955&page=0


----------



## driftingforlife

Quote:


> Originally Posted by *chronicfx*
> 
> Found an unfinished review
> 
> http://www.ocaholic.ch/xoops/html/modules/smartsection/item.php?itemid=955&page=0


Im skeptical, it's wiping the floor of a pair of 680's


----------



## chronicfx

Quote:


> Originally Posted by *driftingforlife*
> 
> Im skeptical, it's wiping the floor of a pair of 680's


The one thing that makes sense is that skyrim is known to scale like crap so the single gpu would do a lot better. That makes me think maybe its real. But who knows...


----------



## driftingforlife

Never mind. It's not a review.
Quote:


> What we ended up with is a performance prediction for an absolute monster of a graphics card.


----------



## Cloudfire777

The seller is back again.
I`m still sceptical because he is using pictures found on videocardz. It would take him 1 minute to open a box and take a picture of his own GPUs.

He have 7 GTX Titan`s available
http://www.ebay.com/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200897669690?pt=PCC_Video_TV_Cards&hash=item2ec66f2a3a


----------



## chronicfx

Doesn't Paypal give your money back if its a fraud?


----------



## chronicfx

Quote:


> Originally Posted by *driftingforlife*
> 
> Never mind. It's not a review.


Prediction? Lol I hope they are right


----------



## maarten12100

Quote:


> Originally Posted by *chronicfx*
> 
> Found an unfinished review
> 
> http://www.ocaholic.ch/xoops/html/modules/smartsection/item.php?itemid=955&page=0


That would be great as it is performing extreme
Quote:


> Originally Posted by *driftingforlife*
> 
> Im skeptical, it's wiping the floor of a pair of 680's


Might be due to them using a i7 3770k with a tiny oc and HT on...
Quote:


> Originally Posted by *chronicfx*
> 
> The one thing that makes sense is that skyrim is known to scale like crap so the single gpu would do a lot better. That makes me think maybe its real. But who knows...


Or they are just clever enough to make it that way

I hope this is true but I very much doubt it.


----------



## hatlesschimp

Ive had this before with a mobile phone the dude had 20 to sell. He sold them all in a couple of hours then we didnt see the phone and it took 3 weeks and we all got refunded. its not a bad way to get money quick and pay off a debt or go to the casino and put it all on black.


----------



## Cloudfire777

Quote:


> Originally Posted by *hatlesschimp*
> 
> Ive had this before with a mobile phone the dude had 20 to sell. He sold them all in a couple of hours then we didnt see the phone and it took 3 weeks and we all got refunded. its not a bad way to get money quick and pay off a debt or go to the casino and put it all on black.


It could also be that this guy selling the Titan`s don`t have any in stock right now which is why he does`nt take his how pictures. So what he basically does is getting people to pre order them for an insane high amount of money fooling them to think they will get the GPU tomorrow, but instead it might take up to a month before the seller actually gets his shipment from Nvidia.

Or like you say, don`t really get the GPUs at all, only try to pay off dept.


----------



## NCSUZoSo

Quote:


> Originally Posted by *maarten12100*
> 
> That would be great as it is performing extreme
> Might be due to them using a i7 3770k with a tiny oc and HT on...
> Or they are just clever enough to make it that way
> 
> I hope this is true but I very much doubt it.


That "review" was out over 3 days ago and it was confirmed to all be speculation.

Here is our new posted material on the delayed launch: http://videocardz.com/39662/nvidia-geforce-gtx-titan-official-slides-leaked-launch-postponed-to-feb-19th


----------



## Zeek

Was already posted


----------



## maarten12100

Quote:


> Originally Posted by *NCSUZoSo*
> 
> That "review" was out over 3 days ago and it was confirmed to all be speculation.
> 
> Here is our new posted material on the delayed launch: http://videocardz.com/39662/nvidia-geforce-gtx-titan-official-slides-leaked-launch-postponed-to-feb-19th


I know I've seen that link already.
However it would be quite something if it had beated sli 680.


----------



## maxmix65

Slide test








http://uptiki.altervista.org/viewer.php?file=1she654tfoky3lsiyxm.jpg
http://uptiki.altervista.org/viewer.php?file=1she654tfoky3lsiyxm.jpg


----------



## NCSUZoSo

lol at expertreviews leaking the announcement early


----------



## hifibuff

seems slightly disappointing but the 108/0p resolution is not what the Titan was created for.


----------



## maarten12100

Quote:


> Originally Posted by *hifibuff*
> 
> seems slightly disappointing but the 108/0p resolution is not what the Titan was created for.


It is 1200p...
However that is also not what the Titan was build for now give us 1440p and 3 way 1440p benches.


----------



## maxmix65

https://docs.google.com/document/d/1pOQQ2FyPtuLxkuwdF8UhXEQ72EsubyUUIfzoOxX5Llw/preview?pli=1&sle=true


----------



## guinner16

Here is the contact info for the guy selling them on ebay. Maybe somebody wants to call him to see if it is legit.

https://www.facebook.com/CtrlEZ/info


----------



## maarten12100

Quote:


> Originally Posted by *maxmix65*
> 
> https://docs.google.com/document/d/1pOQQ2FyPtuLxkuwdF8UhXEQ72EsubyUUIfzoOxX5Llw/preview?pli=1&sle=true


I will remember next time I "olvervolt" my hardware xD.
No but seriously those a nice slides thanks for sharing.


----------



## mike88931

Alright, at this point I see 3 possibilities.

1: The Titan will cost 900 usd and be 25 percent better than the 680 as shown in the nvidia style graph you all saw earlier. In this case nvidia will be receiving nothing from me but a one finger salute. And not the ring, or the thumb, its the one you put up, when you're pissed at nvidia, and don't give a truck.

2:Those graphs are all fake (which would make sense as i do not know why nvidia would release graphs before their own nda was up, and if it was a review site leaking them why make them look like nvidia graphs instead of their own style graphs?

3:Those performance results are real and the Titan will be 600 usd or lower, in which case the lower than expected performance (MUCH LOWER) would be justifiable due to the much lower price as well.

Since I know that I am not going to get a beter than the 690 card for 600 usd, I am hoping for the best realistic option which is it costs 900usd (the price I am prepared to pay for the 2 I would sli) and is damn near or betterthan the 690. *Crossing my fingers*


----------



## maxmix65

Crysis 2 - Titan +47% 7970 / +52% 680
3DMark 13 - Titan +30% 7970 / +49% 680
3DM Vantage - Titan +31% 7970 / +23% 680
BF3 - Titan +35% 7970 / +53% 680
Far Cry 3 - Titan +43% 7970 / +37% 680
Hitman Titan +23% 7970 / +37% 680

MEDIA RESULTS SLIDE TEST - Titan +34.8% 7970 / +41.8% 680


----------



## Compaddict

Quote:


> Originally Posted by *mike88931*
> 
> Alright, at this point I see 3 possibilities.
> 
> 1: The Titan will cost 900 usd and be 25 percent better than the 680 as shown in the nvidia style graph you all saw earlier. In this case nvidia will be receiving nothing from me but a one finger salute. And not the ring, or the thumb, its the one you put up, when you're pissed at nvidia, and don't give a truck.
> 
> 2:Those graphs are all fake (which would make sense as i do not know why nvidia would release graphs before their own nda was up, and if it was a review site leaking them why make them look like nvidia graphs instead of their own style graphs?
> 
> 3:Those performance results are real and the Titan will be 600 usd or lower, in which case the lower than expected (MUCH LOWER) would be justifiable due to the much lower price as well.
> 
> Since I know that I am not going to get a beter than the 690 card for 600 usd, I am hoping for the best realistic option which is it costs 900usd (the price I am prepared to pay for the 2 I would sli) and is damn near or betterthan the 690. *Crossing my fingers*


#3 please!

I need 3 TITANS for my 2D surround (1 for each monitor).


----------



## maarten12100

Quote:


> Originally Posted by *Compaddict*
> 
> #3 please!
> 
> I need 3 TITANS for my 2D surround (1 for each monitor).


Why would you want a slower card...
I prefer the fastest single card


----------



## Vonnis

Quote:


> Originally Posted by *maarten12100*
> 
> Why would you want a slower card...
> I prefer the fastest single card


Coming from three 580's, it would still be a good upgrade from him, and a lot cheaper.









I'm still hoping this will match or even beat GTX680 SLI, but I highly doubt this'll be the case.


----------



## Compaddict

Quote:


> Originally Posted by *maarten12100*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Compaddict*
> 
> #3 please!
> 
> I need 3 TITANS for my 2D surround (1 for each monitor).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why would you want a slower card...
> I prefer the fastest single card
Click to expand...

I meant lower as in price, not speed.









Seriously with my sig rig I can barely play FarCry 3 max settings at 1920x1080 (Which still lags some) and to play it in 2D surround I would probably have to use low settings for playable frame rates. Isn't it time we get hardware that will let us play these games smoothly, especially since we are paying plenty to do just that (TONS more than the cost of a console)!

IF the TITAN rumors are true then MAYBE we can enjoy all the money we spent trying to play the games we buy the way they were meant to be played. IMO we should be able to enjoy multi monitor 3D at this point without having to wait years for hardware to catch up!

I just read this: http://www.geek.com/articles/chips/nvidias-2688-shader-geforce-gtx-titan-graphics-card-leaks-20130215/

Quote:


> "Two versions of the Titan are expected from ASUS and EVGA, with the ASUS card thought to be clocked even higher at 915MHz."


Looks like ASUS will get my money this time.


----------



## Avonosac

Quote:


> Originally Posted by *Compaddict*
> 
> I meant lower as in price, not speed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seriously with my sig rig I can barely play FarCry 3 max settings at 1920x1080 (Which still lags some) and to play it in 2D surround I would probably have to use low settings for playable frame rates. Isn't it time we get hardware that will let us play these games smoothly, especially since we are paying plenty to do just that (TONS more than the cost of a console)!
> 
> IF the TITAN rumors are true then MAYBE we can enjoy all the money we spent trying to play the games we buy the way they were meant to be played. IMO we should be able to enjoy multi monitor 3D at this point without having to wait years for hardware to catch up!
> 
> I just read this: http://www.geek.com/articles/chips/nvidias-2688-shader-geforce-gtx-titan-graphics-card-leaks-20130215/
> 
> *
> *
> 
> Looks like ASUS will get my money this time.


Why? its the same card and factory clocked settings mean squat, unless you're not overclocking.


----------



## Compaddict

Good point but in the past, stable overclocks seem to be higher with ASUS cards.

I will say this though, if EVGA comes out with a water cooled TITAN at launch, it will be my first choice.


----------



## Forceman

Quote:


> Originally Posted by *Cloudfire777*
> 
> The 80Hz is nice though, since Titan is able to push FPS above 60FPS in most games, the new vsync for this baby will be capped at 80FPS for those who have 120Hz screens


You can't just make up your own Vsync, it has to match up with the monitor. Otherwise it's just a frame-rate cap. Refresh rate is Hz, not FPS.

It kind of sounds like it will be doing whatever people do with the Korean monitors to overdrive them to higher refresh rates. But Nvidia can't just send a 80Hz signal to a 60Hz monitor and expect it to work, unless I'm completely wrong about the way most LCD screens operate. You could do it with a CRT, but most LCDs only support 60 , 75. or 120, to my knowledge.


----------



## JinxyJ08

So do we have a real word on release date? I really haven't seen a true launch yet and some people say today, tomorrow, in a week...... When is it NVidia!?


----------



## xoleras

Quote:


> Originally Posted by *JinxyJ08*
> 
> So do we have a real word on release date? I really haven't seen a true launch yet and some people say today, tomorrow, in a week...... When is it NVidia!?


Sounds like reviews are going to be up on the 21st, from the latest leaks.


----------



## xorbe

Quote:


> Originally Posted by *Compaddict*
> 
> Seriously with my sig rig I can barely play FarCry 3 max settings at 1920x1080 (Which still lags some)


Dude turn down post processing. It's the same story as Crysis ... huuuge post-proc perf hit for small effects.


----------



## Avonosac

Quote:


> Originally Posted by *xorbe*
> 
> Dude turn down post processing. It's the same story as Crysis ... huuuge post-proc perf hit for small effects.


I'm sorry, but this is not a helpful post. He obviously knows how to make it playable, but his system isn't built for "playable" its built for everything maxed.


----------



## maarten12100

Quote:


> Originally Posted by *xorbe*
> 
> Dude turn down post processing. It's the same story as Crysis ... huuuge post-proc perf hit for small effects.


Not because we can spot the difference but because YOLO I wan't everything to run maxed GNARGNARGNAR.
I don't even play demanding games but even the games that are not considered demanding can be more demanding than those listed.


----------



## zGunBLADEz

Quote:


> Originally Posted by *Avonosac*
> 
> I'm sorry, but this is not a helpful post. He obviously knows how to make it playable, but his system isn't built for "playable" its built for everything maxed.


Like when the first crysis got released right?>


----------



## maarten12100

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Like when the first crysis got released right?>


Exactly


----------



## zGunBLADEz

Quote:


> Originally Posted by *maarten12100*
> 
> Exactly


didnt happen tho right>>?
what about doom3 on the 9700pro or the 9800 XT even the x850xt..? went to far still comparable..
edit..


----------



## Murlocke

Turning settings down because they "make no difference"? My friend was claiming the same recently, went to the eye doctor and had to get glasses. He didn't even know his vision was bad. Few weeks later, he made me build him a new computer totally going against what he was saying a few weeks prior.

Try running Max settings, 2x SSAA or 4x MSAA, 16x AF, and Very High image quality and come back and say this card is pointless. Even at 1080p, a 680 cannot achieve 60FPS with these settings in many newer games. Sure if you run 1080p, 0x AA, 0x AF it would make this card pointless for you... but honestly if you are willing to do that then why are you even in this thread?

People buying these things want the best possible graphics, while maintaining 60+ FPS, and aren't willing to turn a setting down. If your solution is "just turn a setting down, this card is pointless" then this card is not for you... but it does not make it pointless...


----------



## zGunBLADEz

Quote:


> Originally Posted by *Murlocke*
> 
> Not trying to be a dick... but when's the last some some of you visited a eye doctor? Turning settings down because they "make no difference"? My friend was claiming the same recently, went to the eye doctor and had to get glasses. He didn't even know his vision was bad. Few weeks later, he made me build him a new computer totally going against what he was saying a few weeks prior.
> 
> Try running Max settings, 2x SSAA or 4x MSAA, 16x AF, and Very High image quality and come back and say this card is pointless. Even at 1080p, a 680 cannot achieve 60FPS with these settings in many newer games. Sure if you run 1080p, 0x AA, 0x AF it would make this card pointless for you... but honestly if you are willing to do that then why are you even in this thread?
> 
> People buying these things want the best possible graphics, while maintaining 60+ FPS, and aren't willing to turn a setting down. If that's not your thing, or you don't care if you have to turn some settings down, then the card isn't for you. However, _no_ progress is pointless.


YOU are trying to play on res like 1440P+? aliasing pass to a second term at those resolutions...

No way in hell i will run a 0xAF tho...


----------



## Avonosac

Quote:


> Originally Posted by *Murlocke*
> 
> Turning settings down because they "make no difference"? My friend was claiming the same recently, went to the eye doctor and had to get glasses. He didn't even know his vision was bad. Few weeks later, he made me build him a new computer totally going against what he was saying a few weeks prior.
> 
> Try running Max settings, 2x SSAA or 4x MSAA, 16x AF, and Very High image quality and come back and say this card is pointless. Even at 1080p, a 680 cannot achieve 60FPS with these settings in many newer games. Sure if you run 1080p, 0x AA, 0x AF it would make this card pointless for you... but honestly if you are willing to do that then why are you even in this thread?
> 
> People buying these things want the best possible graphics, while maintaining 60+ FPS, and aren't willing to turn a setting down. If your solution is "just turn a setting down, this card is pointless" then this card is not for you... but it does not make it pointless...


Well put, I was about ready to throw my hands up at the constant suggestion of "turn the settings down".


----------



## zGunBLADEz

I most definitive gonna say you guys are so spoiled now a days...

Give a look at this topic
http://hardforum.com/showthread.php?t=789125

I have dealt with pc hardware for so long XD

Back in those days they were monitors capable a higher res than the ones in there, there were monitors capable of running @ 120hz and all those jingles... XD


----------



## Murlocke

Quote:


> Originally Posted by *zGunBLADEz*
> 
> YOU are trying to play on res like 1440P+? aliasing pass to a second term at those resolutions...
> 
> No way in hell i will run a 0xAF tho...


I have had a 1600p and a 1440p monitor and could definitely tell the difference between 0x AA and 2x SSAA/4x MSAA. I was just messing with a Samsung 970D this weekend. Anything after that? Not so much.. I definitely wouldn't run 0x AA on any resolution. On 1080p, i'd consider 4x MSAA to be the sweet spot.. on 1440p, it's hard to tell the difference between 2x MSAA and 4x MSAA.

I would consider 16x AF an absolute must in any situation. It seems like many newer games have it forced on, but you get those rare few that don't even have it in the menu systems and you need to force it via drivers. Such a difference...


----------



## zGunBLADEz

Quote:


> Originally Posted by *Murlocke*
> 
> I have had a 1600p and a 1440p monitor and could definitely tell the difference between 0x AA and 2x SSAA/4x MSAA. Anything after that? Not so much.. I definitely wouldn't run 0x AA on any resolution.
> 
> I would consider 16x AF and absolute must in any situation. It seems like many newer games have it forced on, but you get those rare few that don't even have it in the menu systems and you need to force it via drivers. Such a difference...


and still *we have to settle from settings that are implemented into the games* like FXAA or SMAA which honestly are not that bad at all..

Try forcing MSAA on Dead Space 3 and see what happens... Yes is a console port but we are plagued by those now a days we have no choice sadly..

I mean back in the days i will play with the AF settings now a days the perf hit is so neglible that is a must to have it on 16x all the time..


----------



## Majin SSJ Eric

Yeah, it's one thing to wish for things. Its another thing entirely to expect them...


----------



## Murlocke

We definitely ARE spoiled now... but if we had the same standards as we did 25 years ago then we'd have no progression. In 25 years from now, people will look back at these posts and think "lol what noobs, anything less than 12k resolution at 120hz is awful!"


----------



## tsm106

Quote:


> Originally Posted by *Murlocke*
> 
> We definitely ARE spoiled now... but if we had the same standards as we did 25 years ago then we'd have no progression. In 25 years from now, people will look back at these posts and think "lol what noobs, anything less than 12k resolution at 120hz is awful!"


I believe this is what they call first world problems.


----------



## zGunBLADEz

Quote:


> Originally Posted by *Murlocke*
> 
> We definitely ARE spoiled now... but if we had the same standards as we did 25 years ago then we'd have no progression. In 25 years from now, people will look back at these posts and think "lol what noobs, anything less than 12k resolution at 120hz is awful!"


Thats for sure i agreed with you 110% there, there will be no progression at all...

But the technology was there thats what i was getting into it like the higher resolutions and the 120hz thing and all that..

For example
Doom 3 was announced running on a 9700 PRO then 9800 XT came out it run like the link posted (main reason of the source)
It took 2 generations of videocards to run the game the way it supposed to be played "enthusiast" standards..

Crysis 1 would be my next choice of an example... How many gens took? Even now a days still a demanding game thats the funny thing...

This link will speak for itself XD
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/16

DAMN this topic got ninja edited


----------



## rcfc89

Quote:


> Originally Posted by *Murlocke*
> 
> I have had a 1600p and a 1440p monitor and could definitely tell the difference between 0x AA and 2x SSAA/4x MSAA. I was just messing with a Samsung 970D this weekend. Anything after that? Not so much.. I definitely wouldn't run 0x AA on any resolution. On 1080p, i'd consider 4x MSAA to be the sweet spot.. on 1440p, it's hard to tell the difference between 2x MSAA and 4x MSAA.
> 
> I would consider 16x AF an absolute must in any situation. It seems like many newer games have it forced on, but you get those rare few that don't even have it in the menu systems and you need to force it via drivers. Such a difference...


Lets attempt to get 2 of these Titan threads closed. Switching between all 3 is getting tiresome. What do you guys think of the new Viper? lol


----------



## Majin SSJ Eric

New Viper is ace but the new Vette is even cooler!


----------



## Roadkill95

American cars are a just a big joke(except for the original gt80)

and this is coming from an American


----------



## badrapper

Mehhh, If those graphs are right.. am not paying over 100% money wise for 30% performance with a bigger electric bill, heat etc. Fudzila is stating that Titan will release to market in a weeks time, so this will only be a paper launch..

If those performance numbers are from nvidia, then its looking really bad for what it has been hyped up to.


----------



## CallsignVega

Quote:


> Originally Posted by *Roadkill95*
> 
> American cars are a just a big joke(except for the original gt80)
> 
> and this is coming from an American


Someone is obviously stuck in 1987.


----------



## badrapper

*This most likely best case scenario*

Crysis 2
Radeon HD 7970 GHz Edition: 68 %
GeForce GTX 680: 65 %
GeForce GTX Titan: 100 %

GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/68)*100 = 147 % = 47 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/65)*100 = 154 % = 54 % faster

3DMark 2013 X Firestrike
Radeon HD 7970 GHz Edition: 77 %
GeForce GTX 680: 67 %
GeForce GTX Titan: 100 %

GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/77)*100 = 130 % = 30 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/67)*100 = 149 % = 49 % faster

3DMark Vantage GPU
Radeon HD 7970 GHz Edition: 76 %
GeForce GTX 680: 81 %
GeForce GTX Titan: 100 %

GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/76)*100 = 132 % = 32 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/81)*100 = 124 % = 24 % faster

Battlefield 3
Radeon HD 7970 GHz Edition: 74 %
GeForce GTX 680: 65 %
GeForce GTX Titan: 100 %

GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/74)*100 = 135 % = 35 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/65)*100 = 154 % = 54 % faster

Far Cry 3
Radeon HD 7970 GHz Edition: 70 %
GeForce GTX 680: 73 %
GeForce GTX Titan: 100 %

GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/70)*100 = 143 % = 43 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/73)*100 = 137 % = 37 % faster

Hitman
Radeon HD 7970 GHz Edition: 81 %
GeForce GTX 680: 73 %
GeForce GTX Titan: 100 %

GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/81)*100 = 124 % = 24 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/73)*100 = 137 % = 37 % faster

Conclusion
GeForce GTX Titan average increase over Radeon HD 7970 GHz Edition: (47 + 30 + 32 + 35 + 43 + 24) / 6 = 35 %
GeForce GTX Titan average increase over GeForce GTX 680: (54 + 49 + 24 + 54 + 37 + 37) / 6 = 42.5 %

http://www.xtremesystems.org/forums/showthread.php?284767-Geforce-Titan-GK110-will-be-a-consumer-part/page26


----------



## Roadkill95

Quote:


> Originally Posted by *CallsignVega*
> 
> Someone is obviously stuck in 1987.


That would be the american automotive industry









I wish we could go back to the glory days of the 60s. Sigh...

P.S- I meant GT40 when I said GT80


----------



## boot318




----------



## maarten12100

Quote:


> Originally Posted by *CallsignVega*
> 
> Someone is obviously stuck in 1987.


Go drive your Ford pick up truck home.
While I sit down in my Audi you see my point?

American cars are really all plastic not meant to last where European cars are.

Not that cars are related to the subject of course just throwing it out there.


----------



## CallsignVega

Oh please, that is ancient news. When is the last time you drove a modern American car?









http://abcnews.go.com/blogs/business/2012/03/american-cars-turning-corner-on-quality/

http://www.huffingtonpost.com/2010/06/17/american-car-quality-tops_n_616592.html

/back on topic.


----------



## Avonosac

Quote:


> Originally Posted by *CallsignVega*
> 
> Oh please, that is ancient news. When is the last time you drove a modern American car?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://abcnews.go.com/blogs/business/2012/03/american-cars-turning-corner-on-quality/
> 
> http://www.huffingtonpost.com/2010/06/17/american-car-quality-tops_n_616592.html
> 
> /back on topic.


Back on topic, did you seriously preorder 2? I clicked around forever on translated pages, and it was nothing but 690s for sale.


----------



## chammii

just came across this on ebay

http://www.ebay.com/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200897669690?pt=PCC_Video_TV_Cards&hash=item2ec66f2a3a

US $1,599.00

says he had 10 to start with


----------



## chammii

Behold the fastest GPU in the world:

6gb of GDDR5 memory (6008mhz) on a 384bit bus, 2,688 shader units, 840mhz GPU Clock all on a single card.

Unit comes equipped with 2x DVI-I, 1x DVI-D, 1 HDMI, and a Mini-Displayport connection.

Bid today and become one of the first owners of the latest, greatest Graphics Card.

I have several of these cards (started with 10) and plan on selling out in my store tomorrow, so cards are first-come-first-serve.

Rush shipping is available! Unless specified, item will be shipped via USPS as soon as payment is received.


----------



## chammii

is ths REAL


----------



## dph314

Quote:


> Originally Posted by *Avonosac*
> 
> Back on topic, did you seriously preorder 2? I clicked around forever on translated pages, and it was nothing but 690s for sale.


He was joking.
Quote:


> Originally Posted by *chammii*
> 
> just came across this on ebay
> 
> http://www.ebay.com/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200897669690?pt=PCC_Video_TV_Cards&hash=item2ec66f2a3a
> US $1,599.00
> says he had 10 to start with


Yeah, we've seen him. He's been reported multiple times and he's still selling. Maybe he's legit. Who knows.


----------



## Vonnis

Quote:


> Originally Posted by *chammii*
> 
> just came across this on ebay
> 
> http://www.ebay.com/itm/Asus-Nvidia-GTX-TITAN-GPU-6gb-GK110-/200897669690?pt=PCC_Video_TV_Cards&hash=item2ec66f2a3a
> 
> US $1,599.00
> 
> says he had 10 to start with


The funny thing is that he was using pics of a GTX680 at first, then when the Titan pics leaked he changed them. The guy's a fraud.


----------



## Swolern

Quote:


> Originally Posted by *Vonnis*
> 
> The funny thing is that he was using pics of a GTX680 at first, then when the Titan pics leaked he changed them. The guy's a fraud.


What the guy is doing is listing them at an inflated amount and is waiting for the Titans to release and buying how ever many he sells on ebay for a profit. He doesnt have them on hand, the guy is waiting around just like the rest of us. What a douche.


----------



## guinner16

Quote:


> Originally Posted by *Vonnis*
> 
> The funny thing is that he was using pics of a GTX680 at first, then when the Titan pics leaked he changed them. The guy's a fraud.


type in Ctrl EZ into face book and his facebook page pops up. From the facebook page it looks like an at home or a basement operation. There is a phone number listed and it would be funny if someone called. I would be surprised if a place like that already had ten in stock but who knows.


----------



## maarten12100

Quote:


> Originally Posted by *Swolern*
> 
> What the guy is doing is listing them at an inflated amount and is waiting for the Titans to release and buying how ever many he sells on ebay for a profit. He doesnt have them on hand, the guy is waiting around just like the rest of us. What a douche.


Hes French nuff said.


----------



## Avonosac

Quote:


> Originally Posted by *dph314*
> 
> He was joking.
> Yeah, we've seen him. He's been reported multiple times and he's still selling. Maybe he's legit. Who knows.


I sad, I was too busy to read it all.


----------



## Vonnis

Quote:


> Originally Posted by *Swolern*
> 
> What the guy is doing is listing them at an inflated amount and is waiting for the Titans to release and buying how ever many he sells on ebay for a profit. He doesnt have them on hand, the guy is waiting around just like the rest of us. What a douche.


Indeed. I wonder if anyone is stupid enough to fall for his little scheme. I hope not, but I wouldn't be surprised if someone did.


----------



## Tatakai All

Quote:


> Originally Posted by *maarten12100*
> 
> Go drive your Ford pick up truck home.
> While I sit down in my Audi you see my point?
> 
> American cars are really all plastic not meant to last where European cars are.
> 
> Not that cars are related to the subject of course just throwing it out there.


Sit down in your Audi, I'll be driving my Cadillac.


----------



## striderstone

Quote:


> Originally Posted by *Roadkill95*
> 
> American cars are a just a big joke(except for the original gt80)
> 
> and this is coming from an American


I believe that the Saleen S7 is an AMAZING american car. Just sayin, there are always exceptions.


----------



## mcg75

Quote:


> Originally Posted by *maarten12100*
> 
> Go drive your Ford pick up truck home.
> While I sit down in my Audi you see my point?
> 
> American cars are really all plastic not meant to last where European cars are.
> 
> Not that cars are related to the subject of course just throwing it out there.


Been in the car repair business for almost two decades.

Audi is a bad example of longevity unless you go into it already expecting to pay above the average number of repairs.

Anything will last a long time if you're silly enough to keep dumping money into it.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Tatakai All*
> 
> Sit down in your Audi, I'll be driving my Cadillac.


No doubt. The CTS-V and ATS are great cars.


----------



## dph314

Quote:


> Originally Posted by *Avonosac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dph314*
> 
> He was joking.
> Yeah, we've seen him. He's been reported multiple times and he's still selling. Maybe he's legit. Who knows.
> 
> 
> 
> I sad, I was too busy to read it all.
Click to expand...

Aw, don't be sad
















Don't worry though. You really didn't miss much. We know about as much now as we did a month ago.


----------



## Avonosac

Quote:


> Originally Posted by *dph314*
> 
> Aw, don't be sad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't worry though. You really didn't miss much. We know about as much now as we did a month ago.


Shocking.


----------



## Majin SSJ Eric

Sucks. I don't get all the secrecy and NDA crap anyway. Nvidia knows AMD has nothing to compete with GK110 so why bother being so secretive about it?


----------



## Avonosac

Corporate policy. You start getting lax during one release, you'll be lax during the next...


----------



## Kiracubed

So TONIGHT is the night, right? They said it got pushed back to the 19th, with reviews coming the 21st?


----------



## dph314

Quote:


> Originally Posted by *Kiracubed*
> 
> So TONIGHT is the night, right? They said it got pushed back to the 19th, with reviews coming the 21st?


I think we should see something soon. 9am EST today, or Thursday. The 680s released at 9am on a Thursday as well.


----------



## Kiracubed

Quote:


> Originally Posted by *dph314*
> 
> I think we should see something soon. 9am EST today, or Thursday. The 680s released at 9am on a Thursday as well.


Thanks for the info! Would it just show listed at our etailers like Newegg and Amazon as a simple "geforce titan" search, or direct from Nvidia, EVGA and Asus? Sorry for the million questions, just want to be able to buy it if I think the final price is good-- I keep hearing anywhere from $900-$1500!


----------



## maxmix65

GTX TITAN also comes with GPU Boost 2.0 that allows increased clock speeds and performance,plus an additional ceiling of manual overclock (up to 1.10GHzz) given to consumers if they understand that going this extra step will diminish the quality of the card in the long run (and presumably warranty). Of course, partner OEMs like ASUS and Gigabyte will have the ability to disable the OverVoltage option within the BIOS itself.
http://uptiki.altervista.org/viewer.php?file=r1xc88gno2ysn2z1jvcg.jpg
http://uptiki.altervista.org/viewer.php?file=r1xc88gno2ysn2z1jvcg.jpg


----------



## striderstone

Here is a bunch of stuff that I am ripping from the EVGA forums since it doesn't look like it's been posted here...

Source: http://www.anandtech.com/...rce-gtx-titan-part-1/2

" Moving on, with a $999 launch price NVIDIA's competition will be rather limited. The GTX 690 is essentially a companion product; NVIDIA's customers can either get the most powerful single-GPU card NVIDIA offers in a blower design, or an alternative design composed of two lesser GPUs in SLI, in a front and rear exhausting design. The GTX 690 will be the faster card, but at a higher TDP and with the general drawbacks of SLI. On the other hand Titan will be the more consistent card, the lower TDP card, the easier to cool card, but also the slower card. Meanwhile though it's not a singular product, the GTX 680 SLI will also be another option, offering higher performance, higher TDP, more noise, and a cheaper price tag of around $900.

As for AMD, with their fastest single-GPU video card being the 7970 GHz Edition, offering performance closer to the GTX 680 than Titan, Titan essentially sits in a class of its own on the single-GPU front. AMD's competition for Titan will be the 7970GE in CrossFire, and then the officially unofficial 7990 family, composed of the air cooled PowerColor 7990, and the closed loop water cooled Asus Ares II. But with NVIDIA keeping GTX 690 around, these are probably closer competitors to the multi-GPU 690 than they are the single-GPU Titan."

http://www.geforce.com/ha...-gtx-titan/performance

http://hexus.net/tech/reviews/graphics/51857-nvidia-geforce-gtx-titan-6gb-graphics-card-overview/

http://www.legitreviews.com/article/2143/1/

http://www.pcper.com/revi...rclocking-and-GPGPU/Bi

http://www.guru3d.com/articles_pages/geforce_gtx_titan_preview_reference.html

MSRP = 999.99USD
no 4-way SLI (3-way max)
I will be passing on this card.


----------



## guinner16

Quote:


> Originally Posted by *striderstone*
> 
> Here is a bunch of stuff that I am ripping from the EVGA forums since it doesn't look like it's been posted here...
> 
> Source: http://www.anandtech.com/...rce-gtx-titan-part-1/2
> 
> " Moving on, with a $999 launch price NVIDIA's competition will be rather limited. The GTX 690 is essentially a companion product; NVIDIA's customers can either get the most powerful single-GPU card NVIDIA offers in a blower design, or an alternative design composed of two lesser GPUs in SLI, in a front and rear exhausting design. The GTX 690 will be the faster card, but at a higher TDP and with the general drawbacks of SLI. On the other hand Titan will be the more consistent card, the lower TDP card, the easier to cool card, but also the slower card. Meanwhile though it's not a singular product, the GTX 680 SLI will also be another option, offering higher performance, higher TDP, more noise, and a cheaper price tag of around $900.
> 
> As for AMD, with their fastest single-GPU video card being the 7970 GHz Edition, offering performance closer to the GTX 680 than Titan, Titan essentially sits in a class of its own on the single-GPU front. AMD's competition for Titan will be the 7970GE in CrossFire, and then the officially unofficial 7990 family, composed of the air cooled PowerColor 7990, and the closed loop water cooled Asus Ares II. But with NVIDIA keeping GTX 690 around, these are probably closer competitors to the multi-GPU 690 than they are the single-GPU Titan."
> 
> http://www.geforce.com/ha...-gtx-titan/performance
> 
> http://hexus.net/tech/reviews/graphics/51857-nvidia-geforce-gtx-titan-6gb-graphics-card-overview/
> 
> http://www.legitreviews.com/article/2143/1/
> 
> http://www.pcper.com/revi...rclocking-and-GPGPU/Bi
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_titan_preview_reference.html
> 
> MSRP = 999.99USD
> no 4-way SLI (3-way max)
> I will be passing on this card.


I am pretty sure toms hardware preview said 4 way sli is supported.


----------



## striderstone

Quote:


> Originally Posted by *guinner16*
> 
> I am pretty sure toms hardware preview said 4 way sli is supported.


hmmm you are correct, they do say that.
What a coincidence, conflict of information. This is the deciding factor if I get these or not. I guess I really will just have to wait.


----------



## maarten12100

Quote:


> Originally Posted by *striderstone*
> 
> hmmm you are correct, they do say that.
> What a coincidence, conflict of information. This is the deciding factor if I get these or not. I guess I really will just have to wait.


Better to go 3 way with this as 4 way doesn't scale well ever.
Unless you are planing on BF3 only which would be a waste as it is too strong.
You might actually be losing performance over 3 way


----------



## maarten12100

double post

Edit: post #1000


----------



## striderstone

Quote:


> Originally Posted by *maarten12100*
> 
> Better to go 3 way with this as 4 way doesn't scale well ever.
> Unless you are planing on BF3 only which would be a waste as it is too strong.
> You might actually be losing performance over 3 way


i'm just waiting for a video rendering software to come out with multi-GPU support. I just want to be ready for when that happens, because right now it's a pain in the ******* ass to render 40 minute long 1080p videos







At least this has more CUDA cores so I will hopefully get 1 for that reason alone. I will get 4-way because I invested in a 4-way SLI board...there for I must use it or I wasted money.


----------



## maarten12100

Quote:


> Originally Posted by *striderstone*
> 
> i'm just waiting for a video rendering software to come out with multi-GPU support. I just want to be ready for when that happens, because right now it's a pain in the ******* ass to render 40 minute long 1080p videos
> 
> 
> 
> 
> 
> 
> 
> At least this has more CUDA cores so I will hopefully get 1 for that reason alone. I will get 4-way because I invested in a 4-way SLI board...there for I must use it or I wasted money.


If you want to use it for accelerated computing you don't need to sli them you can have them running side by side.
As it is not like they have to communicate if they can split the work and never join it again you know what I'm saying?


----------

