# [Official] NVIDIA Titan X Pascal Owners Thread



## GunnzAkimbo

http://www.geforce.com/hardware/10series/titan-x-pascal

Elite members welcome 24/7.









Peasants opening times 4:45pm - 5:00pm.

Peasants must remain behind barriers at all times.

DO NOT TOUCH IT. DO NOT SNEEZE EVER!. DO NOT BREATHE ON IT...

I SAID DON'T BREATHE ON IT!









and no RGB's... don't you even think about it


----------



## Leyaena

I'm straight up disappointed. No GP100, no HBM2 and a higher price tag that the previous TX... I'll wait for the benchmarks, but I'll probably be passing on this one. At an msrp like that, I'd expect it to be 1.5k EUR over here at the minimum.


----------



## GunnzAkimbo

imagine the AUD then... $2000 or thereabouts.


----------



## Mhill2029

Quote:


> Originally Posted by *Leyaena*
> 
> I'm straight up disappointed. No GP100, no HBM2 and a higher price tag that the previous TX... I'll wait for the benchmarks, but I'll probably be passing on this one. At an msrp like that, I'd expect it to be 1.5k EUR over here at the minimum.


It was something I expected to see happen this year, but not August 2nd. That was waaaaaaay sooner than I expected.

I'm more concerned with how GTX 1080 owners must be feeling right now, spending so much money on the GTX 1080 which was almost Titan X money in the first place for a midrange die, only for this monster to come along so soon after. Especially when you consider how long it was before people actually got hold of the GTX 1080 and SLI HB Bridge due to low stocks.

This is probably the biggest stab in the back I've ever seen from Nvidia. 6months later sure, but 2months? Man that's gotta hurt....


----------



## Corsa911

Why would any half intelligent person purchase computer components solely based on it being the best and not around what fits their needs.

If I was a 1080 owner I would care less to what new card came out so long as I was still hitting my intended frame rates based on the research I would have done before purchasing a high end component.

I don't really think 1080 owners care about new titan x release, nor should they.


----------



## Lays

Quote:


> Originally Posted by *Mhill2029*
> 
> It was something I expected to see happen this year, but not August 2nd. That was waaaaaaay sooner than I expected.
> 
> I'm more concerned with how GTX 1080 owners must be feeling right now, spending so much money on the GTX 1080 which was almost Titan X money in the first place for a midrange die, only for this monster to come along so soon after. Especially when you consider how long it was before people actually got hold of the GTX 1080 and SLI HB Bridge due to low stocks.
> 
> This is probably the biggest stab in the back I've ever seen from Nvidia. 6months later sure, but 2months? Man that's gotta hurt....


From the looks of it, the 1080 still has better price to performance. I've seen claims that the TX Pascal is supposed to be ~60% faster than the old TX, which puts it around the ~25-30% faster than 1080, but it almost costs 2x as much.


----------



## meson1

Quote:


> Originally Posted by *Leyaena*
> 
> I'm straight up disappointed. No GP100, no HBM2 and a higher price tag that the previous TX... I'll wait for the benchmarks, but I'll probably be passing on this one. At an msrp like that, I'd expect it to be 1.5k EUR over here at the minimum.


I have a theory.

I'm sure I read that Nvidia pushed Volta back to 2018. So I think they're stretching Pascal out over two years. Like this.

They release flagship 1080 based on GP104 (as they have done already). But now the 1080 outperforms Titan X, so to rectify the situation they release new Titan X based on GP102 with GDDR5X memory.

NEXT YEAR, 2017: Is when they release the GP100 with HBM2 positioned above Titan X (or replacing it) and release a slightly cut down GP100 as 1080Ti. Both with performance improvements over the 2016 releases.

Just a theory.


----------



## Gary2015

Quote:


> Originally Posted by *Lays*
> 
> From the looks of it, the 1080 still has better price to performance. I've seen claims that the TX Pascal is supposed to be ~60% faster than the old TX, which puts it around the ~25-30% faster than 1080, but it almost costs 2x as much.


Yes agree. For that price I can get 2x 1080 GTX. The 1080 has a better price/performance ratio. Unless you are running 4k and wanting over 60fps, a 1080 is a better bet.


----------



## Zurv

oh god.. i have soooo many pointless video cards now.
After i upgrade to these new titans.. i'll have 8! gtx 1080s and 7 Titan X...

anyone looking to buy some cards


----------



## mouacyk

You guys have a lot of catching up to do:
http://www.overclock.net/t/1606550/twitter-jen-hsun-introduces-the-new-nvidia-titan-x


----------



## wirk

Quote:


> Originally Posted by *Zurv*
> 
> oh god.. i have soooo many pointless video cards now.
> After i upgrade to these new titans.. i'll have 8! gtx 1080s and 7 Titan X... anyone looking to buy some cards


I would if I am sure 7 of such cards would run in a single mobo having 7 PCIe slots (with watercooling the cards can be converted to single-slot)







.

Anyway, in my opinion the just-announced Titan X is not the last one in the Pascal line, it has 250 W TDP and DDR5 RAM while the P100 for computing has 300 W TDP and HBM2 memory. One can thus expect a similar Titan XXL with the TDP of 300 W, HBM2 memory and with killing heart-attack price







.


----------



## zipeldiablo

Quote:


> Originally Posted by *Zurv*
> 
> oh god.. i have soooo many pointless video cards now.
> After i upgrade to these new titans.. i'll have 8! gtx 1080s and 7 Titan X...
> 
> anyone looking to buy some cards


Indeed i am


----------



## Juub

Quote:


> Originally Posted by *Mhill2029*
> 
> It was something I expected to see happen this year, but not August 2nd. That was waaaaaaay sooner than I expected.
> 
> I'm more concerned with how GTX 1080 owners must be feeling right now, spending so much money on the GTX 1080 which was almost Titan X money in the first place for a midrange die, only for this monster to come along so soon after. Especially when you consider how long it was before people actually got hold of the GTX 1080 and SLI HB Bridge due to low stocks.
> 
> This is probably the biggest stab in the back I've ever seen from Nvidia. 6months later sure, but 2months? Man that's gotta hurt....


Why would they feel bad? That card is twice the price of a GTX 1080 for 30% more performance.


----------



## Zurv

Quote:


> Originally Posted by *Zurv*
> 
> oh god.. i have soooo many pointless video cards now.
> After i upgrade to these new titans.. i'll have 8! gtx 1080s and 7 Titan X...
> 
> anyone looking to buy some cards


oh pooo.. i can' t post anything in the market till i have 35 rep.. hrmm.. i wonder what is needed to setup a selling account on amazon


----------



## Steven185

Quote:


> Originally Posted by *Juub*
> 
> Why would they feel bad? That card is twice the price of a GTX 1080 for 30% more performance.


Depends on whether one overclocks or not. I suspect that the new Titan X would be loaded, it will have the overclockability of a founder's edition card. So if you overclock you would get it quite a lot higher than that as compared to a GTX 1080. For example if you reach 2-2.1 GHz (which is very possible for a founder edtion's card), you get at least 30% on top of its initial performance. That totals to around 70% above GTX 1080.

You can say that you can overclock GTX 1080 too. But due to the high starting clocks and aggressive adaptive overclocking you don't actually net more than 10% additional performance. If all of the above are true, you can have +50-60% of greater performance as compared to GTX 1080 merely by overclocking your Titan X. This in turn would make Titan X (overclocked) the first 4K card (minimum FPS to most games maxed @ 4K > 50FPS)

Can't say I'm not excited, what I can say is that I'll probably not be able to justify that cost to me (or my wife).


----------



## Derpinheimer

Quote:


> Originally Posted by *Steven185*
> 
> Depends on whether one overclocks or not. I suspect that the new Titan X would be loaded, it will have the overclockability of a founder's edition card. So if you overclock you would get it quite a lot higher than that as compared to a GTX 1080. For example if you reach 2-2.1 GHz (which is very possible for a founder edtion's card), you get at least 30% on top of its initial performance. That totals to around 70% above GTX 1080.
> 
> You can say that you can overclock GTX 1080 too. But due to the high starting clocks and aggressive adaptive overclocking you don't actually net more than 10% additional performance. If all of the above are true, you can have +50-60% of greater performance as compared to GTX 1080 merely by overclocking your Titan X. This in turn would make Titan X (overclocked) the first 4K card (minimum FPS to most games maxed @ 4K > 50FPS)
> 
> Can't say I'm not excited, what I can say is that I'll probably not be able to justify that cost to me (or my wife).


1417 vs 1607 base clock

So even if they both OC to 2100 on average, then the Titan would still be less than 50% faster. (1.14*1.3)


----------



## jtom320

Quote:


> Originally Posted by *Steven185*
> 
> Depends on whether one overclocks or not. I suspect that the new Titan X would be loaded, it will have the overclockability of a founder's edition card. So if you overclock you would get it quite a lot higher than that as compared to a GTX 1080. For example if you reach 2-2.1 GHz (which is very possible for a founder edtion's card), you get at least 30% on top of its initial performance. That totals to around 70% above GTX 1080.
> 
> You can say that you can overclock GTX 1080 too. But due to the high starting clocks and aggressive adaptive overclocking you don't actually net more than 10% additional performance. If all of the above are true, you can have +50-60% of greater performance as compared to GTX 1080 merely by overclocking your Titan X. This in turn would make Titan X (overclocked) the first 4K card (minimum FPS to most games maxed @ 4K > 50FPS)
> 
> Can't say I'm not excited, what I can say is that I'll probably not be able to justify that cost to me (or my wife).


Your assuming it will overclock like a 1080. Pretty big assumption. Why in the world would Nvidia neuter the Titan's performance with clock speed out of the box anyways? Doesn't even make sense for them to do that to their flagship.

Anyway these cards are priced very far apart. 700 dollars is a lot for me to spend on a GPU 1200 is unthinkable. It will be cool to see but speaking as a 1080 owner I was never going to buy it even if it launched first.


----------



## bfedorov11

Quote:


> Originally Posted by *Zurv*
> 
> oh pooo.. i can' t post anything in the market till i have 35 rep.. hrmm.. i wonder what is needed to setup a selling account on amazon


Don't use amazon. I've read about lots of people getting ripped off selling expensive items. Use /r/hardwareswap
Quote:


> Originally Posted by *jtom320*
> 
> Your assuming it will overclock like a 1080. Pretty big assumption. Why in the world would Nvidia neuter the Titan's performance with clock speed out of the box anyways? Doesn't even make sense for them to do that to their flagship.


Probably to keep it at 250w. But I agree, I don't see it clocking past 2000 like the other cards. Heat and PT will be killer. The FE cooler can't even keep the smaller chips cool.


----------



## Tobiman

Quote:


> Originally Posted by *Juub*
> 
> Why would they feel bad? That card is twice the price of a GTX 1080 for 30% more performance.


If you can actually find one for that price.


----------



## pewpewlazer

20% price increase over previous gen with no actual differences over the previous gen? At least Intel added some cores when they made a totally insane price increase. But I guess that's what you can get away with when you have literally zero competition and the future of multi-GPU is grim...


----------



## Kpjoslee

Quote:


> Originally Posted by *pewpewlazer*
> 
> 20% price increase over previous gen with no actual differences over the previous gen? *At least Intel added some cores when they made a totally insane price increase.* But I guess that's what you can get away with when you have literally zero competition and the future of multi-GPU is grim...


Nvidia did the exact same thing here, they just added more cores


----------



## Steven185

T
Quote:


> Originally Posted by *jtom320*
> 
> Your assuming it will overclock like a 1080. Pretty big assumption. Why in the world would Nvidia neuter the Titan's performance with clock speed out of the box anyways? Doesn't even make sense for them to do that to their flagship.
> 
> Anyway these cards are priced very far apart. 700 dollars is a lot for me to spend on a GPU 1200 is unthinkable. It will be cool to see but speaking as a 1080 owner I was never going to buy it even if it launched first.


That's easy to keep the TDP. Titan X (the original) could easily reach the 1500 Mhz range, which very much is the range of the lower Maxwell cards...


----------



## Steven185

Quote:


> Originally Posted by *Derpinheimer*
> 
> 1417 vs 1607 base clock
> 
> So even if they both OC to 2100 on average, then the Titan would still be less than 50% faster. (1.14*1.3)


If the clocks adapt less aggressively then it would 50%+ in performance difference... Also non FE edition cards (that many of the people probably have here) would probably not reach as high as Titan X (binned, handpicked cards / the ones of lesser quality would probably become the 1080 TI)....
Quote:


> Originally Posted by *bfedorov11*
> 
> Don't use amazon. I've read about lots of people getting ripped off selling expensive items. Use /r/hardwareswap
> Probably to keep it at 250w. But I agree, I don't see it clocking past 2000 like the other cards. Heat and PT will be killer. The FE cooler can't even keep the smaller chips cool.


Keep in mind that Titan X are probably binned cards (non binned would become the 1080 ti series once enough stock of them is made). Binned cards don't need as high voltage. That's why you'd get something similar with the original Titan X (often times 1500+ clocks, which is what the lower cards would get too).


----------



## loki993

Quote:


> Originally Posted by *Mhill2029*
> 
> It was something I expected to see happen this year, but not August 2nd. That was waaaaaaay sooner than I expected.
> 
> I'm more concerned with how GTX 1080 owners must be feeling right now, spending so much money on the GTX 1080 which was almost Titan X money in the first place for a midrange die, only for this monster to come along so soon after. Especially when you consider how long it was before people actually got hold of the GTX 1080 and SLI HB Bridge due to low stocks.
> 
> This is probably the biggest stab in the back I've ever seen from Nvidia. 6months later sure, but 2months? Man that's gotta hurt....


First off I don't consider a 400 dollar difference...almost titan money. Second i'd be happy if I bought a 1080 the the announced the new titan. the new Titan is twice the price of a 1080 and seems is going to only be about 30 percent more performance doesn't seem like enough for something that expensive. Also keep in mind this thigs 200 bucks more than the titan its replacing.

Quote:


> Originally Posted by *meson1*
> 
> I have a theory.
> 
> I'm sure I read that Nvidia pushed Volta back to 2018. So I think they're stretching Pascal out over two years. Like this.
> 
> They release flagship 1080 based on GP104 (as they have done already). But now the 1080 outperforms Titan X, so to rectify the situation they release new Titan X based on GP102 with GDDR5X memory.
> 
> NEXT YEAR, 2017: Is when they release the GP100 with HBM2 positioned above Titan X (or replacing it) and release a slightly cut down GP100 as 1080Ti. Both with performance improvements over the 2016 releases.
> 
> Just a theory.


With the prices cards are coming out at now could you imagine how much those cards would be if what you're saying is true? If the jump in price of the new titan is any indication you're probably talking about a 2k and a 1500 dollar graphics card respectively.


----------



## meson1

Quote:


> Originally Posted by *loki993*
> 
> With the prices cards are coming out at now could you imagine how much those cards would be if what you're saying is true? If the jump in price of the new titan is any indication you're probably talking about a 2k and a 1500 dollar graphics card respectively.


I withdraw that theory now. I misunderstood what the GP100 was. GP100 is the double precision chip intended for compute markets with the Tesla line.

However, with the latest news that Nvidia have supposedly decided to bring Volta forward by putting it on 16nm, we are now looking at 2017 for GVxxx products. Pascal was only supposed to be a stop gap until Volta. It's the Volta architecture is supposed to represent a significant jump.


----------



## versions

Quote:


> Originally Posted by *Steven185*
> 
> Depends on whether one overclocks or not. I suspect that the new Titan X would be loaded, it will have the overclockability of a founder's edition card. So if you overclock you would get it quite a lot higher than that as compared to a GTX 1080. For example if you reach 2-2.1 GHz (which is very possible for a founder edtion's card), you get at least 30% on top of its initial performance. That totals to around 70% above GTX 1080.
> 
> You can say that you can overclock GTX 1080 too. But due to the high starting clocks and aggressive adaptive overclocking you don't actually net more than 10% additional performance. If all of the above are true, you can have +50-60% of greater performance as compared to GTX 1080 merely by overclocking your Titan X. This in turn would make Titan X (overclocked) the first 4K card (minimum FPS to most games maxed @ 4K > 50FPS)
> 
> Can't say I'm not excited, what I can say is that I'll probably not be able to justify that cost to me (or my wife).


Here are three Time Spy links, one with the card underclocked to reach the ~1670MHz that Computerbase saw as their average clock speed on the air-cooled FE after letting the card heat up, one stock and one overclocked mostly stable run.
http://www.3dmark.com/spy/80728
http://www.3dmark.com/spy/80588
http://www.3dmark.com/spy/48466

Compared to the underclocked result to try and compare it to the FE at stock on air, the score is 26% higher. Compared to the stock one, where it's already running almost 200MHz over the boost clock due to being under water, it's 15%. This is also on stock BIOS and one can expect it to go up once custom BIOS becomes available.

Yeah, the 1080 overclocks itself far beyond the boost clock, mine hovers around 1898MHz and 1911MHz stock, but the Titan would do the same thing, which cuts into the gains you would see when overclocking just the same. GM204 also reaches higher clock speeds than GM200, could very well be the same here that the Titan will fall 50-100MHz behind the 1080 when overclocked.

GTX 1080 is a 8.9TFLOPs card at 1733MHz, the Titan 11TFLOPs. That makes 24%. Let's say that the memory bandwidth brings that up to 30% at stock. Then let's also say that it overclocks a bit better, and maybe we'll have 35% at most when both are overclocked. If you think it'll be 70% you're setting yourself up for a major disappointment.


----------



## Steven185

Quote:


> Originally Posted by *versions*
> 
> Here are three Time Spy links, one with the card underclocked to reach the ~1670MHz that Computerbase saw as their average clock speed on the air-cooled FE after letting the card heat up, one stock and one overclocked mostly stable run.
> http://www.3dmark.com/spy/80728
> http://www.3dmark.com/spy/80588
> http://www.3dmark.com/spy/48466
> 
> Compared to the underclocked result to try and compare it to the FE at stock on air, the score is 26% higher. Compared to the stock one, where it's already running almost 200MHz over the boost clock due to being under water, it's 15%. This is also on stock BIOS and one can expect it to go up once custom BIOS becomes available.
> 
> Yeah, the 1080 overclocks itself far beyond the boost clock, mine hovers around 1898MHz and 1911MHz stock, but the Titan would do the same thing, which cuts into the gains you would see when overclocking just the same. GM204 also reaches higher clock speeds than GM200, could very well be the same here that the Titan will fall 50-100MHz behind the 1080 when overclocked.
> 
> GTX 1080 is a 8.9TFLOPs card at 1733MHz, the Titan 11TFLOPs. That makes 24%. Let's say that the memory bandwidth brings that up to 30% at stock. Then let's also say that it overclocks a bit better, and maybe we'll have 35% at most when both are overclocked. If you think it'll be 70% you're setting yourself up for a major disappointment.


I explained how it will be 70%.
30% stock and another 30% from overclocking (2000 Mhz is 30% more than 1531mhz). 1.3*1.3=1.69 = 69% more performance

There's nothing to disagree there, it's a matter of math that the card will be 70% faster for overclockers (as compared to stock GTX 1080)

Of course one may also overclock GTX 1080 as well. According to reviews (even overclockers-club that makes some of the most extreme overclocks) you won't get more than 10%-12%. Due to the TDP limit (I assume) nVidia keeps the clocks down in real time performance. In benchmarks things are possibly different, in games not so.

A combination of better bins (=lower TDP even under overclock) + higher TDP limit would allow Titan X to sustain the clocks. I suspect that to be the case because that's happened to my original Titan X as well. I could get at 1500 Mhz sustained...

Under the light of the above $1200 is still too expensive, but if it was $1000 it would be a bargain (especially since you rarely find GTX 1080 for less than $700)...

Yes unlocking the TDP limit on GTX 1080 may change all that, but even then the difference will hover around 40-45% (due to the similar clocks).


----------



## versions

Quote:


> Originally Posted by *Steven185*
> 
> I explained how it will be 70%.
> 30% stock and another 30% from overclocking (2000 Mhz is 30% more than 1531mhz). 1.3*1.3=1.69 = 69% more performance
> 
> There's nothing to disagree there, it's a matter of math that the card will be 70% faster for overclockers (as compared to stock GTX 1080)
> 
> Of course one may also overclock GTX 1080 as well. According to reviews (even overclockers-club that makes some of the most extreme overclocks) you won't get more than 10%-12%. Due to the TDP limit (I assume) nVidia keeps the clocks down in real time performance. In benchmarks things are possibly different, in games not so.
> 
> A combination of better bins (=lower TDP even under overclock) + higher TDP limit would allow Titan X to sustain the clocks. I suspect that to be the case because that's happened to my original Titan X as well. I could get at 1500 Mhz sustained...
> 
> Under the light of the above $1200 is still too expensive, but if it was $1000 it would be a bargain (since you rarely find GTX 1080 for less than $700)...
> 
> Yes unlocking the TDP limit on GTX 1080 may change all that, but even then the difference will hover around 40-45% (due to the similar clocks).


It won't overclock 30% because it will boost over the rated boost clock at stock just like every other Pascal card does. It probably won't be able to maintain said clock speed for a very long time on air though, and would throttle down like the 1080. That won't happen on water, which should also be needed to overclock it properly. The results you see in benchmarks for stock clocks are probably going to be a good bit higher than the boost clock.

Let's say it runs at 1650MHz on air at stock for a short time before it throttles down, and maintains 1700MHz under water. If we then say that it does 2000MHz on water with stock BIOS, that's 18% compared to stock (and my 1080 does 15% under the same circumstances, with mine being average).

It won't hold the clocks on air, pretty sure it's the same cooler as it is on the 1080 and it's a good bit more power hungry than the 1080 so if anything it should struggle more to maintain the clocks.

The Titan X will be faster than the 1080, yes. A good bit faster, too. Just don't overhype it because you will be disappointed.


----------



## Steven185

Quote:


> Originally Posted by *versions*
> 
> It won't overclock 30% because it will boost over the rated boost clock at stock just like every other Pascal card does. It probably won't be able to maintain said clock speed for a very long time on air though, and would throttle down like the 1080. That won't happen on water, which should also be needed to overclock it properly. The results you see in benchmarks for stock clocks are probably going to be a good bit higher than the boost clock.
> 
> Let's say it runs at 1650MHz on air at stock for a short time before it throttles down, and maintains 1700MHz under water. If we then say that it does 2000MHz on water with stock BIOS, that's 18% compared to stock (and my 1080 does 15% under the same circumstances, with mine being average).
> 
> It won't hold the clocks on air, pretty sure it's the same cooler as it is on the 1080 and it's a good bit more power hungry than the 1080 so if anything it should struggle more to maintain the clocks.
> 
> The Titan X will be faster than the 1080, yes. A good bit faster, too. Just don't overhype it because you will be disappointed.


OK, to be fair haven't tried any pascal card yet , all I know I know from reviews. I did try Titan X maxwell though and even on air I could sustain 1500 Mhz, no water needed just good case ventilation that's 38% over boost. So Titan series are known for insane overclocks over their initial clocks.

I think we should take that into account too instead of quoting the oft said 30%. Yes it's only 30% faster if you keep it stock, but if you have just given so much money you better not keep it stock, especially if it is over clocking as well as the original Titan X. Better overclockability + 30% may actually give insane performance.

Of course like you said it may not be able to sustain clocks. If so ... bummer, but if it can sustain high clocks while gaming, even without water that would be huge. My 70% (over stock GTX 1080) figure would not be far from reality...


----------



## loki993

Quote:


> Originally Posted by *meson1*
> 
> I withdraw that theory now. I misunderstood what the GP100 was. GP100 is the double precision chip intended for compute markets with the Tesla line.
> 
> However, with the latest news that Nvidia have supposedly decided to bring Volta forward by putting it on 16nm, we are now looking at 2017 for GVxxx products. Pascal was only supposed to be a stop gap until Volta. It's the Volta architecture is supposed to represent a significant jump.


I wasn't calling you out, I was merely stating that going by the prices in this cycle of cards anything better that whats out now or coming soon will be mind numbingly expensive.


----------



## meson1

Quote:


> Originally Posted by *loki993*
> 
> I wasn't calling you out, I was merely stating that going by the prices in this cycle of cards anything better that whats out now or coming soon will be mind numbingly expensive.


No. It's fine. It wasn't that. What changed my mind was the news that Nvidia are bringing Volta forward to 2017 by manufacturing it on their established 16nm fab. That changed the landscape of everything.

We're cool, dude.

And yes, depressingly, Nvidia will continue with their policy of gradually increasing their prices. Look. They have us conditioned to expect it already.

Next time round, xx80 will be $800. Titan will be $1500.

Our only hope is for AMD to turn it round and suddenly be competing at the top end and putting genuine pressure on Nvidia for market share in that performance bracket.


----------



## dante`afk

Just comparing the numbers, 2x 1080 will still be faster than the new titan right?


----------



## mouacyk

Quote:


> Originally Posted by *dante`afk*
> 
> Just comparing the numbers, 2x 1080 will still be faster than the new titan right?


Where multi-GPU or SLI scales well, of course.

2560 * 2 > 3584


----------



## GunnzAkimbo

SLi is like 2 x CPU sockets, software specific application.

It's still awesome and if you have the money, you would do that, but

1 fast vid card is better as is 1 fast CPU for mainstream bug free usage.

I would push 1 vid card over SLi, and i have SLi'd since the 295 series, and crossfire since the ati 2900 series (only because SLi support is lacklustre or you have to wait quite a long time for it to be supported, by then something new is out and the old game is finished and done with)

IF nvidia really improved support for SLi, then thats a different story. IF.


----------



## Lass3

Let the milking begin continue

AMD we need Vega


----------



## meson1

I think Nvidia are trying to force the hands of the software houses. I think Nvidia originally intended for software to be written with explicit SLI. I think almost no-one did except benchmark designers. Most publishers relied on Nvidia's implicit SLI to do the leg work for them, only tweaking titles where necessary so they didn't crash or have issues. But they often didn't go as far as optimising games for SLI.

My understanding is that Nvidia are taking a gamble: that by withdrawing implicit SLI for DX12 onwards, they hope they can force publishers to write for explicit SLI and thereby dramatically improve the scaling that can be achieved with SLI. To help them, they have reduced formal SLI support to a maximum of 2-way, so that the number of combinations of GPU's to write and test for are reduced. If it works, and software IS written and optimized for SLI, and if scaling IS much improved, I think Nvidia hope that many more buyers may actually be persuaded to buy extra cards for two way SLI configurations.


----------



## GunnzAkimbo

Quote:


> Originally Posted by *meson1*
> 
> I think Nvidia are trying to force the hands of the software houses. I think Nvidia originally intended for software to be written with explicit SLI. I think almost no-one did except benchmark designers. Most publishers relied on Nvidia's implicit SLI to do the leg work for them, only tweaking titles where necessary so they didn't crash or have issues. But they often didn't go as far as optimising games for SLI.
> 
> My understanding is that Nvidia are taking a gamble: that by withdrawing implicit SLI for DX12 onwards, they hope they can force publishers to write for explicit SLI and thereby dramatically improve the scaling that can be achieved with SLI. To help them, they have reduced formal SLI support to a maximum of 2-way, so that the number of combinations of GPU's to write and test for are reduced. If it works, and software IS written and optimized for SLI, and if scaling IS much improved, I think Nvidia hope that many more buyers may actually be persuaded to buy extra cards for two way SLI configurations.


Marketing, the disability of computing advancement.


----------



## Lass3

Quote:


> Originally Posted by *meson1*
> 
> I think Nvidia are trying to force the hands of the software houses. I think Nvidia originally intended for software to be written with explicit SLI. I think almost no-one did except benchmark designers. Most publishers relied on Nvidia's implicit SLI to do the leg work for them, only tweaking titles where necessary so they didn't crash or have issues. But they often didn't go as far as optimising games for SLI.
> 
> My understanding is that Nvidia are taking a gamble: that by withdrawing implicit SLI for DX12 onwards, they hope they can force publishers to write for explicit SLI and thereby dramatically improve the scaling that can be achieved with SLI. To help them, they have reduced formal SLI support to a maximum of 2-way, so that the number of combinations of GPU's to write and test for are reduced. If it works, and software IS written and optimized for SLI, and if scaling IS much improved, I think Nvidia hope that many more buyers may actually be persuaded to buy extra cards for two way SLI configurations.


3 and 4-way SLI was never good outside of benchmarks. In many games performance went down using more than 2 or the gain was incredibly small. Not worth the money at all.

Personally I'm never going SLI or CF again, too many issues.


----------



## dante`afk

I'm running SLI since GTX580 and never had issues. Never.


----------



## Lass3

Quote:


> Originally Posted by *dante`afk*
> 
> I'm running SLI since GTX580 and never had issues. Never.


Oh well, there has been alot of issues with SLI/CF, even on AAA releases. Especially if you play games on release day. I can mention a few, the reason I can remember these especially? They are the reason I stopped using SLI.

Far Cry 3, did not support multi GPU and crashed with SLI. Steam forums were glowing for weeks if not months, workarounds did not work for most people, and if they did, the gpu usage were still low/jumpy. Many users simply disabled SLI/CF and played the game with one card. I did just that, and I completed the game before a fix was out. Yay.

Far Cry 4, almost exact same thing, bad scaling and shadow bug with multi gpu. Workaround was to change SLI bits/profile, still bugged many places and GPU usage were all over the place. Felt jittery. Again I disabled SLI. And then sold the 2nd card. In between this and Far Cry 3, I've had numerous of other issues. Some games worked "fine" with other SLI profiles, some did not. I was tired of tinkering around just because I wanted to play the games on release and not weeks or months later.

It's funny when some people claim SLI/CF simply works every single time, yet game forums are spammed on pretty much every release with multi gpu users that are having issues.

Today I prefer a single powerful GPU. Never any issues, it just works. Low frametimes and smooth gameplay.

When I see my friends CF/SLI setups I find it much less smooth, especially when they use low/mid-end cards, the minimum fps spikes are horrible. Something you learn to live with I guess. It's almost like going from a 144 Hz monitor, back to 60 Hz. It's just not as smooth. After some time you forget about it and think it's fine..

If DX12/Vulkan improves multi GPU alot, I might try it again at some point, right now, no thanks..


----------



## dante`afk

I don't doubt that people have issues, however I never had issues. Maybe I'm just lucky, maybe it's just maybelline.
Then again you need SLI for Division or Tomb Raider at 1440p and max settings.


----------



## stefxyz

If I squeeze 25% vs my 1080 at 2100 MHZ under water I will be super happy with my Titan purchase.20% would still justify my purchase, at 15% slight disappointment would kick in...


----------



## ratzofftoya

So, just two make sure I'm not crazy, two of these will certainly outperform three 980Tis, right?

Also, getting three of these is not advisable, right?


----------



## mbze430

I asked this in the Official Titan X thread, but I will ask here as well. From some of the initial "reports" of the Pascal version, it sounded like the Titan X Pascal is an exclusive and there won't be any 3rd party partners cards? Is that true? Or there might be cards coming from partners in the next 6months?


----------



## meson1

Quote:


> Originally Posted by *mbze430*
> 
> I asked this in the Official Titan X thread, but I will ask here as well. From some of the initial "reports" of the Pascal version, it sounded like the Titan X Pascal is an exclusive and there won't be any 3rd party partners cards? Is that true? Or there might be cards coming from partners in the next 6 months?


Nvidia impose restrictions on the add-in-board partners for cards carrying the Titan brand. They are always reference PCBs with reference BIOS's and reference air coolers. The only exceptions made are those allowing for water-cooling solutions.

Gigabyte sort of got round the restrictions with a couple of products by selling the reference Titan bundled with their Windforce cooler for the end user to fit.

For Titan X Pascal though, Nvidia are selling it exclusively directly; not via the AIB partners. It is unknown if this is some kind of timed exclusive or not.


----------



## stefxyz

Its basically typical monpolistic behavioir. They dint face any competition in the high end so they dont need board partners. Now they can cash the Board partner margin especially in the high end where no extra marketing is needed. People who shop for titans inform themselfes and find oit.


----------



## mbze430

Right
Quote:


> Originally Posted by *meson1*
> 
> For Titan X Pascal though, Nvidia are selling it exclusively directly; not via the AIB partners. It is unknown if this is some kind of timed exclusive or not.


Exactly... I read another news bit that it might be a timed exclusive... but I only ask is because of the water block. If all previous Titan series were all reference designed even by partners, then I won't worry. I am planning to get a water block for this Titan X Pascal.

Anyone have experience as to what time they go on sale? I fear the limited quantity might give me the underhand of getting one on August 2nd because I am in the Pacific Time zone.


----------



## Steven185

Quote:


> Originally Posted by *stefxyz*
> 
> If I squeeze 25% vs my 1080 at 2100 MHZ under water I will be super happy with my Titan purchase.20% would still justify my purchase, at 15% slight disappointment would kick in...


You do know that you can also overclock your Titan and given its thermals and power setup Titan may be able to sustain high clocks better (during gaming). So even against your watercooled GTX 1080 you may be looking for 30%+ difference in performance (with Titan X). It's all up to nVidia and how they'd implement their adaptive overclocking profile on Titan X.


----------



## stefxyz

I hope you are right. I just try to manage my expectations to not get bdisappointed







The more power the better!


----------



## GunnzAkimbo

renamed.


----------



## dante`afk

do you think there will be any benchmarks before the release?


----------



## meson1

Quote:


> Originally Posted by *dante`afk*
> 
> do you think there will be any benchmarks before the release?


Long answer: There's an NDA in place. It lifts on Tuesday August 2nd; the day of release.

Short answer: No.


----------



## FattysGoneWild

Quote:


> Originally Posted by *ratzofftoya*
> 
> So, just two make sure I'm not crazy, two of these will certainly outperform three 980Tis, right?
> 
> Also, getting three of these is not advisable, right?


4 or bust. Don't be a cheapskate.


----------



## Gary2015

Quote:


> Originally Posted by *meson1*
> 
> Long answer: There's an NDA in place. It lifts on Tuesday August 2nd; the day of release.
> 
> Short answer: No.


What time is the release?


----------



## Metros

What time do we get the reviews for the Titan X


----------



## AdamK47

Quote:


> Originally Posted by *Gary2015*
> 
> What time is the release?


That's what I want to know.


----------



## st0necold

Well considering no one is playing the division i'd hold off on picking anything up to cater to it.

Division sucks.


----------



## KickAssCop

Where dat review?


----------



## loki993

Quote:


> Originally Posted by *st0necold*
> 
> Well considering no one is playing the division i'd hold off on picking anything up to cater to it.
> 
> Division sucks.


The game itself doesn't suck...the developers sucks and majorly screwed it up and never properly dealt with the hackers.


----------



## newls1

Waiting for reviews too


----------



## rodpp

http://videocardz.com/62738/nvidia-titan-x-pascal-3dmark-performance


----------



## vmanuelgm

First Benchs:

http://www.gamestar.de/hardware/grafikkarten/nvidia-titan-x/news-artikel/nvidia_titan_x,1010,3276621.html


----------



## Gary2015

On sale NOW!!!


----------



## Evo X

The official page won't let people add to cart for some reason. Use this link.

http://www.geforce.com/hardware/10series/geforce-store

Limit 2 per customer. Just ordered one.









Anyone else?


----------



## Gary2015

Quote:


> Originally Posted by *Evo X*
> 
> The official page won't let people add to cart for some reason. Use this link.
> 
> http://www.geforce.com/hardware/10series/geforce-store
> 
> Limit 2 per customer. Just ordered one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else?


I just got two of them. Site works. I can still order. Kinda disappointed they arent sold out already. Nvidia did a good job on the site...better than Apple. I was cheap though, only went for free shipping.


----------



## chrisk2305

just did it


----------



## Zurv

I just ordered 4







(overnight AM.. but i don't think it will ship till tomorrow)

now i can be number 1 in 3dmark time spy!

(shh.. don't get your panties all bunched up... yes.. still 2 way sli is maxed. I have 2 gaming systems)


----------



## Gary2015

There should an official TITAN XP owners thread..


----------



## Jpmboy

I should have 2 here tomorrow.


decadent is an appropriate term.

http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html
Quote:


> Originally Posted by *Zurv*
> 
> I just ordered 4
> 
> 
> 
> 
> 
> 
> 
> (overnight AM.. but i don't think it will ship till tomorrow)
> now i can be number 1 in 3dmark time spy!
> (shh.. don't get your panties all bunched up... yes.. still 2 way sli is maxed. I have 2 gaming systems)


lol- you are so far off the reservation, but it is fun to watch.


----------



## Gary2015

I gimped on the shipping. Mine probably next week.


----------



## dante`afk

Thank You For Your Order

Overnight before 10am.


----------



## Gary2015

Anyone get the HB SLI bridges? The 2 slot is out of stock.


----------



## Jpmboy

Quote:


> Originally Posted by *Gary2015*
> 
> Anyone get the HB SLI bridges? The 2 slot is out of stock.


I have the 80mm bridge already. I don;t think it will matter unless you are running above 4K60


----------



## Gary2015

decadent is an appropriate term.

"When it comes to gaming, a glorious pair of SLI'd new-look Titan X cards are made for high rollers with bleeding-edge displays alone."

High rollers is the appropriate term...


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> I have the 80mm bridge already. I don;t think it will matter unless you are running above 4K60


It will matter because I'm going get me one of these...

https://pcmonitors.info/dell/dell-up3017q-4k-uhd-oled-monitor/


----------



## vmanuelgm

I ordered one from spanish web...

Lets see if it reaches home soon... The page showed 1-3 working days...

Hope EK have a new block asap...

If you have two old single bridges its the same as HB bridge...


----------



## Gary2015

Quote:


> Originally Posted by *vmanuelgm*
> 
> I ordered one from spanish web...
> 
> Lets see if it reaches home soon... The page showed 1-3 working days...


Nvidia, must be disappointed...nearly an hour and still in stock...


----------



## vmanuelgm

Maybe they have produced a lot of them...

Its the only seller now, so...


----------



## Jpmboy

Quote:


> Originally Posted by *Gary2015*
> 
> It will matter because I'm going get me one of these...
> 
> https://pcmonitors.info/dell/dell-up3017q-4k-uhd-oled-monitor/


IDK bud right now that resolution/refresh is not supported by the available trancoders and I do not know of a cable that will carry 4K120.

also:

http://www.overclock.net/t/1603647/tpu-nvidia-geforce-gtx-1080-sli-is-the-sli-hb-bridge-essential/0_20


----------



## vmanuelgm

The HB bridge is more aesthetical, but 2 old single bridges do the same job...


----------



## NoDoz

Ordered one but won't be back in town until Monday so had to get the slow shipping. But happy regardless to get one.


----------



## vmanuelgm

I couldnt choose fast ship in spanish web... Dont really know when they will send the ítem...


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> IDK bud right now that resolution/refresh is not supported by the available trancoders and I do not know of a cable that will carry 4K120.
> 
> also:
> 
> http://www.overclock.net/t/1603647/tpu-nvidia-geforce-gtx-1080-sli-is-the-sli-hb-bridge-essential/0_20


Damn..was looking forward to that. Just have to wait for the ASUS 4k 144Hz coming out next year. Which 4k LCD you using man?


----------



## Jpmboy

Quote:


> Originally Posted by *Gary2015*
> 
> Damn..was looking forward to that. Just have to wait for the ASUS 4k 144Hz coming out next year. Which 4k LCD you using man?


I have a samsung 4K60, and a Seiki 55''4K (for over 3? years now - piggy-backed one off a pallet of the things going to a special effects shop). Don't get me wrong, the Dell is beautiful... just make sure you do the research. I mean even the dell 5K monitor uses 2 dp cables and can't show a bios screen without pullinng them or using a second SST (single stream transport, normal) monitor.
also use the 1440P144Hz swift - a very good monitor


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> I have a samsung 4K60, and a Seiki 55''4K (for over 3? years now - piggy-backed one off a pallet of the things going to a special effects shop). Don't get me wrong, the Dell is beautiful... just make sure you do the research. I mean even the dell 5K monitor uses 2 dp cables and can't show a bios screen without pullinng them or using a second SST (single stream transport, normal) monitor.
> also use the 1440P144Hz swift - a very good monitor


Thanks man, looking forward to your custom BIOSes!


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> I should have 2 here tomorrow.
> 
> decadent is an appropriate term.
> 
> http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html
> lol- you are so far off the reservation, but it is fun to watch.


Quote:


> Originally Posted by *Jpmboy*
> 
> I should have 2 here tomorrow.
> 
> decadent is an appropriate term.
> 
> http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html
> lol- you are so far off the reservation, but it is fun to watch.


haha.. pot calling kettle








i sold all the 8 1080s (ok.. that was a crazy one







) so it is like i saved money going to titan X(p)









how is the chiller? the hum drive you crazy?
Did chilling "help" - sadly, as we talked about, even with the lower temp i didn't get anymore OC from it.


----------



## Metros

You should all do a video about the performance in different games and upload it to YouTube


----------



## GunnzAkimbo

Makes thread about elite class GPU... can never have one


----------



## Nizzen

Quote:


> Originally Posted by *vmanuelgm*
> 
> The HB bridge is more aesthetical, but 2 old single bridges do the same job...


No


----------



## combat fighter

Ordered mine about 30min after they went online.

Looking forward to it, should be a nice jump up from my 980ti (which is still a great card)

Can't wait to get her under water, chuffed the block can be sent out middle of this month!


----------



## Gary2015

Quote:


> Originally Posted by *combat fighter*
> 
> Ordered mine about 30min after they went online.
> 
> Looking forward to it, should be a nice jump up from my 980ti (which is still a great card)
> 
> Can't wait to get her under water, chuffed the block can be sent out middle of this month!


Are the water blocks for the Titan XP out?


----------



## Zurv

Quote:


> Originally Posted by *Gary2015*
> 
> Are the water blocks for the Titan XP out?


i talked to EK and they said they are making them. ETA? no info. blah


----------



## mbze430

I ordered just 1 today, going to gauge if I am going to need to SLI or use this to tie me over to Volta



Besides EWK who else is making waterblock?

and are the EWK going to block off the NVLink??? I read the current blocks for the 1080 are blocking the NVLink


----------



## Zurv

Quote:


> Originally Posted by *mbze430*
> 
> I ordered just 1 today, going to gauge if I am going to need to SLI or use this to tie me over to Volta
> 
> 
> 
> Besides EWK who else is making waterblock?
> 
> and are the EWK going to block off the NVLink??? I read the current blocks for the 1080 are blocking the NVLink


EK said they would make their own bridges.. but i'm not holding my breath. That said, if you are 4k or under. the LED hard brides work the same as the new HB bridges. (do stay away from the ribbon crap.. but that was the case before the 10x series)


----------



## mbze430

I hope Aqua computer makes a block.. I still have 2 of the Aqua Computer water blocks for the 980TI. Going to keep one 980TI as a dedicated PhysX. My build would be "off" if Aqua Computer doesn't make one lol

Will this be the "official" Titan X New/P/Pascal owner's thread or since the name is still Titan X, we all report to that thread?


----------



## mbze430

I don't know if anyone else picked or posted this story

http://www.gamestar.de/hardware/grafikkarten/nvidia-titan-x/news-artikel/nvidia_titan_x,1010,3276621.html

Since i am not fluent in German... I used the Google translator. I notice this paragraph.

"Depending on the title, the Titan X is 4K whopping 20 percent before the Zotac Geforce GTX AMP 1080 Extreme , which is one of today's fastest 1080 variants thanks to its factory overclocking. Under load, the Titan X remains relatively quiet, but the GP102 chip heats up to 85 degrees, and the clock rate then leveled off at about 1,600 MHz."

[Someone might want to read it in native German]

"....the GP102 chip heats up to 85 degrees, and the clock rate then leveled off at about 1,600 MHz"

sounds like it needs some watercooling for sure!


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> haha.. pot calling kettle
> 
> 
> 
> 
> 
> 
> 
> 
> i sold all the 8 1080s (ok.. that was a crazy one
> 
> 
> 
> 
> 
> 
> 
> ) so it is like i saved money going to titan X(p)
> 
> 
> 
> 
> 
> 
> 
> 
> how is the chiller? the hum drive you crazy?
> Did chilling "help" - sadly, as we talked about, even with the lower temp i didn't get anymore OC from it.


8 1080s... lol. The chiller works fine. I really only use it on occasion, for benchmarking etc. But also for some needed stress/limit testing. For those uses it is absolutely great!








Can't really run it 24/7... I'd say it's more than a "hum".
Quote:


> Originally Posted by *Zurv*
> 
> i talked to EK and they said they are making them. ETA? no info. blah


I heard 6 weeks. ugh. I do have two Ek uniblocks - might have toi try with those if the temp throttling os bad.


----------



## stefxyz

Guys are you sleeping?

Its written in this forum in the EK section:
Quote:


> Originally Posted by *andrejEKWB*
> 
> We have confirmed date for launch; *Tuesday, 16th of August*
> 
> The block will be available for pre-order tomorrow in our webshop!


----------



## Jpmboy

Quote:


> Originally Posted by *stefxyz*
> 
> Guys are you sleeping?
> 
> Its written in this forum in the EK section:


Great!!


----------



## Zurv

Jpmboy,

sadly the 1080 ek waterblock is missing the ability to cool the ram chip to the bottom left of the GPU (maybe more stuff)


----------



## dpoverlord

I went in for 2 with the SLI bridge. I really wanted an EVGA variety. I am a bit wary purchasing straight from Nvidia since there is no clear way of knowing the warranty.







I am going to sell my 1080 founders edition and hope that Nvidia fabbed the new Titan X the right way. I will almost feel bad if I paid $2,400 and did what my last titans to 980tis to 1080 did..... Sat around doing not much. LoL

JPMBoy, I think its time I really dedicated some time to my X99 Rig, get the max OC on the CPU like I did with my xeon / previous platform, so I can really push this Titan X on air and showcase what it can do with you guys.

Anyone else using a 4k curved Sammy?

You guys pumped? Get pumped!


----------



## seross69

So jelly


----------



## Zurv

i hope EK fixed their stuff so normal HB bridge work with these new cards.


----------



## vmanuelgm

Hey guys, did you receive any trackings for your purchases???

If you receive mine, send it to Spain... xD


----------



## CallsignVega

NVIDIA just went out of stock. Hope to have mine tomorrow morning!


----------



## vmanuelgm

Still in stock in Spain... xD


----------



## Jpmboy

Quote:


> Originally Posted by *dpoverlord*
> 
> I went in for 2 with the SLI bridge. I really wanted an EVGA variety. I am a bit wary purchasing straight from Nvidia since there is no clear way of knowing the warranty.
> 
> 
> 
> 
> 
> 
> 
> I am going to sell my 1080 founders edition and hope that Nvidia fabbed the new Titan X the right way. *I will almost feel bad if I paid $2,400* and did what my last titans to 980tis to 1080 did..... *Sat around doing not much*. LoL
> 
> JPMBoy, I think its time I really dedicated some time to my X99 Rig, get the max OC on the CPU like I did with my xeon / previous platform, so I can really push this Titan X on air and showcase what it can do with you guys.
> 
> Anyone else using a 4k curved Sammy?
> 
> You guys pumped? Get pumped!


Almost?
My Titan MX's are direct from NVidia. Haven't had to use the RMA.
Stiil, all we'll need is a pascal bios editor. Absent that, the entire pascal line is "pedestrian".


----------



## Zurv

Quote:


> Originally Posted by *vmanuelgm*
> 
> Hey guys, did you receive any trackings for your purchases???
> 
> If you receive mine, send it to Spain... xD


nor I, but when i called digital river (they run the store for NVidia) it made it sound like they weren't shipping till tomorrow.


----------



## vmanuelgm

Ok Zurv.

Probably you will receive it earlier than me, mine surely Shipping from Germany...


----------



## mlb426

Will EVGA stock these or is it only from Nvidia? Would like to step up from my 1080 as opposed to selling on ebay


----------



## CallsignVega

Quote:


> Originally Posted by *mlb426*
> 
> Will EVGA stock these or is it only from Nvidia? Would like to step up from my 1080 as opposed to selling on ebay


NVIDIA store only.


----------



## Zurv

weee... got shipping notice. I'll have all 4 cards tomorrow morning.

looks like they worked out that i order 4 and tried to cancel it.. but it was shipped out before they got it.
ugh.. so much spam email from nvidia


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> weee... got shipping notice. I'll have all 4 cards tomorrow morning.
> 
> looks like they worked out that i order 4 and tried to cancel it.. but it was shipped out before they got it.
> ugh.. so much spam email from nvidia


yeah - I got a shipping notice also. Delivery tomorrow. STD overnight in my area with fedex is always late in the day.


----------



## NoDoz

I ordered mine pretty quick today...didnt get a shipping notice yet though.


----------



## newls1

I want you guys's money


----------



## Jpmboy

Quote:


> Originally Posted by *mlb426*
> 
> Will EVGA stock these or is it only from Nvidia? Would like to step up from my 1080 as opposed to selling on ebay


It did take a little while for resellers to get the Titan XM.


----------



## Mad Pistol

We need to come up with a universal naming scheme here.

We are all calling the new Titan "Titan XP", so that's settled.

I propose that we call the original Titan X "Titan XO" for orignal generation.

Any takers?


----------



## KickAssCop

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Titan-X-Pascal-12GB-Graphics-Card-Review/Sound-Testing-Pricing-and-Clo


----------



## Evo X

Hardware Canucks posted their review as well

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73148-nvidia-titan-x-12gb-performance-review.html

Holy crap, reading some of those conclusions is insane. This card is actually CPU capped at 2560x1440 max settings in some games and needs 4K to stretch it's legs. Overclocks like a beast and provides a large jump over the already super fast GTX 1080 and nearly double the performance of the Maxwell Titan X!

I love how absurd this card is and can't wait to play around with it.


----------



## ChevChelios

Real flagship behemoth


----------



## Trunkey

So what size PSU are you guys with two of these things running then?


----------



## Charcharo

Quote:


> Originally Posted by *Mad Pistol*
> 
> We need to come up with a universal naming scheme here.
> 
> We are all calling the new Titan "Titan XP", so that's settled.
> 
> I propose that we call the original Titan X "Titan XO" for orignal generation.
> 
> Any takers?


Call it Titan XM.
Might tingle some tankers as well.


----------



## meson1

Quote:


> Originally Posted by *Mad Pistol*
> 
> We need to come up with a universal naming scheme here.
> 
> We are all calling the new Titan "Titan XP", so that's settled.
> 
> I propose that we call the original Titan X "Titan XO" for orignal generation.
> 
> Any takers?


Or Titan XM with the M for Maxwell.


----------



## PatrickCrowely

WOW! Better than I expected, but man I hate to spring for Titans again & the Ti variant be just as good


----------



## Bloodymight

So Titan X is nvidia store exclusive?

What about the warranty if you plan on using another cooler on it?


----------



## Steven185

Quote:


> Originally Posted by *PatrickCrowely*
> 
> WOW! Better than I expected, but man I hate to spring for Titans again & the Ti variant be just as good


If it's any consolation to you, 1080 ti (whenever that may be released) would be equally expensive. OK maybe not $1200, but certainly closer to $1000 than last year's $650 ... so it's not as if you're going to save a lot of money by waiting. Also if AMD has nothing to counteract this kind of performance you may have to wait for 2017 before 1080 ti is out...

But, yeah, those super high end cards are always like that. Never the sensible thing to buy, yet people still do... I guess telling them to wait for another year (or two in this occasion) for this kind of performance in sensible prices just doesn't cut it for them...


----------



## dante`afk

just wait for the titan black and then everyone will be pissed


----------



## Lass3

Quote:


> Originally Posted by *Steven185*
> 
> If it's any consolation to you, 1080 ti (whenever that may be released) would be equally expensive. OK maybe not $1200, but certainly closer to $1000 than last year's $650 ... so it's not as if you're going to save a lot of money by waiting. Also if AMD has nothing to counteract this kind of performance you may have to wait for 2017 before 1080 ti is out...
> 
> But, yeah, those super high end cards are always like that. Never the sensible thing to buy, yet people still do... I guess telling them to wait for another year (or two in this occasion) for this kind of performance in sensible prices just doesn't cut it for them...


What else does your crystal ball tell you?

Can you see when Vega 10 is being released?


----------



## CallsignVega

Time to find out what these can really do.


----------



## Steven185

Quote:


> Originally Posted by *Lass3*
> 
> What else does your crystal ball tell you?
> 
> Can you see when Vega 10 is being released?


No crystal ball needed. GTX 1080 is $650 to $700 while Titan X is $1200. Predicting an $900 Ti is actually an insane proposition if it was not to be made.
As for Vega given how bad RX 4xx were in perf/W, I reaaally doubt that AMD has anything in their sleeve. We would go one year straight, maybe more with Titan/1080 Ti as top performers...


----------



## Jpmboy

Quote:


> Originally Posted by *Mad Pistol*
> 
> We need to come up with a universal naming scheme here.
> 
> We are all calling the new Titan "Titan XP", so that's settled.
> 
> I propose that we call the original Titan X "Titan XO" for orignal generation.
> 
> Any takers?


XP and XM
Quote:


> Originally Posted by *CallsignVega*
> 
> Time to find out what these can really do.


lol - what you do.. get same day shipping?


----------



## CallsignVega

Haha AM overnight.

Hmm, MSI Afterburner cannot read the cards voltages. Anyone have any ideas?

My cards are boosting to 1853 MHz out of the box.


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> Haha AM overnight.
> 
> Hmm, MSI Afterburner cannot read the cards voltages. Anyone have any ideas?
> 
> My cards are boosting to 1853 MHz out of the box.


on your z170 mobo.. are they at x8? You have the HB bridge? If yes, could you run the attached concurrentbandwidth test and post the resulting command window?
2 TXM, R5E-10/6950X


Unzip and open the folder, for windows10, File>open command prompt as admin. Type _concbandwidthtest 0,1_

ConcBandwidth.zip 5k .zip file


this is not clock dependent (well, except for the PEG/DMI). so straight stock is all that's needed.

4.3beta? Try GPUZ


----------



## PatrickCrowely

Quote:


> Originally Posted by *CallsignVega*
> 
> Haha AM overnight.
> 
> Hmm, MSI Afterburner cannot read the cards voltages. Anyone have any ideas?
> 
> My cards are boosting to 1853 MHz out of the box.


Very nice @1853. These cards may be what the Titans should've always been. Seems to have a decent performance jump over 1080's so far.


----------



## Kyouki

Slept on it and looks like they are back in stock so I placed my order for one Titan for my new build!


----------



## CallsignVega

OK found the wall on mine at stock voltage and stock air cooler, 2038 MHz core.

We could be looking at 2100-2200 with upgraded voltage and cooler.

JPM will do, gotta make some breakfast first.


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> OK found the wall on mine at stock voltage and stock air cooler, 2038 MHz core.
> 
> We could be looking at 2100-2200 with upgraded voltage and cooler.
> 
> JPM will do, gotta make some breakfast first.


that's incredible clocks! Nice. I'm not so sure about the voltage part, but with pascal... every 10C is worth ~ 25-50MHz until ~ 10C. The next thermal steps are cryogenic.
Quote:


> Originally Posted by *PatrickCrowely*
> 
> Very nice @1853. These cards may be what the Titans should've always been. Seems to have a decent performance jump over 1080's so far.


lol - which titan are you talking about? the OG (TXK) and the TXM are fantastic cards.


----------



## TronZy

Quote:


> Originally Posted by *CallsignVega*
> 
> OK found the wall on mine at stock voltage and stock air cooler, 2038 MHz core.
> 
> We could be looking at 2100-2200 with upgraded voltage and cooler.
> 
> JPM will do, gotta make some breakfast first.


Wow! Run some benchmarks


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> OK found the wall on mine at stock voltage and stock air cooler, 2038 MHz core.
> 
> We could be looking at 2100-2200 with upgraded voltage and cooler.
> 
> JPM will do, gotta make some breakfast first.


WOWOWOOWOWOWWOW! Under water, they could go to 2200+. Thats 50%+ faster than 1080GTX !!!


----------



## Zurv

wooo.. time to break some stuff!


----------



## Gary2015

Quote:


> Originally Posted by *Zurv*
> 
> wooo.. time to break some stuff!


Damn, should have got same day shipping!! Let you guys take the plunge first!!


----------



## Jpmboy

Ek fullcover block pre order:
https://www.ekwb.com/shop/ek-fc-titan-x-pascal


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> Ek fullcover block pre order:
> https://www.ekwb.com/shop/ek-fc-titan-x-pascal


ordered!

nickel manz! https://www.ekwb.com/shop/ek-fc-titan-x-pascal-nickel

i don't see a warning about SLI HB brdiges.. that is a good thing (unless i'm blind)


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> ordered!
> 
> nickel manz! https://www.ekwb.com/shop/ek-fc-titan-x-pascal-nickel
> 
> i don't see a warning about SLI HB brdiges.. that is a good thing (unless i'm blind)


I didn't either.. gotta ask. I have Ni and Cu on other rigs ATM .. I tend to like the look of Cu with a black backplate. lol - no colored coolants.








oh yeah, the back side of the PCB on the 1080 is very fragile, I'm assuming the TXP is the same. Get a backplate if the EK block is not compatible with the stock BP.


----------



## CallsignVega

First run of Pascal Titan

Callsign_Vega --- 6700K / 4.8 GHz --- Titan X (Pascal), 2050 / 5693 --- 161.7 --- 6764



Even with 4.8 GHz 6700K GPU was under max quite a bit.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> Ek fullcover block pre order:
> https://www.ekwb.com/shop/ek-fc-titan-x-pascal


Got 2x Acetal/Nickel. This isn't CSQ right?


----------



## Jpmboy

Quote:


> Originally Posted by *Gary2015*
> 
> Got 2x Acetal/Nickel. This isn't CSQ right?


I pulled my order until they have backplates and answer the bridge question.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> I pulled my order until they have backplates and answer the bridge question.


There is no warning on the bridge so I assume that's sorted. I ran previous TITAN XM SLI dual on Aquacomputer copper blocks with out backplates and they were fine.


----------



## Metros

Quote:


> Originally Posted by *CallsignVega*
> 
> First run of Pascal Titan
> 
> Callsign_Vega --- 6700K / 4.8 GHz --- Titan X (Pascal), 2050 / 5693 --- 161.7 --- 6764
> 
> 
> 
> Even with 4.8 GHz 6700K GPU was under max quite a bit.


What is the clock speed after a certain amount of time, with the default fan curve, not at 100 percent, as no one would want to use that for gaming


----------



## Snaporz

Quote:


> Originally Posted by *Jpmboy*
> 
> I pulled my order until they have backplates and answer the bridge question.


I placed an order earlier for
EK-FC Titan X Pascal - SKU 3831109831663 -$ 91.30

It was the only option at the time and now I need to change. How did you cancel your order? I am unable to see cancel.


----------



## lyang238

Damn I knew I should have gotten at least overnight shipping since I STILL haven't gotten a shipping confirmation. I ordered within 5 minutes as well!


----------



## Gary2015

Quote:


> Originally Posted by *Snaporz*
> 
> I placed an order earlier for
> EK-FC Titan X Pascal - SKU 3831109831663 -$ 91.30
> 
> It was the only option at the time and now I need to change. How did you cancel your order? I am unable to see cancel.


You need to fill in the contact support form.


----------



## Jpmboy

Quote:


> Originally Posted by *Gary2015*
> 
> There is no warning on the bridge so I assume that's sorted. I ran previous TITAN XM SLI dual on Aquacomputer copper blocks with out backplates and they were fine.


I'm running 2 TXMs with plates right now, and having put a uniblock on a 1080FE (sold it)... like I said, *if* the TXP has the same fragile resistors on the backside of the PCB, I'll use backplates. I swap out components too often not to.
The 1080 PCB is the most fragile I've seen from among 20+ GPU models over the past 3 years. One OCN member, broke a resistor off the back of his 1080 just handling it (supposedly). anyway - smoke 'em if you got 'em.









Spoiler: Warning: Spoiler!








Quote:


> Originally Posted by *CallsignVega*
> 
> First run of Pascal Titan
> 
> Callsign_Vega --- 6700K / 4.8 GHz --- Titan X (Pascal), 2050 / 5693 --- 161.7 --- 6764
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Even with 4.8 GHz 6700K GPU was under max quite a bit.


lol - that's ridiculous.


----------



## Diverge

Just ordered mine w/ overnight shipping... after telling myself I didn't need the new one. I guess I should look into unloading my original Titan, and Titan X maxwell to offset the hurt on my wallet.


----------



## Gary2015

Quote:


> Originally Posted by *lyang238*
> 
> Damn I knew I should have gotten at least overnight shipping since I STILL haven't gotten a shipping confirmation. I ordered within 5 minutes as well!


I ordered at 6.02am and chose FREE shipping, Still no ship confirm. That's ok though, I let the first guys test out first. Also just got some EK blocks.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> I'm running 2 TXMs with plates right now, and having put a uniblock on a 1080FE (sold it)... like I said, *if* the TXP has the same fragile resistors on the backside of the PCB, I'll use backplates. I swap out components too often not to.
> The 1080 PCB is the most fragile I've seen from among 20+ GPU models over the past 3 years. One OCN member, broke a resistor off the back of his 1080 just handling it (supposedly). anyway - smoke 'em if you got 'em.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


To be safe, Ill wait for the backplates before going to water. Support said the plates will be available later this week.


----------



## Gary2015

Quote:


> Originally Posted by *Diverge*
> 
> Just ordered mine w/ overnight shipping... after telling myself I didn't need the new one. I guess I should look into unloading my original Titan, and Titan X maxwell to offset the hurt on my wallet.


The new stock should deal the eBay scalpers a heavy blow. Some guys asking for $1999.


----------



## NoDoz

Quote:


> Originally Posted by *Zurv*
> 
> wooo.. time to break some stuff!


Nice!!


----------



## Gary2015

Quote:


> Originally Posted by *NoDoz*
> 
> Nice!!


Is this now the official Titan XP Owner's thread?


----------



## Snaporz

Quote:


> Originally Posted by *Gary2015*
> 
> You need to fill in the contact support form.


Did that as only option I saw I could take. Thanks man. Too bad it wasn't the all nickel one I was getting for only $91. Lol


----------



## Jpmboy

Quote:


> Originally Posted by *Diverge*
> 
> Just ordered mine w/ overnight shipping... after telling myself I didn't need the new one. I guess I should look into unloading my original Titan, and Titan X maxwell to offset the hurt on my wallet.


Your OG titan - with incredible DP precision FLOPS will sell quickly to the right audience (deep learning, computational guys etc). With NV crippling DP floating point since then, the titan series has not been the poorman's Tesla.








Quote:


> Originally Posted by *Gary2015*
> 
> To be safe, Ill wait for the backplates before going to water. Support said the plates will be available later this week.


smart move IMO.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> OK found the wall on mine at stock voltage and stock air cooler, 2038 MHz core.
> 
> We could be looking at 2100-2200 with upgraded voltage and cooler.
> 
> JPM will do, gotta make some breakfast first.


Very nice and congrats! Think I'll leave mine at 2Ghz for awhile...


----------



## Metros

Is anyone going to keep using the stock cooler?


----------



## lombardsoup

Quote:


> Originally Posted by *Metros*
> 
> Is anyone going to keep using the stock cooler?


For use as a doorstop, yes.


----------



## Snaporz

Haha. It would've been too good to be true if waterblocks were already available to be shipped. That 2 week+ wait is a major buzzkill.


----------



## Metros

Anyone else think it is insane that in less than a year, we will have Volta that will easily beat the Titan X, we seem to be evolving quickly at the moment


----------



## Fiercy

Quote:


> Originally Posted by *Metros*
> 
> Anyone else think it is insane that in less than a year, we will have Volta that will easily beat the Titan X, we seem to be evolving quickly at the moment


Dude where have you been its been like this for ages...


----------



## techguymaxc

Quote:


> Originally Posted by *xarot*
> 
> With EK, always bare copper. So, acetal.


Why do you say that? I've had fantastic results with the acetal + nickel variety. 37C load on my GTX 1070 @ 2126MHz in Furmark with ambient temps in the low-mid 20s (poor ventilation in room & no AC). You getting better than that?


----------



## Metros

Quote:


> Originally Posted by *Fiercy*
> 
> Dude where have you been its been like this for ages...


No, the original Titan X lasted a year and three months. This Titan X will only last 9 months


----------



## Gary2015

Quote:


> Originally Posted by *Snaporz*
> 
> Did that as only option I saw I could take. Thanks man. Too bad it wasn't the all nickel one I was getting for only $91. Lol


Im sure you can also order the nickel one and since its a preorder, they have plenty of time to cancel your first one.


----------



## CallsignVega

Holy moley, testing individual and one of my cards will do 2101 MHz on air.


----------



## Fiercy

Quote:


> Originally Posted by *Metros*
> 
> No, the original Titan X lasted a year and three months. This Titan X will only last 9 months


It will last as long as you need it too







Ain't a lot of games coming that require that kind of performance.


----------



## Snaporz

Quote:


> Originally Posted by *Gary2015*
> 
> Im sure you can also order the nickel one and since its a preorder, they have plenty of time to cancel your first one.


They are quick today, already processing it. I just meant I apparently pre-ordered when it was $30 cheaper. They probably would've force cancelled me anyway due to that discrepancy.


----------



## Metros

Quote:


> Originally Posted by *Fiercy*
> 
> It will last as long as you need it too
> 
> 
> 
> 
> 
> 
> 
> Ain't a lot of games coming that require that kind of performance.


No, that is another reason why I do not want to upgrade, only seems to be a problem at 4K, it seems fine at 3440x1440p, however it is still tempting


----------



## Jpmboy

Quote:


> Originally Posted by *Metros*
> 
> No, the original Titan X lasted a year and three months. This Titan X will only last 9 months


here come the sour grape trolls.

wait for volta.. oops, then you'll have to wait for the next generation... blah blah.


----------



## Gary2015

Quote:


> Originally Posted by *Metros*
> 
> Anyone else think it is insane that in less than a year, we will have Volta that will easily beat the Titan X, we seem to be evolving quickly at the moment


That's the nature of the beast! The Power of Now my friend. Don't look back nor forwards. Enjoy the now because you will have the most powerful GPU known to man!


----------



## Metros

Quote:


> Originally Posted by *CallsignVega*
> 
> Holy moley, testing individual and one of my cards will do 2100 MHz on air.


You using the default fan speed, also what is the clock speed over time


----------



## carlhil2

Quote:


> Originally Posted by *Metros*
> 
> No, the original Titan X lasted a year and three months. This Titan X will only last 9 months


You are speculating though....


----------



## NoDoz

Quote:


> Originally Posted by *lyang238*
> 
> Damn I knew I should have gotten at least overnight shipping since I STILL haven't gotten a shipping confirmation. I ordered within 5 minutes as well!


Yeah I'm in the same boat.


----------



## Metros

Quote:


> Originally Posted by *carlhil2*
> 
> You are speculating though....


No, Volta was moved forward to next year

https://www.techpowerup.com/224413/nvidia-accelerates-volta-to-may-2017


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> here come the sour grape trolls.
> 
> wait for volta.. oops, then you'll have to wait for the next generation... blah blah.


Too true. We are on the bleeding in edge. Wait for volta, then wait for next one year. The guys here who throw $2500+ know what the game is about!


----------



## NoDoz

Quote:


> Originally Posted by *CallsignVega*
> 
> Holy moley, testing individual and one of my cards will do 2101 MHz on air.


Dayum. That's great.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Holy moley, testing individual and one of my cards will do 2101 MHz on air.


***??!?! These things could do 2300+ on water then!!!!


----------



## Zurv

god damn nvidia! they always break stuff when putting artificial limits on their hardware.

I can't even do the 4way SLI for these new cards.. even on benchmarking....


----------



## techguymaxc

Quote:


> Originally Posted by *Metros*
> 
> No, Volta was moved forward to next year
> 
> https://www.techpowerup.com/224413/nvidia-accelerates-volta-to-may-2017


That's the architecture debut at an industry conference, not the date you'll be able to purchase one. And of course they're not going to release the next Titan as the first member of the product family, you'll see 1180s and 1170s first (or whatever they decide to call them).


----------



## carlhil2

Quote:


> Originally Posted by *Metros*
> 
> No, Volta was moved forward to next year
> 
> https://www.techpowerup.com/224413/nvidia-accelerates-volta-to-may-2017


Maybe, but, most who buy this GPU will be waiting on the Volta equivalent...


----------



## techguymaxc

Quote:


> Originally Posted by *Gary2015*
> 
> ***??!?! These things could do 2300+ on water then!!!!


Not without a serious power mod... I'll bet it would be sucking down close to 400W at that point, much of the additional amperage coming over the PCI-e slot too. We all know where that leads...


----------



## Gary2015

Quote:


> Originally Posted by *NoDoz*
> 
> Yeah I'm in the same boat.


I just called NVIDIA support. Im on FREE shipping (bought mine at 6.02 am day) and they said they will be processing the orders within the next 24 hours, so I guess shipping in 48 hours. I wanted to cancel and reorder with overnight shipping but considering Im still waiting for the blocks, a couple days more won't hurt, plus I will know how far these babies can go with the guys getting their cards first.


----------



## Gary2015

Quote:


> Originally Posted by *techguymaxc*
> 
> Not without a serious power mod... I'll bet it would be sucking down close to 400W at that point, much of the additional amperage coming over the PCI-e slot too. We all know where that leads...


Who is going brave enough to test that?


----------



## NoDoz

Quote:


> Originally Posted by *Zurv*
> 
> god damn nvidia! they always break stuff when putting artificial limits on their hardware.
> 
> I can't even do the 4way SLI for these new cards.. even on benchmarking....


Do you need to request a key or something? I know going tri for 1080s you have to. I think anyway.


----------



## techguymaxc

Quote:


> Originally Posted by *Zurv*
> 
> god damn nvidia! they always break stuff when putting artificial limits on their hardware.
> 
> I can't even do the 4way SLI for these new cards.. even on benchmarking....


Yeah, but they kinda told you that already if you've been following the SLI debacle this generation. Did you really buy 4 intending to use them in a single machine for graphical workloads or are you putting them in different machines/have a different (compute) workload in mind?


----------



## CallsignVega

Quote:


> Originally Posted by *Zurv*
> 
> god damn nvidia! they always break stuff when putting artificial limits on their hardware.
> 
> I can't even do the 4way SLI for these new cards.. even on benchmarking....


That's a shame. One of the reasons I don't do 4-way anymore and just stick to two and the HB bridge.


----------



## techguymaxc

Quote:


> Originally Posted by *Gary2015*
> 
> Who is going brave enough to test that?


I may be able to justify buying one of these things, but I sure can't justify burning one up and having to pay for the second one out of pocket.


----------



## Zurv

Quote:


> Originally Posted by *NoDoz*
> 
> Do you need to request a key or something? I know going tri for 1080s you have to. I think anyway.


i have 4 way 1080 too. It will go 4 way SLI in the control panel, but will lock it to 2 way in anything but approved benchmarks.

i'm not planning on using 4 way. I was just doing this to benchmark and then move 2 to another PC.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Holy moley, testing individual and one of my cards will do 2101 MHz on air.


Will you be sending the "defective" card back for an exchange?


----------



## Gary2015

EK just put up the block prices by $10.


----------



## DADDYDC650

Quote:


> Originally Posted by *Metros*
> 
> No, Volta was moved forward to next year
> 
> https://www.techpowerup.com/224413/nvidia-accelerates-volta-to-may-2017


It's rumored that Santa Claus is real. Do you believe that as well?


----------



## CallsignVega

I'm being so irresponsible with this amount of performance.


----------



## Metros

Quote:


> Originally Posted by *Gary2015*
> 
> Too true. We are on the bleeding in edge. Wait for volta, then wait for next one year. The guys here who throw $2500+ know what the game is about!


Really, was there any need for that comment, I have spent a lot of this system, so I do not tell me "$2500+ know what the game is about" if you would read my comment you might find out. Pascal was a skipping stone to Volta, which is why Volta has been moved forward, Pascal was never planned. Why would I want to spend $2500, for a new GPU to come out next year at $600, that will beat it in performance. Also, as stated by one of the members on here, we do not have any demanding games coming out this year (Battlefield 1 uses the same engine) even if you run 4K, you do not need to get Titan X SLI, you are just wasting performance.

There are many multiple GPU setups that are cheaper, while you get the same performance. Depending the games you play will determine if you like SLI, however getting one Titan X would be good, as it can get almost 4K 60 FPS with one GPU

It is your money, however considering we have Volta next year (with HBM2) and that being the main GPU that was planned and not the skipping stone generation, it just seems a waste to go SLI


----------



## lombardsoup

Quote:


> Originally Posted by *DADDYDC650*
> 
> It's rumored that Santa Claus is real. Do you believe that as well?


I believe you should send me the 'bad' cards that don't overclock well.


----------



## Metros

Quote:


> Originally Posted by *NoDoz*
> 
> Do you need to request a key or something? I know going tri for 1080s you have to. I think anyway.


There is no key, NVIDIA does not support it any more (completely) therefore you need to use Multi-Adapter (if the DirectX12 game supports it)


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> I'm being so irresponsible with this amount of performance.


Are you going try 4 way SLI? That would really be irresponsible.


----------



## Murlocke

Someone asked for heaven in the other thread.. here's a +150/+150 overclocked Titan X (Maxwell) versus my STOCK Titan X (Pascal) at 4K, max settings.


----------



## CallsignVega

Holy hell my fast cards memory can also do 5900 MHz. +900 on the slider.


----------



## stefxyz

Metros hows your waiting for Volta going? D Consider waiting for death. Would safe you even more bucks.

Anyone from germany got shipping confirmation by now?


----------



## lyang238

Quote:


> Originally Posted by *stefxyz*
> 
> Metros hows your waiting for Volta going? D Consider waiting for death. Would safe you even more bucks.


Quote:


> Originally Posted by *stefxyz*
> 
> Metros hows your waiting for Volta going? D Consider waiting for death. Would safe you even more bucks.


I suppose he'll be safed in a year or so


----------



## NoDoz

Quote:


> Originally Posted by *CallsignVega*
> 
> Holy hell my fast cards memory can also do 5900 MHz.


Do some benching and post! Single card benching please!


----------



## Metros

Quote:


> Originally Posted by *stefxyz*
> 
> Metros hows your waiting for Volta going? D Consider waiting for death. Would safe you even more bucks.


I am fine at the moment, I have 3440x1440p 100 FPS in nearly every game now. Did you buy two Titan X GPUs then for gaming, what a waste


----------



## CallsignVega

Quote:


> Originally Posted by *NoDoz*
> 
> Do some benching and post! Single card benching please!


That fast of memory is downclocking my core. I NEED MO POWA!


----------



## DADDYDC650

Quote:


> Originally Posted by *Murlocke*
> 
> Someone asked for heaven in the other thread.. here's a +150/+150 overclocked Titan X (Maxwell) versus my STOCK Titan X (Pascal) at 4K, max settings.


That's beastly right there! Can't wait!


----------



## Difunto

on stock for about 3 or 4 laps on valley not bad tempys

it started at 1896mhz then temp rised and boost dropped


----------



## Gary2015

Can we get an official Titan XP Owners Club already please?


----------



## Metros

Anyone got two GTX 980ti they could use to benchmark, along with the new Titan X, want to compare performance, thanks


----------



## The-Real-Link

Haven't ever dabbled with a water setup short of simple AIO so yeah, I'm a horrible schmoe leaving mine on air. But then again I just want 4K60 ultra. No need for higher hz than that for me at the moment so my needs aren't quite as "extreme" as some of you guys, haha.

But after seeing more benchmarks come out on Tom's, HWC, etc., this seems to blow the original Titan X out of the water. I think I'll be very happy.

Will do an unboxing and my own benchmark videos and such once I have time to spend with it after this weekend due to a lot of work coming up.


----------



## Murlocke

Heaven doesn't seem to care about overclocking, lol.

A whole +1 FPS average with a +200 core. 2013MHz, no throttling, max temp was 82C at 90% fan.


----------



## Diverge

Quote:


> Originally Posted by *The-Real-Link*
> 
> Haven't ever dabbled with a water setup short of simple AIO so yeah, I'm a horrible schmoe leaving mine on air. But then again I just want 4K60 ultra. No need for higher hz than that for me at the moment so my needs aren't quite as "extreme" as some of you guys, haha.
> 
> But after seeing more benchmarks come out on Tom's, HWC, etc., this seems to blow the original Titan X out of the water. I think I'll be very happy.
> 
> Will do an unboxing and my own benchmark videos and such once I have time to spend with it after this weekend due to a lot of work coming up.


IMO, there's nothing wrong with sticking to air cooling. I've water cooled a number of systems in the past, and personally done with it... too much maintenance for me. Now I prefer the highest performance parts in the smallest SFF cases. Can't wait for the Dan A4-SFX cases to start shipping at the end of the year


----------



## vmanuelgm

Quote:


> Originally Posted by *Murlocke*
> 
> Heaven doesn't seem to care about overclocking, lol.
> 
> A whole +1 FPS average with a +200 core. 2013MHz, no throttling, max temp was 82C at 90% fan.


Try more resolution...

I envy you guys who received the TitanX... I called digital river to know about my order and they dont know when they are shipping...

Jesús Juan from Nvidia took my money and said thanks, *******!!!


----------



## Woundingchaney

Quote:


> Originally Posted by *Murlocke*
> 
> Heaven doesn't seem to care about overclocking, lol.
> 
> A whole +1 FPS average with a +200 core. 2013MHz, no throttling, max temp was 82C at 90% fan.


How loud was the fan at 90%?


----------



## Jpmboy

Quote:


> Originally Posted by *techguymaxc*
> 
> Not without a serious power mod... I'll bet it would be sucking down close to 400W at that point, *much of the additional amperage coming over the PCI-e slot t*oo. We all know where that leads...


this is a techpower up (or whom ever it was) myth. 1st, the PCIE power is speced at 75W.. not limited to 75W and a top HEDT MB will have no problem (there was a time when we could "soften" the ATX power connector tho). Now the MB OCP will kick in.. at >>75W. Second, if this is due to a vBIOS setting, it is correctable and is an error/bad practice in the bios, but still very unlikely that at the limit the card will pull preferentially from the lowest cap power source. Yeah I know one reviewer "measured" this. Frankly, I don't believe the result.
Quote:


> Originally Posted by *CallsignVega*
> 
> That fast of memory is downclocking my core. I NEED MO POWA!


or causing error correction.
Quote:


> Originally Posted by *Gary2015*
> 
> Can we get an official Titan XP Owners Club already please?


You should start one.. or contact the OP of this thread to either Update post #1 or transfer the thread to you.


----------



## Gary2015

Quote:


> Originally Posted by *Diverge*
> 
> IMO, there's nothing wrong with sticking to air cooling. I've water cooled a number of systems in the past, and personally done with it... too much maintenance for me. Now I prefer the highest performance parts in the smallest SFF cases. Can't wait for the Dan A4-SFX cases to start shipping at the end of the year


Water cooling is a hassle but i could never stand the noise of the fans, especially in 2 slot SLI where the first GPU is hotter than the second due to lack of air because I don't have a 40 lane CPU.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> this is a techpower up (or whom ever it was) myth. 1st, the PCIE power is speced at 75W.. not limited to 75W and a top HEDT MB will have no problem (there was a time when we could "soften" the ATX power connector tho). Now the MB OCP will kick in.. at >>75W. Second, if this is due to a vBIOS setting, it is correctable and is an error/bad practice in the bios, but still very unlikely that at the limit the card will pull preferentially from the lowest cap power source. Yeah I know one reviewer "measured" this. FGrankly, I don't believe the result.
> or causing error correction.
> You should start one.. or contact the OP of this thread to either Update post #1 or transfer the thread to you.


You start one JPM, you are the Guru around here, Im just a noob trying to learn from you guys.


----------



## Murlocke

Valley, +150/+150 on the Titan X Maxwell and +200/+500 on the Titan X Pascal. And yes, it seems very stable. I have not attempted to push higher, still no throttling.


----------



## Jpmboy

Quote:


> Originally Posted by *Gary2015*
> 
> You start one JPM, you are the Guru around here, Im just a noob trying to learn from you guys.


I already do too many threads... and I'm no guru.


----------



## Gary2015

Quote:


> Originally Posted by *Murlocke*
> 
> Valley, +150/+150 on the Titan X Maxwell and +200/+500 on the Titan X Pascal. And yes, it seems very stable. I have no attempted to push higher, still no throttling.


Wow, it seems this card lives up to its billing. Can you test GTA V at 4k at max settings?


----------



## Trunkey

What's the minimum psu to run a pair of these beasts then?
I imagine it would soundly thump my poor old R9 290x in everything, imagine the folding points SLI titan x would score


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> I already do too many threads... and I'm no guru.


Officially I don't have the cards yet so I'm not an owner







So please guru's with cards start a thread!


----------



## Jpmboy

Quote:


> Originally Posted by *Murlocke*
> 
> Valley, +150/+150 on the Titan X Maxwell and +200/+500 on the Titan X Pascal. And yes, it seems very stable. I have not attempted to push higher, still no throttling.


run the benches as described in the OP of these threads and suib an entry in each.. TimeSpy is the only one that uses DX12 in a serious manner.

http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20


----------



## Metros

Quote:


> Originally Posted by *Gary2015*
> 
> Water cooling is a hassle but i could never stand the noise of the fans, especially in 2 slot SLI where the first GPU is hotter than the second due to lack of air because I don't have a 40 lane CPU.


You do not have a 40 PCIE lane CPU, these guys with expensive CPUs know what the game is about


----------



## Steven185

Quote:


> Originally Posted by *CallsignVega*
> 
> First run of Pascal Titan
> 
> Callsign_Vega --- 6700K / 4.8 GHz --- Titan X (Pascal), 2050 / 5693 --- 161.7 --- 6764
> 
> 
> 
> Even with 4.8 GHz 6700K GPU was under max quite a bit.


Can you counteract this with running stock so that we may get the actual gains you had?
Impressive results regardless (not that we expected any less from Titan XP







)


----------



## Gary2015

Quote:


> Originally Posted by *Metros*
> 
> You do not have a 40 PCIE lane CPU, these guys with expensive CPUs know what the game is about


Technically I don't need to. 16x/8x SLI and 4x for SSD. Yes I'm cheap I know. I opted for free shipping also.


----------



## Diverge

Quote:


> Originally Posted by *Gary2015*
> 
> Water cooling is a hassle but i could never stand the noise of the fans, especially in 2 slot SLI where the first GPU is hotter than the second due to lack of air because I don't have a 40 lane CPU.


I don't multicard anymore either.. Too much hassles in the past with drivers, or games not supporting it, ect. Not sure how it is now, but I just like best single gpu card. But I don't like loud fans either, but the nvidia reference titan coolers don't bother me much.


----------



## Difunto

Quote:


> Originally Posted by *Murlocke*
> 
> Valley, +150/+150 on the Titan X Maxwell and +200/+500 on the Titan X Pascal. And yes, it seems very stable. I have not attempted to push higher, still no throttling.


hey are you using precision x or msi after burner?


----------



## Gary2015

I wonder what the ASIC quality is on cards so far.


----------



## Gary2015

Quote:


> Originally Posted by *Diverge*
> 
> I don't multicard anymore either.. Too much hassles in the past with drivers, or games not supporting it, ect. Not sure how it is now, but I just like best single gpu card. But I don't like loud fans either, but the nvidia reference titan coolers don't bother me much.


I swore I wouldn't go SLI after TitanXM but one card looks so lonely in my CaseLabs Magnum SM8.


----------



## Jpmboy

Quote:


> Originally Posted by *Difunto*
> 
> hey are you using precision x or msi after burner?


PX does not work with the TXP yet. Only AB4.3beta.
Quote:


> Originally Posted by *Gary2015*
> 
> I wonder what the ASIC quality is on cards so far.


is it even implemented? Not available for the 1080 in GPUZ.


----------



## Difunto

Quote:


> Originally Posted by *Jpmboy*
> 
> PX does not work with the TXP yet. Only AB4.3beta.
> 
> oh thanks!


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> PX does not work with the TXP yet. Only AB4.3beta.
> is it even implemented? Not available for the 1080 in GPUZ.


Maybe we should compile a dataset of cards and their overclocks/power/voltage.


----------



## Metros

Quote:


> Originally Posted by *Gary2015*
> 
> Technically I don't need to. 16x/8x SLI and 4x for SSD. Yes I'm cheap I know. I opted for free shipping also.


You do not know where the game is at then, x16 and x8, really, get with the game


----------



## Gary2015

Quote:


> Originally Posted by *Metros*
> 
> You do not know where the game is at then, x16 and x8, really, get with the game


Correct me if I am wrong but running 16x/16x and 16x/8x makes less than 1% difference to frame rates but saves me $300.


----------



## Murlocke

Had to step back to +200/+400. +500 on the memory is too much a voltage hog it seems with a 2013MHz core.


Quote:


> Originally Posted by *Difunto*
> 
> hey are you using precision x or msi after burner?


Afterburner.


----------



## Difunto

ok time to go test all my HTC VIVE games and crank that super sampling to the max!!


----------



## Zurv

shame the score doesn't work because it is an unknown card...

for a single card i'm already faster than kingpin on firestrike ultra.... WAAAAAAA?!?!?

gpu is hang'n out around 2ghz


----------



## Metros

Quote:


> Originally Posted by *Gary2015*
> 
> Correct me if I am wrong but running 16x/16x and 16x/8x makes less than 1% difference to frame rates but saves me $300.


Just like getting Titan X SLI then for 4K and below


----------



## Gary2015

Quote:


> Originally Posted by *Metros*
> 
> Just like getting Titan X SLI then for 4K and below


I have a 4k monitor as well.


----------



## Metros

Quote:


> Originally Posted by *Gary2015*
> 
> I have a 4k monitor as well.


Yeah, what does that add then?


----------



## Gary2015

Quote:


> Originally Posted by *Metros*
> 
> Yeah, what does that add then?


We shall find out when I get me cards....


----------



## Metros

Quote:


> Originally Posted by *Gary2015*
> 
> We shall find out when I get me cards....


We can just look at benchmarks, to notice that one Titan X is enough for 4K


----------



## Naennon

Quote:


> Originally Posted by *Zurv*
> 
> wooo.. time to break some stuff!


lol.... 5 grand man!!!


----------



## Gary2015

Quote:


> Originally Posted by *Metros*
> 
> We can just look at benchmarks, to notice that one Titan X is enough for 4K


You can't play benchmarks.


----------



## Arizonian

Though we can turn this into an *owners thread* at some point, any Titan X owner willing to start a Titan X *club* with a members list on OP feel free.


----------



## Metros

Quote:


> Originally Posted by *Gary2015*
> 
> You can't play benchmarks.


Gaming benchmarks


----------



## Gary2015

Quote:


> Originally Posted by *Arizonian*
> 
> Though we can turn this into an *owners thread* at some point, any Titan X owner willing to start a Titan X *club* with a members list on OP feel free.


I would but strictly speaking I am not an owner yet since my cards haven't arrived yet.


----------



## vmanuelgm

Quote:


> Originally Posted by *Arizonian*
> 
> Though we can turn this into an *owners thread* at some point, any Titan X owner willing to start a Titan X *club* with a members list on OP feel free.


I would start it myself if I knew when I am gonna receive my card... According to Digital River Europe its lost around China!!!


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> I would but strictly speaking I am not an owner yet since my cards haven't arrived yet.


You bought the cards so you own them. Just haven't arrived that's all. Now make the thread and become OCN famous!


----------



## CallsignVega

Going under water. Holy TIM batman.


----------



## Testier

Mine still havent shipped out yet.... Order confirmed at 10:29am PST yesterday.

How consistent can the titan x hold 2ghz+? Anyone hit 2.1ghz yet?


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> You bought the cards so you own them. Just haven't arrived that's all. Now make the thread and become OCN famous!


You do it man! Im still a noob.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Going under water. Holy TIM batman.


Holy crap!!!! LOL.


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> You do it man! Im still a noob.


Noobs don't buy 2x $1200 cards. I'll be waiting for you to make that thread.... Or else!


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Going under water. Holy TIM batman.


Same thing with the 980 Ti's and Titan X's I owned. Nvidia = TIM pigs.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Noobs don't buy 2x $1200 cards. I'll be waiting for you to make that thread.... Or else!


Gotta do google docs and all that stuff man, I can just about cope with email.


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> Gotta do google docs and all that stuff man, I can just about cope with email.


I don't even know who you are anymore.. Wait. I don't know so you are at all!


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> I don't even know who you are anymore.. Wait. I don't know so you are at all!


LOL, nice board btw..


----------



## Murlocke

Was that heatsink as easy to remove as it looked?

That has to be hurting temps. Tempted to take mine off and apply my own TIM.
Quote:


> Originally Posted by *Testier*
> 
> Mine still havent shipped out yet.... Order confirmed at 10:29am PST yesterday.
> 
> How consistent can the titan x hold 2ghz+? Anyone hit 2.1ghz yet?


I think everyone is hitting 2GHz, but much higher and I start to have issues. It holds it pretty well.


----------



## Creator

Getting annoyed at lack of shipping info here. I order before 6:30AM PST.


----------



## Gary2015

Quote:


> Originally Posted by *Creator*
> 
> Getting annoyed at lack of shipping info here. I order before 6:30AM PST.


Call them. I ordered mine yesterday just after 9am and chose free shipping. They said orders will be processed in next 24 hours.


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> LOL, nice board btw..


Thanks. Yours is amazing as well. Love that CPU too


----------



## Gary2015

Quote:


> Originally Posted by *Murlocke*
> 
> Was that heatsink as easy to remove as it looked?
> 
> That has to be hurting temps. Tempted to take mine off and apply my own TIM.
> I think everyone is hitting 2GHz, but much higher and I start to have issues. It holds it pretty well.


How many people have tested their cards so far and what are their overclocks?


----------



## DADDYDC650

Quote:


> Originally Posted by *Murlocke*
> 
> Was that heatsink as easy to remove as it looked?
> 
> That has to be hurting temps. Tempted to take mine off and apply my own TIM.
> I think everyone is hitting 2GHz, but much higher and I start to have issues. It holds it pretty well.


I had a Titan X that would run hot. Took off the heatsink and reapplied the paste. Not sure if the heatsink wasn't installed correctly or too much TIM but temps dropped 10c.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Thanks. Yours is amazing as well. Love that CPU too


Thanks man, Its out of commission until I get my new TitanXP's since I sold my TitanXM's a few days ago. Maybe I could put in a my old 590 GTX.?


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> Thanks man, Its out of commission until I get my new TitanXP's since I sold my TitanXM's a few days ago. Maybe I could put in a my old 590 GTX.?


590? Eww. Might as well have a hobo drive your car.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> 590? Eww. Might as well have a hobo drive your car.


ROFL!!


----------



## NoDoz

Quote:


> Originally Posted by *Creator*
> 
> Getting annoyed at lack of shipping info here. I order before 6:30AM PST.


Today or yesterday?


----------



## Gary2015

Quote:


> Originally Posted by *NoDoz*
> 
> Today or yesterday?


I ordered mine yesterday. They told me they had to clear a backlog, whatever that meant.


----------



## DADDYDC650

Can anyone with 2x Titan XP's run benches at 16x/16x pci-e vs 8x/8x? Would like to know if these cards saturate the lanes at all


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Can anyone with 2x Titan XP's run benches at 16x/16x pci-e vs 8x/8x? Would like to know if these cards saturate the lanes at all


I would like to know this as well. Maybe need to upgrade to 6850K.


----------



## Murlocke

Quote:


> Originally Posted by *Gary2015*
> 
> Wow, it seems this card lives up to its billing. Can you test GTA V at 4k at max settings?


Completely maxed out, including advanced graphic options, at 4K with no AA. It maintained 60+ FPS in the benchmark 99% of the time. The only dips were the last scene at the bridge for some reason where it went to 47FPS briefly, then right back up to 70-80FPS as the plane flew around.


----------



## Gary2015

Quote:


> Originally Posted by *Murlocke*
> 
> Completely maxed out, including advanced graphic options, at 4K with no AA. It maintained 60+ FPS in the benchmark 99% of the time.


Wow. Thanks a lot for this! Can't wait to get my cards.


----------



## Testier

Quote:


> Originally Posted by *CallsignVega*
> 
> Going under water. Holy TIM batman.


You need the weird hexagonal screws again? I feel like just a repaste would help it a lot.


----------



## fisher6

Do you guys think with OC and no AA this card will hold itself until next year when Volta is out?


----------



## EniGma1987

Quote:


> Originally Posted by *CallsignVega*
> 
> Holy moley, testing individual and one of my cards will do 2101 MHz on air.


I know there is a lot more to it than ASIC quality percentage, but do the two cards happen to be quite different in that rating or are they pretty similar ASIC %?


----------



## Gary2015

Quote:


> Originally Posted by *EniGma1987*
> 
> I know there is a lot more to it than ASIC quality percentage, but do the two cards happen to be quite different in that rating or are they pretty similar ASIC %?


Like the 1080's, don't think we can get ASIC's with Pascal.


----------



## mouacyk

Quote:


> Originally Posted by *fisher6*
> 
> Do you guys think with OC and no AA this card will hold itself until next year when Volta is out?


Are you serious? Unless AMD of course...
Quote:


> Originally Posted by *EniGma1987*
> 
> I know there is a lot more to it than ASIC quality percentage, but do the two cards happen to be quite different in that rating or are they pretty similar ASIC %?


GPUz hasn't been able to read Pascal ASIC scores.


----------



## cookiesowns

Quote:


> Originally Posted by *Metros*
> 
> Anyone got two GTX 980ti they could use to benchmark, along with the new Titan X, want to compare performance, thanks


If I ever get my cards shipped damnit.

I'm expecting the TX pascal to come within 15% of overclocked kingpins in SLi assuming 80+% scaling efficiency in the 980Ti


----------



## Murlocke

Quote:


> Originally Posted by *Gary2015*
> 
> Wow. Thanks a lot for this! Can't wait to get my cards.


Just played through the whole first mission of GTA V, completely maxed still, 3840x2160, FXAA and VSYNC on. Not a single time did it budge from 60FPS, smooth as butter as far as 60Hz goes. This was completely unplayable at these settings on my Titan X Maxwell. I enabled 2x MSAA to test, but I started seeing low 50s while driving around. I'm sure I could turn grass down from Ultra or something and 2x MSAA would be fine. With 2 cards, I'm sure you'll be able to crank MSAA no problems.

The game looks absolutely mind blowing on a 65" 4K OLED. Words can't describe.


----------



## D749

My two Titan XPs arrive tomorrow. In celebration I just ordered an Asus Rampage V Extreme E10, 32GB Corsair DDR4 3200, Seasonic Platinum 1200 PSU and a Samsung 950 Pro, which all should arrive the day after the GPUs. I'll pick up the 6900K locally. I was planning to hold out for the Samsung 960 Pro to be released, but oh well.







Will throw it in a bench for now. My new CL SM8 doesn't arrive for months.


----------



## Spiriva

Its very tempting to sell the two 1080´s and buy two of these new Titans, I guess a few of you guys already did that switch ?


----------



## bl4ckdot

Anyone from France got their Titans shipped ? I'm still waiting


----------



## lyang238

Quote:


> Originally Posted by *Spiriva*
> 
> Its very tempting to sell the two 1080´s and buy two of these new Titans, I guess a few of you guys already did that switch ?


I did just that. I want 1 card that does roughly 30% better than 1080s which is what this will give me. Plus it actually costs the same anyways for me since I got a deal on both my 1080s.


----------



## vmanuelgm

Quote:


> Originally Posted by *bl4ckdot*
> 
> Anyone from France got their Titans shipped ? I'm still waiting


Europe is second class, USA first class...

I talked to Digital River (spanish department) and they dont know when they are shipping, so the 1-3 working days is b**sh**t...

Smart guy Jesus Juan... 1310 euros with no ETA...


----------



## Murlocke

Well, I am completely satisfied and it exceeded my expectations by a large amount. Every game I try, 60FPS smooth maxed out at 4K with a +200/+400 OC.

Now to decide if I want to reapply the TIM after seeing vega's screenshot. The card gets to about 83-84C in some titles even with 90% fan when you set TDP to 120%.


----------



## vmanuelgm

Quote:


> Originally Posted by *Murlocke*
> 
> Well, I am completely satisfied and it exceeded my expectations by a large amount. Every game I try, 60FPS smooth maxed out at 4K with a +200/+400 OC.
> 
> Now to decide if I want to reapply the TIM after seeing vega's screenshot. The card gets to about 83-84C in some titles even with 90% fan when you set TDP to 120%.


Well, if you get bored of those great things, send the card to Spain, be kind!!!


----------



## Baasha

Got Titan X?


----------



## cookiesowns

FUDGE IT. I YOLO'd. Cancelled my first order, added one-day shipping to next order, and a second card with some bridges. YOLOLOLOL


----------



## Metros

Quote:


> Originally Posted by *Spiriva*
> 
> Its very tempting to sell the two 1080´s and buy two of these new Titans, I guess a few of you guys already did that switch ?


I would keep your GTX 1080 SLI or get one Titan X, you do not need two of them


----------



## CallsignVega

My Franken Titan is almost alive.


----------



## D749

Quote:


> Originally Posted by *CallsignVega*
> 
> My Franken Titan is almost alive.


Nice, but how are you going to route those tubes when the card is instill. More pics please.


----------



## Mad Pistol

Quote:


> Originally Posted by *Zurv*
> 
> wooo.. time to break some stuff!


$5000 worth of video cards... if that's not VC porn, I don't know what is.

Also, to everyone who ordered these video cards... you're freakin nuts!!!


----------



## Newtocooling

Quote:


> Originally Posted by *Murlocke*
> 
> Completely maxed out, including advanced graphic options, at 4K with no AA. It maintained 60+ FPS in the benchmark 99% of the time. The only dips were the last scene at the bridge for some reason where it went to 47FPS briefly, then right back up to 70-80FPS as the plane flew around.


Quote:


> Originally Posted by *Murlocke*
> 
> Completely maxed out, including advanced graphic options, at 4K with no AA. It maintained 60+ FPS in the benchmark 99% of the time. The only dips were the last scene at the bridge for some reason where it went to 47FPS briefly, then right back up to 70-80FPS as the plane flew around.


Was that one card or two? Never mind just saw your other post.


----------



## Z0eff

To those that bought 3 or 4 of these monsters - What will you be using them for? Purely benching? Hoping for a DX12 game that you like to come along where the devs support 4 graphics adapters?

Very jealous regardless, especially after seeing that these cards seem to be very capable of going well north of 2GHz.


----------



## Murlocke

Quote:


> Originally Posted by *Newtocooling*
> 
> Was that one card or two?


One.

The card is a freaking beast.


----------



## Jpmboy

Quote:


> Originally Posted by *Naennon*
> 
> lol.... 5 grand man!!!


Hey bro.... where's a Pascal bios editor??
Quote:


> Originally Posted by *Arizonian*
> 
> Though we can turn this into an *owners thread* at some point, any Titan X owner willing to start a Titan X *club* with a members list on OP feel free.


volunteers... someone?
Quote:


> Originally Posted by *Gary2015*
> 
> I would but strictly speaking I am not an owner yet since my cards haven't arrived yet.


Does Not Matter.
lol- that's no excuse.








Quote:


> Originally Posted by *CallsignVega*
> 
> Going under water. Holy TIM batman.


oh man, they're goobering them up again? Geeze.
Quote:


> Originally Posted by *Testier*
> 
> You need the weird hexagonal screws again? I feel like just a repaste would help it a lot.


4mm socket to remove th ebackplate
Quote:


> Originally Posted by *CallsignVega*
> 
> My Franken Titan is almost alive.


I'll go EK uniblock after a day or two on air. Let us know how the temps tame, if at all.









Mine arrived while I was out for a ride.


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> Hey bro.... where's a Pascal bios editor??
> volunteers... someone?
> Does Not Matter.
> lol- that's no excuse.
> 
> 
> 
> 
> 
> 
> 
> 
> oh man, they're goobering them up again? Geeze.
> 4mm socket to remove th ebackplate
> I'll go EK uniblock after a day or two on air. Let us know how the temps tame, if at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mine arrived while I was out for a ride.


Maybe I'll do it???? Just don't want to leave you guys high and dry if work gets busy again.

Ugh... I should have just ordered two to begin with, fingers crossed Nvidia doesn't give the jerk around like the guy that ordered 4, or the guy that managed to get 3 charge for 2. I CANT WAIT FOR THESE CARDS I WANT NOW.


----------



## sammkv

If I had money to buy these I would have bought 6 keep one and sell the rest for a profit on Ebay


----------



## Zurv

Quote:


> Originally Posted by *Baasha*
> 
> Got Titan X?


you are a manly man sir!

FYI, NVidia is working on a driver fix so you can enable 4 way SLI (... which will only work in approved benchmarks... same as the 1080)
Right now you are only able to turn on 2 way sli. (if you can please tell me so i can update NVidia)


----------



## CallsignVega

First installed.




These EVGA 980Ti Hybrid kits are pretty nice quality. Glad I found them on Amazon for only $67.


----------



## Naennon

Quote:


> Originally Posted by *Jpmboy*
> 
> Hey bro.... where's a Pascal bios editor??


no ETA - when it's done - probably no more


----------



## lyang238

Quote:


> Originally Posted by *CallsignVega*
> 
> First installed.
> 
> 
> 
> 
> These EVGA 980Ti Hybrid kits are pretty nice quality. Glad I found them on Amazon for only $67.


Hmm this looks interesting/promising. However I think EVGA is already working on a Titan X hybrid AIO cooler, but this is good for those that already have this setup or one laying around from a previous 980ti selloff.


----------



## CallsignVega

Quote:


> Originally Posted by *lyang238*
> 
> Hmm this looks interesting/promising. However I think EVGA is already working on a Titan X hybrid AIO cooler, but this is good for those that already have this setup or one laying around from a previous 980ti selloff.


Any of the EVGA Hybrid kits will work. I prefer leaving on the stock aluminum shroud around the blower fan instead of putting on the EVGA plastic shroud.

So far 2100 MHz at 48C full load not too bad..


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Any of the EVGA Hybrid kits will work. I prefer leaving on the stock aluminum shroud around the blower fan instead of putting on the EVGA plastic shroud.
> 
> So far 2100 MHz at 48C full load not too bad..


Very nice! If only we could play with the voltage freely...


----------



## lyang238

Quote:


> Originally Posted by *CallsignVega*
> 
> Any of the EVGA Hybrid kits will work. I prefer leaving on the stock aluminum shroud around the blower fan instead of putting on the EVGA plastic shroud.
> 
> So far 2100 MHz at 48C full load not too bad..


Damn those are nice temps. Do you have a happen to have a thermal image cam to check how VRMs are being cooled before/after?


----------



## CallsignVega

Quote:


> Originally Posted by *lyang238*
> 
> Damn those are nice temps. Do you have a happen to have a thermal image cam to check how VRMs are being cooled before/after?


Sorry no IR. I'm not worried about the VRM temps as the stock blower is only cooling them now without the GPU chip load. The VRM area has it's own heatsink with air blowing over it from the stock fan. Setup is much quieter than stock of course.


----------



## Jpmboy

just installed... but the wife wants to hang at the pool. will test later.


----------



## CallsignVega

So it looks like under water the cards will still be voltage/BIOS limited. I get no throttling but without more voltage/modified BIOS ~2100 MHz is looking like the limit under water. Works for me though as these cards scream at 2.1 GHz.

Bring on the modded BIOS!


----------



## Darkstar757

Hey bud I am also in the DMV. I would love to talk hardware with you sometime. LMK if you would be interested.

Darkstar


----------



## Jpmboy

okay... quick timespy. bone stock. boost to 1860. lol - would be 1st place in HOF 2 card and OCN's thread.









FM is not recognizing the cards or the new driver.


----------



## Baasha

Quote:


> Originally Posted by *Zurv*
> 
> you are a manly man sir!
> 
> FYI, NVidia is working on a driver fix so you can enable 4 way SLI (... which will only work in approved benchmarks... same as the 1080)
> Right now you are only able to turn on 2 way sli. (if you can please tell me so i can update NVidia)


4-Way SLI or bust!









Glad to know they are working on a 'fix.'

Btw, did you ever get 4-Way to work in non-DX12 games w/ the 1080s? I never even bothered - I put 2 in each of my rigs and called it a day.

The way the 2 1080s performed at 5K was just mind-boggling; better than 4-Way SLI Titan X (Maxwell) in most games/scenarios. Can't wait to test out the Pascal Titan X in 5K!









I would love to enable 4-Way in some older games (BF4 etc.) to see the madness!!

So the 4-Way Titan X (Pascal) club is just the two of us (for now) eh?


----------



## CallsignVega

Quote:


> Originally Posted by *Darkstar757*
> 
> Hey bud I am also in the DMV. I would love to talk hardware with you sometime. LMK if you would be interested.
> 
> Darkstar












Look's like 5594 on my memory is the best speed/timing combo.


----------



## Baasha

Okay - I just installed the first card (going to test each individually).

Installed the Titan X driver (369.05) and after reboot (seems to auto restart for me instead of asking me after driver install), there is no NVCP option in the right-click menu.

Also, PhysX is not installed in Control Panel!









I used DDU to remove the driver and tried again to no avail!

HELP!

EDIT: Nvm, I mixed up the DP cables lol.. works fine now.


----------



## Murlocke

So who wants to be the guinea pig and take the cooler off to reapply TIM and report the differences?








Quote:


> Originally Posted by *CallsignVega*
> 
> So it looks like under water the cards will still be voltage/BIOS limited. I get no throttling but without more voltage/modified BIOS ~2100 MHz is looking like the limit under water. Works for me though as these cards scream at 2.1 GHz.
> 
> Bring on the modded BIOS!


Both your cards do 2100? So that's about +300 on the core? What's your RAM at when running that OC? I got some artifacts popping up at +200/+400 on mine. Stepped down to +200/+300 and seemed to have fixed it.

How hard was it to remove the cooler?


----------



## Steven185

Quote:


> Originally Posted by *CallsignVega*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Look's like 5594 on my memory is the best speed/timing combo.


Hey, once you find your best clocks can you do a before/after shot/test?

I mean run a test on stock clocks and then with your stable clocks. It's worthy to note how much this card can overclock under good cooling and have it measured in actual performance difference instead of mere clocks...

Thanks







(still waiting for mine on the mail).


----------



## CallsignVega

Quote:


> Originally Posted by *Murlocke*
> 
> So who wants to be the guinea pig and take the cooler off to reapply TIM and report the differences?
> 
> 
> 
> 
> 
> 
> 
> 
> Both your cards do 2100? So that's about +300 on the core? What's your RAM at when running that OC? I got some artifacts popping up at +200/+400 on mine. Stepped down to +200/+300 and seemed to have fixed it.
> 
> How hard was it to remove the cooler?


Your memory shouldn't be capping out that low. Cooler is pretty easy to remove. This card has about the most screws I've seen on a GPU yet.


----------



## Murlocke

Quote:


> Originally Posted by *CallsignVega*
> 
> Your memory shouldn't be capping out that low. Cooler is pretty easy to remove. This card has about the most screws I've seen on a GPU yet.


I get weird red flashes, it doesn't look like artifact it's like an entire red frame every now and then. I saw about 6 of them at +200/+400 when doing Fire Strike Ultra. I lowered it to +200/+300 and there were none, so I assume it was the RAM until I see it again.

I always get absolutely terrible overclockers, so wouldn't surprise me. My previous Titan did +150/+150 MAX, way below most.


----------



## CallsignVega

Ouch. One of mine does +900 lol.


----------



## MrTOOSHORT

Got mine 45min ago, incredible it was 1 day shipping from US to Canada. Also no duty charge.

Here is a 3dmark11 run:



*http://www.3dmark.com/3dm11/11466052*

Very impressed and happy so far!

Need block...


----------



## Murlocke

Quote:


> Originally Posted by *CallsignVega*
> 
> Ouch. One of mine does +900 lol.


Ofc it does, you always get the best cards. I think you and I have compared OC results at least 3 times in past generations and your cards always OC much better. I called this yeterday!










It could be the fact that +200 core isn't stable and OCing the memory more brings out the issues faster. I'll try something like +150/+500 sometime tonight.


----------



## vmanuelgm

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Got mine 45min ago, incredible it was 1 day shipping from US to Canada. Also no duty charge.
> 
> Here is a 3dmark11 run:
> 
> 
> 
> *http://www.3dmark.com/3dm11/11466052*
> 
> Very impressed and happy so far!
> 
> Need block...


Nice score TooShort, wait for mine to finish with arm wrestling!!! xD

Blocks will ship on 16th from EK... You can preorder them...


----------



## DADDYDC650

Quote:


> Originally Posted by *Murlocke*
> 
> Ofc it does, you always get the best cards. I think you and I have compared OC results at least 3 times in past generations and your cards always OC much better. I called this yeterday!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It could be the fact that +200 core isn't stable and OCing the memory more brings out the issues faster. I'll try something like +150/+500 sometime tonight.


Perhaps his power supply helps with overclocks? Best PSU available.


----------



## CallsignVega

Yes, test the max of both individually, then rise both to those levels and see if it remains stable. Little bit of an art.

It will be interesting to see how high I will be able to keep my stable in SLI with the HB bridge.


----------



## DADDYDC650

I wonder if these bad boys saturate the pci-e lanes at all.....


----------



## CallsignVega

Quote:


> Originally Posted by *DADDYDC650*
> 
> Perhaps his power supply helps with overclocks? Best PSU available.


Oh ya, never cheapen out on power supplies. I like to run a power supply about twice as powerful as I need. This keeps the power supply at about half capacity under load, best efficiency and most stable voltage/clean power.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Oh ya, never cheapen out on power supplies. I like to run a power supply about twice as powerful as I need. This keeps the power supply at about half capacity under load, best efficiency and most stable voltage/clean power.


Indeed. I got your PSU's lil brother.


----------



## Difunto

Vega you think my old evga titan x aio will fit this one? like to swap it since am not using the old titan x?


----------



## DADDYDC650

Quote:


> Originally Posted by *Difunto*
> 
> Vega you think my old evga titan x aio will fit this one? like to swap it since am not using the old titan x?


It will fit just like his.


----------



## Difunto

but like using the evga shroud since i can't route the cables the way he has them


----------



## DADDYDC650

Quote:


> Originally Posted by *Difunto*
> 
> but like using the evga shroud since i can't route the cables the way he has them


Doubt the shroud will fit well. Probably why EVGA is making a new one for the XP.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Got mine 45min ago, incredible it was 1 day shipping from US to Canada. Also no duty charge.
> 
> Here is a 3dmark11 run:
> 
> *http://www.3dmark.com/3dm11/11466052*
> 
> Very impressed and happy so far!
> 
> Need block...


Stud-like. That's just a couple K off my SLI OG TX score.

http://www.3dmark.com/3dm11/10991803


----------



## Difunto

Quote:


> Originally Posted by *DADDYDC650*
> 
> Doubt the shroud will fit well. Probably why EVGA is making a new one for the XP.


damm the thing is that i don't know if it will let me rotate the cables upwards not like frontwards like vega's


----------



## Testier

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Got mine 45min ago, incredible it was 1 day shipping from US to Canada. Also no duty charge.
> 
> Here is a 3dmark11 run:
> 
> 
> 
> *http://www.3dmark.com/3dm11/11466052*
> 
> Very impressed and happy so far!
> 
> Need block...


Congrats! And happy to see a fellow enthusiasts in Edmonton!
Quote:


> Originally Posted by *DADDYDC650*
> 
> Doubt the shroud will fit well. Probably why EVGA is making a new one for the XP.


Do we really need the shroud or is it just for looks?


----------



## Baasha

rofl.. just tested one card and it's >7000 in fire strike ultra at stock (no OC). .o_0

This Pascal Titan X seems to be an absolute monster.









COME ON 4-WAY SLI!!!


----------



## DADDYDC650

Quote:


> Originally Posted by *Testier*
> 
> Congrats! And happy to see a fellow enthusiasts in Edmonton!
> Do we really need the shroud or is it just for looks?


Just for looks mostly. Vega hit 2.1Ghz and probably didn't break 50c. You should get similar results unless you live in Africa.


----------



## mbze430

Man, talk about catching up with the thread.....

Just wanted to report back that Aqua Computer will have/should have a Titan XP waterblock and backplate in the next couple of weeks. I asked them about the HB Bridge/NVLink... will report when they send info.

I am still waiting for shipping info. SOB!


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> rofl.. just tested one card and it's >7000 in fire strike ultra at stock (no OC). .o_0
> 
> This Pascal Titan X seems to be an absolute monster.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> COME ON 4-WAY SLI!!!


Know what I'm really curious about? Gaming performance comparing your 4 cards at pcie 3.0 x8 with standard sli bridge, vs. 2 way on pcie 3.0 x16 with sli hb bridge.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Testier*
> 
> Congrats! And happy to see a fellow enthusiasts in Edmonton!


Thanks!









Becareful with the word _enthusiast_ around here, it's touchy!


----------



## dante`afk

are there any 1080 sli vs titan benches yet?


----------



## newb here

Just ordered my two today!!!









Was curious i also ordered the EK waterblocks and was wondering what everyone else was using for there sli bridge seeing that the nvidia HD would not work with the block.


----------



## gamingarena

Wow what a ride!

got it this morning and been benching since against 1080OC and 1080OC SLi 2025mhz/11ghz

Short conclusion this card OC like monster and at 2050mhz/11Ghz it straight up matches 1080OC SLi in most, in some beat it and some loose it by only 15% and the rest that does not support SLi like Unreal4, Unity, VR etc just walks over it like bulldozer...
this card just made SLi obsolete, buying 2x of this card is absolute waste at anything bellow 4K will just idle in the system.

I'm running it on Acer Predator 1440p/165hz
5960x 4.3ghz
16gb 3000mhz

here are some of the results OC at above mentioned speeds for both, HB Bridge used for 1080 SLi.

Heaven 1440p 8XAA
1080 - 69.2FFPS
1080SLI - 129FPS
TitanXP - 103.4FPS

Valley 1440p 8XAA
1080 - 69.7FPS
1080SLI - 120.8FPS
TitanXP - 104.2FPS

Division 1440p Fully Maxed
1080 - avg 83.1FPS
1080SLI - avg 118.7FPS
titanXP - avg 108.6FPS

Mordor 1440p Fully Maxed
1080 - avg 113.22FPS
1080SLI - avg 198FPS
TitanXP - avg 161,29FPS

Doom 1440p Fully Maxed
1080 - 148.9FPS
1080SLi - 151.3FPS
TtianXP - 200FPS pegged (thats max FPS)

Far Cry Primal 1440p Fully Maxed
1080 - avg 82FPS
1080SLI - avg 87FPS
TitanXP - avg 103FPS

Witcher 3 1440p Fully Maxed
1080 - 68FPS
1080SLI -101FPS
TitanXP -94.2FPS

ROTR 1440p Fully Maxed DX11
1080 - 94.18FPS
1080SLI - 122.76FPS
TitanXP - 124.56FPS

I been running SLi since voodoo2 SLi i moved from 2x Titan X to 2x 1080 to now single Titan XP and im blown away, the smoothness and FPS this monster provides in single GPU package its just out of this world
Now to test some Oculus and Vive games

Again this TianX ran like a champ at +200/+500 2050mhz and after warming up 2ghz flat, fan at 90% this is default voltage temps under 80c


----------



## mbze430

This might be a dumb question.. has anyone tried matching the mounting holes from a 980TI waterblock on the new Titan XP??


----------



## MrTOOSHORT

Quote:


> Originally Posted by *mbze430*
> 
> This might be a dumb question.. has anyone tried matching the mounting holes from a 980TI waterblock on the new Titan XP??


EK already said the blocks won't work from past gens.


----------



## HyperMatrix

Quote:


> Originally Posted by *gamingarena*
> 
> this card just made SLi obsolete, buying 2x of this card is absolute waste at anything bellow 4K will just idle in the system.


165Hz 1440p is more demanding than 60Hz 4K.


----------



## dante`afk

Quote:


> Originally Posted by *gamingarena*
> 
> ...


your settings are not fully maxed, if you'd max out division would give you about 90fps, rotr about 75 fps in sli


----------



## Slomo4shO

For those interested, waterblocks are available through EK


----------



## CallsignVega

Got these babies all set up, time to dial in the overclocks for SLI.


----------



## Captivate

Eww, no FC blocks and/or CPU block?


----------



## lyang238

Quote:


> Originally Posted by *gamingarena*
> 
> Wow what a ride!
> 
> got it this morning and been benching since against 1080OC and 1080OC SLi 2025mhz/11ghz
> 
> Short conclusion this card OC like monster and at 2050mhz/11Ghz it straight up matches 1080OC SLi in most, in some beat it and some loose it by only 15% and the rest that does not support SLi like Unreal4, Unity, VR etc just walks over it like bulldozer...
> this card just made SLi obsolete, buying 2x of this card is absolute waste at anything bellow 4K will just idle in the system.
> 
> I'm running it on Acer Predator 1440p/165hz
> 5960x 4.3ghz
> 16gb 3000mhz
> 
> here are some of the results OC at above mentioned speeds for both, HB Bridge used for 1080 SLi.
> 
> Heaven 1440p 8XAA
> 1080 - 69.2FFPS
> 1080SLI - 129FPS
> TitanXP - 103.4FPS
> 
> Valley 1440p 8XAA
> 1080 - 69.7FPS
> 1080SLI - 120.8FPS
> TitanXP - 104.2FPS
> 
> Division 1440p Fully Maxed
> 1080 - avg 83.1FPS
> 1080SLI - avg 118.7FPS
> titanXP - avg 108.6FPS
> 
> Mordor 1440p Fully Maxed
> 1080 - avg 113.22FPS
> 1080SLI - avg 198FPS
> TitanXP - avg 161,29FPS
> 
> Doom 1440p Fully Maxed
> 1080 - 148.9FPS
> 1080SLi - 151.3FPS
> TtianXP - 200FPS pegged (thats max FPS)
> 
> Far Cry Primal 1440p Fully Maxed
> 1080 - avg 82FPS
> 1080SLI - avg 87FPS
> TitanXP - avg 103FPS
> 
> Witcher 3 1440p Fully Maxed
> 1080 - 68FPS
> 1080SLI -101FPS
> TitanXP -94.2FPS
> 
> ROTR 1440p Fully Maxed DX11
> 1080 - 94.18FPS
> 1080SLI - 122.76FPS
> TitanXP - 124.56FPS
> 
> I been running SLi since voodoo2 SLi i moved from 2x Titan X to 2x 1080 to now single Titan XP and im blown away, the smoothness and FPS this monster provides in single GPU package its just out of this world
> Now to test some Oculus and Vive games
> 
> Again this TianX ran like a champ at +200/+500 2050mhz and after warming up 2ghz flat, fan at 90% this is default voltage temps under 80c


^^^THIS is what I've been waiting for. Thanks and repped. I should expect similar results!


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> on your z170 mobo.. are they at x8? You have the HB bridge? If yes, could you run the attached concurrentbandwidth test and post the resulting command window?
> 2 TXM, R5E-10/6950X
> 
> 
> Unzip and open the folder, for windows10, File>open command prompt as admin. Type _concbandwidthtest 0,1_
> 
> ConcBandwidth.zip 5k .zip file
> 
> 
> this is not clock dependent (well, except for the PEG/DMI). so straight stock is all that's needed.
> 
> 4.3beta? Try GPUZ


Here is the results at two 8X. I don't think this tests the SLI bridge eh?


Quote:


> Originally Posted by *Captivate*
> 
> Eww, no FC blocks and/or CPU block?


No, as this system is going into an Airstream. Very space limited. The sad part is a water CPU block wouldn't really get my 6700K any faster. It's at the wall frequency wise, not heat based.


----------



## EniGma1987

Quote:


> Originally Posted by *dante`afk*
> 
> are there any 1080 sli vs titan benches yet?


Guru3D says SLI 1080's get 16,385 FSE score. Titan XP gets 12,256 FSE. Overclocked Titan should get around 13,500 I think.
A Titan X scores right about what GTX 1070 SLI scores, both stock and overclocked on each.
In real games, the 1080 SLI gets 10-15 more FPS if the game actually scales well. In games where SLI scaling is crap and only adds like 5fps over a single card, then the Titan X actually wins.


----------



## ManuelG_at_NVIDIA

Quad SLI support for 3DMark Fire Strike is not enabled in our current 369.05 driver for the new Titan X (Pascal). Support will be added in our next driver releasing later this month. Sorry for the inconvenience.


----------



## stryker7314

Quote:


> Originally Posted by *CallsignVega*
> 
> Got these babies all set up, time to dial in the overclocks for SLI.


Is that the 980Ti hybrid cooler?


----------



## Fiercy

2050 on air so far getting 110-120 in witcher 3 from 65-75 on old titan x on water.


----------



## dante`afk

Quote:


> Originally Posted by *ManuelG_at_NVIDIA*
> 
> Quad SLI support for 3DMark Fire Strike is not enabled in our current 369.05 driver for the new Titan X (Pascal). Support will be added in our next driver releasing later this month. Sorry for the inconvenience.


any info when new titan x's will be available again in store? need to buy my second


----------



## Murlocke

Been playing Witcher 3. A single card handles it pretty well even with hair works at 4K. Lowest dip I saw was about 51FPS, I prefer playing the game with a controller so 51 vs 60 is barely noticeable.

Disable/Lower hairworks settings, 60 at all times.
Quote:


> Originally Posted by *Fiercy*
> 
> 2050 on air so far getting 110-120 in witcher 3 from 65-75 on old titan x on water.


I assume your not running 4K with that FPS.


----------



## dmasteR

Quote:


> Originally Posted by *ManuelG_at_NVIDIA*
> 
> Quad SLI support for 3DMark Fire Strike is not enabled in our current 369.05 driver for the new Titan X (Pascal). Support will be added in our next driver releasing later this month. Sorry for the inconvenience.


Not exactly related to the Titan XP, but any day on when the Netflix stutter will be fixed?


----------



## stangflyer

Quote:


> Originally Posted by *stryker7314*
> 
> Is that the 980Ti hybrid cooler?


Yes they are


----------



## Baasha

Quote:


> Originally Posted by *ManuelG_at_NVIDIA*
> 
> Quad SLI support for 3DMark Fire Strike is not enabled in our current 369.05 driver for the new Titan X (Pascal). Support will be added in our next driver releasing later this month. Sorry for the inconvenience.


well, at least it's coming.

Manuel,

Can you say whether the 4-Way SLI support will be enabled for games (non-DX12) at least in the next driver update?

Also, is there really an "enthusiast key" for 3 and 4-Way SLI? If so, how do we get it? I've had 4x 1080s and now have 4x Titan X (Pascal) GPUs. would really like to run 4-Way SLI as I used to for years.


----------



## Baasha

Quote:


> Originally Posted by *HyperMatrix*
> 
> 165Hz 1440p is more demanding than 60Hz 4K.


what is the formula to calculate data transfer w/ refresh rate and resolution? Is 144Hz @ 1440P as demanding as 4K UHD @ 60Hz?

5K @ 60Hz still trounces everything out there currently - I would think(?).

Stay tuned for some benchies!


----------



## stryker7314

Quote:


> Originally Posted by *stangflyer*
> 
> Yes they are


Sweet thanks!









Good to know! Using one of those on the OG Titan X. I decided to be cheap and stay a generation behind to get these awesome cards at half the price. If I live a year behind the times, it will cost me half as much from now on, I just can't justify the price gouging, even though I probably would have caved for the full chip.


----------



## Fiercy

100+ on a 144hz G-Sync screen is what every digital life eddict needs. Titan X (Pascal) your bright future some where you want to be.


----------



## HaniWithAnI

Quote:


> Originally Posted by *CallsignVega*
> 
> First installed.
> 
> 
> 
> 
> These EVGA 980Ti Hybrid kits are pretty nice quality. Glad I found them on Amazon for only $67.


Curious- does the EVGA 980TI cooler fit at a 90degree angle only? or does it also work in the vertical orientation (pipes coming out of top of the card near the GTX logo) and this was just your preference? Would it work with the logo removed?

Thinking of trying this but I don't have a hybrid cooler lying around to try with so I'd have to order one, would appreciate if you could tell me before I waste 70$


----------



## CallsignVega

Quote:


> Originally Posted by *stryker7314*
> 
> Is that the 980Ti hybrid cooler?


Quote:


> Originally Posted by *HaniWithAnI*
> 
> Curious- does the EVGA 980TI cooler fit at a 90degree angle only? or does it also work in the vertical orientation (pipes coming out of top of the card near the GTX logo) and this was just your preference? Would it work with the logo removed?
> 
> Thinking of trying this but I don't have a hybrid cooler lying around to try with so I'd have to order one, would appreciate if you could tell me before I waste 70$


Mine are the 980Ti Hybrid coolers. They work perfectly. You can rotate them 90 degrees either facing aft like mine, or up.


----------



## Captivate

Quote:


> Originally Posted by *Baasha*
> 
> what is the formula to calculate data transfer w/ refresh rate and resolution? Is 144Hz @ 1440P as demanding as 4K UHD @ 60Hz?
> 
> 5K @ 60Hz still trounces everything out there currently - I would think(?).
> 
> Stay tuned for some benchies!


I wonder this as well. Is it really just resolution*framerate?

[email protected] = 497664000 pixels / second

[email protected] = 530841600 pixels / second

[email protected] = 495360000 pixels / second


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> what is the formula to calculate data transfer w/ refresh rate and resolution? Is 144Hz @ 1440P as demanding as 4K UHD @ 60Hz?
> 
> 5K @ 60Hz still trounces everything out there currently - I would think(?).
> 
> Stay tuned for some benchies!


Well you don't need to calculate bandwidth because that would just be used to determine port spec limits. You can compare gpu power required by comparing resolution x refresh rate. 4K is approximately 2.2x more pixels than 1440p. Which means 4K at around at 60z is equal to 1440p at about 130Hz.


----------



## HaniWithAnI

Quote:


> Originally Posted by *CallsignVega*
> 
> Mine are the 980Ti Hybrid coolers. They work perfectly. You can rotate them 90 degrees either facing aft like mine, or up.


Awesome, ordering one now. Thanks for the heads up!


----------



## st0necold

I'm ready for 2 can't wait!


----------



## outofmyheadyo

I would order some if I wasnt so poor








But nice to see your benches and stuff, sure looks like ur having fun.


----------



## Artah

I'm not able to get to the Titan XP EK backplate order because I'm getting a 404, anyone successfully get it? I sent them a support email already letting them know. I was able to order the two water blocks with no issues.


----------



## outofmyheadyo

They said backplates are coming later this month.


----------



## criminal

Congrats to all the owners. Y'all have fun with those monsters.


----------



## MrTOOSHORT

Another bench:



*http://www.3dmark.com/3dm/13840890*


----------



## Murlocke

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Another bench:
> 
> 
> 
> *http://www.3dmark.com/3dm/13840890*


That's a crazy high score for a single GPU. Is that the Xeon's doing? I get 8983 with +200/+300 on my 6700k @ 4.6GHz.


----------



## bl4ckdot

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Another bench:
> 
> 
> 
> *http://www.3dmark.com/3dm/13840890*


Lmao you just beat kingpin record, well done


----------



## MrTOOSHORT

Yes, IPC plus more threads will help a lot in Time Spy.

Compare gpu scores.


----------



## Murlocke

I tested more, +0 core and +600 mem = instant artifacting in fire strike ultra. +500 results inartifacting about 2-3 times during the test. +400 I saw an artifact while playing a different game. Seems my memory just ain't that good at OCing.


----------



## stefxyz

500 is what most can do would try that first...


----------



## Murlocke

Quote:


> Originally Posted by *stefxyz*
> 
> 500 is what most can do would try that first...


I edited. 300 is stable (I think), 400 is not.

I'm going to try 350 now, but still that's pretty subpar compared to the average.


----------



## CallsignVega

Callsign_Vega --- 6700K / 4.8 GHz --- Titan-XP, 2101 / 5603 --- 166.3 --- 6957 - 1440P.



Ugg so CPU limited, will need to do a 4K run. GPU's are at half usage at screenshot LOL.


----------



## unreality

Damn this card is a beast.. i actually swore a week ago i wouldnt buy a cut chip... but from what ive reading this must be one of the best cards in years even under air! I ordered one but the european shop is a total disaster so far :S


----------



## Zurv

Quote:


> Originally Posted by *Baasha*
> 
> well, at least it's coming.
> 
> Manuel,
> 
> Can you say whether the 4-Way SLI support will be enabled for games (non-DX12) at least in the next driver update?
> 
> Also, is there really an "enthusiast key" for 3 and 4-Way SLI? If so, how do we get it? I've had 4x 1080s and now have 4x Titan X (Pascal) GPUs. would really like to run 4-Way SLI as I used to for years.


nope, we got f'd. This was a marketing thing, not a tech thing. they are actively blocking at the driver level for the 10x series from doing more than sli 2 way SLI. (there is no reason why i couldn't work with other sli profiles)
Give them some good marketing back. Every benchmark you rock, put a comment








check out my comment on the #1 spot on 3dmark hall of fame timespy


----------



## renejr902

Until now, 100% fan, 2050 boost clock. Memory 11010. Work 100% stable. Max temp 84


----------



## outofmyheadyo

Can you european guys direct me to a shop that has em in stock ? Just sold my 1080 today, and I might not be able to resist the urge


----------



## Jpmboy

2 cards above 2000...


----------



## dante`afk

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Can you european guys direct me to a shop that has em in stock ? Just sold my 1080 today, and I might not be able to resist the urge


Only nvidia sells them


----------



## dante`afk

Quote:


> Originally Posted by *Jpmboy*
> 
> 2 cards above 2000...


Get em higher? I have 16600 with my ftws at stock


----------



## CallsignVega

Is Time Spy super core count dependant? Downloading it now. Just curious if 8-10 cores would have a natural advantage over us 4 core guys lol.


----------



## Folken29

Hello guys.

To play a Swift PG348Q 3440x1440 100 hz, best two 1080 sli or titan x?

Greetings.


----------



## dante`afk

Quote:


> Originally Posted by *CallsignVega*
> 
> Is Time Spy super core count dependant? Downloading it now. Just curious if 8-10 cores would have a natural advantage over us 4 core guys lol.


well, higher physics score obv


----------



## ratzofftoya

Quote:


> Originally Posted by *Jpmboy*
> 
> 2 cards above 2000...


Wow, how'd you get it all the way up to 2050?


----------



## ratzofftoya

My two are coming tomorrow.























Do you guys think they will be bottlenecked by a 5960X @ 4.0 Ghz?


----------



## ManuelG_at_NVIDIA

Quote:


> Originally Posted by *dante`afk*
> 
> any info when new titan x's will be available again in store? need to buy my second


Sorry I don't have that info. I would just check back daily.


----------



## ManuelG_at_NVIDIA

Quote:


> Originally Posted by *dmasteR*
> 
> Not exactly related to the Titan XP, but any day on when the Netflix stutter will be fixed?


The Titan X (Pascal) driver v369.05 has the fix. For the rest of the GPUs, the fix will be in the next driver.


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> Until now, 100% fan, 2050 boost clock. Memory 11010. Work 100% stable. Max temp 84


When i said 2050 boost clock, its 2050 in heaven unigine benchmark stat


----------



## Folken29

Hello guys.

To play a Swift PG348Q 3440x1440 100 hz, best two 1080 sli or titan x?

Greetings.


----------



## ManuelG_at_NVIDIA

Quote:


> Originally Posted by *Baasha*
> 
> well, at least it's coming.
> 
> Manuel,
> 
> Can you say whether the 4-Way SLI support will be enabled for games (non-DX12) at least in the next driver update?
> 
> Also, is there really an "enthusiast key" for 3 and 4-Way SLI? If so, how do we get it? I've had 4x 1080s and now have 4x Titan X (Pascal) GPUs. would really like to run 4-Way SLI as I used to for years.


SLI support for the GeForce GTX Titan X (Pascal) is the same as the GeForce GTX 1080. I am sorry but there isn't any software way to unlock three-way/four-way SLI on those GPUs for all SLI games.


----------



## Barefooter

Quote:


> Originally Posted by *ManuelG_at_NVIDIA*
> 
> SLI support for the GeForce GTX Titan X (Pascal) is the same as the GeForce GTX 1080. I am sorry but there isn't any software way to unlock three-way/four-way SLI on those GPUs for all SLI games.


That's very disappointing!


----------



## Jpmboy

Quote:


> Originally Posted by *dante`afk*
> 
> Get em higher? I have 16600 with my ftws at stock


show a link. otherwise it's BS.
then post a sub here
Quote:


> Originally Posted by *CallsignVega*
> 
> Is Time Spy super core count dependant? Downloading it now. Just curious if 8-10 cores would have a natural advantage over us 4 core guys lol.


Timespy is the only FM bench (at the moment) that actually uses all available threads for physx.


----------



## Jpmboy

Quote:


> Originally Posted by *ratzofftoya*
> 
> My two are coming tomorrow.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you guys think they will be bottlenecked by a 5960X @ 4.0 Ghz?


4.0? yes, at low resolutions they will be. But not at higher resolutions.


----------



## ratzofftoya

Quote:


> Originally Posted by *Jpmboy*
> 
> 4.0? yes, at low resolutions they will be. But not at higher resolutions.


I have a Predator X34 (3440x1440) and a 4K G-sync monitor. Both should be OK, right? Thanks!


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> Timespy is the only FM bench (at the moment) that actually uses all available threads for physx.


Oh ya definitely one of those benchmarks that will heavily favor 8-10 core CPU's. Even though for actual gaming a higher clocked 6700K will come out on top 95% of the time.


----------



## s1rrah

Quote:


> Originally Posted by *ratzofftoya*
> 
> My two are coming tomorrow.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you guys think they will be bottlenecked by a 5960X @ 4.0 Ghz?


LOL ... no worries man. I'm going to be running x2 with a 2700K @ 5ghz ... and I'm not stressing on the whole bottleneck thing ...


----------



## mbze430

Quote:


> Originally Posted by *ratzofftoya*
> 
> My two are coming tomorrow.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you guys think they will be bottlenecked by a 5960X @ 4.0 Ghz?


The target is at least 50% higher performance than GeForce GTX 1080 Founders Edition, and our sources are saying they're now bound by the CPU. Even Core i7-6950X isn't enough to feed all the cards and in a lot of scenarios you could see an Intel Core i7-6700K, with its supreme clock (4.0 vs. 3.0 GHz) easily feed the GP100 more efficiently than Broadwell-E based Core i7 Extreme Edition. The running joke inside Nvidia is "don't buy the 6950X - buy 6700K and a Titan"

Cut/Paste from http://vrworld.com/2016/07/05/nvidia-gp100-titan-faster-geforce-1080/


----------



## NoDoz

I have a 5930k OC'd decent. I am going to run a single Titan, do I need to worry about bottleneck? I play at 4k.


----------



## Murlocke

Here's shadow of mordor maxed at 4K (Ultra textures installed). I remember when this game was impossible to max due to the VRAM requirements. This card just destroys it.

Quote:


> Originally Posted by *NoDoz*
> 
> I have a 5930k OC'd decent. I am going to run a single Titan, do I need to worry about bottleneck? I play at 4k.


lol no!

CPU bottleneck at 4K is extremely hard.


----------



## CallsignVega

Now it's time to answer the question, can decently overclocked Titan-XP SLI handle upcoming 4K @ 120 FPS/Hz displays?

Time to find out...


----------



## NoDoz

Quote:


> Originally Posted by *Murlocke*
> 
> lol no!


Ok thanks. Few of these posts had me starting to worry lol.


----------



## Murlocke

Quote:


> Originally Posted by *NoDoz*
> 
> Ok thanks. Few of these posts had me starting to worry lol.


You'd be fine even with two.


----------



## cookiesowns

Quote:


> Originally Posted by *CallsignVega*
> 
> Now it's time to answer the question, can decently overclocked Titan-XP SLI handle upcoming 4K @ 120 FPS/Hz displays?
> 
> Time to find out...


I don't see why not. Need a crap ton of CPU power though..


----------



## HyperMatrix

Well holy moly....Overclocks to 2050MHZ GPU and and 11.2Gbps on memory no sweat. Only thing I noticed is as the heat goes up, even when not a lot (like when it hits 70c) it starts throttling down to around 2000-2025MHz. Can get it to run max load at 2100, but will crash after about a minute or two.

I'd say from what we've seen so far, 2GHz/11Gbps should be achievable on air for anyone, with an aggressive enough fan profile. This also makes the probability of hitting 2.2GHz under water look very good. Meaning we can look at between 40-50% OC on these cards with a modified bios, and under water...


----------



## ratzofftoya

Quote:


> Originally Posted by *HyperMatrix*
> 
> Well holy moly....Overclocks to 2050MHZ GPU and and 11.2Gbps on memory no sweat. Only thing I noticed is as the heat goes up, even when not a lot (like when it hits 70c) it starts throttling down to around 2000-2025MHz. Can get it to run max load at 2100, but will crash after about a minute or two.
> 
> I'd say from what we've seen so far, 2GHz/11Gbps should be achievable on air for anyone, with an aggressive enough fan profile. This also makes the probability of hitting 2.2GHz under water look very good. Meaning we can look at between 40-50% OC on these cards with a modified bios, and under water...


Which utility are you using to overclock? Are you just setting power target to 120% and letter er' rip?


----------



## HyperMatrix

Quote:


> Originally Posted by *ratzofftoya*
> 
> Which utility are you using to overclock? Are you just setting power target to 120% and letter er' rip?


Yeah. Just MSI afterburner. 120% target, +200 to GPU, +600 to Mem. 100% Fan.


----------



## dante`afk

Quote:


> Originally Posted by *Jpmboy*
> 
> show a link. otherwise it's BS.
> then post a sub here
> Timespy is the only FM bench (at the moment) that actually uses all available threads for physx.


oh timespy


----------



## carlhil2

Quote:


> Originally Posted by *dante`afk*
> 
> oh timespy


Lol.....


----------



## Murlocke

Quote:


> Originally Posted by *HyperMatrix*
> 
> Well holy moly....Overclocks to 2050MHZ GPU and and 11.2Gbps on memory no sweat. Only thing I noticed is as the heat goes up, even when not a lot (like when it hits 70c) it starts throttling down to around 2000-2025MHz. Can get it to run max load at 2100, but will crash after about a minute or two.
> 
> I'd say from what we've seen so far, 2GHz/11Gbps should be achievable on air for anyone, with an aggressive enough fan profile. This also makes the probability of hitting 2.2GHz under water look very good. Meaning we can look at between 40-50% OC on these cards with a modified bios, and under water...


100% fan and I can't achieve 10.7Gbps (+350 crashed on 3rd Fire Strike Ultra loop). Infact, I think I just saw an artifact at 10.6Gbps when messing around in Metro Last Light. Cards only hitting 72C currently. I don't see how it could be by PSU, the AX860 is a quality PSU.

From the brief time I spent with +500, the difference seems less than 1% FPS over +300.


----------



## renejr902

Quote:


> Originally Posted by *Murlocke*
> 
> 100% fan and I can't achieve 10.7Gbps. Infact, I think I just saw an artifact at 10.6Gbps.
> 
> Cards only hitting 72C currently.


72? How you do that, even with open case and a additional fan at the back plate card im still at 83c-84c at max temp with uniengine heaven. 100% fan , +150 clock + 500mem , no artifact after more than 1 hour. Thanks for answer


----------



## HyperMatrix

Quote:


> Originally Posted by *Murlocke*
> 
> 100% fan and I can't achieve 10.7Gbps. Infact, I think I just saw an artifact at 10.6Gbps.
> 
> Cards only hitting 72C currently.


Hmm....that's really unlucky. A 6% OC on memory shouldn't be that hard to do. Someone should make an owners thread so we can start posting OC results to compare and see what others are getting.


----------



## Murlocke

Quote:


> Originally Posted by *renejr902*
> 
> 72? How you do that, even with open case and a additional fan at the back plate card im still at 83c-84c at max temp with uniengine heaven. 100% fan , +150 clock + 500mem Thanks for answer


I made the room really cold to test if the RAM was overheating, to see if that was causing my OCing issue.

Before I cranked the AC down, I was getting about 83C.


----------



## HyperMatrix

Quote:


> Originally Posted by *renejr902*
> 
> 72? How you do that, even with open case and a additional fan at the back plate card im still at 83c-84c at max temp with uniengine heaven. 100% fan , +150 clock + 500mem Thanks for answer


What about fan profile on the video card itself? Also ambient temperature is important. Vega is running 18c ambient. I'm running 20c.


----------



## CallsignVega

I love beating 4-way setups









Callsign_Vega --- 6700K / 4.8 GHz --- Titan-XP SLI, 2088 / 5594 --- 127.2 --- 5322 - 4K.





Two cards taking down 4-Way FuryX and previous 4-Way Titan-X. Not too shabby.


----------



## renejr902

Quote:


> Originally Posted by *Murlocke*
> 
> I made the room really cold to test if the RAM was overheating, to see if that was causing my OCing issue.
> 
> Before I cranked the AC down, I was getting about 83C.


Lol ok i turn on my ac too







thanks for answer!
Its strange about your mem overclock...


----------



## Murlocke

Seems like your getting dang close to 100% scaling in most benchmarks.


----------



## renejr902

Please people tell us your best overclock result.
Until now , mine is +200 core(2100mhz in unuengine heaven) and +500 mem

Thanks for result, i tested it stable gor 30min, no glitch


----------



## Baasha

How do we 'read' the Memory exactly? If we do +500, what is the actual memory speed? MSI shows 5508Mhz stock. Do we double that or...?


----------



## renejr902

Quote:


> Originally Posted by *Baasha*
> 
> How do we 'read' the Memory exactly? If we do +500, what is the actual memory speed? MSI shows 5508Mhz stock. Do we double that or...?


yes double that in your head, but dont add +1000


----------



## Jpmboy

8x the GPUZ reported memory frequency.


----------



## renejr902

Im surprised +200 is stable, its a lot, i will try more


----------



## renejr902

Quote:


> Originally Posted by *Jpmboy*
> 
> 8x the GPUZ reported memory frequency.


exactly


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> Im surprised +200 is stable, its a lot, i will try more


+220 froze after 20 seconds
+210 core speed , 100% stable after 10 minutes , no artifact(max temp 83c, +500 mem)


----------



## Murlocke

Quote:


> Originally Posted by *renejr902*
> 
> Please people tell us your best overclock result.
> Until now , mine is +200 core(2100mhz in unuengine heaven) and +500 mem
> 
> Thanks for result, i tested it stable gor 30min, no glitch


Heaven/Valley are wrong when it comes to reading core clocks.

+200/+500 was stable for 5+ loops of heaven, didn't test further, but crashed almost immediately in Fire Strike Ultra. They aren't good stability tests for modern GPUs.


----------



## renejr902

Quote:


> Originally Posted by *Murlocke*
> 
> Heaven/Valley are wrong when it comes to reading core clocks.
> 
> +200/+500 was stable for 5+ loops of heaven, didn't test further, but crashed almost immediately in Fire Strike Ultra. They aren't good stability tests for modern GPUs.


ok my +210 core and +500mem is in msi afterburner

I will try fire strike ultra later today. And i will try witcher3, it can artifact and froze easily


----------



## Testier

Quote:


> Originally Posted by *Murlocke*
> 
> Heaven/Valley are wrong when it comes to reading core clocks.
> 
> +200/+500 was stable for 5+ loops of heaven, didn't test further, but crashed almost immediately in Fire Strike Ultra. They aren't good stability tests for modern GPUs.


I find GTA V to be a good testing game, because how punishing it can be. Fallout 4 can be also good.


----------



## gamingarena

Looks like new TtianX has same old bug found in first 1080 drivers if you use Gsync with 165hz screen flickers, i guess the fix they released works only on 1080, have to run at at 120hz on desktop to stop the flickering


----------



## cg4200

Little off topic but I came home from work to see my titan xp.. It made my day gotta love the smell of a new card..
Anyway was wondering what everyone's card boost to with 120% power limit and no overclock mine is 1873.
with room upstairs 83 degrees fan 80 % 200/400 firestrike gpu score 31,433 first run
Turned ac on room temp 75 80% fan speed 205/400 second run gpu score 30,940 third run about the same score.
over 205 core in firestrike I crash 2100 is highest boost I saw.
Has anyone took off the back plate? would you know if there is thermal pads? back plate feels real warm.
Also anyone check thermal paste? I got home from work late have not got to catch up thanks
Can't wait for Ek blocks!!


----------



## EniGma1987

Quote:


> Originally Posted by *CallsignVega*
> 
> Any of the EVGA Hybrid kits will work. I prefer leaving on the stock aluminum shroud around the blower fan instead of putting on the EVGA plastic shroud.
> 
> So far 2100 MHz at 48C full load not too bad..


Did you see any throttling on your card with the stock air cooler? I am trying to OC and my temp and power limits are maxed out and the card seems to throttle to 1930-1965MHz in Firestrike Ultra







Temp never gets above 75 degrees though it says which is weird. Seems like a PT limit to me.

And it seems I got a crap card too. Firestrike crashes with some invalid memory reference error when the core hits 2,038MHz


----------



## renejr902

+800 mem, no glitch, im surprised, 2 loop in uni i will try more.

Glitch at third loop

More tests needed but +700 seem ok


----------



## CallsignVega

Quote:


> Originally Posted by *EniGma1987*
> 
> Did you see any throttling on your card with the stock air cooler? I am trying to OC and my temp and power limits are maxed out and the card seems to throttle to 1930-1965MHz in Firestrike Ultra
> 
> 
> 
> 
> 
> 
> 
> Temp never gets above 75 degrees though it says which is weird. Seems like a PT limit to me.
> 
> And it seems I got a crap card too. Firestrike crashes with some invalid memory reference error when the core hits 2,038MHz


Yes, card will throttle beginning in the 45C range or if power target hits 120%. My cards are constantly pinging off the 120% power maximum. I need a new BIOS hours ago.


----------



## Jpmboy

Quote:


> Originally Posted by *renejr902*
> 
> +800 mem, no glitch, im surprised, 2 loop in uni i will try more.
> 
> Glitch at third loop
> 
> More tests needed but +700 seem ok


check the FPS. There is very tolerant error correction in this architecture.


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> How do we 'read' the Memory exactly? If we do +500, what is the actual memory speed? MSI shows 5508Mhz stock. Do we double that or...?


If you're using MSI, you can set the field to x*2 so it automatically doubles it in the app and in the OSD.


----------



## HyperMatrix

Quote:


> Originally Posted by *gamingarena*
> 
> Looks like new TtianX has same old bug found in first 1080 drivers if you use Gsync with 165hz screen flickers, i guess the fix they released works only on 1080, have to run at at 120hz on desktop to stop the flickering


Hmm....that's odd. I'm using the Acer XB271HU at 165Hz and don't have any flickering at all.


----------



## EniGma1987

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, card will throttle beginning in the 45C range or if power target hits 120%. My cards are constantly pinging off the 120% power maximum. I need a new BIOS hours ago.


ah ok, mine must be hitting that PT limit then too. Easy fix. Time to take off the cooler and lower the shunts resistance


----------



## DADDYDC650




----------



## EniGma1987

el.
oh.
el.



My friend and I were joking, and I thought Id run it just for fun. Highest I saw was actually in the 5,400's fps but this was the highest I got a picture of. haha. ok, play times over.


----------



## Diverge

Just got my shipping notification







Ordered around 10-11am EST today, and it coming tomorrow by 10:30am... looks like I'm working a 1/2 day tomorrow


----------



## Fiercy

I am thinking why aren't we getting a game with 1200$ graphics card... so cheap..


----------



## hotrod717

Awww. You guys are really making this hard to pass over.


----------



## NoDoz

I got my shipping notification as well!


----------



## chronicfx

Quote:


> Originally Posted by *DADDYDC650*


Lol dude is kinda funny


----------



## Diverge

Quote:


> Originally Posted by *Fiercy*
> 
> I am thinking why aren't we getting a game with 1200$ graphics card... so cheap..


Initially, Nvidia didn't give a game away with Maxwell Titan X's. But they decided later to throw owner's a bone, and gave us Witcher 3, even for all previous purchases. So it still might happen.


----------



## Fiercy

Quote:


> Originally Posted by *Diverge*
> 
> Initially, Nvidia didn't give a game away with Maxwell Titan X's. But they decided later to throw owner's a bone, and gave us Witcher 3, even for all previous purchases. So it still might happen.


I remember that but that time they sad it had something to do with testing the ability to do that in the Geforce experience app. Nvidia if you are listening please throw us something


----------



## fernlander

How are we overclocking this beast? I tried to increase base clock by 400MHz and it locked up. Does anyone know what to target?

Also in Heaven it gets 3727 stock but even at 100% fan it hits 60C.

Edit: I see why. It's at 1911MHz stock.


----------



## renejr902

Quote:


> Originally Posted by *Jpmboy*
> 
> check the FPS. There is very tolerant error correction in this architecture.


Quote:


> Originally Posted by *renejr902*
> 
> +800 mem, no glitch, im surprised, 2 loop in uni i will try more.
> 
> Glitch at third loop


I got more score and fps with +750 mem than +600. +215 core +705 stable in uniengine heaven. I will try firestrike now.


----------



## Mad Pistol

*looks @ GTX 1070 SLI in rig*


----------



## Celcius

Quote:


> Originally Posted by *EniGma1987*
> 
> el.
> oh.
> el.
> 
> 
> 
> My friend and I were joking, and I thought Id run it just for fun. Highest I saw was actually in the 5,400's fps but this was the highest I got a picture of. haha. ok, play times over.


Did you have any coil whine at that frame rate?


----------



## DarkIdeals

Anyone else having temp issues with this card? I've tried both MSI Combuster (Furmark test) and regular gaming on The Witcher 3 and everytime it will raise up to a crazy 89C temp with the power limit/temp limit raised. This flies against the "83-84C max" that everyone else is saying and it's kinda pissing me off here. No way this card should be that hot, and i haven't even overclocked it yet.


----------



## Gary2015

Quote:


> Originally Posted by *Murlocke*
> 
> Just played through the whole first mission of GTA V, completely maxed still, 3840x2160, FXAA and VSYNC on. Not a single time did it budge from 60FPS, smooth as butter as far as 60Hz goes. This was completely unplayable at these settings on my Titan X Maxwell. I enabled 2x MSAA to test, but I started seeing low 50s while driving around. I'm sure I could turn grass down from Ultra or something and 2x MSAA would be fine. With 2 cards, I'm sure you'll be able to crank MSAA no problems.
> 
> The game looks absolutely mind blowing on a 65" 4K OLED. Words can't describe.


Wow looks like we have found 4K nirvana. I wonder what frame rates would be with the Natural Vision Photorealistic mod.


----------



## Gary2015

Quote:


> Originally Posted by *NoDoz*
> 
> I got my shipping notification as well!


Me too. I'm on free shipping and bought 9.01am on day of release.


----------



## EniGma1987

Quote:


> Originally Posted by *Celcius*
> 
> Did you have any coil whine at that frame rate?


Nope, no whine at all.


----------



## Gary2015

Quote:


> Originally Posted by *DarkIdeals*
> 
> Anyone else having temp issues with this card? I've tried both MSI Combuster (Furmark test) and regular gaming on The Witcher 3 and everytime it will raise up to a crazy 89C temp with the power limit/temp limit raised. This flies against the "83-84C max" that everyone else is saying and it's kinda pissing me off here. No way this card should be that hot, and i haven't even overclocked it yet.


That definitely doesn't sound right! What speed are the fans?


----------



## DarkIdeals

Quote:


> Originally Posted by *Gary2015*
> 
> That definitely doesn't sound right! What speed are the fans?


It cranks up the fans to ~70-80% in manual. I've tried doing it myself too, no dice even at 80%. Any higher is too loud. I wonder if it's bad thermal paste...


----------



## Gary2015

Quote:


> Originally Posted by *Fiercy*
> 
> I am thinking why aren't we getting a game with 1200$ graphics card... so cheap..


Don't think it was meant to be a mainstream product . They give free games in cards for the common public. I would like a free game also but I guess people who can afford these cards don't usually care.


----------



## Gary2015

Quote:


> Originally Posted by *DarkIdeals*
> 
> It cranks up the fans to ~70-80% in manual. I've tried doing it myself too, no dice even at 80%. Any higher is too loud. I wonder if it's bad thermal paste...


Vega sign opened his and it had bad TIM. I would RMA it.


----------



## HyperMatrix

Quote:


> Originally Posted by *Gary2015*
> 
> That definitely doesn't sound right! What speed are the fans?


If you have it on auto-fan, you'll see that raising the power target also raises the temp target. You can unlink them, and raise just the power target. Or better yet...set a manual fan speed, or even better....do a custom fan curve. Because right now it slow spins as long as the temperature is below 90c. Then it goes into panic mode trying to dissipate a crap ton of heat when it's already about to melt.


----------



## DarkIdeals

Quote:


> Originally Posted by *HyperMatrix*
> 
> If you have it on auto-fan, you'll see that raising the power target also raises the temp target. You can unlink them, and raise just the power target. Or better yet...set a manual fan speed, or even better....do a custom fan curve. Because right now it slow spins as long as the temperature is below 90c. Then it goes into panic mode trying to dissipate a crap ton of heat when it's already about to melt.


Well i tried setting it manually to ~75-80% fan speed too and it still gets in the real high 80's temps. I haven't let it go for long though as i'm scared of 89C on a $1200 card so i'm not sure if it will go higher than that.

Quote:


> Originally Posted by *Gary2015*
> 
> Vega sign opened his and it had bad TIM. I would RMA it.


Why RMA over bad TIM? If that's the case i'll just open it up and put some Thermal Grizzly Kryonaut on it. Should help if that's actually the problem.


----------



## CallsignVega

Crysis 3 maxed out I'm getting 100-130 FPS and Star Wars Battlefront maxed out 130-160 FPS at 4K in SLI. Look's like the time has come for 4K 120 Hz monitors...


----------



## carlhil2

Quote:


> Originally Posted by *CallsignVega*
> 
> Crysis 3 maxed out I'm getting 100-130 FPS and Star Wars Battlefront maxed out 130-160 FPS at 4K in SLI. Look's like the time has come for 4K 120 Hz monitors...


It's what I am waiting for... til then, my Sammy 4k will hold me...


----------



## NoDoz

Quote:


> Originally Posted by *DarkIdeals*
> 
> It cranks up the fans to ~70-80% in manual. I've tried doing it myself too, no dice even at 80%. Any higher is too loud. I wonder if it's bad thermal paste...


I would clean off the TIM and apply some more.


----------



## HyperMatrix

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Another bench:
> 
> 
> 
> *http://www.3dmark.com/3dm/13840890*


Quote:


> Originally Posted by *Murlocke*
> 
> That's a crazy high score for a single GPU. Is that the Xeon's doing? I get 8983 with +200/+300 on my 6700k @ 4.6GHz.


This is with +175 / +700, which is pretty much the max stable I can get on air.


----------



## Gary2015

Quote:


> Originally Posted by *NoDoz*
> 
> I got my shipping notification as well!


Me too. I'm on free shipping and bought 9.01am on day of release.
Quote:


> Originally Posted by *CallsignVega*
> 
> Crysis 3 maxed out I'm getting 100-130 FPS and Star Wars Battlefront maxed out 130-160 FPS at 4K in SLI. Look's like the time has come for 4K 120 Hz monitors...


http://hexus.net/tech/news/monitors/93476-asus-4k-gaming-monitor-144hz-refresh-rate-works/


----------



## DarkIdeals

Quote:


> Originally Posted by *Gary2015*
> 
> Me too. I'm on free shipping and bought 9.01am on day of release.
> http://hexus.net/tech/news/monitors/93476-asus-4k-gaming-monitor-144hz-refresh-rate-works/


Just had to remark on how similar our setups are.

Mine is:

Asus Rampage V Edition 10 (same as you)
Intel i7 6800K @ 4.5ghz - Silicon Lottery (same as you)
512GB 950 Pro SSD (similar)
16GB Dominator Platinum @ 3200mhz 15-16-16-35 (different but sorta close)
2x TITAN X Pascal SLI (only using one so far, other one is coming soon







)
Caselabs SMA8 Magnum (basically an SM8 with lower chamber)
EVGA Supernova G2 1000w PSU (different)

and i'm about to buy the Acer Predator X34 to replace my ROG Swift PG278Q next week.


----------



## Gary2015

Quote:


> Originally Posted by *DarkIdeals*
> 
> Just had to remark on how similar our setups are.
> 
> Mine is:
> 
> Asus Rampage V Edition 10 (same as you)
> Intel i7 6800K @ 4.5ghz - Silicon Lottery (same as you)
> 500GB 950 Pro SSD (similar)
> 16GB Dominator Platinum @ 3200mhz 15-16-16-35 (different but sorta close)
> 2x TITAN X Pascal SLI (only using one so far)
> Caselabs SMA8 Magnum (basically an SM8 with lower chamber)
> EVGA Supernova G2 1000w PSU (different)
> 
> and i'm about to buy the Acer Predator X34 to replace my ROG Swift PG278Q next week.


We both have good taste! The swift for me was a huge upgrade. Once you go 21:9 you cant go back and also had ROG Swift PG278Q before that!


----------



## Murlocke

Fast Sync is amazing. I tried it in Witcher 3, GTA, etc. It's infinitely better than VSYNC as long as your FPS stays above 50-55 or so. If it drops below that, you start to get some blurry motion and it starts to feel weird, but nothing unplayable if it's just a quick dip. Being able to play with no tearing, and no input lag, as long as you maintain a decent FPS is extremely nice. It's a great alternative to GSYNC for OLED TV owners.

It makes a huuuge difference in Witcher 3. I always sucked at the combat in that game, and I found out it was due to VSYNC input lag. With fast sync, my rolls are much more precise.

I really want to replay GTA V and Witcher 3 now.....


----------



## ChrisxIxCross

Got my card in this morning. Now just got back from work and about to test this monster out!


----------



## DarkIdeals

Quote:


> Originally Posted by *Gary2015*
> 
> We both have good taste! The swift for me was a huge upgrade. Once you go 21:9 you cant go back and also had ROG Swift PG278Q before that!


I'm just REALLY debating between the X34 34 inch ultra-wide 100hz and the Acer XB321HK 32 inch 4K 60hz. They're both nearly identical in most ways; both have G-Sync, 32 and 34 inch is super close in size, both are IPS panel, both have 10 bit color (with FRC+8 bit etc..), both have 4ms response time etc..etc.. but it's between 60hz 4K and 100hz 3440x1440 21:9.

I'm really thinking the 21:9 will be great, but i kinda wonder if two TITAN X pascal isn't a tiny bit overkill for even 3440x1440 100hz. I just did some more testing on Far Cry 4 on my ROG Swift and i can definitely notice a less laggy and less blurry experience when just sprinting and moving camera side to side with ~120hz vs 60hz. I also noticed that aiming/shooting is a fair bit more precise at 120hz than at 60hz; especially when i turn down some settings on the GTX 1080 i had so i could legit push full 100fps+ at 1440p.

That's the ONE thing that i think is keeping me from getting the XB321HK, the fact that it's only 60hz and can't be overclocked. I'm really wondering how the X34 looks like with Nvidia DSR on. 3440x1440 native with DSR would give me a render resolution of 5120 x 2160 (essentially "4k ultra-wide") downsampled to the original 3440x1440, and i'm wondering if that wouldn't actually look relatively close to the sharpness of 4K (especially if i added in a bit of good quality AA to sharpen things farther) while still letting me have the benefit of ultra-wide....decisions decisions lol.


----------



## ChrisxIxCross

What overclock utility are you guys using? My current install of EVGA Precision doesn't support this card


----------



## sherlock

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> What overclock utility are you guys using? My current install of EVGA Precision doesn't support this card


They are all using MSI Afterburner, specifically 4.3 beta 4 I believe.


----------



## EniGma1987

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> What overclock utility are you guys using? My current install of EVGA Precision doesn't support this card


Only Afterburner works right now.

I thought I was PT limited in my overclock before since benchmarks only reach 75 degrees, but in games I am hitting 89-90c so it is definitely temperature limited and throttling cause of that. It is possible I am htting both temp and PT limits, but I can clearly see I am temp limited in all games I have tried so far. Good thing I have a waterblock coming, it will actually provide real benefit. Didnt have any new resistors on hand for the PT limit mod, have to order some so that will be a week or two as well. Probably just do it at the same time as waterblcok install since the stock cooler has to come off for both projects.


----------



## DarkIdeals

Speaking of afterburner. Anyone actually have voltage control yet on the TITAN X P?


----------



## ChrisxIxCross

Quote:


> Originally Posted by *sherlock*
> 
> They are all using MSI Afterburner, specifically 4.3 beta 4 I believe.


Ah ok thanks!


----------



## DarkIdeals

Quote:


> Originally Posted by *EniGma1987*
> 
> Only Afterburner works right now.
> 
> I thought I was PT limited in my overclock before since benchmarks only reach 75 degrees, but in games I am hitting 89-90c so it is definitely temperature limited and throttling cause of that. It is possible I am htting both temp and PT limits, but I can clearly see I am temp limited in all games I have tried so far. Good thing I have a waterblock coming, it will actually provide real benefit. Didnt have any new resistors on hand for the PT limit mod, have to order some so that will be a week or two as well. Probably just do it at the same time as waterblcok install since the stock cooler has to come off for both projects.


You're not alone. I'm hitting 89-90C too unless i crank to ~90% fan speed or higher, then it stays at 85-86C or so. I'm gonna try putting new Thermal Paste on it (thermal grizzly kryonaut) and i'll see how it helps. I'll let you know if it fixes anything so you can try it to yours; might be nice to have low temps while you wait for the waterblock amirite? (gonna get a block for my two cards too!







)

This throttling has me limited to 1,975mhz max currently; but i think i can go higher for sure without the cooling problems.


----------



## ChrisxIxCross

Quote:


> Originally Posted by *DarkIdeals*
> 
> You're not alone. I'm hitting 89-90C too unless i crank to ~90% fan speed or higher, then it stays at 85-86C or so. I'm gonna try putting new Thermal Paste on it (thermal grizzly kryonaut) and i'll see how it helps. I'll let you know if it fixes anything so you can try it to yours; might be nice to have low temps while you wait for the waterblock amirite? (gonna get a block for my two cards too!
> 
> 
> 
> 
> 
> 
> 
> )
> 
> This throttling has me limited to 1,975mhz max currently; but i think i can go higher for sure without the cooling problems.


Yeah this card is just WAY too much for the reference cooler, my block from EK can't come in soon enough!


----------



## Murlocke

Quote:


> Originally Posted by *DarkIdeals*
> 
> You're not alone. I'm hitting 89-90C too unless i crank to ~90% fan speed or higher, then it stays at 85-86C or so. I'm gonna try putting new Thermal Paste on it (thermal grizzly kryonaut) and i'll see how it helps. I'll let you know if it fixes anything so you can try it to yours; might be nice to have low temps while you wait for the waterblock amirite? (gonna get a block for my two cards too!
> 
> 
> 
> 
> 
> 
> 
> )
> 
> This throttling has me limited to 1,975mhz max currently; but i think i can go higher for sure without the cooling problems.


This is why I recommending putting your comp in another room and using extensions cables for everything, then just turn the sucker into a jet engine.

Looking forward to hearing results, I've been debating redoing TIM on mine all day.


----------



## ChrisxIxCross

Good lord this thing is like at 50c at idle... My 980 Ti Hybrid which I had before was at like 26c idle and never went above 60c at full load

On another note is this thread going to become the owners club, or is someone going to make a owners thread soon?


----------



## HyperMatrix

So I was investigating some FurMark fun. Here's what's happening:

- You can hit 120% TDP on FurMark even with around a 50-60% GPU load on FurMark
- As the card heats up, more power is required for the same performance level
- This results in throttling, in order to stay below TDP
- This cycle continues for quite a while, as the card keeps throttling down to stay within TDP, and the gradual heat build up requiring more and more power for the card to operate, which in turn leads to even more throttling.

Was running a game and at 70% usage everything was fine. Clocked at between 2050-2080MHz. The instant I changed a setting that pushed the GPU to 100% usage, it would instantly drop to 1900-1980MHz.

Even on air, we're going to need a modified bios to take full advantage of these cards.


----------



## Murlocke

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> Good lord this thing is like at 50c at idle... My 980 Ti Hybrid which I had before was at like 26c idle and never went above 60c at full load


Mine is idling at 32C right now with 23% fan.

Most likely you have windows set to High Performance instead of Balanced in power saving options. If I do that, chrome tends to just throw it into 3D mode. Check GPU usage and Power % in afterburner. Should be 7% and 0-1%.
Quote:


> Originally Posted by *HyperMatrix*
> 
> So I was investigating some FurMark fun. Here's what's happening:
> 
> - You can hit 120% TDP on FurMark even with around a 50-60% GPU load on FurMark
> - As the card heats up, more power is required for the same performance level
> - This results in throttling, in order to stay below TDP
> - This cycle continues for quite a while, as the card keeps throttling down to stay within TDP, and the gradual heat build up requiring more and more power for the card to operate, which in turn leads to even more throttling.
> 
> Was running a game and at 70% usage everything was fine. Clocked at between 2050-2080MHz. The instant I changed a setting that pushed the GPU to 100% usage, it would instantly drop to 1900-1980MHz.
> 
> Even on air, we're going to need a modified bios to take full advantage of these cards.


Honestly all that program is good for anymore is breaking cards. It's not a realistic scenario and doesn't even bring out unstable overclocks as well as some games.


----------



## Gary2015

Quote:


> Originally Posted by *DarkIdeals*
> 
> I'm just REALLY debating between the X34 34 inch ultra-wide 100hz and the Acer XB321HK 32 inch 4K 60hz. They're both nearly identical in most ways; both have G-Sync, 32 and 34 inch is super close in size, both are IPS panel, both have 10 bit color (with FRC+8 bit etc..), both have 4ms response time etc..etc.. but it's between 60hz 4K and 100hz 3440x1440 21:9.
> 
> I'm really thinking the 21:9 will be great, but i kinda wonder if two TITAN X pascal isn't a tiny bit overkill for even 3440x1440 100hz. I just did some more testing on Far Cry 4 on my ROG Swift and i can definitely notice a less laggy and less blurry experience when just sprinting and moving camera side to side with ~120hz vs 60hz. I also noticed that aiming/shooting is a fair bit more precise at 120hz than at 60hz; especially when i turn down some settings on the GTX 1080 i had so i could legit push full 100fps+ at 1440p.
> 
> That's the ONE thing that i think is keeping me from getting the XB321HK, the fact that it's only 60hz and can't be overclocked. I'm really wondering how the X34 looks like with Nvidia DSR on. 3440x1440 native with DSR would give me a render resolution of 5120 x 2160 (essentially "4k ultra-wide") downsampled to the original 3440x1440, and i'm wondering if that wouldn't actually look relatively close to the sharpness of 4K (especially if i added in a bit of good quality AA to sharpen things farther) while still letting me have the benefit of ultra-wide....decisions decisions lol.


Depends what games you play . I have the 4K one as well but my daily driver is the X34. I will see if it's overkill when I get my cards. I had two SLi 1080 and I wasn't get 100fps in ESO nor GTAV with mods. With BF1 coming out soon I would prefer 100fps constant . With the new Asus 4K 144 hz out next year , it won't be overkill.

With DSR at max my 1080s only did 25fps on ESO.


----------



## Murlocke

I would choose 4K over 3440x1440, and I've owned both. 3440x1440 doesn't work in some games, and can be a headache. 4K simply works. It also looks significantly better.

I've never felt higher than 60FPS is worth it, in most cases I prefer to just increase AA or use SSAA. My opinion is definitely the minority though, I've gamed on my friend's 1080p 144Hz and the entire time I was like "meh" compared to 4K at home.


----------



## HyperMatrix

Quote:


> Originally Posted by *Murlocke*
> 
> Mine is idling at 32C right now with 23% fan.
> 
> Most likely you have windows set to High Performance instead of Balanced in power saving options. If I do that, chrome tends to just throw it into 3D mode. Check GPU usage and Power % in afterburner. Should be 7% and 0-1%.
> Honestly all that program is good for anymore is breaking cards. It's not a realistic scenario and doesn't even bring out unstable overclocks as well as some games.


I started investigating after World of Warcraft started throttling the card heavy under 4x SSAA.


----------



## fernlander

Quote:


> Originally Posted by *CallsignVega*
> 
> Crysis 3 maxed out I'm getting 100-130 FPS and Star Wars Battlefront maxed out 130-160 FPS at 4K in SLI. Look's like the time has come for 4K 120 Hz monitors...


I don't know about maxed out. I tried Crysis 3 and some settings can bring it to its knees. For example 4K plus even medium TXAA will drop it below 60fps.

Also the game looks dated finally. And one more thing. Nothing on earth gets rid of the horrendous aliasing on every stair in that game. Not 4K + high TXAA. Nothing. Those stairs will be aliased forever it seems.


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> I started investigating after World of Warcraft started throttling the card heavy under 4x SSAA.


Would like to see how these cards perform under water.


----------



## Murlocke

Quote:


> Originally Posted by *HyperMatrix*
> 
> I started investigating after World of Warcraft started throttling the card heavy under 4x SSAA.


You know what's sad? The main reason I bought this card is World of Warcraft. Those new legion graphics are super demanding at 4K. I went from ~45FPS to ~100FPS in the Hyjal coming from the previous Titan X. It's sad you need such a powerful computer to chug through that 15 year old engine, but at least the games looks good with the new graphics.
Quote:


> Originally Posted by *fernlander*
> 
> I don't know about maxed out. I tried Crysis 3 and some settings can bring it to its knees. For example 4K plus even medium TXAA will drop it below 60fps.
> 
> Also the game looks dated finally. And one more thing. Nothing on earth gets rid of the horrendous aliasing on every stair in that game. Not 4K + high TXAA. Nothing. Those stairs will be aliased forever it seems.


Honestly any form of AA that isn't FXAA or SMAA is likely to kill FPS at 4K. Luckily, in most titles, nothing beyond those is a big improvement.


----------



## Gary2015

Quote:


> Originally Posted by *Murlocke*
> 
> I would choose 4K over 3440x1440, and I've owned both. 3440x1440 doesn't work in some games, and can be a headache. 4K simply works. It also looks significantly better.
> 
> I've never felt higher than 60FPS is worth it, in most cases I prefer to just increase AA or use SSAA. My opinion is definitely the minority though, I've gamed on my friend's 1080p 144Hz and the entire time I was like "meh" compared to 4K at home.


That's true 21:9 isn't compatible with all games but most of them work. I just like the curved screen. I can't go back to 16:9.


----------



## DarkIdeals

Quote:


> Originally Posted by *Gary2015*
> 
> Depends what games you play . I have the 4K one as well but my daily driver is the X34. I will see if it's overkill when I get my cards. I had two SLi 1080 and I wasn't get 100fps in ESO nor GTAV with mods. With BF1 coming out soon I would prefer 100fps constant . With the new Asus 4K 144 hz out next year , it won't be overkill.
> 
> With DSR at max my 1080s only did 25fps on ESO


Hmm....yeah i was thinking that perhaps i'd be better off with the X34 due to demanding games. Like i thought to myself "yes it is GENERALLY overkill, but on like The Witcher 3 at Max settings with Hairworks, or 4K Fallout 4 with max Godrays and 100+ mods etc.. i would probably end up at ~80-85fps or something, so perhaps it'd be worth it to get the X34 anyway"

But then there's the X34P coming out SOMETIME this year (i hate Acer's ambiguous vague estimates...."we think it might possibly sort of be kinda ready around Q4 of next year....i guess?" lmao. The X34P tempts me to not get a monitor yet, as it will have 100hz out of box with overclocking still enabled, an Acer rep said minimum 120hz would be possible, and since it uses Displayport 1.4, assuming the limiation is bandwidth and not the panel, i'm betting that ~130-140hz is more likely. Plus it has the joystick controls like the ASUS ROG monitors, and has a matte back finish, and also has the swivel etc.. stand; and it uses one of the higher quality LG "S-IPS" panels instead of the "AH-IPS" panels that have all the backlight bleed issues.

On a side note....is ESO REALLY that demanding? I mean yeah it's online so there's latency and CPU issues etc.. that effect ALL online games; but i didn't think the graphics and whatnot of ESO put it at the level that you would dip to 25fps! And yeah BF1 is one of the main things making me want to try the X34. I'm never a big FPS player, i enjoy the occasional playthrough of Metro 2033/LL, Far Cry 3/4, Fallout games etc.. but i'm never a competitive CoD, BF, etc.. type player. This is why i thought i might be able to get away with 60hz especially when it's in exchange for 4K, since i mostly play RPG, RTS, etc.. type games like Dark Souls, Dragon Age, Skyrim, The Witcher, Total War etc..etc.. but idk...

Quote:


> Originally Posted by *HyperMatrix*
> 
> So I was investigating some FurMark fun. Here's what's happening:
> 
> - You can hit 120% TDP on FurMark even with around a 50-60% GPU load on FurMark
> - As the card heats up, more power is required for the same performance level
> - This results in throttling, in order to stay below TDP
> - This cycle continues for quite a while, as the card keeps throttling down to stay within TDP, and the gradual heat build up requiring more and more power for the card to operate, which in turn leads to even more throttling.
> 
> Was running a game and at 70% usage everything was fine. Clocked at between 2050-2080MHz. The instant I changed a setting that pushed the GPU to 100% usage, it would instantly drop to 1900-1980MHz.
> 
> Even on air, we're going to need a modified bios to take full advantage of these cards.


Yeah even in The Witcher 3 i was getting 89C after only 2-3 minutes even with 80% fan speed; have to crank to 90% or higher to keep it at 86C. I'm guessing this is because Witcher 3 with Ultra settings + Hairworks etc.. was using 85-99% usage. I have NO idea how people like JayztwoCents are running at 83-84C max, unless it really IS just a bad factory TIM application on certain cards.


----------



## HyperMatrix

Quote:


> Originally Posted by *Murlocke*
> 
> You know what's sad? The main reason I bought this card is World of Warcraft. Those new legion graphics are super demanding at 4K. I went from ~45FPS to ~100FPS in the Hyjal coming from the previous Titan X. It's sad you need such a powerful computer to chug through that 15 year old engine, but at least the games looks good with the new graphics.


Have you seen the CPU workload distribution in that game? If you go to somewhere like Firelands, you'll see 1 core is maxed out at 100% while the others are at 5-15%. What really pisses me off though, is that the Mac version now has support for the Metal API. So basically...Mac users get "DX12" performance. And we don't. Since when did gaming on a Mac have an advantage over PC. Lol.


----------



## Fiercy

Quote:


> Originally Posted by *Murlocke*
> 
> I would choose 4K over 3440x1440, and I've owned both. 3440x1440 doesn't work in some games, and can be a headache. 4K simply works. It also looks significantly better.
> 
> I've never felt higher than 60FPS is worth it, in most cases I prefer to just increase AA or use SSAA. My opinion is definitely the minority though, I've gamed on my friend's 1080p 144Hz and the entire time I was like "meh" compared to 4K at home.


1440p 144hz it's where it's at it's not as pixelated as 1080p and scales a lot better for productivity then 4k


----------



## HyperMatrix

I should mention...what really upsets me the most right now, is that even with a 2K+ clocked Nvidia Titan X...I still can't play Quantum Break because of how incredibly crappy it is in terms of performance. And it upsets me because it's actually a good game.


----------



## Murlocke

Quote:


> Originally Posted by *Gary2015*
> 
> That's true 21:9 isn't compatible with all games but most of them work. I just like the curved screen. I can't go back to 16:9.


I thought the same until I saw 4K OLED. I immediately stopped caring about 21:9, and didn't miss the increased FOV at all.

Now if it wasn't for OLED, 3440x1440 vs 4K would be a pretty hard choice. I think it would depend on the game which I prefer. Overall, I'd say I prefer 4K simply because too many titles don't have proper AA support and 3440x1440 still needs a good chunk of AA.
Quote:


> Originally Posted by *DarkIdeals*
> 
> Yeah even in The Witcher 3 i was getting 89C after only 2-3 minutes even with 80% fan speed; have to crank to 90% or higher to keep it at 86C. I'm guessing this is because Witcher 3 with Ultra settings + Hairworks etc.. was using 85-99% usage. I have NO idea how people like JayztwoCents are running at 83-84C max, unless it really IS just a bad factory TIM application on certain cards.


I just played Witcher for a couple hours with maxed out settings, my GPU stuck around 79C with fan at 100%. I think you need to reapply the TIM and/or possibly improve case airflow?
Quote:


> Originally Posted by *HyperMatrix*
> 
> Have you seen the CPU workload distribution in that game? If you go to somewhere like Firelands, you'll see 1 core is maxed out at 100% while the others are at 5-15%. What really pisses me off though, is that the Mac version now has support for the Metal API. So basically...Mac users get "DX12" performance. And we don't. Since when did gaming on a Mac have an advantage over PC. Lol.


I'm sure they will add DX12 at some point. much like they did with DX11.


----------



## Testier

Quote:


> Originally Posted by *Murlocke*
> 
> I thought the same until I saw 4K OLED. I immediately stopped caring about 21:9, and didn't miss the increased FOV at all.
> 
> Now if it wasn't for OLED, 3440x1440 vs 4K would be a pretty hard choice. I think it would depend on the game which I prefer. Overall, I'd say I prefer 4K simply because too many titles don't have proper AA support and 3440x1440 still needs a good chunk of AA.


Is there a lower size model of OLED thats not absolutely overwhelming?


----------



## KickAssCop

Wow, you guys are scoring the same as my 980 Tis in SLi. Amazing!
I get 10390 in Timespy.


----------



## Murlocke

Quote:


> Originally Posted by *Testier*
> 
> Is there a lower size model of OLED thats not absolutely overwhelming?


55" is as small as they come for current OLED TVs.
Quote:


> Originally Posted by *KickAssCop*
> 
> Wow, you guys are scoring the same as my 980 Tis in SLi. Amazing!
> I get 10390 in Timespy.


He's rocking a Xeon processor and Timespy loves it. The typical Titan XP gets about ~9000 in TimeSpy at +200 core.


----------



## Z0eff

Quote:


> Originally Posted by *DarkIdeals*
> 
> _-snip-_
> 
> Yeah even in The Witcher 3 i was getting 89C after only 2-3 minutes even with 80% fan speed; have to crank to 90% or higher to keep it at 86C. I'm guessing this is because Witcher 3 with Ultra settings + Hairworks etc.. was using 85-99% usage. I have NO idea how people like JayztwoCents are running at 83-84C max, unless it really IS just a bad factory TIM application on certain cards.


Might be ambient temperatures being different? Try putting the AC to full blast if you have one.


----------



## HyperMatrix

Quote:


> Originally Posted by *Murlocke*
> 
> 55" is as small as they come for current OLED TVs.
> He's rocking a Xeon processor and Timespy loves it. The typical Titan XP gets about ~9000 in TimeSpy at +200 core.


Lies, Murlocke. Didn't you see mine?

I'm still waiting for someone like CallsignVega to show up with an 11k or 12k score.


----------



## Testier

Quote:


> Originally Posted by *Murlocke*
> 
> He's rocking a Xeon processor and Timespy loves it. The typical Titan XP gets about ~9000 in TimeSpy at +200 core.


That xeon is a 5960x on ivy e.


----------



## fernlander

Quote:


> Originally Posted by *Murlocke*
> 
> You know what's sad? The main reason I bought this card is World of Warcraft. Those new legion graphics are super demanding at 4K. I went from ~45FPS to ~100FPS in the Hyjal coming from the previous Titan X. It's sad you need such a powerful computer to chug through that 15 year old engine, but at least the games looks good with the new graphics.
> Honestly any form of AA that isn't FXAA or SMAA is likely to kill FPS at 4K. Luckily, in most titles, nothing beyond those is a big improvement.


Id have to agree. At 4K usually post process AA is enough. But those stairs nothing can help them. Also for that game in particular it does actually need the TXAA or MSAA to get rid of some jaggies. But thankfully no modern games need it. Usually DSR/SSAA will get the job done.


----------



## DarkIdeals

Quote:


> Originally Posted by *Murlocke*
> 
> I would choose 4K over 3440x1440, and I've owned both. 3440x1440 doesn't work in some games, and can be a headache. 4K simply works. It also looks significantly better.
> 
> I've never felt higher than 60FPS is worth it, in most cases I prefer to just increase AA or use SSAA. My opinion is definitely the minority though, I've gamed on my friend's 1080p 144Hz and the entire time I was like "meh" compared to 4K at home.


To be fair, 1080p 144hz is actually quite a bit worse than even 2560x1440p at 144hz, let alone 3440x1440p at 100hz+. Not that i'm discounting your opinion, in fact i'm taking it dead serious considering it's a $1300 monitor purchase.

The 3440x1440p screen you had, was it G-Sync? (i.e. was it the Acer X34 or ASUS PG348Q? SInce those are the ONLY two 3440x1440 G-Sync's out currently...although i suppose it could've been freesync?) And was the 4K monitor you tried G-Sync?

I'm just curious about the comparison of:

3440x1440p + G-Sync + 100hz + Curved 21:9 ratio VS. 4K + G-Sync + 60hz 16:9 ratio

in specific; not say "4K non-g-sync 60hz vs 3440x1440p non-g-sync 60hz" or "4K G-Sync 60hz vs 3440x1440p non g-sync" etc.. i'm looking for opinions on the difference between two completely nearly identical monitors that just differ by curvature, aspect ratio, and refresh rate and not much else.

Quote:


> Originally Posted by *HyperMatrix*
> 
> I started investigating after World of Warcraft started throttling the card heavy under 4x SSAA.


To be fair (i seem to be saying this a lot lately lol), SSAA has a downsampling effect to it, so technically you are playing at 4K if you use 4xSSAA on a 1080p monitor, or if you use it on a 1440p monitor you are playing at "10K" resolution! (10,240 x 5,760), and if you actually were crazy enough to use full 4 x SSAA on a 4K screen you'd be rendering the game with "SIXTEEN K" resolution!! (or more accurately 15,360 x 8,640)

Quote:


> Originally Posted by *Gary2015*
> 
> That's true 21:9 isn't compatible with all games but most of them work. I just like the curved screen. I can't go back to 16:9.


Yeah the curvature is actually the thing i am most interested in surprisingly lol. I even scoffed when i discovered the fact that the X34 "only" had a curvature of 3800r haha. I saw like the Z35 and such that had 2000r curves, or the LG ones with ~2300r and i was like "why?". Didn't help that even ACER REPRESENTATIVES in Computex vids for the new X34P were specifically saying "the old X34 had 2300r curve, this one updated to 1900r so it's curvier" when in fact the old ones are apparently 3800r, not 2300r.

I think the ONE thing that irritates me about 21:9 support is that Dark Souls games don't work well with it. Dark Souls 1 seems to be well enough supported using "flawless widescreen" mod, but in Dark Souls 3 (and maybe others, i don't know) there is a glitch/issue/problem (not sure if it's intended bahavior by the devs or a glitch) where the enemies that are in the outermost 1-2" of the screen right at the left or rightmost edges of the screen; will literally get choppy as all heck as if their animation suddenly lose 80% of the frames. They basically "glitch around" appearing to "float/glide" in a choppy fashion. LIke the blacksmith, if you turn the camera so he's in the edge of the 21:9 screen view, his hammering will go from smooth "uuuuupp.....doooown....STRIKE.....raiiiising uuuuup......dooooownn...STRIKE" to "Up.....*no movement*....DOWN.....*no movement*....UP...." etc..

If that could be fixed i think i'd definitely lean to 21:9 ratio. Even if the fix meant simply adding a thin ~1" to 1.5" black bar on either side so i couldn't SEE the glitchy behavior occuring. (People's theory on why this happens in Dark Souls 3 is that the dev's coded it to where enemies/NPC's etc.. only use a TINY bit of animations when you can't see them in order to save on resources so your fps doesn't tank from all the off-screen enemies being fully animated when you're completely across the area from them. Makes sense.)


----------



## fernlander

Quote:


> Originally Posted by *Murlocke*
> 
> This is why I recommending putting your comp in another room and using extensions cables for everything, then just turn the sucker into a jet engine.
> 
> Looking forward to hearing results, I've been debating redoing TIM on mine all day.


I did exactly that and also used gefen hdmi over cat5 to get it done. Then came VR so I had to go hybrid.


----------



## Jared Pace

Quote:


> Originally Posted by *DarkIdeals*
> 
> Speaking of afterburner. Anyone actually have voltage control yet on the TITAN X P?


try softmod by hax0ring with Afterburner
poll your cards i2c like this and check
http://www.overclock.net/t/1363440/official-nvidia-geforce-gtx-titan-owners-club/13400_100#post_20635197

OG titan had NCP4206, which was documented. Zawarudo made a tool to force the voltage thru afterburner
http://www.overclock.net/t/1421221/gtx780-titan-any-ncp4206-card-vdroop-fix-solid-1-325v/0_100

maybe Sheyster or Cyclops of Maxwell Titan X bioses can help share info. TXP should still be TDP limited with a forced increase in voltage through i2c interface. Unencrypted or signed bios with key needed for power, unless hardmod. Currently a voltage boost could increase TDP and give unwanted result of decreased clockspeed







A decrease in volts might get a couple extra mhz on a good asic while staying within the power profile... or just crash. Worth it to test though - just to see if possible. Someone will still have to make a bios or Bios editor


----------



## fernlander

Quote:


> Originally Posted by *EniGma1987*
> 
> Only Afterburner works right now.
> 
> I thought I was PT limited in my overclock before since benchmarks only reach 75 degrees, but in games I am hitting 89-90c so it is definitely temperature limited and throttling cause of that. It is possible I am htting both temp and PT limits, but I can clearly see I am temp limited in all games I have tried so far. Good thing I have a waterblock coming, it will actually provide real benefit. Didnt have any new resistors on hand for the PT limit mod, have to order some so that will be a week or two as well. Probably just do it at the same time as waterblcok install since the stock cooler has to come off for both projects.


Nvinspector works great.


----------



## Murlocke

Quote:


> Originally Posted by *HyperMatrix*
> 
> Lies, Murlocke. Didn't you see mine?
> 
> I'm still waiting for someone like CallsignVega to show up with an 11k or 12k score.


How? I get ~8900 consistently on TimeSpy with +200 core. Is that the difference in that demo between 4 and 6 cores?
Quote:


> Originally Posted by *DarkIdeals*
> 
> The 3440x1440p screen you had, was it G-Sync? (i.e. was it the Acer X34 or ASUS PG348Q? SInce those are the ONLY two 3440x1440 G-Sync's out currently...although i suppose it could've been freesync?) And was the 4K monitor you tried G-Sync?


No it was the first 3440x1440 monitor and 60hz, LG 34UM95. My 4K monitor is an OLED TV, no GSYNC on these.

I honestly won't consider a non-OLED at this point because it's too big of a difference, so I am probably not the right person to help you decide which to get. I've always felt 60hz is plenty, so I prefer 4K, but I do not play FPS. 144hz/GSYNC is nice but I can't swallow the picture quality of LCD displays anymore.


----------



## HyperMatrix

Quote:


> Originally Posted by *Murlocke*
> 
> How? I get ~8900 consistently on TimeSpy with +200 core. Is that the difference in that demo between 4 and 6 cores?


8 core. And I'd assume so, especially since DX12 is supposed to do a better job of spreading work between cores. Also, I ran mine at +175 / +700


----------



## CallsignVega

Quote:


> Originally Posted by *Murlocke*
> 
> Fast Sync is amazing. I tried it in Witcher 3, GTA, etc. It's infinitely better than VSYNC as long as your FPS stays above 50-55 or so. If it drops below that, you start to get some blurry motion and it starts to feel weird, but nothing unplayable if it's just a quick dip. Being able to play with no tearing, and no input lag, as long as you maintain a decent FPS is extremely nice. It's a great alternative to GSYNC for OLED TV owners.
> 
> It makes a huuuge difference in Witcher 3. I always sucked at the combat in that game, and I found out it was due to VSYNC input lag. With fast sync, my rolls are much more precise.
> 
> I really want to replay GTA V and Witcher 3 now.....


It's a damn shame FastSync only works single gpu.









As for OLED, all I would have to do is load up the new DOOM on my 4K OLED and it would be impossible not to convert someone to it's epic-ness. My X34 looks like a kids piece of junk next to it.

Quote:


> Originally Posted by *fernlander*
> 
> I don't know about maxed out. I tried Crysis 3 and some settings can bring it to its knees. For example 4K plus even medium TXAA will drop it below 60fps.
> 
> Also the game looks dated finally. And one more thing. Nothing on earth gets rid of the horrendous aliasing on every stair in that game. Not 4K + high TXAA. Nothing. Those stairs will be aliased forever it seems.


When I say maxed out, I mean every setting is at it's highest besides AA. AA is very subjective, and you can toss huge amounts of performance out the window with AA. With 4K, FXAA looks pretty good to me. Once you start talking maxed out with crazy amounts of AA, DSR etc, that's a huge rabbit hole.


----------



## Gary2015

Quote:


> Originally Posted by *DarkIdeals*
> 
> Hmm....yeah i was thinking that perhaps i'd be better off with the X34 due to demanding games. Like i thought to myself "yes it is GENERALLY overkill, but on like The Witcher 3 at Max settings with Hairworks, or 4K Fallout 4 with max Godrays and 100+ mods etc.. i would probably end up at ~80-85fps or something, so perhaps it'd be worth it to get the X34 anyway"
> 
> But then there's the X34P coming out SOMETIME this year (i hate Acer's ambiguous vague estimates...."we think it might possibly sort of be kinda ready around Q4 of next year....i guess?" lmao. The X34P tempts me to not get a monitor yet, as it will have 100hz out of box with overclocking still enabled, an Acer rep said minimum 120hz would be possible, and since it uses Displayport 1.4, assuming the limiation is bandwidth and not the panel, i'm betting that ~130-140hz is more likely. Plus it has the joystick controls like the ASUS ROG monitors, and has a matte back finish, and also has the swivel etc.. stand; and it uses one of the higher quality LG "S-IPS" panels instead of the "AH-IPS" panels that have all the backlight bleed issues.
> 
> On a side note....is ESO REALLY that demanding? I mean yeah it's online so there's latency and CPU issues etc.. that effect ALL online games; but i didn't think the graphics and whatnot of ESO put it at the level that you would dip to 25fps! And yeah BF1 is one of the main things making me want to try the X34. I'm never a big FPS player, i enjoy the occasional playthrough of Metro 2033/LL, Far Cry 3/4, Fallout games etc.. but i'm never a competitive CoD, BF, etc.. type player. This is why i thought i might be able to get away with 60hz especially when it's in exchange for 4K, since i mostly play RPG, RTS, etc.. type games like Dark Souls, Dragon Age, Skyrim, The Witcher, Total War etc..etc.. but idk...
> Yeah even in The Witcher 3 i was getting 89C after only 2-3 minutes even with 80% fan speed; have to crank to 90% or higher to keep it at 86C. I'm guessing this is because Witcher 3 with Ultra settings + Hairworks etc.. was using 85-99% usage. I have NO idea how people like JayztwoCents are running at 83-84C max, unless it really IS just a bad factory TIM application on certain cards.


I have all my settings maxed out on ESO. Maybe there's latency issues but I've seen frame rates dip to 35fps on dual 1080 GTXs native . Need my cards to do testing! But ESO at 100fps with no dips at max is a sight to behold.


----------



## Murlocke

Quote:


> Originally Posted by *HyperMatrix*
> 
> 8 core. And I'd assume so, especially since DX12 is supposed to do a better job of spreading work between cores. Also, I ran mine at +175 / +700


Ah, yes that explains it. That's the only "gaming" benchmark I know of that has a _huge_ difference between 4 and 8 cores. DX12 will likely make me regret getting a 6700k.


----------



## xTesla1856

I decided to go with 2 1080s, which is still cheaper than importing 1 Titan X


----------



## DarkIdeals

Quote:


> Originally Posted by *Murlocke*
> 
> I thought the same until I saw 4K OLED. I immediately stopped caring about 21:9, and didn't miss the increased FOV at all.
> 
> Now if it wasn't for OLED, 3440x1440 vs 4K would be a pretty hard choice. I think it would depend on the game which I prefer. Overall, I'd say I prefer 4K simply because too many titles don't have proper AA support and 3440x1440 still needs a good chunk of AA.
> I just played Witcher for a couple hours with maxed out settings, my GPU stuck around 79C with fan at 100%. I think you need to reapply the TIM and/or possibly improve case airflow?
> I'm sure they will add DX12 at some point. much like they did with DX11.


Ahhhh...i forgot you had an OLED. So the 4K monitor you have is that $5,000 Dell OLED then? Or is it one of the LG OLED TV's? Either way it makes sense i suppose, that you would prefer a 4K OLED over high refresh 1440p (even ultra-wide 21:9 1440p) because OLED has an INSANE response time of only 0.1ms!!! That crazy quick response time means that even with the input lag of an HDTV (which is higher than most monitors typically) your overall "lag" that places like TFT central define by adding true response time to input lag for a general "responsiveness" number; would be easily around as fast as a ~4-5ms IPS 144hz monitor if not faster.

Like for example if your OLED is a TV with an average G2G type response time of ~12ms then you're total lag would only be 12.1ms, compared to like the X34 which even at 100hz iirc, has a response time of like ~7ms and an input lag of ~4.5-5ms, totalling ~11.5 - 12ms; meaning your 4K OLED would be basically identical in overall lag, but would have the huge advantage of the crispness of 4K and the beautiful colors of OLED likely making it superior.

Unfortunately the only 4K OLED monitor is $5,000 and even i can't afford that, especially after spending money on buying all kinds of new stuff (Rampage V Edition 10, i7 6800K, two of these TITAN's etc.. i barely have enough to get the X34 monitor and MAYBE two waterblocks for the cards, might have to do with just one waterblock for now till i get more $$$)

Quote:


> Originally Posted by *Z0eff*
> 
> Might be ambient temperatures being different? Try putting the AC to full blast if you have one.


I am doing just that since ~30 minutes ago; it was fairly hot today even in michigan (~84 degrees fahrenheit even at 11:30pm) I'll see if it makes any difference

Quote:


> Originally Posted by *Murlocke*
> 
> How? I get ~8900 consistently on TimeSpy with +200 core. Is that the difference in that demo between 4 and 6 cores?


Timespy LOVES extra threads, moving from my 8 core i7 5960X @ 4.5ghz to a 6 core i7 6800K @ 4.5ghz made my 6800K + GTX 1080 score fall almost as low as my 5960X + Maxwell TITAN X score was before i sold it to buy the 1080.


----------



## Murlocke

Quote:


> Originally Posted by *CallsignVega*
> 
> *It's a damn shame FastSync only works single gpu.
> 
> 
> 
> 
> 
> 
> 
> *
> 
> As for OLED, all I would have to do is load up the new DOOM on my 4K OLED and it would be impossible not to convert someone to it's epic-ness. My X34 looks like a kids piece of junk next to it.
> When I say maxed out, I mean every setting is at it's highest besides AA. AA is very subjective, and you can toss huge amounts of performance out the window with AA. With 4K, FXAA looks pretty good to me. Once you start talking maxed out with crazy amounts of AA, DSR etc, that's a huge rabbit hole.


REALLY?! That sucks, SLI would be incredible for fast sync + OLED 60hz.

I just played a bunch of Metro Last Night and Doom on my OLED today. If someone has never seen one, It'd only take them about 1 second to lose all faith in LCD displays in those dark games. The PQ of mine still impresses me every day and I've been using 4K OLED since the EG9600 (about a year now?). We actually talked in the AVS EG9600 thread a bunch when you first bought yours, but I ended up returning it and getting the E6 for reduced input lag.
Quote:


> Originally Posted by *DarkIdeals*
> 
> Ahhhh...i forgot you had an OLED. So the 4K monitor you have is that $5,000 Dell OLED then? Or is it one of the LG OLED TV's? Either way it makes sense i suppose, that you would prefer a 4K OLED over high refresh 1440p (even ultra-wide 21:9 1440p) because OLED has an INSANE response time of only 0.1ms!!! That crazy quick response time means that even with the input lag of an HDTV (which is higher than most monitors typically) your overall "lag" that places like TFT central define by adding true response time to input lag for a general "responsiveness" number; would be easily around as fast as a ~4-5ms IPS 144hz monitor if not faster.


It's the $6000 LG E6. I think it dropped to $5000 very recently. The 55" is much cheaper than that, and there are cheaper models now like the B6 and C6.

It's not as good as you make it sound though, the E6 has ~32ms input lag in PC mode which is about the same as a high quality non-gaming 60hz IPS LCD. It's pixels refresh at 0.1ms which makes 60hz motion a bit better than a typical 60hz LCD, but input lag is much higher than a 144hz LCD. I don't play FPS games usually, and even when I do 32ms input lag doesn't bother me. I've been gaming on ~30ms input lag my entire life because I always choose image quality over speed.

I am also turning 30 next month, and my reaction times aren't what they use to be. I have a hard time telling the difference between 1ms input lag and 30ms input lag anymore. Much above 30 though, I start to get annoyed.


----------



## Testier

Quote:


> Originally Posted by *Murlocke*
> 
> REALLY?! That sucks, SLI would be incredible for fast sync + OLED 60hz.
> 
> I just played a bunch of Metro Last Night and Doom on my OLED today. If someone has never seen one, It'd only take them about 1 second to lose all faith in LCD displays in those dark games. The PQ of mine still impresses me every day and I've been using 4K OLED since the EG9600. We actually talked in the AVS EG9600 thread a bunch, but I ended up returning and getting the E6 for reduced input lag.


Honestly OLED 55inch isnt that bad in terms of price. I wish I had seen it before and I would have resetup my entire PC table but I am up for testing new tech.

As for your 6700k, I think its completely fine for gaming for now. Worst case, go jump on SK-E and get 3d xpoint at the same time.


----------



## Murlocke

Quote:


> Originally Posted by *Testier*
> 
> Honestly OLED 55inch isnt that bad in terms of price. I wish I had seen it before and I would have resetup my entire PC table but I am up for testing new tech.
> 
> As for your 6700k, I think its completely fine for gaming for now. Worst case, go jump on SK-E and get 3d xpoint at the same time.


I have seen both the 55" E6 and 65" E6 being used a monitor, as my friend has the 55" version. He bought it after seeing my 65".









For PC use, the 55" is superior to a 65" due to increased PPI and the fact you can sit closer. The 55" models also seem to have a bit better uniformity because it's pretty hard to get perfect uniformity across 65".

I went with the 65" because I use the same display for movies, where I sit quite a bit further back than I do when I am on the PC.


----------



## fernlander

Quote:


> Originally Posted by *Murlocke*
> 
> How? I get ~8900 consistently on TimeSpy with +200 core. Is that the difference in that demo between 4 and 6 cores?
> No it was the first 3440x1440 monitor and 60hz, LG 34UM95. My 4K monitor is an OLED TV, no GSYNC on these.
> 
> I honestly won't consider a non-OLED at this point because it's too big of a difference, so I am probably not the right person to help you decide which to get. I've always felt 60hz is plenty, so I prefer 4K, but I do not play FPS. 144hz/GSYNC is nice but I can't swallow the picture quality of LCD displays anymore.


I could never. Went from plasma to OLED. Can't stand LCD.


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> I got more score and fps with +750 mem than +600. +215 core +705 stable in uniengine heaven. I will try firestrike now.


I made some error, stock fan = 80-83c with +215 and +750

100% fan = 62-68C overclocked same setting


----------



## renejr902

I passed correctly fire strike ultra with core +215, +750 mem and i got a score of 6945


----------



## CallsignVega

Quote:


> Originally Posted by *Murlocke*
> 
> Ah, yes that explains it. That's the only "gaming" benchmark I know of that has a _huge_ difference between 4 and 8 cores. DX12 will likely make me regret getting a 6700k.


I don't regret trading in my 4.6GHz 5960X for a 4.8GHz 6700K. The latter will be faster in 98% of games. The Time Spy benchmark is silly, no game even comes close to needing that much CPU processing of 8-10 cores. Games just don't have that much parallelism. It's pretty pointless running that benchmark on less than a 8-10 core CPU. Games are heading that direction, but nowhere near as much as that benchmark would imply.
Quote:


> Originally Posted by *Murlocke*
> 
> REALLY?! That sucks, SLI would be incredible for fast sync + OLED 60hz.
> 
> I just played a bunch of Metro Last Night and Doom on my OLED today. If someone has never seen one, It'd only take them about 1 second to lose all faith in LCD displays in those dark games. The PQ of mine still impresses me every day and I've been using 4K OLED since the EG9600 (about a year now?). We actually talked in the AVS EG9600 thread a bunch when you first bought yours, but I ended up returning it and getting the E6 for reduced input lag.


I also returned my 9600 due to the vignette issue. The 55OLED6CP I have now is way better.


----------



## Murlocke

Quote:


> Originally Posted by *CallsignVega*
> 
> I don't regret trading in my 4.6GHz 5960X for a 4.8GHz 6700K.


I got the 6700k thinkng 4.7GHz would be a sure thing, 4.8GHz if I get a good chip.

I got 4.6GHz and that needs 1.37v to be stable.








Quote:


> Originally Posted by *CallsignVega*
> 
> I also returned my 9600 due to the vignette issue.


Yeah that TV was a flop in my opinion, most people got free upgrades and/or refunded by LG. Mine had 12" black bars on both side at anything below 10% gray. How something like that got past quality control is very crazy.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> I don't regret trading in my 4.6GHz 5960X for a 4.8GHz 6700K. The latter will be faster in 98% of games. The Time Spy benchmark is silly, no game even comes close to needing that much CPU processing of 8-10 cores. Games just don't have that much parallelism. It's pretty pointless running that benchmark on less than a 8-10 core CPU. Games are heading that direction, but nowhere near as much as that benchmark would imply.
> I also returned my 9600 due to the vignette issue. The 55OLED6CP I have now is way better.


That's not true anymore, Vega. The reason I switched from a 5.2GHz quad core to a 4.7GHz octacore is because I noticed that almost all newer games benefit greatly from the additional threads. Look at GTA V. Or even better, look at Tomb Raider under DX12. It actually hits 90% usage across all 8 cores. I first became suspicious when I noticed my 4.625GHz 5820k work PC outperforming my gaming rig on some games. Basically, older games will be fed sufficiently by a 4 core CPU at 4.7GHz without there being a huge advantage in going with a 6700k with an extra 7% OC. And all other games will benefit more from the additional cores available on the 5960x. Heck if I could guarantee that a 6950x would OC to 4.5GHz I'd have gone that route. But with the 5960x, you get enough cores, with enough clock speed.


----------



## DADDYDC650

Guess the 1080 Ti is coming out. Nvidia engineer Tom Petersen pretty much let it slip as an unreleased product back in May. Said the same thing about the Titan XP which of course was just released. Comment around the 2 hour, 17 minutes and 44 seconds mark.


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Guess the 1080 Ti is coming out. Nvidia engineer Tom Petersen pretty much let it slip as an unreleased product back in May. Said the same thing about the Titan XP which of course was just released. Comment around the 2 hour, 17 minutes and 44 seconds mark.


Nah, I pass, I will be str8 with the TXP...







I am staying up for a while. I hope that I can score one soon. ..


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Nah, I pass, I will be str8 with the TXP...


Same here. I'd rather be a Titan baby!


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Same here. I'd rather be a Titan baby!


Really though, I highly doubt that it would be more powerful than the TXP at a lower cost. it would make zero sense for nVIDIA to do this...


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Really though, I highly doubt that it would be more powerful than the TXP at a lower cost. it would make zero sense for nVIDIA to do this...


It's not going to be faster than the XP. I'm guessing Vega will be a little faster than the 1080 and Nvidia will counter with the Ti.


----------



## KickAssCop

So when is this available on Amazon on Newegg. Apparently, NVidia store does not accept my credit card and US forwarding address.


----------



## DarkIdeals

Quote:


> Originally Posted by *KickAssCop*
> 
> So when is this available on Amazon on Newegg. Apparently, NVidia store does not accept my credit card and US forwarding address.


Nobody knows. Nvidia says it's ONLY available through their site; so there's no guarantee that it will EVER be available elsewhere. You could always try getting a different card, or using a debit card instead of the credit perhaps. Worth a try. Or possibly get a friend or family member to buy it with their card and just pay them back for it.


----------



## DADDYDC650

Quote:


> Originally Posted by *KickAssCop*
> 
> So when is this available on Amazon on Newegg. Apparently, NVidia store does not accept my credit card and US forwarding address.


Have you tried a prepaid card?


----------



## xarot

Quote:


> Originally Posted by *techguymaxc*
> 
> Why do you say that? I've had fantastic results with the acetal + nickel variety. 37C load on my GTX 1070 @ 2126MHz in Furmark with ambient temps in the low-mid 20s (poor ventilation in room & no AC). You getting better than that?


While EK has gotten better with their nickel plating, bare copper can only oxidate and the nickel plating CAN start flaking away. So why not bare copper, the easiest choice. Choose nickel only if you need the looks...but not with black acetal top.







Cooling performance? Negligible. I've had EK's nickel flaking even with their more recent blocks.


----------



## DarkIdeals

Welp....my self control has faded....i was stuck for so long on [email protected] vs 3440(21:9)[email protected] Checked to see if there was any decent ~30" OLED out there, didn't see any. Noticed that by some bizarre manner of glitch my TITAN X charge had utterly disappeared from my Bank statement leaving $1400 still in my account. I said f*** it and bit, getting this monitor. I have a feeling it'll correct itself, but i can always cover the mess with my next payment coming in a week or so.


----------



## DADDYDC650

Quote:


> Originally Posted by *DarkIdeals*
> 
> Welp....my self control has faded....i was stuck for so long on [email protected] vs 3440(21:9)[email protected] Checked to see if there was any decent ~30" OLED out there, didn't see any. Noticed that by some bizarre manner of glitch my TITAN X charge had utterly disappeared from my Bank statement leaving $1400 still in my account. I said f*** it and bit, getting this monitor. I have a feeling it'll correct itself, but i can always cover the mess with my next payment coming in a week or so.


Acer x34p coming around November. Think LG and Viewsonic coming out with thier own soon as well.


----------



## Captivate

Quote:


> Originally Posted by *DADDYDC650*
> 
> Acer x34p coming around November. Think LG and Viewsonic coming out with thier own soon as well.


Very little difference between that one and the original X34. The former runs 100Hz native, but most X34s can overclock to 100Hz just fine (mine included). Also the curve is slightly larger. To be honest, I don't really notice that much curve on mine, but I haven't used any non-curved ultrawides, so it's hard for me to compare.


----------



## DADDYDC650

Quote:


> Originally Posted by *Captivate*
> 
> Very little difference between that one and the original X34. The former runs 100Hz native, but most X34s can overclock to 100Hz just fine (mine included). Also the curve is slightly larger. To be honest, I don't really notice that much curve on mine, but I haven't used any non-curved ultrawides, so it's hard for me to compare.


The larger curve is welcome and perhaps better quality control since the newest LG 34" display panels suffer from a lot less backlight bleeding which the x34p is sure to use. Native 100Hz is also guaranteed. What if the monitor starts to act weird at 100Hz? Acer warranty has you covered. I also prefer the matte finish on the back of the x34p over the glossy finish on the x34. Of course all of this isn't a huge deal but it's very welcome.


----------



## EDORAM

Hi,

is there someone that can run a firestrike performance and and heaven 4.0 in FHD?

Are 30k GS and 3500 heaven possibile?

TKS.


----------



## DarkIdeals

Quote:


> Originally Posted by *Captivate*
> 
> Very little difference between that one and the original X34. The former runs 100Hz native, but most X34s can overclock to 100Hz just fine (mine included). Also the curve is slightly larger. To be honest, I don't really notice that much curve on mine, but I haven't used any non-curved ultrawides, so it's hard for me to compare.


Actually it's more different than you think.

According to Acer the X34P will have:

1) 100hz native, with overclockability to 120hz or higher

2) 1900r curve compared to 3800r curve on X34 (~2x more curve)

3) Matte black finish to the back, so no more shiny cheap plastic on the back

4) Joystick style OSD buttons similar to those on the ASUS ROG Swift monitors

5) Full swivel etc.. stand control (Tilt, Swivel, Height adjust etc..)

6) And most importantly imo, it will use a newer "S-IPS" panel from LG, rather than the older "AH-IPS" that had so much backlight bleed issues. (The S-IPS in the X34P is supposedly the same panel type used in the new 34UM98-W which is $1200 even without the inclusion of G-Sync and 100hz etc.. http://www.lg.com/us/monitors/lg-34UC98-W-ultrawide-monitor )

But i just couldn't see myself waiting till the end of the year to get a new monitor, especially after getting new GPUs which makes you itch for better display tech to take advantage of it lol. (Plus there's no info that i'm aware of that says November. In fact Acer has only said "Q4" with one clarifying remark saying "Late Q4" and some people who were at Computex claiming they were told possibly December. There's also TFTCentral saying that 120hz+ capable 3440x1440p panels are just now in mass-production in late july early august, which they say usually means 2-3 months before companies like Acer get a hold of them in volume, and ANOTHER 2-3 months before they can manufacture enough of them to have a feasible amount of stock for an anticipated release.


----------



## DarkIdeals

Quote:


> Originally Posted by *EDORAM*
> 
> Hi,
> 
> is there someone that can run a firestrike performance and and heaven 4.0 in FHD?
> 
> Are 30k GS and 3500 heaven possibile?
> 
> TKS.


If you look back a few pages there are some people who were getting ~31,000 or more in FS at FHD resolution. Not sure about Heaven, most people are running Firestrike, Timespy, Valley, or in-game benchmarking tools.


----------



## DADDYDC650

Quote:


> Originally Posted by *DarkIdeals*
> 
> Actually it's more different than you think.
> 
> According to Acer the X34P will have:
> 
> 1) 100hz native, with overclockability to 120hz or higher
> 
> 2) 1900r curve compared to 3800r curve on X34 (~2x more curve)
> 
> 3) Matte black finish to the back, so no more shiny cheap plastic on the back
> 
> 4) Joystick style OSD buttons similar to those on the ASUS ROG Swift monitors
> 
> 5) Full swivel etc.. stand control (Tilt, Swivel, Height adjust etc..)
> 
> 6) And most importantly imo, it will use a newer "S-IPS" panel from LG, rather than the older "AH-IPS" that had so much backlight bleed issues. (The S-IPS in the X34P is supposedly the same panel type used in the new 34UM98-W which is $1200 even without the inclusion of G-Sync and 100hz etc.. http://www.lg.com/us/monitors/lg-34UC98-W-ultrawide-monitor )
> 
> But i just couldn't see myself waiting till the end of the year to get a new monitor, especially after getting new GPUs which makes you itch for better display tech to take advantage of it lol. (Plus there's no info that i'm aware of that says November. In fact Acer has only said "Q4" with one clarifying remark saying "Late Q4" and some people who were at Computex claiming they were told possibly December. There's also TFTCentral saying that 120hz+ capable 3440x1440p panels are just now in mass-production in late july early august, which they say usually means 2-3 months before companies like Acer get a hold of them in volume, and ANOTHER 2-3 months before they can manufacture enough of them to have a feasible amount of stock for an anticipated release.


x34 is an awesome monitor that you can use right now. Besides, all these monitors are placeholders for OLED and HDR enabled monitors.


----------



## EDORAM

Quote:


> Originally Posted by *DarkIdeals*
> 
> If you look back a few pages there are some people who were getting ~31,000 or more in FS at FHD resolution. Not sure about Heaven, most people are running Firestrike, Timespy, Valley, or in-game benchmarking tools.


Ok tks, I' didn't found firestrike performance bench in thread gallery at right...

I will find them manually
Tks


----------



## Steven185

Anybody with a Titan XP having measured the actual performance gains from overlock (in %)? (i.e. have the card at stock everything, including power target and then overclocked some).

It's the only way for us who are still waiting for the card to project the actual performance that we're going to get (what the reviews say + the actual performance gain due to the overclock)


----------



## DarkIdeals

So, after seeing this thing my faith in ultra-wide went up a ton. This is HUGE frankly for budget gamers and such. This monitor and others like it will see widespread adoption of 21:9 i bet you. It's 29 inch which is big enough to give an immersive feel, it's only 2560x1080 which means you can run it on a cheap card like an RX 480 without much issue, it's a quality LG AH-IPS panel with 99% sRGB coverage that also overclocks to 75hz due to having Freesync, oh and it has freesync lol. And it's only $275 new cost!

https://www.amazon.com/dp/B01B9IDLAW/ref=psdc_1292115011_t3_B01CX26VNC

If this kinda thing doesn't bring widespread adoption to 21:9 by game developers in the next year or two i'll be eating crow for a month. I mean, hell, i'm not an AMD fan really (thus my presence in this forum lol) but even i would've snagged this up instead of the ASUS VG23AH that i picked up when i got back into PC gaming after years of absence. I had bought one of the "ibuypower" pre-built rigs with an i5 3570K and GTX 660 (gigabyte 1gb model) in an NZXT mid tower with a 500GB WD Blue iirc and a cheapo 500w PSU. But there's not much difference between a GTX 660 running a 23.5" 1920x1080 monitor like the VG23AH and running a 2560x1080 29" monitor like this. And i paid ~$180 for the VG23AH after rebate ($200 normal price), i would've GLADLY got something like this instead, even knowing i had an Nvidia card; as i could've always sent the pre-built back and re-configure a new one with a 7870 or something lol.

Quote:


> Originally Posted by *DADDYDC650*
> 
> x34 is an awesome monitor that you can use right now. Besides, all these monitors are placeholders for OLED and HDR enabled monitors.


Maybe i was mistaken. It might've just been people TYPING that they got over 30k on firestrike; not posting pics. I know for sure i saw a 30k+ scores a couple times recently though. Check this out:

http://www.overclock.net/content/type/61/id/2841708/

That's a stock speed TITAN X getting 26,660 in firestrike at FHD. We're seeing on average around ~19-20% increase in performance with a decent overclock of around ~1,950mhz (so far i've got mine to 1,975mhz with the reference cooler. Others have hit as high as 2,050mhz or so already) and if you take 26,660 and add that ~19-20% from the overclock that we see in other benchmarks, it gives you an average score of somewhere between ~31,725 - 31,992 which is exactly what i've been seeing from the few peopel who posted their graphics scores for FS (~31,000 - 32,000 or so)


----------



## Swolern

Quote:


> Originally Posted by *DADDYDC650*
> 
> x34 is an awesome monitor that you can use right now. Besides, all these monitors are placeholders for OLED and HDR enabled monitors.


Agreed. OLED 34 in curved 21:9 would be amazing! One day!!

BTW has anyone seen any benchmarks at 3440x1440 with the new Titan? If none someone with this res run Heaven 4.0 for me.


----------



## bl4ckdot

Just recieved my Titan XP in France (Chronopost). They didn't even send the me the shipping notification mail so I was kinda suprised


----------



## renejr902

Finally im 100% stable at +210 core and +750 mem. I have problem to run witcher3 smoothly even if i downgraded some settings even lowered my 4k resolution to 2k or less, i still have a lot of stuttering. Sometime the stuttering can be a half second, its not fun at all, even my 1070 and 980ti didnt have this stuttering problem, better playing 40-45fps and no stuttering. The game is stuttering with or without vsync even if still 60fps solid.i need help







until now i dont have this problem with any other game, thanks if you can help me, or try the game in 4k by yourself, maybe the drivers?
Note: it do the same problem with stock clock for core and mem


----------



## KickAssCop

Try the game without the overclock?


----------



## renejr902

Quote:


> Originally Posted by *KickAssCop*
> 
> Try the game without the overclock?


I just edited my post, its the same problem without overclock. I played 1 hour of dirt rally at 4k ultra, no stuttering at all.


----------



## Kielon

Quote:


> Originally Posted by *renejr902*
> 
> Finally im 100% stable at +210 core and +750 mem. I have problem to run witcher3 smoothly even if i downgraded some settings even lowered my 4k resolution to 2k or less, i still have a lot of stuttering. Sometime the stuttering can be a half second, its not fun at all, even my 1070 and 980ti didnt have this stuttering problem. The game is stuttering with or without vsync even if still 60fps solid.i need help
> 
> 
> 
> 
> 
> 
> 
> until now i dont have this problem with any other game, thanks if you can help me, or try the game in 4k by yourself, maybe the drivers?
> Note: it do the same problem with stock clock for core and mem


Can u check LatencyMon at idle and running in the background while gaming? This looks like DPC latency spikes we experience with 1070/80 from the very beginning.


----------



## renejr902

Quote:


> Originally Posted by *Kielon*
> 
> Can u check LatencyMon at idle and running in the background while gaming? This looks like DPC latency spikes we experience with 1070/80 from the very beginning.


i installed it. It running but i dont know this app. What do you want me to note?


----------



## HyperMatrix

Quote:


> Originally Posted by *renejr902*
> 
> Finally im 100% stable at +210 core and +750 mem. I have problem to run witcher3 smoothly even if i downgraded some settings even lowered my 4k resolution to 2k or less, i still have a lot of stuttering. Sometime the stuttering can be a half second, its not fun at all, even my 1070 and 980ti didnt have this stuttering problem, better playing 40-45fps and no stuttering. The game is stuttering with or without vsync even if still 60fps solid.i need help
> 
> 
> 
> 
> 
> 
> 
> until now i dont have this problem with any other game, thanks if you can help me, or try the game in 4k by yourself, maybe the drivers?
> Note: it do the same problem with stock clock for core and mem


Not sure if this is your problem or not. But when the memory clock goes up too high, you actually end up getting errors. And the error correction that's done, causes huge spikes in fps.


----------



## renejr902

What is that in red???


----------



## Kielon

Here u can find how other peeps try to handle the problem: http://www.overclock.net/t/1605618/ongoing-pascal-latency-problems-hotfix-doesnt-work-for-everyone


----------



## renejr902




----------



## renejr902




----------



## Kielon

Quote:


> Originally Posted by *renejr902*


It seems like there is other source of latency than Nvidia driver - most probably USB device or aggresive power saving mode.


----------



## renejr902

ok i will remove my usb drives and retry. i tried latency checker, in idle its ok, but in witcher3, it go between 1000 and even 2000us

thanks so much for help


----------



## Kielon

Look at Stats/Processes tab on the very top and sort the latencies of individual drivers/processes from highest to lowest .


----------



## Woundingchaney

Well Im personally looking forward to aftermarket cooling options, very please with my purchase.


----------



## newls1

am i correct in thinking this Titan XP "IS" finally the card that will be just as fast if not faster then my 2 980Ti's in SLi... If so, ...... Ill go ahead and buy a EK FCWB for it, and wait till nvidia gets stock again, and buy 1. Im wanting to get away from SLi and all its headaches so if this card is the answer, then shes mine... Please someone with knowledge with this question please provide some input as it would be greatly appreciated!


----------



## renejr902

Quote:


> Originally Posted by *Kielon*
> 
> Look at Stats/Processes tab on the very top and sort the latencies of individual drivers/processes from highest to lowest .


which one is the latency 

witcher3 is running in the background, i didnt close it yet

i tried several things, witcher still stuttering a lot, no memory overclock, no usb....


----------



## renejr902

Quote:


> Originally Posted by *Woundingchaney*
> 
> Well Im personally looking forward to aftermarket cooling options, very please with my purchase.


i want the evga watercooler, do you know when it will be release ? can we preorder?


----------



## renejr902

Quote:


> Originally Posted by *HyperMatrix*
> 
> Not sure if this is your problem or not. But when the memory clock goes up too high, you actually end up getting errors. And the error correction that's done, causes huge spikes in fps.


i have severe fps spike in witcher when it rains, like 20 to 30 fps drop sometime, but i tried with stock memory clock, and same thing happen.


----------



## Kielon

Quote:


> Originally Posted by *renejr902*
> 
> which one is the latency
> 
> witcher3 is running in the background, i didnt close it yet
> 
> i tried several things, witcher still stuttering a lot, no memroy overclock, no usb....


Highest execution (ms)


----------



## Woundingchaney

Quote:


> Originally Posted by *renejr902*
> 
> i want the evga watercooler, do you know when it will be release ? can we preorder?


I personally haven't seen anything outside of a few references to custom blocks being released in the near future. I'm wanting a simpler solution, along the lines of an aftermarket cooler or water cooling solution. It has been awhile since I have done custom cooling and I simply don't have the patience for it these days.


----------



## renejr902

ndis.sys , ndis driver 6.20 40ms, the worst thing by a long shot. how dont know this driver and i dont know what to do.

if you want , check my pic and the list, i really dont know much about these things and latency thing. thanks

i can install the hotfix, but i dont to screw something


----------



## renejr902

i think evga is a simple one, ek block are too complex for me right now


----------



## Kielon

Quote:


> Originally Posted by *renejr902*
> 
> ndis.sys , ndis driver 6.20 40ms, the worst thing by a long shot. how dont know this river and i dont know what to do.


This must be related to network/wi-fi card or tcp traffic caused by anti-virus/firewall programs.


----------



## renejr902

antivirus is disable right now, i can uninstall my wifi card, i dont use it much


----------



## Kielon

oki disable both and re-check the latency,


----------



## renejr902

Quote:


> Originally Posted by *Kielon*
> 
> oki disable both and re-check the latency,


I removed my wireless card from my pc, disabled antivirus and i corrected a driver problem i found. Now latencymin detect no problem. Thanks so much.
I will still try witcher3 again, but im not sure the problem i had cause stuttering in witcher


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> I removed my wireless card from my pc, disabled antivirus and i corrected a driver problem i found. Now latencymin detect no problem. Thanks so much.
> I will still try witcher3 again, but im not sure the problem i had cause stuttering in witcher


the problem happen again with lamencymon








But this time it said nvidia windows kernel mode driver version 369,05 has highest execution. I dont think the driver cause a problem.. Right? Or its same problem then 1070 and 1080, but i dont think the hotfix will work for titanx


----------



## GunnzAkimbo

Also try Multimon to monitor background activity for errors or problems transparent to other monitoring software that can lag up the system when it comes to things like file access and registry errors.

http://www.resplendence.com/multimon


----------



## renejr902

Witcher3 do samething, even after i rebooted. But it dont seem to do it everywhere. But at a lot of places.

Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. When i run in a city , if it rain and a lot of people appear my fps drop from 55-65 to 30 or 40 in one shot, that cause SEVERE stuttering VERY OFTEN, i tried vsync on and off, its the same thing. But in openworld with no rain and a few ennemies the stuttering dont happen often and fps is solid 60fps.

Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. But my 1070 and 980ti were not stuttering at 4k, that is strange.

This is my pc, if you find something fishy:

Intel I5 4690 3.5ghz-3.9ghz
32gb ram ddr3 1600mhz
1tb wd black
Asus h97-plus
Antec power supply 750watts

That so bad , i found witcher3 more playable at 4k with my 1070 and 980ti, its so sad, only 35-45fps but no stuterring, with same graphics setting and same 4k resolution


----------



## Foxrun

Mine should be here tomorrow! Anyone know if this http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1 will work on the titan? I think both of the dimensions (1080 and titan) are the same


----------



## cg4200

I only played witcher for 25 mins to show nephew last night running 205/600 temp 70 max with fan 80 %
Someone asked for firestrike score stock vs oc
this run 120% power 205/700 31,982 http://www.3dmark.com/fs/9633278
this is with power100% not 120% no oc 28,713 http://www.3dmark.com/fs/9633444
last score is 120% no oc 29,039
My take is titan xp throttles quick from 2088 to 2024 to mid 1900 thru firestrike part one second part of test boost stays higher 2025 or so although temp max was 74 degrees ..I think tdp and ran temp brings down clock quick, I am going to order Ek block hopefully can keep boost near 2075 gaming would be huge benefit.
anyone take off backplate yet to see contact made ?? thanks
6700k oc to 4.8
32 gskill 3000 14 14 34


----------



## Murlocke

Quote:


> Originally Posted by *cg4200*
> 
> I only played witcher for 25 mins to show nephew last night running 205/600 temp 70 max with fan 80 %
> Someone asked for firestrike score stock vs oc
> this run 120% power 205/700 31,982 http://www.3dmark.com/fs/9633278
> this is with power100% not 120% no oc 28,713 http://www.3dmark.com/fs/9633444
> last score is 120% no oc 29,039
> My take is titan xp throttles quick from 2088 to 2024 to mid 1900 thru firestrike part one second part of test boost stays higher 2025 or so although temp max was 74 degrees ..I think tdp and ran temp brings down clock quick, I am going to order Ek block hopefully can keep boost near 2075 gaming would be huge benefit.
> anyone take off backplate yet to see contact made ?? thanks
> 6700k oc to 4.8
> 32 gskill 3000 14 14 34


There is no contact/RAM on the back of the card I heard.
Quote:


> Originally Posted by *renejr902*
> 
> Witcher3 do samething, even after i rebooted. But it dont seem to do it everywhere. But at a lot of places.
> 
> Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. When i run in a city , if it rain and a lot of people appear my fps drop from 55-65 to 30 or 40 in one shot, that cause SEVERE stuttering VERY OFTEN, i tried vsync on and off, its the same thing. But in openworld with no rain and a few ennemies the stuttering dont happen often and fps is solid 60fps.
> 
> Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. But my 1070 and 980ti were not stuttering at 4k, that is strange.
> 
> This is my pc, if you find something fishy:
> 
> Intel I5 4690 3.5ghz-3.9ghz
> 32gb ram ddr3 1600mhz
> 1tb wd black
> Asus h97-plus
> Antec power supply 750watts
> 
> That so bad , i found witcher3 more playable at 4k with my 1070 and 980ti, its so sad, only 35-45fps but no stuterring, with same graphics setting and same 4k resolution


Not having this problem, witcher 3 runs beautifully for me. I still disable hair works though because it honestly isn't worth it. If you haven't that's the first thing i'd disable as it's known to cause severe performance issues at times for almost no difference.

If you enabled fast sync, then you will get severe stuttering below 50fps or so.


----------



## WorldExclusive

Quote:


> Originally Posted by *Foxrun*
> 
> Mine should be here tomorrow! Anyone know if this http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1 will work on the titan? I think both of the dimensions (1080 and titan) are the same


Yes

https://hardforum.com/threads/nvidia-announces-the-new-titan-x.1905829/page-19#post-1042460620


----------



## cg4200

Thanks mate ... Also forgot to finish while playing witcher I had no stutter of any kind ran in city woods did couple battles no issues..
Off to work will try tonight


----------



## DarkIdeals

Quote:


> Originally Posted by *Swolern*
> 
> Agreed. OLED 34 in curved 21:9 would be amazing! One day!!
> 
> BTW has anyone seen any benchmarks at 3440x1440 with the new Titan? If none someone with this res run Heaven 4.0 for me.


My Acer X34 3440x1440p monitor comes via Fedex tomorrow so i'll let you know then as i have the TITAN already, just need the monitor. Will be doing a full review within a couple days too.









Quote:


> Originally Posted by *renejr902*
> 
> Finally im 100% stable at +210 core and +750 mem. I have problem to run witcher3 smoothly even if i downgraded some settings even lowered my 4k resolution to 2k or less, i still have a lot of stuttering. Sometime the stuttering can be a half second, its not fun at all, even my 1070 and 980ti didnt have this stuttering problem, better playing 40-45fps and no stuttering. The game is stuttering with or without vsync even if still 60fps solid.i need help
> 
> 
> 
> 
> 
> 
> 
> until now i dont have this problem with any other game, thanks if you can help me, or try the game in 4k by yourself, maybe the drivers?
> Note: it do the same problem with stock clock for core and mem


I didn't notice "stuttering" per say, but i did notice that it was awfully "laggy" feeling. That the input lag seemed a bit off, like button presses and mouse movement took longer to take effect after you did them than they usually do. I'll check it again soon. It could just be that this card needs driver updates since it's so soon.
Quote:


> Originally Posted by *newls1*
> 
> am i correct in thinking this Titan XP "IS" finally the card that will be just as fast if not faster then my 2 980Ti's in SLi... If so, ...... Ill go ahead and buy a EK FCWB for it, and wait till nvidia gets stock again, and buy 1. Im wanting to get away from SLi and all its headaches so if this card is the answer, then shes mine... Please someone with knowledge with this question please provide some input as it would be greatly appreciated!


You are indeed correct sir. This card is ~30-35% faster than a GTX 1080, and a GTX 1080 is about ~30% faster than a 980 TI. That puts this card at ~60-65% faster than a 980 TI at stock, and when overclocked it basically ties with SLI 980 TI's. Of course two highly overclocked 980 TI's might SLIGHTLY beat it, but not by much i guarantee. If you are wanting to move away from SLI then this is the card as of now. Of course they could always make a 1080 TI and piss off a lot of people that bought this, but i kinda doubt it since everything about pascal mimics the 600 series (1080 was first 16nm card just like the 680 was the first 28nm card. Both the 680/670 and 1080/1070 had bad stock issues where the cards were out of stock a lot for a couple months after release etc..etc..) and the 600 series had the original TITAN come out not long after the 680/670/660 did, then the next year we got the GTX 780 and GTX 780 TI, with the 780 being ~10-12% slower than the TITAN and the 780 TI being ~5-6% faster than the TITAN. My guess would be that we will get a GTX 1180 using either a modified GP100 or the same GP102 on this TITAN card but with roughly ~3,072 - 3,328 cuda cores and 8GB of 3072 bit HBM2 VRAM (540GB/s bandwidth), and the 1180 TI having roughly ~3,840 - 4,096 cuda cores and the same 8GB of 3072 bit HBM2 VRAM. We might also get a "TITAN X BLACK" that could possibly end up using 4096 bit HBM (720 GB/s) to differentiate from the others. This lines up with the "Volta in 2017" rumor as i just don't see them having volta ready by then; but i can EASILY see them having full ~4,096 core capable "Pascal 2.0" architecture being ready by then (just like how the 700 series was still the same Keplar architecture of the 600 series, but was tweaked/enhanced and used full GK110 die)

Quote:


> Originally Posted by *Foxrun*
> 
> Mine should be here tomorrow! Anyone know if this http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1 will work on the titan? I think both of the dimensions (1080 and titan) are the same


Just to give another answer to the ones who told you already: Yes it would work, but the 980 TI one may actually work better on this card than the 1080/1070 one since you wouldn't have to fuss with making room for the additional power connector etc.. that the 1080/1070 don't have.


----------



## Frosted racquet

Quote:


> Originally Posted by *renejr902*
> 
> the problem happen again with lamencymon
> 
> 
> 
> 
> 
> 
> 
> 
> But this time it said nvidia windows kernel mode driver version 369,05 has highest execution. I dont think the driver cause a problem.. Right? Or its same problem then 1070 and 1080, but i dont think the hotfix will work for titanx


Quote:


> Originally Posted by *renejr902*
> 
> ndis.sys , ndis driver 6.20 40ms, the worst thing by a long shot. how dont know this driver and i dont know what to do.
> 
> if you want , check my pic and the list, i really dont know much about these things and latency thing. thanks
> 
> i can install the hotfix, but i dont to screw something


Disable your LAN adapter temporarily in Device manager and see if the issue is still there.
There's a known problem with Windows 10 and various NICs/LAN drivers.


----------



## Zurv

Quote:


> Originally Posted by *DarkIdeals*
> 
> Anyone else having temp issues with this card? I've tried both MSI Combuster (Furmark test) and regular gaming on The Witcher 3 and everytime it will raise up to a crazy 89C temp with the power limit/temp limit raised. This flies against the "83-84C max" that everyone else is saying and it's kinda pissing me off here. No way this card should be that hot, and i haven't even overclocked it yet.


with the case closed with no OC"n.. the cards got hella hot really fast. They hit 90 right away. got hotter faster than the 1080.. but that is what waterblocks are for


----------



## newls1

Quote:


> Originally Posted by *newls1*
> 
> am i correct in thinking this Titan XP "IS" finally the card that will be just as fast if not faster then my 2 980Ti's in SLi... If so, ...... Ill go ahead and buy a EK FCWB for it, and wait till nvidia gets stock again, and buy 1. Im wanting to get away from SLi and all its headaches so if this card is the answer, then shes mine... Please someone with knowledge with this question please provide some input as it would be greatly appreciated!


I just dont want this to get burried.... can someone please add some input on this before i make this purchase please... thanks


----------



## Woundingchaney

Quote:


> Originally Posted by *newls1*
> 
> I just dont want this to get burried.... can someone please add some input on this before i make this purchase please... thanks


The single card OCed has better performance than my previous TX Maxwell SLI OCed. I apologize that I cant be more accurate in the statement, but I have only tested a few AAA titles.

The performance varies depending on the title but I haven't seen anything that actually runs slower.


----------



## Zurv

Quote:


> Originally Posted by *CallsignVega*
> 
> It's a damn shame FastSync only works single gpu.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for OLED, all I would have to do is load up the new DOOM on my 4K OLED and it would be impossible not to convert someone to it's epic-ness. My X34 looks like a kids piece of junk next to it.
> When I say maxed out, I mean every setting is at it's highest besides AA. AA is very subjective, and you can toss huge amounts of performance out the window with AA. With 4K, FXAA looks pretty good to me. Once you start talking maxed out with crazy amounts of AA, DSR etc, that's a huge rabbit hole.


nod nod to the oled 4k. Doom and the witcher 3 blow me away when i play on the TV. I have the Acer XB321HK 4k and on that it looks like totally diff (and ****ter) - the gysnc (even at 60hz) is pretty nice. I happy the dell oled isn't out because i'd have to buy that for the oled (but it is to small and old DP inputs)


----------



## DarkIdeals

Quote:


> Originally Posted by *Zurv*
> 
> with the case closed with no OC"n.. the cards got hella hot shlle fast. They hit 90 right away. got hotter faster than the 1080.. but that is what waterblocks are for


Yeah but it still doesn't make sense that the early reviews out there from like Hexorus and JaystwoCents etc.. are all saying ~84C max under load. It makes me think that there's a bad batch of TIM or something. Was too tired to change the TIM last night, gonna try it today and see how it works.
Quote:


> Originally Posted by *newls1*
> 
> I just dont want this to get burried.... can someone please add some input on this before i make this purchase please... thanks


I'm pretty sure i replied to this a page or two back, but basically i said that yes it is very close if not tied with two 980 TI's in SLI in most cases. Hell, in a few games i've tested it's FASTER!. This card is almost always ~30-35% faster than the GTX 1080, and the GTX 1080 was ~30% faster than the 980 TI; so this card at stock is around ~60-65% faster than the 980 TI.

With the average ~18-20% overclock this card becomes roughly ~78 - 85% faster than the 980 TI. Since SLI is never perfect this means the single Pascal TITAN X "IS" as fast as two 980 TI's in SLI; because most games only give ~80% extra performance AT MOST with a 2nd card in SLI added. Sometimes not even that. I think the only game i've seen ~85-90% extra from SLI in recent times is The Witcher 3, but even then it's only SLIGHTLY faster than the TITAN X while still having all the disadvantages of SLI. And i can point out many games that have WORSE SLI support, only giving maybe ~50-60% from an extra card.

Overall, this card is basically 980 TI SLI speeds, just like how EVERY other Pascal card is basically equal to two equivalent 900 cards in SLI. The GTX 1080 is almost always roughly as fast as two 980's in SLI, the GTX 1070 is almost always within ~10% of two 970's in SLI, the 1060 is about as fast as two 1060's in SLI etc..etc..


----------



## Trys0meM0re

Quote:


> Originally Posted by *renejr902*
> 
> Witcher3 do samething, even after i rebooted. But it dont seem to do it everywhere. But at a lot of places.
> 
> Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. When i run in a city , if it rain and a lot of people appear my fps drop from 55-65 to 30 or 40 in one shot, that cause SEVERE stuttering VERY OFTEN, i tried vsync on and off, its the same thing. But in openworld with no rain and a few ennemies the stuttering dont happen often and fps is solid 60fps.
> 
> Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. But my 1070 and 980ti were not stuttering at 4k, that is strange.
> 
> This is my pc, if you find something fishy:
> 
> Intel I5 4690 3.5ghz-3.9ghz
> 32gb ram ddr3 1600mhz
> 1tb wd black
> Asus h97-plus
> Antec power supply 750watts
> 
> That so bad , i found witcher3 more playable at 4k with my 1070 and 980ti, its so sad, only 35-45fps but no stuterring, with same graphics setting and same 4k resolution


Couldnt it be a CPU bottleneck?
I mean that I5 is running @ stock causing the game to choke with so many GPU horsepower.
I bet the same is gonna happen running BF4,
All other users here seem to be running 6700k's and up and dont have this problem, i put my money on the CPU as a culprit

And Dirt is GPU bound when maxed, not taxing on the CPU at all.

Anyways, im selling my 1080 and am gonna buy the Titan, its such a beast! especially when OC'ed


----------



## newls1

Quote:


> Originally Posted by *Woundingchaney*
> 
> The single card OCed has better performance than my previous TX Maxwell SLI OCed. I apologize that I cant be more accurate in the statement, but I have only tested a few AAA titles.
> 
> The performance varies depending on the title but I haven't seen anything that actually runs slower.


Quote:


> Originally Posted by *DarkIdeals*
> 
> Yeah but it still doesn't make sense that the early reviews out there from like Hexorus and JaystwoCents etc.. are all saying ~84C max under load. It makes me think that there's a bad batch of TIM or something. Was too tired to change the TIM last night, gonna try it today and see how it works.
> I'm pretty sure i replied to this a page or two back, but basically i said that yes it is very close if not tied with two 980 TI's in SLI in most cases. Hell, in a few games i've tested it's FASTER!. This card is almost always ~30-35% faster than the GTX 1080, and the GTX 1080 was ~30% faster than the 980 TI; so this card at stock is around ~60-65% faster than the 980 TI.
> 
> With the average ~18-20% overclock this card becomes roughly ~78 - 85% faster than the 980 TI. Since SLI is never perfect this means the single Pascal TITAN X "IS" as fast as two 980 TI's in SLI; because most games only give ~80% extra performance AT MOST with a 2nd card in SLI added. Sometimes not even that. I think the only game i've seen ~85-90% extra from SLI in recent times is The Witcher 3, but even then it's only SLIGHTLY faster than the TITAN X while still having all the disadvantages of SLI. And i can point out many games that have WORSE SLI support, only giving maybe ~50-60% from an extra card.
> 
> Overall, this card is basically 980 TI SLI speeds, just like how EVERY other Pascal card is basically equal to two equivalent 900 cards in SLI. The GTX 1080 is almost always roughly as fast as two 980's in SLI, the GTX 1070 is almost always within ~10% of two 970's in SLI, the 1060 is about as fast as two 1060's in SLI etc..etc..


Thanks for the replies... time to SERIOUSLY THINK about this very expensive purchase


----------



## carlhil2

Shops open...just placed order...


----------



## KillerBee33

Got one for this Saturday


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Got one for this Saturday


Your tax, ouch..


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Your tax, ouch..


You in MA







yours was lower?


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> You in MA
> 
> 
> 
> 
> 
> 
> 
> yours was lower?


If you want to call it that, $75...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> If you want to call it that, $75...


[email protected]!!! i knew i was getting ripped off living in NY









FL is my plan for 2018


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> [email protected]!!! i knew i was getting ripped off living in NY
> 
> 
> 
> 
> 
> 
> 
> 
> FL is my plan for 2018


Believe it or not, I took today off, thanks uncle jerry, to snatch this up, but, now, I can head in by 10, still get paid for the whole day..


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Believe it or not, I took today off, thanks uncle jerry, to snatch this up, but, now, I can head in by 10, still get paid for the whole day..


Heh , i was at work just clicking that BUY NOW button for fun , just to see when it starts actually working


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Heh , i was at work just clicking that BUY NOW button for fun , just to see when it starts actually working


I have a pc at work, but, it's down right now for maintenance, and, I don't dare use my phone to shop, so....and, I am out...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> I have a pc at work, but, it's down right now for maintenance, and, I don't dare use my phone to shop, so....and, I am out...


Any one here already got theirs? I'd like to see atleast on paper what it can do before i take that plastic wrap off .


----------



## PatrickCrowely

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - which titan are you talking about? the OG (TXK) and the TXM are fantastic cards.


Any Titan, but I had the OG Titans.... Hopefully these be worth it


----------



## EniGma1987

Quote:


> Originally Posted by *KickAssCop*
> 
> So when is this available on Amazon on Newegg. Apparently, NVidia store does not accept my credit card and US forwarding address.


You could try setting up your credit card on a Paypal account and doing it all through that, then paying on Nvidia's site with Paypal. I tend to pay with Paypal for everything I can, and thats what I used when I bought the Titan a couple days ago. I dont really like Paypal much, but I tend to feel safer having my CC info only stored in Paypal's servers and routing everything through them than giving out my CC info to be stored at companies all over the place.

Quote:


> Originally Posted by *EDORAM*
> 
> Hi,
> 
> is there someone that can run a firestrike performance and and heaven 4.0 in FHD?
> 
> Are 30k GS and 3500 heaven possibile?
> 
> TKS.


31,313 graphics score for me.
http://www.3dmark.com/fs/9629131

Quote:


> Originally Posted by *Steven185*
> 
> Anybody with a Titan XP having measured the actual performance gains from overlock (in %)? (i.e. have the card at stock everything, including power target and then overclocked some).
> 
> It's the only way for us who are still waiting for the card to project the actual performance that we're going to get (what the reviews say + the actual performance gain due to the overclock)


The only comparison I have of stock to overclocked in Firestrike Ultra, didnt do any real game comparisons as I didnt have a ton of time last night. But stock to overclocked I gained a bit over 13% performance, from a 10% overclock. Go figure that one out . My guess would be that we are at the edge of the benchmarks usefulness, as once you get past the tipping point in any benchmark the points just start scaling up at increasing rates. So anyway, very good boosts to performance from overclocking. I found I tended to get an extra 100 points of graphics score for each 50MHz core increase. The numbers I got tend to be what everyone else here is averaging as well, +200 core.


----------



## Zurv

anyone of have the NVidia HB bridge? I assume the issue with it not working with water blocks is the pointy ends of it that would bump into the raised water connectors. Maybe just cut them off? or is the whole connector just to fat to fit?
The evga one (which are never in stock) looks the same as the old v2 LED bridges from the past and should fit.


----------



## Gary2015

Quote:


> Originally Posted by *Zurv*
> 
> anyone of have the NVidia HB bridge? I assume the issue with it not working with water blocks is the pointy ends of it that would bump into the raised water connectors. Maybe just cut them off? or is the whole connector just to fat to fit?
> The evga one (which are never in stock) looks the same as the old v2 LED bridges from the past and should fit.


I have 2 slot and 4 slot nvidia bridge . Also the evga HB bridge . I called EK and they said the nvidia HB bridge will work.


----------



## unreality

Full review online now at gamestar.de (german site):

http://www.gamestar.de/hardware/grafikkarten/nvidia-titan-x/test/nvidia_titan_x_rev_20,1010,3276696,2.html#spiele-benchmarks

2GHZ TITAN X faster than 1080
1080p -> 23%
WQHD -> 43%
4K -> 52%


----------



## Diverge

Damn, my card was supposed to be delivered at 10:30am EST, which is 8 minutes away... but my tracking is showing a 'delivery exception' with no reason (looks like it was late getting to end point fedex for distribution).. so now it's just showing sitting at fedex a couple towns away. Gonna bother Nvidia for a shipping refund if it's not here by 10:30. Making me waste $36 shipping, and my day when I should be working. Was planning to go in at 1/2 day


----------



## Zurv

Quote:


> Originally Posted by *Gary2015*
> 
> I have 2 slot and 4 slot nvidia bridge . Also the evga HB bridge . I called EK and they said the nvidia HB bridge will work.


really? their website now says the NVidia HB bridge will not work. (it wasn't there yesterday)

but unless they moved back the terminal connectors (and they might have) the NVidia one won't work.


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> Witcher3 do samething, even after i rebooted. But it dont seem to do it everywhere. But at a lot of places.
> 
> Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. When i run in a city , if it rain and a lot of people appear my fps drop from 55-65 to 30 or 40 in one shot, that cause SEVERE stuttering VERY OFTEN, i tried vsync on and off, its the same thing. But in openworld with no rain and a few ennemies the stuttering dont happen often and fps is solid 60fps.
> 
> Is it possible that witcher3 is just too graphical powered to be really playable at 4k even with several setting downgraded ?. But my 1070 and 980ti were not stuttering at 4k, that is strange.
> 
> This is my pc, if you find something fishy:
> 
> Intel I5 4690 3.5ghz-3.9ghz
> 32gb ram ddr3 1600mhz
> 1tb wd black
> Asus h97-plus
> Antec power supply 750watts
> 
> That so bad , i found witcher3 more playable at 4k with my 1070 and 980ti, its so sad, only 35-45fps but no stuterring, with same graphics setting and same 4k resolution


Quote:


> Originally Posted by *Murlocke*
> 
> There is no contact/RAM on the back of the card I heard.
> Not having this problem, witcher 3 runs beautifully for me. I still disable hair works though because it honestly isn't worth it. If you haven't that's the first thing i'd disable as it's known to cause severe performance issues at times for almost no difference.
> 
> If you enabled fast sync, then you will get severe stuttering below 50fps or so.


I tried SEVERAL games 20-30min each. No stuttering at all in all these games at 4k with maximum graphics setting even rise of the tomb raider:

Project cars, rise of the tomb raider, dragon age inquisition, risen 3...

No stuttering at all in any of these games, i use adaptive vsync.

I just thought i update my witcher3 old version from 1,04 or 1,05 to last version, a few days ago. Maybe the last version patch cause me this. I will uninstall it later today and reinstall it with version 1,05 and compare. I have a little less stuttering right now, i use recommendation for stuttering for this game on the net, still not satisfied, I will tell you news about these stuttering after trying 1,05 version. Thanks so much everybody for your help, its so much appreciated. I read all your comments and recommendation. My 100% final overclock result are: core +210 mem +700 for 547gb/s of bandwith BETTER and FASTER than quadro 12gb HBM2 version at 540gb/s , who said hbm2 is faster than ddr5x ? LoL

Note: i have 6942 result for firestrike ultra, im a little disappointed it seems overclocked one in average got 7200+ , mine is overclocked to max, maybe i lost a few points because of my memory ddr3 and i5 4690, but i dont think a i5 4690 should bottleneck my titanx in 4k gaming, in 1440p it could be probably

I prey to get working witcher3 with no stuttering later today. Prey for me guys!


----------



## EniGma1987

Quote:


> Originally Posted by *Gary2015*
> 
> I have 2 slot and 4 slot nvidia bridge . Also the evga HB bridge . I called EK and they said the nvidia HB bridge will work.


That is interesting. EKWB must have changed the block design of the Titan because I thought the HB bridge didn't work with the GTX 1080 waterblock. People were a bit upset about it and EK was designing their own HB bridge that is compatible. Good news if true


----------



## renejr902

Quote:


> Originally Posted by *EniGma1987*
> 
> That is interesting. EKWB must have changed the block design of the Titan because I thought the HB bridge didn't work with the GTX 1080 waterblock. People were a bit upset about it and EK was designing their own HB bridge that is compatible. Good news if true


if someone learn something about the evga watercooling block for titan x post it here, thanks


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> That's not true anymore, Vega. The reason I switched from a 5.2GHz quad core to a 4.7GHz octacore is because I noticed that almost all newer games benefit greatly from the additional threads. Look at GTA V. Or even better, look at Tomb Raider under DX12. It actually hits 90% usage across all 8 cores. I first became suspicious when I noticed my 4.625GHz 5820k work PC outperforming my gaming rig on some games. Basically, older games will be fed sufficiently by a 4 core CPU at 4.7GHz without there being a huge advantage in going with a 6700k with an extra 7% OC. And all other games will benefit more from the additional cores available on the 5960x. Heck if I could guarantee that a 6950x would OC to 4.5GHz I'd have gone that route. But with the 5960x, you get enough cores, with enough clock speed.


Hmm Hyper I'm not so sure about that. Virtually all tests I've ever seen put the 6700K ahead of the lower clocked 8 to 10 core CPU's for gaming. Remember Skylake is still the fastest architecture per clock and once you put 4.8 GHz on top of that, it's hard to beat.

Take a look at this that just came out. 6700K cleaned up across the board, even in some games you mentioned:

http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu

http://www.hardocp.com/article/2016/06/24/dx11_vs_dx12_intel_6700k_6950x_framerate_scaling/1#.V6NXiTsrK0o

I've always gotten the highest/most expensive GPU, but the $1700 6950X has about the worst cost to gaming performance due to its low clocks of about any CPU ever released. Plus with SLI now two cards max and the HB SLI bridge out, you don't need those extra PCI-E lanes/slots either.


----------



## lexlutha111384

im not very impressed by the Titan XP numbers







1080ti ftw


----------



## KillerBee33

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm Hyper I'm not so sure about that. Virtually all tests I've ever seen put the 6700K ahead of the lower clocked 8 to 10 core CPU's for gaming. Remember Skylake is still the fastest architecture per clock and once you put 4.8 GHz on top of that, it's hard to beat.
> 
> Take a look at this that just came out. 6700K cleaned up across the board, even in some games you mentioned:
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu


Any chance you have Firestrike run with 6700k @ 4.8? Mine is 4.6 and i think my score is much lower with 1080 in it.


----------



## Murlocke

Quote:


> Originally Posted by *renejr902*
> 
> I tried SEVERAL games 20-30min each. No stuttering at all in all these games at 4k with maximum graphics setting even rise of the tomb raider:
> 
> Project cars, rise of the tomb raider, dragon age inquisition, risen 3...
> 
> No stuttering at all in any of these games, i use adaptive vsync.
> 
> I just thought i update my witcher3 old version from 1,04 or 1,05 to last version, a few days ago. Maybe the last version patch cause me this. I will uninstall it later today and reinstall it with version 1,05 and compare. I have a little less stuttering right now, i use recommendation for stuttering for this game on the net, still not satisfied, I will tell you news about these stuttering after trying 1,05 version. Thanks so much everybody for your help, its so much appreciated. I read all your comments and recommendation. My 100% final overclock result are: core +210 mem +700 for 547gb/s of bandwith BETTER and FASTER than quadro 12gb HBM2 version at 540gb/s , who said hbm2 is faster than ddr5x ? LoL
> 
> Note: i have 6942 result for firestrike ultra, im a little disappointed it seems overclocked one in average got 7200+ , mine is overclocked to max, maybe i lost a few points because of my memory ddr3 and i5 4690, but i dont think a i5 4690 should bottleneck my titanx in 4k gaming, in 1440p it could be probably
> 
> I prey to get working witcher3 with no stuttering later today. Prey for me guys!


1.05 is super old and lacks a lot of optimization they put into the game. The latest patch should perform the best.

Disable hairworks if you haven't, make sure you have FPS limit set to unlimited. Try doing a fresh reinstall of the latest TItan X driver and check "clean install".

My card still caps out at a super low +300 RAM. I tested +350 a bit, and I ended up getting some artifacts after about 30 minutes even with +0 on the core. I get 7300ish in Fire Strike with +200/+300. Definitely your processor, benchmarks are not the same as games.

I have heard that newer processor smooth out FPS significantly in Witcher 3 and GTA 5. Unless you stated the wrong model, your processor can't be overclocked. It's probably your processor that's causing the stuttering and random drops. Most people buying Titan X overclock their processors and/or are running a newer processor with higher stock clocks.
Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm Hyper I'm not so sure about that. Virtually all tests I've ever seen put the 6700K ahead of the lower clocked 8 to 10 core CPU's for gaming. Remember Skylake is still the fastest architecture per clock and once you put 4.8 GHz on top of that, it's hard to beat.
> 
> Take a look at this that just came out. 6700K cleaned up across the board, even in some games you mentioned:
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu
> 
> http://www.hardocp.com/article/2016/06/24/dx11_vs_dx12_intel_6700k_6950x_framerate_scaling/1#.V6NXiTsrK0o
> 
> I've always gotten the highest/most expensive GPU, but the $1700 6950X has about the worst cost to gaming performance due to its low clocks of about any CPU ever released. Plus with SLI now two cards max and the HB SLI bridge out, you don't need those extra PCI-E lanes/slots either.


You are right. I just bought my 6700k about 2 weeks ago and I researched extensively on this topic. Money wasn't in the question, and I still went with a 6700k.


----------



## Trys0meM0re

Quote:


> Originally Posted by *Murlocke*
> 
> Unless you stated the wrong model, your processor can't be overclocked. It's probably your processor that's causing the stuttering and random drops. .


^This


----------



## Gary2015

Quote:


> Originally Posted by *Zurv*
> 
> really? their website now says the NVidia HB bridge will not work. (it wasn't there yesterday)
> 
> but unless they moved back the terminal connectors (and they might have) the NVidia one won't work.


Damn thats what the guy told me yday. Thinking of cancelling the order.


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm Hyper I'm not so sure about that. Virtually all tests I've ever seen put the 6700K ahead of the lower clocked 8 to 10 core CPU's for gaming. Remember Skylake is still the fastest architecture per clock and once you put 4.8 GHz on top of that, it's hard to beat.
> 
> Take a look at this that just came out. 6700K cleaned up across the board, even in some games you mentioned:
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu
> 
> http://www.hardocp.com/article/2016/06/24/dx11_vs_dx12_intel_6700k_6950x_framerate_scaling/1#.V6NXiTsrK0o
> 
> I've always gotten the highest/most expensive GPU, but the $1700 6950X has about the worst cost to gaming performance due to its low clocks of about any CPU ever released. Plus with SLI now two cards max and the HB SLI bridge out, you don't need those extra PCI-E lanes/slots either.


extra threads don;t hurt tho.


----------



## Zurv

Quote:


> Originally Posted by *Gary2015*
> 
> Damn thats what the guy told me yday. Thinking of cancelling the order.


you could still use the EGA HB bridge (if you can find it) or a LED Hard bridge. That should be fine. if you aren't above 4k the older v2 evga LED bridges works great.

i did order the NVidia bridge and i'll see what happens when i cut the end off


----------



## Metros

Quote:


> Originally Posted by *DADDYDC650*
> 
> The larger curve is welcome and perhaps better quality control since the newest LG 34" display panels suffer from a lot less backlight bleeding which the x34p is sure to use. Native 100Hz is also guaranteed. What if the monitor starts to act weird at 100Hz? Acer warranty has you covered. I also prefer the matte finish on the back of the x34p over the glossy finish on the x34. Of course all of this isn't a huge deal but it's very welcome.


You could also buy the ASUS PG348Q, which is much better anyway


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> extra threads don;t hurt tho.


Tried changing GEN1-GEN2-GEN3 in BIOS , no changes from Auto


----------



## Metros

Quote:


> Originally Posted by *DarkIdeals*
> 
> Actually it's more different than you think.
> 
> According to Acer the X34P will have:
> 
> 1) 100hz native, with overclockability to 120hz or higher
> 
> 2) 1900r curve compared to 3800r curve on X34 (~2x more curve)
> 
> 3) Matte black finish to the back, so no more shiny cheap plastic on the back
> 
> 4) Joystick style OSD buttons similar to those on the ASUS ROG Swift monitors
> 
> 5) Full swivel etc.. stand control (Tilt, Swivel, Height adjust etc..)
> 
> 6) And most importantly imo, it will use a newer "S-IPS" panel from LG, rather than the older "AH-IPS" that had so much backlight bleed issues. (The S-IPS in the X34P is supposedly the same panel type used in the new 34UM98-W which is $1200 even without the inclusion of G-Sync and 100hz etc.. http://www.lg.com/us/monitors/lg-34UC98-W-ultrawide-monitor )
> 
> But i just couldn't see myself waiting till the end of the year to get a new monitor, especially after getting new GPUs which makes you itch for better display tech to take advantage of it lol. (Plus there's no info that i'm aware of that says November. In fact Acer has only said "Q4" with one clarifying remark saying "Late Q4" and some people who were at Computex claiming they were told possibly December. There's also TFTCentral saying that 120hz+ capable 3440x1440p panels are just now in mass-production in late july early august, which they say usually means 2-3 months before companies like Acer get a hold of them in volume, and ANOTHER 2-3 months before they can manufacture enough of them to have a feasible amount of stock for an anticipated release.


No, i have just looked again and even ACER say it is native 100hz, nothing about any overclock to 120hz, it uses the same LG panel, so that will be DP 1.2, so it cannot be overclocked, due to DP bandwidth


----------



## hotrod717

So tempted. I've had one in the cart several times, just can't get my conscience to let me do it. My wife would flip spit.


----------



## Zurv

Quote:


> Originally Posted by *hotrod717*
> 
> So tempted. I've had one in the cart several times, just can't get my conscience to let me do it. My wife would flip spit.


just tell her is is 1/2 the price you are really paying. (and she will still flip out







), but that is what i do. Also, don't share bank accounts or CC... cause that is crazy. You never know what can happen in the future







(*also from her view.. what kinda crazy person would buy a $1200 video card!)


----------



## Gary2015

Quote:


> Originally Posted by *Zurv*
> 
> just tell her is is 1/2 the price you are really paying. (and she will still flip out
> 
> 
> 
> 
> 
> 
> 
> ), but that is what i do. Also, don't share bank accounts or CC... cause that is crazy. You never know what can happen in the future


I bought 2 of them and told her they cost $2,500.


----------



## mouacyk

Quote:


> Originally Posted by *EniGma1987*
> 
> ...
> 31,313 graphics score for me.
> http://www.3dmark.com/fs/9629131
> ...


Nice fs score -- just like I called it before release:


Spoiler: Warning: Spoiler!



Calling Firestrike at 31,250.


----------



## dante`afk

¯\_(ツ)_/¯





guess I'll keep my SLI.


----------



## NoDoz

Quote:


> Originally Posted by *lexlutha111384*
> 
> im not very impressed by the Titan XP numbers
> 
> 
> 
> 
> 
> 
> 
> 1080ti ftw


Welp, bye.


----------



## Murlocke

Quote:


> Originally Posted by *dante`afk*
> 
> ¯\_(ツ)_/¯
> 
> 
> 
> 
> 
> guess I'll keep my SLI.


Know what I find funny about this video? Even though the Titan X is consistently ~10FPS lower the gameplay looks far smoother. There is a stutter on the SLI setup. Like every single SLI setup I had in the past. Not quite sure how it doesn't annoy the hell out of everyone, but I guess some people can't see it? *shrug*

Also 1080s are out of stock, and even at their typical ~700 price 2x of them run $200 more than the Titan.

Happy with my purchase.


----------



## dante`afk

don't see any stutter. depending on the sutation its even 15 fps difference.

apart from that gsync >

also, the titan xp is about 1950-2000mhz, while the 1080 run on stock clock.


----------



## hotrod717

Quote:


> Originally Posted by *Zurv*
> 
> just tell her is is 1/2 the price you are really paying. (and she will still flip out
> 
> 
> 
> 
> 
> 
> 
> ), but that is what i do. Also, don't share bank accounts or CC... cause that is crazy. You never know what can happen in the future
> 
> 
> 
> 
> 
> 
> 
> (*also from her view.. what kinda crazy person would buy a $1200 video card!)


lol. Yeah we have seperate accounts and I am the bread winner, have the cash, but spending $10k on hardware every year is definitely crazy. I'm sure I'll justify it to myself over the next couple of days Lol. Especially after visiting these types of threads.


----------



## Gary2015

Quote:


> Originally Posted by *hotrod717*
> 
> lol. Yeah we have seperate accounts and I am the bread winner, have the cash, but spending $10k on hardware every year is definitely crazy. I'm sure I'll justify it to myself over the next couple of days Lol. Especially after visiting these types of threads.


How are you spending $10k on PC hardware every year?. Most pricey upgrade are the video cards. CPU/Monitors/Mobos are like every 2-3 years upgrade.


----------



## Murlocke

Quote:


> Originally Posted by *dante`afk*
> 
> don't see any stutter. depending on the sutation its even 15 fps difference.
> 
> apart from that gsync >
> 
> also, the titan xp is about 1950-2000mhz, while the 1080 run on stock clock.


The stutter on the SLI setup is night/day for me. The Titan looks almost twice as smooth to me if I ignore the FPS counter.

Look at geralt's shoulders as he runs. There is a clear microstutter.


----------



## Metros

Quote:


> Originally Posted by *Murlocke*
> 
> The stutter on the SLI setup is night/day for me. The Titan looks almost twice as smooth to me if I ignore the FPS counter.
> 
> Look at geralt's shoulders as he runs. There is a clear microstutter.


It could also be the recording, seems you want to blame the GTX 1080 SLI for being better, to justify the Titan X purchase, not sure why you are so quick to blame SLI for the issue


----------



## Murlocke

Quote:


> Originally Posted by *Metros*
> 
> It could also be the recording, seems you want to blame the GTX 1080 SLI for being better, to justify the Titan X purchase, not sure why you are so quick to blame SLI for the issue


Or it's the fact I've had around 5 SLI setups in the past and they've all done that.









Many people around here won't buy SLI because of it. SLI is only worth it when you are getting ~50%+ more FPS than a single GPU solution, otherwise it feels worse in most titles due to the stutter. In my opinion, anyway.


----------



## Testier

Quote:


> Originally Posted by *Murlocke*
> 
> I have heard that newer processor smooth out FPS significantly in Witcher 3 and GTA 5. Unless you stated the wrong model, your processor can't be overclocked. It's probably your processor that's causing the stuttering and random drops. Most people buying Titan X overclock their processors and/or are running a newer processor with higher stock clocks.


I was think he could be CPU bottlenecked. Thats a stock i5 haswell....


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> ¯\_(ツ)_/¯
> 
> 
> 
> 
> 
> guess I'll keep my SLI.


Amazing how the Titan XP is only 10 frames behind 1080 SLI in Witcher 3. Don't have to deal with issues with SLI/incompatibility as well.


----------



## Gary2015

Quote:


> Originally Posted by *Testier*
> 
> I was think he could be CPU bottlenecked. Thats a stock i5 haswell....


Yes
Quote:


> Originally Posted by *DADDYDC650*
> 
> Amazing how the Titan XP is only 10 frames behind 1080 SLI in Witcher 3. Don't have to deal with issues with SLI/incompatibility as well.


What happens if you have Titan X SLI? How much faster than 1080 SLI?


----------



## Testier

Quote:


> Originally Posted by *DADDYDC650*
> 
> Amazing how the Titan XP is only 10 frames behind 1080 SLI in Witcher 3. Don't have to deal with issues with SLI/incompatibility as well.


I honestly dont understand the love for SLI in these days. Scaling is poor and SLI simply dont play well on UE4 which I am sure a lot of games are developed on it.


----------



## Gary2015

Quote:


> Originally Posted by *Testier*
> 
> I honestly dont understand the love for SLI in these days. Scaling is poor and SLI simply dont play well on UE4 which I am sure a lot of games are developed on it.


Didn't think the love for SLI went up recently. I think SLI is still for the minority.


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> don't see any stutter. depending on the sutation its even 15 fps difference.
> 
> apart from that gsync >
> 
> also, the titan xp is about 1950-2000mhz, while the 1080 run on stock clock.


Stock boost on the 1080 is 1733Mhz.


----------



## DADDYDC650

Quote:


> Originally Posted by *Metros*
> 
> It could also be the recording, seems you want to blame the GTX 1080 SLI for being better, to justify the Titan X purchase, not sure why you are so quick to blame SLI for the issue


Titan X owners don't need to justify their purchase. Titan XP is king and 1080 SLI needs improvement.


----------



## EniGma1987

Quote:


> Originally Posted by *Metros*
> 
> It could also be the recording, seems you want to blame the GTX 1080 SLI for being better, to justify the Titan X purchase, not sure why you are so quick to blame SLI for the issue


Have you used SLI in the past generation or two? Just wondering.


----------



## Zurv

Quote:


> Originally Posted by *dante`afk*
> 
> ¯\_(ツ)_/¯
> 
> 
> 
> 
> 
> guess I'll keep my SLI.


yeah.. keep the SLI -> titan XP SLI!


----------



## mouacyk

Quote:


> Originally Posted by *Murlocke*
> 
> Know what I find funny about this video? Even though the Titan X is consistently ~10FPS lower the gameplay looks far smoother. There is a stutter on the SLI setup. Like every single SLI setup I had in the past. Not quite sure how it doesn't annoy the hell out of everyone, but I guess some people can't see it? *shrug*
> 
> Also 1080s are out of stock, and even at their typical ~700 price 2x of them run $200 more than the Titan.
> 
> Happy with my purchase.


Not sure what it is with these SLI videos but I can definitely notice the "stutter" too. The rendering is sluggish and the animations look to be rubber banding on left. Despite lower fps, right video doesn't exhibit those symptoms.

I've used SLI 660's in BF3/4 and the stuttering really only appears when graphics are turned up far enough to negatively affect frame pacing.


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> Yes
> What happens if you have Titan X SLI? How much faster than 1080 SLI?


Some say Titan XP SLI owners are now able to time travel and read minds.


----------



## dante`afk

Quote:


> Originally Posted by *EniGma1987*
> 
> Have you used SLI in the past generation or two? Just wondering.


i guess he has not.

I'm just testing witcher 3 myself and don't see any of that supposed stuttering here.


----------



## Murlocke

Quote:


> Originally Posted by *Testier*
> 
> I honestly dont understand the love for SLI in these days. Scaling is poor and SLI simply dont play well on UE4 which I am sure a lot of games are developed on it.


There's a lot less love for it now than there use to be, many people refuse to buy it now. A lot of people have realized due to the slight frame delays of using multiple cards to render, you need to get much higher FPS than a single GPU solution for it to feel "faster". A 10FPS difference, the single GPU will always feel superior.

The only time I'd honestly recommend/consider getting SLI is when you need more power and there's not a single GPU on the market that can fit the bill. So I can understand why people are buying two Titan X, but I can't understand why someone would buy two 1080s now that the Titan exists.
Quote:


> Originally Posted by *dante`afk*
> 
> i guess he has not.
> 
> I'm just testing witcher 3 myself and don't see any of that supposed stuttering here.


What FPS are you getting?

I noticed if you get high FPS, SLI can be quite good. 60 and under, it gets pretty bad and feels much worse than a single GPU solution.


----------



## Zurv

I don't see any stutter in the witcher 3 with SLI... be that 4 way Titan X, 2 way 1080 or 2 way Titan XP

maybe there is a bottleneck some place else.




this was my video of blood and win... i don't recall which system i recorded it on.. but it was some SLI







i think 1080s? there is a very small counter on the bottom right.


----------



## dante`afk

Yea, I'd notice stuttering very quickly since that was one of the reasons I went to gsync, even with single GPU.

I'm playing in 1440p and have about 90-110 fps in witcher3, with AA enabled.

I have the titan xp also here and still considering what to do. I can see why one single GPU is better.

with 1080 sli I have 90 fps in the city, with one card 70 fps, ....^^


----------



## Murlocke

Quote:


> Originally Posted by *Zurv*
> 
> I don't see any stutter in the witcher 3 with SLI... be that 4 way Titan X, 2 way 1080 or 2 way Titan XP
> 
> maybe there is a bottleneck some place else.
> 
> 
> 
> 
> this was my video of blood and win... i don't recall which system i recorded it on.. but it was some SLI
> 
> 
> 
> 
> 
> 
> 
> i think 1080s? there is a very small counter on the bottom right.


I notice a stutter immediately, and skimming through the video I can identify a stutter pretty quickly. Based on your FPS in the corner, that looks closer to 35-40FPS to me, not the ~55FPs you are averaging. That could be due to the fact it's a 30FPS video though.









If you've been on SLI for awhile it becomes harder to see the stutter in my opinion, you get use to it. I was using SLI for about 4 builds in a row and I never saw it until I went back to single GPU and realized how much smoother <60FPS is on them.
Quote:


> Originally Posted by *dante`afk*
> 
> Yea, I'd notice stuttering very quickly since that was one of the reasons I went to gsync, even with single GPU.
> 
> I'm playing in 1440p and have about 90-110 fps in witcher3, with AA enabled.
> 
> I have the titan xp also here and still considering what to do. I can see why one single GPU is better.
> 
> with 1080 sli I have 90 fps in the city, with one card 70 fps, ....^^


Yes, that's why your not seeing it. SLI is great at that FPS it basically completely gets rid of the stutter because it's so fast you don't see it.

A SLI setup in the 40-60FPS ranges will feel worse than a single GPU setup in the same ranges based on my experience, but after that point they start to feel roughly the same. By the time you hit ~90FPS, I can't see the microstutter anymore.

At your resolution, I'd keep the SLI probably unless it's giving you headaches/problems.


----------



## Metros

Quote:


> Originally Posted by *Testier*
> 
> I honestly dont understand the love for SLI in these days. Scaling is poor and SLI simply dont play well on UE4 which I am sure a lot of games are developed on it.


It depends on the games you play, DICE games seem to have 99 percent on both GPUs and the SLI profile comes out when the game is released, Witcher 3 has good scaling, Elite Dangerous, Crysis 3, many games have good scaling. The fact that GTX 980ti SLI still beats a Titan X shows how good SLI is in most games. However if you play small developed games, then getting a single GPU is better for performance


----------



## Zurv

Quote:


> Originally Posted by *Murlocke*
> 
> I notice a stutter immediately, and skimming through the video I can identify a stutter pretty quickly. Based on your FPS in the corner, that looks closer to 35-40FPS to me, not the ~55FPs you are averaging. That could be due to the fact it's a 30FPS video though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you've been on SLI for awhile it becomes harder to see the stutter in my opinion, you get use to it. I was using SLI for about 4 builds in a row and I never saw it until I went back to single GPU and realized how much smoother <60FPS is on them.
> Yes, that's why your not seeing it.
> 
> SLI is great at that FPS it basically completely gets rid of the stutter because it's so fast you don't see it.


In the close up chat scenes the witcher drags down. That is why 1080 SLI isn't powerful enough and i replaced them with the titan XPs. Right now that same scenes on the titan X is in the 70s+

also, capturing in 4k in brutal. I use two NVMe intel 750s in raid 0 and sometimes they max out. (but i capture raw rgb, not the **** that other software does. ugh.. the GeForce exp video capture quality is junk







- then like 8 hours in encode.. then another 10 hours to youtube to butcher it







haha


----------



## Murlocke

Quote:


> Originally Posted by *Zurv*
> 
> In the close up chat scenes the witcher drags down. That is why 1080 SLI isn't powerful enough and i replaced them with the titan XPs. Right now that same scenes on the titan X is in the 70s+
> 
> also, capturing in 4k in brutal. I use two NVMe intel 750s in raid 0 and sometimes they max out. (but i capture raw rgb, not the **** that other software does. ugh.. the GeForce exp video capture quality is junk
> 
> 
> 
> 
> 
> 
> 
> - then like 8 hours in encode.. then another 10 hours to youtube to butcher it
> 
> 
> 
> 
> 
> 
> 
> haha


Yeah, I tried doing raw [email protected] capture once with my 850 EVO.

It could manage about 19 frames a second.


----------



## DADDYDC650

Quote:


> Originally Posted by *Zurv*
> 
> I don't see any stutter in the witcher 3 with SLI... be that 4 way Titan X, 2 way 1080 or 2 way Titan XP
> 
> maybe there is a bottleneck some place else.
> 
> 
> 
> 
> this was my video of blood and win... i don't recall which system i recorded it on.. but it was some SLI
> 
> 
> 
> 
> 
> 
> 
> i think 1080s? there is a very small counter on the bottom right.


That doesn't look very smooth. I noticed stutter even when scrolling thru the map.


----------



## Zurv

Quote:


> Originally Posted by *Murlocke*
> 
> Yeah, I tried doing raw [email protected] capture once with my 850 EVO.
> 
> It could manage about 19 frames a second.


yeah.. even with the nvme intel or the 950 nvme from Sammy. Getting over 30 is pretty hard.
But i'm kinda "saved" now. Before i'd make 4k videos when only crazy people had the hardware do render it. But now i'm limited like everyone else to two way SLI.. soo.. what's the point...boo.. what the hell am i using these 10 cores for!!


----------



## bee144

Quote:


> Originally Posted by *renejr902*
> 
> i want the evga watercooler, do you know when it will be release ? can we preorder?




__ https://twitter.com/i/web/status/760539551655862273


----------



## Zurv

Quote:


> Originally Posted by *DADDYDC650*
> 
> That doesn't look very smooth. I noticed stutter even when scrolling thru the map.


that map is always wacky.. but then i never play on anything but SLI.


----------



## Gary2015

Quote:


> Originally Posted by *bee144*
> 
> 
> __ https://twitter.com/i/web/status/760539551655862273


I guess they are pissed Nvidia decided to sell the Titan XP themselves.


----------



## dante`afk

welp, gotta have the newest from the newest. will sell my two 1080 and stick a bit with the titan xp, and then later buy a second one


----------



## stangflyer

I have had SLI/Crossfire since the 7950GX2. Had 280GTXx2, 5970, 7950's, 970's in SLI. Always noticed small hitching when I played but just thought (that is the way it is) because I had it so long. Then I saw some friends and my cousins pc's with fast single cards. I had 970 G1's in SLI and my cousin had his pc at my place and his single 780 was smoother at 50 fps then mine at 70 same games different settings. Got a 980ti hybrid and I can honestly say I will never SLI again. Night and day difference. I had my doubts until I went back and forth between the two pc's with no fps showing on screen.

Keep in mind I have not seen GSync in SLI!


----------



## bee144

Quote:


> Originally Posted by *Gary2015*
> 
> I guess they are pissed Nvidia decided to sell the Titan XP themselves.


how much do you really think EVGA actually makes by selling a reference card? $20-$25? I think they make their money off of their custom PCB cards like the FTW, which they only have to pay NVIDIA for the GPU chip and can source everything else themselves.

I still think they have a lot of money to be made out titan XP owners. I plan of buying two hybrid kits, their sli bridge, and two of their power link adapters.

I spoke to Jacob about the cooler last week and requested they include RGB. Maybe they're making improvements to their reference 1080 cooler before adding a titan XP cooler.


----------



## D749

Quote:


> Originally Posted by *dante`afk*
> 
> ¯\_(ツ)_/¯
> 
> 
> 
> 
> 
> guess I'll keep my SLI.


What overlay is he/she using? I wish that I could get MSI AB to organize info like that.


----------



## Steven185

Quote:


> Originally Posted by *EniGma1987*
> 
> The only comparison I have of stock to overclocked in Firestrike Ultra, didnt do any real game comparisons as I didnt have a ton of time last night. But stock to overclocked I gained a bit over 13% performance, from a 10% overclock. Go figure that one out . My guess would be that we are at the edge of the benchmarks usefulness, as once you get past the tipping point in any benchmark the points just start scaling up at increasing rates. So anyway, very good boosts to performance from overclocking. I found I tended to get an extra 100 points of graphics score for each 50MHz core increase. The numbers I got tend to be what everyone else here is averaging as well, +200 core.


Hey, thanks!

I guess gone are the days where overclocking made much difference, nowadays it's all about cooling (if one can sustain the clocks). Just last gen I could squeeze 30% more performance over stock, of course tweaked bios helped some as well as good cooling. Thus far this gen seems quite underwhelming OC-wise (I hoped some of the 1080s restrictions were gone, but I guess that's not so).

Thank once again for the report.


----------



## mbze430

Quote:


> Originally Posted by *D749*
> 
> What overlay is he/she using? I wish that I could get MSI AB to organize info like that.


HWInfo 64 w/ RTSS


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Tried changing GEN1-GEN2-GEN3 in BIOS , no changes from Auto


is that a question or a result?
Quote:


> Originally Posted by *hotrod717*
> 
> So tempted. I've had one in the cart several times, just can't get my conscience to let me do it. My wife would flip spit.


decadent for sure!
Quote:


> Originally Posted by *Gary2015*
> 
> How are you spending $10k on PC hardware every year?. Most pricey upgrade are the video cards. CPU/Monitors/Mobos are like every 2-3 years upgrade.


Guess you don't know hotrod717 and his extreme habit.
Quote:


> Originally Posted by *dante`afk*
> 
> i guess he has not.
> 
> I'm just testing witcher 3 myself and don't see any of that supposed stuttering here.


Back again? Just sell the sli set up and get a titan... otherwise why troll A TXP owners thread?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> is that a question or a result?


Results.


----------



## mbze430

Quote:


> Originally Posted by *DADDYDC650*
> 
> Some say Titan XP SLI owners are now able to time travel and read minds.


Uhmm yea... we all bought in to Titan XP SLI for DEEP LEARNING! *****


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> extra threads don;t hurt tho.


A 6600k can easily beet a 6950x in this bench. Valley doesn't use more than 4 cores. With a single card you see one core maxed out most of the time with the second core around 50%. In sli it seem's the load gets spread out between all four of the cores it's able to use.


----------



## techguymaxc

Quote:


> Originally Posted by *dante`afk*
> 
> ¯\_(ツ)_/¯
> 
> guess I'll keep my SLI.


K. Enjoy your micro-stutter.


----------



## Artah

Quote:


> Originally Posted by *DADDYDC650*
> 
> Some say Titan XP SLI owners are now able to time travel and read minds.


Quote:


> Originally Posted by *mbze430*
> 
> Uhmm yea... we all bought in to Titan XP SLI for DEEP LEARNING! *****


That is just completely inaccurate, Titan XP SLI owners can also teleport and use beaming technology!


----------



## Baasha

The Witcher 3 in 4-Way SLI - Titan X (Maxwell):


----------



## CallsignVega

Almost at max 300 FPS single card, so close:

JPM, representing the 6700K:



I freaking love these cards.


----------



## Baasha

Quote:


> Originally Posted by *CallsignVega*
> 
> Almost at max 300 FPS single card, so close:
> 
> JPM, representing the 6700K:
> 
> 
> 
> I freaking love these cards.


what's the maximum stable OC you've gotten with the Titan XP in SLI?

Still testing the single GPU - benchies are going amazingly, actually.


----------



## CallsignVega

2101 MHz, but playing around with the memory seems to have quite the impact.


----------



## dante`afk

Quote:


> Originally Posted by *DADDYDC650*
> 
> That doesn't look very smooth. I noticed stutter even when scrolling thru the map.


damn, even I see the stuttering now.

@zurv whats your fps there and do you use the HB Bridge?
Quote:


> Originally Posted by *D749*
> 
> What overlay is he/she using? I wish that I could get MSI AB to organize info like that.


i think thats nxzt's CAM.


----------



## HaniWithAnI

Quote:


> Originally Posted by *CallsignVega*
> 
> 2101 MHz, but playing around with the memory seems to have quite the impact.


Care to elaborate? Do you mean backing off on mem results in higher core?


----------



## mbze430

Anyone here with a Titan X Pascal with Oculus? Someone on the Oculus forum is saying they are getting the message about the minimum configuration is not being met.

Anyone else can confirm? I am not getting mine till tomorrow or Sat, and I have very much intent to use it with my Oculus Rift CV1


----------



## Testier

I cant go pass 2ghz on my titan x on FS. Stuck around 1950ish I guess on FS, I feel I am being capped by power limit.

http://www.3dmark.com/fs/9637982


----------



## Testier

Quote:


> Originally Posted by *CallsignVega*
> 
> 2101 MHz, but playing around with the memory seems to have quite the impact.


You have a single GPU run score of that Vega? I just want to compare it.


----------



## D749

Quote:


> Originally Posted by *mbze430*
> 
> HWInfo 64 w/ RTSS


Thanks. I use AIDA64 and MSI AB. Will have to check that out.


----------



## D749

My Titan XPs just arrived in the US.

Did anyone else receive a box like this? It has security tape/stickers which were cut and then taped over. The box also had a Foxconn sticker so maybe this went through inspection in customs?





Also, the actual Titan XP box was covered in plastic, but was not shrink wrapped. They just had tape on each side keeping the box closed.

Thanks.


----------



## DADDYDC650

Quote:


> Originally Posted by *D749*
> 
> My Titan XPs just arrived in the US.
> 
> Did anyone else receive a box like this? It has security tape/stickers which were cut and then taped over. The box also had a Foxconn sticker so maybe this went through inspection in customs?
> 
> 
> 
> 
> 
> Also, the actual Titan XP box were not shrink wrapped. They just had tape on each side keeping the box closed.
> 
> Thanks.


Mine just came in just now. Regular brown box with a shipping label on it. Titan X box was wrapped with plastic.


----------



## D749

Quote:


> Originally Posted by *DADDYDC650*
> 
> Mine just came in just now. Regular brown box with a shipping label on it. Titan X box was wrapped with plastic.


Did your exterior packaging look like mine?

Also, each Titan XP box was covered in a plastic bag, but each box was not shrink wrapped. Is that what you noticed?


----------



## techguymaxc

The Titan X boxes sound exactly as I received mine, nothing obviously missing from the box.


----------



## CallsignVega

Quote:


> Originally Posted by *Testier*
> 
> You have a single GPU run score of that Vega? I just want to compare it.


Here you go:


----------



## Murlocke

I keep seeing the FPS and thinking "that's way higher than what i get...". Then I see 1920x1080.









Quote:


> Originally Posted by *D749*
> 
> My Titan XPs just arrived in the US.
> 
> Did anyone else receive a box like this? It has security tape/stickers which were cut and then taped over. The box also had a Foxconn sticker so maybe this went through inspection in customs?
> 
> 
> 
> 
> 
> Also, the actual Titan XP box was covered in plastic, but was not shrink wrapped. They just had tape on each side keeping the box closed.
> 
> Thanks.


That is not how mine came. You were right to take pictures before touching. Hopefully the package is fine.


----------



## DADDYDC650

Quote:


> Originally Posted by *D749*
> 
> Did your exterior packaging look like mine?
> Also, each Titan XP box was covered in a plastic bag, but each box was not shrink wrapped. Is that what you noticed?


Just a regular brown box with a label. XP box isn't shrink wrapped. Looks like yours went through customs.

On that note... Hello darkness, hello friend.


----------



## D749

Almost forgot the money shot.


----------



## GunnzAkimbo




----------



## marc0053

Hey guys,
I did a quick unboxing video yesterday followed by some trial runs with the stock air cooler for Firestrike.
After seeing the core throttling issue I then placed the waterblock and it helped tons.
There is still some core throttling due to hitting the power limit but I will be doing the shunt mod using CLU next week.
Hopefully it fixes the issue.

My first live stream (low budget / low quality but will improve soon with capture card)

unboxing:





air/watercooling setup:





A few HWBOT Subs:
http://hwbot.org/submission/3280668_marc0053_3dmark___fire_strike_titan_x_pascal_27344_marks
http://hwbot.org/submission/3280673_marc0053_3dmark___fire_strike_extreme_titan_x_pascal_15719_marks
http://hwbot.org/submission/3280678_marc0053_3dmark___fire_strike_ultra_titan_x_pascal_8206_marks
http://hwbot.org/submission/3280694_marc0053_3dmark___time_spy_titan_x_pascal_11073_marks
http://hwbot.org/submission/3280521_marc0053_3dmark11___performance_titan_x_pascal_35283_marks
http://hwbot.org/submission/3280574_marc0053_3dmark_vantage___performance_titan_x_pascal_86085_marks
http://hwbot.org/submission/3280435_marc0053_catzilla___720p_titan_x_pascal_70729_marks
http://hwbot.org/submission/3280459_marc0053_catzilla___1440p_titan_x_pascal_25695_marks
http://hwbot.org/submission/3280211_marc0053_gpupi___1b_titan_x_pascal_10sec_967ms


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> A 6600k can easily beet a 6950x in this bench. Valley doesn't use more than 4 cores. With a single card you see one core maxed out most of the time with the second core around 50%. In sli it seem's the load gets spread out between all four of the cores it's able to use.


erm - yeah. Think I came across that before.








my 6320 @ 5.3 does even better in valley.
Quote:


> Originally Posted by *CallsignVega*
> 
> Almost at max 300 FPS single card, so close:
> 
> JPM, representing the 6700K:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I freaking love these cards.


lol-=less to do with a 4.8 6700K, than 5800+ on the video ram.

Guys- I know the IPC on a 6700K is the best right now, especially in DX11 and older APIs (which Unigine is). DX12 is a bit different (eg, time spy). Windows 8?
Still your 170+FPS score in Valley is not easy to match even with my 6700K @ 5.0 and ram at 3866.


----------



## EniGma1987

Quote:


> Originally Posted by *Zurv*
> 
> I don't see any stutter in the witcher 3 with SLI... be that 4 way Titan X, 2 way 1080 or 2 way Titan XP
> 
> maybe there is a bottleneck some place else.
> 
> 
> 
> 
> this was my video of blood and win... i don't recall which system i recorded it on.. but it was some SLI
> 
> 
> 
> 
> 
> 
> 
> i think 1080s? there is a very small counter on the bottom right.


If you want us to try and see or not see stutter at 55fps, you are going to need to make the video in 60fps, not 30fps. Otherwise no matter what you are running we will always see stutter cause the video is not playing anything close to what the game fps is.


----------



## Jpmboy

not easy, but not impossible.


----------



## Zurv

Quote:


> Originally Posted by *EniGma1987*
> 
> If you want us to try and see or not see stutter at 55fps, you are going to need to make the video in 60fps, not 30fps. Otherwise no matter what you are running we will always see stutter cause the video is not playing anything close to what the game fps is.


you are talking crazy town!







haha.. that is to much data to capture. both in size and write speed








but now that i'm limited to 2 way SLI.. what's the point of making videos


----------



## hotrod717

Quote:


> Originally Posted by *Gary2015*
> 
> How are you spending $10k on PC hardware every year?. Most pricey upgrade are the video cards. CPU/Monitors/Mobos are like every 2-3 years upgrade.


980kp, titan xm, 980ti , 4 to 5 motherboards, various ram kits, various ssds, various cpus,
Various legacy gpu's. If I did a true costing, more than $10k a year. There are different levels of enthusiast.


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> erm - yeah. Think I came across that before.
> 
> 
> 
> 
> 
> 
> 
> 
> *my 6320 @ 5.3 does even better in valley.*
> lol-=less to do with a 4.8 6700K, than 5800+ on the video ram.
> 
> Guys- I know the IPC on a 6700K is the best right now, especially in DX11 and older APIs (which Unigine is). DX12 is a bit different (eg, time spy). Windows 8?
> Still your 170+FPS score in Valley is not easy to match even with my 6700K @ 5.0 and ram at 3866.


lol i don't doubt it.









seems like you got a nice gain from 166 FPS to 171 FPS going to skylake, but im sure you pushed it slightly higher.


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> lol i don't doubt it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> seems like you got a nice gain from 166 FPS to 171 FPS going to skylake, but im sure you pushed it slightly higher.


never would have thought to push the ram up that far... certainly can;t survive many other benchmarks at that speed. Vega has some amazing ram on his card.


----------



## D749

Quote:


> Originally Posted by *Murlocke*
> 
> I keep seeing the FPS and thinking "that's way higher than what i get...". Then I see 1920x1080.
> 
> 
> 
> 
> 
> 
> 
> 
> That is not how mine came. You were right to take pictures before touching. Hopefully the package is fine.


I think I know why my box had the security sticker cut. The box sent to me had some Foxconn labeling. It shows quantity 4 on the box. I only ordered 2 cards. Nvidia probably used the same box and just took out 2 cards.


----------



## Jpmboy

Quote:


> Originally Posted by *marc0053*
> 
> Hey guys,
> I did a quick unboxing video yesterday followed by some trial runs with the stock air cooler for Firestrike.
> After seeing the core throttling issue I then placed the waterblock and it helped tons.
> There is still some core throttling due to hitting the power limit but I will be doing the shunt mod using CLU next week.
> Hopefully it fixes the issue.
> 
> My first live stream (low budget / low quality but will improve soon with capture card)
> 
> unboxing:
> 
> 
> 
> 
> 
> air/watercooling setup:
> 
> 
> 
> 
> 
> A few HWBOT Subs:
> http://hwbot.org/submission/3280668_marc0053_3dmark___fire_strike_titan_x_pascal_27344_marks
> http://hwbot.org/submission/3280673_marc0053_3dmark___fire_strike_extreme_titan_x_pascal_15719_marks
> http://hwbot.org/submission/3280678_marc0053_3dmark___fire_strike_ultra_titan_x_pascal_8206_marks
> http://hwbot.org/submission/3280694_marc0053_3dmark___time_spy_titan_x_pascal_11073_marks
> http://hwbot.org/submission/3280521_marc0053_3dmark11___performance_titan_x_pascal_35283_marks
> http://hwbot.org/submission/3280574_marc0053_3dmark_vantage___performance_titan_x_pascal_86085_marks
> http://hwbot.org/submission/3280435_marc0053_catzilla___720p_titan_x_pascal_70729_marks
> http://hwbot.org/submission/3280459_marc0053_catzilla___1440p_titan_x_pascal_25695_marks
> http://hwbot.org/submission/3280211_marc0053_gpupi___1b_titan_x_pascal_10sec_967ms


hey marc - did you put a uniblock on the card and leave the blower for the ram/vrms?


----------



## DADDYDC650

Quote:


> Originally Posted by *D749*
> 
> My Titan XPs just arrived in the US.
> 
> Did anyone else receive a box like this? It has security tape/stickers which were cut and then taped over. The box also had a Foxconn sticker so maybe this went through inspection in customs?
> 
> 
> 
> 
> 
> Also, the actual Titan XP box was covered in plastic, but was not shrink wrapped. They just had tape on each side keeping the box closed.
> 
> Thanks.


Yours came straight off the assembly line from Foxconn.


----------



## Jpmboy

Quote:


> Originally Posted by *DADDYDC650*
> 
> Yours came straight off the assembly line from Foxconn.


that's the same box/packaging i received too. Digital river.


----------



## mbze430

Quote:


> Originally Posted by *hotrod717*
> 
> 980kp, titan xm, 980ti , 4 to 5 motherboards, various ram kits, various ssds, various cpus,
> Various legacy gpu's. If I did a true costing, more than $10k a year. There are different levels of enthusiast.


I know the feeling, I spend almost that much on electronics then I have my SCUBA, skydiving, HPDE weekends....and the list goes on and on. Crazy how an enthusiast can get! so I feel you!


----------



## DADDYDC650

Quote:


> Originally Posted by *Jpmboy*
> 
> that's the same box/packaging i received too. Digital river.


You lucky dawg. It was never put on any shelf.... Straight from the factory. I bet it smelled like newborn babies when you opened it right?


----------



## dante`afk

gotta love that the 1080s are out of stock, sold both my card witch each 100$ winnings on amazon.com


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> gotta love that the 1080s are out of stock, sold both my card witch each 100$ winnings on amazon.com


1080's are for poor people. Kidding!


----------



## marc0053

Quote:


> Originally Posted by *Jpmboy*
> 
> hey marc - did you put a uniblock on the card and leave the blower for the ram/vrms?


Yes, Uni block + fan on VRM and on back of PCB


----------



## Diverge

My titan arrived a couple hours ago... already sold my maxwell pascal









So far I've only bumped the core to +205, and seems stable in Witcher 3. Card maintains ~83-84C w/ 90% fan speed. I'll have to test more to make sure I'm stable, and see about more OC.


----------



## lilchronic

Quote:


> Originally Posted by *marc0053*
> 
> Yes, Uni block + fan on VRM and on back of PCB


you do the shunt mod yet?


----------



## szeged




----------



## marc0053

Quote:


> Originally Posted by *lilchronic*
> 
> you do the shunt mod yet?


Just waiting for CLU in the mail


----------



## Testier

Quote:


> Originally Posted by *marc0053*
> 
> Just waiting for CLU in the mail


Which resistors to take out for the shunt mod? I am curious...


----------



## EniGma1987

Quote:


> Originally Posted by *marc0053*
> 
> Just waiting for CLU in the mail


Going that route instead of soldering resistors over the capacitors huh? I wanted to do the CLU way too, but TiN scared me off of it. Let me know how your mod goes and if it works then ill just go that route too. Easy enough to remove if the CLU over shunts doesnt work









Which reminds me, will CLU work on the die and stock vapor chamber heatsink? Or will it eat into the sink? I assume the vapor chamber is copper and plated with nickel, but I didnt really want to chance that the whole heatsink block was made of aluminum as that would go....very very bad. I am going waterblock anyway, but figured I may do a CLU TIM job in the meantime.


----------



## Viveacious

Quote:


> Originally Posted by *techguymaxc*
> 
> K. Enjoy your micro-stutter.


I don't know how he could've watched that video and thought the SLI setup was smoother. Must've just looked at the FPS number.

Titan was clearly smoother, even with the lower framerate.


----------



## stefxyz

Guys,

I am officially in the club now. Just did some testing (will hagve to wait for the bloody Waterblock). But shes sporting like a champ already:

http://www.3dmark.com/3dm/13862794


----------



## Murlocke

Quote:


> Originally Posted by *dante`afk*
> 
> gotta love that the 1080s are out of stock, sold both my card witch each 100$ winnings on amazon.com


I will never understand this, who is buying these? If someone is willing to drop $800+ on a $700 card, and waste $100+ by not having the patience of waiting, then what's another $400 for ~37% more performance (based on HardwareCanuck's average)?

If $100 means so little to them, surely they can stretch $400 for much better performance! I have a feeling most of the people buying 1080s above market value have no clue this card exists yet.


----------



## KillerBee33

Quote:


> Originally Posted by *Murlocke*
> 
> I will never understand this, who is buying these? If someone is willing to drop $800+ on a $700 card, and waste $100+ by not having the patience of waiting, then what's another $400 for ~37% more performance (based on HardwareCanuck's average)?
> 
> If $100 means so little to them, surely they can stretch $400 for much better performance! I have a feeling most of the people buying 1080s for above market value have no clue this card exists yet.


Got it , played for two months , will sell it and loose 100 bucks at most


----------



## mbze430

Quote:


> Originally Posted by *Murlocke*
> 
> I will never understand this, who is buying these? If someone is willing to drop $800+ on a $700 card, and waste $100+ by not having the patience of waiting, then what's another $400 for ~37% more performance (based on HardwareCanuck's average)?
> 
> If $100 means so little to them, surely they can stretch $400 for much better performance! I have a feeling most of the people buying 1080s above market value have no clue this card exists yet.


Then they are the same people that ask us why we paid $1200 for Titan XP for a 20% gain..


----------



## ratzofftoya

I am going to be running two of these in SLI. Should I leave one of my 980Tis in the system for PhysX?


----------



## techguymaxc

Quote:


> Originally Posted by *ratzofftoya*
> 
> I am going to be running two of these in SLI. Should I leave one of my 980Tis in the system for PhysX?


Not enough. You should leave both 980 Tis in there.


----------



## stefxyz

Dont think many games or apps use dediucated physx. This might change with VR but I wouldnt count on it.


----------



## techguymaxc

Quote:


> Originally Posted by *stefxyz*
> 
> Dont think many games or apps use dediucated physx. This might change with VR but I wouldnt count on it.


You're no fun.


----------



## mbze430

Quote:


> Originally Posted by *ratzofftoya*
> 
> I am going to be running two of these in SLI. Should I leave one of my 980Tis in the system for PhysX?


That is how I am planning to run my setup, and mainly for future proofing my VR experience. I own the Oculus Rift

I got an extra 980TI laying around, and those Titan XP coming in a few days

As indicated with the VR Funhouse Demo from nvidia. to run VR SLI, you needed 2x 10-Series GTX cards in SLI mode, then 1x dedicated 980TI card.


----------



## stefxyz

I think in theory that would be great but lets face it there are a hand full of crazy enthusiasts like we are who would really do this. Developers will not code for 0.01% of the market. I would put 3 bloody Titans in if that would really add substantial value to VR... But not just for Funhouse alone.


----------



## Foxrun

I did next day 10am shipping yesterday, but Nvidia still hasnt shipped my gpu. Arr they slow for anyone else? I was hoping to get it tomorrow because I sold my 1080s.


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> That is how I am planning to run my setup, and mainly for future proofing my VR experience. I own the Oculus Rift
> 
> I got an extra 980TI laying around, and those Titan XP coming in a few days
> 
> As indicated with the VR Funhouse Demo from nvidia. to run VR SLI, you needed 2x 10-Series GTX cards in SLI mode, then 1x dedicated 980TI card.


You can't fully run Fun House on a Rift, main idea of a Fun House is Room Scale and Touch Controllers which Rift isn't capable of now, but not for long


----------



## mbze430

Quote:


> Originally Posted by *ratzofftoya*
> 
> I am going to be running two of these in SLI. Should I leave one of my 980Tis in the system for PhysX?


That is how I am planning to run my setup, and mainly for future proofing my VR experience. I own the Oculus Rift

As indicated with the VR Funhouse Demo from nvidia. to run VR SLI, you needed 2x 980TI or above in SLI mode, then 1x dedicated 980TI card.
Quote:


> Originally Posted by *stefxyz*
> 
> I think in theory that would be great but lets face it there are a hand full of crazy enthusiasts like we are who would really do this. Developers will not code for 0.01% of the market. I would put 3 bloody Titans in if that would really add substantial value to VR... But not just for Funhouse alone.


I wouldnt' do 2x Titan XP SLI and 1x Titan XP Dedicated PhysX.... most I would do is a GTX 1080 as a dedicated PhysX... which I might do... since I am planning to get one of those GPU External TB3.0 box for my laptop


----------



## ratzofftoya

Quote:


> Originally Posted by *mbze430*
> 
> That is how I am planning to run my setup, and mainly for future proofing my VR experience. I own the Oculus Rift
> 
> I got an extra 980TI laying around, and those Titan XP coming in a few days
> 
> As indicated with the VR Funhouse Demo from nvidia. to run VR SLI, you needed 2x 10-Series GTX cards in SLI mode, then 1x dedicated 980TI card.


If it won't confer a substantial benefit (because it relies on developer implementation), I'll likely base my decision on whether or not this works with an EK FC watercooling bridge. If the ports don't line up, forget it!


----------



## mbze430

Quote:


> Originally Posted by *KillerBee33*
> 
> You can't fully run Fun House on a Rift, main idea of a Fun House is Room Scale and Touch Controllers which Rift isn't capable of now, but not for long


True true. can't wait for the Touch to come... but I did start VR Funhouse with my old 980TI SLI setup.. didn't use the second card... but you can move around with the Xbox controller.


----------



## TUFinside

Congratz to all the lucky b4st4rdz owning that little beast.


----------



## mbze430

Quote:


> Originally Posted by *ratzofftoya*
> 
> If it won't confer a substantial benefit (because it relies on developer implementation), I'll likely base my decision on whether or not this works with an EK FC watercooling bridge. If the ports don't line up, forget it!


I have the Aqua Computer waterblocks on my 980TI, so I am waiting for Aqua Computer to release their Titan XP waterblock they said their blocks will be able to use the HB Bridge


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> not easy, but not impossible.


Quote:


> Originally Posted by *Jpmboy*
> 
> erm - yeah. Think I came across that before.
> 
> 
> 
> 
> 
> 
> 
> 
> my 6320 @ 5.3 does even better in valley.
> lol-=less to do with a 4.8 6700K, than 5800+ on the video ram.
> 
> Guys- I know the IPC on a 6700K is the best right now, especially in DX11 and older APIs (which Unigine is). DX12 is a bit different (eg, time spy). Windows 8?
> Still your 170+FPS score in Valley is not easy to match even with my 6700K @ 5.0 and ram at 3866.


0.3 FPS!









Going to have to bump up my 6700K to 4.9 GHz and I can increase another boost step on the GPU. I should probably be able to get 173 with that lol. Don't want to take my SLI apart again though to get the top GPU back to 16x.









By the way what cooler and voltage you using for 5 GHz 6700K? Mine tops out at 4.9 GHz and I use 4.8 24/7.


----------



## DarkIdeals

Changed thermal paste, still terrible temp issues. If i don't pump the fan to ~85% or higher it will raise and raise up to 90C after as little as 4-5 minutes if not less. And the fan is just WAY too loud at that level. I'm not even overclocking either! All i did was raise power and temp limit.

This is disappointing to say the least, what the hell was Nvidia thinking using this cooler on this card. I'm still curious how so many people are apparently "never going over 84C" in reviews and stuff; seems like none of them are raising temp target or something. And there's no waterblocks till the 16th so i'm basically gimped to ~1,400mhz till then as my entire GPU boost is lost from this throttling nonsense.


----------



## techguymaxc

Quote:


> Originally Posted by *DarkIdeals*
> 
> Changed thermal paste, still terrible temp issues. If i don't pump the fan to ~85% or higher it will raise and raise up to 90C after as little as 4-5 minutes if not less. And the fan is just WAY too loud at that level. I'm not even overclocking either! All i did was raise power and temp limit.
> 
> This is disappointing to say the least, what the hell was Nvidia thinking using this cooler on this card. I'm still curious how so many people are apparently "never going over 84C" in reviews and stuff; seems like none of them are raising temp target or something. And there's no waterblocks till the 16th so i'm basically gimped to ~1,400mhz till then as my entire GPU boost is lost from this throttling nonsense.


Unless your ambient temps are garbage (like high 20s/low 30s) there's no reason your card should throttle like that unless it's defective.

Edit: assuming of course you also have proper ventilation in your case...


----------



## lilchronic

Quote:


> Originally Posted by *CallsignVega*
> 
> 0.3 FPS!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going to have to bump up my 6700K to 4.9 GHz and I can increase another boost step on the GPU. I should probably be able to get 173 with that lol. Don't want to take my SLI apart again though to get the top GPU back to 16x.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By the way what cooler and voltage you using for 5 GHz 6700K? Mine tops out at 4.9 GHz and I use 4.8 24/7.


That's about what my 6600k can do water cooled.


Spoiler: Warning: Spoiler! off topic


----------



## batata face

Hello everybody


----------



## dante`afk

does the corsair h90 fit on the txp?

or any other aio watercooling anyone is using?


----------



## KillerBee33

Quote:


> Originally Posted by *dante`afk*
> 
> does the corsair h90 fit on the txp?
> 
> or any other aio watercooling anyone is using?




Same unit http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


----------



## CallsignVega

JPM if you best this you win our duel as I am spent:


----------



## carlhil2

Did anyone who paid for "next day" actually get theirs the next day? also, has anyone put the EK uni-block on their TXP yet?


----------



## DarkIdeals

Quote:


> Originally Posted by *carlhil2*
> 
> Did anyone who paid for "next day" actually get theirs the next day? also, has anyone put the EK uni-block on their TXP yet?


Got mine next day, but apparently at the expense of proper cooling. I'm getting ~50C at idle with max performance set in control panel, when it is set to max power savings mode it's still idling at ~44-45C and i can't seem to cure the ~89C load temps either. It's getting infuriating.

Wish i still had my EK uniblock, i'd be using it now if i did.


----------



## Diverge

Quote:


> Originally Posted by *carlhil2*
> 
> Did anyone who paid for "next day" actually get theirs the next day? also, has anyone put the EK uni-block on their TXP yet?


I ordered yesterday at around 10-11am EST. Got it today at around 1PM, even though it was supposed to be delivered by 10:30AM.


----------



## carlhil2

Quote:


> Originally Posted by *Diverge*
> 
> I ordered yesterday at around 10-11am EST. Got it today at around 1PM, even though it was supposed to be delivered by 10:30AM.


Thanks for the response, you too @DarkIdeals....


----------



## HaniWithAnI

Quote:


> Originally Posted by *KillerBee33*
> 
> 
> 
> Same unit http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


Holy **** that looks so effing good I'm totally doing that!! Presumably he had to drill the top of the cooler?


----------



## KillerBee33

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Holy **** that looks so effing good I'm totally doing that!! Presumably he had to drill the top of the cooler?


Just the side of the Shroud


----------



## carlhil2

Do you guys think that the ASIC not being read is by design, to stop people from returning cards because of it?


----------



## Roaches

I'm curious about the Vram usage and FPS figures at 4K in Ark Survival Evolved at Epic settings if anyone here plays that game? Thanks.


----------



## Jpmboy

Quote:


> Originally Posted by *marc0053*
> 
> Yes, Uni block + fan on VRM and on back of PCB


NIce! I'm gonna see if the EK block will fir by just removing the heat sink, but not the shroud. It's gonna be some time before EK blocks gewt here.








Quote:


> Originally Posted by *CallsignVega*
> 
> JPM if you best this you win our duel as I am spent:
> 
> 
> Spoiler: Warning: Spoiler!


NIce - won;t be for a while with the 6700K bud... I just boxed the M8E/6700K and put a microATX on the bench with this 6320... gotta try something.
But I may give it a go with this other rig (6950X). Waaay too much stuff going on here.








My 6700K needs 1.54V for 5.0/4.8 cache. water cooled - sometimes with an aquarium chiller.


----------



## carlhil2

Quote:


> Originally Posted by *Jpmboy*
> 
> NIce! I'm gonna see if the EK block will fir by just removing the heat sink, but not the shroud. It's gonna be some time before EK blocks gewt here.
> 
> 
> 
> 
> 
> 
> 
> 
> NIce - won;t be for a while with the 6700K bud... I just boxed the M8E/6700K and put a microATX on the bench with this 6320... gotta try something.
> But I may give it a go with this other rig (6950X). Waaay too much stuff going on here.
> 
> 
> 
> 
> 
> 
> 
> 
> My 6700K needs 1.54V for 5.0/4.8 cache. water cooled - sometimes with an aquarium chiller.


Lol, I didn't even see that pic, thanks, now I have my answer...


----------



## Fiercy

Quote:


> Originally Posted by *carlhil2*
> 
> Did anyone who paid for "next day" actually get theirs the next day? also, has anyone put the EK uni-block on their TXP yet?


Got mine next day. Blocks are available for pre-order now but you won't get one till they ship on august 16.


----------



## dante`afk

Quote:


> Originally Posted by *Diverge*
> 
> I ordered yesterday at around 10-11am EST. Got it today at around 1PM, even though it was supposed to be delivered by 10:30AM.


when did you receive the shipping confirmation after ordering? if there was any at all.


----------



## Foxrun

Quote:


> Originally Posted by *dante`afk*
> 
> when did you receive the shipping confirmation after ordering? if there was any at all.


Not a chance. I paid for next day 10am on Wed and I still havent received a confirmation. Ive called customer service twice and they're excuse was the high volume of orders and they have no idea when it will be shipping. Also fun fact they have to have another dept approve my demand for a refund on shipping. Their customer service is pathetic


----------



## Diverge

Quote:


> Originally Posted by *dante`afk*
> 
> when did you receive the shipping confirmation after ordering? if there was any at all.


Last night at around 9PM EST. But according to the tracking it actually shipped out much earlier (it just sat at a fedex location for a while):
Quote:


> Date/Time
> Activity Location
> 
> 8/04/2016 - Thursday
> 12:55 pm Delivered Farmingville, NY
> Left at front door. Package delivered to recipient address - release authorized
> 10:47 am On FedEx vehicle for delivery RONKONKOMA, NY
> 9:37 am Delivery exception RONKONKOMA, NY
> 9:36 am At local FedEx facility RONKONKOMA, NY
> 6:52 am At destination sort facility JAMAICA, NY
> 4:59 am Departed FedEx location INDIANAPOLIS, IN
> 12:10 am Arrived at FedEx location INDIANAPOLIS, IN
> 
> 8/03/2016 - Wednesday
> 7:15 pm Left FedEx origin facility ROSEVILLE, MN
> 2:32 pm Picked up ROSEVILLE, MN
> 12:25 pm Shipment information sent to FedEx


----------



## Diverge

Quote:


> Originally Posted by *Foxrun*
> 
> Not a chance. I paid for next day 10am on Wed and I still havent received a confirmation. Ive called customer service twice and they're excuse was the high volume of orders and they have no idea when it will be shipping. Also fun fact they have to have another dept approve my demand for a refund on shipping. Their customer service is pathetic


It's not really Nvidia's customer service (although their sites are a mess - too many different logins). It's Digital Rivers that is handling Nvidia's webstore. I called them for a refund too, as it was supposed to be delivered before 10:30AM. I had planned to go to work a little late, and instead ended up sitting around 1/2 the day waiting for them.


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Do you guys think that the ASIC not being read is by design, to stop people from returning cards because of it?


Yup and also to stop companies like EVGA from selling cards by ASIC quality.


----------



## GunnzAkimbo

After buying a TXP, you won't want to leave the room.


----------



## GunnzAkimbo




----------



## Darkstar757

165fps Lets go!


----------



## AdamK47

Mine are in.


----------



## carlhil2

Quote:


> Originally Posted by *Darkstar757*
> 
> 165fps Lets go!


Do you know how much OT that I would need for 2 of these, ..looking good though...


----------



## techguymaxc

What I don't understand is how people can buy cards like this and not have real watercooling.


----------



## carlhil2

https://www.techpowerup.com/reviews/NVIDIA/Titan_X_Pascal/


----------



## Darkstar757

Quote:


> Originally Posted by *AdamK47*
> 
> Mine are in.


Mind loaning me that cpu.


----------



## Darkstar757

Quote:


> Originally Posted by *techguymaxc*
> 
> What I don't understand is how people can buy cards like this and not have real watercooling.


Waiting on blocks and I am going to do a full custom loop. For me I love the look of WC but its a lot of work for little return these days.


----------



## Darkstar757

Quote:


> Originally Posted by *carlhil2*
> 
> Do you know how much OT that I would need for 2 of these, ..looking good though...


Thanks m8!


----------



## carlhil2

Missed out on the 10 core BW-E , which I wanted BADLY, but, I now have my loot ready for the 10 core Skylake-E chip when it drops...







that, and Volta, match made in heaven, for me at least...


----------



## Metros

Quote:


> Originally Posted by *carlhil2*
> 
> Missed out on the 10 core BW-E , which I wanted BADLY, but, I now have my loot ready for the 10 core Skylake-E chip when it drops...
> 
> 
> 
> 
> 
> 
> 
> that, and Volta, match made in heaven, for me at least...


Not long until Volta, expect some news soon


----------



## carlhil2

Quote:


> Originally Posted by *Metros*
> 
> Not long until Volta, expect some news soon


BIG Volta is what I am referring to..


----------



## Darkstar757

Quote:


> Originally Posted by *CallsignVega*
> 
> JPM if you best this you win our duel as I am spent:


Wonder why my score is so much lower at factory defaults. Im getting 5468


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> NIce! I'm gonna see if the EK block will fir by just removing the heat sink, but not the shroud. It's gonna be some time before EK blocks gewt here.
> 
> 
> 
> 
> 
> 
> 
> 
> NIce - won;t be for a while with the 6700K bud... I just boxed the M8E/6700K and put a microATX on the bench with this 6320... gotta try something.
> But I may give it a go with this other rig (6950X). Waaay too much stuff going on here.
> 
> 
> 
> 
> 
> 
> 
> 
> My 6700K needs 1.54V for 5.0/4.8 cache. water cooled - sometimes with an aquarium chiller.


I removed the shroud like marc did, I didn't try it with the shroud, let me know if it fits


----------



## Metros

Quote:


> Originally Posted by *carlhil2*
> 
> BIG Volta is what I am referring to..


That will be 2018 then, however GV104 will be out next year and that will beat the Titan X

Also to those who are saying about microstutter in SLI, why buy two Titan X GPUs then, instead of saying "enjoy your microstutter" to a GTX 1080 SLi owner, you should be saying it to Titan X owners as well


----------



## Baasha

The Titan X Pascal is an absolute monster!

Behold benchmarks in the real world (wide variety of games) in 4K & 5K w/ 6950X @ 4.30Ghz:


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Missed out on the 10 core BW-E , which I wanted BADLY, but, I now have my loot ready for the 10 core Skylake-E chip when it drops...
> 
> 
> 
> 
> 
> 
> 
> that, and Volta, match made in heaven, for me at least...


Do you do a lot of encoding?


----------



## DADDYDC650

Quote:


> Originally Posted by *Baasha*
> 
> The Titan X Pascal is an absolute monster!
> 
> Behold benchmarks in the real world (wide variety of games) in 4K & 5K w/ 6950X @ 4.30Ghz:


I'm 1 minute into the video and I've heard about 15 "uhhh" "ummm".


----------



## magbarn

I placed my order on launch day in the afternoon. Still haven't received any confirmation. Anyone else here place an order on Tuesday afternoon and received shipping confirmation?


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Do you do a lot of encoding?


Mostly, I charge guys who want to master their music on it.etc. I have invested in very expensive software for this reason...but, yeah, that too...


----------



## mouacyk

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm 1 minute into the video and I've heard about 15 "uhhh" "ummm".


Lol, right? With a 6950K, you'd think he can edit those out or mute it









Still pretty kick-butt performance for a single GPU, dang. I'm eye'ing that 130fps+ in bf4 for my new XB271UH 165Hz monitor at 1440p.


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Mostly, I charge guys who want to master their music on it.etc. I have invested in very expensive software for this reason...but, yeah, that too...


Gotcha. Was wondering if it was strictly for gaming.


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Gotcha. Was wondering if it was strictly for gaming.


I would stick to a fast Quad for that, which, I have. my rig is a do all pc. I have a in home studio thing going...I also do mixes for parties/functions..


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> I would stick to a fast Quad for that, which, I have. my rig is a do all pc. I have a in home studio thing going...I also do mixes for parties/functions..


Or a 6800k which I own.


----------



## marc0053

Quote:


> Originally Posted by *Testier*
> 
> Which resistors to take out for the shunt mod? I am curious...


Once I get my CLU paste I will take before and after pics of those resistors and post them here.
I also plan on doing a quick live stream of performance before and after and will post the link here when I do so.
For reference the 3x resistors on the GTX 1080 Gigabyte G1. it lowered the %power utilization from the max of 108% (throttle) to about 95% (non-throttle zone).

my card after the mod with CLU


Pic from Techpowerup


----------



## Maintenance Bot

Quote:


> Originally Posted by *mouacyk*
> 
> I'm eye'ing that 130fps+ in bf4 for my new XB271UH 165Hz monitor at 1440p.


Fps counter was pegged at 165 here, card is a tank for sure.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm Hyper I'm not so sure about that. Virtually all tests I've ever seen put the 6700K ahead of the lower clocked 8 to 10 core CPU's for gaming. Remember Skylake is still the fastest architecture per clock and once you put 4.8 GHz on top of that, it's hard to beat.
> 
> Take a look at this that just came out. 6700K cleaned up across the board, even in some games you mentioned:
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu
> 
> http://www.hardocp.com/article/2016/06/24/dx11_vs_dx12_intel_6700k_6950x_framerate_scaling/1#.V6NXiTsrK0o
> 
> I've always gotten the highest/most expensive GPU, but the $1700 6950X has about the worst cost to gaming performance due to its low clocks of about any CPU ever released. Plus with SLI now two cards max and the HB SLI bridge out, you don't need those extra PCI-E lanes/slots either.


That's why I didn't go with the 6950x. Because clocks were lower. Would you be interested in running a couple tests? Games I've tested since getting the 5960x recently have been Crysis 3, GTA V, and Rise of the Tomb Raider. And in all of them, I get substantially better performance with the 5960x at 4.7GHz than my 3770k at 5.2GHz. But as the 3770k is 2 generations behind the 6700k, there may be truth to what you're saying. Just for reference...the 5960x allowed me for the first time, to run Crysis 3's "welcome to the jungle map" at my software imposed 160fps cap. With all settings maxed out.


----------



## Murlocke

Has anyone replaced the TIM yet and posted results? Some people were seeing ~10C drops on the previous TItans.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> That's why I didn't go with the 6950x. Because clocks were lower. Would you be interested in running a couple tests? Games I've tested since getting the 5960x recently have been Crysis 3, GTA V, and Rise of the Tomb Raider. And in all of them, I get substantially better performance with the 5960x at 4.7GHz than my 3770k at 5.2GHz. But as the 3770k is 2 generations behind the 6700k, there may be truth to what you're saying. Just for reference...the 5960x allowed me for the first time, to run Crysis 3's "welcome to the jungle map" at my software imposed 160fps cap. With all settings maxed out.


A Broadwell-E running at 4.4Ghz can keep pace with a 6700k around 4.6Ghz and beat it in games that love more cores. Can't beat out those beastly 6700k's running at 4.8 and above though.


----------



## HyperMatrix

Quote:


> Originally Posted by *DADDYDC650*
> 
> A Broadwell-E running at 4.4Ghz can keep pace with a 6700k around 4.6Ghz and beat it in games that love more cores. Can't beat out those beastly 6700k's running at 4.8 and above though.


I'm sitting on a mediocre 5960x that only clocks to 4.7GHz but I find it hard to believe that a 4.8GHz 6700k would beat it in modern games since they all benefit at least slightly from the additional cores. But I am curious about it. And since we have the same video cards, would be neat to test a few games to see how each cpu performs.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> I'm sitting on a mediocre 5960x that only clocks to 4.7GHz but I find it hard to believe that a 4.8GHz 6700k would beat it in modern games since they all benefit at least slightly from the additional cores. But I am curious about it. And since we have the same video cards, would be neat to test a few games to see how each cpu performs.


Well your CPU = a 6900k @ around 4.3-4.4Ghz. Check out the video below. Doesn't feature the 8-10 core CPU but it does pit a 6800k against a 6700k. 5 minute and 15 secs mark.


----------



## ChrisxIxCross

This card is an absolute monster, achieved over 2GHZ OC on air no problem and over 30K graphics score in Firestrike!







I can only imagine what I'll be able to get once I put this beast under water


----------



## HyperMatrix

Quote:


> Originally Posted by *DADDYDC650*
> 
> Well your CPU = a 6900k @ around 4.3-4.4Ghz. Check out the video below. Doesn't feature the 8-10 core CPU but it does pit a 6800k against a 6700k. 5 minute and 15 secs mark.


Not exactly how it works. Because there is a point where higher clocks are more important than more cores. That's how the entire argument for 4 faster cores comes about. The question is whether a similarly clocked 6700K, like the one vega has, will beat out its 8 core counterpart. And whether that performance difference is with HT enabled or disabled. Because my theory is that if 4 real cores + 4 virtual cores are good for gaming, then 8 real cores + 0 virtual cores should be even better. Even with slightly lower clocks (100MHz in this case).


----------



## sgs2008

Hoping evga will give us a hybrid cooler for these


----------



## ChrisxIxCross

Quote:


> Originally Posted by *sgs2008*
> 
> Hoping evga will give us a hybrid cooler for these


They have confirmed that they will! Only downside is expected ETA is 1-2 months


----------



## carlhil2

Where are all the peeps that said the TXP wouldn't OC like the other Pascal chips?


----------



## CallsignVega

HT is no good for gaming, I only use 4 pure cores. Remember also Skylake is two architectures more advanced than Haswell and one more advanced than Broadwell. So it's more cores of the 8-10 core chips working against less cores but faster architecture and frequency of Skylake. Skylake is still king until Kaby Lake.


----------



## degenn

Any Canadians in this thread who have ordered one? Wondering what kind of international shipping Nvidia offers, hoping for some fast options like 2-3 day FedEx/UPS Express. I would assume they have such options but ya never know... suppose I will probably have to go through the checkout process to find out...

*Edit- blah nvm they just went out of stock when I began checkout... oh well.


----------



## Testier

Quote:


> Originally Posted by *degenn*
> 
> Any Canadians in this thread who have ordered one? Wondering what kind of international shipping Nvidia offers, hoping for some fast options like 2-3 day FedEx/UPS Express. I would assume they have such options but ya never know... suppose I will probably have to go through the checkout process to find out...
> 
> *Edit- blah nvm they just went out of stock when I began checkout... oh well.


Ordered on 8/2, got the card today. Some people got it yesterday. It was only a bit more than ground.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> Not exactly how it works. Beca*Because there is a point where higher clocks are more important than more cores*at's how the entire argument for 4 faster cores comes about. The question is whether a similarly clocked 6700K, like the one vega has, will beat out its 8 core counterpart. And whether that performance difference is with HT enabled or disabled. Because my theory is that if 4 real cores + 4 virtual cores are good for gaming, then 8 real cores + 0 virtual cores should be even better. Even with slightly lower clocks (100MHz in this case).


I agree which is why I pointed out that a high clocked 6700k is king. My response was a little off-topic though so my bad. I was watching Game of Thrones so I didn't read the posts correctly.


----------



## Swolern

Can anyone run a bench of Heaven @ 3440x1440 on a single TXP and post? I havent seen them yet.


----------



## Testier

Quote:


> Originally Posted by *Swolern*
> 
> Can anyone run a bench of Heaven @ 3440x1440 on a single TXP and post? I havent seen them yet.


Ok give me a sec.


----------



## Evo X

Installed in my system and had some time to do some testing today.

My god this card is incredible!

After adjusting the fan curve, I'm running a rock solid 2Ghz on the core and freaking 11.2Ghz on the memory!

It's tearing through every game I throw at it at 3440x1440 100hz.

Huge upgrade from my mediocre overclocking Maxwell Titan X. Very happy right now.


----------



## techguymaxc

Quote:


> Originally Posted by *Metros*
> 
> That will be 2018 then, however GV104 will be out next year and that will beat the Titan X
> 
> Also to those who are saying about microstutter in SLI, why buy two Titan X GPUs then, instead of saying "enjoy your microstutter" to a GTX 1080 SLi owner, you should be saying it to Titan X owners as well


You appear not to understand how microstutter works... If single card A achieves a good deal less than 60 FPS at given settings and SLI card A gets above 60, your FPS well be high but so will your variance. Not the case with single card B getting at or close to 60 FPS let alone in SLI.

Obvious examples here being 4k GP104 vs GP102.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> HT is no good for gaming, I only use 4 pure cores. Remember also Skylake is two architectures more advanced than Haswell and one more advanced than Broadwell. So it's more cores of the 8-10 core chips working against less cores but faster architecture and frequency of Skylake. Skylake is still king until Kaby Lake.


I get the theory craft behind what you're saying. You're not willing to put it to the test though?


----------



## carlhil2

Just received my tracking #, says by 3pm tomorrow...


----------



## sgs2008

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> They have confirmed that they will! Only downside is expected ETA is 1-2 months


O sweet gives me some time to save up a bit more since ill be getting two of these


----------



## Testier

Quote:


> Originally Posted by *Swolern*
> 
> Can anyone run a bench of Heaven @ 3440x1440 on a single TXP and post? I havent seen them yet.




I was getting around 1.9-1.95ghz core, 11100mhz mem.


----------



## degenn

Quote:


> Originally Posted by *Testier*
> 
> Ordered on 8/2, got the card today. Some people got it yesterday. It was only a bit more than ground.


Cheers. Which carrier do they use? What was duty/brokerage like or did you get lucky and only have to pay local taxes?


----------



## ssgwright

where the hell can I get this thing been looking everywhere.


----------



## degenn

Quote:


> Originally Posted by *ssgwright*
> 
> where the hell can I get this thing been looking everywhere.


geforce.com but they are out of stock right now.


----------



## Swolern

Quote:


> Originally Posted by *Testier*
> 
> 
> 
> I was getting around 1.9-1.95ghz core, 11100mhz mem.


Damn with 8xMSAA too! Awesome! Thanks man. Repped!


----------



## EniGma1987

Some geek pr0n for you all













and high res downloads in case on-site upload compresses them:
https://dl.dropboxusercontent.com/u/73615195/TitanXpictures/3quarters%20wallpaper.jpg
https://dl.dropboxusercontent.com/u/73615195/TitanXpictures/Die%20Shot%20Wallpaper.jpg
https://dl.dropboxusercontent.com/u/73615195/TitanXpictures/shroud%20with%20GPU%20front.jpg
https://dl.dropboxusercontent.com/u/73615195/TitanXpictures/shround%20with%20GPU%20behind.jpg
https://dl.dropboxusercontent.com/u/73615195/TitanXpictures/Die%20shot%20mirror.jpg
https://dl.dropboxusercontent.com/u/73615195/TitanXpictures/board%20picture.jpg

The stock heatsink base looks terrible though:


----------



## Testier

So titan x is marked GP102-400-A1. As the 400 part is usually the top end consumer part, I dont think we will see a full GP102 on revision A1.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> I get the theory craft behind what you're saying. You're not willing to put it to the test though?


Which tests? Games or benchmarks?


----------



## Fiercy

I did some benches between overclocked Maxwell and Pascal Titans under different CPU clocks put the entire thing into a review.
http://www.overclock.net/products/nvidia-titan-x-pascal/reviews/7414


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Which tests? Games or benchmarks?


Games of course. Want to see performance difference on a few new titles. Because in theory...and I think we're all getting tired of theories...the performance gains in games that support more cores should be much higher than any potential performance loss in games that only utilize 4 cores. I started testing this stuff when I realized I got better performance in the division with HT on. The last few years I'd been going with HT off because it was a no brainer.


----------



## EniGma1987

Quote:


> Originally Posted by *Testier*
> 
> Which resistors to take out for the shunt mod? I am curious...


I know it has been discussed already but here is an actual picture where you can see the three 5MO resistors you would need to cover with CLU to lower power target reading (giving you 30-40% more PT headroom)



Nvidia always names them RS1, RS2, and RS3 so you can easily find them all all Nvidia cards going back many generations now, and to my knowledge they are always 5MO resistors too.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> Games of course. Want to see performance difference on a few new titles. Because in theory...and I think we're all getting tired of theories...the performance gains in games that support more cores should be much higher than any potential performance loss in games that only utilize 4 cores. I started testing this stuff when I realized I got better performance in the division with HT on. The last few years I'd been going with HT off because it was a no brainer.


Ya sure. Which games?
Quote:


> Originally Posted by *EniGma1987*
> 
> I know it has been discussed already but here is an actual picture where you can see the three 5MO resistors you would need to cover with CLU to lower power target reading (giving you 30-40% more PT headroom)


Did you do it to yours?


----------



## Evo X

Here is Timespy side by side with both my old Maxwell and new Pascal Titan. Both overclocked near their limit. Not a bad gain at all.


----------



## EniGma1987

Quote:


> Originally Posted by *CallsignVega*
> 
> Did you do it to yours?


Sadly I seem to have thrown out my remaining CLU when I moved a couple months ago. I have some ordered but there is no Prime shipping and 5 day shipping costs $45, so I wont get any to do the mod with for a bit over a week









Mouser already shipped my resistors though, so I may just solder resistors on and go that route since it is the same thing, only more permanent and a bit more difficulty. We'll see. I am most definitely hitting PT limit hard in benchmarks, but in actual games I am hitting temp limit really hard. Kinda interesting. I have a waterblock ordered already for the temp problem and supplies on the way for PT problem, so it will all be good soon enough.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya sure. Which games?
> Did you do it to yours?


A few I'm curious about. Rise of the tomb raider. GTA V (though I'm not sure how we could replicate the same scene?). Crysis 3, particularly "welcome to the jungle" map as you get off the elevator before the minefield. And the division. Also if you have any games in mind that you'd like me to test. At the end of the day...overclock.net testing and benching is far more reliable than what you find on other websites.


----------



## HyperMatrix

Quote:


> Originally Posted by *EniGma1987*
> 
> Sadly I seem to have thrown out my remaining CLU when I moved a couple months ago. I have some ordered but there is no Prime shipping and 5 day shipping costs $45, so I wont get any to do the mod with for a bit over a week
> 
> 
> 
> 
> 
> 
> 
> 
> Mouse already shipped my resistors though, so I may just solder resistors on and go that route since it is the same thing, only more permanent and a bit more difficulty. We'll see. I am most definitely hitting PT limit hard in benchmarks, but in actual games I am hitting temp limit really hard. Kinda interesting. I have a waterblock ordered already for the temp problem and supplies on the way for PT problem, so it will all be good soon enough.


Definitely let us know how it goes. I'm down for CLU but don't want to hard mod like I did with my maxwell Titan Xs. Which was fun. Especially with a variable pot to adjust voltage. But would prefer to avoid doing that again.


----------



## Sheyster

Quote:


> Originally Posted by *Evo X*
> 
> Here is Timespy side by side with both my old Maxwell and new Pascal Titan. Both overclocked near their limit. Not a bad gain at all.


Not bad = an almost 60% gain in that benchmark! That's outstanding IMHO.









My card will arrive on Monday. I cheaped out and took the free shipping option.


----------



## carlhil2

Quote:


> Originally Posted by *Sheyster*
> 
> Not bad = an almost 60% gain in that benchmark! That's outstanding IMHO.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My card will arrive on Monday. I cheaped out and took the free shipping option.


Blasphemy... ?


----------



## Sheyster

Quote:


> Originally Posted by *CallsignVega*
> 
> HT is no good for gaming, I only use 4 pure cores.


It depends on the game. HT on is good for FPS in some games but not others.


----------



## DADDYDC650

Sorry for being off topic but I was cleaning out my parents 15 year old desktop that I built (AMD Athon XP 2800+) and adding an SSD and I came across this dead lil fella.... YIKES!!! ??


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> A few I'm curious about. Rise of the tomb raider. GTA V (though I'm not sure how we could replicate the same scene?). Crysis 3, particularly "welcome to the jungle" map as you get off the elevator before the minefield. And the division. Also if you have any games in mind that you'd like me to test. At the end of the day...overclock.net testing and benching is far more reliable than what you find on other websites.


Hmm, unless the game has a canned benchmark not sure how we would test. Unless you just let a level load and not move or something. I do know that playing Crysis 3 on my setup my GPU's are both pegged at 99%, so no CPU limiting.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, unless the game has a canned benchmark not sure how we would test. Unless you just let a level load and not move or something. I do know that playing Crysis 3 on my setup my GPU's are both pegged at 99%, so no CPU limiting.


Even the welcome to the jungle level? And this is with max graphics? Best way to compare would be stock clocks on video cards. Exit elevator and look around the area. The grass is usually a bit of an fps killer. . Check highest (without looking directly at the ground/sky. Haha) and lowest fps you get. Run the game at 1440p to give your cards a break, and since that's the highest resolution I can run without dsr.

Tomb raider has a built in benchmark. Not sure if it's a good one, since the game comes to a crawl in Soviet station and other open world areas. Although the new dx12 multi-gpu support has resolved that problem too.


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Sorry for being off topic but I was cleaning out my parents 15 year old desktop that I built (AMD Athon XP 2800+) and adding an SSD and I came across this dead lil fella.... YIKES!!! ??


Lunch for this afternoon?


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Lunch for this afternoon?


LoL, not a bad idea. I let my dog eat it. Crickets are a good source of protein.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> Even the welcome to the jungle level? And this is with max graphics? Best way to compare would be stock clocks on video cards. Exit elevator and look around the area. The grass is usually a bit of an fps killer. . Check highest (without looking directly at the ground/sky. Haha) and lowest fps you get. Run the game at 1440p to give your cards a break, and since that's the highest resolution I can run without dsr.
> 
> Tomb raider has a built in benchmark. Not sure if it's a good one, since the game comes to a crawl in Soviet station and other open world areas. Although the new dx12 multi-gpu support has resolved that problem too.


Don't have Tomb Raider. Where is that Crysis level, how far into the game? Anyway I can just jump to it?


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Don't have Tomb Raider. Where is that Crysis level, how far into the game? Anyway I can just jump to it?


I think it's the second or third level. If you've played it before, you can jump to any level you want. Do you have hitman? That has a benchmark built in.

Any games you want me to test?


----------



## ratzofftoya

Quote:


> Originally Posted by *DADDYDC650*
> 
> A Broadwell-E running at 4.4Ghz can keep pace with a 6700k around 4.6Ghz and beat it in games that love more cores. Can't beat out those beastly 6700k's running at 4.8 and above though.


How does the 6800K compare?


----------



## Testier

For GTA V benchmarking, the beginning drive with Lamar takes you through the city and its decently predictable. It should easily bring a titan x to its knees on maxed.


----------



## DADDYDC650

Quote:


> Originally Posted by *ratzofftoya*
> 
> How does the 6800K compare?


5 minutes and 15 seconds mark.


----------



## HyperMatrix

Quote:


> Originally Posted by *Testier*
> 
> For GTA V benchmarking, the beginning drive with Lamar takes you through the city and its decently predictable. It should easily bring a titan x to its knees on maxed.


We're testing with 2 Titans though. In order to bring out the CPU bottlenecking.







But yeah that's a really good idea. Just have to run same graphics settings.


----------



## lilchronic

Quote:


> Originally Posted by *HyperMatrix*
> 
> We're testing with 2 Titans though. In order to bring out the CPU bottlenecking.
> 
> 
> 
> 
> 
> 
> 
> But yeah that's a really good idea. Just have to run same graphics settings.


Here is a spot in crysis 3 where it is really cpu demanding @ 1080p my 5820k is bottlenecking my Titan x Maxwell


With a i5 6600k @ 4.8Ghz at the same spot i get around 71fps and a 6700k should be around 85 fps. So if you have enough gpu power to drive that many frames @ 1440p 144hz a 6700k is not going to cut it.


----------



## ratzofftoya

They're here....


----------



## HyperMatrix

Quote:


> Originally Posted by *lilchronic*
> 
> Here is a spot in crysis 3 where it is really cpu demanding @ 1080p my 5820k is bottlenecking my Titan x Maxwell
> 
> 
> With a i5 6600k at the same spot i get around 71fps and a 6700k should be around 85 fps if you have enough gpu power to drive that many frames @ 1440p 144hz a 6700k is not going to cut it.


I'm pretty sure I ran through that area maxed out at the 160fps cap I had put in place, even on my Tri-SLI Maxwell Titans. I'll check with the new cards now in case I'm full of crap.







It is a real possibility.


----------



## ratzofftoya

Wow, so are we really saying that Z170 outperforms X99 for gaming?


----------



## HyperMatrix

Quote:


> Originally Posted by *ratzofftoya*
> 
> They're here....


What on God's green earth happened to Lays...


----------



## Testier

Quote:


> Originally Posted by *HyperMatrix*
> 
> We're testing with 2 Titans though. In order to bring out the CPU bottlenecking.
> 
> 
> 
> 
> 
> 
> 
> But yeah that's a really good idea. Just have to run same graphics settings.


Run in circle in emerald city fallout 4?


----------



## lilchronic

Quote:


> Originally Posted by *HyperMatrix*
> 
> I'm pretty sure I ran through that area maxed out at the 160fps cap I had put in place, even on my Tri-SLI Maxwell Titans. I'll check with the new cards now in case I'm full of crap.
> 
> 
> 
> 
> 
> 
> 
> It is a real possibility.


Its right after you take a zipline on welcome to the jungle.
Or right before you get on the gas tank train car and ride it through the tunnel.

You see that cpu usage though 70-90% on all cores while gpu usage was at 89%.
With the 6600k cpu usage is pegged @ 90-100%


----------



## HyperMatrix

Quote:


> Originally Posted by *Testier*
> 
> Run in circle in emerald city fallout 4?


Played Fallout 4 for about 1 hour before I got bored of it and never touched it again. Sorry.


----------



## Testier

Quote:


> Originally Posted by *ratzofftoya*
> 
> Wow, so are we really saying that Z170 outperforms X99 for gaming?


its not news....

people get x99 for PCIE lanes and other stuff...


----------



## ratzofftoya

Quote:


> Originally Posted by *Testier*
> 
> its not news....
> 
> people get x99 for PCIE lanes and other stuff...


Yeah, I got the 5960X because I wanted 3x 980Ti and the Intel 750. But now that I have 2 Titan Xs, should I switch to Z170?


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> I tried SEVERAL games 20-30min each. No stuttering at all in all these games at 4k with maximum graphics setting even rise of the tomb raider:
> 
> Project cars, rise of the tomb raider, dragon age inquisition, risen 3...
> 
> No stuttering at all in any of these games, i use adaptive vsync.
> 
> I just thought i update my witcher3 old version from 1,04 or 1,05 to last version, a few days ago. Maybe the last version patch cause me this. I will uninstall it later today and reinstall it with version 1,05 and compare. I have a little less stuttering right now, i use recommendation for stuttering for this game on the net, still not satisfied, I will tell you news about these stuttering after trying 1,05 version. Thanks so much everybody for your help, its so much appreciated. I read all your comments and recommendation. My 100% final overclock result are: core +210 mem +700 for 547gb/s of bandwith BETTER and FASTER than quadro 12gb HBM2 version at 540gb/s , who said hbm2 is faster than ddr5x ? LoL
> 
> Note: i have 6942 result for firestrike ultra, im a little disappointed it seems overclocked one in average got 7200+ , mine is overclocked to max, maybe i lost a few points because of my memory ddr3 and i5 4690, but i dont think a i5 4690 should bottleneck my titanx in 4k gaming, in 1440p it could be probably
> 
> I prey to get working witcher3 with no stuttering later today. Prey for me guys!


Quote:


> Originally Posted by *Murlocke*
> 
> 1.05 is super old and lacks a lot of optimization they put into the game. The latest patch should perform the best.
> 
> Disable hairworks if you haven't, make sure you have FPS limit set to unlimited. Try doing a fresh reinstall of the latest TItan X driver and check "clean install".
> 
> My card still caps out at a super low +300 RAM. I tested +350 a bit, and I ended up getting some artifacts after about 30 minutes even with +0 on the core. I get 7300ish in Fire Strike with +200/+300. Definitely your processor, benchmarks are not the same as games.
> 
> I have heard that newer processor smooth out FPS significantly in Witcher 3 and GTA 5. Unless you stated the wrong model, your processor can't be overclocked. It's probably your processor that's causing the stuttering and random drops. Most people buying Titan X overclock their processors and/or are running a newer processor with higher stock clocks.
> You are right. I just bought my 6700k about 2 weeks ago and I researched extensively on this topic. Money wasn't in the question, and I still went with a 6700k.


I found the solution. after loosing 3h in windows 7. I go to my windows 10 partition, i reinstall the witcher3 game in windows 10 and no more stuttering . I dont know the windows 7 problem, but its not important, i will continue the game on windows 10. Thanks for help everybody


----------



## cookiesowns

Quote:


> Originally Posted by *ratzofftoya*
> 
> Yeah, I got the 5960X because I wanted 3x 980Ti and the Intel 750. But now that I have 2 Titan Xs, should I switch to Z170?


Don't think so. I would still take a 5960X or 6900K over 6700K for gaming even with 2x Titan XP's. Assuming you have a high binned CPU. at least 4.4Ghz on either BW-E or HW-E.

6700K @ 4.8+ will still give you better bench scores in certain benchmarks, eg: valley, but in gaming we're talking no more than a few FPS, or 10FPS in CPU demanding titles.


----------



## HyperMatrix

Quote:


> Originally Posted by *lilchronic*
> 
> Its right after you take a zipline on welcome to the jungle.
> Or right before you get on the gas tank train car and ride it through the tunnel.
> 
> You see that cpu usage though 70-90% on all cores while gpu usage was at 89%.
> With the 6600k cpu usage is pegged @ 90-100%


Ok. I have a good idea of where it is now. Let me find it. Also, what graphics settings were you using?


----------



## Testier

Quote:


> Originally Posted by *ratzofftoya*
> 
> Yeah, I got the 5960X because I wanted 3x 980Ti and the Intel 750. But now that I have 2 Titan Xs, should I switch to Z170?


I wont bother imo.


----------



## renejr902

Oh man ! I just played 15min , the difference is like night and day.
Im so Happy right now !!!!!
I didnt believe the cpu problem.
Gpu load is 100% and cpu between 40 to 60%.

So i had to search for a solution in software... I think i will begin to love windows 10









The game runs pretty smooth solid 60 most of time. Battle is 55-59fps.
I dont have fps drop like windows 7. No severe fps drop in windows 10.

The game never gone below 50fps until now. No hairwork, no aa in 4k. Most Others setting are at maximum. I play with vsync on, but vsync off is still playable but more tearing. In 15min playing i saw maybe 3 or 4 very little stuttering, nothing important

Now im happy of my titanx, all others games run flawessly even rise ofvtomb raifer


----------



## HyperMatrix

Quote:


> Originally Posted by *lilchronic*
> 
> Here is a spot in crysis 3 where it is really cpu demanding @ 1080p my 5820k is bottlenecking my Titan x Maxwell
> 
> 
> With a i5 6600k @ 4.8Ghz at the same spot i get around 71fps and a 6700k should be around 85 fps. So if you have enough gpu power to drive that many frames @ 1440p 144hz a 6700k is not going to cut it.


Tried to get as close to where you were standing/looking. But I think I came back to it after the destruction happened. So there's a lot more going on in terms of explosions/smoke on my screen. This is at 2560x1440. Everything completely maxed out. Except AA. There is no AA. And I guess I lied. It's below 160 fps. But it may be a GPU limitation.


----------



## lilchronic

Quote:


> Originally Posted by *HyperMatrix*
> 
> Ok. I have a good idea of where it is now. Let me find it. Also, what graphics settings were you using?


every thing maxed out but AA is disabled.

Im purposely trying to cause a bottleneck so running a lowered resolution till my card dropped it's usage below 99%.


----------



## lilchronic

Quote:


> Originally Posted by *HyperMatrix*
> 
> Tried to get as close to where you were standing/looking. But I think I came back to it after the destruction happened. So there's a lot more going on in terms of explosions/smoke on my screen. This is at 2560x1440. Everything completely maxed out. Except AA. There is no AA. And I guess I lied. It's below 160 fps. =D


try standing on that ledge you walk out on with the railing and look at that bush that is to the left of you gun in the current screen cap.


----------



## HyperMatrix

Quote:


> Originally Posted by *lilchronic*
> 
> try staying on that ledge you walk out on with the railing and look at that bush that is to the lft of you gn in the current sceen cap.


I actually ran to this spot from a saved game. This is where I'm supposed to follow the dude to get on the train. It wont let me up on the ledge. Is that before or after where I am now, time-wise?


----------



## lilchronic

Quote:


> Originally Posted by *HyperMatrix*
> 
> I actually ran to this spot from a saved game. This is where I'm supposed to follow the dude to get on the train. It wont let me up on the ledge. Is that before or after where I am now, time-wise?


Yeah it's hard to get back up on that ledge but i think i have done it before.

It's strange your getting so much fps as every time i try and look at that spot in the game it hammers my cpu?

... wait is that a 8 core cpu? mine is 5820k.


----------



## HyperMatrix

Quote:


> Originally Posted by *lilchronic*
> 
> Yeah it's hard to get back up on that ledge but i think i have done it before.
> 
> It's strange your getting so much fps as every time i try and look at that spot in the game it hammers my cpu?
> 
> ... wait is that a 8 core cpu? mine is 5820k.


Yeah. 5960x. I found the exact spot you were. And yeah it's definitely an FPS killer. The comparison testing challenge came after CallsignVega said a 4.8GHz 6700k would perform better than a 4.7GHz 5960x. So we're just testing to see if cores matter or not.


----------



## Captivate

Quote:


> Originally Posted by *Testier*
> 
> 
> 
> I was getting around 1.9-1.95ghz core, 11100mhz mem.


Is that right? I'm getting this with SLI 780s (not even Ti's):



I'm sure games wouldn't have good SLI scaling like the benchmarks have, but still, doesn't really warrant such an upgrade (money wise)


----------



## HyperMatrix

Quote:


> Originally Posted by *Captivate*
> 
> Is that right? I'm getting this with SLI 780s:


Can you run resolutions your monitor doesn't support in this test? If so, I'll try it out for you.


----------



## lilchronic

Quote:


> Originally Posted by *HyperMatrix*
> 
> Yeah. 5960x. I found the exact spot you were. And yeah it's definitely an FPS killer. The comparison testing challenge came after CallsignVega said a 4.8GHz 6700k would perform better than a 4.7GHz 5960x. So we're just testing to see if cores matter or not.


Nice
This was a few years ago with my 3570k @ 5Ghz and sli 670's
51FPS


and 4790k @ 5Ghz
86FPS so i think a 6700k should be around 95 - 100 fps at around 5 ghz


----------



## Captivate

Quote:


> Originally Posted by *HyperMatrix*
> 
> Can you run resolutions your monitor doesn't support in this test? If so, I'll try it out for you.


Hmm, what do you mean? I'm using 3440x1440 as my gaming resolution, any other resolution's result is not really important to me.


----------



## intrigger

Hi everyone,

Quick question for you. I have a 5960x capable of 4.8ghz, and a 6950x that can reach 4.4ghz on a smaller build (with less cooling capability than the main build). Currently the big/main system (with 5960x) has 3 x Titan X Maxwell, which are soon to be replaced with 2 x Titan XP (and also considering keeping a Maxwell Titan X for PhysX, is that a good idea??).

Should I switch to the 6950x to my main system or stick with the 5960x? I think I will test in any case because the big system has some serious cooling dedicated to the CPU and Motherboard only (2 x 480s and 2 x 360s with dual pumps), so there is a chance I can get the 6950x to 4.5ghz if I am lucky.

Last point. On the big build (that I am contemplating switching to 6950x), I have 3 x Acer predator 27 inch 144hz Gsync displays that I run in surround for gaming). I guess this would diminish the chance of CPU bottleneck?

The smaller build currently has 2 x 980ti strix, and switching those in the a few days to GTX 1080 FTW... And running a acer 34 inch gsync display on this (this is the system I will install the 5960x in and it currently houses the 5950x)..

Cheers


----------



## HyperMatrix

Quote:


> Originally Posted by *intrigger*
> 
> Hi everyone,
> 
> Quick question for you. I have a 5960x capable of 4.8ghz, and a 6950x that can reach 4.4ghz on a smaller build (with less cooling capability than the main build). Currently the big/main system (with 5960x) has 3 x Titan X Maxwell, which are soon to be replaced with 2 x Titan XP (and also considering keeping a Maxwell Titan X for PhysX, is that a good idea??).
> 
> Should I switch to the 6950x to my main system or stick with the 5960x? I think I will test in any case because the big system has some serious cooling dedicated to the CPU and Motherboard only (2 x 480s and 2 x 360s with dual pumps), so there is a chance I can get the 6950x to 4.5ghz if I am lucky.
> 
> Last point. On the big build (that I am contemplating switching to 6950x), I have 3 x Acer predator 27 inch 144hz Gsync displays that I run in surround for gaming). I guess this would diminish the chance of CPU bottleneck?
> 
> The smaller build currently has 2 x 980ti strix, and switching those in the a few days to GTX 1080 FTW... And running a acer 34 inch gsync display on this (this is the system I will install the 5960x in and it currently houses the 5950x)..
> 
> Cheers


I could end up being wrong but I'd go with the 5960x at 4.8GHz over a 6950x at 4.4GHz for "current gaming." This way you're covered for both multi-threaded games, and also for those that rely on clockspeed. But we're still talking about minor differences here. And the 6950x is more futureproof with DX12, especially if you can hit 4.5GHz. I wouldn't bother keeping any card for dedicated PhysX. However, it can be handy for DX12 multi-gpu enabled games. Because they bypass SLI and distribute the workload as they see fit. So technically for DX12 games that support multi-gpu, even 4 Titan X's can work without issue. Since you look like you have money to throw around...keep that option in mind as more DX12 games come out. The only one currently that actually does a good job of multi-threaded distribution and multi-gpu, is Rise of The Tomb Raider. And to hit 165Hz 1440p, it's already running my 4.7GHz 5960x at 90%+ in certain areas.

But that's a lot of hypotheticals. The new Titan is great. I went from 3 Maxwell Titans to 2 of the new ones. While they're amazing individual cards...honestly...they're only about 33% faster than my old setup. So while this generation of cards are amazing, the limited number of cards you can run in SLI, still puts a pretty hard cap on maximum GPU performance for driving high end gaming. Unless, again, we're talking about DX12, which makes your 6950x the better CPU in all cases, and would let you run up to 4 Pascal Titan X's under Multi-GPU.

Edit: Another idea....test out 2 Pascal Titans and 2 Maxwell Titans in multi-gpu with Rise of The Tomb Raider DX12 Multi-GPU and see if you get any gains. If you do...just keep 2 of your Maxwell cards for now.


----------



## Gary2015

How are the overclocks on water?


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> Oh man ! I just played 15min , the difference is like night and day.
> Im so Happy right now !!!!!
> I didnt believe the cpu problem.
> Gpu load is 100% and cpu between 40 to 60%.
> 
> So i had to search for a solution in software... I think i will begin to love windows 10
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The game runs pretty smooth solid 60 most of time. Battle is 55-59fps.
> I dont have fps drop like windows 7. No severe fps drop in windows 10.
> 
> The game never gone below 50fps until now. No hairwork, no aa in 4k. Most Others setting are at maximum. I play with vsync on, but vsync off is still playable but more tearing. In 15min playing i saw maybe 3 or 4 very little stuttering, nothing important
> 
> Now im happy of my titanx, all others games run flawessly even rise ofvtomb raifer


I just found something great !
I changed PhysX to GPU instead of Auto or Cpu, and witcher 3 run more smooth , no more fps drop at all, no more stuttering at all and fps more stable. COULD IT A PROBLEM to change PHYSX to GPU?? Thanks for answer, i dont know if its a good thing to do that, even if my fps are much better. I will appreciate too if someone can explain me why i get a better fps performance with physx to gpu.

Thanks for answers guys


----------



## HyperMatrix

Quote:


> Originally Posted by *Testier*
> 
> Run in circle in emerald city fallout 4?


I ran around in Lexington, in Fallout 4. Like I said, haven't really played the game so that's the biggest city area I could find. With all settings fully maxed out, as I entered the city and ran around, FPS would fluctuate between 100fps to 165fps (gsync cap) with a couple dips down to 80fps during sprinting, which lowers fps for some reason. I looked at CPU usage, and the game is very heavily loaded on one core. Not even 2. But 1. And once that core is maxed out, performance can't go any higher.


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> I just found something great !
> I changed PhysX to GPU instead of Auto or Cpu, and witcher 3 run more smooth , no more fps drop at all, no more stuttering at all and fps more stable. COULD IT A PROBLEM to change PHYSX to GPU?? Thanks for answer, i dont know if its a good thing to do that, even if my fps are much better. I will appreciate too if someone can explain me why i get a better fps performance with physx to gpu.
> 
> Thanks for answers guys


Maybe its another setting that made the better fps performance. It seems you cant force physx to gpu with witcher3 ... Hmmm


----------



## st0necold

Quote:


> Originally Posted by *intrigger*
> 
> Hi everyone,
> 
> Quick question for you. I have a 5960x capable of 4.8ghz, and a 6950x that can reach 4.4ghz on a smaller build (with less cooling capability than the main build). Currently the big/main system (with 5960x) has 3 x Titan X Maxwell, which are soon to be replaced with 2 x Titan XP (*and also considering keeping a Maxwell Titan X for PhysX, is that a good idea??)*.
> 
> Should I switch to the 6950x to my main system or stick with the 5960x? I think I will test in any case because the big system has some serious cooling dedicated to the CPU and Motherboard only (2 x 480s and 2 x 360s with dual pumps), so there is a chance I can get the 6950x to 4.5ghz if I am lucky.
> 
> Last point. On the big build (that I am contemplating switching to 6950x), I have 3 x Acer predator 27 inch 144hz Gsync displays that I run in surround for gaming). I guess this would diminish the chance of CPU bottleneck?
> 
> The smaller build currently has 2 x 980ti strix, and switching those in the a few days to GTX 1080 FTW... And running a acer 34 inch gsync display on this (this is the system I will install the 5960x in and it currently houses the 5950x)..
> 
> Cheers


No that would not result in any real gains. Dedicated Phsyx cards never are really needed and you have EE cpu's so I would skip that part.

3 Predator's would be 8K resolution. Not sure how well 2 new titan xp's will push that that's a lot of pixels. There was an article regarding that same monitor setup I can't find the link but i'll try and locate it and update this post.


----------



## Jpmboy

Quote:


> Originally Posted by *HyperMatrix*
> 
> Not exactly how it works. Because there is a point where higher clocks are more important than more cores. That's how the entire argument for 4 faster cores comes about. The question is whether a similarly clocked 6700K, like the one vega has, will beat out its 8 core counterpart. And whether that performance difference is with HT enabled or disabled. Because my theory is that if 4 real cores + 4 virtual cores are good for gaming, then 8 real cores + 0 virtual cores should be even better. Even with slightly lower clocks (100MHz in this case).


NOt sure how this myth got started... sure, in DX 11, 10 and 9 games a 2 or 4 core processor running at higher clocks will do better than 6-10 cores at lower clocks... 'CAUSE those games do not use the cores and do not use the modern API - DX12. Vega is on Win 8 - no DX12 so he can't run games or benchmarks that call on that API.
If you have any plans on going VR or ultra high resolution, a 4-core with HT disabled is not gonna cut it. And in any DX12 game or benchmark, more cores make a major difference. Unigine Valley and Heaven are DX11, and can't use the nascent crunch power in a modern CPU (or GPU for that matter since they are not DX12).
Quote:


> Originally Posted by *CallsignVega*
> 
> HT is no good for gaming, I only use 4 pure cores. Remember also Skylake is two architectures more advanced than Haswell and one more advanced than Broadwell. So it's more cores of the 8-10 core chips working against less cores but faster architecture and frequency of Skylake. Skylake is still king until Kaby Lake.


Depends on the application. Disabling HT on a modern cpu to play yesterday's games is... well, you shoulda got a 6600K or a 2 core.








Quote:


> Originally Posted by *CallsignVega*
> 
> Which tests? Games or benchmarks?


Any DX12 game or bench and use SLI. Have you run Timespy?
Quote:


> Originally Posted by *HyperMatrix*
> 
> I ran around in Lexington, in Fallout 4. Like I said, haven't really played the game so that's the biggest city area I could find. With all settings fully maxed out, as I entered the city and ran around, FPS would fluctuate between 100fps to 165fps (gsync cap) with a couple dips down to 80fps during sprinting, which lowers fps for some reason. I looked at CPU usage, and the game is very heavily loaded on one core. Not even 2. But 1. And once that core is maxed out, performance can't go any higher.


console game port.


----------



## cookiesowns

Quote:


> Originally Posted by *st0necold*
> 
> 3 Predator's would be 8K resolution. Not sure how well 2 new titan xp's will push that that's a lot of pixels. There was an article regarding that same monitor setup I can't find the link but i'll try and locate it and update this post.


I'll be benching the TXP's once I get them. I also have triple 27"s so can provide some numbers at 2.5K triple surround ~8K.

What are some good benchmarks that are repeatable ?


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> NOt sure how this myth got started... sure, in DX 11, 10 and 9 games a 2 or 4 core processor running at higher clocks will do better than 6-10 cores at lower clocks... 'CAUSE those games do not use the cores and do not use the modern API - DX12. Vega is on Win 8 - no DX12 so he can't run games or benchmarks that call on that API.
> If you have any plans on going VR or ultra high resolution, a 4-core with HT disabled is not gonna cut it. And in any DX12 game or benchmark, more cores make a major difference. Unigine Valley and Heaven are DX11, and can't use the nascent crunch power in a modern CPU (or GPU for that matter since they are not DX12).


That's not a "myth" those are called conditions. You know, the conditions under which 99% of today's games operate... Not exactly a corner case here.
Quote:


> Originally Posted by *Jpmboy*
> 
> Depends on the application. Disabling HT on a modern cpu to play yesterday's games is... well, you shoulda got a 6600K or a 2 core.


To add to this: Most games today aren't affected by HT as long as you have 4 threads or more. The few that are see little difference positive or negative in terms of FPS so it's just not an issue, even when it is. Now, if you step down to a non-HT dual core and compare to an HT-enabled dual core on the same architecture and with similar clocks, the HT-enabled chip can offer decent gains because many games target 3-4 threads.


----------



## HyperMatrix

Quote:


> Originally Posted by *techguymaxc*
> 
> That's not a "myth" those are called conditions. You know, the conditions under which 99% of today's games operate... Not exactly a corner case here.
> To add to this: Most games today aren't affected by HT as long as you have 4 threads or more. The few that are see little difference positive or negative in terms of FPS so it's just not an issue, even when it is. Now, if you step down to a non-HT dual core and compare to an HT-enabled dual core on the same architecture and with similar clocks, the HT-enabled chip can offer decent gains because many games target 3-4 threads.


Your argument would be correct if this were 2013.


----------



## Jpmboy

Quote:


> Originally Posted by *HyperMatrix*
> 
> I ran around in Lexington, in Fallout 4. Like I said, haven't really played the game so that's the biggest city area I could find. With all settings fully maxed out, as I entered the city and ran around, FPS would fluctuate between 100fps to 165fps (gsync cap) with a couple dips down to 80fps during sprinting, which lowers fps for some reason. I looked at CPU usage, and the game is very heavily loaded on one core. Not even 2. But 1. And once that core is maxed out, performance can't go any higher.


Quote:


> Originally Posted by *HyperMatrix*
> 
> Your argument would be correct if this were 2013.


^^
exactly.


----------



## Sheyster

Quote:


> Originally Posted by *HyperMatrix*
> 
> But that's a lot of hypotheticals. The new Titan is great. I went from 3 Maxwell Titans to 2 of the new ones. While they're amazing individual cards...honestly...they're only about 33% faster than my old setup. *So while this generation of cards are amazing, the limited number of cards you can run in SLI, still puts a pretty hard cap on maximum GPU performance for driving high end gaming.* Unless, again, we're talking about DX12, which makes your 6950x the better CPU in all cases, and would let you run up to 4 Pascal Titan X's under Multi-GPU.


That's really not relevant if we're talking games. Tri and Quad SLI scaling in games has been awful for quite a while, generally speaking. Moving forward with DX12, they're totally irrelevant.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> Yeah. 5960x. I found the exact spot you were. And yeah it's definitely an FPS killer. The comparison testing challenge came after CallsignVega said a 4.8GHz 6700k would perform better than a 4.7GHz 5960x. So we're just testing to see if cores matter or not.


How far into the game is that area? Right now in Crysis 3 I am in the storm rig where I have to cross the platform with the helicopter attacking. Any hacks out there to skip forward?

Quote:


> Originally Posted by *Jpmboy*
> 
> NOt sure how this myth got started... sure, in DX 11, 10 and 9 games a 2 or 4 core processor running at higher clocks will do better than 6-10 cores at lower clocks... 'CAUSE those games do not use the cores and do not use the modern API - DX12. Vega is on Win 8 - no DX12 so he can't run games or benchmarks that call on that API.
> If you have any plans on going VR or ultra high resolution, a 4-core with HT disabled is not gonna cut it. And in any DX12 game or benchmark, more cores make a major difference. Unigine Valley and Heaven are DX11, and can't use the nascent crunch power in a modern CPU (or GPU for that matter since they are not DX12).
> Depends on the application. Disabling HT on a modern cpu to play yesterday's games is... well, you shoulda got a 6600K or a 2 core.
> 
> 
> 
> 
> 
> 
> 
> 
> Any DX12 game or bench and use SLI. Have you run Timespy?
> console game port.


Quote:


> Originally Posted by *Jpmboy*
> 
> ^^
> exactly.


Who's on Windows 8?

Just curious how one explains these results where a 6700K beats a 5960X in SLI in every single modern game tested:

http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu

BTW DX12 is what, like a handful of games? And just one single popular game, Star Wars Battlefront.

Also, there is more cache on a 6700K versus a 6600K, so HT is not the only reason to go with a 6700K.

And no, I don't run Timespy. That benchmark is unrealistic and is simply designed to punish as many CPU cores as possible and is not indicative of 99% of gaming.

SWBF is a DX12 game that I can hop into and see if my 6700K can keep up with m y GPU's. I'll post back.


----------



## stefxyz

Vega is right. Unfortunately at the moment the 6700k is the better CPU for gaming. However I am not sure if the missing lanes on Z170 might impact performance in a 2GHZ Titan X SLI system by now. Someone would have to test it.

Also have a lok at the latest hardwarecanucks video where Dimitry observed that a 6700k beats a 6 core CPU even in a lot of non game scenariois on Adobe products.

It is plain a shame that the conusmer platform is so far ahead of the enthusiast platform. It should be the other way round or at least on par. While Haswell-E was good especially Borwadwell-E is a joke with its marginal IPC improvement annd sometimes worse OC potential than Haswell-E...


----------



## kalston

Are you sure the Battlefront PC version is dx12? AFAIK only the Xbox One version is dx12.
BF1 will have dx12 for sure though!:


----------



## DADDYDC650

Quote:


> Originally Posted by *stefxyz*
> 
> Vega is right. Unfortunately at the moment the 6700k is the better CPU for gaming. However I am not sure if the missing lanes on Z170 might impact performance in a 2GHZ Titan X SLI system by now. Someone would have to test it.
> 
> Also have a lok at the latest hardwarecanucks video where Dimitry observed that a 6700k beats a 6 core CPU even in a lot of non game scenariois on Adobe products.
> 
> It is plain a shame that the conusmer platform is so far ahead of the enthusiast platform. It should be the other way round or at least on par. While Haswell-E was good especially Borwadwell-E is a joke with its marginal IPC improvement annd sometimes worse OC potential than Haswell-E...


6700k was mostly a little faster in Adobe after effects but trailed the 6800k in the every game he tested as well as every other program.

Like I said before, 6700k will pull ahead in games that favor a 4 core (newest arch) 4.7-4.8Ghz and higher OC.


----------



## dante`afk

how does one remove the screws behind the backplate? there's just a hole, not for a screwdriver?


----------



## CallsignVega

Ya, SWBF is DX11 on PC. None-the-less, at 3440x1440 at max graphics my game is pinging off the 200 FPS in-game FPS limit. Obviously no CPU limit here.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, SWBF is DX11 on PC. None-the-less, at 3440x1440 at max graphics my game is pinging off the 200 FPS in-game FPS limit. Obviously no CPU limit here.


What are your OCs? Are you on water?


----------



## CallsignVega

Crysis 3 in the storm, 1440P with all settings maxed besides AA (using FXAA). "Measly" 261 FPS.


----------



## dante`afk

welp I ****ed up when trying to unscrew those impossible screws.

can I tape that on that position ? -_-


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> welp I ****ed up when trying to unscrew those impossible screws.
> 
> can I tape that on that position ? -_-


Looks like you made a titanic mistake. Ouch!

BTW, next time try going over the screws with a blow dryer until warm. They'll come out WAY easier.


----------



## Kyouki

Quote:


> Originally Posted by *dante`afk*
> 
> welp I ****ed up when trying to unscrew those impossible screws.
> 
> can I tape that on that position ? -_-


If your any good with a solder iron or have a friend then you are fine. just place it back on the pads and heat it up should just soak up the solder and lay back in spot. or you can clean the pads and lay new solder on the pads first then lay it back on and heat it up.

looking closer you may of ripped the pads a little but I still think you have enough there to reattach it with some good solder skill.


----------



## CallsignVega

What on earth are you trying to do that you are ripping off resisters?


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> What on earth are you trying to do that you are ripping off resisters?


He probably slipped since those screws are hard to take out at times. I'm pretty sure I'd punch the first person I saw if that happened to me.


----------



## Fiercy

What tool are you guys actually using to pull the screws off?


----------



## dante`afk

yep, slipped,

which which tool are you removing those screws anyway?


----------



## Gary2015

Quote:


> Originally Posted by *Fiercy*
> 
> What tool are you guys actually using to pull the screws off?


https://www.ekwb.com/shop/hex-socket-4mm


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> https://www.ekwb.com/shop/hex-socket-4mm


Hey, can you do me a favor and take a good pic of your mobo and G.Skill RAM? It only comes in one color and I was wondering how it would look. Wish they would come out with black/white sticks that run at those speeds.


----------



## CallsignVega

Quote:


> Originally Posted by *DADDYDC650*
> 
> He probably slipped since those screws are hard to take out at times. I'm pretty sure I'd punch the first person I saw if that happened to me.


Ya but I wouldn't be working on those screws on a $1200 GPU without some protective measures in place.


----------



## dante`afk

how do I solder it back on, does it matter which direction + - ? do i put the solder on the card or below transistor and put it back on?


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> how do I solder it back on, does it matter which direction + - ? do i put the solder on the card or below transistor and put it back on?


Just go into any repair shop. Shouldn't cost too much.


----------



## Kyouki

Quote:


> Originally Posted by *DADDYDC650*
> 
> Hey, can you do me a favor and take a good pic of your mobo and G.Skill RAM? It only comes in one color and I was wondering how it would look. Wish they would come out with black/white sticks that run at those speeds.


Check out http://promotions.newegg.com/gskill/16-4852/index.html?icid=364993 They have White and black at 3200 or white and grey like I got.


----------



## DADDYDC650

Quote:


> Originally Posted by *Kyouki*
> 
> Check out http://promotions.newegg.com/gskill/16-4852/index.html?icid=364993 They have White and black at 3200 or white and grey like I got.


They don't have white/black 3200/14 CAS though.


----------



## Kyouki

Quote:


> Originally Posted by *dante`afk*
> 
> how do I solder it back on, does it matter which direction + - ? do i put the solder on the card or below transistor and put it back on?


A repair shop is one options, but if you are comfortable with soldering it is an easy fix. I personally would clean the pads by heating them up and wicking the old solder off. Then just add new solder to each pad, lay the resistor on the joints and hold with some tweezers while applying heat evenly and you'll see the resistor just suck up the solder and lay flat on the joints. You could google or youtube it and find some good info.


----------



## geort45

Does anyone have 3ds max and Vray RT to test on this badboy







???


----------



## Kyouki

Quote:


> Originally Posted by *DADDYDC650*
> 
> They don't have white/black 3200/14 CAS though.


They do, check out newegg and filter it down by 14 Cas latency, not sure if this link will have all my filers in place. http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007611 50008476 601190328 600546709&Manufactory=8476


----------



## DADDYDC650

Quote:


> Originally Posted by *Kyouki*
> 
> They do, check out newegg and filter it down by 14 Cas latency, not sure if this link will have all my filers in place. http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007611 50008476 601190328 600546709&Manufactory=8476


Forgot to mention that I need a quad channel 32GB or 64GB kit.


----------



## Kyouki

Quote:


> Originally Posted by *DADDYDC650*
> 
> Forgot to mention that I need a quad channel 32GB or 64GB kit.


Yea it is hard to match a theme while hitting all the performance needs at the same time. I got lucky on my build I feel like the ram was designed for the motherboard.


----------



## Diverge

Quote:


> Originally Posted by *Kyouki*
> 
> A repair shop is one options, but if you are comfortable with soldering it is an easy fix. I personally would clean the pads by heating them up and wicking the old solder off. Then just add new solder to each pad, lay the resistor on the joints and hold with some tweezers while applying heat evenly and you'll see the resistor just suck up the solder and lay flat on the joints. You could google or youtube it and find some good info.


Personally, I wouldn't suggest that someone attempt to solder a tiny SMD component, if they've never even soldered before. Good chance they cook the lands, and/or component, and further mess it up. But it is an easy fix for anyone with some soldering skills.

I just image someone using one of those wood burning type of irons, blobing solder everywhere, cooking lands and traces,


----------



## dante`afk

good idea,

what are repair shops that do such things called here in the US?

does home depot do that?


----------



## DADDYDC650

Quote:


> Originally Posted by *Kyouki*
> 
> Yea it is hard to match a theme while hitting all the performance needs at the same time. I got lucky on my build I feel like the ram was designed for the motherboard.


You have the gray/white G.Skill right? Any pix? I have the same board.

NVM, thought you had the Rampage mobo.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Hey, can you do me a favor and take a good pic of your mobo and G.Skill RAM? It only comes in one color and I was wondering how it would look. Wish they would come out with black/white sticks that run at those speeds.




Here you go...It looks ok but not great.


----------



## Diverge

Quote:


> Originally Posted by *dante`afk*
> 
> good idea,
> 
> what are repair shops that do such things called here in the US?


search for electronic repair shops w/ surface mount experience. Probably not a lot around, as fixing electronics isn't that profitable - most people throw stuff out and buy new stuff.

edit: or if you know anyone that works in electrical engineering field, their jobs usually have prototype people that are experts at soldering surface mount stuff.


----------



## Kyouki

Quote:


> Originally Posted by *DADDYDC650*
> 
> You have the gray/white G.Skill right? Any pix? I have the same board.


I am using Asus X99 deluxe II you can view pictures here at build log --> http://www.overclock.net/t/1607002/


----------



## Z0eff

I remember reading rumors that big pascal would end up creating CPU bottlenecks, very interesting to see that that's actually true. Though not by that much.

So now we're at a strange point where the latest arch with less cores (6700k) is the better option for gaming, but DX12 is just around the corner DX12 has arrived and any games using it can end up using more than 4 cores far more effectively, thus suddenly making the -E chips very interesting for gaming purposes.

To anyone currently trying to figure out what is best for them, look at the games you're playing and going to play. For me I'm playing a game from 2013 a lot right now which only has DX11 support yet is quite hard on the CPU and GPU. in the future I'll be playing some BF1 which is most likely a full DX12 title but probably not a whole lot, just some casual games. I think I'll keep my 6700k for now but I for my next CPU upgrade I might switch to the enthusiast platform with the hope that intel has native USB3.1 Gen2 support by then.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *DADDYDC650*
> 
> Looks like you made a titanic mistake. Ouch!
> 
> BTW, next time try going over the screws with a blow dryer until warm. They'll come out WAY easier.


If you already have your soldering iron handy, sticking the tip into the screw head also works great and concentrates the heat.


----------



## DADDYDC650

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> If you already have your soldering iron handy, sticking the tip into the screw head also works great and concentrates the heat.


He's already "slipped"up once. Don't think he's feeling lucky today.


----------



## Kyouki

Quote:


> Originally Posted by *Diverge*
> 
> I just image someone using one of those wood burning type of irons, blobing solder everywhere, cooking lands and traces,


good point I can see the follow up picture now! lol I agree if you have the skill.


----------



## DADDYDC650

Quote:


> Originally Posted by *Z0eff*
> 
> I remember reading rumors that big pascal would end up creating CPU bottlenecks, very interesting to see that that's actually true. Though not by that much.
> 
> So now we're at a strange point where the latest arch with less cores (6700k) is the better option for gaming, but DX12 is just around the corner DX12 has arrived and any games using it can end up using more than 4 cores far more effectively, thus suddenly making the -E chips very interesting for gaming purposes.
> 
> To anyone currently trying to figure out what is best for them, look at the games you're playing and going to play. For me I'm playing a game from 2013 a lot right now which only has DX11 support yet is quite hard on the CPU and GPU. in the future I'll be playing some BF1 which is most likely a full DX12 title but probably not a whole lot, just some casual games. I think I'll keep my 6700k for now but I for my next CPU upgrade I might switch to the enthusiast platform with the hope that intel has native USB3.1 Gen2 support by then.


6700k is only the better option if you plan on spending less and or get a high clocked chip.


----------



## lutjens

New arrivals...


----------



## DNMock

Go ahead and sign me up.

I caved:



Chickenfish for good measure

edit: two of them, had to order them individually because I was too lazy to up my daily limit on my card.


----------



## DNMock

Quote:


> Originally Posted by *dante`afk*
> 
> welp I ****ed up when trying to unscrew those impossible screws.
> 
> can I tape that on that position ? -_-


Holy crap!

Ok, here is what you do. put the backplate back on, use a tiny amount of glue to put the warranty voiding sticker thing back on, box it back up and return it to Nvidia as "defective" and never, ever speak of this again.

edit: soldering back on will probably work, but that stuff is a pain to get decent at, so unless you have done it a lot, get someone to do it. Call up an electrician, a lot of them can do it or will know who to contact locally.


----------



## Woundingchaney

Ok guys, I am tweaking my card in (nice to be back to a single card solution in what has been a decade of multi gpus).

I'm stable at +200 on the core though my card doesn't seem to like memory increases so as of right now I'm at +50 on the mem. I'm hitting just above 2000 MHz on FS ultra stress test running at 80 degrees.

Is this comparable to what others are getting and does anyone have any further tweaking suggestions.

Thanks in advance.


----------



## DNMock

Quote:


> Originally Posted by *Woundingchaney*
> 
> Ok guys, I am tweaking my card in (nice to be back to a single card solution in what has been a decade of multi gpus).
> 
> I'm stable at +200 on the core though my card doesn't seem to like memory increases so as of right now I'm at +50 on the mem. I'm hitting just above 2000 MHz on FS ultra stress test running at 80 degrees.
> 
> Is this comparable to what others are getting and does anyone have any further tweaking suggestions.
> 
> Thanks in advance.


power slider cranked all the way up to 150% or whatever?

beyond that, until Sheyster/Cyclops/whoever makes some custom bios, that's probably the only thing more you can do.


----------



## Testier

Hey you guys, make sure to use OSD to check the actual power limit is set to 120%. I noticed AB sometimes glitches and still run it at default eventhough the slider is set to 120%.


----------



## CluckyTaco

In case anyone was waiting for stock, they just refreshed looks like. I'm still on the fence whether to buy or wait for x80ti


----------



## Artah

Quote:


> Originally Posted by *dante`afk*
> 
> yep, slipped,
> 
> which which tool are you removing those screws anyway?


if you attempt to solder it yourself make 100% sure you are using a surface mount soldering iron or you could do much more damage.


----------



## Baasha

So is the Voltage not readable in MSI Afterburner even though the setting is turned on?

Is there an update to MSI Afterburner for the Titan X Pascal?


----------



## EniGma1987

Quote:


> Originally Posted by *dante`afk*
> 
> how does one remove the screws behind the backplate? there's just a hole, not for a screwdriver?


Those tiny holes are actually where the backplate screws go into. You need a wrench or even better a socket wrench to remove the screws once the backplate is off. They serve a double purpose, to screw the PCB into the cooler shroud and act as standoffs for the backplate itself, which the backplate then screws into them. Think of how the motherboard standoffs work in towers, you screw the standoff into the tower and then the motherboard screws screw into the standoffs. Same thing here.

Just go to Home Depot or Lowes and buy a 2.5, 3, and 4mm sized wrench or sockets so you have all the common tiny sizes for things like this.


----------



## Glzmo

I ordered mine yesterday and today I was surprised by the UPS man delivering the beast even though the order status still said it's being processed and I got no shipping confirmation. I'm pleasantly surprised, to say the least. Now I'll go fire up some games.


----------



## techguymaxc

Quote:


> Originally Posted by *HyperMatrix*
> 
> Your argument would be correct if this were 2013.


Quote:


> Originally Posted by *Jpmboy*
> 
> ^^
> exactly.


Quote:


> Originally Posted by *CallsignVega*
> 
> Just curious how one explains these results where a 6700K beats a 5960X in SLI in every single modern game tested:
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-what-is-the-fastest-gaming-cpu


Hyper and Jpm: you were saying?

Not all tasks have inherent parallelism. Even when there are gains to be had from parallelization, unlocking it can be one of the hardest tasks in programming. Clock speed and IPC are still king for most games, once you get to 4 threads.


----------



## dante`afk

Quote:


> Originally Posted by *EniGma1987*
> 
> Those tiny holes are actually where the backplate screws go into. You need a wrench to remove the screws once the backplate is off. They serve a double purpose, to screw the PCB into the cooler shroud and act as standoffs for the backplate itself, which the backplate then screws into them. Think of how the motherboard standoffs work in towers, you screw the standoff into the tower and then the motherboard screws screw into the standoffs. Same thing here.


I did it with the fkin wrench, that's how I slipped on that resistor.

later I read you need a 4mm hex (which i have ...) with that removal was easy.

I'm going later to a shop, he'll put it back on, surface soldering it is he told me. which @Artha comfirmed above.


----------



## Diverge

Unigine Heaven @ 3440 x 1440 w/ Titan X @ +205/+204, 6700K @ 4.6GHz

My card got up to 90C towards the end.


----------



## Jpmboy

Quote:


> Originally Posted by *dante`afk*
> 
> welp I ****ed up when trying to unscrew those impossible screws.
> 
> can I tape that on that position ? -_-


solder or use conductive epoxy. yu need a 4mm socket for the PCB bolt/screw
Quote:


> Originally Posted by *dante`afk*
> 
> how do I solder it back on, does it matter which direction + - ? do i put the solder on the card or below transistor and put it back on?


Quote:


> Originally Posted by *Baasha*
> 
> So is the Voltage not readable in MSI Afterburner even though the setting is turned on?
> 
> Is there an update to MSI Afterburner for the Titan X Pascal?


nah not yet man, GPUz does read it tho.
Quote:


> Originally Posted by *techguymaxc*
> 
> Hyper and Jpm: you were saying?
> 
> Not all tasks have inherent parallelism. Even when there are gains to be had from parallelization, unlocking it can be one of the hardest tasks in programming. Clock speed and IPC are still king for most games, once you get to 4 threads.


I was sayin... lol, you can play core-crippled all you like. Never had a 6, 8 or 10 core that made anything feel or be slow. And non-parallel (serial execution) is, today, poor programming "hygiene"









@CallsignVega
oops my bad - Valley is so "old" it does not even know Windows 10. Unigine is reading it wrong: 8 in valley, NT in Heaven... just noticed that.


----------



## dante`afk

that guy wants 50$ to solder that.

for the moment I put it back on with non conductive tape. the gpu works, any idea what that resistor is for? its behind the memory, so maybe for memory? though it recognizes still 12gb, maybe my tape solution works too?


----------



## CallsignVega

Ugg I've played for like an hour and still can't find that exact spot in the screenshot. Is it before or after the scissor monsters? I took a screenshot in the grass anyway.



Game definitely hammers the CPU, mines at 90+% usage.


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> nah not yet man, GPUz does read it tho.
> I was sayin... lol, you can play core-crippled all you like. Never had a 6, 8 or 10 core that made anything feel or be slow. And non-parallel (serial execution) is, today, poor programming "hygiene"


Feelings are not data points. You can feel however you like about it, but the benchmarks show a high-clocked quad core is still the best CPU for the vast majority of games.


----------



## Ninjawithagun

So, this is supposed to be a Titan X owner thread, but all I see are non-owners complaining about the price.


----------



## Darkstar757

Quote:


> Originally Posted by *dante`afk*
> 
> that guy wants 50$ to solder that.
> 
> for the moment I put it back on with non conductive tape. the gpu works, any idea what that resistor is for? its behind the memory, so maybe for memory? though it recognizes still 12gb, maybe my tape solution works too?


Bro spend the 50 to get it done right, or send the card back is my advice.


----------



## Fiercy

Quote:


> Originally Posted by *Ninjawithagun*
> 
> So, this is supposed to be a Titan X owner thread, but all I see are non-owners complaining about the price.


It started as a non-owners thread that's why you might have that impression. It turned out no one wanted to start one with a spreadsheet and staff.


----------



## Gary2015

Quote:


> Originally Posted by *Ninjawithagun*
> 
> So, this is supposed to be a Titan X owner thread, but all I see are non-owners complaining about the price.


OP needs to list owners


----------



## mbze430

Quote:


> Originally Posted by *Fiercy*
> 
> It started as a non-owners thread that's why you might have that impression. It turned out no one wanted to start one with a spreadsheet and staff.


no wants to start a spreadsheet because any one of us here that can afford 2+ of these cards probably have maids and employees or slaves that work under them


----------



## Gary2015

Quote:


> Originally Posted by *Darkstar757*
> 
> Bro spend the 50 to get it done right, or send the card back is my advice.


Agree you don't know what else could be up. But did you void warranty by doing that?


----------



## CallsignVega

I just say the thread title be changed to owners club. Don't need a spreadsheet.

EDIT: just noticed it's already been changed.









Hmm thinking about ordering parts up for a 6950X just to compare systems and keep the one I like.


----------



## techguymaxc

Quote:


> Originally Posted by *Ninjawithagun*
> 
> So, this is supposed to be a Titan X owner thread, but all I see are non-owners complaining about the price.


You must not have read much of it to say that... Plenty of benchmark results here. I haven't gotten around to running any benches or playing games with mine yet, but it's installed and waiting for me to get done with work so I can finally use it!


----------



## Fiercy

Quote:


> Originally Posted by *techguymaxc*
> 
> You must not have read much of it to say that... Plenty of benchmark results here. I haven't gotten around to running any benches or playing games with mine yet, but it's installed and waiting for me to get done with work so I can finally use it!


I am just annoyed by the fact that EK is so late with the water block







my PC looks like crap with two different Titan X in.


----------



## Ninjawithagun

There are way more posts by disgruntled non-owners. Stop trying to defend them. Thanks!


----------



## techguymaxc

Quote:


> Originally Posted by *CallsignVega*
> 
> I just say the thread title be changed to owners club. Don't need a spreadsheet.
> 
> EDIT: just noticed it's already been changed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm thinking about ordering parts up for a 6950X just to compare systems and keep the one I like.


Dude, did you read the link to Eurogamer that you posted? It shows mainstream i7 is still the way to go for the vast majority of games. If you have other tasks to do like video editing or transcoding go ahead and grab a nicely clocked 6-10 core HEDT chip, otherwise don't waste your money.


----------



## Gary2015

Quote:


> Originally Posted by *Fiercy*
> 
> I am just annoyed by the fact that EK is so late with the water block
> 
> 
> 
> 
> 
> 
> 
> my PC looks like crap with two different Titan X in.


They are even later with backplates.


----------



## carlhil2

My first bench, needs water.. 
http://www.3dmark.com/spy/218525


----------



## techguymaxc

Quote:


> Originally Posted by *Fiercy*
> 
> I am just annoyed by the fact that EK is so late with the water block
> 
> 
> 
> 
> 
> 
> 
> my PC looks like crap with two different Titan X in.


Me too







I have a water cooled 970 in my gaming system now (haven't taken the loop apart yet to add the Titan X) and an air cooled Titan X in the next PCI-e slot. So awkward...
Quote:


> Originally Posted by *Ninjawithagun*
> 
> There are way more posts by disgruntled non-owners. Stop trying to defend them. Thanks!


I've literally read every post in this thread and my analysis is entirely different from yours. That begs the question: have you done the same (read every post)? Lots of excited owners talking about shipping confirmations, delays, finally receiving their cards, and actually using them. Maybe you only read the first 7 pages of the thread before the cards launched and came in here to post your opinion, but the rest of the thread is nothing like you say.


----------



## Gary2015

Quote:


> Originally Posted by *techguymaxc*
> 
> Dude, did you read the link to Eurogamer that you posted? It shows mainstream i7 is still the way to go for the vast majority of games. If you have other tasks to do like video editing or transcoding go ahead and grab a nicely clocked 6-10 core HEDT chip, otherwise don't waste your money.


People who buy 2x Titan XP love wasting money!


----------



## techguymaxc

Quote:


> Originally Posted by *Gary2015*
> 
> People who buy 2x Titan XP love wasting money!


This is true... I'm debating between a 6900k/cherry-picked 5960x/6950x for my media server/encoding machine. But then again I have 2 Dell servers with dual six-core processors (each) that I might just repurpose for the encoding side of things and skip spending $1000-2000 on a CPU.


----------



## CallsignVega

Quote:


> Originally Posted by *techguymaxc*
> 
> Dude, did you read the link to Eurogamer that you posted? It shows mainstream i7 is still the way to go for the vast majority of games. If you have other tasks to do like video editing or transcoding go ahead and grab a nicely clocked 6-10 core HEDT chip, otherwise don't waste your money.


I just like building computers...


----------



## lilchronic

Quote:


> Originally Posted by *Z0eff*
> 
> I remember reading rumors that big pascal would end up creating CPU bottlenecks, very interesting to see that that's actually true. Though not by that much.
> 
> So now we're at a strange point where the latest arch with less cores (6700k) is the better option for gaming, but DX12 is just around the corner DX12 has arrived and any games using it can end up using more than 4 cores far more effectively, thus suddenly making the -E chips very interesting for gaming purposes.
> 
> To anyone currently trying to figure out what is best for them, look at the games you're playing and going to play. For me I'm playing a game from 2013 a lot right now which only has DX11 support yet is quite hard on the CPU and GPU. in the future I'll be playing some BF1 which is most likely a full DX12 title but probably not a whole lot, just some casual games. I think I'll keep my 6700k for now but I for my next CPU upgrade I might switch to the enthusiast platform with the hope that intel has native USB3.1 Gen2 support by then.


It really depends on the resolution and frame rate. So if you're going for 144Hz at 1440p and have the gpu power to push past that many frames, certain spots in some games like crysis 3 are really cpu demanding and a 6700k would limit you from reaching that 144fps


----------



## techguymaxc

Quote:


> Originally Posted by *CallsignVega*
> 
> I just like building computers...


Fair enough. Can't say I'm any different.


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> People who buy 2x Titan XP love wasting money!


Spending money on something I want isn't a waste.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> I just say the thread title be changed to owners club. Don't need a spreadsheet.
> 
> EDIT: just noticed it's already been changed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm thinking about ordering parts up for a 6950X just to compare systems and keep the one I like.


Why? Might as well wait for Skylake-E since you already own a bad a$$ proc.


----------



## renejr902

I just bought a i7 4790k , i will tell you if i see a difference in performance vs my i5 4690

I will overclock my i7 4790k to 4.5ghz before testing. I will begin at 1.2v for my 4.5ghz overclock


----------



## lilchronic

Quote:


> Originally Posted by *CallsignVega*
> 
> Ugg I've played for like an hour and still can't find that exact spot in the screenshot. Is it before or after the scissor monsters? I took a screenshot in the grass anyway.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Game definitely hammers the CPU, mines at 90+% usage.


You still have a little ways to go but you're on the right map/ level.







once you get to the part where you take a *zipline* you are there.

Either way that welcome to the jungle map is very demanding especially with all that grass swaying.


----------



## carlhil2

So far, on air, +190 on the core, +600 on the ram, is the best that I can do..  http://www.3dmark.com/spy/218735


----------



## dante`afk

so my card keeps running into the power target limit or even above that. vcore is not adjustable via msi afterburner right?

it stays at 1.0430 max 200/550 - 2050/5550.

that's ok i guess









temps are below 60. looks like powertarget limit and vcore limit.

and so far I'm getting 5-10 more fps than with 1080 SLI 2100/5300 in the games I play

 + 2x 140mm fans


----------



## st0necold

Quote:


> Originally Posted by *cookiesowns*
> 
> I'll be benching the TXP's once I get them. I also have triple 27"s so can provide some numbers at 2.5K triple surround ~8K.
> 
> What are some good benchmarks that are repeatable ?


bro just moved from around there--

run 3dmark firestrike extreme/ultra, you can also give uniengine valley/heaven some runs.


----------



## aylan1196

Quote:


> Originally Posted by *dante`afk*
> 
> so my card keeps running into the power target limit or even above that. vcore is not adjustable via msi afterburner right?
> 
> it stays at 1.0430 max 200/550 - 2050/5550.
> 
> that's ok i guess
> 
> 
> 
> 
> 
> 
> 
> 
> 
> temps are below 60. looks like powertarget limit and vcore limit.
> 
> and so far I'm getting 5-10 more fps than with 1080 SLI 2100/5300 in the games I play
> 
> + 2x 140mm fans


Nice Sold my sli 1080s fe order the Titan x pascal should arrive in few days snatch it on nvidia website after it was out of stock moments ago I have evga hybrid sealed ready for that beast can't wait


----------



## DNMock

Quote:


> Originally Posted by *mbze430*
> 
> no wants to start a spreadsheet because any one of us here that can afford 2+ of these cards probably have maids and employees or slaves that work under them


Whatever dude, I looked on Newegg, Ebay, and Amazon and there weren't any slaves up for sale...


----------



## Darkstar757

Quote:


> Originally Posted by *mbze430*
> 
> no wants to start a spreadsheet because any one of us here that can afford 2+ of these cards probably have maids and employees or slaves that work under them


The hate is strong with you.


----------



## carlhil2

I am crushing GTA V @4K right now...







am I going deaf, or, is the fan not that loud at full blast?


----------



## dante`afk

dam that soldering guy is 1.5 hours away lmao.

someone from cleveland here with soldering skills ?







or anyonea good tutorial video?


----------



## Panther Al

Quote:


> Originally Posted by *CallsignVega*
> 
> I just say the thread title be changed to owners club. Don't need a spreadsheet.
> 
> EDIT: just noticed it's already been changed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm thinking about ordering parts up for a 6950X just to compare systems and keep the one I like.


Dooooo eeet.

Love to see what sort of build you could do - they usually are pretty good.

As is, about *this* close to picking a Titan X (p) myself. Mainly because it will be the cherry on top of the build I am working on now.


----------



## Darkstar757

Quote:


> Originally Posted by *dante`afk*
> 
> dam that soldering guy is 1.5 hours away lmao.
> 
> someone from cleveland here with soldering skills ?
> 
> 
> 
> 
> 
> 
> 
> or anyonea good tutorial video?


Oren Elliott Products, Inc.
Edgerton, OH 43517-9600 Locations
Contact Company: 419-298-2306


----------



## Diverge

Quote:


> Originally Posted by *dante`afk*
> 
> so my card keeps running into the power target limit or even above that. vcore is not adjustable via msi afterburner right?
> 
> it stays at 1.0430 max 200/550 - 2050/5550.
> 
> that's ok i guess
> 
> 
> 
> 
> 
> 
> 
> 
> 
> temps are below 60. looks like powertarget limit and vcore limit.
> 
> and so far I'm getting 5-10 more fps than with 1080 SLI 2100/5300 in the games I play
> 
> + 2x 140mm fans


If that's your titan on the carpet.... that's a big no-no. Lots of static electricity in carpets that could fry your card, especially with the backplate off and exposed components.


----------



## dante`afk

Quote:


> Originally Posted by *Darkstar757*
> 
> Oren Elliott Products, Inc.
> Edgerton, OH 43517-9600 Locations
> Contact Company: 419-298-2306


that's 3 hours









i'm around shaker heights/beachwood.


----------



## outofmyheadyo

Last time I tried "fixing" a keyboard it ended up in the garbage, I would really not learn soldering on a 1300 $ gpu


----------



## Jpmboy

Quote:


> Originally Posted by *techguymaxc*
> 
> Feelings are not data points. You can feel however you like about it, but the benchmarks show a high-clocked quad core is still the best CPU for the vast majority of games.


here are plenty of data points. Read up so you come up to speed.









http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20
Quote:


> Originally Posted by *CallsignVega*
> 
> I just say the thread title be changed to owners club. Don't need a spreadsheet.
> EDIT: just noticed it's already been changed.
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm thinking about ordering parts up for a 6950X just to compare systems and keep the one I like.


You'll like both.


----------



## techguymaxc

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Last time I tried "fixing" a keyboard it ended up in the garbage, I would really not learn soldering on a 1300 $ gpu


This. I've got halfway decent soldering skills but mostly just DC jacks on laptop motherboards. I wouldn't want to try my hand at such a tiny SMC, let alone on a brand new Titan X.


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> here are plenty of data points. Read up so you come up to speed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
> http://www.overclock.net/t/1443196/firestrike-extreme-top-30
> http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
> http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
> http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20
> You'll like both.


Zero games in your list. I benchmark to have a comparison between components when I upgrade, then I use those components to play games. What do you do?


----------



## lilchronic

Quote:


> Originally Posted by *dante`afk*
> 
> dam that soldering guy is 1.5 hours away lmao.
> 
> someone from cleveland here with soldering skills ?
> 
> 
> 
> 
> 
> 
> 
> or anyonea good tutorial video?


some videos to watch before trying









Spoiler: Warning: Spoiler!


----------



## Darkstar757

Quote:


> Originally Posted by *dante`afk*
> 
> that's 3 hours
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'm around shaker heights/beachwood.


Reach out to them.

http://clevelandcircuits.com/index.html


----------



## Testier

Quote:


> Originally Posted by *CallsignVega*
> 
> I just say the thread title be changed to owners club. Don't need a spreadsheet.
> 
> EDIT: just noticed it's already been changed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm thinking about ordering parts up for a 6950X just to compare systems and keep the one I like.


6950x doesnt clock well even by haswell E standards. Unless you need it for some other application, I personally wouldnt bother. x99 system is aging, better to jump on skylake E.


----------



## DADDYDC650

Quote:


> Originally Posted by *Testier*
> 
> 6950x doesnt clock well even by haswell E standards. Unless you need it for some other application, I personally wouldnt bother. x99 system is aging, better to jump on skylake E.


A 4.4Ghz 6950x would destroy games though.


----------



## Testier

Quote:


> Originally Posted by *DADDYDC650*
> 
> A 4.4Ghz 6950x would destroy games though.


Games that uses more than 4 cores......

For future proofing, just buy it in the future. No need to rush into a dying platform now unless 4 cores just aint enough for you now.

I am happy with my 5960x and probably keep it until at least skylake E/cannon lake E.


----------



## Silent Scone

Quote:


> Originally Posted by *Testier*
> 
> Games that uses more than 4 cores......
> 
> For future proofing, just buy it in the future. No need to rush into a dying platform now unless 4 cores just aint enough for you now.


A dying platform that's literally just had a refresh, that is fantastic logic.


----------



## lutjens

Quote:


> Originally Posted by *CallsignVega*
> 
> I just like building computers...


I'm the same way...my family of computers keeps growing...it's like they're breeding or something...







Number 6 on the way...might be time to cull the herd and rehome a couple of them to needy family members...








Quote:


> Originally Posted by *CallsignVega*
> 
> I just say the thread title be changed to owners club. Don't need a spreadsheet.
> 
> EDIT: just noticed it's already been changed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm thinking about ordering parts up for a 6950X just to compare systems and keep the one I like.


The 6950X is definitely a bad ass chip...try it...you'll love it.









Quote:


> Originally Posted by *DADDYDC650*
> 
> Spending money on something I want isn't a waste.


Agreed. The way I look at it, when the next better card comes out, I'll just pass off these cards to a backup system.


----------



## brunok6g

i have sli of 1080 in the WAter , i need to know!!!!!! sli titan x pascal performance ingames 4k ///


----------



## Metros

Quote:


> Originally Posted by *techguymaxc*
> 
> You appear not to understand how microstutter works... If single card A achieves a good deal less than 60 FPS at given settings and SLI card A gets above 60, your FPS well be high but so will your variance. Not the case with single card B getting at or close to 60 FPS let alone in SLI.
> 
> Obvious examples here being 4k GP104 vs GP102.


This "mircostutter" happens on any SLI settup, the Titan X SLI will still get the same microstutter as the GTX 1080 SLI


----------



## carlhil2

Decided to skip putting my EK Uni block on my card. I will put a full cover joint on it since I will be keeping it, even when I buy big volta. this thing is a beast. sitting at 2000 with temps at 65, because of the ac, in GTA V.doesn't dip below that. I am a happy camper..


----------



## Testier

Quote:


> Originally Posted by *Silent Scone*
> 
> A dying platform that's literally just had a refresh, that is fantastic logic.


Is not getting any more support after this. Skylake E would come with native usb 3.1/3d xpoint etc and possibly a much more reasonably priced 10 core. I am just saying, if you are buying a 10 core for future games, why not wait until a better platform comes out? Z170 chipset have a much nicer set of features than x99 imo.

Either way I still take x99 over z170 for the higher amount cores and 40 PCIE lanes, but just for gaming? Its a much harder decision, I hardly see much utilization higher than 50% on my 5960x on pure games.


----------



## DADDYDC650

Quote:


> Originally Posted by *Testier*
> 
> Games that uses more than 4 cores......
> 
> For future proofing, just buy it in the future. No need to rush into a dying platform now unless 4 cores just aint enough for you now.
> 
> I am happy with my 5960x and probably keep it until at least skylake E/cannon lake E.


Doesn't matter if the game only uses 4 cores. It's a beast regardless.


----------



## Exilon

Anyone strapped a Kraken G10 bracket on yet?


----------



## techguymaxc

Quote:


> Originally Posted by *Metros*
> 
> This "mircostutter" happens on any SLI settup, the Titan X SLI will still get the same microstutter as the GTX 1080 SLI


Micro-stutter *can* happen on any setup, single or multi-GPU but with proper frame-pacing and fast enough GPU(s) it should not. Again, if card A provides , < 60 FPS (say 40) and adding a second card gets you above 60 FPS, your frame times are still going to be in the 20-30ms range with spikes even higher, ergo micro-stutter.


----------



## Testier

Quote:


> Originally Posted by *DADDYDC650*
> 
> Doesn't matter if the game only uses 4 cores. It's a beast regardless.


For sure! Never said it wasnt. Would love to own one


----------



## CallsignVega

Ugg, Skylake-X is not for another year.

http://www.digitaltrends.com/computing/intel-skylake-x/


----------



## mbze430

Why is everyone here so hell bend with EWK waterblocks? pffft... I am so waiting for the Aqua Computers instead.


----------



## cookiesowns

Bit late to the party. But I finally got them!


----------



## Jpmboy

Quote:


> Originally Posted by *techguymaxc*
> 
> Zero games in your list. *I benchmark to have a comparison between components when I upgrade*, then I use those components to play games. What do you do?


erm... you just made my point.








Quote:


> Originally Posted by *Testier*
> 
> 6950x doesnt clock well even by haswell E standards. Unless you need it for some other application, I personally wouldnt bother. x99 system is aging, better to jump on skylake E.


z170 IS a good mainstream platform, ( I have 3). Unfortunately the only way to really know/understand the benefits of a higher core count is to actually use one.
You keep comparing clock speeds... not valid across architectures if you want to understand IPC and IPT. But yes dude, a z170 makes a very good gaming platform... now lets move on and stop with this off topic rant.


----------



## techguymaxc

Quote:


> Originally Posted by *mbze430*
> 
> Why is everyone here so hell bend with EWK waterblocks? pffft... I am so waiting for the Aqua Computers instead.


EK is a much bigger name. Also they look better. I use XSPC and Bitspower stuff in my loops as well, but for GPU blocks EK pretty much owns the market. They're also usually first to market and have the widest range of blocks. I had Tri-SLI 980 Strix and EK was the only manufacturer to make blocks for these custom cards.


----------



## Jpmboy

Quote:


> Originally Posted by *Jpmboy*
> 
> erm... you just made my point.
> 
> 
> 
> 
> 
> 
> 
> 
> z170 IS a good mainstream platform, ( I have 3). Unfortunately the only way to really know/understand the benefits of a higher core count is to actually use one.
> You keep comparing clock speeds... not valid across architectures if you want to understand IPC and IPT. But yes dude, a z170 makes a very good gaming platform... now lets move on and stop with this off topic rant.


Quote:


> Originally Posted by *mbze430*
> 
> Why is everyone here so hell bend with EWK waterblocks? pffft... I am so waiting for the Aqua Computers instead.


yeah man, AQ makes the best looking blocks... not always the best performing *well since their 7970 blocks which I had 2*, but hands down the best looking!


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> erm... you just made my point.
> 
> 
> 
> 
> 
> 
> 
> 
> z170 IS a good mainstream platform, ( I have 3). Unfortunately the only way to really know/understand the benefits of a higher core count is to actually use one.
> You keep comparing clock speeds... not valid across architectures if you want to understand IPC and IPT. But yes dude, a z170 makes a very good gaming platform... now lets move on and stop with this off topic rant.


Your point is irrelevant to the conversation. We're clearly talking about games here, and you come in talking about synthetic benchmarks. I get it, benchmarking is your hobby, and that's fine. I use my PCs for work, for video editing, and for games. Benchmarks are just a tool.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Ugg, Skylake-X is not for another year.
> 
> http://www.digitaltrends.com/computing/intel-skylake-x/


6950x it is!!! Get rid of that crappy 4 core peasant proc. You deserve better.


----------



## Jpmboy

Quote:


> Originally Posted by *techguymaxc*
> 
> Your point is irrelevant to the conversation. We're clearly talking about games here, and you come in talking about synthetic benchmarks. I get it, benchmarking is your hobby, and that's fine. I use my PCs for work, for video editing, and for games. Benchmarks are just a tool.


last reply to you:
Read your own post (clarification added for the comprehension compromised)... You state that you use benchmarks to make comparisons between hardware before upgrading" {I assume this is in order to select the best components} for gaming.
QED.


----------



## techguymaxc

Quote:


> Originally Posted by *Jpmboy*
> 
> last reply to you:
> Read your own post (clarification added for the comprehension compromised)... You state that you use benchmarks to make comparisons between hardware before upgrading" {I assume this is in order to select the best components} for gaming.
> QED.


I didn't bring up benchmarks, you did. I was only responding to you. Every post prior to that was clearly discussing game performance.


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> Ugg, Skylake-X is not for another year.
> 
> http://www.digitaltrends.com/computing/intel-skylake-x/


lol - and you'll be buying new cards about the same time!









grab a 6950X, you will not be disappointed.


----------



## Difunto

Ok! just finished removing the evga hybrid kit from my titan x (M). going to do what vega did to his








wish me luck!


----------



## EniGma1987

Quote:


> Originally Posted by *dante`afk*
> 
> + 2x 140mm fans


I just wanted to say, your posts this morning will probably be the highlight of my day. So thank you for that. I laugh so hard anytime I read your stuff. From wondering if Home Depot will solder that tiny little SMD resistor, to taping it back on, to doing your GPU modding while it sits on the carpet. Man. Just. oh my gosh.

Quote:


> Originally Posted by *mbze430*
> 
> Why is everyone here so hell bend with EWK waterblocks? pffft... I am so waiting for the Aqua Computers instead.


I think everyone likes EK because they are usually fastest to market with a new block and have great rep support here on the forums. Personally I would get a Heatkiller block over anything else. Love the looks and they have some great performance. But I dont think they make Heatkiller GPU blocks anymore :'( I do like the look of the AquaComputer blocks more than EK, and Aqua did just put up their TitanX Pascal block for pre-order too, but looking at the two blocks I would say that EK will have better VRM cooling on their block and that is kinda important on this Titan.
http://shop.aquacomputer.de/product_info.php?products_id=3458
https://www.ekwb.com/shop/ek-fc-titan-x-pascal

As you see in the pics of the blocks, the AC block just kinda lazily have a couple water channels in the general area of the VRM at a diagonal slant. The EK block has two channels specifically over the mosfets and chokes


----------



## CallsignVega

Quote:


> Originally Posted by *DADDYDC650*
> 
> 6950x it is!!! Get rid of that crappy 4 core peasant proc. You deserve better.


lol, this 4.9 GHz "peasant" proc rips through games.









Just worried about spending another 3 grand and actually coming in lower overall besides a few rare scenarios like being strangled to death in Crysis grass.









Maybe someone would want to buy this whole system off me (Minus the Titan-XP's of course).


----------



## stefxyz

These missing EK blocks drive me nuts. I had to place back the 1080 and put the Titan back into the box as this construction here is definately too dangerous:





Really difficult if you have a custom loop going and no block available

Still I am super happy with Time Spy results.

Ram +489 MHZ, Offset 160 MHZ, Fan custom to 90% (to take out temp soft throttle), Power Target 120% and it went through just fine with no artifacts at 10k GPU Score:

http://www.3dmark.com/3dm/13862794

Now thanks to EK Waterblocks I bought a NAS: Synology DS916+ 8 GB with 4 * 4 TB WD RED drives to bridge the time... AT least thanks to EK my VR "videos" will be safely stored in a Raid 10 soon...


----------



## dante`afk

Quote:


> Originally Posted by *Darkstar757*
> 
> Oren Elliott Products, Inc.
> Edgerton, OH 43517-9600 Locations
> Contact Company: 419-298-2306


Quote:


> Originally Posted by *Darkstar757*
> 
> Reach out to them.
> 
> http://clevelandcircuits.com/index.html


thanks, unfortunately that's just a fabric, they don't do small things ^^


----------



## outofmyheadyo

Quote:


> Originally Posted by *CallsignVega*
> 
> lol, this 4.9 GHz "peasant" proc rips through games.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just worried about spending another 3 grand and actually coming in lower overall besides a few rare scenarios like being strangled to death in Crysis grass.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe someone would want to buy this whole system off me (Minus the Titan-XP's of course).


sorry for the offtopic CallsignVega, but how much volts are u feeding that 6700K for 4.9 ? I am a bit of a pussi and my 6700K does 4.8 on 1.43v dont have the balls to go higher.


----------



## Jpmboy

Quote:


> Originally Posted by *outofmyheadyo*
> 
> sorry for the offtopic CallsignVega, but how much volts are u feeding that 6700K for 4.9 ? I am a bit of a ***** and my 6700K does 4.8 on 1.43v dont have the balls to go higher.


if the core voltage @ 4.9 is too high for your liking, or cooling - there is a lot of unused speed in the ram of most z170 rigs.


----------



## Testier

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - and you'll be buying new cards about the same time!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> grab a 6950X, you will not be disappointed.


Hey, I like high core count systems too...

On topic:

How much extra clockspeed you guys think if I went from stock air to hybrid and/or stock air to block?


----------



## outofmyheadyo

Quote:


> Originally Posted by *Jpmboy*
> 
> if the core voltage @ 4.9 is too high for your liking, or cooling - there is a lot of unused speed in the ram of most z170 rigs.


I tried 3600 but my mb doesnt like it perhaps I should grab a nice 3200 CL14 kit


----------



## Baasha

People comparing the 6950X to a quad core? lol.. what's next? A Honda Civic compared to a Viper ACR?









6950X is an absolute beast.

I render 4K 60FPS video WHILE playing games maxed out at 5K. Try that with a quad core.


----------



## CallsignVega

Quote:


> Originally Posted by *outofmyheadyo*
> 
> sorry for the offtopic CallsignVega, but how much volts are u feeding that 6700K for 4.9 ? I am a bit of a pussi and my 6700K does 4.8 on 1.43v dont have the balls to go higher.


1.44v.

Quote:


> Originally Posted by *Baasha*
> 
> People comparing the 6950X to a quad core? lol.. what's next? A Honda Civic compared to a Viper ACR?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 6950X is an absolute beast.
> 
> I render 4K 60FPS video WHILE playing games maxed out at 5K. Try that with a quad core.


A nice 6700K is faster in 99% of games, no hyperbole is going to change that. Broadwell-E loses out on both architecture and frequency. Only wins in core count.


----------



## outofmyheadyo

but he is right, cant encode a 4k video while playing in 5k with a 6700K.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> lol, this 4.9 GHz "peasant" proc rips through games.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just worried about spending another 3 grand and actually coming in lower overall besides a few rare scenarios like being strangled to death in Crysis grass.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe someone would want to buy this whole system off me (Minus the Titan-XP's of course).


Stop the denial! 4 cores are for library's and hobos. What's 3 grand to a Titan owner???


----------



## Jpmboy

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I tried 3600 but my mb doesnt like it perhaps I should grab a nice 3200 CL14 kit


yeah, a Samsung B-die kit will work really well on the Z170-A. That kingston kit is likely hynix, should be able to squeeze out 3200c16 or lower (but 1.45+ volts). z170 is abouot frequency, not really bandwidth, so shoot for the lowest latency at a freq above 3000.


----------



## CallsignVega

Quote:


> Originally Posted by *outofmyheadyo*
> 
> but he is right, cant encode a 4k video while playing in 5k with a 6700K.


And why would I want to encode a video when playing a game? That's what my laptop is for.

Frankly, considering we've had 5 GHz cpu overclocks years ago and going down to 4.3 GHz as a relative max with Broadwell-E is quite embarrassing showing from Intel.


----------



## KillerBee33

Can someone post a Firestrike run with 6700K @ 4.6 ?


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> 1.44v.
> A nice 6700K is faster in 99% of games, no hyperbole is going to change that. Broadwell-E loses out on both architecture and frequency. Only wins in core count.


If you guys want to compare the inherent IPT (instructions per tick) between platforms, *THIS* does a pretty good job of sorting out efficiency


----------



## outofmyheadyo

I have no idea why you would want to do that, but technically he is right, I sort of hate the lag on 6700k when a game is running in the background and I wanna do something else, I do like my 6700k overall really nice for 250.


----------



## axiumone

Cards are back in stock if anyone still needs one.


----------



## stefxyz

I like my 6700k too but I would love to go for more cores. But spending serious money (which I like to) and having WORSE performance in most games I just cant get around. Especially since I render a video may be every 2 months once and dont care if that takes 30% longer... My excel and Google Chrome Browser seem to be fine with my 4600MHZ 6700k already...

I really hope with Skylake E this will change finally and IPC on enthusiast platform at least equal out...


----------



## CallsignVega

Quote:


> Originally Posted by *stefxyz*
> 
> I like my 6700k too but I would love to go for more cores. But spending serious money (which I like to) and having WORSE performance in most games I just cant get around. Especially since I render a video may be every 2 months once and dont care if that takes 30% longer... My excel and Google Chrome Browser seem to be fine with my 4600MHZ 6700k already...
> 
> I really hope with Skylake E this will change finally and IPC on enthusiast platform at least equal out...


Pretty much. Most people can't wrap their head around that a $400 CPU is better at a certain task (gaming) than a $1700 CPU. Of course the higher core count CPU is better for productivity and multi-tasking, but 99% of the time not for gaming. That may change a few years down the road when DX12 becomes more ubiquitous.

On another note, anyone elses NVIDIA control panel V-Sync setting keep switching to ON by itself? It's constantly do it for me, so annoying.


----------



## outofmyheadyo

I had the constant Vsync switching issue on my 1080, annoying to say the least.


----------



## cookiesowns

@Jpmboy

Any luck with the EK Uniblock while maintaining the fan shroud?


----------



## opt33

Even with my old Titan X, vsync keeps turning itself back on in nvidia control panel with restarting the computer, so not just pascal. I noticed it after getting gsync monitor while back, but didnt bother to sort out whether it was related to that or just different nvidia drivers. Though with fallout 4 at 144hz, the game is wonky with gsync on/vsync off, so I leave vsync on for that game anyways.

I get my Titan XP tomorrow afternoon, hopefully...I will see what it can do, then wait on waterblock to use 24/7.


----------



## Jpmboy

Quote:


> Originally Posted by *cookiesowns*
> 
> @Jpmboy
> 
> Any luck with the EK Uniblock while maintaining the fan shroud?


I wish - spent most of the day in the garage prepping a car for tomorrow...


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> I wish - spent most of the day in the garage prepping a car for tomorrow...


Heh, track day?


----------



## CallsignVega

Hmm, BF1 is going to be DX12. Maybe I will order the new components after all and compare the systems.









Sucks though that X99 is coming to the end of it's life and will need a new motherboard next year when Skylake-X hits.


----------



## lyang238

My life....IS COMPLETE!

GrandOpening.jpg 2455k .jpg file


TitanXInstalled.jpg 2557k .jpg file


----------



## Jpmboy

Quote:


> Originally Posted by *cookiesowns*
> 
> Heh, track day?


lol - yes. minor tweaks and tape over lights etc... early morning, a couple of hours with some friends. One just got a used DB9 so I got roped into going ( well, it only took a string really







)


----------



## HyperMatrix

Quote:


> Originally Posted by *techguymaxc*
> 
> Hyper and Jpm: you were saying?
> 
> Not all tasks have inherent parallelism. Even when there are gains to be had from parallelization, unlocking it can be one of the hardest tasks in programming. Clock speed and IPC are still king for most games, once you get to 4 threads.


I keep seeing links to other benchmarks. But no one actually taking me up on my challenge. There appears to be a lot of fear from Skylake owners not wanting to find out their that their cpu may actually be holding them back.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, BF1 is going to be DX12. Maybe I will order the new components after all and compare the systems.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sucks though that X99 is coming to the end of it's life and will need a new motherboard next year when Skylake-X hits.


Skylake-X will be great. I couldn't bother waiting for it and I didn't want a 6700k because I'd feel a little dumb when Kaby Lake is out in a few months. Figured I'd take the middle road with a 6800k from SL. I don't do a bunch of encoding and I can't justify the $1700 so a 6950x wouldn't make sense. I figure by the time 2018 rolls around Intel will have released an 8 core core for the masses with the latest arch. If not, I'll go with whatever 8-10 core blows my socks off. Until then, 6 cores should be plenty for gaming.

I know you didn't ask but I felt like rambling.


----------



## EniGma1987

Quote:


> Originally Posted by *HyperMatrix*
> 
> I keep seeing links to other benchmarks. But no one actually taking me up on my challenge. There appears to be a lot of fear from Skylake owners not wanting to find out their that their cpu may actually be holding them back.


Id test with ya, but I dont have dual Titan X's to do an even test with so it wouldn't help much for me to try testing. vAnd from what it seems like skimming you and Vega's posts on the subject, you seem to be trying to test whether the CPUs are holding back the dual titans right? Otherwise it wouldnt have enough GPU horsepower with just the single card to make the test matter


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> How far into the game is that area? Right now in Crysis 3 I am in the storm rig where I have to cross the platform with the helicopter attacking. Any hacks out there to skip forward?


My game save showed its about 30-40 minutes into the game. I'm not sure of any hacks. But I'm surprised you never finished the game before. Haha. You've very far ahead. Welcome to the jungle is I think the second level. Go to the load game menu and scroll down until you see "welcome to the jungle."


----------



## HyperMatrix

Quote:


> Originally Posted by *EniGma1987*
> 
> Id test with ya, but I dont have dual Titan X's to do an even test with so it wouldn't help much for me to try testing. vAnd from what it seems like skimming you and Vega's posts on the subject, you seem to be trying to test whether the CPUs are holding back the dual titans right? Otherwise it wouldnt have enough GPU horsepower with just the single card to make the test matter


You could run on lower resolution to lower gpu usage, allowing higher frame rates which require cpu power.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> My game save showed its about 30-40 minutes into the game. I'm not sure of any hacks. But I'm surprised you never finished the game before. Haha. You've very far ahead. Welcome to the jungle is I think the second level. Go to the load game menu and scroll down until you see "welcome to the jungle."


I'm not a huge single player FPS guy. I usually only do multiplayer.

I couldn't find the exact spot in those screenshots, but the grass does max out my 6700K to 90+% utilization. GPU utilization is still good, 3440x1440 everything maxed:



Decided to build a second machine, so just blew around 4 grand on SL 4.4 6950X, quad channel ram, 1TB SM961, RVE10.


----------



## Difunto

Well it fits just like vega said it would! Now to put it back in the case


----------



## CallsignVega

Quote:


> Originally Posted by *Difunto*
> 
> Well it fits just like vega said it would! Now to put it back in the case


Looking good!


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> I'm not a huge single player FPS guy. I usually only do multiplayer.
> 
> I couldn't find the exact spot in those screenshots, but the grass does max out my 6700K to 90+% utilization. GPU utilization is still good, 3440x1440 everything maxed:
> 
> 
> 
> Decided to build a second machine, so just blew around 4 grand on SL 4.4 6950X, quad channel ram, 1TB SM961, RVE10.


That specific spot lilchronic shared is the most taxing. In most generic spots with grass I'm at 150+ fps. Its alright after a checkpoint where you exit a building and there's a ledge yore on, with the aliens running around. That's the exact spot we used. Just match screen position with pictures then. I was originally in about the same spot but a different viewport and had over 150fps

Also, welcome to the club. Hope you get a good overclocker. Being unsure of attainable clocks is why I skipped over the 6950x.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> That specific spot lilchronic shared is the most taxing. In most generic spots with grass I'm at 150+ fps. Its alright after a checkpoint where you exit a building and there's a ledge yore on, with the aliens running around. That's the exact spot we used. Just match screen position with pictures then. I was originally in about the same spot but a different viewport and had over 150fps
> 
> Also, welcome to the club. Hope you get a good overclocker. Being unsure of attainable clocks is why I skipped over the 6950x.


Just moving your viewpoint even from the same relative position a few degrees can significantly change your FPS. That's why you either have to do these comparisons after a loading screen and don't move so everyone is at an identical position/view, or use an in-game benchmark.

I've had the top Intel CPU for like the last 8 years, just got rid of my 5960X to try out the 6700K. Supposedly SL guarantees me a 4.4 GHz 6950X, maybe I can get it to 4.5.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> I've had the top Intel CPU for like the last 8 years, just got rid of my 5960X to try out the 6700K. Supposedly SL guarantees me a 4.4 GHz 6950X, maybe I can get it to 4.5.


4.5GHz 6950x is considered golden. I hope you hit it man. I just didn't want to spend $2000 on a chip that ended up only clocking to 4.2-4.3GHz.

It's easy to match view ports. I matched mine to lilchronic if you were following the thread. You stand at the very edge of that ledge. And just align your mouse cursor to point at the same thing our screenshots were pointing at.

If you remember, I used to be a huge fan of 4 core no-HT years ago. Because I had seen it perform so much better especially since it allowed higher clocks. But over the past year I've seen a lot of games benefit more from more cores, but also hyper threading. Particularly open world games. Because even with 8 cores, having hyper threading helps handle all the draw calls more easily.


----------



## Metros

Quote:


> Originally Posted by *techguymaxc*
> 
> Micro-stutter *can* happen on any setup, single or multi-GPU but with proper frame-pacing and fast enough GPU(s) it should not. Again, if card A provides , < 60 FPS (say 40) and adding a second card gets you above 60 FPS, your frame times are still going to be in the 20-30ms range with spikes even higher, ergo micro-stutter.


The GTX 980ti SLI and GTX 18080 SLI are easily fast enough then, this did not just happen with the Titan X, also G-Sync will remove most of it as well


----------



## CallsignVega

You should get in the ballpark, it's not exact though.







Yes splitting hairs a bit.


----------



## Baasha

Where did you get the SM961? Other than that store in Australia, I haven't seen it available anywhere (in the US).


----------



## Metros

Quote:


> Originally Posted by *Baasha*
> 
> Where did you get the SM961? Other than that store in Australia, I haven't seen it available anywhere (in the US).


I would avoid buying it, due to no support, it is for OEM, get the Samsung 950 Pro or wait for the Samsung 960 Pro

Some retailers have it in the UK


----------



## HMoneyGrip

Quote:


> Originally Posted by *HyperMatrix*
> 
> That specific spot lilchronic shared is the most taxing. In most generic spots with grass I'm at 150+ fps. Its alright after a checkpoint where you exit a building and there's a ledge yore on, with the aliens running around. That's the exact spot we used. Just match screen position with pictures then. I was originally in about the same spot but a different viewport and had over 150fps
> 
> Also, welcome to the club. Hope you get a good overclocker. Being unsure of attainable clocks is why I skipped over the 6950x.


Baller!


----------



## DNMock

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - yes. minor tweaks and tape over lights etc... early morning, a couple of hours with some friends. One just got a used DB9 so I got roped into going ( well, it only took a string really
> 
> 
> 
> 
> 
> 
> 
> )


Ewww, a DB9? Absolutely beautiful cars, would love to own one, but wouldn't want to work on one. when autos I prefer KISS and stick with classic Chevy, olds, Ford, etc. etc. where the parts are plentiful and cheap and mistakes won't cost you $1,000's lol


----------



## CallsignVega

http://www.ebay.com/itm/Samsung-960-Pro-Series-OEM-1TB-NVMe-M-2-NGFF-SSD-PCIe-3-0-x4-80mm-SM961-/142074303288?hash=item211449d338:g:mI0AAOSwyLlXoRqB

A few hours ago they had 5 in stock. I bought one and one left.


----------



## Metros

Quote:


> Originally Posted by *CallsignVega*
> 
> http://www.ebay.com/itm/Samsung-960-Pro-Series-OEM-1TB-NVMe-M-2-NGFF-SSD-PCIe-3-0-x4-80mm-SM961-/142074303288?hash=item211449d338:g:mI0AAOSwyLlXoRqB
> 
> A few hours ago they had 5 in stock. I bought one and one left.


That misadverisement, could be taken down, if they do not change it, not a Samsung 960 Pro


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Pretty much. Most people can't wrap their head around that a $400 CPU is better at a certain task (gaming) than a $1700 CPU. Of course the higher core count CPU is better for productivity and multi-tasking, but 99% of the time not for gaming. That may change a few years down the road when DX12 becomes more ubiquitous.
> 
> On another note, anyone elses NVIDIA control panel V-Sync setting keep switching to ON by itself? It's constantly do it for me, so annoying.


I can't wait until you get your 4.5GHz 6950x and finally understand what we're saying.







. You might be 5% lower in some older poorly threaded games. But 100% faster in games that can take advantage of the cores. This 4 core nonsense needs to stop. Especially since you said HT is bad too.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> http://www.ebay.com/itm/Samsung-960-Pro-Series-OEM-1TB-NVMe-M-2-NGFF-SSD-PCIe-3-0-x4-80mm-SM961-/142074303288?hash=item211449d338:g:mI0AAOSwyLlXoRqB
> 
> A few hours ago they had 5 in stock. I bought one and one left.


Tempting. But not sure I want to pay a $200 premium to get it 2 or 3 weeks early.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> I can't wait until you get your 4.5GHz 6950x and finally understand what we're saying.
> 
> 
> 
> 
> 
> 
> 
> . You might be 5% lower in some older poorly threaded games. But 100% faster in games that can take advantage of the cores. This 4 core nonsense needs to stop. Especially since you said HT is bad too.


lol Hyper you act like I'm new to this or something. I just came from a 5960X a month ago.

Yes in some certain grass scenes in Crysis 3 and in Time Spy benchmark it will be faster, but still overall slower in most games. Really bought the new system for BF1 as it's DX12. I will play the heck out of that game.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> lol Hyper you act like I'm new to this or something. I just came from a 5960X a month ago.


I know. Which makes it all the more confusing. Haha. Never would have imagined you going the 4-core way. Because they simply don't make 4 cores in your price range.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> lol Hyper you act like I'm new to this or something. I just came from a 5960X a month ago.
> 
> Yes in some certain grass scenes in Crysis 3 and in Time Spy benchmark it will be faster, but still overall slower in most games. Really bought the new system for BF1 as it's DX12. I will play the heck out of that game.


Check rise of the tomb raider and htman. Both DX12. Also check GTA 5 which isn't dx12. But can load threads like mad. Wanna run a comparison test of that first automated drive Franklin takes through the city at the beginning of the game?


----------



## Difunto

Quote:


> Originally Posted by *CallsignVega*
> 
> Looking good!


Thanks man you gave me the idea!
loving my new temps.. min 17c max 45c with +200 and +350 mem
ran valley at 3440-1440 for 30mins


----------



## Fiercy

Quote:


> Originally Posted by *HyperMatrix*
> 
> Check rise of the tomb raider and htman. Both DX12. Also check GTA 5 which isn't dx12. But can load threads like mad. Wanna run a comparison test of that first automated drive Franklin takes through the city at the beginning of the game?


GTA 5 with Pascal Titan benefits from clocks like crazy just look at this


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> Check rise of the tomb raider and htman. Both DX12. Also check GTA 5 which isn't dx12. But can load threads like mad. Wanna run a comparison test of that first automated drive Franklin takes through the city at the beginning of the game?


lol I don't have any of those games. Like I said not a huge single player guy. What other games? I wish I could copy my steam list somehow.


----------



## HyperMatrix

Quote:


> Originally Posted by *Fiercy*
> 
> GTA 5 with Pascal Titan benefits from clocks like crazy just look at this


We're comparing a 4.8GHz 6700k to a 4.7GHz 5960x. Not lacking in clock speed. Just IPC difference between haswell-e and Skylake.


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - yes. minor tweaks and tape over lights etc... early morning, a couple of hours with some friends. One just got a used DB9 so I got roped into going ( well, it only took a string really
> 
> 
> 
> 
> 
> 
> 
> )


Haha, awesome man.

So.. What are people getting on stock Titan XP? My cards will boost to 1830~ but quickly throttle to around 1750 in Firestrike Extreme, even with fans maxxed out. VDDC is around 1.00 - 1.05V, haven't tested the second card yet. Probably won't be able to hit 2.0 on air with this ambient 32C+


----------



## bl4ckdot

Did anyone else noticed what was written inside the box ?


----------



## HaniWithAnI

Quote:


> Originally Posted by *CallsignVega*
> 
> Looking good!


Quote:


> Originally Posted by *Difunto*
> 
> Thanks man you gave me the idea!
> loving my new temps.. min 17c max 45c with +200 and +350 mem
> ran valley at 3440-1440 for 30mins


Awesome







my hybrid kid arrives on monday, can't wait. could you or @CallsignVega comment on the noise? does the rear fan still need to spin up much to keep the VRM's cool? Aside from temps it'd be awesome to have the fan spin silently if that doesn't risk anything on the card


----------



## carlhil2

Still on air...


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> I'm not a huge single player FPS guy. I usually only do multiplayer.
> 
> I couldn't find the exact spot in those screenshots, but the grass does max out my 6700K to 90+% utilization. GPU utilization is still good, 3440x1440 everything maxed:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Decided to build a second machine, so just blew around 4 grand on SL 4.4 *6950X*, quad channel ram, 1TB SM961, *RVE10*.


good combination!
check this" http://edgeup.asus.com/2016/06/17/broadwell-e-overclocking-guide/
and : http://edgeup.asus.com/2016/05/31/get-best-performance-broadwell-e-processors-asus-thermal-control-tool/


----------



## lutjens

Quote:


> Originally Posted by *CallsignVega*
> 
> http://www.ebay.com/itm/Samsung-960-Pro-Series-OEM-1TB-NVMe-M-2-NGFF-SSD-PCIe-3-0-x4-80mm-SM961-/142074303288?hash=item211449d338:g:mI0AAOSwyLlXoRqB
> 
> A few hours ago they had 5 in stock. I bought one and one left.


Thanks for the heads up...snagged the last one...


----------



## mouacyk

Quote:


> Originally Posted by *bl4ckdot*
> 
> 
> 
> Did anyone else noticed what was written inside the box ?


Lawl! Good luck have fun... It is a gaming card!


----------



## cookiesowns

:/

First card severely throttles, and hits VREL and power limits. 2Ghz is unobtainable. The second card seems to do much better but is SLI limited by the first card. Looks like I'll probably sell one of them and just go single card this round.. Or I can play with the 3rd card that's coming.. what do.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> lol I don't have any of those games. Like I said not a huge single player guy. What other games? I wish I could copy my steam list somehow.


So basically you don't own half of the biggest modern AAA games.







rise of the tomb raider is like Uncharted for PC. You should get it. Even if only for the fact that it is the ONLY proper DX12 game with Multi-GPU support.


----------



## HyperMatrix

Quote:


> Originally Posted by *cookiesowns*
> 
> :/
> 
> First card severely throttles, and hits VREL and power limits. 2Ghz is unobtainable. The second card seems to do much better but is SLI limited by the first card. Looks like I'll probably sell one of them and just go single card this round.. Or I can play with the 3rd card that's coming.. what do.


Test dx12 multi-gpu which bypasses SLI in rise of the tomb raider with your 3 cards. Then decide. Also while Msi afterburner overlay won't show in dx12, it still records data in the app. So you can tab out and look at your graphs afterwards. But it's basically heaven. I ran my 3 maxwell Titans with that setup and was getting ridiculously good scaling. Areas that were limited to about 60-65fps shot up to 165 with dx12 multi-gpu and tri-sli maxwell Titans.


----------



## sherlock

Quote:


> Originally Posted by *cookiesowns*
> 
> :/
> 
> First card severely throttles, and hits VREL and power limits. 2Ghz is unobtainable. The second card seems to do much better but is SLI limited by the first card. Looks like I'll probably sell one of them and just go single card this round.. Or I can play with the 3rd card that's coming.. what do.


You can return the card for refund I think? it is still < 30 days after purchase, though if you are into milking people on Ebay you can probably turn a profit by selling.


----------



## Difunto

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Awesome
> 
> 
> 
> 
> 
> 
> 
> my hybrid kid arrives on monday, can't wait. could you or @CallsignVega comment on the noise? does the rear fan still need to spin up much to keep the VRM's cool? Aside from temps it'd be awesome to have the fan spin silently if that doesn't risk anything on the card


i can't hear the fan i have it to go at the speed of the temp so the max fan speed was 45 since that was my highest temp. you will love it!


----------



## Jared Pace

Quote:


> Originally Posted by *HyperMatrix*
> 
> Check rise of the tomb raider and htman. Both DX12. Also check GTA 5



https://translate.googleusercontent.com/translate_c?depth=1&hl=en&ie=UTF8&prev=_t&rurl=translate.google.com&sl=auto&tl=en&u=http://pclab.pl/art70409-21.html&usg=ALkJrhirvP1iSq6yf_uuNlLZhHeN3CWeDg


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> good combination!
> check this" http://edgeup.asus.com/2016/06/17/broadwell-e-overclocking-guide/
> and : http://edgeup.asus.com/2016/05/31/get-best-performance-broadwell-e-processors-asus-thermal-control-tool/


Ya, I am just curious if when SL says I'll get a minimum 4.4 6950X, if I may be able to get it to 4.5. But I know from reading Broadwell-E hits walls pretty hard. SL says only 19% of 6950X's hit 4.4, so I figure if I'm spending $1700 on a CPU I may as well spend another couple hundred to guarantee 4.4.


----------



## HyperMatrix

Quote:


> Originally Posted by *Jared Pace*
> 
> 
> https://translate.googleusercontent.com/translate_c?depth=1&hl=en&ie=UTF8&prev=_t&rurl=translate.google.com&sl=auto&tl=en&u=http://pclab.pl/art70409-21.html&usg=ALkJrhirvP1iSq6yf_uuNlLZhHeN3CWeDg


Pascal sli Titans are a whole new ball game. That fps is ridiculously low in general. Likely gpu limited. I say this because I average 160fps (software imposed cap) in GTA. This is why I don't trust benchmarks done by people who don't understand technology. And it's why I wanted to have us compare them here.


----------



## CallsignVega

There aren't any DX12 games with built in benchmarks? What about Metro-LL or something (ya I know not DX12).


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> There aren't any DX12 games with built in benchmarks? What about Metro-LL or something (ya I know not DX12).


I think I may have it installed. I'll check. Last time I played it was when maxwell Titans launched. Bit of an old game, notorious for bad performance, so I'm not expecting the 5960 to show any dominance here.

Also hitman, tomb raider, and even gears of war have built in benchmarks.


----------



## Fiercy

Quote:


> Originally Posted by *CallsignVega*
> 
> There aren't any DX12 games with built in benchmarks? What about Metro-LL or something (ya I know not DX12).


Rise of Tomb Raider has a very good DX12 Benchmark.


----------



## Evo X

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I had the constant Vsync switching issue on my 1080, annoying to say the least.


Guys, VSYNC has to be on for GSYNC to work.

So unless you don't have a GSYNC monitor, it turning itself on in the control panel is not a glitch.

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - yes. minor tweaks and tape over lights etc... early morning, a couple of hours with some friends. One just got a used DB9 so I got roped into going ( well, it only took a string really
> 
> 
> 
> 
> 
> 
> 
> )


lol, nice. I was just at the track with a couple of my friends. I got my car wrapped in Xpel Ultimate with a Modesta coating on top so I don't have to deal with tape and wax every time I race.


----------



## cookiesowns

Quote:


> Originally Posted by *sherlock*
> 
> You can return the card for refund I think? it is still < 30 days after purchase, though if you are into milking people on Ebay you can probably turn a profit by selling.


Yes definetly. Typically that is frowned upon here no? So I didn't want to say it







.

Still trying to see if putting these on water is worth it. The card that's doing better still probably won't hit 2Ghz on air.

Are people actually getting 2000+ MHz without throttling in 3DMark extreme ?


----------



## carlhil2

And here I was thinking that ALL TXPs could, at least, hit 2000 on air.....


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> There aren't any DX12 games with built in benchmarks? What about Metro-LL or something (ya I know not DX12).


I didn't try with HT off. But this is what I got:


----------



## CallsignVega

Quote:


> Originally Posted by *Fiercy*
> 
> Rise of Tomb Raider has a very good DX12 Benchmark.


Hmm, well not going to spend $60 on a game I won't play just to benchmark.









Quote:


> Originally Posted by *Evo X*
> 
> Guys, VSYNC has to be on for GSYNC to work.
> 
> So unless you don't have a GSYNC monitor, it turning itself on in the control panel is not a glitch.
> lol, nice. I was just at the track with a couple of my friends. I got my car wrapped in Xpel Ultimate with a Modesta coating on top so I don't have to deal with tape and wax every time I race.


Sweet cars. I didn't put any protection on my Viper and got three paint chips in ****ty Virginia roads in the first week. I was so pissed.


----------



## HyperMatrix

Quote:


> Originally Posted by *cookiesowns*
> 
> Yes definetly. Typically that is frowned upon here no? So I didn't want to say it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Still trying to see if putting these on water is worth it. The card that's doing better still probably won't hit 2Ghz on air.
> 
> Are people actually getting 2000+ MHz without throttling in 3DMark extreme ?


Turn the fans to 100%. If you're throttling from anything other than hitting TDP limit, then your card has a problem. Or your ambient temps are way too high. But TDP limit is real. And will throttle you. Have to wait for a modified bios for that.


----------



## cookiesowns

Quote:


> Originally Posted by *HyperMatrix*
> 
> Turn the fans to 100%. If you're throttling from anything other than hitting TDP limit, then your card has a problem. Or your ambient temps are way too high. But TDP limit is real. And will throttle you. Have to wait for a modified bios for that.


fans at 100% will make it throttle at TDP much quicker. My other card that I thought was better is unstable above 2025.. but can run at 2K if I keep it under 71C.

What do..


----------



## HyperMatrix

Quote:


> Originally Posted by *cookiesowns*
> 
> fans at 100% will make it throttle at TDP much quicker. My other card that I thought was better is unstable above 2025.. but can run at 2K if I keep it under 71C.
> 
> What do..


Problem with Firestrike and any other intensive game/bench is that if you don't max out the fan, 2 things will happen:

1) voltage leak due to heat, causing you to hit TDP quicker

and

2) thermal throttling because it doesn't like to go above 84c. And it'll hit that if you're not maxing your fans.

I don't have the greatest score in firestrike but still got a little over 7800.


----------



## EniGma1987

Quote:


> Originally Posted by *cookiesowns*
> 
> Are people actually getting 2000+ MHz without throttling in 3DMark extreme ?


I bounce between 2000-2050 perfectly fine in FS Extreme. I am pretty sure all Firestrike stuff is power target limited. I run at 72c max temp in Extreme.


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> There aren't any DX12 games with built in benchmarks? What about Metro-LL or something (ya I know not DX12).


eh - check the canuks review for dx12 game benches TimeSpy is DX12, all other FM benches are DX11.
Quote:


> Originally Posted by *Evo X*
> 
> Guys, VSYNC has to be on for GSYNC to work.
> 
> So unless you don't have a GSYNC monitor, it turning itself on in the control panel is not a glitch.
> lol, nice. I was just at the track with a couple of my friends. I got my car wrapped in Xpel Ultimate with a Modesta coating on top so I don't have to deal with tape and wax every time I race.
> 
> 
> Spoiler: Warning: Spoiler!


Nice ru=ides! I'm Bringing the zr1. it's already in my trailer - don;t want to drive it thru the
chiped" road up to Pocono.


Spoiler: Warning: Spoiler!








Quote:


> Originally Posted by *cookiesowns*
> 
> Yes definetly. Typically that is frowned upon here no? So I didn't want to say it
> 
> 
> 
> 
> 
> 
> 
> .
> Still trying to see if putting these on water is worth it. The card that's doing better still probably won't hit 2Ghz on air.
> Are people actually getting 2000+ MHz without throttling in 3DMark extreme ?


yeah - with 2 I can get >2000. One card is setting the max freq for sure (as always!)








http://hwbot.org/submission/3281702_


----------



## Face2Face

Quote:


> Originally Posted by *Jpmboy*
> 
> eh - check the canuks review for dx12 game benches TimeSpy is DX12, all other FM benches are DX11.
> Nice ru=ides! I'm Bringing the zr1. it's already in my trailer - don;t want to drive it thru the
> chiped" road up to Pocono.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> yeah - with 2 I can get >2000. One card is setting the max freq for sure (as always!)
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/3281702_


You have a GNX? You bastard... One of my favorite cars. Super jelly. JK about being a bastard


----------



## auraofjason

Dang, my titan x is a "below average" overclocker. Even under water (hybrid) its max clock is 2025 and sits at around 1970-2000mhz. Anything higher crashes. Oh well I'm still extremely satisfied with my purchase, few extra mhz doesn't really matter lol.


----------



## Menthol

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, I am just curious if when SL says I'll get a minimum 4.4 6950X, if I may be able to get it to 4.5. But I know from reading Broadwell-E hits walls pretty hard. SL says only 19% of 6950X's hit 4.4, so I figure if I'm spending $1700 on a CPU I may as well spend another couple hundred to guarantee 4.4.


I bought mine from SL and got very lucky, 4.5 easily on custom water, add my chiller and it benches quite easily at 4.6, 4.7, probably higher but I am being conservative on voltage. 4.5 is a very good place on these chips 3D benchmarks don't seem to gain much going to 4.7


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> I bought mine from SL and got very lucky, 4.5 easily on custom water, add my chiller and it benches quite easily at 4.6, 4.7, probably higher but I am being conservative on voltage. 4.5 is a very good place on these chips 3D benchmarks don't seem to gain much going to 4.7


probably the best 6950X i've seen actually running (well, seen results from anyway)!


----------



## PasK1234Xw

Quote:


> Originally Posted by *Evo X*
> 
> Guys, VSYNC has to be on for GSYNC to work.
> 
> So unless you don't have a GSYNC monitor, it turning itself on in the control panel is not a glitch.


No vsync doesn't have to be on in fact its acting as a frame buffer throughout the gsync range it doesn't just turn on when you hit max frame like lot of people think.
Mouse responsiveness is much better with vsync off and frame capping. Latency is noticeable even more so with lower frame with gsync/vsync. BF4 is perfect game to see this mainly when shooting.
Gsync isn't 100 % all the time Nvidia solution to fixing these imperfections is to add a frame buffer to gsync. While it shouldn't be issue due to frames being in sync with refresh its not always case it does add latency and can also results in stutter i see this on my 1080 SLI fix it by increasing power limit to stable my clocks. BF4 again map like siege of Shanghai 64 players at 1440 165hz max settings and 150 res scale to push the cards.

Turning off global vsync isn't an option as will reset on reboot you have to turn off vsync per game profile.


----------



## Evo X

Quote:


> Originally Posted by *Jpmboy*
> 
> Nice rides! I'm Bringing the zr1. it's already in my trailer - don;t want to drive it thru the
> chiped" road up to Pocono.


Thanks! That ZR1 must be a beast. Good to see high performance cars actually being used for what they were designed for.

There is a guy near me who tracks his Enzo. Snapped some quick pics yesterday after both of us got a post race detail. Dude has a crazy collection of cars and beats the hell out of all of em. Love it!


Spoiler: Warning: Spoiler!


----------



## pez

Seems like everyone is enjoying their cards







. Mine arrived this morning and I got to play with it a bit tonight. Loving this card. +225 is unstable for me, but +200 seems fine. I might try to eek a couple extra Mhz out of it, but I'm very satisfied. Clocks seem to be sticking to 1924-2000 depending on the game. Going to go back and look through some more of the thread







.


----------



## HyperMatrix

Quote:


> Originally Posted by *pez*
> 
> Seems like everyone is enjoying their cards
> 
> 
> 
> 
> 
> 
> 
> . Mine arrived this morning and I got to play with it a bit tonight. Loving this card. +225 is unstable for me, but +200 seems fine. I might try to eek a couple extra Mhz out of it, but I'm very satisfied. Clocks seem to be sticking to 1924-2000 depending on the game. Going to go back and look through some more of the thread
> 
> 
> 
> 
> 
> 
> 
> .


I don't get how everyone is doing +225 with mediocre clocks and I'm doing +175 and getting 2050MHz stable.


----------



## ryder

as petty as this is, i wish the titan x had an adjustable led on the side so you could change the green font to match your system.


----------



## Baasha

One thing I've noticed with the Titan X Pascal is that although I have the OC set to +200 on both cards, the frequencies differ based on the game which is quite strange. In some games, I hit ~ 2050 while others are at 2012 or throttle down to the ~ 1974 range.

No idea why/how this happens.

Also, didn't know that we need to put the fans at 100% during benchmarks.. the cards throttle quite a bit during the benchmarks I've done so far but I had the fan on Auto (custom profile).


----------



## Viveacious

Deleted.


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> One thing I've noticed with the Titan X Pascal is that although I have the OC set to +200 on both cards, the frequencies differ based on the game which is quite strange. In some games, I hit ~ 2050 while others are at 2012 or throttle down to the ~ 1974 range.
> 
> No idea why/how this happens.
> 
> Also, didn't know that we need to put the fans at 100% during benchmarks.. the cards throttle quite a bit during the benchmarks I've done so far but I had the fan on Auto (custom profile).


That's TDP throttling, assuming temps aren't over the top. You can confirm using gpu-z.


----------



## Baasha

Quote:


> Originally Posted by *Jpmboy*
> 
> probably the best 6950X i've seen actually running (well, seen results from anyway)!


Is there a perceptible advantage to having a CPU clocked 100 - 200Mhz higher other than e-peen?

I want to use that Temp tool thing you linked to for the 6950X to see if I can clock my CPU higher for games/apps that use fewer threads.

Also, I am supremely happy with the CPU being able to pass RealBench and game-stable @ 4.30Ghz at 1.270V. Since I do a LOT of encoding, photo/video editing (while gaming I might add), I need to bump the v-core to 1.320V.

I think I stress-tested the CPU the hardest - 4K 60fps rendering while gaming at 5K!







Handbrake shmandbrake. lol...


----------



## pez

Quote:


> Originally Posted by *HyperMatrix*
> 
> I don't get how everyone is doing +225 with mediocre clocks and I'm doing +175 and getting 2050MHz stable.


I'm not sure I'm deciphering exactly what you're saying, but I think mine is temp-related. That's generally how it is. The longer you hold that 84-85C (or whatever your temp limit is set to) the more the clocks usually decrease. My fans are at about 70% max at any point. I'm not a person that games with 100% fan







. Benchmarks i could see, but I'll be ok with 70% fan and losing a fraction of a single FPS.


----------



## Testier

Quote:


> Originally Posted by *pez*
> 
> I'm not sure I'm deciphering exactly what you're saying, but I think mine is temp-related. That's generally how it is. The longer you hold that 84-85C (or whatever your temp limit is set to) the more the clocks usually decrease. My fans are at about 70% max at any point. I'm not a person that games with 100% fan
> 
> 
> 
> 
> 
> 
> 
> . Benchmarks i could see, but I'll be ok with 70% fan and losing a fraction of a single FPS.


Higher temperature, higher leakage, higher power usage, hits power limit, throttle.

On a slightly offtopic note, anyone knows what type of res/pump combo would be decent for a single 5960x/titan x? I am on thermaltake core v21 atm. Looking at possible 2 x 240mm rad setup. First time looking into custom loop and I feel like I am completely out of my depth


----------



## DNMock

Quote:


> Originally Posted by *Testier*
> 
> Higher temperature, higher leakage, higher power usage, hits power limit, etc.
> 
> On a slightly offtopic note, anyone knows what type of res/pump combo would be decent for a single 5960x/titan x? I am on thermaltake core v21 atm. Looking at possible 2 x 240mm rad setup. First time looking into custom loop and I feel like I am completely out of my depth


5960x and Titan X Pascal, I will assume you aren't working with a thin budget.

Check your case and find out what the max rad space you can fit into it and get that much.

put a Gentle Typhoon, NB-Eloop or EK Vardar fan the rads, most or all set to intake. If you are feeling lazy, get a EKWB D5 pump reservoir combo, if you wanna spice it up a bit more, go for a Monsoon Modular Reservoir and a MCP35x2 (dual DDC pump) from Swifttech.

Which type of radiator? http://www.xtremerigs.net/2015/02/11/radiator-round-2015/ check that out.

To answer your question, yes, dual 240's will be enough, but watercooling is like crack. It won't be too long before you want to add another rad to get even better temps, and so on and so forth, so just save yourself the time of having to take your rig apart multiple times and just max out the rad space from the start.


----------



## pez

Quote:


> Originally Posted by *Testier*
> 
> Higher temperature, higher leakage, higher power usage, hits power limit, throttle.
> 
> On a slightly offtopic note, anyone knows what type of res/pump combo would be decent for a single 5960x/titan x? I am on thermaltake core v21 atm. Looking at possible 2 x 240mm rad setup. First time looking into custom loop and I feel like I am completely out of my depth


Yeah.

I'm also in a similar predicament, though not the pure beast of a CPU as you







. However, I'm trying to downsize to mITX







.


----------



## DNMock

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, well not going to spend $60 on a game I won't play just to benchmark.
> 
> 
> 
> 
> 
> 
> 
> 
> Sweet cars. I didn't put any protection on my Viper and got three paint chips in ****ty Virginia roads in the first week. I was so pissed.


That's why the 442 stays covered in the garage and only gets driven on nice days over well paved roads


----------



## HyperMatrix

Quote:


> Originally Posted by *DNMock*
> 
> 5960x and Titan X Pascal, I will assume you aren't working with a thin budget.
> 
> Check your case and find out what the max rad space you can fit into it and get that much.
> 
> put a Gentle Typhoon, NB-Eloop or EK Vardar fan the rads, most or all set to intake. If you are feeling lazy, get a EKWB D5 pump reservoir combo, if you wanna spice it up a bit more, go for a Monsoon Modular Reservoir and a MCP35x2 (dual DDC pump) from Swifttech.
> 
> Which type of radiator? http://www.xtremerigs.net/2015/02/11/radiator-round-2015/ check that out.
> 
> To answer your question, yes, dual 240's will be enough, but watercooling is like crack. It won't be too long before you want to add another rad to get even better temps, and so on and so forth, so just save yourself the time of having to take your rig apart multiple times and just max out the rad space from the start.


I'd go so far as to say invest in an aquarium water chiller for your loop. Just make sure it can be set to match ambient or sub-ambient, or you yourself know how to rig it up to do that. I was going to do that a few years ago. Even bought the unit for it. But decided to just stick a portable AC over my radiators in the end.


----------



## ChrisxIxCross

Seriously how is there no Pascal Bios Tweaker yet? I can't imagine what this card could do w/ unlocked voltage & PT


----------



## DNMock

So has anyone taken apart an Nvidia HB bridge yet? I would imagine once you take that stupid plastic cover off of it, you would end up with something looking like this:



Quote:


> Originally Posted by *ChrisxIxCross*
> 
> Seriously how is there no Pascal Bios Tweaker yet? I can't imagine what this card could do w/ unlocked voltage & PT


Some of us just hit regular shipping on day one and won't get our cards until tomorrow. Give it a week for the first wave of betas to come out then another week we should have some more refined variants.


----------



## HyperMatrix

Quote:


> Originally Posted by *DNMock*
> 
> So has anyone taken apart an Nvidia HB bridge yet? I would imagine once you take that stupid plastic cover off of it, you would end up with something looking like this:


http://www.pcworld.com/article/3087524/hardware/tested-the-payoff-in-buying-nvidias-40-sli-hb-bridge.html

Pictures of it without the LED cover portion. Feel free to ignore the article as "Gordon Mah Ung" from PC World is a "doodoo head" (self-censored) paid shill who thinks he knows what he's talking about...and did a whole review trying to promote the HB Bridge by comparing it to SLI Ribbon connectors as opposed to the actual hard connectors that come with any premium gaming motherboard. Every single article he's written infuriates me.


----------



## Mike211

My 2 Titan X


----------



## DNMock

Quote:


> Originally Posted by *HyperMatrix*
> 
> http://www.pcworld.com/article/3087524/hardware/tested-the-payoff-in-buying-nvidias-40-sli-hb-bridge.html
> 
> Pictures of it without the LED cover portion. Feel free to ignore the article as "Gordon Mah Ung" from PC World is a POS paid shill who thinks he knows what he's talking about...and did a whole review trying to promote the HB Bridge by comparing it to SLI Ribbon connectors as opposed to the actual hard connectors that come with any premium gaming motherboard. Every single article he's written infuriates me.


So is there a specific front and back on the HB bridge that prevents you from just turning it the other way or something? Wouldn't that give you all the space you need for an EKWB block?


----------



## HyperMatrix

Quote:


> Originally Posted by *DNMock*
> 
> So is there a specific front and back on the HB bridge that prevents you from just turning it the other way or something? Wouldn't that give you all the space you need for an EKWB block?


Technically, you should be able to split that thing down the middle and it should work fine (don't do it...there is some overlapping circuitry, likely just for the LED, but still). There is no bandwidth or data sharing between the 2 SLI fingers. I believe you can even use 2 individual sli connectors to simulate the functionality of the HB bridge. I can't give you a 100% answer, but in theory...there should be nothing preventing the usage of the HB bridge upside down. Because all it does is give connectivity to the fingers on each card.

Aqua Computer just released the preorder for their blocks as well. Although there is no Active XCS backplate yet so I'm holding off. I'm wondering if they're not going to bother with an active cooled backplate since the card comes with a passive backplate already. But if it does become an option, it'll help get a higher memory clock. For reference...without any fans on my backplate right now, in some benches/games I can't get higher than 11Gbps without some artifacting popping up. But if I get some good air flow on the backplates, I can crank the memory up to 11.5Gbps without artifacting. Might not seem like much...but if you can get an extra 5% memory OC with an active cooled backplate...why not. I'm a big fan of them on my Maxwell Titan cards. Dropped temps from the standard 100c~ under OC and hours of intensive use down to about 55c~ at 8000MHz.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Just moving your viewpoint even from the same relative position a few degrees can significantly change your FPS. That's why you either have to do these comparisons after a loading screen and don't move so everyone is at an identical position/view, or use an in-game benchmark.
> 
> I've had the top Intel CPU for like the last 8 years, just got rid of my 5960X to try out the 6700K. Supposedly SL guarantees me a 4.4 GHz 6950X, maybe I can get it to 4.5.


I got a 4.4 ghz 6800k from SL. Actually got it to 4.6ghz


----------



## Gary2015

Which blocks are better EK or Aquacomputer?


----------



## DNMock

Quote:


> Originally Posted by *HyperMatrix*
> 
> Technically, you should be able to split that thing down the middle and it should work fine (don't do it...there is some overlapping circuitry, likely just for the LED, but still). There is no bandwidth or data sharing between the 2 SLI fingers. I believe you can even use 2 individual sli connectors to simulate the functionality of the HB bridge. I can't give you a 100% answer, but in theory...there should be nothing preventing the usage of the HB bridge upside down. Because all it does is give connectivity to the fingers on each card.
> 
> Aqua Computer just released the preorder for their blocks as well. Although there is no Active XCS backplate yet so I'm holding off. I'm wondering if they're not going to bother with an active cooled backplate since the card comes with a passive backplate already. But if it does become an option, it'll help get a higher memory clock. For reference...without any fans on my backplate right now, in some benches/games I can't get higher than 11Gbps without some artifacting popping up. But if I get some good air flow on the backplates, I can crank the memory up to 11.5Gbps without artifacting. Might not seem like much...but if you can get an extra 5% memory OC with an active cooled backplate...why not. I'm a big fan of them on my Maxwell Titan cards. Dropped temps from the standard 100c~ under OC and hours of intensive use down to about 55c~ at 8000MHz.


From what I'm reading and seeing, there shouldn't be a problem just using a dremel and cutting off those two stupid tips.
Quote:


> Originally Posted by *Gary2015*
> 
> Which blocks are better EK or Aquacomputer?


Generally, they perform very similarly, the Aquacomputer blocks are much nicer looking, but they are usually a few weeks behind EK on releasing and come with a higher price tag.


----------



## HyperMatrix

Quote:


> Originally Posted by *Gary2015*
> 
> Which blocks are better EK or Aquacomputer?


Don't know about this generation. But for Maxwell, Aqua Computer was better. They're kind of a weird company though. The block actually sits on the memory modules. You have to put non-conductive thermal paste on them. Precision engineered. Haha. But better heat transfer. That along with active cooled backplate, means more heat dissipated from both sides of the card.


----------



## Gary2015

Quote:


> Originally Posted by *lutjens*
> 
> Thanks for the heads up...snagged the last one...


Wow that's pricey. Got mine for 389 GBP from scan computers.


----------



## Gary2015

deleted


----------



## lutjens

Quote:


> Originally Posted by *Gary2015*
> 
> Wow that's pricey. Got mine for 389 GBP from scan computers.


Yeah, but they're out of stock everywhere...and the places that carry them are so deep in backorders, the snow will fly before an order placed today will ship...


----------



## Gary2015

Quote:


> Originally Posted by *lutjens*
> 
> Yeah, but they're out of stock everywhere...and the places that carry them are so deep in backorders, the snow will fly before an order placed today will ship...


I might sell mine and get the new 4TB Samsung SSD.


----------



## Gary2015

Quote:


> Originally Posted by *Baasha*
> 
> Where did you get the SM961? Other than that store in Australia, I haven't seen it available anywhere (in the US).


Overclockers uk have it 499 GBP


----------



## outofmyheadyo

Quote:


> Originally Posted by *HyperMatrix*
> 
> Don't know about this generation. But for Maxwell, Aqua Computer was better. They're kind of a weird company though. The block actually sits on the memory modules. You have to put non-conductive thermal paste on them. Precision engineered. Haha. But better heat transfer. That along with active cooled backplate, means more heat dissipated from both sides of the card.


Do vrms need paste or thermalpads for aquacomp blocks?


----------



## HyperMatrix

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Do vrms need paste or thermalpads for aquacomp blocks?


vrms still require pads.


----------



## auraofjason

Took off the stock heatsink to put on a hybrid cooler, I was surprised how perfect the stock paste looked on mine compared to others that have posted theirs.


----------



## MrTOOSHORT

Order some CLU paste for the shunt resistors mod. Hopefully it removes most of the throttling. Will be using an FC EK block.


----------



## ChrisxIxCross

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Order some CLU paste for the shunt resistors mod. Hopefully it removes most of the throttling. Will be using an FC EK block.


So the only option we really have is a resistor mod? No way to tweak the bios to increase PT/ unlock voltage?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> So the only option we really have is a resistor mod? No way to tweak the bios to increase PT/ unlock voltage?


So far yes.

If some with knowledge with a HEX editor, then maybe the bios can be modded.


----------



## ChrisxIxCross

Just a finished a vive session w/ this card, it's absolutely INSANE how smooth everything runs compared to a 980 Ti, not to mention with 1.5x supersampling!


----------



## IlIfadeIlI

I noticed this too. VR seems WAY smoother compared to maxwell.


----------



## ChrisxIxCross

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> So far yes.
> 
> If some with knowledge with a HEX editor, then maybe the bios can be modded.


I'm surprised that its been over 2 months and we still have no Pascal Bios Tweaker, I hope that they're still working on it.


----------



## ChrisxIxCross

Quote:


> Originally Posted by *IlIfadeIlI*
> 
> I noticed this too. VR seems WAY smoother compared to maxwell.


Yeah like I remember my 980 Ti was really struggling w/ the VR Funhouse demo while the Pascal Titan just KILLED it


----------



## HyperMatrix

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Order some CLU paste for the shunt resistors mod. Hopefully it removes most of the throttling. Will be using an FC EK block.


Have you done this yet?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *HyperMatrix*
> 
> Have you done this yet?


Just ordered it tonight. Will do the mod when I get the EK block.


----------



## carlhil2

Welp, I removed the thousands of screws and put my EK Uni to use.. 
have some heatsinks coming later today. this will hold me til I get the EK block with backplate..


----------



## outofmyheadyo

Have mercy, put a fan on the backside of the card and one on the VRM with a rubber band like a dude here showed previously


----------



## carlhil2

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Have mercy, put a fan on the backside of the card and one on the VRM with a rubber band like a dude here showed previously


I have about enough fans/air flow to hold it down til I get the heat sinks, I had this on one of my 980Ti Classifieds, she will be ok..







fans blowing on top, several from the bottom also..that's why I am not pushing it yet.. clocks are steady though..


----------



## outofmyheadyo

Quote:


> Originally Posted by *carlhil2*
> 
> I have about enough fans/air flow to hold it down til I get the heat sinks, I had this on one of my 980Ti Classifieds, she will be ok..
> 
> 
> 
> 
> 
> 
> 
> fans blowing on top, several from the bottom also..that's why I am not pushing it yet.. clocks are steady though..


Out of curiosity, what do u plan on using to get the heatsinks on the card ? I have a bit of a frankenstein uniblock project planned and not too sure what to use, that could easily be removed later, perhaps some 3M thermaltape or something ?


----------



## carlhil2

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Out of curiosity, what do u plan on using to get the heatsinks on the card ? I have a bit of a frankenstein uniblock project planned and not too sure what to use, that could easily be removed later, perhaps some 3M thermaltape or something ?


I have some of this .. https://www.amazon.com/dp/B019MUICV2?psc=1 I get a higher score with lower clocks since I added the uni.. http://www.3dmark.com/spy/223714


----------



## Maintenance Bot

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Just ordered it tonight. Will do the mod when I get the EK block.


Post pics please







I will do this as well, not sure which shunt to do R51, R52, or R53 ?


----------



## outofmyheadyo

Quote:


> Originally Posted by *carlhil2*
> 
> I have some of this .. https://www.amazon.com/dp/B019MUICV2?psc=1 I get a higher score with lower clocks since I added the uni.. http://www.3dmark.com/spy/223714


Have you used it before ? Usually if it stick on and the sinks wont fall off its good thatway but if you ever want to take it off it`s a major hassle








No throttling = better score


----------



## cg4200

Hey I am new to shunt resistors mod.. Is anyone here running there titan xp with the mod already...If so is it big difference??
I saw the pics in post seems easy I ran out of clu redeliding my 6700k ordering some more.. I have my artic silver 5 would that work the same or not pure enough??? Thanks looks like we should pay someone couple bucks to speed up pascal editer.....I would throw 10.00 on pay pal we should start fund to pay a pro to crack hex editor


----------



## carlhil2

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Have you used it before ? Usually if it stick on and the sinks wont fall off its good thatway but if you ever want to take it off it`s a major hassle
> 
> 
> 
> 
> 
> 
> 
> 
> No throttling = better score


Lol, true, ask my Classy about it...


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Post pics please
> 
> 
> 
> 
> 
> 
> 
> I will do this as well, not sure which shunt to do R51, R52, or R53 ?


Not so sure the mod is that easy anymore:



*https://xdevs.com/guide/pascal_oc/*

This is on a FE 1080, most likely same as Titan-X P


----------



## HyperMatrix

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Not so sure the mod is that easy anymore:
> 
> 
> 
> *https://xdevs.com/guide/pascal_oc/*
> 
> This is on a FE 1080, most likely same as Titan-X P


If you don't feel comfortable soldering right on top, you could always mount the resistors on an empty part of the board with a glue gun, then just solder 30 gauge wire from the points in the picture to the resistor. Won't look quite as clean. But it'll get the job done.


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> If you don't feel comfortable soldering right on top, you could always mount the resistors on an empty part of the board with a glue gun, then just solder 30 gauge wire from the points in the picture to the resistor. Won't look quite as clean. But it'll get the job done.


Send it back to Nvidia to get fixed.


----------



## pez

Quote:


> Originally Posted by *DNMock*
> 
> So has anyone taken apart an Nvidia HB bridge yet? I would imagine once you take that stupid plastic cover off of it, you would end up with something looking like this:
> 
> 
> Some of us just hit regular shipping on day one and won't get our cards until tomorrow. Give it a week for the first wave of betas to come out then another week we should have some more refined variants.


Quote:


> Originally Posted by *HyperMatrix*
> 
> http://www.pcworld.com/article/3087524/hardware/tested-the-payoff-in-buying-nvidias-40-sli-hb-bridge.html
> 
> Pictures of it without the LED cover portion. Feel free to ignore the article as "Gordon Mah Ung" from PC World is a "doodoo head" (self-censored) paid shill who thinks he knows what he's talking about...and did a whole review trying to promote the HB Bridge by comparing it to SLI Ribbon connectors as opposed to the actual hard connectors that come with any premium gaming motherboard. Every single article he's written infuriates me.


It's very unfortunate that that is the case







. I saw the PCB was basically the same shape and would make doing SLI with the EK blocks impossible without modding. IIRC, even EVGA's HB bridge interferes unless you remove the cover. I'm hoping to see a very nice and sleep HB bridge from EK, but I haven't seen any news since they told people it was coming.


----------



## HyperMatrix

Quote:


> Originally Posted by *pez*
> 
> It's very unfortunate that that is the case
> 
> 
> 
> 
> 
> 
> 
> . I saw the PCB was basically the same shape and would make doing SLI with the EK blocks impossible without modding. IIRC, even EVGA's HB bridge interferes unless you remove the cover. I'm hoping to see a very nice and sleep HB bridge from EK, but I haven't seen any news since they told people it was coming.


I'll check my Aqua computer blocks on maxwell Titans in a sec to see if the hb bridge fits on that.


----------



## techguymaxc

Quote:


> Originally Posted by *Baasha*
> 
> I render 4K 60FPS video WHILE playing games maxed out at 5K.


Got any personal benchmarks to back that up? This reviewer was only in the 50s for several AAA titles @ 5k on SLI Titan X Pascals with a 6950x. http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html
Quote:


> Originally Posted by *Baasha*
> 
> Try that with a quad core.


Don't need to. I have a 5GHz 4790k with my Titan X for gaming, a separate machine with a 5820k @ 4.4GHz for video rendering/media serving, and 2 Dell T410 servers with dual six core CPUs and 64GB RAM each for VMs and spawning additional rendering jobs if I feel like it. When you need more compute power: build another machine. Better yet, buy some time on Azure and spin up a VM https://azure.microsoft.com/en-us/pricing/details/virtual-machines/#Windows


----------



## pez

Quote:


> Originally Posted by *HyperMatrix*
> 
> I'll check my Aqua computer blocks on maxwell Titans in a sec to see if the hb bridge fits on that.


Cool







.

I've essentially scrapped SLI plans at this point. 1080 SLI > Titan was worth it for me. Even have my ITX board ordered already







. I thought a 1080 on its own was good for The x34, but the Titan X is just that much better. And if anything, I'm pretty sure the OC I've achieved with it helped my minimums in GTA V quite a bit.


----------



## HyperMatrix

Quote:


> Originally Posted by *techguymaxc*
> 
> Got any personal benchmarks to back that up? This reviewer was only in the 50s for several AAA titles @ 5k on SLI Titan X Pascals with a 6950x. http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html
> Don't need to. I have a 5GHz 4790k with my Titan X for gaming, a separate machine with a 5820k @ 4.4GHz for video rendering/media serving, and 2 Dell T410 servers with dual six core CPUs and 64GB RAM each for VMs and spawning additional rendering jobs if I feel like it. When you need more compute power: build another machine. Better yet, buy some time on Azure and spin up a VM https://azure.microsoft.com/en-us/pricing/details/virtual-machines/#Windows


Gordon mah ung is a total noob and you should consider anything he writes to be the words of an amateur who likes to pretend he knows what he's talking about. There will be times where he will be correct by fluke. But overall, he's just a douche. Ignore him. And "pc world."

Also baasha said he plays games maxed out. Didn't say it was running at more than 30 fps. Haha. Though I think he also said he bought 4 Titan XPs.


----------



## HyperMatrix

Quote:


> Originally Posted by *pez*
> 
> Cool
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I've essentially scrapped SLI plans at this point. 1080 SLI > Titan was worth it for me. Even have my ITX board ordered already
> 
> 
> 
> 
> 
> 
> 
> . I thought a 1080 on its own was good for The x34, but the Titan X is just that much better. And if anything, I'm pretty sure the OC I've achieved with it helped my minimums in GTA V quite a bit.


Nvidia HB SLI Bridge fits on my Maxwell Titans with Aqua Computer Kryographics. The pointy bits scrap against it a tiny bit as it's going into place. But it fits perfectly! So as long as they don't change anything on the new Titan XP blocks, that won't be a problem. That's a good bit of peace of mind for me. Haha.


----------



## pez

Quote:


> Originally Posted by *HyperMatrix*
> 
> Nvidia HB SLI Bridge fits on my Maxwell Titans with Aqua Computer Kryographics. The pointy bits scrap against it a tiny bit as it's going into place. But it fits perfectly! So as long as they don't change anything on the new Titan XP blocks, that won't be a problem. That's a good bit of peace of mind for me. Haha.


That is good to know and that's good info for everyone else as well







.


----------



## DNMock

Quote:


> Originally Posted by *HyperMatrix*
> 
> Nvidia HB SLI Bridge fits on my Maxwell Titans with Aqua Computer Kryographics. The pointy bits scrap against it a tiny bit as it's going into place. But it fits perfectly! So as long as they don't change anything on the new Titan XP blocks, that won't be a problem. That's a good bit of peace of mind for me. Haha.


Ya that's definitely good news, may have to cancel my ekwb pre-order and just wait on aquacomputer.


----------



## Gary2015

Quote:


> Originally Posted by *DNMock*
> 
> Ya that's definitely good news, may have to cancel my ekwb pre-order and just wait on aquacomputer.


just canceled my ek preorder and went for aqua..


----------



## pez

Link to the blocks you guys are ordering from Alpha? I'm a noob to WC'ing, so I'm just trying to get an idea for it







. The way the Titan X OCs and does so with results has been enough to convince me I need to put it on water to maintain the best clocks I can







.


----------



## Gary2015

Quote:


> Originally Posted by *pez*
> 
> Link to the blocks you guys are ordering from Alpha? I'm a noob to WC'ing, so I'm just trying to get an idea for it
> 
> 
> 
> 
> 
> 
> 
> . The way the Titan X OCs and does so with results has been enough to convince me I need to put it on water to maintain the best clocks I can
> 
> 
> 
> 
> 
> 
> 
> .


http://shop.aquacomputer.de/index.php?cPath=7_11_149


----------



## DNMock

http://shop.aquacomputer.de/index.php?cPath=7_11_149

*FAIL*


----------



## Gary2015

Quote:


> Originally Posted by *DNMock*
> 
> http://shop.aquacomputer.de/index.php?cPath=7_11_149


lol ..beat you to it..


----------



## CallsignVega

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Order some CLU paste for the shunt resistors mod. Hopefully it removes most of the throttling. Will be using an FC EK block.


How hard is liquid ultra to remove from those shunt resistors if warranty work is needed? I recall when I've used it, very hard to remove from CPU's and you usually scrub off the writing on the Intel heat spreader doing it.

What about something like this?

http://www.thinkgeek.com/product/b70c/

We want to actually allow more electricity to flow over the shunt correct?


----------



## pez

Quote:


> Originally Posted by *Gary2015*
> 
> http://shop.aquacomputer.de/index.php?cPath=7_11_149


Quote:


> Originally Posted by *DNMock*
> 
> http://shop.aquacomputer.de/index.php?cPath=7_11_149


Like clockwork







. Thank you both







. You think they will do a full cover WB like EK? Aesthetically speaking, I just like the EK WB better







.


----------



## Gary2015

Quote:


> Originally Posted by *pez*
> 
> Like clockwork
> 
> 
> 
> 
> 
> 
> 
> . Thank you both
> 
> 
> 
> 
> 
> 
> 
> . You think they will do a full cover WB like EK? Aesthetically speaking, I just like the EK WB better
> 
> 
> 
> 
> 
> 
> 
> .


No Aqua dont usually do full blocks.

I used Aqua blocks on my old SLI Titan X. Installation is quite fiddly for newbies. If the rubber grommet comes out, you will get leaks. I wanted to try EK this time but the SLI HB bridge thing is a deal breaker.


----------



## CallsignVega

Actually this stuff looks perfect for the shunt mod and it can easily be removed:

https://smile.amazon.com/Bare-Conductive-Electric-Paint-10ml/dp/B00B888LQ8/ref=pd_sim_sbs_328_3?ie=UTF8&dpID=31t8BeiheHL&dpSrc=sims&preST=_AC_UL160_SR160%2C160_&psc=1&refRID=ZEH676G5GQPVR4ZYJDY2


----------



## pez

Quote:


> Originally Posted by *Gary2015*
> 
> No Aqua dont usually do full blocks.
> 
> I used Aqua blocks on my old SLI Titan X. Installation is quite fiddly for newbies. If the rubber grommet comes out, you will get leaks. I wanted to try EK this time but the SLI HB bridge thing is a deal breaker.


Yeah, since I'm sticking with one GPU, I'll be more inclined for them. It'd be nice if they could give us a solid ETA on their HB bridge








.


----------



## Gary2015

Quote:


> Originally Posted by *pez*
> 
> Yeah, since I'm sticking with one GPU, I'll be more inclined for them. It'd be nice if they could give us a solid ETA on their HB bridge
> 
> 
> 
> 
> 
> 
> 
> .


Then go with EK if you only using one GPU.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Actually this stuff looks perfect for the shunt mod and it can easily be removed:
> 
> https://smile.amazon.com/Bare-Conductive-Electric-Paint-10ml/dp/B00B888LQ8/ref=pd_sim_sbs_328_3?ie=UTF8&dpID=31t8BeiheHL&dpSrc=sims&preST=_AC_UL160_SR160%2C160_&psc=1&refRID=ZEH676G5GQPVR4ZYJDY2


Hey. Did you do that Metro Last Light bench run to compare to mine?


----------



## CallsignVega

Woops, started the steam download and forgot about it. Should be done now.


----------



## cookiesowns

Curious. For those that can do 2K+ fire strike or games at what temps, fan speed, TDP, and voltage does it run at?

My cards cap at around 1950 due to instability, temps and TDP.


----------



## Foxrun

Finally finished putting an AIO on it, dropped my temps by almost 20c. Not alot of thermal paste on mine and the fan wire was tricky to place for the shroud to be flush with the rest of the card. Starting to push this baby now!


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> I didn't try with HT off. But this is what I got:


Hmm, yours says Metro LL Benchmark and mine says "Metro ReduX Benchmark".

Is it the same thing?


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, yours says Metro LL Benchmark and mine says "Metro ReduX Benchmark".
> 
> Is it the same thing?


Redux is 2033 and last light in one. I don't know if there were any performance improvements in it. But I guess we'll find out from your bench results. Haha. My benchmark was a tunnel. Then a train coming. Explosions. Lots of fighting. At this point fps performance dropped in half.


----------



## dante`afk

Quote:


> Originally Posted by *Metros*
> 
> The GTX 980ti SLI and GTX 18080 SLI are easily fast enough then, this did not just happen with the Titan X, also G-Sync will remove most of it as well


gsync removes tearing not MS, you'll still have MS with gsync and SLI.


----------



## Zurv

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, yours says Metro LL Benchmark and mine says "Metro ReduX Benchmark".
> 
> Is it the same thing?


they are not the same. they revisited the game and tweaked it. only the redux should be used to test. (the last update to it was to work with the titan X 12/2015. non-redux was like 2013).

there is a redux for both games.


----------



## Baasha

Quote:


> Originally Posted by *techguymaxc*
> 
> Got any personal benchmarks to back that up? This reviewer was only in the 50s for several AAA titles @ 5k on SLI Titan X Pascals with a 6950x. http://www.pcworld.com/article/3102877/components-graphics/tested-nvidias-new-titan-x-is-absolutely-decadant-in-sli.html
> Don't need to. I have a 5GHz 4790k with my Titan X for gaming, a separate machine with a 5820k @ 4.4GHz for video rendering/media serving, and 2 Dell T410 servers with dual six core CPUs and 64GB RAM each for VMs and spawning additional rendering jobs if I feel like it. When you need more compute power: build another machine. Better yet, buy some time on Azure and spin up a VM https://azure.microsoft.com/en-us/pricing/details/virtual-machines/#Windows


Well, there's this benchmark video:




I also have a 2nd rig (144Hz RoG Swift) w/ 2x Titan XP and a 3970X @ 4.5Ghz.

The point I made was that the 6950X is an absolute monster that can encode 4K video while playing games at 5K maxed out. No quad core can do that period. I never claimed the 6950X is the best 'gaming' chip - whatever that means. The 6950X is the BEST CPU out there for the desktop market. Period.

Many people on this forum can't seem to understand that a budget CPU is just that, budget. A top of the line CPU is the best. A Ford maybe a better grocery-getter than a Lamborghini. Too bad people here are trying to compare the former to the latter.


----------



## dante`afk

Quote:


> Originally Posted by *Foxrun*
> 
> 
> 
> 
> 
> Finally finished putting an AIO on it, dropped my temps by almost 20c. Not alot of thermal paste on mine and the fan wire was tricky to place for the shroud to be flush with the rest of the card. Starting to push this baby now!


how loud is the stock fan with the aio? considering leaving this or replacing it with kraken g10.


----------



## dante`afk

Quote:


> Originally Posted by *Baasha*
> 
> Well, there's this benchmark video:
> 
> 
> 
> 
> I also have a 2nd rig (144Hz RoG Swift) w/ 2x Titan XP and a 3970X @ 4.5Ghz.
> 
> The point I made was that the 6950X is an absolute monster that can encode 4K video while playing games at 5K maxed out. No quad core can do that period. I never claimed the 6950X is the best 'gaming' chip - whatever that means. The 6950X is the BEST CPU out there for the desktop market. Period.
> 
> Many people on this forum can't seem to understand that a budget CPU is just that, budget. A top of the line CPU is the best. A Ford maybe a better grocery-getter than a Lamborghini. Too bad people here are trying to compare the former to the latter.


depends on each user I guess.

why should I get a 6950x if i only game and will never render videos or do anything ever that will require the 10 cores?


----------



## pez

Quote:


> Originally Posted by *dante`afk*
> 
> depends on each user I guess.
> 
> why should I get a 6950x if i only game and will never render videos or do anything ever that will require the 10 cores?


If you're running SLI Titans or 1080s, you'll get the benefit of x16/x16 without losing functionality of M.2/U.2, etc. Other than that, a Z97 or Z170 platform is going to generally be better for OC'ing and gaming performance.


----------



## DADDYDC650

Any 16x/16x vs 16x/8x vs 8x/8x pci-e benches with these beasts yet?


----------



## dante`afk

Quote:


> Originally Posted by *pez*
> 
> If you're running SLI Titans or 1080s, you'll get the benefit of x16/x16 without losing functionality of M.2/U.2, etc. Other than that, a Z97 or Z170 platform is going to generally be better for OC'ing and gaming performance.


There is still no difference between 16x/16x and 8x/8x though.

And sli has still microstuttering, even with 200 fps and gsync.

And about m.2, if your not rendering/require fast read/write performances, a normal ssd does the same job, the differnce is only in benchmarks noticeable.


----------



## stefxyz

Dante where is the 8x/8x proof?


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> There is still no difference between 16x/16x and 8x/8x though.
> 
> And sli has still microstuttering, even with 200 fps and gsync.
> 
> And about m.2, if your not rendering/require fast read/write performances, a normal ssd does the same job, the differnce is only in benchmarks noticeable.


I'd like someone to bench the different pci-e speeds in SLI.


----------



## seckzee

In terms of SSD vs m.2 drives... I am running a 512gb SM951 m.2 drive and I am consistently the first one to enter BF4 maps on map change


----------



## outofmyheadyo

Same, load times are faster than regular ssd-s in other games too.


----------



## sena

Quote:


> Originally Posted by *dante`afk*
> 
> There is still no difference between 16x/16x and 8x/8x though.
> 
> And sli has still microstuttering, even with 200 fps and gsync.
> 
> And about m.2, if your not rendering/require fast read/write performances, a normal ssd does the same job, the differnce is only in benchmarks noticeable.


We are on ocn, overkill si fun









Back on topic, i am currently without pc, i am looking forward to titan x sli, how is overclocking on water?


----------



## Foxrun

Quote:


> Originally Posted by *dante`afk*
> 
> how loud is the stock fan with the aio? considering leaving this or replacing it with kraken g10.


Same as it is without the aio, it all depends on the fan speed. I have my ac going now so I leave the fan at 60%


----------



## Zurv

Quote:


> Originally Posted by *sena*
> 
> We are on ocn, overkill si fun
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back on topic, i am currently without pc, i am looking forward to titan x sli, how is overclocking on water?


we are all still waiting on blocks. But i'd assume you'd be solid over 2ghz - that is what i'm having no problem getting with 100% (which water will be better at cooling)


----------



## czin125

Wouldn't a maxed out/binned 6700K rig + 6800K rig > 6950X rig? Unless you need all 10 cores for encoding when not multitasking.
Quote:


> Originally Posted by *pez*
> 
> If you're running SLI Titans or 1080s, you'll get the benefit of x16/x16 without losing functionality of M.2/U.2, etc. Other than that, a Z97 or Z170 platform is going to generally be better for OC'ing and gaming performance.


36 lanes is enough for 2card SLI and m.2, right?

Can the P6000 play games or certain functions disabled?


----------



## seckzee

it seems like their OC'ing on water is only helping with temp's at this point... seems like theyre waiting for custom bios editing to be released


----------



## pez

Quote:


> Originally Posted by *dante`afk*
> 
> There is still no difference between 16x/16x and 8x/8x though.
> 
> And sli has still microstuttering, even with 200 fps and gsync.
> 
> And about m.2, if your not rendering/require fast read/write performances, a normal ssd does the same job, the differnce is only in benchmarks noticeable.


Quote:


> Originally Posted by *stefxyz*
> 
> Dante where is the 8x/8x proof?


The HB bridges are already showing tangible results in how it is affecting frametimes (for the better) and this is solely proving that transferring data via the PCIE bus isn't as efficient as a dedicated connection (i.e. SLI bridge). The video below is not a perfect showing, but a relevant one showing that this is the case for even 980Tis. Will you see a performance increase whether you're using x8/x8 or x16/x16? Definitely. Will it bottleneck it to the point that it's useless? No.

Also, the micro-stuttering argument is over-used without any proof ever given....could you provide a source of non-anecdotal evidence talking about micro-stutters? After GTX 970 SLI and 1080 SLI, micro-stuttering was the last thing that became an issue with those setups.

I will point you both to this video:




Quote:


> Originally Posted by *czin125*
> 
> Wouldn't a maxed out/binned 6700K rig + 6800K rig > 6950X rig? Unless you need all 10 cores for encoding when not multitasking.
> 36 lanes is enough for 2card SLI and m.2, right?
> 
> Can the P6000 play games or certain functions disabled?


Yeah, however, most X99/2011 chips have 28 or 40 lanes. 1150 and 1151 all have 20 IIRC. No idea about the P6000.


----------



## Woundingchaney

Quote:


> Originally Posted by *Foxrun*
> 
> 
> 
> 
> 
> Finally finished putting an AIO on it, dropped my temps by almost 20c. Not alot of thermal paste on mine and the fan wire was tricky to place for the shroud to be flush with the rest of the card. Starting to push this baby now!


What model aio are you using and dos you have to do any tinkering to get it to mount?


----------



## dante`afk

Quote:


> Originally Posted by *stefxyz*
> 
> Dante where is the 8x/8x proof?


Quote:


> Originally Posted by *DADDYDC650*
> 
> I'd like someone to bench the different pci-e speeds in SLI.


Quote:


> Originally Posted by *pez*
> 
> The HB bridges are already showing tangible results in how it is affecting frametimes (for the better) and this is solely proving that transferring data via the PCIE bus isn't as efficient as a dedicated connection (i.e. SLI bridge). The video below is not a perfect showing, but a relevant one showing that this is the case for even 980Tis. Will you see a performance increase whether you're using x8/x8 or x16/x16? Definitely. Will it bottleneck it to the point that it's useless? No.
> 
> Also, the micro-stuttering argument is over-used without any proof ever given....could you provide a source of non-anecdotal evidence talking about micro-stutters? After GTX 970 SLI and 1080 SLI, micro-stuttering was the last thing that became an issue with those setups.
> 
> I will point you both to this video:
> 
> 
> 
> 
> Yeah, however, most X99/2011 chips have 28 or 40 lanes. 1150 and 1151 all have 20 IIRC. No idea about the P6000.






the HB bridge, does nothing, I made myself tests, also techpowerup mentioned the same



https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/22.html


----------



## YpsiNine

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> I'm surprised that its been over 2 months and we still have no Pascal Bios Tweaker, I hope that they're still working on it.


Who are "they"?

If you have some links to someone actively developing an editor that'll be appreciated. I have yet to find such a discussion...


----------



## KillerBee33

It's here


----------



## carlhil2

Putting my TXP under water didn't do any magic as far as OCing is concerned, but, it doesn't throttle anymore. I will not go hard anyways til I get a FC block.....


----------



## opt33

mine is now ghetto installed with my old titanx hanging....waiting on waterblock. I thought these boosted to 1500...mine says 1809 boost clock on load....guess it depends on load.


----------



## Newtocooling

Does anyone know if my Heatkiller IV 980ti block would work with this card?


----------



## Z0eff

Quote:


> Originally Posted by *dante`afk*
> 
> 
> 
> 
> 
> the HB bridge, does nothing, I made myself tests, also techpowerup mentioned the same
> 
> 
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/22.html


Those are 980's being tested though, not even 980 Ti's. I think the question is more about how SLI Titan XP would handle PCIe Gen3 x8 speeds. Probably still fine or very little change but would be interesting to test regardless.


----------



## pez

Quote:


> Originally Posted by *dante`afk*
> 
> 
> 
> 
> 
> the HB bridge, does nothing, I made myself tests, also techpowerup mentioned the same
> 
> 
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/22.html


I'm not sure if you're ignoring what I said on purpose or just trying to play ignorant. I showed you a video with actual gameplay showing a tangible difference. You linked a video trying to teach me what I already know about PCIE and GPUs (and I've seen that video already).

You also did testing of FPS and not frame times. These are not the same thing and there has been tests that show frame times have been improved. There's a reputable member on here that has even gone as far as to say he noticed a tangible difference with the HB bridge.

@Mad Pistol^

http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/

There's a link that has at least 4 titles with results that aren't just 'margin-of-error' increases.


----------



## Foxrun

Quote:


> Originally Posted by *Woundingchaney*
> 
> What model aio are you using and dos you have to do any tinkering to get it to mount?


I am using EVGA's hybrid for the 1080. The power cable for the fan and the pump is a pain to wire underneath the shroud of the fan. I was not able to take the backplate off because I did not have the proper hex driver for i, but you can still remove the fan header with a pair of needle nose pliers gently. Other than that its just your typical tinker with screws


----------



## Foxrun

Quote:


> Originally Posted by *seckzee*
> 
> it seems like their OC'ing on water is only helping with temp's at this point... seems like theyre waiting for custom bios editing to be released


Yeah I cant get more than 225 on the core, but the water dropped my temps by 20c


----------



## carlhil2

Quote:


> Originally Posted by *opt33*
> 
> mine is now ghetto installed with my old titanx hanging....waiting on waterblock. I thought these boosted to 1500...mine says 1809 boost clock on load....guess it depends on load.


My card boost to 1860 @stock...


----------



## DNMock

Not too bad.

about 33% faster stock on air out the box than my best runs with Maxwell T-X cards watercooled, heavily overclocked maxed out voltage from custom bios.


----------



## DNMock

BTW, anyone know if there is a way to convince the cards that the two SLI bridges are actually a single HB bridge?


----------



## Zurv

Quote:


> Originally Posted by *DNMock*
> 
> BTW, anyone know if there is a way to convince the cards that the two SLI bridges are actually a single HB bridge?


just use an LED hard bridge. That will have the bandwidth needed for 4k. (not mobo hard bridge.. but LED)


----------



## HyperMatrix

Quote:


> Originally Posted by *Zurv*
> 
> just use an LED hard bridge. That will have the bandwidth needed for 4k. (not mobo hard bridge.. but LED)


Does the motherboard hard bridge give the warning message in Nvidia control panel saying your sli configuration could be performing better with a higher quality bridge? My assumption was that motherboard hard bridges were pretty much the same as the LED bridges.


----------



## Zurv

Quote:


> Originally Posted by *HyperMatrix*
> 
> Does the motherboard hard bridge give the warning message in Nvidia control panel saying your sli configuration could be performing better with a higher quality bridge? My assumption was that motherboard hard bridges were pretty much the same as the LED bridges.


it was my understanding that the LED bridge work at a higher hz. There was once a NVidia slide that showed ribbon, hard, led, and HB SLI bridges with the LED being better than the hard. I think it was some pre-release slide. The official ones only used ribbon, LED and HB.


----------



## CaliLife17

Quote:


> Originally Posted by *Zurv*
> 
> it was my understanding that the LED bridge work at a higher hz. There was once a NVidia slide that showed ribbon, hard, led, and HB SLI bridges with the LED being better than the hard. I think it was some pre-release slide. The official ones only used ribbon, LED and HB.


is this the slide you are referring to?


----------



## Zurv

Quote:


> Originally Posted by *CaliLife17*
> 
> is this the slide you are referring to?


that was the final one, but they had one with hard in there too. *shrug* don't ask me.. i'm getting old









that said, u used both hard and led bridge with 4 way titan 4 and i didn't really see a diff.. that said, sli got REAL broken when windows 10 came out. (also, traffic can still spill over to the PCI-E bus)


----------



## Kyouki

Woot! Was not even expecting it till Monday! Now it gets to look pretty till the water block comes in.


----------



## HyperMatrix

Quote:


> Originally Posted by *Zurv*
> 
> it was my understanding that the LED bridge work at a higher hz. There was once a NVidia slide that showed ribbon, hard, led, and HB SLI bridges with the LED being better than the hard. I think it was some pre-release slide. The official ones only used ribbon, LED and HB.


Their chart says standard bridge, led bridge, and hb bridge. Standard is sli ribbons. I'm pretty sure hard bridge and led bridge do the same. There's no actual circuitry involved in these things I don't believe. It's just about the capability of the traces and the bandwidth they can provide.


----------



## Zurv

Quote:


> Originally Posted by *HyperMatrix*
> 
> Their chart says standard bridge, led bridge, and hb bridge. Standard is sli ribbons. I'm pretty sure hard bridge and led bridge do the same. There's no actual circuitry involved in these things I don't believe. It's just about the capability of the traces and the bandwidth they can provide.


if you recall evga said their v2 LED was better than their first. more pixel something or other.


----------



## dante`afk

Quote:


> Originally Posted by *pez*
> 
> I'm not sure if you're ignoring what I said on purpose or just trying to play ignorant. I showed you a video with actual gameplay showing a tangible difference. You linked a video trying to teach me what I already know about PCIE and GPUs (and I've seen that video already).
> 
> You also did testing of FPS and not frame times. These are not the same thing and there has been tests that show frame times have been improved. There's a reputable member on here that has even gone as far as to say he noticed a tangible difference with the HB bridge.
> 
> @Mad Pistol^
> 
> http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/
> 
> There's a link that has at least 4 titles with results that aren't just 'margin-of-error' increases.


I'm in the road, did no see your vid, I don't doubt it's getting better and better with each generation; nontheless sli is plagued with MS. I'm coming from sli 580, 680, 780, 970, 1080.


----------



## EniGma1987

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Not so sure the mod is that easy anymore:
> 
> 
> 
> *https://xdevs.com/guide/pascal_oc/*
> 
> This is on a FE 1080, most likely same as Titan-X P


That is why I got scared off of doing CLU over the shunts. However, what the guide is talking about is shorting them out, using CLU is sort of the same but it back much higher resistance than soldering a small wire. CLU should be fine because it still maintains enough resistance to not put the card in fault mode, but little enough that it still lets you get 2-3x the power target. So I will be doing the CLU route myself next weekend. I saw someone else here was going to do it as well.


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> I'm in the road, did no see your vid, I don't doubt it's getting better and better with each generation; nontheless sli is plagued with MS. I'm coming from sli 580, 680, 780, 970, 1080.


There shouldn't be any micro stutter at high frame rates. For me, unless a specific game has a problem with sli, I can't tel the difference between running it in single gpu or in sli mode. But again I target 165Hz GSYNC.


----------



## opt33

Benching this is useless til I get on water, ran time spy with 2000 core then 2100, score no different, then realized both runs had throttled from 84C temps. But this TitanXP at stock is ~35% faster than my old max OCed titan X ....im just going to wait on waterblock to play with it.


----------



## MaxFTW

Will i be able to use the new Titan X with my current PSU, I have been running Titan X on my seasonic 650W Gold for about a year now and it is in the same power envelope but still.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *MaxFTW*
> 
> Will i be able to use the new Titan X with my current PSU, I have been running Titan X on my seasonic 650W Gold for about a year now and it is in the same power envelope but still.


The new Titan-X will consume a little less power than the last Titan X.


----------



## pez

Quote:


> Originally Posted by *HyperMatrix*
> 
> There shouldn't be any micro stutter at high frame rates. For me, unless a specific game has a problem with sli, I can't tel the difference between running it in single gpu or in sli mode. But again I target 165Hz GSYNC.


Yeah, my issues in SLI usually are a result of the title not having support or having very poor support. Nonetheless, my desire to go ITX has exceeded my desire to have a full on SLI rig this time around. Once the case I want comes in stock, or I decide on a new one, I'm going to be super excited to complete it.


----------



## KillerBee33

First run looks good , a bit Hotter than the 1080

http://www.3dmark.com/3dm/13928141


----------



## DNMock

Quote:


> Originally Posted by *Zurv*
> 
> just use an LED hard bridge. That will have the bandwidth needed for 4k. (not mobo hard bridge.. but LED)


I am, using 2 of them atm. Pixel clock seems to still be at 540 tho


----------



## carlhil2

Once the drivers are approved, these TXP will be able to do what the 1080 couldn't, knock those pesky extreme cooled 980Ti runs off of the top of the Firestrike 3DMark HOF...


----------



## dante`afk

Quote:


> Originally Posted by *HyperMatrix*
> 
> There shouldn't be any micro stutter at high frame rates. For me, unless a specific game has a problem with sli, I can't tel the difference between running it in single gpu or in sli mode. But again I target 165Hz GSYNC.


If you pay very close attention there is even with gsync.

I did not believe it myself until I compared it side by side crucially.


----------



## Sheyster

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> So far yes.
> 
> If some with knowledge with a HEX editor, then maybe the bios can be modded.


I'm going to be looking at the BIOS very closely once I get my card on Monday.









If it's at all similar to the old Titan X BIOS I may be able to get something done quickly. We'll see.


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> If you pay very close attention there is even with gsync.
> 
> I did not believe it myself until I compared it side by side crucially.


Might be something else in your system causing it. I can detect when fps drops from 140 to 120. I would notice microstutter. But I don't unless as I said the game in question has a problem with sli in general. Maybe it's pcie lane limitation. Your quad core cpu. PLX chip depending on your motherboard. Shader cache. Who knows. Lots of things.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Sheyster*
> 
> I'm going to be looking at the BIOS very closely once I get my card on Monday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If it's at all similar to the old Titan X BIOS I may be able to get something done quickly. We'll see.


Hope you do, thanks a lot.


----------



## Z0eff

Quote:


> Originally Posted by *Sheyster*
> 
> I'm going to be looking at the BIOS very closely once I get my card on Monday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If it's at all similar to the old Titan X BIOS I may be able to get something done quickly. We'll see.


----------



## SundayGamer

My first run...


----------



## Stateless

I ordered my Titan X Pascal card and it should arrive on Monday. I am currently using 2 Titan X Maxwell under Water. Going to a single Titan X Pascal. Based on what we know of the new card so far, will this single card outclass the Titan X Maxwell in SLI? I have been running SLI systems for a while without much issues, but for now I am just going with a single card and wondering how a single new Titan X will compare to Dual Titan X Maxwell GPU?


----------



## carlhil2

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Hope you do, thanks a lot.


Where are your notorious massive OC on the ram at, waiting on blocks?


----------



## cookiesowns

Quote:


> Originally Posted by *Stateless*
> 
> I ordered my Titan X Pascal card and it should arrive on Monday. I am currently using 2 Titan X Maxwell under Water. Going to a single Titan X Pascal. Based on what we know of the new card so far, will this single card outclass the Titan X Maxwell in SLI? I have been running SLI systems for a while without much issues, but for now I am just going with a single card and wondering how a single new Titan X will compare to Dual Titan X Maxwell GPU?


Assuming you are overclocked on the T X Maxwell, it will not outclass it. However I've noticed Pascal overall having much better performance consistency in general.

My 980Ti @ 1.55Ghz in SLI scores still takes down a single TXP @ 1.9Ghz. ( different CPU, however shouldn't affect GPU scores )


----------



## MrTOOSHORT

Quote:


> Originally Posted by *carlhil2*
> 
> Where are your notorious massive OC on the ram at, waiting on blocks?


+750 on air is all she wrote I think. Massive enough. Vegas +900 is just a bonus, lucky dog!


----------



## Mad Pistol

Quote:


> Originally Posted by *Stateless*
> 
> I ordered my Titan X Pascal card and it should arrive on Monday. I am currently using 2 Titan X Maxwell under Water. Going to a single Titan X Pascal. Based on what we know of the new card so far, will this single card outclass the Titan X Maxwell in SLI? I have been running SLI systems for a while without much issues, but for now I am just going with a single card and wondering how a single new Titan X will compare to Dual Titan X Maxwell GPU?


It's hit-or-miss, unfortunately. A single Titan XP is faster than 980 Ti/Titan XM/1070 SLI in some cases, but not all. For one, most benchmarks tend to favor the SLI flavors listed before.

For games, though, the single Titan XP is definitely the preferred choice, especially since not all games have good SLI support.


----------



## SundayGamer

Quote:


> Originally Posted by *Stateless*
> 
> ... wondering how a single new Titan X will compare to Dual Titan X Maxwell GPU?


In benchmarks, single TXP might loose, but in gaming, I'm sure single card will perform very well, I see massive improvement over my 980 Ti I had before, even OCd to 1500mhz, it couldn't really get more fps in GTA 5 or BF4, those are old titles already, while this TXP- literally smashes BF4 to 85-100+fps (4K res) and stable 60 fps in GTA 5, haven't installed other games yet as I re-built my system and got my TXP only on Friday, so still in testing process.


----------



## cookiesowns

That said..

SLI Titan XP overclocked is still not enough to run The Division at Max settings 0 AA, 100% res scale at 1440P 165Hz. Thank goodness for Gsync. 110FPS is still better than 40-80FPS.


----------



## Mad Pistol

Quote:


> Originally Posted by *cookiesowns*
> 
> That said..
> 
> SLI Titan XP overclocked is still not enough to run The Division at Max settings 0 AA, 100% res scale at 1440P 165Hz. Thank goodness for Gsync. 110FPS is still better than 40-80FPS.


Yea. I'm shocked at how much of a resource hog The Division is. Even my 1070 SLI setup cannot run it at a consistent frame rate maxed out @ 3440x1440.


----------



## dante`afk

Quote:


> Originally Posted by *HyperMatrix*
> 
> Might be something else in your system causing it. I can detect when fps drops from 140 to 120. I would notice microstutter. But I don't unless as I said the game in question has a problem with sli in general. Maybe it's pcie lane limitation. Your quad core cpu. PLX chip depending on your motherboard. Shader cache. Who knows. Lots of things.


Nope it's not only me, everyone who is staying at single gpu fo those reasons will confirm that.

I did not see it myself for years until someone pointed out the differences, that's why you don't see it.


----------



## Stateless

Quote:


> Originally Posted by *Mad Pistol*
> 
> It's hit-or-miss, unfortunately. A single Titan XP is faster than 980 Ti/Titan XM/1070 SLI in some cases, but not all. For one, most benchmarks tend to favor the SLI flavors listed before.
> 
> For games, though, the single Titan XP is definitely the preferred choice, especially since not all games have good SLI support.


For me it is strictly for gaming. I will run some benches to see what my card can do. I will also be putting it under Water to push it as far as I can to maximize framerates while gaming at 4k. One of the reasons I am going single card, well at least for now is that many games recently are not getting SLI support or poor SLI support.


----------



## Mad Pistol

Quote:


> Originally Posted by *Stateless*
> 
> For me it is strictly for gaming. I will run some benches to see what my card can do. I will also be putting it under Water to push it as far as I can to maximize framerates while gaming at 4k. One of the reasons I am going single card, well at least for now is that many games recently are not getting SLI support or poor SLI support.


Then the Titan XP is the ticket, for sure. GTX 1070 SLI was barely within my budget. Even selling my two 1070's right now, I couldn't afford the difference to spring for a TXP.

That, and I think NONE OF US expected Nvidia to drop a Titan so soon after the 1080/1070 launch.


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> Nope it's not only me, everyone who is staying at single gpu fo those reasons will confirm that.
> 
> I did not see it myself for years until someone pointed out the differences, that's why you don't see it.


I disable sli for a few games that I play. So I play with it on and off every day. And I don't see it.


----------



## Stateless

Quote:


> Originally Posted by *Mad Pistol*
> 
> Then the Titan XP is the ticket, for sure. GTX 1070 SLI was barely within my budget. Even selling my two 1070's right now, I couldn't afford the difference to spring for a TXP.
> 
> That, and I think NONE OF US expected Nvidia to drop a Titan so soon after the 1080/1070 launch.


Thanks.

When will begin to see the bios mods for the Titan X P? From my research and reading, people have not been able to add voltage to push it farther. Lastly, it has been a while since I overclocked my GPU, is MSI Afterburner or Precision X the preferred program for overclocking?


----------



## SundayGamer

Quote:


> Originally Posted by *Stateless*
> 
> is MSI Afterburner or Precision X the preferred program for overclocking?


I ocd my in MSI Afterburner, worked fine- 2050mhz, haven't played a lot, first attempt


----------



## CallsignVega

Well I did the resister shunt mod with this stuff and it did absolutely nothing:



https://www.amazon.com/gp/product/B00CSMDT8S/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

That stuff just has too much resistance IMO. Went ahead and ordered Liquid Ultra.

Strange thing happened too when I did the shunt mod. My EVGA Hybrid turned into a Arctic Accellero what the heck?



This air cooler is actually a beast. Cools just as well as the EVGA Hybrid and it's quiet. Going back to all air with my new system. All of these modern components just don't put out the heat like they used to.


----------



## Stateless

Quote:


> Originally Posted by *SundayGamer*
> 
> I ocd my in MSI Afterburner, worked fine- 2050mhz, haven't played a lot, first attempt


Thanks. Can you confirm that you can or cannot add any voltage within the program? On my old Titan Maxwell I was able to add some voltage via the program, but not until we had modded bios that I was able to add a bit more.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Stateless*
> 
> Thanks. Can you confirm that you can or cannot add any voltage within the program? On my old Titan Maxwell I was able to add some voltage via the program, but not until we had modded bios that I was able to add a bit more.


No voltage control at all with the current AB.


----------



## czin125

Quote:


> Originally Posted by *pez*
> 
> Yeah, however, most X99/2011 chips have 28 or 40 lanes. 1150 and 1151 all have 20 IIRC. No idea about the P6000.


Z170 has 20+16 right?


----------



## sherlock

Quote:


> Originally Posted by *czin125*
> 
> Z170 has 20+16 right?


No, Z170 have 16+4 and only the 16 can be used for SLI, the X4 is routed through chipset and mostly used to run PCIE SSDs/LAN/USB3.1 ports etc.


----------



## czin125

But the cpu is 16x and the pch is 20x?


----------



## axiumone

Quote:


> Originally Posted by *czin125*
> 
> But the cpu is 16x and the pch is 20x?


No, the cpu has 16 and the pch has 4. Equaling to 20 usable lanes for certain devices.


----------



## Gary2015

Quote:


> Originally Posted by *seckzee*
> 
> it seems like their OC'ing on water is only helping with temp's at this point... seems like theyre waiting for custom bios editing to be released


Would hold off on the blocks for now until EK fixes the HB bridge problem. Otherwise go with AC.


----------



## Testier

Quote:


> Originally Posted by *CallsignVega*
> 
> Strange thing happened too when I did the shunt mod. My EVGA Hybrid turned into a Arctic Accellero what the heck?
> 
> This air cooler is actually a beast. Cools just as well as the EVGA Hybrid and it's quiet. Going back to all air with my new system. All of these modern components just don't put out the heat like they used to.


Which version you have and does it cool the vrms properly?


----------



## dante`afk

Quote:


> Originally Posted by *CallsignVega*
> 
> Well I did the resister shunt mod with this stuff and it did absolutely nothing:
> 
> 
> 
> https://www.amazon.com/gp/product/B00CSMDT8S/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> That stuff just has too much resistance IMO. Went ahead and ordered Liquid Ultra.


hmmm I ordered that too...doesnt work?


----------



## 8472

Quote:


> Originally Posted by *CallsignVega*
> 
> 
> 
> This air cooler is actually a beast. Cools just as well as the EVGA Hybrid and it's quiet. Going back to all air with my new system. All of these modern components just don't put out the heat like they used to.


Off topic but what kind of cpu temps do you get with the Noctua. I have 280mm AIO, but in realbench things still get too hot unless I turn the fans up to an annoying level.


----------



## guttheslayer

So what is the max clock the TXP stayed while under water?

Capped at 2.1 GHz?


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, yours says Metro LL Benchmark and mine says "Metro ReduX Benchmark".
> 
> Is it the same thing?


Hey. Here's the bench in Metro LL Redux. Did you upload yours already? Don't remember seeing it.


----------



## Bloodymight

Quote:


> Originally Posted by *CallsignVega*
> 
> Strange thing happened too when I did the shunt mod. My EVGA Hybrid turned into a Arctic Accellero what the heck?
> 
> 
> 
> This air cooler is actually a beast. Cools just as well as the EVGA Hybrid and it's quiet. Going back to all air with my new system. All of these modern components just don't put out the heat like they used to.


What are your temps with that Arctic Accelero

Edit: found a review(german) with the arctic Accelero on a gtx 1080






curious to see how it handles the titan X


----------



## cookiesowns

Man, I still don't get how you guys get such good cards.

While my cards seem to be game stable at around 1970-2025 Mhz 100% FAN, it will fail 3Dmark due to too aggressive of a TDP limit causing core voltage to drop too aggressively. I'm thinking I can do 2050Mhz max on Water on these cards.

My cards will only do 1780-1800 on 120% TDP + 100% FAN before throttling in 3Dmark with 0 offset. Memory seems to be good up till +450.


----------



## Gary2015

Quote:


> Originally Posted by *cookiesowns*
> 
> Man, I still don't get how you guys get such good cards.
> 
> While my cards seem to be game stable at around 1970-2025 Mhz 100% FAN, it will fail 3Dmark due to too aggressive of a TDP limit causing core voltage to drop too aggressively. I'm thinking I can do 2050Mhz max on Water on these cards.
> 
> My cards will only do 1780-1800 on 120% TDP + 100% FAN before throttling in 3Dmark with 0 offset. Memory seems to be good up till +450.


Luck of the draw. If you can be bothered, you can always exchange...


----------



## cookiesowns

Quote:


> Originally Posted by *Gary2015*
> 
> Luck of the draw. If you can be bothered, you can alwayd exchange...


What I want to see is logs of other people's cards. Can make some good correlation of ASIC quality of pasta based on their boost bins, voltage bins and overall OC potential if we have enough data points.


----------



## Gary2015

Quote:


> Originally Posted by *cookiesowns*
> 
> What I want to see is logs of other people's cards. Can make some good correlation of ASIC quality of pasta based on their boost bins, voltage bins and overall OC potential if we have enough data points.


That's what I suggested earlier in the thread when people got their cards. So far it seems overclocks aren't that great.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Well I did the resister shunt mod with this stuff and it did absolutely nothing:
> 
> 
> 
> https://www.amazon.com/gp/product/B00CSMDT8S/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> That stuff just has too much resistance IMO. Went ahead and ordered Liquid Ultra.
> 
> Strange thing happened too when I did the shunt mod. My EVGA Hybrid turned into a Arctic Accellero what the heck?
> 
> 
> 
> This air cooler is actually a beast. Cools just as well as the EVGA Hybrid and it's quiet. Going back to all air with my new system. All of these modern components just don't put out the heat like they used to.


So I guess fitting waterblocks won't do much...


----------



## Woundingchaney

I'm actually looking forward to what we get with aftermarket cooling for these cards. It looks like they are simply begging for a better cooling solution.


----------



## Gary2015

Quote:


> Originally Posted by *Woundingchaney*
> 
> I'm actually looking forward to what we get with aftermarket cooling for these cards. It looks like they are simply begging for a better cooling solution.


It will lower temps but won't do much performance as of now.


----------



## stefxyz

I disagree. Even if we dont achieve higher clockspeeds we will see much more stable clock speeds which smoothens the experience significantly. Right now they fluctuate on benchmarks and in most games based on thermal and power restrictions. Once temps go below 50 c delsius with watercoolingh this changes. I see this with my 1080 and basicvally the titan is the same chip with more cuda cores and higher memory bandwith.


----------



## Woundingchaney

Quote:


> Originally Posted by *Gary2015*
> 
> It will lower temps but won't do much performance as of now.


That is kind of the point though. As of right now performance is slightly degrading because the card seems to start toggling at around 80 degrees. I agree that I don't see the headroom in overall OC increasing, but keeping clocks solid would only improve performance.


----------



## cookiesowns

Quote:


> Originally Posted by *Gary2015*
> 
> It will lower temps but won't do much performance as of now.


It'll help with boost throttling due to temps. I think I can hit 2050+ in some games that don't trigger hard TDP limits. If I had these on water. In 3Dmark the temps should help even with throttling. Can always just add more offset to core clock to raise the clocks at the TDP throttle bin!

These cards can be stable at 1950+ at under 1.03V

Koolance QDCs get here Tuesday so I'll have these under water by Wednesday.


----------



## Gary2015

Quote:


> Originally Posted by *stefxyz*
> 
> I disagree. Even if we dont achieve higher clockspeeds we will see much more stable clock speeds which smoothens the experience significantly. Right now they fluctuate on benchmarks and in most games based on thermal and power restrictions. Once temps go below 50 c delsius with watercoolingh this changes. I see this with my 1080 and basicvally the titan is the same chip with more cuda cores and higher memory bandwith.


Well be my guest, go and test out and let us know your findings.


----------



## Gary2015

Quote:


> Originally Posted by *cookiesowns*
> 
> It'll help with boost throttling due to temps. I think I can hit 2050+ in some games that don't trigger hard TDP limits. If I had these on water. In 3Dmark the temps should help even with throttling. Can always just add more offset to core clock to raise the clocks at the TDP throttle bin!
> 
> These cards can be stable at 1950+ at under 1.03V
> 
> Koolance QDCs get here Tuesday so I'll have these under water by Wednesday.


How does that translate in frame rates?


----------



## cg4200

Quote:


> Originally Posted by *CallsignVega*
> 
> Well I did the resister shunt mod with this stuff and it did absolutely nothing:
> 
> 
> 
> https://www.amazon.com/gp/product/B00CSMDT8S/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> That stuff just has too much resistance IMO. Went ahead and ordered Liquid Ultra.
> 
> Strange thing happened too when I did the shunt mod. My EVGA Hybrid turned into a Arctic Accellero what the heck?
> 
> 
> 
> This air cooler is actually a beast. Cools just as well as the EVGA Hybrid and it's quiet. Going back to all air with my new system. All of these modern components just don't put out the heat like they used to.


That's to bad it did nothing to help with power.. which ones did you do I see one in picture.. Do you think artic silver 5 would work?? i was going to switch out thermal paste with thermal grizzly
and was thinking i would try shunt mod myself.. and i see random oc results wish we had average chart.. wish i did not work 6 days a week or would start one..
I am curious of other cards boost stock mine is 1873 max clock i hit is 2088 with oc. my max overclock 205 on core can not go higher that is with no oc on memory..i can get 750 on memory so basically running 205/750 will pass firestrike 205/600 gives higher score due to throttling i think with water block can keep boost 2050 to 2088 in stead of bouncing around.


----------



## renejr902

For people interested my exchange from i5 4690 to i7 4790k remove 66%-75% of fps drops and stuttering with witcher 3 at 4k with no hairwork no AA with titan x pascal. Overclocked titan core +205 +700 mem run witcher 3 to 60+fps near all the time, except when a lot of effect happen or several ennemies in a fight or big raining with a lot of people, otherwise 60+ can use vsync, and no tearing. Thanks guys for advice about buying a i7 cpu


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> Hey. Here's the bench in Metro LL Redux. Did you upload yours already? Don't remember seeing it.


No not yet, I'm on my laptop as my dang FIOS is down. Verizon pissing me off.

Quote:


> Originally Posted by *dante`afk*
> 
> hmmm I ordered that too...doesnt work?


No, didn't work. I'm thinking it has too much resistance or it's a compound that doesn't make good contact with the ends of the resisters.
Quote:


> Originally Posted by *Bloodymight*
> 
> What are your temps with that Arctic Accelero
> 
> Edit: found a review(german) with the arctic Accelero on a gtx 1080
> 
> 
> 
> 
> 
> 
> curious to see how it handles the titan X


Temps are pretty great for an air cooler. The Arctic is running high 40's C with 2060 MHz on the core.


----------



## unreality

Since we cannot read any ASIC values so far, is there another way of quickly seeing how good a card is? Like default voltage or max boost clock out of the box? Any comparison values so far?

I really hope my card is coming tomorrow, and i do hope for some silcon lottery luck this time!


----------



## Bloodymight

Quote:


> Originally Posted by *CallsignVega*
> 
> Temps are pretty great for an air cooler. The Arctic is running high 40's C with 2060 MHz on the core.


Is that in Idle or load?

The only time I used an Accelero Xtreme was with a HD4870X2









Don't remember the temps at all thought ._.


----------



## stefxyz

Put Power Target to 120 fan to 100% and raise clocks and mem clocks. The higher you can go the better the card...


----------



## Woundingchaney

Quote:


> Originally Posted by *Gary2015*
> 
> How does that translate in frame rates?


If you are stressing your card and temperature is throttling max MHZ then the lower the MHZ the lower the lower the framerate. Essentially with the Nvidia cooler people are able to OC above what they can keep cool. Aftermarket cooling would remove this concern allowing people to maintain their set OC during gameplay or benchmarking.


----------



## CallsignVega

Quote:


> Originally Posted by *Bloodymight*
> 
> Is that in Idle or load?
> 
> The only time I used an Accelero Xtreme was with a HD4870X2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't remember the temps at all thought ._.


lol load of course.

HT with Physx


No HT with Physx


HT no Physx


No HT no Physx


So basically a 8-10 core is able to off-load the Physx better to the CPU than a 6700K. Actually a pretty CPU intensive benchmark, wasn't maxing out my GPU's. But also due to the low resolution of the benchmark.


----------



## Z0eff

Quote:


> Originally Posted by *pez*
> 
> If you're running SLI Titans or 1080s, you'll get the benefit of x16/x16 without losing functionality of M.2/U.2, etc. Other than that, a Z97 or Z170 platform is going to generally be better for OC'ing and gaming performance.


Quote:


> Originally Posted by *axiumone*
> 
> No, the cpu has 16 and the pch has 4. Equaling to 20 usable lanes for certain devices.


Something's off here. I'm pretty sure Z170 is 16+20 with the 16 coming from the CPU directly and the 20 from the PCH.
If the Z170 platform has a total of 20 PCIe lanes then what happens when my GPU takes up 16 of those lanes? GPUz confirms it's running at PCIe Gen3 x16.

See the original Z170 article on anandtech: http://www.anandtech.com/show/9485/intel-skylake-z170-motherboards-asrock-asus-gigabyte-msi-ecs-evga-supermicro

How could my SATA and Ethernet ports still be functioning if the PCH only has 4 PCIe lanes itself and the other 16 just routed from the CPU?

EDIT: Of course if you want to run Quad SLI or benefit from 16x/16x Dual SLI then X99 it is, running graphics cards off of the PCH isn't an option as far as I'm aware.


----------



## pez

Quote:


> Originally Posted by *czin125*
> 
> Z170 has 20+16 right?


Quote:


> Originally Posted by *sherlock*
> 
> No, Z170 have 16+4 and only the 16 can be used for SLI, the X4 is routed through chipset and mostly used to run PCIE SSDs/LAN/USB3.1 ports etc.


Correct. Good info sherlock







.
Quote:


> Originally Posted by *Woundingchaney*
> 
> I'm actually looking forward to what we get with aftermarket cooling for these cards. It looks like they are simply begging for a better cooling solution.


As others have said, it probably won't do much to increase overall clocks, but will work more to maintain a more consistent GPU clock. I.e. instead of me dropping from 2000Mhz in the first 20 minutes down to 1911 after about 2 hours of gameplay, the water may start at 2000Mhz and only drop down to 1974 or 1950 after the same amount of time under load.
Quote:


> Originally Posted by *Z0eff*
> 
> Something's off here. I'm pretty sure Z170 is 16+20 with the 16 coming from the CPU directly and the 20 from the PCH.
> If the Z170 platform has a total of 20 PCIe lanes then what happens when my GPU takes up 16 of those lanes? GPUz confirms it's running at PCIe Gen3 x16.
> 
> See the original Z170 article on anandtech: http://www.anandtech.com/show/9485/intel-skylake-z170-motherboards-asrock-asus-gigabyte-msi-ecs-evga-supermicro
> 
> How could my SATA and Ethernet ports still be functioning if the PCH only has 4 PCIe lanes itself and the other 16 just routed from the CPU?
> 
> EDIT: Of course if you want to run Quad SLI or benefit from 16x/16x Dual SLI then X99 it is, running graphics cards off of the PCH isn't an option as far as I'm aware.


I'll take a look, but unless something has changed, SATA and ethernet run off of their own chipsets. Hell, Z97 boards sometimes had 3 or 4 different chipsets for ethernet, SATA, etc. M.2 Sata and PCIE SSDs should be the only thing using PCIE. However, see the pic posted in the quote I quoted above.


----------



## Edge0fsanity

Quote:


> Originally Posted by *Z0eff*
> 
> Something's off here. I'm pretty sure Z170 is 16+20 with the 16 coming from the CPU directly and the 20 from the PCH.
> If the Z170 platform has a total of 20 PCIe lanes then what happens when my GPU takes up 16 of those lanes? GPUz confirms it's running at PCIe Gen3 x16.
> 
> See the original Z170 article on anandtech: http://www.anandtech.com/show/9485/intel-skylake-z170-motherboards-asrock-asus-gigabyte-msi-ecs-evga-supermicro
> 
> How could my SATA and Ethernet ports still be functioning if the PCH only has 4 PCIe lanes itself and the other 16 just routed from the CPU?
> 
> EDIT: Of course if you want to run Quad SLI or benefit from 16x/16x Dual SLI then X99 it is, running graphics cards off of the PCH isn't an option as far as I'm aware.


you can run graphics cards in the x4 slot on z170 but you won't have sli support. I tested my TXP in the x4 slot when i got it since i can't remove my 980tis yet due to them being underwater.


----------



## CallsignVega

The confusion is that the DMI 3.0 link is effectively only a PCI-E 3.0 X4 speed link no matter how many lanes are connected to it downstream of the PCH. Nothing connected to the PCH will run faster than that. That's why on my Z170, my two 950 Pro's in Raid 0 are ceiling limited at 3.0 X4 speed.


----------



## NoDoz

My card was delivered Friday. Can't wait to get home tomorrow and check it out.


----------



## Steven185

Quote:


> Originally Posted by *carlhil2*
> 
> Putting my TXP under water didn't do any magic as far as OCing is concerned, but, it doesn't throttle anymore. I will not go hard anyways til I get a FC block.....


At this point we're not overclocking we're just avoiding throttling. As it turns out even stock clocks are extremely high from out of the box (higher than GTX 1080?). This thing boosts in 1850 Mhz+, that's 20% more than their stated stock. The problem is retaining those clocks, its cooling can't , so it ends up averaging far below those values.

If you *can* maintain boost you're already 10-15% higher than the rest of us, so I would say WaterCooling matters. It's just that OCing changed, we get the most of the clocking from nVidia, all we need is to maintain it nowadays; gone are the days that we had to get the clocks ourselves.

Still, maintaining clocks *is* effective overclocking, thus I've started looking at those hybrid water-coolers. I'm still not sure if they can fit out of the box or if I would need to modify my Titan XP in any way (I'd prefer not to).


----------



## dante`afk

Quote:


> Originally Posted by *8472*
> 
> Off topic but what kind of cpu temps do you get with the Noctua. I have 280mm AIO, but in realbench things still get too hot unless I turn the fans up to an annoying level.


I have the aio 280mm h115i too here, also the noctua dh15. the noctua actually cools beter than the aio...but I need to do further testing.


----------



## Jpmboy

for anyone wondering, I tried to fit an EK supremacy VGA uniblock to the TXP leaving the fan and base plate installed... no good. the block base is larger than the cutout for the OEM heat sink. You'd either have to mill the EK block or dremmel the base plate opening larger. Neither of which are worth doing (for me) with the EK full cover blocks shipping shortly.









For those sticking with air cooling, changing the thermal paste to Grizzly Kryo dropped peak sustained temperatures on that card by at least 5C (67C to 61C during Heaven 4.0)

lol - 1 TXP pushing 1440P/120Hz and 4K60 live stream (55 inch Seiki).



a Glorious *20* threads!


----------



## pez

Quote:


> Originally Posted by *dante`afk*
> 
> I have the aio 280mm h115i too here, also the noctua dh15. the noctua actually cools beter than the aio...but I need to do further testing.


Both should cool similarly as long as sufficient airflow is in place. However, the Corsair unit will require the fans ramp up quite a bit more for that to happen. It's possible to add more fans to the unit, but at that point, you're breaching a $150 price point, and once you've hit that, you might as well look into something like a Swiftech or EKWB kit.


----------



## mouacyk

Quote:


> Originally Posted by *Jpmboy*
> 
> for anyone wondering, I tried to fit an EK supremacy VGA uniblock to the TXP leaving the fan and base plate installed... no good. the block base is larger than the cutout for the OEM heat sink. You'd either have to mill the EK block or dremmel the base plate opening larger. Neither of which are worth doing (for me) with the EK full cover blocks shipping shortly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For those sticking with air cooling, changing the thermal paste to Grizzly Kryo dropped peak sustained temperatures on that card by at least 5C (67C to 61C during Heaven 4.0)
> 
> lol - 1 TXP pushing 1440P/120Hz and 4K60 live stream (55 inch Seiki).
> 
> 
> 
> a Glorious *20* threads!


paleez... your picture won't even fit on my monitor


----------



## kakan9

How is the performance increase with the titan x if you use a 6700k with different RAM speeds? Or is that neglectable?


----------



## carlhil2

Quote:


> Originally Posted by *stefxyz*
> 
> I disagree. Even if we dont achieve higher clockspeeds we will see much more stable clock speeds which smoothens the experience significantly. Right now they fluctuate on benchmarks and in most games based on thermal and power restrictions. Once temps go below 50 c delsius with watercoolingh this changes. I see this with my 1080 and basicvally the titan is the same chip with more cuda cores and higher memory bandwith.


This.. as I stated before, putting my card under water didn't result in any OC magic, but, I am at a steady 2035/2050 in, for example, GTA V, no down-clocking because of temps...I did lower my ram OC from +600 to +400, til I get a FC block..


----------



## Gary2015

Quote:


> Originally Posted by *carlhil2*
> 
> This.. as I stated before, putting my card under water didn't result in any OC magic, but, I am at a steady 2035/2050 in, for example, GTA V, no down-clocking because of temps...I did lower my ram OC from +600 to +400, til I get a FC block..


What is the difference in frame rates, steady vs down clocking due to throttling at high temps?


----------



## carlhil2

Quote:


> Originally Posted by *Gary2015*
> 
> What is the difference in frame rates, steady vs down clocking due to throttling at high temps?


Well, for example, in benching, I get higher scores at lower OC. my cards temp never goes over 45, depending on ambient temps. with the AC on, temps hardly cross 40. my card doesn't down clock at all in any game. that speaks for itself as far as the advantages go....no more "bouncing betty" clocks...


----------



## Gary2015

Quote:


> Originally Posted by *carlhil2*
> 
> Well, for example, in benching, I get higher scores at lower OC. my cards temp never goes over 45, depending on ambient temps. with the AC on, temps hardly cross 40. my card doesn't down clock at all in any game. that speaks for itself as far as the advantages go....


Can you quantify benching scores? Would really like to know. Thanks. I'm holding off water-cooling at the moment because there are no hard numbers to justify the cost. If you could put up some numbers, benching scores and games like GTA V for example, that would be much appreciated.


----------



## carlhil2

Quote:


> Originally Posted by *Gary2015*
> 
> Can you quantify benching scores? Would really like to know. Thanks. I'm holding off water-cooling at the moment because there are no hard numbers to justify the cost. If you could put up some numbers, benching scores and games like GTA V for example, that would be much appreciated.


My latest FS Ultra.. http://www.3dmark.com/fs/9674194 edit: oops, my bad, you want gaming, get back to you on that...regardless, clocks still fluctuate in graphics test one to some degree..in gaming, not at all...


----------



## 8472

Quote:


> Originally Posted by *dante`afk*
> 
> I have the aio 280mm h115i too here, also the noctua dh15. the noctua actually cools beter than the aio...but I need to do further testing.


Do you still have the 4790k in your sig? I'm wondering if it'll also perform better when cooling a HW-E cpu.

I'm going to have to look into that accelero as well. With a H90 on my TXP, I'm getting 44C max at stock after an hour of Valley. If it can match that, it'll allow me to have less cable clutter in my build.


----------



## carlhil2

And to think, just weeks ago, ,most didn't think that these cards would break 2000, now, seems everyone is expecting 2100+...on air..


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> My latest FS Ultra.. http://www.3dmark.com/fs/9674194 edit: oops, my bad, you want gaming, get back to you on that...regardless, clocks still fluctuate in graphics test one to some degree..in gaming, not at all...


Without Custom BIOS they all gonna be +- 100 points








http://www.3dmark.com/3dm/13946288


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Without Custom BIOS they all gonna be +- 100 points
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/13946288


Nice score..







I dare not push my card yet, the only thing being cooled is the chip at the moment..


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Nice score..
> 
> 
> 
> 
> 
> 
> 
> I dare not push my card yet, the only thing being cooled is the chip at the moment..


The only thing cooling mine is:


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> The only thing cooling mine is:


My ram chips, etc., are bare, removed the nVIDIA shroud/cooler entirely...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> My ram chips, etc., are bare, removed the nVIDIA shroud entirely...


Got EVGAs Hybrid kit for 10Series layin untouched, will probably use it if BIOS tools show up, other than that i havent seen over 72 degrees even @ 4K gaming.


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> My ram chips, etc., are bare, removed the nVIDIA shroud/cooler entirely...


http://i.memeful.com/media/post/lMzzPmM_700wa_0.gif






















I'm glad i waited and didn't Dremel my 1080 for that Kit, don't want to do the Power mod and OC options are limited now .


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> http://i.memeful.com/media/post/lMzzPmM_700wa_0.gif
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm glad i waited and didn't Dremel my 1080 for that Kit, don't want to do the Power mod and OC options are limited now .


Because I am going FC block, the cooler needed to come off anyways... this is a temp job..


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Because I am going FC block, the cooler needed to come off anyways... this is a temp job..


This thing wont fit with factory fan still in? that sux.


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> This thing wont fit with factory fan still in? that sux.


It might have, didn't even try, like I said, the whole cooler needed to come off anyways, why make more work for myself. all of those screws though...


----------



## MunneY

Just wanted to stop by and give everyone my .02 on this card.

Mine boost to 1800ish at stock, around 1900 when you bump the power limit.

From there I can OC it up to about 2070mhz ish before it starts getting temperamental with me. I haven't had the time to REALLY push it, but I suspect if you can hold temps down into that 50c range you can get in that 2100+ range easily.

I wish I was going to have more time to play with it, but its back to the farm for me.


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> It might have, didn't even try, like I said, the whole cooler needed to come off anyways, why make more work for myself. all of those screws though...


Ehh if only had time for Custom Loop








Really want to try Custom Wall Mount.


----------



## Sheyster

Quote:


> Originally Posted by *stefxyz*
> 
> I disagree. Even if we dont achieve higher clockspeeds we will see much more stable clock speeds which smoothens the experience significantly. Right now they fluctuate on benchmarks and in most games based on thermal and power restrictions. Once temps go below 50 c delsius with watercoolingh this changes. I see this with my 1080 and basicvally the titan is the same chip with more cuda cores and higher memory bandwith.


It's nothing a modded BIOS won't fix. I would wait on aftermarket cooling a bit.







When I say this I don't mean the folks who are putting them in loops.


----------



## carlhil2

Quote:


> Originally Posted by *Sheyster*
> 
> It's nothing a modded BIOS won't fix. I would wait on aftermarket cooling a bit.
> 
> 
> 
> 
> 
> 
> 
> When I say this I don't mean the folks who are putting them in loops.


+1...I had no issue gaming with the stock cooler, at full blast, it wasn't even too loud for me. my system is under water, so, I made the move that I made for that reason only..


----------



## Sheyster

Quote:


> Originally Posted by *CallsignVega*
> 
> That's why on my Z170, my *two 950 Pro's in Raid 0* are ceiling limited at 3.0 X4 speed.


You sir are a BOSS!


----------



## Sheyster

Quote:


> Originally Posted by *dante`afk*
> 
> I have the aio 280mm h115i too here, also the noctua dh15. the noctua actually cools beter than the aio...but I need to do further testing.


Yeah, I think I'm done with AIO's for CPU cooling. Probably just gonna pick up a True Spirit 140 Power Edition and call it a day. Plus, it's super quiet.


----------



## DADDYDC650

I'd go air cooling but those heatsinks are massive and take away from the aesthetics of builds.


----------



## gamingarena

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'd go air cooling but those heatsinks are massive and take away from the aesthetics of builds.


aesthetics? what aesthetics its all about performance


----------



## Sheyster

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'd go air cooling but those heatsinks are massive and take away from *the aesthetics of builds.*


Open bench FTW son! That's how I roll now.


----------



## CRITTY

Quote:


> Originally Posted by *techguymaxc*
> 
> K. Enjoy your micro-stutter.


sh sh sh shut your mouth


----------



## hotrod717

It would be nice if when we are talking about water cooling, people specify what they are actually using. There are large differences in setups and just saying "water cooled' really doesn't classify much. There is a huge difference between a single 240 loop on cpu and gpu vs. dedicated loops and and 3x 1080 rads. Ambient air and load temps also play a role. Hard to make any kind of comparison without those details.


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> for anyone wondering, I tried to fit an EK supremacy VGA uniblock to the TXP leaving the fan and base plate installed... no good. the block base is larger than the cutout for the OEM heat sink. You'd either have to mill the EK block or dremmel the base plate opening larger. Neither of which are worth doing (for me) with the EK full cover blocks shipping shortly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For those sticking with air cooling, changing the thermal paste to Grizzly Kryo dropped peak sustained temperatures on that card by at least 5C (67C to 61C during Heaven 4.0)
> 
> lol - 1 TXP pushing 1440P/120Hz and 4K60 live stream (55 inch Seiki).
> 
> 
> 
> a Glorious *20* threads!


Bummer. I guess I'll repaste, or get rid of the cards. Don't feel worth dropping another $300 on mediocre clocking cards.


----------



## carlhil2

Quote:


> Originally Posted by *cookiesowns*
> 
> Bummer. I guess I'll repaste, or get rid of the cards. Don't feel worth dropping another $300 on mediocre clocking cards.


Did you try OCing without touching the ram?


----------



## cookiesowns

Quote:


> Originally Posted by *carlhil2*
> 
> Did you try OCing without touching the ram?


Yes. Stock my cards won't do 1800 boost in 3Dmark. It will TDP throttle down to 1.75 ish.


----------



## Testier

Quote:


> Originally Posted by *cookiesowns*
> 
> Yes. Stock my cards won't do 1800 boost in 3Dmark. It will TDP throttle down to 1.75 ish.


Wait, your card cant do 1800 in 3dmark overclocked?


----------



## dante`afk

who needs watercooling? and that only with a 140mm AiO because the 280mm does not fit onto the gpu. did not even re-apply fresh TIM, just took the old one.

I should be able to push it even a bit further down temperaturewise with thermal grizzly TIM tomorrow.



30 degrees lower than with mk26. ****'s running into PT all the time and downclocking then.


----------



## tpwilko08

The stock boost on my card is 1885Mhz as reported on GPUZ don't know if this is good or not?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *tpwilko08*
> 
> The stock boost on my card is 1885Mhz as reported on GPUZ don't know if this is good or not?


That's pretty good. You have a good card.


----------



## Jpmboy

Quote:


> Originally Posted by *cookiesowns*
> 
> Bummer. I guess I'll repaste, or get rid of the cards. Don't feel worth dropping another $300 on mediocre clocking cards.


thermal throttling (well, dropping clock bins, throttling is more of a stack execution delay, in my day anyway







) starts at ~ 45C. open AB 4.3beta, put the power limit at 120%, open GPUZ do the PCIE test - whgat boost is showing in the GPUZ sensor tab?
Quote:


> Originally Posted by *cookiesowns*
> 
> Yes. Stock my cards won't do 1800 boost in 3Dmark. It will TDP throttle down to 1.75 ish.


I'd bet they do, but slam into the PL, VL and TL.








Quote:


> Originally Posted by *tpwilko08*
> 
> The stock boost on my card is 1885Mhz as reported on GPUZ don't know if this is good or not?


good... and normal as far as I know - this is with the PL at 120%?


----------



## tpwilko08

Quote:


> Originally Posted by *Jpmboy*
> 
> good... and normal as far as I know - this is with the PL at 120%?


No this was with the PL at 100%.


----------



## tpwilko08

Quote:


> Originally Posted by *dante`afk*
> 
> who needs watercooling? and that only with a 140mm AiO because the 280mm does not fit onto the gpu. did not even re-apply fresh TIM, just took the old one.
> 
> I should be able to push it even a bit further down temperaturewise with thermal grizzly TIM tomorrow.
> 
> 
> 
> 30 degrees lower than with mk26. ****'s running into PT all the time and downclocking then.


Whats your max stock boost at 100% PT?


----------



## MunneY

so which one of ya'll is gonna unlock power target and all that jazz... I can't wait to see all these cards in that 2300 range under water.


----------



## HyperMatrix

Quote:


> Originally Posted by *MunneY*
> 
> so which one of ya'll is gonna unlock power target and all that jazz... I can't wait to see all these cards in that 2300 range under water.


I'm thinking voltage limit is going to be an issue when trying for 2300MHz. I'd expect 2200MHz under water. And 2250 if lucky. I would love to be wrong.


----------



## carlhil2

My gpu only OC to 2062, above that crashes the driver in Heaven...needs voltages..


----------



## MunneY

Quote:


> Originally Posted by *carlhil2*
> 
> My gpu only OC to 2062, above that crashes the driver in Heaven...needs voltages..


I really don't think we'll ever get it without some sort of hard mod.


----------



## carlhil2

Quote:


> Originally Posted by *MunneY*
> 
> I really don't think we'll ever get it without some sort of hard mod.


True, I am very happy with this card though..


----------



## dante`afk

you don't need voltage, you're running into PT cap


----------



## Testier

Quote:


> Originally Posted by *MunneY*
> 
> I really don't think we'll ever get it without some sort of hard mod.


Well I will be happy with some type of power limit unlock. My issue with the card is that it throttle down once the temp goes up and hits the power limit.

Doubt we will see voltage being a huge factor unless the card is kept below ambient.


----------



## carlhil2

Quote:


> Originally Posted by *dante`afk*
> 
> you don't need voltage, you're running into PT cap


You could be right. I will just leave it at +195, which gives me 2050 til further notice, lol. it doesn't throttle from that, so, I will be good..


----------



## Testier

Quote:


> Originally Posted by *carlhil2*
> 
> You could be right. I will just leave it at +195, which gives me 2050 til further notice, lol


I feel like I am being bottlenecked by my CPU in certain games. 4.3ghz is simply not fast enough for haswell IMO.

You see it on your 5960x?


----------



## carlhil2

Quote:


> Originally Posted by *Testier*
> 
> I feel like I am being bottlenecked by my CPU in certain games. 4.3ghz is simply not fast enough for haswell IMO.
> 
> You see it on your 5960x?


I am pushing a 4K monitor and have been playing GTA V to test my gaming OC, haven't any issues yet..my chip is at 4.7, cache at 4.5..


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> you don't need voltage, you're running into PT cap


Power target causes throttling. Voltage limit causes the driver crash. We have both problems right now depending on the bench. A modified bios can up the power target a ton. But have to see how much extra voltage, if any, can be applied either through Msi afterburner itself, or through the modified bios like we had with the last 2 Titan cards. There are a lot of unknowns. But honestly, I'm thrilled that it can do 2k+ on air. I'm looking at the rest of it as a potential 10-15% bonus. There are some games that can't quite max out 165Hz at 1440p that could use the boost.


----------



## Testier

Quote:


> Originally Posted by *carlhil2*
> 
> I am pushing a 4K monitor and have been playing GTA V to test my gaming OC, haven't any issues yet..my chip is at 4.7, cache at 4.5..


What settings you running with GTA V? I still drop to 35ish fps on ultrawide maxed out.


----------



## carlhil2

Quote:


> Originally Posted by *Testier*
> 
> What settings you running with GTA V? I still drop to 35ish fps on ultrawide maxed out.


Max settings, other than AA, which I use 2x, Tess on high, reflection x2, and, grass on very high..I use vsync in this game, ...I still get dips to low 40's...in the bench anyways.. in game, smooth as butter..


----------



## cookiesowns

Quote:


> Originally Posted by *Testier*
> 
> Wait, your card cant do 1800 in 3dmark overclocked?


They can do 1950 ish when overclocked, but it's borderline stable. They will do 1.95-2030 in games though.

As far as STOCK BOOST, 120% TDP 100% fan, they will not do 1800 in 3dmark.
Quote:


> Originally Posted by *Jpmboy*
> 
> thermal throttling (well, dropping clock bins, throttling is more of a stack execution delay, in my day anyway
> 
> 
> 
> 
> 
> 
> 
> ) starts at ~ 45C. open AB 4.3beta, put the power limit at 120%, open GPUZ do the PCIE test - whgat boost is showing in the GPUZ sensor tab?
> I'd bet they do, but slam into the PL, VL and TL.
> 
> 
> 
> 
> 
> 
> 
> 
> good... and normal as far as I know - this is with the PL at 120%?


I was under the impression that people are getting 1850+ sustained stock, with just PT increase to 120% and more aggresive fan curves, in 3DMark. Is this not correct?

I'm looking at average / lowest clock throughout the bench, not Maximum boost that only kicks in for 5-10 seconds in a game/bench.

My cards hit about 1810-1873 stock boost ( 120% TDP ) if that's the case, for like 5-10 seconds in 3Dmark LOL.


----------



## Testier

Quote:


> Originally Posted by *cookiesowns*
> 
> They can do 1950 ish when overclocked, but it's borderline stable. They will do 1.95-2030 in games though.


Thats around the same clock my card get for game clocks.


----------



## DNMock

im not sure how people use these cards on air. Being thermally limited just feels like the card is getting neutered. Even with the cap set to 85 C i basically freak out every time I see those temps since I'm so used to seeing upper 30's low 40's with the voltage cranked all the way up and only being limited to whatever O/C the card is capable of putting out lol.

Quote:


> Originally Posted by *Testier*
> 
> I feel like I am being bottlenecked by my CPU in certain games. 4.3ghz is simply not fast enough for haswell IMO.
> 
> You see it on your 5960x?


Is your resolution at 800 x 600 or are you running a game that only uses 2 threads or something? 5930K at 4.5 is showing nothing close to bottleneck with SLI XP, but that's in Fallout 4 and Witcher 3, haven't tested on other games yet.


----------



## DNMock

~Deleted double post


----------



## cookiesowns

Timespy bench. 15405 2x TXP @ 1930 - 2021Mhz. Mostly sitting around 1950/ +400 on mem. 6950X @ 4.4, 3.5 cache, 8x8GB 3000 @ C15-2T

Gpus were at +143 & +156 to maintain SLI boost. Since 2nd card doesn't boost the same way as first card.



FSX. Same settings as above, but boost mostly sat around 1930. Might not be stable, and had GPUZ open along with AB during this run. Listening to spotify on both runs as well.



For comparo, my best run in FSX with the ol' 5960X @ 4.75, and 2x 980Ti KPE at ~1550 Mhz or so.

http://hwbot.org/submission/2967299_cookiesowns_3dmark___fire_strike_ultra_2x_geforce_gtx_980_ti_9926_marks


----------



## Jpmboy

Quote:


> Originally Posted by *tpwilko08*
> 
> Whats your max stock boost at 100% PT?


what's yours? that's the question.








Quote:


> Originally Posted by *MunneY*
> 
> so which one of ya'll is gonna unlock power target and all that jazz... I can't wait to see all these cards in that 2300 range under water.


Quote:


> Originally Posted by *HyperMatrix*
> 
> I'm thinking voltage limit is going to be an issue when trying for 2300MHz. I'd expect 2200MHz under water. And 2250 if lucky. I would love to be wrong.


not really... the PL is the problem. whether or not pascal does better with a bit more voltgae (not extreme voltage) is unknown AFAIK
Quote:


> Originally Posted by *MunneY*
> 
> I really don't think we'll ever get it without some sort of hard mod.


Didnlt you say the same thing about the Titan XM?








IF we have a pascal bios editor, we'll be able to change the PL and voltage to the limit of the VRM/PCM.
Quote:


> Originally Posted by *cookiesowns*
> 
> They can do 1950 ish when overclocked, but it's borderline stable. They will do 1.95-2030 in games though.
> 
> As far as STOCK BOOST, 120% TDP 100% fan, they will not do 1800 in 3dmark.
> I was under the impression that people are getting 1850+ sustained stock, with just PT increase to 120% and more aggresive fan curves, in 3DMark. Is this not correct?
> 
> I'm looking at average / lowest clock throughout the bench, not Maximum boost that only kicks in for 5-10 seconds in a game/bench.
> 
> My cards hit about 1810-1873 stock boost ( 120% TDP ) if that's the case, for like 5-10 seconds in 3Dmark LOL.


they are slamming the PL. look at AB and you'll see the power limit kick in.


----------



## CallsignVega

Whoever is staying on air you need to ditch that stock cooler and get an Arctic Accelero III. In my testing this thing is a great and quiet cooler. Seriously good deal for $54 on Amazon. 48C full load at 2080 MHz core at about 1/4th the noise of the stock cooler at max fan.


----------



## Baasha

Just installed my 3rd and 4th Titan X Pascal in the Asus RoG Swift rig:





The CPU (3970X @ 4.50Ghz vs. 6950X @ 4.30Ghz) seems to have a HUGE impact in terms of performance in synthetic benchmarks (haven't tried gaming yet).

Anyway, real world benchmarks of the Titan X Pascal in SLI (2x GPUs):


----------



## Testier

Quote:


> Originally Posted by *CallsignVega*
> 
> Whoever is staying on air you need to ditch that stock cooler and get an Arctic Accelero III. In my testing this thing is a great and quiet cooler. Seriously good deal for $54 on Amazon. 48C full load at 2080 MHz core at about 1/4th the noise of the stock cooler at max fan.


Edit nvm. Apparently it comes with heatsinks.


----------



## Menthol

sorry


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> they are slamming the PL. look at AB and you'll see the power limit kick in.


Oh come on JPM. You of all people should know I realize that









I know it's slamming into PL that's why I'm curious if people are actually Doing 1.85 sustained or just their top boost BIN stock.

However 1930 at 1V is impressive. My cards will do 1999.5 at 1.05V or so. Same offset.


----------



## dante`afk

Quote:


> Originally Posted by *Testier*
> 
> I feel like I am being bottlenecked by my CPU in certain games. 4.3ghz is simply not fast enough for haswell IMO.
> 
> You see it on your 5960x?


nope, your CPU is not a bottleneck at 4.3ghz.


----------



## spinFX

Far oot! How can anyone justify this card! Must be some pretty specific cases where SLI is just not an option and you must get a single TX...
I guess this card was always on the edge of the enthusiast market and the commercial market (companies doing 3d rendering n such buy them I guess.)
Definitely seems more on the commercial side than enthusiast side now, the 1080Ti will probably be close enough in perf to not even remotely consider TX this round.


----------



## Gary2015

Quote:


> Originally Posted by *spinFX*
> 
> Far oot! How can anyone justify this card! Must be some pretty specific cases where SLI is just not an option and you must get a single TX...
> I guess this card was always on the edge of the enthusiast market and the commercial market (companies doing 3d rendering n such buy them I guess.)
> Definitely seems more on the commercial side than enthusiast side now, the 1080Ti will probably be close enough in perf to not even remotely consider TX this round.


1080Ti? What's that ?


----------



## guttheslayer

Guys would u recommend a 6800K or a 6700K to pair with a TXP for 1440p 144Hz gaming?


----------



## Stateless

Will my OC 3930k CPU at 4.7 be fine with the Titan Pascal? I have this same CPU at the same clock with my 2xTitan Maxwell's and it has been doing fine. Just with the new architecture of Pascal, just wondering if my CPU will be an issue at all?


----------



## dante`afk

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys would u recommend a 6800K or a 6700K to pair with a TXP for 1440p 144Hz gaming?


6700k

Quote:


> Originally Posted by *Stateless*
> 
> Will my OC 3930k CPU at 4.7 be fine with the Titan Pascal? I have this same CPU at the same clock with my 2xTitan Maxwell's and it has been doing fine. Just with the new architecture of Pascal, just wondering if my CPU will be an issue at all?


you're fine. the cpu will last at least another 5 years.


----------



## guttheslayer

Quote:


> Originally Posted by *dante`afk*
> 
> 6700k
> you're fine. the cpu will last at least another 5 years.


Considering if I can OC 6800K to 4.4GHz and I7 6700K to 4.7GHz. You will still recommend 6700K?


----------



## cookiesowns

Hrm. Who's going for the Aquacomputer block vs the EK one? It appears that the aqua block won't cause issues with the HB bridge?

Anyone know what's the deal with EK not supporting HB?


----------



## mbze430

http://videocardz.com/63063/aqua-computer-intros-kryographics-pascal-for-nvidia-titan-x

I am ordering 2 of these babies to match my 980TI one!



http://aquacomputer.de/newsreader/items/kryographics-fuer-nvidia-titan-x.html


----------



## HyperMatrix

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys would u recommend a 6800K or a 6700K to pair with a TXP for 1440p 144Hz gaming?


More cores more performance in majority of upcoming games. My 4.7GHz 5960x dominates over my old 5.2GHz 3770k. 8 core + HyoerThreading helps deal with dx11 draw call limitations in many games as well. There's a lot of nonsense going around about 4 core being better than 8 core for gaming. Yes a 4.8GHx 6700k will probably be better than a stock clock 6800k. But if you overclock that to 4.4GHz, you'll be 5-10% slower in poorly threaded games, and 50% faster in dx12 games or dx11 games with proper workload distribution.

After saying this, you're going to have people linking to garbage reviews showing the 6700 performing better. But those are garbage reviews and don't take everything that matters into account. Especially since we're talking about very high frame rate gaming.

Don't make a mistake. Get as many cores as you can. Dx12 demands it.


----------



## HyperMatrix

Quote:


> Originally Posted by *mbze430*
> 
> http://videocardz.com/63063/aqua-computer-intros-kryographics-pascal-for-nvidia-titan-x
> 
> I am ordering 2 of these babies to match my 980TI one!
> 
> 
> 
> http://aquacomputer.de/newsreader/items/kryographics-fuer-nvidia-titan-x.html


They make amazing blocks. I'm waiting until they release the active cooled backplate before ordering though. Anyone who takes overclocking seriously should do the same. This way you're grabbing heat from both sides of the card.


----------



## mbze430

Quote:


> Originally Posted by *HyperMatrix*
> 
> They make amazing blocks. I'm waiting until they release the active cooled backplate before ordering though. Anyone who takes overclocking seriously should do the same. This way you're grabbing heat from both sides of the card.


I ran SLI 980TI with these babies... did a hard mod on them to push them above 1.30v+ it was keeping them under 87 with 2x 240 radiators. Awesome block. Plus it looks good. not like all the square and rectangular waterblock that are out there.


----------



## Gary2015

Quote:


> Originally Posted by *DNMock*
> 
> im not sure how people use these cards on air. Being thermally limited just feels like the card is getting neutered. Even with the cap set to 85 C i basically freak out every time I see those temps since I'm so used to seeing upper 30's low 40's with the voltage cranked all the way up and only being limited to whatever O/C the card is capable of putting out lol.
> Is your resolution at 800 x 600 or are you running a game that only uses 2 threads or something? 5930K at 4.5 is showing nothing close to bottleneck with SLI XP, but that's in Fallout 4 and Witcher 3, haven't tested on other games yet.


From what I read so far, these cards perform well on air.


----------



## Gary2015

Quote:


> Originally Posted by *mbze430*
> 
> http://videocardz.com/63063/aqua-computer-intros-kryographics-pascal-for-nvidia-titan-x
> 
> I am ordering 2 of these babies to match my 980TI one!
> 
> 
> 
> http://aquacomputer.de/newsreader/items/kryographics-fuer-nvidia-titan-x.html


You're going need/wait for a backplate. Those blocks are heavy and could crack the PCB.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Whoever is staying on air you need to ditch that stock cooler and get an Arctic Accelero III. In my testing this thing is a great and quiet cooler. Seriously good deal for $54 on Amazon. 48C full load at 2080 MHz core at about 1/4th the noise of the stock cooler at max fan.


Is it easy to install?


----------



## mbze430

Quote:


> Originally Posted by *Gary2015*
> 
> You're going need/wait for a backplate. Those blocks are heavy and could crack the PCB.


I will get the active backplates, but it doesn't matter to me... because in my case my motherboard is not mounted off the side. It's mounted on flat parallel to the ground. Thermaltake Core X5

Prior to the Titan X Pascal, I had those Aqua computer on my 980TI for almost 1.5yrs, didn't have a problem(was in a Antec P280 case)... I did notice a slight bend to the PCB when I took off the block to sell one of the 980TIs. But after putting on the original fan cooler it was normal.


----------



## Stateless

I have a pre-order in for the EK block, but what is this that it not compatible with the new SLI bridge? I am thinking of sticking with single card, but if I decide to SLI, will the EK block screw me?


----------



## Testier

Quote:


> Originally Posted by *Gary2015*
> 
> Is it easy to install?


]

You probably need the specific screwdriver.


----------



## Gary2015

Quote:


> Originally Posted by *Stateless*
> 
> I have a pre-order in for the EK block, but what is this that it not compatible with the new SLI bridge? I am thinking of sticking with single card, but if I decide to SLI, will the EK block screw me?


You can use the ribbon SLI bridge but not the Nvidia HB bridge nor the EVGA HB bridge.


----------



## Gary2015

Quote:


> Originally Posted by *mbze430*
> 
> I will get the active backplates, but it doesn't matter to me... because in my case my motherboard is not mounted off the side. It's mounted on flat parallel to the ground. Thermaltake Core X5
> 
> Prior to the Titan X Pascal, I had those Aqua computer on my 980TI for almost 1.5yrs, didn't have a problem(was in a Antec P280 case)... I did notice a slight bend to the PCB when I took off the block to sell one of the 980TIs. But after putting on the original fan cooler it was normal.


Cool let us know you temps!


----------



## sena

Quote:


> Originally Posted by *Stateless*
> 
> I have a pre-order in for the EK block, but what is this that it not compatible with the new SLI bridge? I am thinking of sticking with single card, but if I decide to SLI, will the EK block screw me?


That sucks, any other block that supports hb bridge?


----------



## cookiesowns

Quote:


> Originally Posted by *Stateless*
> 
> I have a pre-order in for the EK block, but what is this that it not compatible with the new SLI bridge? I am thinking of sticking with single card, but if I decide to SLI, will the EK block screw me?


You can use the regular hard bridges, should support 550Mhz. Should be enough for 1440P 165hz. EK is stating they may make a HB bridge that works with their blocks, or you can "cut" the corners off the nvidia HB bridge and try fitting it. Someone on reddit has done this and it seems to work fine still.


----------



## Gary2015

Quote:


> Originally Posted by *sena*
> 
> That sucks, any other block that supports hb bridge?


Aquacomputer works


----------



## sena

Quote:


> Originally Posted by *Gary2015*
> 
> Aquacomputer works


Thx, i will buy two of these then.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> lol load of course.
> 
> HT with Physx
> 
> 
> No HT with Physx
> 
> 
> HT no Physx
> 
> 
> No HT no Physx
> 
> 
> So basically a 8-10 core is able to off-load the Physx better to the CPU than a 6700K. Actually a pretty CPU intensive benchmark, wasn't maxing out my GPU's. But also due to the low resolution of the benchmark.


Just saw your post now. Weird. Ok so 5960x is better at Metro LL Redux. What other recent games should we test?


----------



## D749

Finally built a new system - it was long overdue.

I installed two Titan XP (slot 1 & 3) on an Asus R5E Edition 10 board. I've been sitting in the BIOS a lot and soon will be running a 12-24H memory test off of a USB stick. I noticed that the fans on both of these cards run extremely slow while in the BIOS. Is that normal? I just concerned about them getting too hot.


----------



## sena

Since i am new to aquacomputers, do they come with thermal pads like ek does?


----------



## cookiesowns

Quote:


> Originally Posted by *D749*
> 
> Finally built a new system - it was long overdue.
> 
> I installed two Ttian XP (slot 1 & 3) on an Asus R5E Edition 10 board. I've been sitting in the BIOS a lot and soon will be running a 12-24H memory test off of a USB stick. I noticed that the fans on both of these cards run extremely slow while in the BIOS. Is that normal? I just concerned about them getting too hot.


Totally normal.


----------



## pez

Quote:


> Originally Posted by *Stateless*
> 
> I have a pre-order in for the EK block, but what is this that it not compatible with the new SLI bridge? I am thinking of sticking with single card, but if I decide to SLI, will the EK block screw me?


Yeah, EK said they're working on one, but no news since then.


----------



## HyperMatrix

Quote:


> Originally Posted by *sena*
> 
> Since i am new to aquacomputers, do they come with thermal pads like ek does?


Aqua only uses thermal pads over the VRM. And yes, it's supplied. You need non-conductive thermal paste for direct contact between the block and memory modules. I recommend either Grizzly Kryonaut or Phobia Nanogrease Extreme. And of course, CLU for the GPU itself.


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> Aqua only uses thermal pads over the VRM. And yes, it's supplied. You need non-conductive thermal paste for direct contact between the block and memory modules. I recommend either Grizzly Kryonaut or Phobia Nanogrease Extreme. And of course, CLU for the GPU itself.


You need to be careful that the rubber rings sit properly.


----------



## tpwilko08

Quote:


> Originally Posted by *Jpmboy*
> 
> what's yours? that's the question.


The highest i have seen it go is 1885. gaming is usually around 1838-1845 but then temps kick in and it drops to 1810-1825 at 70c. Firestrike is around 1797-1810 can not wait for waterblock to be released these fanboy coolers are to loud at 100% fan...


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> Aqua only uses thermal pads over the VRM. And yes, it's supplied. You need non-conductive thermal paste for direct contact between the block and memory modules. I recommend either Grizzly Kryonaut or Phobia Nanogrease Extreme. And of course, CLU for the GPU itself.


Why not Grizzly for the GPU too? Wouldn't liquid metal of the CLU actually be in position to damage the chip?


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> Why not Grizzly for the GPU too? Wouldn't liquid metal of the CLU actually be in position to damage the chip?


Far better heat transfer with CLU. And why would you think that it would damage the chip?


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> Far better heat transfer with CLU. And why would you think that it would damage the chip?


Not sure; somewhere I've read it can damage the cover of the chip because it connects with it all too well. How do you clean a liquid metal paste, btw?
Also, why not CLU on everything then? Is there a reason not to go and use CLU on everything, from the CPU to GPUs and VRAM, RAM etc.?


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> Not sure; somewhere I've read it can damage the cover of the chip because it connects with it all too well. How do you clean a liquid metal paste, btw?
> Also, why not CLU on everything then? Is there a reason not to go and use CLU on everything, from the CPU to GPUs and VRAM, RAM etc.?


CLU is hard to get off of metal. You'll have to scrape it off of your block if you ever want to re-apply it. They provide little pads with CLU to help you do this. On Intel CPUs, cleaning it off the IHS (not the die) causes a lot of the chip information to get wiped (not really important). But CLU always wipes cleanly off of the CPU/GPU die itself.

CLU is metal. It's conductive. So you can't use it everywhere. Grizzy Kryonaut and Phobya Nanogrease Extreme are the highest performing non-conductive pastes. You use each one where appropriate.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> CLU is hard to get off of metal. You'll have to scrape it off of your block if you ever want to re-apply it. They provide little pads with CLU to help you do this. On Intel CPUs, cleaning it off the IHS (not the die) causes a lot of the chip information to get wiped (not really important). But CLU always wipes cleanly off of the CPU/GPU die itself.
> 
> CLU is metal. It's conductive. So you can't use it everywhere. Grizzy Kryonaut and Phobya Nanogrease Extreme are the highest performing non-conductive pastes. You use each one where appropriate.


I know about conductivity, but I was interested in this 'taking off' part since I supposed it could be something like that.









Well, I'll probably test all of these since I'm moving to a custom loop - my 5960X even on moderate 1.27V generates up to 80 deg on 30 ambient when running 4.5. I don't think it should even run that high on a pretty good h155i Corsair AIO...


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> I know about conductivity, but I was interested in this 'taking off' part since I supposed it could be something like that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, I'll probably test all of these since I'm moving to a custom loop - my 5960X even on moderate 1.27V generates up to 80 deg on 30 ambient when running 4.5. I don't think it should even run that high on a pretty good h155i Corsair AIO...


Definitely get that under water. At 1.41v mine never hits 80c.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> Definitely get that under water. At 1.41v mine never hits 80c.


OT... it is under water... an AIO... but the performance is ... lol.


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> OT... it is under water... an AIO... but the performance is ... lol.


The only AIO units worth a damn are some of the Swiftech ones. But they're pretty much just a pre-setup/packaged custom loop. I'd recommend one of the XSPC starter kits as well, which comes with all the parts you need to do an actual full custom loop for a cpu.


----------



## toncij

I'll put everything in a custom one. It'll be a custom loop comeback after now 15 years, but I plan to go with a CPU and GPUs all under a 360mm rad loop.


----------



## StullenAndi

Quote:


> Originally Posted by *dante`afk*
> 
> welp I ****ed up when trying to unscrew those impossible screws.
> 
> can I tape that on that position ? -_-


Damn, that´s the 3rd case of ripped off smd components on nvidia pascal cards. I fixed a card from a guy in germany and now the card is working fine like before. He ripped of the resistor mounted on R1035 while unscrewing the cooler.


----------



## Glzmo

If I just want to put on an EVGA Hybrid GPU cooler from my old Maxwell Titan X, which screws of the Pascal Titan X do I have to screw off? Just the screws on the top and side of the shroud and then the four big ones that hold the GPU block? Is there any need to unscrew the screw on the backplate/back of the card at all to do this?


----------



## unreality

My Card arrived!

It does 1847 Boost at default. Seems for core clock +220 is the absolute max. Heaven wont do 220 but +200 for example. Average results or even a bit above average?

Besides being a furnace and its crying for being watercooled, we two are already in love


----------



## dante`afk

Quote:


> Originally Posted by *guttheslayer*
> 
> Considering if I can OC 6800K to 4.4GHz and I7 6700K to 4.7GHz. You will still recommend 6700K?


for gaming yes.
if you do encoding, video **** and picture crap, 6800k


----------



## dante`afk

Quote:


> Originally Posted by *StullenAndi*
> 
> Damn, that´s the 3rd case of ripped off smd components on nvidia pascal cards. I fixed a card from a guy in germany and now the card is working fine like before. He ripped of the resistor mounted on R1305 while unscrewing the cooler.


the cards works flawless without the smd resistor, not sure what it does though


----------



## StullenAndi

The part you ripped of is a capacitor, not a resistor. And that´s the reason why the card is working. Capacitors used most times to work as a filter. If you take a look on my picture, the capacitor *C*(apacitor)503 can be removed and the card will work fine. But the 0Ohm resistor on *R*(esistor)1035 is needed to power up another component.


----------



## dante`afk

lucky me. I tried yesterday to put it back on but no chance, can't keep the metal liquid, freezes for that tiny space immediately.

I guess I'd need a hot air tool to make it work.


----------



## sena

Quote:


> Originally Posted by *HyperMatrix*
> 
> Aqua only uses thermal pads over the VRM. And yes, it's supplied. You need non-conductive thermal paste for direct contact between the block and memory modules. I recommend either Grizzly Kryonaut or Phobia Nanogrease Extreme. And of course, CLU for the GPU itself.


Thanks, i will find some good paste for memory-block contact.


----------



## StullenAndi

If you want to solder it with hot air, you need to preheat the pcb to get enaugh heat while soldering. It´s much easier with a good regular soldering iron and a very small tip.

The problem with hotair stations, even highend ones, is that you did not heat up the pcb fast enaugh while the copper in the pcb cools it down.


----------



## KillerBee33

Quote:


> Originally Posted by *Glzmo*
> 
> If I just want to put on an EVGA Hybrid GPU cooler from my old Maxwell Titan X, which screws of the Pascal Titan X do I have to screw off? Just the screws on the top and side of the shroud and then the four big ones that hold the GPU block? Is there any need to unscrew the screw on the backplate/back of the card at all to do this?


----------



## Jpmboy

Quote:


> Originally Posted by *cookiesowns*
> 
> Oh come on JPM. You of all people should know I realize that
> 
> 
> 
> 
> 
> 
> 
> 
> I know it's slamming into PL that's why I'm curious if people are actually Doing 1.85 sustained or just their top boost BIN stock.
> However 1930 at 1V is impressive. My cards will do 1999.5 at 1.05V or so. Same offset.











Quote:


> Originally Posted by *guttheslayer*
> 
> Guys would u recommend a 6800K or a 6700K to pair with a TXP for 1440p 144Hz gaming?


6700K as long as it can run 4.8core and 4.8 cache.
Quote:


> Originally Posted by *sena*
> 
> Since i am new to aquacomputers, do they come with thermal pads like ek does?


yes!









Quote:


> Originally Posted by *tpwilko08*
> 
> The highest i have seen it go is 1885. gaming is usually around 1838-1845 but then temps kick in and it drops to 1810-1825 at 70c. Firestrike is around 1797-1810 can not wait for waterblock to be released these fanboy coolers are to loud at 100% fan...


yeah so, without overclocking the card and just raising the power limit, 1885 is the expected boost.








Quote:


> Originally Posted by *toncij*
> 
> Why not Grizzly for the GPU too? Wouldn't liquid metal of the CLU actually be in position to damage the chip?


NOt gonna damage the chip with CLU - and the block must be nickel plated (all blocks are copper). BUt, these gpus do not need CLU or CLP. Grizzly is fine, as is Gelid Extreme or any other top paste. lol - want to card to run cool? Cool the water loop better.


----------



## toncij

Quote:


> Originally Posted by *StullenAndi*
> 
> The part you ripped of is a capacitor, not a resistor. And that´s the reason why the card is working. Capacitors used most times to work as a filter. If you take a look on my picture, the capacitor *C*(apacitor)503 can be removed and the card will work fine. But the 0Ohm resistor on *R*(esistor)1035 is needed to power up another component.


Will work, but "fine" hmm...?


----------



## GosuPl

LOL just LOL

This monster with OC perfrom slighlty worse than SLI TX M OC

http://www.3dmark.com/fs/9187357

http://www.3dmark.com/3dm/13963033










My old TITANs vs new









https://scontent-fra3-1.xx.fbcdn.net/v/t35.0-12/13923664_334027503652757_8691721456507039474_o.jpg?oh=682d46222f0f331fc04496f4486b8e65&oe=57AAE898

And i didnt start TX Pascal SLI tests yet









Btw, EVGA Hybrid from TX Maxwell will be fit for TX Pascal? i ordere 2x EVGA Hybrid for Maxwell few days ago, but...I think i will keep single TX P or even SLI









http://www.3dmark.com/3dm/13963280

http://www.3dmark.com/fs/6484495


----------



## KillerBee33

Quote:


> Originally Posted by *GosuPl*
> 
> LOL just LOL
> 
> This monster with OC perfrom slighlty worse than SLI TX M OC
> 
> http://www.3dmark.com/fs/9187357
> 
> http://www.3dmark.com/3dm/13963033
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My old TITANs vs new
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://scontent-fra3-1.xx.fbcdn.net/v/t35.0-12/13923664_334027503652757_8691721456507039474_o.jpg?oh=682d46222f0f331fc04496f4486b8e65&oe=57AAE898
> 
> And i didnt start TX Pascal SLI tests yet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, EVGA Hybrid from TX Maxwell will be fit for TX Pascal? i ordere 2x EVGA Hybrid for Maxwell few days ago, but...I think i will keep single TX P or even SLI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/13963280
> 
> http://www.3dmark.com/fs/6484495


32000 is what OC TitanX P normally runs for Graphics Score in F.S. With Bios tools it will get to 33000 maybe evn 34000


----------



## StullenAndi

Quote:


> Originally Posted by *toncij*
> 
> Will work, but "fine" hmm...?


That´s another question


----------



## DADDYDC650

U peeps using MSI AB or Precision to OC?


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> U peeps using MSI AB or Precision to OC?


MSI AB without RivaTuner.


----------



## unreality

Quote:


> Originally Posted by *DADDYDC650*
> 
> U peeps using MSI AB or Precision to OC?


Im using nvidia inspector. Always loves the minimalistic but working feeling.


----------



## DADDYDC650

Why aren't you folks using Precision and what's wrong with RivaTuner?


----------



## Steven185

My card performs great but throttles like crazy. The stock cooler is kind of underwhelming. Of course I can make the fan run 100% at all times, but that's not the point. So I was looking at AIO coolers. Would this one work? http://eu.evga.com/Products/Product.aspx?pn=400-HY-0996-B1

Or maybe this one? http://eu.evga.com/Products/Product.aspx?pn=400-HY-H980-B1

Without any modding?

I saw some guys being able to fit them. But I'm not sure which one of those they used. Also I'd prefer not to mod my card (so that I may not completely destroy its resale values, Volta is near







).


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Why aren't you folks using Precision and what's wrong with RivaTuner?


No particular reason , just got used to it. Got NZXTs GRID running all my fans so i'm using CAM Soft. for FPS and Temps.


----------



## KillerBee33

Quote:


> Originally Posted by *Steven185*
> 
> My card performs great but throttles like crazy. The stock cooler is kind of underwhelming. Of course I can make the fan run 100% at all times, but that's not the point. So I was looking at AIO coolers. Would this one work? http://eu.evga.com/Products/Product.aspx?pn=400-HY-0996-B1
> 
> Or maybe this one? http://eu.evga.com/Products/Product.aspx?pn=400-HY-H980-B1
> 
> Without any modding?
> 
> I saw some guys being able to fit them. But I'm not sure which one of those they used. Also I'd prefer not to mod my card (so that I may not completely destroy its resale values, Volta is near
> 
> 
> 
> 
> 
> 
> 
> ).


http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1 NEW
Also i don't see crazy temps just yet fans dont even get to 100% , try this curve


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> No particular reason , just got used to it. Got NZXTs GRID running all my fans so i'm using CAM Soft. for FPS and Temps.


I've heard of CAM. Any issues with rivatuner and pascal? I'm used to the software.


----------



## Steven185

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1 NEW
> Also i don't see crazy temps just yet fans dont even get to 100% , try this curve


Woah! That's some steep curve. Thanks, I'll try it.

As for AIOs, I'm in Europe and can't find any vendor having the new one. Are you positive that none of the old ones will work with my Titan XP (without modding of course)? Because those are the only I can find selling.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> I've heard of CAM. Any issues with rivatuner and pascal? I'm used to the software.


Not really but RivaTuner doesn't have an ON / OFF switch and CAM does


----------



## KillerBee33

Quote:


> Originally Posted by *Steven185*
> 
> Woah! That's some steep curve. Thanks, I'll try it.
> 
> As for AIOs, I'm in Europe and can't find any vendor having the new one. Are you positive that none of the old ones will work with my Titan XP (without nodding of course)? Because those are the only I can find selling.


I've opened that 10Series Hybrid box and there is absolutely no , atleast Visual difference aside from the sroud , so yes Maxwell Hybrid kit is exactly the same and will fit perfectly without any MOD








As for the fan curve it's 27% all the way to 40 degrees, next mark is 70% @ 65 degrees and last is 100% @ 75 degrees


----------



## Steven185

Quote:


> Originally Posted by *KillerBee33*
> 
> I've opened that 10Series Hybrid box and there is absolutely no , atleast Visual difference aside from the sroud , so yes Maxwell Hybrid kit is exactly the same and will fit perfectly without any MOD


So buying the maxwell kit (either for 980 or 980 ti) will do, right?
As for the shroud , I don't mind it, I'm not even going to fit it, my understanding is that it is there just for looks...


----------



## KillerBee33

Quote:


> Originally Posted by *Steven185*
> 
> So buying the maxwell kit (either for 980 or 980 ti) will do, right?
> As for the shroud , I don't mind it, I'm not even going to fit it, my understanding is that it is there just for looks...


Get the 980Ti kit , i don't know why but it's cheaper LOL
Just keep outlined part of the Shroud off and you good


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> Not really but RivaTuner doesn't have an ON / OFF switch and CAM does


Did a recent update break the on/off function?


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Did a recent update break the on/off function?


Wouldn't know , havent used it for a wile


----------



## toncij

A question to all the owners: TXP worth moving from [email protected],1? Considering of going 1x TXP for those...


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> A question to all the owners: TXP worth moving from [email protected],1? Considering of going 1x TXP for those...


Judging by my old 980 , coming from Software OC to 15000 Firestrike and BIOS OC to 17000 , i'd say wait a little , hopefully Bios Tools are in the making


----------



## EniGma1987

Quote:


> Originally Posted by *CallsignVega*
> 
> Well I did the resister shunt mod with this stuff and it did absolutely nothing:
> 
> 
> 
> https://www.amazon.com/gp/product/B00CSMDT8S/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> That stuff just has too much resistance IMO. Went ahead and ordered Liquid Ultra.


The stock resistance is 0.005 ohms, which is already extremely low. All of those conductive glue things I have seen have pretty high resistance and are really just barely conductive, way more than 5 milliohms.. Liquid metal even has pretty high resistance for being a metal, nowhere close to copper wire, but it is low enough to get to around 3 milliohm which is perfect for doing this mod.

Quote:


> Originally Posted by *Z0eff*
> 
> Something's off here. I'm pretty sure Z170 is 16+20 with the 16 coming from the CPU directly and the 20 from the PCH.
> If the Z170 platform has a total of 20 PCIe lanes then what happens when my GPU takes up 16 of those lanes? GPUz confirms it's running at PCIe Gen3 x16.
> 
> See the original Z170 article on anandtech: http://www.anandtech.com/show/9485/intel-skylake-z170-motherboards-asrock-asus-gigabyte-msi-ecs-evga-supermicro
> 
> How could my SATA and Ethernet ports still be functioning if the PCH only has 4 PCIe lanes itself and the other 16 just routed from the CPU?
> 
> EDIT: Of course if you want to run Quad SLI or benefit from 16x/16x Dual SLI then X99 it is, running graphics cards off of the PCH isn't an option as far as I'm aware.


See this here from the article:
Quote:


> The processor is connected to the chipset by the four-lane DMI 3.0 interface. The DMI 3.0 protocol is an upgrade over the previous generation which used DMI 2.0 - this upgrade boosts the speed from 5.0 GT/s to 8.0 GT/s


Notice the speeds? DMI2 was 5GT/s and DMI3 is 8GT/s? Same speeds as PCI-E 2.0 and 3.0 are... A DMI link has always been pretty much the same as PCI-E is. The CPU is connected to the PCH with four DMI3 lanes, or in other words is connected with 4x PCI-E 3.0 lanes. The stuff the PCH has are talked about as "ports". Z170 has 26 ports it can use for I/O on the board, but is all routes back into 4 PCI-E lanes between the PCH and CPU, that is where the big bottleneck is. Think of the PCH as a sort of PLX chip for I/O.


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> A question to all the owners: TXP worth moving from [email protected],1? Considering of going 1x TXP for those...


If you're going move, move to 2x TXP.


----------



## EniGma1987

Quote:


> Originally Posted by *HyperMatrix*
> 
> I'm thinking voltage limit is going to be an issue when trying for 2300MHz. I'd expect 2200MHz under water. And 2250 if lucky. I would love to be wrong.


These cards will always have a voltage problem, whether we get unlocked voltage or not. I really hope we do not get unlocked voltage or Nvidia will just lock things down harder next time from all the RMAs. The VRMs are absolute crap on this card and I wouldn't even dare to raise voltage more than 0.05v. The one cool thing about the VRMs is that they have no real de-rating with temp. They can run the same at 25c as 100c. The problem is their rating, is very very low.


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Judging by my old 980 , coming from Software OC to 15000 Firestrike and BIOS OC to 17000 , i'd say wait a little , hopefully Bios Tools are in the making


You think two 1080s are still better? I'm a bit worried about the SLI...
Quote:


> Originally Posted by *Gary2015*
> 
> If you're going move, move to 2x TXP.


Well, since I don't plan to shell 3000€ on GPUs, that's out of the question. If I'm going to move, It's going to be a single one.


----------



## Z0eff

Quote:


> Originally Posted by *EniGma1987*
> 
> See this here from the article:
> Notice the speeds? DMI2 was 5GT/s and DMI3 is 8GT/s? Same speeds as PCI-E 2.0 and 3.0 are... A DMI link has always been pretty much the same as PCI-E is. The CPU is connected to the PCH with four DMI3 lanes, or in other words is connected with 4x PCI-E 3.0 lanes. The stuff the PCH has are talked about as "ports". Z170 has 26 ports it can use for I/O on the board, but is all routes back into 4 PCI-E lanes between the PCH and CPU, that is where the big bottleneck is. Think of the PCH as a sort of PLX chip for I/O.


I'm aware of that, and not sure how that answers my question...


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> You think two 1080s are still better? I'm a bit worried about the SLI...
> Well, since I don't plan to shell 3000€ on GPUs, that's out of the question. If I'm going to move, It's going to be a single one.


It really depends on the game. two 1080's have more raw horsepower but if the game doesn't work well with SLI then it won't matter. TXP would give you less theoretical performance but guarantees it for any game you may play.

To use myself as an example, I've been playing a single multiplayer game for quite a while and will do so for a while longer but it doesn't support SLI at all so the TXP is a product I desire...


----------



## EniGma1987

Quote:


> Originally Posted by *Z0eff*
> 
> I'm aware of that, and not sure how that answers my question...


You said you thought it was 16+20. The article shows it is 16+4.
If you want to know how ti affects your GPUs to take up 16 lanes, just look at how your computer is running now. That is what happens. lol. The system was designed so that 16 lanes go to the GPU. Been this way for... 10 years? More? IDK. The motherboard I/O has always been constrained by far too little lanes going to them all and sharing bandwidth for everything on the southbridge.


----------



## DNMock

Quote:


> Originally Posted by *Gary2015*
> 
> From what I read so far, these cards perform well on air.


I'm losing at least 200 mhz due to thermal throttling, probably more.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> You think two 1080s are still better? I'm a bit worried about the SLI...


There's alway a debate about SLI Some claim it's great and some say it's a pain in the @ss







, i'm not rendering or working on my rig , strictly for entertainment so i have no need for SLI .


----------



## lyang238

Quote:


> Originally Posted by *DNMock*
> 
> I'm losing at least 200 mhz due to thermal throttling, probably more.


Same here I'm sitting at around 1900-2000 mhz boost.


----------



## Woundingchaney

Quote:


> Originally Posted by *DNMock*
> 
> I'm losing at least 200 mhz due to thermal throttling, probably more.


I dont think Im losing quite 200 mhz but it is a valid drop due to throttling. Some of the users that are having luck with existing cooling solutions is making me want to pull the trigger on one.


----------



## DADDYDC650

Finally got around to overclocking this beast. Played some Battlefront @2Ghz Core and 11Ghz memory stable after 15 mins stable with no artifacts. Will keep pushing.


----------



## Woundingchaney

Quote:


> Originally Posted by *DADDYDC650*
> 
> Finally got around to overclocking this beast. Played some Battlefront @2Ghz Core and 11Ghz memory stable after 15 mins stable with no artifacts. Will keep pushing.


Is battlefront even really stressing the card though?


----------



## EniGma1987

Quote:


> Originally Posted by *sena*
> 
> That sucks, any other block that supports hb bridge?


Heatkiller should:


That is their GTX 1080 water block, but the Titan should be pretty much the same. The reason the EK doesnt work with the HB bridge is because the "fins" on the new bridge stick over too far and hit the water connection points. AquaComputer And heatkiller both have their water I/O area over about an extra 1/2" compared to EK so it does not interfere with the new bridge. I like Heatkiller blocks better than anyone else, and they look nice too. But they are usually later to market than both EK and Aqua. Aqua also makes an active backplate with VRM cooling, but I dont think that will matter on this card as the VRM temps dont seem to matter. Same rating at 25c as 100c. The problem with the VRMs is they are just rated really really low, so they cant be pushed hard no matter what the temp is.


----------



## DADDYDC650

Quote:


> Originally Posted by *Woundingchaney*
> 
> Is battlefront even really stressing the card though?


Demanding titles push in their own way. Best way to find instability is to just play games! I was hoping I'd hit 2Ghz core/11Ghz memory. So far so good.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Finally got around to overclocking this beast. Played some Battlefront @2Ghz Core and 11Ghz memory stable after 15 mins stable with no artifacts. Will keep pushing.


Try FarCry 4 with all it's Tweaks @ 4K , mine couldnt handle it.


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> Try FarCry 4 with all it's Tweaks @ 4K , mine couldnt handle it.


I'll download it later today. What's ur max stable OC? I'd imagine it would throttle so it's hard to pin down max stable OC.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'll download it later today. What's ur max stable OC? I'd imagine it would throttle so it's hard to pin down max stable OC.


3DMark reported 2076 no Artifacts, no crashes , also played NFS 2016 @ 4k for few hours without any issues
http://www.3dmark.com/3dm/13928497


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> 3DMark reported 2076 no Artifacts, no crashes , also played NFS 2016 @ 4k for few hours without any issues
> http://www.3dmark.com/3dm/13928497


Speaking of 3DMark. I purchased the full version on Steam a year ago. Saw that Time Spy was out so I downloaded the free version. Now I can't run the full version anymore. No idea what's going on.


----------



## mouacyk

Quote:


> Originally Posted by *DADDYDC650*
> 
> Demanding titles push in their own way. Best way to find instability is to just play games! I was hoping I'd hit 2Ghz core/11Ghz memory. So far so good.


Just want to confirm that you are seeing full GPU utilization and as high as possible memory utilization? NVidia is getting too good with their drivers implicitly lowering GPU usage if the engine doesn't surpass its load threshold(s). Prefering maximum performance power setting seems to do the trick.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Speaking of 3DMark. I purchased the full version on Steam a year ago. Saw that Time Spy was out so I downloaded the free version. Now I can't run the full version anymore. No idea what's going on.


Heh got the same issue when trying to run standalone 3DMark , TimeSpy just refuses to activate and i paid for it thru Steam.
Try runing 3DMark from Steam.


----------



## Kana Chan

Anyone use Kryonaut on these?


----------



## NoDoz

So what is the deal with the part coming off the board? Someone said it has happened 3 times so far. Is this happening when people are trying to put a AIO cooler on it? I'm wanting to throw a evga AIO on mine but not going to attempt it if something is snapping off.


----------



## Gary2015

Quote:


> Originally Posted by *Kana Chan*
> 
> Anyone use Kryonaut on these?


yes, vega did..its around 5c lower temps.


----------



## Jpmboy

Quote:


> Originally Posted by *Kana Chan*
> 
> Anyone use Kryonaut on these?


yes ~ 5C drop in sustained load temp.
Quote:


> Originally Posted by *NoDoz*
> 
> So what is the deal with the part coming off the board? Someone said it has happened 3 times so far. Is this happening when people are trying to put a AIO cooler on it? I'm wanting to throw a evga AIO on mine but not going to attempt it if something is snapping off.


usually folks removing the OEM AC without a 4mm socket... put the pliers away.







(was the same problem on the 1080).


----------



## NoDoz

Quote:


> Originally Posted by *Jpmboy*
> 
> yes ~ 5C drop in sustained load temp.
> usually folks removing the OEM AC without a 4mm socket... put the pliers away.
> 
> 
> 
> 
> 
> 
> 
> (was the same problem on the 1080).


Ok so have the right tool. Got it


----------



## mbze430

Here is what I used with my Aqua computer with my 980TIs.

CLU on the GPU. GELID's GC-Extreme on the RAM. Thermal Grizzly .5mm Pad on the VRM.

The GPU from Nvidia surface material is "glassy" so the CLU comes off fine. I DON'T RECOMMEND buffing off the CLU when bonded with metal. Because if you put metal under the microscope, there are "micro-scratches". The CLU actually fill in those "micro-scratches". when it bonds with the actual metal it harden and essentially fill in those gaps.

I also got a confirmation from AquaComputer that the HB Bridge will fit 100%.
Quote:


> Sent: Mon, 08 Aug 2016 12:32 AM
> Subject: Re-4: Support Formular Aqua Computer Titan X Pascal Water block?,2016-08-02
> Hello,
> 
> I can confirm it will fit.
> 
> Best regards,
> 
> Stephan Wille


----------



## opt33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Speaking of 3DMark. I purchased the full version on Steam a year ago. Saw that Time Spy was out so I downloaded the free version. Now I can't run the full version anymore. No idea what's going on.


Same issue here except I did not download timespy.... I had purchased 3D mark while back, went to run benches couple days ago and couldnt. My license was active and stored in 3dmark, but it said I had not purchased firestrike, etc and would not let me run benchmarks, and I was trying via steam as always. This must have happened with some 3dmark update or some hardware change on my end?. I went to repurchase on steam, and steam said I already purchased it. After a few hours of failing to solve the issue, reinstalling, etc, I just purchased whole version again with timespy and ignored the steam repurchasing. pretty annoying on 3dmark's end...


----------



## Artah

Quote:


> Originally Posted by *cookiesowns*
> 
> Hrm. Who's going for the Aquacomputer block vs the EK one? It appears that the aqua block won't cause issues with the HB bridge?
> 
> Anyone know what's the deal with EK not supporting HB?


I wish aqua would make totally square blocks then I would buy them. I don't like that curvy look on their blocks.


----------



## DNMock

This is some handy info on the Nvidia HB bridge and EKWB blocks. Not my work though:
Quote:


> Originally Posted by *lifeisshort117*
> 
> i've had to dremel a bunch of these bridges at work. the modifying doesn't affect performance, and the driver still reads the bridge as a high bandwidth model as well.
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *lifeisshort117*
> 
> no problem at all! glad I was able to help. I'd say a 1/4" cut. when I had cut the first bridge I was super nervous about trimming more and more until it fit. now I just have memorized the amount in my head, but I haven't measured it. so I'll say 1/4" because it would seem pretty close to how much is actually cut.
> 
> and I'm glad you're digging the look of the loop! I appreciate that.


Thread origin is here http://www.overclock.net/t/1603864/hwunboxed-nvidia-s-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/140#post_25419650

This info might be handy to sticky to the OP.


----------



## Testier

Is the only proper screwdriver needed for this card sold only on ekwb?


----------



## gamingarena

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1 NEW
> Also i don't see crazy temps just yet fans dont even get to 100% , try this curve [IMG
> 
> So im just ready to order new Hybrid cooler for 1080 from EVGA i jsut want to make sure that it works 100% on Titan XP with no problems?
> 
> Anyone knows if its 100% compatible?
> Thanks


----------



## KillerBee33

Quote:


> Originally Posted by *gamingarena*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KillerBee33*
> 
> http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1 NEW
> Also i don't see crazy temps just yet fans dont even get to 100% , try this curve [IMG
> 
> So im just ready to order new Hybrid cooler for 1080 from EVGA i jsut want to make sure that it works 100% on Titan XP with no problems?
> 
> Anyone knows if its 100% compatible?
> Thanks
> 
> 
> 
> I asked on 1080 Owners Thread and no answers yet, more than sure it'll fit with no issues. Here's something to go thru if you still skeptical about it
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look
Click to expand...


----------



## gamingarena

Quote:


> Originally Posted by *KillerBee33*
> 
> I asked on 1080 Owners Thread and no answers yet, more than sure it'll fit with no issues. Here's something to go thru if you still skeptical about it
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look


Yeah i know about that method but thanks anyways i just liked the look of the new Hybrid cooler i guess ill wait a bit longer for confirmation


----------



## KillerBee33

Quote:


> Originally Posted by *gamingarena*
> 
> Yeah i know about that method but thanks anyways i just liked the look of the new Hybrid cooler i guess ill wait a bit longer for confirmation


Another thing









TitanX Pascal is a Reference Card and the only thing EVGA will change for TitanX is the Shroud Color to BLACK.
I'm just waiting for Better OC options before i take mine apart and dremel it.
Mine is just sitting here


----------



## EniGma1987

Quote:


> Originally Posted by *Testier*
> 
> Is the only proper screwdriver needed for this card sold only on ekwb?


You can buy a screwdriver with a tiny phillips head all over the place. I got the one I used from Amazon but you can buy them at lowes or Home Depot in the electronics isle or most other electronics shops would sell them, or just get a pack of small ones from Amazon like I did.


----------



## Testier

Quote:


> Originally Posted by *EniGma1987*
> 
> You can buy a screwdriver with a tiny phillips head all over the place. I got the one I used from Amazon but you can buy them at lowes or Home Depot in the electronics isle or most other electronics shops would sell them, or just get a pack of small ones from Amazon like I did.


I thought this one needed the weird hexa 4mm screw.


----------



## EniGma1987

Quote:


> Originally Posted by *Testier*
> 
> I thought this one needed the weird hexa 4mm screw.


Oh you are talking about the allen bolts, not the screws. They are pretty standard stuff, very easy to find allen wrenches. Again, Lowes, Home depot, amazon, all over the place. The allen bolts are not 4mm size though, they are either 2.5mm or 3mm. The 4mm part a lot of talking about just needs any standard 4mm socket and those are to take out the standoff bolts that connect the PCB to the cooler and backplate.

So to recap, tools needed to disassemble the stock cooler from the card:

tiny phillips screwdriver for all the little screws on the cooler
allen wrench somewhere between 2.5mm-3mm (cant remember which) for the allen bolts on the cooler
4mm socket wrench for the standoff bolts that connect the cooler and backplate mounts to the PCB.


----------



## gamingarena

Quote:


> Originally Posted by *KillerBee33*
> 
> Another thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TitanX Pascal is a Reference Card and the only thing EVGA will change for TitanX is the Shroud Color to BLACK.
> I'm just waiting for Better OC options before i take mine apart and dremel it.
> Mine is just sitting here


Nice,
dremel it? do you actually need to do that at all? i just want easy install with no dremeling


----------



## Seyumi

Just a reminder to everyone that EVGA is coming out with a Titan XP Hybrid kit. EVGA_Jacob confirmed it himself. You could theoretically make the 1080/1070 Hybrid kit work but you'd have to dremel the shroud due to the extra power connectors on the Titan XP. You could also probably fit the gold 980Ti Hybrid that sells on Amazon for pretty cheap right now (same same power connectors spacing as the Titan XP) but not sure if that 100% confirmed to work yet.


----------



## KillerBee33

Quote:


> Originally Posted by *gamingarena*
> 
> Nice,
> dremel it? do you actually need to do that at all? i just want easy install with no dremeling


I don't like EVGA shroud , will try to keep the Reference look, then yes a small piece will be cut out








By the way this Gen. of Hybrid kits come with LED if you interested , just an EVGA Hybrid LOGO


----------



## KillerBee33

Quote:


> Originally Posted by *Seyumi*
> 
> Just a reminder to everyone that EVGA is coming out with a Titan XP Hybrid kit. EVGA_Jacob confirmed it himself. You could theoretically make the 1080/1070 Hybrid kit work but you'd have to dremel the shroud due to the extra power connectors on the Titan XP. You could also probably fit the gold 980Ti Hybrid that sells on Amazon for pretty cheap right now (same same power connectors spacing as the Titan XP) but not sure if that 100% confirmed to work yet.


Thats for whoever wants to keep EVGA shroud i assume.


----------



## toncij

Are they making a hydro-copper block too?


----------



## Tideman

So is there any OC software that supports voltage adjustments yet?

I've been using AB beta and can't adjust voltage.


----------



## HyperMatrix

Quote:


> Originally Posted by *KillerBee33*
> 
> 3DMark reported 2076 no Artifacts, no crashes , also played NFS 2016 @ 4k for few hours without any issues
> http://www.3dmark.com/3dm/13928497


I'm impressed that you can play nfs 2016 for that long. Kudos to you. For me the entire game feels like 10% of what an actual game should be. Lol.


----------



## KillerBee33

Quote:


> Originally Posted by *Tideman*
> 
> So is there any OC software that supports voltage adjustments yet?
> 
> I've been using AB beta and can't adjust voltage.


I've used more Voltage thru AB on a 1080 and the results were worse , believe it or not. I don't think Voltage is the issure here, it's the Power Limit.


----------



## KillerBee33

Quote:


> Originally Posted by *HyperMatrix*
> 
> I'm impressed that you can play nfs 2016 for that long. Kudos to you. For me the entire game feels like 10% of what an actual game should be. Lol.


It's kind of a Replica of the original NFS U2 , atleast they tried







but overall the game is not bad , got it on sale for 29$ and this was my first few hours with it.


----------



## DADDYDC650

Quote:


> Originally Posted by *opt33*
> 
> Same issue here except I did not download timespy.... I had purchased 3D mark while back, went to run benches couple days ago and couldnt. My license was active and stored in 3dmark, but it said I had not purchased firestrike, etc and would not let me run benchmarks, and I was trying via steam as always. This must have happened with some 3dmark update or some hardware change on my end?. I went to repurchase on steam, and steam said I already purchased it. After a few hours of failing to solve the issue, reinstalling, etc, I just purchased whole version again with timespy and ignored the steam repurchasing. pretty annoying on 3dmark's end...


No way I'm going I'm paying these fools twice!


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> No way I'm going I'm paying these fools twice!


I forgot to mention , this issue started after 1607 Win10 upgrade.


----------



## WaXmAn

I had this same issue with 3DMark failing on start of benchmark after upgrading to the new anniversary edition of WIN 10. I deleted my Steam install of 3DMark and re-downloaded it. This fixed my issues


----------



## VeritronX

With steam all the various parts of 3dmark are seperate dlcs that you get for free, you have to tick the box for time spy and then steam will download it and it will work.. this is all to do with how they set it up on steam and nothing to do with the os.


----------



## combat fighter

Quote:


> Originally Posted by *toncij*
> 
> Are they making a hydro-copper block too?


They do already. It's called the EK water block.

Same thing different name.


----------



## KillerBee33

Quote:


> Originally Posted by *VeritronX*
> 
> With steam all the various parts of 3dmark are seperate dlcs that you get for free, you have to tick the box for time spy and then steam will download it and it will work.. this is all to do with how they set it up on steam and nothing to do with the os.


If you purchased it from Steam and run it thru Steam it works fine but if you downloaded standalone 3DMark and paste your key from Steam version it will work , but TimeSpy gives an error when trying to even look up the Key. Not sure if this was just the timing for me and the Upgrade but it worked fine day before 1607. in standalone Version.


----------



## carlhil2

Quote:


> Originally Posted by *Jpmboy*
> 
> yes ~ 5C drop in sustained load temp.
> usually folks removing the OEM AC without a 4mm socket... put the pliers away.
> 
> 
> 
> 
> 
> 
> 
> (was the same problem on the 1080).


I actually used needle-nosed pliers to remove those screws, SUCCESS...







(but, not everyone should try it, takes patience)


----------



## toncij

Quote:


> Originally Posted by *combat fighter*
> 
> They do already. It's called the EK water block.
> 
> Same thing different name.


Well, EK is late even with EVGA blocks, not sure about TXP.


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> If you purchased it from Steam and run it thru Steam it works fine but if you downloaded standalone 3DMark and paste your key from Steam version it will work , but TimeSpy gives an error when trying to even look up the Key. Not sure if this was just the timing for me and the Upgrade but it worked fine day before 1607. in standalone Version.


You have to buy a TimeSpy upgrade, right?


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> You have to buy a TimeSpy upgrade, right?


Yeap , $9+ thru Steam but it wont let me look up TimeSpy key so pasting it to standalone version isn't possible. Also you have to buy it only if you want an option to turn OFF Demo and have custom setting and i think TimeSpy stress test.


----------



## toncij

Who managed to keep TXP on 2GHz+ so far stable under full load? (I presume water-cooled). Anyone here managed to do it with stable output?


----------



## HaniWithAnI

Quote:


> Originally Posted by *toncij*
> 
> Who managed to keep TXP on 2GHz+ so far stable under full load? (I presume water-cooled). Anyone here managed to do it with stable output?


Stable on air with 100% fan at 2085 boost full load (~+210mhz base, +450Mhz Mem). It's loud though. Hoping to fix that and maybe push to 2150 once I fit the hybrid cooler that arrived today, but expecting to hit PL somewhere inbetween, so might need to wait for some nice tutorials on how to shunt correctly.


----------



## CRITTY

I am up to page 138 and getting loads of good info; thanks. I ran Fire Strike Ultra and I am #4 in the WORLD (for now)! Nice to meet everybody!

http://www.3dmark.com/fs/9694615


----------



## toncij

What is the running temp.? What wall are you hitting? PL, VL, TL?


----------



## MunneY

Quote:


> Originally Posted by *toncij*
> 
> Who managed to keep TXP on 2GHz+ so far stable under full load? (I presume water-cooled). Anyone here managed to do it with stable output?


mine did that on air in a 2 hour session. 2050mhz at +200.


----------



## CRITTY

Quote:


> Originally Posted by *toncij*
> 
> What is the running temp.? What wall are you hitting? PL, VL, TL?


In games thus far I have not been over 76. From what I have observed and read; temps are not the main limiting factor (plays a role though). I am just enjoying digging into my backlog and throwing everything to the right.


----------



## BehindTimes

If anyone else could test out lanes. I'm noticing roughly a 5% difference now using 16x/16x over 16x/8x (no extra services running when they're installed). I'm just wondering if mainly my other PCI Express cards are slowing down the system, or if there might be a tangible difference now.


----------



## Edge0fsanity

Quote:


> Originally Posted by *toncij*
> 
> Who managed to keep TXP on 2GHz+ so far stable under full load? (I presume water-cooled). Anyone here managed to do it with stable output?


i did that on the stock cooler when i tested the card a couple days ago. Didn't even bother pushing to find the limit, i'll find it whenever EK ships my block.


----------



## Steven185

Quote:


> Originally Posted by *MunneY*
> 
> mine did that on air in a 2 hour session. 2050mhz at +200.


Temps?


----------



## MunneY

Quote:


> Originally Posted by *Steven185*
> 
> Temps?


Mid 60s with fan on 70%


----------



## Luke212

Hi It seems noone in the world can confirm how many TFlops you get with FP16 operations.

If you want to do a world exclusive, please download SiSoft Sandra http://www.sisoftware.net/download-lite/

and test the FP16 performance.

The Tesla P100 is supposed to do 22 TFlops FP16.

So is the Titan X more like the 1080 or more like the P100?

Why noone else has done this is amazing! Its the single most important performance number for deep learning, which is how the Titan X is marketed!!!

Please do the world a favour and find out!!!


----------



## mbze430

Quote:


> Originally Posted by *Luke212*
> 
> Hi can any pascal owners confirm how many TFlops you get with FP16 operations?
> 
> The Tesla P100 is supposed to do 22 TFlops FP16.
> 
> So is the Titan X more like the 1080 or more like the P100?


More like 1080. The Titan XP can only do 11Tflops FP16 and missing FP64. Telsa is a pure HPC. Titan XP is like a stepping stone in dedicated computation


----------



## Luke212

Quote:


> Originally Posted by *mbze430*
> 
> More like 1080. The Titan XP can only do 11Tflops FP16 and missing FP64. Telsa is a pure HPC. Titan XP is like a stepping stone in dedicated computation


This is probably not correct. Have you have benchmarked it.?

The GTX 1080 is 0.2TFlops (200Gflops) FP16. it is 1/64 of 11TFlops. Not 11TFlops.

You can see a page here that shows capabilities. Noone has tested Titan X yet though! Please do a world first!!
http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5


----------



## mbze430

I am sure someone will run a SiSoft Sandra benchmark. But I am pretty sure it will have a similar result as the 1080. They have to "dumb" down the consumer version to market their HPC products

I would run one, but I am not at home to do it. I am sure if someone here wants to will do so


----------



## Maintenance Bot

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Not so sure the mod is that easy anymore:
> 
> 
> 
> *https://xdevs.com/guide/pascal_oc/*
> 
> This is on a FE 1080, most likely same as Titan-X P


Probably a bit above my skill level atm


----------



## HaniWithAnI

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Probably a bit above my skill level atm


Isn't the shunt only the first part of that image though? I've definitely seen posters with the 1080 reporting success with only the first step, Der8auer on youtube actually has a tutorial demonstrating it on the 1080.

See here: http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/


----------



## Maintenance Bot

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Isn't the shunt only the first part of that image though? I've definitely seen posters with the 1080 reporting success with only the first step, Der8auer on youtube actually has a tutorial demonstrating it on the 1080.
> 
> See here: http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/


Im not 100% sure.

Buried in the comment section of that video Der8auer says he will do a TXP tutorial video though when he gets one.


----------



## HaniWithAnI

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Im not 100% sure.
> 
> Buried in the comment section of that video Der8auer says he will do a TXP tutorial video though when he gets one.


Haha I know, I'm the one who asked him!


----------



## Glzmo

Quote:


> Originally Posted by *Steven185*
> 
> My card performs great but throttles like crazy. The stock cooler is kind of underwhelming. Of course I can make the fan run 100% at all times, but that's not the point. So I was looking at AIO coolers. Would this one work? http://eu.evga.com/Products/Product.aspx?pn=400-HY-0996-B1
> 
> Or maybe this one? http://eu.evga.com/Products/Product.aspx?pn=400-HY-H980-B1
> 
> Without any modding?
> 
> I saw some guys being able to fit them. But I'm not sure which one of those they used. Also I'd prefer not to mod my card (so that I may not completely destroy its resale values, Volta is near
> 
> 
> 
> 
> 
> 
> 
> ).


From what I understand, both will work without having to modify the card and simply they are both the same except the 980 Ti or 980 branding on the shroud, as is the one for the Maxwell Titan X if you can find that), but without the plastic shroud that comes with it due to different shape/mounting holes on those. The plastic shroud is mostly cosmetic (although you could argue that it directs the airflow - although some will say it obstructs it, I guess it depends on your case's airflow - but I guess if you really want to keep a shroud, you could possibly just screw off the Titan X's original shroud's glass part and stick the tubes through that.
You could probably also use the AIO for the 10 Series, not sure if that shroud will work on the Titan X either, though.


----------



## DADDYDC650

I'd mod my XP but I'll probably be upgrading in 6 months anyway.


----------



## CallsignVega

Quote:


> Originally Posted by *Seyumi*
> 
> Just a reminder to everyone that EVGA is coming out with a Titan XP Hybrid kit. EVGA_Jacob confirmed it himself. You could theoretically make the 1080/1070 Hybrid kit work but you'd have to dremel the shroud due to the extra power connectors on the Titan XP. You could also probably fit the gold 980Ti Hybrid that sells on Amazon for pretty cheap right now (same same power connectors spacing as the Titan XP) but not sure if that 100% confirmed to work yet.


Any of the EVGA Hybrid kits will from what I've seen. Although I didn't use that silly plastic "shroud".

Kinda a moot point for me though as the Arctic Accelero III cools just as well and is half the price. And you don't have to worry about cheap AiO pumps/leaks.

Do yourself a favor no matter what cooler you go with, wipe off that stock TIM and put something good on there. Like Gelid Extreme.


----------



## DarkIdeals

Quote:


> Originally Posted by *toncij*
> 
> Who managed to keep TXP on 2GHz+ so far stable under full load? (I presume water-cooled). Anyone here managed to do it with stable output?


Mine hits 2,025mhz on stock air cooling as a max overclock. I have to really crank the fan though as in demanding games like TW3 i can only get 1,975mhz with the fan below 80% speed. There's DEFINITELY a ton of thermal throttling, as when i up my temp limit to max the card hits 90C within a couple minutes flat! I'm honestly really concerned that there's something specifically wrong with my card, as i've already replaced the TIM with Thermal Grizzly kryonaut and it's STILL real high temps. Maybe i just did a bad job applying the TIM, but i simply followed the standard EKWB "star" method so it should be fine.
Quote:


> Originally Posted by *BehindTimes*
> 
> If anyone else could test out lanes. I'm noticing roughly a 5% difference now using 16x/16x over 16x/8x (no extra services running when they're installed). I'm just wondering if mainly my other PCI Express cards are slowing down the system, or if there might be a tangible difference now.


God i hope not. I went and bought an i7 6800K after selling my 5960X thinking that 40 lanes was pointless now that SLI is limited to 2 way etc.. But even a 5% difference is enough to change my mind into getting a 6850K/6900K instead....sigh -____-


----------



## carlhil2

Had to kick that Kingpin LN2 980Ti run to the curb.. http://www.3dmark.com/fs/9698237.. http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1/1+gpu


----------



## Sheyster

My card arrived this afternoon. 1860 MHz max boost stock (+120% PL). I'll start looking at the BIOS tomorrow. Gonna just enjoy the card this evening.


----------



## carlhil2

Quote:


> Originally Posted by *Sheyster*
> 
> My card arrived this afternoon. 1860 MHz max boost stock (+120% PL). I'll start looking at the BIOS tomorrow. Gonna just enjoy the card this evening.


My chip the same, and, thanks man..


----------



## renejr902

Hi guys! I have a titan x pascal oc , core+205 and mem +700 100% stable and i7 4790k oc 4.5ghz. Do i need a better speed ram than 1600 to play at 4k gaming AT 60fps? But if yes i will have to change my motherboard too, because my asus h97-plus, is not able to use memory more than 1600mhz , its only overclock cpu. I cant find a bios mod for the board to allow 1600+ ram , so im not sure it worth buying new board and 32gb of ddr3 at 2400mhz if the performance in 4k gaming is not better at all or like 2-3%.

Im playing in 4k in all games. i have 60+fps in most game. Sometime but its RARE, i got fps drop or stuttering in a few games, nothing too bad, doesnt happen often at all. Witcher3 and rise of tonb raider have a few fps drop or stutteting, but i got 60+fps withe these games. Witcher3 with no haiwork and no AA , others settings are mostly to maximum quality. I get 60fps most of time. I play with vsync , it play so much better and no tearing, but i have a few fps drop or stuttering, but not often at all.

But i still have 32gb DDR3 kingston 1600mhz value ram. 11CL

With 32gb ddr3 at 2400mhz , will i get better fps OR less fps drop OR less stuttering in 4k gaming?

Thanks for answer im short of money now, titan x pascal was so expensive, i dont want to waste money for ram and new board if it doesnt make any difference in gaming at 4k resolution

Thanks for answer guys , its really appreciated

Some links of ram benchmark with differents speed, thanks for your opinions: ( the first link seem to show no fps difference in different speed ram at 2k gaming in nearly all games









http://techbuyersguru.com/gaming-ddr4-memory-2133-vs-26663200mhz-8gb-vs-16gb?page=1

http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial/6

http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial/8

http://www.anandtech.com/show/7364/memory-scaling-on-haswell/7


----------



## carlhil2

Jacked up my ram.. http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+ultra+preset/version+1.1/1+gpu my results show the wrong core clock though..should be 2076ish..


----------



## DADDYDC650

I've seen my card hit 2070Mhz but it needs to go under water to stop some of the throttling. VRAM stable at 11Ghz. Haven't pushed it further.


----------



## carlhil2

3DMark throttles my card still, even under water. I am living dangerous though, only thing being water cooled IS the chip..







, it's only temporary though...


----------



## Jpmboy

Quote:


> Originally Posted by *Testier*
> 
> I thought this one needed the weird hexa 4mm screw.


Quote:


> Originally Posted by *EniGma1987*
> 
> Oh you are talking about the allen bolts, not the screws. They are pretty standard stuff, very easy to find allen wrenches. Again, Lowes, Home depot, amazon, all over the place. The allen bolts are not 4mm size though, they are either 2.5mm or 3mm. The 4mm part a lot of talking about just needs any standard 4mm socket and those are to take out the standoff bolts that connect the PCB to the cooler and backplate.
> 
> So to recap, tools needed to disassemble the stock cooler from the card:
> 
> tiny phillips screwdriver for all the little screws on the cooler
> allen wrench somewhere between 2.5mm-3mm (cant remember which) for the allen bolts on the cooler
> ww.amazon.com/TEKTON-1202-4-Inch-Socket-9-Sockets/dp/B005G7QHKY/ref=sr_1_1?ie=UTF8&qid=1470682139&sr=8-1&keywords=4mm+socket]4mm socket wrench for the standoff bolts that connect the cooler and backplate mounts to the PCB.[/URL]


Guys - the tiny screws on the back plate screw into an internally tapped 4mm hex socket screw that in turn screws into the air cooler. The mishaps occur because guys are taking needle nose pliers to the hex - which can be very tight with hte loctite used.
*There is no allen wrench used to disassemble the stock cooler from teh PCB and backplate.* I just did one a few hours ago:


scanning the PCB with an IR guns while running timespy shows that the 2 R22 mosfets (or what ever) are the hottest things on the card. Core with ambient water never gets above 35C (grizzly on the gpu), and I was able to squeeze about 60Hz more out of the weak card. Gotta do my "good" card in the next couple of days.
Quote:


> Originally Posted by *DADDYDC650*
> 
> No way I'm going I'm paying these fools twice!


If i remember - you pay $10 for timespy and all your 3DMARK licenses are renewed with the TS purchase.


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> 3DMark throttles my card still, even under water. I am living dangerous though, only thing being water cooled IS the chip..
> 
> 
> 
> 
> 
> 
> 
> , it's only temporary though...


3dMark throttles my card the most. Runs around 2030-2070Mhz running Far Cry Primal.


----------



## renejr902

Quote:


> Originally Posted by *DADDYDC650*
> 
> I've seen my card hit 2070Mhz but it needs to go under water to stop some of the throttling. VRAM stable at 11Ghz. Haven't pushed it further.


Is it dangerous, my ram run perfect even at +750 ? But i let it at 700mhz because i tested my stable system at that speed


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> 3dMark throttles my card the most. Runs around 2030-2070Mhz running Far Cry Primal.


Oh, I am straight in games.. Star Citizen crushes my card at 4k though.., to be expected..seriously thinking about getting another TXP..


----------



## renejr902

Quote:


> Originally Posted by *carlhil2*
> 
> 3DMark throttles my card still, even under water. I am living dangerous though, only thing being water cooled IS the chip..
> 
> 
> 
> 
> 
> 
> 
> , it's only temporary though...


I have +205 core stable with msi overdrive.

How much " + " did you got with your core watercolling system ? Thanks for answer


----------



## DADDYDC650

Quote:


> Originally Posted by *renejr902*
> 
> Is it dangerous, my ram run perfect even at +750 ? But i let it at 700mhz because i tested my stable system at that speed


I wouldn't bother running higher than +500 unless you are benching.


----------



## renejr902

Quote:


> Originally Posted by *DADDYDC650*
> 
> I wouldn't bother running higher than +500 unless you are benching.


To be honest even in 4k gaming 500 to 750 seem to give me 1-2fps more at best


----------



## carlhil2

Quote:


> Originally Posted by *renejr902*
> 
> I have +205 core stable with msi overdrive.
> 
> How much " + " did you got with your core watercolling system ? Thanks for answer


+208, but, my boost at stock is 1860...I can do +600+ on the ram, no issue...


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> +208, but, my boost at stock is 1860...I can do +600+ on the ram, no issue...


Think my card boosts to 1848 at stock. Is there something about the 12Mhz difference?


----------



## renejr902

Quote:


> Originally Posted by *carlhil2*
> 
> +208, but, my boost at stock is 1860...I can do +600+ on the ram, no issue...


Thanks a lot. If you got +208 and i got +205, i suppose we need a modded bios to get more speed. Right?


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Think my card boosts to 1848 at stock. Is there something about the 12Mhz difference?


ASIC Quality?


----------



## Testier

Quote:


> Originally Posted by *carlhil2*
> 
> ASIC Quality?


Do we just not have any tool to read that this gen?


----------



## renejr902

Quote:


> Originally Posted by *DADDYDC650*
> 
> Think my card boosts to 1848 at stock. Is there something about the 12Mhz difference?


In witcher 3 my speed boost is between 1946 and 2000mhz
My temp are 75c with 100% fan


----------



## DADDYDC650

Quote:


> Originally Posted by *renejr902*
> 
> Thanks a lot. If you got +208 and i got +205, i suppose we need a modded bios to get more speed. Right?


Modded BIOS + Water cooling.


----------



## carlhil2

Quote:


> Originally Posted by *Testier*
> 
> Do we just not have any tool to read that this gen?


Not that I know of, but, going by past experience, the higher ASIC meant higher stock boost for my Maxwells...


----------



## renejr902

Quote:


> Originally Posted by *DADDYDC650*
> 
> Modded BIOS + Water cooling.


I will buy a watercooling system but i want more speed than +208. I hope a modded bios will be releade


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Not that I know of, but, going by past experience, the higher ASIC meant higher stock boost for my Maxwells...


Perhaps yours OC's 12Mhz higher?


----------



## renejr902

Quote:


> Originally Posted by *renejr902*
> 
> In witcher 3 my speed boost is between 1946 and 2000mhz
> My temp are 75c with 100% fan


Is it good ? ( my mhz speed boost)


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Perhaps yours OC's 12Mhz higher?


?


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> ?


It's a joke considering yours hits 1860 stock boost and mine hits 1848.


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> It's a joke considering yours hits 1860 stock boost and mine hits 1848.


Lol, got you..


----------



## HaniWithAnI

Got some time to test more today, running furmark and Witcher 3 at 100% fan got me 2100ish, but now I'm hitting VRel limit way more than power (about 80% of the time VRel, other 20% is VRel and power)...

As far as I know there's no way (software-or-shunt mod wise) to feed the card more voltage correct? I'd be comfortable with liquid metal or pencil mods but not cutting or soldering anything on the card...

Also, am I waiting for custom BIOS tools only or is the MSI Afterburner team likely to add TitanX Pascal voltage controls eventually?

As usual, any help appreciated, I'm pretty new to this but it's hella fun pushing the card so far, she's a beaut!


----------



## DADDYDC650

Nvidia has Pascal on lock down. Be a miracle if I see BIOS mods the way things are looking.


----------



## carlhil2

To be honest, I am happy that I can hit at least 2050+ with TXP....2150+, I would be ..


----------



## Testier

Quote:


> Originally Posted by *DADDYDC650*
> 
> Nvidia has Pascal on lock down. Be a miracle if I see BIOS mods the way things are looking.


I hope you are wrong....


----------



## DADDYDC650

Quote:


> Originally Posted by *Testier*
> 
> I hope you are wrong....


Me too....


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> To be honest, I am happy that I can hit at least 2050 with TXP....2150+, I would be ..


Nvidia won't allow it. Everything is locked down for a reason. They have the perfect plan on getting us to upgrade.


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> Nvidia won't allow it. Everything is locked down for a reason. They have the perfect plan on getting us to upgrade.


Depending how these push BF1 @4K, I will be ok til Titan XV...


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Depending how these push BF1 @4K, I will be ok til Titan XV...


I believe 1080's can max out BF1 at 4k. We'll be fine for awhile. Or until they release a better card and I get the upgrade itch...


----------



## carlhil2

Quote:


> Originally Posted by *DADDYDC650*
> 
> I believe 1080's can max out BF1 at 4k. We'll be fine for awhile. Or until they release a better card and I get the upgrade itch...


Lol, there it is...


----------



## Z0eff

Quote:


> Originally Posted by *carlhil2*
> 
> Oh, I am straight in games.. Star Citizen crushes my card at 4k though.., to be expected..seriously thinking about getting another TXP..


Pretty much one of the main reasons why I'm closely following the development of GPUs atm. That game is going to be very intensive on GPUs and will probably still be somewhat unoptimized when the first part of the singleplayer campaign comes out.


----------



## carlhil2

Quote:


> Originally Posted by *Z0eff*
> 
> Pretty much one of the main reasons why I'm closely following the development of GPUs atm. That game is going to be very intensive on GPUs and will probably still be somewhat unoptimized when the first part of the singleplayer campaign comes out.


I will have to stick with 1440p in that game til I can tweak the settings I guess.., still looks good at that rez...


----------



## HaniWithAnI

Well I would expect at least some form of Voltage adjust (even if it is only the boost 3.0 gimped version seen on the 1070 and 1080 in the 4.3.0 beta of Afterburner) - not to mention custom bios (we know that's possible on pascal since ASUS has a custom 1080 bios that allows boosted VCORE even on 1080 FE cards)

Time will tell I suppose...


----------



## NoDoz

Finally got home from my trip today and installed my TXP that has been sitting at my house since Friday. Threw a simple OC on it, +190/+485. Really like it so far. Used the settings for top 30.


----------



## toncij

Quote:


> Originally Posted by *DarkIdeals*
> 
> Mine hits 2,025mhz on stock air cooling as a max overclock. I have to really crank the fan though as in demanding games like TW3 i can only get 1,975mhz with the fan below 80% speed. There's DEFINITELY a ton of thermal throttling, as when i up my temp limit to max the card hits 90C within a couple minutes flat! I'm honestly really concerned that there's something specifically wrong with my card, as i've already replaced the TIM with Thermal Grizzly kryonaut and it's STILL real high temps. Maybe i just did a bad job applying the TIM, but i simply followed the standard EKWB "star" method so it should be fine.
> God i hope not. I went and bought an i7 6800K after selling my 5960X thinking that 40 lanes was pointless now that SLI is limited to 2 way etc.. But even a 5% difference is enough to change my mind into getting a 6850K/6900K instead....sigh -____-


I expected temperatures to be even higher after some gaming. Really interested in 100% fan temps since a 100% would be close to a 90% of water performance, sans temperatures.

Regarding the CPU, get what clocks the best for current and get 6+ cores for future games.

Quote:


> Originally Posted by *Z0eff*
> 
> Pretty much one of the main reasons why I'm closely following the development of GPUs atm. That game is going to be very intensive on GPUs and will probably still be somewhat unoptimized when the first part of the singleplayer campaign comes out.


I don't think there'll ever be a game tbh...

Quote:


> Originally Posted by *DADDYDC650*
> 
> I believe 1080's can max out BF1 at 4k. We'll be fine for awhile. Or until they release a better card and I get the upgrade itch...


Single 1080 runs 5K at about 35-45 FPS. Dual do it 65-85.


----------



## TUGenius

I currently have an R9 295x2, but after lackluster Xfire performance, I'm going to move to a single GPU card, the TITAN XP. I've seen here that decent OCs can be made with the stock cooler, at the expense of noise and heat (which I was getting on a closed loop AIO card anyway lol). The EVGA kit should be coming soon, along with a custom BIOS of sorts, and I was wondering if it's worth the extra money to move to the EVGA closed loop water block. Does it void the warranty of the card? Is it significantly cooler or does it perform significantly better? I don't want to move to a custom water solution, since I don't trust myself enough to install a water block, backplate, and have to worry about leakage. I don't know if anyone has tested this either, but does the PCIe slot make a difference? I have mine on the 3rd slot of the ASUS R5E X99 mobo (8x 3.0). Maybe I could wait for EVGA to release a hybrid TITAN XP, then I wouldn't have to modify the FE shroud. Thanks in advance!

(Go TeamGreen!)


----------



## gamingarena

Quote:


> Originally Posted by *CallsignVega*
> 
> Any of the EVGA Hybrid kits will from what I've seen. Although I didn't use that silly plastic "shroud".
> 
> Kinda a moot point for me though as the Arctic Accelero III cools just as well and is half the price. And you don't have to worry about cheap AiO pumps/leaks.
> 
> Do yourself a favor no matter what cooler you go with, wipe off that stock TIM and put something good on there. Like Gelid Extreme.


hey Vega question for you, i'm looking to get Accelero Xtreme IV is that the one you using or III?
How about the VRM and Vram i see Xtreme IV dosn't have any sinks for those just the backplate which supposed to cool VRM and Vram form the back side.

Do you think i should put something over those or Accelero solution would be enough in this case?

Thanks


----------



## MrTOOSHORT

Well I preordered the EK block just now and should receive it in middle of next week. It's actually surprising what kind of scores in benches you can get on the stock cooler running 100% fan speed. Can only imagine the boost from a water block.

The backplate is released at the end of the month, I'll order it then. Just need the block at the very least asap. Enough of this air cooler already!


----------



## pez

I had a whole string of quotes lined up, but I guess I wasn't attentive enough to the browser and it timed out--*sigh*.

Most notable one I wanted to ask about was the stock TIM. Have a lot of you replaced it on just the stock cooler? Any significant improvements?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *pez*
> 
> I had a whole string of quotes lined up, but I guess I wasn't attentive enough to the browser and it timed out--*sigh*.
> 
> Most notable one I wanted to ask about was the stock TIM. Have a lot of you replaced it on just the stock cooler? Any significant improvements?


Some people have checked theirs and it was bad or good. If you're staying on the stock cooler, best to take a look. Mine will be coming off next week, and my temps seem pretty normal compared to others around, so I won't be checking until then.


----------



## HyperMatrix

Can't deny I like staring at these cards. Especially with that HB Bridge...


----------



## Glzmo

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Got some time to test more today, running furmark and Witcher 3 at 100% fan got me 2100ish, but now I'm hitting VRel limit way more than power (about 80% of the time VRel, other 20% is VRel and power)...
> 
> As far as I know there's no way (software-or-shunt mod wise) to feed the card more voltage correct? I'd be comfortable with liquid metal or pencil mods but not cutting or soldering anything on the card...
> 
> Also, am I waiting for custom BIOS tools only or is the MSI Afterburner team likely to add TitanX Pascal voltage controls eventually?
> 
> As usual, any help appreciated, I'm pretty new to this but it's hella fun pushing the card so far, she's a beaut!


You could try adding the following line to your ...\MSI Afterburner\Profiles\ VEN_10DE&DEV*.cfg:

Code:



Code:


[Settings]
VDDC_Generic_Detection = 1

That will unlock the Core Voltage slider on the Nvidia Titan X, although I haven't tried if it actually works.


----------



## pez

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Some people have checked theirs and it was bad or good. If you're staying on the stock cooler, best to take a look. Mine will be coming off next week, and my temps seem pretty normal compared to others around, so I won't be checking until then.


I'll have to check mine out. So far I'm hitting temp limit and fans seem to stick around 70-80%. This is with Power Target at 120%, Temp Target/Limit at 85C, and GPU clock at +200Mhz. In trivial games like CS:GO I'm sitting dead at 2000Mhz and GTA V seems to sit at 1974Mhz even after 30 minutes or so.

On these hybrid coolers, how loud is it compared to the stock blower fan, and what percentage/speed does it normally get to?
Quote:


> Originally Posted by *HyperMatrix*
> 
> Can't deny I like staring at these cards. Especially with that HB Bridge...


I keep telling myself I don't need SLI, but I mean....I've got this leftover HB bridge that just looks so lonely







.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> Can't deny I like staring at these cards. Especially with that HB Bridge...


Very sexy! Mine sits in the dark at the moment.


----------



## toncij

Quote:


> Originally Posted by *pez*
> 
> I'll have to check mine out. So far I'm hitting temp limit and fans seem to stick around 70-80%. This is with Power Target at 120%, Temp Target/Limit at 85C, and GPU clock at +200Mhz. In trivial games like CS:GO I'm sitting dead at 2000Mhz and GTA V seems to sit at 1974Mhz even after 30 minutes or so.
> 
> On these hybrid coolers, how loud is it compared to the stock blower fan, and what percentage/speed does it normally get to?
> I keep telling myself I don't need SLI, but I mean....I've got this leftover HB bridge that just looks so lonely
> 
> 
> 
> 
> 
> 
> 
> .


What temps do you get at 100% fan? What is your ambient?

I question validity of the SLI option... running 1080s in SLI and thinking of moving to a single TX. Just for well, the boost everywhere. It's 30% but it's everywhere. SLI, on the other hand, doesn't run that well, I mean, doesn't run at all in many games.


----------



## Steven185

Quote:


> Originally Posted by *Glzmo*
> 
> From what I understand, both will work without having to modify the card and simply they are both the same except the 980 Ti or 980 branding on the shroud, as is the one for the Maxwell Titan X if you can find that), but without the plastic shroud that comes with it due to different shape/mounting holes on those. The plastic shroud is mostly cosmetic (although you could argue that it directs the airflow - although some will say it obstructs it, I guess it depends on your case's airflow - but I guess if you really want to keep a shroud, you could possibly just screw off the Titan X's original shroud's glass part and stick the tubes through that.
> You could probably also use the AIO for the 10 Series, not sure if that shroud will work on the Titan X either, though.


Thanks, my concern is mostly with the differing die size. Since 980 ti's die is larger and 980's one is smaller would it really fit the thermal plate or is it going to have thermal losses. My understanding is that EVGA has started developing an AIO for Titan XP in particular, but if it is to take weeks (or even months), I'm better off buying their older models for the simple fact that I may be using my card before the release of the new AIO...


----------



## Maintenance Bot

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Well I would expect at least some form of Voltage adjust (even if it is only the boost 3.0 gimped version seen on the 1070 and 1080 in the 4.3.0 beta of Afterburner) - not to mention custom bios (we know that's possible on pascal since ASUS has a custom 1080 bios that allows boosted VCORE even on 1080 FE cards)
> 
> Time will tell I suppose...


4.3.0 Beta 11 to offer voltage control, to be released says Hilbert. This will be nice if it works. http://www.guru3d.com/articles_pages/nvidia_titan_x_(pascal)_overclock_guide,2.html


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> Not having any issues with Doom? I have nightmare stats mode running under vulkan. Showing GPU/CPU usage all under 50% but in areas FPS drops to 160. Asides from Vulkan being bust in general due to no SLI, and no GSYNC, I was disappointed to see it drop below 200fps.


I was running opengl and it constantly hit 2Ghz without throttling. Not sure about the fps but it never lagged one bit.


----------



## Steven185

BTW this the least overclockable card/generation yet. It seems that nVidia not feeling the heat from AMD made her to cut back in allowed tweaks and performance boosts.

Just last gen I could routinely get 20% + performance increase, even 30% + under water. From what I'm reading in this thread even under water we cannot top 15% performance boost over stock clocks/boost.
This is the card/gen that needs tweaked bios the most... let's see what we'll get.


----------



## KillerBee33

Quote:


> Originally Posted by *Steven185*
> 
> BTW this the least overclockable card/generation yet. It seems that nVidia not feeling the heat from AMD made her to cut back in allowed tweaks and performance boosts.
> 
> Just last gen I could routinely get 20% + performance increase, even 30% + under water. From what I'm reading in this thread even under water we cannot top 15% performance boost over stock clocks/boost.
> This is the card/gen that needs tweaked bios the most... let's see what we'll get.


interesting , 1417 + 45% is 2054 , i get a stable benchmark run @ 2098 that's roughly 50% Over Stock


----------



## unreality

Quote:


> Originally Posted by *Steven185*
> 
> BTW this the least overclockable card/generation yet. It seems that nVidia not feeling the heat from AMD made her to cut back in allowed tweaks and performance boosts.
> 
> Just last gen I could routinely get 20% + performance increase, even 30% + under water. From what I'm reading in this thread even under water we cannot top 15% performance boost over stock clocks/boost.
> This is the card/gen that needs tweaked bios the most... let's see what we'll get.


Stock Boost ist ~1530 so when people like me are hitting 2000+ on air already thats already an OC over 30%. We will see what happens when we put those monster under water with voltage control


----------



## Steven185

Quote:


> Originally Posted by *unreality*
> 
> Stock Boost ist ~1530 so when people like me are hitting 2000+ on air already thats already an OC over 30%. We will see what happens when we put those monster under water with voltage control


Quote:


> Originally Posted by *KillerBee33*
> 
> interesting , 1417 + 45% is 2054 , i get a stable benchmark run @ 2098 that's roughly 50% Over Stock


Well, what I did was to measure what I was getting before overclocking (stock) and afterwards. I got about 10%-13% extra performance in my Titan XP. I'm not sure how you got more.

Guru 3D's overclocking review backs me up, the actual gains are minimal: http://www.guru3d.com/articles_pages/nvidia_titan_x_(pascal)_overclock_guide,12.html

Here's hoping to get more from water + voltage + bios tweaks.


----------



## KillerBee33

Quote:


> Originally Posted by *Steven185*
> 
> Well, what I did was to measure what I was getting before overclocking and afterwards. I got about 10%-13% extra performance in my Titan XP. I'm not sure how you got more.
> 
> Guru 3D's overclocking review backs me up, the actual gains are minimal: http://www.guru3d.com/articles_pages/nvidia_titan_x_(pascal)_overclock_guide,12.html


Quite easy really , Stock clock is 1417MHz , it Boosts itself by 30% from factory to around 1860MHz and you can easily oveclock to 2050MHz in most cases and +50Mhz in some. Overall its about 50% over stock 1417MHz
EDIT: If you do the Math 9Series were worse then 10Series , 980 @ 1127 Stock most common OC was 1506MHz over 1127MHz STOCK , thats not even 35%


----------



## renejr902

Quote:


> Originally Posted by *NoDoz*
> 
> Finally got home from my trip today and installed my TXP that has been sitting at my house since Friday. Threw a simple OC on it, +190/+485. Really like it so far. Used the settings for top 30.


If you have time can you do it in 4k, i scored: fps:44,4 score:1118 min 21,2 max 99,6.


----------



## Steven185

Quote:


> Originally Posted by *KillerBee33*
> 
> Quite easy really , Stock clock is 1417MHz , it Boosts itself by 30% from factory to around 1860MHz and you can easily oveclock to 2050MHz in most cases and +50Mhz in some. Overall its about 50% over stock 1417MHz
> EDIT: If you do the Math 9Series were worse then 10Series , 980 @ 1127 Stock most common OC was 1506MHz over 1127MHz STOCK , thats not even 35%


I understand the math, but in actual performance overclocking (as it currently stands, pre-modding and pre-good cooling) is not much worth it.

Just by keeping it stock you get 90% of the performance, this certainly did not happen last gen. My last Titan X pre-overclocking (and modding) was at 75% of the performance , ultimately making the overclocking worth my while. In this gen, I don't see much point to overclocking, I'll probably return to it when more bios/voltage tweaks are in and when I'd receive my hybrid cooler; as of now I'm kind of underwhelmed.


----------



## EniGma1987

Quote:


> Originally Posted by *Maintenance Bot*
> 
> 4.3.0 Beta 11 to offer voltage control, to be released says Hilbert. This will be nice if it works. http://www.guru3d.com/articles_pages/nvidia_titan_x_(pascal)_overclock_guide,2.html


Good news if they limit max voltage to 1.08v. People just dont seem to understand how inadequate the Titan X VRM is and how big the risk of blowing up the card is for anything over 1.1v when overclocking.

Quote:


> Originally Posted by *unreality*
> 
> Stock Boost ist ~1530 so when people like me are hitting 2000+ on air already thats already an OC over 30%. We will see what happens when we put those monster under water with voltage control


Nvidia lists stock boost as the minimum possible. All cards so far have boosted to 1800MHz completely stock out of the box, with 90%+ of them at 1850MHz. So really, that is stock. Not 1500MHz.


----------



## KillerBee33

Quote:


> Originally Posted by *Steven185*
> 
> I understand the math, but in actual performance overclocking (as it currently stands, pre-modding and pre-good cooling) is not much worth it.
> 
> Just by keeping it stock you get 90% of the performance, this certainly did not happen last gen. My last Titan X pre-overclocking (and modding) was at 75% of the performance , ultimately making the overclocking worth my while. In this gen, I don't see much point to overclocking, I'll probably return to it when more bios/voltage tweaks are in and when I'd receive my hybrid cooler; as of now I'm kind of underwhelmed.


Well, i was able to push 980 to 1582MHz , lets just wait and see what BIOS Tools will bring







Nvidia did take it to the next level of tryin to lock that








And yes Hybrid kit is stashed away until needed , as of today no matter what my OC is , didn't get over 74 degrees.


----------



## DNMock

What form of TIM is on the stock backplate if any? is it the 3mm adhesive pads?


----------



## NoDoz

Quote:


> Originally Posted by *renejr902*
> 
> If you have time can you do it in 4k, i scored: fps:44,4 score:1118 min 21,2 max 99,6.


Heres mine at 4K. I havent tweaked my OC yet, I could go higher for sure.


----------



## pez

Quote:


> Originally Posted by *toncij*
> 
> What temps do you get at 100% fan? What is your ambient?
> 
> I question validity of the SLI option... running 1080s in SLI and thinking of moving to a single TX. Just for well, the boost everywhere. It's 30% but it's everywhere. SLI, on the other hand, doesn't run that well, I mean, doesn't run at all in many games.


I never ran my 1080s at 100% while gaming, but they are both sold off as of today, officially. I did do so just to see what 100% fan was like. I actually made a video of a single G1's fan speed at 100%. I'd say it's every bit of annoying as any cooler (blower or AIB) at 100%. Temps on those cards were about 81-83C for the top at around 75% fan and the bottom card got to around 73C with about 60% fan. This was with a custom fan profile that was set to be more aggressive near the top of that temperature range.

I don't run my Titan X P at 100% either, but temps will hit the 85C limit that I set. My ambients are ~77F/25C.

My biggest suggestion is to take into account what game you think you'll play the most or the majority of the games you are interested in and go from there. If you're into a lot of indie titles or triple-A games with mediocre SLI support (i.e. Fallout 4), then most definitely stick to a single card if you want to avoid as many headaches as possible







. However, if you're a BF4 or Battlefront fan, SLI is an amazing thing. My 1080s pushed BF4 completely maxed at around 80+ FPS in 4K. Currently, my plan is to stick to a single Titan X P, but I'm getting the itch for a second card again...I ordered a mITX board for my system to downsize and actually prevent me from doing SLI again, but I still am unable to decide







.

EDIT: Video of different fan speeds on the G1:


----------



## aylan1196

http://www.guru3d.com/articles-pages/nvidia-titan-x-(pascal)-overclock-guide,1.html
Happy tweaking and notice the new afterburner in the article yay voltage control can't wait


----------



## KillerBee33

Quote:


> Originally Posted by *aylan1196*
> 
> http://www.guru3d.com/articles-pages/nvidia-titan-x-(pascal)-overclock-guide,1.html
> Happy tweaking and notice the new afterburner in the article yay voltage control can't wait


If it's the same as the 1080, then raising Voltage will give you a slightly higher clocks but lower score


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> If it's the same as the 1080, then raising Voltage will give you a slightly higher clocks but lower score


What a buzz kill. Get it?


----------



## Artah

ekwb backplates available for pre-order guys if you were waiting.

https://www.ekwb.com/configurator/step1_complist


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> What a buzz kill. Get it?


Look at those guys Graphics score in Firestrike , i might be wrong but it looks like they run it @ +250 ...
This is my last run @ +215 on stock Voltage http://www.3dmark.com/3dm/13985049
EDIT: Keep in mind everything is in BETA stage , Drivers and OC Software .


----------



## HaniWithAnI

Unwinder replied to me in the Afterburner support thread and confirmed that to enable Voltage offset in Afterburner for TITAN XP you just need to do the following:
Quote:


> You can add unofficial support for TITAN X via editing hardware profile (profiles .\VEN_10DE&DEV_....cfg) and adding the following lines to it:
> 
> [Settings]
> VDDC_Generic_Detection = 1


In my experience it doesn't do much as the TL/PL seems to be causing the card to choose a lower voltage regardless of offset (and despite me being under targets for both??? :S), so I end up at the same voltage as before under load. Will likely be most useful to those who have already performed shunt mod or watercooling. Will try it again once I have my hybrid applied, still haven't had the chance to fit it yet.


----------



## CallsignVega

Quote:


> Originally Posted by *gamingarena*
> 
> hey Vega question for you, i'm looking to get Accelero Xtreme IV is that the one you using or III?
> How about the VRM and Vram i see Xtreme IV dosn't have any sinks for those just the backplate which supposed to cool VRM and Vram form the back side.
> 
> Do you think i should put something over those or Accelero solution would be enough in this case?
> 
> Thanks


I originally got the IV which was a mistake. I reordered the III which will cool the power section better.


----------



## cg4200

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Unwinder replied to me in the Afterburner support thread and confirmed that to enable Voltage offset in Afterburner for TITAN XP you just need to do the following:
> In my experience it doesn't do much as the TL/PL seems to be causing the card to choose a lower voltage regardless of offset (and despite me being under targets for both??? :S), so I end up at the same voltage as before under load. Will likely be most useful to those who have already performed shunt mod or watercooling. Will try it again once I have my hybrid applied, still haven't had the chance to fit it yet.


hey when editing msi profile VEN_10DE&DEV_1B00&SUBSYS_119A10DE&REV_A1&BUS_1&DEV_0&FN_0.cfg VDDC_Generic_Detection = 1 or after .cfg should I add _ thanks


----------



## HaniWithAnI

Quote:


> Originally Posted by *cg4200*
> 
> hey when editing msi profile VEN_10DE&DEV_1B00&SUBSYS_119A10DE&REV_A1&BUS_1&DEV_0&FN_0.cfg VDDC_Generic_Detection = 1 or after .cfg should I add _ thanks


open the file that looks like "VEN_10DE&DEV_....cfg" in the afterburner install directory -> profiles folder - you can open it in notepad

paste the following lines (as is, both of them) to the end of it then save. That's all.

[Settings]
VDDC_Generic_Detection = 1


----------



## cg4200

Quote:


> Originally Posted by *HaniWithAnI*
> 
> open the file that looks like "VEN_10DE&DEV_....cfg" in the afterburner install directory -> profiles folder - you can open it in notepad
> 
> paste the following lines (as is, both of them) to the end of it then save. That's all.
> 
> [Settings]
> VDDC_Generic_Detection = 1


Hey mate it worked opened her up thank..


----------



## Jpmboy

Quote:


> Originally Posted by *cg4200*
> 
> Hey mate it worked opened her up thank..


check the voltage... most likely it will not raise the voltage much - depending on the voltage table in bios.


----------



## Woundingchaney

Quote:


> Originally Posted by *CallsignVega*
> 
> I originally got the IV which was a mistake. I reordered the III which will cool the power section better.


Did you have to mod the Accelero at all to get it to fit?


----------



## Yuhfhrh

This thing is shashing against that 120% power limit in games. Has anybody tried the power mod with CLU yet? I'm tempted to try now but it's probably best to just wait for the waterblock to come in.


----------



## Jokanaan

To avoid any questions regarding eligibility











I have one question to you guys, since I'm a bit new to the wondrous world of overclocking. I hope you don't mind, since it's related to the card itself. With my new Titan X, I decided to build a new PC and chose i5-6600k.

I thought to myself, that the difference between i5 and current i7s is not that big and I will go for Kaby Lake or Zen, when it comes out instead of spending more for i7.

So the question is will my i5 work well with Titan X (or at least well enough). I'm sure some of you are using this chip, so does it work well for you?
Or should I go with the i7.

Sorry, if I ask in the wrong place but a single 'yes' or 'no' will be fine.


----------



## bee144

Quote:


> Originally Posted by *Jokanaan*
> 
> To avoid any questions regarding eligibility
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to you guys, since I'm a bit new to the wondrous world of overclocking. I hope you don't mind, since it's related to the card itself. With my new Titan X, I decided to build a new PC and chose i5-6600k.
> 
> I thought to myself, that the difference between i5 and current i7s is not that big and I will go for Kaby Lake or Zen, when it comes out instead of spending more for i7.
> 
> So the question is will my i5 work well with Titan X (or at least well enough). I'm sure some of you are using this chip, so does it work well for you?
> Or should I go with the i7.
> 
> Sorry, if I ask in the wrong place but a single 'yes' or 'no' will be fine.


Congrats. You really need to tell us what you're trying to do though. Are you trying to game at 1080p 60hz? Then yes, you'll be fine abit overkill though.

Are you trying to create new world records in benchmarks? Then no, you aren't good as you'll need more cores and higher clock speeds to break world records.

Again, what are you trying to accomplish?


----------



## Seyumi

Quote:


> Originally Posted by *Jokanaan*
> 
> To avoid any questions regarding eligibility
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to you guys, since I'm a bit new to the wondrous world of overclocking. I hope you don't mind, since it's related to the card itself. With my new Titan X, I decided to build a new PC and chose i5-6600k.
> 
> I thought to myself, that the difference between i5 and current i7s is not that big and I will go for Kaby Lake or Zen, when it comes out instead of spending more for i7.
> 
> So the question is will my i5 work well with Titan X (or at least well enough). I'm sure some of you are using this chip, so does it work well for you?
> Or should I go with the i7.
> 
> Sorry, if I ask in the wrong place but a single 'yes' or 'no' will be fine.


I'm too lazy to find & link the actual statistic benchmarks out there but it's on the internet somewhere.

6600k is actually better for gaming than the 6700k for 3 reasons:

-Hyperthreading actually decreases gaming performance by 1-5% because you're essentially splitting 1 core into 2 threads. You get more performance out of 1 core working at 99% versus 2 threads working at 99%. Even if I had a CPU with hyperthreading, i'd disable it (think you save 1-2C on well on temperatures)
-The 6600k will typically overclock faster than the 6700k. Yes the 6700k is clocked higher out of the box but it's more limited to overclock.
-The 6600k costs cheaper. Put that money into a Titan X Pascal SLI or something instead.

You can buy pre-overlocked CPU's from Siliconlottery.com. Notice they have 5.0Ghz 6600k's for sale but can't get the 6700k's past 4.9Ghz.

General rule of thumb more speed > cores. Since there's going to be people in here who cannot truly fathom their $1700 6950x or their $1000 5960k is being outbeat by a $250 processor, here is just a recent example using Ashes of Singularity which is one of the most CPU intensive DX12 games out there at the moment:

http://www.hardocp.com/article/2016/06/24/dx11_vs_dx12_intel_6700k_6950x_framerate_scaling/3#.V6pRNygrLAQ

FYI: I'm actually pro x99 platform. I came from a 4.7ghz 5960x. Nvidia killed that for me when they announced only 2 way SLI is supported in games now thus my Z170 platform "downgrade" (more like an upgrade IMO for games)

FYI 2: I think the only game in the world right now that actually uses more than 4 cores 'properly' would be Crytek games. You'll see people defending games like Crysis 3 up and down. I played that game once a year ago, beat it in several hours, and never played it again.


----------



## cookiesowns

So I just repasted my cards.TIM looked fairly decent, maybe dropped my temps 2-4C max. I am using an older tube of GC extreme so not as good as Grizzly.

Here's something I discovered though, the card that boost higher ( 1860 ) seems to actually clock worse than the card that only boosts to 1843.. had artifacts at the same OC settings. Will need to re-test.

Here's some logs of the 2nd card using GPUZ after repaste running 3Dmark DX12 Timespy. Card is in the 2nd 16x slot on RVE10 so may have a bit of perf loss if any. ( first slot disabled )

GTXTitanX2-208-475-120TDP.txt 72k .txt file


GTXTitanX2-195-475-120TDP.txt 74k .txt file


Didn't get to refuse shipment from Fedex, since the guy didn't ring bell, so I guess I'll be testing the 3rd card out lol, fingers crossed it's in the same boost bin as the best card, or even better.

Now I just need to chase JPM down in 3Dmark =)


----------



## Baasha

Okay so since my 3rd and 4th Titan XP are installed in the X79 rig, there are some weird things going on.

The performance seems to be quite crappy @ 1440P - I understand these cards are ridiculous overkill for that resolution but @ 144Hz, it should be utilized pretty well.

In my X99 rig, I'm getting ~90 FPS in GTA V @ 5K. In the X79 rig, I'm getting 70 FPS @ 1440P. The main difference is that the X99 rig has the 6950X @ 4.30Ghz whereas the X79 rig has the 3970X @ 4.50Ghz. The CPU alone can't make THAT big of a difference - please tell me that that is not the case.

Games like BF4 are fine - pegged at 144FPS and the SLI OC on the X79 rig is +200 / +650.

Haven't tested other games yet but will do some more benchmarks on it.


----------



## Yuhfhrh

Quote:


> Originally Posted by *Baasha*
> 
> Okay so since my 3rd and 4th Titan XP are installed in the X79 rig, there are some weird things going on.
> 
> The performance seems to be quite crappy @ 1440P - I understand these cards are ridiculous overkill for that resolution but @ 144Hz, it should be utilized pretty well.
> 
> In my X99 rig, I'm getting ~90 FPS in GTA V @ 5K. In the X79 rig, I'm getting 70 FPS @ 1440P. The main difference is that the X99 rig has the 6950X @ 4.30Ghz whereas the X79 rig has the 3970X @ 4.50Ghz. The CPU alone can't make THAT big of a difference - please tell me that that is not the case.
> 
> Games like BF4 are fine - pegged at 144FPS and the SLI OC on the X79 rig is +200 / +650.
> 
> Haven't tested other games yet but will do some more benchmarks on it.


Ivy to Haswell/Broadwell did have a pretty nice IPC jump, plus those extra cores.


----------



## Baasha

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Ivy to Haswell/Broadwell did have a pretty nice IPC jump, plus those extra cores.


Hmm.. but that is a MASSIVE difference in performance. 1440P --> 5K is a 11 million pixel jump and same cards perform 20FPS *better* than @ 1440P?!?

Will be testing some RoTR in DX12...


----------



## pompss

One important question for you guys.

I bought the titan x and i was wondering if i should buy a new Cpu. At the moment i have i7 5820k and im not sure if i should upgrade to a new cpu or just wait .

What are your thoughts and what cpu you guys suggest?

Thanks


----------



## carlhil2

Quote:


> Originally Posted by *pompss*
> 
> One important question for you guys.
> 
> I bought the titan x and i was wondering if i should buy a new Cpu. At the moment i have i7 5820k and im not sure if i should upgrade to a new cpu or just wait .
> 
> What are your thoughts and what cpu you guys suggest?
> 
> Thanks


Just OC the snot out of it....the 5820k I mean...


----------



## CRITTY

Quote:


> Originally Posted by *Baasha*
> 
> Okay so since my 3rd and 4th Titan XP are installed in the X79 rig, there are some weird things going on.
> 
> The performance seems to be quite crappy @ 1440P - I understand these cards are ridiculous overkill for that resolution but @ 144Hz, it should be utilized pretty well.
> 
> In my X99 rig, I'm getting ~90 FPS in GTA V @ 5K. In the X79 rig, I'm getting 70 FPS @ 1440P. The main difference is that the X99 rig has the 6950X @ 4.30Ghz whereas the X79 rig has the 3970X @ 4.50Ghz. The CPU alone can't make THAT big of a difference - please tell me that that is not the case.
> 
> Games like BF4 are fine - pegged at 144FPS and the SLI OC on the X79 rig is +200 / +650.
> 
> Haven't tested other games yet but will do some more benchmarks on it.


Digital Foundry spoke about this phenomenon in their YT video series about the Titan XP. Long story short; Titan XP's kick ass at higher resolutions.

I think this is the vid:


----------



## HyperMatrix

Quote:


> Originally Posted by *Seyumi*
> 
> I'm too lazy to find & link the actual statistic benchmarks out there but it's on the internet somewhere.
> 
> 6600k is actually better for gaming than the 6700k for 3 reasons:
> 
> -Hyperthreading actually decreases gaming performance by 1-5% because you're essentially splitting 1 core into 2 threads. You get more performance out of 1 core working at 99% versus 2 threads working at 99%. Even if I had a CPU with hyperthreading, i'd disable it (think you save 1-2C on well on temperatures)
> -The 6600k will typically overclock faster than the 6700k. Yes the 6700k is clocked higher out of the box but it's more limited to overclock.
> -The 6600k costs cheaper. Put that money into a Titan X Pascal SLI or something instead.
> 
> You can buy pre-overlocked CPU's from Siliconlottery.com. Notice they have 5.0Ghz 6600k's for sale but can't get the 6700k's past 4.9Ghz.
> 
> General rule of thumb more speed > cores. Since there's going to be people in here who cannot truly fathom their $1700 6950x or their $1000 5960k is being outbeat by a $250 processor, here is just a recent example using Ashes of Singularity which is one of the most CPU intensive DX12 games out there at the moment:
> 
> http://www.hardocp.com/article/2016/06/24/dx11_vs_dx12_intel_6700k_6950x_framerate_scaling/3#.V6pRNygrLAQ
> 
> FYI: I'm actually pro x99 platform. I came from a 4.7ghz 5960x. Nvidia killed that for me when they announced only 2 way SLI is supported in games now thus my Z170 platform "downgrade" (more like an upgrade IMO for games)
> 
> FYI 2: I think the only game in the world right now that actually uses more than 4 cores 'properly' would be Crytek games. You'll see people defending games like Crysis 3 up and down. I played that game once a year ago, beat it in several hours, and never played it again.


Wrong. Challenge me to benchmarks in gta, tomb raider, metro, watchdogs, whatever. If you're hitting dx11 draw call limitations, HyperThreading helps. As vega found, when using PhysX in games, HyperThreading helps. Games like the division. HyperThreading helps.

There is so much misinformation regarding cores vs. Clock and HT being spread around that it's starting to annoy me. I was one of few people who did the No-HT thing as of a few years ago. I was one of the first to advocate for high clocked 4 core vs. 6/8 cores. But that is not valid information anymore. And will be even less true as more dx12 games are coming out. If you don't need high FPS, then you have nothing to worry about. But if you're pushing 120+ Hz, you'll need a good 6 or 8 core.


----------



## tpwilko08

Quote:


> Originally Posted by *HyperMatrix*
> 
> Wrong. Challenge me to benchmarks in gta, tomb raider, metro, watchdogs, whatever. If you're hitting dx11 draw call limitations, HyperThreading helps. As vega found, when using PhysX in games, HyperThreading helps. Games like the division. HyperThreading helps.
> 
> There is so much misinformation regarding cores vs. Clock and HT being spread around that it's starting to annoy me. I was one of few people who did the No-HT thing as of a few years ago. I was one of the first to advocate for high clocked 4 core vs. 6/8 cores. But that is not valid information anymore. And will be even less true as more dx12 games are coming out. If you don't need high FPS, then you have nothing to worry about. But if you're pushing 120+ Hz, you'll need a good 6 or 8 core.


Will i be ok with my 4 year old 3770k (Delid) clocked at 5 ghz. i am running at 1440p 144Hz not sure if i need to upgrade to keep up with the beastly Titan X....


----------



## HyperMatrix

Quote:


> Originally Posted by *tpwilko08*
> 
> Will i be ok with my 4 year old 3770k (Delid) clocked at 5 ghz. i am running at 1440p 144Hz not sure if i need to upgrade to keep up with the beastly Titan X....


Depends on the game. I had a 5.2GHz 3770k until a month ago. Noticed significant improvement in many games when switching to a 4.7GHz 5960x. But I'm also pushing a 165Hz monitor. I wouldn't be in a rush to upgrade it. It's a decent cpu. But if you look at dx12 games like tomb raider, there are scenes where it's using over 90% cpu usage across all 8 cores and moves from 60fpa in areas like Soviet station all the up to 165Hz. You should test tomb raider with your 3770k! I admit I never got to test it out with my 3770k.


----------



## cookiesowns

Hahahahaha..

Guess what guys..... With my luck, the first card that I ordered ( arrived last ), actually ended up being worse than the other two. Maximum boost is the same of the "lesser" card, but ends up being unstable at the same core clocks as the lesser card. Seems to TDP/Thermal throttle just a wee bit less though.

Looks like I'll be keeping the first two... sigh, should have just refused delivery.

On a side note here's my cards in Time Spy ( second card as at max OC, first card was not ):

Best card 182-425


Second best card 221-575


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Wrong. Challenge me to benchmarks in gta, tomb raider, metro, watchdogs, whatever. If you're hitting dx11 draw call limitations, HyperThreading helps. As vega found, when using PhysX in games, HyperThreading helps. Games like the division. HyperThreading helps.
> 
> There is so much misinformation regarding cores vs. Clock and HT being spread around that it's starting to annoy me. I was one of few people who did the No-HT thing as of a few years ago. I was one of the first to advocate for high clocked 4 core vs. 6/8 cores. But that is not valid information anymore. And will be even less true as more dx12 games are coming out. If you don't need high FPS, then you have nothing to worry about. But if you're pushing 120+ Hz, you'll need a good 6 or 8 core.


I just spent 3 days testing my [email protected] with 1080 sli vs my [email protected] with 1080 sli at 3440x1440. The 6700k pulled ahead in almost every bench. I'll post a thread about all of this soon.

I was pretty surprised at the results. Needless to say after spending years on the intel extreme platform, it'll be main stream platforms for me from now on especially since the imposed two way sli limitation.


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> I just spent 3 days testing my [email protected] with 1080 sli vs my [email protected] with 1080 sli at 3440x1440. The 6700k pulled ahead in almost every bench. I'll post a thread about all of this soon.
> 
> I was pretty surprised at the results. Needless to say after spending years on the intel extreme platform, it'll be main stream platforms for me from now on especially since the imposed two way sli limitation.


The small gain you might see in some poorly optimized games are nothing compared to the gains you get from games that actually take advantage of multi core. Vega and I tested metro ll for example and this was true. And as usual, I challenge people to bench tomb raider in dx12.


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> The small gain you might see in some poorly optimized games are nothing compared to the gains you get from games that actually take advantage of multi core. Vega and I tested metro ll for example and this was true. And as usual, I challenge people to bench tomb raider in dx12.


As requested.


----------



## Jpmboy

Quote:


> Originally Posted by *Seyumi*
> 
> I'm too lazy to find & link the actual statistic benchmarks out there but it's on the internet somewhere.
> 
> 6600k is actually better for gaming than the 6700k for 3 reasons:
> 
> -Hyperthreading actually decreases gaming performance by 1-5% because you're essentially splitting 1 core into 2 threads. You get more performance out of 1 core working at 99% versus 2 threads working at 99%. Even if I had a CPU with hyperthreading, i'd disable it (think you save 1-2C on well on temperatures)
> -The 6600k will typically overclock faster than the 6700k. Yes the 6700k is clocked higher out of the box but it's more limited to overclock.
> -The 6600k costs cheaper. Put that money into a Titan X Pascal SLI or something instead.
> 
> You can buy pre-overlocked CPU's from Siliconlottery.com. Notice they have 5.0Ghz 6600k's for sale but can't get the 6700k's past 4.9Ghz.
> 
> General rule of thumb more speed > cores. *Since there's going to be people in here who cannot truly fathom their $1700 6950x or their $1000 5960k is being outbeat by a $250 processor, here is just a recent example using Ashes of Singularity which is one of the most CPU intensive DX12 games out there at the moment:
> *
> http://www.hardocp.com/article/2016/06/24/dx11_vs_dx12_intel_6700k_6950x_framerate_scaling/3#.V6pRNygrLAQ
> 
> FYI: I'm actually pro x99 platform. I came from a 4.7ghz 5960x. Nvidia killed that for me when they announced only 2 way SLI is supported in games now thus my Z170 platform "downgrade" (more like an upgrade IMO for games)
> 
> FYI 2: I think the only game in the world right now that actually uses more than 4 cores 'properly' would be Crytek games. You'll see people defending games like Crysis 3 up and down. I played that game once a year ago, beat it in several hours, and never played it again.


lol - this is _such_ a myth. (eg, a statement lacking any basis in fact). "Outbeat", c'mon. But with regard to gaming.. a 6700K is fine, a 6600K is budget friendly and fully capable of playing most any game. The cpu dependence (or clock speed scaling) drops significantly as resolution increases. . (I have all 4 processors running right now).
Quote:


> Originally Posted by *cookiesowns*
> 
> Hahahahaha..
> 
> Guess what guys..... With my luck, the first card that I ordered ( arrived last ), actually ended up being worse than the other two. Maximum boost is the same of the "lesser" card, but ends up being unstable at the same core clocks as the lesser card. Seems to TDP/Thermal throttle just a wee bit less though.
> 
> Looks like I'll be keeping the first two... sigh, should have just refused delivery.
> 
> On a side note here's my cards in Time Spy ( second card as at max OC, first card was not ):
> 
> Best card 182-425
> 
> 
> Second best card 221-575


hey bud - it seems that every thing changes once you keep the core below 40C. Testing at ambient with core temps above 45C might be misleading. IDK, thermal scaling does not seem to track straight?


----------



## NoDoz

So after installing my new titan, Im having a ram issue. Not sure if its related but my pc never has done this before I put in my titan. Im playing WoW and Im getting messages from windows saying Im out of memory. I checked it and I was topping out at 15.9gb out of 16gb. Anyone have any idea whats going on? Ive never had a issue with memory before. Is my new card in any way causing it to do that?


----------



## tpwilko08

Quote:


> Originally Posted by *HyperMatrix*
> 
> Depends on the game. I had a 5.2GHz 3770k until a month ago. Noticed significant improvement in many games when switching to a 4.7GHz 5960x. But I'm also pushing a 165Hz monitor. I wouldn't be in a rush to upgrade it. It's a decent cpu. But if you look at dx12 games like tomb raider, there are scenes where it's using over 90% cpu usage across all 8 cores and moves from 60fpa in areas like Soviet station all the up to 165Hz. You should test tomb raider with your 3770k! I admit I never got to test it out with my 3770k.


Thanks for quick reply guess i will be sticking with the 3770k for now. i will try tomb raider in dx12 later today i had finished the game in dx11 before the dx12 patch came out so not been back on the game to try it...

How much voltage did you have to put in your 3770k to get 5.2Ghz? Mine is [email protected]


----------



## CRITTY

Haswell-E VS Skylake | i7 5960X VS 6700K 4.5GHz Gaming performance | GTX 1080 FE




Sharing is caring.


----------



## Jokanaan

Quote:


> Originally Posted by *bee144*
> 
> Congrats. You really need to tell us what you're trying to do though. Are you trying to game at 1080p 60hz? Then yes, you'll be fine abit overkill though.
> 
> Are you trying to create new world records in benchmarks? Then no, you aren't good as you'll need more cores and higher clock speeds to break world records.
> 
> Again, what are you trying to accomplish?


If you are interested here is a full list of my components, most of which I intend to buy:

- CPU: i5-6600k
- Mobo: Asus Maximus VIII Hero Z170
- RAM: G.Skill Ripjaws 4, 2400 Mhz/CL15, 32 GB
- Drives:
SSD for Windows 10/programs: SanDisk Extreme PRO 240GB SATA 6.0Gb/s
HDD for the rest: WD Black 2 TB, 7200 RPM SATA 6 Gb/s 64MB Cache 3.5 Inch - WD2003FZEX
- CPU Cooler: be Quiet! Dark Rock Pro 3
- Case Z11 Neo
- PSU: Corsair RMi Series RM750i
- Screen: Asus ROG Swift PG279q

I intend to use it for 144 Hz gaming instead of 4k.

Asus PG279q 27" is my current weapon of choice (https://www.amazon.com/SWIFT-PG279Q-Screen-LED-Lit-Monitor/dp/B017EVR2VM/ref=sr_1_1?s=pc&ie=UTF8&qid=1470787248&sr=1-1&keywords=asus+pg279q. I prefer a smooth gameplay to eye candy, so this is a 1440p IPS screen with G-Sync and 144Hz









No breaking world records for me yet because I'm just learning and observing first.

In general, I just wanted to know if any of you had some problems with bottleneck or low performance using this particular i5 and Titan X, preferrably in 1440p or higher. I haven't upgraded my PC for a while but I have done my homework and I just wanted to make a right call. It's always better to have a second opinion


----------



## bee144

Quote:


> Originally Posted by *Jokanaan*
> 
> If you are interested here is a full list of my components, most of which I intend to buy:
> 
> - CPU: i5-6600k
> - Mobo: Asus Maximus VIII Hero Z170
> - RAM: G.Skill Ripjaws 4, 2400 Mhz/CL15, 32 GB
> - Drives:
> SSD for Windows 10/programs: SanDisk Extreme PRO 240GB SATA 6.0Gb/s
> HDD for the rest: WD Black 2 TB, 7200 RPM SATA 6 Gb/s 64MB Cache 3.5 Inch - WD2003FZEX
> - CPU Cooler: be Quiet! Dark Rock Pro 3
> - Case Z11 Neo
> - PSU: Corsair RMi Series RM750i
> - Screen: Asus ROG Swift PG279q
> 
> I intend to use it for 144 Hz gaming instead of 4k.
> 
> Asus PG279q 27" is my current weapon of choice (https://www.amazon.com/SWIFT-PG279Q-Screen-LED-Lit-Monitor/dp/B017EVR2VM/ref=sr_1_1?s=pc&ie=UTF8&qid=1470787248&sr=1-1&keywords=asus+pg279q. I prefer a smooth gameplay to eye candy, so this is a 1440p IPS screen with G-Sync and 144Hz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No breaking world records for me yet because I'm just learning and observing first.
> 
> In general, I just wanted to know if any of you had some problems with bottleneck or low performance using this particular i5 and Titan X, preferrably in 1440p or higher. I haven't upgraded my PC for a while but I have done my homework and I just wanted to make a right call. It's always better to have a second opinion


Honestly it looks like a great system. Of course the response you receive will vary on who you ask here. Some are arguing just a few post above mine that hyper threading provided with the i7 series does provide some benefit now(compared to a few years ago when it was argued that HT didn't really help in games) while others are posting benchmarks showing that their X99 system is tying their lower end and cheaper Z710 system.

At the end of the day, given your budget, I think you've picked out great parts. Sure, you can buy an i7 but you'll have to increase your budget. Is the extra $100-$200 for the i7 within your budget? If so, then go for it assuming you can still eat 3 meals a day.







Otherwise don't sweat the stats and just simply enjoy gaming on your new rig!


----------



## Menthol

I would suggest a 480 512 GB SSD, a 240 will fill up fast


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - this is _such_ a myth. (eg, a statement lacking any basis in fact). "Outbeat", c'mon. But with regard to gaming.. a 6700K is fine, a 6600K is budget friendly and fully capable of playing most any game. The cpu dependence (or clock speed scaling) drops significantly as resolution increases. . (I have all 4 processors running right now).
> hey bud - it seems that every thing changes once you keep the core below 40C. Testing at ambient with core temps above 45C might be misleading. IDK, thermal scaling does not seem to track straight?


Interesting. How so? How's running a uniblock without the fan shroud. You recommend some strong active airflow or do you think it's fine with passive airflow from rads.

You think the two other cards might scale better than best card sub 40C?


----------



## carlhil2

Quote:


> Originally Posted by *CRITTY*
> 
> Haswell-E VS Skylake | i7 5960X VS 6700K 4.5GHz Gaming performance | GTX 1080 FE
> 
> 
> 
> 
> Sharing is caring.


The ram speed would help a little also me thinks. 3200 vs 2666. cache, etc...


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> As requested.


Just got home. Let me run a couple benches and see. Are these with or without HT?


----------



## Testier

Quote:


> Originally Posted by *carlhil2*
> 
> The ram speed would help a little also me thinks. 3200 vs 2666. cache, etc...


Quad channel vs dual channel as well.


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Just got home. Let me run a couple benches and see. Are these with or without HT?


All HT on.


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> All HT on.


That's with just 1 GPU? Just saw your other post. Sli 1080. So we can't really make a good comparison bench because my score is higher but that's likely because I'm running Titans.


----------



## cg4200

Quote:


> Originally Posted by *Jpmboy*
> 
> check the voltage... most likely it will not raise the voltage much - depending on the voltage table in bios.


yeah did not raise voltage much 1050 10 1080 but it did change me to going from 2088 max to 2125 before throttle city kick in... my ambient 73 with ac with full +100mv temps shot up in fire strike ultra quickhit 84.. with +50mv was not so bad helped stable core a little and temp hit 69 max fire strike..
I did replace thermal paste was not to bad on mine but I used thermal grizzly about 5c drop.. kept backplate off though no real contact exept one little piece of thermal tape on little chip tthat hits backplate .. so I have my spot fan angled blowing on the back..seems to be a little cooler..
I am waiting on clu for shunt mod and water block also..any one done shunt mod ??any difference ??
Gaming I average witcher 3 4k maxed hairX4 60-68fps ...hairworks x8 51-59 after 30 minutes 67c 95 fan% think I saw 48 as a low..running 6700k 4.87


----------



## CallsignVega

Quote:


> Originally Posted by *axiumone*
> 
> I just spent 3 days testing my [email protected] with 1080 sli vs my [email protected] with 1080 sli at 3440x1440. The 6700k pulled ahead in almost every bench. I'll post a thread about all of this soon.
> 
> I was pretty surprised at the results. Needless to say after spending years on the intel extreme platform, it'll be main stream platforms for me from now on especially since the imposed two way sli limitation.


Quote:


> Originally Posted by *HyperMatrix*
> 
> The small gain you might see in some poorly optimized games are nothing compared to the gains you get from games that actually take advantage of multi core. Vega and I tested metro ll for example and this was true. And as usual, I challenge people to bench tomb raider in dx12.


I have both a highly overclocked 6700K and 6950X system so I have no skin either way. Apart from very specific circumstances like trying to render a million blades of grass in Crysis 3 or off-loading Physx to the CPU, 90+% of games will run faster on the higher clocked 6700K. It will slowly move in favor of the higher core count CPU's as time goes by however.


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> As requested.


For what it's worth...


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> I have both a highly overclocked 6700K and 6950X system so I have no skin either way. Apart from very specific circumstances like trying to render a million blades of grass in Crysis 3 or off-loading Physx to the CPU, 90+% of games will run faster on the higher clocked 6700K. It will slowly move in favor of the higher core count CPU's as time goes by however.


Did you test in other games like GTA V? Your 6950x should beat out your 6700k.


----------



## GosuPl

Hi i run plenty tests of TITAN X pascal vs TITAN X Maxwell (include SLI).

Few of this for you









AC Syndicate

TITAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000

1440p max + FXAA

FPS

MIN 43 MIN 63 + 46%
MAX 51 MAX 78 + 52%
AVG 46 AVG 70 + 52%

TITAN X Maxwell SLI @1380/7700 TITAN X Pascal @1974/11 000

MIN 71 MIN 63 - 11%
MAX 89 MAX 78 - 12%
AVG 78 AVG 70 - 10%

SLI scalling

MIN 43 MIN 71 + 65%
MAX 51 MAX 89 + 74%
AVG 46 AVG 78 + 69%

Witcher 3

1440p max+ HW AAx8

TITAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000

MIN 51 MIN 75 + 47%
MAX 58 MAX 89 + 53%
AVG 54 AVG 83 + 53%

TITAN X Maxwell SLI @1380/7700 TITAN X Pascal @1974/11 000

MIN 86 MIN 75 - 13%
MAX 95 MAX 89 - 6%
AVG 90 AVG 90 - 0%

SLI scalling

MIN 51 MIN 86 + 68%
MAX 58 MAX 95 + 63%
AVG 54 AVG 90 + 66%

Rise Of The Tomb Raider

1440p max + SMAA

TITAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000

MIN 41 MIN 53 + 29%
MAX 54 MAX 82 + 51%
AVG 47 AVG 69 + 46%

TITAN X Maxwell SLI @1380/7700 TITAN X Pascal @1974/11 000

MIN 57 MIN 53 - 7%
MAX 96 MAX 82 - 14%
AVG 77 AVG 69 - 10%

SLi scalling TITAN X Maxwell

ROTR

Single vs SLI

MIN 41 MIN 57 + 39%
MAX 54 MAX 96 + 77%
AVG 47 AVG 77 + 63%

AC Syndicate max 4k

TIAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000

MIN 26 MIN 41 + 57%
MAX 39 MAX 47 + 20%
AVG 27 AVG 43 + 59%

TIAN X Maxwell SLI @1380/7700 TITAN X Pascal SLI @1974/11 000

MIN 45 MIN 70 + 55%
MAX 56 MAX 87 + 55%
AVG 50 AVG 79 + 57%

SLi scalling Maxwell Sli scalling (old bridge)

MIN + 73% MIN + 70%
MAX + 43% MAX + 85%
AVG + 85% AVG + 83%

Tommorow i will get HB bridge and rereun tests.



This HB bridge is to short LOL







So i waiting for new


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Did you test in other games like GTA V? Your 6950x should beat out your 6700k.


Not likely. Again, my comparison of ROTR was running same resolution, same settings. So your result doesn't add much without knowing settings, resolution, etc.

I'll have my post up tonight or tomorrow comparing GTAV, among other things. Spoiler ahead, 6700k beats the 5960x in GTAV. Even with a 5% IPC improvement in Broadwell-E, the Skylake system would beat it.

DX12 does have an advantage in current titles, but it's within margin of error. I imagine that extreme based systems would trump mainstream systems in future titles. Especially titles that have been developed from the ground up to take advantage of DX12. However, right now, if you want the best gaming performance, Skylake is the way to go.


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> Not likely. Again, my comparison of ROTR was running same resolution, same settings. So your result doesn't add much without knowing settings, resolution, etc.


Tomb raider bench uses a preset resolution and settings.


----------



## Yuhfhrh

Quote:


> Originally Posted by *HyperMatrix*
> 
> Tomb raider bench uses a preset resolution and settings.


No it doesn't, it uses whatever settings you currently have applied.


----------



## HyperMatrix

Quote:


> Originally Posted by *Yuhfhrh*
> 
> No it doesn't, it uses whatever settings you currently have applied.


Why does it run below native monitor resolution then?


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Tomb raider bench uses a preset resolution and settings.


Quote:


> Originally Posted by *HyperMatrix*
> 
> Why does it run below native monitor resolution then?


Why would it? Can you elaborate? Are you running the bench from the in game menu or is there an executable for the bench that I wasn't aware of?


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why would it? Can you elaborate? Are you running the bench from the in game menu or is there an executable for the bench that I wasn't aware of?


Running it from in-game, but since it was rendering below my monitor resolution of 1440p, I assumed it was using canned settings/1080p res. You didn't notice how terrible the bench looked compared to when you're actually in-game?


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Running it from in-game, but since it was rendering below my monitor resolution of 1440p, I assumed it was using canned settings/1080p res. You didn't notice how terrible the bench looked compared to when you're actually in-game?


Somethings wrong on your end. This is what I see as I'm running the bench this very moment.


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> Somethings wrong on your end. This is what I see as I'm running the bench this very moment.


Even when I set in-game resolution to 1080p manually, monitor is still reporting that it's at 2560x1440. But you're right about the graphics settings. Setting them to low did change the bench results. I had been running everything on full max, except AA which I had set to smaa.


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Even when I set in-game resolution to 1080p manually, monitor is still reporting that it's at 2560x1440. But you're right about the graphics settings. Setting them to low did change the bench results. I had been running everything on full max, except AA which I had set to smaa.


Yeah, I see that too. I just changed the resolution a bunch of times and the display reported that I'm still running 3440. I'll see if there's another way to dig up the actual resolution that the bench is rendering in, as it's definitely worth looking into.

I just want to bring this up. I'm not debating value for performance or that it's a bad buy to get an extreme cpu or anything else. I'm usually first in line to get the latest extreme cpu. I just wanted to investigate the performance difference between the 6700k and the 5960x I had on hand. It just seems like the mainstream skylake is a better choice as of the moment it came out. I found the results pretty intriguing. Enough to dissuade me from purchasing the 6950x.


----------



## cookiesowns

Quote:


> Originally Posted by *axiumone*
> 
> Yeah, I see that too. I just changed the resolution a bunch of times and the display reported that I'm still running 3440. I'll see if there's another way to dig up the actual resolution that the bench is rendering in, as it's definitely worth looking into.
> 
> I just want to bring this up. I'm not debating value for performance or that it's a bad buy to get an extreme cpu or anything else. I'm usually first in line to get the latest extreme cpu. I just wanted to investigate the performance difference between the 6700k and the 5960x I had on hand. It just seems like the mainstream skylake is a better choice as of the moment it came out. I found the results pretty intriguing. Enough to dissuade me from purchasing the 6950x.


I don't get why people are surprised a 6700K is faster than a lower clocked, lower IPC, older architecture CPU in IPC bound games?

Try encoding H265 or H264 ( streaming ) and playing games on a 6700K vs a 6800K or heck 6950X. If you're going Multi card + multi NVMe / PCIe X99 wins every time.

Thread has been derailed quite a bit lols, can we get back to overclocking results? Has anyone else posted up GPUz logs of throttling / overall performance in 3DMark?


----------



## HyperMatrix

Quote:


> Originally Posted by *axiumone*
> 
> Yeah, I see that too. I just changed the resolution a bunch of times and the display reported that I'm still running 3440. I'll see if there's another way to dig up the actual resolution that the bench is rendering in, as it's definitely worth looking into.
> 
> I just want to bring this up. I'm not debating value for performance or that it's a bad buy to get an extreme cpu or anything else. I'm usually first in line to get the latest extreme cpu. I just wanted to investigate the performance difference between the 6700k and the 5960x I had on hand. It just seems like the mainstream skylake is a better choice as of the moment it came out. I found the results pretty intriguing. Enough to dissuade me from purchasing the 6950x.


Dude. 100%. I'm just tired of people linking other sites/articles/etc...when I saw let's actually sit down and bench a few modern games on OCN to see the difference. Because while I know my 3770k was old architecture and the 6700k has much better IPC performance, I honestly can't believe how great it is to see a huge FPS jump in some games that I struggled with before.

I was waiting for the 6950x to upgrade. Saw very poor overclocking results. Didn't see any evidence that a game would benefit from 10 cores vs 8 cores. And got a hot deal on someone trying to dump their 5960x. So it was either $1050 for a 4.7GHz 5960x, with mobo, and ram, or $2500 for the same deal but with a 4.3GHz~ 6950x.


----------



## axiumone

Quote:


> Originally Posted by *cookiesowns*
> 
> I don't get why people are surprised a 6700K is faster than a lower clocked, lower IPC, older architecture CPU in IPC bound games?
> 
> Try encoding H265 or H264 ( streaming ) and playing games on a 6700K vs a 6800K or heck 6950X. If you're going Multi card + multi NVMe / PCIe X99 wins every time.
> 
> Thread has been derailed quite a bit lols, can we get back to overclocking results? Has anyone else posted up GPUz logs of throttling / overall performance in 3DMark?


I apologize wholeheartedly for derailing the thread. I realize that this isn't a cpu discussion, just wanted to put my two cents into the discussion. As I don't stream, it's never been a concern for me, thought that's a really good point. Although, I would like to point out that I'm running two pascal cards in sli along with a pcie ssd and I've gained performance from switching to z170 (though with a plx chip).

I'd also like to mention that I'm very pleased that the pascal titan seems to overclock just as well as the 1080 variant. I was sure that the clocks would be much lower. Topping out around 1900.


----------



## Jpmboy

Quote:


> Originally Posted by *cookiesowns*
> 
> Interesting. How so? How's running a uniblock without the fan shroud. You recommend some strong active airflow or do you think it's fine with passive airflow from rads.
> 
> You think the two other cards might scale better than best card sub 40C?


yeah - I just put the second ("better" ) card in the loop - running two uniblocks, it's "loopy". What I've seen so far is that ya can;t predict the Freq_max from what the card maintains while being downclocked from the TL (well, and PL etc). They all work together and the weighting of each is hard to figure out when I can only remove the thermal effect. One thing for sure, the clock holds much more constant. I had the weak card at 9C (max t of 18C under load) and it will hold 2101 in timespy, heaven and valley .without dropping a clock bin. Just on the rads, the cards never > 35C.








I posted the IR gun scan earlier... the hottest things on the board are the 2 R22 mofsets thingys up by the PCIE power connectors.




Spoiler: Warning: Spoiler!







Quote:


> Originally Posted by *CallsignVega*
> 
> I have both a highly overclocked 6700K and 6950X system so I have no skin either way. Apart from very specific circumstances like trying to render a million blades of grass in Crysis 3 or off-loading Physx to the CPU, 90+% of games will run faster on the higher clocked 6700K. It will slowly move in favor of the higher core count CPU's as time goes by however.


so... which rig do you use? the 6700K or the 6950X.


----------



## Yuhfhrh

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - I just put the second ("better" ) card in the loop - running two uniblocks, it's "loopy". What I've seen so far is that ya can;t predict the Freq_max from what the card maintains while being downclocked from the TL (well, and PL etc). They all work together and the weighting of each is hard to figure out when I can only remove the thermal effect. One thing for sure, the clock holds much more constant. I had the weak card at 9C (max t of 18C under load) and it will hold 2101 in timespy, heaven and valley .without dropping a clock bin. Just on the rads, the cards never > 35C.
> 
> 
> 
> 
> 
> 
> 
> 
> I posted the IR gun scan earlier... the hottest things on the board are the 2 R22 mofsets thingys up by the PCIE power connectors.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so... which rig do you use? the 6700K or the 6950X.


I have both too, and use the 6700K system exclusively for gaming. It just does better most of the time. Although in reality the difference between the two is negligible.









The 6950X is my premier pro baby.


----------



## renejr902

Quote:


> Originally Posted by *NoDoz*
> 
> Heres mine at 4K. I havent tweaked my OC yet, I could go higher for sure.


Thanks alot, very similar result , and you have 2800 mhz ram. My 1600 mhz cas 9 ram dont seem to slower the fps in 4k. I think I wont change my ram . Thanks a lot its fun to compare


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - I just put the second ("better" ) card in the loop - running two uniblocks, it's "loopy". What I've seen so far is that ya can;t predict the Freq_max from what the card maintains while being downclocked from the TL (well, and PL etc). They all work together and the weighting of each is hard to figure out when I can only remove the thermal effect. One thing for sure, the clock holds much more constant. I had the weak card at 9C (max t of 18C under load) and it will hold 2101 in timespy, heaven and valley .without dropping a clock bin. Just on the rads, the cards never > 35C.
> 
> 
> 
> 
> 
> 
> 
> 
> I posted the IR gun scan earlier... the hottest things on the board are the 2 R22 mofsets thingys up by the PCIE power connectors.


Hrm. I don't see active cooling on the vmem or VRM of the card. Think it's "safe" to run like so? I didn't see the picture with IR readings, only the one circled on the R22 chokes/mostfets.

I'm debating just skipping FC blocks and run with uniblocks like i did on the KPEs.


----------



## pez

Quote:


> Originally Posted by *Baasha*
> 
> Okay so since my 3rd and 4th Titan XP are installed in the X79 rig, there are some weird things going on.
> 
> The performance seems to be quite crappy @ 1440P - I understand these cards are ridiculous overkill for that resolution but @ 144Hz, it should be utilized pretty well.
> 
> In my X99 rig, I'm getting ~90 FPS in GTA V @ 5K. In the X79 rig, I'm getting 70 FPS @ 1440P. The main difference is that the X99 rig has the 6950X @ 4.30Ghz whereas the X79 rig has the 3970X @ 4.50Ghz. The CPU alone can't make THAT big of a difference - please tell me that that is not the case.
> 
> Games like BF4 are fine - pegged at 144FPS and the SLI OC on the X79 rig is +200 / +650.
> 
> Haven't tested other games yet but will do some more benchmarks on it.


Glad I'm not the only one that noticed what seemed like better performance at higher resolutions. In GTA V, I had to push my 1080s to use around 5GB of VRAM (via the ingame slider) to get it to use both GPUs at 100% and actually scale performance. Play with some settings in GTA V to get your VRAM usage a bit closer to max VRAM (much harder to do on the Titan's obviously







) and see how it does.
Quote:


> Originally Posted by *GosuPl*
> 
> Hi i run plenty tests of TITAN X pascal vs TITAN X Maxwell (include SLI).
> 
> Few of this for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AC Syndicate
> 
> TITAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000
> 
> 1440p max + FXAA
> 
> FPS
> 
> MIN 43 MIN 63 + 46%
> MAX 51 MAX 78 + 52%
> AVG 46 AVG 70 + 52%
> 
> TITAN X Maxwell SLI @1380/7700 TITAN X Pascal @1974/11 000
> 
> MIN 71 MIN 63 - 11%
> MAX 89 MAX 78 - 12%
> AVG 78 AVG 70 - 10%
> 
> SLI scalling
> 
> MIN 43 MIN 71 + 65%
> MAX 51 MAX 89 + 74%
> AVG 46 AVG 78 + 69%
> 
> Witcher 3
> 
> 1440p max+ HW AAx8
> 
> TITAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000
> 
> MIN 51 MIN 75 + 47%
> MAX 58 MAX 89 + 53%
> AVG 54 AVG 83 + 53%
> 
> TITAN X Maxwell SLI @1380/7700 TITAN X Pascal @1974/11 000
> 
> MIN 86 MIN 75 - 13%
> MAX 95 MAX 89 - 6%
> AVG 90 AVG 90 - 0%
> 
> SLI scalling
> 
> MIN 51 MIN 86 + 68%
> MAX 58 MAX 95 + 63%
> AVG 54 AVG 90 + 66%
> 
> Rise Of The Tomb Raider
> 
> 1440p max + SMAA
> 
> TITAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000
> 
> MIN 41 MIN 53 + 29%
> MAX 54 MAX 82 + 51%
> AVG 47 AVG 69 + 46%
> 
> TITAN X Maxwell SLI @1380/7700 TITAN X Pascal @1974/11 000
> 
> MIN 57 MIN 53 - 7%
> MAX 96 MAX 82 - 14%
> AVG 77 AVG 69 - 10%
> 
> SLi scalling TITAN X Maxwell
> 
> ROTR
> 
> Single vs SLI
> 
> MIN 41 MIN 57 + 39%
> MAX 54 MAX 96 + 77%
> AVG 47 AVG 77 + 63%
> 
> AC Syndicate max 4k
> 
> TIAN X Maxwell @1380/7700 TITAN X Pascal @1974/11 000
> 
> MIN 26 MIN 41 + 57%
> MAX 39 MAX 47 + 20%
> AVG 27 AVG 43 + 59%
> 
> TIAN X Maxwell SLI @1380/7700 TITAN X Pascal SLI @1974/11 000
> 
> MIN 45 MIN 70 + 55%
> MAX 56 MAX 87 + 55%
> AVG 50 AVG 79 + 57%
> 
> SLi scalling Maxwell Sli scalling (old bridge)
> 
> MIN + 73% MIN + 70%
> MAX + 43% MAX + 85%
> AVG + 85% AVG + 83%
> 
> Tommorow i will get HB bridge and rereun tests.
> 
> 
> 
> This HB bridge is to short LOL
> 
> 
> 
> 
> 
> 
> 
> So i waiting for new


I know you're waiting on the proper sized bridge, but PLEASE get frametimes when you do the testing between old and new bridges. This is a very important measurement that a lot of big review sites fail to do.


----------



## xarot

Quote:


> Originally Posted by *Jpmboy*


Sweet looking rig. That looks like the mad scientist's lab...lol no. That's pretty much how my "PC room" might look like if we lived in a lot bigger apartment. I sometimes bench on top of the fridge...
Quote:


> Originally Posted by *Baasha*
> 
> Okay so since my 3rd and 4th Titan XP are installed in the X79 rig, there are some weird things going on.
> 
> The performance seems to be quite crappy @ 1440P - I understand these cards are ridiculous overkill for that resolution but @ 144Hz, it should be utilized pretty well.
> 
> In my X99 rig, I'm getting ~90 FPS in GTA V @ 5K. In the X79 rig, I'm getting 70 FPS @ 1440P. The main difference is that the X99 rig has the 6950X @ 4.30Ghz whereas the X79 rig has the 3970X @ 4.50Ghz. The CPU alone can't make THAT big of a difference - please tell me that that is not the case.
> 
> Games like BF4 are fine - pegged at 144FPS and the SLI OC on the X79 rig is +200 / +650.
> 
> Haven't tested other games yet but will do some more benchmarks on it.


Are you running at PCIe 2.0 by any chance? 3970X needs a patch to run to enable PCIe 3.0. I don't know if it still works with Pascal (I have a 3970X too over here, but I don't have a Pascal yet...neither is the chip installed currently







)

http://nvidia.custhelp.com/app/answers/detail/a_id/3135/~/geforce-gen3-support-on-x79-platform


----------



## toncij

Quote:


> Originally Posted by *NoDoz*
> 
> So after installing my new titan, Im having a ram issue. Not sure if its related but my pc never has done this before I put in my titan. Im playing WoW and Im getting messages from windows saying Im out of memory. I checked it and I was topping out at 15.9gb out of 16gb. Anyone have any idea whats going on? Ive never had a issue with memory before. Is my new card in any way causing it to do that?


You should cover VRAM with your system RAM.
Quote:


> Originally Posted by *CRITTY*
> 
> Haswell-E VS Skylake | i7 5960X VS 6700K 4.5GHz Gaming performance | GTX 1080 FE
> 
> 
> 
> 
> Sharing is caring.


What software is used for this kind of measurement (or even better, that one where they also have graphs with frametimes at the bottom)?

Quote:


> Originally Posted by *CallsignVega*
> 
> I have both a highly overclocked 6700K and 6950X system so I have no skin either way. Apart from very specific circumstances like trying to render a million blades of grass in Crysis 3 or off-loading Physx to the CPU, 90+% of games will run faster on the higher clocked 6700K. It will slowly move in favor of the higher core count CPU's as time goes by however.


Real cores will come to fruition but not yet I guess.

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - I just put the second ("better" ) card in the loop - running two uniblocks, it's "loopy". What I've seen so far is that ya can;t predict the Freq_max from what the card maintains while being downclocked from the TL (well, and PL etc). They all work together and the weighting of each is hard to figure out when I can only remove the thermal effect. One thing for sure, the clock holds much more constant. I had the weak card at 9C (max t of 18C under load) and it will hold 2101 in timespy, heaven and valley .without dropping a clock bin. Just on the rads, the cards never > 35C.
> 
> 
> 
> 
> 
> 
> 
> 
> I posted the IR gun scan earlier... the hottest things on the board are the 2 R22 mofsets thingys up by the PCIE power connectors.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> so... which rig do you use? the 6700K or the 6950X.


You're not cooling VRAM, VRM etc.? How that works for you temp wise?









----

Darn, those results make me switch from 1080 SLI to TXP...


----------



## Stateless

So need some help. I removed my 2 Maxwell Titan's, re-routed my water loop. Plugged in the new Titan P and now my computer is telling me that I need to select a proper boot drive or insert a boot disk. No other changes happened to the computer other than removing the 2 Titans and inserting the single new Titan. I have checked all my cables and everything is secured and plugged in. Any suggestions as to what I could look into?


----------



## HyperMatrix

Quote:


> Originally Posted by *Stateless*
> 
> So need some help. I removed my 2 Maxwell Titan's, re-routed my water loop. Plugged in the new Titan P and now my computer is telling me that I need to select a proper boot drive or insert a boot disk. No other changes happened to the computer other than removing the 2 Titans and inserting the single new Titan. I have checked all my cables and everything is secured and plugged in. Any suggestions as to what I could look into?


Disable UEFI boot mode by enabling CSM. Reboot to BIOS. Now you should be able to manually select your hard drive. Boot into windows. Let drivers install for the video cards. Reboot into BIOS. Disable CSM to re-enable UEFI mode again and at this point it should work fine.


----------



## GosuPl

@pez

"I know you're waiting on the proper sized bridge, but PLEASE get frametimes when you do the testing between old and new bridges. This is a very important measurement that a lot of big review sites fail to do."

No problem i will







When tests will be finish, you can see them on my YT channel.

https://www.youtube.com/channel/UCQSy7A7a75eE0H7bDfRpEew

Few days and go







Still have tests GTX 1080 SLI vs TITAN X Maxwell SLI, waiting for finishing and upload.


----------



## toncij

Quote:


> Originally Posted by *GosuPl*
> 
> @pez
> 
> "I know you're waiting on the proper sized bridge, but PLEASE get frametimes when you do the testing between old and new bridges. This is a very important measurement that a lot of big review sites fail to do."
> 
> No problem i will
> 
> 
> 
> 
> 
> 
> 
> When tests will be finish, you can see them on my YT channel.
> 
> https://www.youtube.com/channel/UCQSy7A7a75eE0H7bDfRpEew
> 
> Few days and go
> 
> 
> 
> 
> 
> 
> 
> Still have tests GTX 1080 SLI vs TITAN X Maxwell SLI, waiting for finishing and upload.


At 1.5GHz vs 2.1GHz TXM vs 1080, I expect ~13-14% more perf. for 1080.


----------



## pez

Quote:


> Originally Posted by *GosuPl*
> 
> @pez
> 
> "I know you're waiting on the proper sized bridge, but PLEASE get frametimes when you do the testing between old and new bridges. This is a very important measurement that a lot of big review sites fail to do."
> 
> No problem i will
> 
> 
> 
> 
> 
> 
> 
> When tests will be finish, you can see them on my YT channel.
> 
> https://www.youtube.com/channel/UCQSy7A7a75eE0H7bDfRpEew
> 
> Few days and go
> 
> 
> 
> 
> 
> 
> 
> Still have tests GTX 1080 SLI vs TITAN X Maxwell SLI, waiting for finishing and upload.


Thank you







. I've subscribed







.


----------



## Gary2015

Idle temps
Top card runs 31c
Bottom card runs 25c


----------



## Z0eff

The more white colored cases I see the more I want my next case to be white...


----------



## toncij

Quote:


> Originally Posted by *Z0eff*
> 
> The more white colored cases I see the more I want my next case to be white...


It's harder to get white memory, white board, white ... I want that InWin 909 tho...


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> It's harder to get white memory, white board, white ... I want that InWin 909 tho...


That's actually the exact case I was looking at too


----------



## toncij

Quote:


> Originally Posted by *Z0eff*
> 
> That's actually the exact case I was looking at too


Far from white tho and a serious pain if you use the board connectors often. But it looks awesome.


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> Far from white tho and a serious pain if you use the board connectors often. But it looks awesome.


Strange, I thought that had a white option too. Looking at the other InWin cases, no white options. What was the case then that I had in my head?

EDIT: Eh, going a bit OT I guess.... *whistles innocently*


----------



## Glzmo

Quote:


> Originally Posted by *Steven185*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Glzmo*
> 
> From what I understand, both will work without having to modify the card and simply they are both the same except the 980 Ti or 980 branding on the shroud, as is the one for the Maxwell Titan X if you can find that), but without the plastic shroud that comes with it due to different shape/mounting holes on those. The plastic shroud is mostly cosmetic (although you could argue that it directs the airflow - although some will say it obstructs it, I guess it depends on your case's airflow - but I guess if you really want to keep a shroud, you could possibly just screw off the Titan X's original shroud's glass part and stick the tubes through that.
> You could probably also use the AIO for the 10 Series, not sure if that shroud will work on the Titan X either, though.
> 
> 
> 
> Thanks, my concern is mostly with the differing die size. Since 980 ti's die is larger and 980's one is smaller would it really fit the thermal plate or is it going to have thermal losses. My understanding is that EVGA has started developing an AIO for Titan XP in particular, but if it is to take weeks (or even months), I'm better off buying their older models for the simple fact that I may be using my card before the release of the new AIO...
Click to expand...

The cooler itself is the same size for the 980, 980 Ti and Titan X (Maxwell). I've just slapped the 980 Ti cooler on my Pascal Titan X a couple of minutes ago and the heatsink is large enough for the die (there is still some room). So yeah, any of those should work, so get the one that you can get immediately and the cheapest. I went for the 980 Ti one, since only that and the 980 one were in stock so I figured I'd get the one for the stronger card, although they are exactly the same anyhow, except that one says 980 Ti and the other 980 on the plastic shroud (and Titan X on the Maxwell Titan X one) which you won't be using anyway.
As for the 10 series version I don't know, it looks a little different and warranty is lower for some reason. In any case, it wasn't in stock over here in Europe anyway.


----------



## dante`afk

Quote:


> Originally Posted by *HyperMatrix*
> 
> Tomb raider bench uses a preset resolution and settings.


nope?

Quote:


> Originally Posted by *axiumone*
> 
> As requested.


by the looks of the fps this was without SSAA 4x, AA 16x, VXAO?


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> nope?
> by the looks of the fps this was without SSAA 4x, AA 16x, VXAO?


Not sure about his. But there is no VXAO in DX12. I ran all maxed, except no SSAA. Just SMAA. 2560x1440. SSAA 4x is ridiculous and would kill the GPUs before the CPU could even hope of becoming a bottleneck.

Also as a side note...VXAO in tomb raider (under DX11, of course) actually increases FPS in some scenes.


----------



## DADDYDC650

Rainbow Six Siege 2560x1440p max settings. Original Titan X (1300mhz) vs Titan XP (1700Mhz)

Original Titan X. 4790k @4.4Ghz/2400Mhz DDR3



Titan XP 6800K @4.4Ghz/3200Mhz DDR4


----------



## fisher6

Quote:


> Originally Posted by *DADDYDC650*
> 
> Rainbow Six Siege 2560x1440p max settings. Original Titan X (1300mhz) vs Titan XP (1700Mhz)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Original Titan X. 4790k @4.4Ghz/2400Mhz DDR3
> 
> 
> 
> Titan XP 6800K @4.4Ghz/3200Mhz DDR4


That's quite an increase at 1440p. I have same CPU but 980Ti as GPU. It's getting harder no to order one.


----------



## Jpmboy

Quote:


> Originally Posted by *Yuhfhrh*
> 
> I have both too, and use the 6700K system exclusively for gaming. It just does better most of the time. Although in reality the difference between the two is negligible.
> 
> 
> 
> 
> 
> 
> 
> 
> The 6950X is my premier pro baby.


Yeah, I think they trade blows depending on the game/test being used. A delidded 6700K @ 4.8/4.8 or higher is hard to beat in low core count apps... as it should be.








Quote:


> Originally Posted by *cookiesowns*
> 
> Hrm. I don't see active cooling on the vmem or VRM of the card. Think it's "safe" to run like so? I didn't see the picture with IR readings, only the one circled on the R22 chokes/mostfets.
> I'm debating just skipping FC blocks and run with uniblocks like i did on the KPEs.


I have a delta fan on each card. to cool the top side. I'm not sure the stock cooler does much for the VRMs etc. Y=The great thing about the KPEs etc, is the cold plate and stock back plate - perfect for uniblocks!
No need to stick silly little copper heat sinks on the exposed ICs, but really strong air flow is needed either way. EK blocks ship next week.









Quote:


> Originally Posted by *xarot*
> 
> Sweet looking rig. That looks like the mad scientist's lab...lol no. That's pretty much how my "PC room" might look like if we lived in a lot bigger apartment. I sometimes bench on top of the fridge...
> *Are you running at PCIe 2.0* by any chance? 3970X needs a patch to run to enable PCIe 3.0. I don't know if it still works with Pascal (I have a 3970X too over here, but I don't have a Pascal yet...neither is the chip installed currently
> 
> 
> 
> 
> 
> 
> 
> )
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/3135/~/geforce-gen3-support-on-x79-platform


I wasn't for those runs, but 2.0 will reduce the CPU overhead... good for low core counts.

Quote:


> Originally Posted by *toncij*
> 
> You should cover VRAM with your system RAM.
> What software is used for this kind of measurement (or even better, that one where they also have graphs with frametimes at the bottom)?
> Real cores will come to fruition but not yet I guess.
> *You're not cooling VRAM, VRM etc.? How that works for you temp wise?*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ----
> 
> Darn, those results make me switch from 1080 SLI to TXP...


the IR gun temps were measured as shown. The 2 R22s I pointed out will hit very high temps (70C+) but that is still below the spec T_max for that part. Regardless, the performance/stability declines with higher temps.


----------



## Tideman

Damn these things run HOT.

Top card peaked at 84C after 1hr GTAV (at 4k). No overclock. They boost to over 1800 but stabilize at 1785. This is with a custom fan curve that hits 100% at 85C.

Is this normal? Also noticed that my gpu usage doesn't seem quite as stable as my previous 980s which pretty much held 98% the majority of the time (in GTAV). My Titans jump around from 70 - 99.. Guessing this is driver related?


----------



## DADDYDC650

Quote:


> Originally Posted by *Tideman*
> 
> Damn these things run HOT.
> 
> Top card peaked at 84C after 1hr GTAV (at 4k). No overclock. They boost to over 1800 but stabilize at 1785. This is with a custom fan curve that hits 100% at 85C.
> 
> Is this normal? Also noticed that my gpu usage doesn't seem quite as stable as my previous 980s which pretty much held 98% the majority of the time (in GTAV). My Titans jump around from 70 - 99.. Guessing this is driver related?


Normal for Titans.


----------



## Tideman

Quote:


> Originally Posted by *DADDYDC650*
> 
> Normal for Titans.


Thanks, good to know. I guess they can handle it.


----------



## AvengerUK

Very happy with my new Titan.

It is a shame about the reference cooler though!

I've ordered a new wc loop and EK gpu block to fix that - as the cooler really does need to be pushed up high to maintain clock speeds under load









Hopefully once voltage tweaking is unlocked and under water I can really push it !


----------



## Jpmboy

Quote:


> Originally Posted by *AvengerUK*
> 
> Very happy with my new Titan.
> 
> It is a shame about the reference cooler though!
> 
> I've ordered a new wc loop and EK gpu block to fix that - as the cooler really does need to be pushed up high to maintain clock speeds under load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully once voltage tweaking is unlocked and under water I can really push it !


it's not so much voltage with pascal.. it's the power and thermal limit throttling.


----------



## Dr Mad

Also got my TX-P but there's no surprise, the card is limited by thermal throttling.

I wish Watercool make the new Titan waterblock available soon as it has the better looking in my opinion but it's not announced yet.

Aquacomputer block in black version is only planned for release in 30 days, too bad since their active cooling backplate is a really good point.

So my money will go to EK, despite the fact I'm a bit bored with their waterblock design.

I believe someone here in this thread told it's possible to fit the original backplate to EK waterblock, at least for gtx 1080.
Do you think it's possible for TX-P as well ?

Thanks


----------



## toncij

I don't understand Nvidia's decisions on TXP. Not only they banned custom designs, but this time they even banned AIBs from branding and sale making it hard to get a card at all.


----------



## vmanuelgm

Write here to all interested in reading, including Nvidia people!!!

I purchased one of the new TitanX Pascal from Spain on august 2nd (first hour after release) and still waiting for it when nvidia.es have shown stock (still showing) and 1-3 working days for delivery since that day.

I contacted Digital River and they dont know what its happening. Its a problem only with spanish web since the card is being received in other countries. Seems like discrimination!!!

So, please, Nvidia, solve the problem. You got my money since august 2nd, I dont have the card neither a good explanation of what is going on!!!

Very disappointing!!!


----------



## Zurv

Quote:


> Originally Posted by *Gary2015*
> 
> Would hold off on the blocks for now until EK fixes the HB bridge problem. Otherwise go with AC.


I've seen no gain between HB bridge and LED bridge...
also, just cut tips off the HB bridge and it fits (and works fine)


----------



## Gary2015

Quote:


> Originally Posted by *Zurv*
> 
> I've seen no gain between HB bridge and LED bridge...
> also, just cut tips off the HB bridge and it fits (and works fine)


Ill wait for the Aqua/Heatkiller blocks..


----------



## EniGma1987

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - I just put the second ("better" ) card in the loop - running two uniblocks, it's "loopy". What I've seen so far is that ya can;t predict the Freq_max from what the card maintains while being downclocked from the TL (well, and PL etc). They all work together and the weighting of each is hard to figure out when I can only remove the thermal effect. One thing for sure, the clock holds much more constant. I had the weak card at 9C (max t of 18C under load) and it will hold 2101 in timespy, heaven and valley .without dropping a clock bin. Just on the rads, the cards never > 35C.
> 
> 
> 
> 
> 
> 
> 
> 
> I posted the IR gun scan earlier... the hottest things on the board are the 2 R22 mofsets thingys up by the PCIE power connectors.


The two R22 "thingys" are chokes (inductors). That is the memory VRM up there. Not surprising that they run so hot being that we are overclocking into the 11GHz range on all those dense memory chips.

You can easily solve your power limit problem, especially since your cards are already naked like that. Just put some CoolLabratory Liquid Ultra over the 5MO resistors near the power connectors on the cards. Cover up the whole resistor to connect the two sides of the resistor together. That should about double your power limit on the GPUs.

Quote:


> Originally Posted by *Jpmboy*
> 
> the IR gun temps were measured as shown. The 2 R22s I pointed out will hit very high temps (70C+) but that is still below the spec T_max for that part. Regardless, the performance/stability declines with higher temps.


The mosfets (typically the limiting factor of the GPU VRM) in the VRMs of this card have basically no de-rating with higher temps. Same rating at 25c as 100c. Which is odd for a mosfet, but the reason it is like that is because the package is rated extremely low and is the limiting factor. They do become a little bit less power efficient at higher temps, but they dont lose performance or anything like that on this model. The good news is that the memory VRM current capability is way overbuilt for the memory, can easily put 150 watts through the mosfets (though, 150w would probably over saturate the memory chokes). Just buckets more power than you would ever use. it is surprising Nvidia would build up the memory VRM so much considering how badly they cut costs and make a crap core VRM on all their cards, and because GDDR5X is supposed to use even less power than normal GDDR5. IDK. Maybe the new X RAM requires a more intricate power delivery to achieve its lower draw and higher bandwidth.

I am curious though, how can you claim the parts are not near their T_Max when you dont even know what the parts are called that you are referring to? Seems like you would not be able to look up the spec sheet on something you dont know what it is.

Quote:


> Originally Posted by *Dr Mad*
> 
> I wish Watercool make the new Titan waterblock available soon as it has the better looking in my opinion but it's not announced yet.


Watercool told me yesterday that the Titan Heatkiller block is being designed. My card performs well enough as is for the time being so I am just going to leave it and wait for the Heatkiller block


----------



## NoDoz

Quote:


> Originally Posted by *NoDoz*
> 
> So after installing my new titan, Im having a ram issue. Not sure if its related but my pc never has done this before I put in my titan. Im playing WoW and Im getting messages from windows saying Im out of memory. I checked it and I was topping out at 15.9gb out of 16gb. Anyone have any idea whats going on? Ive never had a issue with memory before. Is my new card in any way causing it to do that?


Going to bump my question since it got buried in this popular thread. If anyone has any input I would appreciate it. Do I just need more ram? Or does the Titan allow my pc to use more ram?


----------



## Testier

Quote:


> Originally Posted by *NoDoz*
> 
> Going to bump my question since it got buried in this popular thread. If anyone has any input I would appreciate it. Do I just need more ram? Or does the Titan allow my pc to use more ram?


Do it again and see what program is using the excessive memory. It shouldnt really, unless WoW have a memory leak or something.........


----------



## NoDoz

Quote:


> Originally Posted by *Testier*
> 
> Do it again and see what program is using the excessive memory. It shouldnt really, unless WoW have a memory leak or something.........


I looked to see what was going on and I think wow said 1200 mb. I really didn't see anything else really high. I'll have to check again later this afternoon.


----------



## Testier

Quote:


> Originally Posted by *NoDoz*
> 
> I looked to see what was going on and I think wow said 1200 mb. I really didn't see anything else really high. I'll have to check again later this afternoon.


I think its just probably a one time bug or something. If it is cant be replicated, I personally wouldnt worry too much about it.


----------



## NoDoz

Quote:


> Originally Posted by *Testier*
> 
> I think its just probably a one time bug or something. If it is cant be replicated, I personally wouldnt worry too much about it.


Ok I'll check it out again. Thanks for the response.


----------



## DNMock

Quote:


> Originally Posted by *NoDoz*
> 
> Going to bump my question since it got buried in this popular thread. If anyone has any input I would appreciate it. Do I just need more ram? Or does the Titan allow my pc to use more ram?


I recall something about having a ton a Vram and low amount of system ram causing issues on the old Titan - X. Something about the way data goes from the Hard Drive --> system ram --->V Ram.

I don't remember the details but I think some software would have the VRAM queue the system ram, so 12GB of system ram would be set aside by the VRAM.

It's over my head but people did get improved results in some games (mostly older) by adding more system ram.


----------



## NoDoz

Quote:


> Originally Posted by *DNMock*
> 
> I recall something about having a ton a Vram and low amount of system ram causing issues on the old Titan - X. Something about the way data goes from the Hard Drive --> system ram --->V Ram.
> 
> I don't remember the details but I think some software would have the VRAM queue the system ram, so 12GB of system ram would be set aside by the VRAM.
> 
> It's over my head but people did get improved results in some games (mostly older) by adding more system ram.


Interesting. I'll see if it does it again. If it does I'll bump up to 32gb.


----------



## toncij

Do you have 16GB?


----------



## NoDoz

Quote:


> Originally Posted by *toncij*
> 
> Do you have 16GB?


Yes 16gb currently.


----------



## RedM00N

Kinda regret deciding against getting two of these cards after reading this thread.

Plus I could've(possibly) whipped up a custom bios while we wait for the tool for it.

Guess I'll have to pull strings to try to convince my financial advisor to allow this









Will probably end up selling my Maxwell TX to cover costs if I can.


----------



## Lennyx

Quote:


> Originally Posted by *vmanuelgm*
> 
> Write here to all interested in reading, including Nvidia people!!!
> 
> I purchased one of the new TitanX Pascal from Spain on august 2nd (first hour after release) and still waiting for it when nvidia.es have shown stock (still showing) and 1-3 working days for delivery since that day.
> 
> I contacted Digital River and they dont know what its happening. Its a problem only with spanish web since the card is being received in other countries. Seems like discrimination!!!
> 
> So, please, Nvidia, solve the problem. You got my money since august 2nd, I dont have the card neither a good explanation of what is going on!!!
> 
> Very disappointing!!!


It has been a mess. Same issue ordering from Norway. Seems like there was missing export documentations and stuff. So cards where sendt back for alot of people that ordered on release day.
I ordered the card on monday and it got stuck in customs all day couse of those missing documentations but it cleareds couple of hours ago. So im expecting to get the card sometime next week.

So yeah, your card might be stuck in customs somewhere or sendt back.


----------



## xarot

Quote:


> Originally Posted by *Lennyx*
> 
> It has been a mess. Same issue ordering from Norway. Seems like there was missing export documentations and stuff. So cards where sendt back for alot of people that ordered on release day.
> I ordered the card on monday and it got stuck in customs all day couse of those missing documentations but it cleareds couple of hours ago. So im expecting to get the card sometime next week.
> 
> So yeah, your card might be stuck in customs somewhere or sendt back.


Customs?? Don't they send the cards from EU? At least I ordered mine through UK store to Finland.


----------



## KillerBee33

Quote:


> Originally Posted by *RedM00N*
> 
> Kinda regret deciding against getting two of these cards after reading this thread.
> 
> Plus I could've(possibly) whipped up a custom bios while we wait for the tool for it.
> 
> Guess I'll have to pull strings to try to convince my financial advisor to allow this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will probably end up selling my Maxwell TX to cover costs if I can.


Or, i can get you my BIOS and you can try and change the power limit on this thing so i can love you








I'm not doing that ridiculous Hard Mod on a 1300$ card .


----------



## toncij

Quote:


> Originally Posted by *xarot*
> 
> Customs?? Don't they send the cards from EU? At least I ordered mine through UK store to Finland.


Norway is not in EU so that might be a problem?


----------



## xarot

Quote:


> Originally Posted by *toncij*
> 
> Norway is not in EU so that might be a problem?


I could be wrong but to my understanding no customs there from EU.


----------



## Lennyx

Quote:


> Originally Posted by *xarot*
> 
> Customs?? Don't they send the cards from EU? At least I ordered mine through UK store to Finland.[/q
> Quote:
> 
> 
> 
> Originally Posted by *xarot*
> 
> I could be wrong but to my understanding no customs there from EU.
> 
> 
> 
> Everything ordered outside Norway need to go thru customs. Sometimes things get stuck in customs for days before it gets cleared. Nvidia had a Store page for Norwegian customers but they have made an error somewhere with the export documents wich slowed down the process or in some cases the shippment was returned.
Click to expand...


----------



## toncij

Quote:


> Originally Posted by *xarot*
> 
> I could be wrong but to my understanding no customs there from EU.


Quote:


> Originally Posted by *Lennyx*
> 
> Everything ordered outside Norway need to go thru customs. Sometimes things get stuck in customs for days before it gets cleared. Nvidia had a Store page for Norwegian customers but they have made an error somewhere with the export documents wich slowed down the process or in some cases the shippment was returned.


Some light...









Anyone here moved from 1080 SLI to TXP (single) and not satisfied with the slowdown?


----------



## RedM00N

Quote:


> Originally Posted by *KillerBee33*
> 
> Or, i can get you my BIOS and you can try and change the power limit on this thing so i can love you
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not doing that ridiculous Hard Mod on a 1300$ card .


If you want, toss your bios file up here. I'll see what I can make of it. Won't make the promise of delivering anything since I do not have a card to test the bios with myself (and I'd feel uncomfortable putting someone else's card at risk while I run thru modding it)


----------



## Tideman

Quote:


> Originally Posted by *xarot*
> 
> Customs?? Don't they send the cards from EU? At least I ordered mine through UK store to Finland.


Yes they ship from Ireland.

I even had to jump through hoops to get my cards and I live in Ireland lol


----------



## KillerBee33

Quote:


> Originally Posted by *RedM00N*
> 
> If you want, toss your bios file up here. I'll see what I can make of it. Won't make the promise of delivering anything since I do not have a card to test the bios with myself (and I'd feel uncomfortable putting someone else's card at risk while I run thru modding it)


Will give it a shot in about 5 hours when home


----------



## Lennyx

Quote:


> Originally Posted by *Tideman*
> 
> Yes they ship from Ireland.
> 
> I even had to jump through hoops to get my cards and I live in Ireland lol


Are you sure? On my order the payment went to a company in Ireland but the card shipped from Sweden.

First upgrade in almost 3 years. Also ordered new monitor and a waterblock. Next week is gonna be alot of fun


----------



## mbze430

There is a BIOS mod available for the Titan X Pascal??!?!?! I thought it was locked down being it's exclusive Nvidia card.


----------



## EniGma1987

Quote:


> Originally Posted by *RedM00N*
> 
> If you want, toss your bios file up here. I'll see what I can make of it. Won't make the promise of delivering anything since I do not have a card to test the bios with myself (and I'd feel uncomfortable putting someone else's card at risk while I run thru modding it)


Have you been reading up on the issues people have been having with the GTX 1080 bios modding? I have no idea if the Titan is the same way, but I would assume it is. The problem isnt so much that people cant mod the bios, but that they cant get a signature validation for the bios to allow it to flash. So far people can flash from one bios to another between cards and vendors, as long as they are official bios. But so far no one has managed to get a valid signature on a modded bios that lets it flash to the GPU







Hopefully you can figure out a way around this. Im sure everyone would love you on this site if you were able to get a modded bios working.


----------



## KillerBee33

Quote:


> Originally Posted by *EniGma1987*
> 
> Have you been reading up on the issues people have been having with the GTX 1080 bios modding? I have no idea if the Titan is the same way, but I would assume it is. The problem isnt so much that people cant mod the bios, but that they cant get a signature validation for the bios to allow it to flash. So far people can flash from one bios to another between cards and vendors, as long as they are official bios. But so far no one has managed to get a valid signature on a modded bios that lets it flash to the GPU
> 
> 
> 
> 
> 
> 
> 
> Hopefully you can figure out a way around this. Im sure everyone would love you on this site if you were able to get a modded bios working.


5.92 Works just fine Flashed 1080 quite a few times.


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> I don't understand Nvidia's decisions on TXP. Not only they banned custom designs, but this time they even banned AIBs from branding and sale making it hard to get a card at all.


I'm wondering if this was done exactly to make it _easier_ to get a card, by putting the entirety of the available stock behind a single store front. As opposed to splitting it up to various AIBs and having some just floating around in a warehouse due to inefficiencies and anything else that may cause a product to not go to a willing customer immediately. I wonder if the 1080 would have less availability issues if for whatever reason it was just available from nvidia themselves for a while.

EDIT: Not that I would like to see that happening with a card like the 1080 of course. Having access to custom cooler solutions immediately is far more important..


----------



## Yuhfhrh

Quote:


> Originally Posted by *KillerBee33*
> 
> 5.92 Works just fine Flashed 1080 quite a few times.


What?


----------



## KillerBee33

Quote:


> Originally Posted by *Yuhfhrh*
> 
> What?


Sorry 5.292








https://www.techpowerup.com/downloads/2709/nvflash-5-292-0-for-windows
And few guys clainm they have MODDED 1080 BIOS


----------



## EniGma1987

Quote:


> Originally Posted by *KillerBee33*
> 
> 5.92 Works just fine Flashed 1080 quite a few times.


flashed a user modded bios? Or flashed an official bios from other card?
Quote:


> Originally Posted by *KillerBee33*
> 
> Sorry 5.292
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.techpowerup.com/downloads/2709/nvflash-5-292-0-for-windows
> And few guys clainm they have MODDED 1080 BIOS


Thats news to me.







Would be awesome if true. I had thought people had only managed to flash other official bios with v5.292 so far


----------



## KillerBee33

Quote:


> Originally Posted by *EniGma1987*
> 
> flashed a user modded bios? Or flashed an official bios from other card?
> Thats news to me.
> 
> 
> 
> 
> 
> 
> 
> Would be awesome if true. I had thought people had only managed to flash other official bios with v5.292 so far


Search for this guy UNDEWOOTAGE if i remember , he even posted few screenshots but refuses to share , yes he claims someone moded that bios with higher Power and 1.21V
EDIT: Keep checking this guy for Bypass in nVflash








http://www.overclock.net/t/1521334/official-nvflash-with-certificate-checks-bypassed-for-gtx-950-960-970-980-980ti-titan-x


----------



## carlhil2

Anyone have any good results using the CLU mod yet? also, as jpmboy has said, I nixed the heatsink idea and am just using my EK Uni block by itself. I have 2 of these on the top of the card,.. https://www.amazon.com/gp/product/B002CYPWTG/ref=oh_aui_detailpage_o01_s01?ie=UTF8&psc=1 , with fans blowing up from the bottom of my case, along with all of the multiple other fans in my case, giving it enough airflow to make heat a non-issue. this will suffice til I put a FC block on it. very happy with the results...


----------



## toncij

Quote:


> Originally Posted by *Z0eff*
> 
> I'm wondering if this was done exactly to make it _easier_ to get a card, by putting the entirety of the available stock behind a single store front. As opposed to splitting it up to various AIBs and having some just floating around in a warehouse due to inefficiencies and anything else that may cause a product to not go to a willing customer immediately. I wonder if the 1080 would have less availability issues if for whatever reason it was just available from nvidia themselves for a while.
> 
> EDIT: Not that I would like to see that happening with a card like the 1080 of course. Having access to custom cooler solutions immediately is far more important..


You have a valid point here. It might be the reason, since yields for GP102 are obviously very bad (obvious from 3584 cores ON and up to 3840 being shut off).

Now, this all concerns me since this would mean 1080Ti won't happen. Titan XP is what I'd make a 1080Ti to be. Anything less doesn't make sense performance-wise.


----------



## Yuhfhrh

Quote:


> Originally Posted by *KillerBee33*
> 
> Search for this guy UNDEWOOTAGE if i remember , he even posted few screenshots but refuses to share , yes he claims someone moded that bios with higher Power and 1.21V


Would love a link for this.


----------



## mbze430

Quote:


> Originally Posted by *toncij*
> 
> You have a valid point here. It might be the reason, since yields for GP102 are obviously very bad (obvious from 3584 cores ON and up to 3840 being shut off).
> 
> Now, this all concerns me since this would mean 1080Ti won't happen. Titan XP is what I'd make a 1080Ti to be. Anything less doesn't make sense performance-wise.


they "can" make anything happen. cut a few more cores between 1080 to Titan XP and gimp the ram. Question is, will it be profitable for them to do it?


----------



## Stateless

Quote:


> Originally Posted by *HyperMatrix*
> 
> Disable UEFI boot mode by enabling CSM. Reboot to BIOS. Now you should be able to manually select your hard drive. Boot into windows. Let drivers install for the video cards. Reboot into BIOS. Disable CSM to re-enable UEFI mode again and at this point it should work fine.


CSM was already enabled.

How can installing a new GPU cause issues like this? I was using the computer a few hours before this with zero issues at all. I have 3 SSD's only, with one obvious having Windows on it. I just don't get how removing 2 Titan X Maxwell's, inserting the new Titan XP and I still cannot get in to Windows. I am at a complete loss. All 3 SSD's are being seen by the BIOS. I am not what else to do. Can the Titan P be the cause of this, can inserting a new GPU kill the SSD and the boot sector not allowing windows to load at all? Thanks in advance for any help.

I have inserted new GPU's into rigs for years without ever having an issue. I really thought last night after re-routing my loop and leak testing that I could just turn it on, download the Titan XP drivers and be bragging or crying about my overclocks.







Right now, I just want to get into Windows.


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> You have a valid point here. It might be the reason, since yields for GP102 are obviously very bad (obvious from 3584 cores ON and up to 3840 being shut off).
> 
> Now, this all concerns me since this would mean 1080Ti won't happen. Titan XP is what I'd make a 1080Ti to be. Anything less doesn't make sense performance-wise.


I'm secretly hoping that this generation will be a repeat of the 700 series. The original Titan also was partially disabled like the TXP, then the 780 Ti was the same full fat kepler chip but fully enabled. (But with less VRAM)


----------



## carlhil2

Quote:


> Originally Posted by *Z0eff*
> 
> I'm secretly hoping that this generation will be a repeat of the 700 series. The original Titan also was partially disabled like the TXP, then the 780 Ti was the same full fat kepler chip but fully enabled. (But with less VRAM)


Why though, they could get 4G for that chip by selling it as a Pro card....


----------



## KillerBee33

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Would love a link for this.


Searching thru , found where he posted that he is testing it for someone and when everyone Ganged up on a guy for not shering it , but cant find those screenshots


----------



## Z0eff

Quote:


> Originally Posted by *carlhil2*
> 
> Why though, they could get 4G for that chip by selling it as a Pro card....


Same thing happened with the big kepler chip iirc, then the 780 Ti had the same thing but with the compute stuff disabled I think


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Searching thru , found where he posted that he is testing it for someone and when everyone Ganged up on a guy for not shering it , but cant find those screenshots


never saw any data or results he claimed. I put it in the BS bin.


----------



## carlhil2

Quote:


> Originally Posted by *Z0eff*
> 
> Same thing happened with the big kepler chip iirc, then the 780 Ti had the same thing but with the compute stuff disabled I think


I think that by the time the yields are good enough for that, you would be better off waiting on the **80 Volta, no?


----------



## Glzmo

Quote:


> Originally Posted by *carlhil2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Z0eff*
> 
> Same thing happened with the big kepler chip iirc, then the 780 Ti had the same thing but with the compute stuff disabled I think
> 
> 
> 
> I think that by the time the yields are good enough for that, you would be better off waiting on the 1180 Volta, no?
Click to expand...

I'm guessing the 11xx Series might be the Pascal Refresh with HBM2 early next year and Volta will be 12xx or whatever.


----------



## carlhil2

Quote:


> Originally Posted by *Glzmo*
> 
> I'm guessing the 11xx Series might be the Pascal Refresh with HBM2 early next year and Volta will be 12xx or whatever.


Lol, true, I edited my post...


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> never saw any data or results he claimed. I put it in the BS bin.


Is it possible his post were deleted? [email protected] ...
Ehh, Apologies for relying on unreliable source







i could've sworn i saw those screenshots. Maybe on a different thread.


----------



## mbze430

Quote:


> Originally Posted by *carlhil2*
> 
> I think that by the time the yields are good enough for that, you would be better off waiting on the **80 Volta, no?


No if they yield better they will just call it the Titan X TI


----------



## DADDYDC650

Quote:


> Originally Posted by *carlhil2*
> 
> Why though, they could get 4G for that chip by selling it as a Pro card....


Doubt we'll see the full chip this year but we should see the 1080 Ti in the next couple of months.


----------



## gamingarena

Man so hard to resist getting second one always had SLi since ever, first time now only one, i'm running 1440/165 and honestly OC TXP its enough for anything i trow at it at this resolution,
all games that high FPS matters like multiplayer games: BF4, SWBF, COD BO3, Overwatch, Doom etc.. they all locked at 165fps with single TXP, the rest is walk in the park.

For a week now im trying to find and justify a reason to get 2nd one but i just cant find any "zero reasons" until 4k/120 that is, but until then we will get Titan Volta.
Moved from 2x Titan XM to 2x 1080SLi to single TXP and by far this is the smoothest and best experience in everything.

This card is a monster, but again my HB bridge is lonely


----------



## CRITTY

Quote:


> Originally Posted by *gamingarena*
> 
> Man so hard to resist getting second one allays had SLi since ever, first time now only one, i'm running 1440/165 and honestly OC TXP its enough for anything i trow at it at this resolution,
> all games that high FPS matters like multiplayer games: BF4, SWBF, COD BO3, Overwatch, Doom etc.. they all locked at 165fps with single TXP, the rest is walk in the park.
> 
> For a week now im trying to find and justify a reason to get 2nd one but i just cant find any "zero reasons" until 4k/120 that is, but until then we will get Titan Volta.
> Moved from 2x Titan XM to 2x 1080SLi to single TXP and by far this is the smoothest and best experience in everything.
> 
> This card is a monster, but again my HB bridge is lonely


Peer pressure!?


----------



## Sheyster

Has anyone been able to extract the BIOS from the card? I've tried using the latest NVflash (Joe Dirt Edition) and GPU-Z, but no luck so far. Please post it up if you have it!


----------



## HaniWithAnI

So I just had a thought regarding power limits.

TDP of Titan XP is 250W.

120% of 250W is 300W.

6+8pin power = 75W + 150W, +75W base from the PCIE slot = 300W.

Does this pose an issue if we attempt a shunt mod to raise the power limit or am I missing something? Where would the card draw the extra wattage from?


----------



## bee144

Quote:


> Originally Posted by *gamingarena*
> 
> Man so hard to resist getting second one allays had SLi since ever, first time now only one, i'm running 1440/165 and honestly OC TXP its enough for anything i trow at it at this resolution,
> all games that high FPS matters like multiplayer games: BF4, SWBF, COD BO3, Overwatch, Doom etc.. they all locked at 165fps with single TXP, the rest is walk in the park.
> 
> For a week now im trying to find and justify a reason to get 2nd one but i just cant find any "zero reasons" until 4k/120 that is, but until then we will get Titan Volta.
> Moved from 2x Titan XM to 2x 1080SLi to single TXP and by far this is the smoothest and best experience in everything.
> 
> This card is a monster, but again my HB bridge is lonely


I have two titan xp in HB SLI and I play BF4 on my 1440p 144Hz Gsync monitor. I was noticing 70-95% usage and achieving 120-144 FPS. I have all settings set to ultra except the last 3 options, which I believe are AA, post processing, and ambient occlusion? Those 3 are turned off. I have scaling set to 200% so I believe this is scaling at 5k?

How are you achieving 165 FPS with only a single card? What are your settings and justification for those settings? Just curious.


----------



## KillerBee33

Quote:


> Originally Posted by *Sheyster*
> 
> Has anyone been able to extract the BIOS from the card? I've tried using the latest NVflash (Joe Dirt Edition) and GPU-Z, but no luck so far. Please post it up if you have it!


Interesting, will have to try today. Somethings have to be updated, GPU_Z still have few things as Unknown and Grayed out , 3DMark just started to recognize Titan X Pascal and also OCulus keeps telling me my system does not meet the requirements


----------



## Z0eff

Quote:


> Originally Posted by *carlhil2*
> 
> I think that by the time the yields are good enough for that, you would be better off waiting on the **80 Volta, no?


I'm sort of guessing here (well, hoping...) that nvidia is releasing the Titan XP so early exactly to facilitate a timeslot to release the 1080Ti in before it's too close to the 1100 series release.


----------



## HyperMatrix

Quote:


> Originally Posted by *Stateless*
> 
> CSM was already enabled.
> 
> How can installing a new GPU cause issues like this? I was using the computer a few hours before this with zero issues at all. I have 3 SSD's only, with one obvious having Windows on it. I just don't get how removing 2 Titan X Maxwell's, inserting the new Titan XP and I still cannot get in to Windows. I am at a complete loss. All 3 SSD's are being seen by the BIOS. I am not what else to do. Can the Titan P be the cause of this, can inserting a new GPU kill the SSD and the boot sector not allowing windows to load at all? Thanks in advance for any help.
> 
> I have inserted new GPU's into rigs for years without ever having an issue. I really thought last night after re-routing my loop and leak testing that I could just turn it on, download the Titan XP drivers and be bragging or crying about my overclocks.
> 
> 
> 
> 
> 
> 
> 
> Right now, I just want to get into Windows.


And it won't load even if you do a boot override of your Windows hard drive instead of using Windows boot manager?


----------



## toncij

Quote:


> Originally Posted by *Z0eff*
> 
> I'm sort of guessing here (well, hoping...) that nvidia is releasing the Titan XP so early exactly to facilitate a timeslot to release the 1080Ti in before it's too close to the 1100 series release.


I doubt there is any space for a Ti... when? What kind of perf and price?


----------



## gamingarena

Quote:


> Originally Posted by *bee144*
> 
> I have two titan xp in HB SLI and I play BF4 on my 1440p 144Hz Gsync monitor. I was noticing 70-95% usage and achieving 120-144 FPS. I have all settings set to ultra except the last 3 options, which I believe are AA, post processing, and ambient occlusion? Those 3 are turned off. I have scaling set to 200% so I believe this is scaling at 5k?
> 
> How are you achieving 165 FPS with only a single card? What are your settings and justification for those settings? Just curious.


my settings are maxed out at 1440p including 4xAA and usage is 90-95% and its pegged at 165FPS, this is on 5960x 4.3ghz

Scaling 200% that's not 1440p then, just use 4xAA and 100% scaling and you will be pegged at 165 on single card, cant really tell any difference with 4xaa against 200% scaling except massive FPS hit.

Well if people using this overkill settings then i guess 2x TXP can be justified, but that's basically running at 4-5k and expecting to run 165fps, that is pipe dream even with 2x TXP

Im talking about pure 1440p Res with no DSR or Super-sampling


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> I doubt there is any space for a Ti... when? What kind of perf and price?


Bah, I can dream!


----------



## RedM00N

Quote:


> Originally Posted by *Sheyster*
> 
> Has anyone been able to extract the BIOS from the card? I've tried using the latest NVflash (Joe Dirt Edition) and GPU-Z, but no luck so far. Please post it up if you have it!


If your going down the route of bios modding, let me know of your findings. Till I get this card, it's shooting in the dark / useless as to what I do to the file (if we get one).


----------



## bee144

Quote:


> Originally Posted by *gamingarena*
> 
> my settings are maxed out at 1440p including 4xAA and usage is 90-95% and its pegged at 165FPS, this is on 5960x 4.3ghz
> 
> Scaling 200% that's not 1440p then, just use 4xAA and 100% scaling and you will be pegged at 165 on single card, cant really tell any difference with 4xaa against 200% scaling except massive FPS hit.
> 
> Well if people using this overkill settings then i guess 2x TXP can be justified, but that's basically running at 4-5k and expecting to run 165fps, that is pipe dream even with 2x TXP
> 
> Im talking about pure 1440p Res with no DSR or Super-sampling


but trees look so fake in BF4 even with 4xAA


----------



## EniGma1987

Quote:


> Originally Posted by *HaniWithAnI*
> 
> So I just had a thought regarding power limits.
> 
> TDP of Titan XP is 250W.
> 
> 120% of 250W is 300W.
> 
> 6+8pin power = 75W + 150W, +75W base from the PCIE slot = 300W.
> 
> Does this pose an issue if we attempt a shunt mod to raise the power limit or am I missing something? Where would the card draw the extra wattage from?


See here:
http://www.overclock.net/t/1604477/reddit-rx-480-fails-pci-e-specification/1030#post_25326330


----------



## HaniWithAnI

Quote:


> Originally Posted by *EniGma1987*
> 
> See here:
> http://www.overclock.net/t/1604477/reddit-rx-480-fails-pci-e-specification/1030#post_25326330


Interesting. But the "fix" mentioned was a deliberate change made by AMD correct? Is it possible that the Titan will overdraw from the PCI in a similar fashion rather than (apparently safely) via the 6/8pin?

Not going to stop me doing it, but curious of the ramifications. More than happy to lose a motherboard, less than happy to lose a 1200$ GPU lol


----------



## EniGma1987

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Interesting. But the "fix" mentioned was a deliberate change made by AMD correct? Is it possible that the Titan will overdraw from the PCI in a similar fashion rather than (apparently safely) via the 6/8pin?
> 
> Not going to stop me doing it, but curious of the ramifications. More than happy to lose a motherboard, less than happy to lose a 1200$ GPU lol


I didnt mean to talk about AMD's overdraw fix or anything, just that in that post I go over the safe limits used by the manufacturers for how much power can be put through the 6 and 8 pin PCI-E connectors. Well above the 75 and 150w the spec says. So we can easily go over 300W on the GPU without stressing the PCI-E power connectors.

As for draw from the PCI-E slot, I have not yet measured how the Titan X draws power through the slot, so I cannot give an answer on how much it is drawing from the slot or where the slot power goes to. *Normally*, slot power is used for the video memory and display outputs. I do not know for sure that is the case on this Titan X though. Id be willing to say display outputs for sure come through PCI-E, but memory I am not sure about. It looks like it may be wired into PCI-E connectors. I wont have a chance to measure the traces for at least a few weeks due to being too busy to go that in depth with the card.

Either way, overclocking always pushes beyond what specs say are "safe limits", hence that is why they are the spec. We do it anyway and as long as you stick to common sense, and circuit power ratings, voltage limits, and temperature limits then things always turn out fine in overclocking. So as long as we stick to the same thing here, overclocking the Titan X will be fine too. Knowing what parts are used in the hardware of this GPU, I would say safe limits are 1.1v or under on GPU VCore, and 350w or less power draw. Technically VRM for the core is rated for *about* 560 watts, but we need to take transient spikes into account so you usually want to leave a couple hundred watts as headroom for the VRM section. And we have to remember the PCI-E connectors on this GPU can only safely put out about 470 watts anyway.


----------



## nyk20z3

When is NVIDIA expecting more stock?, i am considering finally taking the plunge on a ridicolous card at a ridicolous price.


----------



## HaniWithAnI

Quote:


> Originally Posted by *EniGma1987*
> 
> I didnt mean to talk about AMD's overdraw fix or anything, just that in that post I go over the safe limits used by the manufacturers for how much power can be put through the 6 and 8 pin PCI-E connectors. Well above the 75 and 150w the spec says. So we can easily go over 300W on the GPU without stressing the PCI-E power connectors.
> 
> As for draw from the PCI-E slot, I have not yet measured how the Titan X draws power through the slot, so I cannot give an answer on how much it is drawing from the slot or where the slot power goes to. *Normally*, slot power is used for the video memory and display outputs. I do not know for sure that is the case on this Titan X though. Id be willing to say display outputs for sure come through PCI-E, but memory I am not sure about. It looks like it may be wired into PCI-E connectors. I wont have a chance to measure the traces for at least a few weeks due to being too busy to go that in depth with the card.


Understood. I'm planning to apply liquid metal when it shows up on the 19th so it got me overthinking I guess, I actually knew the cables could hold more but wasn't sure how the card would decide where to draw from.

Thanks, and speaking for pretty much everyone in this thread, looking forward to reading what you find when you do get the time!


----------



## DADDYDC650

Clash of the Titans

http://www.babeltechreviews.com/battle-titans-pascal-titan-x-vs-maxwell-titan-x/


----------



## bee144

Just received my two Titan Xp yesterday. Starting to overclock.

Is my Corsair AX1200i enough for the rig in my signature?


----------



## axiumone

Quote:


> Originally Posted by *bee144*
> 
> Just received my two Titan Xp yesterday. Starting to overclock.
> 
> Is my Corsair AX1200i enough for the rig in my signature?


I'd say more than enough. You'll probably top our around 800 watts.


----------



## EniGma1987

Did the CLU mod since my CLU just came in today. Pictures and tutorial to follow. Ran a quick test to prove it works.

Before CLU:
http://www.3dmark.com/fs/9627962

after CLU:
http://www.3dmark.com/fs/9724823



As you can see, core frequency detected went up a little bit, since the GPU tends to stay higher clocked. And graphics score went up 220 points. These are the same settings in afterburner. Only difference is one is before CLU went on and one is right after CLU went on the shunts. Pictures and tutorial of mod will come either later tonight or tomorrow.


----------



## carlhil2

Quote:


> Originally Posted by *EniGma1987*
> 
> Did the CLU mod since my CLU just came in today. Pictures and tutorial to follow. Ran a quick test to prove it works.
> 
> Before CLU:
> http://www.3dmark.com/fs/9627962
> 
> after CLU:
> http://www.3dmark.com/fs/9724823
> 
> 
> 
> As you can see, core frequency detected went up a little bit, since the GPU tends to stay higher clocked. And graphics score went up 220 points. These are the same settings in afterburner, no change whatsoever. Only difference is one is before CLU went on and one is right after CLU went on the shunts. Pictures and tutorial of mod will come either later tonight or tomorrow.


Thanks for sharing your results..


----------



## KillerBee33

Quote:


> Originally Posted by *Sheyster*
> 
> Has anyone been able to extract the BIOS from the card? I've tried using the latest NVflash (Joe Dirt Edition) and GPU-Z, but no luck so far. Please post it up if you have it!


Yeap , same [email protected] thing.


----------



## toncij

How does warranty work for Nvidia? This is the first time I'll be using them directly and would like to know how that works? From experience so far, Nvidia doesn't reply on Twitter customer support account and that is a bit worrying... I don't expect them to be as far and good as Amazon but still...


----------



## pompss

Quote:


> Originally Posted by *EniGma1987*
> 
> Did the CLU mod since my CLU just came in today. Pictures and tutorial to follow. Ran a quick test to prove it works.
> 
> Before CLU:
> http://www.3dmark.com/fs/9627962
> 
> after CLU:
> http://www.3dmark.com/fs/9724823
> 
> 
> 
> As you can see, core frequency detected went up a little bit, since the GPU tends to stay higher clocked. And graphics score went up 220 points. These are the same settings in afterburner, no change whatsoever. Only difference is one is before CLU went on and one is right after CLU went on the shunts. Pictures and tutorial of mod will come either later tonight or tomorrow.


thanks for sharing.
Waiting for your tutorial


----------



## Jpmboy

Quote:


> Originally Posted by *HaniWithAnI*
> 
> So I just had a thought regarding power limits.
> 
> TDP of Titan XP is 250W.
> 
> 120% of 250W is 300W.
> 
> 6+8pin power = 75W + 150W, +75W base from the PCIE slot = 300W.
> 
> Does this pose an issue if we attempt a shunt mod to raise the power limit or am I missing something? Where would the card draw the extra wattage from?


both PCIE rails can will and do deliver well beyond the rated spec. look at your PSU, if it is a multi rail, most pCIE rails are 20A or more = >200W with that rail alone. don't worry, you r not going to melt your rig with the TXP.. not even close to the power draw of a 780Ti KPE, or even the titanXM with mods.
Quote:


> Originally Posted by *bee144*
> 
> Just received my two Titan Xp yesterday. Starting to overclock.
> 
> Is my Corsair AX1200i enough for the rig in my signature?


Plenty!


----------



## cookiesowns

Getting ready to throw these on water. For first time water coolers this card is not for faint of heart.

I had to be super super careful with these. On the KPE I can man handle them and they'd be great ( not saying I did )


----------



## fernlander

Quote:


> Originally Posted by *Stateless*
> 
> CSM was already enabled.
> 
> How can installing a new GPU cause issues like this? I was using the computer a few hours before this with zero issues at all. I have 3 SSD's only, with one obvious having Windows on it. I just don't get how removing 2 Titan X Maxwell's, inserting the new Titan XP and I still cannot get in to Windows. I am at a complete loss. All 3 SSD's are being seen by the BIOS. I am not what else to do. Can the Titan P be the cause of this, can inserting a new GPU kill the SSD and the boot sector not allowing windows to load at all? Thanks in advance for any help.
> 
> I have inserted new GPU's into rigs for years without ever having an issue. I really thought last night after re-routing my loop and leak testing that I could just turn it on, download the Titan XP drivers and be bragging or crying about my overclocks.
> 
> 
> 
> 
> 
> 
> 
> Right now, I just want to get into Windows.


FWIW I had to reset my CMOS to get it to boot. I can't explain why but that's what I had to do. I simply took out my Maxwell Titan and put in the TXP.


----------



## fernlander

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Interesting. But the "fix" mentioned was a deliberate change made by AMD correct? Is it possible that the Titan will overdraw from the PCI in a similar fashion rather than (apparently safely) via the 6/8pin?
> 
> Not going to stop me doing it, but curious of the ramifications. More than happy to lose a motherboard, less than happy to lose a 1200$ GPU lol


I was able to draw close to 400w on the TXM BIOS modded, those numbers are just indicative. More power can be drawn from each pin.


----------



## fernlander

Quote:


> Originally Posted by *Jpmboy*
> 
> both PCIE rails can will and do deliver well beyond the rated spec. look at your PSU, if it is a multi rail, most pCIE rails are 20A or more = >200W with that rail alone. don't worry, you r not going to melt your rig with the TXP.. not even close to the power draw of a 780Ti KPE, or even the titanXM with mods.
> Plenty!


I must say though my BIOS flashed 780ti at some point did something weird and burnt itself out and my MB. I didn't really troubleshoot the damage as much as just get a 980 and new MB. I didn't want to risk the 980 by testing the board or the new board by trying the 780ti on it. But yeah, the 780ti was drawing ridiculous amounts of power. I clearly had no idea how much.


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> both PCIE rails can will and do deliver well beyond the rated spec. look at your PSU, if it is a multi rail, most pCIE rails are 20A or more = >200W with that rail alone. don't worry, you r not going to melt your rig with the TXP.. not even close to the power draw of a 780Ti KPE, or even the titanXM with mods.
> Plenty!


It's kind of good benching two big cards and not having to add a second power supply, two over volted 780ti KPE's on an overclocked X79 or X99 and one wasn't enough to bench


----------



## Stateless

Quote:


> Originally Posted by *fernlander*
> 
> FWIW I had to reset my CMOS to get it to boot. I can't explain why but that's what I had to do. I simply took out my Maxwell Titan and put in the TXP.


Thanks for offering help and to Hyper for suggestion. By the time I read your CMOS issue that helped you, I was so frustrated that I ended up doing a fresh install of Win10. I still had some odd issues after the fresh install, it appears one of the SSD's or the port it was connected or even the cable was bad. After I did the install, I wiped that drive, removed the MBR from it, but when I would try to add that SSD to my system, it would just hang and be stuck in the loading of Windows screen forever. I was able to get my 1tb SSD to work without issues. Anyways, system is up and running, still doing some tweaks to settings and such.

But the good/great news is I was finally able to mess around with my new Titan. My card seems to be running well. I was able to hit 2012 with not hitting higher that 65c in temps in Fire Strike Ultra. I have a fan profile where it matches speed with temp. I was able to hit rank #40 in the world on the Fire Strike Ultra Single Card Hall of Fame. I originally was at 50, but I had ended up adding a +200 to the memory, which stabilized the clocks. I noticed that when I had the core at +200 with memory untouched, it would fluctuate from 1785-2012 and bounce in between those ranges. Again, temp never breached 65c. However when I added a +200 memory, the clocks would stay in the 1945-2000 range pretty consistently, it would never drop below the mid 1900's. It seems like a +200 to the memory helped sustain the clocks at a more consistent speed. Don't think this happened with my Titan Maxwell.

I did try to go to +250 on the core, but that was a no go. It would crash within 30 seconds of the run. I am going to do a few more runs going up +10 on each to see what I can top out at.

Last question because I am still installing stuff, whenever MSI Afterburner runs, I get a error that that it cant reach the River Tuner servers or something like this. This is with the latest beta version. I am still able to use AB without issue, but whenever I load it up, about 15 secs after I get this pop up error about it not being able to reach the RT servers. Any idea?

I also did a run of the new Time Spy and was ranked 29th. http://www.3dmark.com/hall-of-fame-2/timespy+3dmark+score+performance+preset/version+1.0/1+gpu


----------



## EniGma1987

*Shunt mod tutorial:*

*Warning:* CoolLabratory Liquid Ultra (and Pro) are conductive thermal interface materials. We use this conductivity to lower the resistance in the card to do this mod, but this conductivity means it is dangerous and can destroy your graphics card if it spills and touches another component on the GPU. I take no responsibility if you break your own (or someone else's) GPU with this mod. If you choose to do the mod, be careful.
NOTE: you can use other liquid metal TIMs as well. Thermal Grizzly Conductonaut works fine too

EDIT: Criminal gave me a link this morning to a video where der8auer does this mod as well. I posted it at the end of this tutorial if you ant to see someone apply the liquid metal in a video rather than just pictures.

1) Start off by removing the backplate and cooler from your GPU. To do this you will need a size #0 phillips screwdriver and a 4mm socket wrench. Once all the screws and bolts are out the cooler should come off relatively easy, so if it is not coming off then you probably forgot a screw somewhere. Likely places that get forgotten are on the end of the card with the display outputs.

2) Once the cooler is removed your card should look like this:


The three resistors on the card are circled in red. We are only going to mod the top two resistors.

3) Place a small bit of Liquid Ultra on the two top shunt resistors. These resistors have a case label of "5MO". You dont need a ton of CLU, just a small dab that you will spread out. See the picture here for how much I applied:



4) Now use the brush included in the CLU package to spread the CLU over the whole top of the resistors. Be careful not to spill it off the resistor. If the CLU spills off the top of the resistor it will most likely destroy your graphics card. Be careful not to use too much CLU as using too much will cause a spill as well when you put the graphics card back together. There is an inductor very close to resistor #2 (RS2) that can easily be spilled onto. Be very careful with this step.



Your graphics card should look like the above picture now. Put the cooler back onto the card and you are done!

To remove this mod from your graphics card, simply wipe away the CLU. If the liquid ultra has hardened then you may be able to use a hair drier for a very brief period to liquefy the material again so that you can remove it.

I showed a screenshot of before and after the mod in 3DMark earlier to show the mod works. Graphics score rose by 223 points in FS Ultra with the mod. I have now been playing a video game for the past 3 hours since doing the mod and posting this with no problems.

By covering both the resistors on the Titan X you should increase your power limit by about 30-35%. This will give you some additional headroom for overclocking so the card doesnt throttle down due to power limit. However, the Titan X (Pascal) has very poor VRM setup on the GPU. It is not rated very high and adding 30% more power from this mod, combined with setting your PT slider to 120% will allow the GPU to draw what I consider to be unsafe levels of power to the core. This mod should let you draw around 390 watts of power, but I dont consider anything over 350w on the Titan X to be safe. Just because you *allow* the card to draw that much power, doesnt mean it will of course. However to be safe, you may want to limit your power target slider to only 110%. That, combined with this mod, should let you draw around 350 watts max before the limit kicks in which will keep your card's VRM from frying.

As you can see, this mod is for the Titan X, and Der8auer's video is for the GTX 1070 and 1080. These mods works for any Nvidia card going back to at least the 700 generation, and probably in future generations as well. Not all cards will have 3 shunt resistors, you may only have 2 resistors because of the single power connector input on your GPU. If you only have 1 power connector, only apply CLU to the one resistor at the top.

Der8auer's video tutorial for modding the power limit on these cards:


----------



## magbarn

Just gone one to replace my Maxwell Titan X


----------



## renejr902

If possible I want your opinion guys about a few questions, thanks so much. i try to get the more speed possible from my pc to help me with my titan x pascal performance.

I made some tests with witcher 3 with my memory ddr3 downclocked from 1600 to 1066mhz. and with ddr3 at 1066mhz, some people were right , i got much more fps drop and they were worst. For a example , at 1600mhz it was down to 55-56fps drop VS 49-50 with 1066 speed instead...
Its not night and day difference , but 1600 is really better than 1066, and some place i got no fps drop with 1600 but 1066 got one. So to help me with my fps drop problem i made a move...

I got a deal. i sold my h97-plus and 32gb ddr3 1600 for 200$Canadian money , i keep my 4790k and i got a big sale deal so i bought 32gb ddr3 at 2400 mhz cas 11 from Team Dark( i will overclock at cas9 or 10 or 2666mhz, which is better 2666 or cas 9-10?) and i bought a MSI Z97 PC MATE. I bought this msi board and the ddr3 team dark memory because it was very cheap vs other hardware choice. So in canadian money it cost me 280$CA( its like 180-190usa money) , so for 80$ca more im ok with changing my ddr3 memory and im happy to got a z97 board too. I trust msi a lot, i always loved their board and custom videocard. Is team dark memory reliable enough?

What do you think of this guys? Is it a good deal, a good upgrade for 80$ ?

I have another question. I checked a lot on the web, i have a 1tb wd black for Windows10 and gaming, and it seem that SDD hard disk only offer faster loading, but no fps gain. Is it true? Otherwise i will buy a cheap one . But i had one with a laptop and i dont care about os loading faster , and programs loading faster. If it cant offer better game performance in fps , i dont want it. BUT if it can help with fps drop or stuttering i will buy one. Thanks for your opinions, its always very appreciated guys thumbsupsmiley.png

I made these changes to get the more power possible of my TITAN X PASCAL wink.gif


----------



## xarot

Quote:


> Originally Posted by *Lennyx*
> 
> Are you sure? On my order the payment went to a company in Ireland but the card shipped from Sweden.


My card was shipped last night, but I got tracking number for a French delivery company. However the tracking number doesn't work yet.
Quote:


> Originally Posted by *Lennyx*
> 
> Everything ordered outside Norway need to go thru customs. Sometimes things get stuck in customs for days before it gets cleared. Nvidia had a Store page for Norwegian customers but they have made an error somewhere with the export documents wich slowed down the process or in some cases the shippment was returned.


Thanks for clearing that up. Out of curiosity, do you need to pay any custom charges too from EU?


----------



## Lennyx

Quote:


> Originally Posted by *xarot*
> 
> My card was shipped last night, but I got tracking number for a French delivery company. However the tracking number doesn't work yet.
> Thanks for clearing that up. Out of curiosity, do you need to pay any custom charges too from EU?


I believe all of us got tracking number for that french site that did not work. I checked a local delivery company and found out the card was in sweden then checked the same tracking number on a swedish delivery company. My card arrived in Norway this morning and can now track it here. You can alos try and check your local delivery company and see if they have a global search function and just enter your tracking number there.

There is a custom fee for some things. For automotive parts there is no customs fee. It is probably 3 years since i ordered pc parts outside norway so i dont remember if it is any fee on that.
We need to make sure we dont pay local tax in the country we order things from. Norway will charge 25% no matter what.

Price of item + shipping + 25% and maybe a custom fee on top of that.


----------



## toncij

Quote:


> Originally Posted by *Lennyx*
> 
> I believe all of us got tracking number for that french site that did not work. I checked a local delivery company and found out the card was in sweden then checked the same tracking number on a swedish delivery company. My card arrived in Norway this morning and can now track it here. You can alos try and check your local delivery company and see if they have a global search function and just enter your tracking number there.
> 
> There is a custom fee for some things. For automotive parts there is no customs fee. It is probably 3 years since i ordered pc parts outside norway so i dont remember if it is any fee on that.
> We need to make sure we dont pay local tax in the country we order things from. Norway will charge 25% no matter what.
> 
> Price of item + shipping + 25% and maybe a custom fee on top of that.


But the price on the site should reflect the total cost with tax...


----------



## xarot

Quote:


> Originally Posted by *Lennyx*
> 
> I believe all of us got tracking number for that french site that did not work. I checked a local delivery company and found out the card was in sweden then checked the same tracking number on a swedish delivery company. My card arrived in Norway this morning and can now track it here. You can alos try and check your local delivery company and see if they have a global search function and just enter your tracking number there.
> 
> There is a custom fee for some things. For automotive parts there is no customs fee. It is probably 3 years since i ordered pc parts outside norway so i dont remember if it is any fee on that.
> We need to make sure we dont pay local tax in the country we order things from. Norway will charge 25% no matter what.
> 
> Price of item + shipping + 25% and maybe a custom fee on top of that.


Woot. Went to local UPS site and I can track it there.







Thanks a lot!


----------



## cookiesowns

Wow... Underwater these cards are NUTS. Holding 2Ghz+ throughout Timespy now at the same offset/clocks. My temps are a bit high though, might have to re-do TIM on first card, it's 6C higher than second card.

Still hitting roughly 50C & 44C on second card.


----------



## fernlander

Quote:


> Originally Posted by *Stateless*
> 
> Thanks for offering help and to Hyper for suggestion. By the time I read your CMOS issue that helped you, I was so frustrated that I ended up doing a fresh install of Win10. I still had some odd issues after the fresh install, it appears one of the SSD's or the port it was connected or even the cable was bad. After I did the install, I wiped that drive, removed the MBR from it, but when I would try to add that SSD to my system, it would just hang and be stuck in the loading of Windows screen forever. I was able to get my 1tb SSD to work without issues. Anyways, system is up and running, still doing some tweaks to settings and such.
> 
> But the good/great news is I was finally able to mess around with my new Titan. My card seems to be running well. I was able to hit 2012 with not hitting higher that 65c in temps in Fire Strike Ultra. I have a fan profile where it matches speed with temp. I was able to hit rank #40 in the world on the Fire Strike Ultra Single Card Hall of Fame. I originally was at 50, but I had ended up adding a +200 to the memory, which stabilized the clocks. I noticed that when I had the core at +200 with memory untouched, it would fluctuate from 1785-2012 and bounce in between those ranges. Again, temp never breached 65c. However when I added a +200 memory, the clocks would stay in the 1945-2000 range pretty consistently, it would never drop below the mid 1900's. It seems like a +200 to the memory helped sustain the clocks at a more consistent speed. Don't think this happened with my Titan Maxwell.
> 
> I did try to go to +250 on the core, but that was a no go. It would crash within 30 seconds of the run. I am going to do a few more runs going up +10 on each to see what I can top out at.
> 
> Last question because I am still installing stuff, whenever MSI Afterburner runs, I get a error that that it cant reach the River Tuner servers or something like this. This is with the latest beta version. I am still able to use AB without issue, but whenever I load it up, about 15 secs after I get this pop up error about it not being able to reach the RT servers. Any idea?
> 
> I also did a run of the new Time Spy and was ranked 29th. http://www.3dmark.com/hall-of-fame-2/timespy+3dmark+score+performance+preset/version+1.0/1+gpu


Sorry about your SSD, just bad luck. I am able to run it at +200 but since my 780ti incident I always keep my fan at 100% manually. When I do so it stays at 2100 in Heaven at 70C and no throttling. Anything higher and I get artifacts and freezes. I will need to put a hybrid cooler on it for actual use though, I can't tolerate that fan noise. Also I hope it will give me a little more room in my O/C. I haven't tested memory overclocking yet but I'll give it a try. If it stabilizes the clock it would be great, but I don't think we are memory bandwidth limited in games, I could be wrong. I know I would get a few more frames in Heaven but if you take it too far it actually lowers your score and I felt with the Titan Maxwell at least it reduced my clock stability.

I use nVinspector to overclock. For some reason it is not popular here but I believe it use the same Rivatuner based overclocking software as everything else uses. I like it because it is clean, simple, light and does not need to connect to a server.


----------



## Lennyx

Quote:


> Originally Posted by *toncij*
> 
> But the price on the site should reflect the total cost with tax...


Yes. They have a .no site thats why. Some online stores based in a eu country have a norwegian part and those stores take care of taxes. Alot of other stores do not and then it is up to the customer to make sure he dont pay local tax in another country.

The waterblock i ordered from ek i need to pay taxes of item and shipping.


----------



## Gary2015

Getting this error when running Firestrike Ultra on my 2X TXP. Anyone help?


----------



## Tideman

Quote:


> Originally Posted by *Lennyx*
> 
> Are you sure? On my order the payment went to a company in Ireland but the card shipped from Sweden.
> 
> First upgrade in almost 3 years. Also ordered new monitor and a waterblock. Next week is gonna be alot of fun


Seems you might be right. Depends on your location but I know that for the UK, they ship from Ireland.


----------



## Lennyx

Quote:


> Originally Posted by *Tideman*
> 
> Seems you might be right. Depends on your location but I know that for the UK, they ship from Ireland.


You are probably right.They most likely have several warehouses spread around where they ship from. Stock is out in Norway and Sweden now, but it seems like the card is still in stock in the UK.


----------



## fisher6

Quote:


> Originally Posted by *Lennyx*
> 
> I believe all of us got tracking number for that french site that did not work. I checked a local delivery company and found out the card was in sweden then checked the same tracking number on a swedish delivery company. My card arrived in Norway this morning and can now track it here. You can alos try and check your local delivery company and see if they have a global search function and just enter your tracking number there.
> 
> There is a custom fee for some things. For automotive parts there is no customs fee. It is probably 3 years since i ordered pc parts outside norway so i dont remember if it is any fee on that.
> We need to make sure we dont pay local tax in the country we order things from. Norway will charge 25% no matter what.
> 
> Price of item + shipping + 25% and maybe a custom fee on top of that.


I see you are in Norway too! Out of curiosity, do you expect to pay any customs (toll) fees on top of the 13k NOK for the card itself provided you ordered it from the Norwegian Nvidia website? I thought the price included all that since everything is in Norwegian and so it would count as normal local purchase.


----------



## Lennyx

Quote:


> Originally Posted by *fisher6*
> 
> I see you are in Norway too! Out of curiosity, do you expect to pay any customs (toll) fees on top of the 13k NOK for the card itself provided you ordered it from the Norwegian Nvidia website? I thought the price included all that since everything is in Norwegian and so it would count as normal local purchase.


There should not be any extra to pay, the 13k NOK should cover it all. I have just tried to explain how taxes and customs work for us in Norway when we buy from outside Norway. Fun fact: 2 years ago 13k NOK was over 2000 USD


----------



## superkyle1721

Anyone know if they will be releasing a TI version of the 10 series. Looking to upgrade my 980TI SLI and I planned to get two 1080TIs but now I feel there isn't a large enough gap to allow a 1080TI so I may just pick up a single Titan. The problem with that is from benchmarks etc this would essentially be an expensive side grade at best right?

Sent from my iPhone using Tapatalk


----------



## KillerBee33

Best one yet








http://www.3dmark.com/3dm/14035409


----------



## combat fighter

Quote:


> Originally Posted by *superkyle1721*
> 
> Anyone know if they will be releasing a TI version of the 10 series. Looking to upgrade my 980TI SLI and I planned to get two 1080TIs but now I feel there isn't a large enough gap to allow a 1080TI so I may just pick up a single Titan. The problem with that is from benchmarks etc this would essentially be an expensive side grade at best right?
> 
> Sent from my iPhone using Tapatalk


Your'll basically have at least power of 2 of your cards, and quite possibly more (especially under water) in one card.

I think it's an amazing card!

Can only get better under water and when voltage control is available.


----------



## superkyle1721

Quote:


> Originally Posted by *combat fighter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *superkyle1721*
> 
> Anyone know if they will be releasing a TI version of the 10 series. Looking to upgrade my 980TI SLI and I planned to get two 1080TIs but now I feel there isn't a large enough gap to allow a 1080TI so I may just pick up a single Titan. The problem with that is from benchmarks etc this would essentially be an expensive side grade at best right?
> 
> Sent from my iPhone using Tapatalk
> 
> 
> 
> Your'll basically have at least power of 2 of your cards, and quite possibly more (especially under water) in one card.
> 
> I think it's an amazing card!
> 
> Can only get better under water and when voltage control is available.
Click to expand...

I don't think a Titan X does match two of my cards OC vs OC. My graphic score is around 41200. So far what I have seen I'm not sure it does match two cards. However real world I might only lose a couple FPS. Problem is if I make the upgrade I will only be able to afford a single card. If they drop a 1080 I can buy two question is will they release one...

Sent from my iPhone using Tapatalk


----------



## HyperMatrix

Quote:


> Originally Posted by *Gary2015*
> 
> Getting this error when running Firestrike Ultra on my 2X TXP. Anyone help?


Does it happen at stock clocks too?


----------



## HyperMatrix

Quote:


> Originally Posted by *superkyle1721*
> 
> I don't think a Titan X does match two of my cards OC vs OC. My graphic score is around 41200. So far what I have seen I'm not sure it does match two cards. However real world I might only lose a couple FPS. Problem is if I make the upgrade I will only be able to afford a single card. If they drop a 1080 I can buy two question is will they release one...
> 
> Sent from my iPhone using Tapatalk


Real-world you'll probably gain, due to no loss from SLI scaling issues. With a single Titan XP I can run Dying Light maxed out at 1440p, locked to 165fps.


----------



## superkyle1721

Quote:


> Originally Posted by *HyperMatrix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *superkyle1721*
> 
> I don't think a Titan X does match two of my cards OC vs OC. My graphic score is around 41200. So far what I have seen I'm not sure it does match two cards. However real world I might only lose a couple FPS. Problem is if I make the upgrade I will only be able to afford a single card. If they drop a 1080 I can buy two question is will they release one...
> 
> Sent from my iPhone using Tapatalk
> 
> 
> 
> Real-world you'll probably gain, due to no loss from SLI scaling issues. With a single Titan XP I can run Dying Light maxed out at 1440p, locked to 165fps.
Click to expand...

True with games like doom etc I'm sure I will gain due to crappy scaling. Games like witcher where scaling is pretty good it should be about the same. I'll prob wait until non reference pcb boards are launched and go full water cooled custom board then wait until voltage mods are released.

Sent from my iPhone using Tapatalk


----------



## Snaporz

Quote:


> Originally Posted by *superkyle1721*
> 
> True with games like doom etc I'm sure I will gain due to crappy scaling. Games like witcher where scaling is pretty good it should be about the same. I'll prob wait until non reference pcb boards are launched and go full water cooled custom board then wait until voltage mods are released.
> 
> Sent from my iPhone using Tapatalk


It's my understanding you'll only ever see Titan X Pascal direct from Nvidia.


----------



## superkyle1721

Quote:


> Originally Posted by *Snaporz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *superkyle1721*
> 
> True with games like doom etc I'm sure I will gain due to crappy scaling. Games like witcher where scaling is pretty good it should be about the same. I'll prob wait until non reference pcb boards are launched and go full water cooled custom board then wait until voltage mods are released.
> 
> Sent from my iPhone using Tapatalk
> 
> 
> 
> It's my understanding you'll only ever see Titan X Pascal direct from Nvidia.
Click to expand...

You mean no custom pcb boards? I know gigabyte and the others are selling the reference but I didn't know there wouldn't be any custom boards

Sent from my iPhone using Tapatalk


----------



## fisher6

Quote:


> Originally Posted by *superkyle1721*
> 
> You mean no custom pcb boards? I know gigabyte and the others are selling the reference but I didn't know there wouldn't be any custom boards
> 
> Sent from my iPhone using Tapatalk


Only Nvidia will be making the Titan XP, there will be no custom boards or coolers. It will only be sold directly from Nvidia. Only thing you can do is slap a wb on it.


----------



## HyperMatrix

Quote:


> Originally Posted by *superkyle1721*
> 
> You mean no custom pcb boards? I know gigabyte and the others are selling the reference but I didn't know there wouldn't be any custom boards
> 
> Sent from my iPhone using Tapatalk


There has never been a non-Nvidia made Titan. Previous generations, other companies would just brand them but they'd be selling the same thing. Even when EVGA swapped out the reference blower and put a hybrid unit on it, it was still on the standard reference PCB.


----------



## toncij

It's even worse this time: they don't allow 3rd party sale either.


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> Does it happen at stock clocks too?


I am at stock. I havent started OC yet.


----------



## Drag-On

Quote:


> Originally Posted by *Gary2015*
> 
> Getting this error when running Firestrike Ultra on my 2X TXP. Anyone help?


I ran into the same problem the other day after I did the Windows 10 Anniversary Update. The issue comes from Windows removing the required Futuremark SystemInfo service when updating to a newer version. Reinstalling 3DMark should correct the issue.


----------



## Jpmboy

Quote:


> Originally Posted by *EniGma1987*
> 
> The two R22 "thingys" are chokes (inductors). That is the memory VRM up there. Not surprising that they run so hot being that we are overclocking into the 11GHz range on all those dense memory chips.
> 
> You can easily solve your power limit problem, especially since your cards are already naked like that. Just put some CoolLabratory Liquid Ultra over the 5MO resistors near the power connectors on the cards. Cover up the whole resistor to connect the two sides of the resistor together. That should about double your power limit on the GPUs.
> The mosfets (typically the limiting factor of the GPU VRM) in the VRMs of this card have basically no de-rating with higher temps. Same rating at 25c as 100c. Which is odd for a mosfet, but the reason it is like that is because the package is rated extremely low and is the limiting factor. They do become a little bit less power efficient at higher temps, but they dont lose performance or anything like that on this model. The good news is that the memory VRM current capability is way overbuilt for the memory, can easily put 150 watts through the mosfets (though, 150w would probably over saturate the memory chokes). Just buckets more power than you would ever use. it is surprising Nvidia would build up the memory VRM so much considering how badly they cut costs and make a crap core VRM on all their cards, and because GDDR5X is supposed to use even less power than normal GDDR5. IDK. Maybe the new X RAM requires a more intricate power delivery to achieve its lower draw and higher bandwidth.
> 
> I am curious though, how can you claim the parts are not near their T_Max when you dont even know what the parts are called that you are referring to? Seems like you would not be able to look up the spec sheet on something you dont know what it is.
> Watercool told me yesterday that the Titan Heatkiller block is being designed. My card performs well enough as is for the time being so I am just going to leave it and wait for the Heatkiller block


lol - yes, I know they are chokes (I think these are from ON Semi). Never knew one where the AOR and NOR for the part do not separate at higher temps. Above the NOR the part may provide a less stable output - but that is neither here nor there. I did the clu mod to my 1080.
no need for an allen wrench - right?









good to see you did the CLU mod. With my bench mounting the cards vertically, the CLU can (and will) just run off and short the card in a spectacular manner. I tried drying the CLU with a hot gun... no help. We'll get a pascal bios editor (hopefully) and these will open up just like TitanXM did.








Quote:


> Originally Posted by *cookiesowns*
> 
> Getting ready to throw these on water. For first time water coolers this card is not for faint of heart.
> 
> *I had to be super super careful with these*. On the KPE I can man handle them and they'd be great ( not saying I did )


Nice Unis! I agree - TXM and KPE - or any maxwell card had a much less fragile PCB.
Quote:


> Originally Posted by *Menthol*
> 
> It's kind of good benching two big cards and not having to add a second power supply, two over volted 780ti KPE's on an overclocked X79 or X99 and one wasn't enough to bench


Oh right? Waay back I had 3 780Ti KPEs and they required a 1200 and a 1500 powersupply to run (but juiced with the EVBOT) AlanCsalt had the same issue with them. I really came to like the "Add2PSU" gadget.








Quote:


> Originally Posted by *EniGma1987*
> 
> *Shunt mod tutorial:*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Warning: CoolLabratory Liquid Ultra (and Pro) are conductive thermal interface materials. We use this conductivity to lower the resistance in the card to do this mod, but this conductivity means it is dangerous and can destroy your graphics card if it spills and touches another component on the GPU. I take no responsibility if you break your own (or someone else's) GPU with this mod. If you choose to do the mod, be careful.
> 
> 1) Start off by removing the backplate and cooler from your GPU. To do this you will need a size #0 phillips screwdriver and a 4mm socket wrench. Once all the screws and bolts are out the cooler should come off relatively easy, so if it is not coming off then you probably forgot a screw somewhere. Likely places that get forgotten are on the end of the card with the display outputs.
> 
> 2) Once the cooler is removed your card should look like this:
> 
> 
> The three resistors on the card are circled in red. We are only going to mod the top two resistors.
> 
> 3) Place a small bit of Liquid Ultra on the two top shunt resistors. These resistors have a case label of "5MO". You dont need a ton of CLU, just a small dab that you will spread out. See the picture here for how much I applied:
> 
> 
> 
> 4) Now use the brush included in the CLU package to spread the CLU over the whole top of the resistors. Be careful not to spill it off the resistor. If the CLU spills off the top of the resistor it will most likely destroy your graphics card. Be careful not to use too much CLU as using too much will cause a spill as well when you put the graphics card back together. There is an inductor very close to resistor #2 (RS2) that can easily be spilled onto. Be very careful with this step.
> 
> 
> 
> Your graphics card should look like the above picture now. Put the cooler back onto the card and you are done!
> 
> To remove this mod from your graphics card, simply wipe away the CLU. If the liquid ultra has hardened then you may be able to use a hair drier for a very brief period to liquefy the material again so that you can remove it.
> 
> I showed a screenshot of before and after the mod in 3DMark earlier to show the mod works. Graphics score rose by 223 points in FS Ultra with the mod. I have now been playing a video game for the past 3 hours since doing the mod and posting this with no problems.
> 
> 
> 
> @GunnzAkimbo want to add this to the OP?


well done.


----------



## Gary2015

Quote:


> Originally Posted by *Drag-On*
> 
> I ran into the same problem the other day after I did the Windows 10 Anniversary Update. The issue comes from Windows removing the required Futuremark SystemInfo service when updating to a newer version. Reinstalling 3DMark should correct the issue.


Worked. Thx. Repped.


----------



## Gary2015

Stock clocks. Boosts to 1823.


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> I doubt there is any space for a Ti... when? What kind of perf and price?


Agreed. TXP will be the best offering for at least 12 months.


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> good to see you did the CLU mod. With my bench mounting the cards vertically, the CLU can (and will) just run off and short the card in a spectacular manner. I tried drying the CLU with a hot gun... no help. We'll get a pascal bios editor (hopefully) and these will open up just like TitanXM did.


Well that sucks as my liquid ultra is arriving today. My cards are mounted vertically, so I guess it is a no-go. I hadn't realized liquid ultra stays liquid enough to literally roll off the resisters. I remember using it with CPU's it would dry hard as a rock and I had to sandblast the stuff off (and why I don't use it on CPU's anymore).


----------



## EniGma1987

Quote:


> Originally Posted by *CallsignVega*
> 
> Well that sucks as my liquid ultra is arriving today. My cards are mounted vertically, so I guess it is a no-go. I hadn't realized liquid ultra stays liquid enough to literally roll off the resisters. I remember using it with CPU's it would dry hard as a rock and I had to sandblast the stuff off (and why I don't use it on CPU's anymore).


It will dry out eventually, given enough thermal cycles and time. You probably dont have that kind of time or patience for making it dry out on a resistor though. Not even sure those resistors would ever naturally get hot enough to do a proper thermal cycle. You could try just experimenting a bit, put some CLU on a little piece of scrap somewhere and then use a blow drier and some ice to cycle it a few times and see how it behaves after 5-10 minutes of doing that. Probably not near enough time but its probably worth a shot to see how it behaves. If you dont want to try it yourself, I will test out the behavior of the CLU like that next week sometime just to satisfy my own curiosity.


----------



## Gary2015

OC +180 400 MEM. Boost 2050Mhz. Temps 65C/55c 100% FAN.


----------



## toncij

Quote:


> Originally Posted by *Gary2015*
> 
> Agreed. TXP will be the best offering for at least 12 months.


I seriously doubt that too. We are going to see TXPv2 with a fully enabled chip (3840 cores and possibly HBM) very soon. Maybe even December as GDDR5X 1080Ti with 2017Q1 new TXPv2 16GB HBM.
Maybe they'll move to Volta early.


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> I seriously doubt that too. We are going to see TXPv2 with a fully enabled chip (3840 cores and possibly HBM) very soon. Maybe even December as GDDR5X 1080Ti with 2017Q1 new TXPv2 16GB HBM.
> Maybe they'll move to Volta early.


I disagree. They wont release another flagship within 12 months.


----------



## DNMock

So how easy or difficult is this shunting mod? looks like a median of about 3% to 5% increase.

What are the odds of this damaging your GPU with long term use assuming proper application?


----------



## EniGma1987

Quote:


> Originally Posted by *DNMock*
> 
> So how easy or difficult is this shunting mod? looks like a median of about 3% to 5% increase.
> 
> What are the odds of this damaging your GPU with long term use assuming proper application?


If you know how to apply CLU to a CPU, then I'd put the mod difficulty at a 2 out of 10.
Performance increase is because the card doesnt throttle from power limit. That 223 points extra in Firestrike Ultra graphics score is without increasing my overclock at all, just the mod. I played a game yesterday after the mod and the game used to constantly throttle down to 1750-1900. After the mod the lowest the GPU throttles was 1978MHz with spending 95+% of the time at 2012 or above. So definitely worth it if you are overclocking the card.


----------



## toncij

Quote:


> Originally Posted by *Gary2015*
> 
> I disagree. They wont release another flagship within 12 months.


So you bet that 1080Ti won't be coming soon?


----------



## Testier

Quote:


> Originally Posted by *toncij*
> 
> I seriously doubt that too. We are going to see TXPv2 with a fully enabled chip (3840 cores and possibly HBM) very soon. Maybe even December as GDDR5X 1080Ti with 2017Q1 new TXPv2 16GB HBM.
> Maybe they'll move to Volta early.


GP102 cant mount hbm2 and gp100 is staying on HPC it looks like.


----------



## criminal

Quote:


> Originally Posted by *CallsignVega*
> 
> Well that sucks as my liquid ultra is arriving today. My cards are mounted vertically, so I guess it is a no-go. I hadn't realized liquid ultra stays liquid enough to literally roll off the resisters. I remember using it with CPU's it would dry hard as a rock and I had to sandblast the stuff off (and why I don't use it on CPU's anymore).


I ordered Thermal Grizzly conductonaunt referenced in this 



 to try the shunt mod on my 1070. Strange nothing was mentioned in his video about the cards having to remain horizontal after the mod. Maybe conductonaunt is thicker? Anyway, I am thinking I will just cover the surrounding area on my card with LET just in case.
Quote:


> Originally Posted by *EniGma1987*
> 
> It will dry out eventually, given enough thermal cycles and time. You probably dont have that kind of time or patience for making it dry out on a resistor though. Not even sure those resistors would ever naturally get hot enough to do a proper thermal cycle. You could try just experimenting a bit, put some CLU on a little piece of scrap somewhere and then use a blow drier and some ice to cycle it a few times and see how it behaves after 5-10 minutes of doing that. Probably not near enough time but its probably worth a shot to see how it behaves. If you dont want to try it yourself, I will test out the behavior of the CLU like that next week sometime just to satisfy my own curiosity.


So your card isn't mounted vertically?


----------



## EniGma1987

Quote:


> Originally Posted by *criminal*
> 
> So your card isn't mounted vertically?


Nope, just using a regular tower.


----------



## nyk20z3

Just went to pick up a Titan X and that damn tax lol *** ($115), adds insult to injury, i just might pick up a 1080 and call it a day.


----------



## Creator

Are the TXPs CPU bottlenecked by a 4.5ghz Haswell class CPU at 1440p?


----------



## Kana Chan

What temps do the liquid electrical tape melt at?


----------



## DADDYDC650

Quote:


> Originally Posted by *toncij*
> 
> I seriously doubt that too. We are going to see TXPv2 with a fully enabled chip (3840 cores and possibly HBM) very soon. Maybe even December as GDDR5X 1080Ti with 2017Q1 new TXPv2 16GB HBM.
> Maybe they'll move to Volta early.


Nope. You'll only see the 1080 Ti possibly in the next couple of months. No full chip Pascal Titan.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Nope. You'll only see the 1080 Ti possibly in the next couple of months. No full chip Pascal Titan.


Yes, if there is a Ti variant this time. Given the TXP is 30% faster than a 1080, how fast would a Ti be 28% or 15%.?


----------



## KillerBee33

Quote:


> Originally Posted by *Gary2015*
> 
> Yes, if there is a Ti variant this time. Given the TXP is 30% faster than a 1080, how fast would a Ti be 28% or 15%.?


Even Mathematically it doesn't make sense releasing a "Ti" Doubt there'll be one.


----------



## criminal

Quote:


> Originally Posted by *Kana Chan*
> 
> What temps do the liquid electrical tape melt at?


Can't find that, but found this on the spec sheet:

Temperature use range: -30˚F to 200˚F


----------



## Gary2015

Quote:


> Originally Posted by *KillerBee33*
> 
> Even Mathematically it doesn't make sense releasing a "Ti" Doubt there'll be one.


Judging from another thread, there are a few people scoffing at TXP owners and waiting on their 1080Ti which they are sure will be released soon.


----------



## DNMock

I've zero experience with CLU always used traditional paste.

I've heard the one mention of Conductonaut, anyone know if that's the best to use or if another would be more viable.

I figured I'd get a bunch and do multiple practice attempts on random old stuff before trying on the Titans.


----------



## KillerBee33

Quote:


> Originally Posted by *Gary2015*
> 
> Judging from another thread, there are a few people scoffing at TXP owners and waiting on their 1080Ti which they are sure will be released soon.


Heh , i can only immagine the fuzz it'll bring if they decide to test HBM on 1080 Ti if for some reason it does show up


----------



## mbze430

Quote:


> Originally Posted by *superkyle1721*
> 
> Anyone know if they will be releasing a TI version of the 10 series. Looking to upgrade my 980TI SLI and I planned to get two 1080TIs but now I feel there isn't a large enough gap to allow a 1080TI so I may just pick up a single Titan. The problem with that is from benchmarks etc this would essentially be an expensive side grade at best right?
> 
> Sent from my iPhone using Tapatalk


I think it really depends what you want to do with the Titan XPs. I had the SLI 980TI setup. It was pretty good with 4k as long the game support SLI. But after getting an Oculus Rift, I realize, yes it's OK, but I want better. I looked at the 1080, but people were still not able to get MAX graphics setting with a single 1080 in VR. When the Titan XP was announced, and looking at the specs, I knew it will do MAX Graphics @ 90fps and then some. The choice was clear.

My second reason for jumping to the Titan XP was the 3x 4k monitors.


----------



## criminal

Quote:


> Originally Posted by *DNMock*
> 
> I've zero experience with CLU always used traditional paste.
> 
> I've heard the one mention of Conductonaut, anyone know if that's the best to use or if another would be more viable.
> 
> I figured I'd get a bunch and do multiple practice attempts on random old stuff before trying on the Titans.


Conductonaut is supposed to be more conductive than CLU from what I have read.

I wonder after applying clu/conductonaunt to shunt, could you cover it with liquid electrical tape so it doesn't run off?


----------



## Gary2015

Quote:


> Originally Posted by *mbze430*
> 
> I think it really depends what you want to do with the Titan XPs. I had the SLI 980TI setup. It was pretty good with 4k as long the game support SLI. But after getting an Oculus Rift, I realize, yes it's OK, but I want better. I looked at the 1080, but people were still not able to get MAX graphics setting with a single 1080 in VR. When the Titan XP was announced, and looking at the specs, I knew it will do MAX Graphics @ 90fps and then some. The choice was clear.
> 
> My second reason for jumping to the Titan XP was the 3x 4k monitors.


I've had mine a few days and I say its worth the upgrade from 1080GTX SLI on my Acer X34. Games like ESO are now butter smooth which wasn't the case on 1080GTX SLI (settings maxed). Even with the photo realistic mod on GTAV (settings maxed) I'm still getting a solid 70fps overall. Forza 6 Beta with all settings maxed looks great with frame rates rarely dipping below 100fps.


----------



## mbze430

Quote:


> Originally Posted by *toncij*
> 
> I seriously doubt that too. We are going to see TXPv2 with a fully enabled chip (3840 cores and possibly HBM) very soon. Maybe even December as GDDR5X 1080Ti with 2017Q1 new TXPv2 16GB HBM.
> Maybe they'll move to Volta early.


they would only move to Volta early 1) AMD has something up their sleeves to compete 2) for some magical reasons, they are getting much higher yield in the last 2months. there is a 50/50 with a 1080 TI.


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> I think it really depends what you want to do with the Titan XPs. I had the SLI 980TI setup. It was pretty good with 4k as long the game support SLI. But after getting an Oculus Rift, I realize, yes it's OK, but I want better. I looked at the 1080, but people were still not able to get MAX graphics setting with a single 1080 in VR. When the Titan XP was announced, and looking at the specs, I knew it will do MAX Graphics @ 90fps and then some. The choice was clear.
> 
> My second reason for jumping to the Titan XP was the 3x 4k monitors.


May i ask what VR Game you were trying to run that 2 980tis couldnt handle?


----------



## junknown

Long time lurker, finally decided to sign up and see if I can contribute to the knowledge-base.

Quick questions to people who've managed to use their EVGA Hybrid (TXM,980 Ti) Coolers on the Titan X Pascal. How difficult was the setup process on the new Titan VS the old? I'm not very experienced with custom loops and things like that, but I do have a Titan XM with a EVGA 980 Ti Hybrid Kit on it? How difficult was the transition? Anything delicate that I would need to worry about compared to the older Titan?

TXP is a beast even without the OC, but the fan noise is too much.


----------



## Gary2015

Quote:


> Originally Posted by *junknown*
> 
> Long time lurker, finally decided to sign up and see if I can contribute to the knowledge-base.
> 
> Quick questions to people who've managed to use their EVGA Hybrid (TXM,980 Ti) Coolers on the Titan X Pascal. How difficult was the setup process on the new Titan VS the old? I'm not very experienced with custom loops and things like that, but I do have a Titan XM with a EVGA 980 Ti Hybrid Kit on it? How difficult was the transition? Anything delicate that I would need to worry about compared to the older Titan?
> 
> TXP is a beast even without the OC, but the fan noise is too much.


It won't be as hard as a custom loop but you will still need to take the time to make sure you install it properly. A bad installation may mean artifacts and other problems,


----------



## mbze430

Quote:


> Originally Posted by *KillerBee33*
> 
> May i ask what VR Game you were trying to run that 2 980tis couldnt handle?


There are NO VR SLI games. So any games that is in VR only can use 1 GPU.

The only "game" that run VR SLI is VR Funhouse from Nvidia, but you need 2 10-Series card + a dedicated PhysX card. (that is exactly what I have now, a SLI Titan XP + 980TI)

Game like PCars, in VR with a 980TI, I had to pretty much run every down to medium with a Pixel Density setting of 1.3, which avg about 68fps, but since the Rift ATW it helps get it back to 90fps


----------



## mouacyk

Even if they drop a TI, how many of you would actually pay 900 - 1000 for a DX11 card when Volta will be out in less than a year with full DX12 compliance? Or even Vega will be out soonish...

All these pricing make the 980ti godly at 650 on release for its performance. That's not gonna happen again.


----------



## junknown

Quote:


> Originally Posted by *Gary2015*
> 
> It won't be as hard as a custom loop but you will still need to take the time to make sure you install it properly. A bad installation may mean artifacts and other problems,


So no harder than the Maxwell Titan X? That installation was easy enough, just trying to find out if there will be any surprises along the way.


----------



## DNMock

Quote:


> Originally Posted by *criminal*
> 
> Conductonaut is supposed to be more conductive than CLU from what I have read.
> 
> I wonder after applying clu/conductonaunt to shunt, could you cover it with liquid electrical tape so it doesn't run off?


Eraser putty?

edit: what about a thin adhesive thermal pad?


----------



## chronicfx

they should make copper (insert soft conducting metal of choice here) tipped pencils.... So you can reverse pencil mod

like this.. but wonder how hard it is and what the metal is

http://www.wired.com/2010/09/neither-pen-nor-pencil-write-endlessly-in-metal/

edit- watched the video it seems to be lead based, was hoping for a winner!


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> There are NO VR SLI games. So any games that is in VR only can use 1 GPU.
> 
> The only "game" that run VR SLI is VR Funhouse from Nvidia, but you need 2 10-Series card + a dedicated PhysX card. (that is exactly what I have now, a SLI Titan XP + 980TI)
> 
> Game like PCars, in VR with a 980TI, I had to pretty much run every down to medium with a Pixel Density setting of 1.3, which avg about 68fps, but since the Rift ATW it helps get it back to 90fps


I gave up on Pcars in VR , too many confusing Video settings, btw TitanX cant run it Medium ,AA off and DPO @ 1.5


----------



## Gary2015

Quote:


> Originally Posted by *junknown*
> 
> So no harder than the Maxwell Titan X? That installation was easy enough, just trying to find out if there will be any surprises along the way.


Shouldn't be.


----------



## mbze430

Quote:


> Originally Posted by *KillerBee33*
> 
> I gave up on Pcars in VR , too many confusing Video settings, btw TitanX cant run it Medium ,AA off and DPO @ 1.5


on my old 980TI setup I at least was able to to get D2X... without AA in VR..... it's like.... going back to the dark ages. *****


----------



## toncij

I never said GP1xx will do it, but it wouldn't surprise me to see GP200 Titan X Black in April 2017. 1080Ti could be an 8GB 300ish core chip to fill the gap between $700 & $1200, like a $900 card for AIBs.
I don't think Nvidia will let the Ti market slide; and it's a large market because AIBs could do miracles with custom 3000-3200 core chips.


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> on my old 980TI setup I at least was able to to get D2X... without AA in VR..... it's like.... going back to the dark ages. *****


Have you tried VorpX yet?


----------



## mbze430

Quote:


> Originally Posted by *KillerBee33*
> 
> Have you tried VorpX yet?


No I haven't. Wasn't sure it was something I will use. Problem with me is that I really don't have that much time to actually play through a game. like start to finish.

I head the IT department at a law firm as my day job, so I get lots of computer time during weekdays, week nights I get about 2-3hrs of free time, but that splits between gaming and "other" activities. During the weekends, I go skydiving/scuba/racing cars (HPDE) or anything more adventurous. So gaming, I get maybe... 2hrs at most per week?


----------



## CallsignVega

Quote:


> Originally Posted by *EniGma1987*
> 
> It will dry out eventually, given enough thermal cycles and time. You probably dont have that kind of time or patience for making it dry out on a resistor though. Not even sure those resistors would ever naturally get hot enough to do a proper thermal cycle. You could try just experimenting a bit, put some CLU on a little piece of scrap somewhere and then use a blow drier and some ice to cycle it a few times and see how it behaves after 5-10 minutes of doing that. Probably not near enough time but its probably worth a shot to see how it behaves. If you dont want to try it yourself, I will test out the behavior of the CLU like that next week sometime just to satisfy my own curiosity.


Quote:


> Originally Posted by *criminal*
> 
> I ordered Thermal Grizzly conductonaunt referenced in this
> 
> 
> 
> to try the shunt mod on my 1070. Strange nothing was mentioned in his video about the cards having to remain horizontal after the mod. Maybe conductonaunt is thicker? Anyway, I am thinking I will just cover the surrounding area on my card with LET just in case.
> So your card isn't mounted vertically?


I think CLU is just too unpredictable to use on a vertical surface. I've had to take it off with a jack hammer before and now it may roll off the resistor..


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> No I haven't. Wasn't sure it was something I will use. Problem with me is that I really don't have that much time to actually play through a game. like start to finish.
> 
> I head the IT department at a law firm as my day job, so I get lots of computer time during weekdays, week nights I get about 2-3hrs of free time, but that splits between gaming and "other" activities. During the weekends, I go skydiving/scuba/racing cars (HPDE) or anything more adventurous. So gaming, I get maybe... 2hrs at most per week?


i see.


----------



## combat fighter

Quote:


> Originally Posted by *toncij*
> 
> I never said GP1xx will do it, but it wouldn't surprise me to see GP200 Titan X Black in April 2017. 1080Ti could be an 8GB 300ish core chip to fill the gap between $700 & $1200, like a $900 card for AIBs.
> I don't think Nvidia will let the Ti market slide; and it's a large market because AIBs could do miracles with custom 3000-3200 core chips.


Can't see it happening this round.

Nvidia have no competition this time.


----------



## unreality

Quote:


> Originally Posted by *Creator*
> 
> Are the TXPs CPU bottlenecked by a 4.5ghz Haswell class CPU at 1440p?


Actually in some cases yes! I did 3 Benchmarks with GTA 5 with my 5960X @ max settings. I cant remember how much AA but i think it was 4xMSAA

5960X 8 x 4.0 GHZ 3 / 3.0 GHz Cache


Spoiler: Warning: Spoiler!



Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 20.019939, 104.065613, 88.805367
Pass 1, 55.099808, 118.970184, 91.187744
Pass 2, 57.472954, 148.144653, 98.961647
Pass 3, 72.733543, 150.134537, 115.904945
Pass 4, 40.494995, 183.099274, 96.914566



5960X 8 x 4.8 GHZ / 3.0 GHz Cache


Spoiler: Warning: Spoiler!



Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 20.295830, 109.696838, 98.428856
Pass 1, 51.555618, 119.189049, 97.331108
Pass 2, 75.502701, 158.346664, 106.262833
Pass 3, 67.805008, 166.358932, 122.071167
Pass 4, 60.490322, 269.635406, 104.462173



5960X 8 x 4.8 GHZ / 4.5 GHz Cache


Spoiler: Warning: Spoiler!



Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 20.680553, 113.721191, 102.476326
Pass 1, 52.432949, 119.809387, 99.176872
Pass 2, 82.153358, 249.020477, 111.173126
Pass 3, 76.648148, 162.067535, 124.879341
Pass 4, 57.243412, 230.010925, 108.521515



Also note that this is with a SINGLE TXp


----------



## CallsignVega

Hmm, looks like I can put on the CLU very thin, so it should stay put I think.


----------



## criminal

Quote:


> Originally Posted by *DNMock*
> 
> Eraser putty?
> 
> edit: what about a thin adhesive thermal pad?


Both good ideas i think.








Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, looks like I can put on the CLU very thin, so it should stay put I think.


Awesome.


----------



## CRITTY




----------



## aylan1196

Hi is voltage unlocked for Titan x p in afterburner ? Can't seem to use it ?


----------



## Seyumi

Just got my Titan XP even though I ordered it on Monday AM and chose overight AM delivery (2 days for Nvidia to ship and 2 days for FedEx to deliver). First world problems I guess lol.

Interestingly enough my Titan XP beneifts more from memory overclocks versus core overclocks.

Unigine Valley 4K Ultra x8 AA. 100% fan speed. Computer specs below.

Stock boost speeds: 1822 boost, 1784 final, 67C, 44.2 FPS
+270 max overclock: 2100 boost, 2050 final, 70C, 46.8 FPS
+2.6 FPS or +5.7% increase

+800 memory overlock (11.6 Ghz): 2100 boost, 2037 final, 73C, 51.1FPS
+4.3 FPS or +8.8% increase

Can't wait to get an EVGA AIO water cooler on this thing. I used MSI afterburner but I've been using EVGA precision for years. I couldn't find a way to unlock the slider bar to increase core voltage. I even went into the settings and checked the "unlock core volatage" box.


----------



## DNMock

Quote:


> Originally Posted by *criminal*
> 
> Both good ideas i think.
> 
> 
> 
> 
> 
> 
> 
> 
> Awesome.


I think I'll use some eraser putty to make a small form around it to prevent any over spill to places it shouldn't go, then use a adhesive thermal pad to cover it up now that I think about it.


----------



## NoDoz

Quote:


> Originally Posted by *aylan1196*
> 
> Hi is voltage unlocked for Titan x p in afterburner ? Can't seem to use it ?


I don't think it works yet. The option is there but it does not turn it on.


----------



## aylan1196

Evga precision doesn't support Titan x pascal yet I couldn't start the programs I get this msg only gtx 1060 1070 1080 is supported


----------



## DADDYDC650

1080 Ti to be announced very soon? Youtubers at a Nvidia event in Europe. Notice the Nvidia logo and the number 10 in the background in the image below.


----------



## DNMock

Questions:

1. Memory modules on the back of the card like old titan or all on the front this time?

2. Hunting down some fujipoly extreme and I can't remember for the life of me how much I need per card. Something like 350 mm^2 of the 1mm thickness for the memory modules and another 200 mm^2 of the .5 for the VRM, then (since I'm using the stock backplate and not waiting another month, that would be another 300mm^2 of the 3mm thickness per card, that sound about right?


----------



## bee144

Quote:


> Originally Posted by *Seyumi*
> 
> Just got my Titan XP even though I ordered it on Monday AM and chose overight AM delivery (2 days for Nvidia to ship and 2 days for FedEx to deliver). First world problems I guess lol.
> 
> Interestingly enough my Titan XP beneifts more from memory overclocks versus core overclocks.
> 
> Unigine Valley 4K Ultra x8 AA. 100% fan speed. Computer specs below.
> 
> Stock boost speeds: 1822 boost, 1784 final, 67C, 44.2 FPS
> +270 max overclock: 2100 boost, 2050 final, 70C, 46.8 FPS
> +2.6 FPS or +5.7% increase
> 
> +800 memory overlock (11.6 Ghz): 2100 boost, 2037 final, 73C, 51.1FPS
> +4.3 FPS or +8.8% increase
> 
> Can't wait to get an EVGA AIO water cooler on this thing. I used MSI afterburner but I've been using EVGA precision for years. I couldn't find a way to unlock the slider bar to increase core voltage. I even went into the settings and checked the "unlock core volatage" box.


+1 for a couple of your points.

If NVIDIA isn't going to allow Evga to sell the Titan xp then they should develop their own software and include it with the card. I can't imagine Evga being motivated to include support for a card they don't even sell. With the being said, I've used precision since I bought my gtx 275 so it's hard switching over to afterburner. Hopefully Evga will add support out of the kindness of their hearts?

I'm also waiting for the official Evga Titan xp hybrid upgrade kit. They're just now getting the 1080/1070 hybrids out so we probably still have another month or two. This card gets incredibly hot compared to the OG Titan.

Also, lucky you! I have two Titan Xp in sli but I'm maxing out at +175mhz on the core and +400mhz on the memory. Anything 50mhz higher and BF4 and TimeSpy crash.


----------



## bee144

Quote:


> Originally Posted by *DADDYDC650*
> 
> 1080 Ti to be announced very soon? Youtubers at a Nvidia event in Europe. Notice the Nvidia logo and the number 10 in the background in the image below.


No, NVIDIA will wait till at least September 2nd to avoid all of the Titan Xp owners from returning their cards. Also, NVIDIA has had multiple order of 10 events taking place. They don't seem to be an indicator of a new product. Just another way to create marketing hype and giveaway free stuff.


----------



## bee144

Quote:


> Originally Posted by *DNMock*
> 
> Questions:
> 
> 1. Memory modules on the back of the card like old titan or all on the front this time?
> 
> 2. Hunting down some fujipoly extreme and I can't remember for the life of me how much I need per card. Something like 350 mm^2 of the 1mm thickness for the memory modules and another 200 mm^2 of the .5 for the VRM, then (since I'm using the stock backplate and not waiting another month, that would be another 300mm^2 of the 3mm thickness per card, that sound about right?


I believe someone earlier on the thread stated the memory was only on the front this time.


----------



## DADDYDC650

Quote:


> Originally Posted by *bee144*
> 
> No, NVIDIA will wait till at least September 2nd to avoid all of the Titan Xp owners from returning their cards. Also, NVIDIA has had multiple order of 10 events taking place. They don't seem to be an indicator of a new product. Just another way to create marketing hype and giveaway free stuff.


So you think those Youtubers flew all the way to Europe to celebrate existing cards and to get free stuff?


----------



## cookiesowns

Quote:


> Originally Posted by *DADDYDC650*
> 
> 1080 Ti to be announced very soon? Youtubers at a Nvidia event in Europe. Notice the Nvidia logo and the number 10 in the background in the image below.


Launch of 10 series mobile GPU's is my guess.


----------



## mouacyk

Quote:


> Originally Posted by *DADDYDC650*
> 
> So you think those Youtubers flew all the way to Europe to celebrate existing cards and to get free stuff?


Yes. They are being treated for helping to sell the 1080 and Titan XP at exorbitant prices.


----------



## EniGma1987

Quote:


> Originally Posted by *DNMock*
> 
> Questions:
> 
> 1. Memory modules on the back of the card like old titan or all on the front this time?


All on the front:


----------



## bee144

Quote:


> Originally Posted by *mouacyk*
> 
> Yes. They are being treated for helping to sell the 1080 and Titan XP at exorbitant prices.


Yep, agreed. NVIDIA hasn't hosted launch events outside of the US so I'd say this is a way of buttering up the reviewers for giving positive reviews and creating a huge hype train for the 1080 that they still can't keep them in stock.


----------



## DADDYDC650

Quote:


> Originally Posted by *cookiesowns*
> 
> Launch of 10 series mobile GPU's is my guess.


That would be my guess as well but a 1080 Ti announcement sounds more exciting.


----------



## DNMock

Quote:


> Originally Posted by *EniGma1987*
> 
> All on the front:


sweet, thanks, really no need for extra cooling on the back plate then, maybe the VRM, but that's about it.

Anyone other than PPC's and Ebay that sell fujipoly online?


----------



## DADDYDC650

Quote:


> Originally Posted by *mouacyk*
> 
> Yes. They are being treated for helping to sell the 1080 and Titan XP at exorbitant prices.


Sounds like a conspiracy theory.


----------



## mouacyk

Quote:


> Originally Posted by *DADDYDC650*
> 
> Sounds like a conspiracy theory.





Spoiler: Warning: Spoiler!



That would be my guess as well but a 1080 Ti announcement sounds more exciting.



Nvidia: "how much we can charge for 1080 TI, guys? comon, you know it's irresistable!"
All: "Over 9,000"


----------



## cg4200

Quote:


> Originally Posted by *DNMock*
> 
> sweet, thanks, really no need for extra cooling on the back plate then, maybe the VRM, but that's about it.
> 
> Anyone other than PPC's and Ebay that sell fujipoly online?


I get mine on amazon..And I believe it is all 1mm someone correct me if I am wrong...


----------



## Seyumi

Quote:


> Originally Posted by *bee144*
> 
> +1 for a couple of your points.
> 
> If NVIDIA isn't going to allow Evga to sell the Titan xp then they should develop their own software and include it with the card. I can't imagine Evga being motivated to include support for a card they don't even sell. With the being said, I've used precision since I bought my gtx 275 so it's hard switching over to afterburner. Hopefully Evga will add support out of the kindness of their hearts?
> 
> I'm also waiting for the official Evga Titan xp hybrid upgrade kit. They're just now getting the 1080/1070 hybrids out so we probably still have another month or two. This card gets incredibly hot compared to the OG Titan.
> 
> Also, lucky you! I have two Titan Xp in sli but I'm maxing out at +175mhz on the core and +400mhz on the memory. Anything 50mhz higher and BF4 and TimeSpy crash.


I spoke too soon. Looks like +240 core and +450 memory for stable. Ran a few games and was crashing and artifacting like crazy on my previous overclock. Strange since usually as long as Valley runs without any problems then my games do as well.


----------



## KillerBee33

Has anyone else noticed different Boost Clocks when using same exact AB settings? Im doing +215 on the core all the time but i've seen 3DMark reporting 2073-2088 and last one 2101


----------



## DooRules

Quote:


> Originally Posted by *aylan1196*
> 
> Evga precision doesn't support Titan x pascal yet I couldn't start the programs I get this msg only gtx 1060 1070 1080 is supported


That is only for the lastest version for 10 series cards. The previous version does see the card and allow overclocking.


----------



## cg4200

Quote:


> Originally Posted by *EniGma1987*
> 
> *Shunt mod tutorial:*
> 
> *Warning:* CoolLabratory Liquid Ultra (and Pro) are conductive thermal interface materials. We use this conductivity to lower the resistance in the card to do this mod, but this conductivity means it is dangerous and can destroy your graphics card if it spills and touches another component on the GPU. I take no responsibility if you break your own (or someone else's) GPU with this mod. If you choose to do the mod, be careful.
> NOTE: you can use other liquid metal TIMs as well. Thermal Grizzly Conductonaut works fine too
> 
> EDIT: Criminal gave me a link this morning to a video where der8auer does this mod as well. I posted it at the end of this tutorial if you ant to see someone apply the liquid metal in a video rather than just pictures.
> 
> 1) Start off by removing the backplate and cooler from your GPU. To do this you will need a size #0 phillips screwdriver and a 4mm socket wrench. Once all the screws and bolts are out the cooler should come off relatively easy, so if it is not coming off then you probably forgot a screw somewhere. Likely places that get forgotten are on the end of the card with the display outputs.
> 
> 2) Once the cooler is removed your card should look like this:
> 
> 
> The three resistors on the card are circled in red. We are only going to mod the top two resistors.
> 
> 3) Place a small bit of Liquid Ultra on the two top shunt resistors. These resistors have a case label of "5MO". You dont need a ton of CLU, just a small dab that you will spread out. See the picture here for how much I applied:
> 
> 
> 
> 4) Now use the brush included in the CLU package to spread the CLU over the whole top of the resistors. Be careful not to spill it off the resistor. If the CLU spills off the top of the resistor it will most likely destroy your graphics card. Be careful not to use too much CLU as using too much will cause a spill as well when you put the graphics card back together. There is an inductor very close to resistor #2 (RS2) that can easily be spilled onto. Be very careful with this step.
> 
> 
> 
> Your graphics card should look like the above picture now. Put the cooler back onto the card and you are done!
> 
> To remove this mod from your graphics card, simply wipe away the CLU. If the liquid ultra has hardened then you may be able to use a hair drier for a very brief period to liquefy the material again so that you can remove it.
> 
> I showed a screenshot of before and after the mod in 3DMark earlier to show the mod works. Graphics score rose by 223 points in FS Ultra with the mod. I have now been playing a video game for the past 3 hours since doing the mod and posting this with no problems.
> 
> By covering both the resistors on the Titan X you should increase your power limit by about 30-35%. This will give you some additional headroom for overclocking so the card doesnt throttle down due to power limit. However, the Titan X (Pascal) has very poor VRM setup on the GPU. It is not rated very high and adding 30% more power from this mod, combined with setting your PT slider to 120% will allow the GPU to draw what I consider to be unsafe levels of power to the core. This mod should let you draw around 390 watts of power, but I dont consider anything over 350w on the Titan X to be safe. Just because you *allow* the card to draw that much power, doesnt mean it will of course. However to be safe, you may want to limit your power target slider to only 110%. That, combined with this mod, should let you draw around 350 watts max before the limit kicks in which will keep your card's VRM from frying.
> 
> As you can see, this mod is for the Titan X, and Der8auer's video is for the GTX 1070 and 1080. These mods works for any Nvidia card going back to at least the 700 generation, and probably in future generations as well. Not all cards will have 3 shunt resistors, you may only have 2 resistors because of the single power connector input on your GPU. If you only have 1 power connector, only apply CLU to the one resistor at the top.
> 
> Der8auer's video tutorial for modding the power limit on these cards:


Great write up man.. I have couple questions why did you only do two and not mod the third 5mo also?
Is it to not put to much power thru the card ?
also after reading some people say clu should not be used on vertical for fear of rolling off.. what is you take? I usually use on delid and gets hard..never tried gpu
I tried a old resistor with artic silver 5 and then blow dryer tacked up pretty good, do you think being polyxynthetic silver it would work?
I am getting my wb next week and I have more clu on order but my card is vertical and don't want to take chance shorting out..thanks


----------



## CallsignVega

Alright, this CLU spreads VERY thin! That is a good thing for us vertical mount people. I used a very tiny amount of CLU, about the size of a needle eye's worth. Used the included brush to ever so carefully brush the whole resistor. Basically, the incredibly small amount of weight of the used CLU could never overcome the surface tension and "fall off". Only way it should come off is to wipe it.


----------



## mouacyk

Quote:


> Originally Posted by *CallsignVega*
> 
> Alright, this CLU spreads VERY thin! That is a good thing for us vertical mount people. I used a very tiny amount of CLU, about the size of a needle eye's worth. Used the included brush to ever so carefully brush the whole resistor. Basically, the incredibly small amount of weight of the used CLU could never overcome the surface tension and "fall off". Only way it should come off is to wipe it.


Sounds reasonable. Now, get us some clocks


----------



## criminal

Quote:


> Originally Posted by *CallsignVega*
> 
> Alright, this CLU spreads VERY thin! That is a good thing for us vertical mount people. I used a very tiny amount of CLU, about the size of a needle eye's worth. Used the included brush to ever so carefully brush the whole resistor. Basically, the incredibly small amount of weight of the used CLU could never overcome the surface tension and "fall off". Only way it should come off is to wipe it.


Holy heatsinks Batman!









Looks like a winner to me. Curious to see your numbers.


----------



## junknown

Update: Got the EVGA Hybrid Cooler installed. Easier than expected. nVidia's stock thermal solution is basically moist clay, it might be worth it to some to crack it open just to replace it.

Heaven used to shoot my GPU straight into the 80's pretty much on a steady incline, now it's struggling as I type this to break 55c at 50% fan speed. Now to try some overclocking B-). Thanks to Gary2015 for the quick responses.


----------



## Menthol

Quote:


> Originally Posted by *DooRules*
> 
> That is only for the lastest version for 10 series cards. The previous version does see the card and allow overclocking.


I tried using this version to see if I could enable K-Boost to no avail, does it work for you?


----------



## DooRules

No, kboost will not work for me either, all else good same as AB


----------



## cg4200

Quote:


> Originally Posted by *CallsignVega*
> 
> Alright, this CLU spreads VERY thin! That is a good thing for us vertical mount people. I used a very tiny amount of CLU, about the size of a needle eye's worth. Used the included brush to ever so carefully brush the whole resistor. Basically, the incredibly small amount of weight of the used CLU could never overcome the surface tension and "fall off". Only way it should come off is to wipe it.


Thanks Great to see that's the way I will go when I get my wb thin layer call it a day...
Hopefully at some point we will be able to mod bios


----------



## toncij

Quote:


> Originally Posted by *junknown*
> 
> Update: Got the EVGA Hybrid Cooler installed. Easier than expected. nVidia's stock thermal solution is basically moist clay, it might be worth it to some to crack it open just to replace it.
> 
> Heaven used to shoot my GPU straight into the 80's pretty much on a steady incline, now it's struggling as I type this to break 55c at 50% fan speed. Now to try some overclocking B-). Thanks to Gary2015 for the quick responses.


Well, it's water... My TX Maxwell can't go above 65°C with h115i on itself running 1,55GHz.














I presume TXP can do so much better.

The Youtubers came for their free cards, ofc. mPascal announcement pending.


----------



## DarkIdeals

Just for the record, those of you who are actually going to use the Shunt mod; PLEASE for the love of god only short ONE shunt at a time and then check your system after each one. If one shunt shorted, or two shunts shorted is enough to supply you the power you need then you really don't want to do more. I've had personal experience with these type of mods and i can say that each shunt you mod increases the risk exponentially of frying the relatively weak VRM's. I've found in the Maxwell TITAN X's that using just two shunts was more than enough for me to get 1,550mhz on both cards with stock bios; doing three often provides little to no benefit but increases risk.

The Pascal chips, despite having a nicer 7+2 phase design, use weaker E6930's made by A&O rather than the older chips that had a much better low and high end amp maximum. iirc these are rated for only ~20 amp low and like 75 amp high, which means that using more than maybe 340-350 watt will give trouble; and using any more than 1.1v - 1.125v or so core voltage is risking blowing the VRM's as well.

You also risk the card being forced out of 3d mode by Nvidia's software detecting too low power if you raise the TDP limit too much; meaning your card could permanently be stuck in "2d mode" with very low mhz frequency cap.

Jus' sayin...

On a side note, trying to decide whether to get my 2nd TITAN XP, or order waterblocks first. I won't have the money for both for another week or so; so i'm wondering if i should just order the two waterblocks so i won't have to deal with the crazy temps and noise on my current T-XP, then just get the 2nd one later. I mean i'll have to buy one of the EK terminals too, so it's definitely gonna cut into the cash i have currently. I'll have ~$1,380 free in a few days, which if i get those blocks won't leave me enough for the 2nd T-XP...decisions decisions lol


----------



## mbze430

This would be a reason why 1080TI doesn't need to exist

*Nvidia - $1.43 Billion revenue beats expectations with Pascal Graphics chips*.

http://venturebeat.com/2016/08/11/nvidias-1-43-billion-in-revenue-beats-expectations-as-pascal-graphics-chips-launch/


----------



## EniGma1987

Quote:


> Originally Posted by *KillerBee33*
> 
> Has anyone else noticed different Boost Clocks when using same exact AB settings? Im doing +215 on the core all the time but i've seen 3DMark reporting 2073-2088 and last one 2101


Your card is throttling, most likely from power target. My card used to be like that too. 3DMark was mostly power target throttle for me and some other benchmarks were a temperature throttle. Games seemed to be a combination of both but used to be hitting temp a bit harder. Since I replaced the default TIM with Kryonaut and did the shunt mod games do not normally reach the 80c mark and start temp throttling and they no longer PT throttle either.

Quote:


> Originally Posted by *cg4200*
> 
> Great write up man.. I have couple questions why did you only do two and not mod the third 5mo also?
> Is it to not put to much power thru the card ?
> also after reading some people say clu should not be used on vertical for fear of rolling off.. what is you take? I usually use on delid and gets hard..never tried gpu
> I tried a old resistor with artic silver 5 and then blow dryer tacked up pretty good, do you think being polyxynthetic silver it would work?
> I am getting my wb next week and I have more clu on order but my card is vertical and don't want to take chance shorting out..thanks


Covering only the top 2 resistors already results in a power limit capable of damaging the card when you turn the PT slider up to 120%. No need to cover the 3rd resistor when we are already at those kind of power levels. Not modding the last resistor also helps to make sure resistance doesn't get too low on the circuit and thus kick the card into fault mode and lock you at 135MHz. This fault mode is also why you cannot just solder little wires across the resistors for a more permanent and safer mod, resistance is too low and the card faults.

I see you already read Vega's post right below yours for his thoughts on the vertical mounting of the card.

CLU is made from some sort of gallium mix, or maybe just straight gallium, if I am not mistaken. I do know it dries out over time but I have no idea how fast that will happen on these resistors here. I did suggest possibly using a hair drier and maybe some ice or something to do a bunch of forced thermal cycles to see if that might help the process along quick.

Quote:


> Originally Posted by *CallsignVega*


Decided to do all 3 resistors huh? I guess maybe since you are using a bit thinner layer that might be fine because a thinner layer would also mean each resistor still has more resistance on it (theoretically anyway). I still only recommend shorting 2 of them as that will let you reach up to almost 400 watts of power target which is past what I consider the safe level on this card. It has a VERY weak VRM.


----------



## KillerBee33

Quote:


> Originally Posted by *EniGma1987*
> 
> Your card is throttling, most likely from power target. My card used to be like that too. 3DMark was mostly power target throttle for me and some other benchmarks were a temperature throttle. Games seemed to be a combination of both but used to be hitting temp a bit harder. Since I replaced the default TIM with Kryonaut and did the shunt mod games do not normally reach the 80c mark and start temp throttling and they no longer PT throttle either.


In any of the 4 3DMark tests it wont go higher than 70 Degrees, but how can it get to 2101 or even to 2088 which is what i usually see from just +215 in AB ?
I see people post +250 and even +275 , on the core so decided to ask


----------



## CRITTY

[/quote]
Quote:


> Originally Posted by *Gary2015*
> 
> Judging from another thread, there are a few people scoffing at TXP owners and waiting on their 1080Ti which they are sure will be released soon.


Yeah, a lot of people want to tell "us" what the worth of something is to "us". I am enjoying my cards and if the TI comes out; I hope they enjoy it as well.


----------



## lilchronic

Quote:


> Originally Posted by *KillerBee33*
> 
> In any of the 4 3DMark tests it wont go higher than 70 Degrees, but how can it get to 2101 or even to 2088 which is what i usually see from just +215 in AB ?
> I see people post +250 and even +275 , on the core so decided to ask


What is your max boost clock with no overclock just 120% power limit ?

So if it's 1886Mhz and you add +215Mhz you should get a max boost clock of 2101Mhz


----------



## KillerBee33

Quote:


> Originally Posted by *lilchronic*
> 
> What is your max boost clock with no overclock just 120% power limit ?
> 
> So if it's 1886Mhz and you add +215Mhz you should get a max boost clock of 2101Mhz


Honestly i only ran a single test @ stock and i cant remember what it was , will check that tonight








This Boost 3.0 is friggin confusing, especially if you compare to 1080 @ 1600 stock.


----------



## lilchronic

Quote:


> Originally Posted by *KillerBee33*
> 
> Honestly i only ran a single test @ stock and i cant remember what it was , will check that tonight


As long as the card is not power throttling or temperature throttling, that's how it should work.


----------



## KillerBee33

Quote:


> Originally Posted by *lilchronic*
> 
> As long as the card in not power throttling or temperature throttling, that's how it should work.


VRel and Pwr most of the time even with Power set to 120


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> VRel and Pwr most of the time even with Power set to 120


For example, +200 on one card won't give the same OC on another. it depends on your stock max boost....I think that it gives an idea as far as ASIC. mine boost to 1860, which I think is about average....


----------



## Jpmboy

Quote:


> Originally Posted by *cg4200*
> 
> Thanks Great to see that's the way I will go when I get my wb thin layer call it a day...
> Hopefully at some point we will be able to mod bios


let us know how it works.. the original guys that did this on the 1080 still say that you need quite a large amount (like a "hump" of CLU) to actually lower the resistance enough to make a difference... but hey, it's pretty empirical at this point. May the PCB gods smile upon you!








Quote:


> Originally Posted by *KillerBee33*
> 
> In any of the 4 3DMark tests it wont go higher than 70 Degrees, but how can it get to 2101 or even to 2088 which is what i usually see from just +215 in AB ?
> I see people post +250 and even +275 , on the core so decided to ask


lower the card temp (somehow). Air cooled i could not run timespy over 2050 (steady clock). Water coolled - they both do 2010 as a solid line. Unfortunately, I pulled hte 950NVMe from this rig to install W7 on a raid 0 and didn't copy the pictures folder over to a shared drive.








The YXP responds very well to temps below 35C.


----------



## lilchronic

Quote:


> Originally Posted by *KillerBee33*
> 
> VRel and Pwr most of the time even with Power set to 120


Quote:


> Originally Posted by *Jpmboy*
> 
> let us know how it works.. the original guys that did this on the 1080 still say that you need quite a large amount (like a "hump" of CLU) to actually lower the resistance enough to make a difference... but hey, it's pretty empirical at this point. May the PCB gods smile upon you!
> 
> 
> 
> 
> 
> 
> 
> 
> lower the card temp (somehow). Air cooled i could not run timespy over 2050 (steady clock). Water coolled - they both do 2010 as a solid line. Unfortunately, I pulled hte 950NVMe from this rig to install W7 on a raid 0 and didn't copy the pictures folder over to a shared drive.
> 
> 
> 
> 
> 
> 
> 
> 
> The YXP responds very well to temps below 35C.


Well i guess he's hitting the power target.^^

as well as throttling from temp's at anything over 45c? Im pretty sure i remember you saying that. Is that really what they did?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> .


Got a brand new Hybrid kit for the 10 Series , just feel like 70 degrees and Limited OC options are not enough to use it , just to get that let's just say 20MHz + stability








This particular run made me ask questions http://www.3dmark.com/spy/254233


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> let us know how it works.. the original guys that did this on the 1080 still say that you need quite a large amount (like a "hump" of CLU) to actually lower the resistance enough to make a difference... but hey, it's pretty empirical at this point. May the PCB gods smile upon you!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lower the card temp (somehow). Air cooled i could not run timespy over 2050 (steady clock). Water coolled - they both do 2010 as a solid line. Unfortunately, I pulled hte 950NVMe from this rig to install W7 on a raid 0 and didn't copy the pictures folder over to a shared drive.
> 
> 
> 
> 
> 
> 
> 
> 
> The YXP responds very well to temps below 35C.


One question

2050 mhz you reading it on gpu z or on msi afterburn ?
On gpuz i see 1800 mhz but on msi around 2020 mhz

Which one is correct ?

Try to figure out how high my card can go.

Thanks jpmboy


----------



## lilchronic

Quote:


> Originally Posted by *pompss*
> 
> One question
> 
> 2050 mhz you reading it on gpu z or on msi afterburn ?
> On gpuz i see 1800 mhz but on msi around 2020 mhz
> 
> Which one is correct ?
> 
> Try to figure out how high my card can go.
> 
> Thanks jpmboy


Are you using the Sensor Tab to see the max boost clock in GPU-z? That should be the same as msi afterburner.


----------



## steponz

Guys .... Show TDP in GPUz to show what your actually getting for power.

Showing different FS scores.. gives a bit of info.. but to prove the mod works like you think you need to track your max throttling in GPUz or Afterburner.
You could possibly only drop the a small amount.

Ill show a mod in the next couple of days that will drop your power to 40 percent. Which will be great for water and air.

Ive been running this mod for the past couple of days while benching FS/FSE/FSU Im beating all the scores posted with chilled water on air......

The shunt mod will not kill the vrm.... There are extra protections that take care of overheating.... Ive done the shunts on over 40 plus something cards benching professionally and never lost one because of that. The reason why you don't want to short more than 2 is that the controller will fault on boot and will either not boot or lock the mhz at a lower value. For Titan X Pascal it seems to fault and just not boot.

Also the vrm is weak, but not as weak as people are saying, If anybody has actually looked at what voltage is actually going to the card... it can go up to 1.12 to 1.13... 1.2 and 1.3 should be fine on a card, but won't give you anything on air or water.. maybe extremely chilled water/DICE and LN2. Remember adding voltage bumps up the power percentage and also makes more heat.. which lowers overclocking ability.

Ill be epowering one in the next day or two.. working on MEM and PLL mods at the moment.
Ill test what giving voltage actually will do.. I don't expect it will do much as with 1080.. even on ln2.. 1.4 didn't do anything.

Anyways.. Enjoy....


----------



## pompss

Quote:


> Originally Posted by *lilchronic*
> 
> Are you using the Sensor Tab to see the max boost clock in GPU-z? That should be the same as msi afterburner.


yep the same
long time i didnt overlock have to learn again


----------



## KillerBee33

Quote:


> Originally Posted by *lilchronic*
> 
> What is your max boost clock with no overclock just 120% power limit ?
> 
> So if it's 1886Mhz and you add +215Mhz you should get a max boost clock of 2101Mhz


Stock +120Pwr FS run


----------



## lilchronic

Quote:


> Originally Posted by *KillerBee33*
> 
> Stock +120Pwr FS run


nice









Wish i had one. next month or two for sure i will unless they announce a 1080ti or something


----------



## KillerBee33

Quote:


> Originally Posted by *lilchronic*


Uhumm i still don't get the 3.0 Boost , First it Boosts itself to 40% over stock and adding Core is also adding 1 to 1 Boost, didn't work the same in 9Series or i'm missing something.


----------



## lilchronic

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhumm i still don't get the 3.0 Boost , First it Boosts itself to 40% over stock and and adding Core is also adding 1 to 1 Boost, didn't work the same in 9Series or i'm missing something.


It's been the same way since the 600series. what ever you max boost clock is you add on top of that.

Each card is slightly different boosting between 1800-1900Mhz

The boost 1.0,2.0,3.0 bull crap is all just so the card can perform at it's best while staying withing the power / temp limits.


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> Well i guess he's hitting the power target.^^
> 
> as well as throttling from temp's at anything over 45c? Im pretty sure i remember you saying that. Is that really what they did?


eh - excuse my typo... I meant steady at 2101. (left hand is wrapped after smashing it up pretty good this morning in the garage.








Quote:


> Originally Posted by *pompss*
> 
> One question
> 2050 mhz you reading it on gpu z or on msi afterburn ?
> On gpuz i see 1800 mhz but on msi around 2020 mhz
> Which one is correct ?
> Try to figure out how high my card can go.
> Thanks jpmboy


AB 4.3beta or gpuZ are reading the same.
Quote:


> Originally Posted by *steponz*
> 
> Guys .... Show TDP in GPUz to show what your actually getting for power.
> 
> Showing different FS scores.. gives a bit of info.. but to prove the mod works like you think you need to track your max throttling in GPUz or Afterburner.
> You could possibly only drop the a small amount.
> 
> Ill show a mod in the next couple of days that will drop your power to 40 percent. Which will be great for water and air.
> 
> Ive been running this mod for the past couple of days while benching FS/FSE/FSU Im beating all the scores posted with chilled water on air......
> 
> The shunt mod will not kill the vrm.... There are extra protections that take care of overheating.... Ive done the shunts on over 40 plus something cards benching professionally and never lost one because of that. The reason why you don't want to short more than 2 is that the controller will fault on boot and will either not boot or lock the mhz at a lower value. For Titan X Pascal it seems to fault and just not boot.
> 
> Also the vrm is weak, but not as weak as people are saying, If anybody has actually looked at what voltage is actually going to the card... it can go up to 1.12 to 1.13... 1.2 and 1.3 should be fine on a card, but won't give you anything on air or water.. maybe extremely chilled water/DICE and LN2. Remember adding voltage bumps up the power percentage and also makes more heat.. which lowers overclocking ability.
> 
> Ill be epowering one in the next day or two.. working on MEM and PLL mods at the moment.
> Ill test what giving voltage actually will do.. I don't expect it will do much as with 1080.. even on ln2.. 1.4 didn't do anything.
> 
> Anyways.. Enjoy....


thanks man! looking forward to see the way you are doing this.


----------



## KillerBee33

Quote:


> Originally Posted by *lilchronic*
> 
> It's been the same way since the 600series. what ever you max boost clock is you add on top of that.
> 
> Each card is slightly different boosting between 1800-1900Mhz
> 
> The boost 1.0,2.0,3.0 bull crap is all just so the card can perform at it's best while staying withing the power / temp limits.


Well, I'm still very skeptical about these Brutal Power Mods "shunt mod" , feel like i should really wait for at least a single Modded Bios to show up .


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Stock +120Pwr FS run


My FS Ultra at stock. 
and, FS .. 
how is you memory usage lower than mine?


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> My FS Ultra at stock.
> and, FS ..
> how is you memory usage lower than mine?


You got something else runing that takes up Vram? whats your Idle usage?


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> You got something else runing that takes up Vram? whats your Idle usage?


571, the same things running as when I bench....I don't do "tweaks" when I bench...


----------



## Stateless

I know I will get various answers to this, but what is the overall best thermal paste to use on the GPU? My Waterblock should be here next Friday and I want to order some new thermal paste, but it has been a while since I had to use some, so no sure what the best stuff around is anymore. Thanks for any and all feedback.

Also other than protection and looks, since there are no memory chips on the back of the card, is there a need for the back plate? EK wont have back plates till the end of the month, but the blocks are out next week. It would be a pain in the ass to add the block and then 2 weeks later have to drain to add the back plate.


----------



## carlhil2

Quote:


> Originally Posted by *Stateless*
> 
> I know I will get various answers to this, but what is the overall best thermal paste to use on the GPU? My Waterblock should be here next Friday and I want to order some new thermal paste, but it has been a while since I had to use some, so no sure what the best stuff around is anymore. Thanks for any and all feedback.
> 
> Also other than protection and looks, since there are no memory chips on the back of the card, is there a need for the back plate? EK wont have back plates till the end of the month, but the blocks are out next week. It would be a pain in the ass to add the block and then 2 weeks later have to drain to add the back plate.


I use PK-3...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> 571, the same things running as when I bench....I don't do "tweaks" when I bench...


----------



## NoDoz

So whats the overall verdict of getting a AIO cooler for these? My max temp stays around 65-68 during benchmarking. Does throttling happen at those temps? I'm just not sure a AIO is going to help us get alot better OCs, they would help with fan noise but imo that may be it.


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*


Lol, I have about 90 processes, that would explain it, thanks....


----------



## CallsignVega

Quote:


> Originally Posted by *steponz*
> 
> Guys .... Show TDP in GPUz to show what your actually getting for power.
> 
> Showing different FS scores.. gives a bit of info.. but to prove the mod works like you think you need to track your max throttling in GPUz or Afterburner.
> You could possibly only drop the a small amount.
> 
> Ill show a mod in the next couple of days that will drop your power to 40 percent. Which will be great for water and air.
> 
> Ive been running this mod for the past couple of days while benching FS/FSE/FSU Im beating all the scores posted with chilled water on air......
> 
> The shunt mod will not kill the vrm.... There are extra protections that take care of overheating.... Ive done the shunts on over 40 plus something cards benching professionally and never lost one because of that. The reason why you don't want to short more than 2 is that the controller will fault on boot and will either not boot or lock the mhz at a lower value. For Titan X Pascal it seems to fault and just not boot.
> 
> Also the vrm is weak, but not as weak as people are saying, If anybody has actually looked at what voltage is actually going to the card... it can go up to 1.12 to 1.13... 1.2 and 1.3 should be fine on a card, but won't give you anything on air or water.. maybe extremely chilled water/DICE and LN2. Remember adding voltage bumps up the power percentage and also makes more heat.. which lowers overclocking ability.
> 
> Ill be epowering one in the next day or two.. working on MEM and PLL mods at the moment.
> Ill test what giving voltage actually will do.. I don't expect it will do much as with 1080.. even on ln2.. 1.4 didn't do anything.
> 
> Anyways.. Enjoy....


Since voltage is still locked to stock, shorting 3 resistors shouldn't really pull that much more power eh? I could see the draw getting pretty hefty with max power target raised _and_ more voltage applied.


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Lol, I have about 90 processes, that would explain it, thanks....










Time to clean up


----------



## KillerBee33

Quote:


> Originally Posted by *NoDoz*
> 
> So whats the overall verdict of getting a AIO cooler for these? My max temp stays around 65-68 during benchmarking. Does throttling happen at those temps? I'm just not sure a AIO is going to help us get alot better OCs, they would help with fan noise but imo that may be it.


Some claim Unbearable Noise from EVGAs Kit but i havent heard a peep out of it, Easy,Clean,Silent,Effective


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> 
> 
> 
> 
> 
> 
> 
> Time to clean up


Or, I need to use a CLEAN Windows install to bench on...which I had, til I pulled my intel 750 drive...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Or, I need to use a CLEAN Windows install to bench on...


Just did one right after Win10 Anniversary was live. Don't believe in OS upgrades
https://www.microsoft.com/en-us/software-download/windows10
Anniversary included


----------



## pompss

Finally i'm able to test one of my two titan's

First card running firestrike 2075 mhz on air Seems pretty good right ?

testing Second card now


----------



## Maintenance Bot

Quote:


> Originally Posted by *NoDoz*
> 
> So whats the overall verdict of getting a AIO cooler for these? My max temp stays around 65-68 during benchmarking. Does throttling happen at those temps? I'm just not sure a AIO is going to help us get alot better OCs, they would help with fan noise but imo that may be it.


Just installed tonite 980ti hybrid pump-rad, and instantly gained 76mhz.


----------



## Maintenance Bot

Quote:


> Originally Posted by *pompss*
> 
> Finally i'm able to test one of my two titan's
> 
> First card running firestrike 2075 mhz on air Seems pretty good right ?
> 
> testing Second card now


Yeah that is pretty fast.


----------



## pompss

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Yeah that is pretty fast.


Running the titans on old system x79. score are not the best
Still waiting for Ek to ship the waterblock then i can run it on X99

Second card seems score higher









2088 mhz right now









I was really skeptic spenditing $1200 for those card but i have to admit they are worth the money

Cant wait to put them underwater


----------



## Stateless

Quote:


> Originally Posted by *pompss*
> 
> Running the titans on old system x79. score are not the best
> Still waiting for Ek to ship the waterblock then i can run it on X99
> 
> Second card seems score higher
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2088 mhz right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was really skeptic spenditing $1200 for those card but i have to admit they are worth the money
> 
> Cant wait to put them underwater


I am running my Titan on x79, how much better would x99 be? I have 3930k OC to 4.7.


----------



## CallsignVega

Ok, the very thin layer of CLU gave me 10-15% bonus on the power target. Before when I was bouncing off the limit in the 115-120 range, I'm in the 100-108 range now. Makes for more stable overclock.

Testing:


----------



## steponz

Quote:


> Originally Posted by *CallsignVega*
> 
> Since voltage is still locked to stock, shorting 3 resistors shouldn't really pull that much more power eh? I could see the draw getting pretty hefty with max power target raised _and_ more voltage applied.


Well im running MAX 42 power.. 2100 with plus 700 memory with no issues on air.... temps become the issue.. as I go up to 67.. If I can keep the temps like with water.
I could gain some clocks.

Voltage goes up to 1.13 monitoring by multimeter..


----------



## steponz

Quote:


> Originally Posted by *pompss*
> 
> Running the titans on old system x79. score are not the best
> Still waiting for Ek to ship the waterblock then i can run it on X99
> 
> Second card seems score higher
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2088 mhz right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was really skeptic spenditing $1200 for those card but i have to admit they are worth the money
> 
> Cant wait to put them underwater


Likely throttling.. anything on air will throttle over 2000.. it will seem like your getting a decent score.. but it could be better... if you look at power.. your constantly hitting over 120 and its dropping clocks.


----------



## steponz

Quote:


> Originally Posted by *CallsignVega*
> 
> Ok, the very thin layer of CLU gave me 10-15% bonus on the power target. Before when I was bouncing off the limit in the 115-120 range, I'm in the 100-108 range now. Makes for more stable overclock.
> 
> Testing:


Why don't you show GPUz?

Click on it twice to see max.. your likely still throttling a little bit.


----------



## carlhil2

WT..look at my TDP during my last FSU run..  131.3...


----------



## pompss

Quote:


> Originally Posted by *steponz*
> 
> Likely throttling.. anything on air will throttle over 2000.. it will seem like your getting a decent score.. but it could be better... if you look at power.. your constantly hitting over 120 and its dropping clocks.


yes it throttling but i can hit 2126 mhz on air and stil going while my other titan over 2075 was crashing.

with a waterblock and unlocking the power limit im sure i will hit 2200 .

I dont even try the mod yet.

This card is a beast no doubt


----------



## pompss

Quote:


> Originally Posted by *CallsignVega*
> 
> Ok, the very thin layer of CLU gave me 10-15% bonus on the power target. Before when I was bouncing off the limit in the 115-120 range, I'm in the 100-108 range now. Makes for more stable overclock.
> 
> Testing:


nice vega i will try the mod as soon i get the waterblock.

getting 2138 on air and still no crash on firestrike.

Will Keep pushing


----------



## carlhil2

Quote:


> Originally Posted by *pompss*
> 
> nice vega i will try the mod as soon i get the waterblock.
> 
> getting 2138 on air and still no crash on firestrike.
> 
> Will Keep pushing


Nice, I am jelly. you should be #1 with that OC..


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> That would be my guess as well but a 1080 Ti announcement sounds more exciting.


No way man. They wouldn't announce a Ti a month after the XP launch .


----------



## Gary2015

Quote:


> Originally Posted by *pompss*
> 
> yes it throttling but i can hit 2126 mhz on air and stil going while my other titan over 2075 was crashing.
> 
> with a waterblock and unlocking the power limit im sure i will hit 2200 .
> 
> I dont even try the mod yet.
> 
> This card is a beast no doubt


Real world increase ?


----------



## Jpmboy

thought this group might be interested in this: http://overclocking.guide/nvidia-hb-sli-bridge-technical-review/


----------



## pompss

Quote:


> Originally Posted by *carlhil2*
> 
> Nice, I am jelly. you should be #1 with that OC..


I hit my max which is 2151 mhz .This card its a beast

Still waiting for the waterblock to be able to install the card on my main x99 rig

I should upgrade to a 6800k or 6900k to score high.


----------



## carlhil2

Quote:


> Originally Posted by *pompss*
> 
> I hit my max which is 2151 mhz .This card its a beast
> 
> Still waiting for the waterblock to be able to install the card on my main x99 rig
> 
> I should upgrade to a 6800k or 6900k to score high.


My vote goes to the 6900k. if my local MC had them in a month ago, I would be pushing one now...


----------



## pompss

Quote:


> Originally Posted by *Gary2015*
> 
> Real world increase ?


Cant really say right now since the card its throttling and its installed on a old x79 rig and no games installed either on that rig

Also my paid version of 3d mark its installed on my main rig x99 but with firestrike i score 16780 on x79 and memory its still at a stock

Tomorrow i will get the pci express riser so maybe i will be able to run some test with my main rig x99 and post some score since i only have one pci express slot on my x99 itx mb.


----------



## Gary2015

Quote:


> Originally Posted by *Stateless*
> 
> I know I will get various answers to this, but what is the overall best thermal paste to use on the GPU? My Waterblock should be here next Friday and I want to order some new thermal paste, but it has been a while since I had to use some, so no sure what the best stuff around is anymore. Thanks for any and all feedback.
> 
> Also other than protection and looks, since there are no memory chips on the back of the card, is there a need for the back plate? EK wont have back plates till the end of the month, but the blocks are out next week. It would be a pain in the ass to add the block and then 2 weeks later have to drain to add the back plate.


That's why I cancelled and waiting.


----------



## Gary2015

Quote:


> Originally Posted by *carlhil2*
> 
> My vote goes to the 6900k. if my local MC had them in a month ago, I would be pushing one now...


6800k oc at 4,5hhz is better for gaming .


----------



## ChrisxIxCross

Does anyone have any kind of update or know the current state of Pascal Bios Tweaker development? I know NVFlash has Pascal support but that's pretty much it.


----------



## carlhil2

Quote:


> Originally Posted by *Gary2015*
> 
> 6800k oc at 4,5hhz is better for gaming .


If he is just gaming, he should just go for the 6700k....the 6800k and 6900k are the same chip, just one has more cores than the other. what's the difference between the 2 at 4.5, gaming wise? shouldn't the 6900k be as fast, or, faster


----------



## pompss

Quote:


> Originally Posted by *carlhil2*
> 
> If he is just gaming, he should just go for the 6700k....the 6800k and 6900k are the same chip, just one has more cores than the other. what's the difference between the 2 at 4.5, gaming wise? shouldn't the 6900k be as fast, or, faster







not really.
i think the best option is the 6850k which is $400 cheaper then the 6900k but score higher in game and 3d mark .Also only $200 more expensive then the i7 6800k which is basically the i7 5820k which i have already .

I think 6850k its the best option price -value

if you bench then the 6950x no questions

If you gaming then i7 6700k its the best choice .


----------



## Godsarmy

Waiting on waterblock


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> thought this group might be interested in this: http://overclocking.guide/nvidia-hb-sli-bridge-technical-review/


The difference between the bridges is frame time, not FPS. Not sure why he only tested for FPS.


----------



## Gary2015

Quote:


> Originally Posted by *carlhil2*
> 
> If he is just gaming, he should just go for the 6700k....the 6800k and 6900k are the same chip, just one has more cores than the other. what's the difference between the 2 at 4.5, gaming wise? shouldn't the 6900k be as fast, or, faster


6800k can support faster DDR4 speeds .


----------



## Gary2015

Quote:


> Originally Posted by *Godsarmy*
> 
> 
> 
> And backplate....
> 
> Waiting on waterblock


----------



## Gary2015

Quote:


> Originally Posted by *pompss*
> 
> 
> 
> 
> 
> 
> not really.
> i think the best option is the 6850k which is $400 cheaper then the 6900k but score higher in game and 3d mark .Also only $200 more expensive then the i7 6800k which is basically the i7 5820k which i have already .
> 
> I think 6850k its the best option price -value
> 
> if you bench then the 6950x no questions
> 
> If you gaming then i7 6700k its the best choice .


I have X99 board so 6700k isn't doable.


----------



## mbze430

Quote:


> Originally Posted by *pompss*
> 
> I hit my max which is 2151 mhz .This card its a beast
> 
> Still waiting for the waterblock to be able to install the card on my main x99 rig
> 
> I should upgrade to a 6800k or 6900k to score high.


6900k all the way, loving mine


----------



## DADDYDC650

Quote:


> Originally Posted by *pompss*
> 
> 
> 
> 
> 
> 
> not really.
> i think the best option is the 6850k which is $400 cheaper then the 6900k but score higher in game and 3d mark .Also *only $200 more expensive* then the i7 6800k which is basically the i7 5820k which i have already .
> 
> I think 6850k its the best option price -value
> 
> if you bench then the 6950x no questions
> 
> If you gaming then i7 6700k its the best choice .


Why pay $200 more for the exact same performance? Only difference is that you'll be able to run M.2. and NVMe at the same time and 16x/16x pci-e which runs the same as 16x/8x.If you buy a Gigabyte Designare board then you can run pretty much anything and 16x/16x using a 6800k.


----------



## mbze430

Quote:


> Originally Posted by *carlhil2*
> 
> If he is just gaming, he should just go for the 6700k....the 6800k and 6900k are the same chip, just one has more cores than the other. what's the difference between the 2 at 4.5, gaming wise? shouldn't the 6900k be as fast, or, faster


Not quite. the 6800k only has 28 PCIE Lanes. 6900k 40lanes... so for people like myself with PCIE SSD and/or NVE SSD big difference


----------



## ChrisxIxCross

Quote:


> Originally Posted by *Godsarmy*
> 
> 
> 
> Waiting on waterblock


Nice. I can't wait for my EK Nickel block to come in next week! Although their delay with the backplates is lame...


----------



## carlhil2

Quote:


> Originally Posted by *mbze430*
> 
> Not quite. the 6800k only has 28 PCIE Lanes. 6900k 40lanes... so for people like myself with PCIE SSD and/or NVE SSD big difference


You have it confused, I am not pushing the 6800k, I prefer the 6900k, but, if ONLY gaming, I would go 6700k..I was responding to someone who said to go 6800k.as someone with a intel 750 SSD, I feel you..


----------



## ChrisxIxCross

Quote:


> Originally Posted by *Gary2015*
> 
> No way man. They wouldn't announce a Ti a month after the XP launch .


I'm not so sure there even will be a Ti released at all considering the TXP is already a cut down chip. Imo the TXP IS the Ti of this generation that's being sold as a Titan


----------



## pompss

Quote:


> Originally Posted by *DADDYDC650*
> 
> Why pay $200 more for the exact same performance? Only difference is that you'll be able to run M.2. and NVMe at the same time and 16x/16x pci-e which runs the same as 16x/8x.If you buy a Gigabyte Designare board then you can run pretty much anything and 16x/16x using a 6800k.


if you speak about gaming performance then i agree 6800k its way enough but since i have i7 5820 doesn't make much sense spending that money to have the same performance

The 6850k will be only 8% increase the 6800k 4%

Questions is how good they overlock ??

Honestly both are not worth upgrading if i cant push it over 4.7 or 4.8 ghz


----------



## DADDYDC650

Quote:


> Originally Posted by *pompss*
> 
> if you speak about gaming performance then i agree 6800k its way enough but since i have i7 5820 doesn't make much sense spending that money to have the same performance
> 
> The 6850k will be only 8% increase the 6800k 4%
> 
> Questions is how good they overlock ??
> 
> Honestly both are not worth upgrading if i cant push it over 4.7 or 4.8 ghz


What percentage increase are you seeing? The 6800k and 6850k perform exactly the same. The 200Mhz is simply set by Intel to confuse noobs. Best options for gaming if not encoding are the 6700k or the 6800k if you aren't taking advantage or running M.2. and NVMe at the same time. The Gigabyte Designare would fix that problem though.

BTW, the 6800k/6850k both max out at 4.5Ghz. 4.6 if you are really lucky but I'm sure that requires a lot of voltage.


----------



## steponz

Here's the new Firestrike Ultra Single Record up on Hall of Fame.

Only plus 200..... When you get to a certain point with clock, it doesn't get you a better score....

http://www.3dmark.com/fs/9740770

All Air.. beating guys on water and chilled water... Look at the gpu score.. thats what really matters........


----------



## carlhil2

Quote:


> Originally Posted by *steponz*
> 
> Here's the new Firestrike Ultra Single Record up on Hall of Fame.
> 
> Only plus 200..... When you get to a certain point with clock, it doesn't get you a better score....
> 
> http://www.3dmark.com/fs/9740770
> 
> All Air.. beating guys on water and chilled water... Look at the gpu score.. thats what really matters........


Dayum...


----------



## markklok

For the ones who fitted a EK universal gpu block..

Will just removing the default heatsink + backplate be enough or do i have to strip it entirely to get it fitted ?


----------



## carlhil2

Quote:


> Originally Posted by *markklok*
> 
> For the ones who fitted a EK GPU (mono) block..
> 
> Will just removing the default heatsink + backplate be enough or do i have to strip it entirely to get it fitted ?


Do you mean the universal gpu block? if so, I was going to go that route, but, looking at it, wasn't sure that it would work with what my setup was going to be, so, I ended up removing the entire thing. I am going FC soon, so, I didn't mind...


----------



## DADDYDC650

Quote:


> Originally Posted by *steponz*
> 
> Here's the new Firestrike Ultra Single Record up on Hall of Fame.
> 
> Only plus 200..... When you get to a certain point with clock, it doesn't get you a better score....
> 
> http://www.3dmark.com/fs/9740770
> 
> All Air.. beating guys on water and chilled water... Look at the gpu score.. thats what really matters........


Nice score. What was your vram running at?


----------



## craftyhack

Quote:


> Originally Posted by *markklok*
> 
> For the ones who fitted a EK GPU (mono) block..
> 
> Will just removing the default heatsink + backplate be enough or do i have to strip it entirely to get it fitted ?


There really isn't anything to remove other than the OEM blower cooler and back plate, so that's it.I haven't opened my Titan XP yet, but if it is like the last 4 or 5 generations, as long as you only remove the screws you need to, the PCI bracket can stay on (the only other thing I can think of that you may be asking about).


----------



## pompss

Quote:


> Originally Posted by *steponz*
> 
> Here's the new Firestrike Ultra Single Record up on Hall of Fame.
> 
> Only plus 200..... When you get to a certain point with clock, it doesn't get you a better score....
> 
> http://www.3dmark.com/fs/9740770
> 
> All Air.. beating guys on water and chilled water... Look at the gpu score.. thats what really matters........


nice score








Tomorrow i will try to install the titan on my x99 rig to see what graphics score i get.
For sure i cant beat your overall score since my 5820 cant beat your 6950x


----------



## markklok

Quote:


> Originally Posted by *craftyhack*
> 
> There really isn't anything to remove other than the OEM blower cooler and back plate, so that's it.I haven't opened my Titan XP yet, but if it is like the last 4 or 5 generations, as long as you only remove the screws you need to, the PCI bracket can stay on (the only other thing I can think of that you may be asking about).


I'll just order it today so i can play with it tomorrow









Yesterday i opened up my XP to repaste it




After the repaste job (artic5) it got like 8 / 9 degrees warmer








So prob a poor job from my side.
But the default paste was actually very good.

100% load 100% fan about 66 degrees (so i mest that up *whups*)


----------



## craftyhack

Quote:


> Originally Posted by *markklok*
> 
> I'll just order it today so i can play with it tomorrow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yesterday i opened up my XP to repaste it
> 
> After the repaste job (artic5) it got like 8 / 9 degrees warmer
> 
> 
> 
> 
> 
> 
> 
> So prob a poor job from my side.
> But the default paste was actually very good.
> 
> 100% load 100% fan about 66 degrees (so i mest that up *whups*)


OK, wow, not even sure if previous generations could come apart like that, and either way if this one has to come apart this way to replace the block... maybe this one is quite a bit different in more ways than just aesthetics.


----------



## ChrisxIxCross

Quote:


> Originally Posted by *craftyhack*
> 
> OK, wow, not even sure if previous generations could come apart like that, and either way if this one has to come apart this way to replace the block... maybe this one is quite a bit different in more ways than just aesthetics.


That's just how he took it apart, you can 100% take the entire cooler off without having to disassemble the stock cooler piece by piece.


----------



## cookiesowns

Quote:


> Originally Posted by *carlhil2*
> 
> Do you mean the universal gpu block? if so, I was going to go that route, but, looking at it, wasn't sure that it would work with what my setup was going to be, so, I ended up removing the entire thing. I am going FC soon, so, I didn't mind...


Quote:


> Originally Posted by *markklok*
> 
> For the ones who fitted a EK universal gpu block..
> 
> Will just removing the default heatsink + backplate be enough or do i have to strip it entirely to get it fitted ?


Yes, you have to remove everything. You can just avoid removing the acrylic cover by itself to remove the heatsink. Just take off all screws. 2 on IO shield, and then all the tiny ones plus the hex screws. Pain in the butt. The tiny screws are REALLY easy to lose also.

You won't be able to fit a uni block without some heavy dremeling or a very thick shim, you also need to remove the nvidia led logo too if you decide to mod. I don't think it's worth it.

Some good airflow over the vmem and VRM is enough to keep them within reason. I've been measuring them roughly 65C with a panaflo running at 1500RPM on the top of it, or around 80C with just passive airflow from my top rads. I wouldn't push vMEM or over-volt without some serious active cooling on bare mem or VRM, or heatsinks though.

Card is also VERY fragile without backplate.


----------



## toncij

Quote:


> Originally Posted by *steponz*
> 
> Here's the new Firestrike Ultra Single Record up on Hall of Fame.
> 
> Only plus 200..... When you get to a certain point with clock, it doesn't get you a better score....
> 
> http://www.3dmark.com/fs/9740770
> 
> All Air.. beating guys on water and chilled water... Look at the gpu score.. thats what really matters........


OK, I'm all ears - how do you manage such a score with the clocks being pretty much identical to other scores. I don't really get the joke here. I've noticed that, even when I have a 100% rock solid clock, results differ... by a large amount.


----------



## tin0

He's an overclocker, he knows his benchmark/hardware/software tweaks


----------



## markklok

Quote:


> Originally Posted by *cookiesowns*
> 
> Yes, you have to remove everything. You can just avoid removing the acrylic cover by itself to remove the heatsink. Just take off all screws. 2 on IO shield, and then all the tiny ones plus the hex screws. Pain in the butt. The tiny screws are REALLY easy to lose also.
> 
> You won't be able to fit a uni block without some heavy dremeling or a very thick shim, you also need to remove the nvidia led logo too if you decide to mod. I don't think it's worth it.
> 
> Card is also VERY fragile without backplate.


Ok soo... it won't work when i remove the heatsink + backplate + nvidia logo... Is it the plate surrounding the GPU that will prevent it ? (small holes)


----------



## toncij

Quote:


> Originally Posted by *tin0*
> 
> He's an overclocker, he knows his benchmark/hardware/software tweaks


Every sufficiently advanced technology can be referred to as magic.








Joke aside, I see his test #2 score is high, his memory clock is very nice.


----------



## cookiesowns

Quote:


> Originally Posted by *steponz*
> 
> Here's the new Firestrike Ultra Single Record up on Hall of Fame.
> 
> Only plus 200..... When you get to a certain point with clock, it doesn't get you a better score....
> 
> http://www.3dmark.com/fs/9740770
> 
> All Air.. beating guys on water and chilled water... Look at the gpu score.. thats what really matters........


Solid stuff step. Then again your max boost is also 2083 @ +200, must be a solid card as well =)


----------



## marc0053

Quote:


> Originally Posted by *toncij*
> 
> OK, I'm all ears - how do you manage such a score with the clocks being pretty much identical to other scores. I don't really get the joke here. I've noticed that, even when I have a 100% rock solid clock, results differ... by a large amount.


This is what Steponz was explaining in earlier posts where it's important to remove power limit throttle. Once power limit is removed the clocks will remain steady and provide a much better score. Shunt mod is a good start for air/water but Steponz will be pushing his card under LN2 soon and that's why he did a hard mod on his Titan X.


----------



## KillerBee33

Any real changes in this 4.3.0.?
https://gaming.msi.com/features/afterburner


----------



## MrTOOSHORT

Got an email from EK that my block has shipped. A nice surprise since it was supposed to be released on the 16th.







Wish I had it for this weekend though as I'm off from work.


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> nice vega i will try the mod as soon i get the waterblock.
> getting 2138 on air and still no crash on firestrike.
> Will Keep pushing


show the graphics score... clocks do not always = higher "productivity" with the leash(es) NV has on these cards. Also, Timespy is DX12 - run that.
Quote:


> Originally Posted by *pompss*
> 
> Cant really say right now since the card its throttling and its installed on a old x79 rig and no games installed either on that rig
> 
> Also my paid version of 3d mark its installed on my main rig x99 but with firestrike i score 16780 on x79 and memory its still at a stock
> 
> Tomorrow i will get the pci express riser so maybe i will be able to run some test with my main rig x99 and post some score since i only have one pci express slot on my x99 itx mb.


x79 or x99, th4e graphics score will be nearly the same.

Quote:


> Originally Posted by *markklok*
> 
> I'll just order it today so i can play with it tomorrow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yesterday i opened up my XP to repaste it
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After the repaste job *(artic5)* it got like 8 / 9 degrees warmer
> 
> 
> 
> 
> 
> 
> 
> So prob a poor job from my side.
> But the default paste was actually very good.
> 
> 100% load 100% fan about 66 degrees (so i mest that up *whups*)


really? that TIM is from the Cretaceous period. Try one of the PKs or Gelid or Grixzzy.
Quote:


> Originally Posted by *steponz*
> 
> Here's the new Firestrike Ultra Single Record up on Hall of Fame.
> Only plus 200..... When you get to a certain point with clock, it doesn't get you a better score....
> http://www.3dmark.com/fs/9740770
> All Air.. beating guys on water and chilled water... Look at the gpu score.. thats what really matters........


Nice one S... here come the *FrankenTitans*!
Quote:


> Originally Posted by *marc0053*
> 
> This is what Steponz was explaining in earlier posts where it's important to remove power limit throttle. Once power limit is removed the clocks will remain steady and provide a much better score. Shunt mod is a good start for air/water but Steponz will be pushing his card under LN2 soon and that's why he did a hard mod on his Titan X.


For benching absolutely- do you guys really think painting CLU on the resistors is a good thing to do to a card before covering it with a waterblock and using it for gaming?


----------



## toncij

....


----------



## Gary2015

Quote:


> Originally Posted by *cookiesowns*
> 
> Yes, you have to remove everything. You can just avoid removing the acrylic cover by itself to remove the heatsink. Just take off all screws. 2 on IO shield, and then all the tiny ones plus the hex screws. Pain in the butt. The tiny screws are REALLY easy to lose also.
> 
> You won't be able to fit a uni block without some heavy dremeling or a very thick shim, you also need to remove the nvidia led logo too if you decide to mod. I don't think it's worth it.
> 
> Some good airflow over the vmem and VRM is enough to keep them within reason. I've been measuring them roughly 65C with a panaflo running at 1500RPM on the top of it, or around 80C with just passive airflow from my top rads. I wouldn't push vMEM or over-volt without some serious active cooling on bare mem or VRM, or heatsinks though.
> 
> Card is also VERY fragile without backplate.


Quote:


> Originally Posted by *cookiesowns*
> 
> Yes, you have to remove everything. You can just avoid removing the acrylic cover by itself to remove the heatsink. Just take off all screws. 2 on IO shield, and then all the tiny ones plus the hex screws. Pain in the butt. The tiny screws are REALLY easy to lose also.
> 
> You won't be able to fit a uni block without some heavy dremeling or a very thick shim, you also need to remove the nvidia led logo too if you decide to mod. I don't think it's worth it.
> 
> Some good airflow over the vmem and VRM is enough to keep them within reason. I've been measuring them roughly 65C with a panaflo running at 1500RPM on the top of it, or around 80C with just passive airflow from my top rads. I wouldn't push vMEM or over-volt without some serious active cooling on bare mem or VRM, or heatsinks though.
> 
> Card is also VERY fragile without backplate.


What do you use to remove the HEX screws?


----------



## Jpmboy

Quote:


> Originally Posted by *Gary2015*
> 
> What do you use to remove the HEX screws?


4mm socket. DO NOT TRY TO DO IT WITH PLIERS!!


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Got an email from EK that my block has shipped. A nice surprise since it was supposed to be released on the 16th.
> 
> 
> 
> 
> 
> 
> 
> Wish I had it for this weekend though as I'm off from work.


oh man! jelly. I'm hoping to get blocks on these before the August "foldathon".







TXP folds silly.


----------



## toncij

Anyone had old Titan X overclocked to compare to TXP overclocked maybe? My results are worse than expected ([email protected] vs [email protected]) difference on average is not as expected +60%...


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> What do you use to remove the HEX screws?


Warm up the stubborn screws using a blow drier. Don't get too close because you don't want it getting too hot. Just warm enough to loosen them up.


----------



## pompss

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Got an email from EK that my block has shipped. A nice surprise since it was supposed to be released on the 16th.
> 
> 
> 
> 
> 
> 
> 
> Wish I had it for this weekend though as I'm off from work.


Mine too. should be here around tuesday - wednesday
Unfortunately im going in vacation tomorrow coming back after a week


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> show the graphics score... clocks do not always = higher "productivity" with the leash(es) NV has on these cards. Also, Timespy is DX12 - run that.
> x79 or x99, th4e graphics score will be nearly the same.


i played with core clock . The memory was stock and if i remember correctly my graphics score was around 7800 maybe more.
I have very little time since im going in vacation tomorrow .
I will try to install the titan on my rig today and try to run a fast test to see what score i get.

My waterblock has been shipped cant wait to put this beast underwater.


----------



## unreality

Quote:


> Originally Posted by *toncij*
> 
> Anyone had old Titan X overclocked to compare to TXP overclocked maybe? My results are worse than expected ([email protected] vs [email protected]) difference on average is not as expected +60%...


I do. Watercooled TXm (1452/8000) vs TXp under air (probably avg below 2000 because of high ambients here)
CPU 5960X @ 4.8 GHz

Timespy


Spoiler: Warning: Spoiler!



TXm http://www.3dmark.com/3dm/13883601 Graphics score 6 299

TXp http://www.3dmark.com/3dm/13968623 Graphics score 10 169



Firestrike


Spoiler: Warning: Spoiler!



TXm http://www.3dmark.com/3dm/13883732 Graphics score 21 558

TXp http://www.3dmark.com/3dm/13967551 Graphics score 31 930



Firestrike Extreme


Spoiler: Warning: Spoiler!



TXm http://www.3dmark.com/3dm/13883844 Graphics score 9 904

TXp http://www.3dmark.com/3dm/13967318 Graphics score 15 460



Fire Strike Ultra


Spoiler: Warning: Spoiler!



TXm http://www.3dmark.com/3dm/13883973 Graphics score 5 037

TXp http://www.3dmark.com/3dm/13967179 Graphics score 7662



Heaven WQHD


Spoiler: Warning: Spoiler!



TXm
FPS: 67.1
Score: 1691
Min FPS: 36.6
Max FPS: 133.3

TXp
FPS: 103.8
Score: 2614
Min FPS: 40.5
Max FPS: 220.9



Valley WQHD


Spoiler: Warning: Spoiler!



TXm
FPS: 68.4
Score: 2862
Min FPS: 26.4
Max FPS: 125.6

TXp
FPS: 105.3
Score: 4406
Min FPS: 46.4
Max FPS: 198.2



Biggest gain Ive seen in GTA V but i cant find the right benchmark file. Was about 75-80% above TITAN Xm results though. Also Ive read TW3 has gains >80%


----------



## toncij

Quote:


> Originally Posted by *unreality*
> 
> I do. Watercooled TXm (1452/8000) vs TXp under air (probably avg below 2000 because of high ambients here)
> CPU 5960X @ 4.8 GHz
> 
> Timespy
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TXm http://www.3dmark.com/3dm/13883601 Graphics score 6 299
> 
> TXp http://www.3dmark.com/3dm/13968623 Graphics score 10 169
> 
> 
> 
> Firestrike
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TXm http://www.3dmark.com/3dm/13883732 Graphics score 21 558
> 
> TXp http://www.3dmark.com/3dm/13967551 Graphics score 31 930
> 
> 
> 
> Firestrike Extreme
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TXm http://www.3dmark.com/3dm/13883844 Graphics score 9 904
> 
> TXp http://www.3dmark.com/3dm/13967318 Graphics score 15 460
> 
> 
> 
> Fire Strike Ultra
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TXm http://www.3dmark.com/3dm/13883973 Graphics score 5 037
> 
> TXp http://www.3dmark.com/3dm/13967179 Graphics score 7662
> 
> 
> 
> Heaven WQHD
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TXm
> FPS: 67.1
> Score: 1691
> Min FPS: 36.6
> Max FPS: 133.3
> 
> TXp
> FPS: 103.8
> Score: 2614
> Min FPS: 40.5
> Max FPS: 220.9
> 
> 
> 
> Valley WQHD
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TXm
> FPS: 68.4
> Score: 2862
> Min FPS: 26.4
> Max FPS: 125.6
> 
> TXp
> FPS: 105.3
> Score: 4406
> Min FPS: 46.4
> Max FPS: 198.2
> 
> 
> 
> Biggest gain Ive seen in GTA V but i cant find the right benchmark file. Was about 75-80% above TITAN Xm results though. Also Ive read TW3 has gains >80%


Thank you very much. I'm having 30°C ambients here and hard to actually measure it. My FTW 1080s go just fine up to 2114, cool, but TXP heats up a lot. Cooler is really garbage.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Thank you very much. I'm having 30°C ambients here and hard to actually measure it. My FTW 1080s go just fine up to 2114, cool, but TXP heats up a lot. Cooler is really garbage.


On TXP? Same reference cooler as always , a bit loud but wont let it go over 75.


----------



## fernlander

Quote:


> Originally Posted by *cookiesowns*
> 
> Yes, you have to remove everything. You can just avoid removing the acrylic cover by itself to remove the heatsink. Just take off all screws. 2 on IO shield, and then all the tiny ones plus the hex screws. Pain in the butt. The tiny screws are REALLY easy to lose also.
> 
> You won't be able to fit a uni block without some heavy dremeling or a very thick shim, you also need to remove the nvidia led logo too if you decide to mod. I don't think it's worth it.
> 
> Some good airflow over the vmem and VRM is enough to keep them within reason. I've been measuring them roughly 65C with a panaflo running at 1500RPM on the top of it, or around 80C with just passive airflow from my top rads. I wouldn't push vMEM or over-volt without some serious active cooling on bare mem or VRM, or heatsinks though.
> 
> Card is also VERY fragile without backplate.


More fragile than the TXM was? I wish the 980ti hybrid cooler could go on while leaving the backplate on.


----------



## fernlander

Quote:


> Originally Posted by *toncij*
> 
> Anyone had old Titan X overclocked to compare to TXP overclocked maybe? My results are worse than expected ([email protected] vs [email protected]) difference on average is not as expected +60%...


From memory my TXM would get around 2600 in Heaven with a hybrid cooler. This card on air gets almost to 4000. That's about 53%. Maybe - a big maybe, it would go a bit higher with with a hybrid cooler but I'm not expecting much.


----------



## toncij

Quote:


> Originally Posted by *fernlander*
> 
> More fragile than the TXM was? I wish the 980ti hybrid cooler could go on while leaving the backplate on.


I doubt they've started making worse PCBs...

Quote:


> Originally Posted by *KillerBee33*
> 
> On TXP? Same reference cooler as always , a bit loud but wont let it go over 75.


Very, very loud.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Very, very loud.


Try this for a change 0-40 30% , 70% @ 65 degrees 100% @ 75 degrees

Don't mind the BIOS part , didnt feel like editing this image


----------



## KillerBee33

Got sound proof pads all over inside this NZXT H440 , can barely hear it


----------



## Gary2015

Quote:


> Originally Posted by *pompss*
> 
> Mine too. should be here around tuesday - wednesday
> Unfortunately im going in vacation tomorrow coming back after a week


You need the backplates though. Don't install the block without them.


----------



## Gary2015

Quote:


> Originally Posted by *KillerBee33*
> 
> Got sound proof pads all over inside this NZXT H440 , can barely hear it


The fan isn't that loud. Finished a 3 hour session of ESO. Could barely hear them. No Man Sky out in 15mins!!!!!!!


----------



## KillerBee33

Quote:


> Originally Posted by *Gary2015*
> 
> The fan isn't that loud. Finished a 3 hour session of ESO. Could barely hear them. No Man Sky out in 15mins!!!!!!!


NMS should've had native VR support


----------



## Gary2015

Quote:


> Originally Posted by *KillerBee33*
> 
> NMS should've had native VR support


Don't care much for VR. Sold my Oculus this week.


----------



## Snaporz

Quote:


> Originally Posted by *Gary2015*
> 
> You need the backplates though. Don't install the block without them.


Yeah. Real bummer my EK block has shipped and have to wait 2 weeks to install the thing for the backplate to arrive. Major buzzkill.


----------



## jaminiah

Have any of you been experiencing stuttering/latency issues similar to these guys?

https://forums.geforce.com/default/topic/954602/geforce-1000-series/titan-x-pascal-high-dpc-latency-and-stuttering-/1/


----------



## Gary2015

Quote:


> Originally Posted by *Snaporz*
> 
> Yeah. Real bummer my EK block has shipped and have to wait 2 weeks to install the thing for the backplate to arrive. Major buzzkill.


They should have released them together. No way I'm going to drain the loop and do it again twice.


----------



## KillerBee33

Quote:


> Originally Posted by *Gary2015*
> 
> Don't care much for VR. Sold my Oculus this week.


Keeping mine , Wife loves PCars







BumperCars without Head injury


----------



## Gary2015

Quote:


> Originally Posted by *KillerBee33*
> 
> Keeping mine , Wife loves PCars
> 
> 
> 
> 
> 
> 
> 
> BumperCars without Head injury


Could never get into it. My X34 does the job.


----------



## DNMock

So has there been any conclusion on shunting yet? Would prefer not to burn it up lol, seems like vega going light layer on all 3 is the safest best so far, correct?


----------



## tpwilko08

Quote:


> Originally Posted by *Gary2015*
> 
> You need the backplates though. Don't install the block without them.


Why cant you install them without the backplate? I dont mind draining my loop 2 times only takes 5 mins to drain its a dual loop setup anyway.


----------



## steponz

Quote:


> Originally Posted by *DNMock*
> 
> So has there been any conclusion on shunting yet? Would prefer not to burn it up lol, seems like vega going light layer on all 3 is the safest best so far, correct?


I think Vega needs to show a detailed GPUz to make sure. I can show ya soldering mods that will do the same. but of course its harder to do.

Im pretty sure he's still throttling as I need to see the the gpuz after running a firestrike ultra run.


----------



## Testier

Whats a good volt meter and/or soldering iron that I should get?


----------



## Creator

Quote:


> Originally Posted by *jaminiah*
> 
> Have any of you been experiencing stuttering/latency issues similar to these guys?
> 
> https://forums.geforce.com/default/topic/954602/geforce-1000-series/titan-x-pascal-high-dpc-latency-and-stuttering-/1/


I'll have to check this out later. At first I thought it was a CPU bottleneck. I seem to feel a noticeable stuttering in benchmarks. I'm not sure of games yet, as I've only had about an hour to test the card this morning before work.


----------



## stefxyz

BS comment. Of course you can install without backplate. Its 99% optics only...


----------



## ChrisxIxCross

Yeah you can install the block with no backplate no problem, it's just pure aesthetics. Jayztwocents even had a full tutorial on the installation w/ no backplate with the 1080


----------



## craftyhack

Yeah, I haven't read anything anywhere, nor do I see why a back plate would be NEEDED when it doesn't have a functional purpose other than protection (important in my case, with all of the crap stuffed in my NCase, I have a few bits mounted to my back plate, I ain't doin that to the PCB!). Maybe that is what was meant.

For you guys that have the EK block on the way, did any of you order back plates in the same order? Mine hasn't shipped and I did order a back plate at the same time. I just assumed they would ship together. If not, I may just put it in another rig, it isn't that hard to resist playing with it when aircooled, but if I had a block too, I might be stupid and deal with redoing a loop twice.

Also, that stutter thread just posted is the first I heard of 10 series/Titan XP issues, not a pleasant read, but I can't imagine that NVidia won't get this sorted as quick as they can if it is affecting a significant group of users. It looks like they have already acknowledged it and have deployed a hotfix, it just wasn't effective. That is still good news IMHO, they acknowledge the issueand have people assigned to work on it, normally the hardest part.


----------



## tpwilko08

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> Yeah you can install the block with no backplate no problem, it's just pure aesthetics. Jayztwocents even had a full tutorial on the installation w/ no backplate with the 1080


Thanks for this that saves me listening to this blower cooler till the end of the month.


----------



## Snaporz

Quote:


> Originally Posted by *craftyhack*
> 
> Yeah, I haven't read anything anywhere, nor do I see why a back plate would be NEEDED when it doesn't have a functional purpose other than protection (important in my case, with all of the crap stuffed in my NCase, I have a few bits mounted to my back plate, I ain't doin that to the PCB!). Maybe that is what was meant.
> 
> For you guys that have the EK block on the way, did any of you order back plates in the same order? Mine hasn't shipped and I did order a back plate at the same time. I just assumed they would ship together. If not, I may just put it in another rig, it isn't that hard to resist playing with it when aircooled, but if I had a block too, I might be stupid and deal with redoing a loop twice.
> 
> Also, that stutter thread just posted is the first I heard of 10 series/Titan XP issues, not a pleasant read, but I can't imagine that NVidia won't get this sorted as quick as they can if it is affecting a significant group of users. It looks like they have already acknowledged it and have deployed a hotfix, it just wasn't effective. That is still good news IMHO, they acknowledge the issueand have people assigned to work on it, normally the hardest part.


Link to the stutter thread? I wonder if its similar to what I've experienced when attempting to play Diablo 3 in Windowed Fullscreen on my Acer X34.


----------



## Dr Mad

Quote:


> Originally Posted by *toncij*
> 
> Anyone had old Titan X overclocked to compare to TXP overclocked maybe? My results are worse than expected ([email protected] vs [email protected]) difference on average is not as expected +60%...


Seriously, how much did you expect from a card being 30% faster than 1080 ?

I switched from 980ti SLI (1520/2000 each, watercooled) to one TX-P and at 2025, results are what I expected, between 50 & 60% more performance from a single 980ti.

3440x1440 (Acer X34) --> GTA5 (55%) / Fallout 4 (55%) / Witcher 3 (60%)
In these games SLI scaling is ~1.7 so the TX-P is doing a great job.

Waiting for the waterblock to arrive and get 2050/2075 full stable without throttling.


----------



## toncij

Quote:


> Originally Posted by *Dr Mad*
> 
> Seriously, how much did you expect from a card being 30% faster than 1080 ?
> 
> I switched from 980ti SLI (1520/2000 each, watercooled) to one TX-P and at 2025, results are what I expected, between 50 & 60% more performance from a single 980ti.
> 
> 3440x1440 (Acer X34) --> GTA5 (55%) / Fallout 4 (55%) / Witcher 3 (60%)
> In these games SLI scaling is ~1.7 so the TX-P is doing a great job.
> 
> Waiting for the waterblock to arrive and get 2050/2075 full stable without throttling.


Mathematically I expected up to 60% in the case of my old [email protected] vs [email protected] Now, 1080 is a different story (I have two in SLI) where these at 2114 exhibit (single) 15% better performance at best, which is very close to theoretical 20%. Theoretical TXP advantage is 33%, which is fine. But my first tests showed only 10% better performance than 1080 and about 25% more than TXM, which is significantly worse.

The reason was that idiotic NV driver automatically resets 3D settings VSync to ON which decimates performance in benchmarks, which is hard to notice if you benchmark with FS Ultra which never goes up to 60 FPS to highlight the VSync.


----------



## l88bastar

Ordered two blocks from EK..... are there any clearance issues with the funky Nvidia HD bridge touching the EK water bridge between two cards?


----------



## craftyhack

Quote:


> Originally Posted by *Snaporz*
> 
> Link to the stutter thread? I wonder if its similar to what I've experienced when attempting to play Diablo 3 in Windowed Fullscreen on my Acer X34.


Here you go: https://forums.geforce.com/default/topic/954602/geforce-1000-series/titan-x-pascal-high-dpc-latency-and-stuttering-/1/


----------



## l88bastar

Quote:


> Originally Posted by *Baasha*
> 
> Just installed my 3rd and 4th Titan X Pascal in the Asus RoG Swift rig:
> 
> 
> 
> 
> 
> The CPU (3970X @ 4.50Ghz vs. 6950X @ 4.30Ghz) seems to have a HUGE impact in terms of performance in synthetic benchmarks (haven't tried gaming yet).
> 
> Anyway, real world benchmarks of the Titan X Pascal in SLI (2x GPUs):


You need to ditch those coolers sucka


----------



## toncij

Quote:


> Originally Posted by *craftyhack*
> 
> Here you go: https://forums.geforce.com/default/topic/954602/geforce-1000-series/titan-x-pascal-high-dpc-latency-and-stuttering-/1/


No DPC problems here, but I'm using Asus Xonar for audio....


----------



## Snaporz

Quote:


> Originally Posted by *craftyhack*
> 
> Here you go: https://forums.geforce.com/default/topic/954602/geforce-1000-series/titan-x-pascal-high-dpc-latency-and-stuttering-/1/


Thx!


----------



## jaminiah

In regards to the stuttering issue, I appreciate most people on here are benchmarking/overclocking etc, but it would be interesting to get some feedback on whether you guys have experienced any stuttering whilst casually playing games. I've only played 2 games since I installed the Titan XP, both have the stuttering! Hitman (2016) and Battlefield Hardline.
It's not a constant stutter, the games are smooth 99% of the time, there's just a huge dip in fps every now and again.

The Titan XP replaced my Titan X (Max) SLI an I had no issues like this with them on the same system.

Thanks to anyone who has any feedback!!


----------



## l88bastar

Quote:


> Originally Posted by *l88bastar*
> 
> Ordered two blocks from EK..... are there any clearance issues with the funky Nvidia HD bridge touching the EK water bridge between two cards?


Answered my own question.... DANG IT, WHY ARE THESE COMPANIES SO STUPID!


----------



## Dr Mad

Quote:


> Originally Posted by *toncij*
> 
> Mathematically I expected up to 60% in the case of my old [email protected] vs [email protected] Now, 1080 is a different story (I have two in SLI) where these at 2114 exhibit (single) 15% better performance at best, which is very close to theoretical 20%. Theoretical TXP advantage is 33%, which is fine. But my first tests showed only 10% better performance than 1080 and about 25% more than TXM, which is significantly worse.
> 
> The reason was that idiotic NV driver automatically resets 3D settings VSync to ON which decimates performance in benchmarks, which is hard to notice if you benchmark with FS Ultra which never goes up to 60 FPS to highlight the VSync.


Sorry I did misread your post, I thought you said you're unhappy with +60% but this is what you expected








So everything's fine since you disabled vsync now









What is your max boost?


----------



## mouacyk

Quote:


> Originally Posted by *l88bastar*
> 
> Answered my own question.... DANG IT, WHY ARE THESE COMPANIES SO STUPID!


----------



## toncij

Quote:


> Originally Posted by *Dr Mad*
> 
> Sorry I did misread your post, I thought you said you're unhappy with +60% but this is what you expected
> 
> 
> 
> 
> 
> 
> 
> 
> So everything's fine since you disabled vsync now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What is your max boost?


2012 @ 70% fan ~85°C (ambient 29°C).


----------



## EniGma1987

Quote:


> Originally Posted by *DNMock*
> 
> So has there been any conclusion on shunting yet? Would prefer not to burn it up lol, seems like vega going light layer on all 3 is the safest best so far, correct?


You should get about a 30% power target headroom increase doing the shunt mod with CLU covering 2 resistors. But that is only for people with normal towers. If you have a vertical setup like Vega or Criminal then you have to be a lot more careful and put far less CLU on, which makes the mod less effective but still worth doing IMO. I made the mod post into its own thread so people can find it easier. Someone posted in the thread they got the same 30%~ I did.

Overall I think it is a great easy mod for those who are doing just standard overclocking or putting an AIO on the card. It is definitely not the best mod to do and does not give the biggest increase, but it is a very easy mod to do and a very easy mod to remove. It also lets you possibly not void the warranty since it can be wiped away with no trace, which is a plus. Really I posted it to help get more people excited about hardware mods and to help show people how easy they can be to do. I am glad about a dozen people have tried it (that I know of anyway) so far and all are success stories. I do plan on modding mine in a much more hardcore way once I see some other people (like steponz) put out some guides or helpful tips, but the CLU mod on the shunts is a great starting place for most and it does help make the card run much more stable.

Quote:


> Originally Posted by *Testier*
> 
> Whats a good volt meter and/or soldering iron that I should get?


If you want to stay cheap on the soldering iron, I buy mine from McMaster because they have a good power range and some very tiny tips, and very quick shipping so I can get supplies fast. Of course a real soldering station is a much better choice if you will be continuing to solder things in the future. But if all you want is to try 1 or 2 things then getting a cheap iron is best and you can upgrade later if you want to do it more seriously.

For a DMM, I like the ExTech EX430 for a cheap but good quality multimeter. It has fairly good accuracy for a sub $80 DMM. 0.5% accuracy of DC voltage measurement and 0.8-1% accuracy of ohms measurement. Not top grade like the $500+ models, but good enough for a cheap unit.


----------



## Glzmo

Quote:


> Originally Posted by *jaminiah*
> 
> In regards to the stuttering issue, I appreciate most people on here are benchmarking/overclocking etc, but it would be interesting to get some feedback on whether you guys have experienced any stuttering whilst casually playing games. I've only played 2 games since I installed the Titan XP, both have the stuttering! Hitman (2016) and Battlefield Hardline.
> It's not a constant stutter, the games are smooth 99% of the time, there's just a huge dip in fps every now and again.
> 
> The Titan XP replaced my Titan X (Max) SLI an I had no issues like this with them on the same system.
> 
> Thanks to anyone who has any feedback!!


I haven't been experiencing any stuttering or large dips in framerate myself. I've only been using the 369.09 drivers on Windows 10 64 bit Pro 1607 with this card, in case that matters.


----------



## mouacyk

Anyone who is sputtering (Pascal Stutter) in a game, where there is at least one person who hasn't, has to be bottle-necked by something in their system. I see too many X99 systems with DDR4-2133 and 2400 than I would like, which exacerbates the sputter if the hexa/octa-core CPU is also clocked low. When you run a GPU like the TXP, you essentially opened up your fps ceiling, which is good. The problem is when your slowest parts choke on a scene in the game, making the fps plummet. There goes your fps swinging wildly on a $3000 unbalanced rig, where otherwise a balanced and tuned rig can maintain a narrow margin of fps variation.

High-bandwidth high-latency low-speed DDR4 is just one issue -- great for productivty, poor for realtime interactive rendering. These guys typically put a million other things in their systems too -- streaming hardware, extra sound card(s), 20 million SDD's, etc... takes interrupt time away from the CPU scheduler.


----------



## toncij

[email protected],7, 128GB DDR4 @2800 CL15 1T, SM961, Win10 - no stuttering with either 1080 SLI or TXP or the old TXM.


----------



## gamingarena

Anyone have problem where Power-usage drops to 60-65% and it will not come back the clocks stay the same and GPU usage jumps to 99% since has less power to use and only reboot fixes it?

I'm using afterburner an my power is set to 120% but just randomly drops to 60-65% in games and will not come back until reboot FPS are cut in half as soon the Power Drop happens.

Not sure if its the drivers 369.09 or afterburner or the card itself
Any ideas?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Any real changes in this 4.3.0.?
> https://gaming.msi.com/features/afterburner


4.3beta has been out since 1080 launch. Use it with your TXP for sure. If you hit "cntrl-F" you get the V/F window:


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> 4.3beta has been out since 1080 launch. Use it with your TXP for sure. If you hit "cntrl-F" you get the V/F window:


Uhummm

















Curve kinda blows







Clocks are higher , performance is lower , just tried that once on Titan with same results.
On a way home will try the 4.3. ...wait thats not what i meant LOL 4.3.11 not 4.3.4 sorry forgot to mention those numbers


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhummm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Curve kinda blows
> 
> 
> 
> 
> 
> 
> 
> Clocks are higher , performance is lower , just tried that once on Titan with same results.
> On a way home will try the 4.3. ...wait thats not what i meant LOL 4.3.11 not 4.3.4 sorry forgot to mention those numbers


oops...
4.3.11 is not out yet AFAIK.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> oops...
> 4.3.11 is not out yet AFAIK.


You have better results with AB than PX? I've always used PX.

My card seems to top out at +200 (2068) Its just smashing against the power limit. I'm really hoping when my waterblock gets here we can figure out a way to get some more voltage and power limit


----------



## DNMock

Quote:


> Originally Posted by *mouacyk*




Got my Nvidia HB bridge in today. Forgot my dremel at the shop so I used a grinding wheel, and by hand just sanded off the tips, fits like a champ on my old Maxwell Titans with EKWB blocks, so hopefully it should fit fine when my other blocks show up.

Note, don't actually power it up, just use the wheel like you would sandpaper. Took all of 10 minutes to grind it down enough to fit.


----------



## HyperMatrix

You guys read the new about nvidia switching to Samsung and 14nm for Pascal? Titan X black full unlocked gp102 on 14nm, with 1080ti launching with 3072 cores is a possibility now.


----------



## sgs2008

Does the 980ti or titanx hybrid cooler fit on these ?


----------



## Glzmo

Quote:


> Originally Posted by *sgs2008*
> 
> Does the 980ti or titanx hybrid cooler fit on these ?


Yes. The EVGA hybrid cooler itself fits (the cooler is the same for all the Maxwell cards), but not the plastic shroud. You'll have to run it without the front part of the original Titan X (Pascal)'s shroud. You can put the back one around the fan back on after plugging in the cooler's power cable.
All you have to do is unscrew the shroud on the sides and top, take the shroud off, then unscrew the four spring screws on the bottom of the card. Then place the hybrid cooler on it, screw it on with the spring screws, unplug the fan cable, plug in the Hybrid's power cable and connect the fan cable to it and optionally screw the back part of the shroud (including the LED part) back on. You can leave the backplates on during the whole process without issues, too.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> You guys read the new about nvidia switching to Samsung and 14nm for Pascal? Titan X black full unlocked gp102 on 14nm, with 1080ti launching with 3072 cores is a possibility now.


Return our Titan XP's confirmed.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> oops...
> 4.3.11 is not out yet AFAIK.


Looks like it isnt Available for the Public , didnt want to read into it decided to see if anyone got it yet








http://www.guru3d.com/articles-pages/nvidia-titan-x-(pascal)-overclock-guide,2.html


----------



## cookiesowns

Quote:


> Originally Posted by *Gary2015*
> 
> What do you use to remove the HEX screws?


4mm socket/bit included in the iFixit toolkit. Very useful. NO need to heat the screws, just be very gentle. Also don't rest the socket on the PCB, leave a bit of gap, help prevents scratches in case you ever RMA =)


----------



## Testier

Quote:


> Originally Posted by *EniGma1987*
> 
> You should get about a 30% power target headroom increase doing the shunt mod with CLU covering 2 resistors. But that is only for people with normal towers. If you have a vertical setup like Vega or Criminal then you have to be a lot more careful and put far less CLU on, which makes the mod less effective but still worth doing IMO. I made the mod post into its own thread so people can find it easier. Someone posted in the thread they got the same 30%~ I did.
> 
> Overall I think it is a great easy mod for those who are doing just standard overclocking or putting an AIO on the card. It is definitely not the best mod to do and does not give the biggest increase, but it is a very easy mod to do and a very easy mod to remove. It also lets you possibly not void the warranty since it can be wiped away with no trace, which is a plus. Really I posted it to help get more people excited about hardware mods and to help show people how easy they can be to do. I am glad about a dozen people have tried it (that I know of anyway) so far and all are success stories. I do plan on modding mine in a much more hardcore way once I see some other people (like steponz) put out some guides or helpful tips, but the CLU mod on the shunts is a great starting place for most and it does help make the card run much more stable.
> If you want to stay cheap on the soldering iron, I buy mine from McMaster because they have a good power range and some very tiny tips, and very quick shipping so I can get supplies fast. Of course a real soldering station is a much better choice if you will be continuing to solder things in the future. But if all you want is to try 1 or 2 things then getting a cheap iron is best and you can upgrade later if you want to do it more seriously.


For someone like me with the card in a vertical position, do you just recommend putting a layer with the brush on all 3 resistors?


----------



## EniGma1987

Quote:


> Originally Posted by *Testier*
> 
> For someone like me with the card in a vertical position, do you just recommend putting a layer with the brush on all 3 resistors?


Ya for vertical mounted GPUs keep the amount of CLU or Conductonaut you use down to a very small, thin thin layer. If you do that super thin layer on all 3 you will probably drop power target calculations about 15%.


----------



## Gary2015

Quote:


> Originally Posted by *stefxyz*
> 
> BS comment. Of course you can install without backplate. Its 99% optics only...


Lol hope your PCB is strong.


----------



## jaminiah

Quote:


> Originally Posted by *mouacyk*
> 
> Anyone who is sputtering (Pascal Stutter) in a game, where there is at least one person who hasn't, has to be bottle-necked by something in their system. I see too many X99 systems with DDR4-2133 and 2400 than I would like, which exacerbates the sputter if the hexa/octa-core CPU is also clocked low. When you run a GPU like the TXP, you essentially opened up your fps ceiling, which is good. The problem is when your slowest parts choke on a scene in the game, making the fps plummet. There goes your fps swinging wildly on a $3000 unbalanced rig, where otherwise a balanced and tuned rig can maintain a narrow margin of fps variation.
> 
> High-bandwidth high-latency low-speed DDR4 is just one issue -- great for productivty, poor for realtime interactive rendering. These guys typically put a million other things in their systems too -- streaming hardware, extra sound card(s), 20 million SDD's, etc... takes interrupt time away from the CPU scheduler.


So looking at my sig what do you think is causing the bottleneck? The DDR4? The CPU?
I don't have an additional soundcard and only 2 SSD's!

Thanks for your help.


----------



## chronicfx

Quote:


> Originally Posted by *jaminiah*
> 
> So looking at my sig what do you think is causing the bottleneck? The DDR4? The CPU?
> I don't have an additional soundcard and only 2 SSD's!
> 
> Thanks for your help.


Are you actually stuttering with setup? It is a good setup, my only difference is that I bought 3 samsungs for a 1.5GB raid 0 having my games on the same drive. That is a thought.. Although I am aware that people have been doing this for a long time just not m2 and sata drives together. Your ram is fine, 16gb of 3000 is a recommended speed, I run mine 3733 but started out at 3000, the third would be that it is actually beneficial to have a soundcard since it does have its own sound processing chip taking these cycles off of your cpu. Other than that, Stuttering should happen, What is your cache at? It this high performance system type you may want to run your cache at the same as your multiplier on the cpu. Nice rig, Nothing screams bottleneck and hopefully a few new drivers and everything will smooth out.


----------



## Gary2015

Quote:


> Originally Posted by *chronicfx*
> 
> Are you actually stuttering with setup? It is a good setup, my only difference is that I bought 3 samsungs for a 1.5GB raid 0 having my games on the same drive. That is a thought.. Although I am aware that people have been doing this for a long time just not m2 and sata drives together. Your ram is fine, 16gb of 3000 is a recommended speed, I run mine 3733 but started out at 3000, the third would be that it is actually beneficial to have a soundcard since it does have its own sound processing chip taking these cycles off of your cpu. Other than that, Stuttering should happen, What is your cache at? It this high performance system type you may want to run your cache at the same as your multiplier on the cpu. Nice rig, Nothing screams bottleneck and hopefully a few new drivers and everything will smooth out.


He has faulty cards. I had this with my old TXMs. RMA them. I have SLI and its bitter smooth.


----------



## HyperMatrix

For those ordering aqua computer blocks, they just added the active cooled back plate. http://shop.aquacomputer.de/product_info.php?products_id=3463

Also don't forget to order some grizzly kryonaut from them since these blocks sit directly on the memory. And I recommend ordering some f
Good fujipoly thermal pads for the VRMs since they're sensitive enough as is and any extra cooling will help.


----------



## pompss

Quote:


> Originally Posted by *HyperMatrix*
> 
> For those ordering aqua computer blocks, they just added the active cooled back plate. http://shop.aquacomputer.de/product_info.php?products_id=3463
> 
> Also don't forget to order some grizzly kryonaut from them since these blocks sit directly on the memory. And I recommend ordering some f
> Good fujipoly thermal pads for the VRMs since they're sensitive enough as is and any extra cooling will help.


Thats good news but for right now without a mod to unlock power limit increasing vrm clock will remove more power from the core causing the core mhz to drop down when reaching the power limit.
I did some test today and i couldnt reach 2150 anymore on the core by increasing the vrm clock. More then +66 on vrm caused my titan to crash.

Hopefully we can get voltage control and more power limit soon .

by the way i love that backplate


----------



## HyperMatrix

Quote:


> Originally Posted by *pompss*
> 
> Thats good news but for right now without a mod to unlock power limit increasing vrm clock will remove more power from the core causing the core mhz to drop down when reaching the power limit.
> I did some test today and i couldnt reach 2150 anymore on the core by increasing the vrm clock. More then +66 on vrm caused my titan to crash.
> 
> Hopefully we can get voltage control and more power limit soon .
> 
> by the way i love that backplate


One of the things I love about the Aqua Computer blocks is that they leave part of the PCB exposed. So any CLU or voltage hard mods you may need to do, you still have access to it even with the block on there.


----------



## pompss

Quote:


> Originally Posted by *HyperMatrix*
> 
> One of the things I love about the Aqua Computer blocks is that they leave part of the PCB exposed. So any CLU or voltage hard mods you may need to do, you still have access to it even with the block on there.


Yep i just ordered both the block and the backplate Couldnt resist








Will resell the ek block here in overclock or ebay.

Like the aqua better then ek.


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> One of the things I love about the Aqua Computer blocks is that they leave part of the PCB exposed. So any CLU or voltage hard mods you may need to do, you still have access to it even with the block on there.


I had Aqua blocks for TXM . Never went above 40x with 23c ambient . I'm waiting for Heatkiller blocks.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> For those ordering aqua computer blocks, they just added the active cooled back plate. http://shop.aquacomputer.de/product_info.php?products_id=3463
> 
> Also don't forget to order some grizzly kryonaut from them since these blocks sit directly on the memory. And I recommend ordering some f
> Good fujipoly thermal pads for the VRMs since they're sensitive enough as is and any extra cooling will help.


How are EK pads? Also, 0.5mm is enough?


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> How are EK pads? Also, 0.5mm is enough?


Fujipoly 0.5mm are the best. But I understand EK pads are from Fujipoly anyway.


----------



## combat fighter

Quote:


> Originally Posted by *pompss*
> 
> Yep i just ordered both the block and the backplate Couldnt resist
> 
> 
> 
> 
> 
> 
> 
> 
> Will resell the ek block here in overclock or ebay.
> 
> Like the aqua better then ek.


I've got the EK block and backplate on order. Don't know why but I'm tempted by the AQ block. I think it's because I like the look of the active backplate.

Thing is they are out out of stock and not ready to ship for 21 days though.

I could just keep carrying on with air I suppose, it's not like the fun and games start until voltage control becomes available.

I've still got time to cancel my EK order. . .

EDIT:

Just ordered the black nickel block with the active backplate.









Nice to have a change from EK which is what I normally buy. The fact I would of had to wait for the backplate always grated on me. At least with the AC both products have the same lead time.


----------



## DADDYDC650

Need BIOS to remove power limit!


----------



## markklok

What do you do when you got a spare fan laying around...











repasted the gpu for the 2nd time because i put a little too much on it the first time


----------



## HyperMatrix

Quote:


> Originally Posted by *Gary2015*
> 
> Fujipoly 0.5mm are the best. But I understand EK pads are from Fujipoly anyway.


Difference is the EK pads are low end, and rated for 3-5W/mK while the high end Fujipoly ones are rated for 17W/mK.


----------



## cg4200

So I did the shunt mod last night could not wait for wb.. my card is vertical so I did a light coat clu on two of three 5mo.. ran firestrike ultra same 205/585 and score went up 130 points.
So not as good as some but I did not want take chance. in pic you might see 1ro I put small cut piece electrical tape on solder so if any run off should be good..
I cleaned 5mo put tape painters tape around 5m0 put light coat then hair dryer couple cycles then good to go ..
I still get power and vreg on gpu-z was thinking when wb comes in put one more small layer should give a little more power.
Just wanted to share that in case someone was thinking about doing it took 30 minutes with the right tools.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> Difference is the EK pads are low end, and rated for 3-5W/mK while the high end Fujipoly ones are rated for 17W/mK.


Now to see if anyone sells Fujipoly in Europe


----------



## Jpmboy

Quote:


> Originally Posted by *MunneY*
> 
> You have better results with AB than PX? I've always used PX.
> 
> My card seems to top out at +200 (2068) Its just smashing against the power limit. I'm really hoping when my waterblock gets here we can figure out a way to get some more voltage and power limit


yeah - only thing PX is good for is K-boost (but not working for txp right now). lowering the temps on these cards helps a lot, nearly as much as a shunt. combine the two and ...wooo!
Quote:


> Originally Posted by *toncij*
> 
> Now to see if anyone sells Fujipoly in Europe


Fuji SARCON is now the brand.

EK has not published an install instruction sheet yet AFAIK, so you do not know which pad thicknesses to get.


----------



## Snaporz

Quote:


> Originally Posted by *HyperMatrix*
> 
> Difference is the EK pads are low end, and rated for 3-5W/mK while the high end Fujipoly ones are rated for 17W/mK.


Recommending pads other than what would come with the EK waterblock?


----------



## DNMock

Quote:


> Originally Posted by *Snaporz*
> 
> Recommending pads other than what would come with the EK waterblock?


historically 1mm and .5mm thickness

Just to add on the thermal pads:

Fujipoly makes thermal pads ranging from 4 to 17 in heat transfer rate. So just because EK uses Fuji pads (if they do I can neither confirm nor deny that) it doesn't mean it's the high end 17 rated pads, most likely the cheaper ones somewhere between 4 and 11 in thermal conductivity.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - only thing PX is good for is K-boost (but not working for txp right now). lowering the temps on these cards helps a lot, nearly as much as a shunt. combine the two and ...wooo!
> Fuji SARCON is now the brand.
> 
> EK has not published an install instruction sheet yet AFAIK, so you do not know which pad thicknesses to get.


Yea its a bit annoying having the clock bounce all over the place. Congrats on the 6950x btw


----------



## bl4ckdot

Hi, what thermal paste should I use ? At the moment with my 980 Ti (also on my 4790k) I have the EK one but I have the feeling it is really not the best. Should I repaste my cpu also ?


----------



## DNMock

Grizzly Kryonaut won the last thermal paste round-up.

If you wanna play it safe, the #2 Gelid-Extreme won like the previous 5 over on extremerigs.net


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> Grizzly Kryonaut won the last thermal paste round-up.
> 
> If you wanna play it safe, the #2 Gelid-Extreme won like the previous 5 over on extremerigs.net


^^ This.. or even PK1 or PK-3 are plenty good for pascal.


----------



## Gary2015

Quote:


> Originally Posted by *bl4ckdot*
> 
> Hi, what thermal paste should I use ? At the moment with my 980 Ti (also on my 4790k) I have the EK one but I have the feeling it is really not the best. Should I repaste my cpu also ?


Grizzly Kryonaut. Just installed temps are 10c lower.


----------



## toncij

Quote:


> Originally Posted by *Gary2015*
> 
> Grizzly Kryonaut. Just installed temps are 10c lower.


On the stock cooler?


----------



## HyperMatrix

Quote:


> Originally Posted by *Snaporz*
> 
> Recommending pads other than what would come with the EK waterblock?


I was recommending it for Aqua Computers. But yes, the same would appy to EK. No block manufacturer is going to put top shelf materials in the package. Costs too much. And they lose a sales opportunity at the same time.








Quote:


> Originally Posted by *DNMock*
> 
> Grizzly Kryonaut won the last thermal paste round-up.
> 
> If you wanna play it safe, the #2 Gelid-Extreme won like the previous 5 over on extremerigs.net


Grizzly Kryonaut and Phobya Nanogrease Extreme are about tied, and their performance varies based on temperature the parts are operating at. However, both of these are non-conductive pastes. In areas where conductivity doesn't matter, like on a CPU/GPU die itself, an all-metal paste is a must, as it provides several times more heat transfer. This is why I'm a huge fan of Coollaboraty Liquid Ultra.


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> On the stock cooler?


Yes, just finished and went through FS Ultra. Was getting 75c at 100% fan, now 64c

.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> I was recommending it for Aqua Computers. But yes, the same would appy to EK. No block manufacturer is going to put top shelf materials in the package. Costs too much. And they lose a sales opportunity at the same time.
> 
> 
> 
> 
> 
> 
> 
> 
> Grizzly Kryonaut and Phobya Nanogrease Extreme are about tied, and their performance varies based on temperature the parts are operating at. However, both of these are non-conductive pastes. In areas where conductivity doesn't matter, like on a CPU/GPU die itself, an all-metal paste is a must, as it provides several times more heat transfer. This is why I'm a huge fan of Coollaboraty Liquid Ultra.


"Several times"? Aren't you exaggerating a bit?


----------



## MunneY

Quote:


> Originally Posted by *toncij*
> 
> "Several times"? Aren't you exaggerating a bit?


slightly


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> Yes, just finished and went through FS Ultra. Was getting 75c at 100% fan, now 64c
> 
> .


Could be that the heatsink wasn't totally flush against the chip. It's happened to me before.


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> "Several times"? Aren't you exaggerating a bit?


Quote:


> Originally Posted by *MunneY*
> 
> slightly


Not at all. Grizzy Kryonaut = 12.5W/mk. CLU = 38.4W/mk. That's just over 3x. Grizzly recently came out with a liquid metal

Although I'm thinking I want to try this out: http://www.thermal-grizzly.com/en/products/26-conductonaut-en

looking at some user benches, it seems to beat out CLU by 2-3c....which is actually pretty huge considering how great CLU already is...


----------



## carlhil2

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ This.. or even PK1 or PK-3 are plenty good for pascal.


+1 on the PK-3, has my uni working like a champ..


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ This.. or even PK1 or PK-3 are plenty good for pascal.


I used to swear by PK-3 but I've been using Kryonaut since it was released and I'm very happy with it.


----------



## Sheyster

Quote:


> Originally Posted by *HyperMatrix*
> 
> Not at all. Grizzy Kryonaut = 12.5W/mk. CLU = 38.4W/mk. That's just over 3x. Grizzly recently came out with a liquid metal
> 
> Although I'm thinking I want to try this out: http://www.thermal-grizzly.com/en/products/26-conductonaut-en
> 
> looking at some user benches, it seems to beat out CLU by 2-3c....which is actually pretty huge considering how great CLU already is...


Most liquid metal TIMs are very close in performance, within 0.5 deg C typically. I have a tube of Phobya LM. It was very cheap ($6.99 I think) and works great! I use it between the die and IHS on my 5 GHz 4790K.


----------



## carlhil2

Quote:


> Originally Posted by *Sheyster*
> 
> Most liquid metal TIMs are very close in performance, within 0.5 deg C typically. I have a tube of Phobya LM. It was very cheap ($6.99 I think) and works great! I use it between the die and IHS on my 5 GHz 4790K.


Any luck with the bios?


----------



## Sheyster

Quote:


> Originally Posted by *carlhil2*
> 
> Any luck with the bios?


I posted in the JoeDirt thread a bit ago. I'm not able to extract and save it.







Once I have the file I can attempt to mod it.


----------



## KillerBee33

Quote:


> Originally Posted by *Sheyster*
> 
> I posted in the JoeDirt thread a bit ago. I'm not able to extract and save it.
> 
> 
> 
> 
> 
> 
> 
> Once I have the file I can attempt to mod it.


Possibly with the Next GPU-Z and hopefully very soon.


----------



## enfluence

So I've just purchased a Titan x and will be waiting until I move into my new place to install it but before I do that I thought I'd see whats what and it seems that I should buy Grizzly Kryonaut to give it that extra cooling with the standard air cooler it has. Is this a good idea?


----------



## HyperMatrix

Quote:


> Originally Posted by *Sheyster*
> 
> Most liquid metal TIMs are very close in performance, within 0.5 deg C typically. I have a tube of Phobya LM. It was very cheap ($6.99 I think) and works great! I use it between the die and IHS on my 5 GHz 4790K.


Most are close. Because most are in the 30-40 W/mk range. The reason I'm curious about the grizzly conductonaut is that it has 2x better thermal conductivity than the rest of them. What that means I'm terms of real world performance I don't know, other than a bench I've seen that showed a guy at 86-89c across all cores with CLU and down to 84-86c with conductonaut. Makes me curious enough to try it.


----------



## HyperMatrix

Quote:


> Originally Posted by *enfluence*
> 
> So I've just purchased a Titan x and will be waiting until I move into my new place to install it but before I do that I thought I'd see whats what and it seems that I should buy Grizzly Kryonaut to give it that extra cooling with the standard air cooler it has. Is this a good idea?


Get a liquid metal for the gpu die. CLU or grizzly conductonaut depending on what is available to you.


----------



## enfluence

Seems I can get both in the UK. Which one would you guys recommend?

https://www.overclockers.co.uk/thermal-grizzly-conductonaut-thermal-paste-1g-th-021-tg.html

https://www.amazon.co.uk/Coollaboratory-Liquid-Ultra-Cooling-Cleaning/dp/B0039RY3MM


----------



## HyperMatrix

Quote:


> Originally Posted by *enfluence*
> 
> Seems I can get both in the UK. Which one would you guys recommend?
> 
> https://www.overclockers.co.uk/thermal-grizzly-conductonaut-thermal-paste-1g-th-021-tg.html
> 
> https://www.amazon.co.uk/Coollaboratory-Liquid-Ultra-Cooling-Cleaning/dp/B0039RY3MM


I've only ever used CLU. Conductonaut is new. Seems to be better. Ordering some for my titans when my blocks arrive. Not sure if it makes a big difference on air. But since conductonaut is the same price or even cheaper than CLU I'd go with that.


----------



## carlhil2

Quote:


> Originally Posted by *Sheyster*
> 
> I posted in the JoeDirt thread a bit ago. I'm not able to extract and save it.
> 
> 
> 
> 
> 
> 
> 
> Once I have the file I can attempt to mod it.


Oh, my bad, missed that. sounds good..


----------



## axiumone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Most are close. Because most are in the 30-40 W/mk range. The reason I'm curious about the grizzly conductonaut is that it has 2x better thermal conductivity than the rest of them. What that means I'm terms of real world performance I don't know, other than a bench I've seen that showed a guy at 86-89c across all cores with CLU and down to 84-86c with conductonaut. Makes me curious enough to try it.


Judging by the folks that have tested conductonaut, clu is still within margin of error and is substantially easier to apply. Supposedly there's only three metals that are combined for the metal thermal materials and they're all the same for the consumer available pastes. There's also no one that has tested the actual thermal conductivity of the available materials privately, so any company can essentially make up whatever ratings they prefer.

Having said that, I have conductonaut on hand and will apply it for the shunt mod on the 1080's, as well as the die on the delided cpu. I may also be brave enough to try it directly on one of the GPU's. Let's see how that goes in a month or so.


----------



## Testier

Quote:


> Originally Posted by *axiumone*
> 
> Having said that, I have conductonaut on hand and will apply it for the shunt mod on the 1080's, as well as the die on the delided cpu. I may also be brave enough to try it directly on one of the GPU's. Let's see how that goes in a month or so.


Should I be worried about applying CLU on the GPU die if I am mounting the GPU vertically?


----------



## axiumone

Quote:


> Originally Posted by *Testier*
> 
> Should I be worried about applying CLU on the GPU die if I am mounting the GPU vertically?


If you think you may need to RMA the GPU sometime in the future, I probably would be.


----------



## Stateless

When I put blocks on my Maxwell Titan's, someone suggested that before putting the thermal pads on top of the memory modules and other components, that I should put a small amount of thermal paste on them first, lay the pad on them and then put a little more on top of the pad. Is this something that is typically recommended?


----------



## opt33

Quote:


> Originally Posted by *axiumone*
> 
> Judging by the folks that have tested conductonaut, clu is still within margin of error and is substantially easier to apply. Supposedly there's only three metals that are combined for the metal thermal materials and they're all the same for the consumer available pastes. There's also no one that has tested the actual thermal conductivity of the available materials privately, so any company can essentially make up whatever ratings they prefer.
> 
> Having said that, I have conductonaut on hand and will apply it for the shunt mod on the 1080's, as well as the die on the delided cpu. I may also be brave enough to try it directly on one of the GPU's. Let's see how that goes in a month or so.


yep agreed... CLP is listed as 80 w/mk per coolaboratory website under additional info..., CLU listed as 38 w/mk, and yet both are mostly gallium 40 w/mk, with small amounts of indium, rhodium, silver, zinc, stannous (tin)....just clu is suspended in graphite-copper matrix so increases spreadability but slightly lowers thermal conductance. Many including myself, have tested both on bare die cpus (which would show the most difference), and both test within margin of error, not surprising given both are mostly gallium (40 w/mk), and both will have similar bondline thickness and interface resistances.

grizzly conductonaut is listed as 73 w/mk on their website, and same mostly gallium 40 w/mk with indium, tin, etc.

I would bet if all 3 tested under same rigid testing with calibrated probes, you wouldnt be able to tell the 3 apart.

That being said, cpu bare die applications with very low surface area and high power density, achieve the most benefit from using metal tim. On large surface area GPUs with lower power density, it will be more like the difference between IHS and block on cpu, ie much lower temp differences. If I were an air cooler, I might be willing to gain a few C with liquid metal vs paste.. But since I will be on water, and temps are never a problem, I will use only use non-conducting paste on GPU. My Titan X never gets above 43C, with water temp of 29C. So no chance the metal tim would lower temps any more than a few C on water.
Quote:


> Originally Posted by *Stateless*
> 
> When I put blocks on my Maxwell Titan's, someone suggested that before putting the thermal pads on top of the memory modules and other components, that I should put a small amount of thermal paste on them first, lay the pad on them and then put a little more on top of the pad. Is this something that is typically recommended?


If pads are dry, yes, as they wont fill in microcrevices as well as paste. If the pads have wet surface, may be ok not too, but even then I dust both surfaces (not pads) with high conductance paste.


----------



## dante`afk

my gpu is at about 28c idle and 42c load after hours of gaming, currently with kryonaut.

I'll put the condactonaut on it the next days and see if this lowers anything.

*question about the shunt mod:*
I have the kraken G10 blowing on the right side of the card, will the liquid metal stay on the shunts or not?


----------



## HyperMatrix

Quote:


> Originally Posted by *Testier*
> 
> Should I be worried about applying CLU on the GPU die if I am mounting the GPU vertically?


Not if you don't over-apply. I've ran it for years on a (obviously) vertically mounted CPU without any issue/leakage/drip. The key to CLU is really "less is more." Spread it as thin as you can, and as long as you see visible coverage you're good to go. Personally, I brush it on both surfaces. So both the die, and the block. But again...super light layer.


----------



## Gary2015

Quote:


> Originally Posted by *enfluence*
> 
> So I've just purchased a Titan x and will be waiting until I move into my new place to install it but before I do that I thought I'd see whats what and it seems that I should buy Grizzly Kryonaut to give it that extra cooling with the standard air cooler it has. Is this a good idea?


Yes it is my temps are 10c lower.


----------



## DarkIdeals

Quote:


> Originally Posted by *Gary2015*
> 
> Yes it is my temps are 10c lower.


This is so bizarre. I changed my TIM on my TITAN X pascal to Grizzly Kryonaut and it didn't do squat for temps. Still getting mid to high 80's under load even with ~85% fan speed. This is starting to worry me. My 2nd TITAN X comes on monday, i'll have to compare it and see if the temps are better on that one. I REALLY hope there isn't something wrong with the card, i suppose even if there is a waterblock would likely fix it as i can't imagine it being anything other than the COOLER causing that kind of issue, but still...


----------



## DarkIdeals

Ugh....i wanna throw something. I figured what the heck, i'll try changing my thermal paste again and apply the kryonaut using the special rubber "brush" that the instructions tell you to use. Just put the card back together afterwards and now the frickin fan is making a HORRID squealing sound that gets louder the faster the fan moves. Putting my finger on the fan to slow it down reduces the noise. It's absolutely ATROCIOUS sounding, worse than any coil whine etc.. i've ever heard. what could possibly cause this? It's not like i took the damn fan apart!!


----------



## tpwilko08

Just wondering what peoples results are like on water, are the stock clocks more stable, do you gain in overclocking on water. My wb should arrive next week hopefully so will post my findings when I do.


----------



## toncij

Has anyone from Irleand managed to purchase a card? It seems co.uk ships exclusively to UK..


----------



## Jpmboy

Quote:


> Originally Posted by *carlhil2*
> 
> +1 on the PK-3, has my uni working like a champ..


yeah - it's plenty for pascal. with the Uni's, temps are always in the 30's? (max of like +10 over the water temperature.)
Quote:


> Originally Posted by *Sheyster*
> 
> I used to swear by PK-3 but I've been using Kryonaut since it was released and I'm very happy with it.


once you get into the higher quality TIMs it is more the mount quality that affects the outcome.


----------



## DADDYDC650

Quote:


> Originally Posted by *DarkIdeals*
> 
> This is so bizarre. I changed my TIM on my TITAN X pascal to Grizzly Kryonaut and it didn't do squat for temps. Still getting mid to high 80's under load even with ~85% fan speed. This is starting to worry me. My 2nd TITAN X comes on monday, i'll have to compare it and see if the temps are better on that one. I REALLY hope there isn't something wrong with the card, i suppose even if there is a waterblock would likely fix it as i can't imagine it being anything other than the COOLER causing that kind of issue, but still...


Highly doubt changing the TIM will lower temps by 10c. Maybe 1-3c. Something is amiss.


----------



## hotrod717

Quote:


> Originally Posted by *DADDYDC650*
> 
> Highly doubt changing the TIM will lower temps by 10c. Maybe 1-3c. Something is amiss.


1-3* would hardly be worth the effort. 10* variance between stock and high-end tim is not unheard of and quite common with clu, kryonaut and gelid. More important, as JPM points out, is the mount.


----------



## DADDYDC650

Quote:


> Originally Posted by *hotrod717*
> 
> 1-3* would hardly be worth the effort. 10* variance between stock and high-end tim is not unheard of and quite common with clu, kryonaut and gelid. More important, as JPM points out, is the mount.


I've mentioned before that the cooler was probably not mounted correctly hence a big shift in temps.


----------



## Gary2015

Quote:


> Originally Posted by *DarkIdeals*
> 
> This is so bizarre. I changed my TIM on my TITAN X pascal to Grizzly Kryonaut and it didn't do squat for temps. Still getting mid to high 80's under load even with ~85% fan speed. This is starting to worry me. My 2nd TITAN X comes on monday, i'll have to compare it and see if the temps are better on that one. I REALLY hope there isn't something wrong with the card, i suppose even if there is a waterblock would likely fix it as i can't imagine it being anything other than the COOLER causing that kind of issue, but still...


You put too much. Spread thin and even.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Highly doubt changing the TIM will lower temps by 10c. Maybe 1-3c. Something is amiss.


I was getting 75-80c on my top card, now 65/66c. I don't put too much and I use the spatula. A little bit and thin.


----------



## hotrod717

Quote:


> Originally Posted by *DADDYDC650*
> 
> I've mentioned before that the cooler was probably not mounted correctly hence a big shift in temps.


That is not what I am saying. A 10* variance between stock and a high-end tim is not at all uncommon.


----------



## DADDYDC650

Quote:


> Originally Posted by *hotrod717*
> 
> That is not what I am saying. A 10* variance between stock and a high-end tim is not at all uncommon.


Got anything to prove a 10 degree difference between Nvidia's stock TIM and "high end TIM"? I need consistent results and not just hearsay.


----------



## Tideman

Quote:


> Originally Posted by *toncij*
> 
> Has anyone from Irleand managed to purchase a card? It seems co.uk ships exclusively to UK..


Yes I purchased 2 on launch. I just used parcel motel and it worked, got my cards in 4 days.


----------



## Difunto

Quote:


> Originally Posted by *DarkIdeals*
> 
> Ugh....i wanna throw something. I figured what the heck, i'll try changing my thermal paste again and apply the kryonaut using the special rubber "brush" that the instructions tell you to use. Just put the card back together afterwards and now the frickin fan is making a HORRID squealing sound that gets louder the faster the fan moves. Putting my finger on the fan to slow it down reduces the noise. It's absolutely ATROCIOUS sounding, worse than any coil whine etc.. i've ever heard. what could possibly cause this? It's not like i took the damn fan apart!!


i know what you did... its the cable that connects to power the LED its probably inside the fan chamber... open it again and fix that once you put it back move the fan with ur hand to make sure its out of the way.


----------



## toncij

Quote:


> Originally Posted by *Tideman*
> 
> Yes I purchased 2 on launch. I just used parcel motel and it worked, got my cards in 4 days.


What's a parcel motel?








I presume you ordered to that "parcel motel" in UK and shipped to yourself from there? Which one?


----------



## Steven185

Quote:


> Originally Posted by *Gary2015*
> 
> I was getting 75-80c on my top card, now 65/66c. I don't put too much and I use the spatula. A little bit and thin.


Your results are great so I'm willing to postpone buying hybrid cooling for the time being and try repasting first. My only concern is that there are reports of damage done to heatsink/retension system by some people who improperly tried to remove it in this generation of cards.

Any tips to avoid said damage? For example do I need a certain type of tool/screwdriver?


----------



## Testier

Quote:


> Originally Posted by *Steven185*
> 
> Your results are great so I'm willing to postpone buying hybrid cooling for the time being and try repasting first. My only concern is that there are reports of damage done to heatsink/retension system by some people who improperly tried to remove it in this generation of cards.
> 
> Any tips to avoid said damage? For example do I need a certain type of tool/screwdriver?


You need a 4mm hex socket wrench I believe.


----------



## Sheyster

Quote:


> Originally Posted by *HyperMatrix*
> 
> Get a liquid metal for the gpu die. CLU or grizzly conductonaut depending on what is available to you.


I would avoid a LM TIM if your card is vertically mounted (test bench, desktop style case like mine, etc.)

Just too risky IMHO with a $1200 card.


----------



## Testier

Quote:


> Originally Posted by *Sheyster*
> 
> I would avoid a LM TIM if your card is vertically mounted (test bench, desktop style case like mine, etc.)
> 
> Just too risky IMHO with a $1200 card.


I am planning on using Gelid Extreme, probably the best non conductive paste. I do like these test bench style cases though.


----------



## Sheyster

Hey guys, regrading LM TIMs and the new Thermal Grizzly LM (Conductonaut), check this out:

http://overclocking.guide/thermal-paste-roundup-2015-47-products-tested-with-air-cooling-and-liquid-nitrogen-ln2/6/

I would be shocked if any LM is much better than Phobya, which I use. It's probably within 0.5 deg C max. Get Phobia LM, it's cheap!

LINK: http://www.performance-pcs.com/phobya-liquid-metal-thermal-compound-paste-lm-1g.html

6 dolla!!


----------



## Sheyster

Quote:


> Originally Posted by *Testier*
> 
> I am planning on using Gelid Extreme, probably the best non conductive paste. I do like these test bench style cases though.


Kryonaut is better, if you want the best non LM TIM.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> once you get into the higher quality TIMs it is more the mount quality that affects the outcome.


True dat! My built-in "bad mount" meter is finely tuned at this point! I assume yours is too!


----------



## toncij

Can someone please tell dimensions and weight of the TXP with the box?


----------



## Jpmboy

Quote:


> Originally Posted by *Sheyster*
> 
> Kryonaut is better, if you want the best non LM TIM.


TGK is the best IMO.. but also recognize that the author of that review is the has a major stake in Kyronaut. Not sayin' he's biased... lol for any business in the US, the assumption of bias would be the default.


----------



## DarkIdeals

Quote:


> Originally Posted by *Difunto*
> 
> i know what you did... its the cable that connects to power the LED its probably inside the fan chamber... open it again and fix that once you put it back move the fan with ur hand to make sure its out of the way.


Tried re-doing everything again. Took it all apart and put together again. Still makes the fan noise. It's just a HORRIBLE high pitch screaching coming directly from the fan that stop when i put my finger on the fan to stop it, and lessens when you turn the fan down. I tried to make sure the cord was out of the way but idk. When i have the card in my hand and turn the blade it doesn't make any sound at all, only when it's on.

ANYONE have any idea what this is? It's driving me crazy! I didn't do ANYTHING to the fan!


----------



## KillerBee33

Got GELID Extreme laying around , will it make any difference changing from EVGAs Hybrid Factory paste?


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> TGK is the best IMO.. but also recognize that the author of that review is the has a major stake in Kyronaut. Not sayin' he's biased... lol for any business in the US, the assumption of bias would be the default.


Good point! We need a Devil's Advocate POV every now and then, thanks!


----------



## Sheyster

Quote:


> Originally Posted by *KillerBee33*
> 
> Got GELID Extreme laying around , will it make any difference changing from EVGAs Hybrid Factory paste?


It's one of the top non LM TIMs, probably #3. Use it if you have it!


----------



## Stateless

I ordered some of the Grizzly thermal paste. I never used it before, but I assume, it should be applied like other pasts, thin and even? Should I also apply thin and even on the water block as well, or just covering the surface of the gpu itself should be fine?


----------



## KillerBee33

Quote:


> Originally Posted by *Sheyster*
> 
> It's one of the top non LM TIMs, probably #3. Use it if you have it!


Good to know







also since TitanX on air working against th NZXTs kraken had to clock down 6700 to Stock bcz. even. @ 4.6 temps raise to 82







, should i also use grlid on the cpu?


----------



## Sheyster

Quote:


> Originally Posted by *KillerBee33*
> 
> Good to know
> 
> 
> 
> 
> 
> 
> 
> also since TitanX on air working against th NZXTs kraken had to clock down 6700 to Stock bcz. even. @ 4.6 temps raise to 82
> 
> 
> 
> 
> 
> 
> 
> , should i also use grlid on the cpu?


Might as well try it!


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Got GELID Extreme laying around , will it make any difference changing from EVGAs Hybrid Factory paste?


Gelid Extreme is a top 3, and has been for a long time - great stuff. You will not detect any difference between it and.. well, any of the top 5 for that matter.


----------



## dante`afk

so I've changed the TIM from kryonaut to condactonaut.

kryonaut / condactonaut
idle: 28c / 26c
load: 48c / 50c

meh, liquid metal not better


----------



## aylan1196

Quote:


> Originally Posted by *DarkIdeals*
> 
> Tried re-doing everything again. Took it all apart and put together again. Still makes the fan noise. It's just a HORRIBLE high pitch screaching coming directly from the fan that stop when i put my finger on the fan to stop it, and lessens when you turn the fan down. I tried to make sure the cord was out of the way but idk. When i have the card in my hand and turn the blade it doesn't make any sound at all, only when it's on.
> 
> ANYONE have any idea what this is? It's driving me crazy! I didn't do ANYTHING to the fan!


Happend to me b4 on Titan xm the fan is faulty probably the clip is damaged the small pin check it out and what I did is I had an old 980 reference gpu swapped the fans and viola working again
Good luck


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> Gelid Extreme is a top 3, and has been for a long time - great stuff. You will not detect any difference between it and.. well, any of the top 5 for that matter.


That's the thing , i'm not sure what EVGA uses







Don't have much time reassemble everything twice .


----------



## DarkIdeals

Quote:


> Originally Posted by *aylan1196*
> 
> Happend to me b4 on Titan xm the fan is faulty probably the clip is damaged the small pin check it out and what I did is I had an old 980 reference gpu swapped the fans and viola working again
> Good luck


So i managed to fix it somehow. I noticed that a small piece of thermal pad had fallen into the fan cavity, and was JUST barely within reach of the fan blades. I removed it. Not sure if that's what did it as i also changed back to the stock pads for the rest of the card instead of the EK ones i stuck on it. But regardless of which one did it, the fan is now normal again...for now at least lol.


----------



## Jpmboy

Quote:


> Originally Posted by *dante`afk*
> 
> so I've changed the TIM from kryonaut to condactonaut.
> kryonaut / condactonaut
> idle: 28c / 26c
> load: 48c / 50c
> meh, liquid metal not better


the thing with LM is that it really can't fill gaps caused by surface irregularities in the heat sink side. Works great for a delid or with a very high quality water block (nickel plate only!).. but a stock aircooler?
Quote:


> Originally Posted by *KillerBee33*
> 
> That's the thing , i'm not sure what EVGA uses
> 
> 
> 
> 
> 
> 
> 
> Don't have much time reassemble everything twice .


EVGA is not using a top TIM, that's for sure. I changed the TIM on one of my TXPs before putting uniblkocks on them... used gelid or kryo - can;t remember. Bought about 5-10C. Ya never know, sometimes it looks like the OEM put the TIM on with a pointing bag.


----------



## dante`afk

Quote:


> Originally Posted by *Jpmboy*
> 
> the thing with LM is that it really can't fill gaps caused by surface irregularities in the heat sink side. Works great for a delid or with a very high quality water block (nickel plate only!).. but a stock aircooler?


yea, figured. not stock cooler, using the H90 + kraken G10


----------



## CallsignVega

Quote:


> Originally Posted by *dante`afk*
> 
> so I've changed the TIM from kryonaut to condactonaut.
> 
> kryonaut / condactonaut
> idle: 28c / 26c
> load: 48c / 50c
> 
> meh, liquid metal not better


Something like Gelid Extreme is so good, using liquid metal stuff is rather pointless and not worth the trouble for a degree or two.


----------



## Tideman

Quote:


> Originally Posted by *toncij*
> 
> What's a parcel motel?
> 
> 
> 
> 
> 
> 
> 
> I presume you ordered to that "parcel motel" in UK and shipped to yourself from there? Which one?


http://www.parcelmotel.com/

When you go to order your card, you just use the address for their UK depot in Northern Ireland. Make sure you've registered with them first though, otherwise they might not accept your parcel. Then once it's arrived with them, they bring it over the border to your local depot down south (I chose to have mine delivered).


----------



## lilchronic

yeah i remember about 4 years ago i put LM on my 670's thinking i was going to get a nice temp drop like my 3570k when delidded.

It was a sad day. Took me all day to tear down my loop and re do the paste just for a 1-2° temp drop.


----------



## HyperMatrix

Quote:


> Originally Posted by *Sheyster*
> 
> I would avoid a LM TIM if your card is vertically mounted (test bench, desktop style case like mine, etc.)
> 
> Just too risky IMHO with a $1200 card.


Out of curiosity, are you saying you don't use CLU for your CPU then? Or are you saying that you don't trust others to not over-do it with the amount of TIM they apply?


----------



## gamingarena

Ok i asked already before but got no asnwer so ill try again.

So no one has problem with TDP usage droping down randomly and not coming back untill reboot?

Busting my head cant figure out what it is
So its basically like moving TDP slider to 80% in Afterburner and seats there and want come back to 120%
Clocks all stay the same they dont move just TDP limit drops to 70-80% depends and it needs reboot to get back to 120%

Its really anoying there is no driver crash or anything just random drop

The only thing i can think of its TitanX is not properly supported by afterburner
But then again looks like im the only one with the problem

Anyone have any idea what could be that drops TDP limit like that? or have similar problem

Or it just faulty card? other then that its rock solid


----------



## HyperMatrix

Quote:


> Originally Posted by *gamingarena*
> 
> Ok i asked already before but got no asnwer so ill try again.
> 
> So no one has problem with TDP usage droping down randomly and not coming back untill reboot?
> 
> Busting my head cant figure out what it is
> So its basically like moving TDP slider to 80% in Afterburner and seats there and want come back to 120%
> Clocks all stay the same they dont move just TDP limit drops to 70-80% depends and it needs reboot to get back to 120%
> 
> Its really anoying there is no driver crash or anything just random drop
> 
> The only thing i can think of its TitanX is not properly supported by afterburner
> But then again looks like im the only one with the problem
> 
> Anyone have any idea what could be that drops TDP limit like that? or have similar problem
> 
> Or it just faulty card? other then that its rock solid


Did you check windows event viewer to make sure there was no quick display driver crash/recovery?


----------



## gamingarena

Quote:


> Originally Posted by *HyperMatrix*
> 
> Did you check windows event viewer to make sure there was no quick display driver crash/recovery?


No forgot ill do that when i get home but honestly i would visually see driver crash unless they made it so you cant really tell the driver carshed in front of you

Or did they? Is that recovey is so fast now that you cant tell it cashed?

Thanks for the input


----------



## HyperMatrix

Quote:


> Originally Posted by *gamingarena*
> 
> No forgot ill do that when i get home but honestly i would visually see driver crash unless they made it so you cant really tell the driver carshed in front of you
> 
> Or did they? Is that recovey is so fast now that you cant tell it cashed?
> 
> Thanks for the input


Depends on the crash. I've had it times where it was nothing more than an instant screen flicker and I didn't realize anything had happened until I noticed the FPS in game had dropped. Of course in that situation my GPU clock would be running lower as well. So I made a batch file to unload and reload the display driver without having to reboot the PC to get them running 100% again.


----------



## bee144

Quote:


> Originally Posted by *KillerBee33*
> 
> That's the thing , i'm not sure what EVGA uses
> 
> 
> 
> 
> 
> 
> 
> Don't have much time reassemble everything twice .


EVGA Jacob shared on their board what thermal paste is used on the hybrid coolers. It was some off brand Chinese based stuff. It had 5 stars on Newegg. Can't remember the name though.


----------



## KillerBee33

Quote:


> Originally Posted by *bee144*
> 
> EVGA Jacob shared on their board what thermal paste is used on the hybrid coolers. It was some off brand Chinese based stuff. It had 5 stars on Newegg. Can't remember the name though.


That a good or a bad thing ? Things i bought from N.Egg were not based on Stars


----------



## gamingarena

Quote:


> Originally Posted by *HyperMatrix*
> 
> Depends on the crash. I've had it times where it was nothing more than an instant screen flicker and I didn't realize anything had happened until I noticed the FPS in game had dropped. Of course in that situation my GPU clock would be running lower as well. So I made a batch file to unload and reload the display driver without having to reboot the PC to get them running 100% again.


Oh can you give me instruction about that batch file
So i dont need to reboot my system?

Thanks in adavnce


----------



## HyperMatrix

Quote:


> Originally Posted by *gamingarena*
> 
> Oh can you give me instruction about that batch file
> So i dont need to reboot my system?
> 
> Thanks in adavnce


The parameters might be different on your PC. But it's basically just using Devcon to do it. So for mine, I have it set to identify all display drivers with the identifier VEN_10DE in it. So just create a .bat file anywhere. Like in your C:\ root. Make sure Devcon.exe is placed in that same spot. Use a similar command:

c:\devcon.exe restart =display *VEN_10DE*

Then just create a link to your NAME.bat file and place it on your desktop for easy access.


----------



## gamingarena

Quote:


> Originally Posted by *HyperMatrix*
> 
> The parameters might be different on your PC. But it's basically just using Devcon to do it. So for mine, I have it set to identify all display drivers with the identifier VEN_10DE in it. So just create a .bat file anywhere. Like in your C:\ root. Make sure Devcon.exe is placed in that same spot. Use a similar command:
> 
> c:\devcon.exe restart =display *VEN_10DE*
> 
> Then just create a link to your NAME.bat file and place it on your desktop for easy access.


Thanks a lot
Cheers


----------



## Stateless

My wife just gave me the green light to order a 2nd Titan X Pascal. My original goal was to go to a single card due to some of the recent games not supporting SLI. However, playing the Division and not being able to max it out on a single Titan XP at 4k/60fps was bumming me out a little (I have my TXP boosting to 1974). I was able to hit around 52fps in gameplay with everything set to max with the exception of shadows, fog level and particle effects. In Witcher 3 at 4k. I am able to max it out with the exception of Hairworks off and AA off and in fights I would get around 52fps or so.

My question, actual several are as follows:

1.) For those that have SLI Titan XP's are you able to max out (I am talking complete max with hairworks full on/AA) at 4k Witcher 3 and get 60fps?
2.) For Division, same thing, completely maxed out 4k/60fps?
3.) I know from reading that the EK Blocks wont allow the Nvidia HB Bridge to work, but someone said they used a sander to sand down the pointing parts. I can't find that post, but is it possible that this would work?

Don't get me wrong, I am getting tremendous performance out of my single card now, being able to play doom 100% maxed out at 4k/60fps has been great. On my Titan Maxwell this would not be remotely close due to that game only supporting single GPU's. I also know that my single Titan will go faster once I have Water Cooling and when After Burner allows voltage. But with the Wife giving me the green light, I just want to know that with 2 Titan XP's that I should be able to max out Witcher 3 and Division with ease at 4k/60fps.

Thanks to anyone that can answer some of these. I am on Nvidia's site ready to order and on EKWB to order a 2nd block.


----------



## HyperMatrix

Quote:


> Originally Posted by *Stateless*
> 
> My wife just gave me the green light to order a 2nd Titan X Pascal. My original goal was to go to a single card due to some of the recent games not supporting SLI. However, playing the Division and not being able to max it out on a single Titan XP at 4k/60fps was bumming me out a little (I have my TXP boosting to 1974). I was able to hit around 52fps in gameplay with everything set to max with the exception of shadows, fog level and particle effects. In Witcher 3 at 4k. I am able to max it out with the exception of Hairworks off and AA off and in fights I would get around 52fps or so.
> 
> My question, actual several are as follows:
> 
> 1.) For those that have SLI Titan XP's are you able to max out (I am talking complete max with hairworks full on/AA) at 4k Witcher 3 and get 60fps?
> 2.) For Division, same thing, completely maxed out 4k/60fps?
> 3.) I know from reading that the EK Blocks wont allow the Nvidia HB Bridge to work, but someone said they used a sander to sand down the pointing parts. I can't find that post, but is it possible that this would work?
> 
> Don't get me wrong, I am getting tremendous performance out of my single card now, being able to play doom 100% maxed out at 4k/60fps has been great. On my Titan Maxwell this would not be remotely close due to that game only supporting single GPU's. I also know that my single Titan will go faster once I have Water Cooling and when After Burner allows voltage. But with the Wife giving me the green light, I just want to know that with 2 Titan XP's that I should be able to max out Witcher 3 and Division with ease at 4k/60fps.
> 
> Thanks to anyone that can answer some of these. I am on Nvidia's site ready to order and on EKWB to order a 2nd block.


On average, I find 2 cards to be sufficient for maxed out 1440p at 140-180 fps, barring CPU limitations. Putting them under water will push that up to about 155-200fps. 4k is 2.2x more taxing. So if you divide 155-200 by 2.2, you'll end up with 70-90fps at 4k. With a little tweaking you could probably even get away with 4k at 100Hz when monitors come out. But sadly, 4k 144Hz will require 2 next gen titans.


----------



## Sheyster

Quote:


> Originally Posted by *HyperMatrix*
> 
> Out of curiosity, are you saying you don't use CLU for your CPU then? Or are you saying that you don't trust others to not over-do it with the amount of TIM they apply?


On my delidded CPU, I'm using Phobya Liquid Metal between the die and the IHS. Between the IHS and the block I use Kryonaut.

I was referring to avoiding use of a liquid metal TIM on video cards if they're mounted in a test bench (vertical mount) as follows:


----------



## Sheyster

Quote:


> Originally Posted by *bee144*
> 
> EVGA Jacob shared on their board what thermal paste is used on the hybrid coolers. It was some off brand Chinese based stuff. It had 5 stars on Newegg. Can't remember the name though.


Probably Shin-Etsu. They make different TIMs, but nothing on the same level as Kryonaut, PK-3 or Gelid GC Extreme.


----------



## Gary2015

Quote:


> Originally Posted by *Steven185*
> 
> Your results are great so I'm willing to postpone buying hybrid cooling for the time being and try repasting first. My only concern is that there are reports of damage done to heatsink/retension system by some people who improperly tried to remove it in this generation of cards.
> 
> Any tips to avoid said damage? For example do I need a certain type of tool/screwdriver?


I would say the HEX screws are the most finicky . Use the proper HEX key and don't use pliers .
https://www.ekwb.com/shop/hex-socket-4mm


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> On average, I find 2 cards to be sufficient for maxed out 1440p at 140-180 fps, barring CPU limitations. Putting them under water will push that up to about 155-200fps. 4k is 2.2x more taxing. So if you divide 155-200 by 2.2, you'll end up with 70-90fps at 4k. With a little tweaking you could probably even get away with 4k at 100Hz when monitors come out. But sadly, 4k 144Hz will require 2 next gen titans.


Y
Quote:


> Originally Posted by *Stateless*
> 
> My wife just gave me the green light to order a 2nd Titan X Pascal. My original goal was to go to a single card due to some of the recent games not supporting SLI. However, playing the Division and not being able to max it out on a single Titan XP at 4k/60fps was bumming me out a little (I have my TXP boosting to 1974). I was able to hit around 52fps in gameplay with everything set to max with the exception of shadows, fog level and particle effects. In Witcher 3 at 4k. I am able to max it out with the exception of Hairworks off and AA off and in fights I would get around 52fps or so.
> 
> My question, actual several are as follows:
> 
> 1.) For those that have SLI Titan XP's are you able to max out (I am talking complete max with hairworks full on/AA) at 4k Witcher 3 and get 60fps?
> 2.) For Division, same thing, completely maxed out 4k/60fps?
> 3.) I know from reading that the EK Blocks wont allow the Nvidia HB Bridge to work, but someone said they used a sander to sand down the pointing parts. I can't find that post, but is it possible that this would work?
> 
> Don't get me wrong, I am getting tremendous performance out of my single card now, being able to play doom 100% maxed out at 4k/60fps has been great. On my Titan Maxwell this would not be remotely close due to that game only supporting single GPU's. I also know that my single Titan will go faster once I have Water Cooling and when After Burner allows voltage. But with the Wife giving me the green light, I just want to know that with 2 Titan XP's that I should be able to max out Witcher 3 and Division with ease at 4k/60fps.
> 
> Thanks to anyone that can answer some of these. I am on Nvidia's site ready to order and on EKWB to order a 2nd block.


Yes to 1) and 2)

3) use evga HB bridge



See thread
http://www.overclock.net/t/1607133/titan-x-pascal-waterblock-release-date-expectations/150#post_25436007


----------



## HyperMatrix

Quote:


> Originally Posted by *Sheyster*
> 
> On my delidded CPU, I'm using Phobya Liquid Metal between the die and the IHS. Between the IHS and the block I use Kryonaut.
> 
> I was referring to avoiding use of a liquid metal TIM on video cards if they're mounted in a test bench (vertical mount) as follows:


Have there been any tests showing that there is any concern with proper application in a vertical setting? I used CLU both between die and IHS and IHS and cpu block for over 2 years, but the CPU block coverage area is obviously bigger than the IHS so even if there was dripping of any kind, it would drop down the block as opposed to anything else. But when I removed the block and re-applied CLU, I didn't notice any out of place spread of the liquid metal. I just can't imagine CLU dripping at all if you don't over-apply it.

I'm just wondering if you read that this is an issue somewhere, or if it's just a personal worry. Thanks.


----------



## Menthol

Quote:


> Originally Posted by *Sheyster*
> 
> Probably Shin-Etsu. They make different TIMs, but nothing on the same level as Kryonaut, PK-3 or Gelid GC Extreme.


Like everyone has been saying, CLU between die and lid, any of the top 5 TIM's anywhere else, especially if like most here swap hardware frequently for testing and benching, PK-1 is the easiest past to spreed and I us it when swapping hardware constantly and one the 3 above for a more permanent mount.

HEX drivers are easier to use than a socket, they just are
https://www.amazon.com/gp/product/B000BQ4XPQ/ref=oh_aui_detailpage_o02_s00?ie=UTF8&psc=1


----------



## jaminiah

Quote:


> Originally Posted by *Gary2015*
> 
> He has faulty cards. I had this with my old TXMs. RMA them. I have SLI and its bitter smooth.


Managed to spend a few hours investigating this over the weekend and noticed the stutter was during an autosave of each game.
So I moved a few games from the Samsung Evo 840 to the Samsung SM951 and now it's super smooth.

So the card isn't faulty, even though the firmware is up to date on the Evo 840 I assume it still has problems. I'll order a new SSD for the game storage!


----------



## DarkIdeals

Quote:


> Originally Posted by *Stateless*
> 
> My wife just gave me the green light to order a 2nd Titan X Pascal. My original goal was to go to a single card due to some of the recent games not supporting SLI. However, playing the Division and not being able to max it out on a single Titan XP at 4k/60fps was bumming me out a little (I have my TXP boosting to 1974). I was able to hit around 52fps in gameplay with everything set to max with the exception of shadows, fog level and particle effects. In Witcher 3 at 4k. I am able to max it out with the exception of Hairworks off and AA off and in fights I would get around 52fps or so.
> 
> My question, actual several are as follows:
> 
> 1.) For those that have SLI Titan XP's are you able to max out (I am talking complete max with hairworks full on/AA) at 4k Witcher 3 and get 60fps?
> 2.) For Division, same thing, completely maxed out 4k/60fps?
> 3.) I know from reading that the EK Blocks wont allow the Nvidia HB Bridge to work, but someone said they used a sander to sand down the pointing parts. I can't find that post, but is it possible that this would work?
> 
> Don't get me wrong, I am getting tremendous performance out of my single card now, being able to play doom 100% maxed out at 4k/60fps has been great. On my Titan Maxwell this would not be remotely close due to that game only supporting single GPU's. I also know that my single Titan will go faster once I have Water Cooling and when After Burner allows voltage. But with the Wife giving me the green light, I just want to know that with 2 Titan XP's that I should be able to max out Witcher 3 and Division with ease at 4k/60fps.
> 
> Thanks to anyone that can answer some of these. I am on Nvidia's site ready to order and on EKWB to order a 2nd block.


1) Yup. Basically guaranteed that in games that support SLI you WILL get 60fps constant, even in minimum's. Go look up a guy called "thirty IR" on youtube, he did 4K testing with SLI TITAN XP's in like FIFTEEN different games and EVERY single game had well over 60fps minimum's! In witcher 3 maxed out with hairworks etc.. he got average of 86fps iirc, minimum's were like 75fps and max was ~96fps. My 2nd card comes tomorrow so i'll confirm it for you but i'm EXTREMELY confident that it'll easily pass 60fps constant even for minimum fps.

2) Yes. Same thing.

3) idk about sanding, the only working mod i've seen done to these is with a Dremel to saw off the pointed bits. It did work for the record. However if kinda looks fugly in my opinion...just saying. I'm not a fan of the EVGA HB bridge either though, really hoping the EK HB bridge comes out end of August like they told us it would; i really wanna see what it looks like...


----------



## mbze430

I don't own Witcher 3, but I am now wondering how well it will play with SLI TXP and a dedicated 980TI PhysX


----------



## DarkIdeals

Quote:


> Originally Posted by *mbze430*
> 
> I don't own Witcher 3, but I am now wondering how well it will play with SLI TXP and a dedicated 980TI PhysX


iirc, Witcher 3 wasn't very big on PhysX. Could be wrong i guess, but i don't think i am. I doubt even a 980 TI as PPU would see an appreciable increase, especially on cards as powerful as these TITAN's. Maybe try it out on Metro LL/2033, or Arkham games etc.. too bad i sold my MSI Gaming X 1080 or i'd throw that in as physx dedicated PPU card and see how it worked. Meh...

Oh and here's the video i was mentioned showing TITAN XP SLI gaming results @Stateless


----------



## habu58

That's cool


----------



## atreides

Got mines last week! I am running 3440x1400 at 100hz with my Titan X. Its been amazing so far. Currently loving Fallout 4 with the Olympus ENB mod and 120 other mods loaded. I can't wait to try out Battlefield 1, and all the other hit games coming out. I am considering grabbing another one for sli but I'm going to wait to see how the new games will perform with just one.


----------



## Gary2015

Quote:


> Originally Posted by *atreides*
> 
> IMG_1022.JPG 1258k .JPG file
> 
> 
> IMG_1023.JPG 1675k .JPG file
> 
> 
> IMG_1026.JPG 1648k .JPG file
> 
> 
> Got mines last week! I am running 3440x1400 at 100hz with my Titan X. Its been amazing so far. Currently loving Fallout 4 with the Olympus ENB mod and 120 other mods loaded. I can't wait to try out Battlefield 1, and all the other hit games coming out. I am considering grabbing another one for sli but I'm going to wait to see how the new games will perform with just one.


I have the ACER X34. You need two for max settings .


----------



## atreides

Quote:


> Originally Posted by *Gary2015*
> 
> I have the ACER X34. You need two for max settings .


Your talking about Battlefield 1?


----------



## Kaapstad

What a dreadful owners thread, the OP has made no effort at all.


----------



## Gary2015

Quote:


> Originally Posted by *atreides*
> 
> Your talking about Battlefield 1?


Yes


----------



## atreides

Okay thank you!


----------



## toncij

Quote:


> Originally Posted by *Tideman*
> 
> http://www.parcelmotel.com/
> 
> When you go to order your card, you just use the address for their UK depot in Northern Ireland. Make sure you've registered with them first though, otherwise they might not accept your parcel. Then once it's arrived with them, they bring it over the border to your local depot down south (I chose to have mine delivered).


Yep, goggled and found it.









Quote:


> Originally Posted by *Stateless*
> 
> My wife just gave me the green light to order a 2nd Titan X Pascal. My original goal was to go to a single card due to some of the recent games not supporting SLI. However, playing the Division and not being able to max it out on a single Titan XP at 4k/60fps was bumming me out a little (I have my TXP boosting to 1974). I was able to hit around 52fps in gameplay with everything set to max with the exception of shadows, fog level and particle effects. In Witcher 3 at 4k. I am able to max it out with the exception of Hairworks off and AA off and in fights I would get around 52fps or so.
> 
> My question, actual several are as follows:
> 
> 1.) For those that have SLI Titan XP's are you able to max out (I am talking complete max with hairworks full on/AA) at 4k Witcher 3 and get 60fps?
> 2.) For Division, same thing, completely maxed out 4k/60fps?
> 3.) I know from reading that the EK Blocks wont allow the Nvidia HB Bridge to work, but someone said they used a sander to sand down the pointing parts. I can't find that post, but is it possible that this would work?
> 
> Don't get me wrong, I am getting tremendous performance out of my single card now, being able to play doom 100% maxed out at 4k/60fps has been great. On my Titan Maxwell this would not be remotely close due to that game only supporting single GPU's. I also know that my single Titan will go faster once I have Water Cooling and when After Burner allows voltage. But with the Wife giving me the green light, I just want to know that with 2 Titan XP's that I should be able to max out Witcher 3 and Division with ease at 4k/60fps.
> 
> Thanks to anyone that can answer some of these. I am on Nvidia's site ready to order and on EKWB to order a 2nd block.


Keep in mind that The Division does not scale in SLI. your second card is useless for it.


----------



## guttheslayer

Guys can i ask how u spread LM or new tim on ur titan gpu?

1) credit card spreading
2) big X
3) pea size drop at centre and compressed with headsink.

Which one?


----------



## cookiesowns

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys can i ask how u spread LM or new tim on ur titan gpu?
> 
> 1) credit card spreading
> 2) big X
> 3) pea size drop at centre and compressed with headsink.
> 
> Which one?


I ended up doing PEA + Line, plus some dabs ( lol ) at the far edges if I feel the heatsink isn't that even. I finally realized...... More is always better than less, just makes a mess.

EDIT: This is for TIM. I don't use liquid metal or any of the sorts.


----------



## lilchronic

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys can i ask how u spread LM or new tim on ur titan gpu?
> 
> 1) credit card spreading
> 2) big X
> 3) pea size drop at centre and compressed with headsink.
> 
> Which one?


----------



## HyperMatrix

Quote:


> Originally Posted by *lilchronic*


Very important to note: do not over-apply. Also I like to use the pads that come with CLU to prepare my block since clu fills it in and bonds to it. Personally, I apply as thin a layer as possible and I apply it to both surfaces.


----------



## Kaapstad

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys can i ask how u spread LM or new tim on ur titan gpu?
> 
> 1) credit card spreading
> 2) big X
> 3) pea size drop at centre and compressed with headsink.
> 
> Which one?


----------



## guttheslayer

Hmmm how about for non-conductive tim? Method applied is same as CLU?


----------



## markklok

I first whent for the X.. (totally overdid it)

2nd attempt was ok..


I used Artic 5 because i didn't have anything better


----------



## lilchronic

Quote:


> Originally Posted by *guttheslayer*
> 
> Hmmm how about for non-conductive tim? Method applied is same as CLU?


With regular thermal paste i use a small x method


----------



## Gary2015

Quote:


> Originally Posted by *markklok*
> 
> I first whent for the X.. (totally overdid it)
> 
> 2nd attempt was ok..
> 
> 
> I used Artic 5 because i didn't have anything better


I prefer a thin vertical line and then use a spatula to spread both side.


----------



## axiumone

Quote:


> Originally Posted by *toncij*
> 
> Keep in mind that The Division does not scale in SLI. your second card is useless for it.


Well, that's totally wrong. It has other issues with frame pacing in sli, but scaling is not one of them.


----------



## Glzmo

By the way, will we see a proper owner's club thread for this card some time? I think many people own it by now.


----------



## Silent Scone

Quote:


> Originally Posted by *Glzmo*
> 
> By the way, will we see a proper owner's club thread for this card some time? I think many people own it by now.


What would you call this thread, then? lol at having the box art in your avatar.


----------



## Gary2015

Quote:


> Originally Posted by *Silent Scone*
> 
> What would you call this thread, then?


Titan X P Owners thread.


----------



## Kaapstad

Ay one know when NVidia are going to enable 4 way SLI for benching with the Pascal Titan.


----------



## Gary2015

Quote:


> Originally Posted by *axiumone*
> 
> Well, that's totally wrong.


Lots of games dont have SLI support.


----------



## axiumone

Quote:


> Originally Posted by *Gary2015*
> 
> Lots of games dont have SLI support.


Right, but the division is not lots of those games. It supports sli just fine.


----------



## unreality

I really dont like spreading technique because of air traps inside. But for bigger chips like GPU and 2011-3 i guess pea isnt that perfect. Ill try a very thin X for both when my Waterblock arrives







Also got some Kyronaut!


----------



## Kaapstad

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> 
> 
> 
> http://www.geforce.com/hardware/10series/titan-x-pascal
> 
> Elite members welcome 24/7.
> 
> 
> 
> 
> 
> 
> 
> 
> Peasants opening times 4:45pm - 5:00pm.
> Peasants must remain behind barriers at all times.
> DO NOT TOUCH IT. DO NOT SNEEZE EVER!. DO NOT BREATHE ON IT...
> I SAID DON'T BREATHE ON IT!
> 
> 
> 
> 
> 
> 
> 
> 
> and no RGB's... don't you even think about it:1eyed2


Quote:


> Originally Posted by *Silent Scone*
> 
> What would you call this thread, then? lol at having the box art in your avatar.


Probably the worst owners thread I have ever seen.


----------



## Gary2015

Quote:


> Originally Posted by *Kaapstad*
> 
> Probably the worst owners thread I have ever seen.


OK


----------



## toncij

Quote:


> Originally Posted by *axiumone*
> 
> Well, that's totally wrong. It has other issues with frame pacing in sli, but scaling is not one of them.


Quote:


> Originally Posted by *axiumone*
> 
> Right, but the division is not lots of those games. It supports sli just fine.


We can say it does scale, but scaling is so bad (17% and 26%) that it's not worth it.

The difference in single vs SLI of the 1080 is 85 vs 100 FPS on 1440 and 35 vs 40 FPS in 5K. Both cards hover at about 40-45% usage. Occasionally at 5K one card jumps to 95% but the other one dips to 30%. I did not have a chance to test TitanX SLI, but if it does not scale on 1080, it does not scale on TitanXP either.

I'd never buy a second $1200 card for The Division to get 26% scaling... that using $300 out of $1200 if you have a 5K screen and $200 out of $1200 is you have a 1440.

But, if one has endless pit of cash, why not. I'd buy 2 in that case. Unfortunately I still work for my money so can't be totally unreasonable.


----------



## KillerBee33

Not sure where H8 comes from for this Game , it looks and handles great. Just not sure how to take a 4K_DSR screenshot.


----------



## Silent Scone

Quote:


> Originally Posted by *Kaapstad*
> 
> Probably the worst owners thread I have ever seen.


Maybe it needs a roll of honour?


----------



## KillerBee33

Quote:


> Originally Posted by *Silent Scone*
> 
> Maybe it needs a roll of honour?


----------



## DADDYDC650

Quote:


> Originally Posted by *Kaapstad*
> 
> What a dreadful owners thread, the OP has made no effort at all.


Everyone in this thread is rich. I'm guessing the OP's butler hasn't bothered to update the thread for him. Hard to find good help these days.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Everyone in this thread is rich. I'm guessing the OP's butler hasn't bothered to update the thread for him. Hard to find good help these days.


Can't take that Ca$h to the ground Bud! I live in a 800sqf apartment and this Titan was half of what i pay for rent


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> Can't take that Ca$h to the ground Bud! I live in a 800sqf apartment and this Titan was half of what i pay for rent


Your sarcasm detector is broken.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Your sarcasm meter is broken.


I'll ask my 2 foot tall Butler to fix it


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Everyone in this thread is rich. I'm guessing the OP's butler hasn't bothered to update the thread for him. Hard to find good help these days.


Quote:


> Originally Posted by *DADDYDC650*
> 
> Everyone in this thread is rich. I'm guessing the OP's butler hasn't bothered to update the thread for him. Hard to find good help these days.


lolololololololololololololololololololol


----------



## DNMock

Quote:


> Originally Posted by *KillerBee33*
> 
> Can't take that Ca$h to the ground Bud! I live in a 800sqf apartment and this Titan was half of what i pay for rent


I just hope my parents let me know when my pair of EK blocks show up for my SLI TXP









P.S. I'm too lazy to look, where the hell do you live that 2,000 only gets you an 800 sq. ft. apt?

edit: new york, that explains it.


----------



## carlhil2

A TXP is 2 months condo fee for me, yeah, I know, I am being robbed. they raised it when oil prices were high, then, refuse to lower it when the price dropped.....thieves...


----------



## KillerBee33

Quote:


> Originally Posted by *DNMock*
> 
> I just hope my parents let me know when my pair of EK blocks show up for my SLI TXP
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. I'm too lazy to look, where the hell do you live that 2,000 only gets you an 800 sq. ft. apt?
> 
> edit: new york, that explains it.


Thats the time we live in , if i don't spend it , someone else will!


----------



## Snaporz

Now I just need that backplate to come.


----------



## Zurv

Quote:


> Originally Posted by *KillerBee33*
> 
> Can't take that Ca$h to the ground Bud! I live in a 800sqf apartment and this Titan was half of what i pay for rent


nod nod... 900sq here and 4 titan xp







(for 2 computers) - 2k a month rent is HELLA cheap. nice find.


----------



## Gary2015

Quote:


> Originally Posted by *Zurv*
> 
> nod nod... 900sq here and 4 titan xp
> 
> 
> 
> 
> 
> 
> 
> (for 2 computers) - 2k a month rent is HELLA cheap. nice find.


I pay $6k for 1200sf.


----------



## KillerBee33

Quote:


> Originally Posted by *Zurv*
> 
> nod nod... 900sq here and 4 titan xp
> 
> 
> 
> 
> 
> 
> 
> (for 2 computers) - 2k a month rent is HELLA cheap. nice find.


Only in Krooklyn


----------



## Zurv

Quote:


> Originally Posted by *Snaporz*
> 
> 
> 
> Now I just need that backplate to come.


waaaaaa... what part of the world are you in? i ordered my blocks on the 3rd and they are just sitting in process








screw that pointless backplate! put that card on water now!


----------



## Snaporz

Quote:


> Originally Posted by *Zurv*
> 
> waaaaaa... what part of the world are you in? i ordered my blocks on the 3rd and they are just sitting in process
> 
> 
> 
> 
> 
> 
> 
> 
> screw that pointless backplate! put that card on water now!


Virginia, USA lol


----------



## stefxyz

Tonight watercooling assemble and benchmark party!:


----------



## DNMock

Quote:


> Originally Posted by *Zurv*
> 
> waaaaaa... what part of the world are you in? i ordered my blocks on the 3rd and they are just sitting in process
> 
> 
> 
> 
> 
> 
> 
> 
> screw that pointless backplate! put that card on water now!


Mine is on the truck out on delivery as we speak in Dallas...

Gonna go ahead and clean the loop while I'm at it and mod the current backplate to work with the EK block so it will be a couple days before it's ready to go.


----------



## Zurv

did you guys order before the 3rd or something? *cry*
directly from EK?


----------



## DNMock

whatever day they went up for pre-order from EK. shipped friday, showing up today. didn't pay for express shipping either lol.


----------



## KillerBee33

Can you guys suggest an EK KIT for first timer ?


----------



## Edge0fsanity

my block comes in today too







I ordered as soon as they were available for pre order.


----------



## stefxyz

1 pm CET on release day was my order. Only advise: do soft tubing. get a normal D5 and as big a radiator as fits in your case.


----------



## Silent Scone

I tried on the same day also, but their site took my money twice and didn't process the order. So I ordered a few hours later. Poxy website...

Mine is still sat at processing also.


----------



## Zurv

i ordered a terminal bridge that is out of stock. (which it wasn't when i ordered!) argh. i don't care about that.. *sigh*


----------



## Lobotomite430

Sorry if this has been asked on this page before but has anyone tried EVGA's 1080 Hybrid cooler on the Titan X? How about Corsair HG10? I really want to put my Titan on AIO option but no ETA from EVGA is driving me nuts. I dont like the card running at 84c all the time with the fan very high speed, I can't wait for winter! Thanks.


----------



## DNMock

Quote:


> Originally Posted by *KillerBee33*
> 
> Can you guys suggest an EK KIT for first timer ?


Screw ek kit.

Get soft tubing from Mayhem
Fittings from Monsoon
Gentle Typhoon fans
Heatkiller 4.0 CPU block
EK d5 res pump combo and rads dependent on your case


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Thats the time we live in , if i don't spend it , someone else will!


For that money you can get a 2000 sqf house by the sea here.
Quote:


> Originally Posted by *Lobotomite430*
> 
> Sorry if this has been asked on this page before but has anyone tried EVGA's 1080 Hybrid cooler on the Titan X? How about Corsair HG10? I really want to put my Titan on AIO option but no ETA from EVGA is driving me nuts. I dont like the card running at 84c all the time with the fan very high speed, I can't wait for winter! Thanks.


I've seen it on 1080 so I doubt it won't fit TXP too.


----------



## Creator

How are everyone's cards overclocking on the stock cooler? I seem to be able to run close to 2000/11000, but after longer sessions the lower end of the throttling zone ends up around 1950-1975. This is with +200/+500 clocks. Temperatures also reach 85-90C with the fan going as high as 80% speed.

This card is pretty amazing. I tried Alien Isolation and Cities Skylines with 5K DSR (both of these games really need the super sampling), and AI was running around 70fps, and CS around 60fps. Haven't tried much else but it's going to be a little bit painful going back to OG Titans if I end up being too lazy to take apart my loop and returning within the 30 day window.


----------



## KillerBee33

Quote:


> Originally Posted by *DNMock*
> 
> Screw ek kit.
> 
> Get soft tubing from Mayhem
> Fittings from Monsoon
> Gentle Typhoon fans
> Heatkiller 4.0 CPU block
> EK d5 res pump combo and rads dependent on your case


That sounds like a month of Weekends


----------



## DADDYDC650

Quote:


> Originally Posted by *Creator*
> 
> How are everyone's cards overclocking on the stock cooler? I seem to be able to run close to 2000/11000, but after longer sessions the lower end of the throttling zone ends up around 1950-1975. This is with +200/+500 clocks. Temperatures also reach 85-90C with the fan going as high as 80% speed.
> 
> This card is pretty amazing. I tried Alien Isolation and Cities Skylines with 5K DSR (both of these games really need the super sampling), and AI was running around 70fps, and CS around 60fps. Haven't tried much else but it's going to be a little bit painful going back to OG Titans if I end up being too lazy to take apart my loop and returning within the 30 day window.


Pretty much my results as well give or take a couple Mhz.


----------



## The-Real-Link

Haven't even had to push it much past stock so far because of it's capabilities but I've seen it hold a mild +200 / 200 so far easy. Has certainly scared me into being CPU bound in some games (*cough WoW cough*) even at 4K. It's a beast.

Seeing perfect scaling here in some games too, and is generally holding 4K60 99% of the time in things I've tested, benchmarks aside of course.


----------



## Jpmboy

Quote:


> Originally Posted by *HyperMatrix*
> 
> Out of curiosity, are you saying you don't use CLU for your CPU then? Or are you saying that you don't trust others to not over-do it with the amount of TIM they apply?


under the IHS for sure. it is reall unnecessary between the IHS and block.
Quote:


> Originally Posted by *bee144*
> 
> EVGA Jacob shared on their board what thermal paste is used on the hybrid coolers. It was some off brand Chinese based stuff. It had 5 stars on Newegg. Can't remember the name though.


right below AS5 on this chart:
http://overclocking.guide/thermal-paste-roundup-2015-47-products-tested-with-air-cooling-and-liquid-nitrogen-ln2/6/

being below AS5 says a lot.








Quote:


> Originally Posted by *Sheyster*
> 
> On my delidded CPU, I'm using Phobya Liquid Metal between the die and the IHS. Between the IHS and the block I use Kryonaut.
> I was referring to avoiding use of a liquid metal TIM on video cards if they're mounted in a test bench (vertical mount) as follows:


This is the reason I have not done this reversible mod.


Quote:


> Originally Posted by *Menthol*
> 
> Like everyone has been saying, *CLU between die and lid, any of the top 5 TIM's anywhere else*, especially if like most here swap hardware frequently for testing and benching, *PK-1 is the easiest past to spreed* and I us it when swapping hardware constantly and one the 3 above for a more permanent mount.
> 
> HEX drivers are easier to use than a socket, they just are
> https://www.amazon.com/gp/product/B000BQ4XPQ/ref=oh_aui_detailpage_o02_s00?ie=UTF8&psc=1


^^ This!
and PK1 or PK3 are plenty good for any application. Guys look at a 0.25C degree ddiffierence measured in a review where the reviewer does not show the variance they get when doing multiple mounts with the same TIM. Daaum, I miss Skinnielabs.







Quote:


> Originally Posted by *KillerBee33*


would be nice if some key info was added to the OP tho.,,
Quote:


> Originally Posted by *Snaporz*
> 
> 
> 
> Now I just need that backplate to come.


very jelly... mine have not shipped yet.








Quote:


> Originally Posted by *Zurv*
> 
> did you guys order before the 3rd or something? *cry*
> directly from EK?


yeah, me too! well, at least there won;t be a big gap in time between blocks and backplates.


----------



## toncij

Quote:


> Originally Posted by *Creator*
> 
> How are everyone's cards overclocking on the stock cooler? I seem to be able to run close to 2000/11000, but after longer sessions the lower end of the throttling zone ends up around 1950-1975. This is with +200/+500 clocks. Temperatures also reach 85-90C with the fan going as high as 80% speed.
> 
> This card is pretty amazing. I tried Alien Isolation and Cities Skylines with 5K DSR (both of these games really need the super sampling), and AI was running around 70fps, and CS around 60fps. Haven't tried much else but it's going to be a little bit painful going back to OG Titans if I end up being too lazy to take apart my loop and returning within the 30 day window.


Quote:


> Originally Posted by *The-Real-Link*
> 
> Haven't even had to push it much past stock so far because of it's capabilities but I've seen it hold a mild +200 / 200 so far easy. Has certainly scared me into being CPU bound in some games (*cough WoW cough*) even at 4K. It's a beast.
> 
> Seeing perfect scaling here in some games too, and is generally holding 4K60 99% of the time in things I've tested, benchmarks aside of course.


What's "OG Titans"?

Btw, there are many games not catching 60 at 4K, still.



Spoiler: Warning: Spoiler!



Latest rumours from Gamescom are that 1080Ti is going to be a full Pascal die (3840 cores) with 8GB of memory and sell for $899/$999 MSRP/FE.


----------



## mbze430

Quote:


> Originally Posted by *guttheslayer*
> 
> Hmmm how about for non-conductive tim? Method applied is same as CLU?


it all depends on the viscosity of the TIM. the watery type you can use the droplet/"pea-size" method. the heavier thicker paste (Gelid Extreme) I prefer the spread method.

CLU is easy to apply, if you know how to apply it, it just needs to be a thin layer of it.


----------



## mbze430

Quote:


> Originally Posted by *DarkIdeals*
> 
> iirc, Witcher 3 wasn't very big on PhysX. Could be wrong i guess, but i don't think i am. I doubt even a 980 TI as PPU would see an appreciable increase, especially on cards as powerful as these TITAN's. Maybe try it out on Metro LL/2033, or Arkham games etc.. too bad i sold my MSI Gaming X 1080 or i'd throw that in as physx dedicated PPU card and see how it worked. Meh...


I do have Arkham Knight and from what I have seen on my system. whatever you turn on/off from Gameworks you gain/lose in FPS is what you gain/lose when you have a dedicated PhysX. but since Arkham Knights doesn't support SLI, I didn't get to see the full "glory" Plus I think Hairworks uses the GPU harder than Gameworks

I think i might have to spend 10mins maybe next a couple of weekends from now with ROTTR.

I heard that Witcher 3 is a very GPU demanding title, so I figure if anyone have SLI+Dedicated PhysX would give it a whirl. It's just not my type of game. Any game that requires many hours of playing = not my type of games


----------



## Testier

Quote:


> Originally Posted by *toncij*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Latest rumours from Gamescom are that 1080Ti is going to be a full Pascal die (3840 cores) with 8GB of memory and sell for $899/$999 MSRP/FE.


How can that work with 384bit controller?


----------



## unreality

Come on, i wanna see Water vs Air results already!!!


----------



## Kaapstad

Quote:


> Originally Posted by *Silent Scone*
> 
> Maybe it needs a roll of honour?


What a great idea !!!!

Perhaps you could do one as you are famous for your people skills.


----------



## Gary2015

Quote:


> Originally Posted by *unreality*
> 
> Come on, i wanna see Water vs Air results already!!!


Me too


----------



## toncij

Quote:


> Originally Posted by *Testier*
> 
> How can that work with 384bit controller?


I presume it will be 512 in that case (64-bits x8)? I'm not sure if such modules are produces either, but that's what's floating around.


----------



## Testier

Quote:


> Originally Posted by *toncij*
> 
> I presume it will be 512 in that case (64-bits x8)? I'm not sure if such modules are produces either, but that's what's floating around.


So you are suggesting there is extra memory controller in GP102 or nvidia have a different die for 1080 TI? I doubt neither are true.

Most likely 320 bit memory controller with 10gb gddr5x with 3072 to 3584 CUDA cores. Ofcourse the full die is possible but somehow I doubt it. Not with the way Titan XP is selling. Why piss off everyone who just bought your 1200USD card when you can milk them and another portion of the market at the same time?

If they launches this year, Titan XP would be too pissed off to upgrade. Also, from 3584 to 3840 is at best 3-5% of actual performance.

Edit: One last possibility for 8GB 3840 1080 TI
If nvidia chop the bus down to 256bit and uses something like 14ghz GDDR5x, I guess its possible....


----------



## DNMock

Quote:


> Originally Posted by *toncij*
> 
> What's "OG Titans"?
> 
> Btw, there are many games not catching 60 at 4K, still.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Latest rumours from Gamescom are that 1080Ti is going to be a full Pascal die (3840 cores) with 8GB of memory and sell for $899/$999 MSRP/FE.


No chance.

TI will be identical to the Titan with a few less cores (3200 or 3300 +/-). The only way they drop a full 3840 core 1080 ti is if Volta is behind and we see a new Titan XP Black based on the Tesla cards. Otherwise no one would have a reason to buy a titan after the 1080ti was released and Nvidia would lose out on potential profit from the mark-up.

Not gonna change the memory controller either to do 8gb Vram. Those changes would cost more to implement than just rolling out the 6 or 12 gb of ram with the already designed set-up (I believe that's what the 384 bit controller needs, but I may be wrong here)

edit: This is all moot if Vega is more powerful than TXP, in which case we will definitely see a full 3840 core 1080ti


----------



## Jpmboy

Quote:


> Originally Posted by *Kaapstad*
> 
> What a great idea !!!!
> 
> Perhaps you could do one as you are famous for your people skills.


Quote:


> Originally Posted by *Gary2015*
> 
> Me too


huh? I already posted results using a uniblock... and callsignvega has AIO results. My resultys are that basically, clocks hold steady and at +30 to +~100 depending on the benchmark. Max core temp is always <40C.


----------



## unreality

Theres still a difference to a real waterblock cooling VRMs and VRAM modules. At least i hope so.


----------



## KillerBee33

I wonder if it's possible to keep both parts of the Reference Shroud with any AIO units.


----------



## DNMock

Quote:


> Originally Posted by *unreality*
> 
> Theres still a difference to a real waterblock cooling VRMs and VRAM modules. At least i hope so.


not really, JPM basically uses an industrial leaf blower as a fan to cool the VRM and memory, so their temps are never an issue.


----------



## Jpmboy

Quote:


> Originally Posted by *unreality*
> 
> Theres still a difference to a real waterblock cooling VRMs and VRAM modules. At least i hope so.


well sure... that's why I ordered 2 EK blocks... but:
Quote:


> Originally Posted by *DNMock*
> 
> not really, JPM basically uses an industrial leaf blower as a fan to cool the VRM and memory, so their temps are never an issue.


you found out! lol - nothing like the sound of two screaming Delta fans to make the wife close my office door.


----------



## DADDYDC650

Quote:


> Originally Posted by *Testier*
> 
> How can that work with 384bit controller?


No Way in hell Nvidia releases a 1080 TI with 3840 cores for $900, lol!


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> under the IHS for sure. it is reall unnecessary between the IHS and block.
> right below AS5 on this chart:
> http://overclocking.guide/thermal-paste-roundup-2015-47-products-tested-with-air-cooling-and-liquid-nitrogen-ln2/6/
> 
> being below AS5 says a lot.
> 
> 
> 
> 
> 
> 
> 
> 
> This is the reason I have not done this reversible mod.
> 
> ^^ This!
> and PK1 or PK3 are plenty good for any application. Guys look at a 0.25C degree ddiffierence measured in a review where the reviewer does not show the variance they get when doing multiple mounts with the same TIM. Daaum, I miss Skinnielabs.
> 
> 
> 
> 
> 
> 
> 
> 
> would be nice if some key info was added to the OP tho.,,
> very jelly... mine have not shipped yet.
> 
> 
> 
> 
> 
> 
> 
> 
> yeah, me too! well, at least there won;t be a big gap in time between blocks and backplates.


I'll make you really jealous... My block will be here today (at the farm house) and the card is at my actual house 60 minutes away.


----------



## Artah

Quote:


> Originally Posted by *MunneY*
> 
> I'll make you really jealous... My block will be here today (at the farm house) and the card is at my actual house 60 minutes away.


I'll have my EK blocks at my house before I get home from work also but sadly the back plates will not even ship until the end of the month. I'm contemplating installing the blocks and the GPUs and then later on disassembling it again to install the back plates.


----------



## DADDYDC650

Thinking of buying another Titan XP for the hell of it. My Titan XP is single and ready to mingle.


----------



## Kielon

Quote:


> Originally Posted by *DADDYDC650*
> 
> No Way in hell Nvidia releases a 1080 TI with 3840 cores for $900, lol!


http://wccftech.com/amd-vega-flagship-gpu-launch-teaser/


----------



## DADDYDC650

Quote:


> Originally Posted by *Kielon*
> 
> http://wccftech.com/amd-vega-flagship-gpu-launch-teaser/


Ok? 1080 Ti with 3840 cores isn't happening bud.


----------



## toncij

Quote:


> Originally Posted by *Testier*
> 
> So you are suggesting there is extra memory controller in GP102 or nvidia have a different die for 1080 TI? I doubt neither are true.
> 
> Most likely 320 bit memory controller with 10gb gddr5x with 3072 to 3584 CUDA cores. Ofcourse the full die is possible but somehow I doubt it. Not with the way Titan XP is selling. Why piss off everyone who just bought your 1200USD card when you can milk them and another portion of the market at the same time?
> 
> If they launches this year, Titan XP would be too pissed off to upgrade. Also, from 3584 to 3840 is at best 3-5% of actual performance.
> 
> Edit: One last possibility for 8GB 3840 1080 TI
> If nvidia chop the bus down to 256bit and uses something like 14ghz GDDR5x, I guess its possible....


About pissing off people: well, that didn't bother them before with 780Ti or 980Ti, so why would it now?

Quote:


> Originally Posted by *DNMock*
> 
> No chance.
> 
> TI will be identical to the Titan with a few less cores (3200 or 3300 +/-). The only way they drop a full 3840 core 1080 ti is if Volta is behind and we see a new Titan XP Black based on the Tesla cards. Otherwise no one would have a reason to buy a titan after the 1080ti was released and Nvidia would lose out on potential profit from the mark-up.
> 
> Not gonna change the memory controller either to do 8gb Vram. Those changes would cost more to implement than just rolling out the 6 or 12 gb of ram with the already designed set-up (I believe that's what the 384 bit controller needs, but I may be wrong here)
> 
> edit: This is all moot if Vega is more powerful than TXP, in which case we will definitely see a full 3840 core 1080ti


Quote:


> Originally Posted by *Kielon*
> 
> http://wccftech.com/amd-vega-flagship-gpu-launch-teaser/


The word is they're not only fast on track for Pascal refresh (or Volta), but are also changing complete manufacturing to Samsung 14nm fabs because Nvidia is allegedly pissed off about the yields of the current 16nm which are not improving much or at desired rate.

I'm also really interested in the financials of this alleged news, It seems very strange, but then, they've released a Titan way earlier than I expected.


----------



## stefxyz

Guys got some first TImespy Benchmark result from my Titan under water at constant 34 Clesius:

http://www.3dmark.com/3dm/14145671?

10564 GPU not so bad I guess


----------



## KillerBee33

Quote:


> Originally Posted by *stefxyz*
> 
> Guys got some first TImespy Benchmark result from my Titan under water at constant 34 Clesius:
> 
> http://www.3dmark.com/3dm/14145671?
> 
> 10564 GPU not so bad I guess


Here's one on air with somehow lower Physics score at the same 4.6 http://www.3dmark.com/spy/254233


----------



## mbze430

384bit, full core Pascal for 1080TI... guess I'll be selling my Titan XP soon *****


----------



## stefxyz

And some Firestrike Ultra for the statistics:

http://www.3dmark.com/3dm/14146087?


----------



## CRITTY

Just got a notification the "EVGA PRO SLI BRIDGE HB" is in stock.


----------



## toncij

Quote:


> Originally Posted by *mbze430*
> 
> 384bit, full core Pascal for 1080TI... guess I'll be selling my Titan XP soon *****


Don't rush. It's only rumours. Very, very vague.







We can only hope.


----------



## Testier

Quote:


> Originally Posted by *toncij*
> 
> About pissing off people: well, that didn't bother them before with 780Ti or 980Ti, so why would it now?
> 
> The word is they're not only fast on track for Pascal refresh (or Volta), but are also changing complete manufacturing to Samsung 14nm fabs because Nvidia is allegedly pissed off about the yields of the current 16nm which are not improving much or at desired rate.
> 
> I'm also really interested in the financials of this alleged news, It seems very strange, but then, they've released a Titan way earlier than I expected.


780 TI was reactive to 290x as for 980 ti, it have less shaders.

As for 14nm on samsung, probably for the new console they are powering. Samsung have lots of experience with mobile chips, less so with big die GPU.


----------



## toncij

Quote:


> Originally Posted by *Testier*
> 
> 780 TI was reactive to 290x as for 980 ti, it have less shaders.
> 
> As for 14nm on samsung, probably for the new console they are powering. Samsung have lots of experience with mobile chips, less so with big die GPU.


Tell me more about the new console? New, 3rd player?


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> Don't rush. It's only rumours. Very, very vague.
> 
> 
> 
> 
> 
> 
> 
> We can only hope.


Rumors from where though...


----------



## axiumone

Quote:


> Originally Posted by *toncij*
> 
> We can say it does scale, but scaling is so bad (17% and 26%) that it's not worth it.
> 
> The difference in single vs SLI of the 1080 is 85 vs 100 FPS on 1440 and 35 vs 40 FPS in 5K. Both cards hover at about 40-45% usage. Occasionally at 5K one card jumps to 95% but the other one dips to 30%. I did not have a chance to test TitanX SLI, but if it does not scale on 1080, it does not scale on TitanXP either.
> 
> I'd never buy a second $1200 card for The Division to get 26% scaling... that using $300 out of $1200 if you have a 5K screen and $200 out of $1200 is you have a 1440.
> 
> But, if one has endless pit of cash, why not. I'd buy 2 in that case. Unfortunately I still work for my money so can't be totally unreasonable.


Really? 17%-26%? I think you may need to see if there's something wrong on your end.



Spoiler: Warning: Spoiler!








Looks like 57% to me. This is with 1080's mind you, but I don't expect the scaling to be any different on the titan xp. Same settings, same resolution. One card vs two. No one in their right mind would ever argue getting a top end gpu with a top end display as a value proposition.


----------



## toncij

Quote:


> Originally Posted by *Z0eff*
> 
> Rumors from where though...


Köln


----------



## toncij

Quote:


> Originally Posted by *axiumone*
> 
> Really? 17%-26%? I think you may need to see if there's something wrong on your end.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like 57% to me. This is with 1080's mind you, but I don't expect the scaling to be any different on the titan xp. Same settings, same resolution. One card vs two. No one in their right mind would ever argue getting a top end gpu with a top end display as a value proposition.


At what exact settings you ran that? What driver and what card settings and a CPU?

Regarding 1 vs2, I don't say.







Just that it doesn't scale really well from my experience. Ahh, is it maybe that I use HTFS, PCSS and VXAO?


----------



## Stateless

Quote:


> Originally Posted by *DarkIdeals*
> 
> 1) Yup. Basically guaranteed that in games that support SLI you WILL get 60fps constant, even in minimum's. Go look up a guy called "thirty IR" on youtube, he did 4K testing with SLI TITAN XP's in like FIFTEEN different games and EVERY single game had well over 60fps minimum's! In witcher 3 maxed out with hairworks etc.. he got average of 86fps iirc, minimum's were like 75fps and max was ~96fps. My 2nd card comes tomorrow so i'll confirm it for you but i'm EXTREMELY confident that it'll easily pass 60fps constant even for minimum fps.
> 
> 2) Yes. Same thing.
> 
> 3) idk about sanding, the only working mod i've seen done to these is with a Dremel to saw off the pointed bits. It did work for the record. However if kinda looks fugly in my opinion...just saying. I'm not a fan of the EVGA HB bridge either though, really hoping the EK HB bridge comes out end of August like they told us it would; i really wanna see what it looks like...


Thank you very much. It was funny you mentioned that "thirty IR", I happened to find that yesterday and watched it. What I love about the Titan XP is that in games that I have tested that don't support SLI like Doom it tears that game apart. I have everything set to max and never dips below 60fps at 4k. Games like Witcher 3, even without hairworks and no AA, I get dips to the low 50's in combat. I know when I get my water blocks I can push the card further, but I doubt they can be pushed far enough were a single Titan XP can do Witcher 3 with Hairworks and AA at 60fps. The game is just too demanding with all that stuff. But watching that video and it seems like having 2 Titan's would be the ticket for me.

I definitely still look forward to your testing. I just ordered my 2nd card and ordered a SLI bridge as well. I might have my card on Wednesday. Still need to be an order in for a 2nd water block, but will wait till the back plates are available as well before putting them under water. I will be testing on air for a few weeks.


----------



## Stateless

Quote:


> Originally Posted by *habu58*
> 
> Here is my division bench at 4K with max AA and settings. Titan XP's in SLI are insanely fast. I am even able to play at 8K in some games around 60 fps.


Thanks. That looks impressive. And that is with every possible setting set to the max as well? The highest shadow setting, highest AA setting etc?


----------



## axiumone

Quote:


> Originally Posted by *toncij*
> 
> At what exact settings you ran that? What driver and what card settings and a CPU?
> 
> Regarding 1 vs2, I don't say.
> 
> 
> 
> 
> 
> 
> 
> Just that it doesn't scale really well from my experience. Ahh, is it maybe that I use HTFS, PCSS and VXAO?


How would using the gameworks features affect scaling? That doesn't make any sense. Same settings with the only variable being one or two cards as the only difference. Also, 1 vs 2, you don't say? What? That's what sli is, that's what scaling is referring to.

3440x1440, ultra preset, 6700k.


----------



## Z0eff

Quote:


> Originally Posted by *toncij*
> 
> Köln


I guess nvidia claiming that the TXP isn't a gaming card and barely doing any advertisement for it would make sense if they had a different product in mind for us.


----------



## DarkIdeals

Something is seriously wrong in my setup for some reason. Put in my 2nd TITAN X and i'm getting nearly NO improvement over a single card in every game i try. In witcher 3 at 3440x1440 with ultra settings and hairworks on with 4x AA etc.. i was getting average of ~65-70fps on one TITAN X. And now with TWO of them i'm barely getting ~70-75fps. Fallout 4 is a similar story, so are other games. I tried using two flex bridges thinking that maybe not having the HB bridge had something to do with it (i doubted it but you never know) and still no change.

Anyone else having this? Anyone know a fix? I tried disabling and re-enabling SLI with no luck, it doesn't make sense because in witcher 3 i'm getting ~75% usage on both cards, which should be vastly outperforming 100% usage on one card. In fallout i'm only getting ~55-60% on each card but my fps seems even LOWER than it was with just a single card...


----------



## toncij

Quote:


> Originally Posted by *axiumone*
> 
> How would using the gameworks features affect scaling? That doesn't make any sense. Same settings with the only variable being one or two cards as the only difference. Also, 1 vs 2, you don't say? What? That's what sli is, that's what scaling is referring to.
> 
> 3440x1440, ultra preset, 6700k.


Actually, many features in games directly affect scaling. It is really important how each feature works, because many features can reduce scaling. The SLI scaling works in a way that it AFRs what would otherwise do a single card. Games see SLI as a single GPU, but you can specifically code your game that way that you enable or disable scaling, unintentionally. Many rendering features are designed in such a way that reduces scaling or even completely disables it (specifically many post processing algorithms that rely on prev-frame) and that's why some game engines will never support it and some need specific effects disabled for it to work.

I can't test 21:9 atm, but my GPU usage is at 81% - running 3840x2160, Ultra (no custom), 4,45GHz 5960X, 2,8GHz [email protected] CL15, 1080 SLI @ 2114/5524. The framerates are exaclty around 45 and CPU usage is 24% for SLI.
Quote:


> Originally Posted by *DarkIdeals*
> 
> Something is seriously wrong in my setup for some reason. Put in my 2nd TITAN X and i'm getting nearly NO improvement over a single card in every game i try. In witcher 3 at 3440x1440 with ultra settings and hairworks on with 4x AA etc.. i was getting average of ~65-70fps on one TITAN X. And now with TWO of them i'm barely getting ~70-75fps. Fallout 4 is a similar story, so are other games. I tried using two flex bridges thinking that maybe not having the HB bridge had something to do with it (i doubted it but you never know) and still no change.
> 
> Anyone else having this? Anyone know a fix? I tried disabling and re-enabling SLI with no luck, it doesn't make sense because in witcher 3 i'm getting ~75% usage on both cards, which should be vastly outperforming 100% usage on one card. In fallout i'm only getting ~55-60% on each card but my fps seems even LOWER than it was with just a single card...


It seems that Fallout 4 does not support SLI correctly
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/11.html

. What other games you have tested? There is like 30% of popular AAA games that don't support SLI.


----------



## axiumone

Quote:


> Originally Posted by *DarkIdeals*
> 
> Something is seriously wrong in my setup for some reason. Put in my 2nd TITAN X and i'm getting nearly NO improvement over a single card in every game i try. In witcher 3 at 3440x1440 with ultra settings and hairworks on with 4x AA etc.. i was getting average of ~65-70fps on one TITAN X. And now with TWO of them i'm barely getting ~70-75fps. Fallout 4 is a similar story, so are other games. I tried using two flex bridges thinking that maybe not having the HB bridge had something to do with it (i doubted it but you never know) and still no change.
> 
> Anyone else having this? Anyone know a fix? I tried disabling and re-enabling SLI with no luck, it doesn't make sense because in witcher 3 i'm getting ~75% usage on both cards, which should be vastly outperforming 100% usage on one card. In fallout i'm only getting ~55-60% on each card but my fps seems even LOWER than it was with just a single card...


Are you using gsync by any chance?


----------



## toncij

Quote:


> Originally Posted by *axiumone*
> 
> Are you using gsync by any chance?


Shouldn't cause problems I presume. Many games work flawlessly with GSync.


----------



## DarkIdeals

Quote:


> Originally Posted by *axiumone*
> 
> Are you using gsync by any chance?


Yes i am. Acer X34. G-Sync has nothing to do with it really though, i've had G-Sync with SLI for years in the past without a single issue.

Quote:


> Originally Posted by *toncij*
> 
> Actually, many features in games directly affect scaling. It is really important how each feature works, because many features can reduce scaling. The SLI scaling works in a way that it AFRs what would otherwise do a single card. Games see SLI as a single GPU, but you can specifically code your game that way that you enable or disable scaling, unintentionally. Many rendering features are designed in such a way that reduces scaling or even completely disables it (specifically many post processing algorithms that rely on prev-frame) and that's why some game engines will never support it and some need specific effects disabled for it to work.
> 
> I can't test 21:9 atm, but my GPU usage is at 81% - running 3840x2160, Ultra (no custom), 4,45GHz 5960X, 2,8GHz [email protected] CL15, 1080 SLI @ 2114/5524. The framerates are exaclty around 45 and CPU usage is 24% for SLI.
> It seems that Fallout 4 does not support SLI correctly
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/11.html
> 
> . What other games you have tested? There is like 30% of popular AAA games that don't support SLI.


I've tested Witcher 3, Far Cry 4, Fallout 4, Dark Souls 3, etc.. and it's always the same story. Witcher 3 is the most surprising as that game has always had nearly PERFECT scaling in SLI for me no matter the cards. And other people in reviews etc.. are reporting massive gains in SLI with these cards. The Thirty IR video i linked a few pages back had him using two of these same cards in Witcher 3 and was getting an average fps of about 85 with even his minimum staying over 75fps...and he was at 4K RES!! I'm only doing 3440x1440!! So there's just no reason why i shouldn't be maxing out the 100hz of this monitor. I've tried everything, turning V-Sync off, updating windows, turning sli on and off, making sure "maximum performance" was selected in control panel etc.. NOTHING works! It's getting ridiculous....

Quote:


> Originally Posted by *toncij*
> 
> Shouldn't cause problems I presume. Many games work flawlessly with GSync.


Yeah i had zero problems with my maxwell TITAN X SLI setup on a G-Sync monitor so i doubt that has anything to do with it.


----------



## axiumone

Just give it a shot. Disable gsync and try again. I'm on an x34 as well and I've noticed a loss in performance running the latest 369.09 with gsync vs no gsync.


----------



## cookiesowns

Quote:


> Originally Posted by *Jpmboy*
> 
> well sure... that's why I ordered 2 EK blocks... but:
> you found out! lol - nothing like the sound of two screaming Delta fans to make the wife close my office door.


Haha.. I can barely stand a thick Panaflo sitting on top of the cards, don't know how you stand the Delta's, plus what I'm assuming to be high RPM fans over your TridentZ's either.

Either way


----------



## DarkIdeals

Quote:


> Originally Posted by *axiumone*
> 
> Just give it a shot. Disable gsync and try again. I'm on an x34 as well and I've noticed a loss in performance running the latest 369.09 with gsync vs no gsync.


Nope, doesn't change anything. Fps is still abysmal.

This just doesn't make sense. On top of me getting half the framerate i should i also get weird quirks in how the fps changes. Like when i first start witcher 3 it always gives me just ~60-65fps. But then after a while even sitting in the same spot, or fiddling with settings etc.. the fps will suddenly jump up to ~75fps. This is just bizarre.


----------



## toncij

Quote:


> Originally Posted by *DarkIdeals*
> 
> Nope, doesn't change anything. Fps is still abysmal.
> 
> This just doesn't make sense. On top of me getting half the framerate i should i also get weird quirks in how the fps changes. Like when i first start witcher 3 it always gives me just ~60-65fps. But then after a while even sitting in the same spot, or fiddling with settings etc.. the fps will suddenly jump up to ~75fps. This is just bizarre.


I'll take this as a cautionary tale and skip SLI for my Titan XPs. Volta soon out anyway.


----------



## axiumone

Quote:


> Originally Posted by *toncij*
> 
> Actually, many features in games directly affect scaling. It is really important how each feature works, because many features can reduce scaling. The SLI scaling works in a way that it AFRs what would otherwise do a single card. Games see SLI as a single GPU, but you can specifically code your game that way that you enable or disable scaling, unintentionally. Many rendering features are designed in such a way that reduces scaling or even completely disables it (specifically many post processing algorithms that rely on prev-frame) and that's why some game engines will never support it and some need specific effects disabled for it to work.
> 
> I can't test 21:9 atm, but my GPU usage is at 81% - running 3840x2160, Ultra (no custom), 4,45GHz 5960X, 2,8GHz [email protected] CL15, 1080 SLI @ 2114/5524. The framerates are exaclty around 45 and CPU usage is 24% for SLI.
> It seems that Fallout 4 does not support SLI correctly
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/11.html
> 
> . What other games you have tested? There is like 30% of popular AAA games that don't support SLI.


Le sigh. You said that the division doesn't scale. I said it sure does and posted screen shots. Could the scaling be worse in 4k? Sure, but a 27% swing is crazy. Bringing up titles that have no scaling is completely irrelevant, because, well they don't scale to begin with. As I stated, comparing systems with the same settings will produce the same scaling results.

I took a look at guru3d's 1070 sli, as they're the only ones that tested the division in pascal sli. They're results at 4k were 33% scaling. You're below that, probably worth investigating for you.

What other games have I tested in sli? Take a look - http://www.overclock.net/t/1608309/x99-5960x-4-6-vs-z170-6700k-4-8-w-1080-sli-3440x1440/0_100

Fruitless argument anyway. All we've accomplished is the same as always. Does game X scale well in sli? Yes, but it varies from person to person!


----------



## DarkIdeals

Quote:


> Originally Posted by *toncij*
> 
> I'll take this as a cautionary tale and skip SLI for my Titan XPs. Volta soon out anyway.


Except this obviously isn't happening to other people. Go watch this video:






The guy is getting MORE fps than i am with his TITAN XP sli setup and he's running at 4K when i'm only running at 3440x1440. There's just NO reason for that.


----------



## axiumone

DDU the drivers. If that doesn't work, then maybe it's time for a fresh windows install.


----------



## guttheslayer

Quote:


> Originally Posted by *DNMock*
> 
> No chance.
> 
> TI will be identical to the Titan with a few less cores (3200 or 3300 +/-). The only way they drop a full 3840 core 1080 ti is if Volta is behind and we see a new Titan XP Black based on the Tesla cards. Otherwise no one would have a reason to buy a titan after the 1080ti was released and Nvidia would lose out on potential profit from the mark-up.
> 
> Not gonna change the memory controller either to do 8gb Vram. Those changes would cost more to implement than just rolling out the 6 or 12 gb of ram with the already designed set-up (I believe that's what the 384 bit controller needs, but I may be wrong here)
> 
> edit: This is all moot if Vega is more powerful than TXP, in which case we will definitely see a full 3840 core 1080ti


We all know that nvidia will estimate the perf of vega better than us. If 3840 is true... Den vega is smth really powerful


----------



## Bradum

Hey Guys,

Picked up a Titan XP a couple weeks ago, and the water block is in the mail. Is there any kind of custom BIOS to unlock the voltage yet?

Sorry if it was already posted, but I don't have time to read through 223 pages. lol


----------



## Yuhfhrh

Quote:


> Originally Posted by *Bradum*
> 
> Hey Guys,
> 
> Picked up a Titan XP a couple weeks ago, and the water block is in the mail. Is there any kind of custom BIOS to unlock the voltage yet?
> 
> Sorry if it was already posted, but I don't have time to read through 223 pages. lol


Nope, we're locked down hard.


----------



## Bradum

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Nope, we're locked down hard.


Just found this, which says they're using an internal version of afterburner which allows voltage overclocking.

http://www.guru3d.com/articles_pages/nvidia_titan_x_(pascal)_overclock_guide,1.html


----------



## Yuhfhrh

Quote:


> Originally Posted by *Bradum*
> 
> Just found this, which says they're using an internal version of afterburner which allows voltage overclocking.
> 
> http://www.guru3d.com/articles_pages/nvidia_titan_x_(pascal)_overclock_guide,1.html


You can set up the current afterburner to allow voltage adjustment on the card, but the card's bios limits it to around 1.08V.


----------



## DarkIdeals

Quote:


> Originally Posted by *Yuhfhrh*
> 
> You can set up the current afterburner to allow voltage adjustment on the card, but the card's bios limits it to around 1.08V.


How do you do that? Unlock the voltage i mean? I've tried everything and haven't had any luck. Using the latest beta, went into settings and checked the "allow voltage modification" option and tried the reference, standard MSI, extended MSI, etc.. and no luck with any of them. So i'm very curious of how you are getting that to work.

Even under full 100% load my cards won't go any higher than like 1.02v for some reason.


----------



## Bradum

Quote:


> Originally Posted by *DarkIdeals*
> 
> How do you do that? Unlock the voltage i mean? I've tried everything and haven't had any luck. Using the latest beta, went into settings and checked the "allow voltage modification" option and tried the reference, standard MSI, extended MSI, etc.. and no luck with any of them. So i'm very curious of how you are getting that to work.
> 
> Even under full 100% load my cards won't go any higher than like 1.02v for some reason.


I'm in the same boat. I've unlocked voltage control, but it still won't let me touch it in Afterburner.


----------



## CallsignVega

Quote:


> Originally Posted by *DarkIdeals*
> 
> Except this obviously isn't happening to other people. Go watch this video:
> 
> 
> 
> 
> 
> 
> The guy is getting MORE fps than i am with his TITAN XP sli setup and he's running at 4K when i'm only running at 3440x1440. There's just NO reason for that.


Your symptoms almost seem like you are running the games in windowed mode. SLI works terrible in windowed mode versus full screen.


----------



## DarkIdeals

Quote:


> Originally Posted by *CallsignVega*
> 
> Your symptoms almost seem like you are running the games in windowed mode. SLI works terrible in windowed mode versus full screen.


I figured that too. But i double checked and i was running every game in fullscreen but still getting bad fps. The GPUs were getting ~75% usage on average for both cards. That equates to 1.5x a single card; so i SHOULD at the very least be getting 1.5x my single card framerate. Meaning my ~65fps average should be ~98fps. But instead i can't get it over 80fps, and it goes down to 65-70 in a lot of situations. It's downright infuriating. I just tried doing a complete clean install of all display drivers and it STILL doesn't do anything!

I'm at the end of my wits here, this isn't how $2,500 worth of hardware is supposed to perform! I was getting almost this same fps on two titan-x MAXWELL cards for crying out loud!


----------



## CallsignVega

Quote:


> Originally Posted by *DarkIdeals*
> 
> I figured that too. But i double checked and i was running every game in fullscreen but still getting bad fps. The GPUs were getting ~75% usage on average for both cards. That equates to 1.5x a single card; so i SHOULD at the very least be getting 1.5x my single card framerate. Meaning my ~65fps average should be ~98fps. But instead i can't get it over 80fps, and it goes down to 65-70 in a lot of situations. It's downright infuriating. I just tried doing a complete clean install of all display drivers and it STILL doesn't do anything!
> 
> I'm at the end of my wits here, this isn't how $2,500 worth of hardware is supposed to perform! I was getting almost this same fps on two titan-x MAXWELL cards for crying out loud!


What happens when you ALT+ENTER in a game out to the desktop and then ALT+ENTER back in? Does the screen go black for a second which means full-screen or does it instantly switch back and forth which would be windows behavior? When you ALT+ENTER in and out does the SLI usage change?


----------



## Yuhfhrh

Quote:


> Originally Posted by *DarkIdeals*
> 
> How do you do that? Unlock the voltage i mean? I've tried everything and haven't had any luck. Using the latest beta, went into settings and checked the "allow voltage modification" option and tried the reference, standard MSI, extended MSI, etc.. and no luck with any of them. So i'm very curious of how you are getting that to work.
> 
> Even under full 100% load my cards won't go any higher than like 1.02v for some reason.


Quote:


> Originally Posted by *Bradum*
> 
> I'm in the same boat. I've unlocked voltage control, but it still won't let me touch it in Afterburner.


For afterburner, open up the file under MSI/Profiles starting with 10DE&DEV... with notepad (admin rights) and add this in:

[Settings]
VDDC_Generic_Detection = 1


----------



## MrTOOSHORT

Got the block today. But I'm at work now. No CLU though yet. Will probably install the block in the morning after work with out the shunt mod for now.


----------



## CallsignVega

Quote:


> Originally Posted by *Yuhfhrh*
> 
> For afterburner, open up the file under MSI/Profiles starting with 10DE&DEV... with notepad (admin rights) and add this in:
> 
> [Settings]
> VDDC_Generic_Detection = 1


That doesn't do anything for my Afterburner.

On a side note, NVIDIA drivers getting worse and worse as time goes on. Stupid crap constantly turning V-Sync ON by itself.


----------



## Yuhfhrh

Quote:


> Originally Posted by *CallsignVega*
> 
> That doesn't do anything for my Afterburner.
> 
> On a side note, NVIDIA drivers getting worse and worse as time goes on. Stupid crap constantly turning V-Sync ON by itself.


It does for me, make sure you're on the latest beta (4.3.0?) and apply to both profile files if you have two cards. Close afterburner then reopen, go under afterburner settings and check unlock voltage.


----------



## DarkIdeals

Quote:


> Originally Posted by *CallsignVega*
> 
> What happens when you ALT+ENTER in a game out to the desktop and then ALT+ENTER back in? Does the screen go black for a second which means full-screen or does it instantly switch back and forth which would be windows behavior? When you ALT+ENTER in and out does the SLI usage change?


Yes it goes black because it's in fullscreen. Tried in in about 5 games so far.

The weirdest thing by far though is that when i load up Witcher 3 it starts at ~60fps but if i just sit there for ~2-3 minutes it will suddenly jump to ~75-80fps for no reason and stay that way. SO bizarre!
Quote:


> Originally Posted by *Yuhfhrh*
> 
> It does for me, make sure you're on the latest beta (4.3.0?) and apply to both profile files if you have two cards. Close afterburner then reopen, go under afterburner settings and check unlock voltage.


Yeah it works for me. Don't forget to add the [Settings] part too.

Unfortunately even increasing voltage does NOTHING to fix this horrid SLI issue. Just tried Far Cry 4 with both cards with +60 voltage added and it wouldn't go over a pitiful 35% usage per card.....like seriously WHAT THE HELL?!?!?


----------



## Testier

Quote:


> Originally Posted by *toncij*
> 
> Tell me more about the new console? New, 3rd player?


Nintendo console is using nvidia chip. I believe its some form of Tegra. Samsung have mobile chip experience and TSMC is probably full.


----------



## stefxyz

Dark something is wrong. I would check both cards alone first and try to detect if one is faulty. Having wrong PCI Slots I guess is not the problem as you had two titans before. Also to be 100 percent sure i would do a complete fresh windows installation and install with the bare minimum: chipset drivers and gpu drivers only and plug in only your 2 gpus and a mouse and keyboard. Start testing with no overclocking at all on cpu and ram and no afterburner installed. next with afterburner and if it seems good start to oc cpu.


----------



## DarkIdeals

Quote:


> Originally Posted by *stefxyz*
> 
> Dark something is wrong. I would check both cards alone first and try to detect if one is faulty. Having wrong PCI Slots I guess is not the problem as you had two titans before. Also to be 100 percent sure i would do a complete fresh windows installation and install with the bare minimum: chipset drivers and gpu drivers only and plug in only your 2 gpus and a mouse and keyboard. Start testing with no overclocking at all on cpu and ram and no afterburner installed. next with afterburner and if it seems good start to oc cpu.


Well actually, now that you mention it. PCI slot DID have somewhat of an effect. I had the cards in slot 1 and slot 2 on my Rampage V Edition 10; and i thought there was just a TINY possibility that the 2nd card being in an 8x slot was effecting it. Swapping it to the 3rd slot, which is a 16x slot, gave me an extra ~10fps or so.

I was getting ~75-80fps at most, and now it's bouncing around ~90-95fps most of the time, every once in a while hitting ~98-100fps or so. At 3440x1440. It still seems kinda low, but definitely some kind of improvement.

That's honestly big news in my opinion. That's evidence (anecdotal yes, but still evidence) that 16x vs 8x pci lanes is ACTUALLY starting to make a decent difference now. These cards are getting to the point that the extra bandwidth appears to be helpful.

I did check each card separately by the way. Both of them performed roughly the same, getting ~65-70fps at 3440x1440p by themselves, and when i used DSR to get 5120x2160 "4K Ultra-Wide" it reached an average of ~48fps which isn't bad considering most benchmarks for regular 4K show ~55fps, ~6-7fps is easily attributable to the increase of pixels by 34% that 5120x2160 has over normal 4K. It's only when i get to SLI that things appear odd.


----------



## Z0eff

Quote:


> Originally Posted by *DarkIdeals*
> 
> That's honestly big news in my opinion. That's evidence (anecdotal yes, but still evidence) that 16x vs 8x pci lanes is ACTUALLY starting to make a decent difference now. These cards are getting to the point that the extra bandwidth appears to be helpful.


Yeah based on reading various reports in this thread there does now seem to be a reason to be on PCIe Gen3 x16. On top of that it looks like having a more recent high end CPU actually makes a difference now too.

I do wonder though, does the TXP really saturate a PCIe Gen3 x16 slot or is it something else like a decrease in latency that come with faster speeds.

Being on Z170 I do wonder if I do finally end up going SLI for the first time that the x8 speeds will reduce my FPS by a small but noticeable amount....


----------



## DADDYDC650

Quote:


> Originally Posted by *DarkIdeals*
> 
> Well actually, now that you mention it. PCI slot DID have somewhat of an effect. I had the cards in slot 1 and slot 2 on my Rampage V Edition 10; and i thought there was just a TINY possibility that the 2nd card being in an 8x slot was effecting it. Swapping it to the 3rd slot, which is a 16x slot, gave me an extra ~10fps or so.
> 
> I was getting ~75-80fps at most, and now it's bouncing around ~90-95fps most of the time, every once in a while hitting ~98-100fps or so. At 3440x1440. It still seems kinda low, but definitely some kind of improvement.
> 
> That's honestly big news in my opinion. That's evidence (anecdotal yes, but still evidence) that 16x vs 8x pci lanes is ACTUALLY starting to make a decent difference now. These cards are getting to the point that the extra bandwidth appears to be helpful.
> 
> I did check each card separately by the way. Both of them performed roughly the same, getting ~65-70fps at 3440x1440p by themselves, and when i used DSR to get 5120x2160 "4K Ultra-Wide" it reached an average of ~48fps which isn't bad considering most benchmarks for regular 4K show ~55fps, ~6-7fps is easily attributable to the increase of pixels by 34% that 5120x2160 has over normal 4K. It's only when i get to SLI that things appear odd.


Would it be possible for you to run Firestrike and Time Spy with 16x/16x and 16x/8x? Do you own a HB bridge?


----------



## DarkIdeals

Quote:


> Originally Posted by *DADDYDC650*
> 
> Would it be possible for you to run Firestrike and Time Spy with 16x/16x and 16x/8x? Do you own a HB bridge?


I might be able to run it later, my system takes forever to re-seat the cards so i'm not moving them again tonight lol. And no i don't have the HB bridge. I've tried using two flexible bridges, the Asus double finger 3 slot bridge, and an Nvidia LED hard bridge (older style with the claw logo) Not sure how much of a difference the HB bridge will actually make. I'm definitely going to get one but after spending so much on the cards i have to wait a bit, plus my EK blocks won't fit with the Nvidia HB bridge and i hate the look of the EVGA one so i've been waiting for the EK HB bridge to finally come out...


----------



## steponz

Guys some little updates....

Futuremark Firestrike Ulta and Extreme Hall of Fame Records..

http://www.3dmark.com/fs/9792075

http://www.3dmark.com/fs/9792072

Card will go to Epower next... Enjoy...


----------



## steponz

Quote:


> Originally Posted by *DADDYDC650*
> 
> Would it be possible for you to run Firestrike and Time Spy with 16x/16x and 16x/8x? Do you own a HB bridge?


16x vs 8x makes a pretty big difference.. with newer more power hungry architectures.. it should make more of a difference because your really using the bandwidth.


----------



## DADDYDC650

Quote:


> Originally Posted by *steponz*
> 
> 16x vs 8x makes a pretty big difference.. with newer more power hungry architectures.. it should make more of a difference because your really using the bandwidth.


Any proof? I haven't seen any 16x/16x vs 16x/8x pci-e with HB bridge benchmarks.


----------



## steponz

Quote:


> Originally Posted by *DADDYDC650*
> 
> Any proof? I haven't seen any 16x/16x vs 16x/8x pci-e with HB bridge benchmarks.


Ive done it quite a bit actually to test the difference.... Just run your setup in a different slot and run a bunch.... watch gpu score change..

Best way is to see for yourself and test.............


----------



## DADDYDC650

Quote:


> Originally Posted by *steponz*
> 
> Ive done it quite a bit actually to test the difference.... Just run your setup in a different slot and run a bunch.... watch gpu score change..
> 
> Best way is to see for yourself and test.............


I have a 6800k + only 1 Titan XP so no go at 16x/16x. I'd like to believe you but I need to some actual benches/proof.All benchmarks for HB bridge were on a 6700k system (16x8x) No mention of performance loss or lane saturation.


----------



## toncij

Quote:


> Originally Posted by *axiumone*
> 
> Le sigh. You said that the division doesn't scale. I said it sure does and posted screen shots. Could the scaling be worse in 4k? Sure, but a 27% swing is crazy. Bringing up titles that have no scaling is completely irrelevant, because, well they don't scale to begin with. As I stated, comparing systems with the same settings will produce the same scaling results.
> 
> I took a look at guru3d's 1070 sli, as they're the only ones that tested the division in pascal sli. They're results at 4k were 33% scaling. You're below that, probably worth investigating for you.
> 
> What other games have I tested in sli? Take a look - http://www.overclock.net/t/1608309/x99-5960x-4-6-vs-z170-6700k-4-8-w-1080-sli-3440x1440/0_100
> 
> Fruitless argument anyway. All we've accomplished is the same as always. Does game X scale well in sli? Yes, but it varies from person to person!


Now, I'm really wondering why is there such a difference. Also, 33% scaling is not something to be proud off, I don't find it worth it below 50% which I find very bad.

Not sure what could be the problem. The driver is the newest one, settings have been reset, the machine is clean, there is really a very strange behavior here.

Quote:


> Originally Posted by *DarkIdeals*
> 
> Except this obviously isn't happening to other people. Go watch this video:
> 
> 
> 
> 
> 
> 
> The guy is getting MORE fps than i am with his TITAN XP sli setup and he's running at 4K when i'm only running at 3440x1440. There's just NO reason for that.


This is even more strange








Quote:


> Originally Posted by *guttheslayer*
> 
> We all know that nvidia will estimate the perf of vega better than us. If 3840 is true... Den vega is smth really powerful


By just asuming 4096 cores of the similar speed of RX 480, I don't see Vega 10 that good. 1.77% of the RX 480 places Vega just above 1070 and similar to 1080. Not something that would prompt Nvidia for anything radical. Unless, of course, Vega is planned to take advantage of 14nm FinFET process and put much more transistors there...
Quote:


> Originally Posted by *Z0eff*
> 
> Yeah based on reading various reports in this thread there does now seem to be a reason to be on PCIe Gen3 x16. On top of that it looks like having a more recent high end CPU actually makes a difference now too.
> 
> I do wonder though, does the TXP really saturate a PCIe Gen3 x16 slot or is it something else like a decrease in latency that come with faster speeds.
> 
> Being on Z170 I do wonder if I do finally end up going SLI for the first time that the x8 speeds will reduce my FPS by a small but noticeable amount....


You can get a board with PLX chip and then you can have 16 x16x16 ...


----------



## escalibur

Has anyone of you applied any AIO cooler on this? I think it should work as with 1080 references right?


----------



## carlhil2

My card hit TDP of 134%, in FSU, I needs some CLU....


----------



## bee144

FYI, new NVIDIA WHQL driver 372.54 was released just now which bring the Titan Xp into the regular branch of NVIDIA drivers.


----------



## guttheslayer

Quote:


> Originally Posted by *carlhil2*
> 
> My card hit TDP of 134%, in FSU, I needs some CLU....


How u know how much tdp you will hit?

And regarding vega what happen if there is improvement ipc as well as power saving (which is infer from the chart). It could stand a chance to be faster than 1080. At least one can hope


----------



## DADDYDC650

Quote:


> Originally Posted by *bee144*
> 
> FYI, new NVIDIA WHQL driver 372.54 was released just now which bring the Titan Xp into the regular branch of NVIDIA drivers.


Downloading! Thanks.


----------



## HyperMatrix

Don't download 372.54 drivers. They break overclocking. I have no idea why but it's causing artifacting immediately (even at 10% usage under pre-boost clocks). Turned off memory clock and it's still happening. Appears to be tied to GPU clock. Happening even at just +150 offset (even though it's running at stock 1417MHz still), while temps are at 36c.

Or someone tell me that there's something wrong with my system.


----------



## carlhil2

Quote:


> Originally Posted by *guttheslayer*
> 
> How u know how much tdp you will hit?
> 
> And regarding vega what happen if there is improvement ipc as well as power saving (which is infer from the chart). It could stand a chance to be faster than 1080. At least one can hope


GPUZ, also, I have a TXP, am not concerned with any other gpu unless it is much faster than it....







my latest Time Spy run..  check the TDP...


----------



## HyperMatrix

Quote:


> Originally Posted by *DarkIdeals*
> 
> Yes it goes black because it's in fullscreen. Tried in in about 5 games so far.
> 
> The weirdest thing by far though is that when i load up Witcher 3 it starts at ~60fps but if i just sit there for ~2-3 minutes it will suddenly jump to ~75-80fps for no reason and stay that way. SO bizarre!
> Yeah it works for me. Don't forget to add the [Settings] part too.
> 
> Unfortunately even increasing voltage does NOTHING to fix this horrid SLI issue. Just tried Far Cry 4 with both cards with +60 voltage added and it wouldn't go over a pitiful 35% usage per card.....like seriously WHAT THE HELL?!?!?


I tried Far Cry 4 out just a couple days ago. Generally ran around 90% usage across both cards. Only time it'd go lower is when it was being bottlenecked by CPU/DX11 draw calls


----------



## HyperMatrix

I figured out the bug with the new 372.54 drivers. Under pre-boost clocks, if you have the GPU OC'd to about +150 (probably different based on your card) it'll artifact like crazy and then crash. GPU-Z reports this as a voltage reliability issue. But once GPU usage picks up and the card actually switches to the boost clock, the problem goes away. So: Unstable OC with new driver at pre-boost clocks. Perfectly fine once boosted.

Stupid annoying bug.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> Don't download 372.54 drivers. They break overclocking. I have no idea why but it's causing artifacting immediately (even at 10% usage under pre-boost clocks). Turned off memory clock and it's still happening. Appears to be tied to GPU clock. Happening even at just +150 offset (even though it's running at stock 1417MHz still), while temps are at 36c.
> 
> Or someone tell me that there's something wrong with my system.


No problems on my end. Overclocking is a go.


----------



## HyperMatrix

Quote:


> Originally Posted by *DADDYDC650*
> 
> No problems on my end. Overclocking is a go.


Quote:


> Originally Posted by *carlhil2*
> 
> Same here, I get a slightly higher score with same settings...


Do me a favour? Test with super low windowed resolution and low settings, keeping the Titan X at base clock of 1417MHz. See if you have any artifacting when GPU is OC'd at least +150 then.


----------



## carlhil2

Same here, I get a slightly higher score with same settings...


----------



## Kaapstad

4 SLI is now working for benches.













http://www.3dmark.com/3dm/14156025?


----------



## carlhil2

Quote:


> Originally Posted by *Kaapstad*
> 
> 4 SLI is now working for benches.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/14156025?


Nice score, should make for a lot of happy campers ..


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> Do me a favour? Test with super low windowed resolution and low settings, keeping the Titan X at base clock of 1417MHz. See if you have any artifacting when GPU is OC'd at least +150 then.


720p res.


----------



## HyperMatrix

Quote:


> Originally Posted by *DADDYDC650*
> 
> 720p res.


That does nothing for me because your card is still using boost clocks. The problem with the driver is that it causes instability at pre-boost clocks. Run any type of game or benchmark that has low GPU usage, which results in the card using the stock clock of 1417MHz. I noticed this problem at the menu of a game I was playing. Once in game, no issues. But the menu is locked to 60fps and uses very little GPU, which keeps it at 1417MHz, resulting in artifacting and crashing.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> That does nothing for me because your card is still using boost clocks. The problem with the driver is that it causes instability at pre-boost clocks. Run any type of game or benchmark that has low GPU usage, which results in the card using the stock clock of 1417MHz. I noticed this problem at the menu of a game I was playing. Once in game, no issues. But the menu is locked to 60fps and uses very little GPU, which keeps it at 1417MHz, resulting in artifacting and crashing.


720p, low settings, window mode. MSI AB set to +200 core and +500 VRAM. No issues.


----------



## HyperMatrix

Quote:


> Originally Posted by *DADDYDC650*


Thanks. Well I'm thoroughly confused. I swapped my monitor to my second titan and there is no problem. Now to trouble shoot why this is happening. Fuuuuu.

Reverted drivers back to 369.05 and everything is fine. How on earth can a driver update break just 1 card....

Wondering if I should replace my card while I still can. Not a good sign that a driver can break its voltage control.


----------



## tpwilko08

Anyone experiencing screen flicker on the new nvidia drivers? May have to revert back to old ones.


----------



## carlhil2

Are these cards ever going to be able to push past 1.093 volts without a bios flash?


----------



## DADDYDC650

Quote:


> Originally Posted by *tpwilko08*
> 
> Anyone experiencing screen flicker on the new nvidia drivers? May have to revert back to old ones.


No flickering.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> Thanks. Well I'm thoroughly confused. I swapped my monitor to my second titan and there is no problem. Now to trouble shoot why this is happening. Fuuuuu.
> 
> Reverted drivers back to 369.05 and everything is fine. How on earth can a driver update break just 1 card....
> 
> Wondering if I should replace my card while I still can. Not a good sign that a driver can break its voltage control.


SLI fun I guess?


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> SLI fun I guess?


SLI is disabled. This is happening on a single-card basis.


----------



## tpwilko08

Quote:


> Originally Posted by *DADDYDC650*
> 
> No flickering.


Did a clean install everything seems to be ok now screen flickering gone. Must of been conflicting with old drivers now to some game testing.


----------



## KillerBee33

Getting better results with new driver








http://www.3dmark.com/spy/283962 Driver 372.54
http://www.3dmark.com/spy/254233 Driver 369.05


----------



## dante`afk

and the before/after comparison is where?

jesus.

btw, w10 anniversary update version gives less points than 1511


----------



## dante`afk

Quote:


> Originally Posted by *DADDYDC650*
> 
> Any proof? I haven't seen any 16x/16x vs 16x/8x pci-e with HB bridge benchmarks.


there is no difference, just google up techpowerup comparisons with 1080 and HB bridge/regular bridge.


----------



## stefxyz

But may be with the Titan its different. We havent see a new tests yet. I see only words or speculations... Gamernexus did a good video on Lanes for the 1080 and they found very small impact but this might have changed now that we have 25 to 35 % more performance.


----------



## Kaapstad

Quote:


> Originally Posted by *dante`afk*
> 
> there is no difference, just google up techpowerup comparisons with 1080 and HB bridge/regular bridge.


In some games like Witcher 3 there is a big difference @2160p.

I have tested this myself.


----------



## Menthol

Quote:


> Originally Posted by *DNMock*
> 
> No chance.
> 
> TI will be identical to the Titan with a few less cores (3200 or 3300 +/-). The only way they drop a full 3840 core 1080 ti is if Volta is behind and we see a new Titan XP Black based on the Tesla cards. Otherwise no one would have a reason to buy a titan after the 1080ti was released and Nvidia would lose out on potential profit from the mark-up.
> 
> Not gonna change the memory controller either to do 8gb Vram. Those changes would cost more to implement than just rolling out the 6 or 12 gb of ram with the already designed set-up (I believe that's what the 384 bit controller needs, but I may be wrong here)
> 
> edit: This is all moot if Vega is more powerful than TXP, in which case we will definitely see a full 3840 core 1080ti


Quote:


> Originally Posted by *steponz*
> 
> Guys some little updates....
> 
> Futuremark Firestrike Ulta and Extreme Hall of Fame Records..
> 
> http://www.3dmark.com/fs/9792075
> 
> http://www.3dmark.com/fs/9792072
> 
> Card will go to Epower next... Enjoy...


Nice waiting to see your results, thanks for sharing

Quote:


> Originally Posted by *steponz*
> 
> 16x vs 8x makes a pretty big difference.. with newer more power hungry architectures.. it should make more of a difference because your really using the bandwidth.


Not so much with the 1080's but with TXP it does seem to make a difference in benches


----------



## Silent Scone

Quote:


> Originally Posted by *steponz*
> 
> 16x vs 8x makes a pretty big difference.. with newer more power hungry architectures.. it should make more of a difference because your really using the bandwidth.


Quote:


> Originally Posted by *steponz*
> 
> Ive done it quite a bit actually to test the difference.... Just run your setup in a different slot and run a bunch.... watch gpu score change..
> 
> Best way is to see for yourself and test.............


Quote:


> Originally Posted by *Kaapstad*
> 
> In some games like Witcher 3 there is a big difference @2160p.
> 
> I have tested this myself.


You would need to display these results with the correct testing to prove there is any disparity strictly between lane speeds.


----------



## Jpmboy

Quote:


> Originally Posted by *cookiesowns*
> 
> Haha.. I can barely stand a thick Panaflo sitting on top of the cards, don't know how you stand the Delta's, plus what I'm assuming to be high RPM fans over your TridentZ's either.
> 
> Either way


lol - the deltas are on a switch and pot so I can use them when needed. All other fans on the bench are straight PWM, so not loud at all... but the 90mm delta fans are simply sirens.








Not quite a klaxon, more like the 2PM tuesday siren test.,
Quote:


> Originally Posted by *Yuhfhrh*
> 
> You can set up the current afterburner to allow voltage adjustment on the card, but the card's bios limits it to around 1.08V.


really odd that a pascal bios editor has not shown up yet.
Quote:


> Originally Posted by *DarkIdeals*
> 
> Yes it goes black because it's in fullscreen. Tried in in about 5 games so far.
> 
> The weirdest thing by far though is that when i load up Witcher 3 it starts at ~60fps but if i just sit there for ~2-3 minutes it will suddenly jump to ~75-80fps for no reason and stay that way. SO bizarre!
> Yeah it works for me. Don't forget to add the [Settings] part too.
> 
> Unfortunately even increasing voltage does NOTHING to fix this horrid SLI issue. Just tried Far Cry 4 with both cards with +60 voltage added and it wouldn't go over a pitiful 35% usage per card.....like seriously WHAT THE HELL?!?!?


if you have "optimal power" set in NVCP (which is the default) it seem to be able to sleep a card or something. Set to adaptive or max performance.
Quote:


> Originally Posted by *DADDYDC650*
> 
> Any proof? I haven't seen any 16x/16x vs 16x/8x pci-e with HB bridge benchmarks.


It's unlikely you will notice or even measure any difference at lower than 1440P and even then the PCIE bandwidth is still well above the data transit bandwidth. 4K and 5K... and an HB bridge - yes, lane bandwidth will matter.

at x8 the bandwidth is half... still well above what a game can throw at the lanes.


----------



## dante`afk

everywhere saybelieve statements without any tests, wishful thinking.

just tested new nvidia drivers, pretty unstable. the OC I was running before is not stable any longer, switching back.


----------



## marc0053

Quote:


> Originally Posted by *steponz*
> 
> Guys some little updates....
> 
> Futuremark Firestrike Ulta and Extreme Hall of Fame Records..
> 
> http://www.3dmark.com/fs/9792075
> 
> http://www.3dmark.com/fs/9792072
> 
> Card will go to Epower next... Enjoy...


Great job Steponz!! Looking forward to your Epowah!


----------



## SuprUsrStan

What's the "max" overclocks that people have been getting under water? I saw some 2100 on air but were people able to push it further than that?

I looked 40 pages back but didn't see anything


----------



## Jpmboy

Quote:


> Originally Posted by *dante`afk*
> 
> everywhere saybelieve statements without any tests, wishful thinking
> just tested new nvidia drivers, pretty unstable. the OC I was running before is not stable any longer, switching back.


well - he likely has some testing I'm sure, but under conditions you are not gonna have on your day-driver gaming rig.








Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Got the block today. But I'm at work now. No CLU though yet. Will probably install the block in the morning after work with out the shunt mod for now.


Before you do the mod, check out the major improvements you get just by keeping the temps below 40C.








Quote:


> Originally Posted by *steponz*
> 
> Guys some little updates....
> Futuremark Firestrike Ulta and Extreme Hall of Fame Records..
> http://www.3dmark.com/fs/9792075
> http://www.3dmark.com/fs/9792072
> Card will go to Epower next... Enjoy...


whoa - 100pts above my FSE score. Nice!

hey Steponz - I know W8.1 is worth a bunch of pts... what's the card do on w10 just for comparison's sake?


----------



## KillerBee33

Quote:


> Originally Posted by *dante`afk*
> 
> everywhere saybelieve statements without any tests, wishful thinking.
> 
> just tested new nvidia drivers, pretty unstable. the OC I was running before is not stable any longer, switching back.


Edited two runs 372 VS 369 TimeSpy run


----------



## CallsignVega

Quote:


> Originally Posted by *Yuhfhrh*
> 
> It does for me, make sure you're on the latest beta (4.3.0?) and apply to both profile files if you have two cards. Close afterburner then reopen, go under afterburner settings and check unlock voltage.


Ah, my files already had a {Settings} header, so that's why It didn't take. Make sure to not have two of the same headers.


----------



## dante`afk

whats a day-driver rig? lmao.

10563 gpu score with old drivers and less on core. http://www.3dmark.com/spy/284774


----------



## Jpmboy

Quote:


> Originally Posted by *dante`afk*
> 
> whats a day-driver rig? lmao.
> 
> 10563 gpu score with old drivers and less on core. http://www.3dmark.com/spy/284774


that's pretty good!









sub *here*


----------



## Snaporz

I have another question going back to a previous topic about thermal compounds. I saw some discussing Liquid Metal and Non-LM compounds. What to use for this card on a waterblock instead of standard EK thermal compound?


----------



## Jpmboy

Quote:


> Originally Posted by *Snaporz*
> 
> I have another question going back to a previous topic about thermal compounds. I saw some discussing Liquid Metal and Non-LM compounds. What to use for this card on a waterblock instead of standard EK thermal compound?


if you have a Nickel-plated block, then you CAN use LM, if it is a copper block do not use a LM. Stick with Grizzly or Gelid, PK-1, PK-3..etc.


----------



## Snaporz

Quote:


> Originally Posted by *Jpmboy*
> 
> if you have a Nickel-plated block, then you CAN use LM, if it is a copper block do not use a LM. Stick with Grizzly or Gelid, PK-1, PK-3..etc.


I got the all nickel, so it's nickel plated then. Slightly further off topic, but does the same apply to CPU blocks? So I could dump my Arctic Silver 5? lol


----------



## KillerBee33

Anyone else noticed this?


----------



## EniGma1987

Quote:


> Originally Posted by *Kaapstad*
> 
> What a dreadful owners thread, the OP has made no effort at all.


This didnt start as an owners thread at all and the OP had no desire for it to be one. Mods changed the thread name. It also isnt an owners club, like the other threads in your sig.

Quote:


> Originally Posted by *steponz*
> 
> Guys some little updates....
> 
> Futuremark Firestrike Ulta and Extreme Hall of Fame Records..
> 
> http://www.3dmark.com/fs/9792075
> 
> http://www.3dmark.com/fs/9792072
> 
> Card will go to Epower next... Enjoy...


Can we get some pictures of your card and maybe a little write up on some of the mods you did?


----------



## Jpmboy

Quote:


> Originally Posted by *Snaporz*
> 
> I got the all nickel, so it's nickel plated then. Slightly further off topic, but does the same apply to CPU blocks? So I could dump my Arctic Silver 5? lol


You're good to go.
All blocks are copper core... nickel is plated. CLU will form an amalgam with the copper block surface that is a real PIA to restore to "like-new". NIckel will not do this chemistry.
AS5? yeah, dump the AS5.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> You're good to go.
> All blocks are copper core... nickel is plated. CLU will form an amalgam with the copper block surface that is a real PIA to restore to "like-new". NIckel will not do this chemistry.
> AS5? yeah, dump the AS5.


Pics please..


----------



## MrTOOSHORT

Stuck the water block on and got some pretty good results!









*http://www.3dmark.com/fs/9797451*

*http://www.3dmark.com/spy/285350*


----------



## criminal

Quote:


> Originally Posted by *Jpmboy*
> 
> if you have a Nickel-plated block, then you CAN use LM, if it is a copper block do not use a LM. Stick with Grizzly or Gelid, PK-1, PK-3..etc.


What does it do to a copper block?


----------



## dante`afk

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Stuck the water block on and got some pretty good results!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *http://www.3dmark.com/fs/9797451*
> 
> *http://www.3dmark.com/spy/285350*


stock clocks?


----------



## Snaporz

Quote:


> Originally Posted by *Jpmboy*
> 
> You're good to go.
> All blocks are copper core... nickel is plated. CLU will form an amalgam with the copper block surface that is a real PIA to restore to "like-new". NIckel will not do this chemistry.
> AS5? yeah, dump the AS5.


Have some Thermal Grizzly Kryonaut en route now to use on both!


----------



## Creator

I can't monitor or change voltage on my TXP. I have the latest drivers, latest AB 4.3.0 Beta 4, and am running Win10 64-bit. I've checked all the options for voltage, tried all the difference voltage control, but nothing - it's grayed out and I see 0 mV as the voltage read. Anyone know why? Am I missing something?

Edit: Nvm just got it working... I think.

Edit again: Nope I didn't. Slider is unlocked but not letting me change and still reading 0 mV.


----------



## Jpmboy

Quote:


> Originally Posted by *criminal*
> 
> What does it do to a copper block?


the indium will do a little chemistry with straight copper - it's called an amalgam. It is chemically reversible, but most folks just "polish" it off, thinking it is a stain. Only affects the first few microns on the cold plate.


----------



## DADDYDC650

Quote:


> Originally Posted by *Kaapstad*
> 
> In some games like Witcher 3 there is a big difference @2160p.
> 
> I have tested this myself.


Doubt it.


----------



## chronicfx

http://www.indium.com/blog/indium-copper-intermetallics-in-soldering.php

Just some reference @criminal


----------



## Jpmboy

Hey guys - I have an odd situation here. 2 TXPs run very well in windows 10 with the HB bridge, or a 3-way hard bridge included with ASUS MB. I wanted to run the 2 cards under windows 7, so did a fresh install on a raid 0. After all the updates and driver installs - with windows dev manager showing both cards, I did an install of either of the 2 drivers which support the TXP... several times, and each time right after loading the driver the required restart results in a complete loss of the vid signal (eg, this ASUS swift just turns off) and a reset goes to "repair windows or boot normally... repair repairs nothing, and a normal start repeats the signal drop. Did 3 fresh OS installs and DDU/NVD install cycles to no avail. Each card will run solo in the slot it is in (eg by switching off the other card with MB switches) and doing the OS install with only one card active will work with either card active. WTH?









_ANY_ ideas are appreciated. (well except to forget W7







)
Quote:


> Originally Posted by *chronicfx*
> 
> http://www.indium.com/blog/indium-copper-intermetallics-in-soldering.php
> 
> Just some reference @criminal


NIce. +1


----------



## ottoore

Quote:


> Originally Posted by *Jpmboy*
> 
> the indium will do a little chemistry with straight copper - it's called an amalgam. It is chemically reversible, but most folks just "polish" it off, thinking it is a stain. Only affects the first few microns on the cold plate.











That's why i prefer pure gallium ( you can find it on ebay) rather than CLP/CLU. And it's cheaper than " brand products".


----------



## Jpmboy

Quote:


> Originally Posted by *ottoore*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's why i prefer pure gallium ( you can find it on ebay) rather than CLP/CLU. And it's cheaper than " brand products".


erm... has the same issue with copper. Folks, copper and aluminum are very prone to form alloys, lol as easily as tripping over the effect in going from the copper age to the bronze age.


----------



## Kaapstad

Quote:


> Originally Posted by *DADDYDC650*
> 
> Doubt it.


Believe what you want to, it is your loss.


----------



## Kaapstad

Quote:


> Originally Posted by *EniGma1987*
> 
> This didnt start as an owners thread at all and the OP had no desire for it to be one. Mods changed the thread name. It also isnt an owners club, like the other threads in your sig.
> Can we get some pictures of your card and maybe a little write up on some of the mods you did?


There are some very clever people on these forums (I'm not one of them), perhaps one of them could do a proper owners thread.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Stuck the water block on and got some pretty good results!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *http://www.3dmark.com/fs/9797451*
> 
> *http://www.3dmark.com/spy/285350*


MrT is ready to take over the leader boards!









no resistor mod?


----------



## DADDYDC650

Quote:


> Originally Posted by *Kaapstad*
> 
> Believe what you want to, it is your loss.


I believe in proof.


----------



## mbze430

Quote:


> Originally Posted by *Gary2015*
> 
> Pics please..


Here is my 980TI block with CLU that is still there. Honestly, there is NO reason you need to remove it, if you applied it correctly. As it fills in all the micro-scratches in the copper (which is exactly the whole point of the TIM in the first place)

Just look up CUGA2 Solder


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Jpmboy*
> 
> MrT is ready to take over the leader boards!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> no resistor mod?


No mod yet, CLU hasn't come yet. Can't believe the card does this good with just a block and stock bios.


----------



## Kaapstad

Quote:


> Originally Posted by *DADDYDC650*
> 
> I believe in proof.


What I said about Witcher 3 has been repeated by a number of people including me.

There is only so much proof you can have.


----------



## DADDYDC650

Quote:


> Originally Posted by *Kaapstad*
> 
> What I said about Witcher 3 has been repeated by a number of people including me.
> 
> There is only so much proof you can have.


Like I said, no proof.


----------



## criminal

Quote:


> Originally Posted by *Jpmboy*
> 
> the indium will do a little chemistry with straight copper - it's called an amalgam. It is chemically reversible, but most folks just "polish" it off, thinking it is a stain. Only affects the first few microns on the cold plate.


Quote:


> Originally Posted by *chronicfx*
> 
> http://www.indium.com/blog/indium-copper-intermetallics-in-soldering.php
> 
> Just some reference @criminal


Okay. Thanks to you both!


----------



## DarkIdeals

Quote:


> Originally Posted by *HyperMatrix*
> 
> I tried Far Cry 4 out just a couple days ago. Generally ran around 90% usage across both cards. Only time it'd go lower is when it was being bottlenecked by CPU/DX11 draw calls


That's why it's so weird that mine is only putting up ~30% usage in that game per card.

Quote:


> Originally Posted by *stefxyz*
> 
> But may be with the Titan its different. We havent see a new tests yet. I see only words or speculations... Gamernexus did a good video on Lanes for the 1080 and they found very small impact but this might have changed now that we have 25 to 35 % more performance.


Don't forget that the memory bandwidth is also higher. On the 1080 we actually had LESS bandwidth than a 980 TI (320GB/s on 1080, 336GB/s on 980 TI) but now with the TITAN XP we're working with a stock 480GB/s of bandwidth which is very high; if you overclock the memory to 11,000mhz you actually get 528GB/s which is MORE than the HBM used in cards like the Fury-X!

That combined with the raw horsepower might just be able to make it matter. My analogy for bandwidth is that your GPU is the pump, the data is the water, and the pci lane is the "size of the pipe". If your GPU pump can push "2 square inches of water per second" and the pipe is only 1 inches in diameter, you aren't getting the full 2 inches per second volume to its destination. Increasing the bandwidth of the pci lanes (pipe size is now 2 inches diameter) will give you a definite improvement, but doubling the pipe size AGAIN to 4 inches will not because you now have MORE pipe width than water flow with empty spots at the top. So in order to take advantage of the even larger "pipe" you need a more powerful "pump" (gpu).

This is how my theory on the correlation between GPU power/memory and the PCI-e bandwidth goes.

Quote:


> Originally Posted by *Jpmboy*
> 
> if you have "optimal power" set in NVCP (which is the default) it seem to be able to sleep a card or something. Set to adaptive or max performance


Yeah tried that when it first started happening. I've had it set to Maximum performance mode since i first got the cards, and it's still there now. Just can't figure it out...

Quote:


> Originally Posted by *Creator*
> 
> I can't monitor or change voltage on my TXP. I have the latest drivers, latest AB 4.3.0 Beta 4, and am running Win10 64-bit. I've checked all the options for voltage, tried all the difference voltage control, but nothing - it's grayed out and I see 0 mV as the voltage read. Anyone know why? Am I missing something?
> 
> Edit: Nvm just got it working... I think.
> 
> Edit again: Nope I didn't. Slider is unlocked but not letting me change and still reading 0 mV.


Go into your disk drive you store files on (C:/ or whatever), then to Program Files X86, then to the MSI Afterburner folder, then to the Profiles folder. Inside of the profiles folder you should see up to 4 config files; one that says MSI Afterburner, and some more that say "VEN_10DE&SUBSYS_" etc.. If you have only one card you will only have up to two of these files (i have three for some reason, not sure why), if you have two cards for SLI you will have two files. Go into the top file and add lines saying this:

[Settings]
VDDC_Generic_Detection = 1

Then if you are using SLI go to the next file down and add that same line. You should now be able to control voltage in MSI Afterburner.

Quote:


> Originally Posted by *Jpmboy*
> 
> MrT is ready to take over the leader boards!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> no resistor mod?


I pity the fool who messes with those cards once he does the mod.









Quote:


> Originally Posted by *Kaapstad*
> 
> What I said about Witcher 3 has been repeated by a number of people including me.
> 
> There is only so much proof you can have.


You're talking about the HB Bridge effecting performance right? Sorry this conversation is spread out so i'm not following it super well.


----------



## Kaapstad

Quote:


> Originally Posted by *DADDYDC650*
> 
> Like I said, no proof.


I think even if I posted proof you would not believe it.


----------



## DADDYDC650

Quote:


> Originally Posted by *Kaapstad*
> 
> I think even if I posted proof you would not believe it.


Wrong.


----------



## habu58

Since people are looking for benchies. ROTR is at 4K with max settings and FXAA. The second one is an early 8K benchmark with max settings. Compared to the previous Titan XM SLI, it was only able to hold around 30-45fps in rottr. Running 2x Titan XP's at +200/+600.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> No mod yet, CLU hasn't come yet. Can't believe the card does this good with just a block and stock bios.


Yeah man, very sensitive to temps... check GPUZ, clocks hold solid and very minor power throttle in most benchmarks.








Quote:


> Originally Posted by *criminal*
> 
> Okay. Thanks to you both!


you're welcome bro.








CLU is great stuff... just good to know when and where it makes a meaningful difference. Folks tend to think it's magic stuff - basically the same eutectic used in today's "mercury" thermometers.
Quote:


> Originally Posted by *Kaapstad*
> 
> I think even if I posted proof you would not believe it.


believe what?


----------



## ottoore

Quote:


> Originally Posted by *Jpmboy*
> 
> erm... has the same issue with copper. Folks, copper and aluminum are very prone to form alloys, lol as easily as tripping over the effect in going from the copper age to the bronze age.


We made some tests on clp and pure gallium in an italian forum. After a couple of year, we wanted to prove long-term effects, we verified that gallium does not leave any " mark " on copper.


----------



## Kaapstad

Quote:


> Originally Posted by *Jpmboy*
> 
> believe what?


Scaling in Witcher 3 @2160p using HB SLI bridge and Pascal Titans.

There is upto about 30% gain using HB bridge in the game and the fps are very constant.


----------



## Testier

I have probably one of the worse titan XP. It cant hold clock at 2ghz at all....


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> .
> believe what?


Quote:


> Originally Posted by *Testier*
> 
> I have probably one of the worse titan XP. It cant hold clock at 2ghz at all....


"hold" is the key. The cards get hot. don't expect it to stay at that under load unless you are under water or rocking 100% fans


----------



## Testier

Quote:


> Originally Posted by *Zurv*
> 
> "hold" is the key. The cards get hot. don't expect it to stay at that under load unless you are under water or rocking 100% fans


I am having a very aggressive fan profile.... oh well, I probably need custom cooler.


----------



## EniGma1987

Quote:


> Originally Posted by *ottoore*
> 
> We made some tests on clp and pure gallium in an italian forum. After a couple of year, we wanted to prove long-term effects, we verified that gallium does not leave any " mark " on copper.


Pure Gallium may not, but CLP and CLU do. Most likely due to the other materials mixed with the gallium in the liquid metal TIM. Did your heatsink not show any sign of the CLP fusing to the heatsink? I know for a fact CL Ultra does, and I thought the one time I used CL Pro it did as well, but perhaps the Pro is pure enough that it doesnt fuse.


----------



## chronicfx

@HyperMatrix when you run far cry 4 do you go into task manager and disable the "pegged" core and re-enable it by going to task
Manager --> set affinity? For me far cry runs a single core at 90-95% until I do that. You may see better frames after.


----------



## toncij

Quote:


> Originally Posted by *Kaapstad*
> 
> Scaling in Witcher 3 @2160p using HB SLI bridge and Pascal Titans.
> 
> There is upto about 30% gain using HB bridge in the game and the fps are very constant.


30% compared to a single flex? I've tried dual flex and it's the same as HB one - sans incompatibility with EKWB blocks.


----------



## chronicfx

Quote:


> Originally Posted by *toncij*
> 
> 30% compared to a single flex? I've tried dual flex and it's the same as HB one - sans incompatibility with EKWB blocks.


Yeah I suspected as much. The HB bridges were not initially available for the 1080 because it was the titans they were really meant for.


----------



## Jpmboy

Quote:


> Originally Posted by *ottoore*
> 
> We made some tests on clp and pure gallium in an italian forum. After a couple of year, we wanted to prove long-term effects, we verified that gallium does not leave any " mark " on copper.


great - would love to see the data etc.


----------



## enfluence

Anyone using a nzxt G10 bracket on the Titan? Just bought one with a kraken x41.


----------



## DNMock

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> No mod yet, CLU hasn't come yet. Can't believe the card does this good with just a block and stock bios.


So has a general consensus been reached on the resistor mod? Thin layer on 2 or thin layer on all 3? Gonna give it a go this weekend.
Quote:


> Originally Posted by *toncij*
> 
> 30% compared to a single flex? I've tried dual flex and it's the same as HB one - sans incompatibility with EKWB blocks.


Just cut the tips off. Don't even have to go past where the screws are to get it to fit. It's literally like .25 inches that needs to be removed tops.

If you are uncomfortable cutting it, just use a decent metal file and sand the corners off.


----------



## Menthol

I added the info necessary to control voltage in AB but it doesn't appear to add any extra voltage, I also tried the CTRL L to lock boost clocks, seems to work on one card but in SLI it still only works on one card

The mod didn't help me much (about 50mhz higher) and on one card it made things worse, GPU-Z would show max TDP of like 24% and card would not clock worth a crap, hope you have better luck than I did


----------



## Kaapstad

Quote:


> Originally Posted by *toncij*
> 
> 30% compared to a single flex? I've tried dual flex and it's the same as HB one - sans incompatibility with EKWB blocks.


In a lot of things there is no difference between a HB and Flexi bridge.

Witcher 3 and a few others seem to be the exception rather than the rule.

If you do want to use a HB bridge with waterblocks EVGA also sell them and they don't need cutting down.


----------



## toncij

Quote:


> Originally Posted by *Kaapstad*
> 
> In a lot of things there is no difference between a HB and Flexi bridge.
> 
> Witcher 3 and a few others seem to be the exception rather than the rule.
> 
> If you do want to use a HB bridge with waterblocks EVGA also sell them and they don't need cutting down.


Dual flexi at 5K - I saw no difference out of the 3% margin of error aka different test instance. Single flexi is the other thing, but after trying both HB and 2xFlexi, I think connections are the same thickness so there is same perf.


----------



## lilchronic

Quote:


> Originally Posted by *Menthol*
> 
> I added the info necessary to control voltage in AB but it doesn't appear to add any extra voltage, I also tried the CTRL L to lock boost clocks, seems to work on one card but in SLI it still only works on one card
> 
> The mod didn't help me much (about 50mhz higher) *and on one card it made things worse, GPU-Z would show max TDP of like 24% and card would not clock worth a crap*, hope you have better luck than I did


Have you tried doing one at a time? I heard if you lower the power sense to low it will lock the card in 2d clock's or something like that.


----------



## GlowingBurrito

So mine just came in. Do I install it now or wait for my block to come in


----------



## chronicfx

Quote:


> Originally Posted by *GlowingBurrito*
> 
> So mine just came in. Do I install it now or wait for my block to come in


Is that even a question







No but seriously you can always stress test and bench it to see if it is all good before you get the block


----------



## Yuhfhrh

Quote:


> Originally Posted by *DNMock*
> 
> So has a general consensus been reached on the resistor mod? Thin layer on 2 or thin layer on all 3? Gonna give it a go this weekend.


I did all three, under a furmark test where I was hitting 115-120% before, I'm now around 80-84%. Even setting the power limit to 120% now, the card seems to stop feeding more power at 400W. The power limit will show going up to 120%, but everything from 105% to 120% is pulling 400W from my rough math.


----------



## Menthol

Quote:


> Originally Posted by *lilchronic*
> 
> Have you tried doing one at a time? I heard if you lower the power sense to low it will lock the card in 2d clock's or something like that.


I tried all 3 first, then removed CLU from the bottom res, then removed the middle, then removed the last, much harder to remove than to apply, that crap will end up all over the place if your not extremely careful
using universal blocks and a chiller for benching, they both hold 2100mhz, CLU on the one card gave me another 50mhz, would never do this for daily use


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> Have you tried doing one at a time? I heard if you lower the power sense to low it will lock the card in 2d clock's or something like that.


^^ This is what was happening with the 1080. The CLU mod is very "iffy" IMO.
Quote:


> Originally Posted by *Yuhfhrh*
> 
> I did all three, under a furmark test where I was hitting 115-120% before, I'm now around 80-84%. Even setting the power limit to 120% now, the card seems to stop feeding more power at 400W. The power limit will show going up to 120%, but everything from 105% to 120% is pulling 400W from my rough math.


do you have a Killawatt meter?
Quote:


> Originally Posted by *Menthol*
> 
> I tried all 3 first, then removed CLU from the bottom res, then removed the middle, then removed the last, much harder to remove than to apply, that crap will end up all over the place if your not extremely careful
> using universal blocks and a chiller for benching, they both hold 2100mhz, CLU on the one card gave me another 50mhz, *would never do this for daily use*


^^ for sure.


----------



## DNMock

Quote:


> Originally Posted by *Yuhfhrh*
> 
> I did all three, under a furmark test where I was hitting 115-120% before, I'm now around 80-84%. Even setting the power limit to 120% now, the card seems to stop feeding more power at 400W. The power limit will show going up to 120%, but everything from 105% to 120% is pulling 400W from my rough math.


Isn't that about the cap on a 6 + 8 pin connector + PCIE?

What were you pulling at 120% prior to the mod?


----------



## Yuhfhrh

Quote:


> Originally Posted by *Jpmboy*
> 
> do you have a Killawatt meter?


Yup.
Quote:


> Originally Posted by *DNMock*
> 
> Isn't that about the cap on a 6 + 8 pin connector + PCIE?
> 
> What were you pulling at 120% prior to the mod?


Roughly 300W from the card before the mod at 120%.


----------



## DNMock

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Yup.
> Roughly 300W from the card before the mod at 120%.


dang, so a 25% increase only equated to 50 mhz? Sounds like it's not worth it to even bother with.


----------



## Yuhfhrh

Quote:


> Originally Posted by *DNMock*
> 
> dang, so a 25% increase only equated to 50 mhz? Sounds like it's not worth it to even bother with.


It stops nearly all of the power throttling in benchmarks, better performance at the same clocks.


----------



## EniGma1987

Quote:


> Originally Posted by *DNMock*
> 
> Isn't that about the cap on a 6 + 8 pin connector + PCIE?


Quote:


> 6-pin connector specified for 17A of current (3 contacts for +12V power, 2 contacts for GND return, one contact for detect). 8-pin have 25.5A current specification (3 contacts for +12V power, 3 contacts for GND return and 2 contacts for detection). This is 204W at +12.0V level or 306W for 8-pin accordingly.


Quote:


> Originally Posted by *DNMock*
> 
> dang, so a 25% increase only equated to 50 mhz? Sounds like it's not worth it to even bother with.


The shunt mod doesnt add any MHz to your overclock, that was never the point of the mod. The point is to stop the driver throttling down your overclock once you hit the "power limit" programmed into the GPU.


----------



## Jpmboy

Quote:


> Originally Posted by *Yuhfhrh*
> 
> *Yup.*
> Roughly 300W from the card before the mod at 120%.


lol - when you said "math" I figured you were doing something more. You meant arithmetic.


----------



## opt33

mine is up on water....5 runs of time spy and the hottest spot on back of my card read 45.1C (vrm area).

2100 core and +200 mem works so far in time spy. 1.06v max is bummer, havent tried the unlock msi volt yet
http://www.3dmark.com/3dm/14173901?


----------



## Jpmboy

Quote:


> Originally Posted by *opt33*
> 
> mine is up on water....5 runs of time spy and the hottest spot on back of my card read 45.1C (vrm area).
> 
> 2100 core and +200 mem works so far in time spy. 1.06v max is bummer, havent tried the unlock msi volt yet
> http://www.3dmark.com/3dm/14173901?


ram should be able to do +500 or more.


----------



## HyperMatrix

Quote:


> Originally Posted by *chronicfx*
> 
> @HyperMatrix when you run far cry 4 do you go into task manager and disable the "pegged" core and re-enable it by going to task
> Manager --> set affinity? For me far cry runs a single core at 90-95% until I do that. You may see better frames after.


I'll definitely give this a try! Because performance in some areas drops down to 70fps when maxed out, all from a CPU bottleneck, and that's painful for an FPS. Thanks.


----------



## HyperMatrix

So no one else is getting overclock instability with the new drivers? One of my cards is essentially in "gimp" mode right now. The other one is fine. Reverting to old drivers resolved it. But I'm wondering if this is a hardware issue that the driver exposed. If so, I'll need to RMA the card.


----------



## dante`afk

Quote:


> Originally Posted by *HyperMatrix*
> 
> So no one else is getting overclock instability with the new drivers? One of my cards is essentially in "gimp" mode right now. The other one is fine. Reverting to old drivers resolved it. But I'm wondering if this is a hardware issue that the driver exposed. If so, I'll need to RMA the card.


You should re-evaluate your OC stability using the new driver.

New drivers push the GPUs in different/harder ways, and the code goes through different paths to produce supposedly-faster results. So, if your overclock becomes unstable, parts of the GPUs are working harder, and you need to re-evaluate to find the new sweet spot.

I had to lower my core clock by 30Mhz to regain stability with those drivers, but overall my 3DMark score and FPS is higher now.


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> You should re-evaluate your OC stability using the new driver.
> 
> New drivers push the GPUs in different/harder ways, and the code goes through different paths to produce supposedly-faster results. So, if your overclock becomes unstable, parts of the GPUs are working harder, and you need to re-evaluate to find the new sweet spot.
> 
> I had to lower my core clock by 30Mhz to regain stability with those drivers, but overall my 3DMark score and FPS is higher now.


I documented earlier that there is an actual serious problem with the driver. At pre-boost clocks, it'll artifact and crash even with a +100MHz offset. This happens when there is low GPU usage. But if you're running a program that utilizes the GPU more, and pushes it into boost clocks, it'll go over 2000MHz+ and all artifacting/etc issues will be gone. GPU-Z is reporting it as a VRel issue during pre-boost clocks.


----------



## DADDYDC650

Quote:


> Originally Posted by *HyperMatrix*
> 
> I documented earlier that there is an actual serious problem with the driver. At pre-boost clocks, it'll artifact and crash even with a +100MHz offset. This happens when there is low GPU usage. But if you're running a program that utilizes the GPU more, and pushes it into boost clocks, it'll go over 2000MHz+ and all artifacting/etc issues will be gone. GPU-Z is reporting it as a VRel issue during pre-boost clocks.


I"ve had 0 issues so far with this driver. Are you guys running WIndows 10 anniversary update?


----------



## HyperMatrix

Quote:


> Originally Posted by *DADDYDC650*
> 
> I"ve had 0 issues so far with this driver. Are you guys running WIndows 10 anniversary update?


Yeah. But again, the weirdest thing is that the problem only exists with one of my cards. Which makes it impossible for me to understand and try to figure out.


----------



## dante`afk

I went back to 1511 after testing the anniversary update last week. sucks. blocked the update until they fix their ****.


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> I went back to 1511 after testing the anniversary update last week. sucks. blocked the update until they fix their ****.


What issues?


----------



## KillerBee33

Any difference btwn. 372 Win 10 and 372 Win 10 AU driver?
@ dante`afk , Even on a clean install?


----------



## Jpmboy

Quote:


> Originally Posted by *HyperMatrix*
> 
> So no one else is getting overclock instability with the new drivers? One of my cards is essentially in "gimp" mode right now. The other one is fine. Reverting to old drivers resolved it. But I'm wondering if this is a hardware issue that the driver exposed. If so, I'll need to RMA the card.


yeah - I'm getting some weird behavior with the card in slot 3... I'm running 1151 also. Gotta dig a little tio see whats up
Quote:


> Originally Posted by *DADDYDC650*
> 
> I"ve had 0 issues so far with this driver . Are you guys running WIndows 10 anniversary update?


no. 1151. 1607 is not ready yet IMO.








Quote:


> Originally Posted by *dante`afk*
> 
> I went back to 1511 after testing the anniversary update last week. sucks. blocked the update until they fix their ****.


Same here.


----------



## opt33

Quote:


> Originally Posted by *Jpmboy*
> 
> ram should be able to do +500 or more.


yeah, my 2100core is max for timespy, max memory I have been able to run at that core speed is+ 200, any higher mem and crashes during cpu test. I backed off to 2070 core, and +500mem no issue, still testing up mem. But so far best score is 2100 with 200 mem, though 2070/500 mem was within margin of error. And both runs have some minor power throttling..

Every crash I have had has been on cpu test...either loading or running it, seems harder on gpu than gpu tests.


----------



## Jpmboy

Quote:


> Originally Posted by *opt33*
> 
> yeah, my 2100core is max for timespy, max memory I have been able to run at that core speed is+ 200, any higher mem and crashes during cpu test. I backed off to 2070 core, and +500mem no issue, still testing up mem. But so far best score is 2100 with 200 mem, though 2070/500 mem was within margin of error. And both runs have some minor power throttling..
> 
> Every crash I have had has been on cpu test...either loading or running it, seems harder on gpu than gpu tests.


lol - yeah - most hangs on the "cpu" test are not cpu related. It's TimeSpy version 1.0...









my card in slot 1 is working fine: http://www.3dmark.com/3dm/14175774


----------



## HyperMatrix

As a side note for the CPU discussion, I recently noticed I get higher bench numbers on CPU intensive tasks at a lower clockspeed. It seems like even at 80c, the 5960x likes to throttle down performance even though no WHEA errors or actual visible throttling is present. So I dropped from 1.41v at 4.7GHz to 1.28v at 4.625GHz and actually get better performance on linpack.


----------



## Jpmboy

Quote:


> Originally Posted by *HyperMatrix*
> 
> As a side note for the CPU discussion, I recently noticed I get higher bench numbers on CPU intensive tasks at a lower clockspeed. It seems like even at 80c, the 5960x likes to throttle down performance even though no WHEA errors or actual visible throttling is present. So I dropped from 1.41v at 4.7GHz to 1.28v at 4.625GHz and actually get better performance on linpack.


at or near the Tcase, error correction can make things pretty inefficient (looping until checksums match.. or not then it is an uncorrectable error). NOt all correctable errors (machine-check erors or WHEA) will be recorded.


----------



## opt33

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - yeah - most hangs on the "cpu" test are not cpu related. It's TimeSpy version 1.0...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my card in slot 1 is working fine: http://www.3dmark.com/3dm/14175774


nice run on timespy....power throttling is killing mine, not sure the msi power slider does anything. as i increase the mem further my score is going lower, higher mem is causing gpu core to throttle down farther. but 550 mem so far stable. really need access to bios.


----------



## Neb9

Definitely getting a Titan XP, but also considering going SLI. I am using a 1440p 144hz monitor and am wondering about what the scaling is like at 1440p?


----------



## DADDYDC650

What issues are people running into with the latest version of Windows 10? I haven't had any problems so pretty curious.


----------



## Fiercy

Quote:


> Originally Posted by *Neb9*
> 
> Definitely getting a Titan XP, but also considering going SLI. I am using a 1440p 144hz monitor and am wondering about what the scaling is like at 1440p?


Don't bother with SLI more problems than gain IMHO. Get Titan though


----------



## HyperMatrix

Quote:


> Originally Posted by *Fiercy*
> 
> Don't bother with SLI more problems than gain IMHO. Get Titan though


I would disagree. Two Titans are almost perfect for 165Hz 1440p. Just a little shy, which will be solved when I put them under water. But for 144Hz they should be exactly what you need to run maxed out, barring any games that are cpu bottlenecked.


----------



## Jpmboy

Quote:


> Originally Posted by *DADDYDC650*
> 
> What issues are people running into with the latest version of Windows 10? I haven't had any problems so pretty curious.


i don;t think they have fully optimized the core preferencing with BWE processors (eg the * core thing). Seems to compromise performance in benchmarks, but 1607 should be fine for most uses.








Quote:


> Originally Posted by *opt33*
> 
> nice run on timespy....power throttling is killing mine, not sure the msi power slider does anything. as i increase the mem further my score is going lower, higher mem is causing gpu core to throttle down farther. but 550 mem so far stable. really need access to bios.


same here.. the PL is just slamming into 120%


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Would it be possible for you to run Firestrike and Time Spy with 16x/16x and 16x/8x? Do you own a HB bridge?


Im running x16/x8 and having no problems getting 100fps on my X34.


----------



## Gary2015

Quote:


> Originally Posted by *Fiercy*
> 
> Don't bother with SLI more problems than gain IMHO. Get Titan though


Sli hasn't had any problems for me.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> What issues are people running into with the latest version of Windows 10? I haven't had any problems so pretty curious.


Nine except for Realtek driver don't work on my mobo so I had to rollback to an old one.


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> Im running x16/x8 and having no problems getting 100fps on my X34.


I'm pretty sure that Titan XP's aren't saturating pci-e 16x/8x 3.0. There would be benchmarks popping up all over if that was the case.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - when you said "math" I figured you were doing something more. You meant arithmetic.


LOL, no PDE's to be found here I'm afraid. (Applied Math major)


----------



## Jpmboy

Quote:


> Originally Posted by *Sheyster*
> 
> LOL, no PDE's to be found here I'm afraid. (Applied Math major)


thanks for the memories.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> thanks for the memories.


LOL, that freaking class was a bish! I took it in 1986.







Definitely separated the men from the boys.


----------



## Gary2015

So are there any performance gains with water cooling ?


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm pretty sure that Titan XP's aren't saturating pci-e 16x/8x 3.0. There would be benchmarks popping up all over if that was the case.


Yeah so people are wrong saying x16/x16 is better.


----------



## EQBoss

Got my 2 titans today with my ek water-blocks yesterday. I'm too impatient to wait for back-plates so considering not installing them period. Are they really necessary? especially when my rig is horizontal so weight of the gpus wont be an issue. I've never had back-plates personally but because the titan comes with one by default makes me curious Also anyone know when the evga hb sli brridge releases? Default ones are not compatible sadly so have to wait for ek or evga, for now will use 2 ribbons.


----------



## stefxyz

Quote:


> Originally Posted by *Gary2015*
> 
> So are there any performance gains with water cooling ?


Actually for me I dont see benefits beside having mich lower noise of course. Performance wise in real games like withcer and rise of the tomb raider the limit is always power for me. So while i can crank up clock to slightly above 2100 mhz and ram close to plus 600 truth is clocks flictuate way too mich for me and dip as low as 1987 with the driver from yesterday. Before lowest was 2000 spot on for me in withcer which so far is the most power hungry game i tried. Other games have stable clocks higher but i see no benefit in fluctuating clocks except having higher average fps.

But ecpet of having nicer numbers who cares about average in games the lower 5% are what define the gamjng experience so my experience will not be any better with anything above the lowest which in witcher 3 is 1987 mhz for me.

And to come back to the question 1987 you can achieve on air too but at the cost of high noise to avoid thermal throttling below that. Just for benchmarking you can squeeze some points more here and there but also i dont see a huge impact. Timespy goes up by a couple 100 points thats it. Still would always fo watercooling for the nice temps and super low noise at below 759 rpm fan speed....


----------



## Gary2015

Quote:


> Originally Posted by *stefxyz*
> 
> Actually for me I dont see benefits beside having mich lower noise of course. Performance wise in real games like withcer and rise of the tomb raider the limit is always power for me. So while i can crank up clock to slightly above 2100 mhz and ram close to plus 600 truth is clocks flictuate way too mich for me and dip as low as 1987 with the driver from yesterday. Before lowest was 2000 spot on for me in withcer which so far is the most power hungry game i tried. Other games have stable clocks higher but i see no benefit in fluctuating clocks except having higher average fps.
> 
> But ecpet of having nicer numbers who cares about average in games the lower 5% are what define the gamjng experience so my experience will not be any better with anything above the lowest which in witcher 3 is 1987 mhz for me.
> 
> And to come back to the question 1987 you can achieve on air too but at the cost of high noise to avoid thermal throttling below that. Just for benchmarking you can squeeze some points more here and there but also i dont see a huge impact. Timespy goes up by a couple 100 points thats it. Still would always fo watercooling for the nice temps and super low noise at below 759 rpm fan speed....


Thanks . Let's hope the voltage can be unlocked soon.


----------



## Fiercy

I don't know a proper water loop is a lot of maintenance never before I had to leave my pc open for 2 weeks as I was waiting for a block... If I had air problem solved say one.

But I like to mess with this but I am also sertain a lot of people won't.


----------



## Gary2015

Quote:


> Originally Posted by *Fiercy*
> 
> I don't know a proper water loop is a lot of maintenance never before I had to leave my pc open for 2 weeks as I was waiting for a block... If I had air problem solved say one.
> 
> But I like to mess with this but I am also sertain a lot of people won't.


Don't understand what you are saying.


----------



## Neb9

Quote:


> Originally Posted by *Fiercy*
> 
> I don't know a proper water loop is a lot of maintenance never before I had to leave my pc open for 2 weeks as I was waiting for a block... If I had air problem solved say one.
> 
> But I like to mess with this but I am also sertain a lot of people won't.


In my experience it takes a while to set up, mostly getting rid of air, though after that I have never had to touch it. This goes for for my current loop which I have had for over a year, and the setup which I had before that which I had for around the same amount of time.


----------



## HyperMatrix

Quote:


> Originally Posted by *chronicfx*
> 
> @HyperMatrix when you run far cry 4 do you go into task manager and disable the "pegged" core and re-enable it by going to task
> Manager --> set affinity? For me far cry runs a single core at 90-95% until I do that. You may see better frames after.


Dude...***...ok so core 2 was at 99%. Disabling it made the core usage jump to 99% on core 3. So on and so forth for all the cores. So it got to 99% usage on core 7. And for shoots and giggles I decided to repeat the process yet again for the final core. Then it switched to Core 0 being the primary core, but only using about 60%. So no more cores pegged at 99%. And increased frame rate. Why on earth is this something we have to do?


----------



## HyperMatrix

Quote:


> Originally Posted by *Neb9*
> 
> In my experience it takes a while to set up, mostly getting rid of air, though after that I have never had to touch it. This goes for for my current loop which I have had for over a year, and the setup which I had before that which I had for around the same amount of time.


Yeah I switched from water to Ice Dragon Nano Coolant. That along with quick-disconnect fittings, means this is all maintenance free for the most part, and doesn't even need to be drained when adding/removing hardware. I installed a brand new motherboard/cpu without having to drain anything.


----------



## Neb9

Quote:


> Originally Posted by *HyperMatrix*
> 
> Yeah I switched from water to Ice Dragon Nano Coolant. That along with quick-disconnect fittings, means this is all maintenance free for the most part, and doesn't even need to be drained when adding/removing hardware. I installed a brand new motherboard/cpu without having to drain anything.


I made the mistake of using no quick disconnects, with one tube going behind the motherboard (the cable management area, for lack of a better name), which connects to a rad set up in the top of my case above the res. There is still a tiny bit of air in there, and the only way I got the majority of it out was to literally turn my case upside-down. If I had a quick disconnect on that tube I could have moved that rad down to get the air out and saved myself quite a few hours.


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> Yeah I switched from water to Ice Dragon Nano Coolant. That along with quick-disconnect fittings, means this is all maintenance free for the most part, and doesn't even need to be drained when adding/removing hardware. I installed a brand new motherboard/cpu without having to drain anything.


If the CPU is the same like when I switched from RIV BE to RVE10 and i76800k , it's easy. Running my 6800k at 4.5ghz at 28c with 23c ambient.


----------



## HyperMatrix

Quote:


> Originally Posted by *Gary2015*
> 
> If the CPU is the same like when I switched from RIV BE to RVE10 and i76800k , it's easy. Running my 6800k at 4.5ghz at 28c with 23c ambient.


With quick disconnects it doesn't even matter. I switched from a 3770k to a 5960x. Just attached quick disconnect fittings to the new block. Quick-disconnected the tubing from the previous block, and plugged them into the new block. And voila. Everything was done in about 30 seconds.


----------



## bee144

I'm confused by
Quote:


> Originally Posted by *Gary2015*
> 
> If the CPU is the same like when I switched from RIV BE to RVE10 and i76800k , it's easy. Running my 6800k at 4.5ghz at 28c with 23c ambient.


are you looking to sell your rampage black? PM me if you are.


----------



## unreality

Anyone spoke with Aquacomputer when blocks are going out? In the Forums someone said first were going out monday, but still nothing happening so far


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> Yeah I switched from water to Ice Dragon Nano Coolant. That along with quick-disconnect fittings, means this is all maintenance free for the most part, and doesn't even need to be drained when adding/removing hardware. I installed a brand new motherboard/cpu without having to drain anything.


Tell me more about those quick-disconnect fittings? EK has them or Aqua? Or someone else?
I'm looking into a proper water-cooling setup, but I'm hesitating because it's the pain of setting up and when replacing GPUs...
Quote:


> Originally Posted by *Gary2015*
> 
> If the CPU is the same like when I switched from RIV BE to RVE10 and i76800k , it's easy. Running my 6800k at 4.5ghz at 28c with 23c ambient.


From RVE to RVE10? What did you gain?


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> Tell me more about those quick-disconnect fittings? EK has them or Aqua? Or someone else?
> I'm looking into a proper water-cooling setup, but I'm hesitating because it's the pain of setting up and when replacing GPUs...
> From RVE to RVE10? What did you gain?


Koolance has the best ones I could find. Quality is amazing. And they look freaking great.


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> Tell me more about those quick-disconnect fittings? EK has them or Aqua? Or someone else?
> I'm looking into a proper water-cooling setup, but I'm hesitating because it's the pain of setting up and when replacing GPUs...
> From RVE to RVE10? What did you gain?


Didnt have RVE had RIV Black Edition.


----------



## Silent Scone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Koolance has the best ones I could find. Quality is amazing. And they look freaking great.


Black ones don't tend to last too long without contaminant build up. Nickel is always best.


----------



## HyperMatrix

Quote:


> Originally Posted by *Silent Scone*
> 
> Black ones don't tend to last too long without contaminant build up. Nickel is always best.


Not these ones. You'll notice the nickel plating on the revised model (as pictured above), as opposed to the old model which was all black.


----------



## Silent Scone

Quote:


> Originally Posted by *HyperMatrix*
> 
> Not these ones. You'll notice the nickel plating on the revised model (as pictured above), as opposed to the old model which was all black.


Ah the internals aren't painted. Nice, they obviously wised up, then


----------



## profundido

Quote:


> Originally Posted by *Silent Scone*
> 
> Black ones don't tend to last too long without contaminant build up. Nickel is always best.


I got the nickel ones in G4 and cable-to-cable versions in different places and I'm madly happy with them. Use them to easy drain in 3 places straight back into the bottles they came from. Makes the draining process so easy I don't even hesitate or look up to a quick hardware/loop change anymore


----------



## Silent Scone

Quote:


> Originally Posted by *profundido*
> 
> I got the nickel ones in G4 and cable-to-cable versions in different places and I'm madly happy with them. Use them to easy drain in 3 places straight back into the bottles they came from. Makes the draining process so easy I don't even hesitate or look up to a quick hardware/loop change anymore


I have 6 sets or more on my loop. Zero toots given to aesthetics, they make changing components so easy (I do often).


----------



## ottoore

Quote:


> Originally Posted by *EniGma1987*
> 
> Pure Gallium may not, but CLP and CLU do. Most likely due to the other materials mixed with the gallium in the liquid metal TIM. Did your heatsink not show any sign of the CLP fusing to the heatsink? I know for a fact CL Ultra does, and I thought the one time I used CL Pro it did as well, but perhaps the Pro is pure enough that it doesnt fuse.


Just read previous post, i was talking about the difference between clp/clu and pure gallium. That's why you should use pure gallium.
CLP interacts with copper as you and Jpmboy said, this is an axiom.


----------



## tpwilko08

Is anyone still waiting on there ek waterblock to be shipped mine is still in the processing status. I ordered it on 9/8/16 :-(


----------



## stefxyz

Did you order it with the backplate? If yes thats the reason if not I would write to support. They respond super quick.


----------



## HyperMatrix

Quote:


> Originally Posted by *Silent Scone*
> 
> I have 6 sets or more on my loop. Zero toots given to aesthetics, they make changing components so easy (I do often).


I replaced every single connector and tube in my system with them. So even all connecting tubes have quick disconnect on both ends. Haha. Waste of money...perhaps...but so much convenience it's crazy.


----------



## tpwilko08

Quote:


> Originally Posted by *stefxyz*
> 
> Did you order it with the backplate? If yes thats the reason if not I would write to support. They respond super quick.


No I was just going to install without backplate for now then order it later. I will get in touch with ek and see what they have to say.


----------



## Phoenix81

Quote:


> Originally Posted by *tpwilko08*
> 
> Is anyone still waiting on there ek waterblock to be shipped mine is still in the processing status. I ordered it on 9/8/16 :-(


Mine is also in processing. Ordered on the same day as yours. Already wrote to support and they said they try to ship the order by the end of this week.


----------



## stefxyz

Btw I checked the manual of my 1080 backplate again and I am pretty sure I can install the Titan backplate without taking my Titan with EK WB out of the pc.

If it is like 1080 and with 99% probabability it will you just have to unscrew the 6 outer screws and then the important 7 inner screws are still in place and will easyliy hold the block. and then just mount the backplate and screw back in and job done. All can be done in the case if you have the space. So no need t take out all water again, disrupt the loop and so on.

Correct me if im wrong but it looks even pretty safe. the important components are secured by the inner screws anyways.


----------



## tpwilko08

Quote:


> Originally Posted by *Phoenix81*
> 
> Mine is also in processing. Ordered on the same day as yours. Already wrote to support and they said they try to ship the order by the end of this week.


That is good news for you am just waiting on reply from EK.
Quote:


> Originally Posted by *stefxyz*
> 
> Btw I checked the manual of my 1080 backplate again and I am pretty sure I can install the Titan backplate without taking my Titan with EK WB out of the pc.
> 
> If it is like 1080 and with 99% probabability it will you just have to unscrew the 6 outer screws and then the important 7 inner screws are still in place and will easyliy hold the block. and then just mount the backplate and screw back in and job done. All can be done in the case if you have the space. So no need t take out all water again, disrupt the loop and so on.
> 
> Correct me if im wrong but it looks even pretty safe. the important components are secured by the inner screws anyways.


Thanks for this info that would certainly save a lot of time and effort and not having to fill and bleed the loop twice.


----------



## dante`afk

Quote:


> Originally Posted by *Gary2015*
> 
> So are there any performance gains with water cooling ?


Nope, it's all Power and Vcore limited. And since bios can't be modded it will stay like this.


----------



## profundido

Quote:


> Originally Posted by *tpwilko08*
> 
> Is anyone still waiting on there ek waterblock to be shipped mine is still in the processing status. I ordered it on 9/8/16 :-(


same boat, estimated date of delivery here is 23 august for 2 titan X waterblocks


----------



## dante`afk

Quote:


> Originally Posted by *KillerBee33*
> 
> Any difference btwn. 372 Win 10 and 372 Win 10 AU driver?
> @ dante`afk , Even on a clean install?


yep, even after clean install with a new 1607 ISO from technet
Quote:


> Originally Posted by *DADDYDC650*
> 
> What issues?


Quote:


> Originally Posted by *DADDYDC650*
> 
> What issues are people running into with the latest version of Windows 10? I haven't had any problems so pretty curious.


https://www.google.com/webhp?sourceid=chrome-instant&rlz=1C1PRFE_enUS705US705&ion=1&espv=2&ie=UTF-8#q=low+fps+windows+10+anniversary+update
http://steamcommunity.com/app/730/discussions/0/360671247405939095/
http://forum.blackdesertonline.com/index.php?/topic/106461-windows-10-anniversary-update-nvidia-50-fps-drop/

__
https://www.reddit.com/r/4vzv5h/psa_windows_10_anniversary_update_messing_with/%5B/URL

On top of that I had those FPS issues regardless of the fixes, had to restart to make them go away and then they would appear again all of the sudden. Also some apps like Lync stopped working with the update. if for example skype for business 2016 was running in the background, all windows/dragging etc would feel like 30hz, even though the screens are running here at 165 or 120hz. and the fps drop in games would be guaranteed then. overall it felt way more sluggish.


----------



## dante`afk

Quote:


> Originally Posted by *Gary2015*
> 
> Sli hasn't had any problems for me.


'microstutters' and don't tell anyone that there are none. there *are*.


----------



## tpwilko08

Quote:


> Originally Posted by *dante`afk*
> 
> 'microstutters' and don't tell anyone that there are none. there *are*.


I agree with this coming from 980ti sli to Titan XP the experience is so much smoother.


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> 'microstutters' and don't tell anyone that there are none. there *are*.


It doesn't exist. http://www.guru3d.com/articles_pages/geforce_gtx_1080_2_way_sli_review,8.html

Show me any benching that uses 16x/16x config with Pascal and HB Bridge, that shows any real difference between single gpu and sli, with the exception of the odd game that may have sli issues in general.


----------



## tpwilko08

Quote:


> Originally Posted by *profundido*
> 
> same boat, estimated date of delivery here is 23 august for 2 titan X waterblocks


Where are situated roughly i am from UK.


----------



## DADDYDC650

Quote:


> Originally Posted by *dante`afk*
> 
> yep, even after clean install with a new 1607 ISO from technet
> 
> https://www.google.com/webhp?sourceid=chrome-instant&rlz=1C1PRFE_enUS705US705&ion=1&espv=2&ie=UTF-8#q=low+fps+windows+10+anniversary+update
> http://steamcommunity.com/app/730/discussions/0/360671247405939095/
> http://forum.blackdesertonline.com/index.php?/topic/106461-windows-10-anniversary-update-nvidia-50-fps-drop/
> 
> __
> https://www.reddit.com/r/4vzv5h/psa_windows_10_anniversary_update_messing_with/%5B/URL
> 
> On top of that I had those FPS issues regardless of the fixes, had to restart to make them go away and then they would appear again all of the sudden. Also some apps like Lync stopped working with the update. if for example skype for business 2016 was running in the background, all windows/dragging etc would feel like 30hz, even though the screens are running here at 165 or 120hz. and the fps drop in games would be guaranteed then. overall it felt way more sluggish.


Weird. That sucks bro. Hope they release a fix for you soon. I'd be annoyed like no other if I had that problem.


----------



## Silent Scone

Quote:


> Originally Posted by *tpwilko08*
> 
> Where are situated roughly i am from UK.


I'm UK side, and ordered day after release. Looking at the block leaving them this week, but haven't been reassured anything. Getting bored waiting.


----------



## Gary2015

Quote:


> Originally Posted by *dante`afk*
> 
> Nope, it's all Power and Vcore limited. And since bios can't be modded it will stay like this.


Ouch! I dont mind a little fan noise, so probably will hold off my block purchase.


----------



## Tideman

Alright somethings not right here..

Ran Timespy yesterday after updating to latest drivers.. got 12886 total (that's about the score I was getting on previous drivers). Always seemed kinda low to me for 2 titans.

Then I ran it twice today and got 2000 points more! Same clocks.. Doesn't seem normal to me


----------



## Gary2015

Quote:


> Originally Posted by *Tideman*
> 
> Alright somethings not right here..
> 
> Ran Timespy yesterday after updating to latest drivers.. got 12886 total (that's about the score I was getting on previous drivers). Always seemed kinda low to me for 2 titans.
> 
> Then I ran it twice today and got 2000 points more! Same clocks.. Doesn't seem normal to me


If it walks like a duck, talks like a duck...it is a...


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> It doesn't exist. http://www.guru3d.com/articles_pages/geforce_gtx_1080_2_way_sli_review,8.html
> 
> Show me any benching that uses 16x/16x config with Pascal and HB Bridge, that shows any real difference between single gpu and sli, with the exception of the odd game that may have sli issues in general.


There are no microstutters, frametimes are consistent and stable with 1080. I don't see why a TitanX would be worse.

Quote:


> Originally Posted by *dante`afk*
> 
> Nope, it's all Power and Vcore limited. And since bios can't be modded it will stay like this.


At least you can reduce insane noise and working temperatures. Also, BIOS could once be modded and verifications avoided. We can't be sure it won't.


----------



## toncij

Quote:


> Originally Posted by *Tideman*
> 
> Alright somethings not right here..
> 
> Ran Timespy yesterday after updating to latest drivers.. got 12886 total (that's about the score I was getting on previous drivers). Always seemed kinda low to me for 2 titans.
> 
> Then I ran it twice today and got 2000 points more! Same clocks.. Doesn't seem normal to me


Did your driver force VSync maybe?


----------



## dante`afk

Quote:


> Originally Posted by *toncij*
> 
> There are no microstutters, frametimes are consistent and stable with 1080. I don't see why a TitanX would be worse.


Quote:


> Originally Posted by *HyperMatrix*
> 
> It doesn't exist. http://www.guru3d.com/articles_pages/geforce_gtx_1080_2_way_sli_review,8.html
> 
> Show me any benching that uses 16x/16x config with Pascal and HB Bridge, that shows any real difference between single gpu and sli, with the exception of the odd game that may have sli issues in general.


Anyone who switched from SLI to single card will tell you that single gpu experience is much smoother. I was running SLI since 580 series and switched just recently, because someone pointed that out. I never paid attention to those details, but after having them pointed out I could not look away, damn.

You see it the best here, just look at the shoulder area and how the screen behaves, single gpu is much smoother than SLI. The whole screen is a lot smoother than the SLI one, even though it can't even hold 60 fps like SLI can. I see the microstuttering all around there.

And I'm having Gsync, Gsync does not remove that.


----------



## dante`afk

Quote:


> Originally Posted by *toncij*
> 
> At least you can reduce insane noise and working temperatures. Also, BIOS could once be modded and verifications avoided. We can't be sure it won't.


sure the noise and temperatures are better. but how long is the 1080 already out and no bios mods?


----------



## Tideman

Quote:


> Originally Posted by *Gary2015*
> 
> If it walks like a duck, talks like a duck...it is a...


Not following you. Maybe try typing a proper response.

Quote:


> Originally Posted by *toncij*
> 
> Did your driver force VSync maybe?


Not likely, I was getting over 60 in that 'bad' run.


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> Anyone who switched from SLI to single card will tell you that single gpu experience is much smoother. I was running SLI since 580 series and switched just recently, because someone pointed that out. I never paid attention to those details, but after having them pointed out I could not look away, damn.
> 
> You see it the best here, just look at the shoulder area and how the screen behaves, single gpu is much smoother than SLI.
> 
> And I'm having Gsync, Gsync does not remove that.


Want me to make a video just like this with sli enabled so you can see there is no microstutter? I see the microstutter in this video. But I don't get that when gaming. The only microstutter I ever get is a result of content being streamed from the hard drive. For example it happens sometimes in black ops 3. But doesn't happen in the previous call of duty game because it preloaded all content into vram.

Besides. Even if we accepted what you're saying, dx12 multi gpu won't have that problem at all.


----------



## toncij

Quote:


> Originally Posted by *dante`afk*
> 
> sure the noise and temperatures are better. but how long is the 1080 already out and no bios mods?


Not sure why. As far as I know the problem is not editing BIOS, but verifying it to be able to flash it...
Quote:


> Originally Posted by *HyperMatrix*
> 
> Want me to make a video just like this with sli enabled so you can see there is no microstutter? I see the microstutter in this video. But I don't get that when gaming. The only microstutter I ever get is a result of content being streamed from the hard drive. For example it happens sometimes in black ops 3. But doesn't happen in the previous call of duty game because it preloaded all content into vram.
> 
> Besides. Even if we accepted what you're saying, dx12 multi gpu won't have that problem at all.


DX12 mGPU is probably DOA since game consoles don't have multiple GPUs and old SLI AFR worked as-is, with no much additional effort. DX12 EMA requires explicit programming to enable it.

Regarding micro-stutter, I don't have this problem not even on 5K in Witcher 3, but then again, I'm using a HB bridge and have SM961 SSD with 128GB of RAM - hardly and reason to stutter.


----------



## DooRules

Mine is also waiting in processing line. Hope I am getting mine in next batch to ship out.


----------



## st0necold

i'll stick with SLI. I don't play games where you ride horses and pretend to be a character from a teenage novel.

Battlefield + SLI = Win.

Done.


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> Not sure why. As far as I know the problem is not editing BIOS, but verifying it to be able to flash it...
> DX12 mGPU is probably DOA since game consoles don't have multiple GPUs and old SLI AFR worked as-is, with no much additional effort. DX12 EMA requires explicit programming to enable it.
> 
> Regarding micro-stutter, I don't have this problem not even on 5K in Witcher 3, but then again, I'm using a HB bridge and have SM961 SSD with 128GB of RAM - hardly and reason to stutter.


Dx12 multi gpu works on tomb raider and gears of war and ashes. Not bad considering how new it is. All frostbite games should have it as well. Outlook isn't all that bad.


----------



## toncij

This is 1080 SLI scaling I have on [email protected]



http://imgur.com/j52qr


Not really impressive tbh.

Yes it works on ROTTR and Ashes, but both scaling numbers are awful.


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> Ah the internals aren't painted. Nice, they obviously wised up, then


I noticed that too. I still like the shinny ones tho. It's like black paint and chrome "accents" - always looks good.








Quote:


> Originally Posted by *tpwilko08*
> 
> Is anyone still waiting on there ek waterblock to be shipped mine is still in the processing status. I ordered it on 9/8/16 :-(


Me.







Quote:


> Originally Posted by *dante`afk*
> 
> Nope, it's all Power and Vcore limited. And since bios can't be modded it will stay like this.


So.. . not quite accurate. keeping the card below 40C results in steady clocks with much lees power ceiling throttle. Voltage/current, load and temperature all impact the power limit circuit. Both of my cards run steady higher clocks for any given load. Water cooling these cards helps a lot IF you have a decent cooling loop. If your water temperature is hitting the mid 30s on the cold side, it might not help all that much.
Quote:


> Originally Posted by *Gary2015*
> 
> Ouch! I dont mind a little fan noise, so probably will hold off my block purchase.


Running thee cards on air is like putting a blower on a motor without an intercooler.








Quote:


> Originally Posted by *st0necold*
> 
> i'll stick with SLI. I don't play games where you ride horses and pretend to be a character from a teenage novel.
> Battlefield + SLI = Win.
> Done.


----------



## Jpmboy

oops - derped.


----------



## xarot

Has anyone sent their cards back to NVIDIA if they were faulty and if you had used other than stock cooling? I'd be curious if they deny the warranty or not if you put everything back carefully.

You can buy other cards directly from NVIDIA too, maybe someone has tried?


----------



## DooRules

just got email, block on the way


----------



## Gary2015

Quote:


> Originally Posted by *dante`afk*
> 
> Anyone who switched from SLI to single card will tell you that single gpu experience is much smoother. I was running SLI since 580 series and switched just recently, because someone pointed that out. I never paid attention to those details, but after having them pointed out I could not look away, damn.
> 
> You see it the best here, just look at the shoulder area and how the screen behaves, single gpu is much smoother than SLI. The whole screen is a lot smoother than the SLI one, even though it can't even hold 60 fps like SLI can. I see the microstuttering all around there.
> 
> And I'm having Gsync, Gsync does not remove that.


Im having none of these problems with my TXP SLI


----------



## Gary2015

Quote:


> Originally Posted by *xarot*
> 
> Has anyone sent their cards back to NVIDIA if they were faulty and if you had used other than stock cooling? I'd be curious if they deny the warranty or not if you put everything back carefully.
> 
> You can buy other cards directly from NVIDIA too, maybe someone has tried?


Has anyone checked this? Will replacing the stock cooler void the warranty?


----------



## Gary2015

Quote:


> Originally Posted by *dante`afk*
> 
> sure the noise and temperatures are better. but how long is the 1080 already out and no bios mods?


There are no BIOS mods out in the future, Im going to hold off water for awhile...


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> I noticed that too. I still like the shinny ones tho. It's like black paint and chrome "accents" - always looks good.
> 
> 
> 
> 
> 
> 
> 
> 
> Me.
> 
> 
> 
> 
> 
> 
> 
> 
> So.. . not quite accurate. keeping the card below 40C results in steady clocks with much lees power ceiling throttle. Voltage/current, load and temperature all impact the power limit circuit. Both of my cards run steady higher clocks for any given load. Water cooling these cards helps a lot IF you have a decent cooling loop. If your water temperature is hitting the mid 30s on the cold side, it might not help all that much.
> Running thee cards on air is like putting a blower on a motor without an intercooler.


I dunno man. Really not noticing the fans, especially after the latest drivers. ESO runs 95fps maxed out constantly. Was only getting 65fps on average and fluctuating a lot with 2x 1080's, For me TXP SLI is the holy grail.


----------



## MrTOOSHORT

Gary please use the edit button instead of multi posting.









If you have a loop already, I suggest a block for your card. Really stretches it's legs even without a modded bios. So far so good with my EK block.


----------



## Dr Mad

Quote:


> Originally Posted by *Jpmboy*
> 
> So.. . not quite accurate. keeping the card below 40C results in steady clocks with much lees power ceiling throttle. Voltage/current, load and temperature all impact the power limit circuit. Both of my cards run steady higher clocks for any given load. Water cooling these cards helps a lot IF you have a decent cooling loop. If your water temperature is hitting the mid 30s on the cold side, it might not help all that much.


So you confirm that a card at <40°C eats less power with same settings/overclock than at >60°?


----------



## stefxyz

Im not sure this really makes a significant difference on power consumption. I didnt check 1 to 1 but I dont see much power advantage now with 35 degrees celsius under loadwith 3.8 liter per minute and 2 * 560 mm radiators push pull


----------



## DADDYDC650

Would anyone else be willing to return their Titan XP's if Vega was strongly rumored to be out in October with BF1 for free?


----------



## Ninjawithagun

Quote:


> Originally Posted by *Gary2015*
> 
> Has anyone checked this? Will replacing the stock cooler void the warranty?


Here's the link to the warranty policy provided on Nvidia's website:

http://www.nvidia.com/object/manufacturer_warranty.html

I don't see any specifics regarding the warranty being voided by adding a 3rd party cooling solution (AiO, water block, etc). However, there is specific wording that states, "misapplication of service by a party other than an authorized service representative" that may apply to anyone (including the owner) other than an authorized service representative performing any kind of service on the card itself will automatically void the 3-year factory warranty.

I will attempt to contact an Nvidia customer service representative to get a more detailed answer to the question - and it is a great question!


----------



## Ninjawithagun

Quote:


> Originally Posted by *DooRules*
> 
> just got email, block on the way


I decided to combine my Titan XP water block and the backplate into the same order to save on shipping costs. As a result, I won't get my stuff until after 29 August, which also happens to be the official availability date for the backplate. Oh well. Patience really is a virtue!


----------



## Ninjawithagun

Quote:


> Originally Posted by *Dr Mad*
> 
> So you confirm that a card at <40°C eats less power with same settings/overclock than at >60°?


Yes, this is correct. Electrodynamics specifically states that when there is less resistance, less power loss occurs - especially over distance. Hence, lower temperatures will allow for a lower level of electrical current to be required than at higher temperatures. However, for this particular test case the differences would be negligible (most likely in the low MeV range) due to the lower level of power use and the minimum delta in watercooled vs. aircooled ambient temperatures.


----------



## Silent Scone

Quote:


> Originally Posted by *Dr Mad*
> 
> So you confirm that a card at <40°C eats less power with same settings/overclock than at >60°?


Yes, this isn't exclusive to these cards. This is how all semiconductors behave.


----------



## EniGma1987

Quote:


> Originally Posted by *DADDYDC650*
> 
> Would anyone else be willing to return their Titan XP's if Vega was strongly rumored to be out in October with BF1 for free?


No. Not unless Vega somehow magically provides 5-10fps more than a Titan XP, which obviously isnt going to happen. At best Vega will match a GTX 1080, and even that is being a bit hopeful IMO.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Gary please use the edit button instead of multi posting.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you have a loop already, I suggest a block for your card. Really stretches it's legs even without a modded bios. So far so good with my EK block.


^^ This. Really changes the behavior of the card.








Quote:


> Originally Posted by *Dr Mad*
> 
> So you confirm that a card at <40°C eats less power with same settings/overclock than at >60°?


In addition to what others have responded with... my experience is that the card will actually "eat _more_ power" overall - especially at non-cyrogenic temperatures - since it is not throttling as much at a higher frequency than can hold at greater than 40C, in an attempt to lower the power-temp-current component of the PL calc. I guess it just a different perspective of the same effect that MrT and scone pointed out.
Quote:


> Originally Posted by *DooRules*
> 
> just got email, block on the way


----------



## DADDYDC650

Quote:


> Originally Posted by *EniGma1987*
> 
> No. Not unless Vega somehow magically provides 5-10fps more than a Titan XP, which obviously isnt going to happen. At best Vega will match a GTX 1080, and even that is being a bit hopeful IMO.


Vega probably won't beat a Titan in DX11 games but it might in DX12.


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ This
> In addition to what others have responded with... my experience is that the card will actually "eat _more_ power" overall - especially at non-cyrogenic temperatures - since it is not throttling as much at a higher frequency than can hold at greater than 40C, in an attempt to lower the power-temp-current component of the PL calc. I guess it just a different perspective of the same effect that MrT and scone pointed out.


Nobody here is going to be able to touch the subject with any real substance. These smaller parts will have adopted new power gating methods to reduce leakage, but as long you understand the relationship that's enough for the purpose of this thread


----------



## toncij

Quote:


> Originally Posted by *DADDYDC650*
> 
> Would anyone else be willing to return their Titan XP's if Vega was strongly rumored to be out in October with BF1 for free?


Why would you do such a thing?


----------



## stefxyz

The guy with the Cristall Ball has been spoken ....


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> Nobody here is going to be able to touch the subject with any real substance. These smaller parts will have adopted new power gating methods to reduce leakage, but as long you understand the relationship that's enough for the purpose of this thread


lol - I'm not sure I expressed that clearly.








anyway - once you put the water blocks on these cards they just run better - simply said?
I have one card that just begs for a bios mod.. the other is running in the high 2100s without throttling (with chilled water tho).


----------



## Ninjawithagun

Quote:


> Originally Posted by *Silent Scone*
> 
> Nobody here is going to be able to touch the subject with any real substance. These smaller parts will have adopted new power gating methods to reduce leakage, but as long you understand the relationship that's enough for the purpose of this thread


^^


----------



## Silent Scone

That's not how you spell crystal









Quote:


> Originally Posted by *Jpmboy*
> 
> lol - I'm not sure I expressed that clearly.
> 
> 
> 
> 
> 
> 
> 
> 
> anyway - once you put the water blocks on these cards they just run better - simply said?
> I have one card that just begs for a bios mod.. the other is running in the high 2100s without throttling (with chilled water tho).


Yeah, I got what you were saying! Wasn't really in direct reply to your comment more the ones you were replying to. The cards are not hitting their thermal target, so the clocks are more constant


----------



## Ninjawithagun

...most likely a spellcheck operator error and/or language barrier...


----------



## EniGma1987

Quote:


> Originally Posted by *DADDYDC650*
> 
> Vega probably won't beat a Titan in DX11 games but it might in DX12.


Fine with me since 95% of the games I play are DX11 and that will still hold true for the 2 years I have this titan in my gaming rig.


----------



## DADDYDC650

Quote:


> Originally Posted by *EniGma1987*
> 
> Fine with me since 95% of the games I play are DX11 and that will still hold true for the 2 years I have this titan in my gaming rig.


2 years? Be a miracle if I own my Titan for 9 months.


----------



## dante`afk

Quote:


> Originally Posted by *st0necold*
> 
> i'll stick with SLI. I don't play games where you ride horses and pretend to be a character from a teenage novel.
> 
> Battlefield + SLI = Win.
> 
> Done.


enjoy your microstuttering. denying you don't have any is a lie.

so you play teenage shooting games and think of being rambo with a cowboy hat and shotgun? cool guy.
Quote:


> Originally Posted by *Gary2015*
> 
> Im having none of these problems with my TXP SLI


oh you have. I was posting the same statements like you when I still had SLI and denied of seeing it.


----------



## Jpmboy

Quote:


> Originally Posted by *dante`afk*
> 
> enjoy your microstuttering. denying you don't have any is a lie.
> 
> so you play teenage shooting games and think of being rambo with a cowboy hat and shotgun? cool guy.


some folks are very sensitive to micro stutter and others do not "sense" it at all. I'm one of the latter.


----------



## Fiercy

I feel like some SLI users are living in denial I've actually already disscusesed this and pointed out that you only have 0 problems with SLI if you play 1-10 super AAA games and you mostly play benches which I see a lot of people do here.

Where SLI is completely useless is in reliable performance across all games I mean what help SLI will give me in No Mans Sky, Deus Ex Mankind Devided or new World of Warcraft Legion. 3 games coming out this month and none has SLI support.

So from my point of view spending another 1200$ is useless if you can't really use it. If you think playing benches is not fun and you play more then 10 games and you don't think bragging about having more matters don't get SLI.


----------



## EniGma1987

Quote:


> Originally Posted by *DADDYDC650*
> 
> 2 years? Be a miracle if I own my Titan for 9 months.


Yep. Had 670's and moved to 980s (about 2 years), and replaced my 980s with a new titan which was just barely under 2 years. So that seems about when I finally give in to wanting more performance. And now looking back at all my past GPUs purchased I seem to have continued that pattern for the past 10 years, which is as far back as I can remember what GPUs I have owned. I guess I am pretty consistent


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> Why would you do such a thing?


I would never go back to AMD. Drivers are horrible.


----------



## Silent Scone

Quote:


> Originally Posted by *st0necold*
> 
> i'll stick with SLI. I don't play games where you ride horses and pretend to be a character from a teenage novel.
> 
> Battlefield + SLI = Win.
> 
> Done.


That description of Witcher 3 is almost as weak as your CPU cooling


----------



## Ninjawithagun

Quote:


> Originally Posted by *Fiercy*
> 
> I feel like some SLI users are living in denial I've actually already disscusesed this and pointed out that you only have 0 problems with SLI if you play 1-10 super AAA games and you mostly play benches which I see a lot of people do here.
> 
> Where SLI is completely useless is in reliable performance across all games I mean what help SLI will give me in No Mans Sky, Deus Ex Mankind Devided or new World of Warcraft Legion. 3 games coming out this month and none has SLI support.
> 
> So form my point of view spending another 1200$ is useless if you can't really use it. If you think playing benches is not fun and you play more then 10 games and you don't think bragging about having more matters don't get SLI.


I have been an SLI owner/user for over 10 years and can tell everyone now that SLI has improved significantly. I can also tell you that SLI still has issues to this day to include microstutter (whether you want to see it or not, Nvidia has formerly acknowledged its existence) and performance scaling inconsistencies (based upon game encoding and/or graphics driver issues).

Needless to say, based upon 10+ years of gaming in SLI, it is truly a great experience to finally have a single card that I can use to game at 4K/UHD and UQHD (3440 x 1440) resolutions without having to use a second card.

I can only imagine how much the Volta Titan card will leap over the Pascal Titan...


----------



## Silent Scone

I've been in a similar boat, would rather not have to use SLI. Have been using it since it's conception with the PCIE variant of the 6800 Ultra up until Maxwell Titan X.


----------



## dante`afk

Quote:


> Originally Posted by *Fiercy*
> 
> I feel like some SLI users are living in denial I've actually already disscusesed this and pointed out that you only have 0 problems with SLI if you play 1-10 super AAA games and you mostly play benches which I see a lot of people do here.
> 
> Where SLI is completely useless is in reliable performance across all games I mean what help SLI will give me in No Mans Sky, Deus Ex Mankind Devided or new World of Warcraft Legion. 3 games coming out this month and none has SLI support.
> 
> So from my point of view spending another 1200$ is useless if you can't really use it. If you think playing benches is not fun and you play more then 10 games and you don't think bragging about having more matters don't get SLI.


this should be a sticky post.


----------



## DNMock

Quote:


> Originally Posted by *Fiercy*
> 
> I feel like some SLI users are living in denial I've actually already disscusesed this and pointed out that you only have 0 problems with SLI if you play 1-10 super AAA games and you mostly play benches which I see a lot of people do here.
> 
> Where SLI is completely useless is in reliable performance across all games I mean what help SLI will give me in No Mans Sky, Deus Ex Mankind Devided or new World of Warcraft Legion. 3 games coming out this month and none has SLI support.
> 
> So from my point of view spending another 1200$ is useless if you can't really use it. If you think playing benches is not fun and you play more then 10 games and you don't think bragging about having more matters don't get SLI.


That's an argument to have in the 1070 or 1080 owners thread. At the T-XP level it's akin to telling someone not to buy a ski boat unless they like being on the water...

This is the high end of the enthusiast food chain here and quite frankly if you can't afford to buy 2 Titans, you probably shouldn't be buying the first one to begin with.

edit: I am saying you figuratively, not literally you. Sweeping generalizations are one thing, but I'm not of the habit of telling individuals how or what they should spend their money on


----------



## Ninjawithagun

Quote:


> Originally Posted by *Silent Scone*
> 
> I've been in a similar boat, would rather not have to use SLI. Have been using it since it's conception with the PCIE variant of the 6800 Ultra up until Maxwell Titan X.


I will honestly admit that I almost purchased two GTX1080 cards instead of the Titan X. Part of 'the save' for me was the fact that all GTX1080s were sold out for so long and then the Titan X was released and was in stock. I read as many early reviews as possible to see how well the new Titan X worked compared to the GTX1080s in SLI as well as compared to my two 'old' GTX980Ti cards that I had in my main gaming rig. Needless to say, it ended up being a great decision. The Titan X performs on par with my two GTX980Ti cards in SLI while overclocked. And this comparison is with the Titan X at stock speeds, so an incredible upgrade IMHO. I have played several games with all settings maxed @ 4K/UHD and just amazing how smooth the game play is compared to the SLI setup. Also, my minimum frame rates actually increased on average over the SLI configuration. Microstutter for me is noticeable when using the GTX980Ti SLI configuration. I'm not quite as sensitive to it as others, but still it's there. The Titan X offers something truly unique along with that nasty price tag - true single card gaming experience. I do admit that the GTX1080 gets oh so close in some games, especially when heavily overclocked. But for me, there extra shaders (1024+ over the GTX1080), extra 4GB VRAM, and 384-bit bus were enough to convince me to take the dive into the Titan universe. If I were asked to do it all over again, I wouldn't change a thing. The Titan X Pascal is the answer to your single card gaming wishes.


----------



## cisco0623

What's the consensus on modding the bios at this point? Is it possible?


----------



## DNMock

Quote:


> Originally Posted by *cisco0623*
> 
> What's the consensus on modding the bios at this point? Is it possible?


Possible? yes

Probable? possibly, but not for a while.


----------



## cisco0623

Quote:


> Originally Posted by *DNMock*
> 
> Possible? yes
> 
> Probable? possibly, but not for a while.


Hah- figures. A custom bios will be sweet on these cards.


----------



## Ninjawithagun

There is definitely potential for a modded BIOS in the future. Most hardware reviews are finding a solid 2100Mhz & 1.25V cap on the GPU overclock, even on water. I find this a bit odd as the GTX1080s have been able to be pushed up past 2100Mhz on air and 2800Mhz on LN2.


----------



## cisco0623

Quote:


> Originally Posted by *Ninjawithagun*
> 
> There is definitely potential for a modded BIOS in the future. Most hardware reviews are finding a solid 2100Mhz & 1.25V cap on the GPU overclock, even on water. I find this a bit odd as the GTX1080s have been able to be pushed up past 2800Mhz on LN2.


I've read that as well and believe the bios is set to be very conservative. I guess when you're selling a $1200 card with a 3 year warranty you want to play it smart as well lol

If a custom bios gets gets us another 500mhz on water that would be beastly.


----------



## stefxyz

No its not odd the 1080 also nearly all go unstable at around 2100 the pascal chips seem to behave all the same...


----------



## Silent Scone

Quote:


> Originally Posted by *Ninjawithagun*
> 
> There is definitely potential for a modded BIOS in the future. Most hardware reviews are finding a solid 2100Mhz & 1.25V cap on the GPU overclock, even on water. I find this a bit odd as the GTX1080s have been able to be pushed up past 2100Mhz on air and 2800Mhz on LN2.


1.25v is a limit placed at resistor level to stop users from damaging the cards. This has been in force since the 780Ti on all Nvidia and partner boards manufactured by Flextronics


----------



## xarot

I too have been using SLI since 8800 series. In some rare exceptions, like Witcher 3, SLI can shine. But there have been tens if not hundred games I have actually already finished before SLI worked with those games. On the other hand I have no trouble playing graphics-intensive games like Witcher 3 and immediately after that finish an indie adventure with 320x240 graphics. It only matters if the game is good or not. I say going SLI with Maxwell Titan Xs for Witcher 3 was worth it. Don't know if the same is true anymore as I have to check my unfinished games list.


----------



## Ninjawithagun

Quote:


> Originally Posted by *Silent Scone*
> 
> 1.25v is a limit placed at resistor level to stop users from damaging the cards. This has been in force since the 780Ti on all Nvidia and partner boards manufactured by Flextronics


Thanks Silent Scone - good info


----------



## combat fighter

Quote:


> Originally Posted by *Silent Scone*
> 
> 1.25v is a limit placed at resistor level to stop users from damaging the cards. This has been in force since the 780Ti on all Nvidia and partner boards manufactured by Flextronics


We don't even have voltage control in AB yet though?

So I guess all our TXP's are running lower volts than 1.25v at the moment.


----------



## DNMock

Quote:


> Originally Posted by *combat fighter*
> 
> We don't even have voltage control in AB yet though?
> 
> So I guess all our TXP's are running lower volts than 1.25v at the moment.


Pulling this from memory, but I think 1.06 or 1.08 is around where we are capping atm.

Those 2800mhz 1080's are Frankensteined up with a second PCB soldered to directly control the chips power, allowing a ton more voltage to be dumped into it with LN2 cooling.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> 1.25v is a limit placed at resistor level to stop users from damaging the cards. This has been in force since the 780Ti on all Nvidia and partner boards manufactured by Flextronics


980 TI BIOS edit opens up 1.275v? Oh -- Flextronics.


----------



## EniGma1987

Quote:


> Originally Posted by *cisco0623*
> 
> I've read that as well and believe the bios is set to be very conservative. I guess when you're selling a $1200 card with a 3 year warranty you want to play it smart as well lol
> 
> If a custom bios gets gets us another 500mhz on water that would be beastly.


LOL. You are hoping and dreaming way too much there my friend. Best anyone should hope for is a couple hundred MHz on the Titan.


----------



## KillerBee33

Quote:


> Originally Posted by *mouacyk*
> 
> 980 TI BIOS edit opens up 1.275v? Oh -- Flextronics.


Made my co-workers 980G1 run @ 1.312V however GPU Z only reports up to 1.275


----------



## toncij

It's not about BIOS. Pascal chips can't go to 2.5GHz if not sub-zero.


----------



## unreality

Aquacomputer blocks wont dispatch till next week









Also the first batch will be only a few going out at all... almost thinking about ordering from EK now, im pretty dissapointed by AQ


----------



## EniGma1987

Quote:


> Originally Posted by *unreality*
> 
> Aquacomputer blocks wont dispatch till next week
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also the first batch will be only a few going out at all... almost thinking about ordering from EK now, im pretty dissapointed by AQ


The AQ blocks should be better. Direct VRAM contact instead of a cheap thermal pad, and an active backplate? yes please.

Im still waiting on heatkiller block myself.


----------



## Dr Mad

Quote:


> Originally Posted by *unreality*
> 
> Aquacomputer blocks wont dispatch till next week
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also the first batch will be only a few going out at all... almost thinking about ordering from EK now, im pretty dissapointed by AQ


Yes but I think that if you order from EK, it will be shipped even later than Aquacomputer.

I placed an order from EK almost a week ago and I was told by Igor my waterblock is still in production, so nothing at home before at least 10 days :/

That's why I'm thinking about cancelling the order and give my money to AC but still, I'm not fond of basic copper waterblocks. The nickel / black glass one won't be available before 3 weeks.

Overall, my preference would go to Heatkiller but I didn't heard anything for Titan X waterblock yet.


----------



## mbze430

Sheeeeit, I have been SLI'n before Nvidia bought 3Dfx with Voodoo SLI. (guess that hints how old I am). I have gone SLI since that day. Microstutter is there, and it's there even the with my TXP setup, but you seriously need to stare and look for it.

As for the AquaComputer blocks. They are only offering the full copper block, hence I haven't ordered mine. Plus with the 1080/1070 blocks they have different colors, I am going to wait for those offering for the TXP. Without a bios mod I am not in a hurry. You guys can always do the pencil vmod


----------



## EniGma1987

Quote:


> Originally Posted by *Dr Mad*
> 
> Overall, my preference would go to Heatkiller but I didn't heard anything for Titan X waterblock yet.


nothing was posted, but if you contact them the company will tell you that it is in development right now. Their rep on the forums said late August/early September. So same timeframe as AC's alternative blocks.


----------



## Woundingchaney

Quote:


> Originally Posted by *Fiercy*
> 
> I feel like some SLI users are living in denial I've actually already disscusesed this and pointed out that you only have 0 problems with SLI if you play 1-10 super AAA games and you mostly play benches which I see a lot of people do here.
> 
> Where SLI is completely useless is in reliable performance across all games I mean what help SLI will give me in No Mans Sky, Deus Ex Mankind Devided or new World of Warcraft Legion. 3 games coming out this month and none has SLI support.
> 
> So from my point of view spending another 1200$ is useless if you can't really use it. If you think playing benches is not fun and you play more then 10 games and you don't think bragging about having more matters don't get SLI.


Im not sure where you are getting your information but, a driver has just released for SLI in Deus Ex MD and No Mans Sky. I do believe that Legion is lacking official support but there has been various success with work around for AFR. Given the history of WoW I dont think there will ever be official support not just for Legion, but Wow in general.

I think perhaps you are in denial in regards to SLI or simply uninformed. The vast majority of AAA releases support SLI, and have positive scaling associated with the titles. I can think of maybe two or three titles that didnt support SLI in the last year. Indie titles generally dont support SLI, but its also important to note that those titles generally dont need to support the technology.

My TXp is literally the first time in years that I havent ran a mGPU configuration.


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> 1.25v is a limit placed at resistor level to stop users from damaging the cards. This has been in force since the 780Ti on all Nvidia and partner boards manufactured by Flextronics


with the exception of classified and kingpin cards in both kepler and maxwell series.. (maybe not Flextronics?)


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> with the exception of classified and kingpin cards in both kepler and maxwell series.. (maybe not Flextronics?)


MSI lightning and ASUS Strix / Matrix







.... didn't the original titan have a mod for more voltage? That's kepler though


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> 980 TI BIOS edit opens up 1.275v? Oh -- Flextronics.


If you had the proper equipment you could see what the voltage actually does at this level, but you don't so I don't expect you'll answer this.
Quote:


> Originally Posted by *lilchronic*
> 
> MSI lightning and ASUS Strix / Matrix
> 
> 
> 
> 
> 
> 
> 
> .... didn't the original titan have a mod for more voltage?


Yes, the OG TITAN was before NVIDIA introduced their greenlight program


----------



## lilchronic

Quote:


> Originally Posted by *Silent Scone*
> 
> If you had the proper equipment you could see what the voltage actually does at this level, but you don't so I don't expect you'll answer this.
> Yes, the OG TITAN was before NVIDIA introduced their greenlight program


Yeah just remembered that it's kepler arch.


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> MSI lightning and ASUS Strix / Matrix
> 
> 
> 
> 
> 
> 
> 
> .... didn't the original titan have a mod for more voltage? That's kepler though


yeah, OG Titans were above 1.3V (measured) If I remember correctly. Strix and Lightening for sure. The 980 strix really liked voltage for some reason. Nothing done so far would indicate pascal scales with voltage at ambient temps.
Quote:


> Originally Posted by *Silent Scone*
> 
> If you had the proper equipment you could see what the voltage actually does at this level, but you don't so I don't expect you'll answer this.
> Yes, the OG TITAN was before NVIDIA introduced their greenlight program


yeah - when measured off the TitanX, no matter what voltage table was put into the bios, it maxed out at ~1.265V measured at load (vdroop).


----------



## Lobotomite430

Does anyone know if the items marked with the white lines are flush with each other? Just wondering if I could attach a Corsair Hydro AIO directly to the GPU stock cooler without having to buy a bracket and just reuse the stock cooler but have the gpu on water?


----------



## HyperMatrix

Curious if all these people with microstutter are using x8/x8 config with poor ram/ssd, using a quad core cpu, without hb bridge.


----------



## criminal

Quote:


> Originally Posted by *Lobotomite430*
> 
> 
> 
> Does anyone know if the items marked with the white lines are flush with each other? Just wondering if I could attach a Corsair Hydro AIO directly to the GPU stock cooler without having to buy a bracket and just reuse the stock cooler but have the gpu on water?


You will probably need a shim on the gpu or just get a 980Ti Hybrid kit.


----------



## cisco0623

Quote:


> Originally Posted by *EniGma1987*
> 
> LOL. You are hoping and dreaming way too much there my friend. Best anyone should hope for is a couple hundred MHz on the Titan.


Lol now that I re-read it I didn't mean 2500, more like 2100 like a 1089 but in water it'll be ice cold


----------



## Z0eff

Quote:


> Originally Posted by *HyperMatrix*
> 
> Curious if all these people with microstutter are using x8/x8 config with poor ram/ssd, using a quad core cpu, without hb bridge.


I remember seeing a slide from nvidia themselves explaining what causes the microstutter. I don't remember exactly what it was nor can I find it again but it boiled down to the 2 cards not syncing their frame drawing. So on average you'd have maybe 60 fps, but in reality you might end up with 2 frames that are only a millisecond apart then a gap of 16 milliseconds before another 2 frames are delivered a millisecond apart.

I wish I had that slide bookmarked... -_-


----------



## paxw




----------



## HyperMatrix

Quote:


> Originally Posted by *Z0eff*
> 
> I remember seeing a slide from nvidia themselves explaining what causes the microstutter. I don't remember exactly what it was nor can I find it again but it boiled down to the 2 cards not syncing their frame drawing. So on average you'd have maybe 60 fps, but in reality you might end up with 2 frames that are only a millisecond apart then a gap of 16 milliseconds before another 2 frames are delivered a millisecond apart.
> 
> I wish I had that slide bookmarked... -_-


But this 16ms is based off of 60Hz gameplay. I'm playing at 165Hz. It doesn't leave much room for you to notice any potential microstutter.


----------



## EniGma1987

Quote:


> Originally Posted by *HyperMatrix*
> 
> Curious if all these people with microstutter are using x8/x8 config with poor ram/ssd, using a quad core cpu, without hb bridge.


I had microstutter on my last setup and I have a pretty fast computer. It was 8x8 but that shouldn't matter as the cards dont come close to saturating such a config yet. The problem is just the setup itself, not the other parts of the computer. You generally dont notice you have it till you pull a card out and just notice how much smoother it seems. I guess maybe you just have to be sensitive to stutter or framerates though, which I am very much. So ya either notice it or you dont. The weird part is that reviews with FCAT dont seem to show these inconsistencies, yet they are so clearly there. Maybe Nvidia is just playing dirty again given that they make FCAT...


----------



## mouacyk

Quote:


> Originally Posted by *EniGma1987*
> 
> I had microstutter on my last setup and I have a pretty fast computer. It was 8x8 but that shouldn't matter as the cards dont come close to saturating such a config yet. The problem is just the setup itself, not the other parts of the computer. You generally dont notice you have it till you pull a card out and just notice how much smoother it seems. I guess maybe you just have to be sensitive to stutter or framerates though, which I am very much. So ya either notice it or you dont. The weird part is that reviews with FCAT dont seem to show these inconsistencies, yet they are so clearly there. Maybe Nvidia is just playing dirty again given that they make FCAT...


oO... the towel is thrown down!


----------



## HyperMatrix

Quote:


> Originally Posted by *EniGma1987*
> 
> I had microstutter on my last setup and I have a pretty fast computer. It was 8x8 but that shouldn't matter as the cards dont come close to saturating such a config yet. The problem is just the setup itself, not the other parts of the computer. You generally dont notice you have it till you pull a card out and just notice how much smoother it seems. I guess maybe you just have to be sensitive to stutter or framerates though, which I am very much. So ya either notice it or you dont. The weird part is that reviews with FCAT dont seem to show these inconsistencies, yet they are so clearly there. Maybe Nvidia is just playing dirty again given that they make FCAT...


Considering I play with sli disable for some of my fps games that don't support sli, I'm using both single card and sli setup on a daily basis. Yet can't notice any microstutter when playing bf4 in sli vs. Playing planet side 2 with a single gpu. I can see the microstutter even in the recorded video someone posted a few pages back. But I don't see it when gaming on my computer. Because again I think the higher the refresh rate and subsequent fps, the lower the chance of visible microstutter. Some games also smooth out with more cores/threads. For example in black ops 3 and HT disabled, on a 5960x, I'd get some microstutter during panning/scanning. I attribute that to data streaming from the ssd, but nonetheless enabling HyperThreading eliminated the greater majority of even that microstutter. So I think a lot of perceived microstutter is dependent on your gaming rig, and your monitor. And if you're solidly pushing high enough fps, then microstutter becomes a none issue.


----------



## HyperMatrix

Quote:


> Originally Posted by *mouacyk*
> 
> oO... the towel is thrown down!


I can't quite tell if you're referencing a mic drop or throwing in the towel. Both of which mean completely different things. Haha.


----------



## mbze430

Quote:


> Originally Posted by *HyperMatrix*
> 
> Curious if all these people with microstutter are using x8/x8 config with poor ram/ssd, using a quad core cpu, without hb bridge.


I'm on a i7-6900k OC on a ROG Strix x99 (16/16), my boot drive is a Samsung 950 NVMe 512GB, and my games are stored on 3x Samsung 850 512GB in a RAID0. RAM I have are 3000mhz and HB bridge. Like I said before, I have to really sit there and look for it, but it does happen. But i don't worry about it


----------



## Z0eff

As was mentioned before in this thread, how problematic it is depends on your sensitivity of varying frame times. Annoyingly I'm quite sensitive to frame rate issues.


----------



## dante`afk

Log into a game with SLI and turn the camera very slowly around, walk around and pay attention to every corner, how movement behaves, how smooth or not smooth the panning, turning etc everything is.

Then do the same with single GPU.

Let's say both setups would run with 200 fps, Gsync etc. The single GPU solution will feel smoother.

Then saying both are the same is denial


----------



## combat fighter

Quote:


> Originally Posted by *DNMock*
> 
> Pulling this from memory, but I think 1.06 or 1.08 is around where we are capping atm.
> 
> Those 2800mhz 1080's are Frankensteined up with a second PCB soldered to directly control the chips power, allowing a ton more voltage to be dumped into it with LN2 cooling.


Just checked with GPU-Z and your right it currently tops out at 1.06v so plenty more voltage to come even without modding the bios.

Best is yet to come!


----------



## EniGma1987

Quote:


> Originally Posted by *dante`afk*
> 
> Log into a game with SLI and turn the camera very slowly around, walk around and pay attention to every corner, how movement behaves, how smooth or not smooth the panning, turning etc everything is.
> 
> Then do the same with single GPU.
> 
> Let's say both setups would run with 200 fps, Gsync etc. The single GPU solution will feel smoother.
> 
> Then saying both are the same is denial


eh, IDK if you would notice a serious difference at 200 fps. The frame times are a max of 5 milliseconds apart then, which is pretty short. I think that is about at the limit of where humans can perceive timing differences.


----------



## CallsignVega

I'm pretty sensitive to input lag and stutters being a long time competitive FPS player. I've hardly ever had an issue with SLI micro-stutter. But then again, I usually have extremely quick/optimized systems.

With my 4.5 GHz 6950X system with two Titan-XP's at 2050+ core with their own each 16x PCI-E 3.0 slots and HB SLI bridge and 3440x1440 100 Hz G-Sync monitor, all games have been butter smooth in SLI. Crysis 3 is running just blazing maxed out on this system. Some levels upwards of 85% CPU usage on a 10-core!


----------



## dante`afk

In the end its all in the eye of the beholder. If I'd play a game on your machine CallsignVega, I'd see MS too.


----------



## Lobotomite430

Quote:


> Originally Posted by *criminal*
> 
> You will probably need a shim on the gpu or just get a 980Ti Hybrid kit.


Any ideas on where to buy one? Amazon.com shows 1-2 months to ship


----------



## besthijacker

So did anyone OC their titan yet? Just installed the EK block on it.


----------



## HyperMatrix

Quote:


> Originally Posted by *mbze430*
> 
> I'm on a i7-6900k OC on a ROG Strix x99 (16/16), my boot drive is a Samsung 950 NVMe 512GB, and my games are stored on 3x Samsung 850 512GB in a RAID0. RAM I have are 3000mhz and HB bridge. Like I said before, I have to really sit there and look for it, but it does happen. But i don't worry about it


And you didn't mention what monitor refresh rate/fps you're running, and whether you have GSYNC.


----------



## CallsignVega

Quote:


> Originally Posted by *dante`afk*
> 
> In the end its all in the eye of the beholder. If I'd play a game on your machine CallsignVega, I'd see MS too.


Saying you would see micro-stuttering on my system doesn't mean anything. Subjective "I have awesome powers that no one else possesses" statements are pretty silly. Micro-stuttering would need to be measured objectively.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Saying you would see micro-stuttering on my system doesn't mean anything. Subjective "I have awesome powers that no one else possesses" statements are pretty silly. Micro-stuttering would need to be measured objectively.


He could be positive that he sees microstutter just like some people go to the doctor for a full checkup every month convinced that they have a disease they're going to die from even when the doctor tells them they're fine. So I'm not going to judge.


----------



## mbze430

Quote:


> Originally Posted by *HyperMatrix*
> 
> And you didn't mention what monitor refresh rate/fps you're running, and whether you have GSYNC.


3x ASUS PB287Q


----------



## HyperMatrix

Quote:


> Originally Posted by *mbze430*
> 
> 3x ASUS PB287Q


60Hz displays. Again going back to my statement about microstutter being non existent at high refresh/fps. The fact that you can play a game at 60Hz is impressive.


----------



## criminal

Quote:


> Originally Posted by *Lobotomite430*
> 
> Any ideas on where to buy one? Amazon.com shows 1-2 months to ship


This should work: http://www.evga.com/Products/Product.aspx?pn=400-HY-5188-B1


----------



## CRITTY

With two cards I can play EVERY game in my library in 4K maxed out. That is what is important to me; not debating about micro stuttering. I have the option to use one or two cards and you don't. That's a fact jack!


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> I'm pretty sensitive to input lag and stutters being a long time competitive FPS player. I've hardly ever had an issue with SLI micro-stutter. But then again, I usually have extremely quick/optimized systems.
> 
> *With my 4.5 GHz 6950X system with two Titan-XP's* at 2050+ core with their own each 16x PCI-E 3.0 slots and HB SLI bridge and 3440x1440 100 Hz G-Sync monitor, all games have been butter smooth in SLI. Crysis 3 is running just blazing maxed out on this system. Some levels upwards of 85% CPU usage on a 10-core!


what's the verdict? Better gaming on the 6700K or the HEDT rig?









Quote:


> Originally Posted by *CRITTY*
> 
> With two cards I can play EVERY game in my library in 4K maxed out. That is what is important to me; not debating about micro stuttering. I have the option to use one or two cards and you don't. *That's a fact jack*!



Quote:


> Originally Posted by *besthijacker*
> 
> So did anyone OC their titan yet? Just installed the EK block on it.


----------



## Menthol

Quote:


> Originally Posted by *CallsignVega*
> 
> Saying you would see micro-stuttering on my system doesn't mean anything. Subjective "I have awesome powers that no one else possesses" statements are pretty silly. Micro-stuttering would need to be measured objectively.


My brain microstutters so sli helps everything look smooth


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> My brain microstutters so sli helps everything look smooth


That off-phase stutter! Comes in handy.


----------



## dante`afk

Quote:


> Originally Posted by *CallsignVega*
> 
> Saying you would see micro-stuttering on my system doesn't mean anything. Subjective "I have awesome powers that no one else possesses" statements are pretty silly. Micro-stuttering would need to be measured objectively.


Alright let me rephrase it, there are microstutters with SLI.

Fact.

If you see them or not, that is another story. Saying you don't have any is plain denial.


----------



## dVeLoPe

can anyone help me decide i have an evga acx3 1080 on the way

should i sell it and buy one of these or buy another 1080 cant choose!


----------



## Jpmboy

Quote:


> Originally Posted by *dVeLoPe*
> 
> can anyone help me decide i have an evga acx3 1080 on the way
> 
> should i sell it and buy one of these or buy another 1080 cant choose!


do you see mimimicrostustustutter?


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> what's the verdict? Better gaming on the 6700K or the HEDT rig?


Well, the 6700K system puts up overall highest FPS in most games but in demanding CPU games the 6950X puts up better minimum FPS. Since BF1 will be DX12, I'll probably stick with the 6950X.
Quote:


> Originally Posted by *dante`afk*
> 
> Alright let me rephrase it, there are microstutters with SLI.
> 
> Fact.
> 
> If you see them or not, that is another story. Saying you don't have any is plain denial.


Ya, and a single GPU's frame-time and pacing aren't perfect either and can spike. Fact. See how that works? The point is that on a properly setup system the frame-time variance in SLI is at the insignificant/imperceptible level. It's like arguing you can feel the difference in input lag between a 2ms lag monitor and a 4ms input lag monitor.







Not to mention the faster your FPS and refresh rate, the less and less any small frame time variance matters.

Oh, and putting a large gap between two sentences doesn't add any credence to your point.


----------



## Stateless

Quote:


> Originally Posted by *CRITTY*
> 
> With two cards I can play EVERY game in my library in 4K maxed out. That is what is important to me; not debating about micro stuttering. I have the option to use one or two cards and you don't. That's a fact jack!


I received my second Titan XP today. After testing it on it's own, I threw in it's brother and tried some games out. FANTASTIC! Everything I have tested I am at 4k/60fps without a dip with 100% maxed out settings. Witcher 3, with hairworks at the highest settings and everything else is rock solid 60fps. Whoever was the person that said The Division does not scale in SLI was full of it. At max settings, my benchmark was average 46.1 fps on single card and in SLI I am hitting 87.3 fps. So far it has been great. Both cards are at +200 core and +200 on memory hitting about 1946 or so when boosting. Temps are hitting a high of 82c on one card and 78c on the other. This is after gaming for a while in The Division. In Witcher 3, temps do go up as high, I think the highest I hit was 70c. I know the cards would be hitting a higher core if the temps were more under control, I have the fans at a 1 to 1 match with the temp until they hit 60, then I have the fans at 80 when temps hit 70c.

The other thing is that both cards are synced 100% as far as core speed, on my Maxwell Titan's, usually one card would be slightly below the other, but so far the XP's have been perfect in being at the same speed, even though the usage is slightly different. Will be receiving my 1st Ek block tomorrow, but will wait until the 2nd block and backplates arrive before going water on them. But glad I did get a 2nd card. My plan was to try to stick with 1 card, but some of my games 1 card alone could not handle unless I turned things down, but what is the fun at turning down stuff when gaming at 4k. lol. In games like Doom where only 1 card is used, I am dead locked at 60fps at the max settings, so far with games that don't use SLI it has been fine. I have not tried a lot of games that don't support SLI, but when I have more time I will.


----------



## Gary2015

I get around 50-60% scaling on my 2nd card in ESO. Maxed out settings on X34 at solid 95fps.


----------



## mbze430

Quote:


> Originally Posted by *HyperMatrix*
> 
> 60Hz displays. Again going back to my statement about microstutter being non existent at high refresh/fps. The fact that you can play a game at 60Hz is impressive.


actually I am not a gamer, if I was I probably get a 144hz+ monitor. I play at most 15mins and turn it off.

The inherent fact is that microstutter exist because not every frame is exactly at the end of a refresh. When frame rate is peaking and dipping eventually there will be a frame that is within the refresh cycle of the monitor no matter how high refresh rate is.


----------



## HyperMatrix

Quote:


> Originally Posted by *mbze430*
> 
> actually I am not a gamer, if I was I probably get a 144hz+ monitor. I play at most 15mins and turn it off.
> 
> The inherent fact is that microstutter exist because not every frame is exactly at the end of a refresh. When frame rate is peaking and dipping eventually there will be a frame that is within the refresh cycle of the monitor no matter how high refresh rate is.


Yes, at 60Hz that variance can be as high as 16.66ms, with a median of 8.4ms. But even at just 144Hz, that frame variance can only be as high as 6.94ms, with a median of 3.47ms. And that's not even taking into account GSYNC.


----------



## atreides

Hi everyone,

I own a titan xp and I received a new case from my friends, birthday gift. Right now I am using my Cosmos II for my rig. I found that the Titan X is throwing out lots of hot air while I am playing my games which is no problem better out then in. My friends got me an in win 805i this is a med tower case. It has two 120mm intakes on the bottom and 1 120mm exhaust on the back and I am worried about what would happen if I do use this case? So I wanted to get some advice and ask if it would be a bad idea if I switch to this case while having the Titan XP installed?? My cousin owns this case and he is using a 1080 ftw edition and it says he is getting 60-70c playing the witcher 3 on ultra at 3440x1440p resolution going 100hz. I humbly ask for advice guys thank you!


----------



## atreides

double post srry


----------



## Gary2015

Quote:


> Originally Posted by *atreides*
> 
> Hi everyone,
> 
> I own a titan xp and I received a new case from my friends, birthday gift. Right now I am using my Cosmos II for my rig. I found that the Titan X is throwing out lots of hot air while I am playing my games which is no problem better out then in. My friends got me an in win 805i this is a med tower case. It has two 120mm intakes on the bottom and 1 120mm exhaust on the back and I am worried about what would happen if I do use this case? So I wanted to get some advice and ask if it would be a bad idea if I switch to this case while having the Titan XP installed?? My cousin owns this case and he is using a 1080 ftw edition and it says he is getting 60-70c playing the witcher 3 on ultra at 3440x1440p resolution going 100hz. I humbly ask for advice guys thank you!


Should be no problem since you have proper airflow in the new case.


----------



## dVeLoPe

sli 1080 or pascal XP

i can do either or but the XP requires me selling off the 1080

discuss im literally stuck trying to decide and wil be using 120hz @ 1080p

whenever x34p comes out i will go that route so what should i do?


----------



## atreides

Quote:


> Originally Posted by *Gary2015*
> 
> Should be no problem since you have proper airflow in the new case.


The only intake are the two bottom fans and thats all and the only exhaust is the 1 120mm on the back of the case. Does your Titan XP warm up ur case?

here is an example of the case


----------



## Gary2015

Quote:


> Originally Posted by *atreides*
> 
> 
> The only intake are the two bottom fans and thats all and the only exhaust is the 1 120mm on the back of the case. Does your Titan XP warm up ur case?
> 
> here is an example of the case


What is your ambient temp?


----------



## atreides

Quote:


> Originally Posted by *Gary2015*
> 
> What is your ambient temp?


My ambient temperature in my loft room is usually 76-78 F / 25c. It is summer right now but usually I have my ac on when I'm using my pc.


----------



## Gary2015

Quote:


> Originally Posted by *atreides*
> 
> My ambient temperature in my loft room is usually 76-78 F / 25c. It is summer right now but usually I have my ac on when I'm using my pc.


That should be enough cooling for you. What RPM are your fans running at?


----------



## atreides

Quote:


> Originally Posted by *Gary2015*
> 
> That should be enough cooling for you. What RPM are your fans running at?


Currently I'm running all my fans at max in the cosmos ii case. I don't mind sound at all so if i use the 805i I'll be maxing out the fan speeds. When I play games I raise the Titan XP fan to 85%. Just for things to stay cool. Do you think it would be okay using that 805i!


----------



## Gary2015

Quote:


> Originally Posted by *atreides*
> 
> Currently I'm running all my fans at max in the cosmos ii case. I don't mind sound at all so if i use the 805i I'll be maxing out the fan speeds. When I play games I raise the Titan XP fan to 85%. Just for things to stay cool. Do you think it would be okay using that 805i!


Shouldn't be a problem


----------



## atreides

Quote:


> Originally Posted by *Gary2015*
> 
> Shouldn't be a problem


okay great thank you!


----------



## DNMock

So had to go full on McGuyver to get things going tonight...

Put one of the screws in a little too deep and cracked the acrylic on the block.

Rather than waiting two weeks I noticed that the original Titan X acetal block was the exact same size with the same hole pattern.

Took the cracked acrylic off the T-X and with the power of carving and fire to clear enough space for the capacitors, I now have a fully functioning Titan X-P with the face plate of a Titan X










Not sure if that makes me a genius or insane.

edit: definitely see what y'all are talking about slamming head first into the power limit wall. Max GPU temp 28 C, max clock 2050 regardless of how i clock it.

Try cranking up the memory, to 550. TDP limit 122% crash... still below 30 C


----------



## markklok

Quote:


> Originally Posted by *DNMock*
> 
> So had to go full on McGuyver to get things going tonight...
> 
> Put one of the screws in a little too deep and cracked the acrylic on the block.
> 
> Rather than waiting two weeks I noticed that the original Titan X acetal block was the exact same size with the same hole pattern.
> 
> Took the cracked acrylic off the T-X and with the power of carving and fire to clear enough space for the capacitors, I now have a fully functioning Titan X-P with the face plate of a Titan X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure if that makes me a genius or insane.
> 
> edit: definitely see what y'all are talking about slamming head first into the power limit wall. Max GPU temp 28 C, max clock 2050 regardless of how i clock it.
> 
> Try cranking up the memory, to 550. TDP limit 122% crash... still below 30 C


Would love to see some pictureas... everybody is talking about putting on the blocks... but very little graphical porn


----------



## Silent Scone

Still waiting on EK... Order been stuck at processing since 4th August


----------



## markklok

First I wanted to order an ek uni block then people said I had to strip the entire pcb... so I cancelled it..

Now my mind is going.. commonn I know you want too...

So how much cooling would be enough besides the block itself..

Would a 80mm fan (front and back) be enough ?


----------



## MrKenzie

Hi guys I've been following this thread from the beginning and you've talked me into getting a Titan XP! But as I'm in Australia I'm having trouble sourcing one. Have you noticed a time of day or anything where they seem to become available? In 2 weeks I have only seen them listed as in stock on the Nvidia USA website once, and that was at around midnight here. I don't want to buy off Ebay as they're asking US$1,600 or more for them!

I have an aquarium cooler setup similar to a couple of you guys I think, it should keep a Titan XP at around 30c hopefully!

Cheers


----------



## toncij

I'm in the heart of Europe and can't get one officially, but need to send people physically around to pick one.


----------



## cookiesowns

Quote:


> Originally Posted by *markklok*
> 
> First I wanted to order an ek uni block then people said I had to strip the entire pcb... so I cancelled it..
> 
> Now my mind is going.. commonn I know you want too...
> 
> So how much cooling would be enough besides the block itself..
> 
> Would a 80mm fan (front and back) be enough ?


My cards sit vertical. I have a panaflo 120mm running around 1200-1600RPM on top of it + passive airflow from my top rads. It seems fine. Using a fluke IR, I've seen it gone up to around 75C max, with minimal airflow, with the fans at 1200-1600rpm, it's around 65C. Ram chips run at most 50C.


----------



## Gary2015

Quote:


> Originally Posted by *MrKenzie*
> 
> Hi guys I've been following this thread from the beginning and you've talked me into getting a Titan XP! But as I'm in Australia I'm having trouble sourcing one. Have you noticed a time of day or anything where they seem to become available? In 2 weeks I have only seen them listed as in stock on the Nvidia USA website once, and that was at around midnight here. I don't want to buy off Ebay as they're asking US$1,600 or more for them!
> 
> I have an aquarium cooler setup similar to a couple of you guys I think, it should keep a Titan XP at around 30c hopefully!
> 
> Cheers


They are out of stock everywhere. You may find you need to pay the premium or wait a few months.


----------



## markklok

Quote:


> Originally Posted by *cookiesowns*
> 
> My cards sit vertical. I have a panaflo 120mm running around 1200-1600RPM on top of it + passive airflow from my top rads. It seems fine. Using a fluke IR, I've seen it gone up to around 75C max, with minimal airflow, with the fans at 1200-1600rpm, it's around 65C. Ram chips run at most 50C.


ok sounds good... i'm also in the vertical (upside down) league
https://postimg.org/image/n4m1rsm27/

Also would love to test the CLU shunt mod.. (very thin layer)


----------



## MrKenzie

Quote:


> Originally Posted by *Gary2015*
> 
> They are out of stock everywhere. You may find you need to pay the premium or wait a few months.


I'm happy to wait I just assumed you guys were buying them during the day (when I'm sleeping). I have a 1080 but I'm not going to bother water cooling it for a few % better performance.


----------



## Phoenix81

Quote:


> Originally Posted by *MrKenzie*
> 
> Hi guys I've been following this thread from the beginning and you've talked me into getting a Titan XP! But as I'm in Australia I'm having trouble sourcing one. Have you noticed a time of day or anything where they seem to become available? In 2 weeks I have only seen them listed as in stock on the Nvidia USA website once, and that was at around midnight here. I don't want to buy off Ebay as they're asking US$1,600 or more for them!
> 
> I have an aquarium cooler setup similar to a couple of you guys I think, it should keep a Titan XP at around 30c hopefully!
> 
> Cheers


This site helped a lot. You can set an email notification to notify when the titan x is in stock (they check the nvidia page every 5 mins). But you have to hurry anyway. The stock only last like 10 mins.

https://www.nowinstock.net/computers/videocards/nvidia/gtxtitanp/


----------



## MrKenzie

Quote:


> Originally Posted by *Phoenix81*
> 
> This site helped a lot. You can set an email notification to notify when the titan x is in stock (they check the nvidia page every 5 mins). But you have to hurry anyway. The stock only last like 10 mins.
> 
> https://www.nowinstock.net/computers/videocards/nvidia/gtxtitanp/


Thank you! I will try that, it can only be better than checking a few times a day when I'm at a computer!


----------



## dante`afk

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, and a single GPU's frame-time and pacing aren't perfect either and can spike. Fact. See how that works? The point is that on a properly setup system the frame-time variance in SLI is at the insignificant/imperceptible level. It's like arguing you can feel the difference in input lag between a 2ms lag monitor and a 4ms input lag monitor.
> 
> 
> 
> 
> 
> 
> 
> Not to mention the faster your FPS and refresh rate, the less and less any small frame time variance matters.


Yet better than SLI.
Quote:


> Originally Posted by *CallsignVega*
> 
> Oh, and putting a large gap between two sentences doesn't add any credence to your point.


Obviously enough credence for noticing it.


----------



## pez

Quote:


> Originally Posted by *CallsignVega*
> 
> Well, the 6700K system puts up overall highest FPS in most games but in demanding CPU games the 6950X puts up better minimum FPS. Since BF1 will be DX12, I'll probably stick with the 6950X.
> Ya, and a single GPU's frame-time and pacing aren't perfect either and can spike. Fact. See how that works? The point is that on a properly setup system the frame-time variance in SLI is at the insignificant/imperceptible level. It's like arguing you can feel the difference in input lag between a 2ms lag monitor and a 4ms input lag monitor.
> 
> 
> 
> 
> 
> 
> 
> Not to mention the faster your FPS and refresh rate, the less and less any small frame time variance matters.
> 
> Oh, and putting a large gap between two sentences doesn't add any credence to your point.


He consistently makes arguments and states 'fact' in several threads without ever posting any actual info....so...unfortunately arguing with someone like that (especially in the GPU subforum on OCN) is rather futile







.


----------



## Silent Scone

Which is unfortunate, as otherwise there is grounds for an argument there. Depending on the resolution and frame rate you are targeting, SLI is still needed - I don't think there is any arguing with that.

That said, I don't think anyone can argue the sanity of if it were possible to hit their frame target consistently without SLI, they would rather not have SLI. NVIDIA dropped three and four way card support with good reasons. Here are just a few that come to mind without even really racking my brain.

1) It's costly and time extensive to maintain support for

2) It requires a constant commutative relationship between the developer and vendor in order to keep performance at an acceptable level

3) Highly adopted engines such as UE do not support AFR (requiring previous frame to render the next)

4) DX12 is moving away from external pacing methods, having explicit GPU scaling support natively, therefore winding down support makes perfect sense,

When these features work to the standard one would expect them to, then there is no issue. Even to those who are susceptible to latency - it's possible to have SLI and have just as good an experience as without, but with the benefit of higher framerates. The real question is how often do people genuinely find that is the case? To which in my experience, the answer is foggier than what one would like.


----------



## Dr Mad

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *DNMock*
> 
> So had to go full on McGuyver to get things going tonight...
> 
> Put one of the screws in a little too deep and cracked the acrylic on the block.
> 
> Rather than waiting two weeks I noticed that the original Titan X acetal block was the exact same size with the same hole pattern.
> 
> Took the cracked acrylic off the T-X and with the power of carving and fire to clear enough space for the capacitors, I now have a fully functioning Titan X-P with the face plate of a Titan X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure if that makes me a genius or insane.
> 
> edit: definitely see what y'all are talking about slamming head first into the power limit wall. Max GPU temp 28 C, max clock 2050 regardless of how i clock it.
> 
> Try cranking up the memory, to 550. TDP limit 122% crash... still below 30 C






Just in case, do you think the original Titan X waterblock could fit to Titan X Pascal?
It seems there are some differences in VRM area but for GPU & memory, it could make good contact.


----------



## cisco0623

Quote:


> Originally Posted by *MrKenzie*
> 
> Thank you! I will try that, it can only be better than checking a few times a day when I'm at a computer!


Honestly they have stock every week day and often morning and afternoon. Nowinstock.com shows the history to support what I am saying. I don't think they stock on weekends, but you should be able to snag one Monday morning (around 10:30-11eastern time) pretty easy.


----------



## Gary2015

Quote:


> Originally Posted by *Dr Mad*
> 
> 
> Just in case, do you think the original Titan X waterblock could fit to Titan X Pascal?
> It seems there are some differences in VRM area but for GPU & memory, it could make good contact.


You can try..let us know please.


----------



## MrKenzie

Quote:


> Originally Posted by *cisco0623*
> 
> Honestly they have stock every week day and often morning and afternoon. Nowinstock.com shows the history to support what I am saying. I don't think they stock on weekends, but you should be able to snag one Monday morning (around 10:30-11eastern time) pretty easy.


Hopefully I can get one fairly soon, I have an EK water block due to be delivered next week and I won't even have the card yet!


----------



## Gary2015

Quote:


> Originally Posted by *MrKenzie*
> 
> Hopefully I can get one fairly soon, I have an EK water block due to be delivered next week and I won't even have the card yet!


So technically you aren't an owner yet and shouldn't be posting here?


----------



## Woundingchaney

Quote:


> Originally Posted by *Gary2015*
> 
> So technically you aren't an owner yet and shouldn't be posting here?


That was more than a bit abrasive..........

He obviously intends to purchase the item and this thread is his best place to go on the site for information regarding his intended purchase.


----------



## cisco0623

Quote:


> Originally Posted by *pez*
> 
> He consistently makes arguments and states 'fact' in several threads without ever posting any actual info....so...unfortunately arguing with someone like that (especially in the GPU subforum on OCN) is rather futile
> 
> 
> 
> 
> 
> 
> 
> .


My two cents on this: I had a 9800gx2 - bad micro stutter. I then went to 295gtx sli. Slight micro stutter, but it didn't bother me at all. I then had tri-sli Titans until I got the Titan x just now and I hardly had any issues with it- I cant even say I had micro stutter. They were great cards and I was even reluctant to upgrade after three years. (I kept losing one card because nvidia's new drivers only cater to SLI and kept omitting the third!) Basically no micro stutter that's noticeable like I originally experienced on the old card. Also while my Titan x is obviously more powerful on its own compared to my Titan tri sli I don't really "see" or "feel" like it's any smoother other than higher frame rates.


----------



## cisco0623

Quote:


> Originally Posted by *MrKenzie*
> 
> Hopefully I can get one fairly soon, I have an EK water block due to be delivered next week and I won't even have the card yet!


Hah! You did it right. I'm waiting on all my ek parts (back plates) before I get started. You will definitely get one Monday morning. Not sure if you work or not jus have the webpage ready in your phone and keep refreshing it lol. I literally ordered mine while sitting in a meeting. Heheh


----------



## MrKenzie

Quote:


> Originally Posted by *Woundingchaney*
> 
> That was more than a bit abrasive..........
> 
> He obviously intends to purchase the item and this thread is his best place to go on the site for information regarding his intended purchase.


It's ok it takes more than that to wind up an Aussie, but yes technically I'm not an owner, but neither are plenty of other people that have posted.


----------



## Gary2015

Quote:


> Originally Posted by *Woundingchaney*
> 
> That was more than a bit abrasive..........
> 
> He obviously intends to purchase the item and this thread is his best place to go on the site for information regarding his intended purchase.


LOL, it was a joke. Why so serious?


----------



## Jpmboy

Quote:


> Originally Posted by *dVeLoPe*
> 
> sli 1080 or pascal XP
> 
> i can do either or but the XP requires me selling off the 1080
> 
> discuss im literally stuck trying to decide and wil be using 120hz @ 1080p
> 
> whenever x34p comes out i will go that route so what should i do?


if you are staying with 1080/120Hz. you only need a single 1080 that can OC. But... "want" is a different problem.








Quote:


> Originally Posted by *markklok*
> 
> First I wanted to order an ek uni block then people said I had to strip the entire pcb... so I cancelled it..
> Now my mind is going.. commonn I know you want too...
> So how much cooling would be enough besides the block itself..
> *Would a 80mm fan (front and back)* be enough ?


yes
Quote:


> Originally Posted by *cookiesowns*
> 
> My cards sit vertical. I have a panaflo 120mm running around 1200-1600RPM on top of it + passive airflow from my top rads. It seems fine. Using a fluke IR, I've seen it gone up to around 75C max, with minimal airflow, with the fans at 1200-1600rpm, it's around 65C. Ram chips run at most 50C.


yeah - I get roughly the same temps with a fluke IR. I'm hoping that the EK block actually provides cooling to the entire r22 choke assembly.


----------



## Woundingchaney

Quote:


> Originally Posted by *Gary2015*
> 
> LOL, it was a joke. Why so serious?


I apologize, but surely you realize that text is not the ideal form of communication to relay sarcasm.


----------



## Jpmboy

Our @strong island 1 hit a *milestone*! Dude has run many owner threads here and is a benchmark editor for OCN.


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> Which is unfortunate, as otherwise there is grounds for an argument there. Depending on the resolution and frame rate you are targeting, SLI is still needed - I don't think there is any arguing with that.
> 
> That said, I don't think anyone can argue the sanity of if it were possible to hit their frame target consistently without SLI, they would rather not have SLI. NVIDIA dropped three and four way card support with good reasons. Here are just a few that come to mind without even really racking my brain.
> 
> 1) It's costly and time extensive to maintain support for
> 
> 2) It requires a constant commutative relationship between the developer and vendor in order to keep performance at an acceptable level
> 
> 3) Highly adopted engines such as UE do not support AFR (requiring previous frame to render the next)
> 
> 4) DX12 is moving away from external pacing methods, having explicit GPU scaling support natively, therefore winding down support makes perfect sense,
> 
> When these features work to the standard one would expect them to, then there is no issue. Even to those who are susceptible to latency - it's possible to have SLI and have just as good an experience as without, but with the benefit of higher framerates. The real question is how often do people genuinely find that is the case? To which in my experience, the answer is foggier than what one would like.


Very well said. I always anticipate an argument and am always glad to see myself proven wrong, but after personal experience and knowing others personally with similar experiences, I'm going to be hard pressed to take the word of a 'random' user on OCN







.
Quote:


> Originally Posted by *cisco0623*
> 
> My two cents on this: I had a 9800gx2 - bad micro stutter. I then went to 295gtx sli. Slight micro stutter, but it didn't bother me at all. I then had tri-sli Titans until I got the Titan x just now and I hardly had any issues with it- I cant even say I had micro stutter. They were great cards and I was even reluctant to upgrade after three years. (I kept losing one card because nvidia's new drivers only cater to SLI and kept omitting the third!) Basically no micro stutter that's noticeable like I originally experienced on the old card. Also while my Titan x is obviously more powerful on its own compared to my Titan tri sli I don't really "see" or "feel" like it's any smoother other than higher frame rates.


Yeah, the issue is blown out of proportion these days. I'm not claiming SLI never had microstutter, because I remember it happening in titles for myself with much earlier cards. It's just the same baseless arguments and regurgitated crap I see posted on Reddit.


----------



## Fiercy

Like it was disscusesed previously the micro stuttering is only a tip of the iceberg the main problem for SLI is just that it doesn't work in games.

I hear this excuse of being an enthusiast but since when enthusiast means wasting money for volitile results?

I have no problem with SLI owners if you like it then it's good but don't say SLI has no problems and inspire other people to get it because in the end it's always gonna be a second rate solution.

To many people refuse to accept that they only have a limited perception of SLI because they only play 1-10 proven games that have SLI support. Try to venture into other games. Some old games run 50% less with SLI active.


----------



## cisco0623

Quote:


> Originally Posted by *Fiercy*
> 
> Like it was disscusesed previously the micro stuttering is only a tip of the iceberg the main problem for SLI is just that it doesn't work in games.
> 
> I hear this excuse of being an enthusiast but since when enthusiast means wasting money for volitile results?
> 
> I have no problem with SLI owners if you like it then it's good but don't say SLI has no problems and inspire other people to get it because in the end it's always gonna be a second rate solution.
> 
> To many people refuse to accept that they only have a limited perception of SLI because they only play 1-10 proven games that have SLI support. Try to venture into other games. Some old games run 50% less with SLI active.


I'm not sure I understand your argument. Are you saying people who have SLI only play SLI supported games? How is it a second rate solution? What's the first rate solution? Why is it a waste of money? Titan xp (or any card that's new to be honest) will obviously own any game over a few years old so why would sli matter in that case?


----------



## Woundingchaney

Quote:


> Originally Posted by *Fiercy*
> 
> Like it was disscusesed previously the micro stuttering is only a tip of the iceberg the main problem for SLI is just that it doesn't work in games.
> 
> I hear this excuse of being an enthusiast but since when enthusiast means wasting money for volitile results?
> 
> I have no problem with SLI owners if you like it then it's good but don't say SLI has no problems and inspire other people to get it because in the end it's always gonna be a second rate solution.
> 
> To many people refuse to accept that they only have a limited perception of SLI because they only play 1-10 proven games that have SLI support. Try to venture into other games. Some old games run 50% less with SLI active.


SLI users typically are not interested in playing older games, its also important to note that older games or indie titles dont require the additional processing power to run.

Realistically there is typically only a couple titles a year that dont support SLI, if that. The last high profile title that I recall that didnt support SLI was JC 3 (given how broken Batman was on launch last year its hard to fault lack of SLI support) .


----------



## profundido

Quote:


> Originally Posted by *tpwilko08*
> 
> Where are situated roughly i am from UK.


I live in Belgium and I just got the EK shipment confirmation email in today so I should be getting them blocks within the next 2-3 days


----------



## Fiercy

Well if they say the encounter no problems using SLI the only thing that comes to mind is that yes they only play SLI supported games.

Any solution that produces volitile results is in opinion a second rate one.

It's a waste of money when you have so many games coming back out that don't have SLI support. Some games get SLI support much later when I already finished them aka Fallout 4, Devision...

SLI is like a roadster it's cool and fun but not usable for daily use and in our case gaming.


----------



## pez

Quote:


> Originally Posted by *Fiercy*
> 
> Like it was disscusesed previously the micro stuttering is only a tip of the iceberg the main problem for SLI is just that it doesn't work in games.
> 
> I hear this excuse of being an enthusiast but since when enthusiast means wasting money for volitile results?
> 
> I have no problem with SLI owners if you like it then it's good but don't say SLI has no problems and inspire other people to get it because in the end it's always gonna be a second rate solution.
> 
> To many people refuse to accept that they only have a limited perception of SLI because they only play 1-10 proven games that have SLI support. Try to venture into other games. Some old games run 50% less with SLI active.


Who is saying it's 100% problem free? No one is claiming this right now. If you go into it not knowing you may have to deal with tweaking or non-support for certain games, then you're either an uninformed consumer, or just mindlessly spending money. However, if you know these, accept them, and work to get around or have the patience to wait for support, then it's by no means a 'second rate' solution.
Quote:


> Originally Posted by *cisco0623*
> 
> I'm not sure I understand your argument. Are you saying people who have SLI only play SLI supported games? How is it a second rate solution? What's the first rate solution? Why is it a waste of money? Titan xp (or any card that's new to be honest) will obviously own any game over a few years old so why would sli matter in that case?


Quote:


> Originally Posted by *Woundingchaney*
> 
> SLI users typically are not interested in playing older games, its also important to note that older games or indie titles dont require the additional processing power to run.
> 
> Realistically there is typically only a couple titles a year that dont support SLI, if that. The last high profile title that I recall that didnt support SLI was JC 3 (given how broken Batman was on launch last year its hard to fault lack of SLI support) .


These guys got it^.
Quote:


> Originally Posted by *Fiercy*
> 
> Well if they say the encounter no problems using SLI the only thing that comes to mind is that yes they only play SLI supported games.
> 
> Any solution that produces volitile results is in opinion a second rate one.
> 
> It's a waste of money when you have so many games coming back out that don't have SLI support. Some games get SLI support much later when I already finished them aka Fallout 4, Devision...
> 
> SLI is like a roadster it's cool and fun but not usable for daily use and in our case gaming.


The fact that there are examples like the last 2 BF games, Crysis series, etc. proves how ignorant your argument is.


----------



## DNMock

Quote:


> Just in case, do you think the original Titan X waterblock could fit to Titan X Pascal?
> It seems there are some differences in VRM area but for GPU & memory, it could make good contact.


The block itself probably not, the mounting holes are located differently. As far as contact goes, it's identical to the Titan-X Maxwell. If you would be willing to mill new screw mounts into it then it shouldn't be an issue.

The front plate well, there are a couple capacitors that aren't quite in the exact same spot so it just needs to be mangled a bit:


Spoiler: Warning: Spoiler!







Where it mounts to the copper block itself and the gasket layout are identical though.


----------



## stefxyz

Its really not appropiate arguing in a TitanX forum what is waste of money. For one person buying a 1080 over a 1070 is waste of money as frames per Dollar are lower for another person having 30% performance increase in 2 out of 10 games which they play 30 min per week is worth the extra titan because 1200 Dollar is not so much money for them and it will not impact their financial situaton.

Its like going to McDonalds and arguing with other customers why they order a Big mac because it is waste of money as 2 Cheeseburger are cheaper and have more calories...

Dont project your personal situation on others...


----------



## Fiercy

Quote:


> Originally Posted by *pez*
> 
> The fact that there are examples like the last 2 BF games, Crysis series, etc. proves how ignorant your argument is.


I mean yes if you only play battlefield which i think every SLI owner here plays and that's the main source of your impressions of how it works. I don't see how it makes anything ignorant.

Le'ts take a quick look at game released in 2016 that are notable:

FarCry Primal:

__
https://www.reddit.com/r/448jf6/xcom2_day_1_nvidia_sli_issues_any_temp_fixes/%5B/URL
 (Not sure if it was released or not yet but it wasn't there when i finished the game)

Quantum Break
Windows 10 Store exclusive no SLI support

Mirror's Edge Catalyst
https://forums.mirrorsedge.com/discussion/1930/sli-support

No Man's Sky, Deus Ex Mankind Divided also no support.

List can go on forever if I include 2015 and 2014 etc...

What is really ignorant is ignoring how bad SLI actually is for a gamer that plays games.

Quote:


> Originally Posted by *stefxyz*
> 
> Its really not appropiate arguing in a TitanX forum what is waste of money. For one person buying a 1080 over a 1070 is waste of money as frames per Dollar are lower for another person having 30% performance increase in 2 out of 10 games which they play 30 min per week is worth the extra titan because 1200 Dollar is not so much money for them and it will not impact their financial situaton.
> 
> Its like going to McDonalds and arguing with other customers why they order a Big mac because it is waste of money as 2 Cheeseburger are cheaper and have more calories...
> 
> Dont project your personal situation on others...


It's not about more frames its a waste of money when you get problems in games or you just can't use the second card.


----------



## DNMock

Quote:


> Originally Posted by *stefxyz*
> 
> Its really not appropiate arguing in a TitanX forum what is waste of money. For one person buying a 1080 over a 1070 is waste of money as frames per Dollar are lower for another person having 30% performance increase in 2 out of 10 games which they play 30 min per week is worth the extra titan because 1200 Dollar is not so much money for them and it will not impact their financial situaton.
> 
> Its like going to McDonalds and arguing with other customers why they order a Big mac because it is waste of money as 2 Cheeseburger are cheaper and have more calories...
> 
> Dont project your personal situation on others...


No, its like going to the most expensive steak house in town and doing something similar.


----------



## Fiercy

Quote:


> Originally Posted by *DNMock*
> 
> No, its like going to the most expensive steak house in town and doing something similar.


I mean same thing if you order 2 cheeseburgers and can only eat one and if your order two steaks but you can only eat one isn't that a waste of money to you?


----------



## Jpmboy

This discussion of SLI or not frametimes, microstutter or what ever is off topic.. start another thread on th esubject - stop hijacking the TXP owner thread with this... PLEASE!


----------



## stefxyz

But there is a benefit of having 2 Titansin some cases even if the benefit relatively small and not available in every application while there is no disadvantage in any other situation as you can just deactivate the 2nd card. So its not a complete waste.


----------



## Woundingchaney

Quote:


> Originally Posted by *Fiercy*
> 
> I mean yes if you only play battlefield which i think every SLI owner here plays and that's the main source of your impressions of how it works. I don't see how it makes anything ignorant.
> 
> Le'ts take a quick look at game released in 2016 that are notable:
> 
> FarCry Primal:
> 
> __
> https://www.reddit.com/r/448jf6/xcom2_day_1_nvidia_sli_issues_any_temp_fixes/%5B/URL
> (Not sure if it was released or not yet but it wasn't there when i finished the game)
> 
> Quantum Break
> Windows 10 Store exclusive no SLI support
> 
> Mirror's Edge Catalyst
> https://forums.mirrorsedge.com/discussion/1930/sli-support
> 
> No Man's Sky, Deus Ex Mankind Divided also no support.
> 
> List can go on forever if I include 2015 and 2014 etc...
> 
> What is really ignorant is ignoring how bad SLI actually is for a gamer that plays games.
> It's not about more frames its a waste of money when you get problems in games or you just can't use the second card.


NMS and Deus SLI drivers released earlier this week.

FC Primal, game is cpu bottle necked. Once settings are high enough scaling is good. I played the game @4k with good scaling.

Windows UWP lacks many basic functions of PC gaming. A lack of SLI support shouldn't be a surprise given just how bad the platform support PC features.

XCOM 2 I am honestly not familiar with so I honestly can't say either way. I don't recall the game every being graphically intensive though.

As a gamer that plays games, I can't help but feel as if your opinion is very uninformed.


----------



## cisco0623

Quote:


> Originally Posted by *Jpmboy*
> 
> This discussion of SLI or not frametimes, microstutter or what ever is off topic.. start another thread on th esubject - stop hijacking the TXP owner thread with this... PLEASE!


You're right.


----------



## DNMock

Quote:


> Originally Posted by *Fiercy*
> 
> What is really ignorant is ignoring how bad SLI actually is for a gamer that plays games.
> It's not about more frames its a waste of money when you get problems in games or you just can't use the second card.


Did you really just post in the Titan-X owners thread that SLI is a waste of money? Titan-X is a waste of money, you think anyone here gives a crap? What, are you going to go to a car forum scroll down to the exotic section and start saying how getting different features on a Lambo is a waste of money next?


----------



## Silent Scone

I think the discussion is misinformed on both sides. Plus, you know it's time to move things along when the car analogies come out.


----------



## DNMock

Quote:


> Originally Posted by *Silent Scone*
> 
> I think the discussion is misinformed on both sides. Plus, you know it's time to move things along when the car analogies come out.


yar!

back to using a lighter to catch your old acetal Titan X faceplate on fire and scraping away the molten burning parts to get it to fit on the new Titan-XP after going full retrard and breaking the Titan XP faceplate by overtightening the screws.


----------



## markklok

Common guys.. please just take it outside









For the ones who already received their EK block..... Isn't it possible to somehow use the factory default backplate ? (maybe longer screws)


----------



## Fiercy

Quote:


> Originally Posted by *Woundingchaney*
> 
> NMS and Deus SLI drivers released earlier this week.
> 
> FC Primal, game is cpu bottle necked. Once settings are high enough scaling is good. I played the game @4k with good scaling.
> 
> Windows UWP lacks many basic functions of PC gaming. A lack of SLI support shouldn't be a surprise given just how bad the platform support PC features.
> 
> XCOM 2 I am honestly not familiar with so I honestly can't say either way. I don't recall the game every being graphically intensive though.
> 
> As a gamer that plays games, I can't help but feel as if your opinion is very uninformed.


I mean I read a lot of excuses here there but fact is fact lots of issues are there

Also it's funny that you think releasing a driver and stating SLI support mean it's ok. Fallout 4 has SLI support that's completely useless as are many other games.

Deus Ex has a dx12 renderer and looking at total war war hammer and rise of tomb raider there is a high chance SLI support there is DX11 only.


----------



## CallsignVega

Quote:


> Originally Posted by *pez*
> 
> He consistently makes arguments and states 'fact' in several threads without ever posting any actual info....so...unfortunately arguing with someone like that (especially in the GPU subforum on OCN) is rather futile
> 
> 
> 
> 
> 
> 
> 
> .


Yes, I run into peoples subjective thoughts as "facts" in the audio world too. People who claim to be able to tell the difference between a few ms of input lag, or minuscule frame time variances, or that speakers sound better with X cable over Y cable virtually always get wrecked in blind A-B comparison tests.


----------



## dante`afk

The denial is real.


----------



## mouacyk

Quote:


> Originally Posted by *DNMock*
> 
> Did you really just post in the Titan-X owners thread that SLI is a waste of money? Titan-X is a waste of money, you think anyone here gives a crap? What, are you going to go to a car forum scroll down to the exotic section and start saying how getting different features on a Lambo is a waste of money next?


Why not? It's not like anyone needs an exclusive membership passphrase to get in here to see the 50% avg performance increase over a GPU that can now be acquired for merely 1/4 of the cost. If someone else feels like going to a car forum, they probably will and hopefully they have good points to counter marginal gains praising there as well. There's one BIG exception though, those cars retain or even increase their value, making such an car antagonist irrelevant versus here.


----------



## DNMock

Quote:


> Originally Posted by *markklok*
> 
> Common guys.. please just take it outside
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For the ones who already received their EK block..... Isn't it possible to somehow use the factory default backplate ? (maybe longer screws)


I played with it a bit, you can, but the trick is those little bolts you need the 4mm socket to remove act as standoffs for the stock back plate, would take the right combination of thermal pads to get it to work. The screw Nvidia used are smaller as well so you will have to bore the holes out a bit larger as well.
Quote:


> Originally Posted by *mouacyk*
> 
> Why not? It's not like anyone needs an exclusive membership passphrase to get in here to see the 50% avg performance increase over a GPU that can now be acquired for merely 1/4 of the cost. If someone else feels like going to a car forum, they probably will and hopefully they have good points to counter marginal gains praising there as well. There's one BIG exception though, those cars retain or even increase their value, making such an car antagonist irrelevant versus here.


The point is, these folks are the wrong target audience. People who get T-X gpu's are folks who want top of the line performance, and are willing to pay big bucks for even marginal gains, enjoy benchmarks and the thrill of getting that max 3dmark score, or enthusiasts who just enjoy building nice rigs. All this "Micro stutter" "poor scaling" and "only available in AAA titles" stuff means nothing to this target audience.

I'm not arguing the statements about SLI, I'm simply saying no one who is using SLI _here_ in this thread care so it's wasted time and energy.

Now I'm done on the subject.


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> yar!
> 
> back to using a lighter to catch your old acetal Titan X faceplate on fire and scraping away the molten burning parts to get it to fit on the new Titan-XP after going full retrard and breaking the Titan XP faceplate by overtightening the screws.


NOw ^^ this is what a TXP owners thread should be talking about!









@dante`afk - take this quixotic argument elsewhere... politely asked.


----------



## DNMock

Quote:


> Originally Posted by *markklok*
> 
> I deff would have picked up my dremel and made a sculpture out of it.....


Left the dremel up at the shop, was tired of taking the loop apart, putting it back together, seeing it leak, back apart, etc. etc. so I just said screw it and threw up a hail mary lol.


----------



## cisco0623

I get the feeling a lot of these negative Titan X posts are from users who don't actually own a titan x, but subconsciously wish they could







. I see the same cycle of trolls for twenty years every time a new card comes out. And before any stupid flame war starts - if you even feel the need to argue my post that is proof alone. If you don't own a Titan X plan on owning a Titan why bother posting about it? Do you think k people who've purchased the Titan X are going to say you're right I'm going to return my card and by a jaton?


----------



## Jpmboy

Quote:


> Originally Posted by *markklok*
> 
> I deff would have picked up my dremel and made a sculpture out of it.....


I know - right? But you gotta admit, never seen someone "flame-trim" an acetal block top before.


----------



## markklok

Quote:


> Originally Posted by *Jpmboy*
> 
> I know - right? But you gotta admit, never seen someone "flame-trim" an acetal block top before.


Other option... just saw it off







Bet it would have been quicker.. + you can reach the points for the shunt mod


----------



## DNMock

Honestly didn't intend on catching it on fire originally, was using a file to trim it down, so I was just not thinking and tried to heat it up a bit to speed things along.

Worst case scenario I already saw it would fit over the primary copper block area, so I figured if I screwed it up bad enough I'd just saw that section off and go with a non-full cover block

edit: beat me to it lol, that was the back-up plan. I was so impressed that it worked given how absurd it was, I decided to roll with it.


----------



## criminal

Quote:


> Originally Posted by *Jpmboy*
> 
> Our @strong island 1 hit a *milestone*! Dude has run many owner threads here and is a benchmark editor for OCN.


That pretty amazing. I remember when he started the 780 Classified owners thread. He was just getting into extreme overclocking at the time. He has moved up fast.


----------



## cisco0623

Does the oem backplate work with the EK block?


----------



## DNMock

Quote:


> Originally Posted by *cisco0623*
> 
> Does the oem backplate work with the EK block?


not without some pretty heavy modding and it's a pos anyway. Better bet would be using any old EK backplate you may have and just drilling holes in it.

Was going to try and use the Titan-X Maxwell back plate originally, but due to other issues stated above, my patience waned.

Edit, btw, from what I was looking at, the only real trick I saw necessary to use the Titan-X back plate was to use a thicker thermal pads (no memory chips on the back of new titan-x) and maybe a couple holes for screws if you want it really snug.


----------



## pez

Quote:


> Originally Posted by *Fiercy*
> 
> I mean yes if you only play battlefield which i think every SLI owner here plays and that's the main source of your impressions of how it works. I don't see how it makes anything ignorant.
> 
> Le'ts take a quick look at game released in 2016 that are notable:
> 
> FarCry Primal:
> 
> __
> https://www.reddit.com/r/448jf6/xcom2_day_1_nvidia_sli_issues_any_temp_fixes/%5B/URL
> (Not sure if it was released or not yet but it wasn't there when i finished the game)
> 
> Quantum Break
> Windows 10 Store exclusive no SLI support
> 
> Mirror's Edge Catalyst
> https://forums.mirrorsedge.com/discussion/1930/sli-support
> 
> No Man's Sky, Deus Ex Mankind Divided also no support.
> 
> List can go on forever if I include 2015 and 2014 etc...
> 
> What is really ignorant is ignoring how bad SLI actually is for a gamer that plays games.
> It's not about more frames its a waste of money when you get problems in games or you just can't use the second card.


Quote:


> Originally Posted by *Woundingchaney*
> 
> NMS and Deus SLI drivers released earlier this week.
> 
> FC Primal, game is cpu bottle necked. Once settings are high enough scaling is good. I played the game @4k with good scaling.
> 
> Windows UWP lacks many basic functions of PC gaming. A lack of SLI support shouldn't be a surprise given just how bad the platform support PC features.
> 
> XCOM 2 I am honestly not familiar with so I honestly can't say either way. I don't recall the game every being graphically intensive though.
> 
> As a gamer that plays games, I can't help but feel as if your opinion is very uninformed.


And all it takes is this guys anecdotal evidence below to counter your anecdotal evidence. You see how that works? The main point you didn't seem to get is that no one said it was perfect, and we provided you with flaws that it has. However, you're entitled to your own _opinion
_.
Quote:


> Originally Posted by *dante`afk*
> 
> Yea? show me all the threads where I say that other than this one here.
> 
> I'm gonna wait until next year then for your reply.
> We can also add UE4 engine games to the list. Probably the best engine currently, with no SLI support.


If it wasn't here earlier on, it was in the 1080 thread when you tried to argue with me (without proof again) something SLI related.
Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, I run into peoples subjective thoughts as "facts" in the audio world too. People who claim to be able to tell the difference between a few ms of input lag, or minuscule frame time variances, or that speakers sound better with X cable over Y cable virtually always get wrecked in blind A-B comparison tests.


It's a shame, because I've ready quite a few of your 'mini-reviews' and your rather thorough posting has always impressed me







.


----------



## cisco0623

Quote:


> Originally Posted by *DNMock*
> 
> not without some pretty heavy modding and it's a pos anyway. Better bet would be using any old EK backplate you may have and just drilling holes in it.
> 
> Was going to try and use the Titan-X Maxwell back plate originally, but due to other issues stated above, my patience waned.
> 
> Edit, btw, from what I was looking at, the only real trick I saw necessary to use the Titan-X back plate was to use a thicker thermal pads (no memory chips on the back of new titan-x) and maybe a couple holes for screws if you want it really snug.


I'll just wait for my proper ek back plates lol.


----------



## mbze430

Quote:


> Originally Posted by *HyperMatrix*
> 
> Yes, at 60Hz that variance can be as high as 16.66ms, with a median of 8.4ms. But even at just 144Hz, that frame variance can only be as high as 6.94ms, with a median of 3.47ms. And that's not even taking into account GSYNC.


Right, so even with your 166hz+ monitor, there is still a variance of frame time.
Quote:


> Originally Posted by *cisco0623*
> 
> My two cents on this: I had a 9800gx2 - bad micro stutter. I then went to 295gtx sli. Slight micro stutter, but it didn't bother me at all. I then had tri-sli Titans until I got the Titan x just now and I hardly had any issues with it- I cant even say I had micro stutter. They were great cards and I was even reluctant to upgrade after three years. (I kept losing one card because nvidia's new drivers only cater to SLI and kept omitting the third!) Basically no micro stutter that's noticeable like I originally experienced on the old card. Also while my Titan x is obviously more powerful on its own compared to my Titan tri sli I don't really "see" or "feel" like it's any smoother other than higher frame rates.


Tri-Card for whatever reason, completely solved microstutter in SLI set ups... that was well documented.


----------



## mouacyk

Quote:


> Originally Posted by *DNMock*
> 
> I played with it a bit, you can, but the trick is those little bolts you need the 4mm socket to remove act as standoffs for the stock back plate, would take the right combination of thermal pads to get it to work. The screw Nvidia used are smaller as well so you will have to bore the holes out a bit larger as well.
> *The point is, these folks are the wrong target audience.* People who get T-X gpu's are folks who want top of the line performance, and are willing to pay big bucks for even marginal gains, enjoy benchmarks and the thrill of getting that max 3dmark score, or enthusiasts who just enjoy building nice rigs. All this "Micro stutter" "poor scaling" and "only available in AAA titles" stuff means nothing to this target audience.
> 
> I'm not arguing the statements about SLI, I'm simply saying no one who is using SLI _here_ in this thread care so it's wasted time and energy.
> 
> Now I'm done on the subject.


The owners may or may not be the right target audience for the antagonism. However, there are other groups of people who visit this thread and deserve a balanced view over the product (whether it be for potential purchasing or passing on information), because it can't be all peaches and sunshine in Titan XP land. There is a healthy ratio to any product criticism.


----------



## Woundingchaney

Quote:


> Originally Posted by *pez*
> 
> And all it takes is this guys anecdotal evidence below to counter your anecdotal evidence. You see how that works? The main point you didn't seem to get is that no one said it was perfect, and we provided you with flaws that it has. However, you're entitled to your own _opinion
> _.
> .


The conversation was regarding SLI support, I am not sure what part of this is anecdotal. The majority of the games he listed do in fact support SLI.

I don't think anyone here has ever said that there wasn't issues with SLI, rather that many of these bullet points from your stance aren't representative of the reality nor are they quite the issue that some make them out to be.


----------



## cisco0623

Is there a forum or thread specific to SLI for you guys to hammer out the epeen?


----------



## pez

Quote:


> Originally Posted by *Woundingchaney*
> 
> The conversation was regarding SLI support, I am not sure what part of this is anecdotal. The majority of the games he listed do in fact support SLI.
> 
> I don't think anyone here has ever said that there wasn't issues with SLI, rather that many of these bullet points from your stance aren't representative of the reality nor are they quite the issue that some make them out to be.


Yes, I'm letting him know that your argument was a direct counter to his argument. I'm not sure if your whole post is directed at me...you re-iterated part of my post in different words.
Quote:


> Originally Posted by *cisco0623*
> 
> Is there a forum or thread specific to SLI for you guys to hammer out the epeen?


Sure, click the 'NVIDIA' link at the top and go for it. This isn't an e-peen battle.

Also, this may be relevant for you:
http://www.overclock.net/rigbuilder


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> Yes, I'm letting him know that your argument was a direct counter to his argument. I'm not sure if your whole post is directed at me...you re-iterated part of my post in different words.


***? is this a troll feeding fest or can folks not understand how off topic this BS is?


----------



## pez

Quote:


> Originally Posted by *Jpmboy*
> 
> ***? is this a troll feeding fest or can folks not understand how off topic this BS is?


Report the posts then. It started as a discussion about SLI and was relevant to the thread at first.


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> Report the posts then. It started as a discussion about SLI and was relevant to the thread at first.


and wandered off into a battle between very few users in an Owners thread NOT an SLI Debate thread.


----------



## cisco0623

Quote:


> Originally Posted by *pez*
> 
> Report the posts then. It started as a discussion about SLI and was relevant to the thread at first.


Yes and even after everyone said what needed to be said it continued and got worse. That's why I'm saying bring it elsewhere or ignore them - it is rude to folk coming in here looking for Titan X specific and relevant info and instead see these back and forth bs that means nothing. Nothing personal.


----------



## pez

Quote:


> Originally Posted by *Jpmboy*
> 
> and wandered off into a battle between very few users in an Owners thread NOT an SLI Debate thread.


Great. Sorry we didn't appease to your interests for a few pages. I personally am glad the thread wasn't about the 30th person asking about the progress of custom voltage control, modded BIOS' and the same rehashed questions about OCs and benchmarks because people aren't competent enough to use the search feature







.


----------



## Silent Scone

Lol yet not a single piece of data presented. I've more experience with SLI than all of you put together, therefore my word is final - back OT.

Good evening.


----------



## opt33

Quote:


> Originally Posted by *markklok*
> 
> For the ones who already received their EK block..... Isn't it possible to somehow use the factory default backplate ? (maybe longer screws)


I was going to do that until my EK backplate arrived, but the factory backplate is in 2 pieces, and all the screwholes you would use are on the 2 ends...nothing to hold the middle down without doing something risky.

btw...the max temp in 20+ 3d mark benching runs on back of my card was 46.4C (vrm area) with IR. Any expensive thermal pads belong on the waterblock side.


----------



## cisco0623

Quote:


> Originally Posted by *pez*
> 
> Great. Sorry we didn't appease to your interests for a few pages. I personally am glad the thread wasn't about the 30th person asking about the progress of custom voltage control, modded BIOS' and the same rehashed questions about OCs and benchmarks because people aren't competent enough to use the search feature
> 
> 
> 
> 
> 
> 
> 
> .


lol I get your point, but that is technically the point of an owners thread. Especially at this early stage of its release.


----------



## cisco0623

Quote:


> Originally Posted by *Silent Scone*
> 
> Lol yet not a single piece of data presented. I've more experience with SLI than all of you put together, therefore my word is final - back OT.
> 
> Good evening.


Damn you must always be the life of a party.


----------



## pez

Quote:


> Originally Posted by *cisco0623*
> 
> lol I get your point, but that is technically the point of an owners thread. Especially at this early stage of its release.


Yeah, I get it. The SLI arguments annoy me, too. But it also annoys me that this thread moves faster due to half of the content being those questions over and over and over and over....and over.


----------



## cisco0623

Quote:


> Originally Posted by *pez*
> 
> Yeah, I get it. The SLI arguments annoy me, too. But it also annoys me that this thread moves faster due to half of the content being those questions over and over and over and over....and over.


I agree with you, I'm just a little more patient lol


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> Yeah, I get it. The SLI arguments annoy me, too. But it also annoys me that this thread moves faster due to half of the content being those questions over and over and over and over....and over.


or that any actually useful information, rare as it is, is diluted (polluted?) with repetitive post regarding a subjective effect/observation that is nearly impossible to quantify. It's like debating the fact that some individuals do not get poison ivy. Generally leads to a accumulation of aphasic posts... but like you say, this is a free/open forum, folks can post what ever they like within the TOS.


----------



## lilchronic

There has always been a slightly noticeable micro stutter when going sli. I've had 480 sli, 670 sli and 780Ti sli running smoothly but compared to a single card you can notice a slight difference, nothing new here.


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> There has always been a slightly noticeable micro stutter when going sli. I've had 480 sli, 670 sli and 780Ti sli running smoothly but compared to a single card you can notice a slight difference, nothing new here.


lol- this outta get the posts going... I never noticed any stutter when running 2 HD 7970s in CFX.


----------



## Testier

We all see things differently and thus no one is able to make a statement on this for everyone.

There is very little point in discussing essentially a personal preference. How about we talk about something else?


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> lol- this outta get the posts going... I never noticed any stutter when running 2 HD 7970s in CFX.


Well i've never owned amd gpu's so.....

Anyway's just like you said before some people are just more susceptible to noticing the micro stutter.

...Even a single card can have micro stutter sometime's, add another and you have twice as much.


----------



## stangflyer

Hi all, Not an owner (Yet). I want to ask a serious question to see which way to go.

My current pc is a i5 3570 non k- )I made a mistake when ordering and did not notice it was a non k till after delidding it)







. Runs at 4.2. 16 gigs of pc1600 ddr3. Asrock OC Formula Z77 mobo. 2 ssd's and a 980TI hybrid that runs at 1520/7600 on stock bios. (I have a buyer for $375 for the 980ti if I want to sell it).

*I game at 7680x1440 60hz*. I had 970's in SLI but was not fond of the SLI anymore after years of having it and crossfire. So I went with the 980ti.

My choices are:

1: Make this pc my backup and sell my current backup (I7 950DO /X58 mobo / GTX 780) and build a new pc.
2.Just sell my 980ti and get a Titan XP and keep the 3570 pc untill Skylake E drops.
3. Get a cheap 980ti and go back to SLI. *Some of the reading on the thread makes me think that the 980ti in sli would push enough frames where the SLI would be less noticeable then with the 970's.
*4. Sell my 3 monitors and get a 3440x1440 100hz Gsync monitor. Then my 980ti should be good enough until volta etc. I am not an AA hog and I am ok with knocking down a couple of settings to get the fps where I want them. I want to get the Acer Predator X34P that comes out in a few months but that does me no good now which makes #4 hard.
5. I could sell the 980ti and stick the gtx780 in with the 3570 because I am still on 1080p for a TV. Then on my backup pc put my sapphire 7950 boost flex in the x58 machine to sell cheap. Then I have more funds for a complete pc overhaul.

I am sorry this is so long winded but I do have a number of choices here. I can spend around $1500 or so on the upgrade.

Thanks and appreciate and thoughts or insight you all have.


----------



## CRITTY

Quote:


> Originally Posted by *Fiercy*
> 
> I mean same thing if you order 2 cheeseburgers and can only eat one and if your order two steaks but you can only eat one isn't that a waste of money to you?


No, it's called Breakfast.


----------



## Fiercy

Quote:


> Originally Posted by *stangflyer*
> 
> Hi all, Not an owner (Yet). I want to ask a serious question to see which way to go.
> 
> My current pc is a i5 3570 non k- )I made a mistake when ordering and did not notice it was a non k till after delidding it)
> 
> 
> 
> 
> 
> 
> 
> . Runs at 4.2. 16 gigs of pc1600 ddr3. Asrock OC Formula Z77 mobo. 2 ssd's and a 980TI hybrid that runs at 1520/7600 on stock bios. (I have a buyer for $375 for the 980ti if I want to sell it).
> 
> *I game at 7680x1440 60hz*. I had 970's in SLI but was not fond of the SLI anymore after years of having it and crossfire. So I went with the 980ti.
> 
> My choices are:
> 
> 1: Make this pc my backup and sell my current backup (I7 950DO /X58 mobo / GTX 780) and build a new pc.
> 2.Just sell my 980ti and get a Titan XP and keep the 3570 pc untill Skylake E drops.
> 3. Get a cheap 980ti and go back to SLI. *Some of the reading on the thread makes me think that the 980ti in sli would push enough frames where the SLI would be less noticeable then with the 970's.
> *4. Sell my 3 monitors and get a 3440x1440 100hz Gsync monitor. Then my 980ti should be good enough until volta etc. I am not an AA hog and I am ok with knocking down a couple of settings to get the fps where I want them. I want to get the Acer Predator X34P that comes out in a few months but that does me no good now which makes #4 hard.
> 5. I could sell the 980ti and stick the gtx780 in with the 3570 because I am still on 1080p for a TV. Then on my backup pc put my sapphire 7950 boost flex in the x58 machine to sell cheap. Then I have more funds for a complete pc overhaul.
> 
> I am sorry this is so long winded but I do have a number of choices here. I can spend around $1500 or so on the upgrade.
> 
> Thanks and appreciate and thoughts or insight you all have.


At this resolution you definitely don't need a better processor. I would say best option is to get 34 display. 980ti will give you better performance on it and I think the immersion might better with it as compared to three monitors.


----------



## bee144

Quote:


> Originally Posted by *Silent Scone*
> 
> Lol yet not a single piece of data presented. I've more experience with SLI than all of you put together, therefore my word is final - back OT.
> 
> Good evening.


LOL you're full of yourself. BYE


----------



## DNMock

Quote:


> Originally Posted by *stangflyer*
> 
> Hi all, Not an owner (Yet). I want to ask a serious question to see which way to go.
> 
> My current pc is a i5 3570 non k- )I made a mistake when ordering and did not notice it was a non k till after delidding it)
> 
> 
> 
> 
> 
> 
> 
> . Runs at 4.2. 16 gigs of pc1600 ddr3. Asrock OC Formula Z77 mobo. 2 ssd's and a 980TI hybrid that runs at 1520/7600 on stock bios. (I have a buyer for $375 for the 980ti if I want to sell it).
> 
> *I game at 7680x1440 60hz*. I had 970's in SLI but was not fond of the SLI anymore after years of having it and crossfire. So I went with the 980ti.
> 
> My choices are:
> 
> 1: Make this pc my backup and sell my current backup (I7 950DO /X58 mobo / GTX 780) and build a new pc.
> 2.Just sell my 980ti and get a Titan XP and keep the 3570 pc untill Skylake E drops.
> 3. Get a cheap 980ti and go back to SLI. *Some of the reading on the thread makes me think that the 980ti in sli would push enough frames where the SLI would be less noticeable then with the 970's.
> *4. Sell my 3 monitors and get a 3440x1440 100hz Gsync monitor. Then my 980ti should be good enough until volta etc. I am not an AA hog and I am ok with knocking down a couple of settings to get the fps where I want them. I want to get the Acer Predator X34P that comes out in a few months but that does me no good now which makes #4 hard.
> 5. I could sell the 980ti and stick the gtx780 in with the 3570 because I am still on 1080p for a TV. Then on my backup pc put my sapphire 7950 boost flex in the x58 machine to sell cheap. Then I have more funds for a complete pc overhaul.
> 
> I am sorry this is so long winded but I do have a number of choices here. I can spend around $1500 or so on the upgrade.
> 
> Thanks and appreciate and thoughts or insight you all have.


1500 dollar budget?

Are you willing to buy used?

If so then you may be able to upgrade to a 5930K Haswell CPU with the extra PCIE lanes and even get a pair of Titan-X maxwell cards for the added VRAM.

As long as it's all used stuff (I prefer used from the OCN trader forum because I know the stuff is well cared for) you should be able to get a 5930k, X99 Mobo, 32gb of 2133 DDR4 ram and a pair of Maxwell T-X cards for around what your budget is.

Edit: at that resolution I would definitely think the Titan-X Maxwell SLI used set-up would be the way to go for the VRAM since that high of a resolution is gonna eat up a lot of VRam


----------



## stangflyer

Quote:


> Originally Posted by *DNMock*
> 
> 1500 dollar budget?
> 
> Are you willing to buy used?
> 
> If so then you may be able to upgrade to a 5930K Haswell CPU with the extra PCIE lanes and even get a pair of Titan-X maxwell cards for the added VRAM.
> 
> As long as it's all used stuff (I prefer used from the OCN trader forum because I know the stuff is well cared for) you should be able to get a 5930k, X99 Mobo, 32gb of 2133 DDR4 ram and a pair of Maxwell T-X cards for around what your budget is.
> 
> Edit: at that resolution I would definitely think the Titan-X Maxwell SLI used set-up would be the way to go for the VRAM since that high of a resolution is gonna eat up a lot of VRam


Thanks for the reply. I do not think I need more vram as I do not use much AA or none at all. Except older games that performance is a non issue. I keep AB open and do not see vram usage getting above 4.5-5gigs. I do not play GTA V, Witcher 3 etc.


----------



## Jpmboy

Quote:


> Originally Posted by *stangflyer*
> 
> Thanks for the reply. I do not think I need more vram as I do not use much AA or none at all. Except older games that performance is a non issue. I keep AB open and do not see vram usage getting above 4.5-5gigs. I do not play GTA V, Witcher 3 etc.


if it's running the way you like... stand pat. If you want to upgrade in stages, Build upon the best base you can for your budget. Either go z170/6700K or x99 and a 6-core. SKL-E will use a different socket (LGA 2066) so there is nothing to do there. Absent an intermediate mobo/cpu, upgrading to a TXP with what is likely a PCIEgen2x16 (tho z87 G3x16 is common) coupled to a 4.2GHz ivy is just going to choke the TXP IMO. A 1080 may better suit your needs. IDK.


----------



## Falknir

Anyone else getting "Payment authorization failed, please try again or use another payment method." trying to complete checkout? Been trying trying to complete the order over the last couple hours. No transactions/blocks are reported by my bank , credit card, or Paypal today.


----------



## Creator

I don't think many people know what microstutter is. Here's an old but good example:





And here's it still existing today:




But without the side by side comparisons, one may not realize it's there. But it is, as it's a problem with AFR. It's too bad SFR never really took off. Hopefully with DX12...

*Edit: Oh and 1000th post!*


----------



## dante`afk

Dam you can't post that, that will shake the belief of the denial-army.

'B-b-but I don't have that on my screen, it-t-t's only in those videos like that'
'My CPU is quadrillion ghz with SLI and that awesome LED HB bridge uberduper. I have 5000 fps and therefore the microstuttering is neglectable!!11'


----------



## mbze430

someone needs to do a video in slow motion on a 160hz+ monitor.... then MAybe...just MAYBE....nah, still in denial


----------



## bee144

The micro stutter talk is great but it needs to be moved to a separate thread.


----------



## Silent Scone

Witcher 3 isn't an SLI friendly game. The engine has too much post processing.


----------



## stangflyer

Quote:


> Originally Posted by *Jpmboy*
> 
> if it's running the way you like... stand pat. If you want to upgrade in stages, Build upon the best base you can for your budget. Either go z170/6700K or x99 and a 6-core. SKL-E will use a different socket (LGA 2066) so there is nothing to do there. Absent an intermediate mobo/cpu, upgrading to a TXP with what is likely a PCIEgen2x16 (tho z87 G3x16 is common) coupled to a 4.2GHz ivy is just going to choke the TXP IMO. A 1080 may better suit your needs. IDK.


My mobo is a Gen 3x16. Up till 1-2 years ago when you used big resolutions on monitors the CPU/memory speeds did not make a big difference. Know I am starting to see where even upgrading from PC1600 to PC2400 can make 4-5 fps difference in some games. I think I am going to see if I can get any bites on our company bulletin board for my pc and see what I can get for it.


----------



## HyperMatrix

Quote:


> Originally Posted by *mbze430*
> 
> Right, so even with your 166hz+ monitor, there is still a variance of frame time.


No. No there isn't. You just don't get it. An undetectable variance is a non-existent variance as far as your eyes are concerned.

I don't understand why some people are so dense.


----------



## HyperMatrix

Quote:


> Originally Posted by *mouacyk*
> 
> The owners may or may not be the right target audience for the antagonism. However, there are other groups of people who visit this thread and deserve a balanced view over the product (whether it be for potential purchasing or passing on information), because it can't be all peaches and sunshine in Titan XP land. There is a healthy ratio to any product criticism.


Create a new thread for that. This is for owners for a reason. Because video card comments on any website, if you follow them, are split up into groups of people who have the card, and groups of people who can't afford the card and spend all their time bashing it to feel better about themselves. You can complain about the Titan XP. But you should do it in another thread.


----------



## CallsignVega

Quote:


> Originally Posted by *Silent Scone*
> 
> I've more experience with SLI than all of you put together, therefore my word is final


Um, no.
Quote:


> Originally Posted by *Creator*
> 
> I don't think many people know what microstutter is. Here's an old but good example:


No one is debating what micro-stutter _is_. A video from GTX 580 days at 30 FPS hardly is a catch all for today. The point is using a well tuned system and components, micro-stutter is by far and large a non-issue.

Spikes of 2-3ms off baseline which can even happen to single GPU:


----------



## bee144

Quote:


> Originally Posted by *CallsignVega*
> 
> Um, no.
> No one is debating what micro-stutter _is_. A video from GTX 580 days at 30 FPS hardly is a catch all for today. The point is using a well tuned system and components, micro-stutter is by far and large a non-issue.
> 
> Spikes of 2-3ms off baseline which can even happen to single GPU:


Stop, the truth hurts.


----------



## besthijacker

This thread is just getting better and better.




Here is setup and 3DMark. CPU runs at 4.6ghz


----------



## Jpmboy

Quote:


> Originally Posted by *stangflyer*
> 
> My mobo is a Gen 3x16. Up till 1-2 years ago when you used big resolutions on monitors the CPU/memory speeds did not make a big difference. Know I am starting to see where even upgrading from PC1600 to PC2400 can make 4-5 fps difference in some games. I think I am going to see if I can get any bites on our company bulletin board for my pc and see what I can get for it.


good move... and then get what?
Quote:


> Originally Posted by *CallsignVega*
> 
> Um, no.
> No one is debating what micro-stutter _is_. A video from GTX 580 days at 30 FPS hardly is a catch all for today. The point is using a well tuned system and components, micro-stutter is by far and large a non-issue.
> 
> Spikes of 2-3ms off baseline which can even happen to single GPU:
> 
> 
> Spoiler: Warning: Spoiler!


Finally... data. As opposed to trying to get everyone to see the same thing in a Rorschach blot.


----------



## unreality

I put my old 2015 Titan X under air again (going to sell it) and made some Air vs Air Benchmarks (TXm 1405/7000 no throttle bios) vs TXp +210/+400 (throttle)

Timespy
http://www.3dmark.com/compare/spy/298969/spy/299754 (gotta see this!)

Firestrike Ultra
http://www.3dmark.com/compare/fs/9826860/fs/9828802

GTA 5 5K:
72-79% Improvement

Metro Last Light 5K:
23.66 avg vs 40.20 avg ~70% improvement


----------



## DNMock

Quote:


> Originally Posted by *stangflyer*
> 
> Thanks for the reply. I do not think I need more vram as I do not use much AA or none at all. Except older games that performance is a non issue. I keep AB open and do not see vram usage getting above 4.5-5gigs. I do not play GTA V, Witcher 3 etc.


If you are currently using 970's you only have 4gb of VRAM in your system. If you are using anything over 4, it's being off loaded into your system ram and significantly slowing you down.

That said, given your system and budget, either go quick and dirty for an upgrade (a pair of used Titan-X/980ti cards or equivalent on the cheap), or spend your money upgrading your platform so you can build on it later. If the used cards are still pricey, you could always go for polaris 10 GPU's which should be faster than the 970s with more RAM.


----------



## cisco0623

Quote:


> Originally Posted by *unreality*
> 
> I put my old 2015 Titan X under air again (going to sell it) and made some Air vs Air Benchmarks (TXm 1405/7000 no throttle bios) vs TXp +210/+400 (throttle)
> 
> Timespy
> http://www.3dmark.com/compare/spy/298969/spy/299754 (gotta see this!)
> 
> Firestrike Ultra
> http://www.3dmark.com/compare/fs/9826860/fs/9828802
> 
> GTA 5 5K:
> 72-79% Improvement
> 
> Metro Last Light 5K:
> 23.66 avg vs 40.20 avg ~70% improvement


Impressive. I don't upgrade as much as I used to. When I do I like to see at least 40-50% jump. That's one hell of an increase after one gen! It reminds me of the late 90's early 2000's.


----------



## DADDYDC650

Quote:


> Originally Posted by *DNMock*
> 
> If you are currently using 970's you only have 4gb of VRAM in your system. If you are using anything over 4, it's being off loaded into your system ram and significantly slowing you down.
> 
> That said, given your system and budget, either go quick and dirty for an upgrade (a pair of used Titan-X/980ti cards or equivalent on the cheap), or spend your money upgrading your platform so you can build on it later. If the used cards are still pricey, you could always go for polaris 10 GPU's which should be faster than the 970s with more RAM.


970 has 3.5GB.


----------



## Woundingchaney

Quote:


> Originally Posted by *DADDYDC650*
> 
> 970 has 3.5GB.


Actually it has 4GB but a rather unique configuration. Ram access is essentially 3.5 + .5


----------



## DADDYDC650

Quote:


> Originally Posted by *Woundingchaney*
> 
> Actually it has 4GB but a rather unique configuration. Ram access is essentially 3.5 + .5


Only 3.5 works like it should. It slows down when trying to access the .5 portion.

http://www.pcgamer.com/why-nvidias-gtx-970-slows-down-using-more-than-35gb-vram/


----------



## Woundingchaney

Quote:


> Originally Posted by *DADDYDC650*
> 
> Only 3.5 works like it should. It slows down when trying to access the .5 portion.
> 
> http://www.pcgamer.com/why-nvidias-gtx-970-slows-down-using-more-than-35gb-vram/


Yes, but none the less it still has 4GB and access to 4GB. There is latency issues when accessing the final .5 GB.


----------



## DADDYDC650

Quote:


> Originally Posted by *Woundingchaney*
> 
> Yes, but none the less it still has 4GB and access to 4GB. There is latency issues when accessing the final .5 GB.


Yes, just like I stated above. Garbage move by Nvidia.


----------



## ryanallan

I haven't heard anything regarding system memory requirements this time around. No 24GB min news. Guess that whole ordeal was a non issue...


----------



## Jpmboy

Quote:


> Originally Posted by *ryanallan*
> 
> I haven't heard anything regarding system memory requirements this time around. No 24GB min news. *Guess that whole ordeal was a non issue*...


to say the least.


----------



## Snaporz

Just got TimeSpy and the other benches. Stock TXP still (just being lazy until I install my waterblock...waiting on backplate) and haven't updated to new drivers yet.

http://www.3dmark.com/spy/300573


----------



## HyperMatrix

Quote:


> Originally Posted by *ryanallan*
> 
> I haven't heard anything regarding system memory requirements this time around. No 24GB min news. Guess that whole ordeal was a non issue...


Maximum ram usage I've seen is a 13gb total. So it should be fine. The more vram, the less system ram you really need for swapping data back and forth anyway. But I don't see why you wouldn't want at least 32Gb of ram for multi tasking anyway. Heck if the imc in my 5960x didn't suck so bad I'd have gone with 128Gb and set up a massive ram drive. Haha.


----------



## CallsignVega

This is quite odd. Did the thin layer of CLU on both Titan-XP's. On one, the power usage % dropped around 15%. The other, it is like I didn't even put any CLU on and power usage max is the same as OEM. Both have a thin coating of the stuff and I did both as identical as possible.


----------



## Testier

Quote:


> Originally Posted by *CallsignVega*
> 
> This is quite odd. Did the thin layer of CLU on both Titan-XP's. On one, the power usage % dropped around 15%. The other, it is like I didn't even put any CLU on and power usage max is the same as OEM. Both have a thin coating of the stuff and I did both as identical as possible.


Can you measure the resistance?


----------



## Yuhfhrh

Quote:


> Originally Posted by *CallsignVega*
> 
> This is quite odd. Did the thin layer of CLU on both Titan-XP's. On one, the power usage % dropped around 15%. The other, it is like I didn't even put any CLU on and power usage max is the same as OEM. Both have a thin coating of the stuff and I did both as identical as possible.


You need a thicker coating probably, make sure it goes completely from end to end.


----------



## DarkIdeals

So i THINK i figured out what the whold weird low fps issue was. Maybe.

In Witcher 3 i was getting especially low framerate. With two TITAN X's at 2,000mhz i was only getting ~90fps at 3440x1440p with only 2x Hairwworks AA, only regular SSAO, and shadows turned down to high etc.. and at the closest i could get to 4K (4587 x 1920, which is 8.8 million pixels, very close to the 8.3 mil of 4K. Can't do exact 3840x2160 since i'm using an X34 ultra-wide.) my fps was down to high 50's to low 60's at most!

Tried EVERYTHING. I updated drivers, i changed every setting imaginable, i did DDU clean installs etc.. and even did a whole new install of Windows 10. And just HAPPENED today, after still having NO solution; i just happened to turn off Anti-aliasing in Witcher 3 and all of a sudden my fps in the starting Kaer Morhen area at 3440x1440 jumped from ~90fps to 130fps!!! Seriously...why the HELL is the generic W3 AA giving me a FORTY FPS DROP!!!

I'm now able to get ~95-100fps with 8x hairworks AA, HBAO+, shadows and everything else on Ultra etc.. at 3440x1440, and at 4587x1920 (4K equivalent) i'm getting ~80fps average.

The weirdest thing of all is that i noticed turning on AA caused a MASSIVE drop in GPU usage in this game! My GPU usage went from ~90-95% on card 1 and ~80-85% on card 2; down to ~80% on card 1 and ~65-70% on card 2 just by enabling AA. Turning off AA would cause my GPU usage to raise back to 90 and 80 again, respectively.

Anyone have a clue about this? My GPU usage still isn't "quite" what i should be getting (There's videos of people with TITTAN XP in SLI getting 98-99% usage on both cards all the time in this game) but it's interesting to know th at i found SOME kind of cause.

On a side note, i think i also may have figured out why G-Sync is causing low fps for some people using SLI. I noticed that when i would enable G-Sync that the game would force Alternate Frame Rendering mode 1 in NV Control Panel; and it REFUSED to change back to default SLI setting. I would change to default and hit Apply and it'd instantly pop back to Alternate Frame Rendering 1. And with AF1 my GPU usage in Witcher 3 without the AA went down from ~90% on card 1 and ~80% on card 2 to more like ~82% on card 1 and ~72% on card 2. Which may explain the low fps, especialy if usage is lower for other people.


----------



## CallsignVega

Hmm, definitely strange. I've been using Titan-XP SLI with G-Sync and it's been working buttery smooth with maxed GPU usage.


----------



## MrKenzie

Quote:


> Originally Posted by *Phoenix81*
> 
> This site helped a lot. You can set an email notification to notify when the titan x is in stock (they check the nvidia page every 5 mins). But you have to hurry anyway. The stock only last like 10 mins.
> 
> https://www.nowinstock.net/computers/videocards/nvidia/gtxtitanp/


Within a few hours of subscribing to nowinstock I was able to order a TitanXP, if anyone else is having trouble definitely try it!


----------



## Gary2015

Quote:


> Originally Posted by *cisco0623*
> 
> I get the feeling a lot of these negative Titan X posts are from users who don't actually own a titan x, but subconsciously wish they could
> 
> 
> 
> 
> 
> 
> 
> . I see the same cycle of trolls for twenty years every time a new card comes out. And before any stupid flame war starts - if you even feel the need to argue my post that is proof alone. If you don't own a Titan X plan on owning a Titan why bother posting about it? Do you think k people who've purchased the Titan X are going to say you're right I'm going to return my card and by a jaton?


But isn't it exactly this reason why we have an owner only thread . Not planning or thinking about it but REAL owners only?


----------



## Gary2015

Quote:


> Originally Posted by *MrKenzie*
> 
> Within a few hours of subscribing to nowinstock I was able to order a TitanXP, if anyone else is having trouble definitely try it!


I say this in a polite manner but we don't really need a play by play chronicle of how you obtain the card. This is old news .


----------



## HyperMatrix

Quote:


> Originally Posted by *Gary2015*
> 
> I say this in a polite manner but we don't really need a play by play chronicle of how you obtain the card. This is old news .


He's an owner (or soon will be). He's sharing his excitement on his first day of becoming an OCN member. Come on man.


----------



## mustrum

My EK fullcover arrived yesterday. Did the shunt mod while being at it.
This card is so damn good!
So far running at 2050mhz but trying 2100 tonight.
It does not run into the power target so far. Max temperature while gaming for hours: 42 degrees Celsius.


----------



## bee144

Quote:


> Originally Posted by *HyperMatrix*
> 
> He's an owner (or soon will be). He's sharing his excitement on his first day of becoming an OCN member. Come on man.


Agreed, the micro stutter's should be shamed before a new fellow Titan Xp member. Very harsh. Seesh


----------



## HaniWithAnI

Quote:


> Originally Posted by *Gary2015*
> 
> I say this in a polite manner but we don't really need a play by play chronicle of how you obtain the card. This is old news .


I say this in a polite manner but we don't really need you speaking on behalf of everyone in the thread, especially if you're going to be such a downer. He's excited to be joining the ranks, why spoil that?


----------



## MrKenzie

Quote:


> Originally Posted by *Gary2015*
> 
> I say this in a polite manner but we don't really need a play by play chronicle of how you obtain the card. This is old news .


I wasn't aware it's old news because I didn't see it mentioned in the previous 260+ pages. I will refrain from posting until I have something more relevant to share.


----------



## HyperMatrix

Quote:


> Originally Posted by *MrKenzie*
> 
> I wasn't aware it's old news because I didn't see it mentioned in the previous 260+ pages. I will refrain from posting until I have something more relevant to share.


Don't worry about what he said. Welcome to OCN.


----------



## aylan1196

Great card 2088 with hybrid cooler max temps 50 the funny thing is my 1080 is dedecatid to physix lol need to sell it soon


----------



## toncij

Quote:


> Originally Posted by *besthijacker*
> 
> This thread is just getting better and better.
> 
> 
> 
> 
> Here is setup and 3DMark. CPU runs at 4.6ghz


How are you satisfied with the Predator? Enough thermal capacity?

Quote:


> Originally Posted by *Silent Scone*
> 
> Lol yet not a single piece of data presented. I've more experience with SLI than all of you put together, therefore my word is final - back OT.
> 
> Good evening.


Hardly. I, for example, use SLI since 1998. I guess you weren't an avid user back then?

No, micro-stutter is not a problem lately, especially with HB bridge (or dual flex). Frametimes are consistent and smooth.

The real problem in an SLI debate is only when comparing 1080 SLI vs TitanX (Pascal) since the scaling is very close to Titan X (Pascal) in results.


----------



## xarot




----------



## Tideman

14600 total / 17582 graphics
http://www.3dmark.com/3dm/14234487

Same stuff going on though. I actually did a clean install of windows and when I first ran timespy I got a gpu score in the 14000 range. Noted my top card sat at 80 usage. Then when I started up my system this morning and ran it I got the score linked above. This time both gpus were 99 usage..


----------



## DADDYDC650

Quote:


> Originally Posted by *Gary2015*
> 
> I say this in a polite manner but we don't really need a play by play chronicle of how you obtain the card. This is old news .


Damn dude. Let the guy share his experience. Not everyone can afford a TItan XP.


----------



## Lennyx

Just got my card. Waterblock is estimated to arrive on monday so i got some planing to do conserning the loop. Need to go thru my wc stuff, got some quick disconnect fittings somewhere.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Tideman*
> 
> 14600 total / 17582 graphics
> http://www.3dmark.com/3dm/14234487
> 
> Same stuff going on though. I actually did a clean install of windows and when I first ran timespy I got a gpu score in the 14000 range. Noted my top card sat at 80 usage. Then when I started up my system this morning and ran it I got the score linked above. This time both gpus were 99 usage..


Fresh start up(and in morning time) will have temperatures lower overall and prevent some throttling. That's why the better score when booting up. Open up a window by your rig and set your cards to 100% speed. Stick a house fan into your open rig. Do this early in the morning when it's cooler outside. Should get a better score.


----------



## dante`afk

Buttery smooth with microstutters


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Fresh start up(and in morning time) will have temperatures lower overall and prevent some throttling. That's why the better score when booting up. Open up a window by your rig and set your cards to 100% speed. Stick a house fan into your open rig. Do this early in the morning when it's cooler outside. Should get a better score.


.... MrT reveals his secret special sauce.


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> .... MrT reveals his secret special sauce.


lol you never tried that? I used to do it all the time in the winter periods. There will always be a variation in score if running the more sensitive tests straight from power on. Everything's nice and cold.

Don't have to wait for the oil pressure to rise or anything to rag it


----------



## markklok

As of today i'm a proud owner of a glue gun (al those lost years without







)

Extreme example.


Would something like this work against a running CLU with the shunt mod ?

Since my GPU is upside down


Spoiler: Warning: Spoiler!







Or would this be a bad idea (temperature wise) ?


----------



## DNMock

Quote:


> Originally Posted by *Silent Scone*
> 
> lol you never tried that? I used to do it all the time in the winter periods. There will always be a variation in score if running the more sensitive tests straight from power on. Everything's nice and cold.
> 
> Don't have to wait for the oil pressure to rise or anything to rag it


I've done that in the winter before, close off the room, leave the window open until the coolant temps are reading in the single digits and benchmark with a coat on.


----------



## Dr Mad

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *DarkIdeals*
> 
> So i THINK i figured out what the whold weird low fps issue was. Maybe.
> 
> In Witcher 3 i was getting especially low framerate. With two TITAN X's at 2,000mhz i was only getting ~90fps at 3440x1440p with only 2x Hairwworks AA, only regular SSAO, and shadows turned down to high etc.. and at the closest i could get to 4K (4587 x 1920, which is 8.8 million pixels, very close to the 8.3 mil of 4K. Can't do exact 3840x2160 since i'm using an X34 ultra-wide.) my fps was down to high 50's to low 60's at most!
> 
> Tried EVERYTHING. I updated drivers, i changed every setting imaginable, i did DDU clean installs etc.. and even did a whole new install of Windows 10. And just HAPPENED today, after still having NO solution; i just happened to turn off Anti-aliasing in Witcher 3 and all of a sudden my fps in the starting Kaer Morhen area at 3440x1440 jumped from ~90fps to 130fps!!! Seriously...why the HELL is the generic W3 AA giving me a FORTY FPS DROP!!!
> 
> I'm now able to get ~95-100fps with 8x hairworks AA, HBAO+, shadows and everything else on Ultra etc.. at 3440x1440, and at 4587x1920 (4K equivalent) i'm getting ~80fps average.
> 
> The weirdest thing of all is that i noticed turning on AA caused a MASSIVE drop in GPU usage in this game! My GPU usage went from ~90-95% on card 1 and ~80-85% on card 2; down to ~80% on card 1 and ~65-70% on card 2 just by enabling AA. Turning off AA would cause my GPU usage to raise back to 90 and 80 again, respectively.
> 
> Anyone have a clue about this? My GPU usage still isn't "quite" what i should be getting (There's videos of people with TITTAN XP in SLI getting 98-99% usage on both cards all the time in this game) but it's interesting to know th at i found SOME kind of cause.
> 
> On a side note, i think i also may have figured out why G-Sync is causing low fps for some people using SLI. I noticed that when i would enable G-Sync that the game would force Alternate Frame Rendering mode 1 in NV Control Panel; and it REFUSED to change back to default SLI setting. I would change to default and hit Apply and it'd instantly pop back to Alternate Frame Rendering 1. And with AF1 my GPU usage in Witcher 3 without the AA went down from ~90% on card 1 and ~80% on card 2 to more like ~82% on card 1 and ~72% on card 2. Which may explain the low fps, especialy if usage is lower for other people.






I had the same problem with 980ti SLI.
GPU usage was at 98/99% for both but after a few seconds, first GPU usage dropped to 80/84% meaning ~10/13 fps loss (Woesong Brige at White Orchard - 75 fps average at 3440x1440 everything maxxed + user tweaks + mods)
Disabling AA solved the problem.
I installed UGOM from Nexus and I discovered that the game uses Temporal AA and I knew it usually brokes SLI scaling (TXAA is one of these Temporal AA also).
So I enabled AA again and disabled Temporal AA and it's fine now


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> lol you never tried that? I used to do it all the time in the winter periods. There will always be a variation in score if running the more sensitive tests straight from power on. Everything's nice and cold.
> 
> Don't have to wait for the oil pressure to rise or anything to rag it


lol - during winter my wife is dialing up the wood stove as I keep opening windows... a vicious cycle.... hence the water chiller.
Funny you should mention the oil pressure thing... was at Pocono raceway for a track morning a few weeks ago, I've always just verified oil pressure but would never spin up a motor until the oil temp was up >180F. Poo Poo from my buddies running 18PSI in the merge (all of which have blueprinted at least one motor) and then... this GM engineer steps in to see what we're running and says " oil pressure is whether you leave the pit lane, oil temp is whether you redline in the black lane".


----------



## cisco0623

Haha! At my old house I put the entire PC outside one time and ran the monitor and wireless keyboard mouse inside the patio door lol good to know I'm in the right forum now!


----------



## JackCY

Quote:


> Originally Posted by *bee144*
> 
> Agreed, the micro stutter's should be shamed before a new fellow Titan Xp member. Very harsh. Seesh


Quote:


> Originally Posted by *dante`afk*
> 
> Buttery smooth with microstutters


So the Titan XP suffers the same issues as the lower 10x0 cards lately? Stutter and audio pops on some systems no matter what driver is used and wanna be patch given by NV?


----------



## xarot

I once put my whole PC to the balcony over some newspapers on the bench during winter. I was using Corsair H100 AIO back then and unfortunately the liquid in the AIO froze and suddendly temps skyrocketed. It was only around -17c outside at that time. The pump in that AIO still works though.

I once tried the same with my i7-990X during the winter and idle temps went to -7c in RealTemp. I guess it was almost freezing at that time too...heh.


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Damn dude. Let the guy share his experience. Not everyone can afford a TItan XP.


Apologies, my bad. NMS ruined my week.


----------



## JackCY

Quote:


> Originally Posted by *Gary2015*
> 
> Apologies, my bad. NMS ruined my week.


How much FPS in One Man's Lie with Titan XP? Anyone knows?








That piece of junk engine probably doesn't run on anything fast let alone stable.


----------



## toncij

Quote:


> Originally Posted by *JackCY*
> 
> How much FPS in One Man's Lie with Titan XP? Anyone knows?
> 
> 
> 
> 
> 
> 
> 
> 
> That piece of junk engine probably doesn't run on anything fast let alone stable.


Not much. Lags a lot. But there is a trick: set max FPS to MAX. Might help a bit.


----------



## Testier

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Fresh start up(and in morning time) will have temperatures lower overall and prevent some throttling. That's why the better score when booting up. Open up a window by your rig and set your cards to 100% speed. Stick a house fan into your open rig. Do this early in the morning when it's cooler outside. Should get a better score.


Are you gonna try that in our winter or you scared of your WC pipes bursting?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Testier*
> 
> Are you gonna try that in our winter or you scared of your WC pipes bursting?


not afraid, look:


----------



## MunneY

So I'm sitting here... with my Titan XP and my EK block and I just can't make myself get motivated to put it on! Whats wrong with me guys?!


----------



## deafmetal

I'm just waiting for their backplates to be available first (and hopefully PPC carrying both the EKWB blocks and plates).


----------



## JackCY

Quote:


> Originally Posted by *MunneY*
> 
> So I'm sitting here... with my Titan XP and my EK block and I just can't make myself get motivated to put it on! Whats wrong with me guys?!


You can always donate it to me if you don't like it.


----------



## lilchronic

Quote:


> Originally Posted by *Testier*
> 
> Are you gonna try that in our winter or you scared of your WC pipes bursting?


Quote:


> Originally Posted by *MrTOOSHORT*
> 
> not afraid, look:


It's tough to beat Mrtooshort he's got that Canadian cold air.









Down here in Florida i'll be lucky to have one night where it's 0°c


----------



## GnarlyCharlie

Quote:


> Originally Posted by *MunneY*
> 
> So I'm sitting here... with my Titan XP and my EK block and I just can't make myself get motivated to put it on! Whats wrong with me guys?!


----------



## toncij

Quote:


> Originally Posted by *MunneY*
> 
> So I'm sitting here... with my Titan XP and my EK block and I just can't make myself get motivated to put it on! Whats wrong with me guys?!


Can't even buy it normally over here, so if you don't need both, I'll take it off your shoulder...


----------



## Zurv

ugh... only using 2 cards ... i have some kinda baby system now!
I think the last time i had less than 3 cards in my system was sometime in 2011!!



now for project two.. capturing high quality game play at 4K 60fps. i have 3 NVMe drives (system, game, and capture drive) - dxtory using MagicYUV.. then x264 it for 10 hours.. (then have youtube murder the quality







)
The main problem @ 60 fps is the HD can't write fast enough for lossless RGB 4:4:4:4

i just picked up the OCZ 400 1TB (my other drivers are intel 750 1.2TB and the 950 pro). expect some random witcher 3 video!!


----------



## toncij

Quote:


> Originally Posted by *Zurv*
> 
> ugh... only using 2 cards ... i have some kinda baby system now!
> I think the last time i had less than 3 cards in my system was sometime in 2011!!
> 
> 
> 
> now for project two.. capturing high quality game play at 4K 60fps. i have 3 NVMe drives (system, game, and capture drive) - dxtory using MagicYUV.. then x264 it for 10 hours.. (then have youtube murder the quality
> 
> 
> 
> 
> 
> 
> 
> )
> The main problem @ 60 fps is the HD can't write fast enough for lossless RGB 4:4:4:4
> 
> i just picked up the OCZ 400 1TB (my other drivers are intel 750 1.2TB and the 950 pro). expect some random witcher 3 video!!


750 should handle a lot - 1.2Gbps. That should amount as enough for 60Hz 1080 at 8-bit and some room left....


----------



## DarkIdeals

Quote:


> Originally Posted by *xarot*
> 
> I once put my whole PC to the balcony over some newspapers on the bench during winter. I was using Corsair H100 AIO back then and unfortunately the liquid in the AIO froze and suddendly temps skyrocketed. It was only around -17c outside at that time. The pump in that AIO still works though.
> 
> I once tried the same with my i7-990X during the winter and idle temps went to -7c in RealTemp. I guess it was almost freezing at that time too...heh.


uh-huh.....

*only* -17C....

"*almost* freezing" he says...









Quote:


> Originally Posted by *toncij*
> 
> 750 should handle a lot - 1.2Gbps. That should amount as enough for 60Hz 1080 at 8-bit and some room left....


Speaking of fast stuff. Anyone read the article about that new crazy 100GB/s PCi-e 4.0 x16 device? PCI-e 4.0 just got announced which is likely coming to Z270 boards, so that'd be an interesting combo. I'll have to see if i can find the article again; just read it yesterday.

EDIT: Found it. http://videocardz.com/63305/pci-express-4-0-to-arrive-next-year. Edited some typo's/errors too. It was a 100Gb/s NIC, rather than a 1000Gb/s storage drive like i originally thought, my bad. Still interesting though.


----------



## Zurv

Quote:


> Originally Posted by *toncij*
> 
> 750 should handle a lot - 1.2Gbps. That should amount as enough for 60Hz 1080 at 8-bit and some room left....


i think you missed the part with the 4k









the OCZ 400 NVMe has faster write than the intel - i'm still bumping into "(!) disk" warnings but it isn't to bad. I still take like a 20+ FPS hit.. but it is mostly over 60 when cap'n.
review of the SSD (OCZ name was bought by Toshiba) : http://www.thessdreview.com/featured/ocz-rd400-nvme-ssd-review-256gb512gb1tb/

but 4k cap'n lossless is about 200+gigs every 5min (with magicyuv) then over an hour to encode for every 5min... after encoded it is about 3.5gigs for every 5min. (i have this odd feeling that one can't tell the diff once it is on youtube. thoughts? crappy geforce exp capture look the same as doing it the "right" way?)


----------



## habu58

Quote:


> Originally Posted by *Zurv*
> 
> ugh... only using 2 cards ... i have some kinda baby system now!
> I think the last time i had less than 3 cards in my system was sometime in 2011!!
> 
> now for project two.. capturing high quality game play at 4K 60fps. i have 3 NVMe drives (system, game, and capture drive) - dxtory using MagicYUV.. then x264 it for 10 hours.. (then have youtube murder the quality
> 
> 
> 
> 
> 
> 
> 
> )
> The main problem @ 60 fps is the HD can't write fast enough for lossless RGB 4:4:4:4
> 
> i just picked up the OCZ 400 1TB (my other drivers are intel 750 1.2TB and the 950 pro). expect some random witcher 3 video!!


Have your ever tried using Bandicam? The footage doesn't look like uncompressed footage but its pretty darn close with much more reasonable file sizes.


----------



## MunneY

Alright, I've caved











This whole, tiny screw into a hex bolt is quite possibly the stupidest thing I've ever seen. Seriously.

Not bad temps though, and that 2100mhz is 100% flat line, game stable :-D


----------



## toncij

Quote:


> Originally Posted by *MunneY*
> 
> Alright, I've caved
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This whole, tiny screw into a hex bolt is quite possibly the stupidest thing I've ever seen. Seriously.
> 
> Not bad temps though, and that 2100mhz is 100% flat line, game stable :-D


Is that under load?
Quote:


> Originally Posted by *Zurv*
> 
> i think you missed the part with the 4k
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the OCZ 400 NVMe has faster write than the intel - i'm still bumping into "(!) disk" warnings but it isn't to bad. I still take like a 20+ FPS hit.. but it is mostly over 60 when cap'n.
> review of the SSD (OCZ name was bought by Toshiba) : http://www.thessdreview.com/featured/ocz-rd400-nvme-ssd-review-256gb512gb1tb/
> 
> but 4k cap'n lossless is about 200+gigs every 5min (with magicyuv) then over an hour to encode for every 5min... after encoded it is about 3.5gigs for every 5min. (i have this odd feeling that one can't tell the diff once it is on youtube. thoughts? crappy geforce exp capture look the same as doing it the "right" way?)


I did, yes. Sorry. Doesn't Shadowplay compress on-the-fly?


----------



## Zurv

shadowplay looks real bad.. but it might not matter after YT screws it up real good







Just 5min of [email protected] x264 encoding takes me an hour (on a 10 core CPU @ 4.5ghz)


----------



## MunneY

Quote:


> Originally Posted by *toncij*
> 
> Is that under load?


Yup. It bounces between that and 34. I have a 480mm Monsta in front of the card.


----------



## toncij

Quote:


> Originally Posted by *MunneY*
> 
> Yup. It bounces between that and 34. I have a 480mm Monsta in front of the card.


And it's a single card and a t what clock is 5960X? I have a problem with my 5960X - it easily runs 4,45GHz at 1.26V, which is not a lot, but H115i can't cool it and it, under load, goes to 80ish deg.-- which is awful.
I'm planing to move to EK loop but still thinking if 360 would be enough for that and dual Titans...


----------



## HyperMatrix

Quote:


> Originally Posted by *Zurv*
> 
> ugh... only using 2 cards ... i have some kinda baby system now!
> I think the last time i had less than 3 cards in my system was sometime in 2011!!
> 
> 
> 
> now for project two.. capturing high quality game play at 4K 60fps. i have 3 NVMe drives (system, game, and capture drive) - dxtory using MagicYUV.. then x264 it for 10 hours.. (then have youtube murder the quality
> 
> 
> 
> 
> 
> 
> 
> )
> The main problem @ 60 fps is the HD can't write fast enough for lossless RGB 4:4:4:4
> 
> i just picked up the OCZ 400 1TB (my other drivers are intel 750 1.2TB and the 950 pro). expect some random witcher 3 video!!


Just wait for OBS to support HEVC for hardware encoded 444 60fps 4K. Right now it's broken with h264 nvenc where it looks brilliant on the top half of the screen, but has a green tint/overlay on the bottom half.

The new encoder in pascal is one of the reasons I upgraded.


----------



## MunneY

Quote:


> Originally Posted by *toncij*
> 
> And it's a single card and a t what clock is 5960X? I have a problem with my 5960X - it easily runs 4,45GHz at 1.26V, which is not a lot, but H115i can't cool it and it, under load, goes to 80ish deg.-- which is awful.
> I'm planing to move to EK loop but still thinking if 360 would be enough for that and dual Titans...


What I have is massive overkill honestly, but I'm a temp/quiet freak.

I have an external Monsta 480, GPU, CPU, then 3 240mm Darkside slims. I ran 2 Titan XM's and a 5960x and never had any issues with the 3 internal rads. I really need to get a new case cut and re-do my loop.

As for a 360, i think you'll be pushing the limits of the cooling especially if you are doing big OC. This chip can do 4.8 @1.4 or 4.5 at 1.25 and stays in the 60s with this setup


----------



## spyui

Does anyone have benchmark for titan xp vs gtx 1080 sli both max OC ? How much is titan xp slower than gtx 1080 sli ?


----------



## toncij

Quote:


> Originally Posted by *spyui*
> 
> Does anyone have benchmark for titan xp vs gtx 1080 sli both max OC ? How much is titan xp slower than gtx 1080 sli ?


I do.







Give me a moment to dig up my excel..



http://imgur.com/lKA1V


----------



## Zurv

Quote:


> Originally Posted by *spyui*
> 
> Does anyone have benchmark for titan xp vs gtx 1080 sli both max OC ? How much is titan xp slower than gtx 1080 sli ?


Clearly the 1080 sli will be much faster... it is also clear the answer. Get two titan xp! (and water cool... IMO anyone dropping for a xp is losing to much perf (thermal throttle) not to water cool. It isn't like money is holding you back.


----------



## spyui

Quote:


> Originally Posted by *Zurv*
> 
> Clearly the 1080 sli will be much faster... it is also clear the answer. Get two titan xp! (and water cool... IMO anyone dropping for a xp is losing to much perf (thermal throttle) not to water cool. It isn't like money is holding you back.


I know but I plan to sell my gtx 1080 sli for titan xp. 50 % of my games i play don't support sli and i think letting my second gtx 1080 sitting idle 50% of the time is a waste.


----------



## toncij




----------



## spyui

Can anyone post Time Spy score for 1 titan XP running at 2100mhz ?


----------



## Zurv

Quote:


> Originally Posted by *spyui*
> 
> I know but I plan to sell my gtx 1080 sli for titan xp. 50 % of my games i play don't support sli and i think letting my second gtx 1080 sitting idle 50% of the time is a waste.


Which games don't support SLI that you are playing (that would need it) - most games with fancy grafix over the last few years support SLI. You should be able to get about $600 for your 1080s. That is what i sold my billion 1080s for.


----------



## spyui

Quote:


> Originally Posted by *Zurv*
> 
> Which games don't support SLI that you are playing (that would need it) - most games with fancy grafix over the last few years support SLI. You should be able to get about $600 for your 1080s. That is what i sold my billion 1080s for.


Total War warhammer and world of warcraft . I have an offer $1350 for my 1080 sli so i am going to see if its worth it to swap. If titan xp max OC is only 15-20% slower than 1080sli than i am definitely going to get the Titan.


----------



## toncij

Quote:


> Originally Posted by *spyui*
> 
> Total War warhammer and world of warcraft . I have an offer $1350 for my 1080 sli so i am going to see if its worth it to swap. If titan xp max OC is only 15-20% slower than 1080sli than i am definitely going to get the Titan.


But WoW supports SLI... sort of. Judging by GPU usage it doesn't support even a single card correctly.


----------



## Zurv

Quote:


> Originally Posted by *spyui*
> 
> Total War warhammer and world of warcraft . I have an offer $1350 for my 1080 sli so i am going to see if its worth it to swap. If titan xp max OC is only 15-20% slower than 1080sli than i am definitely going to get the Titan.


both support SLI.. but wow still loves to hump the CPU more than the GPU.


----------



## spyui

Total war Warhammer SLi scaling is really poor, I only get 10 more fps compare to single 1080. I heard the upcoming games that use Unity engine won't support SLI.


----------



## Jpmboy

Quote:


> Originally Posted by *MunneY*
> 
> Alright, I've caved
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This whole, tiny screw into a hex bolt is quite possibly the stupidest thing I've ever seen. Seriously.
> 
> Not bad temps though, and that 2100mhz is 100% flat line, game stable :-D


nice card bro! Flat 2100 is very strong!
Quote:


> Originally Posted by *spyui*
> 
> Can anyone post Time Spy score for 1 titan XP running at 2100mhz ?


look here: http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20


----------



## cisco0623

Quote:


> Originally Posted by *MunneY*
> 
> What I have is massive overkill honestly, but I'm a temp/quiet freak.
> 
> I have an external Monsta 480, GPU, CPU, then 3 240mm Darkside slims. I ran 2 Titan XM's and a 5960x and never had any issues with the 3 internal rads. I really need to get a new case cut and re-do my loop.
> 
> As for a 360, i think you'll be pushing the limits of the cooling especially if you are doing big OC. This chip can do 4.8 @1.4 or 4.5 at 1.25 and stays in the 60s with this setup


We are brothers from another mother! I have the original Feser monsta, and two alpha cool monstas (120.3 and 120.2). I haven't put the new Titan under yet as I'm waiting for the backplate, but my system is super quiet and basically keeps everything just over room temp. It was a crazy investment, but now I love it.


----------



## toncij

Quote:


> Originally Posted by *spyui*
> 
> Total war Warhammer SLi scaling is really poor, I only get 10 more fps compare to single 1080. I heard the upcoming games that use Unity engine won't support SLI.


There is no reason for that. Unity engine developers can carefully code the game to support it. SLI is about not making it not-work, not about supporting it actively. (in dx11)


----------



## axiumone

Gamers nexus hybrid build testing. Pretty much what we all figured out about pascal. It's limited by thermals and when thermals are under control it's severely starved for power.


----------



## MunneY

Quote:


> Originally Posted by *axiumone*
> 
> Gamers nexus hybrid build testing. Pretty much what we all figured out about pascal. It's limited by thermals and when thermals are under control it's severely starved for power.


I concur... I could tame the temps on air and have them down with water, but until I can get past 120% power then I'm just stuck.

Where o where is the Pascal Bios editor


----------



## DNMock

Quote:


> Originally Posted by *MunneY*
> 
> I concur... I could tame the temps on air and have them down with water, but until I can get past 120% power then I'm just stuck.
> 
> Where o where is the Pascal Bios editor


I wish I knew. 2050 puts me face first into the 120% wall.


----------



## MunneY

Quote:


> Originally Posted by *DNMock*
> 
> I wish I knew. 2050 puts me face first into the 120% wall.


I basically got nothing out of watercooling except for not sitting beside a jet engine.


----------



## dante`afk

Quote:


> Originally Posted by *MunneY*
> 
> Not bad temps though, and that 2100mhz is 100% flat line, game stable :-D


doubt it.


----------



## MunneY

Quote:


> Originally Posted by *dante`afk*
> 
> doubt it.


----------



## markklok

If watercooling will get me a card with no noise and a core of 2050 stable... then i'm happy.. The last 2/3% are nice but no show stopper


----------



## axiumone

Quote:


> Originally Posted by *dante`afk*
> 
> doubt it.


_In the middle of the journey of our life I came to myself within a dark wood where the straight way was lost._

-Dante

Seems fitting.


----------



## Godm0de

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'll download it later today. What's ur max stable OC? I'd imagine it would throttle so it's hard to pin down max stable OC.


I don't know why, but since I have mounted the water block, my card doesn't reach the power limit as often as with air. It seems that the high temperature of the core increases power draw a lot.


----------



## stefxyz

Quote:


> Originally Posted by *dante`afk*
> 
> doubt it.


Actually I doubt that too. May be in some not so power hungry games but I yet have to see a card that does Witcher 3 2100 without fluctuating below. May be if your ambient is below 2 degrees celsius....


----------



## stefxyz

Quote:


> Originally Posted by *spyui*
> 
> Can anyone post Time Spy score for 1 titan XP running at 2100mhz ?


Yes mine::

http://www.3dmark.com/3dm/14256614?

Dont be mistaken tho. 2101 doesnt mean it keeps the clock all the time.... Here I got slightly higher even tho max clocks are slightly lower:

http://www.3dmark.com/spy/299262

10774 GPU score I think is a very good result.

I am still Nr 47 Hall of Fame and I think no one above me has only a 4 Core CPU.


----------



## mustrum

Quote:


> Originally Posted by *stefxyz*
> 
> Yes mine::
> 
> http://www.3dmark.com/3dm/14256614?
> 
> Dont be mistaken tho. 2101 doesnt mean it keeps the clock all the time.... Here I got slightly higher even tho max clocks are slightly lower:
> 
> http://www.3dmark.com/spy/299262
> 
> 10774 GPU score I think is a very good result.
> 
> I am still Nr 47 Hall of Fame and I think no one above me has only a 4 Core CPU.


Benchmarks mean nothing. Play games. With water and shuntmod mine goes down to 2050 in W3 and no mans sky.


----------



## stefxyz

No I disagree. Benchmarks are a very good indicator on where your performance really is. Due to the controlled environment they are also comparable to other users while Games are not.


----------



## lilchronic

Quote:


> Originally Posted by *Godm0de*
> 
> I don't know why, but since I have mounted the water block, my card doesn't reach the power limit as often as with air. It seems that the high temperature of the core increases power draw a lot.


The fan probably uses 5%-10% of GPU TDP. Although at lower temps it should be more efficient.


----------



## Gary2015

Quote:


> Originally Posted by *JackCY*
> 
> How much FPS in One Man's Lie with Titan XP? Anyone knows?
> 
> 
> 
> 
> 
> 
> 
> 
> That piece of junk engine probably doesn't run on anything fast let alone stable.


I got like 35 fps with dual TXP. I heard they patched it but I got refund already .


----------



## Gary2015

Quote:


> Originally Posted by *MunneY*
> 
> I basically got nothing out of watercooling except for not sitting beside a jet engine.


Not to mention spending a couple of hundred bucks


----------



## Baasha

A glorious sight:










...but 2 of them are absolutely useless since games are gimped to 2-Way SLI!









behold, 4-Way SLI using 2 GPUs:


----------



## Silent Scone

Quote:


> Originally Posted by *Baasha*
> 
> A glorious sight:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...but 2 of them are absolutely useless since games are gimped to 2-Way SLI!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> behold, 4-Way SLI using 2 GPUs:


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> A glorious sight:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...but 2 of them are absolutely useless since games are gimped to 2-Way SLI!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> behold, 4-Way SLI using 2 GPUs:


You try it out with Tomb Raider since it uses DX12 multi-gpu which doesn't care about all that SLI hoopla? It's very taxing on the GPU and I run out of GPU power before I hit a CPU bottleneck, which is rare. Could give you some hope for awesome performance in DX12 titles in the future.

Also...what's that fan hanging out over top of your CPU block? I'm interested...


----------



## Gary2015

Quote:


> Originally Posted by *Baasha*
> 
> A glorious sight:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...but 2 of them are absolutely useless since games are gimped to 2-Way SLI!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> behold, 4-Way SLI using 2 GPUs:


Huh? Are the other two paperweights ?


----------



## besthijacker

Quote:


> Originally Posted by *toncij*
> 
> How are you satisfied with the Predator? Enough thermal capacity?


It's pretty great. GPU is around 50C at full load.


----------



## Jpmboy

Quote:


> Originally Posted by *dante`afk*
> 
> doubt it.


lol - a thermal-leakage denier?
I see a bunch of folks with TXPs under water suddenly running clocks over 2100, steady, through any number of different loads. Or in virus-mode loads (like 3D Mark), run at higher clocks than stock coolers can manage.


----------



## Glzmo

Quote:


> Originally Posted by *Gary2015*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JackCY*
> 
> How much FPS in One Man's Lie with Titan XP? Anyone knows?
> 
> 
> 
> 
> 
> 
> 
> 
> That piece of junk engine probably doesn't run on anything fast let alone stable.
> 
> 
> 
> I got like 35 fps with dual TXP. I heard they patched it but I got refund already .
Click to expand...

Even without a patch, the release version of the game ran with 100+ FPS with maxed out details at 3840x2160 resolution on one Titan X (Pascal) for me (GOG.com version). I had to turn off Vsync and set MaxFPS to unlimited, though, otherwise the framerate was abysmal just like Gary's. Haven't tried a patch yet, though.


----------



## KillerBee33

So i slapped EVGAs AIO , used GELID Extreme and i'm not very happy , During TimeSpy i get 50 degrees Vrel still exists as soon as GPU usage is over 20%








EDIT : the result is worse than AIO on 980 using factory TIM


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> So i slapped EVGAs AIO , used GELID Extreme and i'm not very happy , During TimeSpy i get 50 degrees Vrel still exists as soon as GPU usage is over 20%
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT : the result is worse than AIM on 980 using factory TIM


Meh. The Hybrid will lower temps and noise levels bu that's pretty much it.


----------



## Glzmo

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Godm0de*
> 
> I don't know why, but since I have mounted the water block, my card doesn't reach the power limit as often as with air. It seems that the high temperature of the core increases power draw a lot.
> 
> 
> 
> The fan probably uses 5%-10% of GPU TDP. Although at lower temps it should be more efficient.
Click to expand...

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Godm0de*
> 
> I don't know why, but since I have mounted the water block, my card doesn't reach the power limit as often as with air. It seems that the high temperature of the core increases power draw a lot.
> 
> 
> 
> The fan probably uses 5%-10% of GPU TDP. Although at lower temps it should be more efficient.
Click to expand...

Indeed. With the card sitting idle and the fan speed at 23% the power used is 10% for me, 80% fan speed uses 15% power, if I crank the fan up to 100% speed it's 21% power. So running the fan at full speed costs around 11% more power than the lowest speed.


----------



## MrTOOSHORT

Some EK TXP pics:







Still getting a nickel back plate later next month.


----------



## Celcius

Looking good there


----------



## toncij

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Some EK TXP pics:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still getting a nickel back plate later next month.


What's that you're cooling below the TXP?


----------



## unreality

Got my Aquacomputer Block. Pretty sweet! Didnt go above 12°C ∆ to room temperature.









Besides clocks being steady and stable now i couldnt squeeze more than +20 on Core though. Lets hope for Voltage adjustmens and/or Bios Mods soon


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Celcius*
> 
> Looking good there


Thanks!

Quote:


> Originally Posted by *toncij*
> 
> What's that you're cooling below the TXP?


Intel 750 NVMe drive.


----------



## toncij

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Thanks!
> Intel 750 NVMe drive.


Ahh,. thought so, but wasn't aware you could water cool it. I know these get hot, but didn't know it's up to that point. EK sells 750's block?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *toncij*
> 
> Ahh,. thought so, but wasn't aware you could water cool it. I know these get hot, but didn't know it's up to that point. EK sells 750's block?


Yes, it does keep the drive cooler, but I got the block just because and for the looks.

Here is a thread on it:

*http://www.overclock.net/t/1576721/ek-is-releasing-intel-ssd-750-series-water-block*


----------



## KillerBee33

Can you guys suggest a better way for applying TIM, used about 2 matchstick heads worth in the middle and results aren't great.


----------



## Seyumi

Quote:


> Originally Posted by *KillerBee33*
> 
> Can you guys suggest a better way for applying TIM, used about 2 matchstick heads worth in the middle and results aren't great.


Question - Are you able to squeeze another GPU with the hoses oriented like that or would you need to take more of the shroud off? (0 slot spacing)


----------



## KillerBee33

Quote:


> Originally Posted by *Seyumi*
> 
> Question - Are you able to squeeze another GPU with the hoses oriented like that or would you need to take more of the shroud off? (0 slot spacing)


I'd say it's about 2-3MM thinner with pump and tubes and without that shroud piece, btw i left reference blower on
Will take it apart later , gonna try and re paste it .


----------



## unreality

Quote:


> Originally Posted by *KillerBee33*
> 
> Can you guys suggest a better way for applying TIM, used about 2 matchstick heads worth in the middle and results aren't great.


I used a pea size amount but in a little vertical form because the chip isnt a perfect square. Getting good results with waterblock so far!


----------



## DADDYDC650

Anyone get the DAC to work running Windows 10 Anniversary Update?


----------



## dante`afk

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - a thermal-leakage denier?
> I see a bunch of folks with TXPs under water suddenly running clocks over 2100, steady, through any number of different loads. Or in virus-mode loads (like 3D Mark), run at higher clocks than stock coolers can manage.


Quote:


> Originally Posted by *stefxyz*
> 
> Actually I doubt that too. May be in some not so power hungry games but I yet have to see a card that does Witcher 3 2100 *without fluctuating below*. May be if your ambient is below 2 degrees celsius....


this

but hey, he can surely make a video in witcher 3 how the 2100 stays forever, right?


----------



## MunneY

Quote:


> Originally Posted by *dante`afk*
> 
> this
> 
> but hey, he can surely make a video in witcher 3 how the 2100 stays forever, right?


Since I dont play or own Witcher 3, then no, i cant.


----------



## Silent Scone

Quote:


> Originally Posted by *dante`afk*
> 
> this
> 
> but hey, he can surely make a video in witcher 3 how the 2100 stays forever, right?


What is it you are not understanding about his statement, or why do you doubt the claims? Sounds like you don't understand how GPU Boost works given the temperatures some of these guys are playing at. Including me.
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Some EK TXP pics:
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still getting a nickel back plate later next month.


Mines a little more industrial looking at the moment lol.


----------



## Ascendor81

Quote:


> Originally Posted by *Glzmo*
> 
> Indeed. With the card sitting idle and the fan speed at 23% the power used is 10% for me, 80% fan speed uses 15% power, if I crank the fan up to 100% speed it's 21% power. So running the fan at full speed costs around 11% more power than the lowest speed.


So, if we disconnect the FAN from the cards' PCB, and route it to the motherboard in our EVGA Hybrid kit. Will that allow more power to the GPU?


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Some EK TXP pics:
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still getting a nickel back plate later next month.


I was hoping EK would come out with a staggered bridge of some kind for the 750 block so it could be looped in as part of a parallel unit - too many issues with alignment and swaps.


----------



## dante`afk

Quote:


> Originally Posted by *Silent Scone*
> 
> What is it you are not understanding about his statement, or why do you doubt the claims? Sounds like you don't understand how GPU Boost works given the temperatures some of these guys are playing at. Including me.


No, I want to believe, 2100 stable without fluctuating in games would be awesome.

Quote:


> Originally Posted by *KillerBee33*
> 
> Can you guys suggest a better way for applying TIM, used about 2 matchstick heads worth in the middle and results aren't great.


Your GPU radiator does not get any fresh air, what are you temps? Either a blob in the middle or apply thin over the whole DIE.

I have mine at the front.


----------



## Jpmboy

Quote:


> Originally Posted by *jscheema*
> 
> So, if we disconnect the FAN from the cards' PCB, and route it to the motherboard in our EVGA Hybrid kit. Will that allow more power to the GPU?


AFAIK, the voltage rail for the fans and lihgt is independent from the rails which TDP is measured. The POwer limit which causes a drop in clock bins is separate (in every bios I have modded) from the TDP summation.
Quote:


> Originally Posted by *dante`afk*
> 
> No, I want to believe, 2100 stable without fluctuating in games would be awesome.
> Your GPU radiator does not get any fresh air, what are you temps? Either a blob in the middle or apply thin over the whole DIE.
> 
> I have mine at the front.


So this is where there is a bit of confusion, thermal throttling and power limit throttling are not independent as you know... and whether or not a card can hold a steady frequency under load for extended periods of time (eg... while gaming) is only testable once K-boost is enabled (or if you set all the P-states to the frequency you want held steady in NVI for now - essentially locking the card in the P0 state). Then, load variations which ALL GAMES HAVE will cause load-based down-clocking, which can be distinguished from thermal and power throttling. AFAIK, there's no other way to convoluted the contributing factors without access to the bios settings. Some games, and several Futuremark benchmarks trip the low level virus-mode protection trap, and cause clock bin drops (eg, the same way E-class CPUs drop frequency when AVX is detected in the execution stack). I would bet that some games have procedure calls that NV has labeled as a virus-mode instruction set... leading to clock bin drops. AFAIK, this execution stack instruction set trap is new with Pascal.
Just my


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> AFAIK, the voltage rail for the fans and lihgt is independent from the rails which TDP is measured. The POwer limit which causes a drop in clock bins is separate (in every bios I have modded) from the TDP summation.
> So this is where there is a bit of confusion, thermal throttling and power limit throttling are not independent as you know... and whether or not a card can hold a steady frequency under load for extended periods of time (eg... while gaming) is only testable once K-boost is enabled (or if you set all the P-states to the frequency you want held steady in NVI for now - essentially locking the card in the P0 state). Then, load variations which ALL GAMES HAVE will cause load-based down-clocking, which can be distinguished from thermal and power throttling. AFAIK, there's no other way to convoluted the contributing factors without access to the bios settings. Some games, and several Futuremark benchmarks trip the low level virus-mode protection trap, and cause clock bin drops (eg, the same way E-class CPUs drop frequency when AVX is detected in the execution stack). I would bet that some games have procedure calls that NV has labeled as a virus-mode instruction set... leading to clock bin drops. AFAIK, this execution stack instruction set trap is new with Pascal.
> Just my


This has been around since 600 series cards. You hit the power limit you throttle.

The fan is still pulling power through the card. EVGA was even using the low power draw fans on the cards for more overclocking headroom.


----------



## cookiesowns

Quote:


> Originally Posted by *lilchronic*
> 
> This has been around since 600 series cards. You hit the power limit you throttle.
> 
> The fan is still pulling power through the card. EVGA was even using the low power draw fans on the cards for more overclocking headroom.


JPM is referring to the new behavior where the cards throttle even before power or thermal limits is triggered.

Nvidia implemented this before on the driver stack, but it seems the XP maybe the other pascals seem to do this on the HW level.

I've definetly seem some workloads cause clock bin drops without hitting a temp or power induced throttling.


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> This has been around since 600 series cards. *You hit the power limit you throttle*.
> 
> The fan is still pulling power through the card. EVGA was even using the low power draw fans on the cards for more overclocking headroom.


and before....
lol , of course.







The point made is that the card will "throttle" - drop clock bins - even when the PL is not exceeded due to additional (newer) causes .
Don't know where else the fan would pull power from... but you do know there is more than a single rail on these (and older) cards right... lol was that a Capt obvious moment?
the low power fan thing had and has nothing to do with the Power limiter... neither does TDP as you should know from modding the TXM bios.


----------



## lilchronic

Quote:


> Originally Posted by *cookiesowns*
> 
> JPM is referring to the new behavior where the cards throttle even before power or thermal limits is triggered.
> 
> Nvidia implemented this before on the driver stack, but it seems the XP maybe the other pascals seem to do this on the HW level.
> 
> I've definetly seem some workloads cause clock bin drops without hitting a temp or power induced throttling.


alright i see.

But could be possible that what ever monitoring tool is not reading the senors as accurate.

I guess once we get power limits raised we should find out some more on whats really going on.


----------



## Sheyster

Has EVGA indicated when the TXP Hybrid kit will be released? I don't want a 980Ti kit.


----------



## KillerBee33

Quote:


> Originally Posted by *dante`afk*
> 
> Your GPU radiator does not get any fresh air, what are you temps? Either a blob in the middle or apply thin over the whole DIE.
> I have mine at the front.


Did a line in the middle diagonally , same exact results , overall 20 degrees less than aggressive fan profile .


----------



## Testier

Any recommendation on hybrid kits?


----------



## dVeLoPe

can anyone show me FASEST OC 1080sli vs titan xp sli oc

not benchmarks real world SIIDE BY SIDE gaming performanc


----------



## cookiesowns

Quote:


> Originally Posted by *dVeLoPe*
> 
> can anyone show me FASEST OC 1080sli vs titan xp sli oc
> 
> not benchmarks real world SIIDE BY SIDE gaming performanc


Want to send me some "FASEST OC 1080sli" then I can give you a side-by-side gaming performance?


----------



## dVeLoPe

oc or stock doesnt really matter to much just need to see the comparasion of both cards in sli


----------



## Maintenance Bot

Quote:


> Originally Posted by *Sheyster*
> 
> Has EVGA indicated when the TXP Hybrid kit will be released? I don't want a 980Ti kit.


The 980ti kit is a bit noisey here.
Quote:


> Originally Posted by *Testier*
> 
> Any recommendation on hybrid kits?


Besides the 980ti kit being noisey, it cools ok, max I seen has been 45c.


----------



## PowerK

Count me in.
Two Titan X Pascals on the way from Nvidia. :-D
1080s will be moved to my sons.


----------



## MunneY

Quote:


> Originally Posted by *PowerK*
> 
> Count me in.
> Two Titan X Pascals on the way from Nvidia. :-D
> 1080s will be moved to my sons.


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> AFAIK, the voltage rail for the fans and lihgt is independent from the rails which TDP is measured. The POwer limit which causes a drop in clock bins is separate (in every bios I have modded) from the TDP summation.
> So this is where there is a bit of confusion, thermal throttling and power limit throttling are not independent as you know... and whether or not a card can hold a steady frequency under load for extended periods of time (eg... while gaming) is only testable once K-boost is enabled (or if you set all the P-states to the frequency you want held steady in NVI for now - essentially locking the card in the P0 state). Then, load variations which ALL GAMES HAVE will cause load-based down-clocking, which can be distinguished from thermal and power throttling. AFAIK, there's no other way to convoluted the contributing factors without access to the bios settings. Some games, and several Futuremark benchmarks trip the low level virus-mode protection trap, and cause clock bin drops (eg, the same way E-class CPUs drop frequency when AVX is detected in the execution stack). I would bet that some games have procedure calls that NV has labeled as a virus-mode instruction set... leading to clock bin drops. AFAIK, this execution stack instruction set trap is new with Pascal.
> Just my


You damn right !!
Mine hit 2150 mhz on air and tomorrow i will install the ek wateblock but i without a mod bios im sure i will hit power limit throttling.

The clu mod its worth the effort ? i Have clu but i never used it


----------



## bee144

Quote:


> Originally Posted by *Sheyster*
> 
> Has EVGA indicated when the TXP Hybrid kit will be released? I don't want a 980Ti kit.


See my post on the top of page 3. Kind of worried.

http://forums.evga.com/Petion-for-EVGA-TITAN-X-HYBRID-m2518532-p3.aspx


----------



## dante`afk

Quote:


> Originally Posted by *KillerBee33*
> 
> Did a line in the middle diagonally , same exact results , overall 20 degrees less than aggressive fan profile .


which AiO is that? I have 28 idle and about 45 load in games.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *pompss*
> 
> You damn right !!
> Mine hit 2150 mhz on air and tomorrow i will install the ek wateblock but i without a mod bios im sure i will hit power limit throttling.
> 
> The clu mod its worth the effort ? i Have clu but i never used it


That sounds like the best TXP I've read about. I'm not sure it's worth the risk to do the shunt mod, I haven't myself even though I have some too. Still debating on doing the mod. Hoping a nice bios comes to light that I can use instead.

When you get the block on, make sure you post some benchies in the benchmark section.


----------



## pompss

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> That sounds like the best TXP I've read about. I'm not sure it's worth the risk to do the shunt mod, I haven't myself even though I have some too. Still debating on doing the mod. Hoping a nice bios comes to light that I can use instead.
> 
> When you get the block on, make sure you post some benchies in the benchmark section.


Yeah i rather wait for a bios mod instead of going with the Clu mod. Never really liked this kind of mods.
But im really tempted to push this card high.

Will wait another month to see if there will be some bios mod before trying the clu mod.


----------



## DNMock

Quote:


> Originally Posted by *pompss*
> 
> Yeah i rather wait for a bios mod instead of going with the Clu mod. Never really liked this kind of mods.
> But im really tempted to push this card high.
> 
> Will wait another month to see if there will be some bios mod before trying the clu mod.


Best I hit is 2050 before getting curbstomped by power limit on water. Sounds like you won the silicon lottery!


----------



## Neon01

Xpost from [H], but I thought I'd share my own thoughts on the Titan here too (it's long, sorry):

*TL;DR*: FPS numbers down slightly going from 980ti SLI setup to a single Titan XP, but play experience just as good, if not (circumstantially) better. I could easily make a case for sticking with the older setup, however, if money were tight.

Got my Titan X today. As sad as it seems, it was sitting in the box, ready to be installed, and I was thinking about sending it back (or selling it). The reason is that I was running "before" benchmarks most of the afternoon with my 980ti SLI setup, and quite frankly, it had never run so well.

Thanks to what I had read in one of the threads on Overclock.net, I found that most of my issues with Witcher 3 and other games was due to a weird problem rendering in-game AA. It was giving me horrible GPU utilization rates (80-90% on one, 50-60% on the other in TW3) and therefore a terrible experience. When I disabled it, all of the games I've been playing worked flawlessly.

The Witcher 3 ran at slightly better than 60 fps, and that was with almost every single bell and whistle enabled in 4k (including maxed out hairworks and 4x hairworks AA), OTHER than standard AA, which was disabled. I was blown away, quite frankly. Same experience when I loaded up Dragon Age Inquisition (but the frames were slightly better there). Emerald Graves area was giving me better than 70 fps, and everywhere else was routinely hitting >90. Even Crysis 3 (fully maxed) ran anywhere from 56-61 in the very unscientific "benchmark" I tried to run by just playing the game for a bit from a certain save. Heaven benchmark gave me an average of 64 fps.

The single game that still gave me issues was Black Desert Online. In this, with "high end mode" enabled, I was still getting anywhere from 33-40 fps in Calpheon (pretty heavy graphically), and my GPU utilization was pretty poor. I place the blame on this on the game itself, which is likely a crappy port of the original Korean developer's work.

It's blasphemy, I realize, but I started thinking maybe I should just stick with my current setup for now. My primary motivation with this move was to get to a single card solution that would take care of my needs, and give me enough rendering power to deliver a solid 4k experience.

But I reasoned that, even if the 980ti's are giving me solid numbers, I'd really love to get rid of the headache and inconvenience of a dual card setup, so I cracked open the Titan and fired it up for a comparison. Here were my results:

I didn't do a whole lot of bench testing in games with the Titan at stock clocks. The only reason I didn't install the EK block on it before even putting it in the PC is to make sure the card isn't defective (and not a complete OCing dud). After some tweaking, I was able to get +180-185mhz stable in Heaven benchmark, but later had to nug that down to about +175 to be stable in all games (the witcher 3 crashed after about 20 minutes at +180). I ran at 100% fan to enable this OC, and even at that it was clearly begging for more power, cooler temps, and more voltage. All things that a water block and custom BIOS will give me.

So here were my results, juxtaposed with the 980ti SLI numbers. Keep in mind these tests weren't very scientific, and I didn't exactly follow prescribed courses through the game while benching. Vsync always off for tests.

980ti SLI (1477mhz each)/Titan XP (+175mhz offset)

Heaven 4.0 Bench (UHD, Quality Ultra, Tessellation Normal, AA x2)): 64 fps | 55 fps
The Witcher 3 (UHD, all settings maxed, 4x Hairworks AA, no standard AA, low sharpening, no blur, no motion blur, no vignette): 61-73 fps | 54-60 fps
Black Desert Online (All settings maxed): 33-40 fps | 39-44 fps
Crysis 3 (All settings maxed, FXAA): 56-61 fps | 44-55 fps
Dragon Age Inquisition (All settings maxed, AA off): 68-72 fps | 63-65 fps

So, looking at the numbers alone, this is a little depressing. Not that I didn't realize that I was going to be taking a slight FPS hit across the board with this move (for games that support SLI). What really blew me away, however, was that the actual gameplay experience seemed just as good, if not better, on the Titan.

You may be saying that this is simply confirmation bias, and the scientist in me would affirm that you are probably correct. However, if I were to show gameplay to someone who knows nothing of video games for framerates, I would wager they would be very hard pressed to tell the difference between the two experiences. Frankly, in a lot of situations, though the framerate was worse with the Titan, motion seemed smoother, in general. Maybe this is microstutter at work...or maybe it's just the aforementioned confirmation bias. I've been using dial GPU setups since around 2010, so it's been a a minute since I really had a single GPU play experience.

The long and the short of this is that there really isn't a massive difference between the two. With the exception of Black Desert Online (which was noticeably smoother with the Titan), most of the games really played pretty much the same on the two setups.

So the question is, if I knew what I know now before i ordered the Titan, would I do it again? That's a tough one to answer. After selling my 980ti's used, assuming I can get at least $800 with the water blocks included, it's still a $470 "upgrade" (with tax on the Titan). The thought that a 1080ti could come along in 3 months and give me 99% of the Titan X performance for 75% of the price makes me absolutely cringe. But in the end, I'm very happy with the Titan, and though the numbers may be slightly lower than I was hoping (I really wanted to be able to hit a constant 60 fps in TW3 with the OC), the Titan is butter smooth, and I'm keeping it.

My 2c


----------



## ablangc

I came from the exact same setup (funny, I also play those exact games including BDO). I had 980 Ti's in SLI. Loved them but also wanted to try a single card solution. I notice the slight performance hit in some games but the overall experience was smooth. I do not know if it was microstutter or not but all the games I threw at it just seemed like the gameplay was better. I especially noticed it in GTA V, BDO, and Witcher 3


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> You damn right !!
> Mine hit 2150 mhz on air and tomorrow i will install the ek wateblock but i without a mod bios im sure i will hit power limit throttling.
> 
> The clu mod its worth the effort ? i Have clu but i never used it


THat's quite the TXP you got there!








Quote:


> Originally Posted by *MrTOOSHORT*
> 
> That sounds like the best TXP I've read about. *I'm not sure it's worth the risk to do the shunt mod,* I haven't myself even though I have some too. Still debating on doing the mod. Hoping a nice bios comes to light that I can use instead.
> 
> When you get the block on, make sure you post some benchies in the benchmark section.


Yeah - I've been reluctant to defeat the PL with the clu mod also... no micro ohm meter and without one it really is a blind mod as far as i can tell. Much rather do it with some control using as bios editor. Which probably won;t happen until/if 3rd party manuf get the sell the card. Maybe afterburner beta 11 can help a little?


----------



## cisco0623

Unfortunately I highly doubt anyone but nvidia ever sells this card. I was so used to having a custom bios available I took it for granted when I purchased my card. Hopefully someone figures it out!


----------



## KillerBee33

Quote:


> Originally Posted by *dante`afk*
> 
> which AiO is that? I have 28 idle and about 45 load in games.


EVGAs for the 10 Series, 24 idle -50 games , also set that aio to run as intake , getting fresh air , won only 1 degree from 25 idle to 24


----------



## Zurv

while i'm still sad i can't run 3/4 way SLI. Finally with waterblocks on in getting the perf i had (kinda) with 4way titan X(M). Even in the witcher 3 conversations i'm getting 80fps+ (all hairworks AA etc..)

i'm still confused why capturing does such massive hits to perf (ie, even lowers GPU usage.. which is odd)

but .. rock solid perf (note, that 20fps went away because of dxtory..)



I played it for a few hours (like 8) as i needed to finish Blood and wine before deux ex comes out!







(i also playing hitman with everything maxed and super sampling @ 1.6). Cards were cool and run with no problem with +175, +450


----------



## Nizzen

Quote:


> Originally Posted by *Zurv*
> 
> while i'm still sad i can't run 3/4 way SLI. Finally with waterblocks on in getting the perf i had (kinda) with 4way titan X(M). Even in the witcher 3 conversations i'm getting 80fps+ (all hairworks AA etc..)
> 
> i'm still confused why capturing does such massive hits to perf (ie, even lowers GPU usage.. which is odd)
> 
> but .. rock solid perf (note, that 20fps went away because of dxtory..)
> 
> 
> 
> I played it for a few hours (like 8) as i needed to finish Blood and wine before deux ex comes out!
> 
> 
> 
> 
> 
> 
> 
> (i also playing hitman with everything maxed and super sampling @ 1.6). Cards were cool and run with no problem with +175, +450


Why not using shadowplay?


----------



## dante`afk

Quote:


> Originally Posted by *Neon01*
> 
> Xpost from [H], but I thought I'd share my own thoughts on the Titan here too (it's long, sorry):


Quote:


> Originally Posted by *ablangc*
> 
> I came from the exact same setup (funny, I also play those exact games including BDO). I had 980 Ti's in SLI. Loved them but also wanted to try a single card solution. I notice the slight performance hit in some games but the overall experience was smooth. I do not know if it was microstutter or not but all the games I threw at it just seemed like the gameplay was better. I especially noticed it in GTA V, BDO, and Witcher 3


That's what you call microstutter which so many SLI owners here deny to have. You see, even with maybe lower fps on the Titan, your games run much smoother than on your old SLI setup. Win.

Quote:


> Originally Posted by *KillerBee33*
> 
> EVGAs for the 10 Series, 24 idle -50 games , also set that aio to run as intake , getting fresh air , won only 1 degree from 25 idle to 24


Normal temperatures, there's nothing wrong then?


----------



## dreamcat4

This is just an idea but for AIO cooling it might be possible to use this other type AIO solution. For example the Corsair HG10 980ti adapter. Here is the 1080 modification:

http://forum.corsair.com/v3/showpost.php?p=856494&postcount=11

Then --> attach a Corsair H45 (or else the bigger H100i, which fits too). Finally swapping out the 120mm fans for better ones, eg Skythe GTs. Of course the drawback is not fully water-cooling the VRMs too, like is on a full water block / closed loop.

Dont have a Titan XP myself to try that on, but makes a lot of sense to me as others do it for the 1080. That is, if the EVGA's AIO solution does not appeal as much for whatever reason(s).


----------



## KillerBee33

Quote:


> Originally Posted by *dante`afk*
> 
> Normal temperatures, there's nothing wrong then?


AC off, room temperture over 80 about 30 minutes gaming on stock clocks and power , this is what it recorded


----------



## Zurv

Quote:


> Originally Posted by *Nizzen*
> 
> Why not using shadowplay?


because shadow play quality sucks








like super sucks








(and of course you need to install GeForce experience - which installs a crap load of unneeded services)


----------



## MunneY

Quote:


> Originally Posted by *dante`afk*
> 
> That's what you call microstutter which so many SLI owners here deny to have. You see, even with maybe lower fps on the Titan, your games run much smoother than on your old SLI setup. Win.
> Normal temperatures, there's nothing wrong then?


You sure are salty about a lot... Why the bad attitude?


----------



## CallsignVega

Quote:


> Originally Posted by *Zurv*
> 
> while i'm still sad i can't run 3/4 way SLI.


As a long time 4-way GPU enthusiast, I will say it's time has come and gone.


----------



## dante`afk

Quote:


> Originally Posted by *MunneY*
> 
> You sure are salty about a lot... Why the bad attitude?


Being objective != being salty.

You must learn to differentiate.


----------



## Zurv

Quote:


> Originally Posted by *CallsignVega*
> 
> As a long time 4-way GPU enthusiast, I will say it's time has come and gone.


yep.. this is the first time for me in about 5-6 years i don't have ton ' cards in my system. I was unhappy with the 1080s, they weren't powerful enough - but i'm happy now with the Titan XPs.


----------



## Baasha

Quote:


> Originally Posted by *Zurv*
> 
> yep.. this is the first time for me in about 5-6 years i don't have ton ' cards in my system. I was unhappy with the 1080s, they weren't powerful enough - but i'm happy now with the Titan XPs.


Agreed - even at 5K, the Titan XP SLI perform phenomenally well (equivalent and sometimes even better than 4-Way Titan X Maxwell).

Here's 4-Way SLI Titan XP in GTA V (worse performance than using just 2x Titan XP in the rig): lol


----------



## MunneY

Quote:


> Originally Posted by *Baasha*
> 
> Agreed - even at 5K, the Titan XP SLI perform phenomenally well (equivalent and sometimes even better than 4-Way Titan X Maxwell).
> 
> Here's 4-Way SLI Titan XP in GTA V (worse performance than using just 2x Titan XP in the rig): lol


I'm really hoping there is a way to get the scaling for 3/4 way back. I know in DX12 an mGPU games it is still phenomenal. Try Tomb Raider and see if it scales!


----------



## Zurv

Quote:


> Originally Posted by *MunneY*
> 
> I'm really hoping there is a way to get the scaling for 3/4 way back. I know in DX12 an mGPU games it is still phenomenal. Try Tomb Raider and see if it scales!


It does not. It tried with sli off too. (with 4 way titan xp and 4 way 1080)


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> As a long time 4-way GPU enthusiast, I will say it's time has come and gone.


It's just unfortunate because I find 2 cards to be a little underpowered right now for maxed out 165Hz. I even set my monitor back to 144Hz to help lower gpu usage. But even then there are times where FPS will drop to 110~ FPS in GTA V. Or even dip as low as 80fps~ in Witcher 3. With blocks and a modified bios, best I can expect is another 10% performance, which is ok for 120Hz gaming at 1440p, but ends up being shy of 144-165Hz. If only the Titan XP was a full 610mm2 die like GP100. That extra 30% would have been amazing. I do like the simplicity of 2-card SLI. And much happier with heat management as a result.

Ah well. Here's hoping for 600mm2 Volta with HBM2 next year.


----------



## Zurv

Quote:


> Originally Posted by *HyperMatrix*
> 
> It's just unfortunate because I find 2 cards to be a little underpowered right now for maxed out 165Hz. I even set my monitor back to 144Hz to help lower gpu usage. But even then there are times where FPS will drop to 110~ FPS in GTA V. Or even dip as low as 80fps~ in Witcher 3. With blocks and a modified bios, best I can expect is another 10% performance, which is ok for 120Hz gaming at 1440p, but ends up being shy of 144-165Hz. If only the Titan XP was a full 610mm2 die like GP100. That extra 30% would have been amazing. I do like the simplicity of 2-card SLI. And much happier with heat management as a result.
> 
> Ah well. Here's hoping for 600mm2 Volta with HBM2 next year.


being on how minor increasing mem speed is on game performance. I'm not holding my breath or even caring about HBM2


----------



## HyperMatrix

Quote:


> Originally Posted by *Zurv*
> 
> being on how minor increasing mem speed is on game performance. I'm not holding my breath or even caring about HBM2


Just want it for the reduction in heat and power draw. More overclocking headroom.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> It's just unfortunate because I find 2 cards to be a little underpowered right now for maxed out 165Hz. I even set my monitor back to 144Hz to help lower gpu usage. But even then there are times where FPS will drop to 110~ FPS in GTA V. Or even dip as low as 80fps~ in Witcher 3. With blocks and a modified bios, best I can expect is another 10% performance, which is ok for 120Hz gaming at 1440p, but ends up being shy of 144-165Hz. If only the Titan XP was a full 610mm2 die like GP100. That extra 30% would have been amazing. I do like the simplicity of 2-card SLI. And much happier with heat management as a result.
> 
> Ah well. Here's hoping for 600mm2 Volta with HBM2 next year.


I've been doing some tests to see if two highly clocked Titan-XP would be enough for 4K @ 120 FPS/Hz as I feel that will be the benchmark in 2017 and beyond. 4K @ 120 FPS/Hz is 67% more demand than 3440x1440 @ 120 Hz. Basically, barring any CPU limits, I need enough GPU horsepower to push 3440x1440 at 200 FPS to simulate the load of 4k @ 120 Hz since I am currently without a 4K monitor at this moment.

Many games I can reach it, but many AA titles two Titan-XP isn't enough without turning settings down. What stinks is that the upcoming 4K @ 120 Hz displays won't be G-Sync, so you would preferably want even a larger buffer than a G-Sync display over and beyond to keep that FPS min good. I think Volta next year will be the first true 4K @ 120 Hz capable AAA title SLI setup.


----------



## skypine27

Ok, somewhat paranoid post coming up.

On opening day (Aug 2nd I think), I ordered what I thought were 2 x Titan X (Pascals) to replace my current 2 x Titan X (original/maxwell) set up.

I've been sitting on them, still in the boxes, since they arrived as I was waiting for my EK water blocks to show up. The water blocks are due to arrive today so I opened up one of the Titan X boxes just to check it out.

Im looking at it thinking "it looks cool" and then I see the green GEFORCE stuff on it and thought "didn't nvidia drop the geforce thing for the Pascal model..."

PIC:
http://s82.photobucket.com/user/skypine27/media/DSC_1086.jpg.html

And now Im super paranoid and searching for pics of the Titan X Pascal vs the original Titan Xs. I cant compare my original Titan X's to my new cards because mine are under EK water blocks and I didnt save the original air coolers when I installed them. So I cant see does the original titan X cooler look exactly like my Titan X (P) ones.

The part number on the new card's backplate reads 900-1G611-2500-000 which when I google it, the top hits are coming up as the Pascal version:
2016 Nvidia Titan X (Pascal Architecture) 900-1G611-2500-000



Just like with the water block issue, I cant compare the part number of my original Titan X's because I have EK backplates on them and didnt save the original OEM backplates (why dont I save this stuff!?!)

I paid for them via PayPal but the charge reads:
Seller info
DigitalRiver US Inc
952-646-5037
http://www.digitalriver.com
[email protected]

And now Im on the nvidia store order history page, and this order is NOT showing up in my order history. I didnt use a different email address than my nvidia one when I placed the order.

Can someone here confirm their Titan X (P) OEM cooler has the green GEFORCE GTX logo on it?

Does your card look like this:
http://s82.photobucket.com/user/skypine27/media/DSC_1087.jpg.html

http://s82.photobucket.com/user/skypine27/media/DSC_1085.jpg.html

I dont want to tear apart my 2 x new cards and put the blocks on them today if they actually aren't real!

Thanks for helping out a paranoid member!


----------



## stefxyz

Looks like your paranoya hindered your google skills. Its well known the new titan kept the Geforce on the card. Its all fine ... You can relax. The old Titan X btw only has the name Titan on it as far as I recall..


----------



## MrTOOSHORT

Quote:


> Can someone here confirm their Titan X (P) OEM cooler has the green GEFORCE GTX logo on it?


Yes, mine and all the TXPs look like yours.


----------



## skypine27

Thx guys. Im a US expat working in China so I'm sprung to immediately scream SCAM SCAM when I have the slightest strange feeling. (I bought the cards while I was back in the States on vacation).

The new blocks should show up today.


----------



## pompss

i read that the Intel Xeon E5 2683 V4 cpu its beating the i7 5960x big time

Found one cpu for $399

What you guys think?


----------



## axiumone

Quote:


> Originally Posted by *pompss*
> 
> i read that the Intel Xeon E5 2683 V4 cpu its beating the i7 5960x big time
> 
> Found one cpu for $399
> 
> What you guys think?


Any chance you have links to any game benches comparing the two? Would be very curious to see. Off the top of my head the 2xxx xeons will have a locked multiplier, so the only way to overclock one would be through the bus.


----------



## HyperMatrix

Quote:


> Originally Posted by *pompss*
> 
> i read that the Intel Xeon E5 2683 V4 cpu its beating the i7 5960x big time
> 
> Found one cpu for $399
> 
> What you guys think?


For any application that uses an unlimited number of threads, yes, it will be better. It's vastly superior to the 5960x for raw compute performance. But it should be substantially worse in comparison to the 5960x for gaming.


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> i read that the Intel Xeon E5 2683 V4 cpu its beating the i7 5960x big time
> 
> Found one cpu for $399
> 
> What you guys think?


you'll likely be disappointed in the gaming performance of that particular E-class server processor due to the low base turbo and limited BCLK headroom. Lol - lot's of users in this thread think 4 (and maybe 2 cores) would do better.








Besides, once it sees AVX in the stack it will down clock to the highest non-turbo multiplier (= 2.1 GHz)


----------



## pompss

Quote:


> Originally Posted by *HyperMatrix*
> 
> For any application that uses an unlimited number of threads, yes, it will be better. It's vastly superior to the 5960x for raw compute performance. But it should be substantially worse in comparison to the 5960x for gaming.


I saw this video






Which they said pretty much same fps in gaming but i cant find anything about the xeon e5 2683 v4 gaming perfomance

Quote:


> Originally Posted by *axiumone*
> 
> Any chance you have links to any game benches comparing the two? Would be very curious to see. Off the top of my head the 2xxx xeons will have a locked multiplier, so the only way to overclock one would be through the bus.


cpu bench the xeon wins but seems i cant find any gaming bench to compare .


----------



## HyperMatrix

Quote:


> Originally Posted by *pompss*
> 
> I saw this video
> 
> 
> 
> 
> 
> 
> Which they said pretty much same fps in gaming but i cant find anything about the xeon e5 2683 v4 gaming perfomance
> cpu bench the xeon wins but seems i cant find any gaming bench to compare .


Pro tip: never attempt to "learn" anything from Linus.


----------



## Ascendor81

eVGA 1070/1080 Hybrid Kit mod for Titan X(P). I have my TXP and my kit, waiting for dermel tool, comes in mail tomorrow, going AIO tomorrow night.

https://hardforum.com/threads/nvidia-announces-the-new-titan-x.1905829/page-19#post-1042460620

Instructions: cut tab to make room for 6-pin, cut out small square for extra phase chip.


----------



## Steven185

Quote:


> Originally Posted by *HyperMatrix*
> 
> I documented earlier that there is an actual serious problem with the driver. At pre-boost clocks, it'll artifact and crash even with a +100MHz offset. This happens when there is low GPU usage. But if you're running a program that utilizes the GPU more, and pushes it into boost clocks, it'll go over 2000MHz+ and all artifacting/etc issues will be gone. GPU-Z is reporting it as a VRel issue during pre-boost clocks.


Same here. Any solution (s) thus far? On low volts heavy artifacting, on 2000 mhz just fine...


----------



## HyperMatrix

Quote:


> Originally Posted by *Steven185*
> 
> Same here. Any solution (s) thus far? On low volts heavy artifacting, on 2000 mhz just fine...


Since it's only happening to one card, I'm inclined to believe that there is some sort of hardware fault with this card. I'm probably going to request an RMA from Nvidia tomorrow unless they can give me a solid explanation and promise of a fix. I don't want to touch the card with the CLU mod and put a block on it if there's a chance that this hardware fault will result in a lower top end OC as well. I just have absolutely no idea how the driver could break one card, but not the other, unless there is faulty hardware involved.

Would be nice if GPU-Z could read ASIC values off these cards so we could see if there's any connection between ASIC quality and the driver causing failures.


----------



## Steven185

So you think it's a hardware issue? Mine does not crash but I do get artifacts. I'd guess a bios mod would fix that right up (just boost the volts for the low clocks). Since you're only getting it during overclock do you think they will accept the RMA (or do you get it during stock clocks too?). Sorry for being so inquisitive but I'mlooking to find a solution too with my card.


----------



## HyperMatrix

Quote:


> Originally Posted by *Steven185*
> 
> So you think it's a hardware issue? Mine does not crash but I do get artifacts. I'd guess a bios mod would fix that right up (just boost the volts for the low clocks). Since you're only getting it during overclock do you think they will accept the RMA (or do you get it during stock clocks too?). Sorry for being so inquisitive but I'mlooking to find a solution too with my card.


Well considering it's exhibiting abnormal behavior, I don't see why not. This isn't an issue of "My card doesn't OC to 2100MHz." Rather, it's a problem where the card may indeed OC to 2100MHz, but due to some kind of problem, prior to activating the boost clock, it fails to provide enough voltage to run at stock clocks, causing artifacting and crashing depending on how high your OC is. Right now I'm running the old drivers and have no problem playing at 2050MHz after hours on end of 90%~ usage in GTA5.

Unless someone can explain how this problem can be brought on by a driver, and have it only affect one card in a system, there's nothing else that would make sense. Because if it were a simple driver issue, it should be a problem that presents itself in both cards. Even if they didn't want to do an RMA, we're within the 30 day return period. So you can return it without question, then just order another one. Not really in the mood to take a gamble like this on a potentially faulty $1200 GPU that they may deny for RMA/Warranty service after it's been put under water.


----------



## jodasanchezz

Hi There,
are there any official informations about replaceing the stock cooler?
Warranty should be still ok if i replace this pice of crap?

Is there a Chance to get advanced warranty @ nvidia like evga offers?

Thanks in advance


----------



## Z0eff

Quote:


> Originally Posted by *HyperMatrix*
> 
> Pro tip: never attempt to "learn" anything from Linus.


What's so bad about linus?


----------



## HyperMatrix

Quote:


> Originally Posted by *Z0eff*
> 
> What's so bad about linus?


It's over simplified content for the general masses. A lot of conclusions are made that aren't accurate or are based off of improper testing techniques and comparisons, which is fine for the layman. But I wouldn't put any stock into their analysis of the feasibility of using a xeon processor as a gaming cpu, for example.


----------



## skypine27

Add me to the old enough to know better but still too young to care crowd....

2 x Titan X (P's) in SLI. EK waterblocks, factory backplates (with a little Rube Goldberg know how and thermal paste). Asus ROG 3 way SLI bridge (due to slot spacing as well as I wanted both SLI tabs on each card connected since the Nvidia HB bridges wont fit EK blocks) +150 on the clocks, no change to memory times.

13598 Fire Strike Ultra score:
http://www.3dmark.com/3dm/14308478

http://s82.photobucket.com/user/skypine27/media/IMG_3268.jpg.html


----------



## toncij

Quote:


> Originally Posted by *skypine27*
> 
> Add me to the old enough to know better but still too young to care crowd....
> 
> 2 x Titan X (P's) in SLI. EK waterblocks, factory backplates (with a little Rube Goldberg know how and thermal paste). Asus ROG 3 way SLI bridge (due to slot spacing as well as I wanted both SLI tabs on each card connected since the Nvidia HB bridges wont fit EK blocks) +150 on the clocks, no change to memory times.
> 
> 13598 Fire Strike Ultra score:
> http://www.3dmark.com/3dm/14308478
> 
> http://s82.photobucket.com/user/skypine27/media/IMG_3268.jpg.html


You can cut HB bridges easily. No damage. Then those fit too.


----------



## skypine27

Quote:


> Originally Posted by *toncij*
> 
> You can cut HB bridges easily. No damage. Then those fit too.


Thx for the info (also found a PC world article where the completely removed the plastic shroud from the bridge to make it fit). For me it doesn't matter since I ordered the wrong length HB bridge anyway. I thought the "3 slot" model actually meant Tri SLI. The reason I wanted a Tri SLI length bridge is because my slot spacing for 2x SLI is equal to that of 3 x SLI setups (i.e. 4 slots).

I think its officially called "spaced SLI". However, when I ordered the HB bridge opening day, it wasn't in the forefront of my mind that since nvidia had officially given up 3 and 4 way sli, that 3 "slot" bridge actually means 3 slots and not 3 x cards







So its too short.

I talked to EK tech support and they confirmed they are working on their own custom HB bridges and also mentioned that the EVGA PRO HB bridges will fit, I would need this one:
http://www.evga.com/Products/Product.aspx?pn=100-2W-0028-LR

But Ill wait a while and see if any better looking ones come out.

Thx for the info.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> It's just unfortunate because I find 2 cards to be a little underpowered right now for maxed out 165Hz. I even set my monitor back to 144Hz to help lower gpu usage. But even then there are times where FPS will drop to 110~ FPS in GTA V. Or even dip as low as 80fps~ in Witcher 3. With blocks and a modified bios, best I can expect is another 10% performance, which is ok for 120Hz gaming at 1440p, but ends up being shy of 144-165Hz. If only the Titan XP was a full 610mm2 die like GP100. That extra 30% would have been amazing. I do like the simplicity of 2-card SLI. And much happier with heat management as a result.
> 
> Ah well. Here's hoping for 600mm2 Volta with HBM2 next year.


30%? 3840 cores vs 3584 is only 7% more, hardly worth waiting if you think that would be the Volta. Volta needs to be in the range of 5k cores to be a meaningful upgrade. 5k can be placed on a chip the size of GP100 since you don't need FP64 cores etc.

However, I think we'll first see a 14nm (Samsung fabs) refresh of the Pascal like 3840 cores, 1.8/2GHz official base/boost and keeping the same power envelope. That'd probably happen early 2017 as 1080Ti and possible "Titan X Black (Pascal)".




Here is my latest testing:


http://imgur.com/XPoK3


Titan X (Pascal) is projected, calculated result as average of 60% faster than Titan X (Maxwell) and 33% faster than 1080 (max theoretical advantage), but 1080, 1080 SLI na old Titan X are measured in exactly the same environment and setup.

Now, except for that, theoretically, Titan X (Pascal) beats 1080 SLI in most situations, I've ran across a very strange thing.. 3 different 1080 cards (EVGA, EVGA, FE) in a single and SLI config exhibit a strange behavior in Rise of The Tomb Raider benchmark - Syria (2nd test) has a hiccup at the beginning and is abysmally slow in 3rd test too dropping minimum framerate to 3-4 in the 2nd and 15-16 in the 3rd test. Old Titan X (Maxwell) does not have that problem, minimum is very high for it.

Can anyone test on their own Titan X (Pascal)?

Thx.


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> 30%? 3840 cores vs 3584 is only 7% more, hardly worth waiting if you think that would be the Volta. .


Die size of GP100. Just as Nvidia did with the Titan X over the OG Titan by dropping fp64.


----------



## atreides

hi everyone,

I'm not particularly tech savvy but my Titan XP seems to be generating massive heat. I know it's a good thing the heat is being exhausted but I would like to know if it's the same experience for the users who are just using the stock cooler. Please if anyone could chime in I'd appreciate it thank you.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> Die size of GP100. Just as Nvidia did with the Titan X over the OG Titan by dropping fp64.


Current Titan X (Pascal) already has 3840 cores, just not all are active. 4 SMs have been disabled due to yield problems. As I've said in the same post, with a die size of GP100 we'd get something like 5760 cores, which is a full 60 SM chip, a fully enabled GP100 with FP64 replaced by FP32.

Unless they make a better yielded refresh, we won't see anything new soon.

That kind of chip would be 60% faster than Titan X (Pascal) and 125% faster than 1080. I presume they'll again have yield problems and will not make current cards obsolete that fast - probably not before summer 2017 or later.


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> Current Titan X (Pascal) already has 3840 cores, just not all are active. 4 SMs have been disabled due to yield problems. As I've said in the same post, with a die size of GP100 we'd get something like 5760 cores, which is a full 60 SM chip, a fully enabled GP100 with FP64 replaced by FP32.
> 
> Unless they make a better yielded refresh, we won't see anything new soon.
> 
> That kind of chip would be 60% faster than Titan X (Pascal) and 125% faster than 1080. I presume they'll again have yield problems and will not make current cards obsolete that fast - probably not before summer 2017 or later.


All I ever said was that I wish the Pascal Titan X was a full sized 610mm2 die without fp64, because 2 of the current card just aren't enough for maxed out gaming.


----------



## skypine27

Quote:


> Originally Posted by *atreides*
> 
> hi everyone,
> 
> I'm not particularly tech savvy but my Titan XP seems to be generating massive heat. I know it's a good thing the heat is being exhausted but I would like to know if it's the same experience for the users who are just using the stock cooler. Please if anyone could chime in I'd appreciate it thank you.


Skip to about 10 mins of this guys review:
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Titan-X-Pascal-12GB-Graphics-Card-Review

He's using a single card and the stock cooler of course. Yes it runs hot under load. 85 degrees C on his.

Get some temp monitoring software like AIDA 64 or just use EVGA precision X (it's free) and enable the OSD and select temp monitoring in the options (be warned the precision X interface is not very easy to figure it the first time you use it).


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> All I ever said was that I wish the Pascal Titan X was a full sized 610mm2 die without fp64, because 2 of the current card just aren't enough for maxed out gaming.


I agree and unfortunately, I don't think we'll see it before 2017.


----------



## unreality

I cant say 2x Titan Xps are not enough for 165Hz 1440p.

Im running on one Titan Xp @ 2050/11000 (with AC waterblock) and GTA V is spitting out around 144 fps when not running into CPU limit ([email protected] btw







)

Most of the times i even play in 5120x2880 with a healthy 50-80 fps







Your SLI scaling must be faulty!

Singe Card ftw


----------



## HyperMatrix

Quote:


> Originally Posted by *unreality*
> 
> I cant say 2x Titan Xps are not enough for 165Hz 1440p.
> 
> Im running on one Titan Xp @ 2050/11000 (with AC waterblock) and GTA V is spitting out around 144 fps when not running into CPU limit ([email protected] btw
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Most of the times i even play in 5120x2880 with a healthy 50-80 fps
> 
> 
> 
> 
> 
> 
> 
> Your SLI scaling must be faulty!
> 
> Singe Card ftw


I have no CPU bottleneck. That only pops up if I activate the extended distance scaling, which I do not do. You're just not running your settings high enough. Which is understandable since you only have a single card and are used to not enabling certain features because you feel they're too taxing on your system.


----------



## toncij

To be honest, I don't think he does it wrong: maxed out settings, 5960X @ 4.7, 128GB of 2.8GHz CL15 1T - these are the settings for 2152MHz 1080s in SLI compared to projected TXP performance across the board:


http://imgur.com/RIRcH




When SLI works, which is rare nowdays, its scaling is so erratic, while TXP offers consistent performance all the time compared to dual 1080 where the 2nd card is an expensive paperweight half the time. Even worse for Titan X (Pascal).

If we could guarantee that 1080 SLI worked more than in 50% of situations, we'd have something like Titan X (Pascal) performance in some games and high scaling in others, but atm it's 20% scaling high, 30% worse than Titan X (Pascal) and 50% not at all or worse than a single card.


----------



## skypine27

I'm not sure all the SLI hate.

It works a hell of a lot better than Crossfire in AAA titles, which is what SLI is for in the first place.

You don't need an SLI profile for Kerbal Space Program but for Far Cry Primal or dear God Elite Dangerous... It works a lot better than crossfire.


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> To be honest, I don't think he does it wrong: maxed out settings, 5960X @ 4.7, 128GB of 2.8GHz CL15 1T - these are the settings for 2152MHz 1080s in SLI compared to projected TXP performance across the board:
> 
> 
> http://imgur.com/RIRcH
> 
> 
> 
> 
> When SLI works, which is rare nowdays, its scaling is so erratic, while TXP offers consistent performance all the time compared to dual 1080 where the 2nd card is an expensive paperweight half the time. Even worse for Titan X (Pascal).
> 
> If we could guarantee that 1080 SLI worked more than in 50% of situations, we'd have something like Titan X (Pascal) performance in some games and high scaling in others, but atm it's 20% scaling high, 30% worse than Titan X (Pascal) and 50% not at all or worse than a single card.


I don't feel like going over the SLI fight again, where people who buy a single card ridicule the ones who get more, because "SLI is bad and it's a waste of money", just like non-Titan XP members sometimes ridicule us for buying a card that is way overpriced and makes no logical sense to buy. Forget the scaling portion. The problem here wasn't even about SLI. It was about SLI Titan XP's not having enough to max out 1440p at the moment. At which point, he decided to take an example of the FPS I was getting, by saying he's getting 144fps all the time except when he gets a cpu bottleneck, and concluding that SLI is bad and he is smarter by having just 1 card.

Unless a game is seriously broken, on average you can expect 70-80% scaling with SLI. Some games are even 90%+. The problem with a lot of SLI scaling tests, is that they don't take into account cpu bottlenecking. If his claim that he gets 144fps at all times except under cpu bottleneck are true, then it means 2 things. First, he doesn't know what a cpu bottleneck is, and second, he's probably not running the game maxed out.


----------



## meson1

So that suggests SLI is not the way to go. That suggests blowing a wad on Titan XP is likely to be better than blowing a slightly bigger wad on two 1080's if one hopes to get anywhere close to 60fps @ 4K.

Whatever I choose will be going underwater and OC'd, fronted with an i7-6900K (also OC'd).

Even then it sounds like we have to wait for Volta to get something that can do 60fps @ 4K with (performance) headroom to spare.

I'm weighing up my options. TXP now? Or 1080 now and Volta later. Help! Confused. Can't decide.


----------



## Nizzen

Allways buy 2 of the best cards allways


----------



## profundido

Quote:


> Originally Posted by *Nizzen*
> 
> Allways buy 2 of the best cards allways


I couldn't decide between single TXP or 1080 SLI so yeah I went with TXP SLI


----------



## unreality

Quote:


> Originally Posted by *HyperMatrix*
> 
> I have no CPU bottleneck. That only pops up if I activate the extended distance scaling, which I do not do. You're just not running your settings high enough. Which is understandable since you only have a single card and are used to not enabling certain features because you feel they're too taxing on your system.


I do know what a CPU bottleneck is since its exactly what you described the extended distance scaling which brings it into cpu bottleneck. Thats why i play with 0% extended distance @ 1440p and 100% @ 5K.

And i do have some settings tweaked for personal preference and no visual decrease (even an improvement for me)

Settings i adjusted:
Shader Very High -> High (no visual decrease test it yourself)
PostFX Ultra -> High (no visual decrease since i dont wanna play with motion blur **** anyway which destroys image quality)
Grass Ultra -> High
Also using a lot of texture mods and VisualV which should tank fps a bit.

As i said i have a solid 120+fps all the time and even 50+ @5k. So if you dont get 165 with 2 cards your SLI scaling is just bad! Thats simple logic. Microstuttering neglected beforehand.

Imo you should try playing with 4xDSR 0% Smoothness and 100% extended Distance Detail, where you scaling should be much better and for a crystal clear image









Edit: by the way even at 0% extended distance gta5 runs into cpu limit with a [email protected] and ONE Titan Xp see pic:


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> I don't feel like going over the SLI fight again, where people who buy a single card ridicule the ones who get more, because "SLI is bad and it's a waste of money", just like non-Titan XP members sometimes ridicule us for buying a card that is way overpriced and makes no logical sense to buy. Forget the scaling portion. The problem here wasn't even about SLI. It was about SLI Titan XP's not having enough to max out 1440p at the moment. At which point, he decided to take an example of the FPS I was getting, by saying he's getting 144fps all the time except when he gets a cpu bottleneck, and concluding that SLI is bad and he is smarter by having just 1 card.
> 
> Unless a game is seriously broken, on average you can expect 70-80% scaling with SLI. Some games are even 90%+. The problem with a lot of SLI scaling tests, is that they don't take into account cpu bottlenecking. If his claim that he gets 144fps at all times except under cpu bottleneck are true, then it means 2 things. First, he doesn't know what a cpu bottleneck is, and second, he's probably not running the game maxed out.


No need to go through it again. I just picked up a bunch of games and tested these on TXM, 1080 and 1080 SLI and then calculated TXP ratios. It happens scaling is good only in Frostbite engine and a new Tomb Raider DX12. All the other times, a single TXP wins.

5960X @ 4.7GHz does not and can not bottleneck anything here. It's up to those games that obviously are more and more, that don't support SLI at all or very badly.

If money is of no concern, of course you'll get 2x TXP. But, if you're trying to stay under 2k, a single TXP is a much better choice for 1200 than 2x 1080 for 1400 or more.
Quote:


> Originally Posted by *profundido*
> 
> I couldn't decide between single TXP or 1080 SLI so yeah I went with TXP SLI

















- the best choice, ofc.


----------



## profundido

to ease the minds of those thinking " why on earth would you take TXP SLI useless OMG!". These were my personal reasons:

-The games that I typically play (RPG eyecandy mostly) such as dragon age, The Witcher, tomb raider, ESO on my g-sync 4K screen all happen to greatly benefit from SLI as I noticed on my previous GTX 780 Ti setup in the passed years. So I'm guaranteed the same personally proven benefit with simply loads more horsepower under the belt.

-the step from a 780 Ti to TXP Pascal (that's essentially a future 1080 Ti with more memory I guess) is a considerable, big upgrade in fact since I jump 2 generations forward

-money didn't matter, only performance

I admit that in most other situations (money concerns, different type of games,....) it would make prefect sense to go for 1 TXP or 1 GTX 1080 as well so I respect all the people that make that decision just as much as I do my own


----------



## unreality

Imo its pretty easy.

WQHD 144Hz+ -> One Titan Xp is more than enough
4K/5K/DSR users -> Two Titan Xp can bring some extra power in some games


----------



## pez

I'd say for max eye candy in Ultrawide 1440p that Titan XP SLI is desirable as well. A single one is super nice, but when those 144hz UWs come out, it'll be a good bit of heft to have.


----------



## profundido

Quote:


> Originally Posted by *atreides*
> 
> hi everyone,
> 
> I'm not particularly tech savvy but my Titan XP seems to be generating massive heat. I know it's a good thing the heat is being exhausted but I would like to know if it's the same experience for the users who are just using the stock cooler. Please if anyone could chime in I'd appreciate it thank you.


confirmed on both my cards. These cards scream for watercooling as concluded by multiple reviewers/users. Massive thermal throttling and lowered performance. Can't wait till my waterblocks arrive to rerun my test after I put them on


----------



## MrKenzie

Quote:


> Originally Posted by *profundido*
> 
> to ease the minds of those thinking " why on earth would you take TXP SLI useless OMG!". These were my personal reasons:
> 
> -The games that I typically play (RPG eyecandy mostly) such as dragon age, The Witcher, tomb raider, ESO on my g-sync 4K screen all happen to greatly benefit from SLI as I noticed on my previous GTX 780 Ti setup in the passed years. So I'm guaranteed the same personally proven benefit with simply loads more horsepower under the belt.
> 
> -the step from a 780 Ti to TXP Pascal (that's essentially a future 1080 Ti with more memory I guess) is a considerable, big upgrade in fact since I jump 2 generations forward
> 
> -money didn't matter, only performance
> 
> I admit that in most other situations (money concerns, different type of games,....) it would make prefect sense to go for 1 TXP or 1 GTX 1080 as well so I respect all the people that make that decision just as much as I do my own


Do you play at 4K 60hz? The reason I ask is because I have gone from a single 780Ti to a 1080, and am waiting on my Titan to arrive now because the 1080 can only handle 40fps in some games, whereas the Titan should handle it OK until some decent 4K 100hz+ or OLED monitors become available. Wouldn't your two Titan's be throttled way back to only push 60pfs on g-sync?


----------



## pez

Quote:


> Originally Posted by *MrKenzie*
> 
> Do you play at 4K 60hz? The reason I ask is because I have gone from a single 780Ti to a 1080, and am waiting on my Titan to arrive now because the 1080 can only handle 40fps in some games, whereas the Titan should handle it OK until some decent 4K 100hz+ or OLED monitors become available. Wouldn't your two Titan's be throttled way back to only push 60pfs on g-sync?


GTX 1080 SLI did great at 4K, IMO, though most of it was a task to get running in 4K @ 60+ FPS. Very doable in most titles. The only title that I got around to test and had serious trouble with was Crysis 3. Disabling AA and turning shaders to high did the trick IIRC. The Titan XP should only improve on that (especially since Crysis 3 has great scaling for SLI). I'd love to see 4K 100Hz panels with G-sync this year.


----------



## profundido

Quote:


> Originally Posted by *MrKenzie*
> 
> Do you play at 4K 60hz? The reason I ask is because I have gone from a single 780Ti to a 1080, and am waiting on my Titan to arrive now because the 1080 can only handle 40fps in some games, whereas the Titan should handle it OK until some decent 4K 100hz+ or OLED monitors become available. Wouldn't your two Titan's be throttled way back to only push 60pfs on g-sync?


Yes, I play on the Asus ROG Swift PG27AQ. Currently, [email protected] is still the maximum eyecandy currently possible gaming wise until it's successor with [email protected]/144Hz comes out beginning 2017 (They already have it running in proof of concept since 3 months) Single Titan XP handles modern games very decent when put on water with averages close to 60fps but still drops down below. On air, due to thremal throttling you can only reach good results with lots of airflow and noise though !

The 2 titans will not be throttled at all or affected even. The 60Hz on the monitor is simply only capable of showing 60fps but it does not affect the cards in any way. One can choose to limit the cards to generate only 60fps in the software/drivers but again this is user's choice, not something the monitor imposes on the hardware. On the contrary, besides having a flat out continuous 60fps eyecandy with no drops below 60Hz anymore they create the extra leeway for the upgrade to my new monitor in half a year. I'm hoping the new ROG Swift will be 32" [email protected] which would unlock the full potential of a TXP SLI setup.

Since I'm really happy with a solid 60frames for the sort of eyecandy games I play I'm simply maxxed out hardware wise currently but I understand that fps-game people will never settle for 60hz monitors so they will prefer to stay on the lower resolutions to keep the fps to 100 and up. They get alot less scaling though from SLI setups


----------



## Bloodymight

Quote:


> Originally Posted by *unreality*
> 
> WQHD 144Hz+ -> One Titan Xp is more than enough


What?!

not sure why you would say that









Go read ONE review about the Titan X Pascal and you will see there are quite a lot of games that barely run 100fps(not 144) at 1440p and a lot of games are to come which will be even more demanding, especially with the new consoles being released soon.

One Titan XP is certainly not "more than enough" for wqhd 144hz.


----------



## KillerBee33

Got a delema , running stock 6700K gives me better performing Titan








Stock 6700K http://www.3dmark.com/spy/321436
6700K @ 4.6 http://www.3dmark.com/spy/313743
Any1 have any ideas why this may be happening>?


----------



## unreality

Quote:


> Originally Posted by *Bloodymight*
> 
> What?!
> 
> not sure why you would say that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go read ONE review about the Titan X Pascal and you will see there are quite a lot of games that barely run 100fps(not 144) at 1440p and a lot of games are to come which will be even more demanding, especially with the new consoles being released soon.
> 
> One Titan XP is certainly not "more than enough" for wqhd 144hz.


Well im already running into CPU limit a lot with one card beceause its so powerful. You show me how much a second card provides at 1440p... oh wait i already have some numbers here:

Witcher 3 27% more performance..
Tomb Raider 14%....
Hitman 35%
Division 30%
Fallout 4 28%
Doom 29%

Thats non OC so with a good OC under water the numbers even diminish further. Not very good scaling if you ask me. Btw in 4k SLI scaling goes up a lot thats why i think its fine there.


----------



## unreality

Quote:


> Originally Posted by *KillerBee33*
> 
> Got a delema , running stock 6700K gives me better performing Titan
> 
> 
> 
> 
> 
> 
> 
> 
> Stock 6700K http://www.3dmark.com/spy/321436
> 6700K @ 4.6 http://www.3dmark.com/spy/313743
> Any1 have any ideas why this may be happening>?


graphics score is roughly the same so not a problem with your titan. a 0.2% difference in a run is totally normal


----------



## profundido

Quote:


> Originally Posted by *KillerBee33*
> 
> Got a delema , running stock 6700K gives me better performing Titan
> 
> 
> 
> 
> 
> 
> 
> 
> Stock 6700K http://www.3dmark.com/spy/321436
> 6700K @ 4.6 http://www.3dmark.com/spy/313743
> Any1 have any ideas why this may be happening>?


Quote:


> Originally Posted by *unreality*
> 
> graphics score is roughly the same so not a problem with your titan. a 0.2% difference in a run is totally normal


Besides error margin (rerun your tests enough times to rule that out) there's 1 thing I would think about immediately. If you still have that cpu kraken radiator mounted up front instead of on top it will dump more heat into your case when the cpu is overclocked, causing more thermal throttling on your new titan xp in maximized benchmarks than when running the cpu at stock


----------



## toncij

Quote:


> Originally Posted by *profundido*
> 
> Yes, I play on the Asus ROG Swift PG27AQ. Currently, [email protected] is still the maximum eyecandy currently possible gaming wise until it's successor with [email protected]/144Hz comes out beginning 2017 (They already have it running in proof of concept since 3 months) Single Titan XP handles modern games very decent when put on water with averages close to 60fps but still drops down below. On air, due to thremal throttling you can only reach good results with lots of airflow and noise though !
> 
> The 2 titans will not be throttled at all or affected even. The 60Hz on the monitor is simply only capable of showing 60fps but it does not affect the cards in any way. One can choose to limit the cards to generate only 60fps in the software/drivers but again this is user's choice, not something the monitor imposes on the hardware. On the contrary, besides having a flat out continuous 60fps eyecandy with no drops below 60Hz anymore they create the extra leeway for the upgrade to my new monitor in half a year. I'm hoping the new ROG Swift will be 32" [email protected] which would unlock the full potential of a TXP SLI setup.
> 
> Since I'm really happy with a solid 60frames for the sort of eyecandy games I play I'm simply maxxed out hardware wise currently but I understand that fps-game people will never settle for 60hz monitors so they will prefer to stay on the lower resolutions to keep the fps to 100 and up. They get alot less scaling though from SLI setups


The [email protected] monitor won't be ready before summer 2017, thus making Volta be a reality much before you're able to enjoy your Titan X (Pascal) SLI. TXP will be obsolete with Volta and HBM, proper Async support etc.

Quote:


> Originally Posted by *Bloodymight*
> 
> What?!
> 
> not sure why you would say that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go read ONE review about the Titan X Pascal and you will see there are quite a lot of games that barely run 100fps(not 144) at 1440p and a lot of games are to come which will be even more demanding, especially with the new consoles being released soon.
> 
> One Titan XP is certainly not "more than enough" for wqhd 144hz.


Now, I didn't have the chance to test myself, but - mathematically, Titan X P has a theoretical advantage of min 60% compared to Titan X M if run at 2GHz compared to 1.465GHz TXM. Also, mathematically it has 33% advantage over 1080 at 2.15GHz.

Since TXP does not have scaling losses etc. we can assume it achieves its advantage all the time. In that case, at 1440, I have yet to find a game where TXP wouldn't give 100 FPS as a minimum...


----------



## KillerBee33

Quote:


> Originally Posted by *profundido*
> 
> Besides error margin (rerun your tests enough times to rule that out) there's 1 thing I would think about immediately. If you still have that cpu kraken radiator mounted up front instead of on top it will dump more heat into your case when the cpu is overclocked, causing more thermal throttling on your new titan xp in maximized benchmarks than when running the cpu at stock


Both radiators work as exhaust and changing anything makes it much worse ,

Temperatures are not horrible with this , 49-50 GPU and 59-65 CPU Running Time Spy +240 +650 and 5 degrees lower on the GPU in Firestrike


----------



## kvickstick

Quote:


> Originally Posted by *Bloodymight*
> 
> What?!
> 
> not sure why you would say that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go read ONE review about the Titan X Pascal and you will see there are quite a lot of games that barely run 100fps(not 144) at 1440p and a lot of games are to come which will be even more demanding, especially with the new consoles being released soon.
> 
> One Titan XP is certainly not "more than enough" for wqhd 144hz.


Tweak a few settings and you have 144fps with 1 Titan X Pascal. As an example i run Ultra in Overwatch instead of Epic and i have 165 instead of 130. Cant tell the difference in fidelity. This is with 1440P 165Hz Gsync panel.


----------



## profundido

Quote:


> Originally Posted by *KillerBee33*
> 
> Both radiators work as exhaust and changing anything makes it much worse ,


your setup is indeed the most efficient possible case position wise. Yet, your Titan can still be thermal throttling easy if it's running on 1 12/14cm rad only. I would run test with afterburner graph open to make sure it never gets close to 80 degrees Celsius. One test after another from cold (first run) to second and subsequent runs (hot environment) could already yield different results. Then it might have nothing to do with your cpu overclock e.g but rather. Note that you would same see the same max temps in afterburner but lower performance due to more/faster throttling

Also, try running high res benchmarks to rule out any possible cpu bottlenecking


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Both radiators work as exhaust and changing anything makes it much worse ,
> 
> Temperatures are not horrible with this , 49-50 GPU and 59-65 CPU Running Time Spy +240 +650 and 5 degrees lower on the GPU in Firestrike


You can easily determine if the gpu is hitting the temp, power and voltage limits in AB 4.3beta. They are in the graph
regarding the cpu thing.. if "stock" is actually "load optimizezed defaults with manual adjustment of the ram freq, bclk spreadspectrum may be different and PCIE gen also (actually, you should try TS with PCIE set to Gen2 - lower CPU overhead).


----------



## KillerBee33

Quote:


> Originally Posted by *profundido*
> 
> your setup is indeed the most efficient possible case position wise. Yet, your Titan can still be thermal throttling easy if it's running on 1 12/14cm rad only. I would run test with afterburner graph open to make sure it never gets close to 80 degrees Celsius. One test after another from cold (first run) to second and subsequent runs (hot environment) could already yield different results. Then it might have nothing to do with your cpu overclock e.g but rather. Note that you would same see the same max temps in afterburner but lower performance due to more/faster throttling
> 
> Also, try running high res benchmarks to rule out any possible cpu bottlenecking


Ran few games yesterday for few hours in a 80+ room temperature

This is with different setup GPU Radiator as Intake which didnt really work as you can see.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> You can easily determine if the gpu is hitting the temp, power and voltage limits in AB 4.3beta. They are in the graph
> regarding the cpu thing.. if "stock" is actually "load optimizezed defaults with manual adjustment of the ram freq, bclk spreadspectrum may be different and PCIE gen also (actually, you should try TS with PCIE set to Gen2 - lower CPU overhead).


Tried GEN 2-GEN3- and AUTO still works better
Keepin GPU Z running and it keeps steady @ 1844 i think while gaming, Core/Memory and Power @ Stock


----------



## dante`afk

Quote:


> Originally Posted by *KillerBee33*
> 
> Ran few games yesterday for few hours in a 80+ room temperature
> 
> This is with different setup GPU Radiator as Intake which didnt really work as you can see.


As I said yesterday, your temps are normal. And 80F room temperature is really hot.


----------



## KillerBee33

Quote:


> Originally Posted by *dante`afk*
> 
> As I said yesterday, your temps are normal. And 80F room temperature is really hot.


I know , just wanted to test in HOT environment . But still TitanX P runs hotter than the 980


----------



## RedM00N

No bios files yet for me to play with?

Tis a sad thing indeed.

And the cards are still out of stock


----------



## dante`afk

Quote:


> Originally Posted by *KillerBee33*
> 
> I know , just wanted to test in HOT environment . But still TitanX P runs hotter than the 980


It sure generates more heat than a 980, and that's normal too


----------



## profundido

Quote:


> Originally Posted by *KillerBee33*
> 
> Both radiators work as exhaust and changing anything makes it much worse ,
> 
> Temperatures are not horrible with this , 49-50 GPU and 59-65 CPU Running Time Spy +240 +650 and 5 degrees lower on the GPU in Firestrike


I would like to see a screenshot of your expanded afterburner graph while running your Firestrike test multiple times


----------



## KillerBee33

Quote:


> Originally Posted by *dante`afk*
> 
> It sure generates more heat than a 980, and that's normal too


Should've bought 980Ti kit for 69$ instead


----------



## KillerBee33

Quote:


> Originally Posted by *profundido*
> 
> I would like to see a screenshot of your expanded afterburner graph while running your Firestrike test multiple times


I can do a few when i get home .


----------



## Lobotomite430

Quote:


> Originally Posted by *KillerBee33*
> 
> Should've bought 980Ti kit for 69$ instead


Where are people finding these kits for this price? Every thing I am finding is around $110.


----------



## KillerBee33

Quote:


> Originally Posted by *Lobotomite430*
> 
> Where are people finding these kits for this price? Every thing I am finding is around $110.


Heh , i got my Ti kit a wile back from this seller for 74$ and he keeps changing price from 69-117 , just keep checking this link








https://www.amazon.com/gp/product/B00ZQ4PFX2/ref=oh_aui_detailpage_o02_s00?ie=UTF8&psc=1


----------



## lyang238

I have one I bought but decided not to use for $70 shipped if you want.


----------



## Lobotomite430

Quote:


> Originally Posted by *lyang238*
> 
> I have one I bought but decided not to use for $70 shipped if you want.


I will think about it, I already ordered a Arctic Accelero this weekend since it was way cheaper than the EVGA 1080 hybrid kit and considering they dont have anymore


----------



## DNMock

So if PCIE 4.0 GPU power is on the motherboard, that means that the system Bios will control the Voltage on the GPU so we won't need modded bios ever again, right?


----------



## graymoon

Quote:


> Originally Posted by *KillerBee33*
> 
> Ran few games yesterday for few hours in a 80+ room temperature
> 
> This is with different setup GPU Radiator as Intake which didnt really work as you can see.


what is this program's name?


----------



## KillerBee33

Quote:


> Originally Posted by *graymoon*
> 
> what is this program's name?


NZXT CAM


----------



## axiumone

Quote:


> Originally Posted by *DNMock*
> 
> So if PCIE 4.0 GPU power is on the motherboard, that means that the system Bios will control the Voltage on the GPU so we won't need modded bios ever again, right?


No, that doesn't mean that at all. Voltage of the GPU and the source where it's getting power from have very little correlation. Otherwise you'd already see PSU unit that could directly control GPU and CPU voltage.


----------



## mattlach

Quote:


> Originally Posted by *RedM00N*
> 
> No bios files yet for me to play with?
> 
> Tis a sad thing indeed.
> 
> And the cards are still out of stock


How long does it typically take until we have BIOS modding for a new gen of cards?

My Pascal Titan X is screaming for some added voltage and edited power limit.

My max stable overclock id 2063Mhz, and with my EK Full Cover Block it can sit there under full load at 34C all day...


----------



## mattlach

By the way...

Has anyone else had trouble with the 372.54 drivers?

After I completed my water loop, I upgraded to these using the GeForce Experience app. They seemed to work well, but when I tried to play Fallout New Vegas it kept crashing.

I figured, huh, maybe my drivers somehow had become corrupt, so I went to Geforce.com to manually download the drivers and reinstall, doing a clean install.

I saw a new section on their website for "Windows 10 Anniversary Update" so I used that section. The drivers kept failing to install though of some reason.

So I did some cleaning up, uninstalling all Nvidia components already installed. and wen to try the install again, still the driver install failed.

Immediately after - however - Windows auto detected the drivers and installed them for me again. After checking, I was now back on the 369.05 launch day drivers. I thought this was odd, but Fallout new Vegas runs fine with these drivers, so I think I'll keep them until I'm done playing it, or until the next driver release after 372.54 comes out.


----------



## toncij

Quote:


> Originally Posted by *mattlach*
> 
> By the way...
> 
> Has anyone else had trouble with the 372.54 drivers?


Yes. Disastrous driver.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Yes. Disastrous driver.


Everything smooth here , much better Benchmark results than 369.


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Everything smooth here , much better Benchmark results than 369.


Completely killed my 1080s machine. Crashes, overclock artifacts... awful. :/


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Completely killed my 1080s machine. Crashes, overclock artifacts... awful. :/


Start seeing artifacts after 1400 on the memory. Haven't checked any games really other than NFS 2016
Might be silly but have you tried this?


----------



## axiumone

Quote:


> Originally Posted by *toncij*
> 
> Completely killed my 1080s machine. Crashes, overclock artifacts... awful. :/


No issue's what so ever on my end with 1080's.


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Start seeing artifacts after 1400 on the memory. Haven't checked any games really other than NFS 2016
> Might be silly but have you tried this?


Older driver? Yes, 369.09 works perfectly.

Btw, can anyone please tell me dimensions of the box Nvidia sends Titan X in? Thanks. Got some packages and need to repackage them and tell my courier up front the dimensions.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Older driver? Yes, 369.09 works perfectly.
> 
> Btw, can anyone please tell me dimensions of the box Nvidia sends Titan X in? Thanks. Got some packages and need to repackage them and tell my courier up front the dimensions.


Forget the Numbers , Extract downloaded Driver to folder and delete everything except the files on this image, Install








Also try DDU .


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> 
> Someone's only hearing what he wants to hear. I'ts like those trump voters.


I still haven't figured out why you're even on this site.


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> Everything smooth here , much better Benchmark results than 369.


Yeah, I tried again today.

DDU in safe mode, followed by a reboot and installing 372.54 drivers results in a driver installation failed message.

If I go into the device manager, right click and manually point it at the 372.54 drivers it will install. After that I can run the Nvidia installer package again and it will install the rest of the Nvidia components as well, but upon a reboot I'll have a code 43 error.

The 372.54 drivers are _completely_ broken on my system. The only way I can get it to work is to go back to 369.05

The 369.05 drivers work perfectly though.


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> Yeah, I tried again today.
> 
> DDU in safe mode, followed by a reboot and installing 372.54 drivers results in a driver installation failed message.
> 
> If I go into the device manager, right click and manually point it at the 372.54 drivers it will install. After that I can run the Nvidia installer package again and it will install the rest of the Nvidia components as well, but upon a reboot I'll have a code 43 error.
> 
> The 372.54 drivers are _completely_ broken on my system. The only way I can get it to work is to go back to 369.05
> 
> The 369.05 drivers work perfectly though.


Must be the OS then , nothing wrong with this driver, infact i posted two results a bit earlier 369 vs 372
Uninstall with DDU , let Windows install it for you then just install PhysX standalone and Uninstall Nvidia Update , i hear Windows AU installs 372 .


----------



## mattlach

Quote:


> Originally Posted by *HyperMatrix*
> 
> I don't feel like going over the SLI fight again, where people who buy a single card ridicule the ones who get more, because "SLI is bad and it's a waste of money", just like non-Titan XP members sometimes ridicule us for buying a card that is way overpriced and makes no logical sense to buy. Forget the scaling portion. The problem here wasn't even about SLI. It was about SLI Titan XP's not having enough to max out 1440p at the moment. At which point, he decided to take an example of the FPS I was getting, by saying he's getting 144fps all the time except when he gets a cpu bottleneck, and concluding that SLI is bad and he is smarter by having just 1 card


To each their own. I'd say that for me - however - I'd sacrifice my left nut to never have to deal with SLI ever again.

I had dual Radeon 6970's back in late 2010. Got them because I had just gone 2560x1600 and no single card was fast enough. The crossfire experience was awful. Poor minimum framerates, stutter, input lag, game compatibility and crashing, you name it.

I didn't learn from my mistakes. When I again was a high resolution early adopter last year when I got my 4K screen, the only solution that was fast enough was to go with dual cards. I got dual 980ti's. I was willing to try it because everyone always says SLI is so much better than Crossfire.

To be fair, my SLI experience was indeed better than Crossfire, but that isn't saying much. I still had many titles that didn't work well with SLI, I had poor minimum frame rates, high input lag and stutter, and had to constantly fight with the SLI profiles every time a new driver came out.

I understand that for certain AAA titles, the SLI experience isn't too bad because they put a ton of time into optimizing their games for SLI, but I'm not really into "Call of Modern Battlefield: Online" so this was not my experience at all.

My SLI experience was what was behind my motivation to spend $1200 for a Titan X and an additional $1000 on a custom water loop this time around. I wanted to make sure I could get minimum framerates of 60 fps or above at 4K with eye candy turned on, in the games that I play, without ever having to go anywhere near SLI again.

Suffice it to say, I HATE SLI. If it works for you, that's great. I'm not going to tell you or anyone else what to do. But I'm never touching it again


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> Must be the OS then , nothing wrong with this driver, infact i posted two results a bit earlier 369 vs 372
> Uninstall with DDU , let Windows install it for you then just install PhysX standalone and Uninstall Nvidia Update , i hear Windows AU installs 372 .


OS is Windows 10 Pro 64bit with Anniversary Update installed. What are you on?


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> OS is Windows 10 Pro 64bit with Anniversary Update installed. What are you on?


I tried Upgrade and few things went south so i just did a clean install , works like it should be while many claim Anniversary Update is a headache .
You can get ISO from Microsoft with AU in it .


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> I tried Upgrade and few things went south so i just did a clean install , works like it should be while many claim Anniversary Update is a headache .
> You can get ISO from Microsoft with AU in it .


You could be right, but I don't think it's the Anniversary Update that is causing the problems. Everything else works perfectly, and 369.05 drivers work perfectly.

I suspect there is some sort of incompatibility in the 372.54 drivers that is not working well with some other installed software or hardware in my system.

I'll stay on 369.05 for now and wait for the next driver release to see if that fixes it. If it doesn't, then I'll consider doing the clean install.

With Microsoft's new model of doing a major build release every 6-8 months, there is no way I want to do a clean install every time, especially for an OS I only use for games...


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> You could be right, but I don't think it's the Anniversary Update that is causing the problems. Everything else works perfectly, and 369.05 drivers work perfectly.
> 
> I suspect there is some sort of incompatibility in the 372.54 drivers that is not working well with some other installed software or hardware in my system.
> 
> I'll stay on 369.05 for now and wait for the next driver release to see if that fixes it. If it doesn't, then I'll consider doing the clean install.
> 
> With Microsoft's new model of doing a major build release every 6-8 months, there is no way I want to do a clean install every time, especially for an OS I only use for games...


I got big things like Games installed on 2nd SSD , basically Clean install fully loaded is a bit under an hour to do.


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> I got big things like Games installed on 2nd SSD , basically Clean install fully loaded is a bit under an hour to do.


My Linux installs usually take me 10 minutes or less, but I always wind up having problems with windows. Finding all the latest drivers for everything, disabling all the Windows 10 junk, getting everything set up the way I like it. It takes a while.


----------



## mbze430

372 is giving me issues with VR stuff. Juddering and all. Everyone is reporting the same in the Oculus forum.


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> My Linux installs usually take me 10 minutes or less, but I always wind up having problems with windows. Finding all the latest drivers for everything, disabling all the Windows 10 junk, getting everything set up the way I like it. It takes a while.


Ehh , i used to mess around , customizing Win7 











, now leave it mostly as is


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> 372 is giving me issues with VR stuff. Juddering and all. Everyone is reporting the same in the Oculus forum.


I have judder in VR using VirtualDesktop all the way from when Titan was installed , havent noticed anything new with 372


----------



## mbze430

Quote:


> Originally Posted by *KillerBee33*
> 
> I have judder in VR using VirtualDesktop all the way from when Titan was installed , havent noticed anything new with 372


Pcars even at minimum setting looking directly left or right while at race speed can see juddering. going back 360x dont' have that. Similar wth Dirt Rally and Assetto Corsa


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> Pcars even at minimum setting looking directly left or right while at race speed can see juddering. going back 360x dont' have that. Similar wth Dirt Rally and Assetto Corsa


i wus gonna mention Pcars , tried it Saturday and it runs flawlessly @ high settings with DX2 without DPO though. I always check on CadwellPark , seems to be the worst optimized track plus it drops no matter what around the bridge when SunFlair is on.


----------



## ratzofftoya

Darn it. EVGA doesn't make a 3-slot HB bridge appropriate for the RVE/RVE10 SLI spacing. Guess I'll have to go with the NVIDIA bridge and saw off the stupid pointy parts to make room for the EK waterblocks.


----------



## KillerBee33

Quote:


> Originally Posted by *profundido*
> 
> I would like to see a screenshot of your expanded afterburner graph while running your Firestrike test multiple times


6700 Stock
[email protected]
Same Clocks +225+600 Firestrike runs


----------



## Menthol

Quote:


> Originally Posted by *ratzofftoya*
> 
> Darn it. EVGA doesn't make a 3-slot HB bridge appropriate for the RVE/RVE10 SLI spacing. Guess I'll have to go with the NVIDIA bridge and saw off the stupid pointy parts to make room for the EK waterblocks.


Yes they do, 2 slot spacing is the one you want, you order by the number of empty pcie slots in between your cards, regardless if there is an actual pcie slot in the space, this one here

http://www.newegg.com/Product/Product.aspx?Item=N82E16814998131


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh , i used to mess around , customizing Win7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> , now leave it mostly as is


I'm the opposite. I left Windows 7 as it was. It didn't bother me.

Windows 10 is awful IMHO. I refuse to use it without "de-clouding", removing Cortana, disabling all the apps and windows store, and setting all privacy settings to their strictest levels.


----------



## mattlach

Quote:


> Originally Posted by *mattlach*
> 
> You could be right, but I don't think it's the Anniversary Update that is causing the problems. Everything else works perfectly, and 369.05 drivers work perfectly.
> 
> I suspect there is some sort of incompatibility in the 372.54 drivers that is not working well with some other installed software or hardware in my system.
> 
> I'll stay on 369.05 for now and wait for the next driver release to see if that fixes it. If it doesn't, then I'll consider doing the clean install.
> 
> With Microsoft's new model of doing a major build release every 6-8 months, there is no way I want to do a clean install every time, especially for an OS I only use for games...


Well, it looks like I am far from the only one with trouble. They are up to 40 pages in no time in the 372.54 driver feedback thread over in the Nvidia forums.

Makes you wonder what has been going on. Their flawless drivers has been one of the biggest selling points for buying Nvidia products for a long time, but they seem to really be dropping the ball with Pascal. First the DPC Latency issues and now this abysmal driver release.

Almost makes you wonder if there is a new team developing drivers over at Nvidia or something.


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> I'm the opposite. I left Windows 7 as it was. It didn't bother me.
> 
> Windows 10 is awful IMHO. I refuse to use it without "de-clouding", removing Cortana, disabling all the apps and windows store, and setting all privacy settings to their strictest levels.


No need for all that, just sign in as Admin and delete any other users. None of the modern apps can run under Administrators account. That is if you that paranoid


----------



## axiumone

Quote:


> Originally Posted by *mattlach*
> 
> Well, it looks like I am far from the only one with trouble. They are up to 40 pages in no time in the 372.54 driver feedback thread over in the Nvidia forums.
> 
> Makes you wonder what has been going on. Their flawless drivers has been one of the biggest selling points for buying Nvidia products for a long time, but they seem to really be dropping the ball with Pascal. First the DPC Latency issues and now this abysmal driver release.
> 
> Almost makes you wonder if there is a new team developing drivers over at Nvidia or something.


Where have you been? The drivers haven't been great for a year and a half easily. Ever since gameworks, gsync, shield, deep learning, geforce experience, the touted army of engineers have been working on anything but the actual drivers.

The deeper you buy into the ecosystem, the worse time you'll have, because half the features conflict with others. To add insult to injury, there's close to no communication with the fan base. The only way to get a resolution is to get a media outlet to cover it. Then nvidia will post a "We are looking into it".


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> No need for all that, just sign in as Admin and delete any other users. None of the modern apps can run under Administrators account. That is if you that paranoid


Bad idea to run as an administrator as your primary account.

It is best practice to have your main user account separate from your admin account, and only log in to the admin account when - you know - performing admin tasks.

Also, UAC must always remain on, and patches must always be installed promptly.

Computer security is important!


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> Bad idea to run as an administrator as your primary account.
> 
> It is best practice to have your main user account separate from your admin account, and only log in to the admin account when - you know - performing admin tasks.
> 
> Also, UAC must always remain on, and patches must always be installed promptly.
> 
> Computer security is important!


Humm, you believing in all these things you just mentioned might be the reason why you have issues with Windows AU and Driver installation








I have no better way of putting this , just had to be said








Activate Admin or even make a new User with Admin privileges and try to install that 372 , you can delete that user after


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> Humm, you believing in all these things you just mentioned might be the reason why you have issues with Windows AU and Driver installation
> 
> 
> 
> 
> 
> 
> 
> 
> I have no better way of putting this , just had to be said
> 
> 
> 
> 
> 
> 
> 
> 
> Activate Admin or even make a new User with Admin privileges and try to install that 372 , you can delete that user after


Of course. I log in with my admin account when updating drivers. Those are the "admin tasks" I was referring to









You don't run games/programs/browse the web etc in an admin account though. That is very inadvisable behavior from a security standpoint. We aren't on Windows XP anymore


----------



## mbze430

Quote:


> Originally Posted by *mattlach*
> 
> Bad idea to run as an administrator as your primary account.
> 
> It is best practice to have your main user account separate from your admin account, and only log in to the admin account when - you know - performing admin tasks.
> 
> Also, UAC must always remain on, and patches must always be installed promptly.
> 
> Computer security is important!


Unless you are MCSE trained don't think a whole lot of people know that.


----------



## EQBoss

Put 2 titans under water 2.1 ghz seems to be fairly easy on water with voltage at 1.09. Cards are definitely starved for power, wish it was 2 8 pins I hit 126 % tdp in ROTTR pretty easy. Increasing the memory clock seems to take away from gpu clock due to power limits. Cards stay in high 30s low 40s on load. Pretty happy with performance overall, still need my hb bridge, evga ones are hard to come by.


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> To be honest, I don't think he does it wrong: maxed out settings, 5960X @ 4.7, 128GB of 2.8GHz CL15 1T - these are the settings for 2152MHz 1080s in SLI compared to projected TXP performance across the board:
> 
> 
> http://imgur.com/RIRcH
> 
> 
> 
> 
> When SLI works, which is rare nowdays, its scaling is so erratic, while TXP offers consistent performance all the time compared to dual 1080 where the 2nd card is an expensive paperweight half the time. Even worse for Titan X (Pascal).
> 
> If we could guarantee that 1080 SLI worked more than in 50% of situations, we'd have something like Titan X (Pascal) performance in some games and high scaling in others, but atm it's 20% scaling high, 30% worse than Titan X (Pascal) and 50% not at all or worse than a single card.


I'm doing fine with SLI getting an extra 25fps consistent on ESO.


----------



## Gary2015

Quote:


> Originally Posted by *mattlach*
> 
> Bad idea to run as an administrator as your primary account.
> 
> It is best practice to have your main user account separate from your admin account, and only log in to the admin account when - you know - performing admin tasks.
> 
> Also, UAC must always remain on, and patches must always be installed promptly.
> 
> Computer security is important!


Never ran a second account . Never had any problems.


----------



## Gary2015

Quote:


> Originally Posted by *mattlach*
> 
> Well, it looks like I am far from the only one with trouble. They are up to 40 pages in no time in the 372.54 driver feedback thread over in the Nvidia forums.
> 
> Makes you wonder what has been going on. Their flawless drivers has been one of the biggest selling points for buying Nvidia products for a long time, but they seem to really be dropping the ball with Pascal. First the DPC Latency issues and now this abysmal driver release.
> 
> Almost makes you wonder if there is a new team developing drivers over at Nvidia or something.


Hotfix out?


----------



## CallsignVega

Just got done with some testing single Titan-XP vs SLI:



All games tested at max graphics with the system in my sig. Crysis 3, SWBF and Metro LL can all be played single GPU at 4K @ 60 Hz. Witcher 3 is close.

Crysis 3, SWBF, Metro LL would all be pushed properly for 4K @ 120 Hz displays this winter. A single Titan-XP won't even be close for 4K @ 120 Hz monitors. I couldn't get Witcher 3 SLI usage above 80% on each GPU no matter what I did, hence only 47% gain. The rest of the games averaged 83% FPS gain which is quite nice. Star Citizen would be playable with Titan-XP 4K @ 120 Hz if you lower some settings.


----------



## Gary2015

Quote:


> Originally Posted by *Bloodymight*
> 
> What?!
> 
> not sure why you would say that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go read ONE review about the Titan X Pascal and you will see there are quite a lot of games that barely run 100fps(not 144) at 1440p and a lot of games are to come which will be even more demanding, especially with the new consoles being released soon.
> 
> One Titan XP is certainly not "more than enough" for wqhd 144hz.


Agree. You're kidding yourself if you say one XP is enough at 144fps.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Just got done with some testing single Titan-XP vs SLI:
> 
> 
> 
> All games tested at max graphics with the system in my sig. Crysis 3, SWBF and Metro LL can all be played single GPU at 4K @ 60 Hz. Witcher 3 is close.
> 
> Crysis 3, SWBF, Metro LL would all be pushed properly for 4K @ 120 Hz displays this winter. A single Titan-XP won't even be close for 4K @ 120 Hz monitors. I couldn't get Witcher 3 SLI usage above 80% on each GPU no matter what I did, hence only 47% gain. The rest of the games averaged 83% FPS gain which is quite nice. Star Citizen would be playable with Titan-XP 4K @ 120 Hz if you lower some settings.


Well there you go. So naysayers of SLI , better save your pennies for another card...


----------



## Gary2015

Quote:


> Originally Posted by *mattlach*
> 
> To each their own. I'd say that for me - however - I'd sacrifice my left nut to never have to deal with SLI ever again.
> 
> I had dual Radeon 6970's back in late 2010. Got them because I had just gone 2560x1600 and no single card was fast enough. The crossfire experience was awful. Poor minimum framerates, stutter, input lag, game compatibility and crashing, you name it.
> 
> I didn't learn from my mistakes. When I again was a high resolution early adopter last year when I got my 4K screen, the only solution that was fast enough was to go with dual cards. I got dual 980ti's. I was willing to try it because everyone always says SLI is so much better than Crossfire.
> 
> To be fair, my SLI experience was indeed better than Crossfire, but that isn't saying much. I still had many titles that didn't work well with SLI, I had poor minimum frame rates, high input lag and stutter, and had to constantly fight with the SLI profiles every time a new driver came out.
> 
> I understand that for certain AAA titles, the SLI experience isn't too bad because they put a ton of time into optimizing their games for SLI, but I'm not really into "Call of Modern Battlefield: Online" so this was not my experience at all.
> 
> My SLI experience was what was behind my motivation to spend $1200 for a Titan X and an additional $1000 on a custom water loop this time around. I wanted to make sure I could get minimum framerates of 60 fps or above at 4K with eye candy turned on, in the games that I play, without ever having to go anywhere near SLI again.
> 
> Suffice it to say, I HATE SLI. If it works for you, that's great. I'm not going to tell you or anyone else what to do. But I'm never touching it again


Nonsense. My SLI has 50% scaling in ALL my games except No Mans Sky.


----------



## Gary2015

Quote:


> Originally Posted by *unreality*
> 
> Imo its pretty easy.
> 
> WQHD 144Hz+ -> One Titan Xp is more than enough
> 4K/5K/DSR users -> Two Titan Xp can bring some extra power in some games


Vega just debunked what you just said ! SLI FTW!!!


----------



## CallsignVega

I just basically confirmed Titan-XP SLI will almost be required for the upcoming 4K @ 120 Hz displays. Even Volta isn't going to have the power to drive such a display single GPU IMO.

We all know 4k @ 120 Hz displays will be the next "big thing" in gaming.







Actually, 4K @ 144 Hz is already said to be coming down the line. Some _serious_ GPU power required!


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> Just got done with some testing single Titan-XP vs SLI:
> 
> 
> 
> All games tested at max graphics with the system in my sig. Crysis 3, SWBF and Metro LL can all be played single GPU at 4K @ 60 Hz. Witcher 3 is close.
> 
> Crysis 3, SWBF, Metro LL would all be pushed properly for 4K @ 120 Hz displays this winter. A single Titan-XP won't even be close for 4K @ 120 Hz monitors. I couldn't get Witcher 3 SLI usage above 80% on each GPU no matter what I did, hence only 47% gain. The rest of the games averaged 83% FPS gain which is quite nice. Star Citizen would be playable with Titan-XP 4K @ 120 Hz if you lower some settings.


This is clearly fake. I've been reassured repeatedly on this thread that SLI only scales 30% and is a total waste of time and money.


----------



## mattlach

Quote:


> Originally Posted by *EQBoss*
> 
> Put 2 titans under water 2.1 ghz seems to be fairly easy on water with voltage at 1.09. Cards are definitely starved for power, wish it was 2 8 pins I hit 126 % tdp in ROTTR pretty easy. Increasing the memory clock seems to take away from gpu clock due to power limits. Cards stay in high 30s low 40s on load. Pretty happy with performance overall, still need my hb bridge, evga ones are hard to come by.


How are you adjusting voltage? I haven't found any tool that can do that yet...


----------



## HaniWithAnI

Quote:


> Originally Posted by *mattlach*
> 
> How are you adjusting voltage? I haven't found any tool that can do that yet...


Afterburner 4.3.0 beta. Search the thread or my post history for the [setting], I posted it earlier.


----------



## EQBoss

Quote:


> Originally Posted by *mattlach*
> 
> How are you adjusting voltage? I haven't found any tool that can do that yet...


I tried msi afterburner but couldn't get voltage to work and was too lazy to fix it, used asus gpu tweak II it worked right out of the box. Programs a pos tho, gonna see if I can get afterburner running later.


----------



## toncij

Quote:


> Originally Posted by *Gary2015*
> 
> I'm doing fine with SLI getting an extra 25fps consistent on ESO.


25 FPS fromhow much before it?
Quote:


> Originally Posted by *CallsignVega*
> 
> Just got done with some testing single Titan-XP vs SLI:
> 
> 
> 
> All games tested at max graphics with the system in my sig. Crysis 3, SWBF and Metro LL can all be played single GPU at 4K @ 60 Hz. Witcher 3 is close.
> 
> Crysis 3, SWBF, Metro LL would all be pushed properly for 4K @ 120 Hz displays this winter. A single Titan-XP won't even be close for 4K @ 120 Hz monitors. I couldn't get Witcher 3 SLI usage above 80% on each GPU no matter what I did, hence only 47% gain. The rest of the games averaged 83% FPS gain which is quite nice. Star Citizen would be playable with Titan-XP 4K @ 120 Hz if you lower some settings.


Now, I seriously doubt [email protected] will come before mid 2017.

Your tested games are those SLI friendly. It always comes to what games you're going to play. My selection is a choice of games I have with me and actually play sometimes.
It seems that in my case and in case of reviewers going with a bunch of popular games, SLI works in 50% of games usually. That's what I don't like in particular.

Today we'll see how new DeusEx works.
Quote:


> Originally Posted by *Gary2015*
> 
> Well there you go. So naysayers of SLI , better save your pennies for another card...


I've ordered my second TXP to see for myself, but my 1080 SLI doesn't offer much hope.
Quote:


> Originally Posted by *Gary2015*
> 
> Nonsense. My SLI has 50% scaling in ALL my games except No Mans Sky.


Can you list all your games?
Quote:


> Originally Posted by *CallsignVega*
> 
> I just basically confirmed Titan-XP SLI will almost be required for the upcoming 4K @ 120 Hz displays. Even Volta isn't going to have the power to drive such a display single GPU IMO.
> 
> We all know 4k @ 120 Hz displays will be the next "big thing" in gaming.
> 
> 
> 
> 
> 
> 
> 
> Actually, 4K @ 144 Hz is already said to be coming down the line. Some _serious_ GPU power required!


If Volta comes with 6K cores,it might work up to 100...









Let's see...


----------



## sena

Any word on ek hb bridge?


----------



## pez

Quote:


> Originally Posted by *HyperMatrix*
> 
> This is clearly fake. I've been reassured repeatedly on this thread that SLI only scales 30% and is a total waste of time and money.


Same here....so strange....


----------



## Silent Scone

Quote:


> Originally Posted by *CallsignVega*
> 
> I just basically confirmed Titan-XP SLI will almost be required for the upcoming 4K @ 120 Hz displays. Even Volta isn't going to have the power to drive such a display single GPU IMO.
> 
> We all know 4k @ 120 Hz displays will be the next "big thing" in gaming.
> 
> 
> 
> 
> 
> 
> 
> Actually, 4K @ 144 Hz is already said to be coming down the line. Some _serious_ GPU power required!


I've had 4K 40" all the way down to 28" with G-Sync, 1440p, 3440, 2560x1080, curved. All in all over period of a few months at a time, I've tried about 15 panels.

4K at 120hz in my honest, wholehearted opinion is a fools dream. Yes, you do need two of these cards to hit the performance at those pixels, but do you need that many pixels? Unless you're looking at a density on such as a 40" panel like the VA Phillips (soon to be discontinued), it's an utterly pointless show of hardware muscle. Even then, the only time I really was impressed was with Witcher 3 at that scale, some of the scenic views when standing still were amazing. Sadly, I don't tend to stand still for too long in games.

I'm not sure why the spur of HB bridges has suddenly gotten people raving about SLI, either. I've been using it for over 15 years, and the industry has been trying to move away from it for at least 6 of those. That's not to say I would not recommend having two of these cards at that resolution, because that would be lying. What I would do is chuckle a wee bit and ask why you are at that resolution at all.


----------



## profundido

Quote:


> Originally Posted by *KillerBee33*
> 
> I know , just wanted to test in HOT environment . But still TitanX P runs hotter than the 980


it does
Quote:


> Originally Posted by *sena*
> 
> Any word on ek hb bridge?


I asked the question again to 2 EK representatives on this forum, Akira and a colleague and I got word back that they're still requesting info on this internally and got nothing so no, nothing yet


----------



## DADDYDC650

Quote:


> Originally Posted by *Silent Scone*
> 
> I've had 4K 40" all the way down to 28" with G-Sync, 1440p, 3440, 2560x1080, curved. All in all over period of a few months at a time, I've tried about 15 panels.
> 
> 4K at 120hz in my honest, wholehearted opinion is a fools dream. Yes, you do need two of these cards to hit the performance at those pixels, but do you need that many pixels? Unless you're looking at a density on such as a 40" panel like the VA Phillips (soon to be discontinued), it's an utterly pointless show of hardware muscle. Even then, the only time I really was impressed was with Witcher 3 at that scale, some of the scenic views when standing still were amazing. Sadly, I don't tend to stand still for too long in games.
> 
> I'm not sure why the spur of HB bridges has suddenly gotten people raving about SLI, either. I've been using it for over 15 years, and the industry has been trying to move away from it for at least 6 of those. That's not to say I would not recommend having two of these cards at that resolution, because that would be lying. What I would do is chuckle a wee bit and ask why you are at that resolution at all.


Since you've tried pretty much every monitor, what would you say is the best?


----------



## markklok

Quote:


> Originally Posted by *Silent Scone*
> 
> I've had 4K 40" all the way down to 28" with G-Sync, 1440p, 3440, 2560x1080, curved. All in all over period of a few months at a time, I've tried about 15 panels.
> 
> 4K at 120hz in my honest, wholehearted opinion is a fools dream. Yes, you do need two of these cards to hit the performance at those pixels, but do you need that many pixels? Unless you're looking at a density on such as a 40" panel like the VA Phillips (soon to be discontinued), it's an utterly pointless show of hardware muscle. Even then, the only time I really was impressed was with Witcher 3 at that scale, some of the scenic views when standing still were amazing. Sadly, I don't tend to stand still for too long in games.
> 
> I'm not sure why the spur of HB bridges has suddenly gotten people raving about SLI, either. I've been using it for over 15 years, and the industry has been trying to move away from it for at least 6 of those. That's not to say I would not recommend having two of these cards at that resolution, because that would be lying. What I would do is chuckle a wee bit and ask why you are at that resolution at all.


Single XP + 40" VA Philips *check*


----------



## Silent Scone

Quote:


> Originally Posted by *DADDYDC650*
> 
> Since you've tried pretty much every monitor, what would you say is the best?


For 4k? Phillips by a country mile. The perfect balance, somewhere between the Acer Z35 and x34.


----------



## DADDYDC650

Quote:


> Originally Posted by *Silent Scone*
> 
> For 4k? Phillips by a country mile. The perfect balance, somewhere between the Acer Z35 and x34.


I thought the Phillips had a bunch of issues?


----------



## profundido

Quote:


> Originally Posted by *KillerBee33*
> 
> 6700 Stock
> [email protected]
> 
> Same Clocks +225+600 Firestrike runs


ok, from these screenshots you posted we see that there's no thermal throtlling on the GPU so you're good and can sleep easy

This leads me to believe that the 200 less on a score of almost 10000 is nothing but random benchmarking error margin which is what user Unreality already suggested earlier in a post

Just to sleep easy you could rerun your 3dmark benchmark 10 times on stock and 10 times on OC just to see if there is a definate reproducable performance loss when overlocking cpu

But honestly, for 200 score more or less I wouldn't even bother investigating much further seeing nothing wrong in AB. Then again something tells me you probably will anyway


----------



## KillerBee33

Quote:


> Originally Posted by *profundido*
> 
> ok, from these screenshots you posted we see that there's no thermal throtlling on the GPU so you're good and can sleep easy
> 
> This leads me to believe that the 200 less on a score of almost 10000 is nothing but random benchmarking error margin which is what user Unreality already suggested earlier in a post
> 
> Just to sleep easy you could rerun your 3dmark benchmark 10 times on stock and 10 times on OC just to see if there is a definate reproducable performance loss when overlocking cpu
> 
> But honestly, for 200 score more or less I wouldn't even bother investigating much further seeing nothing wrong in AB. Then again something tells me you probably will anyway


Meh, maybe when new driver or none BETA AB show up , not runing anything demanding or 10 monitors , i might be looking into SWIFTECH instead of KRAKEN in the near future though


----------



## profundido

Quote:


> Originally Posted by *KillerBee33*
> 
> Meh, maybe when new driver or none BETA AB show up , not runing anything demanding or 10 monitors , i might be looking into SWIFTECH instead of KRAKEN in the near future though


you seem like a real enthousiast. I'm surprised you have not made the jump to a custom waterloop yet. You would not regret and probably like the added tech part of it alot. It wouls for sure give you more leeway to play around with


----------



## KillerBee33

Quote:


> Originally Posted by *profundido*
> 
> you seem like a real enthousiast. I'm surprised you have not made the jump to a custom waterloop yet. You would not regret and probably like the added tech part of it alot. It wouls for sure give you more leeway to play around with


Honestly have no time for water loop, although would love to try one day. Over 80 hours a week to work leaves me no desire to spend a day or more just putting things together. Heh got so lazy last time , slapped TIM on with Hybrid kit still attached to the case


----------



## toncij

Now that you mention hybrid, has anyone tried HG10 N980 on TXP?


----------



## piee

Ordered TXP, was instock a few hours yesterday and will be selling 980ticlassy81.5asic ek blocked soon as it arrives, eating beans and rice for a few monthsLOL,was going to wait for volta but figured HBM not major increase in FPS. At least TXP hits 60FPS in most at 4K.


----------



## markklok

For the EK block owners...

Could anyone make a picture on the side too show how much space there is between the pcb and the EK block.

I would love to know how much space there is between the shunt mod and the EK block


----------



## eliau81

Has anyone tried to fit the EVGA 980TI hybrid shroud ?


----------



## meson1

Quote:


> Originally Posted by *Silent Scone*
> 
> For 4k? Phillips by a country mile. The perfect balance, somewhere between the Acer Z35 and x34.


The Philips BDM4065UC?


----------



## Silent Scone

Quote:


> Originally Posted by *meson1*
> 
> The Philips BDM4065UC?


Yes


----------



## meson1

Quote:


> Originally Posted by *Silent Scone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *meson1*
> 
> The Philips BDM4065UC?
> 
> 
> 
> Yes
Click to expand...

Woo-hoo!









I got the same one. Presently I have a 780 Ti trying to drive it Poor wee thing, it's trying it's best. It's great for desktop real estate. But at the moment I have to play games in Windowed mode or scaled. It's why I'm putting together a new rig and why I'm now down to last and most important piece of the puzzle (GPU), which looks increasingly like it's going to have to be a TXP.


----------



## Gary2015

Quote:


> Originally Posted by *Silent Scone*
> 
> For 4k? Phillips by a country mile. The perfect balance, somewhere between the Acer Z35 and x34.


Philips? You're joking?


----------



## toncij

Quote:


> Originally Posted by *Gary2015*
> 
> Philips? You're joking?


He must be. There is no other explanation.


----------



## bl4ckdot

Can someone take a picture of the every different type of screws I need to remove in order to install the EKWB block ? What screwdriver do I need ? Any advices ?
Thank you


----------



## CallsignVega

Quote:


> Originally Posted by *Silent Scone*
> 
> I've had 4K 40" all the way down to 28" with G-Sync, 1440p, 3440, 2560x1080, curved. All in all over period of a few months at a time, I've tried about 15 panels.
> 
> 4K at 120hz in my honest, wholehearted opinion is a fools dream. Yes, you do need two of these cards to hit the performance at those pixels, but do you need that many pixels? Unless you're looking at a density on such as a 40" panel like the VA Phillips (soon to be discontinued), it's an utterly pointless show of hardware muscle. Even then, the only time I really was impressed was with Witcher 3 at that scale, some of the scenic views when standing still were amazing. Sadly, I don't tend to stand still for too long in games.
> 
> I'm not sure why the spur of HB bridges has suddenly gotten people raving about SLI, either. I've been using it for over 15 years, and the industry has been trying to move away from it for at least 6 of those. That's not to say I would not recommend having two of these cards at that resolution, because that would be lying. What I would do is chuckle a wee bit and ask why you are at that resolution at all.


Try a 4K LG 55" curved OLED as a monitor pushed pack a little on your desk. The resolution/quality is just staggering. Yes, 4K is really good. Ten years ago people thought why would you need anything more than 1080p? Today 1080p makes my eyes want to bleed.

Hell, there are already 8K panels in the pipeline which is four times the resolution of 4K. Now the big deal with 4K @ 120 Hz will be OLED. OLED pixels are so fast than the boost to 120 Hz from 60 Hz would just be epic. If "flexing hardware muscle" comes along with the most amazing and immersive gaming experience, I guess I'll flex away...


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Try a 4K LG 55" curved OLED as a monitor pushed pack a little on your desk. The resolution/quality is just staggering. Yes, 4K is really good. Ten years ago people thought why would you need anything more than 1080p? Today 1080p makes my eyes want to bleed.
> 
> Hell, there are already 8K panels in the pipeline which is four times the resolution of 4K. Now the big deal with 4K @ 120 Hz will be OLED. OLED pixels are so fast than the boost to 120 Hz from 60 Hz would just be epic. If "flexing hardware muscle" comes along with the most amazing and immersive gaming experience, I guess I'll flex away...


A 55" TV a few feet in front of you is a good way to go blind and get a tan!


----------



## CallsignVega

Quote:


> Originally Posted by *DADDYDC650*
> 
> A 55" TV a few feet in front of you is a good way to go blind and get a tan!


Don't knock it until you try it, it's like IMAX!










DOOM on it is the best looking game I've ever seen, pictures of it doesn't do it justice.

If Dell ever releases the 30" 4K @ 120 Hz OLED with a DP 1.3/1.4 connection, even at $5K it's an instant buy for me.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> A 55" TV a few feet in front of you is a good way to go blind and get a tan!


8 feet away 50"


----------



## Fiercy

Quote:


> Originally Posted by *CallsignVega*
> 
> Don't knock it until you try it, it's like IMAX!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DOOM on it is the best looking game I've ever seen, pictures of it doesn't do it justice.
> 
> If Dell ever releases the 30" 4K @ 120 Hz OLED with a DP 1.3/1.4 connection, even at $5K it's an instant buy for me.


'

I don't think it's good for eyes to play like that especially for a very long gaming session. Another problem would be focusing on something because of how big is the screen which makes it bad for any competitive play.


----------



## combat fighter

Quote:


> Originally Posted by *CallsignVega*
> 
> Don't knock it until you try it, it's like IMAX!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DOOM on it is the best looking game I've ever seen, pictures of it doesn't do it justice.
> 
> If Dell ever releases the 30" 4K @ 120 Hz OLED with a DP 1.3/1.4 connection, even at $5K it's an instant buy for me.


Shame it's curved. Really adds nothing but distortion and ruins the TV.

Fortunately the fad is dying.

Only have to look at the new line up of LG OLED's nearly all are flat.

55" is too small for 4K when used as a TV as well.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> Don't knock it until you try it, it's like IMAX!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DOOM on it is the best looking game I've ever seen, pictures of it doesn't do it justice.
> 
> If Dell ever releases the 30" 4K @ 120 Hz OLED with a DP 1.3/1.4 connection, even at $5K it's an instant buy for me.


I'm sure it looks amazing but I love my eyes too much to abuse them like that, lol! Beautiful TV nonetheless.


----------



## DADDYDC650

Quote:


> Originally Posted by *combat fighter*
> 
> *Shame it's curved. Really adds nothing but distortion* and ruins the TV.
> 
> Fortunately the fad is dying.
> 
> Only have to look at the new line up of LG OLED's nearly all are flat.
> 
> 55" is too small for 4K when used as a TV as well.


Truth. Curved monitors and TV's add nothing to the picture. Stop adding crap and give me better picture quality!


----------



## CallsignVega

Quote:


> Originally Posted by *Fiercy*
> 
> '
> 
> I don't think it's good for eyes to play like that especially for a very long gaming session. Another problem would be focusing on something because of how big is the screen which makes it bad for any competitive play.


Not an issue. Hell, VR displays are like one inch from your eyes. I do great in FPS's with it. Don't forget about the benefit of how much easier it is to see and target people at medium to far distances in a game.

Quote:


> Originally Posted by *combat fighter*
> 
> Shame it's curved. Really adds nothing but distortion and ruins the TV.
> 
> Fortunately the fad is dying.
> 
> Only have to look at the new line up of LG OLED's nearly all are flat.
> 
> 55" is too small for 4K when used as a TV as well.


Curve absolutely makes sense for a computer monitor. It isn't going anywhere. Every single large display manufacturer has announced new curved panels coming out for computer monitors.


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> He must be. There is no other explanation.


Sheesh. I only bought one Philips thing is my life.


----------



## Gary2015

Quote:


> Originally Posted by *CallsignVega*
> 
> Don't knock it until you try it, it's like IMAX!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DOOM on it is the best looking game I've ever seen, pictures of it doesn't do it justice.
> 
> If Dell ever releases the 30" 4K @ 120 Hz OLED with a DP 1.3/1.4 connection, even at $5K it's an instant buy for me.


Now that is DA BOMB!!!


----------



## unreality

Quote:


> Originally Posted by *Gary2015*
> 
> Agree. You're kidding yourself if you say one XP is enough at 144fps.


Quote:


> Originally Posted by *Gary2015*
> 
> Well there you go. So naysayers of SLI , better save your pennies for another card...


Quote:


> Originally Posted by *Gary2015*
> 
> Nonsense. My SLI has 50% scaling in ALL my games except No Mans Sky.


Quote:


> Originally Posted by *Gary2015*
> 
> Vega just debunked what you just said ! SLI FTW!!!


You seem to have some kind of disorder by replying 3 times after your own posts

I already proved you wrong several pages ago, by posting benchmarks of SLI TXps @ 1440p

For 4k+ this changes of course.

Stil leach to their own, i do know i will have the better experience with a single heavy OCed TXp under water


----------



## Woundingchaney

Quote:


> Originally Posted by *combat fighter*
> 
> Shame it's curved. Really adds nothing but distortion and ruins the TV.
> 
> Fortunately the fad is dying.
> 
> Only have to look at the new line up of LG OLED's nearly all are flat.
> 
> 55" is too small for 4K when used as a TV as well.


Why is 55" too small for 4k when used as a TV, wouldnt this primarily be associated with viewing distance rather than any general TV or monitor reference?


----------



## DNMock

Quote:


> Originally Posted by *bl4ckdot*
> 
> Can someone take a picture of the every different type of screws I need to remove in order to install the EKWB block ? What screwdriver do I need ? Any advices ?


a 4mm socket (if you have a full socket set you should have one of these already)

an Allan wrench (should come with the EK blocks)

a small phillips head screw driver. The same kind you would use for repairing glasses.


----------



## toncij

Quote:


> Originally Posted by *CallsignVega*
> 
> Not an issue. Hell, VR displays are like one inch from your eyes. I do great in FPS's with it. Don't forget about the benefit of how much easier it is to see and target people at medium to far distances in a game.
> Curve absolutely makes sense for a computer monitor. It isn't going anywhere. Every single large display manufacturer has announced new curved panels coming out for computer monitors.


I will always choose a curved display if I can. 21:9 one, of course. Too bad nobody produces high-ppi ones yet and I can't really stand pixelated desktop and apps any more...


----------



## Lennyx

Quote:


> Originally Posted by *bl4ckdot*
> 
> Can someone take a picture of the every different type of screws I need to remove in order to install the EKWB block ? What screwdriver do I need ? Any advices ?
> Thank you


I just finnished installing the block myself. You need a realy small philips scredriver and a 4mm socket. First remove 2 screws on the I/O shield then all screws on the back witht he screwdriver. Then remove all the screws on the back with the 4mm socket.

EK got great instuctions that comes in the package with pictures.


----------



## Gary2015

One more hour until DEUX EX. Wanna see how DX12 runs on XP.


----------



## Woundingchaney

Ok guys been looking for an AIO cooler that fits the TXp, cant seem to find any in stock that I know will fit.

Any help would be appreciated.

Before anyone asks, I dont have the gear for a custom water cooling solution and dont quite have that level of dedication these days.


----------



## mattlach

Quote:


> Originally Posted by *Silent Scone*
> 
> For 4k? Phillips by a country mile. The perfect balance, somewhere between the Acer Z35 and x34.


For what it's worth, a year later I am still VERY happy with my 48" Samsung JS9000.

I wish it could hit above 60hz, and it is just a tad bit too large (probably would have been better in the 42"-44" range, but I am still very happy.

I don't see the point of 4K under 40". I don't buy all this extra resolution only to have to scale it up


----------



## mattlach

Quote:


> Originally Posted by *Woundingchaney*
> 
> Ok guys been looking for an AIO cooler that fits the TXp, cant seem to find any in stock that I know will fit.
> 
> Any help would be appreciated.
> 
> Before anyone asks, I dont have the gear for a custom water cooling solution and dont quite have that level of dedication these days.


I don't think there are any official solutions yet. I have seen modifications of older design EVGA hybrid kits though (involved dremeling out some metal to make way for a Capacitor, but otherwise fit just fine)

Look at this and this.


----------



## mattlach

Quote:


> Originally Posted by *combat fighter*
> 
> Shame it's curved. Really adds nothing but distortion and ruins the TV.
> 
> Fortunately the fad is dying.
> 
> Only have to look at the new line up of LG OLED's nearly all are flat.
> 
> 55" is too small for 4K when used as a TV as well.


My 48" JS9000 is curved, and I think it is great as a monitor.

When sitting 2 - 2.5ft (60 -75cm) away from it, I find it helps see the corners.

I wouldn't get curved if I were using it at normal TV watching distances though.


----------



## Woundingchaney

Quote:


> Originally Posted by *mattlach*
> 
> I don't think there are any official solutions yet. I have seen modifications of older design EVGA hybrid kits though (involved dremeling out some metal to make way for a Capacitor, but otherwise fit just fine)
> 
> Look at this and this.


Yeah was looking into it and found this video, only problem is I cant find anything in stock.

http://www.gamersnexus.net/guides/2568-titan-x-pascal-hybrid-results-clock-throttling-on-reference


----------



## mattlach

Quote:


> Originally Posted by *Woundingchaney*
> 
> Yeah was looking into it and found this video, only problem is I cant find anything in stock.
> 
> http://www.gamersnexus.net/guides/2568-titan-x-pascal-hybrid-results-clock-throttling-on-reference


Yeah, it's often difficult to find these for previous gen cards. They stop making them after a while, because most people buy them for their new shiny GPU's, and more than a few months after the initial launch there is a very small market for them.

I'd try eBay or for sale / for trade sections on hardware enthusiast forums. With many people upgrading to 1080's and Pascal Titan X's there might be some good deals out there.


----------



## DNMock

Quote:


> Originally Posted by *mattlach*
> 
> For what it's worth, a year later I am still VERY happy with my 48" Samsung JS9000.
> 
> I wish it could hit above 60hz, and it is just a tad bit too large (probably would have been better in the 42"-44" range, but I am still very happy.
> 
> I don't see the point of 4K under 40". I don't buy all this extra resolution only to have to scale it up


42" monitor was the sweet spot for me at the 2 to 3 foot range on 4k. Perfect size for immersion, not too overbearing to have to literally turn my head to see what's going on at the edges of the screen.


----------



## mattlach

Quote:


> Originally Posted by *axiumone*
> 
> Where have you been? The drivers haven't been great for a year and a half easily. Ever since gameworks, gsync, shield, deep learning, geforce experience, the touted army of engineers have been working on anything but the actual drivers.
> 
> The deeper you buy into the ecosystem, the worse time you'll have, because half the features conflict with others. To add insult to injury, there's close to no communication with the fan base. The only way to get a resolution is to get a media outlet to cover it. Then nvidia will post a "We are looking into it".


While I pride myself in not being a fanboy and going back and forth between AMD and Nvidia, my latest green stretch has been kind of long.

It started when I wrecked my 7970 by slipping with a screwdriver (I was trying to custom mod a Corsair AIO on the GPU) and I bough the 2GB GTX 680 on launch.

GTX680 (launch week) -> Original Kepler Titan (launch week) -> SLI 980TI's (about a year ago) -> GTX 1080 (only owned it for a week before selling after the Titan X announcement) -> Pascal Titan X (launch day)

In that time, the first time I noticed Nvidia driver problems was with the DPC Latency issues with the GTX 1080 launch


----------



## combat fighter

Quote:


> Originally Posted by *CallsignVega*
> 
> Not an issue. Hell, VR displays are like one inch from your eyes. I do great in FPS's with it. Don't forget about the benefit of how much easier it is to see and target people at medium to far distances in a game.
> Curve absolutely makes sense for a computer monitor. It isn't going anywhere. Every single large display manufacturer has announced new curved panels coming out for computer monitors.


But the question was about a TV.

Not a monitor.

Most new OLED TV's are flat, hence the curved fad is ending which is great news as I really, really hate curved TV's!


----------



## jcde7ago

Quick question after trying to look through this thread for an answer....

*Will EK's Titan XM waterblocks fit on the XP by any chance?*









I told myself I wouldn't pull the trigger on the XP but with a bonus on the way + finding a buyer for my XMs, it was a no-brainer....ordered 2x XPs an hour ago...


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> Just got done with some testing single Titan-XP vs SLI:
> 
> 
> 
> All games tested at max graphics with the system in my sig. Crysis 3, SWBF and Metro LL can all be played single GPU at 4K @ 60 Hz. Witcher 3 is close.
> 
> Crysis 3, SWBF, Metro LL would all be pushed properly for 4K @ 120 Hz displays this winter. A single Titan-XP won't even be close for 4K @ 120 Hz monitors. I couldn't get Witcher 3 SLI usage above 80% on each GPU no matter what I did, hence only 47% gain. The rest of the games averaged 83% FPS gain which is quite nice. Star Citizen would be playable with Titan-XP 4K @ 120 Hz if you lower some settings.


nice work CSV! +1
Quote:


> Originally Posted by *Gary2015*
> 
> Philips? You're joking?


no - it is a very good monitor.
Quote:


> Originally Posted by *CallsignVega*
> 
> Don't knock it until you try it, it's like IMAX!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DOOM on it is the best looking game I've ever seen, pictures of it doesn't do it justice.
> If Dell ever releases the 30" 4K @ 120 Hz OLED with a DP 1.3/1.4 connection, *even at $5K it's an instant buy for me*.


very nice! - unfortunately there are no treatments for your affliction.


----------



## meson1

My former monitor was a Samsung 24inch (1920x1200). Working out from the dpi of that and scaling up, I didn't want one any smaller than 36 inch. That way the pixel density wouldn't make it too small to read without scaling - and let's face it, scaling can be hit and miss especially with legacy applications.

The Philips 40inch 4K is nicely large, but not too big. And looks fine to my eyes. Also it was a quarter the price of anything else.

I'm sure there are better monitors out there. I'm sure one day someone will produce a G-sync 40inch 4K one that doesn't cost an arm and a leg. I'll think about such a beast then.

But for now, the Philips is fine. I just need a graphics card that will drive it at 60fps.


----------



## mattlach

Quote:


> Originally Posted by *meson1*
> 
> My former monitor was a Samsung 24inch (1920x1200). Working out from the dpi of that and scaling up, I didn't want one any smaller than 36 inch. That way the pixel density wouldn't make it too small to read without scaling - and let's face it, scaling can be hit and miss especially with legacy applications.
> 
> The Philips 40inch 4K is nicely large, but not too big. And looks fine to my eyes. Also it was a quarter the price of anything else.
> 
> I'm sure there are better monitors out there. I'm sure one day someone will produce a G-sync 40inch 4K one that doesn't cost an arm and a leg. I'll think about such a beast then.
> 
> But for now, the Philips is fine. I just need a graphics card that will drive it at 60fps.


That's kind of how I feel too.

As it stands, maintaining minimum framerates at 60fps or above is a challenge at 4k in many titles, even with an overclocked Pascal Titan X. (unless you want to drop quality settings, but whats the fun in that?)

At some point, if an affordable 42"-44" Gsync compatible monitor appears with higher refresh rates I'll likely be interested, but probably not until the next Titan (or whatever video card is able to push - say 144hz - on a single GPU at 4k









There is no way I'm going SLI again. Too much trouble and poor overall gaming experience.

Then I can move my 48" Samsung to TV duty.


----------



## toncij

Complaining about monitor price in a thread of owners of the $1200 GPU...


----------



## Lennyx

Quote:


> Originally Posted by *jcde7ago*
> 
> Quick question after trying to look through this thread for an answer....
> 
> *Will EK's Titan XM waterblocks fit on the XP by any chance?*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I told myself I wouldn't pull the trigger on the XP but with a bonus on the way + finding a buyer for my XMs, it was a no-brainer....ordered 2x XPs an hour ago...


No they do not fit. Grats on new cards


----------



## mattlach

Quote:


> Originally Posted by *Gary2015*
> 
> Nonsense. My SLI has 50% scaling in ALL my games except No Mans Sky.


Lol.

Re-read my post. I didn't even mention scaling









Scaling based on average framerates might be there, but the problem is usually minimum framerates.

SLI (and crossfire as well) tend to work well until that one problem scene when they don't. In Red Orchestra 2 it used to be when the battlefield was smoked up, and artillery was falling ontop of you and you were trying to rush for cover. Then all of a sudden the framerates would drop really low.

The average still looked fine, and appeared to have great scaling, but the minimums were unsatisfactory, and usually occurred at the most important high intensity times when you needed high framerates the most.

I don't even look at average framerates in reviews anymore, and I haven't for over 10 years. In every review I read, I skip directly to minimum framerates. if the review doesn't have minimum framerates, then I read a different review. I find average framerates to be of little to no value in making a decision on playability and overall gaming experience. The experience is determined almost entirely by minimum framerates.

This is true for all GPU combinations, but especially so in SLI builds, there the minimums seem to be much lower compared to the averages, than they do on single GPU's


----------



## jcde7ago

Quote:


> Originally Posted by *Lennyx*
> 
> No they do not fit. Grats on new cards


Figured...thanks for the response!









For realsies though, this is my last high-end GPU go-around...I thought I could hold off from the upgrade itch but then I look back and it's been 18-19 months already and the value of the TXMs I have are going to continue to sink the longer I wait to switch...figured that now is as good a time as any given how beastly these TXPs are....


----------



## toncij

Quote:


> Originally Posted by *mattlach*
> 
> Lol.
> 
> Re-read my post. I didn't even mention scaling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Scaling based on average framerates might be there, but the problem is usually minimum framerates.
> 
> SLI (and crossfire as well) tend to work well until that one problem scene when they don't. In Red Orchestra 2 it used to be when the battlefield was smoked up, and artillery was falling ontop of you and you were trying to rush for cover. Then all of a sudden the framerates would drop really low.
> 
> The average still looked fine, and appeared to have great scaling, but the minimums were unsatisfactory, and usually occurred at the most important high intensity times when you needed high framerates the most.
> 
> I don't even look at average framerates in reviews anymore, and I haven't for over 10 years. In every review I read, I skip directly to minimum framerates. if the review doesn't have minimum framerates, then I read a different review. I find average framerates to be of little to no value in making a decision on playability and overall gaming experience. The experience is determined almost entirely by minimum framerates.
> 
> This is true for all GPU combinations, but especially so in SLI builds, there the minimums seem to be much lower compared to the averages, than they do on single GPU's


The minimums <3


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> The minimums <3


I gotta check out The Division numbers, boring game imo , haven't touched it in a wile but i doubt Single Titan can do 48FPS Maxed out @ 4K it's a demanding game .


----------



## mattlach

Quote:


> Originally Posted by *toncij*
> 
> The minimums <3


Yeah,

I haven't played any of those titles, but that seems to support my subjective experiences quite nicely.

Notably, in some titles the 1080SLI actually has LOWER minimums than a single 1080, and an overclocked Pascal Titan absolutely trouncing everything else.









These things always vary from title to title though. Some AAA titles do a great job of optimizing for SLI, but if you get out of AAA and into the more interesting titles (IMHO) I like to play SLI tends to perform worse and worse.


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> I gotta check out The Division numbers, boring game imo , haven't touched it in a wile but i doubt Single Titan can do 48FPS Maxed out @ 4K it's a demanding game .


Titan XP numbers are a projection only - pure calculation from the raw force these have compared to TXM and 1080 - the end results is an average of 60% over TXM and 33% over 1080, which is exactly coming from the core count. So far, haven't tested it all, but that's about similar to what I have experienced with a real card.

Would love to have time to test again with TXP for real, but mine is still stuck in "shipping"...


----------



## Gary2015

Man, only getting 35fps MAXED on Deus Ex.. what gives?


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Titan XP numbers are a projection only - pure calculation from the raw force these have compared to TXM and 1080 - the end results is an average of 60% over TXM and 33% over 1080, which is exactly coming from the core count. So far, haven't tested it all, but that's about similar to what I have experienced with a real card.
> 
> Would love to have time to test again with TXP for real, but mine is still stuck in "shipping"...


I got it installed , will give you few numbers later tonight , there are few options though , OC Titan and Stock , Ultra in Game Preset or Manually Maxed out


----------



## toncij

Quote:


> Originally Posted by *Gary2015*
> 
> Man, only getting 35fps MAXED on Deus Ex.. what gives?


lol?








Quote:


> Originally Posted by *KillerBee33*
> 
> I got it installed , will give you few numbers later tonight , there are few options though , OC Titan and Stock , Ultra in Game Preset or Manually Maxed out


Let me check that, brb.


----------



## Woundingchaney

Quote:


> Originally Posted by *Gary2015*
> 
> Man, only getting 35fps MAXED on Deus Ex.. what gives?


This seems to be the case with the game as it is very demanding or poorly coded. What settings are you using?


----------



## jodasanchezz

Hi There
Just wana let u guy konw !

*If asked the NVIDIA Support if the replaement of the Cooler will void the Warantiy...*
These in my respond

Sorry for the German but its my nativ language...

*Short term Removing / Changing the Stock Cooler will Void The Waranty....*


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> I got it installed , will give you few numbers later tonight , there are few options though , OC Titan and Stock , Ultra in Game Preset or Manually Maxed out


All maxed out. OC of course. (hopefully at 2.0GHz)

Let me check that, brb.
Quote:


> Originally Posted by *Woundingchaney*
> 
> This seems to be the case with the game as it is very demanding or poorly coded. What settings are you using?


Same engine as Rot Tomb Raider I presume. Lacks SLI, of course.


----------



## Woundingchaney

Quote:


> Originally Posted by *toncij*
> 
> All maxed out. OC of course. (hopefully at 2.0GHz)
> 
> Let me check that, brb.
> Same engine as Rot Tomb Raider I presume. Lacks SLI, of course.


Tomb raider and this title support SLI, of course........

It may be that SqEx is still intending to release performance updates/patches.


----------



## mattlach

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi There
> Just wana let u guy konw !
> 
> *If asked the NVIDIA Support if the replaement of the Cooler will void the Warantiy...*
> These in my respond
> 
> Sorry for the German but its my nativ language...
> 
> *Short term Removing / Changing the Stock Cooler will Void The Waranty....*


Possible in Germany, but per the Moss-Magnuson Warranty Act it is illegal to void warranty based on servicing your own hardware in the U.S.

That being said, they might still fight you on it, and they have staff and retained lawyers, which most of us don't, but that is the law.

In reality I think what happens is, as long as you do a good job of replacing the cooler, and nothing looks physically damaged or out of place, they neither check for, nor refuse RMA's based on the cooler being removed.

If you were to damage the video card while removing/replacing the cooler, then - of course - the warranty is void.


----------



## mattlach

Quote:


> Originally Posted by *Gary2015*
> 
> Man, only getting 35fps MAXED on Deus Ex.. what gives?


Which drivers are you using?

372.54 are the official drivers for Deus Ex support, but these drivers don't work for me, so I have to use the launch drivers 369.05.

35 fps seems surprisingly low given the screenshot quality I've seen, but then again, this is day one of the game launch.

Eidos Montreal tend to be better than most studios in this regard, but most games launch broken these days. That's why I always give it a few months to a year before buying any new title these days. The first few weeks to the first few months amount to paid beta testing in most cases.


----------



## eliau81

Quote:


> Originally Posted by *eliau81*
> 
> Has anyone tried to fit the EVGA 980TI hybrid shroud ?


anybody ???


----------



## dante`afk

Isn't mankind divided the same engine as dx:hr? Looks ****ty to be so taxing, should have def more fps.


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> anybody ???


Will not work , 9series had heatplate separate and and 2 piece shroud , 10series second , small part of the shroud is the same piece as the heatplate
Screws are in different places , and why would you want to do that? Its plastic cover nothing else...

EDIT: 10series kit has a shroud and a heatplate for that reason.


----------



## mattlach

Quote:


> Originally Posted by *eliau81*
> 
> anybody ???


You could always ghetto mount them.

I'm not sure which cooler Vega (on the Hardforum) used (not sure if same Vega as CallSignVega here) but the four screws around the GPU look like they are int he same place.

Just take the window and the heatsink out of the stock cooler, and you have this.

Apparently this cooled better than stock cooler, but he must not have been happy with it, as since then he installed Acrtic Accelero coolers on both of them, and then after that, apparently must have changed his mind again and built a custom loop with EK blocks (at least according to his sig)

So he modded his GPU's with the EVGA coolers, replaced the EVGA coolers with Arctic Accelero coolers and then installed a custom water loop all in less than half the time it too me to order parts for, plan and build my custom loop


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*
> 
> Will not work , 9series had heatplate separate and and 2 piece shroud , 10series second , small part of the shroud is the same piece as the heatplate
> Screws are in different places , and why would you want to do that? Its plastic cover nothing else...
> 
> EDIT: 10series kit has a shroud and a heatplate for that reason.


to make a much better air flow out from the case


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> to make a much better air flow out from the case


this is a 980 , see the difference?

Just keep the small part where the fan is , will not make any difference since the chip is already under water.


----------



## eliau81

Quote:


> Originally Posted by *mattlach*
> 
> You could always ghetto mount them.
> 
> I'm not sure which cooler Vega (on the Hardforum) used (not sure if same Vega as CallSignVega here) but the four screws around the GPU look like they are int he same place.
> 
> Just take the window and the heatsink out of the stock cooler, and you have this.
> 
> Apparently this cooled better than stock cooler, but he must not have been happy with it, as since then he installed Acrtic Accelero coolers on both of them, and then after that, apparently must have changed his mind again and built a custom loop with EK blocks (at least according to his sig)
> 
> So he modded his GPU's with the EVGA coolers, replaced the EVGA coolers with Arctic Accelero coolers and then installed a custom water loop all in less than half the time it too me to order parts for, plan and build my custom loop


he probably didn't realize that the clocks are crap in this boost 3.0 every temp up cousin clock down by ~12hz in 50c i was on 2105hz
by the time igot 75c i was 1987hz !!! what in the hell nvidia doing ??
we need a bios mod !!!


----------



## Zurv

argh! scotty i need more powah!





(The vide isn't 4k yet.. YT is still baking it.)

without cap'n it is mostly above 60.. but barely. It would have been nice to add a 3rd card.


----------



## mattlach

Quote:


> Originally Posted by *eliau81*
> 
> he probably didn't realize that the clocks are crap in this boost 3.0 every temp up cousin clock down by ~12hz in 50c i was on 2105hz
> by the time igot 75c i was 1987hz !!! what in the hell nvidia doing ??
> we need a bios mod !!!


Well, they do have to be more careful with voltage and temps at 16nm than they were at 28nm. Whenever the die shrinks it takes less voltage to do damage, and that is exacerbated with heat.

I think that's why we are sing signed BIOS:es and less voltage control this time around.


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*
> 
> this is a 980 , see the difference?
> 
> Just keep the small part where the fan is , will not make any difference since the chip is already under water.


NG youtuber has a tutorial but i don't think it's going to do any good
from what i've learned we need to keep the GPU on max 60c ~65c to kipp clock high
is CLU mod fix it?
will the card be safe with AIO and CLU mod?


----------



## mattlach

Quote:


> Originally Posted by *Zurv*
> 
> argh! scotty i need more powah!
> 
> 
> 
> 
> 
> (The vide isn't 4k yet.. YT is still baking it.)
> 
> without cap'n it is mostly above 60.. but barely. It would have been nice to add a 3rd card.


Wow.. That's the type of performance I was expecting from a Single Pascal Titan X, not from two...

I didn't expect this title to be particularly graphically intense, considering Human Revolution wasn't when it launched compared to other titles.

Hopefully this will improve significantly after launch day through game and driver patches.

Don't get me wrong, it looks nice and all, but not "dual Pascal Titans Aren't Fast Enough" nice.


----------



## jcde7ago

Quote:


> Originally Posted by *Zurv*
> 
> argh! scotty i need more powah!
> 
> 
> 
> 
> 
> (The vide isn't 4k yet.. YT is still baking it.)
> 
> without cap'n it is mostly above 60.. but barely. It would have been nice to add a 3rd card.


Honestly this screams "less than stellar optimization for a PC port" from Nixxes than it does "2x TXPs aren't strong enough..."

That, or the latest 372.54 drivers aren't good at all for DE: MD.

I bumped all settings to the highest on my X34 (3440x1440 @100hz) and I can barely maintain ~45fps on 3x TXMs...I wonder how my incoming 2x TXPs will do on 3440x1440....


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> NG youtuber has a tutorial but i don't think it's going to do any good
> from what i've learned we need to keep the GPU on max 60c ~65c to kipp clock high
> is CLU mod fix it?
> will the card be safe with AIO and CLU mod?


TitanXP runs under 60 with evgas kit , i just don't see a reason of slapin a plastic shroud on it , Chip is already cooled and stock fan is on top of the VRM , it would not make any difference with a shroud on , it's basically the same idea as the NZXT or Corsair kits .


----------



## 8472

Quote:


> Originally Posted by *eliau81*
> 
> anybody ???


You could always use a nzxt kraken g10. That's what I'm using on mine.


----------



## mattlach

Quote:


> Originally Posted by *jcde7ago*
> 
> Honestly this screams "less than stellar optimization for a PC port" from Nixxes than it does "2x TXPs aren't strong enough..."


That pisses me off.

The original from 2001 was a PC title. They shouldn't even have released it on console, or if they did, only ported it once th ePC title was complete and released.

They need to start doing "PC First and Foremost" titles so we don't have this inefficient poor console port garbage anymore.


----------



## jcde7ago

Quote:


> Originally Posted by *mattlach*
> 
> That pisses me off.
> 
> The original from 2001 was a PC title. They shouldn't even have released it on console, or if they did, only ported it once th ePC title was complete and released.
> 
> They need to start doing "PC Fires and Foremost" titles so we don't have this inefficient poor console port garbage anymore.


The only game that comes to mind that is focusing solely on PC and advancing PC tech is Star Citizen....

This is the modern day conundrum; display tech is still a bit behind (4k @60 FPS just isn't sexy enough) and 99% of games that come out are either a) unoptimized or b) don't really ever take full advantage of the top of the line hardware most enthusiasts have...

Maybe I should just cancel my TXPs before they ship if all I have to look forward to is Star Citizen in 2018...


----------



## axiumone

Nixxes is usually pretty decent at porting games. Look at Tomb Raider and RoTR.

I see that there's an sli profile for Deus Ex, but it's possible that it's not completely active. Can you test the game with a single card to see if there's a difference in FPS?


----------



## jcde7ago

Quote:


> Originally Posted by *axiumone*
> 
> Nixxes is usually pretty decent at porting games. Look at Tomb Raider and RoTR.
> 
> I see that there's an sli profile for Deus Ex, but it's possible that it's not completely active. Can you test the game with a single card to see if there's a difference in FPS?


I agree, Nixxes usually is right on the ball with PC ports, which is why i'm mind blown by people saying that TXPs are struggling...

I'm about to disable tri-SLI on my TXMs....will report back in a bit to see how it is on a single card.


----------



## unreality

Quote:


> Originally Posted by *toncij*
> 
> The minimums <3


Thank you! Exactly my gaming experience so far.


----------



## jcde7ago

Mankind Divided on 1x Titan X Maxwell = ~45 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
Mankind Divided on 3x Titan X Maxwell = ~ 52 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)

Ugh, those TXPs can't come soon enough....either DE: MD is stupid taxing in ways that I don't notice or 372.54 drivers suck + Nixxes finally has a terrible PC port on their hands.... -_-


----------



## Lobotomite430

So I ordered a Arctic Accelero III and a EVGA 1080/1070 Hybrid kit anyone know which might be a better cooler for the Titan Xp?


----------



## Zurv

Quote:


> Originally Posted by *axiumone*
> 
> Nixxes is usually pretty decent at porting games. Look at Tomb Raider and RoTR.
> 
> I see that there's an sli profile for Deus Ex, but it's possible that it's not completely active. Can you test the game with a single card to see if there's a difference in FPS?


i'm running 99% on my TXPs and getting 80-90fps in game at 4k. (with AA off)


----------



## Fiercy

1440p Ultra 2X MSAA

Titan XP (2062MHZ) Water cooled



Well this looks horribly unoptimised maybe DX12 will make things better but its promised on September.


----------



## Zurv

just because something is slow with maxed settings doesn't mean it isn't optimized. It just means they give you more options that a system can handle.
It seems lazy to assume it isn't running well vs the options are just to aggressive. (it could be.. but it doesn't run poorly.. it doesn't ignore resources. SLI is working great with 99% usage.)

Was crysis 3 unoptimzied?


----------



## jcde7ago

Quote:


> Originally Posted by *jcde7ago*
> 
> Mankind Divided on 1x Titan X Maxwell = ~45 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
> Mankind Divided on 3x Titan X Maxwell = ~ 52 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
> 
> Ugh, those TXPs can't come soon enough....either DE: MD is stupid taxing in ways that I don't notice or 372.54 drivers suck + Nixxes finally has a terrible PC port on their hands.... -_-


After some somewhat-extensive testing...found the culprit, at least for me:

Screenspace Reflections set to (Ultra): ~ 52 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
Screenspace Reflections set to (On): ~ 82 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)

So, that setting is a -30 FPS hit for my TXMs....

Can you guys with TXPs please test changing only that setting and see if it makes such a drastic difference for you?


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> So I ordered a Arctic Accelero III and a EVGA 1080/1070 Hybrid kit anyone know which might be a better cooler for the Titan Xp?


in my experience with 980 the EVGA is much better and give much more from Arctic
try to fit the all AIO 1080 kit to the titan and let us know if it fits with the shroud


----------



## Zurv

Quote:


> Originally Posted by *jcde7ago*
> 
> After some somewhat-extensive testing...found the culprit, at least for me:
> 
> Screenspace Reflections set to (Ultra): ~ 52 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
> Screenspace Reflections set to (On): ~ 82 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
> 
> So, that setting is a -30 FPS hit for my TXMs....
> 
> Can you guys with TXPs please test changing only that setting and see if it makes such a drastic difference for you?


i'm heading to the gym now but i'll try it when i get back (unless someone else does)


----------



## CallsignVega

Quote:


> Originally Posted by *combat fighter*
> 
> But the question was about a TV.
> 
> Not a monitor.
> 
> Most new OLED TV's are flat, hence the curved fad is ending which is great news as I really, really hate curved TV's!


That's nice. They offer both flat and curved. It's never a bad thing to have more options.

I'm not going to go over all the pro's vs con's of both, that crap has been beat to death on the AVS Forum.

Curved has picked up more in Asia and Europe. Historically though, they have been more willing and apt to change/adjust than Americans are. I'll probably never own a flat display again.

As a matter of fact, I'd buy a 55" version of this for my computer:


----------



## Zurv

holy crap that is SUPER curved... i too would get that for my desk .. but they would have to fix that god damn auto-dim when stuff isn't moving!
another thing for my wishlist is a 70" that isn't freakn' nutty costly. the 65" oled i have now is a little "small"


----------



## Lobotomite430

Quote:


> Originally Posted by *eliau81*
> 
> in my experience with 980 the EVGA is much better and give much more from Arctic
> try to fit the all AIO 1080 kit to the titan and let us know if it fits with the shroud


Will do, I was thinking of doing the EVGA 1080 kit just because it looks cleaner and will look nicer in my case.


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> Will do, I was thinking of doing the EVGA 1080 kit just because it looks cleaner and will look nicer in my case.


agree


----------



## axiumone

Quote:


> Originally Posted by *CallsignVega*
> 
> That's nice. They offer both flat and curved. It's never a bad thing to have more options.
> 
> I'm not going to go over all the pro's vs con's of both, that crap has been beat to death on the AVS Forum.
> 
> Curved has picked up more in Asia and Europe. Historically though, they have been more willing and apt to change/adjust than Americans are. I'll probably never own a flat display again.
> 
> As a matter of fact, I'd buy a 55" version of this for my computer:
> 
> 
> Spoiler: Warning: Spoiler!


Holy! I want that yesterday!


----------



## toncij

Hey another game making my 2nd TXP an expensive paperweight...


----------



## bl4ckdot

Quote:


> Originally Posted by *Lennyx*
> 
> I just finnished installing the block myself. You need a realy small philips scredriver and a 4mm socket. First remove 2 screws on the I/O shield then all screws on the back witht he screwdriver. Then remove all the screws on the back with the 4mm socket.
> 
> EK got great instuctions that comes in the package with pictures.


Thank you. Did you use EK's thermal pad and paste ? I'm trying to search where I can buy Fujipoly pads in France, and I have yet to succeed at that.


----------



## mattlach

Quote:


> Originally Posted by *HaniWithAnI*
> 
> Afterburner 4.3.0 beta. Search the thread or my post history for the [setting], I posted it earlier.


Thanks for that. I found it and was able to test it, and making that edit definitely gives me a voltage slider.

It goes all the way up to +100% though









Does this really mean +100% as in double the voltage? That would seem it would kill the chip instantly.

How do I interpret the percentage value on this slider, so I know what I am doing?


----------



## mattlach

Quote:


> Originally Posted by *bl4ckdot*
> 
> Thank you. Did you use EK's thermal pad and paste ? I'm trying to search where I can buy Fujipoly pads in France, and I have yet to succeed at that.


I used the included thermal pads. They seemed OK. They were a bit difficult to peel the blue side of the backing off though. I decided to wear gloves in order to not get finger grease on them.

I did not use the included EK paste. It's apparently just relabeled Gelid GC Extreme though which is good stuff. I just didn't know it at the time.

Instead I used some Thermal Grizzly Kryonaut.

I wound up with overclocked (+170, resulting in 2063Mhz) load temps of 34C at 23C ambient so I am very happy.


----------



## Zurv

Quote:


> Originally Posted by *toncij*
> 
> Hey another game making my 2nd TXP an expensive paperweight...


What game? Sli is working great is Deus ex. I'm getting 99% usage on both cards (even in my video a page back showed that... with some % eaten by dxtory)


----------



## toncij

Quote:


> Originally Posted by *Zurv*
> 
> What game? Sli is working great is Deus ex. I'm getting 99% usage on both cards (even in my video a page back showed that... with some % eaten by dxtory)


Drivers?


----------



## Zurv

Quote:


> Originally Posted by *bl4ckdot*
> 
> Thank you. Did you use EK's thermal pad and paste ? I'm trying to search where I can buy Fujipoly pads in France, and I have yet to succeed at that.


Yeah, screw ek and those crazy hard to peel pads ☺

I normally use the Fuji for the ram, the blue nightmare for the other stuff.


----------



## Zurv

Quote:


> Originally Posted by *toncij*
> 
> Drivers?


The one from last week. The one with Deus sli profile









I always ddu. I'm also running a clean install with the latest version of windows 10 - 1607. (every new version of windows 10 is a new OS.. 1607 is the 3rd version.)


----------



## DNMock

Quote:


> Originally Posted by *Zurv*
> 
> Yeah, screw ek and those crazy hard to peel pads ☺
> 
> I normally use the Fuji for the ram, the blue nightmare for the other stuff.


I do it the opposite, fuji on VRM, EK on mem


----------



## CallsignVega

Quote:


> Originally Posted by *axiumone*
> 
> Holy! I want that yesterday!


You guys also notice it's 120 Hz 4K OLED?









I keep throwing money at the screen but nothing happens.


----------



## toncij

OK... Deus Ex MD: 1440, all maxed out - 75 FPS.

I don't think the game looks that good to mandate 75 FPS on 1440. Dual cards 90 FPS average, both cards 95-98% usage.

OK, so, new drivers work, SLI works, but the game has very erratic FPS. Dual Titans at one point give 70ish, and at another 120ish FPS on 1440.


----------



## combat fighter

Quote:


> Originally Posted by *CallsignVega*
> 
> That's nice. They offer both flat and curved. It's never a bad thing to have more options.
> 
> I'm not going to go over all the pro's vs con's of both, that crap has been beat to death on the AVS Forum.
> 
> Curved has picked up more in Asia and Europe. Historically though, they have been more willing and apt to change/adjust than Americans are. I'll probably never own a flat display again.


Fact is curved TV's add nothing to the experience apart from distortion. It's a failed fad (thankfully!)









Only have to look at LG's 2016 oled line up to see they are back tracking with the whole curved TV idea and focusing on flat versions from now on.

Apart from one model all the others are flat.

I would not be surprised one bit to see the 2017 line up to be all flat.

Most people on AVS Forum hate curved TV's as well, and I'm one of them.


----------



## Zurv

Quote:


> Originally Posted by *combat fighter*
> 
> Fact is curved TV's add nothing to the experience apart from distortion. It's a failed fad (thankfully!)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only have to look at LG's 2016 oled line up to see they are back tracking with the whole curved TV idea and focusing on flat versions from now on.
> 
> Apart from one model all the others are flat.
> 
> I would not be surprised one bit to see the 2017 line up to be all flat.
> 
> Most people on AVS Forum hate curved TV's as well, and I'm one of them.


yeah.. for a TV!! (all of us drooling already have flat 4k OLEDs too)

but we are talking about having it on our desk as a monitor.


----------



## dante`afk

Quote:


> Originally Posted by *jcde7ago*
> 
> After some somewhat-extensive testing...found the culprit, at least for me:
> 
> Screenspace Reflections set to (Ultra): ~ 52 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
> Screenspace Reflections set to (On): ~ 82 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
> 
> So, that setting is a -30 FPS hit for my TXMs....
> 
> Can you guys with TXPs please test changing only that setting and see if it makes such a drastic difference for you?


MSAA 8x is the killer in this game. pretty sure even with SLI it would kill them easily

CHS on/off/ultra only 2-5 fps difference
also screenspace reflect on/off/ultra only 2-5 difference.


----------



## jcde7ago

Quote:


> Originally Posted by *dante`afk*
> 
> MSAA 8x is the killer in this game. pretty sure even with SLI it would kill them easily
> 
> CHS on/off/ultra only 2-5 fps difference
> also screenspace reflect on/off/ultra only 2-5 difference.


I don't use MSAA; I mentioned specifically that I had it set to off.


----------



## HyperMatrix

Quote:


> Originally Posted by *dante`afk*
> 
> Cute to take almost only games that scale well with SLI and leave other games out of it.
> 
> The microstuttering and unsmooth screen must be lovely, but hey you don't see it, right
> 
> 
> 
> 
> 
> 
> 
> .
> 
> You need SLI for 4k and above to have an acceptable framerate though.


Is there a way to remove this guy from the thread? He's constantly trolling and never contributing anything useful to the thread or to the site as a whole.


----------



## axiumone

Quote:


> Originally Posted by *combat fighter*
> 
> *Fact is curved TV's add nothing to the experience* apart from distortion. It's a failed fad (thankfully!)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only have to look at LG's 2016 oled line up to see they are back tracking with the whole curved TV idea and focusing on flat versions from now on.
> 
> Apart from one model all the others are flat.
> 
> I would not be surprised one bit to see the 2017 line up to be all flat.
> 
> Most people on AVS Forum hate curved TV's as well, and I'm one of them.


Bud, that's an opinion, not a fact.


----------



## CallsignVega

Quote:


> Originally Posted by *combat fighter*
> 
> Fact is curved TV's add nothing to the experience apart from distortion. It's a failed fad (thankfully!)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only have to look at LG's 2016 oled line up to see they are back tracking with the whole curved TV idea and focusing on flat versions from now on.
> 
> Apart from one model all the others are flat.
> 
> I would not be surprised one bit to see the 2017 line up to be all flat.
> 
> Most people on AVS Forum hate curved TV's as well, and I'm one of them.


What a silly way to use the word fact for a completely subjective opinion. There are only two 2016 LG panels, 55" and 65". Doesn't matter how many models they come out with a different silly built-in sound bar. I guarantee their 2017 lineup will have a curved version.

Oh, and saying "most people" adds zero credence to your point of view. "Most people" in this world are mouth breathers. I guess you better go tell the designers of the ten million dollar military flight simulators I fly and Imax designers to switch to "flat" displays.


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> OK... Deus Ex MD: 1440, all maxed out - 75 FPS.
> 
> I don't think the game looks that good to mandate 75 FPS on 1440. Dual cards 90 FPS average, both cards 95-98% usage.
> 
> OK, so, new drivers work, SLI works, but the game has very erratic FPS. Dual Titans at one point give 70ish, and at another 120ish FPS on 1440.


That's dx11 draw call limitations. Same thing that happened in rise of the tomb raider where you'd drop as low as 60fps in parts of Soviet station. Soon as dx12 came out though, my fps in those areas jumped to 140+. Gotta wait for that patch. And hope that multi gpu support comes along with it.


----------



## DADDYDC650

Quote:


> Originally Posted by *CallsignVega*
> 
> That's nice. They offer both flat and curved. It's never a bad thing to have more options.
> 
> I'm not going to go over all the pro's vs con's of both, that crap has been beat to death on the AVS Forum.
> 
> Curved has picked up more in Asia and Europe. Historically though, they have been more willing and apt to change/adjust than Americans are. I'll probably never own a flat display again.
> 
> As a matter of fact, I'd buy a 55" version of this for my computer:


Now THAT'S a curve. None of that weak sauce available in the states.I wouldn't want that on my TV though, only monitor.


----------



## HaniWithAnI

Quote:


> Originally Posted by *mattlach*
> 
> Thanks for that. I found it and was able to test it, and making that edit definitely gives me a voltage slider.
> 
> It goes all the way up to +100% though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does this really mean +100% as in double the voltage? That would seem it would kill the chip instantly.
> 
> How do I interpret the percentage value on this slider, so I know what I am doing?


It's 100% extra of whatever the card is currently using as an offset, not of the base value. Every TITAN XP is locked to 1.0925V max no matter what you do with this slider.

For example:

My card auto settings are 1.05v, with offset of 0.02v based on thermal and power availability. +100 on this slider will allow an offset of 0.04v instead, for a total of around 1.09V instead. Nothing major, but all we have for now.

To test it's working, get your card VRel limited (OC it and run a light 3d workload while fan and power targets are maxed usually does it) then play with the slider. you should see your voltage change slightly.

Hope that helps.


----------



## mattlach

Quote:


> Originally Posted by *HaniWithAnI*
> 
> It's 100% extra of whatever the card is currently using as an offset, not of the base value. Every TITAN XP is locked to 1.0925V max no matter what you do with this slider.
> 
> For example:
> 
> My card auto settings are 1.05v, with offset of 0.02v based on thermal and power availability. +100 on this slider will allow an offset of 0.04v instead, for a total of around 1.09V instead. Nothing major, but all we have for now.
> 
> To test it's working, get your card VRel limited (OC it and run a light 3d workload while fan and power targets are maxed usually does it) then play with the slider. you should see your voltage change slightly.
> 
> Hope that helps.


Thanks, it is.

For the last few years, I had a fast enough video card at my resolution, so when I vsynced it to 60hz, I had no need to overclock it. I ahvent actually overclocked much since the "new" dynamic overclocking style introduced with the GTX680. Most of my GPU overclocking was with the old static style before that.

I wonder how much higher of a clock this will get me before I run into the power limit.


----------



## mattlach

Quote:


> Originally Posted by *EQBoss*
> 
> Put 2 titans under water 2.1 ghz seems to be fairly easy on water with voltage at 1.09. Cards are definitely starved for power, wish it was 2 8 pins I hit 126 % tdp in ROTTR pretty easy. Increasing the memory clock seems to take away from gpu clock due to power limits. Cards stay in high 30s low 40s on load. Pretty happy with performance overall, still need my hb bridge, evga ones are hard to come by.


How are you exceeding the +120% limit? Have you done a shunt mod?


----------



## Baasha

Max wattage I've seen so far is 1320W with Power Limit set to 120%.

But, the real interesting thing is:


----------



## skypine27

Quote:


> Originally Posted by *Baasha*
> 
> Max wattage I've seen so far is 1320W with Power Limit set to 120%.
> 
> But, the real interesting thing is:


System pics and a link to your 3D Mark score please.


----------



## pez

Quote:


> Originally Posted by *HyperMatrix*
> 
> Is there a way to remove this guy from the thread? He's constantly trolling and never contributing anything useful to the thread or to the site as a whole.


Just report the post and/or block and move on. It's worked at least once so far







.


----------



## Baasha

Quote:


> Originally Posted by *skypine27*
> 
> System pics and a link to your 3D Mark score please.




3DMark Fire Strike Ultra: *20,231* http://www.3dmark.com/3dm/14260635


----------



## skypine27

I have those same power cables on my ax1200i









That score should land you easily well up the hall of fame top 100.

I can't crack the 15K barrier with my 2 x XP's to make it into the top 100.


----------



## BrainSplatter

Quote:


> Originally Posted by *jcde7ago*
> 
> found the culprit, at least for me:
> 
> Screenspace Reflections set to (Ultra): ~ 52 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)
> Screenspace Reflections set to (On): ~ 82 FPS average @ 3440x1440p all settings highest, MSAA/Motion Blur/DOF off (1400mhz/3800mhz)


That's probably the setting which is sabotaging the SLI scaling. It's not unusual that a specific setting turns out to kill SLI performance. You could check that by testing reflections ultra/on with a single card.

You might want to try the 0x080002F5 SLI flag since somebody mentioned that Deus Ex seems to behave similar to Tomb Raider (same developer) in that regard:
http://www.forum-3dcenter.org/vbulletin/showthread.php?s=9da67ffb564937b84bbe1e2c8dece9cc&p=10927377#post10927377


----------



## HyperMatrix

Quote:


> Originally Posted by *BrainSplatter*
> 
> That's probably the setting which is sabotaging the SLI scaling. It's not unusual that a specific setting turns out to kill SLI performance. You could check that by testing reflections ultra/on with a single card.
> 
> You might want to try the 0x080002F5 SLI flag since somebody mentioned that Deus Ex seems to behave similar to Tomb Raider (same developer) in that regard:
> http://www.forum-3dcenter.org/vbulletin/showthread.php?s=9da67ffb564937b84bbe1e2c8dece9cc&p=10927377#post10927377


The game just runs so poorly. I had to turn so many features on/off to find the best looking combination that could give a 100fps average. 2x MSAA with Temporal AA on and Sharpening disabled was probably the biggest graphical change I could see. I'm not sure DX12 would do much to increase performance considering how problematic it is right now. Seems like a terrible port compared to Rise of The Tomb Raider, which is so beautifully done. And worst of all, the game doesn't look good enough to justify the huge GPU usage.


----------



## webmi

cant compete with you 10-core guys ... but its not bad, HOF #8

http://www.3dmark.com/spy/325044


----------



## unreality

4.9 ? Wow







My 5960X does [email protected] but i didnt dare to go any higher. What voltage did you use? Also card stable at 2088?


----------



## toncij

Quote:


> Originally Posted by *skypine27*
> 
> System pics and a link to your 3D Mark score please.


Quote:


> Originally Posted by *Baasha*
> 
> 
> 
> 3DMark Fire Strike Ultra: *20,231* http://www.3dmark.com/3dm/14260635


Love it when people ask Baasha for sys pics to prove it


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*


Cheked The Division ,
Titan @ 2100/1377 6700K @ Stock
2160p everything Maxed out FX AA OFF , Supersampling ON , Nvidia HFTS ON 30-45FPS-HFTS OFF 50-60FPS
Vsync OFF -No frame limit
This is surprisingly good


----------



## combat fighter

Quote:


> Originally Posted by *axiumone*
> 
> Bud, that's an opinion, not a fact.


I disagree.

Plenty of hardware review programmes on TV come to the conclusion curved adds nothing to the TV set as well.

That's a fact.

Anyway enough of this curved TV bollocks, back on track with Titan X Pascal. . .


----------



## skypine27

Quote:


> Originally Posted by *webmi*
> 
> cant compete with you 10-core guys ... but its not bad, HOF #8
> 
> http://www.3dmark.com/spy/325044


Hey bro how are you HOF number 8 ?

I'm running 2 XPs in SLI and nipping at 14k but not even in the top 100?


----------



## skypine27

Any mildly modded BIOS yet?

On my previous Titan X setup I was using a very simple mooded BIOS that let you increase the "power" slider from the factory 110% limit up to an modded 120% limit. This allowed my cards to hold a slightly higher clock consistently in very heavy gaming scenes. The mod was so simple using whatever the program was called to edit the BIOS, I could see exactly what he'd done and using basic highscool math I could have changed the numbers to allow 125% etc. (though I didn't)

Any chance of something like that for the XPs? I would be curious if a modded power limit of say 125-130% (since the XP seems to allow 120% by default) would have the same effect.


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Cheked The Division ,
> Titan @ 2100/1377 6700K @ Stock
> 2160p everything Maxed out FX AA OFF , Supersampling ON , Nvidia HFTS ON 30-45FPS-HFTS OFF 50-60FPS
> Vsync OFF -No frame limit
> This is surprisingly good


Hmm, 30-45? I get 45 in a 1080 overclocked to 2.15. I'll screenshot my settings and get back to you. 30-45 is seriously low for a TXP. But, 50-60 with HFTS OFF is very similar to my projection of 48-62 FPS for a TXP. I guess my results have Ultra preset (HFTS OFF) then. Will double check.

My TXPs still stuck in transit


----------



## DADDYDC650

Quote:


> Originally Posted by *skypine27*
> 
> Hey bro how are you HOF number 8 ?
> 
> I'm running 2 XPs in SLI and nipping at 14k but not even in the top 100?


That's because he's number 8 with 1 card.


----------



## pez

Quote:


> Originally Posted by *combat fighter*
> 
> I disagree.
> 
> Plenty of hardware review programmes on TV come to the conclusion curved adds nothing to the TV set as well.
> 
> That's a fact.
> 
> Anyway enough of this curved TV bollocks, back on track with Titan X Pascal. . .


I'm confused how many peoples opinions make something a fact...that's not quite how it works.

Also, your avatar has a picture of what I'm assuming is your setup with a curved monitor. I know it's not a TV...but...lol.


----------



## skypine27

Quote:


> Originally Posted by *DADDYDC650*
> 
> That's because he's number 8 with 1 card.


Ahhhhh ok. I'm not at home right now to look closer at the lists.

But I can't crack into the 14K barrier needed to get into the top 100 on the list I saw.

I have 2 XPs wirh a 6950x and 13,700 is my best so far. I didn't realize the lists were segregated by number of cards. It seems to default to the list for me that is dominated by 4 X Titan X (maxwells) blowing my score away.

Thx for the info.


----------



## dante`afk

DX12 is now available after a patch in dx:mankind divided.

Game is however boring as f*ck, considering refunding.


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> Hmm, 30-45? I get 45 in a 1080 overclocked to 2.15. I'll screenshot my settings and get back to you. 30-45 is seriously low for a TXP. But, 50-60 with HFTS OFF is very similar to my projection of 48-62 FPS for a TXP. I guess my results have Ultra preset (HFTS OFF) then. Will double check.
> 
> My TXPs still stuck in transit


Not Ultra Ingame Preset , Manually MAXed the game.


----------



## skypine27

Quote:


> Originally Posted by *dante`afk*
> 
> DX12 is now available after a patch in dx:mankind divided.
> 
> Game is however boring as f*ck, considering refunding.


I'm on the same page. Frame rates for me are pushing 100 which is what my X34 runs at (2 X Titan XPs and a 6950x) but I don't like the gameplay. The shooting especially feels "fake" (I'm an avid shooter in RL though obviously I have never fired full auto at a person!) but it really feels off. Games like Arma really nail the feel of firing a rifle and even Far Cry 4 is miles ahead of mankind with respect to the gun mechanics.

I realize gun play isn't supposed to be its main pull since it's set that far into the future. But the "hold space down to move to behind this box" isn't exciting to me either.

I've only requested one refund my entire Steam career and they approved it the same day (No Mans Sky) but Mankind fees like it's going to be my 2nd one very soon.


----------



## GunnzAkimbo

Quote:


> Originally Posted by *Baasha*
> 
> 
> 
> 3DMark Fire Strike Ultra: *20,231* http://www.3dmark.com/3dm/14260635


20000 and no LED rainbow pulsing breathing lightning music sync effect... we cannot accept this. Your score has been removed due to noncompliant system specs.


----------



## skypine27

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> 20000 and no LED rainbow pulsing breathing lightning music sync effect... we cannot accept this. Your score has been removed due to noncompliant system specs.


****.

You likely don't want to see the R5E v10 and thermaltake RGB LED fans I have ready to go as soon as the EK acetal monoblock is available for the v10.....


----------



## combat fighter

Quote:


> Originally Posted by *pez*
> 
> I'm confused how many peoples opinions make something a fact...that's not quite how it works.
> 
> Also, your avatar has a picture of what I'm assuming is your setup with a curved monitor. I know it's not a TV...but...lol.


Not sure what the lol is for but have one back, your avatar looks gay! (LOL!)









Yes your right about my avatar, it's a X34 with a very, very slight curve (3800R) and you sit 12" away from it. . .

TV is a totally different thing, if you can't work that out I really can't be bothered.


----------



## MrKenzie

Quote:


> Originally Posted by *toncij*
> 
> Hmm, 30-45? I get 45 in a 1080 overclocked to 2.15. I'll screenshot my settings and get back to you. 30-45 is seriously low for a TXP. But, 50-60 with HFTS OFF is very similar to my projection of 48-62 FPS for a TXP. I guess my results have Ultra preset (HFTS OFF) then. Will double check.
> 
> My TXPs still stuck in transit


Can you try it with the in-game benchmark? I get 26.6fps average with 1x 1080 @ 2101MHz.
3840x2160, MSAA= OFF, all other settings max.

I doubt anyone would manage over 35fps on a 1080.


----------



## st0necold

Quote:


> Originally Posted by *unreality*
> 
> You seem to have some kind of disorder by replying 3 times after your own posts
> 
> I already proved you wrong several pages ago, by posting benchmarks of SLI TXps @ 1440p
> 
> For 4k+ this changes of course.
> 
> Stil leach to their own, i do know i will have the better experience with a single heavy OCed TXp under water


You'll have a great experience but one TXp is not better then 2... come on man.


----------



## Neon01

Quote:


> Originally Posted by *combat fighter*
> 
> Fact is curved TV's add nothing to the experience apart from distortion. It's a failed fad (thankfully!)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only have to look at LG's 2016 oled line up to see they are back tracking with the whole curved TV idea and focusing on flat versions from now on.
> 
> Apart from one model all the others are flat.
> 
> I would not be surprised one bit to see the 2017 line up to be all flat.
> 
> Most people on AVS Forum hate curved TV's as well, and I'm one of them.


I don't know about that. I just picked up the 2016 LG OLED55C6P (curved), and I only wish it were a little more curved. I agree it's not something I enjoy from the other side of the room, where the curvature radii of the typical 65" is completely nonsensical, but from a PC monitor standpoint its fantastic. I was using a 49" flat UHDTV (2014 model year LG) for my PC monitor prior to the curved OLED and seeing the edges was actually a little tough since they were so oblique from my perpendicular sight line to the TV. With even the slight curve on the OLED, I can see right to the edge perfectly. And that's a larger display, which would normally only exacerbate the problem!


----------



## xzamples

what is the best aftermarket AIR cooler for this gpu?

is the Accelero Xtreme IV the only one available for it right now?


----------



## Gary2015

Quote:


> Originally Posted by *Woundingchaney*
> 
> This seems to be the case with the game as it is very demanding or poorly coded. What settings are you using?


Maxed but even when I turn off MSAA and lower to medium I only get 45fps...


----------



## Gary2015

Quote:


> Originally Posted by *toncij*
> 
> Hmm, 30-45? I get 45 in a 1080 overclocked to 2.15. I'll screenshot my settings and get back to you. 30-45 is seriously low for a TXP. But, 50-60 with HFTS OFF is very similar to my projection of 48-62 FPS for a TXP. I guess my results have Ultra preset (HFTS OFF) then. Will double check.
> 
> My TXPs still stuck in transit


That's everything on ULTRA and MSAA on 4x. But even if I lower to medium and turn off MSAA and go Exclusive FullScreen its only 45-50fps. I didnt pay $2,500 to play at this FPS.


----------



## Woundingchaney

Quote:


> Originally Posted by *Gary2015*
> 
> Maxed but even when I turn off MSAA and lower to medium I only get 45fps...


Yeah I have been playing for a few hours. It seems that many of the setting (even the ones that would be suspect as being performance impacting) don't have a dramatic impact on performance. There is a good bit of speculation as it being a sub-par port.


----------



## Gary2015

Quote:


> Originally Posted by *Woundingchaney*
> 
> Yeah I have been playing for a few hours. It seems that many of the setting (even the ones that would be suspect as being performance impacting) don't have a dramatic impact on performance. There is a good bit of speculation as it being a sub-par port.


I'm going to get a refund on Steam if they they don't patch it next week. I only played 30mins so still have time left.


----------



## Gary2015

Quote:


> Originally Posted by *dante`afk*
> 
> DX12 is now available after a patch in dx:mankind divided.
> 
> Game is however boring as f*ck, considering refunding.


At least you got so far to playing. I'm not used to playing at anything below 60fps. Reminds me of those Wing Commander days where 20fps was the norm.


----------



## pez

Quote:


> Originally Posted by *combat fighter*
> 
> Not sure what the lol is for but have one back, your avatar looks gay! (LOL!)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes your right about my avatar, it's a X34 with a very, very slight curve (3800R) and you sit 12" away from it. . .
> 
> TV is a totally different thing, if you can't work that out I really can't be bothered.


Just the irony of your side of the argument. I get the distortion argument, but you provided that 'many people prefer x' = 'this is a fact'. That's not true.


----------



## Gary2015

Quote:


> Originally Posted by *skypine27*
> 
> I'm on the same page. Frame rates for me are pushing 100 which is what my X34 runs at (2 X Titan XPs and a 6950x) but I don't like the gameplay. The shooting especially feels "fake" (I'm an avid shooter in RL though obviously I have never fired full auto at a person!) but it really feels off. Games like Arma really nail the feel of firing a rifle and even Far Cry 4 is miles ahead of mankind with respect to the gun mechanics.
> 
> I realize gun play isn't supposed to be its main pull since it's set that far into the future. But the "hold space down to move to behind this box" isn't exciting to me either.
> 
> I've only requested one refund my entire Steam career and they approved it the same day (No Mans Sky) but Mankind fees like it's going to be my 2nd one very soon.


You know what they say...you spend an hour waiting for a bus, then 2 show up! Just requested a refund. Not wasting anymore time on this.


----------



## Lobotomite430

Quote:


> Originally Posted by *xzamples*
> 
> what is the best aftermarket AIR cooler for this gpu?
> 
> is the Accelero Xtreme IV the only one available for it right now?


I ordered the Accelero Xtreme 3 and an EVGA 1080/1070 hybrid kit. Both will be here by this weekend but I think I will go with the EVGA for aesthetic reasons.


----------



## combat fighter

Quote:


> Originally Posted by *pez*
> 
> Just the irony of your side of the argument. I get the distortion argument, but you provided that 'many people prefer x' = 'this is a fact'. That's not true.


http://smg.photobucket.com/user/scoobiedave/media/5149210491_8fbc4fa1e8_zpski9yvvnb.jpg.html


----------



## pez

And another troll with nothing useful to say is blocked







.


----------



## Zurv

oh blah! here is a rampage V/SLI question

Do I *NEED* to use PCI 1 and 3 or is that just suggested? yes, using 1 and 3 would give me 16x on both. (I can live with 16/8.)
I just put 2 TXP in that system (waterblocked/etc) and the perf is a little wacky. I'm not sure if the issue is because i'm using pci 1 and 2 or something else.
My gut tells me it is something else as that computer has been running 3 titan XM for a long time and 3 1080s for a short time (and the 1080s were gimped to only 2 way SLI)

I'd rather just throw in a 3rd card then redo the water









that said, what are people doing for the bridge for 2 way SLI on the rampage V? I really like the solid terminal bridges from EK and don't want to go back to the leaky nightmares of using the g1/4.

(i'm thinking of upgrading my main box from the x99-e ws to the 10 year rampage v... so i'll need a better water bridge option for that i guess... ugh)


----------



## combat fighter

Quote:


> Originally Posted by *pez*
> 
> And another troll with nothing useful to say is blocked
> 
> 
> 
> 
> 
> 
> 
> .


Never been called a troll before! LOL

Anyway,

I'd rather get back on track with Titan X Pascal over clocking. . .


----------



## DADDYDC650

Quote:


> Originally Posted by *combat fighter*
> 
> I'd rather get back on track with Titan X Pascal over clocking. . .


He blocked you pal. It's over!


----------



## combat fighter

Quote:


> Originally Posted by *DADDYDC650*
> 
> He blocked you pal. It's over!


Oh no!

Don't know how I am going to sleep now!

LOL


----------



## CallsignVega

Quote:


> Originally Posted by *Zurv*
> 
> oh blah! here is a rampage V/SLI question
> 
> Do I *NEED* to use PCI 1 and 3 or is that just suggested? yes, using 1 and 3 would give me 16x on both. (I can live with 16/8.)
> I just put 2 TXP in that system (waterblocked/etc) and the perf is a little wacky. I'm not sure if the issue is because i'm using pci 1 and 2 or something else.
> My gut tells me it is something else as that computer has been running 3 titan XM for a long time and 3 1080s for a short time (and the 1080s were gimped to only 2 way SLI)
> 
> I'd rather just throw in a 3rd card then redo the water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that said, what are people doing for the bridge for 2 way SLI on the rampage V? I really like the solid terminal bridges from EK and don't want to go back to the leaky nightmares of using the g1/4.
> 
> (i'm thinking of upgrading my main box from the x99-e ws to the 10 year rampage v... so i'll need a better water bridge option for that i guess... ugh)


You just need the longest bridge like I have here:



The RVE is actually pretty good for spacing and air cooling.


----------



## Lobotomite430

Quote:


> Originally Posted by *CallsignVega*
> 
> You just need the longest bridge like I have here:
> 
> 
> 
> The RVE is actually pretty good for spacing and air cooling.


I have this cooler you are using on the way which is better Arctic or EVGA hybrid?


----------



## Tideman

Anybody else getting a bugged gpu memory usage reading in AB? Constantly shows like 44000 or something for me in Windows 10..


----------



## Zurv

Quote:


> Originally Posted by *CallsignVega*
> 
> You just need the longest bridge like I have here:
> 
> 
> 
> The RVE is actually pretty good for spacing and air cooling.


i should have been more clear.. WATER bridge









i wonder if i can just close up the middle ports on a 3 way EK terminal bridge and just use the ends... hrmm...


----------



## pez

Zurv,

People are actually cutting the edges off of the HB with good success. I'm sure the more time taken to shave/cut it, the better it will look







.


----------



## Zurv

Quote:


> Originally Posted by *pez*
> 
> Zurv,
> 
> People are actually cutting the edges off of the HB with good success. I'm sure the more time taken to shave/cut it, the better it will look
> 
> 
> 
> 
> 
> 
> 
> .


i'm not talking about SLI bridges.. but water bridges.


----------



## mattlach

Quote:


> Originally Posted by *pez*
> 
> Just the irony of your side of the argument. I get the distortion argument, but you provided that 'many people prefer x' = 'this is a fact'. That's not true.


It's a sad state of affairs when people don't know the difference between the objective and the subjective. and we let these people vote...









The statement that 9 out of 10 dentists recommend may be a fact, but only in as much as it is a fact that the OPINION of 9 out of 10 dentists think something. The most you are doing here is making an objective statement about the dentists subjective opinion.

To make it an objective, factual statement you have to have some sort of performance metric, but even when you do have an objective measure of something, it's almost impossible to show that one thing is objectively better than the other across the board. There is no best anything. "best" is always an opinion. There isn't even a "better" anything. The most you can hope for is a "better at task X" or "better in situation Y". Across the border "best" does not exist. Across the border "better" is extremely rare.

Even when you have an objective performance metric, lets say Brightness. Screen A has a higher measured brightness than screen B. That doesn't necessarily mean that Screen A has "better" brightness than screen B. It just means it is brighter. This might be great for some poeple or in some situations (sunlight) but worse for others as it might be too bright or hurt their eyes.

Best does not exist, better only rarely exists unless you really narrow things down to a very specific setting or task.

Now back to screens.

I have a curved screen, and for my use it is better than a flat screen, as I sit very close to a large screen, and it allows me to see the corners without viewing angle distortions, whereas if I sat this close to a flat screen of similar size, that would be a problem. it might have had some impact on light uniformity, but I cant really tell in my use.

I wouldn't use a curved screen as a TV for viewing across a room though. They don't seem well suited to that.


----------



## CallsignVega

Quote:


> Originally Posted by *Lobotomite430*
> 
> I have this cooler you are using on the way which is better Arctic or EVGA hybrid?


I prefer the Arcitc. The EVGA Hybrid kept my cards a few degrees cooler, but I'll take the simplicity and zero risk of air versus those cheap AIO water coolers that make strange noises.









Quote:


> Originally Posted by *Zurv*
> 
> i should have been more clear.. WATER bridge
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i wonder if i can just close up the middle ports on a 3 way EK terminal bridge and just use the ends... hrmm...


Yes, you can do that. I did it on my 4-way EK bridge.


----------



## Zurv

Quote:


> Originally Posted by *CallsignVega*
> 
> I prefer the Arcitc. The EVGA Hybrid kept my cards a few degrees cooler, but I'll take the simplicity and zero risk of air versus those cheap AIO water coolers that make strange noises.
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, you can do that. I did it on my 4-way EK bridge.


what did you plug them with? can i use the tops of the waterblocks that are taken off the cards to connect them to terminals? That should work.. the only downside might be the grooves for the rubber on both sides...


----------



## mattlach

Quote:


> Originally Posted by *Baasha*
> 
> 
> 
> 3DMark Fire Strike Ultra: *20,231* http://www.3dmark.com/3dm/14260635


But why? Do you get off on benchmarks or something?


----------



## pez

Quote:


> Originally Posted by *Zurv*
> 
> i'm not talking about SLI bridges.. but water bridges.


Ah, I must have confused the conversation for another. My bad







.
Quote:


> Originally Posted by *mattlach*
> 
> It's a sad state of affairs when people don't know the difference between the objective and the subjective. and we let these people vote...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The statement that 9 out of 10 dentists recommend may be a fact, but only in as much as it is a fact that the OPINION of 9 out of 10 dentists think something. The most you are doing here is making an objective statement about the dentists subjective opinion.
> 
> To make it an objective, factual statement you have to have some sort of performance metric, but even when you do have an objective measure of something, it's almost impossible to show that one thing is objectively better than the other across the board. There is no best anything. "best" is always an opinion. There isn't even a "better" anything. The most you can hope for is a "better at task X" or "better in situation Y". Across the border "best" does not exist. Across the border "better" is extremely rare.
> 
> Even when you have an objective performance metric, lets say Brightness. Screen A has a higher measured brightness than screen B. That doesn't necessarily mean that Screen A has "better" brightness than screen B. It just means it is brighter. This might be great for some poeple or in some situations (sunlight) but worse for others as it might be too bright or hurt their eyes.
> 
> Best does not exist, better only rarely exists unless you really narrow things down to a very specific setting or task.
> 
> Now back to screens.
> 
> I have a curved screen, and for my use it is better than a flat screen, as I sit very close to a large screen, and it allows me to see the corners without viewing angle distortions, whereas if I sat this close to a flat screen of similar size, that would be a problem. it might have had some impact on light uniformity, but I cant really tell in my use.
> 
> I wouldn't use a curved screen as a TV for viewing across a room though. They don't seem well suited to that.


Yeah...I dunno. I can only smile at ignorance and move on







.


----------



## mattlach

Quote:


> Originally Posted by *combat fighter*
> 
> Plenty of hardware review programmes on TV come to the conclusion curved adds nothing to the TV set as well.
> 
> That's a fact.


It's a fact that that is THEIR OPINION. (or it may be, I haven't verified it).

I would probably even agree with that opinion IF USED AS A TV.

When sitting really close to a very large monitor, a mild curve is almost essential!

Again, subjective vs objective. The distinctions are very important.


----------



## mattlach

Quote:


> Originally Posted by *skypine27*
> 
> Any mildly modded BIOS yet?
> 
> On my previous Titan X setup I was using a very simple mooded BIOS that let you increase the "power" slider from the factory 110% limit up to an modded 120% limit. This allowed my cards to hold a slightly higher clock consistently in very heavy gaming scenes. The mod was so simple using whatever the program was called to edit the BIOS, I could see exactly what he'd done and using basic highscool math I could have changed the numbers to allow 125% etc. (though I didn't)
> 
> Any chance of something like that for the XPs? I would be curious if a modded power limit of say 125-130% (since the XP seems to allow 120% by default) would have the same effect.


It's probably going to be a while, if we ever get a modded BIOS.

The BIOS is signed this time around, so essentially it's like waiting for a Root or a Jailbreak. Someone might be able to figure it out, someone might not. Not going to be as straight forward as previous generations.


----------



## DADDYDC650

Ugh, where are the BIOS mods? Now were arguing about monitors and it's curves!

BTW, I like my curves on women.


----------



## mattlach

Quote:


> Originally Posted by *DADDYDC650*
> 
> Ugh, where are the BIOS mods? Now were arguing about monitors and it's curves!
> 
> BTW, I like my curves on women.


I wouldn't stay up at night waiting. As I mentioned in the post above yours, the BIOS files are digitally signed this time around.

It's going to take A LOT of effort to break them.

In the past it was an easier task, and those trying could rely on leaks from inside board partners to get the tools and information they needed for it.

This time around the task is MUCH more difficult, due to the digital signing, and there are no board partners, and Nvidia is much better at keeping leaks from going out than the board partners are.

It's going to be more like waiting for a Android root / unlocked bootloader or an iPhone jailbreak, except the volumes of Titan X buyers are much smaller than the volumes of phone buyers, so there will be many fewer people working on it.

Someone might have a breakthrough at some point and succeed, but it is just as possible that there will never be a successfully modded bios for the new Titan.


----------



## Gary2015

Quote:


> Originally Posted by *combat fighter*
> 
> Oh no!
> 
> Don't know how I am going to sleep now!
> 
> LOL


Quote:


> Originally Posted by *DADDYDC650*
> 
> Ugh, where are the BIOS mods? Now were arguing about monitors and it's curves!
> 
> BTW, I like my curves on women.


I have feeling we won't get one bro.


----------



## mattlach

Quote:


> Originally Posted by *CallsignVega*
> 
> You just need the longest bridge like I have here:
> 
> 
> 
> The RVE is actually pretty good for spacing and air cooling.


Ahh, so you ARE the same Vega as over on the [H]


----------



## KillerBee33

Quote:


> Originally Posted by *Gary2015*
> 
> I have feeling we won't get one bro.


TitanP boosting itself to 40% over stock from Factory that's exactly what we were able to do with most Maxwell using Custom BIOS , about 35% over stock and 40% in rare cases.
It really is doubtful we gonna get some BIOS tools


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> TitanP boosting itself to 40% over stock from Factory that's exactly what we were able to do with most Maxwell using Custom BIOS , about 35% over stock and 40% in rare cases.
> It really is doubtful we gonna get some BIOS tools


Well, there's always the shunt mod.

I just wish there was a way to give me better warm fuzzies about it.

I neither want to put liquid metal paste on the resistors nor solder over them. Both seem too risky to me.

I wonder if a conductive silver trace pen might work better. Anyone know if those can be wiped off if you change your mind?


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> Well, there's always the shunt mod.
> 
> I just wish there was a way to give me better warm fuzzies about it.
> 
> I neither want to put liquid metal paste on the resistors nor solder over them. Both seem too risky to me.
> 
> I wonder if a conductive silver trace pen might work better. Anyone know if those can be wiped off if you change your mind?


SHUNT MOD gives you stability , not higher OC. We do need that stability but not this way . It's just brutal IMO


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> SHUNT MOD gives you stability , not higher OC. We do need that stability but not this way . It's just brutal IMO


Well, as I understand it (and I could be wrong) at least the shunt mod will allow us to take full advantage of the limited voltage control we currently have.

Right now, you can turn up the voltage to +100% which gives you at most 1.0925V but in most cases you are going to hit the +120% power limit WAY before you take full advantage of that voltage boost.

Shunt mod allows you to trick the card into thinking it's using less power, and thus circumventing the power limit, so - in theory - with sufficient cooling you will at least be able to take full advantage of the 1.0925v.

That is nowhere near as much flexibility as you'd get with a custom BIOS, but it is better than nothing.


----------



## mattlach

Side note.

Does anyone know if the power limit includes ALL power the board uses (including RAM, fan, etc.) or if it is just the core?


----------



## fernlander

Quote:


> Originally Posted by *KillerBee33*
> 
> TitanP boosting itself to 40% over stock from Factory that's exactly what we were able to do with most Maxwell using Custom BIOS , about 35% over stock and 40% in rare cases.
> It really is doubtful we gonna get some BIOS tools


On TXM I got about an extra 20MHz on the core. However the whole deal was more stable and that's what I'm looking for in a BIOS mod. The overall clock speed increase for someone on air or water isn't going to be much different.

The thing I think we need is to uncap the power limit.


----------



## KillerBee33

Quote:


> Originally Posted by *fernlander*
> 
> On TXM I got about an extra 20MHz on the core. However the whole deal was more stable and that's what I'm looking for in a BIOS mod. The overall clock speed increase for someone on air or water isn't going to be much different.
> 
> The thing I think we need is to uncap the power limit.


Yeap , saw this on the 1080, raising Voltage isn't beneficial unless we get around Power Limit.
EDIT: if not BIOS then hopefully OC soft will get out of BETA and finally give up fuller control


----------



## CallsignVega

Quote:


> Originally Posted by *mattlach*
> 
> Ahh, so you ARE the same Vega as over on the [H]


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> i should have been more clear.. WATER bridge
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i wonder if i can just close up the middle ports on a 3 way EK terminal bridge and just use the ends... hrmm...


I use the EK 3-way terminal-bridge (or whatever is the proper name) with the metal plate to block the slot 2 opening. Your question regarding using 1-2 or 1-3 slots can depend on whether you have a x4 card in slot 4 _AND_ are running an M.2 NVMe (vs ACHI) For most any gaming purpose, 1-2 will be indistinguishable from 1-3. But, at the limit, any card in x8 forces all its stable mates to communicate x8 in reality. The "true" concurrent bandwidth is driven by the slowest "participant".








Quote:


> Originally Posted by *DADDYDC650*
> 
> Ugh, where are the BIOS mods? Now were arguing about monitors and it's curves!
> 
> BTW, I like my curves on women.


Just no 60" curves please...


Spoiler: Warning: Spoiler!






That outta lock up a few neurons.








Quote:


> Originally Posted by *KillerBee33*
> 
> SHUNT MOD gives you stability , not higher OC. We do need that stability but not this way . It's just brutal IMO


Low temps also give stability.. and higher-steady clocks.


----------



## KillerBee33

I know most of Titan owners are on Custom Loops , just wanted to share this here for the others


----------



## ottoore

Quote:


> Originally Posted by *bl4ckdot*
> 
> Thank you. Did you use EK's thermal pad and paste ? I'm trying to search where I can buy Fujipoly pads in France, and I have yet to succeed at that.


Aquatuning sells them, thermal pad 17 W/mk (sarcon XR-m)


----------



## mattlach

Quote:


> Originally Posted by *Jpmboy*
> 
> Low temps also give stability.. and higher-steady clocks.


That they do.

I haven't started playing with voltage adjustments yet, because I want to get my new custom fan controller in there first, but here's where I stand.

With my EK block and custom loop, under load (I used Heaven benchmark on a loop as my load) my max OC without touching voltages is +170Mhz which results in 2063Mhz.

My GPU temp flatlines at 34-35C. It's funny to look at the chart. Completely straight. So is the core clock line. It never throttles down below 2063Mhz.

Unfortunately I didn't win the silicon lottery this time around. This is an OK result but not the best out there.

I'm hoping that by adding voltage I can get it up to 2100 or above, but I'm very rapidly going to hit the power limit when I do, which is why I am thinking about shunt mods, because an unlocked BIOS is unlikely to come our way anytime soon. They just scare me a little. I'm trying to be creative and come up with safer ways of doing it.

I've killed an expensive (but not this expensive) GPU before within weeks of launch. I don't want to do it again


----------



## EQBoss

Quote:


> Originally Posted by *mattlach*
> 
> How are you exceeding the +120% limit? Have you done a shunt mod?


When you monitor the power usage afterburner reports 126% even tho I set it to 120% in some points of the game. No hard mods done.


----------



## Jpmboy

Quote:


> Originally Posted by *mattlach*
> 
> That they do.
> 
> I haven't started playing with voltage adjustments yet, because I want to get my new custom fan controller in there first, but here's where I stand.
> 
> With my EK block and custom loop, under load (I used Heaven benchmark on a loop as my load) my max OC without touching voltages is +170Mhz which results in 2063Mhz.
> 
> My GPU temp flatlines at 34-35C. It's funny to look at the chart. Completely straight. So is the core clock line. It never throttles down below 2063Mhz.
> 
> Unfortunately I didn't win the silicon lottery this time around. This is an OK result but not the best out there.
> 
> I'm hoping that by adding voltage I can get it up to 2100 or above, but I'm very rapidly going to hit the power limit when I do, which is why I am thinking about shunt mods, because an unlocked BIOS is unlikely to come our way anytime soon. They just scare me a little. I'm trying to be creative and come up with safer ways of doing it.
> 
> I've killed an expensive (but not this expensive) GPU before within weeks of launch. I don't want to do it again


Yeah - I'm not too optimistic about the effect of adding voltage at ambient temperatures. Raising the powerlimiit? Sure... this will unlock the potential in a lot of cards, if they are kept below 40C.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> Yeah - I'm not too optimistic about the effect of adding voltage at ambient temperatures. Raising the powerlimiit? Sure... this will unlock the potential in a lot of cards, if they are kept below 40C.


Stop it with UNDER 40 talk , that really makes me want to get custom loop, tried The Division @ 4K last night , 10 min in and already 63 degrees


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Stop it with UNDER 40 talk , that really makes me want to get custom loop, tried The Division @ 4K last night , 10 min in and already 63 degrees


eh - winter in a few months. your card wil lbe very comfortable.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> eh - winter in a few months. your card wil lbe very comfortable.


Last floor in the building about a mile away from the ocean , who cares about that card , i'll be freezing my fingers off








But yes yesterday 86 degrees AC OFF windows Open , Liquid Temps 40 @ idle


----------



## toncij

Quote:


> Originally Posted by *KillerBee33*
> 
> Stop it with UNDER 40 talk , that really makes me want to get custom loop, tried The Division @ 4K last night , 10 min in and already 63 degrees


Quote:


> Originally Posted by *KillerBee33*
> 
> Last floor in the building about a mile away from the ocean , who cares about that card , i'll be freezing my fingers off
> 
> 
> 
> 
> 
> 
> 
> 
> But yes yesterday 86 degrees AC OFF windows Open , Liquid Temps 40 @ idle


I'm about 300m from the sea north and south (small peninsula) (northern wind is extremely cold here, gusts about 100-150kmph), also open from the south... 8th floor. Nothing from either side (yes, fantastic view)... Need a lot of effort to keep 19°C during winter and prevent it from sliding lower.









But summers (now)... 35°C is a norm.

Now, that custom loop really sounds fantastic...


----------



## KillerBee33

Quote:


> Originally Posted by *toncij*
> 
> I'm about 300m from the sea north and south (small peninsula) (northern wind is extremely cold here, gusts about 100-150kmph), also open from the south... 8th floor. Nothing from either side (yes, fantastic view)... Need a lot of effort to keep 19°C during winter and prevent it from sliding lower.
> 
> 
> 
> 
> 
> 
> 
> 
> But summers (now)... 35°C is a norm.
> 
> Now, that custom loop really sounds fantastic...


Uhummm,i might go for that just as soon as 1080 sells. Just not sure which , Swiftech will be a faster optin i think.


----------



## Edge0fsanity

anyone know how to add the option for voltage adjustment to afterburner beta? I seem to recall seeing it in this thread at some point but can't find the post.


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> Stop it with UNDER 40 talk , that really makes me want to get custom loop, tried The Division @ 4K last night , 10 min in and already 63 degrees


DOOO EEEEEHT!

Just finished building my first. GPU is nice and happy at 34-35C under full load. 74F/24C ambient.


----------



## mattlach

te]
Quote:


> Originally Posted by *Edge0fsanity*
> 
> anyone know how to add the option for voltage adjustment to afterburner beta? I seem to recall seeing it in this thread at some point but can't find the post.


See below:
Quote:


> Originally Posted by *HaniWithAnI*
> 
> Unwinder replied to me in the Afterburner support thread and confirmed that to enable Voltage offset in Afterburner for TITAN XP you just need to do the following:
> In my experience it doesn't do much as the TL/PL seems to be causing the card to choose a lower voltage regardless of offset (and despite me being under targets for both??? :S), so I end up at the same voltage as before under load. Will likely be most useful to those who have already performed shunt mod or watercooling. Will try it again once I have my hybrid applied, still haven't had the chance to fit it yet.


Quote:


> Originally Posted by *HaniWithAnI*
> 
> open the file that looks like "VEN_10DE&DEV_....cfg" in the afterburner install directory -> profiles folder - you can open it in notepad
> 
> paste the following lines (as is, both of them) to the end of it then save. That's all.
> 
> [Settings]
> VDDC_Generic_Detection = 1


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> DOOO EEEEEHT!
> 
> Just finished building my first. GPU is nice and happy at 34-35C under full load. 74F/24C ambient.


Dilemma , Titan is not enough for 100% of AAA titles @4K so the best next thing is 1620P Ultra which really isn't much of a visual difference and @ 1620P it sits nicely under 55 degrees ans without a smallest noise


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> Dilemma , Titan is not enough for 100% of AAA titles @4K so the best next thing is 1620P Ultra which really isn't much of a visual difference and @ 1620P it sits nicely under 55 degrees ans without a smallest noise


I don't think I've ever seen a 1620P screen.

I used to have a 2560x1600 16:10 screen.

I really miss the 16:10 aspect ratio. So much nicer than 16:9.


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> I don't think I've ever seen a 1620P screen.
> 
> I used to have a 2560x1600 16:10 screen.
> 
> I really miss the 16:10 aspect ratio. So much nicer than 16:9.


I'm not on a 144hz Train







Simple TV+DSR is just fine here


----------



## Xanvast

I'm having a big issue with my Titan X cards in SLI. The card nb 2 crashes as soon as vram hits 12gb and with my usage it happens all the time in certain games.
Anyone else experiencing this ? I didn't have such a problem on the Maxwell TXs...
I'm on windows 7.


----------



## mattlach

Quote:


> Originally Posted by *KillerBee33*
> 
> I'm not on a 144hz Train
> 
> 
> 
> 
> 
> 
> 
> Simple TV+DSR is just fine here


Me either.

I have a JS9000 TV and 60hz, and do dynamic vsync. I don't ahve any of that Dreesync/G-Sync stuff.

I'm trying to figure out what to do in Deus Ex though. I may wind up having to do dynamic vsync at half th erefresh rate, and play at 30hz :/


----------



## KillerBee33

Quote:


> Originally Posted by *mattlach*
> 
> Me either.
> 
> I have a JS9000 TV and 60hz, and do dynamic vsync. I don't ahve any of that Dreesync/G-Sync stuff.
> 
> I'm trying to figure out what to do in Deus Ex though. I may wind up having to do dynamic vsync at half th erefresh rate, and play at 30hz :/


Was gonna get this Sony X810C but other than installing it by the toilet i got no place to stick my TOSHIBA which is surprisingly good for it's low price.
Really don't want to have good tech. laying around and selling it is a pain between the back pockets








Buying something is not an issue, it's the idea of letting something go for Dirt Cheap that bothers me


----------



## HyperMatrix

Quote:


> Originally Posted by *mattlach*
> 
> Me either.
> 
> I have a JS9000 TV and 60hz, and do dynamic vsync. I don't ahve any of that Dreesync/G-Sync stuff.
> 
> I'm trying to figure out what to do in Deus Ex though. I may wind up having to do dynamic vsync at half th erefresh rate, and play at 30hz :/


Should get GSYNC man. 100Fps average with temporal AA and 2x msaa and the game is not only playable, but looks pretty damn good too.


----------



## skypine27

Quote:


> Originally Posted by *Gary2015*
> 
> You know what they say...you spend an hour waiting for a bus, then 2 show up! Just requested a refund. Not wasting anymore time on this.


Steam approved my Mankind refund yesterday.
2 X refunds in 2 X days. I'm starting to seem like a bad customer!


----------



## cisco0623

Quote:


> Originally Posted by *CallsignVega*
> 
> You just need the longest bridge like I have here:
> 
> 
> 
> The RVE is actually pretty good for spacing and air cooling.


Damn I have that original cpu cooler still from ten years ago. Thermalright fix-14? Thing is a monster. I have it in storage lol I went water after that and never looked back.


----------



## mattlach

Quote:


> Originally Posted by *HyperMatrix*
> 
> Should get GSYNC man. 100Fps average with temporal AA and 2x msaa and the game is not only playable, but looks pretty damn good too.


But then I have to give up this:



Unless you are aware of any good 42"-44" 4k Gsync screens


----------



## pompss

1 mm fujipoly termal pads for ram and vrms of the titan X will do the work or i need 0.5 mm ??


----------



## bl4ckdot

Quote:


> Originally Posted by *pompss*
> 
> 1 mm fujipoly termal pads for ram and vrms of the titan X will do the work or i need 0.5 mm ??


According to EKWB ( https://www.ekwb.com/shop/EK-IM/EK-IM-3831109831670.pdf ) you need 0.5mm. 1mm is for MOSFET.


----------



## skypine27

FYI even though EK officially say their block isn't compatible with the factory backplates, you can use them just fine.

The left half you can secure properly with the 2 x screws on the far left. Problem is, you are still left with 4 x kinda ugly "holes" where the main screws from the factory cooler were; these aren't used with the EK blocks. The right side you cant really secure properly but you can just put it on top of the card and it says put just fine. Heres mine:
http://s82.photobucket.com/user/skypine27/media/IMG_3268.jpg.html

On the bottom card, you can see one of the 4 x ugly holes I'm talking about but you can also see the backplates will fit just fine until you decide to splurge on some custom EK ones.

EDIT: JPMBOY, ya that would work. So would a few dabs of thermal paste you have laying around. It won't do anything temp wise since the OEM backplates appear to be some kind of ABS plastic but it would keep it secure. Honestly the loose half (i.e. the right) sit perfectly in place without any effort at all, in a conventional case where the cards lay flat. If you had one of those cases where your cards were vertical, yeah a small piece of double sided servo tape from a hobby store would be perfect.


----------



## Jpmboy

Quote:


> Originally Posted by *skypine27*
> 
> FYI even though EK officially say their block isn't compatible with the factory backplates, you can use them just fine.
> 
> The left half you can secure properly with the 2 x screws on the far left. Problem is, you are still left with 4 x kinda ugly "holes" where the main screws from the factory cooler were; these aren't used with the EK blocks. The right side you cant really secure properly but you can just put it on top of the card and it says put just fine. Heres mine:
> http://s82.photobucket.com/user/skypine27/media/IMG_3268.jpg.html
> 
> On the bottom card, you can see one of the 4 x ugly holes I'm talking about but you can also see the backplates will fit just fine until you decide to splurge on some custom EK ones.


just use some 2-sidsed tape on the loose half.


----------



## Dr Mad

Hello folks ^^

I installed EK waterblock and did the shunt mod with Grizzly Conductonaut only to the first resistor (near power connectors).
I surrounded the resistor with electrical tape and applied a thin layer.
Application is not very easy with the "cotton bud" provided by Thermal Grizzly....

With aircooling, the card oc'ed to 2050 (+500 memory) hits ~110/115% on Firestrike and obviously, it could not maintain 2050 due to temperature throttle.
Pushing voltage slider to the max resulted in 1.093v but this is useless since the card throttles almost instantly due to Power Limit.

Once watercooled, Firestrike eats between 80 & 90% TDP and up to 100% with 1.093v and constant 2100 GPU frequency (+500 memory).

So shunt mod on only one resistor provides ~20/25% more power.

Concerning GPU temperature, it's ~8-9°C above water temp, as usual with EK blocks, I'd say.

25°C ambient temp --> 34°C GPU temp after 30mn of intensive load (custom loop with 3 RX480v3 / 2xMCP35X and 5960X at 4500 + R5E)


----------



## dVeLoPe

can anyone who is neutral towards SLI post a 1080 sli vs titan XP GAME side by side benchs?


----------



## HyperMatrix

Quote:


> Originally Posted by *dVeLoPe*
> 
> can anyone who is neutral towards SLI post a 1080 sli vs titan XP GAME side by side benchs?


I'm in the pro-sli camp. I would take a single Titan XP over sli 1080. Single card is always the better option. But having a single Titan XP means you can always add a second card down the line if needed.


----------



## CallsignVega

Quote:


> Originally Posted by *cisco0623*
> 
> Damn I have that original cpu cooler still from ten years ago. Thermalright fix-14? Thing is a monster. I have it in storage lol I went water after that and never looked back.


No, it is a Noctua NH-D15. Water these days is losing a bit of it's luster since chips are getting so power efficient. An example is this air cooler can keep my 6950X cool enough at 4.5 GHz. Water really wouldn't raise that ceiling since you are voltage limited anyway.


----------



## Gary2015

Ok so I spoke to a customer service rep today and he said any modifications to the card will void the warranty. I asked him about changing the stock cooler and he said in a obtuse way that is a modification so that will invalidate the warranty . But I guess they won't know of you put the stock cooler back on .


----------



## skypine27

Quote:


> Originally Posted by *dVeLoPe*
> 
> can anyone who is neutral towards SLI post a 1080 sli vs titan XP GAME side by side benchs?


I don't know if you'll find any 1080 owners in this thread. We are pretty much all Titan XP owners.

Check the 1080 owners club and ask there for someone with 1080 SLI to post some game benches then come back to this thread and ask us for the same benches (assuming some of us have some of the same games)


----------



## toncij

Quote:


> Originally Posted by *dVeLoPe*
> 
> can anyone who is neutral towards SLI post a 1080 sli vs titan XP GAME side by side benchs?


I owned both and posted my test results already. Unfortunatelly not SLI of TXP, but a single card.


----------



## willmaltby

Just noticed that the 1080 volt modding guide at xdev has been updated regarding the shunt mod:

https://xdevs.com/guide/pascal_oc/#step3

They now say NOT to shunt the 5M ohm resistors directly, but to add 10 ohm resistors to a corresponding set of 3 capactirors located nearby:



I presume this applies directly to the Titan X?


----------



## Lissandro

Quote:


> Originally Posted by *toncij*
> 
> I owned both and posted my test results already. Unfortunatelly not SLI of TXP, but a single card.


+REP!

Been looking for something like this, thank you!

Btw guys, any new rumors about 1080ti or vega coming this year?


----------



## cookiesowns

Quote:


> Originally Posted by *skypine27*
> 
> FYI even though EK officially say their block isn't compatible with the factory backplates, you can use them just fine.
> 
> EDIT: JPMBOY, ya that would work. So would a few dabs of thermal paste you have laying around. It won't do anything temp wise since the OEM backplates appear to be some kind of ABS plastic but it would keep it secure. Honestly the loose half (i.e. the right) sit perfectly in place without any effort at all, in a conventional case where the cards lay flat. If you had one of those cases where your cards were vertical, yeah a small piece of double sided servo tape from a hobby store would be perfect.


I don't think they are ABS plastic, but rather a very thin piece of alloy. Seems rather light for ABS plastic.


----------



## HyperMatrix

Someone try out these settings in Deus Ex. It's the best I could find that looked good while still maintaining an average fps of at least 100. Although, admittedly, some areas I drop to 90fps because the latest drivers broke overclocking on my cards and I have to run them under 2GHz. For the rest of you, that shouldn't be a problem. In buildings/etc it'll go up to 120-144fps. GPU usage generally sits at around 85% across both cards. Meaning with water blocks and a proper OC, and DX12, we can probably look forward to another 25-30% FPS compared to now. But it's honestly playable right now.

This is for 1440p, btw.

Couple things to note: Screen space reflections. They're nice. But it's responsible for a solid 10% reduction in FPS, even with the normal "on" mode. Ultra is even worse. "Very High" shadows are also less taxing that "High" shadows. Noooo idea why.

edit: One mistake I made. You should keep shadows on Medium because Very High seems to introduce some hitching during many scene transitions. The screenshots below are all with Medium shadows.


----------



## mattlach

Quote:


> Originally Posted by *Gary2015*
> 
> Ok so I spoke to a customer service rep today and he said any modifications to the card will void the warranty. I asked him about changing the stock cooler and he said in a obtuse way that is a modification so that will invalidate the warranty . But I guess they won't know of you put the stock cooler back on .


If you fight them on this in the US you will eventually win, but it may take time effort and money.

Voiding warranty based on things like this is illegal in the US based on the Moss-Magnuson act


----------



## dante`afk

Quote:


> Originally Posted by *Dr Mad*
> 
> Hello folks ^^
> 
> I installed EK waterblock and did the shunt mod with Grizzly Conductonaut only to the first resistor (near power connectors).
> I surrounded the resistor with electrical tape and applied a thin layer.
> Application is not very easy with the "cotton bud" provided by Thermal Grizzly....
> 
> With aircooling, the card oc'ed to 2050 (+500 memory) hits ~110/115% on Firestrike and obviously, it could not maintain 2050 due to temperature throttle.
> Pushing voltage slider to the max resulted in 1.093v but this is useless since the card throttles almost instantly due to Power Limit.
> 
> Once watercooled, Firestrike eats between 80 & 90% TDP and up to 100% with 1.093v and constant 2100 GPU frequency (+500 memory).
> 
> So shunt mod on only one resistor provides ~20/25% more power.
> 
> Concerning GPU temperature, it's ~8-9°C above water temp, as usual with EK blocks, I'd say.
> 
> 25°C ambient temp --> 34°C GPU temp after 30mn of intensive load (custom loop with 3 RX480v3 / 2xMCP35X and 5960X at 4500 + R5E)


You did not appy enough LM. Mine stays at about 40% power.


----------



## Neon01

Quote:


> Originally Posted by *mattlach*
> 
> But then I have to give up this:
> 
> 
> 
> Unless you are aware of any good 42"-44" 4k Gsync screens


Is that a 49incher? And the side monitors, are they the old 20" IPS Dells? I used to have two of those in portrait next to a Dell 3011 30". Now I've got a 55" monstrosity on my desk next to my RoG Swift, and I couldn't be happier. It's amazing how quickly you can get used to the size.


----------



## toncij

So you gave up on 144Hz for the diagonal size?


----------



## Dr Mad

Quote:


> Originally Posted by *dante`afk*
> 
> You did not appy enough LM. Mine stays at about 40% power.


If I'm right, you applied LM on all 3 resistors.

Just one for me, so 25% gain of TDP is not only normal but also enough since my card don't even reach 100% with max overclock in Witcher 3.
Because of the weakness of VRM on this card, I was reluctant to apply LM on more than 2 resistors, did not even considered doing it on all 3.


----------



## DADDYDC650

Just setup my LG 34UM88 monitor. It's a 34" UltraWide. It's their newest flat screen monitor and it looks great! I believe they fixed their Backlight bleeding issues. Anyway, my Titan XP is beasting away in every game at 3440x1440p max settings in DooM and Rainbow Six Siege. Can't wait to see what BF1 looks on this bad boy.









Pic of my panel. Brightness @25 percent/contrast @70.


----------



## mattlach

Quote:


> Originally Posted by *Neon01*
> 
> Is that a 49incher? And the side monitors, are they the old 20" IPS Dells? I used to have two of those in portrait next to a Dell 3011 30". Now I've got a 55" monstrosity on my desk next to my RoG Swift, and I couldn't be happier. It's amazing how quickly you can get used to the size.


It's a 48" js9000, and yep, the two sides are the 20" IPS dells. I too used to have a U3011 in the center!


----------



## mattlach

Quote:


> Originally Posted by *toncij*
> 
> So you gave up on 144Hz for the diagonal size?


I haven't had a monitor capable of above 60hz since the CRT days. I used to run my 22" IIyama Vision master Pro at 100hz but it's been a while.

I know it's a statement many would disagree with, but I consider frame rates above 60fps to be in placebo territory









I also don't see the point of 4k under 40". I don't buy all these extra pixels just so I have to scale everything up


----------



## chronicfx

Quote:


> Originally Posted by *mattlach*
> 
> I haven't had a monitor capable of above 60hz since the CRT days. I used to run my 22" IIyama Vision master Pro at 100hz but it's been a while.
> 
> I know it's a statement many would disagree with, but I consider frame rates above 60fps to be in placebo territory


Visually yes, but you will feel it in your mouse.


----------



## webmi

30% PT @ 2050 @ Water


----------



## Yuhfhrh

Quote:


> Originally Posted by *webmi*
> 
> 
> 
> 30% PT @ 2050 @ Water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


You sure loaded up on that CLU!


----------



## mattlach

Quote:


> Originally Posted by *chronicfx*
> 
> Visually yes, but you will feel it in your mouse.


That's the way I feel about going from 30fps to 60fps, but above 60fps I don't feel a difference.

Or at least I don't remember feeling a difference, but it's been 11+ years since I last played a game above 60fps.


----------



## mattlach

Quote:


> Originally Posted by *willmaltby*
> 
> Just noticed that the 1080 volt modding guide at xdev has been updated regarding the shunt mod:
> 
> https://xdevs.com/guide/pascal_oc/#step3
> 
> They now say NOT to shunt the 5M ohm resistors directly, but to add 10 ohm resistors to a corresponding set of 3 capactirors located nearby:
> 
> 
> 
> I presume this applies directly to the Titan X?


Possibly, but there is no way in hell I'm taking a soldering iron to my Titan.

Manual PCB soldering is extremely difficult even for professionals. Huge risks of doing permanent damage here, and after this there definitely won't be any warranty.


----------



## Luke212

theres about 10 reviews out but noone testing the compute performance, which is THE PRIMARY USE OF THIS CARD! what is wrong with reviewers these days? only Anandtech tests for it, and they werent given a Titan X Pascal. shame on all the others. give the card to a real review site.


----------



## toncij

Quote:


> Originally Posted by *mattlach*
> 
> I haven't had a monitor capable of above 60hz since the CRT days. I used to run my 22" IIyama Vision master Pro at 100hz but it's been a while.
> 
> I know it's a statement many would disagree with, but I consider frame rates above 60fps to be in placebo territory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also don't see the point of 4k under 40". I don't buy all these extra pixels just so I have to scale everything up


Not trying to start an off-topic flame war, but you're wrong. Give it a try. Above 60Hz will give you smoother animations in desktop, will be easier on your eyes if you look at it for hours, and in games there certainly is a huge difference. I'm not a teenager with 20/20, but double that age and I see a difference like night and day. 100 to 120 or 120 to 144 and above might be less of a difference, but going only 100Hz is huge from my experience (better than my old CRTs ever were (85Hz).

Regarding pixel count, yes, more pixels on 40" is nice, it actually keeps the quality in check when going big. But, the most important advantage I have with my 5K display is a fantastic image quality. You can see the difference at first sight. Try SW: BF on 5K and you'll be amazed. Also, desktop scaled 200% is so clean, mostly text which is fantastic to read even at small font sizes.
Quote:


> Originally Posted by *Luke212*
> 
> theres about 10 reviews out but noone testing the compute performance, which is THE PRIMARY USE OF THIS CARD! what is wrong with reviewers these days? only Anandtech tests for it, and they werent given a Titan X Pascal. shame on all the others. give the card to a real review site.


I use Titans for compute (my primary use and my job). It's pretty much almost full theoretical advantage considering ALU and if you need memory bandwidth, 500ish is beautiful compared to old 300ish. Especially visible when you overclock the monster to 2GHz and 5.5GHz (VRAM).
Compared to the old Titan, TitanXP is, clock for clock, faster 60-70%. That's huge. 6 TXPs will give you the computing performance you could get with 9 TXMs before. Thats 3 "cards" more for the cost of $600 more compared to TXM and their price.


----------



## mattlach

Quote:


> Originally Posted by *Luke212*
> 
> theres about 10 reviews out but noone testing the compute performance, which is THE PRIMARY USE OF THIS CARD! what is wrong with reviewers these days? only Anandtech tests for it, and they werent given a Titan X Pascal. shame on all the others. give the card to a real review site.


Most (all?) of the sites I have seen reviews on are oriented towards games. It makes sense they would focus on the topics their readers are interested in.

If there are computer oriented review sites, that should be their priority.

I agree that seeing compute results would be nice, but it really doesn't matter to most of their readers. What matters more to them is getting the review up as quickly as they can, which means skipping some of the extras.


----------



## figgie

Hi folks,

long time computer guy, been over at the [H] since early 2000's (since there 1st reset).

I just received my 2 x Titan XP. Waiting on my EK water blocks and other goodies from EU (EK terminal blocks and a ton of aquacomputer stuff to include a brand new 480mm radiator).

I currently have a liquid cooled setup currently on a 5960x. 64GB Corsair Dominator Platinum 2666, 512 M2. Samsung 950 Pro and an ASUS ARES III card. Monitor I was running was the original LG 34UC97.


----------



## profundido

By the way,

starting a totally different topic here: Did anyone choose to reuse the original backplates when putting on the EK blocks instead of ordering the EK backplates ?

I reused mine on both cards:


----------



## bl4ckdot

Quote:


> Originally Posted by *profundido*
> 
> By the way,
> 
> starting a totally different topic here: Did anyone choose to reuse the original backplates when putting on the EK blocks instead of ordering the EK backplates ?
> 
> I reused mine on both cards:


Is it easy ? what did you had to use to make this work ?


----------



## profundido

Quote:


> Originally Posted by *bl4ckdot*
> 
> Is it easy ? what did you had to use to make this work ?


It's easy,

I used my dremel 4200 to widen some of the existing holes I decided to use using the default drill bit that comes with the device (happens to be the right size).

Use the small new screws that come with the EK block with a washer directly on the PCB for 1-3 screws (typically inner ones)

Use the large screws that come with the EK block to attach the backplate. So washers on all holes on the PCB, then plate on it, then bolts into the whole thing for whatever holes you decide to use

I marked the modified holes (with arger screws on) with green arrows for your convenience.


----------



## Fiercy

I don't think it's a good idea because one day when new titan comes out you would need to sell this one and people wound notice what you did with the backplate.


----------



## eliau81

so i decided to try the shunt mod
but i don't have any liquid metal
will pencil on the 5om gonna work?
how many 5om i need to shunt ? there are 3


----------



## Slushpup

I purchased my Titan X (Pascal) today guys! Ready to get back to one GPU!


----------



## Neon01

Quote:


> Originally Posted by *Fiercy*
> 
> I don't think it's a good idea because one day when new titan comes out you would need to sell this one and people wound notice what you did with the backplate.


I do think it looks really slick, but they are plastic, so if anything I think they insulate - more than cool - the board. Probably adds some rigidity, but the EK block isn't tremendously heavy (not like the aqua computer blocks I had on my 980... Whew!).

Sent from my XT1575 using Tapatalk


----------



## mattlach

Quote:


> Originally Posted by *toncij*
> 
> Not trying to start an off-topic flame war, but you're wrong. Give it a try. Above 60Hz will give you smoother animations in desktop, will be easier on your eyes if you look at it for hours, and in games there certainly is a huge difference. I'm not a teenager with 20/20, but double that age and I see a difference like night and day. 100 to 120 or 120 to 144 and above might be less of a difference, but going only 100Hz is huge from my experience (better than my old CRTs ever were (85Hz).
> 
> Regarding pixel count, yes, more pixels on 40" is nice, it actually keeps the quality in check when going big. But, the most important advantage I have with my 5K display is a fantastic image quality. You can see the difference at first sight. Try SW: BF on 5K and you'll be amazed. Also, desktop scaled 200% is so clean, mostly text which is fantastic to read even at small font sizes.
> I use Titans for compute (my primary use and my job). It's pretty much almost full theoretical advantage considering ALU and if you need memory bandwidth, 500ish is beautiful compared to old 300ish. Especially visible when you overclock the monster to 2GHz and 5.5GHz (VRAM).
> Compared to the old Titan, TitanXP is, clock for clock, faster 60-70%. That's huge. 6 TXPs will give you the computing performance you could get with 9 TXMs before. Thats 3 "cards" more for the cost of $600 more compared to TXM and their price.


Well, as I said, I knew people would disagree.

IMHO, much like the audiophile community, I believe that you truly believe you are seeing a difference, but that it is all placebo effect. Human brains are silly things and can not be trusted, not even your own









Personally, in modern times on LCD panels I cant visually tell the difference of anything above 30fps. 30fps "feels" off when I move the mouse though, and it keeps feeling off up until about 60fps, above which I can't tell a difference at all, and I couldn't when I was young and nimble either 15 years ago, when I had a 100hz CRT









I absolutely don't believe in scaling. The #1 reason I get larger monitors is for the extra screen real estate. Scaling the display completely counteracts that benefit. IMHO, nothing above ~100-120DPI makes sense on the desktop. For higher resolution a larger screen is the way to go.

IMHO, my 48" 4k screen is too big. It only results in ~92DPI which is a little bit too low to me. At 42" to 44" 4k would be perfect though.


----------



## DNMock

Quote:


> Originally Posted by *mattlach*
> 
> That's the way I feel about going from 30fps to 60fps, but above 60fps I don't feel a difference.
> 
> Or at least I don't remember feeling a difference, but it's been 11+ years since I last played a game above 60fps.


I can never tell a difference going to a higher refresh rate or resolution. Then, when I go back to what I was previously using it looks like utter trash. Standard definition TV just looks like a fuzzy blob and gives me a headache now, but when 1080p first came out, I couldn't tell a difference between the two.


----------



## markklok

Pfff CLU is not my fav stuff to apply 

Since my card is hanging thus vertical i have made some safety precautions for when the stuff decides to run over..

I made 3 plastic squares which i put around the resistors .
   



Then with a little sweaty hands i applied to godly stuff

I must have lost atleast 5 pounds of weight but it is working.

Tomorrow i'm going to install my EK block and wil see how the CLU is *hanging* and maybe put a little roof over the resistors

*update*
After installing the EK block its nice too see you can clearly see the resistors and thus check up on them


----------



## eliau81

Quote:


> Originally Posted by *eliau81*
> 
> so i decided to try the shunt mod
> but i don't have any liquid metal
> will pencil on the 5om gonna work?
> how many 5om i need to shunt ? there are 3


please help guys
im going to use evga 980 AIO HYBRID


----------



## mattlach

Quote:


> Originally Posted by *eliau81*
> 
> so i decided to try the shunt mod
> but i don't have any liquid metal
> will pencil on the 5om gonna work?
> how many 5om i need to shunt ? there are 3


Nope. Pencil will be too high resistance.

Even a conductive silver pen will be too high resistance to work.

Soldering it over or running copper wire may not work either.

The problem is you need to hit a very specific resistance range. It has to be less than the tiny resistor on there, but not zero, because then it goes into error mode and protects itself by turning the power way down instead of up.

I don't know how someone figured out that CLU had just the right resistance, but that's really the only solution I've heard when asking around.

You can ask more in the thread for it.


----------



## Z0eff

Quote:


> Originally Posted by *mattlach*
> 
> Well, as I said, I knew people would disagree.
> 
> IMHO, much like the audiophile community, I believe that you truly believe you are seeing a difference, but that it is all placebo effect. Human brains are silly things and can not be trusted, not even your own
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally, in modern times on LCD panels I cant visually tell the difference of anything above 30fps. 30fps "feels" off when I move the mouse though, and it keeps feeling off up until about 60fps, above which I can't tell a difference at all, and I couldn't when I was young and nimble either 15 years ago, when I had a 100hz CRT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I absolutely don't believe in scaling. The #1 reason I get larger monitors is for the extra screen real estate. Scaling the display completely counteracts that benefit. IMHO, nothing above ~100-120DPI makes sense on the desktop. For higher resolution a larger screen is the way to go.
> 
> IMHO, my 48" 4k screen is too big. It only results in ~92DPI which is a little bit too low to me. At 42" to 44" 4k would be perfect though.


What about examples like this? http://30vs60.com/bf4running.php

If you want to be more rigorous, don't read the text above the 2 running videos so you won't know which is 30 and which is 60.


----------



## CRITTY

Quote:


> Originally Posted by *HyperMatrix*
> 
> Someone try out these settings in Deus Ex. It's the best I could find that looked good while still maintaining an average fps of at least 100. Although, admittedly, some areas I drop to 90fps because the latest drivers broke overclocking on my cards and I have to run them under 2GHz. For the rest of you, that shouldn't be a problem. In buildings/etc it'll go up to 120-144fps. GPU usage generally sits at around 85% across both cards. Meaning with water blocks and a proper OC, and DX12, we can probably look forward to another 25-30% FPS compared to now. But it's honestly playable right now.
> 
> This is for 1440p, btw.
> 
> Couple things to note: Screen space reflections. They're nice. But it's responsible for a solid 10% reduction in FPS, even with the normal "on" mode. Ultra is even worse. "Very High" shadows are also less taxing that "High" shadows. Noooo idea why.
> 
> edit: One mistake I made. You should keep shadows on Medium because Very High seems to introduce some hitching during many scene transitions. The screenshots below are all with Medium shadows.


I just can't pull the trigger on a game that I can't play maxed out at 4k 60fps with 2 Titan XP's. It's not like this game has the best graphics of any game to date; no reason for it to run so "poorly".


----------



## DNMock

Quote:


> Originally Posted by *CRITTY*
> 
> I just can't pull the trigger on a game that I can't play maxed out at 4k 60fps with 2 Titan XP's. It's not like this game has the best graphics of any game to date; no reason for it to run so "poorly".


Even with Disabled AA and disabled Trashworks stuff like god hairs or whatever?


----------



## Fiercy

Quote:


> Originally Posted by *CRITTY*
> 
> I just can't pull the trigger on a game that I can't play maxed out at 4k 60fps with 2 Titan XP's. It's not like this game has the best graphics of any game to date; no reason for it to run so "poorly".


Some times I wonder why people actually buy these titans if they have reasons like that not to play an awesome game.


----------



## CRITTY

Quote:


> Originally Posted by *Fiercy*
> 
> Some times I wonder why people actually buy these titans if they have reasons like that not to play an awesome game.


Awesome game? Must be a one of those facts they throw around here. I have plenty of awesome games in my backlog to play and refuse to support that game in it's current state. I don't have time for that or your drivel.


----------



## CRITTY

Quote:


> Originally Posted by *DNMock*
> 
> Even with Disabled AA and disabled Trashworks stuff like god hairs or whatever?


That was with AA off. There was an update so that may have changed, but I have not checked it out as of today.


----------



## DNMock

Quote:


> Originally Posted by *Fiercy*
> 
> Some times I wonder why people actually buy these titans if they have reasons like that not to play an awesome game.


Do you buy a fridge to store rotten, moldy food in, or fresh food in?

And yeah, any title (sans heavily modded or .ini files cranked up to 11) that can't run at 60 fps @4k on SLI TXP (assuming it supports SLI) even with AA disabled is moldy and rotten.


----------



## HyperMatrix

I still think people are prematurely making the jump to 4K. 144Hz+ 1440p is where it's at right now as that is all you can expect to get with current generation technology if you like maxing out your games. Higher pixel density shows additional detail. But lower refresh rates make seeing that detail hard to see when there's motion of any kind.

Too many people don't understand how different a game feels when you're playing with 165Hz GSYNC. Everything is just so buttery smooth and responsive that you feel a connection to the game world.


----------



## mouacyk

Quote:


> Originally Posted by *HyperMatrix*
> 
> I still think people are prematurely making the jump to 4K. 144Hz+ 1440p is where it's at right now as that is all you can expect to get with current generation technology if you like maxing out your games. Higher pixel density shows additional detail. But lower refresh rates make seeing that detail hard to see when there's motion of any kind.
> 
> Too many people don't understand how different a game feels when you're playing with 165Hz GSYNC. Everything is just so buttery smooth and responsive that you feel a connection to the game world.


Too many people don't know that G-Sync is capped at 144Hz. Going over that actually defeats the purpose of G-sync in the first place.


----------



## HyperMatrix

Quote:


> Originally Posted by *mouacyk*
> 
> Too many people don't know that G-Sync is capped at 144Hz. Going over that actually defeats the purpose of G-sync in the first place.


Too many people don't know that GSYNC on a 165Hz monitor is capped at 165Hz. And that's why you set an fps cap 5fps below the GSYNC cap so it's always active.


----------



## mouacyk

Quote:


> Originally Posted by *HyperMatrix*
> 
> Too many people don't know that GSYNC on a 165Hz monitor is capped at 165Hz. And that's why you set an fps cap 5fps below the GSYNC cap so it's always active.


Proof plz. Tftcentral seems to be outdated. 165hz isn't official either. It's in overclock mode.


----------



## DNMock

Quote:


> Originally Posted by *HyperMatrix*
> 
> I still think people are prematurely making the jump to 4K. 144Hz+ 1440p is where it's at right now as that is all you can expect to get with current generation technology if you like maxing out your games. Higher pixel density shows additional detail. But lower refresh rates make seeing that detail hard to see when there's motion of any kind.
> 
> Too many people don't understand how different a game feels when you're playing with 165Hz GSYNC. Everything is just so buttery smooth and responsive that you feel a connection to the game world.


In my opinion it depends on the games played really, if it's games like Skyrim, the witcher, fallout, dragonage, etc. etc. then 4k 60 fps is just fine since there isn't a lot of fast movement to disturb things. If it's racing or fps type games with a lot of quick movements, then I totally agree, the added fps really helps a lot more than the higher pixel density.

The 3440x1440 100hz monitors are a nice half way meeting point between the two I think.
Quote:


> Originally Posted by *HyperMatrix*
> 
> Too many people don't know that GSYNC on a 165Hz monitor is capped at 165Hz. And that's why you set an fps cap 5fps below the GSYNC cap so it's always active.


If you are running a fairly steady 165 fps, what point is there to GSYNC?

edit, as in who cares if Gsync quits working when you are above 160 fps as long as it kicks back in when you drop below it.


----------



## mouacyk

Quote:


> Originally Posted by *DNMock*
> 
> In my opinion it depends on the games played really, if it's games like Skyrim, the witcher, fallout, dragonage, etc. etc. then 4k 60 fps is just fine since there isn't a lot of fast movement to disturb things. If it's racing or fps type games with a lot of quick movements, then I totally agree, the added fps really helps a lot more than the higher pixel density.
> 
> The 3440x1440 100hz monitors are a nice half way meeting point between the two I think.


Bingo. I love the extra fov also. Took bad light boost is not available. 100hz light boost is acceptable.


----------



## Neon01

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HyperMatrix*
> 
> I still think people are prematurely making the jump to 4K. 144Hz+ 1440p is where it's at right now as that is all you can expect to get with current generation technology if you like maxing out your games. Higher pixel density shows additional detail. But lower refresh rates make seeing that detail hard to see when there's motion of any kind.
> 
> Too many people don't understand how different a game feels when you're playing with 165Hz GSYNC. Everything is just so buttery smooth and responsive that you feel a connection to the game world.
> 
> 
> 
> Too many people don't know that G-Sync is capped at 144Hz. Going over that actually defeats the purpose of G-sync in the first place.
Click to expand...

It's only premature if you need 100± fps. Personally, I think the demand for passing everything at >144hz speeds is ridiculous... And I have a RoG Swift. But then my refresh rate needs aren't what others' are, so I won't judge.

Seriously, I get 60 fps on just about every game I play with max or very-close-to max settings, and I've pretty much transitioned to playing everything at 4k. My RoG Swift is basically just a supplemental screen at this point.

Sent from my XT1575 using Tapatalk


----------



## dallas1990

I'm building a new computer for VR and 4k gaming. I shouldn't say 4k more like 2560x1440p with using super sampling a little bit. But I'm looking at either a titan x (pascal) or a gtx 1080. Price isnt a bother and I'm just running a single card.


----------



## HyperMatrix

Quote:


> Originally Posted by *DNMock*
> 
> If you are running a fairly steady 165 fps, what point is there to GSYNC?
> 
> edit, as in who cares if Gsync quits working when you are above 160 fps as long as it kicks back in when you drop below it.


It helps when the fps fluctuates. If I 160fps guaranteed/locked down FPS in games all the time, I'd be a happy camper. But that's not possible with 2 Titans, unfortunately. GSYNC also reduces input lag. vsync-off at 165Hz/165fps still has higher input lag than 165Hz GSYNC with a 160fps cap since Gsync matches every frame generated to a screen refresh.

Quote:


> Originally Posted by *mouacyk*
> 
> Proof plz. Tftcentral seems to be outdated. 165hz isn't official either. It's in overclock mode.


Looks like we have another contrarian in our midst. Mentioning tftcentral, then ignoring that TFT central clearly lists

"Refresh Rate
144Hz native
Up to 165z max overclocked
G-sync range 30 - 165Hz"


----------



## HyperMatrix

Quote:


> Originally Posted by *Neon01*
> 
> It's only premature if you need 100± fps. Personally, I think the demand for passing everything at >144hz speeds is ridiculous... And I have a RoG Swift. But then my refresh rate needs aren't what others' are, so I won't judge.
> 
> Seriously, I get 60 fps on just about every game I play with max or very-close-to max settings, and I've pretty much transitioned to playing everything at 4k. My RoG Swift is basically just a supplemental screen at this point.
> 
> Sent from my XT1575 using Tapatalk


Only way I could imagine doing 60Hz is if it were on Plasma or OLED. LCD at 60Hz is just painful to watch. It's like when I try watching a hockey game at a bar or at a friend's place on their LCD. It's just painful, unless they're using interpolated 120/240hz simulation, which is really just a hack fix. Also lower refresh rate is fine for some games. Like I could play Witcher 3 at 80fps and be a happy camper. Not sure I'd be able to do 60. But still, for me to admit to being able to play any time at 80fps, it's saying a lot. The higher refresh matters more for games where input lag/responsiveness is important. For example in an FPS game, a higher refresh rate will actually increase your KDR. The more physical distance pixels have to move on screen, the more you will see a benefit from high refresh rates. But if you're playing Diablo 3 for example, where it's top down and there's not a lot of pixel movement, the way you have in an FPS when you do a 180, for example, then you can generally get by. So I think a lot of it depends on the type of games you play.


----------



## Jquala

I have these cards in sli but I've run into a weird issue. Top card is stuck at 1.062 and bottom card is stuck at 1.5v no matter how I tweak the power slider. I can't seem to get my cards to clock higher on AB


----------



## Vellinious

I've never been a big fan of MSI AB...what's everyone using for an oc tool?

Also...any news on the pascal bios tweaker? These power limits are killin me.....

OH...Can I join your little club? lol


----------



## DooRules

AB seems to be the most popular. Precision will work, albeit without kboost function working. You would need the version of precision that was being used before the 10 series cards. The very latest version will not see the gpu.


----------



## Yuhfhrh

Quote:


> Originally Posted by *Vellinious*
> 
> I've never been a big fan of MSI AB...what's everyone using for an oc tool?
> 
> Also...any news on the pascal bios tweaker? These power limits are killin me.....
> 
> OH...Can I join your little club? lol


AB is popular, it's the only one letting me use the voltage slider right now. We may never be able to flash a non-signed bios to the card.


----------



## Vellinious

Quote:


> Originally Posted by *Yuhfhrh*
> 
> AB is popular, it's the only one letting me use the voltage slider right now. We may never be able to flash a non-signed bios to the card.


How did you get it to unlock the voltage slider? And...is it actually doing anything?


----------



## mbze430

Need to put out a bounty for a Bios mod.......


----------



## Yuhfhrh

Quote:


> Originally Posted by *Vellinious*
> 
> How did you get it to unlock the voltage slider? And...is it actually doing anything?


I wish somebody would put this in the OP. It lets you use up to ~1.08-1.09V.
Quote:


> Originally Posted by *Yuhfhrh*
> 
> For afterburner, open up the file under MSI/Profiles starting with 10DE&DEV... with notepad (admin rights) and add this in:
> 
> [Settings]
> VDDC_Generic_Detection = 1


----------



## tonnytech

Quote:


> Originally Posted by *profundido*
> 
> It's easy,
> 
> I used my dremel 4200 to widen some of the existing holes I decided to use using the default drill bit that comes with the device (happens to be the right size).
> 
> Use the small new screws that come with the EK block with a washer directly on the PCB for 1-3 screws (typically inner ones)
> 
> Use the large screws that come with the EK block to attach the backplate. So washers on all holes on the PCB, then plate on it, then bolts into the whole thing for whatever holes you decide to use
> 
> I marked the modified holes (with arger screws on) with green arrows for your convenience.


thats a lot of work getting dremel out when you can use the original nvidea hex bolts into the ek block and smaller screws.


Spoiler: Warning: Spoiler!


----------



## Vellinious

Quote:


> Originally Posted by *Yuhfhrh*
> 
> I wish somebody would put this in the OP. It lets you use up to ~1.08-1.09V.


Hmm...pretty much as I expected. Just makes it slam against the power limit even harder. lol

Thanks for the info, though. Will come in handy....someday


----------



## eliau81

Ohhh goddd what did i done????!!!!!
Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
Im uploadin some photo only the pump works the gtx led logo dosent work
Please help somebody


----------



## Yuhfhrh

Quote:


> Originally Posted by *eliau81*
> 
> Ohhh goddd what did i done????!!!!!
> Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
> Im uploadin some photo only the pump works the gtx led logo dosent work
> Please help somebody


Owch. Did you cut traces by screwing in too tight?


----------



## jcde7ago

Argh....the wait for EK waterblocks/backplates is gonna be excruciating from here on out.


----------



## eliau81

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Owch. Did you cut traces by screwing in too tight?


Ammm i dont know
Pcb mybe damaged


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> Too many people don't know that GSYNC on a 165Hz monitor is capped at 165Hz. And that's why you set an fps cap 5fps below the GSYNC cap so it's always active.


You only need to set the FPS cap one FPS below the max refresh rate.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> You only need to set the FPS cap one FPS below the max refresh rate.


If I remember correctly when the fps cap was too close to the GSYNC limit it would cause it to go on/off repeatedly, hurting input times. I think mark from blur busters did the test.


----------



## CallsignVega

Quote:


> Originally Posted by *HyperMatrix*
> 
> If I remember correctly when the fps cap was too close to the GSYNC limit it would cause it to go on/off repeatedly, hurting input times. I think mark from blur busters did the test.


That is quite old info right from when G-Sync came out. It doesn't have that polling problem anymore. Give 164 Hz a try and see if you notice any unusual behavior.


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> That is quite old info right from when G-Sync came out. It doesn't have that polling problem anymore. Give 164 Hz a try and see if you notice any unusual behavior.


Appreciate the update. Will check it out later tonight.


----------



## 5150 Joker

Quote:


> Originally Posted by *eliau81*
> 
> Ohhh goddd what did i done????!!!!!
> Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
> Im uploadin some photo only the pump works the gtx led logo dosent work
> Please help somebody


RIP Titan X. I hope you don't try to claim warranty for something you did. Should've left it with stock cooling, it's not like you gain any worthwhile performance with water or these mods some are doing.


----------



## eliau81

Quote:


> Originally Posted by *5150 Joker*
> 
> RIP Titan X. I hope you don't try to claim warranty for something you did. Should've left it with stock cooling, it's not like you gain any worthwhile performance with water or these mods some are doing.


Stock was verey noisi and temp up to 89c so i had to
Still gonna try to fixit it is probly trace that i v damage


----------



## HyperMatrix

Speaking of warranty....I was on the phone with Nvidia today regarding one of my cards which is acting up under the new drivers. The rep tried telling me that overclocking voids my warranty and that I should only run my card at stock settings. At this point I yelled a bunch of expletives and told him to transfer me to a supervisor. I've never been under the impression that video card overclocking would void warranty since it's blocked in the driver by thermal and power limits. Unless, of course, you're using a custom bios. So this did tick me off.


----------



## skypine27

Eliau81:

I'm sorry to say I'd have to kind of side with Joker on this one. If you aren't going to go full custom loop and get respectable and tested full blocks from EK or aqua-comptuer.de, then stick with the stock coolers. Taking an AIO off a different card and going that route doesn't seem worth the risk.

I really hope you get your money back and nvidia doesn't read this thread !

Side comment:

So I was running more Fire Strike Ultra Benches and this time had the Power Limit displayed in the Precsion X OSD.

I have the power slider set to its max of 120% and + 175 Mhz on the GPU clock and +500 on the Mhz.

I was watching clock speeds and power usage (the temps are fine to me under the EK blocks) and noticed the cards seemed to max out at 2000 and then one card or the other would smack into the 120% (briefly would see 121%) and then one or both would rapidly drop down to 1975, 1987, 1974, 1962, etc. The highest I would see was 2000 and it wasn't sustained very long before that power limit hit. I ran FSU back to back in this config and got:
Run 1: 13817
Run 2: 13749

I really hope someone can crack the BIOS and let us crank that slider to 125% or 130% and not have to deal with strange PCB mods !!


----------



## Glerox

I created an account just to write this comment. I hope I can help you eliau81.
I've been following this thread a lot!

I had two gtx 1080 EVGA FE in SLI and I just switched to one titan x pascal becase I noticed stuttering on my 4k television with SLI (also have a 165Hz g-sync). And one card is just way simpler IMO.

I installed AIO watercooling with kraken G10s on my gtx 1080. The first time I damaged the card (don't know why because I was really careful) and it wasn't even detected in my system. I had to RMA it and the reason I chose EVGA is because they have the best warranty. I replaced the original cooler and without a single question they gave me a brand new gtx 1080 FE without any fee (for this reason I will always buy EVGA when I can).

Now, titan X Pascal is from Nvidia directly and they don't seem to have a really good warranty... but it's not our fault if a 1800$CAN GPU has a cooler that SUCKS REALLY BAD. So it's totally legitimate to change the cooler...

I guess your best luck is to replace the original cooler as clean as you can and tell them the product was dead on arrival... it was possibly already damaged and just by changing the cooler it killed the card... who knows!

I hope you will get a replacement...

I just received my EK custom loop kit today. Will be installing my first custom loop this weekend!
I'm a bit stressed now that I see someone has broken his card...


----------



## Phoenix81

Quote:


> Originally Posted by *HyperMatrix*
> 
> Speaking of warranty....I was on the phone with Nvidia today regarding one of my cards which is acting up under the new drivers. The rep tried telling me that overclocking voids my warranty and that I should only run my card at stock settings. At this point I yelled a bunch of expletives and told him to transfer me to a supervisor. I've never been under the impression that video card overclocking would void warranty since it's blocked in the driver by thermal and power limits. Unless, of course, you're using a custom bios. So this did tick me off.


As long as thay don't have any proof that you have overclocked your card (which they don't), you don't have to worry about that I guess.


----------



## pompss

Just finish to install the titan x with waterblock

Its pretty late 2.00 am here. Will try to make a quick run to see how it goes


----------



## Gary2015

Quote:


> Originally Posted by *eliau81*
> 
> Ohhh goddd what did i done????!!!!!
> Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
> Im uploadin some photo only the pump works the gtx led logo dosent work
> Please help somebody


Pcb cracked


----------



## tonnytech

Quote:


> Originally Posted by *eliau81*
> 
> Ohhh goddd what did i done????!!!!!
> Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
> Im uploadin some photo only the pump works the gtx led logo dosent work
> Please help somebody


what tool did you use to unscrew hex bolts can see lots of worn circles on the pcb around screw holes , seems like used a little to much force and would be noticable should you return it and they dissasemble to inspect pcb.

The worn circles look like you gone in with a socket which is to big and pushed down to hard as you have unscrewed the hex bolts , damaging various parts on the pcb.


----------



## skypine27

Quote:


> Originally Posted by *tonnytech*
> 
> what tool did you use to unscrew hex bolts can see lots of worn circles on the pcb around screw holes , seems like used a little to much force and would be noticable should you return it and they dissasemble to inspect pcb.


Funny you mention the hex bolts.

I put EK full blocks on my XPs. EK does NOT include a 4.5mm socket or wrench. In their PDF instructions they say to use a 4.5mm socket (which isn't included) or use pliers but be careful of damaging the area around each hex screw!

I had a 4.5mm tiny wrench with an inclined angle and it was fine. But to remove all those with a pair of pliers might have been lame, especially on 2 X cards.

The good news is you don't use those hex screws when reassembling the card with the EK blocks.


----------



## tonnytech

Quote:


> Originally Posted by *skypine27*
> 
> Funny you mention the hex bolts.
> 
> I put EK full blocks on my XPs. EK does NOT include a 4.5mm socket or wrench. In their PDF instructions they say to use a 4.5mm socket (which isn't included) or use pliers but be careful of damaging the area around each hex screw!
> 
> I had a 4.5mm tiny wrench with an inclined angle and it was fine. But to remove all those with a pair of pliers might have been lame, especially on 2 X cards.
> 
> The good news is you don't use those hex screws when reassembling the card with the EK blocks.


The hex bolts are actually 4 mm , the instructions including with block tell you to use 4mm ... if your using 4.5 mm at a angle your setting yourself up for a accident on a very expensive card. Same as anyone using pliers which was on the first batch of instructions from ek, which has since been further updated instructing users not to use pliers.

Best thing for anyone is to get down local hardware store and get correct hex socket bolt / driver ie 4mm an use that.


----------



## inoran81

Joining this Titan x pascal owner club here with my 2016 rig... need to wait for my aqc waterblocks and other wc stuffs now... hope it will be ready to ship out ASAP...









First of caselabs SM8 with peds










Acer Predator XB321HK



















Parts and peripherals





































'The Band of extreme CPUs'


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> Speaking of warranty....I was on the phone with Nvidia today regarding one of my cards which is acting up under the new drivers. The rep tried telling me that overclocking voids my warranty and that I should only run my card at stock settings. At this point I yelled a bunch of expletives and told him to transfer me to a supervisor. I've never been under the impression that video card overclocking would void warranty since it's blocked in the driver by thermal and power limits. Unless, of course, you're using a custom bios. So this did tick me off.


Changing the stock cooler voids the warranty .


----------



## Feklar

Quote:


> Originally Posted by *tonnytech*
> 
> 
> what tool did you use to unscrew hex bolts can see lots of worn circles on the pcb around screw holes , seems like used a little to much force and would be noticable should you return it and they dissasemble to inspect pcb.
> 
> The worn circles look like you gone in with a socket which is to big and pushed down to hard as you have unscrewed the hex bolts , damaging various parts on the pcb.


Looks like pliers were used and not carefully to be sure. Nvidia will never honor that card for warranty after they inspect it. Very expensive lesson.


----------



## Gary2015

Quote:


> Originally Posted by *inoran81*
> 
> Joining this Titan x pascal owner club here with my 2016 rig... need to wait for my aqc waterblocks and other wc stuffs now... hope it will be ready to ship out ASAP...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First of caselabs SM8 with peds
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Acer Predator XB321HK
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Parts and peripherals
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 'The Band of extreme CPUs'


The extended top on the sm8 looks ugly .


----------



## HyperMatrix

Quote:


> Originally Posted by *Gary2015*
> 
> Changing the stock cooler voids the warranty .


As far as I'm aware that law doesn't apply to the US and Canada. But that's besides the point. I'm still running everything at stock. No blocks. No clu mod. Just want to make sure I don't have a bum card before getting into that.


----------



## skypine27

Quote:


> Originally Posted by *tonnytech*
> 
> The hex bolts are actually 4 mm , the instructions including with block tell you to use 4mm ... if your using 4.5 mm at a angle your setting yourself up for a accident on a very expensive card. Same as anyone using pliers which was on the first batch of instructions from ek, which has since been further updated instructing users not to use pliers.
> 
> Best thing for anyone is to get down local hardware store and get correct hex socket bolt / driver ie 4mm an use that.


I couldn't remember if the hexs were 4 or 4.5mm. Either way, using the little wrench I had or using pliers, it was fine. But with 2 X cards you are talking close to 24 of them so the potential to scratch something mission critical goes up a bit with fatigue.


----------



## Gary2015

Quote:


> Originally Posted by *HyperMatrix*
> 
> As far as I'm aware that law doesn't apply to the US and Canada. But that's besides the point. I'm still running everything at stock. No blocks. No clu mod. Just want to make sure I don't have a bum card before getting into that.


Wise move. Since this is my first all Nvidia release , not sure what they going to pull if I have a problem . EVGA I can depend upon.. Nvidia ???


----------



## eliau81

Can nvidia fix the card for my?
I will pay if i have to


----------



## BURGER4life

Quote:


> Originally Posted by *eliau81*
> 
> Can nvidia fix the card for my?
> I will pay if i have to


They probably won't repair it. You're gonna have to buy a new one.


----------



## combat fighter

Quote:


> Originally Posted by *Gary2015*
> 
> The extended top on the sm8 looks ugly .


The SM8 looks ugly full stop.

Has the looks from something about 15 years ago.

Dated and boring.


----------



## profundido

Quote:


> Originally Posted by *tonnytech*
> 
> The hex bolts are actually 4 mm , the instructions including with block tell you to use 4mm ... if your using 4.5 mm at a angle your setting yourself up for a accident on a very expensive card. Same as anyone using pliers which was on the first batch of instructions from ek, which has since been further updated instructing users not to use pliers.
> 
> Best thing for anyone is to get down local hardware store and get correct hex socket bolt / driver ie 4mm an use that.


In that boat myself. I couldn't find any hex screwdriver in all the do-it-yourself shops around me and even in the only hobbystore. I ultimately bought mini pliers and started the monk job telling myself "slow...slow...don't take the risk of leaving that steady hand for 1 sec"

and after going through the pain of having to do this for 2 cards I was so happy that EK put simple philips screws in. Not for the love of my life would I put those hex bolts back in !!









Yes it's kinda scary:


----------



## eliau81

Fu€k my life
Used a warnch for this hex


----------



## unreality

Aquacomputer Backplate arrived today and i assembled it just now. While mounting it the watercooler lost contact to the card a few millimeters. Do you think i trapped air inside the thermal paste now?

What are your delta temperatures to water temperatures while running heaven? Im sitting at 15°C ∆ to water temp here @ 2050/1350 running heaven in loop.


----------



## profundido

Quote:


> Originally Posted by *unreality*
> 
> Aquacomputer Backplate arrived today and i assembled it just now. While mounting it the watercooler lost contact to the card a few millimeters. Do you think i trapped air inside the thermal paste now?
> 
> What are your delta temperatures to water temperatures while running heaven? Im sitting at 15°C ∆ to water temp here @ 2050/1350 running heaven in loop.


Room 26° C, both cards don't exceed 40°C on firestrike/timespy loops, mostly stay around 35-40

+200/+500 resulting in 2050 clock


----------



## Gary2015

Quote:


> Originally Posted by *eliau81*
> 
> Can nvidia fix the card for my?
> I will pay if i have to


If you modified it. They won't .


----------



## toncij

Quote:


> Originally Posted by *mattlach*
> 
> Well, as I said, I knew people would disagree.
> 
> IMHO, much like the audiophile community, I believe that you truly believe you are seeing a difference, but that it is all placebo effect. Human brains are silly things and can not be trusted, not even your own
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally, in modern times on LCD panels I cant visually tell the difference of anything above 30fps. 30fps "feels" off when I move the mouse though, and it keeps feeling off up until about 60fps, above which I can't tell a difference at all, and I couldn't when I was young and nimble either 15 years ago, when I had a 100hz CRT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I absolutely don't believe in scaling. The #1 reason I get larger monitors is for the extra screen real estate. Scaling the display completely counteracts that benefit. IMHO, nothing above ~100-120DPI makes sense on the desktop. For higher resolution a larger screen is the way to go.
> 
> IMHO, my 48" 4k screen is too big. It only results in ~92DPI which is a little bit too low to me. At 42" to 44" 4k would be perfect though.


I'm really going to limit this discussion as much as possible. There is a significant difference in the two things you mention. First is that medically/biologically, there are certain limits to human hearing that prevent people, no matter how trained or fine-tasted, to hear something. It's simply something that can be measured, just as human vision perception can be.

Now, the human vision. People don't see in frames or pixels, but there are time frames where people can see something and there are dots related to distances which humans can or can not see. To cut this short: humans can see a short frame even at high refresh rates above 200Hz. 240Hz considered the upper limit where people will not see flickering, even subconsciously. Every normal human can see a difference between 60Hz refreshed motion and 1xx Hz refreshed motion. Period. Unless you have a medical condition that would render you unable to see it, like some people do have issues with 3D, VR etc. As the refresh rate goes up from the 60, the change required for you to perceive it, is larger. You need a much bigger change to see it at high, than low framerates e.g. you'll easier see 30 to 50 FPS change than 100 to 165Hz.

Now, the resolution. Scaling is not something to believe in or not. It's also, like framerates and motion, science. Not something tied to personal preference. There is a term "retina" you may have heard, which is one of the terms related to human resolution perception and which says that at a certain distance and dot (pixel) density, you should not be able to see individual dots.
At a normal, 40cm distance from a display, the resolution of 2560x1440 on a 27" display still makes it possible to see individual pixels and jagged edges or lines. The point of high density displays is not to make it run at 100% scaling and have more real estate, but to scale content up to make text and other elements more clear and close to paper print (300 DPI usually). A 5K display exhibits a 217 PPI measure, still short of 300 DPI a printed magazine has.
By increasing the panel density making pixels and dot pitch smaller, our perception of the content improves. We're capable of reading smaller text and noticing tinier details at the same distance of the same diagonal panel we're looking at usually. With a 5K display on 27" you're able to see much more and finer detail than with a normal 2560x1440 display, at the same content, but scaled up to 200% making one content pixel be made up of 4 physical pixels.

DSR technology that Nvidia promotes and people tend to use to "try high resolution" has nothing to do with it. Higher fidelity of high resolution displays is a physical property and physical feat, not something you could ever simulate on a normal resolution display since all the benefits come from the higher physical pixel density of the display. DSR can only provide a better quality source for content sampling, but not much more.

Hopefully you'll read this wall of text and consider learning from it, not attacking it.


----------



## willmaltby

Quote:


> Originally Posted by *eliau81*
> 
> Ohhh goddd what did i done????!!!!!
> Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
> Im uploadin some photo only the pump works the gtx led logo dosent work
> Please help somebody


Please tell me you didn't actually install it with only the waterblock attached as pictured!?

That'd certainly explain why the logo didn't turn on, lol.


----------



## eliau81

so i have spoken with electronic lab and they said that it can be fix
hopefully
just reinstated my gtx 980 and i can swar i heard my pc saying "are you kidding [email protected][email protected]!..."


----------



## profundido

Quote:


> Originally Posted by *eliau81*
> 
> so i have spoken with electronic lab and they said that it can be fix
> hopefully
> just reinstated my gtx 980 and i can swar i heard my pc saying "are you kidding [email protected][email protected]!..."


in case they cannot fix it and the card is lost, just swallow the sore pill since you learned a valuable lesson here. You won't be so careless and daunting next time with a card of this value and stick to safer more proven methods of cooling like e.g EK blocks, read and follow guidelines better, use the plastic washers everywhere, don't overtighten screws etc.... I sincerely hope it won't cost you the card eventually to learn that but in the long run it will benefit you and the industry I'm sure so if it comes down to that just realize that all you lost was some numbers on your bank account


----------



## eliau81

Quote:


> Originally Posted by *skypine27*
> 
> Eliau81:
> 
> I'm sorry to say I'd have to kind of side with Joker on this one. If you aren't going to go full custom loop and get respectable and tested full blocks from EK or aqua-comptuer.de, then stick with the stock coolers. Taking an AIO off a different card and going that route doesn't seem worth the risk.
> 
> I really hope you get your money back and nvidia doesn't read this thread !
> 
> Side comment:
> 
> So I was running more Fire Strike Ultra Benches and this time had the Power Limit displayed in the Precsion X OSD.
> 
> I have the power slider set to its max of 120% and + 175 Mhz on the GPU clock and +500 on the Mhz.
> 
> I was watching clock speeds and power usage (the temps are fine to me under the EK blocks) and noticed the cards seemed to max out at 2000 and then one card or the other would smack into the 120% (briefly would see 121%) and then one or both would rapidly drop down to 1975, 1987, 1974, 1962, etc. The highest I would see was 2000 and it wasn't sustained very long before that power limit hit. I ran FSU back to back in this config and got:
> Run 1: 13817
> Run 2: 13749
> 
> I really hope someone can crack the BIOS and let us crank that slider to 125% or 130% and not have to deal with strange PCB mods !!


Quote:


> Originally Posted by *Glerox*
> 
> I created an account just to write this comment. I hope I can help you eliau81.
> I've been following this thread a lot!
> 
> I had two gtx 1080 EVGA FE in SLI and I just switched to one titan x pascal becase I noticed stuttering on my 4k television with SLI (also have a 165Hz g-sync). And one card is just way simpler IMO.
> 
> I installed AIO watercooling with kraken G10s on my gtx 1080. The first time I damaged the card (don't know why because I was really careful) and it wasn't even detected in my system. I had to RMA it and the reason I chose EVGA is because they have the best warranty. I replaced the original cooler and without a single question they gave me a brand new gtx 1080 FE without any fee (for this reason I will always buy EVGA when I can).
> 
> Now, titan X Pascal is from Nvidia directly and they don't seem to have a really good warranty... but it's not our fault if a 1800$CAN GPU has a cooler that SUCKS REALLY BAD. So it's totally legitimate to change the cooler...
> 
> I guess your best luck is to replace the original cooler as clean as you can and tell them the product was dead on arrival... it was possibly already damaged and just by changing the cooler it killed the card... who knows!
> 
> I hope you will get a replacement...
> 
> I just received my EK custom loop kit today. Will be installing my first custom loop this weekend!
> I'm a bit stressed now that I see someone has broken his card...


Quote:


> Originally Posted by *tonnytech*
> 
> 
> what tool did you use to unscrew hex bolts can see lots of worn circles on the pcb around screw holes , seems like used a little to much force and would be noticable should you return it and they dissasemble to inspect pcb.
> 
> The worn circles look like you gone in with a socket which is to big and pushed down to hard as you have unscrewed the hex bolts , damaging various parts on the pcb.


im not gonna send it to nvidia its Obviously and very noticeable that i have scratch the PCB so it gonna be waste of time
thanks guys for helping
i will send the card to the electronic lab and keep updating y'll


----------



## Gary2015

Quote:


> Originally Posted by *eliau81*
> 
> im not gonna send it to nvidia its Obviously and very noticeable that i have scratch the PCB so it gonna be waste of time
> thanks guys for helping
> i will send the card to the electronic lab and keep updating y'll


Hope you get it sorted.


----------



## willmaltby

Quote:


> Originally Posted by *eliau81*
> 
> didn't meant to insult anybody
> and i very Appreciate for the help


It's cool, I feel for you man, been there, done that!

Hopefully this place can fix the PCB for you...


----------



## inoran81

Quote:


> Originally Posted by *Gary2015*
> 
> The extended top on the sm8 looks ugly .


Quote:


> Originally Posted by *combat fighter*
> 
> The SM8 looks ugly full stop.
> 
> Has the looks from something about 15 years ago.
> 
> Dated and boring.


Lol one man's meat is another poison... but the flexibility to change the size of this sm8 made the deal. Was previously contemplated on sth10 but u can't 'resize' that case as compare to this option I have...

I can also always change to my inwin d-frame and use aqc gigant for that... if I don like this setup after awhile


----------



## PowerK

TITAN X Pascals arrived yesterday. However, I ordered 3-Slot (instead of 4-slot) HB Bridge by mistake. :-(
Installing today.


----------



## Woundingchaney

Quote:


> Originally Posted by *eliau81*
> 
> Ohhh goddd what did i done????!!!!!
> Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
> Im uploadin some photo only the pump works the gtx led logo dosent work
> Please help somebody


Im somewhat confused as to how you ended up like this. I recently installed the 1080 evga hybrid cooler and it took maybe 5 minutes. There is not reason to remove the back plate or the fan and metal plate it attaches to. The aio pump will literally mount directly once you remove the stock cooling chamber and outer plastic shroud.


----------



## toncij

Quote:


> Originally Posted by *PowerK*
> 
> TITAN X Pascals arrived yesterday. However, I ordered 3-Slot (instead of 4-slot) HB Bridge by mistake. :-(
> Installing today.


Is that a satellite dish inside the house? How does that go?









You can use dual flex bridges instead, don't worry.


----------



## KillerBee33

Quote:


> Originally Posted by *Woundingchaney*
> 
> Im somewhat confused as to how you ended up like this. I recently installed the 1080 evga hybrid cooler and it took maybe 5 minutes. There is not reason to remove the back plate or the fan and metal plate it attaches to. The aio pump will literally mount directly once you remove the stock cooling chamber and outer plastic shroud.


How did you manage to connect the AIO pump without taking backplate and heatplate off?
I mean how did you manage to get to Power Connector Heather?


----------



## PowerK

Quote:


> Originally Posted by *toncij*
> 
> Is that a satellite dish inside the house? How does that go?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can use dual flex bridges instead, don't worry.


As for the dish inside, it's B&O A9 speaker. hehe
Yeah, I'm using ASUS ROG 3-Way SLI Bridge (LED) at the monent.


----------



## Woundingchaney

Quote:


> Originally Posted by *KillerBee33*
> 
> How did you manage to connect the AIO pump without taking backplate and heatplate off?
> I mean how did you manage to get to Power Connector Heather?


I connected the power connector to the pump with tweezers. It was a relatively simple thing to do honeslty, I removed the stock fan power connector attached it to the aio pump and connected it back to the power connector on the card.


----------



## KillerBee33

Quote:


> Originally Posted by *Woundingchaney*
> 
> I connected the power connector to the pump with tweezers. It was a relatively simple thing to do honeslty, I removed the stock fan power connector attached it to the aio pump and connected it back to the power connector on the card.


Tried that , couldn't even unplug it first so ended up taking the whole thing apart


----------



## KillerBee33

Quote:


> Originally Posted by *Woundingchaney*
> 
> I connected the power connector to the pump with tweezers. It was a relatively simple thing to do honeslty, I removed the stock fan power connector attached it to the aio pump and connected it back to the power connector on the card.


[email protected]@@m Double posts again


----------



## Woundingchaney

Quote:


> Originally Posted by *KillerBee33*
> 
> Tried that , couldn't even unplug it first so ended up taking the whole thing apart


Huh that's odd, yeah I had no problem. Hardest part of the whole procedure was routing the pump power line.

Honestly I would rather connect the pump to an external power source, but didn't have anything compatible.


----------



## KillerBee33

Quote:


> Originally Posted by *Woundingchaney*
> 
> Huh that's odd, yeah I had no problem. Hardest part of the whole procedure was routing the pump power line.


Ehh what pissed me off is bcz. of much smaller shroud we cant have the last piece on top of pump to have a nice clean look like the 9Series, Dremeling one of the biggest parts on a 1200$ piece is is unacceptable imo


----------



## Woundingchaney

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh what pissed me off is bcz. of much smaller shroud we cant have the last piece on top of pump to have a nice clean look like the 9Series, Dremeling one of the biggest parts on a 1200$ piece is is unacceptable imo


Yeah I know what you mean. Hopefully EVGA releases an official aio for the Titan Xp. I bought the 1080 one just as a stop gap solution to see if I wanted to go full water block, ultimately my temps never go above 55 so I cant see investing in a custom solution.


----------



## KillerBee33

Quote:


> Originally Posted by *Woundingchaney*
> 
> Yeah I know what you mean. Hopefully EVGA releases an official aio for the Titan Xp. I bought the 1080 one just as a stop gap solution to see if I wanted to go full water block, ultimately my temps never go above 55 so I cant see investing in a custom solution.


Nope...and to be honest i wouldn't want EVGAs shroud

Also waiting for this to be released for TXP , then go nuts and finally blow some time on better cooling solution








http://www.swiftech.com/KOMODONV-LEGTX1080.aspx


----------



## eliau81

Quote:


> Originally Posted by *Woundingchaney*
> 
> I connected the power connector to the pump with tweezers. It was a relatively simple thing to do honeslty, I removed the attached it to the aio pump and connected it back to the power connector on the card.


there was no access for stock fan power connector i had to remove all the hex and remove the pcb plate to gain access
believe me i tried


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> there was no access for stock fan power connector i had to remove all the hex and remove the pcb plate to gain access
> believe me i tried


Uhumm i did try and use tweezers here but it's a no go


----------



## Woundingchaney

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhumm i did try and use tweezers here but it's a no go


Quote:


> Originally Posted by *eliau81*
> 
> there was no access for stock fan power connector i had to remove all the hex and remove the pcb plate to gain access
> believe me i tried


Guys I swear it was really easy to do.









For the cabling I just routed around the fan on via the SLI connector shroud.


----------



## piee

Gamers Nexus shows complete install of this hybrid gpu cooler on youtube


----------



## KillerBee33

Quote:


> Originally Posted by *piee*
> 
> Gamers Nexus shows complete install of this hybrid gpu cooler on youtube


I did not open the 1080 , but there might be a difference on the TXP , possibly a smaller opening


----------



## Jpmboy

Quote:


> Originally Posted by *profundido*
> 
> In that boat myself. I couldn't find any hex screwdriver in all the do-it-yourself shops around me and even in the only hobbystore. I ultimately bought mini pliers and started the monk job telling myself "slow...slow...don't take the risk of leaving that steady hand for 1 sec"
> 
> and after going through the pain of having to do this for 2 cards I was so happy that EK put simple philips screws in. Not for the love of my life would I put those hex bolts back in !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes it's kinda scary:


lol - Even PEPboys carries 4mm sockets.


----------



## DNMock

Quote:


> Originally Posted by *HyperMatrix*
> 
> It helps when the fps fluctuates. If I 160fps guaranteed/locked down FPS in games all the time, I'd be a happy camper. But that's not possible with 2 Titans, unfortunately. GSYNC also reduces input lag. vsync-off at 165Hz/165fps still has higher input lag than 165Hz GSYNC with a 160fps cap since Gsync matches every frame generated to a screen refresh.
> Looks like we have another contrarian in our midst. Mentioning tftcentral, then ignoring that TFT central clearly lists
> 
> "Refresh Rate
> 144Hz native
> Up to 165z max overclocked
> G-sync range 30 - 165Hz"


Wasn't aware Gsync helped with input lag, makes sense now, thank ya


----------



## Nunzi

Quote:


> Originally Posted by *Woundingchaney*
> 
> I connected the power connector to the pump with tweezers. It was a relatively simple thing to do honeslty, I removed the stock fan power connector attached it to the aio pump and connected it back to the power connector on the card.


I did the same thing ........


----------



## profundido

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - Even PEPboys carries 4mm sockets.


since the name PEPboys didn't ring any bell here, I googled it. It turns out to be some sort of car shop or chain maybe in America ? I'm guessing from your comment that people that live over there have these things available in many stores, easily accessible.

If this is the case my friend, might I remind you that there exist other countries outside of the US/Canada where people like me don't have this luxury and also have to pay 2620euro (=2960 US dollars) for 2 of these cards ?


----------



## Difunto

Quote:


> Originally Posted by *Woundingchaney*
> 
> I connected the power connector to the pump with tweezers. It was a relatively simple thing to do honeslty, I removed the stock fan power connector attached it to the aio pump and connected it back to the power connector on the card.


yup! thats the same thing i did.. its easy


----------



## Woundingchaney

Quote:


> Originally Posted by *profundido*
> 
> since the name PEPboys didn't ring any bell here, I googled it. It turns out to be some sort of car shop or chain maybe in America ? I'm guessing from your comment that people that live over there have these things available in many stores, easily accessible.
> 
> If this is the case my friend, might I remind you that there exist other countries outside of the US/Canada where people like me don't have this luxury and also have to pay 2620euro (=2960 US dollars) for 2 of these cards ?


Im familiar with PEPboys but we do not have any in my city. I suppose the part that people find odd is that these bits/sockts/drives (whatever) are extremely common in just about any general store. Virtually every multi-piece mechanical set I have seen has this 4mm attachment (though Im assuming it is typically easier to find in screwdriver sets than perhaps say a socket set).

Imagine living in a place where 60,000 people a year file bankruptcy over medical bills.........


----------



## Fiercy

Quote:


> Originally Posted by *DNMock*
> 
> Do you buy a fridge to store rotten, moldy food in, or fresh food in?
> 
> And yeah, any title (sans heavily modded or .ini files cranked up to 11) that can't run at 60 fps @4k on SLI TXP (assuming it supports SLI) even with AA disabled is moldy and rotten.


See that's a very flaud logic say they decrease the demands before the release and make top settings perform better does that make it fresh and nice?

That is so stupid. Witcher 2 had a super sampling setting that destroyed everything back in its day so it's rotten too!? Or Crysis 1 was rotten?

If you are buying Titans just to get 60 in 4k and go completely blind to the game people were developing for 5 years just because you don't get 60 makes me really feel sorry for you.


----------



## profundido

The furthest I got on my search was the guy in the hobby store digging up a catalog of electronic instruments and toolsets he "could order". My colleague at work waited 6 months for a sort of measuring tool he ordered from a store in the US. Eventually the shop reimbursed him the money because Customs kept blocking it for no reasons at all. Never arrived. The free and equal world at it's best hahaha


----------



## DNMock

Quote:


> Originally Posted by *profundido*
> 
> since the name PEPboys didn't ring any bell here, I googled it. It turns out to be some sort of car shop or chain maybe in America ? I'm guessing from your comment that people that live over there have these things available in many stores, easily accessible.
> 
> If this is the case my friend, might I remind you that there exist other countries outside of the US/Canada where people like me don't have this luxury and also have to pay 2620euro (=2960 US dollars) for 2 of these cards ?


Every full socket set I have ever seen contains a 4mm socket.


Spoiler: Warning: Spoiler!







If you don't own a full socket set, you need one immediately considering almost everything that you need to assemble/disassemble on the planet is gonna require a socket of some form.


----------



## profundido

Quote:


> Originally Posted by *DNMock*
> 
> Every full socket set I have ever seen contains a 4mm socket.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> If you don't own a full socket set, that need one immediately considering almost everything that you need to assemble/disassemble on the planet is gonna require a socket of some form.


I have such a set but much fewer pieces in total. And ofc nothing close to that small. Yours looks nice though !


----------



## DNMock

Quote:


> Originally Posted by *profundido*
> 
> I have such a set but much fewer pieces in total. And ofc nothing close to that small. Yours looks nice though !


lol mine is covered in dirt and muck with one of the doors snapped off. That's a stock photo from www.homedepot.com

I've got like 3 of them since I've a tendency to lose sockets. Have the one fully assembled kit and a drawer filled with a bunch of rogue sockets and drivers haha.
Quote:


> Originally Posted by *Fiercy*
> 
> See that's a very flaud logic say they decrease the demands before the release and make top settings perform better does that make it fresh and nice?
> 
> That is so stupid. Witcher 2 had a super sampling setting that destroyed everything back in its day so it's rotten too!? Or Crysis 1 was rotten?
> 
> If you are buying Titans just to get 60 in 4k and go completely blind to the game people were developing for 5 years just because you don't get 60 makes me really feel sorry for you.


You know what both those games have in common you listed? Both games built for PC,bboth head and shoulders above anything else in their day graphically, and both could be tweaked to the moon in settings.

Exception to every rule, this doesn't fit the bill here though.


----------



## Jpmboy

Quote:


> Originally Posted by *profundido*
> 
> since the name PEPboys didn't ring any bell here, I googled it. It turns out to be some sort of car shop or chain maybe in America ? I'm guessing from your comment that people that live over there have these things available in many stores, easily accessible.
> 
> If this is the case my friend, might I remind you that there exist other countries outside of the US/Canada where people like me don't have this luxury and also have to pay 2620euro (=2960 US dollars) for 2 of these cards ?


So... since you have no location listed in your avatar/sig, and no access to METRIC tools from any shop where ever you are... my friend. I guess you are just SOL.









and the price of the cards is more reason to use the proper tools or don;t touch the thing with a pitch fork or whatever "handy" tools - right?


----------



## profundido

Quote:


> Originally Posted by *Jpmboy*
> 
> So... since you have no location listed in your avatar/sig, and no access to METRIC tools from any shop where ever you are... my friend. I guess you are just SOL.


Please read... I went to several shops that sell nothing but tools including metric tools and they did not have anything so small, even a hobby shop that specializes in small parts and tools and they had small screwdrivers for philips and flat types but this particular hex ones they didn't have....

but indeed I was definately out of luck so small pliers it was


----------



## KillerBee33

Quote:


> Originally Posted by *profundido*
> 
> Please read... I went to several shops that sell nothing but tools to do stuff and they did not have anything so small, even a hobby shop that specializes in small parts and tools and they had small screwdrivers for philips and flat types but this particular hex ones they didn't have....


Get this kit for the future








https://www.amazon.com/gp/product/B016Q3D4AC/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1


----------



## profundido

Quote:


> Originally Posted by *KillerBee33*
> 
> Get this kit for the future
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.amazon.com/gp/product/B016Q3D4AC/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1


owww that is very nice indeed !ç I have a similar one in blue without the bits in the lowest row. That seems to be super rare here

I immediately clicked on order, selected my country (belgium) and got:

In Stock.
This item does not ship to Belgium

roflmao


----------



## profundido

guess I will browse online stores in europe for something similar. Probably my best bet for the future.


----------



## mattlach

Quote:


> Originally Posted by *toncij*
> 
> I'm really going to limit this discussion as much as possible. There is a significant difference in the two things you mention. First is that medically/biologically, there are certain limits to human hearing that prevent people, no matter how trained or fine-tasted, to hear something. It's simply something that can be measured, just as human vision perception can be.
> 
> Now, the human vision. People don't see in frames or pixels, but there are time frames where people can see something and there are dots related to distances which humans can or can not see. To cut this short: humans can see a short frame even at high refresh rates above 200Hz. 240Hz considered the upper limit where people will not see flickering, even subconsciously. Every normal human can see a difference between 60Hz refreshed motion and 1xx Hz refreshed motion. Period. Unless you have a medical condition that would render you unable to see it, like some people do have issues with 3D, VR etc. As the refresh rate goes up from the 60, the change required for you to perceive it, is larger. You need a much bigger change to see it at high, than low framerates e.g. you'll easier see 30 to 50 FPS change than 100 to 165Hz.


Dead wrong.

This is correct on traditional CRT screens where 60hz is bothersome because the entire screen is being refreshed and this results in a noticeable flicker. LCD screens don't work this way, they don't clear the entire screen once every refresh. They only update the sections that need to be updated, masking this effect.

I looked at the 30fps vs 60fps video someone linked above. I stared at them for a while before reading which was which. In the end I could kind of tell that the 3fps one was slightly worse, but it was so marginal as to almost not be worth mentioning. I had to REALLY focus to see which was which. If I weren't trying to compare them side by side, I would never have known.

I would have felt it in a mouse, as 30fps feels much less responsive than 60fps, but visually? Almost no difference at all. Going above 60fps is completely and utterly pointless. All it does is create more cost, more heat and more noise in games.
Quote:


> Originally Posted by *toncij*
> 
> Now, the resolution. Scaling is not something to believe in or not. It's also, like framerates and motion, science. Not something tied to personal preference. There is a term "retina" you may have heard, which is one of the terms related to human resolution perception and which says that at a certain distance and dot (pixel) density, you should not be able to see individual dots.
> At a normal, 40cm distance from a display


Well, there's your problem right there. 40cm is super close for desktop use. If you sit ergonomically correctly at a screen, viewing distance should be at about arms length, almost 80cm away.

This is why high DPI on phones is very useful, because people hold them much closer to their face, but on a desktop is a waste and only results in scaling.
Quote:


> Originally Posted by *toncij*
> 
> , the resolution of 2560x1440 on a 27" display still makes it possible to see individual pixels and jagged edges or lines.


This is not a real world test. Look at a digital photograph instead without the artificial high contrast of a black line on a white background. With the exception of some corner cases of poorly designed displays and weird pictures, you'd have to go down yo 90DPI or below to see individual pixels at a normal 80cm viewing distance.

Quote:


> Originally Posted by *toncij*
> 
> The point of high density displays is not to make it run at 100% scaling and have more real estate, but to scale content up to make text and other elements more clear and close to paper print (300 DPI usually).


I still consider that a total waste of pixels. The difference is marginal at best.

Also, you can't compare screen DPI to print DPI. They are completely different animals. Print DPI uses the 4 different color dots (CMYK) separately, whereas screen DPI, each pixel has an RGB value made up from components of the subpixels.

Essentially, the print DPI value is referring to subpixels. So, a print at 300DPI has a MUCH lower effective resolution than a screen at the same resolution. You could probably estimate it by dividing by 3 or 4, but it's tough to do, because there is no direct correlation.

This is why when you - for example - put a picture from a magazine into a scanner, and look at it on screen, it usually looks kind of bad, with a bunch of "noise" introduced by the printing process. (you can of course try to improve this through various blurring filters, but still)

Many people hold printed materials closer when they read them than optimal screen distance - however - this is why professional printers - like magazines, etc. - usually print at much higher resolutions, like 1200 to 2400DPI

Quote:


> Originally Posted by *toncij*
> 
> A 5K display exhibits a 217 PPI measure, still short of 300 DPI a printed magazine has.
> By increasing the panel density making pixels and dot pitch smaller, our perception of the content improves. We're capable of reading smaller text and noticing tinier details at the same distance of the same diagonal panel we're looking at usually. With a 5K display on 27" you're able to see much more and finer detail than with a normal 2560x1440 display, at the same content, but scaled up to 200% making one content pixel be made up of 4 physical pixels.


If you look at the science on this (which I did a while back) there are calculations that show perceptible resolution based on viewing distance.

In those calculations when I did them, I recall there being a difference of possible perception at a standard 80cm viewing distance all the way up to just under 200dpi (if I recall it maxed out somewhere in the 190's) but in practical use you quickly run into the law of diminishing returns, where going much above 110 didn't help much.

But yes, it is possible using high contrast worst case techniques at a typical viewing distance of 80 cm to tell the difference up to just under 200dpi, but these aren't real world tests. Real world tests are final content, like a digital photograph.

Quote:


> Originally Posted by *toncij*
> 
> DSR technology that Nvidia promotes and people tend to use to "try high resolution" has nothing to do with it. Higher fidelity of high resolution displays is a physical property and physical feat, not something you could ever simulate on a normal resolution display since all the benefits come from the higher physical pixel density of the display. DSR can only provide a better quality source for content sampling, but not much more.


The DSR technology is essentially just better anti aliasing at a huge frame rate hit.

Quote:


> Originally Posted by *toncij*
> 
> Hopefully you'll read this wall of text and consider learning from it, not attacking it.


Not an attack, just setting the record straight.

Now, I understand everyone is different. You - for instance - sit unusually close to your screen, apparently, and higher resolution is definitely more useful the closer you get, but a lot of the rest of this is plain old placebo effect, just like with the audiophile conversations that often get so heated.

Personally, I consider my money wasted if I need to scale the desktop to read normal text. It might result in slightly higher fidelity text, but with proper font antialiasing settings, text is just fine at 100dpi. I might go slightly higher, to 110, but that's the extent of it. I consider this all to be Apples fault, using their bull**** reality distortion field to market the retina crap no one needs and charge much more money for it.

I typed this on my 30" 2560x1600 screen (Dell U3011) at work at 100.6 DPI and it is great.

Similarly I consider my money on both GPU's and screens as well as the added annoyance of heat, noise and power use a waste, going above 60fps. I always vsync everything to 60fps. My bigger goal is to make sure my minimums never drop below 60. Thats where I find I get real return on my computing investments.


----------



## meson1

Quote:


> Originally Posted by *profundido*
> 
> guess I will browse online stores in europe for something similar. Probably my best bet for the future.


Here in the UK I got one of these: https://www.amazon.co.uk/Draper-02349-40-Piece-Imperial-Combined/dp/B004QXDHGG/ref=sr_1_1?ie=UTF8&qid=1472225816&sr=8-1&keywords=Draper+02349


Spoiler: Warning: Spoiler!







I got it for working on my computer, specifically I could use the extensions to get at and tighten the nuts on my Thermalright Silverarrow SB-E in my current rig. And I used it to help assemble my new CL TH10A. I just checked and there is a 4mm socket included.


----------



## profundido

Quote:


> Originally Posted by *meson1*
> 
> Here in the UK I got one of these: https://www.amazon.co.uk/Draper-02349-40-Piece-Imperial-Combined/dp/B004QXDHGG/ref=sr_1_1?ie=UTF8&qid=1472225816&sr=8-1&keywords=Draper+02349
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I got it for working on my computer, specifically I could use the extensions to get at and tighten the nuts on my Thermalright Silverarrow SB-E in my current rig. And I used it to help assemble my new CL TH10A. I just checked and there is a 4mm socket included.


looks really good, from there I can actually order too. Ty for the tip


----------



## CRITTY

Quote:


> Originally Posted by *Fiercy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DNMock*
> 
> Do you buy a fridge to store rotten, moldy food in, or fresh food in?
> 
> And yeah, any title (sans heavily modded or .ini files cranked up to 11) that can't run at 60 fps @4k on SLI TXP (assuming it supports SLI) even with AA disabled is moldy and rotten.
> 
> 
> 
> See that's a very flaud logic say they decrease the demands before the release and make top settings perform better does that make it fresh and nice?
> 
> That is so stupid. Witcher 2 had a super sampling setting that destroyed everything back in its day so it's rotten too!? Or Crysis 1 was rotten?
> 
> If you are buying Titans just to get 60 in 4k and go completely blind to the game people were developing for 5 years just because you don't get 60 makes me really feel sorry for you.
Click to expand...


----------



## toncij

Quote:


> Originally Posted by *PowerK*
> 
> As for the dish inside, it's B&O A9 speaker. hehe
> Yeah, I'm using ASUS ROG 3-Way SLI Bridge (LED) at the monent.


Well, rather go with two flex bridges. 3-way doesn't have full interconnects, but zig-zags. it actually connects two card on two pins only. Dual flex actually work as a HB bridge, connecting 2 vs 2 pins.


----------



## toncij

Quote:


> Originally Posted by *mattlach*
> 
> ...


You're wrong in almost everything you wrote, just confusing things you don't completely understand and mixing misinterpreted facts and correlations with correct information. Probably unintentionally, you're presenting correct, but irrelevant and unrelated or semi-related information as arguments for your ideas, but that's incorrect.
Quote:


> Originally Posted by *mattlach*
> 
> I looked at the 30fps vs 60fps video someone linked above. I stared at them for a while before reading which was which. In the end I could kind of tell that the 3fps one was slightly worse, but it was so marginal as to almost not be worth mentioning. I had to REALLY focus to see which was which. If I weren't trying to compare them side by side, I would never have known.
> I would have felt it in a mouse, as 30fps feels much less responsive than 60fps, but visually? Almost no difference at all.
> I still consider that a total waste of pixels. The difference is marginal at best.


Don't get me wrong, but it is evident you have a medical problem with your eyes. Sincerely, get that checked.
Quote:


> Originally Posted by *mattlach*
> 
> This is why when you - for example - put a picture from a magazine into a scanner, and look at it on screen, it usually looks kind of bad, with a bunch of "noise" introduced by the printing process. (you can of course try to improve this through various blurring filters, but still)


A typical screen is low resolution crap. A 22 megapixel photo looks stunningly better on an 5K display than a 1440 one. Of course, you can't see it because of your medical condition, but there's significant difference for average people.

Anyway. Someone's wrong on The Internet and I don't keep false hopes of proving anyone wrong or showing you "the truth". Just, really, get your eye sight tested. Telling difference from 144Hz to 165Hz is very hard, but if you can't tell an obvious stuttering on 30 FPS from 60, you have a serious problem of going almost completely motion-blind.


----------



## mattlach

Quote:


> Originally Posted by *toncij*
> 
> You're wrong in almost everything you wrote, just confusing things you don't completely understand and mixing misinterpreted facts and correlations with correct information. Probably unintentionally, you're presenting correct, but irrelevant and unrelated or semi-related information as arguments for your ideas, but that's incorrect.


Right back at you









Quote:


> Originally Posted by *toncij*
> 
> Don't get me wrong, but it is evident you have a medical problem with your eyes. Sincerely, get that checked.
> A typical screen is low resolution crap. A 22 megapixel photo looks stunningly better on an 5K display than a 1440 one. Of course, you can't see it because of your medical condition, but there's significant difference for average people.


My vision is - once corrected by lenses - 20/20, so I have no problem with my vision, thank you very much.

Just keep in mind how the human brain works. When a human being expects something to look/feel/taste/sound/whatever better, it does. It's called the placebo effect, and it affects ALL of us. So when you say "there is a significant difference to average people" I believe you. It looks significantly better to them because they believe it will.

Then again, if the placebo effect works, why not take advantage of it?


----------



## Kyouki

I used a kit like this you can get off of amazon for around $15, I bought it for when I work on my quads but it has been used for many other hobby related things.


https://www.amazon.com/helicopter-Maintenance-Driver-Screwdriver-Shipping/dp/B01FVYISI0/ref=sr_1_3?s=toys-and-games&ie=UTF8&qid=1472234121&sr=1-3&keywords=7+in+1+Hex+Screw+Driver+Tool


----------



## bizplan

Same here, I installed the EVGA 980ti hybrid kit by only taking the front cooler and fan shroud off, using tweezers to remove and reinstall the power connector.


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> How did you manage to connect the AIO pump without taking backplate and heatplate off?
> I mean how did you manage to get to Power Connector Heather?


Same here, I installed the EVGA 980ti hybrid kit on my Titan XP by only taking the front cooler and fan shroud off, using tweezers or needle-nose pliers to remove, patch in the pump and fan connector, and then reinstall the power connector. I might add that it was VERY DIFFICULT and I do NOT recommend doing it this way as those tweezers/pliers could cut into the [twisted] power wires (about 4 of them that connect to the connector) or the tweezers could damage the connector and create a short. You have to be very patient and careful if you do it this way, however, HIGHLY NOT RECOMMENDED!!


----------



## HyperMatrix

Quote:


> Originally Posted by *inoran81*
> 
> Lol one man's meat is another poison... but the flexibility to change the size of this sm8 made the deal. Was previously contemplated on sth10 but u can't 'resize' that case as compare to this option I have...
> 
> I can also always change to my inwin d-frame and use aqc gigant for that... if I don like this setup after awhile


I am incredibly attracted to this case...


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> You're wrong in almost everything you wrote, just confusing things you don't completely understand and mixing misinterpreted facts and correlations with correct information. Probably unintentionally, you're presenting correct, but irrelevant and unrelated or semi-related information as arguments for your ideas, but that's incorrect.
> Don't get me wrong, but it is evident you have a medical problem with your eyes. Sincerely, get that checked.
> A typical screen is low resolution crap. A 22 megapixel photo looks stunningly better on an 5K display than a 1440 one. Of course, you can't see it because of your medical condition, but there's significant difference for average people.
> 
> Anyway. Someone's wrong on The Internet and I don't keep false hopes of proving anyone wrong or showing you "the truth". Just, really, get your eye sight tested. Telling difference from 144Hz to 165Hz is very hard, but if you can't tell an obvious stuttering on 30 FPS from 60, you have a serious problem of going almost completely motion-blind.


The guy is clearly blind. I couldn't even bring myself to reply to his ridiculous commentary and statements of facts that are just so incredibly wrong that they're not even funny. Don't bother trying to convince him otherwise. Some goldfish prefer sitting in a fish bowl and never knowing about the ocean.


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> Same here, I installed the EVGA 980ti hybrid kit on my Titan XP by only taking the front cooler and fan shroud off, using tweezers or needle-nose pliers to remove, patch in the pump and fan connector, and then reinstall the power connector. I might add that it was VERY DIFFICULT and I do NOT recommend doing it this way as those tweezers/pliers could cut into the [twisted] power wires (about 4 of them that connect to the connector) or the tweezers could damage the connector and create a short. You have to be very patient and careful if you do it this way, however, HIGHLY NOT RECOMMENDED!!


OKEY , you may relax now







it was DONE the right WAY last WEEK !


----------



## Testier

Wait, how did someone kill their card by doing the hybrid mod? I thought it was fairly simple. How exactly do you broke the traces by screwing things in too tight?


----------



## criminal

Quote:


> Originally Posted by *Testier*
> 
> Wait, how did someone kill their card by doing the hybrid mod? I thought it was fairly simple. How exactly do you broke the traces by screwing things in too tight?


There are always dangers when installing any aftermarket coolers. If I had to guess they broke one using the pliers to remove the hex nuts or they damaged the gpu core itself by over tightening. These Pascal cards seem to be more delicate than Kepler or Maxwell IMHO.


----------



## Testier

Quote:


> Originally Posted by *criminal*
> 
> There are always dangers when installing any aftermarket coolers. If I had to guess they broke one using the pliers to remove the hex nuts or they damaged the gpu core itself by over tightening. These Pascal cards seem to be more delicate than Kepler or Maxwell IMHO.


It looks like the GP102 core have the metal bracket things around it. Would that help with not cracking the core?


----------



## Kyouki

Question: when installing the EK backplate do I need to use the clear plastic washers between the backplate and PCB? The instructions don't show them used but it came with some? If I do use them I done show the pads making firm contact with PCB.

Answer: lol I could not wait I installed without and looks like the screws fit perfect and now there is better contact with PCB and thermal pads.


----------



## toncij

Quote:


> Originally Posted by *mattlach*
> 
> Right back at you
> 
> 
> 
> 
> 
> 
> 
> 
> My vision is - once corrected by lenses - 20/20, so I have no problem with my vision, thank you very much.


I was not trying to insult you. I'm very serious - there are many eye diseases and defects that are not short or long sight. Motion, depth, color, etc. it may be broken.

Quote:


> Originally Posted by *HyperMatrix*
> 
> I am incredibly attracted to this case...


I see it can be done with handcuffs...









I doubt he could've cracked the core itself. It's rather one of the connects from component to component where he applied pressure...


----------



## eliau81

Quote:


> Originally Posted by *toncij*
> 
> I was not trying to insult you. I'm very serious - there are many eye diseases and defects that are not short or long sight. Motion, depth, color, etc. it may be broken.
> I see it can be done with handcuffs...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I doubt he could've cracked the core itself. It's rather one of the connects from component to component where he applied pressure...


I used a warnch and pcb probly damage


----------



## Kyouki

Alright got my backplate in and installed!



Added a touch of vinyl!




Everything installed and ready for leak test.


----------



## hanzy

Kyouki, looks good man. Like the color and the way you applied the vinyl to the backplate.


----------



## Kyouki

Quote:


> Originally Posted by *hanzy*
> 
> Kyouki, looks good man. Like the color and the way you applied the vinyl to the backplate.


Thank you


----------



## carlhil2

Anyone try using a conductive adhesive tape for the shunt mod yet?


----------



## HyperMatrix

Quote:


> Originally Posted by *Kyouki*
> 
> Alright got my backplate in and installed!
> 
> 
> 
> Added a touch of vinyl!
> Everything installed and ready for leak test.


Looks great man. Curious though. Doesn't the vinyl reduce heat dissipation?


----------



## Kyouki

Quote:


> Originally Posted by *HyperMatrix*
> 
> Looks great man. Curious though. Doesn't the vinyl reduce heat dissipation?


It could but I doubt the difference in temps would be that noticeable. After I get it up and running well see where I'm coming in at and see if I need to make changes.


----------



## Stateless

Quote:


> Originally Posted by *Gary2015*
> 
> Wise move. Since this is my first all Nvidia release , not sure what they going to pull if I have a problem . EVGA I can depend upon.. Nvidia ???


When I had an issue with one of my Titan X Maxwell's Nvidia was very good about it. I was upfront and honest with them and told them that while I was removing the stock cooler, one of the screws were tightened down too much and when I used a little force, my screwdriver moved and broke a part off the PCB. The card still worked, but I was unsure if it would work long term. They said no problem, I returned it and within a few days of them receiving it they refunded my money. At the time, the cards were still fairly new and going in and out of stock, so they suggested to just get a refund and re-buy one when they became available. While I prefer to go with EVGA, I did go Nvidia because at the time the Maxwell Titan X were only from Nvidia much like the Titan X Pascal.


----------



## Stateless

Quote:


> Originally Posted by *inoran81*
> 
> Joining this Titan x pascal owner club here with my 2016 rig... need to wait for my aqc waterblocks and other wc stuffs now... hope it will be ready to ship out ASAP...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First of caselabs SM8 with peds
> 
> 
> 
> 
> 
> 
> 
> 
> 
> quote]
> 
> Would you mind listing your parts list. I cannot make out which parts you are using? Looks like a fun build however!!!


----------



## KillerBee33

Thinking of getting this Wblock and Dremel a way for extra Power connector, http://www.swiftech.com/KOMODONV-LEGTX1080.aspx


----------



## DNMock

Quote:


> Originally Posted by *toncij*
> 
> You're wrong in almost everything you wrote, just confusing things you don't completely understand and mixing misinterpreted facts and correlations with correct information. Probably unintentionally, you're presenting correct, but irrelevant and unrelated or semi-related information as arguments for your ideas, but that's incorrect.
> Don't get me wrong, but it is evident you have a medical problem with your eyes. Sincerely, get that checked.
> A typical screen is low resolution crap. A 22 megapixel photo looks stunningly better on an 5K display than a 1440 one. Of course, you can't see it because of your medical condition, but there's significant difference for average people.
> 
> Anyway. Someone's wrong on The Internet and I don't keep false hopes of proving anyone wrong or showing you "the truth". Just, really, get your eye sight tested. Telling difference from 144Hz to 165Hz is very hard, but if you can't tell an obvious stuttering on 30 FPS from 60, you have a serious problem of going almost completely motion-blind.


umm, if he was watching the videos on a 30 hz monitor, there probably isn't a difference between the two videos....

As a side note (this is 100% pure guess work with literally nothing to support it), but the contacts or glasses themselves may have enough distortion to them as to negate the visible stuttering.


----------



## HyperMatrix

I have a question for those running more than 2 cards right now. I'm finding 2 Titan XP's to not be enough for 165Hz 1440p. So before I go about selling my old maxwell Titan x cards, I was wondering if I should keep one of them for use with dx12 multi gpu. Is it possible to just put 2 Titan XP's connected wth the sli bridge and use 2-card sli with dx11 games, but then just let dx12 take care of using all 3 under explicit multi gpu?


----------



## HyperMatrix

Quote:


> Originally Posted by *DNMock*
> 
> umm, if he was watching the videos on a 30 hz monitor, there probably isn't a difference between the two videos....
> 
> As a side note (this is 100% pure guess work with literally nothing to support it), but the contacts or glasses themselves may have enough distortion to them as to negate the visible stuttering.


Also to contribute...I agree that it's harder to pickup the difference between 144Hz and 165Hz (although you can feel the more immediate nature of the responsiveness at 165Hz). But comparing 120Hz to 165Hz is huge and is very noticeable. You can close all fps overlays. And even with GSYNC, if my fps drops from 165fps to 130fps, I can feel it.


----------



## inoran81

Quote:


> Originally Posted by *Stateless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *inoran81*
> 
> Joining this Titan x pascal owner club here with my 2016 rig... need to wait for my aqc waterblocks and other wc stuffs now... hope it will be ready to ship out ASAP...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First of caselabs SM8 with peds
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would you mind listing your parts list. I cannot make out which parts you are using? Looks like a fun build however!!!
Click to expand...

thanks dude....

oops... i forgotten about mentioning the parts in here

*Processor*
Intel i7-6950X Extreme
*Mainboard*
Rampage V Extreme Edition 10
*Graphics Card*
Nvidia Titan X Pascal (2-Way)
*Memory*
G.Skill TridentZ 3200 (16GBx8) 128GB
*Display*
Acer Predator XB321HK
*Storage*
Intel 750 1.2TB | Samsung 950 Pro 512GB
*Casing*
CaseLabs SM8 + Dual Peds (Two-Tones)
*PSU*
Corsair AX1500i
*Peripherals*
Steelseries Sensei Wireless Mouse | Steelseries Apex M800 Keyboard | Steelseries Dex Mousepad


----------



## HyperMatrix

Quote:


> Originally Posted by *HyperMatrix*
> 
> I have a question for those running more than 2 cards right now. I'm finding 2 Titan XP's to not be enough for 165Hz 1440p. So before I go about selling my old maxwell Titan x cards, I was wondering if I should keep one of them for use with dx12 multi gpu. Is it possible to just put 2 Titan XP's connected wth the sli bridge and use 2-card sli with dx11 games, but then just let dx12 take care of using all 3 under explicit multi gpu?


update: I dropped one of the old Maxwell titan cards into my system and while it is detected, it isn't being used under dx12 multi-gpu. I can enable SLI between the 2 pascal cards. And when running tomb raider, only the 2 pascal cards have usage on them. If I plug the monitor into the Maxwell card, then only that card is used. From what I'd read about AoTS and being able to use cards from different vendors, even, I thought for sure I'd be able to use the old Maxwell titan for a little extra juice. Sadly, that is not the case. Any info regarding this issue would be appreciated.

Turns out tomb raider has homogeneous mGPU. I wonder if a 3rd pascal titan is in my future....since I doubt heterogeneous mGPU will catch on.


----------



## eliau81

i was thinking to contact nvidia and try to tell them whats happened , telling the all story and hopefully they will help me
should i ?
can someone help me formulate a good email i m not from us and putting the Right words isn't my strong side....


----------



## eliau81

Quote:


> Originally Posted by *Stateless*
> 
> When I had an issue with one of my Titan X Maxwell's Nvidia was very good about it. I was upfront and honest with them and told them that while I was removing the stock cooler, one of the screws were tightened down too much and when I used a little force, my screwdriver moved and broke a part off the PCB. The card still worked, but I was unsure if it would work long term. They said no problem, I returned it and within a few days of them receiving it they refunded my money. At the time, the cards were still fairly new and going in and out of stock, so they suggested to just get a refund and re-buy one when they became available. While I prefer to go with EVGA, I did go Nvidia because at the time the Maxwell Titan X were only from Nvidia much like the Titan X Pascal.


maybe i will try to contact them and be honest


----------



## Lobotomite430

My Titan Xp is on water now!


----------



## jcde7ago

Quote:


> Originally Posted by *HyperMatrix*
> 
> I wonder if a 3rd pascal titan is in my future....


DON'T DO IT!

Seriously though, unless you're a benchmark freak, wait to see how the DX12 landscape pans out with homogeneous mGPU...and even then, Nvidia's going to have to show that it's serious about supporting developers who choose to support more than 2-way GPU configs...if they're basically abandoning general support for 3/4-way SLI configs by not optimizing their own drivers for those specs (pretty obvious with how they approached the 'enthusiast key' situation and then even ditched that starting with the GTX 10xx series) then i'm skeptical about how developers are going to want to undertake that kind of a challenge mostly on their own.

I went 3x Titan X Maxwell for this last build (in the middle of switching out to 2x TXPs now) and if there's one thing I learned over the last ~18 months of Tri-SLI gaming is that it is NOT an experience I want to undertake again...the micro-stuttering and mediocre support for most games past 2-way SLI is too much to handle. Heck, I went back and forth on 1 or 2 TXPs as i'm just playing on an X34 and a single TXP might be enough to do the job for most games @ 3440x1440p/100hz.

Anyway, even if you were hemorrhaging money I would still discourage anything past 2-way SLI for anyone that is looking for anything more than pretty benchmark numbers....the gaming performance just truly isn't there (and I really, REALLY wanted the performance to be there).


----------



## HyperMatrix

Quote:


> Originally Posted by *jcde7ago*
> 
> DON'T DO IT!
> 
> Seriously though, unless you're a benchmark freak, wait to see how the DX12 landscape pans out with homogeneous mGPU...and even then, Nvidia's going to have to show that it's serious about supporting developers who choose to support more than 2-way GPU configs...if they're basically abandoning general support for 3/4-way SLI configs by not optimizing their own drivers for those specs (pretty obvious with how they approached the 'enthusiast key' situation and then even ditched that) then i'm skeptical about how developers are going to want to undertake that kind of a challenge mostly on their own.
> 
> I went 3x Titan X Maxwell for this last build (in the middle of switching out to 2x TXPs now) and if there's one thing I learned over the last ~18 months of Tri-SLI gaming is that it is NOT an experience I want to undertake again...the micro-stuttering and mediocre support for most games past 2-way SLI is too much to handle. Heck, I went back and forth on 1 or 2 TXPs as i'm just playing on an X34 and a single TXP might be enough to do the job for most games @ 3440x1440p/100hz.
> 
> Anyway, even if you were hemorrhaging money I would still discourage anything past 2-way SLI for anyone that is looking for anything more than pretty benchmark numbers....the gaming performance just truly isn't there (and I really, REALLY wanted the performance to be there).


See, the only reason I'm possibly interests is because I saw how well tomb raider dx12 mGPU worked with 3 maxwell Titans. Maxing them out completely for some epic fps. I just can't get 165fps reliably with 2 pascal Titans. If I knew for sure all new games would support dx12 mGPU with 3 cards, I would be all over it. But $1500 to have it work in just a coupe games, yeah, definitely wouldn't be worth it.


----------



## jcde7ago

Quote:


> Originally Posted by *HyperMatrix*
> 
> See, the only reason I'm possibly interests is because I saw how well tomb raider dx12 mGPU worked with 3 maxwell Titans. Maxing them out completely for some epic fps. I just can't get 165fps reliably with 2 pascal Titans. If I knew for sure all new games would support dx12 mGPU with 3 cards, I would be all over it. But $1500 to have it work in just a coupe games, yeah, definitely wouldn't be worth it.


Yeah for sure...if DX12 mgpu scaling is there...it's definitely going to be worth it...but it likely won't be there for anything but a single game or two every year it seems like, which is terrible value for anything past 2 cards.

GPU hardware is far outpacing graphical advances in game development, so it'll likely make more "sense" to keep upgrading to newer generation GPUs than to tack on more and more GPUs from the previous generation.

Money is likely not an issue for enthusiasts like us in this thread, as there's a certain level of masochism involved to keep upgrading components even when MAYBE 1-2 games will even take advantage of the upgrades every year...but i'm never going back to more than a 2-way GPU config. until developers/Nvidia can prove that it's actually a worthwhile investment....for me, it's been anything but that. I'll upgrade to 2x Titan X Voltas or whatever else is the high-end card 18 months from now before I tack on a 3rd TXP, that's for sure.


----------



## Gary2015

Quote:


> Originally Posted by *inoran81*
> 
> thanks dude....
> 
> oops... i forgotten about mentioning the parts in here
> 
> *Processor*
> Intel i7-6950X Extreme
> *Mainboard*
> Rampage V Extreme Edition 10
> *Graphics Card*
> Nvidia Titan X Pascal (2-Way)
> *Memory*
> G.Skill TridentZ 3200 (16GBx8) 128GB
> *Display*
> Acer Predator XB321HK
> *Storage*
> Intel 750 1.2TB | Samsung 950 Pro 512GB
> *Casing*
> CaseLabs SM8 + Dual Peds (Two-Tones)
> *PSU*
> Corsair AX1500i
> *Peripherals*
> Steelseries Sensei Wireless Mouse | Steelseries Apex M800 Keyboard | Steelseries Dex Mousepad


Gskill ripjaws v has faster timings . 3000 will be faster than your Tridents.

Sm 961 ssd is faster.


----------



## Gary2015

Quote:


> Originally Posted by *eliau81*
> 
> maybe i will try to contact them and be honest


I'm not disputing the validity of the OPs situation but when I spoke to them they said any modifications will void warranty. So am I quite surprised to hear they offered the OP a refund.


----------



## toncij

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah for sure...if DX12 mgpu scaling is there...it's definitely going to be worth it...but it likely won't be there for anything but a single game or two every year it seems like, which is terrible value for anything past 2 cards.
> 
> GPU hardware is far outpacing graphical advances in game development, so it'll likely make more "sense" to keep upgrading to newer generation GPUs than to tack on more and more GPUs from the previous generation.
> 
> Money is likely not an issue for enthusiasts like us in this thread, as there's a certain level of masochism involved to keep upgrading components even when MAYBE 1-2 games will even take advantage of the upgrades every year...but i'm never going back to more than a 2-way GPU config. until developers/Nvidia can prove that it's actually a worthwhile investment....for me, it's been anything but that. I'll upgrade to 2x Titan X Voltas or whatever else is the high-end card 18 months from now before I tack on a 3rd TXP, that's for sure.


"GPU hardware is far outpacing graphical advances"

I disagree with this specific part. Monitors are advancing, GPUs are stalling and late to party.
Graphics is advancing but there are no GPUs that can handle it. For most games a single TXP is not enough at [email protected] or [email protected]
Games next year will not be limited with the API, DX12 and Vulkan are here.

What concerns me is that developers (I'm one and I know many) mostly don't feel like bothering about multi-GPU in DX12/Vulkan (it's not even possible in Vulkan yet and it may not be up to Vulkan 1.2).
I'd do it, but usually it's not up to one person, but a budget.


----------



## eliau81

Quote:


> Originally Posted by *Gary2015*
> 
> I'm not disputing the validity of the OPs situation but when I spoke to them they said any modifications will void warranty. So am I quite surprised to hear they offered the OP a refund.


just finish chating with nvidia and was told that he have to check this with RMA team
and basically this voids warranty then he gave an e mail to fill all kind of datails
so done that now waiting with fingers crossed


----------



## jcde7ago

Quote:


> Originally Posted by *toncij*
> 
> "GPU hardware is far outpacing graphical advances"
> 
> I disagree with this specific part. Monitors are advancing, GPUs are stalling and late to party.
> Graphics is advancing but there are no GPUs that can handle it. For most games a single TXP is not enough at [email protected] or [email protected]
> Games next year will not be limited with the API, DX12 and Vulkan are here.
> 
> What concerns me is that developers (I'm one and I know many) mostly don't feel like bothering about multi-GPU in DX12/Vulkan (it's not even possible in Vulkan yet and it may not be up to Vulkan 1.2).
> I'd do it, but usually it's not up to one person, but a budget.


Eh, we can agree to disagree then.

4K has been around for a *few* years now, and yeah, it's still a hard resolution to drive, but primarily due to the sheer millions of pixels involved and not necessarily due to a game's graphical fidelity.

The vast majority of games are console ports/games that are multiplatform that aren't focused solely on milking every tier of PC hardware.

"Graphics is advancing but there are no GPUs that can handle it." Really? Can you even put together a list of more than ~10 overwhelmingly popular games that can't be run on 1440p @ 60hz? Because that's what the "new standard" is for probably 98% of PC gamers...the rest of the PC gaming population is lumped into the [email protected], 3440x1440p @ 100hz and 4k @ 60hz crowd.

You're also underestimating game developers' willingness to take advantage of DX12 and Vulcan; RoTR had "okay" DX12 improvements, nothing too spectacular, and we have yet to see how other games like Deus Ex: Mankind Divided will fare.

Gaming as a whole is a money-first business like everything else, so if you think that developers are going to come out of the woodwork to take advantage of DX12 and Vulcan so that the top 2% of PC gamers with $1,200 GPUs and $1-2K CPUs are getting the absolute BEST graphical fidelity at the most optimal and efficient performance at the best resolutions and framerates....lol....prepare to be disappointed.

We're lucky to get developers putting in the time for DEDICATED PC games that are designed and developed with PCs in mind....I mean, really, how many games coming out besides Star Citizen can you actually say are being built to advance graphics technology and take full advantage of PCs? And even then, DX12 is still just in the exploratory phase with Star Citizen...they've already mentioned it will come after launch and will likely require a code revamp for most of the things they've done (since they're turned CryEngine inside out basically).

I appreciate you trying to justify our hardware but when 98% of PC gamers are in a much different places than those of us running TXPs, then pretty much most games are ALWAYS going to be behind the hardware curve.


----------



## toncij

Quote:


> Originally Posted by *jcde7ago*
> 
> Eh, we can agree to disagree then.
> 
> 4K has been around for a *few* years now, and yeah, it's still a hard resolution to drive, but primarily due to the sheer millions of pixels involved and not necessarily due to a game's graphical fidelity.
> 
> The vast majority of games are console ports/games that are multiplatform that aren't focused solely on milking every tier of PC hardware.
> 
> "Graphics is advancing but there are no GPUs that can handle it." Really? Can you even put together a list of more than ~10 overwhelmingly popular games that can't be run on 1440p @ 60hz? Because that's what the "new standard" is for probably 98% of PC gamers...the rest of the PC gaming population is lumped into the [email protected], 3440x1440p @ 100hz and 4k @ 60hz crowd.
> 
> You're also underestimating game developers' willingness to take advantage of DX12 and Vulcan; RoTR had "okay" DX12 improvements, nothing too spectacular, and we have yet to see how other games like Deus Ex: Mankind Divided will fare.
> 
> Gaming as a whole is a money-first business like everything else, so if you think that developers are going to come out of the woodwork to take advantage of DX12 and Vulcan so that the top 2% of PC gamers with $1,200 GPUs and $1-2K CPUs are getting the absolute BEST graphical fidelity at the most optimal and efficient performance at the best resolutions and framerates....lol....prepare to be disappointed.
> 
> We're lucky to get developers putting in the time for DEDICATED PC games that are designed and developed with PCs in mind....I mean, really, how many games coming out besides Star Citizen can you actually say are being built to advance graphics technology and take full advantage of PCs? And even then, DX12 is still just in the exploratory phase with Star Citizen...they've already mentioned it will come after launch and will likely require a code revamp for most of the things they've done (since they're turned CryEngine inside out basically).
> 
> I appreciate you trying to justify our hardware but when 98% of PC gamers are in a much different places than those of us running TXPs, then pretty much most games are ALWAYS going to be behind the hardware curve.


I have no problem with someone disagreeing.







Sure, go ahead, but. The sheer millions of pixels is not a real problem. I can make you a game-looking demo that runs 2000 FPS on 4K. It is a combination. You can today run some Blizzard games in 5K maxed out on an Titan X Maxwell, but the, running SW: Battlefront at [email protected] is hard for TitanXP. [email protected]? With what card? 1080 can do it... I think. But not 144 or [email protected]

You read my comment wrong. I never commented on DX12/Vulkan usage, but multi-GPU usage. It is a very different thing. DeusEx DX12 patch came, I have yet to see if it fixed DX11 problems it had (API inefficiencies). DX12 and Vulkan are coming fast and will take over the AAA games in a very short time. Also, DX12 and Vulkan are *not* about expensive GPUs. A $200 RX 480 and cheaper cards benefit from it. Both APIs are supported by most of the old and new hardware in use. It has nothing to do with $1200 GPUs.
Now that you mention putting time to dedicated PC games, yes, DX12 and Vulkan are actually making it drastically easier to port console games to PC and to write cross-platform games.

Regarding Star Citizen, I suggest you refund and take your money elsewhere. I think the game will never be released, at least not nearly as described. It will most probably end in a class action lawsuit. Don't hold your breath.


----------



## GosuPl

SLI TX Pascall vs SLI TX Maxwell.

Witcher 3 all maxed - 1080p/1440p/4k


----------



## piee

TXP arrived yesterday: Stock 160oc120pwr100fan/mem-0oc
49c 1936 49-54c 2112?
62c 1987 53c 2012
66c 1974 54-68c 1774
71-72c 1797 60c 1987
Noticed when left bf4 the idle clk stayed at 1419 and mem5000,then restart and 139clk mem405, I like low idle


----------



## piee

those are two rows first stock/ then 160pwr100fan no mem oc.


----------



## TurricanM3

Are we still unable to flash the card with a mod bios?
No updated nvflash out there?
5.306 doesn't work?


----------



## cisco0623

Quote:


> Originally Posted by *TurricanM3*
> 
> Are we still unable to flash the card with a mod bios?
> No updated nvflash out there?
> 5.306 doesn't work?


I haven't come across anything yet. My guess is nvidia doesn't update it, but someone smart will figure it out!


----------



## Jpmboy

Quote:


> Originally Posted by *carlhil2*
> 
> Anyone try using a conductive adhesive tape for the shunt mod yet?


which tape you thinking of?
Quote:


> Originally Posted by *Gary2015*
> 
> *Gskill ripjaws v has faster timings . 3000 will be faster than your Tridents.*
> 
> Sm 961 ssd is faster.


lol - where did you get this idea from ?? The R5E10 likes a full house - 8 sticks - due to T-topology in the memory trace sublayer. There are no optimized settings for 3000 on x99 that will out perform optimized 3200 on the same platform for two reasons... 1) x99 quad channel is bandwidth oriented, 2) the memory divider for 3200 is the strongest on that platform. With his 6950X and R5E10, 3400c14, or c13 may be possible depending on the CPU's IMC.
Quote:


> Originally Posted by *TurricanM3*
> 
> Are we still unable to flash the card with a mod bios?
> No updated nvflash out there?
> 5.306 doesn't work?


can;t read the bios and can't modify it yet. NV has locked us out.
Quote:


> Originally Posted by *jcde7ago*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Eh, we can agree to disagree then.
> 
> 4K has been around for a *few* years now, and yeah, it's still a hard resolution to drive, but primarily due to the sheer millions of pixels involved and not necessarily due to a game's graphical fidelity.
> 
> The vast majority of games are console ports/games that are multiplatform that aren't focused solely on milking every tier of PC hardware.
> 
> "Graphics is advancing but there are no GPUs that can handle it." Really? Can you even put together a list of more than ~10 overwhelmingly popular games that can't be run on 1440p @ 60hz? Because that's what the "new standard" is for probably 98% of PC gamers...the rest of the PC gaming population is lumped into the [email protected], 3440x1440p @ 100hz and 4k @ 60hz crowd.
> 
> You're also underestimating game developers' willingness to take advantage of DX12 and Vulcan; RoTR had "okay" DX12 improvements, nothing too spectacular, and we have yet to see how other games like Deus Ex: Mankind Divided will fare.
> 
> Gaming as a whole is a money-first business like everything else, so if you think that developers are going to come out of the woodwork to take advantage of DX12 and Vulcan so that the top 2% of PC gamers with $1,200 GPUs and $1-2K CPUs are getting the absolute BEST graphical fidelity at the most optimal and efficient performance at the best resolutions and framerates....lol....prepare to be disappointed.
> 
> We're lucky to get developers putting in the time for DEDICATED PC games that are designed and developed with PCs in mind....I mean, really, how many games coming out besides Star Citizen can you actually say are being built to advance graphics technology and take full advantage of PCs? And even then, DX12 is still just in the exploratory phase with Star Citizen...they've already mentioned it will come after launch and will likely require a code revamp for most of the things they've done (since they're turned CryEngine inside out basically).
> 
> 
> I appreciate you trying to justify our hardware but when 98% of PC gamers are in a much different places than those of us running TXPs, *then pretty much most games are ALWAYS going to be behind the hardware curve*.


and when it reverses the consumer/market revolts and will not buy games that can't be played on mainstream hardware. Enthusiasts buy halo products... for the halo.


----------



## DADDYDC650

No competition from AMD for another 6-10 months. No wonder the TItan XP costs $1200.

http://www.overclock.net/t/1609897/amd-investor-presentation-august-2016-vega-still-6-10-months-away#post_25471497


----------



## MikeSanders

Some overclockers have a titan x pascal nvflash version. But no one shared it yet. =(


----------



## dante`afk

Quote:


> Originally Posted by *GosuPl*
> 
> SLI TX Pascall vs SLI TX Maxwell.
> 
> Witcher 3 all maxed - 1080p/1440p/4k


Great comparison, it nicely shows how hard the MS is still for the Maxwell generation, but far less for Pascal, yet still not gone.


----------



## st0necold

Quote:


> Originally Posted by *HyperMatrix*
> 
> I have a question for those running more than 2 cards right now. I'm finding 2 Titan XP's to not be enough for 165Hz 1440p. So before I go about selling my old maxwell Titan x cards, I was wondering if I should keep one of them for use with dx12 multi gpu. Is it possible to just put 2 Titan XP's connected wth the sli bridge and use 2-card sli with dx11 games, but then just let dx12 take care of using all 3 under explicit multi gpu?


Are you sure you are not doing something wrong?

I have 2 980ti's and max every single game at 1440p...


----------



## Jpmboy

Quote:


> Originally Posted by *MikeSanders*
> 
> Some overclockers have a titan x pascal nvflash version. But no one shared it yet. =(


----------



## cisco0623

Quote:


> Originally Posted by *Jpmboy*


For real? Lame lol


----------



## DNMock

Quote:


> Originally Posted by *Jpmboy*


----------



## Nunzi

wow almost got exsighted ..........lol


----------



## DNMock

Quote:


> Originally Posted by *Nunzi*
> 
> wow almost got exsighted ..........lol


Actually I wouldn't be surprised if JPM was helping work out the kinks in the code, alpha, or beta testing something. I was just razzing him a bit.

edit:

As a side note, has anyone tried actually slightly reducing the voltage on their cards to avoid the power limit wall? I wonder if I could drop them down to 1.05, set the memory to stock clocks and get a higher core clock speed that way without being curbstomped by PL.


----------



## markklok

After a half day of testing my Titan came up with the following results

Gaming / Heaven stable

Core 2038mhz (+178)
Mem 5580mhz (+575)
AB settings


Spoiler: Warning: Spoiler!







3dmark ultra

Graphical score of 7805
TimeSpy

Graphical score of 10354



Spoiler: Warning: Spoiler!







Temperature is max 45 celcius but thats because it's freaking hot in my room (attic) and no AC


Not a golden ticket but its ok right ?


----------



## Yuhfhrh

Quote:


> Originally Posted by *Jpmboy*


Triggered!!!


----------



## Nunzi

Quote:


> Originally Posted by *DNMock*
> 
> Actually I wouldn't be surprised if JPM was helping work out the kinks in the code, alpha, or beta testing something. I was just razzing him a bit.
> 
> yeah he's a good guy ....


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> Actually I wouldn't be surprised if JPM was helping work out the kinks in the code, alpha, or beta testing something. I was just razzing him a bit.
> 
> edit:
> 
> As a side note, has anyone tried actually slightly reducing the voltage on their cards to avoid the power limit wall? I wonder if I could drop them down to 1.05, set the memory to stock clocks and get a higher core clock speed that way without being curbstomped by PL.


yeah - lets not get our hopes up yet.
unfortunately, nvflash is only half the solution... we've had a working nvflash for the 1080 for a long time, but no bios editor for the 1080 afaik.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> which tape you thinking of?
> lol - where did you get this idea from ?? The R5E10 likes a full house - 8 sticks - due to T-topology in the memory trace sublayer. There are no optimized settings for 3000 on x99 that will out perform optimized 3200 on the same platform for two reasons... 1) x99 quad channel is bandwidth oriented, 2) the memory divider for 3200 is the strongest on that platform. With his 6950X and R5E10, 3400c14, or c13 may be possible depending on the CPU's IMC.
> can;t read the bios and can't modify it yet. NV has locked us out.
> and when it reverses the consumer/market revolts and will not buy games that can't be played on mainstream hardware. Enthusiasts buy halo products... for the halo.


Lololol
Typo I meant 3200 ripjaws.


----------



## jcde7ago

Quote:


> Originally Posted by *toncij*
> 
> Regarding Star Citizen, I suggest you refund and take your money elsewhere. I think the game will never be released, at least not nearly as described. It will most probably end in a class action lawsuit. Don't hold your breath.


Lol...and I think this is for sure where we part ways, friend.








Quote:


> Originally Posted by *Jpmboy*
> 
> and when it reverses the consumer/market revolts and will not buy games that can't be played on mainstream hardware. Enthusiasts buy halo products... for the halo.


I would love for it to reverse but sadly most studios with the budget to truly take advantage of PC gaming hardware are the ones who want to put out games with as minimal effort as possible and market the crap out of it .....and most gamers will eat it up...too many people still pre-order, etc.


----------



## inoran81

Quote:


> Originally Posted by *Jpmboy*
> 
> which tape you thinking of?
> *lol - where did you get this idea from ?? The R5E10 likes a full house - 8 sticks - due to T-topology in the memory trace sublayer. There are no optimized settings for 3000 on x99 that will out perform optimized 3200 on the same platform for two reasons... 1) x99 quad channel is bandwidth oriented, 2) the memory divider for 3200 is the strongest on that platform. With his 6950X and R5E10, 3400c14, or c13 may be possible depending on the CPU's IMC.*
> can;t read the bios and can't modify it yet. NV has locked us out.
> and when it reverses the consumer/market revolts and will not buy games that can't be played on mainstream hardware. Enthusiasts buy halo products... for the halo.


finally someone who knows his stuffs and not bull his way through.... thanks for the detailed explanation on this... seems like we are on the same page...


----------



## inoran81

Quote:


> Originally Posted by *Gary2015*
> 
> Lololol
> Typo I meant 3200 ripjaws.


nice try to flip your claims...lol

though i'm really curious when g.skills came out with Ripjaws V *3200* 128GB kit?









http://www.gskill.com/en/finder?cat=31&prop_2=128GB+%2816GBx8%29&prop_3=0&prop_4=0&series=2481&prop_6=Quad+Channel+Kit


----------



## Jpmboy

Quote:


> Originally Posted by *inoran81*
> 
> nice try to flip your claims...lol
> 
> though i'm really curious when g.skills came out with Ripjaws V *3200* 128GB kit?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.gskill.com/en/finder?cat=31&prop_2=128GB+%2816GBx8%29&prop_3=0&prop_4=0&series=2481&prop_6=Quad+Channel+Kit


I heard to expect them this fall, with 3333c13 leading the way... but frankly, with 128GB of RAM unless you use lots of VMs or ram disks, 90% of the capacity goes unaddressed (and not much better with 64GB







). But, for VMs and the like, 128GB is an amazing amount for a day/driver - home rig!








http://www.anandtech.com/show/10582/gskill-shows-off-trident-z-kits


----------



## inoran81

Quote:


> Originally Posted by *Jpmboy*
> 
> I heard to expect them this fall, with 3466c13 leading the way... but frankly, with 128GB of RAM unless you use lots of VMs or ram disks, 90% of the capacity goes unaddressed (and not much better with 64GB
> 
> 
> 
> 
> 
> 
> 
> ). But, for VMs and the like, 128GB is an amazing amount for a day/driver - home rig!


haha... i rather wait for SM961 or 960 Pro 1TB to be easily available for us consumers soon than to wait for faster RAM...









but if the upcoming 3466c13 can make my system fly... i don't mind buying another set....

I already got a set of Ripjaw 4 2800 64GB when its initially launched early last year....











you're right, i will be using type-2 hypervisor on this rig as my home test lab.... not sure if anyone did try running esxi 6 on R5E10 yet.... maybe can give that a try as well...


----------



## Jpmboy

Quote:


> Originally Posted by *inoran81*
> 
> haha... i rather wait for SM961 or 960 Pro 1TB to be easily available for us consumers soon than to wait for faster RAM...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but if the upcoming 3466c13 can make my system fly... i don't mind buying another set....
> 
> I already got a set of Ripjaw 4 2800 64GB when its initially launched early last year....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you're right, i will be using type-2 hypervisor on this rig as my home test lab.... not sure if anyone did try running esxi 6 on R5E10 yet.... maybe can give that a try as well...


3200c14s at 3400c13


----------



## inoran81

Quote:


> Originally Posted by *Jpmboy*
> 
> 3200c14s at 3400c13


nice









you running on 128GB as well?


----------



## scgeek12

Just bought 2







should be here on the 1st, got a pair ok EK blocks and backplates to go with them


----------



## toncij

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - lets not get our hopes up yet.
> unfortunately, nvflash is only half the solution... we've had a working nvflash for the 1080 for a long time, but no bios editor for the 1080 afaik.


Quote:


> Originally Posted by *Jpmboy*
> 
> which tape you thinking of?
> lol - where did you get this idea from ?? The R5E10 likes a full house - 8 sticks - due to T-topology in the memory trace sublayer. There are no optimized settings for 3000 on x99 that will out perform optimized 3200 on the same platform for two reasons... 1) x99 quad channel is bandwidth oriented, 2) the memory divider for 3200 is the strongest on that platform. With his 6950X and R5E10, 3400c14, or c13 may be possible depending on the CPU's IMC.
> can;t read the bios and can't modify it yet. NV has locked us out.
> and when it reverses the consumer/market revolts and will not buy games that can't be played on mainstream hardware. Enthusiasts buy halo products... for the halo.


Haven't tried, but - is it possible to remove the BIOS chip and read & reprogram it externally? Would that work?


----------



## HyperMatrix

Quote:


> Originally Posted by *st0necold*
> 
> Are you sure you are not doing something wrong?
> 
> I have 2 980ti's and max every single game at 1440p...


You're not using high enough game settings. Or missed the part about 165Hz.


----------



## Jpmboy

Quote:


> Originally Posted by *inoran81*
> 
> nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you running on 128GB as well?


_only_ 64GB.








Quote:


> Originally Posted by *toncij*
> 
> Haven't tried, but - is it possible to remove the BIOS chip and read & reprogram it externally? Would that work?


whoa... that's flow bench territory. (it's not a socketed bios chip)


----------



## toncij

Quote:


> Originally Posted by *Jpmboy*
> 
> _only_ 64GB.
> 
> 
> 
> 
> 
> 
> 
> 
> whoa... that's flow bench territory. (it's not a socketed bios chip)


Just asking. Never bothered to even look.









How is reading blocked?


----------



## cisco0623

Quote:


> Originally Posted by *scgeek12*
> 
> Just bought 2
> 
> 
> 
> 
> 
> 
> 
> should be here on the 1st, got a pair ok EK blocks and backplates to go with them


Congrats! Post some pics when you have it all setup.


----------



## scgeek12

Quote:


> Originally Posted by *cisco0623*
> 
> Congrats! Post some pics when you have it all setup.


Gladly







just shipped out my maxwell Titan X hydro coppers that I sold on eBay to help fund these lol, sad to see those cards go but something tells me it will be well worth it!


----------



## Stateless

Quote:


> Originally Posted by *scgeek12*
> 
> Gladly
> 
> 
> 
> 
> 
> 
> 
> just shipped out my maxwell Titan X hydro coppers that I sold on eBay to help fund these lol, sad to see those cards go but something tells me it will be well worth it!


How much did you get for them? I am going to be putting my Titan Maxwells up soon. I have EK Water Blocks and Backplates, the box and the original cooler with both of my cards, but have been unsure at what price to put them up at? Thanks for any input you can provide.


----------



## Woundingchaney

Quote:


> Originally Posted by *Stateless*
> 
> How much did you get for them? I am going to be putting my Titan Maxwells up soon. I have EK Water Blocks and Backplates, the box and the original cooler with both of my cards, but have been unsure at what price to put them up at? Thanks for any input you can provide.


I sold mine for 550 each (buy now) and they sold in literally 15 minutes. I would imagine I could have gotten more.


----------



## scgeek12

Quote:


> Originally Posted by *Stateless*
> 
> How much did you get for them? I am going to be putting my Titan Maxwells up soon. I have EK Water Blocks and Backplates, the box and the original cooler with both of my cards, but have been unsure at what price to put them up at? Thanks for any input you can provide.


$1200 for both and they sold in like 6 hours


----------



## Stateless

Quote:


> Originally Posted by *scgeek12*
> 
> $1200 for both and they sold in like 6 hours


Thanks. Did you sell them together? I am going to post mine tonight, but not sure if I should do them individually or as a set?


----------



## scgeek12

Quote:


> Originally Posted by *Stateless*
> 
> Thanks. Did you sell them together? I am going to post mine tonight, but not sure if I should do them individually or as a set?


I posted just 1 and said I have 2 and would accept offers on them together


----------



## Fiercy

Quote:


> Originally Posted by *Woundingchaney*
> 
> I sold mine for 550 each (buy now) and they sold in literally 15 minutes. I would imagine I could have gotten more.


I sold mine a week ago for 750


----------



## Stateless

Quote:


> Originally Posted by *Fiercy*
> 
> I sold mine a week ago for 750


WOW. I have EK Water Block Full Cover block and Back plate, the original Box and even the original Titan X Air Cooler. I wonder if I should list them for $750 or so as well? Is there more value if they have the box and air cooler in your opinion?


----------



## Gary2015

Quote:


> Originally Posted by *scgeek12*
> 
> $1200 for both and they sold in like 6 hours


I got $850 for one of mine a few weeks ago on fleabay.


----------



## HyperMatrix

I need to get around to selling mine. Wondering if anyone would even want my cards. Full cover Aqua Computer block + Active cooled backplate. But they're voltage hardmodded which I'm guessing might turn some people off, even if the cards are capable of 1.5GHz+ with proper cooling.


----------



## xarot

Anyone tried Linux (Ubuntu or Mint) with Titan X Pascal?

I have had a lot of headaches. No success yet in installing either distros with nvidia drivers. I can not recall getting a Linux distro installed was this troublesome 10 years ago. Either Ubuntu hates my TXP or my secondary platform (RIVE BE).


----------



## carlhil2

Quote:


> Originally Posted by *Jpmboy*
> 
> which tape you thinking of?
> lol - where did you get this idea from ?? The R5E10 likes a full house - 8 sticks - due to T-topology in the memory trace sublayer. There are no optimized settings for 3000 on x99 that will out perform optimized 3200 on the same platform for two reasons... 1) x99 quad channel is bandwidth oriented, 2) the memory divider for 3200 is the strongest on that platform. With his 6950X and R5E10, 3400c14, or c13 may be possible depending on the CPU's IMC.
> can;t read the bios and can't modify it yet. NV has locked us out.
> and when it reverses the consumer/market revolts and will not buy games that can't be played on mainstream hardware. Enthusiasts buy halo products... for the halo.


Some Aluminum.... I just ordered some Thermal Grizzly Conductonaut, will go with that and cover the area around the shunts with electric tape or liquid...


----------



## tonnytech

does anyone play project cars here , seems still suffering from vr bug on my titan pascal. Titan will not engage max boost clocks despite trying a number of things.


----------



## skypine27

Quote:


> Originally Posted by *HyperMatrix*
> 
> I need to get around to selling mine. Wondering if anyone would even want my cards. Full cover Aqua Computer block + Active cooled backplate. But they're voltage hardmodded which I'm guessing might turn some people off, even if the cards are capable of 1.5GHz+ with proper cooling.


List them on eBay individually and offer free shipping to the USA.

I just sold my 2 x Titan X (maxwells) w/ EK blocks and EK backplates to the same buyer. (I listed them as separate auctions and the same guy bought both of them). He paid 590 for one and 595 for the other, USD. Yes, he paid 1,185 USD for the 2 x cards and I dont even have the OEM coolers or box, threw all that stuff out.

People sometimes have really weird reasons for buying what they buy. Obviously this guy isn't a gamer, any gamer knows that for much less than 1185 dollars he could have bought one card or even two that would blow away the gaming performance of two original Titan X's in SLI. Maybe he's a benchmark guy and already has 2 x water-cooled Titan X's and wants 4 way SLI? Who knows. I dont ask the questions.

Just list your items very very honestly and you'll be surprised at what you can get back on Ebay.


----------



## MikeSanders

Can you tell us the nvflash version at least ifyou cannot share it?
Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - lets not get our hopes up yet.
> unfortunately, nvflash is only half the solution...


----------



## Jpmboy

^^ nope.. there is no version. The version that works for the 1080 series does not address the lockout. Best guy to ask about this is JoeDirt.
Quote:


> Originally Posted by *xarot*
> 
> Anyone tried Linux (Ubuntu or Mint) with Titan X Pascal?
> 
> I have had a lot of headaches. No success yet in installing either distros with nvidia drivers. I can not recall getting a Linux distro installed was this troublesome 10 years ago. Either Ubuntu hates my TXP or my secondary platform (RIVE BE).


only way I've found is to use an older maxwell card, load the nvdrivers on those (pascal compatible) and then swap in the TXP. Works with linux mint anyway.


----------



## MikeSanders

Thats wrong. There is a working version. Seems like you also don't have it. =(


----------



## piee

Looks like there is a market for titan max w/wb, I'll be selling my 980ticlassy81.5w/ek block this week on ebay,boosts 1445 stock,oc 1545 easy.Got TXP golden stk [email protected],[email protected]@74c,0mem,may check for more oc,happy camper.


----------



## Baasha

Let the matter be settled once and for all!


----------



## jcde7ago

Quote:


> Originally Posted by *Gary2015*
> 
> I got $850 for one of mine a few weeks ago on fleabay.


Wow, very nice!

My 3x Titan XMs will be going on fleabay in a couple of weeks (still waiting on those darned EK waterblocks and backplates to get in before I redo this loop).

If I can get a minimum of ~$650 each i'll be an extremely happy camper (they have EK blocks/backplates + original box and cooler too etc). Most likely will list around $700-750 with a 'best offer' option and see how they do...I might even sell just two and keep one as a backup card...but doubtful.


----------



## Artah

Quote:


> Originally Posted by *Baasha*
> 
> Let the matter be settled once and for all!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


That's sick dude but he didn't mention at all how he did it, what a bummer.


----------



## Jpmboy

Quote:


> Originally Posted by *MikeSanders*
> 
> Thats wrong. There is a working version. Seems like you also don't have it. =(


lol - wrong.. but works right now.








The native drivers for the most recent linux mint will not work with TXP. I just loaded the proper NV driver with a different card in the rig. I only use it for Google Stressapp Test, so not really interested in looking for the "correct" way.
Good hunting.


----------



## kx11

anyone knows what i need to watercool 1 Titanx with a predator360 AIO ?? specific names\brands please


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> Let the matter be settled once and for all!


So I just finished watching some dude blow himself for 7 minutes straight on YouTube.

Hope more details come out. I'd love to add a 3rd card to my rig to max out 1440p/165Hz.


----------



## toncij

Quote:


> Originally Posted by *HyperMatrix*
> 
> So I just finished watching some dude blow himself for 7 minutes straight on YouTube.
> 
> Hope more details come out. I'd love to add a 3rd card to my rig to max out 1440p/165Hz.


I'm guessing he had modified profiles to make the driver think GTA5 is 3DMark.







NVidia allows 4-way for benchmarks.









The scaling, on the other hand, is really, really nice. Now, one should wonder how did he sort out the bridges since for 4-way we're missing pins and only 2 pins don't feel enough for 8K (which I think makes his FPS drop actually).


----------



## HyperMatrix

Quote:


> Originally Posted by *toncij*
> 
> I'm guessing he had modified profiles to make the driver think GTA5 is 3DMark.
> 
> 
> 
> 
> 
> 
> 
> NVidia allows 4-way for benchmarks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The scaling, on the other hand, is really, really nice. Now, one should wonder how did he sort out the bridges since for 4-way we're missing pins and only 2 pins don't feel enough for 8K (which I think makes his FPS drop actually).


It's so silly that Nvidia blocks it in driver, as opposed to just saying it's not officialy supported and your mileage may vary.


----------



## shapin

Baasha would you mind share your knowledge with us?
How did you make it work?


----------



## Mike211

Yes Monster Baasha share with us.LoL


----------



## st0necold

Quote:


> Originally Posted by *HyperMatrix*
> 
> So I just finished watching some dude blow himself for 7 minutes straight on YouTube.
> 
> Hope more details come out. I'd love to add a 3rd card to my rig to max out 1440p/165Hz.


Dude 2 980ti's can max every single game at 1440p..

If you are having issues i'd definitely see whats going on a single 980ti can hold damn near 144 by itself (980ti classified)

2 TXP's should have you well over 200fps in all newer games.. something is wrong with your setup.


----------



## HyperMatrix

Quote:


> Originally Posted by *st0necold*
> 
> Dude 2 980ti's can max every single game at 1440p..
> 
> If you are having issues i'd definitely see whats going on a single 980ti can hold damn near 144 by itself (980ti classified)
> 
> 2 TXP's should have you well over 200fps in all newer games.. something is wrong with your setup.


You play games on low settings.


----------



## st0necold

Quote:


> Originally Posted by *HyperMatrix*
> 
> You're not using high enough game settings. Or missed the part about 165Hz.


Battlefield 4 I hold a steady 200fps... ULTRA.

Can you please tell me the games your playing? Saying you can't max out 1440p with 2 Titan XP's is sort of nuts..


----------



## bujao

titan xm is a great card for 3d rendering and photoscan
many ppl r taking the opportunity to finally own one (or two) of them

I'm one of those but I went for the titan x. for now I own a gtx 1080, 970 and my titan x coming this week. but I really thought of buying the titans instead of gtx 1080
memory size is key as the entire scene (including textures) must fit in the gpu memory.

titan XM = 12gb. gtx 1080 8gb
so their choices, for similar price are 2 titan xm or a titan xp. or just one titan xm for "cheap"
lets not forget not many 3d renders support pascal yet.

My gtx 1080 plus 970 working together are faster than a titan xp for photoscan.

One small/medium size photoscan (70 photos) took almost 4gb of my gpu. a medium project (150-200) photos would get close to gtx 1080 mem size.

there are many other things than just gaming. granted, those ppl seldom overclock, much less do voltage modding as they depend on 100% reliability.

now that I have a titan xp, I may sell my 1080 and 970 for an used titan xm so both are in the same memory size and can render 12gb instead of being limited to the smaller card.


----------



## HyperMatrix

Quote:


> Originally Posted by *st0necold*
> 
> Battlefield 4 I hold a steady 200fps... ULTRA.
> 
> Can you please tell me the games your playing? Saying you can't max out 1440p with 2 Titan XP's is sort of nuts..


Battlefield 4 isn't a demanding game. I come from 3 Maxwell Titan X's at 1.5GHz. Prior to that 3 OG Titans at 1.3GHz. And before that 4 gtx 680 Classifieds. So I think I have a pretty good handle on what type of GPU power is required for what type of gaming performance. GTA 5, Deux Ex, Rise of the Tomb Raider, etc...all maxing out the cards without giving a consistent 165fps. Some scenes, sure. But other scenes they can drop down significantly.


----------



## st0necold

Quote:


> Originally Posted by *HyperMatrix*
> 
> Battlefield 4 isn't a demanding game. I come from 3 Maxwell Titan X's at 1.5GHz. Prior to that 3 OG Titans at 1.3GHz. And before that 4 gtx 680 Classifieds. So I think I have a pretty good handle on what type of GPU power is required for what type of gaming performance. GTA 5, Deux Ex, Rise of the Tomb Raider, etc...all maxing out the cards without giving a consistent 165fps. Some scenes, sure. But other scenes they can drop down significantly.


...okay


----------



## bujao

Photoscan
GTX Titan XM = 1004.91 million samples/sec
GTX 1080 stock = 1000 million samples/sec. 1250 is max I get overclocked
you see they are the same speed but titan has 12Gb and 1080 "only" 8gb

2 cards are very linear. just a very small loss (less than 10%). 2 1080/titan xm would be around 1900 million samples/sec (stock clock)

I don't know the titan xp as nobody ever did the benchmark I asked a while ago









2x Titan X EVGA Superclocked. Dense cloud High.
finished depth reconstruction in 322.303 seconds
Device 1 performance: 1353.73 million samples/sec (GeForce GTX TITAN X)
Device 2 performance: 1350.39 million samples/sec (GeForce GTX TITAN X)
Total performance: 2704.12 million samples/sec

This is with only 1 card

1x Titan X EVGA Superclocked.Dense cloud High.
finished depth reconstruction in 541.615 seconds
Device 1 performance: 1534.46 million samples/sec (GeForce GTX TITAN X)
Total performance: 1534.46 million samples/sec

titan xm overclocked even beats the gtx 1080 (at least mine)


----------



## 5150 Joker

Quote:


> Originally Posted by *Baasha*
> 
> Let the matter be settled once and for all!


Eh nm, mods wont' be happy.


----------



## CallsignVega

Quote:


> Originally Posted by *Baasha*
> 
> Let the matter be settled once and for all!


A few things:

GPU usage does not equal scaling. Scaling means at what multiplication of a single card's performance does it attain. "Perfect scaling" would mean you are getting exactly 400% performance over a single GPU which is 100%. Run a benchmark with a single card and then with four cards to attain scaling numbers. Plenty of games have high GPU usage but poor scaling.

Why are the core clocks so low? Are you running stock air coolers sandwiched together sucking in each others hot air?


----------



## Mike211

My NVIDIA Titan X Pascal SLI


----------



## Baasha

Quote:


> Originally Posted by *HyperMatrix*
> 
> Hope more details come out. I'd love to add a 3rd card to my rig to max out 1440p/165Hz.


Adding a 3rd Titan XP to a 1440P monitor? No details for you...









Quote:


> Originally Posted by *toncij*
> 
> I'm guessing he had modified profiles to make the driver think GTA5 is 3DMark.
> 
> 
> 
> 
> 
> 
> 
> NVidia allows 4-way for benchmarks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The scaling, on the other hand, is really, really nice. Now, one should wonder how did he sort out the bridges since for 4-way we're missing pins and only 2 pins don't feel enough for 8K (which I think makes his FPS drop actually).


Changing profiles doesn't do a thing. Getting 4-Way SLI to work in games is a huge milestone. Reading some of the responses on this thread, however, nvm.
Quote:


> Originally Posted by *CallsignVega*
> 
> A few things:
> 
> GPU usage does not equal scaling. Scaling means at what multiplication of a single card's performance does it attain. "Perfect scaling" would mean you are getting exactly 400% performance over a single GPU which is 100%. Run a benchmark with a single card and then with four cards to attain scaling numbers. Plenty of games have high GPU usage but poor scaling.
> 
> Why are the core clocks so low? Are you running stock air coolers sandwiched together sucking in each others hot air?


Yea scaling in terms of GPU usage is what I meant. Over 95% usage across all 4 is awesome.

Running a GPU sammich - temps of the top card gets to 90C.


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> Adding a 3rd Titan XP to a 1440P monitor? No details for you...
> 
> 
> 
> 
> 
> 
> 
> 
> .


But 165Hz 1440p is more demanding than 4K 60Hz....


----------



## CallsignVega

Quote:


> Originally Posted by *Baasha*
> 
> Yea scaling in terms of GPU usage is what I meant. Over 95% usage across all 4 is awesome.
> 
> Running a GPU sammich - temps of the top card gets to 90C.


There is a 31.5" 8K panel going into production in Q1 2017.

For testing, keep GTA5 at 8K supersampling and remove GPU's 2 and 4 so 1 and 3 can breath and boost to 2000+ Core. I wouldn't be surprised if you get 70-80% of the performance of the down-clocked 4-way. May be interesting to see how 4-Way is really scaling versus 2-way.

EDIT: Actually anyone with a 1440p monitor can test 8K resolution. Just set DSR factors to 3.0x.

Baasha/Hyper game for comparison?


----------



## HyperMatrix

Quote:


> Originally Posted by *CallsignVega*
> 
> There is a 31.5" 8K panel going into production in Q1 2017.
> 
> For testing, keep GTA5 at 8K supersampling and remove GPU's 2 and 4 so 1 and 3 can breath and boost to 2000+ Core. I wouldn't be surprised if you get 70-80% of the performance of the down-clocked 4-way. May be interesting to see how 4-Way is really scaling versus 2-way.
> 
> EDIT: Actually anyone with a 1440p monitor can test 8K resolution. Just set DSR factors to 3.0x.
> 
> Baasha/Hyper game for comparison?


CPU difference might mess up results. And latest drivers broke my card OC. Returning one and getting another. Stuck at under 2k atm.


----------



## CallsignVega

CPU shouldn't matter much at all the FPS will be quite low.

Woops, DSR only goes up to 4x, which is actually only 2-times frame scaling (2x2). To get above that you would need to use an in-game frame scaler.


----------



## Baasha

Quote:


> Originally Posted by *HyperMatrix*
> 
> But 165Hz 1440p is more demanding than 4K 60Hz....


I have a 144Hz 1440P panel and tried 2x Titan XP on it - both GPUs were at around 50% scaling - this is on my 2nd rig.

Also, you do realize I mainly game on my 5K monitor right? That's 70% more pixels than 4K. I'm pretty sure 5K @ 60Hz is more demanding than 1440P @ 165Hz. Vega, got that data transfer calculation memorized?









Do you have in-game pics to show 95 - 99% scaling on both GPUs on your 1440P panel? I had the refresh rate set to 144Hz as well but the GPU usage was horrid (for 2 Titan XP) and I was getting only around 70 FPS (with the ENB mod etc.).
Quote:


> Originally Posted by *CallsignVega*
> 
> *There is a 31.5" 8K panel going into production in Q1 2017.*
> 
> For testing, keep GTA5 at 8K supersampling and remove GPU's 2 and 4 so 1 and 3 can breath and boost to 2000+ Core. I wouldn't be surprised if you get 70-80% of the performance of the down-clocked 4-way. May be interesting to see how 4-Way is really scaling versus 2-way.
> 
> EDIT: Actually anyone with a 1440p monitor can test 8K resolution. Just set DSR factors to 3.0x.
> 
> Baasha/Hyper game for comparison?


8K panel coming out? By Dell? o_0

What about the 4K 120Hz panel by Dell? Any news on that?









2-way Titan XP benchmarks was posted several pages back - tested 5K on my native 5K monitor - no SS or DSR etc.

benchies again:





Playing at 5K in 4-Way is simply astounding - I've just got this going since Friday night. You won't believe how incredible the performance is when the scaling is there.

Shadow of Mordor @ 8K w/ everything maxed out including AA:



pCARS @ 8K w/ everything maxed out (no AA but SMAA setting on Ultra):


----------



## CallsignVega

http://www.tftcentral.co.uk/news_archive/35.htm#panels_update

31.5" 8K is going to be a pretty epic screen. It may require two Displayport inputs like the Dell 5K, but this time two DP 1.3/1.4 ports instead of DP 1.2's if the display doesn't support DSC.

4K @ 120 Hz still suppose to come out around the same time, I'd suspect late winter.

I've had the 5K Dell a long time ago. The resolution is super crisp but the motion blur is quite bad. For static bright images though its pretty sweet.

Do you have SWBF? That game scales really well in SLI and has a built in resolution scale slider up to 200% if you want to compare 4-way vs my 2-way numbers.

So more people can play:

1. Launch SWBF and set resolution to 2560x1440.
2. Full Screen / V-Sync off / Field of view 55 / motion blur = 0 / Film grain = 0 / Resolution Scale = 200% / Graphics quality preset = Ultra
3. Go to Missions -> Training -> Endor Chase -> Solo play.
4. As soon as you get dumped into the game world and the storm trooper stops talking on the top left, note FPS.

On my system it is 92.


----------



## pompss

i still dont understand why people spending $1300 for a 32 inch 4k 60 hz monitor from acer when you can buy 55 inch lg oled for $1600








only for the g-sync its not worth the money in my opinion

im still debating with myself to buy lg oled 55 or another 27 inch 144hz ips to complete my 3 monitor 7680x1440 configuration

What you guys think ?


----------



## HyperMatrix

Quote:


> Originally Posted by *Baasha*
> 
> I have a 144Hz 1440P panel and tried 2x Titan XP on it - both GPUs were at around 50% scaling - this is on my 2nd rig.
> 
> Also, you do realize I mainly game on my 5K monitor right? That's 70% more pixels than 4K. I'm pretty sure 5K @ 60Hz is more demanding than 1440P @ 165Hz. Vega, got that data transfer calculation memorized?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you have in-game pics to show 95 - 99% scaling on both GPUs on your 1440P panel? I had the refresh rate set to 144Hz as well but the GPU usage was horrid (for 2 Titan XP) and I was getting only around 70 FPS (with the ENB mod etc.).


I'm not running any mods so I don't know about any of that. GPU usage between cards is also a little lop-sided due to TXAA, since temporal AA is pushed through just 1 card.


----------



## pez

Quote:


> Originally Posted by *pompss*
> 
> i still dont understand why people spending $1300 for a 32 inch 4k 60 hz monitor from acer when you can buy 55 inch lg oled for $1600
> 
> 
> 
> 
> 
> 
> 
> 
> only for the g-sync its not worth the money in my opinion
> 
> im still debating with myself to buy lg oled 55 or another 27 inch 144hz ips to complete my 3 monitor 7680x1440 configuration
> 
> What you guys think ?


Not everyone wants to sit in front of a 55" TV at their PC. That's nowhere near practical for me gaming the way I do.


----------



## toncij

I'm running 2 Dell 5Ks since their release (that's why I had to use 4 TITAN X (Maxwell)s), but in some situations you can't replace the beauty of motion at 144Hz.


----------



## skypine27

Quote:


> Originally Posted by *pompss*
> 
> i still dont understand why people spending $1300 for a 32 inch 4k 60 hz monitor from acer when you can buy 55 inch lg oled for $1600
> 
> 
> 
> 
> 
> 
> 
> 
> only for the g-sync its not worth the money in my opinion
> 
> im still debating with myself to buy lg oled 55 or another 27 inch 144hz ips to complete my 3 monitor 7680x1440 configuration
> 
> What you guys think ?


Me too.

I'm also not sure why someone gives a ****about 5K or 8K when normal 4K or 3440 x 1440 at 144 hz would make a much better gaming experience.

But to each their own.


----------



## AdamK47

Quote:


> Originally Posted by *pompss*
> 
> i still dont understand why people spending $1300 for a 32 inch 4k 60 hz monitor from acer when you can buy 55 inch lg oled for $1600
> 
> 
> 
> 
> 
> 
> 
> 
> only for the g-sync its not worth the money in my opinion
> 
> im still debating with myself to buy lg oled 55 or another 27 inch 144hz ips to complete my 3 monitor 7680x1440 configuration
> 
> What you guys think ?


LG OLED TVs employ automatic dimming similar to the older plasma TVs. This cannot be turned off either. Believe me, I've tried.


----------



## skypine27

Quote:


> Originally Posted by *AdamK47*
> 
> LG OLED TVs employ automatic dimming similar to the older plasma TVs. This cannot be turned off either. Believe me, I've tried.


I dont get why someone cares what 5k or 8k is and they dont even play at 100+hz on 3440 x 1440? I guess your game consists of looking at a static picture slide show?


----------



## AdamK47

Quote:


> Originally Posted by *skypine27*
> 
> I dont get why someone cares what 5k or 8k is and they dont even play at 100+hz on 3440 x 1440? I guess your game consists of looking at a static picture slide show?


I'm not sure what that has to do with what I said.


----------



## GunnzAkimbo

Try this tweak, see if it pushes scores up a bit...?

http://www.overclock.net/t/1516449/official-the-windows-10-club/7800_20#post_25473847


----------



## toncij

Quote:


> Originally Posted by *skypine27*
> 
> Me too.
> 
> I'm also not sure why someone gives a ****about 5K or 8K when normal 4K or 3440 x 1440 at 144 hz would make a much better gaming experience.
> 
> But to each their own.


Because at higher resolution, the quality is significantly better. Crisp, sharp, clear image, incredibly detailed textures at modern games (SW: Battlefront, Battlefield 1). I can't wait until [email protected] is out!


----------



## Gary2015

Quote:


> Originally Posted by *jcde7ago*
> 
> Wow, very nice!
> 
> My 3x Titan XMs will be going on fleabay in a couple of weeks (still waiting on those darned EK waterblocks and backplates to get in before I redo this loop).
> 
> If I can get a minimum of ~$650 each i'll be an extremely happy camper (they have EK blocks/backplates + original box and cooler too etc). Most likely will list around $700-750 with a 'best offer' option and see how they do...I might even sell just two and keep one as a backup card...but doubtful.


Sad fact is that I also sold my 1080GTX on fleabay and sent to someone in Azerbaijan which has not yet been delivered and is now missing


----------



## kx11

Quote:


> Originally Posted by *Zurv*
> 
> oh god.. i have soooo many pointless video cards now.
> After i upgrade to these new titans.. i'll have 8! gtx 1080s and 7 Titan X...
> 
> anyone looking to buy some cards


me too









i have 4 now , 2 1080s strix w/HBsli bridge + 2 Galax HOF 1080s w/Galax HOF SLi bridge

trying to sell them but no luck so far


----------



## Gary2015

Quote:


> Originally Posted by *kx11*
> 
> me too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i have 4 now , 2 1080s strix w/HBsli bridge + 2 Galax HOF 1080s w/Galax HOF SLi bridge
> 
> trying to sell them but no luck so far


Dont sell on fleabay to anyone in Azerbaijan. I think it must have been stolen.


----------



## xarot

Quote:


> Originally Posted by *Gary2015*
> 
> Sad fact is that I also sold my 1080GTX on fleabay and sent to someone in Azerbaijan which has not yet been delivered and is now missing


Bummer...wouldn't have expected anything else though.


----------



## PowerK

Wow, HB SLI Bridge does provide better performance.
http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/

Because I ordered 3-Slot (instead of 4-Slot) HB Bridge by mistake, I'm currently using 3-Way SLI Bridge (LED) from ASUS.
https://www.asus.com/Graphics-Cards-Accessory/ROG_Enthusiast_SLI_Bridge/

Maybe, I should place an order for 4-Slot HB Bridge?


----------



## dante`afk

Quote:


> Originally Posted by *PowerK*
> 
> Wow, HB SLI Bridge does provide better performance.
> http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/
> 
> Because I ordered 3-Slot (instead of 4-Slot) HB Bridge by mistake, I'm currently using 3-Way SLI Bridge (LED) from ASUS.
> https://www.asus.com/Graphics-Cards-Accessory/ROG_Enthusiast_SLI_Bridge/
> 
> Maybe, I should place an order for 4-Slot HB Bridge?


No it doesn't.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/22.html
Quote:


> Originally Posted by *HyperMatrix*
> 
> You play games on low settings.


----------



## unreality

Quote:


> Originally Posted by *st0necold*
> 
> Dude 2 980ti's can max every single game at 1440p..
> 
> If you are having issues i'd definitely see whats going on a single 980ti can hold damn near 144 by itself (980ti classified)
> 
> 2 TXP's should have you well over 200fps in all newer games.. something is wrong with your setup.


I told him the same. But instead of accepting that he has a faulty SLI setup hes ignorant af.


----------



## jodasanchezz

Quote:


> Originally Posted by *pompss*
> 
> i still dont understand why people spending $1300 for a 32 inch 4k 60 hz monitor from acer when you can buy 55 inch lg oled for $1600
> 
> 
> 
> 
> 
> 
> 
> 
> only for the g-sync its not worth the money in my opinion
> 
> im still debating with myself to buy lg oled 55 or another 27 inch 144hz ips to complete my 3 monitor 7680x1440 configuration
> 
> What you guys think ?


What LG Monitor are u talking about?
can u share a link please?


----------



## st0necold

Quote:


> Originally Posted by *HyperMatrix*
> 
> But 165Hz 1440p is more demanding than 4K 60Hz....


*No* it's not.

Lol.

3440x1440p (acer x34/swift) is almost as demanding as 4k...

1440p/144hz-165hz can be maxxed out with 780ti's...


----------



## RedM00N

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - lets not get our hopes up yet.
> unfortunately, nvflash is only half the solution... we've had a working nvflash for the 1080 for a long time, but no bios editor for the 1080 afaik.


So is it safe to assume the 1000 gpu and TXP bios' would be the same structure line for line? How was it with Maxwell, since it would probably follow the same scenario.

I'm more of a making programs that run through command line than GUI guy, but maybe I could try to whip something up (with messy code of course cause that's just me







). Or at least help contribute to the making of said program in some way.


----------



## st0necold

Guys i'm looking to get back into Starcraft but this thread has me worried I may not have enough juice to run it at it's full potential.

Does anyone know when Nvidia will unlock 3-4way SLI for the TXP? Just want to make sure I can max it out and from what i'm reading on here it looks like I may have to wait for software support before I jump back on Starcraft.

(RIG IN SIG.)


----------



## dante`afk

Sick Trollpost mate ^


----------



## DNMock

Quote:


> Originally Posted by *st0necold*
> 
> Guys i'm looking to get back into Starcraft but this thread has me worried I may not have enough juice to run it at it's full potential.
> 
> Does anyone know when Nvidia will unlock 3-4way SLI for the TXP? Just want to make sure I can max it out and from what i'm reading on here it looks like I may have to wait for software support before I jump back on Starcraft.
> 
> (RIG IN SIG.)


Starcraft is more CPU limited than GPU limited.
Quote:


> Originally Posted by *dante`afk*
> 
> Sick Trollpost mate ^


----------



## profundido

Quote:


> Originally Posted by *HyperMatrix*
> 
> I have a question for those running more than 2 cards right now. I'm finding 2 Titan XP's to not be enough for 165Hz 1440p. So before I go about selling my old maxwell Titan x cards, I was wondering if I should keep one of them for use with dx12 multi gpu. Is it possible to just put 2 Titan XP's connected wth the sli bridge and use 2-card sli with dx11 games, but then just let dx12 take care of using all 3 under explicit multi gpu?


Out of curiosity, what sort of gpu power are we talking about here ?

I finally finished setting up and testing my new system this weekend with the 2 TXP cards and although I don't have an HB sli yet I couldn't resist running some early benchmarks already and I got about 16.5K Timespy and 33K (normal) Firestrike. No shunt mod or any sort of hardware modifications. I'm wondering what everyone's getting

I also noticed to my great surprise that when I installed one of my favorite games called "Dragon age" and cranked up the visuals to beyond ULTRA mode (yes some settings have an even higher setting called "fade touched") both cards ran at 80-90% usage constantly when exploring and fighting out in the wilds with the occasional jump to 100% for both. Visuals were unearthly though on my 4K screen. Never even imagined something like that would be possible at a constant fluid 60 frames. Hah, I guess I just reached my personal Valhalla I've been chasing for so many years. My 27" screen is about 50-60cm away from my eyes and at this level the game is now getting this ridiculous sense of realism somehow. I swear I could touch some of these things for real.

Also noticed that with this sort of intensive next-gen gaming for a few hours the system itself stays completely cool pulls 600-750W from the wall nonstop and it heats up the room by 1-2 degree per hour. So in summer I might start needing an airco now... =P =P


----------



## EniGma1987

Quote:


> Originally Posted by *AdamK47*
> 
> LG OLED TVs employ automatic dimming similar to the older plasma TVs. This cannot be turned off either. Believe me, I've tried.


"automatic dimming". lol. I guess if you consider the underlying technology doesnt emit any light when the cells are off then yes it would be automatic dimming. Kinda like how a light bulb auto dims when you turn it off.


----------



## Yuhfhrh

Quote:


> Originally Posted by *EniGma1987*
> 
> "automatic dimming". lol. I guess if you consider the underlying technology doesnt emit any light when the cells are off then yes it would be automatic dimming. Kinda like how a light bulb auto dims when you turn it off.


I think they are referring to the ABL, if there is a lot of white on the screen, brightness will be reduced.


----------



## Jpmboy

Quote:


> Originally Posted by *RedM00N*
> 
> So is it safe to assume the 1*000 gpu and TXP bios' would be the same structure line for line*? How was it with Maxwell, since it would probably follow the same scenario.
> 
> I'm more of a making programs that run through command line than GUI guy, but maybe I could try to whip something up (with messy code of course cause that's just me
> 
> 
> 
> 
> 
> 
> 
> ). Or at least help contribute to the making of said program in some way.


Thanks for the offer.. we can always use abilities such as yours.







But, actually the two bioses would not be very similar. the ram size and timings (different ICs), gpu and even the power planes are not the same between the 1080 and TXP. So without a jailbreak (and then hex edit) we're stuck. I mean, I have several 1080 rom files and did a bunch of cross flashing with my 1080... hoping to find a more efficient bios. The Strix XOC bios (out now) may be the best - this came from the man himself (shimino). With NV keeping the TXP out of 3rd party hands, it's gonna be a tough road AFAIK. GUys like Marc0053 really keep an ear on the Bot rail, so if a TXP bios mod becomes available I think it will happen there first. As for now, we only have the resistor mod. I have not done this to either of my cards at this point. Not desperate enough, I mean c'mon these cards have incredible performance OOB.


----------



## Lobotomite430

I installed the EVGA 1080/1070 hybrid kit on my Titan and my temps have been getting mid 60c and sometimes to 70c just curious if this seems right. Ambient temp is 77F and my Titan is running +208 core +515 memory. I almost feel like the little radiator is working pretty hard lol and Im sure its impressive considering I was hitting 85c on the stock cooler no overclock.


----------



## Baasha

Quote:


> Originally Posted by *CallsignVega*
> 
> http://www.tftcentral.co.uk/news_archive/35.htm#panels_update
> 
> 31.5" 8K is going to be a pretty epic screen. It may require two Displayport inputs like the Dell 5K, but this time two DP 1.3/1.4 ports instead of DP 1.2's if the display doesn't support DSC.
> 
> 4K @ 120 Hz still suppose to come out around the same time, I'd suspect late winter.
> 
> I've had the 5K Dell a long time ago. The resolution is super crisp but the motion blur is quite bad. For static bright images though its pretty sweet.
> 
> Do you have SWBF? That game scales really well in SLI and has a built in resolution scale slider up to 200% if you want to compare 4-way vs my 2-way numbers.
> 
> So more people can play:
> 
> 1. Launch SWBF and set resolution to 2560x1440.
> 2. Full Screen / V-Sync off / Field of view 55 / motion blur = 0 / Film grain = 0 / Resolution Scale = 200% / Graphics quality preset = Ultra
> 3. Go to Missions -> Training -> Endor Chase -> Solo play.
> 4. As soon as you get dumped into the game world and the storm trooper stops talking on the top left, note FPS.
> 
> On my system it is 92.


I do have SWBF. Haven't tried it yet but can you post a screenshot? Also, I think it would be best, for consistency, to use the 60 sec. Fraps Benchmark tool to get FPS - post the AVERAGE FPS number for the Endor Chase level.


----------



## DNMock

Quote:


> Originally Posted by *Jpmboy*
> 
> Thanks for the offer.. we can always use abilities such as yours.
> 
> 
> 
> 
> 
> 
> 
> But, actually the two bioses would not be very similar. the ram size and timings (different ICs), gpu and even the power planes are not the same between the 1080 and TXP. So without a jailbreak (and then hex edit) we're stuck. I mean, I have several 1080 rom files and did a bunch of cross flashing with my 1080... hoping to find a more efficient bios. The Strix XOC bios (out now) may be the best - this came from the man himself (shimino). With NV keeping the TXP out of 3rd party hands, it's gonna be a tough road AFAIK. GUys like Marc0053 really keep an ear on the Bot rail, so if a TXP bios mod becomes available I think it will happen there first. As for now, we only have the resistor mod. I have not done this to either of my cards at this point. Not desperate enough, I mean c'mon these cards have incredible performance OOB.


Doesn't anyone know someone in "Anonymous" or someone who works for the Chinese government or the NSA who can pull the engineers flash utility off their servers?


----------



## Baasha

Quote:


> Originally Posted by *Jpmboy*
> 
> Thanks for the offer.. we can always use abilities such as yours.
> 
> 
> 
> 
> 
> 
> 
> 
> But, actually the two bioses would not be very similar. the ram size and timings (different ICs), gpu and even the power planes are not the same between the 1080 and TXP. So without a jailbreak (and then hex edit) we're stuck. I mean, I have several 1080 rom files and did a bunch of cross flashing with my 1080... hoping to find a more efficient bios. The Strix XOC bios (out now) may be the best - this came from the man himself (shimino). With NV keeping the TXP out of 3rd party hands, it's gonna be a tough road AFAIK. GUys like Marc0053 really keep an ear on the Bot rail, so if a TXP bios mod becomes available I think it will happen there first. As for now, we only have the resistor mod. I have not done this to either of my cards at this point. Not desperate enough, I mean c'mon these cards have incredible performance OOB.


JPM,

Quick question about scaling - Fire Strike Ultra runs great w/ 4 Titan XP with scaling near 99% across all GPUs.

When I run Fire Strike Extreme (Time Spy is garbage for me - not sure why), scaling drops to ~ 40% across the 4 GPUs. I saw on the HoF that a couple of people ran the Extreme preset w/ 4x Titan XP but their scores were really high which means their GPUs must have scaled properly.

It seems like for me, only higher resolutions, both in benchmarks and in games, offer proper scaling. Any idea why? How do I "fix" this? I would like to run Fire Strike Extreme and get a nice score.

Time Spy is another weird thing for me - my 4-Way score is lower than my 2-Way score!


----------



## DNMock

Quote:


> Originally Posted by *Baasha*
> 
> JPM,
> 
> Quick question about scaling - Fire Strike Ultra runs great w/ 4 Titan XP with scaling near 99% across all GPUs.
> 
> When I run Fire Strike Extreme (Time Spy is garbage for me - not sure why), scaling drops to ~ 40% across the 4 GPUs. I saw on the HoF that a couple of people ran the Extreme preset w/ 4x Titan XP but their scores were really high which means their GPUs must have scaled properly.
> 
> It seems like for me, only higher resolutions, both in benchmarks and in games, offer proper scaling. Any idea why? How do I "fix" this? I would like to run Fire Strike Extreme and get a nice score.
> 
> Time Spy is another weird thing for me - my 4-Way score is lower than my 2-Way score!


You didn't accidentally forget to set your GPU's to "Prefer Max performance" or whatever in the control panel did you?


----------



## Baasha

Quote:


> Originally Posted by *HyperMatrix*
> 
> I'm not running any mods so I don't know about any of that. GPU usage between cards is also a little lop-sided due to TXAA, since temporal AA is pushed through just 1 card.


Well, that's not a good benchmark - I'm running a really intense ENB with a ton of texture/car mods and so vanilla GTA V is really no comparison.

How about some other games like Shadow of Mordor, BF4, Tomb Raider etc.?


----------



## Baasha

Quote:


> Originally Posted by *DNMock*
> 
> You didn't accidentally forget to set your GPU's to "Prefer Max performance" or whatever in the control panel did you?


Nope, definitely on 'max performance' in NVCP.


----------



## stangflyer

Quote:


> Originally Posted by *Lobotomite430*
> 
> I installed the EVGA 1080/1070 hybrid kit on my Titan and my temps have been getting mid 60c and sometimes to 70c just curious if this seems right. Ambient temp is 77F and my Titan is running +208 core +515 memory. I almost feel like the little radiator is working pretty hard lol and Im sure its impressive considering I was hitting 85c on the stock cooler no overclock.


I would check and see if the GPU is seated properly or thermal paste is correct. I have the same ambient temp as you with my 980ti hybrid at 1525/7600 and my temp never goes above 55-56 and that is playing at 7680x1440.


----------



## RedM00N

Quote:


> Originally Posted by *Jpmboy*
> 
> Thanks for the offer.. we can always use abilities such as yours.
> 
> 
> 
> 
> 
> 
> 
> But, actually the two bioses would not be very similar. the ram size and timings (different ICs), gpu and even the power planes are not the same between the 1080 and TXP. So without a jailbreak (and then hex edit) we're stuck. I mean, I have several 1080 rom files and did a bunch of cross flashing with my 1080... hoping to find a more efficient bios. The Strix XOC bios (out now) may be the best - this came from the man himself (shimino). With NV keeping the TXP out of 3rd party hands, it's gonna be a tough road AFAIK. GUys like Marc0053 really keep an ear on the Bot rail, so if a TXP bios mod becomes available I think it will happen there first. As for now, we only have the resistor mod. I have not done this to either of my cards at this point. Not desperate enough, I mean c'mon these cards have incredible performance OOB.


Figured they would, just wanted to make sure







That's needed info for if I can get anything going.

So about bios'. Do the official ones pulled from cards flash without errors to other cards and only hex edited bios have the certification issue, or has that issue been resolved?


----------



## Jpmboy

Quote:


> Originally Posted by *Baasha*
> 
> JPM,
> 
> Quick question about scaling - Fire Strike Ultra runs great w/ 4 Titan XP with scaling near 99% across all GPUs.
> 
> When I run Fire Strike Extreme (Time Spy is garbage for me - not sure why), scaling drops to ~ 40% across the 4 GPUs. I saw on the HoF that a couple of people ran the Extreme preset w/ 4x Titan XP but their scores were really high which means their GPUs must have scaled properly.
> 
> It seems like for me, only higher resolutions, both in benchmarks and in games, offer proper scaling. Any idea why? How do I "fix" this? I would like to run Fire Strike Extreme and get a nice score.
> 
> Time Spy is another weird thing for me - my 4-Way score is lower than my 2-Way score!


that's an odd problem, unless the cpu is bottlenecking the lower resolution benchmark? If you have speedstep enabled, make sure windows power plan is High Perf, and that min proc state = 100%. Or.. disable speedstep.
If you look in gpuZ (after launching the GUI for 3DMark) are all 4 cards idling at around 1450 or 1500? eg, all 4 in P0? See in the 2 snips below... Card 0 is in P0, and card 1 is in P8. You can use NVI to check that all the cards are entering the proper clock state for "Max Perf" on NVCP.


----------



## habu58

What were you getting with that setup?


----------



## EniGma1987

Quote:


> Originally Posted by *Baasha*
> 
> JPM,
> 
> Quick question about scaling - Fire Strike Ultra runs great w/ 4 Titan XP with scaling near 99% across all GPUs.
> 
> When I run Fire Strike Extreme (Time Spy is garbage for me - not sure why), scaling drops to ~ 40% across the 4 GPUs. I saw on the HoF that a couple of people ran the Extreme preset w/ 4x Titan XP but their scores were really high which means their GPUs must have scaled properly.
> 
> It seems like for me, only higher resolutions, both in benchmarks and in games, offer proper scaling. Any idea why? How do I "fix" this? I would like to run Fire Strike Extreme and get a nice score.
> 
> Time Spy is another weird thing for me - my 4-Way score is lower than my 2-Way score!


What is your GPU usage on the 4 cards in FSE? My thought is that because the cards are so strong, and the resolution is kinda low, then the load on each GPU might not be causing the cards to get into boost mode. This happens when the GPU doesnt get enough utilization. So maybe 2 cards are utilized enough to hit boost, and split the load into 4 and it is too low? Just a thought.


----------



## Lobotomite430

Quote:


> Originally Posted by *stangflyer*
> 
> I would check and see if the GPU is seated properly or thermal paste is correct. I have the same ambient temp as you with my 980ti hybrid at 1525/7600 and my temp never goes above 55-56 and that is playing at 7680x1440.


Ok I will try that tonight, thanks!


----------



## AdamK47

Quote:


> Originally Posted by *EniGma1987*
> 
> "automatic dimming". lol. I guess if you consider the underlying technology doesnt emit any light when the cells are off then yes it would be automatic dimming. Kinda like how a light bulb auto dims when you turn it off.


It does auto dimming. Load up a browser with a blank white page. Set the browser to half size against a dark background and then maximize it. The brightness will automatically be reduced. It's LGs way of increasing the longevity of the display. There is no way to turn it off and it looks terrible. Plasmas did the same thing.


----------



## DNMock

Quote:


> Originally Posted by *Baasha*
> 
> Nope, definitely on 'max performance' in NVCP.


Figured, just ruling out the derps.


----------



## cookiesowns

Quote:


> Originally Posted by *AdamK47*
> 
> It does auto dimming. Load up a browser with a blank white page. Set the browser to half size against a dark background and then maximize it. The brightness will automatically be reduced. It's LGs way of increasing the longevity of the display. There is no way to turn it off and it looks terrible. Plasmas did the same thing.


It's also called dynamic contrast. Be reducing the overall brightness you'll get better blacks on a black : white background. But if the display is any good it should have local dimming no?

Hate this behavior really annoying. Basically ruined this sharp quatron panel for us. RGBY sucks for PC though so whatever.


----------



## AdamK47

Quote:


> Originally Posted by *cookiesowns*
> 
> It's also called dynamic contrast. Be reducing the overall brightness you'll get better blacks on a black : white background. But if the display is any good it should have local dimming no?
> 
> Hate this behavior really annoying. Basically ruined this sharp quatron panel for us. RGBY sucks for PC though so whatever.


That would make sense if we were talking about an LCD. There is no need for it with OLED since black is always going to be black. It's used on the LG OLED TVs for power reduction and panel longevity. Personally, I'd sacrifice that longevity and power savings to have a display with constant contrast and brightness.


----------



## Fredthehound

Hi all,

Single Titan at 225/0. Coming to Nvidia from a pair of tweaked/liquid cooled AMD 390s. 1080P monitor but bought the titan for modded Skyrim in VR through VorpX.

That said, If you want realism/mind blown, crank a vive up in VorpX to 1920x1440 supersampled and you will see God. However, you will want that 2nd Titan.

Since getting my TXP I have been devoting all my efforts to sorting Skyrim/VR but yesterday I reconfigured the ini files and ran it on my 1080p 65in LG (not a high end part but I like it). Absolutely incredible to see that titan glue heavily modded Skyrim to 60FPS at 2x supersampling and all settings, including lighting mods, weather mods etc., fully maxed. No skips, stutters or grief...all while never going over 30% usage.

This is my first taste of how big boys play so excuse my amazement, but damn. This thing is impressive.


----------



## bizplan

FOR THOSE OF YOU WHO ARE INTERESTED, EVGA PRECISION XOC VERSION 6.05 NOW SUPPORTS TITAN XP!

http://www.evga.com/precisionxoc/


----------



## CRITTY

Quote:


> Originally Posted by *bizplan*
> 
> FOR THOSE OF YOU WHO ARE INTERESTED, EVGA PRECISION XOC VERSION 6.05 NOW SUPPORTS TITAN XP!
> 
> http://www.evga.com/precisionxoc/


Voltage is unlocked.


----------



## scgeek12

No cards yet but waterblocks and backplates got here today


----------



## bl4ckdot

Quote:


> Originally Posted by *bizplan*
> 
> FOR THOSE OF YOU WHO ARE INTERESTED, EVGA PRECISION XOC VERSION 6.05 NOW SUPPORTS TITAN XP!
> 
> http://www.evga.com/precisionxoc/


Can you control voltage ?


----------



## Fredthehound

Just power target to 120 from what I see.


----------



## xarot

Quote:


> Originally Posted by *Baasha*
> 
> JPM,
> 
> Quick question about scaling - Fire Strike Ultra runs great w/ 4 Titan XP with scaling near 99% across all GPUs.
> 
> When I run Fire Strike Extreme (Time Spy is garbage for me - not sure why), scaling drops to ~ 40% across the 4 GPUs. I saw on the HoF that a couple of people ran the Extreme preset w/ 4x Titan XP but their scores were really high which means their GPUs must have scaled properly.
> 
> It seems like for me, only higher resolutions, both in benchmarks and in games, offer proper scaling. Any idea why? How do I "fix" this? I would like to run Fire Strike Extreme and get a nice score.
> 
> Time Spy is another weird thing for me - my 4-Way score is lower than my 2-Way score!


I would have guessed G-Sync is on but I guess from your sig that you don't use G-Sync monitor.


----------



## bizplan

Quote:


> Originally Posted by *bl4ckdot*
> 
> Can you control voltage ?


Yes, you have to click on the comb above the default button and to the left of the lightning bolt.


----------



## bizplan

Quote:


> Originally Posted by *Fredthehound*
> 
> Just power target to 120 from what I see.


You have to click on the comb above the default button and to the left of the lightning bolt.


----------



## Fredthehound

That goes no higher than 100% and into the red range for me. No higher.


----------



## DNMock

Quote:


> Originally Posted by *bizplan*
> 
> You have to click on the comb above the default button and to the left of the lightning bolt.


So you can increase the power limit above 120?

Increasing the voltage just makes you face smash into the 120 PL even harder


----------



## Jpmboy

Quote:


> Originally Posted by *bizplan*
> 
> You have to click on the comb above the default button and to the left of the lightning bolt.


thanks for th elink... UGH! no K-boost.


----------



## bizplan

Quote:


> Originally Posted by *DNMock*
> 
> So you can increase the power limit above 120?
> 
> Increasing the voltage just makes you face smash into the 120 PL even harder


You can't increase the PL this way, however, more voltage makes your OC's more stable.


----------



## CRITTY

Quote:


> Originally Posted by *bizplan*
> 
> You can't increase the PL this way, however, more voltage makes your OC's more stable.


Any negatives to putting the voltage to 100%?


----------



## kx11

Quote:


> Originally Posted by *scgeek12*
> 
> 
> 
> No cards yet but waterblocks and backplates got here today


i might be in your shoes in a couple of days


----------



## DNMock

Quote:


> Originally Posted by *bizplan*
> 
> You can't increase the PL this way, however, more voltage makes your OC's more stable.


Stock voltage is enough to make me face slam into into the power limit stable


----------



## Evo X

Is there any benefit to using EVGA Precision X instead of MSI Afterburner?

I already have a stable OC and fan curve in AB.

I see that Precision offers voltage control, but is that necessary when I'm already hitting the temp and power limit?


----------



## Jpmboy

Quote:


> Originally Posted by *CRITTY*
> 
> Any negatives to putting the voltage to 100%?


does not add any voltage that the bios wouldn;t provide otherwise... it will let you undervolt the card tho. Lack of K-boost is an "uninstall" for me.


----------



## CRITTY

Quote:


> Originally Posted by *Jpmboy*
> 
> does not add any voltage that the bios wouldn;t provide otherwise... it will let you undervolt the card tho. Lack of K-boost is an "uninstall" for me.


It maxes out at 1.0930 and my OC seems to be more "stable"; meaning my clocks stay higher and fluctuate less. My benchmarks are not higher, but in game is where I see the positive attributes I spoke of.


----------



## dante`afk

wow legion lets my card run hotter than hours of playing overwatch or doom/witcher3


----------



## Baasha

Quote:


> Originally Posted by *Jpmboy*
> 
> that's an odd problem, unless the cpu is bottlenecking the lower resolution benchmark? If you have speedstep enabled, make sure windows power plan is High Perf, and that min proc state = 100%. Or.. disable speedstep.
> If you look in gpuZ (after launching the GUI for 3DMark) are all 4 cards idling at around 1450 or 1500? eg, all 4 in P0? See in the 2 snips below... Card 0 is in P0, and card 1 is in P8. You can use NVI to check that all the cards are entering the proper clock state for "Max Perf" on NVCP.


Here's the Inspector states - all seem to be in P0 when 3DMark GUI is open:



EDIT: here's my 3DMark FSE run:










EDIT #2: Here's the GPU-Z screenshot - all 4 seem to be idling at 1417Mhz:









Quote:


> Originally Posted by *EniGma1987*
> 
> What is your GPU usage on the 4 cards in FSE? My thought is that because the cards are so strong, and the resolution is kinda low, then the load on each GPU might not be causing the cards to get into boost mode. This happens when the GPU doesnt get enough utilization. So maybe 2 cards are utilized enough to hit boost, and split the load into 4 and it is too low? Just a thought.


In FSE, the usage is <50% which is pathetic.
Quote:


> Originally Posted by *xarot*
> 
> I would have guessed G-Sync is on but I guess from your sig that you don't use G-Sync monitor.


I have a G-Sync monitor on my 2nd rig but I'm not talking about that one.


----------



## bizplan

Quote:


> Originally Posted by *CRITTY*
> 
> Any negatives to putting the voltage to 100%?


I've run it at max voltage (100%) for hours on end, the increase in voltage does seem to keep the clock speeds maxed out (between 2088-2100) and there is less fluctuation at these higher clocks assuming there is a consistent load on the GPU.


----------



## Glerox

Hi folks! Maybe some of you can help me! I have two questions.
First, I just finished my first custom loop for my titan XP. All with EKWB products.



I have a 6800k OC to 4,2 GHz/1,4v.
I have a 240mm top rad 25mm thick and a 360mm front rad 40mm thick.
All my 5 fans are push configuration set as intake and I have a 120mm exhaust at the back.
They are EK-vardar 1150 rpm running at max speed.
When I game, I'm getting temps around 54-55 degrees... which is way too high IMO for a custom loop...

Is it simply because my fans are not fast enough? Or because something is wrong with the loop or the rads are not enough?

My other question is this :



Since I installed the EKWB gpu block, I've lost voltage monitoring in afterburner... did I damaged the pcb? :S

Thanks!


----------



## Glerox

Ok for my voltage question, I installed gpu-z and it shows VDDC so my problem is probably with AB. nevermind!
Still my bad temps question is valid


----------



## Jpmboy

Quote:


> Originally Posted by *Baasha*
> 
> Here's the Inspector states - all seem to be in P0 when 3DMark GUI is open:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> EDIT: here's my 3DMark FSE run:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT #2: Here's the GPU-Z screenshot - all 4 seem to be idling at 1417Mhz:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In FSE, the usage is <50% which is pathetic.
> I have a G-Sync monitor on my 2nd rig but I'm not talking about that one.


baasha - that's a driver issue. look at the ram utilization









when you run Firestrike (1080P) is the util even lower?


----------



## Maintenance Bot

Quote:


> Originally Posted by *Glerox*
> 
> Ok for my voltage question, I installed gpu-z and it shows VDDC so my problem is probably with AB. nevermind!
> Still my bad temps question is valid


If I read your first post correctly, you have 5 intake forcing air past the rads into the case. Maybe too much heat in there?


----------



## Baasha

Quote:


> Originally Posted by *Jpmboy*
> 
> baasha - that's a driver issue. look at the ram utilization
> 
> 
> 
> 
> 
> 
> 
> 
> 
> when you run Firestrike (1080P) is the util even lower?


Hmm.. that's odd - will have to try 1080P. Will report back.

Btw, the "RAM" reading in the OSD is 'system RAM' first and then 'Pagefile'. The VRAM is all screwed up since I started running 4-Way SLI. Any idea on how to fix that - since the max VRAM is 12GB?

Or, are you talking about the numbers in GPU-Z where it shows negative utilization?

I installed the driver after DDU with Driver & PhysX only. o_0


----------



## Glerox

Quote:


> Originally Posted by *Maintenance Bot*
> 
> If I read your first post correctly, you have 5 intake forcing air past the rads into the case. Maybe too much heat in there?


Yeah at the beginning my top rad was exhaust but the temps were even worst like up to 80 degrees :S

I'm waiting for five corsair SP120 high performance fans (2500 rpm) to see if the low rpm fans are the problem...


----------



## Maintenance Bot

Quote:


> Originally Posted by *Glerox*
> 
> Yeah at the beginning my top rad was exhaust but the temps were even worst like up to 80 degrees :S
> 
> I'm waiting for five corsair SP120 high performance fans (2500 rpm) to see if the low rpm fans are the problem...


Sp120's are nice, should drop your temps alot.


----------



## Render33

1.4 volts for a 4.2 overclock on your 6800k seems high. I run mine at 1.39 volts for a 4.8 overclock. Your temps don't seem that high imo for gaming.


----------



## SlammiN

My Titan XP has been used in 2 totally different systems

The problem is on random days the screen will just cut out, or the screen I am using

If the system used only has one screen I cant get it back on without rebooting

If it has 2 screens then unplugging the screen and plugging back in seems to work

Any ideas here?

I am about to install an EK kit this week so I hope it's not f**ked

Thanks


----------



## Nizzen

Quote:


> Originally Posted by *Render33*
> 
> 1.4 volts for a 4.2 overclock on your 6800k seems high. I run mine at 1.39 volts for a 4.8 overclock. Your temps don't seem that high imo for gaming.


But you don't have Broadwell-e


----------



## toncij

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Sp120's are nice, should drop your temps alot.


SP120s seem louder than Noctuas, even Noctua PPCs 2000...


----------



## Leyaena

Quote:


> Originally Posted by *SlammiN*
> 
> My Titan XP has been used in 2 totally different systems
> 
> The problem is on random days the screen will just cut out, or the screen I am using
> 
> If the system used only has one screen I cant get it back on without rebooting
> 
> If it has 2 screens then unplugging the screen and plugging back in seems to work
> 
> Any ideas here?
> 
> I am about to install an EK kit this week so I hope it's not f**ked
> 
> Thanks


Have you tried with different drivers?
Or DDU and reinstalling the latest?


----------



## SlammiN

Quote:


> Originally Posted by *Leyaena*
> 
> Have you tried with different drivers?
> Or DDU and reinstalling the latest?


The 2 machines do have different versions of the driver.

The previous machine would actually sometimes not come back on when I restarted the machine, could hear windows coming on

I have never tried DDU so I will give that a go too thanks


----------



## Leyaena

Quote:


> Originally Posted by *SlammiN*
> 
> The 2 machines do have different versions of the driver.
> 
> The previous machine would actually sometimes not come back on when I restarted the machine, could hear windows coming on
> 
> I have never tried DDU so I will give that a go too thanks


It's worth a shot, remnants of the previous config might be messing with the Titan XP that's in there now.
DDU is an amazing tool, it's definitely not a bad idea to run it in between swapping GPUs


----------



## SlammiN

Quote:


> Originally Posted by *Leyaena*
> 
> It's worth a shot, remnants of the previous config might be messing with the Titan XP that's in there now.
> DDU is an amazing tool, it's definitely not a bad idea to run it in between swapping GPUs


Ok brilliant thanks for advice on that


----------



## PowerK

Quote:


> Originally Posted by *dante`afk*
> 
> No it doesn't.
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/22.html


http://www.pcworld.com/article/3087524/hardware/tested-the-payoff-in-buying-nvidias-40-sli-hb-bridge.html


----------



## profundido

Quote:


> Originally Posted by *habu58*
> 
> I thought I would share some of my 4K benchmark results I got for a video im making. Every single setting in each game is maxed out except for AA which is noted below. I used Fraps and took benchmarks in three different areas for each game and then took the average for all three combined. I have since installed EK water blocks on both cards so I will have to do these benchmarks again.
> 
> Rig Specs:
> Rampage V
> I7 5960X 4.2Ghz
> 32GB Gskill Ripjaw
> 2x Titan X SLI (200/600 OC)
> Win 8.1
> 
> The Division - SMAA 1X Ultra
> Low 49
> Avg 71
> High 102
> 
> SWBF - TAA
> Low 105
> Avg 137
> High 162
> 
> Crysis 3 - FXAA
> Low 65
> Avg 107
> High 145
> 
> The Witcher 3 - AA Enabled
> Low 70
> Avg 92
> High 102
> 
> ROTTR - FXAA
> Low 51
> Avg 99
> High 118
> 
> Assassins Creed Syndicate - TXAA4X + FXAA
> Low 65
> Avg 102
> High 132
> 
> DOOM - TSAA 8TX
> Low 85
> Avg 110
> High 132
> 
> GTAV - FXAA/4xMSAA - 2xMSAA Reflections
> Low 71
> Avg 87
> High 115


What do you got in timespy or firestrike normal (for personal base of comparison to my system) ?


----------



## st0necold

Quote:


> Originally Posted by *toncij*
> 
> SP120s seem louder than Noctuas, even Noctua PPCs 2000...


SP120's are great but brutal on noise and I am not one who really cares about fan noise--


----------



## habu58

Quote:


> Originally Posted by *profundido*
> 
> What do you got in timespy or firestrike normal (for personal base of comparison to my system) ?


I've only ever ran FSU so I'll have to do another with normal. Ultra always crashes with my OC though but hopefully normal wont. And I'm using win 8 so I cant run Timespy.


----------



## profundido

Quote:


> Originally Posted by *Glerox*
> 
> Yeah at the beginning my top rad was exhaust but the temps were even worst like up to 80 degrees :S
> 
> I'm waiting for five corsair SP120 high performance fans (2500 rpm) to see if the low rpm fans are the problem...


I read your first post. You need more 3D rad surface for your current aimed fan speeds. In other words: your temps are perfectly normal for your current setup and there's no real reason to try and bring them further down without an ulterior cause such as more thermal headroom or more epeen. Remember that on stock coolers these can easy go up to 80° Celsius and be safe. If you really still want to bring the temps further down and you don't mind the noise, your best bet is indeed high performance fans and don't forget to beef up the exhaust accordingly too or the heat will build up on the motherboard and the rads (heat pushback)

also keep in mind that the TXP runs very hot by default if you overclock it. If you consider more/thicker rads to increase the 3D surface, here's my setup for a base of comparison:

1*420 SR2 Blackice rad with thermaltake 140 riiing RGB fans (front)
1*360 SR2 Blackice rad with EK Vardar's on it (top)
1*360 PE EK rad with thermaltake 120 riing RGB fans(side)

[email protected] Ghz. 2*TXP SLI. Both cards run at 41° C while gaming with +200/+500 OC or benchmarking Firestrike/Timespy

All fans @ 600 RPM, top and rear exhaust, bottom + side intake, case inside room temp 26 idle, 36 under load. Room temp 26°

I'm glad I overdimensioned WC on this new machine from the start because now I have alot of thermal room left with this new rig for when I want to crank up future-wise or put in extra-hotter components


----------



## Glerox

Quote:


> Originally Posted by *Nizzen*
> 
> But you don't have Broadwell-e


Yeah Broadwell-e oc very badly... I tried 4.3ghz and even at 1.48v it wasn't stable... So not worth it for me beyond 4.2ghz... Unlucky in lottery!


----------



## profundido

Quote:


> Originally Posted by *Nizzen*
> 
> But you don't have Broadwell-e


and yet he's right.

[email protected] here.

Stable OC with no more than 1.36v on the cpu core under load in windows with all 10 cores enabled. I might have gotten a bit lucky on the silicone though. 4.3Ghz works but not worth it. Needs sick voltage and thermals


----------



## pez

Got my BF1 Beta Invite this morning, so going to try and get some game time in once I get home. Hoping to see some great results at UW 1440p







.


----------



## Maintenance Bot

Quote:


> Originally Posted by *pez*
> 
> Got my BF1 Beta Invite this morning, so going to try and get some game time in once I get home. Hoping to see some great results at UW 1440p
> 
> 
> 
> 
> 
> 
> 
> .


Runs damm good here at 4k


----------



## KillerBee33

Any one try this yet?
http://www.nvidia.com/download/driverResults.aspx/107012/en-us


----------



## Lobotomite430

Quote:


> Originally Posted by *pez*
> 
> Got my BF1 Beta Invite this morning, so going to try and get some game time in once I get home. Hoping to see some great results at UW 1440p
> 
> 
> 
> 
> 
> 
> 
> .


It ran great on my UW 1440 with my 980 ti in alpha, try turning up the resolution scaling and see if that makes full use of the card!


----------



## pez

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Runs damm good here at 4k


Single card on 4K? That's exciting. I'm hoping for 100FPS + with some setting tweaks for my setup...though that may be a far fetched dream







.


----------



## Jpmboy

Quote:


> Originally Posted by *Baasha*
> 
> Hmm.. that's odd - will have to try 1080P. Will report back.
> 
> Btw, the "RAM" reading in the OSD is 'system RAM' first and then 'Pagefile'. The VRAM is all screwed up since I started running 4-Way SLI. Any idea on how to fix that - since the max VRAM is 12GB?
> 
> Or, are you talking about the numbers in GPU-Z where it shows negative utilization?
> 
> I installed the driver after DDU with Driver & PhysX only. o_0


yeah - the GPUZ ram "negative" utilization. Something ain't right there. I know 4-way will work in FSE... all I can suggest is use driver version 372.54 or reinstall it. Disable SLI before installing.
Frankly, these TX Pascal drivers are just crap IMO.


----------



## axiumone

Quote:


> Originally Posted by *pez*
> 
> Single card on 4K? That's exciting. I'm hoping for 100FPS + with some setting tweaks for my setup...though that may be a far fetched dream
> 
> 
> 
> 
> 
> 
> 
> .


That should be doable for you. I have a very similar set up to yours. 1080 sli nets me around 130-160 fps on all ultra.


----------



## EniGma1987

Quote:


> Originally Posted by *SlammiN*
> 
> My Titan XP has been used in 2 totally different systems
> 
> The problem is on random days the screen will just cut out, or the screen I am using
> 
> If the system used only has one screen I cant get it back on without rebooting
> 
> If it has 2 screens then unplugging the screen and plugging back in seems to work
> 
> Any ideas here?
> 
> I am about to install an EK kit this week so I hope it's not f**ked
> 
> Thanks


You have a GSync display? Mine did that and I got so fed up with the monitor I got rid of it after a month. Most expensive monitor I ever bought and it gave me more headaches than any monitor I have ever bought too. lol


----------



## Maintenance Bot

Quote:


> Originally Posted by *pez*
> 
> Single card on 4K? That's exciting. I'm hoping for 100FPS + with some setting tweaks for my setup...though that may be a far fetched dream
> 
> 
> 
> 
> 
> 
> 
> .


Yeah was getting around 75-80fps @ 3865x2174.

There resolution scale is a bit different, had to open up console command to see and adjust setting.


----------



## dante`afk

bf1 runs as good as sw:bf does, because it's the same engine.

however the game gets utterly boring after 1-3 rounds, it's for casuals again. same UI, interface, and gameplay like sw:bf only with different weapons.


----------



## GunnzAkimbo

Quote:


> Originally Posted by *dante`afk*
> 
> bf1 runs as good as sw:bf does, because it's the same engine.
> 
> however the game gets utterly boring after 1-3 rounds, it's for casuals again. same UI, interface, and gameplay like sw:bf only with different weapons.


Gaming industry has turned into a cash cow. Glad i was part of the innovative Golden Age of computers, that had none of this BS marketing / recycled BS with a new haircut.


----------



## pez

Quote:


> Originally Posted by *axiumone*
> 
> That should be doable for you. I have a very similar set up to yours. 1080 sli nets me around 130-160 fps on all ultra.


That's great to hear! I need a casual MP game for taking a break from OW and CS:GO, so this sounds like it should be great if I end up liking the beta. Many thanks for the input!


----------



## profundido

Quote:


> Originally Posted by *EniGma1987*
> 
> You have a GSync display? Mine did that and I got so fed up with the monitor I got rid of it after a month. Most expensive monitor I ever bought and it gave me more headaches than any monitor I have ever bought too. lol


either you simply had a defective monitor or are doing something wrong imho. I've had 2 different Asus ROG swifts with G-sync so far (pg27aq and pg278q). I've always used g-sync since it was released and not seen a single problem or similar issue as to what you describe yet. One of the most stable gamechanging new technologies I've ever seen implemented tbh.


----------



## toncij

Quote:


> Originally Posted by *dante`afk*
> 
> bf1 runs as good as sw:bf does, because it's the same engine.
> 
> however the game gets utterly boring after 1-3 rounds, it's for casuals again. same UI, interface, and gameplay like sw:bf only with different weapons.


It runs a bit slower for the max settings and SLI doesn't work very good (10-30% max).,. Regarding quality, it's very far from boring. It's very good and fun. Like all Battlefield games it's for casuals, of course, but for professional FPS you should look elsewhere, like Counter Strike.


----------



## axiumone

Quote:


> Originally Posted by *toncij*
> 
> No, it does not. It runs a bit slower for the max settings. Regarding quality, it's very far from boring. It's very good and fun. Like all Battlefield games it's for casuals, of course, but for professional FPS you should look elsewhere, like Counter Strike.


Really? It runs way better than battlefront for me. All ultra, I see about 30fps more in BF1. Tough to judge by just one may however.


----------



## bl4ckdot

New drivers out : http://www.geforce.com/whats-new/articles/battlefield-1-open-beta-world-of-warcraft-legion-game-ready-driver-released

Release notes : http://us.download.nvidia.com/Windows/372.70/372.70-win10-win8-win7-desktop-release-notes.pdf


----------



## kx11

Deus ex runs much better with this drivers


----------



## scgeek12

Look what came in the mail today!


----------



## jhowell1030

Quote:


> Originally Posted by *kx11*
> 
> Deus ex runs much better with this drivers


How much better? Any before/after specs?


----------



## mouacyk

Quote:


> Originally Posted by *scgeek12*
> 
> 
> 
> Look what came in the mail today!


$3k worth of enthusiasm? Gratz.


----------



## bizplan

Re: the new version (6.05) of EVGA Precision XOC for the Titan XP, I noticed that [one] benchmark was significantly lower using XOC vs. MSI AB at the same power (120), temp (90), clock (+195), memory (+595) and voltage settings (100%).

I only tested (2x) 3d Mark 11 and my score was ~600 points lower with XOC (~27,200 XOC vs. ~27,800 AB, 6700k @ 4.7 & single Titan XP).

Can someone else test & confirm with other benchmarks?

Thanks!


----------



## EniGma1987

Quote:


> Originally Posted by *kx11*
> 
> Deus ex runs much better with this drivers


Much better than the last game Ready drivers for Deus Ex? Or much better than the Titan X launch drivers? Cause ya, I saw a huge difference between the launch drivers and the last game ready ones. I will give the newest release a try later today.


----------



## cisco0623

Quote:


> Originally Posted by *scgeek12*
> 
> 
> 
> Look what came in the mail today!


Congrats!!


----------



## Glerox

Quote:


> Originally Posted by *profundido*
> 
> I read your first post. You need more 3D rad surface for your current aimed fan speeds. In other words: your temps are perfectly normal for your current setup and there's no real reason to try and bring them further down without an ulterior cause such as more thermal headroom or more epeen. Remember that on stock coolers these can easy go up to 80° Celsius and be safe. If you really still want to bring the temps further down and you don't mind the noise, your best bet is indeed high performance fans and don't forget to beef up the exhaust accordingly too or the heat will build up on the motherboard and the rads (heat pushback)
> 
> also keep in mind that the TXP runs very hot by default if you overclock it. If you consider more/thicker rads to increase the 3D surface, here's my setup for a base of comparison:
> 
> 1*420 SR2 Blackice rad with thermaltake 140 riiing RGB fans (front)
> 1*360 SR2 Blackice rad with EK Vardar's on it (top)
> 1*360 PE EK rad with thermaltake 120 riing RGB fans(side)
> 
> [email protected] Ghz. 2*TXP SLI. Both cards run at 41° C while gaming with +200/+500 OC or benchmarking Firestrike/Timespy
> 
> All fans @ 600 RPM, top and rear exhaust, bottom + side intake, case inside room temp 26 idle, 36 under load. Room temp 26°
> 
> I'm glad I overdimensioned WC on this new machine from the start because now I have alot of thermal room left with this new rig for when I want to crank up future-wise or put in extra-hotter components


Awesome! Thanks for your detailed answer! I should have more rad surface but no more room in my nzxt h440...

I'll try high performance fans and control the speed to be quiet in idle mode with nzxt grid v2 fan controller...

Just want to keep a load temp below 50 degrees because this is where I start to see thermal throttling with Pascal.

My max clock is 2063mhz without modifying voltage. Trying to get to a stable 2100mhz


----------



## toncij

Quote:


> Originally Posted by *bl4ckdot*
> 
> New drivers out : http://www.geforce.com/whats-new/articles/battlefield-1-open-beta-world-of-warcraft-legion-game-ready-driver-released
> 
> Release notes : http://us.download.nvidia.com/Windows/372.70/372.70-win10-win8-win7-desktop-release-notes.pdf


BF 1 Beta - no SLI... seems like SLI is pretty much wasted money more and more.


----------



## scohen158

Congrats I just returned one of my Titan X Pascals and the SLI bridge. I realized I didn't need two and although fun I can always get one in awhile for less. I plan to put the EVGA Hybrid Cooler on it when they release it as I don't want a custom loop.

What processor are you using the Titan's with? I have a 6850K.


----------



## Nizzen

Quote:


> Originally Posted by *toncij*
> 
> BF 1 Beta - no SLI... seems like SLI is pretty much wasted money more and more.


Beta...

Sli is coming to Bf1 DX12 too


----------



## Artah

Quote:


> Originally Posted by *scohen158*
> 
> Congrats I just returned one of my Titan X Pascals and the SLI bridge. I realized I didn't need two and although fun I can always get one in awhile for less. I plan to put the EVGA Hybrid Cooler on it when they release it as I don't want a custom loop.
> 
> What processor are you using the Titan's with? I have a 6850K.


They let you return it opened?


----------



## scohen158

Quote:


> Originally Posted by *Artah*
> 
> They let you return it opened?


Yes within 30 days


----------



## Glerox

Does anyone else observed decreased performance with overvolting? (even with watercooling)
Using MSI afterburner with overvolting unlocked :

Temps around 44 degrees.
Firestrike ultra 7707 with no overvolting.
Fitestrike ultra 7528 with 100% overvolting. I had to diminish the gpu boost from +225 to +220 because it would make the driver crash (but it's working with no overvolting, which i don't understand).
Even if i reached a higher max clockspeed, clock seem to fluctuate more with overvolting

I observed the same results when I had a gtx 1080. Was way more unstable with overvolting. I finally let it to no overvolting for the best performance/stability!

Maybe Pascal doesn't like overvolting?


----------



## Vellinious

Quote:


> Originally Posted by *Glerox*
> 
> Does anyone else observed decreased performance with overvolting? (even with watercooling)
> Using MSI afterburner with overvolting unlocked :
> 
> Temps around 44 degrees.
> Firestrike ultra 7707 with no overvolting.
> Fitestrike ultra 7528 with 100% overvolting. I had to diminish the gpu boost from +225 to +220 because it would make the driver crash (but it's working with no overvolting, which i don't understand).
> Even if i reached a higher max clockspeed, clock seem to fluctuate more with overvolting
> 
> I observed the same results when I had a gtx 1080. Was way more unstable with overvolting. I finally let it to no overvolting for the best performance/stability!
> 
> Maybe Pascal doesn't like overvolting?


It's more about the temps than the volts....it was starting to shift that way with maxwell. Pascal just moved further that direction. Increasing voltage on Pascal is going to do nothing without getting temps ridiculously low.


----------



## pompss

Quote:


> Originally Posted by *Glerox*
> 
> Does anyone else observed decreased performance with overvolting? (even with watercooling)
> Using MSI afterburner with overvolting unlocked :
> 
> Temps around 44 degrees.
> Firestrike ultra 7707 with no overvolting.
> Fitestrike ultra 7528 with 100% overvolting. I had to diminish the gpu boost from +225 to +220 because it would make the driver crash (but it's working with no overvolting, which i don't understand).
> Even if i reached a higher max clockspeed, clock seem to fluctuate more with overvolting
> 
> I observed the same results when I had a gtx 1080. Was way more unstable with overvolting. I finally let it to no overvolting for the best performance/stability!
> 
> Maybe Pascal doesn't like overvolting?


You are hitting power limit soner by adding more voltage.
Without a bios to unlock power limit adding more voltage will only make it worst.


----------



## Glerox

Quote:


> Originally Posted by *pompss*
> 
> You are hitting power limit soner by adding more voltage.
> Without a bios to unlock power limit adding more voltage will only make it worst.


Quote:


> Originally Posted by *Vellinious*
> 
> It's more about the temps than the volts....it was starting to shift that way with maxwell. Pascal just moved further that direction. Increasing voltage on Pascal is going to do nothing without getting temps ridiculously low.


Makes sense! I guess I'll stay with stock voltage. Waiting for a Bios mod because don't think it's worth it to hard power mod with CLU...


----------



## Stateless

I have a question for those that have installed EK blocks. When using the thermal pads that come them, did you cut down the pieces to go over sections 2 and 3 or did you just lay the entire thing over the sections for 2 and 3? I know they come pre-cut for the memory modules, but not sure if I need to cut down the other pieces to over the other parts or just lay the entire thing over those sections?


----------



## Yuhfhrh

Quote:


> Originally Posted by *Stateless*
> 
> I have a question for those that have installed EK blocks. When using the thermal pads that come them, did you cut down the pieces to go over sections 2 and 3 or did you just lay the entire thing over the sections for 2 and 3? I know they come pre-cut for the memory modules, but not sure if I need to cut down the other pieces to over the other parts or just lay the entire thing over those sections?


Just lay them down, no need to cut.


----------



## Artah

Quote:


> Originally Posted by *Stateless*
> 
> I have a question for those that have installed EK blocks. When using the thermal pads that come them, did you cut down the pieces to go over sections 2 and 3 or did you just lay the entire thing over the sections for 2 and 3? I know they come pre-cut for the memory modules, but not sure if I need to cut down the other pieces to over the other parts or just lay the entire thing over those sections?


For me I cut the #2 thinner so it fits better on top of the VRMs. #3 I just shortened it so it's not too long.


----------



## pez

Quote:


> Originally Posted by *bl4ckdot*
> 
> New drivers out : http://www.geforce.com/whats-new/articles/battlefield-1-open-beta-world-of-warcraft-legion-game-ready-driver-released
> 
> Release notes : http://us.download.nvidia.com/Windows/372.70/372.70-win10-win8-win7-desktop-release-notes.pdf


I will be trying these later today hopefully. Thanks for posting this.

Game is running 120-150 normally with the lowest FPS I think I've seen (DX11) at 88. I turned down AA and don't think I noticed a huge minimum dip, but rather an average of probably 95-110. Would love to see 100+ at full tilt or 100+ without AA. Game is a blast for me so far.
Quote:


> Originally Posted by *toncij*
> 
> BF 1 Beta - no SLI... seems like SLI is pretty much wasted money more and more.


It is still beta







.


----------



## markklok

Quote:


> Originally Posted by *Stateless*
> 
> I have a question for those that have installed EK blocks. When using the thermal pads that come them, did you cut down the pieces to go over sections 2 and 3 or did you just lay the entire thing over the sections for 2 and 3? I know they come pre-cut for the memory modules, but not sure if I need to cut down the other pieces to over the other parts or just lay the entire thing over those sections?


I cut nr 2 on the sides for better fit. Also on the top you can see I made short pad (also nr 2)
Nr 3 just a bit shorter


----------



## iTurn

Nice "perspective" vid from digital foundry.


----------



## MrKenzie

Quote:


> Originally Posted by *EniGma1987*
> 
> Much better than the last game Ready drivers for Deus Ex? Or much better than the Titan X launch drivers? Cause ya, I saw a huge difference between the launch drivers and the last game ready ones. I will give the newest release a try later today.


I can confirm there is ZERO fps change on a 1080 with the new drivers. The game reverts back to "low" settings after the driver update, but once you stick it back where it was, there is no increase to fps.


----------



## profundido

Quote:


> Originally Posted by *Stateless*
> 
> I have a question for those that have installed EK blocks. When using the thermal pads that come them, did you cut down the pieces to go over sections 2 and 3 or did you just lay the entire thing over the sections for 2 and 3? I know they come pre-cut for the memory modules, but not sure if I need to cut down the other pieces to over the other parts or just lay the entire thing over those sections?


The thermal pads 2 and 3 are too long so you'll need to cut those. When you do, take into account that on the top side you want to cover just enough and not 1 mm more because when you do (like I did first by mistake in attached picture below) you'll see that thermal pad sticking out of the block once the card is installed in your pc. I marked the spot for your convenience.


----------



## bl4ckdot

Quote:


> Originally Posted by *profundido*
> 
> The thermal pads 2 and 3 are too long so you'll need to cut those. When you do, take into account that on the top side you want to cover just enough and not 1 mm more because when you do (like I did first by mistake in attached picture below) you'll see that thermal pad sticking out of the block once the card is installed in your pc. I marked the spot for your convenience.


Did you keep the weird white nvidia pads ?


----------



## MikeSanders

Still no mod bios? =(


----------



## profundido

Quote:


> Originally Posted by *bl4ckdot*
> 
> Did you keep the weird white nvidia pads ?


yes I did, but only on the extra spots that EK doesn't cover. It's not required but sure doesn't hurt. My thermals are great on both cards


----------



## Lobotomite430

I fixed my cooling issue with the EVGA 1080/1070 hybrid kit Im using on my Titan. Now my games are running at 55c rather than 70+ I am very happy with the EVGA hybrid kit!


----------



## RedM00N

Quote:


> Originally Posted by *toncij*
> 
> BF 1 Beta - no SLI... seems like SLI is pretty much wasted money more and more.


372.70 removed SLI support due to some rare issue or something. You could always downgrade to the 372.54 driver (which had SLI support), and export the BF1 profile with Nvidia inspector, then install 372.70 and import that profile if you really want SLI on the newest driver.


----------



## toncij

Quote:


> Originally Posted by *RedM00N*
> 
> 372.70 removed SLI support due to some rare issue or something. You could always downgrade to the 372.54 driver (which had SLI support), and export the BF1 profile with Nvidia inspector, then install 372.70 and import that profile if you really want SLI on the newest driver.


I sincerely hope that 372.54 "SLI support" was not what we can expect. The 2nd card had 40-70% usage and FPS gain was marginal at best, from 5% to 30% max in 1440 average framerates.


----------



## Snaporz

Quote:


> Originally Posted by *profundido*
> 
> The thermal pads 2 and 3 are too long so you'll need to cut those. When you do, take into account that on the top side you want to cover just enough and not 1 mm more because when you do (like I did first by mistake in attached picture below) you'll see that thermal pad sticking out of the block once the card is installed in your pc. I marked the spot for your convenience.


I have part of that top module uncovered as well where your green arrow is. Glad I didn't royally mess up. I took off all of the white stock stuff, though. Need to finally get aroudn to OC'ing but my idle temps are 25C and didnt go higher than 34 in BF1 Beta last night. Love it.


----------



## RepTexas

Hey guys. QQ.

My OC is @ +250 core on an EK water block was averaging in the 2050 to 2100 range at boost.

Everthing has been fine lots of gaming and testing for weeks and no previois issues.

Yesterday while playing the BF1 beta i noticed my boost doesnt go over 1800 now? I tried other games and sure enough same thing..it shows my clock @ +250 but stop @ 1800 in witcher, ark and gtaV as well.

Anyoen run into this? Is it the new driver Cause i did update or did i break something? Could it be afterburner?

I'm asking cause im at work all day and cant trouble shoot till laylte tonight and its bugging me lol.


----------



## CallsignVega

My cards actually run a little bit faster with the voltage slider at default instead of increased. This is with the CLU mod. I wonder what you guys on water would show?


----------



## DADDYDC650

Kinda bored with my XP. Should I return it and try the Titan lottery again? Current XP is stable around 2000-2030Mhz core and +500-600 memory.


----------



## profundido

Quote:


> Originally Posted by *RepTexas*
> 
> Hey guys. QQ.
> 
> My OC is @ +250 core on an EK water block was averaging in the 2050 to 2100 range at boost.
> 
> Everthing has been fine lots of gaming and testing for weeks and no previois issues.
> 
> Yesterday while playing the BF1 beta i noticed my boost doesnt go over 1800 now? I tried other games and sure enough same thing..it shows my clock @ +250 but stop @ 1800 in witcher, ark and gtaV as well.
> 
> Anyoen run into this? Is it the new driver Cause i did update or did i break something? Could it be afterburner?
> 
> I'm asking cause im at work all day and cant trouble shoot till laylte tonight and its bugging me lol.


Sounds like nvidia driver growing pains. Did you update your nividia drivers or anything recently by chance ? If so, tried reverting ? also, have your thermals changed causing your cards to throttle maybe ? First things I could think of


----------



## KillerBee33

@CallsignVega
Haven't tried on the TXP but 1080 FE had exact same results judging by 3DMark runs , around 200 less with Voltage raised thru OC Soft.


----------



## RepTexas

Quote:


> Originally Posted by *profundido*
> 
> Sounds like nvidia driver growing pains. Did you update your nividia drivers or anything recently by chance ? If so, tried reverting ? also, have your thermals changed causing your cards to throttle maybe ? First things I could think of


I did update the nvidia driver before playing the battlefield 1 beta. Im guessing that has to be it. Nothing else jas changed and my temps are still 33-45.

I guess i will have to revert and test some more when i get home tonight..if anyone else has the same issue in the meanwhile please let me know.


----------



## DNMock

Quote:


> Originally Posted by *RepTexas*
> 
> I did update the nvidia driver before playing the battlefield 1 beta. Im guessing that has to be it. Nothing else jas changed and my temps are still 33-45.
> 
> I guess i will have to revert and test some more when i get home tonight..if anyone else has the same issue in the meanwhile please let me know.


How is the Power limit doing? are you still at the same spot there or did that go down as well?

May just be that the boost is less aggressive with the latest driver and you just need to push your base core clock higher to compensate.

Comically, my clocks used to redline at 2050 if I was at +100 or +200 OC, now what I have my core set to actually has an influence on what the final boost is.


----------



## jhowell1030

Quote:


> Originally Posted by *Lobotomite430*
> 
> I fixed my cooling issue with the EVGA 1080/1070 hybrid kit Im using on my Titan. Now my games are running at 55c rather than 70+ I am very happy with the EVGA hybrid kit!


If you don't mind me asking, what was it you needed to do to fix the issue? I have a Kraken G10 coming tomorrow and I plan on putting it on my Titan this weekend.


----------



## jhowell1030

Why so high on the memory? Lot's of folks are posting about anything more than 450-475ish not showing any improvement to FPS.


----------



## Lobotomite430

I'm pretty sure I was over tightening the screws that held the pump on, I turned them until they were hand tight which I think was too much. Every time I took the pump off there was almost no paste on the gpu/pump. I think I am just not used to the spring screws, but this time I didnt tighten them much and I would consider them to be loose. I also added a push fan on the radiator as I wasnt using that before. Now the card is running cool and I am happy.


----------



## jhowell1030

Cool deal. Yeah, I've never put an AIO on a GPU so this will be my first go around until I convince the wife to let me drop some cash on a custom loop. I've got a kraken x61 that I'll set up in push/pull. We'll see how that works out. This will also be my first time tearing apart a GPU so that'll be fun as well.


----------



## Lobotomite430

It was pretty easy just go slow, and if a piece doesn't come off right away look for a hidden screw. Also use a magnet tray to hold the screws if you have pets like me that love to interfere with projects its easy to lose these tiny screws!


----------



## marc0053

Hi all, a few things I have observed:
-Applying a thin coat of Coollaboratory liquid ultra on the 3x shunt resistors helped to reduced power % in firestrike to a maximum of around 85% while without the shunt mod I was hitting 120%. I applied the CLU using the tip of the tube and thinned out using Q-tips. I mound my card horizontally with a fan blasting air on the VRM for cooling as I am using an EK thermosphere waterblock.
Although the shunt mod eliminates most of the core clock throttle there is still throttling due to GPU temps. On air clocks would throttle down by as much as 200 mhz while water was around 50mhz. doing the shunt mod with watercooling brings down throttle to about 20 mhz or so. My GPU temps are 21C unload and about 31C when loaded (firestrike). This did help most benchmark scores by a small amount.





Yesterday I put the TX on LN2. The clocks scales very well with cold for example gpupi I could clock up to 2139mhz on water at 21C/31C but on LN2 at -38C I got up to 2316MHz. For those interested in cooling the TX below ambient the cards drops clocks at around 0C or colder by 200mhz. for me at -38C or colder the gpu crashes at any load. The only way to avoid the 200mhz downclock below 0C was to set the core clocks using the "Curve" as opposed to offset. When you press CTRL+F in MSI afterburner you will get the clock curve. Click on the point assigned to 1.075V and hold the CTRL button while moving up the point to your desired clock (for me it was 2316MHZ for GPUPI @ 1.075V). http://hwbot.org/submission/3301662_marc0053_gpupi___1b_titan_x_pascal_10sec_144ms


----------



## TK421

Is it possible to buy this off micro center or other shop?


----------



## DADDYDC650

Quote:


> Originally Posted by *DADDYDC650*
> 
> Kinda bored with my XP. Should I return it and try the Titan lottery again? Current XP is stable around 2000-2030Mhz core and +500-600 memory.


Anyone?


----------



## profundido

Thx for sharing that nice info Marc. Nice to see how keep pushing all corners to squeeze out more and hear of your latest findings









I don't have the guts honestly to do any hardware mod to my ridiculously expensive hard-earned TXP cards but I cannot deny the itch to do the shunt mod...=P


----------



## axiumone

Quote:


> Originally Posted by *TK421*
> 
> Is it possible to buy this off micro center or other shop?


No, only the nvidia website.


----------



## profundido

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone?


seriously ? Would you send back a product like that just for the hopes to get a few more Mhz on the core overclock ? Are we becoming so spoiled ?









I personally would be very happy with what I got and call it a day. The difference even if you got lucky would be marginal after all right ?


----------



## DADDYDC650

Quote:


> Originally Posted by *profundido*
> 
> seriously ? Would you send back a product like that just for the hopes to get a few more Mhz on the core overclock ? Are we becoming so spoiled ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I personally would be very happy with what I got and call it a day. The difference even if you got lucky would be marginal after all right ?


Yes,I'm serious and yes I'm spoiled.


----------



## Baasha

The GPU sammich is causing the top card to get to 90C which then causes thermal throttling so that's annoying.

Nonetheless, performance is staggering since my memory is stable across all 4 GPUs at +650 and OC +150.

Battlefield 4 in Ultra @ 8K:


----------



## profundido

Quote:


> Originally Posted by *Baasha*
> 
> The GPU sammich is causing the top card to get to 90C which then causes thermal throttling so that's annoying.
> 
> Nonetheless, performance is staggering since my memory is stable across all 4 GPUs at +650 and OC +150.
> 
> Battlefield 4 in Ultra @ 8K:


I read and followed up on your setup and I remember I was thinking exactly that. In fact even far below 90 you have gradual throttling so I would expect the total performance of your 4-SLI setup to skyrocket if you put it on water with some oversized rads behind it. In fact I can hear them scream for water like flowers in the desert already







What's holding you back mate ? Me for one I am dying to see the difference that would make


----------



## Gary2015

Just played 3 hours of BF1 beta. Its amazing with DX12..


----------



## Gary2015

Quote:


> Originally Posted by *Baasha*
> 
> The GPU sammich is causing the top card to get to 90C which then causes thermal throttling so that's annoying.
> 
> Nonetheless, performance is staggering since my memory is stable across all 4 GPUs at +650 and OC +150.
> 
> Battlefield 4 in Ultra @ 8K:


Why are you showing vids on BF4. Do some on the BF1 beta!!!


----------



## Jpmboy

Quote:


> Originally Posted by *marc0053*
> 
> Hi all, a few things I have observed:
> -Applying a thin coat of Coollaboratory liquid ultra on the 3x shunt resistors helped to reduced power % in firestrike to a maximum of around 85% while without the shunt mod I was hitting 120%. I applied the CLU using the tip of the tube and thinned out using Q-tips. I mound my card horizontally with a fan blasting air on the VRM for cooling as I am using an EK thermosphere waterblock.
> Although the shunt mod eliminates most of the core clock throttle there is still throttling due to GPU temps. On air clocks would throttle down by as much as 200 mhz while water was around 50mhz. doing the shunt mod with watercooling brings down throttle to about 20 mhz or so. My GPU temps are 21C unload and about 31C when loaded (firestrike). This did help most benchmark scores by a small amount.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yesterday I put the TX on LN2. The clocks scales very well with cold for example gpupi I could clock up to 2139mhz on water at 21C/31C but on LN2 at -38C I got up to 2316MHz. For those interested in cooling the TX below ambient the cards drops clocks at around 0C or colder by 200mhz. for me at -38C or colder the gpu crashes at any load. The only way to avoid the 200mhz downclock below 0C was to set the core clocks using the "Curve" as opposed to offset. When you press CTRL+F in MSI afterburner you will get the clock curve. Click on the point assigned to 1.075V and hold the CTRL button while moving up the point to your desired clock (for me it was 2316MHZ for GPUPI @ 1.075V). http://hwbot.org/submission/3301662_marc0053_gpupi___1b_titan_x_pascal_10sec_144ms
> 
> 
> Spoiler: Warning: Spoiler!


thanks for sharing this Marc.








Quote:


> Originally Posted by *Baasha*
> 
> The GPU sammich is causing the top card to get to 90C which then causes thermal throttling so that's annoying.
> 
> Nonetheless, performance is staggering since my memory is stable across all 4 GPUs at +650 and OC +150.
> 
> Battlefield 4 in Ultra @ 8K:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


4 air cooled TXPs in a $5000 heat sammich... the shame!


----------



## Gary2015

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone?


Im going return one of mine..


----------



## markklok

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone?


You're not alone... Max stable core is 2037 and mem +575

But how much time do you want to spend getting that extra 2 to 4%

I just accepted it... maybe getting some extra speed with custom firmware (if that will be possible let in the future)


----------



## fuzzybass

Quote:


> Originally Posted by *Baasha*
> 
> The GPU sammich is causing the top card to get to 90C which then causes thermal throttling so that's annoying.
> 
> Nonetheless, performance is staggering since my memory is stable across all 4 GPUs at +650 and OC +150.
> 
> Battlefield 4 in Ultra @ 8K:


Dude, are you sure you want to keep doing that? I've had GTX680's in SLI for more than four years now, and my main GTX680 ran pretty hot (high 80's), due to my fault, of course, and it finally gave up the ghost a few months ago.

I didn't think running it that hot would cause it to die at the time, so I just ran it that way. But when I get new video cards, I'm going to try to run them cooler.


----------



## TK421

Quote:


> Originally Posted by *axiumone*
> 
> No, only the nvidia website.


god dammit


----------



## SuCCEzz

Quote:


> Originally Posted by *dante`afk*
> 
> bf1 runs as good as sw:bf does, because it's the same engine.
> 
> however the game gets utterly boring after 1-3 rounds, it's for casuals again. same UI, interface, and gameplay like sw:bf only with different weapons.


This is sad news. Why does everything go Casual now. I left consoles almost exclusively to get away from the Casual Marketing.


----------



## kx11

so my TitX is on the way now and nvidia store provided me a tracking number but which carrier does nvidia use for tracking ?!!


----------



## TK421

Quote:


> Originally Posted by *SuCCEzz*
> 
> This is sad news. Why does everything go Casual now. I left consoles almost exclusively to get away from the Casual Marketing.


People play to have fun


----------



## RepTexas

Quote:


> Originally Posted by *RepTexas*
> 
> Hey guys. QQ.
> 
> My OC is @ +250 core on an EK water block was averaging in the 2050 to 2100 range at boost.
> 
> Everthing has been fine lots of gaming and testing for weeks and no previois issues.
> 
> Yesterday while playing the BF1 beta i noticed my boost doesnt go over 1800 now? I tried other games and sure enough same thing..it shows my clock @ +250 but stop @ 1800 in witcher, ark and gtaV as well.
> 
> Anyoen run into this? Is it the new driver Cause i did update or did i break something? Could it be afterburner?
> 
> I'm asking cause im at work all day and cant trouble shoot till laylte tonight and its bugging me lol.


In case anyone runs into the same issue. I traced it back to Riva Tuner. After trying a bunch of stuff including rolling back drivers what worked was installing afterburner without Riva Tuner. Don't know why but im back to 2068 average MHz.

Thanks all. While I'm here here is a pic of my new back plate that was waiting in my mailbox when I got home. Thanks V1Tech <3


----------



## jhowell1030

Quote:


> Originally Posted by *RepTexas*
> 
> In case anyone runs into the same issue. I traced it back to Riva Tuner. After trying a bunch of stuff including rolling back drivers what worked was installing afterburner without Riva Tuner. Don't know why but im back to 2068 average MHz.
> 
> Thanks all. While I'm here here is a pic of my new back plate that was waiting in my mailbox when I got home. Thanks V1Tech <3


I was looking at the same backplate at mainframe customs. Is it for the original Titan X? I had no idea if it would work or not.


----------



## RepTexas

Quote:


> Originally Posted by *jhowell1030*
> 
> I was looking at the same backplate at mainframe customs. Is it for the original Titan X? I had no idea if it would work or not.


Yup from the maxwell version fit fine.


----------



## Artah

What are you guys using to OC these new cards with and are you adding voltage without the resistor mod? I'm ready to finally do some OCing.


----------



## RepTexas

Quote:


> Originally Posted by *Artah*
> 
> What are you guys using to OC these new cards with and are you adding voltage without the resistor mod? I'm ready to finally do some OCing.


I was able to get to 2060mhz (+225) core and +200 mem by simply moving the power and temp limit to max. No added voltage. But im on the EK block. Not really tempted to up voltage till i need it right now everything runs good except Ark lmao. But i know you can add a slight voltage amount by modifying the config file in AB


----------



## scgeek12

Both my pascal Titan X cards installed with EK blocks ,, time for some overclocking!


----------



## Artah

Quote:


> Originally Posted by *RepTexas*
> 
> I was able to get to 2060mhz (+225) core and +200 mem by simply moving the power and temp limit to max. No added voltage. But im on the EK block. Not really tempted to up voltage till i need it right now everything runs good except Ark lmao. But i know you can add a slight voltage amount by modifying the config file in AB


nice thanks. I'm getting 2012 with +215 and 5200 on +200. I have EK blocks also. Temp 54ish


----------



## Testier

those of you on stock air. are your cards holding your max stable boost clock? my card can do 2025/2012mhz for a bit before clocking down quite a bit once it heats up.


----------



## CallsignVega

Yes, even on top water the cards will down-clock while under heavy load.

I miss the old days of the BIOS modded Titan-X with those baby's cores locked on rock solid.

This new boost version is crap.


----------



## Testier

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, even on top water the cards will down-clock while under heavy load.
> 
> I miss the old days of the BIOS modded Titan-X with those baby's cores locked on rock solid.
> 
> This new boost version is crap.


How much does your card clock down? Mine can get into mid/low 1900s at times or even lower under heat.


----------



## CallsignVega

Quote:


> Originally Posted by *Testier*
> 
> How much does your card clock down? Mine can get into mid/low 1900s at times or even lower under heat.


All depends on the load. If you guys REALLY want to test how far your card(s) will down-clock, go into the NVIDIA control panel and select 4.0x DSR and then run a benchmark at the new higher resolution.


----------



## DADDYDC650

Got approved for an RMA. Hope my next card is even better. If not, whatevs. Are all XP's pretty much hitting 2Ghz/+500 memory game stable?


----------



## Lennyx

Quote:


> Originally Posted by *DADDYDC650*
> 
> Got approved for an RMA. Hope my next card is even better. If not, whatevs. Are all XP's pretty much hitting 2Ghz/+500 memory game stable?


Its not only the XP cards. Its the same for the other Pascal cards released. It comes down to double digit clock diferences only so i dont think its worth playing any lottery with these cards.

I did some heaven benching yday and got +200/+800 stable there. In valley however i would see artifacts at the end of the run at those clocks. Right now i run +200/+750 and it seems stable. Keeping temps below 42c helped on the tests i did. Once the temps got higher the clock would fluctuate more.


----------



## DADDYDC650

Quote:


> Originally Posted by *Lennyx*
> 
> Its not only the XP cards. Its the same for the other Pascal cards released. It comes down to double digit clock diferences only so i dont think its worth playing any lottery with these cards.
> 
> I did some heaven benching yday and got +200/+800 stable there. In valley however i would see artifacts at the end of the run at those clocks. Right now i run +200/+750 and it seems stable. Keeping temps below 42c helped on the tests i did. Once the temps got higher the clock would fluctuate more.


I figured that much. My card had pretty bad coil whine with high frames which I don't think is normal so it had to go back regardless.


----------



## Lennyx

Quote:


> Originally Posted by *DADDYDC650*
> 
> I figured that much. My card had pretty bad coil whine with high frames which I don't think is normal so it had to go back regardless.


I also got pretty bad coil whine. Did not notice until i put on the waterblock. At first i thought it was the referance cooler that made the noise. And the noise gets loader higher frames i get. Decided to wait it out for a little while, heard sometimes the coilwhine dissapears after some use. If its not gone in a week or two i will rma aswell.


----------



## DADDYDC650

Quote:


> Originally Posted by *Lennyx*
> 
> I also got pretty bad coil whine. Did not notice until i put on the waterblock. At first i thought it was the referance cooler that made the noise. And the noise gets loader higher frames i get. Decided to wait it out for a little while, heard sometimes the coilwhine dissapears after some use. If its not gone in a week or two i will rma aswell.


The whine is pretty annoying isn't it? I don't remember the last time I had a card with as much coil whine as my XP. Can't be normal. Haven't read much about coil whine on here either.


----------



## atreides

Quote:


> Originally Posted by *Gary2015*
> 
> Just played 3 hours of BF1 beta. Its amazing with DX12..


Hello there, I also played some bf1 today. I run everything on ultra and I can only go up to 60% resolution any higher I'll drop below 60 fps., at 100% I'm going at 30fps. What kind of performance are you getting? Im not water cooling my Titan and with bf1 I'm hitting 70c with 90% fan speed. I'm using a 3440x1440p 100hz panel.

I've been contemplating going sli to see if it would be possible to have every setting maxed out and see if I could get that constant 60fps. Could anyone give me advice? Will two Titan XP achieve that perfect 60fps at maxed or settings? Thanks


----------



## Lennyx

Quote:


> Originally Posted by *DADDYDC650*
> 
> The whine is pretty annoying isn't it? I don't remember the last time I had a card with as much coil whine as my XP. Can't be normal. Haven't read much about coil whine on here either.


Yes its a nightmare. I never experienced whine this bad before. Gonna try put it on heavy load this weekend while im out doing other things. Hopefully few 10hour sessions will remove the whine.
Quote:


> Originally Posted by *atreides*
> 
> Hello there, I also played some bf1 today. I run everything on ultra and I can only go up to 60% resolution any higher I'll drop below 60 fps., at 100% I'm going at 30fps. What kind of performance are you getting? Im not water cooling my Titan and with bf1 I'm hitting 70c with 90% fan speed. I'm using a 3440x1440p 100hz panel.
> 
> I've been contemplating going sli to see if it would be possible to have every setting maxed out and see if I could get that constant 60fps. Could anyone give me advice? Will two Titan XP achieve that perfect 60fps at maxed or settings? Thanks


42% is ur native resolution in BF1. Ultra preset and native res on a 1440p monitor i never dropped below 150fps in BF1. I ran it in DX11 though.


----------



## DADDYDC650

Quote:


> Originally Posted by *Lennyx*
> 
> Yes its a nightmare. I never experienced whine this bad before. Gonna try put it on heavy load this weekend while im out doing other things. Hopefully few 10hour sessions will remove the whine.


You should not have to stress your $1200 GPU to try and fix coil whine.


----------



## pez

My 1080s both had some rather annoying whine. My FE 1070 and TXP seem to both be fine, however. My 1080s were G1s....go figure.


----------



## ypaul123

hey guys,

was wondering if getting around 100-120 fps in BF1:Beta using a 6900k and a titan x (pascal)at 4k resolution is decent or not

thanks.


----------



## Lennyx

Quote:


> Originally Posted by *DADDYDC650*
> 
> You should not have to stress your $1200 GPU to try and fix coil whine.


I agree with you. Im still gonna try couse i dont want to wait weeks getting a new card.


----------



## atreides

Quote:


> Originally Posted by *Lennyx*
> 
> Yes its a nightmare. I never experienced whine this bad before. Gonna try put it on heavy load this weekend while im out doing other things. Hopefully few 10hour sessions will remove the whine.
> 42% is ur native resolution in BF1. Ultra preset and native res on a 1440p monitor i never dropped below 150fps in BF1. I ran it in DX11 though.


Oh I see. I forgot to mention I'm running it on dx12. If I increase the resolution to 100% I get 30 fps with every setting maxed.


----------



## Lennyx

Quote:


> Originally Posted by *atreides*
> 
> Oh I see. I forgot to mention I'm running it on dx12. If I increase the resolution to 100% I get 30 fps with every setting maxed.


100% is 2x your native resolution, thats why you only get 30fps. Play around some with the slider til you get the fps target you are looking for or set it to 42% for your native resolution.


----------



## MrKenzie

Quote:


> Originally Posted by *kx11*
> 
> so my TitX is on the way now and nvidia store provided me a tracking number but which carrier does nvidia use for tracking ?!!


You should be able to click on the tracking number... It was FedEx for mine.


----------



## unreality

I also had some terrible coil whine after putting my Aqua Computer Backplate on. It turned out I overtightended the screws. I put them lose again and was gentler the next time. Its gone for now


----------



## Lennyx

Quote:


> Originally Posted by *unreality*
> 
> I also had some terrible coil whine after putting my Aqua Computer Backplate on. It turned out I overtightended the screws. I put them lose again and was gentler the next time. Its gone for now


i dont use a backplate. But im gonna try and reseat my block and see what happens.


----------



## profundido

Quote:


> Originally Posted by *Glerox*
> 
> Does anyone else observed decreased performance with overvolting? (even with watercooling)
> Using MSI afterburner with overvolting unlocked :
> 
> Temps around 44 degrees.
> Firestrike ultra 7707 with no overvolting.
> Fitestrike ultra 7528 with 100% overvolting. I had to diminish the gpu boost from +225 to +220 because it would make the driver crash (but it's working with no overvolting, which i don't understand).
> Even if i reached a higher max clockspeed, clock seem to fluctuate more with overvolting
> 
> I observed the same results when I had a gtx 1080. Was way more unstable with overvolting. I finally let it to no overvolting for the best performance/stability!
> 
> Maybe Pascal doesn't like overvolting?


there seems to be a continuous gradual throtlling effect in place with Pascal even at much lower temps than with maxwell. It may not be noticeable at low temps but putting higher voltage leads directly to higher internal temps causing more or faster throtling causing lower benchmark results while having to deal with more heat. So indeed, putting voltage to max is not longer a guaranteed win. It's more like find the golden balance where stuff still runs supercool while being stable with just enough extra voltage and no more.


----------



## fernlander

Quote:


> Originally Posted by *jhowell1030*
> 
> Why so high on the memory? Lot's of folks are posting about anything more than 450-475ish not showing any improvement to FPS.


I don't understand it either. I do get higher benches but both this txp and TXM became unstable past about 300MHz OC on memory. It reduces my stable baseclock point.


----------



## KillerBee33

Does anyone know if these things exist or what company might make these for Vinyl OD: 5/8" (16 mm) x ID: 3/8" (10 mm) tube?


----------



## EniGma1987

Quote:


> Originally Posted by *DADDYDC650*
> 
> The whine is pretty annoying isn't it? I don't remember the last time I had a card with as much coil whine as my XP. Can't be normal. Haven't read much about coil whine on here either.


All my Nvidia cards going back 5 years have had just as much coil whine at high FPS at my Titan XP does :/ Waterblocks make it more noticeable since you dont have the fan noise covering it up as much.

Quote:


> Originally Posted by *KillerBee33*
> 
> Does anyone know if these things exist or what company might make these for Vinyl OD: 5/8" (16 mm) x ID: 3/8" (10 mm) tube?


https://www.amazon.com/XSPC-Barb-Fitting-Black-Chrome/dp/B00NODEEBQ/ref=sr_1_4?ie=UTF8&qid=1472737697&sr=8-4&keywords=10mm+barb+G1%2F4
+
https://www.amazon.com/Barrow-Degree-Fitting-Connector-TDWT90SN-V2/dp/B01IDRX79C/ref=sr_1_15?ie=UTF8&qid=1472737725&sr=8-15&keywords=G1%2F4+90+degree

alternatives:

https://www.amazon.com/Barrow-Degree-Fitting-Rotatable-Adapter/dp/B01HOSO952/ref=sr_1_27?ie=UTF8&qid=1472737748&sr=8-27&keywords=G1%2F4+90+degree

https://www.amazon.com/Bitspower-Female-Extender-Fitting-Shining/dp/B017GLWXL4/ref=pd_day0_147_3?ie=UTF8&psc=1&refRID=PJSKY0WKDMFZBVKWA7J0

https://www.amazon.com/Bitspower-Female-Extender-Fitting-4-pack/dp/B00WS5BEX2/ref=pd_day0_147_1?ie=UTF8&psc=1&refRID=PJSKY0WKDMFZBVKWA7J0

https://www.amazon.com/XSPC-Barb-Fitting-Chrome-4-pack/dp/B00NODEC8Q/ref=sr_1_1?ie=UTF8&qid=1472737850&sr=8-1&keywords=10mm+barb+G1%2F4

I used a setup kinda like that but with a "T" piece instead of just an elbow to create my drain valve area of the loop. I used a piece like this:
https://www.amazon.com/Monsoon-Rotary-Fitting-Matched-Chrome/dp/B017ZHBB58/ref=sr_1_34?ie=UTF8&qid=1472738049&sr=8-34&keywords=G1%2F4+90+degree
with these barbs:
https://www.amazon.com/XSPC-Barb-Fitting-Chrome-4-pack/dp/B00NODEC8Q/ref=sr_1_1?ie=UTF8&qid=1472737850&sr=8-1&keywords=10mm+barb+G1%2F4
and this valve:
https://www.amazon.com/Phobya-2-way-Ball-Valve-Nickel/dp/B009KBHW7O/ref=pd_sim_147_5?ie=UTF8&psc=1&refRID=WKN5958MAP75WBFNCSKA
Then I connected another barb on the output of the valve. I leave extra hose off the valve output most of the time and I simply put a piece of tubing onto the valve when it comes time to drain and I can use the big tube to go right into a bucket.


----------



## KillerBee33

Quote:


> Originally Posted by *EniGma1987*


Nice , thank you.


----------



## DADDYDC650

Quote:


> Originally Posted by *EniGma1987*
> 
> All my Nvidia cards going back 5 years have had just as much coil whine at high FPS at my Titan XP does :/ Waterblocks make it more noticeable since you dont have the fan noise covering it up as much.


None of my Nvidia cards have had coil whine like My Titan XP.


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> None of my Nvidia cards have had coil whine like My Titan XP.


Mine sounded like a Jet Engine after 4200 RPM







these titans go up to 5000


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> Mine sounded like a Jet Engine after 4200 RPM
> 
> 
> 
> 
> 
> 
> 
> these titans go up to 5000


I'm talking about coil whine with high frames not the fan noise.


----------



## jhowell1030

Quote:


> Originally Posted by *DADDYDC650*
> 
> I figured that much. My card had pretty bad coil whine with high frames which I don't think is normal so it had to go back regardless.


I've seen multiple reviewers comment that going +500 on the memory (regardless of stability) not only had no diminishing returns for FPS but in some cases even hurt it.


----------



## DADDYDC650

Quote:


> Originally Posted by *jhowell1030*
> 
> I've seen multiple reviewers comment that going +500 on the memory (regardless of stability) not only had no diminishing returns for FPS but in some cases even hurt it.


I figured as much but +500 sure looks like a cool number doesn't it?


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm talking about coil whine with high frames not the fan noise.


Isn't that kind of a whistle sound?


----------



## Lobotomite430

Quote:


> Originally Posted by *jhowell1030*
> 
> I've seen multiple reviewers comment that going +500 on the memory (regardless of stability) not only had no diminishing returns for FPS but in some cases even hurt it.


Well then I will change my OC when i get home!


----------



## KillerBee33

Quote:


> Originally Posted by *EniGma1987*


Decided to go with 2 or 3 of thsese









https://www.amazon.com/gp/product/B004CLDI0M/ref=ox_sc_act_title_2?ie=UTF8&psc=1&smid=A3GO5VFCNOM5I7
Plus one on each end
http://www.swiftech.com/3-8x5-8inch-LokSeal-Compression-fitting.aspx


----------



## DADDYDC650

Quote:


> Originally Posted by *KillerBee33*
> 
> Isn't that kind of a whistle sound?


Kinda I guess. I sent back the card already so can't remember 100 percent.


----------



## jodasanchezz

Hi There,
got my Titan yesterday









But i have no idea how to unlock voltage? in Afterburner

Is Precision X incompatible?


----------



## KillerBee33

Quote:


> Originally Posted by *DADDYDC650*
> 
> Kinda I guess. I sent back the card already so can't remember 100 percent.


Humm







always thought Cole Wine was from a cheap EVGA Blower Fan .


----------



## KillerBee33

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi There,
> got my Titan yesterday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But i have no idea how to unlock voltage? in Afterburner
> 
> Is Precision X incompatible?


Adding Voltage on Pascals is a loss not a gain. You'll hit the PL faster and will throttle sooner.


----------



## SuCCEzz

Anybody here got an AIO hybrid cooler for sale that has upgraded to an EKWB?


----------



## KillerBee33

Quote:


> Originally Posted by *SuCCEzz*
> 
> Anybody here got an AIO hybrid cooler for sale that has upgraded to an EKWB?


Will sell mine by the end of this month EVGA 10Series


----------



## dante`afk

Quote:


> Originally Posted by *DADDYDC650*
> 
> Got approved for an RMA. Hope my next card is even better. If not, whatevs. Are all XP's pretty much hitting 2Ghz/+500 memory game stable?


Your +500 memory gives you about 0.0001 more fps and 10 more points in 3dmark.

welp.


----------



## DADDYDC650

Yea you forgot +20 percent coolness factor.


----------



## SuCCEzz

Quote:


> Originally Posted by *KillerBee33*
> 
> Will sell mine by the end of this month EVGA 10Series


Keep me posted, I will pick it up!


----------



## atreides

Quote:


> Originally Posted by *Lennyx*
> 
> 100% is 2x your native resolution, thats why you only get 30fps. Play around some with the slider til you get the fps target you are looking for or set it to 42% for your native resolution.


Okay thank you for the information I really appreciate it


----------



## habu58

Quote:


> Originally Posted by *atreides*
> 
> Hello there, I also played some bf1 today. I run everything on ultra and I can only go up to 60% resolution any higher I'll drop below 60 fps., at 100% I'm going at 30fps. What kind of performance are you getting? Im not water cooling my Titan and with bf1 I'm hitting 70c with 90% fan speed. I'm using a 3440x1440p 100hz panel.
> 
> I've been contemplating going sli to see if it would be possible to have every setting maxed out and see if I could get that constant 60fps. Could anyone give me advice? Will two Titan XP achieve that perfect 60fps at maxed or settings? Thanks


In SLI at 4K with every setting maxed out and 100% scaling I was getting a solid 60 (I didn't try it without vsync). Even while I was recording gameplay it only ever dropped to 59 ever so often. I'm overclocking the cards at 200/600 and have them on water with peak temps of 43.


----------



## bl4ckdot

Random 3dmark FS of the day : http://www.3dmark.com/fs/10006570
+200 on core only, air cooling for now. Is it "good" ?


----------



## KillerBee33

Quote:


> Originally Posted by *bl4ckdot*
> 
> Random 3dmark FS of the day : http://www.3dmark.com/fs/10006570
> +200 on core only, air cooling for now. Is it "good" ?


Try +230+650 with these fan settings


----------



## bl4ckdot

Quote:


> Originally Posted by *KillerBee33*
> 
> Try +230+650 with these fan settings


http://www.3dmark.com/fs/10007085

+220 / 600
More isn't stable or I have artifacts.

This is the #11 best score compared to everyone with a 4790k + Titan XP. I guess this is OK since my OC on my CPU isn't high at all.


----------



## KillerBee33

Quote:


> Originally Posted by *bl4ckdot*
> 
> http://www.3dmark.com/fs/10007085
> 
> +220 / 600
> More isn't stable or I have artifacts.
> 
> This is the #11 best score compared to everyone with a 4790k + Titan XP. I guess this is OK since my OC on my CPU isn't high at all.


Depends on what you are looking for. I just go by GFX score , makes more sense since some have 6950Xs








http://www.3dmark.com/fs/9866037


----------



## bl4ckdot

Quote:


> Originally Posted by *KillerBee33*
> 
> Depends on what you are looking for. I just go by GFX score , makes more sense since some have 6950Xs
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/9866037


Yeah sure, I meant that I wasn't underperforming.
Now it seems that, at least on air, +220 (boost @2038) is the max it can do. Luck of the draw








I'm more than happy though. Now I'm looking for a good 1440p monitor









Edit : your score is very good







2114 boost is insane


----------



## CallsignVega

Quote:


> Originally Posted by *Lennyx*
> 
> Yes its a nightmare. I never experienced whine this bad before. Gonna try put it on heavy load this weekend while im out doing other things. Hopefully few 10hour sessions will remove the whine.
> 42% is ur native resolution in BF1. Ultra preset and native res on a 1440p monitor i never dropped below 150fps in BF1. I ran it in DX11 though.


I noticed too that the game starts off at "42%" resolution factor. What an odd number? I set it to 50% like every other game, but I guess I was running it up-scaled a bit?


----------



## KillerBee33

Quote:


> Originally Posted by *bl4ckdot*
> 
> Yeah sure, I meant that I wasn't underperforming.
> Now it seems that, at least on air, +220 (boost @2038) is the max it can do. Luck of the draw
> 
> 
> 
> 
> 
> 
> 
> 
> I'm more than happy though. Now I'm looking for a good 1440p monitor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit : your score is very good
> 
> 
> 
> 
> 
> 
> 
> 2114 boost is insane


Ehh, it usually boosts to 2126 , this was with an old Driver , this 372.70 did magic at least in TimeSpy .


----------



## fernlander

Quote:


> Originally Posted by *AdamK47*
> 
> That would make sense if we were talking about an LCD. There is no need for it with OLED since black is always going to be black. It's used on the LG OLED TVs for power reduction and panel longevity. Personally, I'd sacrifice that longevity and power savings to have a display with constant contrast and brightness.


The overall benefits of these OLEDs is so great that the dimming thing doesn't bother me. I could never go back to LCD. Ever. Much less annoying than the dynamic contrast those do among all the other issues they have.

A time will come when they release OLEDs without this dimming but IMHO even the current OLEDs blow away any LCD I've seen.


----------



## mouacyk

Quote:


> Originally Posted by *CallsignVega*
> 
> I noticed too that the game starts off at "42%" resolution factor. What an odd number? I set it to 50% like every other game, but I guess I was running it up-scaled a bit?


To see the internal rendering resolution,type console command: Render.DrawScreenInfo 1

Manually setting 42% usually doesn't get 1:1 scaling back. To do that you need to reset Video settings.


----------



## DNMock

Quote:


> Originally Posted by *bl4ckdot*
> 
> Yeah sure, I meant that I wasn't underperforming.
> Now it seems that, at least on air, +220 (boost @2038) is the max it can do. Luck of the draw
> 
> 
> 
> 
> 
> 
> 
> 
> I'm more than happy though. Now I'm looking for a good 1440p monitor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit : your score is very good
> 
> 
> 
> 
> 
> 
> 
> 2114 boost is insane


The way their new boost clock works is confusing as all get out, not a big fan of it. Under water so core temp doesn't go over 40 C, and if I put an OC of +175 in the cards boost to 1850 or so only, if I put it at +195 it will boost to 2050 and at 200 boost to 2062 before the cards strap on their ****** helmet and wreck their tricycles into the PL limit wall.


----------



## willverduzco

Goodbye 1080 Waterforce, hello Titan XP. Now just a few hours until my GPU watercooling gear arrives!


----------



## tpwilko08

Quote:


> Originally Posted by *DNMock*
> 
> The way their new boost clock works is confusing as all get out, not a big fan of it. Under water so core temp doesn't go over 40 C, and if I put an OC of +175 in the cards boost to 1850 or so only, if I put it at +195 it will boost to 2050 and at 200 boost to 2062 before the cards strap on their ****** helmet and wreck their tricycles into the PL limit wall.


It seems all these titan xps are different. If I put +175 core on mine it boosts to 2050 am also on water. Seems it could be related to asics score even though we can not see it to confirm hopefully eventually.


----------



## Z0eff

Quote:


> Originally Posted by *Gary2015*
> 
> Just played 3 hours of BF1 beta. Its amazing with DX12..


I don't see any gains with DX12 here =/


----------



## atreides

Quote:


> Originally Posted by *habu58*
> 
> In SLI at 4K with every setting maxed out and 100% scaling I was getting a solid 60 (I didn't try it without vsync). Even while I was recording gameplay it only ever dropped to 59 ever so often. I'm overclocking the cards at 200/600 and have them on water with peak temps of 43.


thanks for the information, still I wonder what kind of results I would get with the stock cooler.


----------



## cisco0623

Quote:


> Originally Posted by *tpwilko08*
> 
> It seems all these titan xps are different. If I put +175 core on mine it boosts to 2050 am also on water. Seems it could be related to asics score even though we can not see it to confirm hopefully eventually.


I just started messing with my OC last night as I finally have it under water. I had +99 already set by accident and it was boosting to 1950???? Glad I caught this as I thought something was wrong. (I was thinking to myself I know I hate math but 1531 + 100 isn't 1950!)

I'm curious if it boosts higher than stock at stock now. I don't recall that when I had it on air, but now I'm going to try later and watch closely.


----------



## tpwilko08

Quote:


> Originally Posted by *cisco0623*
> 
> I just started messing with my OC last night as I finally have it under water. I had +99 already set by accident and it was boosting to 1950???? Glad I caught this as I thought something was wrong. (I was thinking to myself I know I hate math but 1531 + 100 isn't 1950!)
> 
> I'm curious if it boosts higher than stock at stock now. I don't recall that when I had it on air, but now I'm going to try later and watch closely.


mine at stock at 100% power target boosts to 1886mhz while gaming.


----------



## atreides

Quote:


> Originally Posted by *HyperMatrix*
> 
> As far as I'm aware that law doesn't apply to the US and Canada. But that's besides the point. I'm still running everything at stock. No blocks. No clu mod. Just want to make sure I don't have a bum card before getting into that.


this is distressing. Hopefully the customer support rep was incorrect or trolling.


----------



## TK421

Quote:


> Originally Posted by *Gary2015*
> 
> Changing the stock cooler voids the warranty .


are you serious?


----------



## cisco0623

Quote:


> Originally Posted by *TK421*
> 
> are you serious?


There's no warranty seal so you can put the stock one right back on.


----------



## Stateless

I was able to get my EK blocks and back plate on and things were going great. Using my +200 Core/+200 Memory I was hitting pretty constant 2000 on each card with a drop to 1974 every once in a while. Heat was fine with the top card about 3-4c higher than the lower card. Through several runs of 3dmark and Valley, the highest temp was 52c on the top card. I then ran some games and things were also staying pretty steady. I then played Crysis 3 at max settings/4k and for only about 15 minutes or so and my top GPU was hitting 68c!!! My bottom card was staying steady at 42-44c no higher.

I am not sure why it would shoot up that high. I have run water loops on my GPU's since the initial Titan and never hit temps higher that 58c during the summer, so this is freaking me out. The only think I can think of is that a air bubble happen to hit and get perhaps stuck there for a bit. I did about 20 min of Valley, 2-3 runs Firestrike, 2 Runs of Timespy and never broke 52c, but 20 min of Crysis 3 made it hit 68c. I am going to run my water pumps for a while and I turned up the flow speed and I could see some air bubbles, some large pushing through the top card block, so maybe that is what caused it. I know that if it continues, my only option is to drain, remove and reseat the block, but I did both cards back to back and had great contact when I tested, so I am hoping it was just air bubbles that happened to hit the right spot during my Crysis 3 tests.


----------



## cisco0623

Quote:


> Originally Posted by *Stateless*
> 
> I was able to get my EK blocks and back plate on and things were going great. Using my +200 Core/+200 Memory I was hitting pretty constant 2000 on each card with a drop to 1974 every once in a while. Heat was fine with the top card about 3-4c higher than the lower card. Through several runs of 3dmark and Valley, the highest temp was 52c on the top card. I then ran some games and things were also staying pretty steady. I then played Crysis 3 at max settings/4k and for only about 15 minutes or so and my top GPU was hitting 68c!!! My bottom card was staying steady at 42-44c no higher.
> 
> I am not sure why it would shoot up that high. I have run water loops on my GPU's since the initial Titan and never hit temps higher that 58c during the summer, so this is freaking me out. The only think I can think of is that a air bubble happen to hit and get perhaps stuck there for a bit. I did about 20 min of Valley, 2-3 runs Firestrike, 2 Runs of Timespy and never broke 52c, but 20 min of Crysis 3 made it hit 68c. I am going to run my water pumps for a while and I turned up the flow speed and I could see some air bubbles, some large pushing through the top card block, so maybe that is what caused it. I know that if it continues, my only option is to drain, remove and reseat the block, but I did both cards back to back and had great contact when I tested, so I am hoping it was just air bubbles that happened to hit the right spot during my Crysis 3 tests.


Sounds like SLI wasn't being used. Also maybe the block didn't mount great on the gpu. Also air like you said. How many blocks/rads are in your loop?


----------



## Stateless

Quote:


> Originally Posted by *cisco0623*
> 
> Sounds like SLI wasn't being used. Also maybe the block didn't mount great on the gpu. Also air like you said. How many blocks/rads are in your loop?


SLI was in use as I was monitoring it and I would not be hitting the frame rate I was with settings set to max. I have 3 RADS and have used original Titans in SLI, Titan Blacks, Titan X Maxwells with the same loop set up. Also my second card was always 42c or under even under load. When I came home today, I noticed lots of air bubbles in the 1st GPU and when I turned my pumps to max I saw a lot of bubbles running though the block. I am hoping it was the bubbles since it never hit that high with the benchmarking I was doing. I am letting it run now for about 2.5 hours at high flow speed and then will do some more testing.


----------



## azzazel99

How do you guys feel about purchasing the titanX pascal? Im in the market for a new gpu to match up to my pg348q ultra wide 100hz monitor. I thought about doing 1080sli but now im considering doing a titanX instead. Thoughts? Honest thoughts and feelings not just how you think you should feel since you already dropped a but load on a single card. 1 card goes for basically what 1080 sli goes for.......Should i pull the trigger on one and MAYBE another later on? I currently run a 6700k and a 5820k in a spare rig. Thoughts?


----------



## theshadowofsam

Sort of in the same boat as Azzazel. I'm considering picking up a txp (maybe used once they get there) but need some more good reasons to nab one. I'm going to be upgrading to either 1440p ultra wide or 4k soon, and I need to know if I should nab a Pascal Titan, or wait for Volta.


----------



## azzazel99

This is my dilemma. Suck it up with my 780ti classifieds till volta or maybe a pascal refresh in 2017 with hbm2............I've read rumors of volta next year. I don't want to drop $1200-$2400 now then something better come out in just a few months. Normally i dont play the waiting game but i feel something else is coming end of year or early next year. Maybe a 1080ti or something but thats highly unlikely.


----------



## Stateless

Quote:


> Originally Posted by *theshadowofsam*
> 
> Sort of in the same boat as Azzazel. I'm considering picking up a txp (maybe used once they get there) but need some more good reasons to nab one. I'm going to be upgrading to either 1440p ultra wide or 4k soon, and I need to know if I should nab a Pascal Titan, or wait for Volta.


Well, playing the waiting game you can end up doing that forever because there will be something after Volta or the Titan Volta variant etc. Right now, if you nee the memory or you want to drive the best performance with a single card, the Titan X P is a good choice. I started with one and then got a second and am enjoying the performance they prove at 4k. Being able to max out something like Witcher 3 at 4k and rock solid 60fps is bliss. When I say max, I mean every possible setting at it's highest, even hairworks.


----------



## theshadowofsam

Quote:


> Originally Posted by *Stateless*
> 
> Well, playing the waiting game you can end up doing that forever because there will be something after Volta or the Titan Volta variant etc. Right now, if you nee the memory or you want to drive the best performance with a single card, the Titan X P is a good choice. I started with one and then got a second and am enjoying the performance they prove at 4k. Being able to max out something like Witcher 3 at 4k and rock solid 60fps is bliss. When I say max, I mean every possible setting at it's highest, even hairworks.


Thank you. That's pretty much what I needed to hear. Sitting on a 980ti right now is nice, but there's always that need for... More.


----------



## willverduzco

So apparently I couldn't even last 4 hours with the default cooling...




Full Specs:

Core i7 Haswell-E 5930k - 6 Core at 5 GHz (watercooled with 480mm double-width radiator)
Titan X Pascal (CLLC with dedicated 120mm radiator)
Asus x99 Sabertooth TUF motherboard with active VRM cooling
32 gigs Corsair Vengeance LPX DDR4 at 3200 MHz, CL16
512 GB Samsung 950 Pro (M.2 NVMe connection) for apps and games
512 GB Samsung 850 Pro (SATA) for downloads and documents
2x 1 TB WD Blue (RAID 1) for monthly backups of critical data
I'm in a happy place now... well, until I feel the need to get a second Titan XP


----------



## Glerox

Quote:


> Originally Posted by *Glerox*
> 
> Hi folks! Maybe some of you can help me! I have two questions.
> First, I just finished my first custom loop for my titan XP. All with EKWB products.
> 
> 
> 
> I have a 6800k OC to 4,2 GHz/1,4v.
> I have a 240mm top rad 25mm thick and a 360mm front rad 40mm thick.
> All my 5 fans are push configuration set as intake and I have a 120mm exhaust at the back.
> They are EK-vardar 1150 rpm running at max speed.
> When I game, I'm getting temps around 54-55 degrees... which is way too high IMO for a custom loop...
> 
> Is it simply because my fans are not fast enough? Or because something is wrong with the loop or the rads are not enough?


Maybe this will interest newcomers in custom looping their rigs!

I just installed my new sp120 high performance fans (2400 rpm). My temps dropped a lot and I stay now at 44 degrees at full load!
I was at 55 degrees with 1150 rpm fans. So huge difference just by changing the fans.

For the noise I can control the speed for when I'm on desktop or even lower it during gaming.
It just gives you more options!

I wasted 100$ on low speed fans damn haha


----------



## Glerox

Quote:


> Originally Posted by *azzazel99*
> 
> How do you guys feel about purchasing the titanX pascal? Im in the market for a new gpu to match up to my pg348q ultra wide 100hz monitor. I thought about doing 1080sli but now im considering doing a titanX instead. Thoughts? Honest thoughts and feelings not just how you think you should feel since you already dropped a but load on a single card. 1 card goes for basically what 1080 sli goes for.......Should i pull the trigger on one and MAYBE another later on? I currently run a 6700k and a 5820k in a spare rig. Thoughts?


I had 1080 sli watercooled with AIO and I switched to a titan x pascal with a custom loop.
It was quite a struggle to sell my two modded 1080s without losing too much money but I'm glad I did it!

1080s in SLI were more powerful but you have the SLI problem. YES it works in most games. BUT it's more complicated, and it took me more time (which I don't have) because i was always cheking my scaling etc etc... and some games are just not working and you definitely have stuttering and VR still don't support it... Aneways don't want to start the SLI debate lol.

I find that titan x pascal is perfect for [email protected] and [email protected] with gsync because you'll get all games at 50-60 fps in 4k and 100-165 fps in 1440p. So gsync does the rest of the job.

I'll keep my rig for a while!


----------



## Baasha

BF1 @ 5K (everything on Ultra , No AA):


----------



## willverduzco

Quote:


> Originally Posted by *Baasha*
> 
> BF1 @ 8K (100% resolution scale) w/ everything on Ultra:
> 
> http://i.imgur.com/ha4NLxW.jpg


That is seriously remarkable performance. Beyond jealous!


----------



## RepTexas

Quote:


> Originally Posted by *azzazel99*
> 
> How do you guys feel about purchasing the titanX pascal? Im in the market for a new gpu to match up to my pg348q ultra wide 100hz monitor. I thought about doing 1080sli but now im considering doing a titanX instead. Thoughts? Honest thoughts and feelings not just how you think you should feel since you already dropped a but load on a single card. 1 card goes for basically what 1080 sli goes for.......Should i pull the trigger on one and MAYBE another later on? I currently run a 6700k and a 5820k in a spare rig. Thoughts?


My 2 cents. Im not rich by any means.

I game on a 3440x1440 monitor and on the HTC vive. I had 2 980tis in SLI on water. Games that support SLI played great. Games that dont launch with Sli or never support it ran just short of comfort. I decided to give one my 980tis to my son. Sold his 970 and my other 980ti. Ended up paying about $800 for the Titan with a water block.

Every game now except for 2 (Arma3 and Ark) run above 60fps maxed. All of the games i play run 90+ fps with mixed ultra / high settings. Also i can upscale all my htc vive games and stay above fps which is nice.

The peformance above my 980tis was not huge...but i love not having to worry about ski anymore. My TXP will be a long term single card solhti5for me and should carry me into the next generation of VR. DX12 will also help extend the lifespan of the TXP hopefully.

I have zero regrets. Super happy. It really depends why and what you will us it for tbh.it would be a waste on lower resolutions or sli supported games. Or if money is an issue the 1080 is definitely a better bang for your buck.


----------



## krizby

Quote:


> Originally Posted by *Glerox*
> 
> Maybe this will interest newcomers in custom looping their rigs!
> 
> I just installed my new sp120 high performance fans (2400 rpm). My temps dropped a lot and I stay now at 44 degrees at full load!
> I was at 55 degrees with 1150 rpm fans. So huge difference just by changing the fans.
> 
> For the noise I can control the speed for when I'm on desktop or even lower it during gaming.
> It just gives you more options!
> 
> I wasted 100$ on low speed fans damn haha


With low rpm fans I think push pull config (doubling your fans) would have been more preferable, cost more but then you don't have to deal with the industrial vacuum cleaner that is the corsair SP high performance fans lol.

Also with such positive pressure air flow i would like to know the temp inside your case, do you have a temperature probe ? also 44c at full load after 10min or 30min ? since temp will slowly build up inside positive pressure air flow.


----------



## Glerox

Quote:


> Originally Posted by *krizby*
> 
> With low rpm fans I think push pull config (doubling your fans) would have been more preferable, cost more but then you don't have to deal with the industrial vacuum cleaner that is the corsair SP high performance fans lol.
> 
> Also with such positive pressure air flow i would like to know the temp inside your case, do you have a temperature probe ? also 44c at full load after 10min or 30min ? since temp will slowly build up inside positive pressure air flow.


I've read that push/pull don't make a big difference vs push. Aneways i don't have space in my case for push /pull so I'm happy with these sp120!

I have changed the setup to three intakes and three exhaust so there is minimal heat build-up. 44 degrees was after more then 30 mins at full load!


----------



## Asmodian

My EK waterblock got here so I got to test max overclocks. I haven't done any power mods, voltage adjustments, or touched the memory yet.

I seem to be stable at +190 MHz in Afterburner, for 2037 Mhz at today's steady state of 42°C. It doesn't like much more at all, but I am not sure how it works now, it seems to run at 2050 MHz a lot and I have seen a bit higher for brief periods. At least it stays below the 120% power limit at these clocks.









9526 in Time Spy




edit: The EK backplate gets quite hot (~45°C)! It is fairly thick aluminum and comes with milled features and thermal pads to attempt to cool the back side of the PCB, especially the voltage regulators, but it contacts behind the memory as well. I like it a lot.


----------



## jodasanchezz

HI

I woll install a Singele Loop for The Titan,
Has someone a Titan in an Extra Loop with 360 Rad?

What Temps should i expact, looking vor EK-CoolStream PE 360 .

Thanks in Advance.


----------



## DADDYDC650

Anyone else get high coil whine on their XP's at high frames like when viewing game menu's?


----------



## xarot

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone else get high coil whine on their XP's at high frames like when viewing game menu's?


Yes.


----------



## azzazel99

Well im torn. Do i do dual titanX's on this 6700k? Or do i do 1 titanX (or dual 1080's) upgrade this under microcenter extended warranty to a 6850k and x99 so i have 40 lanes so if i do get another titanX


----------



## azzazel99

I've never had this much trouble figuring out what to do . I dont and havent had sli issues ever with my dual 780ti classifieds so im not scared to sli but im just so torn. This 3440x1440 monitor runs at 100hz so to fully enjoy it i was as close to that 100hz as i can get.


----------



## willverduzco

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone else get high coil whine on their XP's at high frames like when viewing game menu's?


I only noticed the coil whine after I put mine under water, as the stock hairdryer is way louder than any coil noise. Now that I've done so, the coil whine is by far the loudest part of my computer when playing a game. I notice the whine most when I am benchmarking the card--even louder than in game menus.


----------



## Leyaena

I just sold my second Titan X (Maxwell).
When it comes right down to it, SLI wasn't really worth it anymore in my book, especially not with the state games have been releasing in lately.
And that's coming from someone who had 2-way SLI with the top-of-the-line cards every year since the 680s came out.


----------



## willverduzco

Quote:


> Originally Posted by *Leyaena*
> 
> I just sold my second Titan X (Maxwell).
> When it comes right down to it, SLI wasn't really worth it anymore in my book, especially not with the state games have been releasing in lately.
> And that's coming from someone who had 2-way SLI with the top-of-the-line cards every year since the 680s came out.


Couldn't agree more. I've had dual (relatively high-end) cards for ages (2x Voodoo 2 12 MB SLI, 2x GTX460, 2x GTX570, 2x GTX670, 2x Radeon 7970, 2x Radeon 290x, 2x 980TI). Now, it's just no longer a good investment because half of the time, a new game release doesn't support SLI/CFx--or it takes ages for the game developers to add it in.

With the lack of game support, multi-GPU stutters when under 60 fps, and the proportional mandatory increase in input latency as you add more cards, I don't believe that multi-GPU always provides a good experience if you just want to get in and play on a game's launch day. I imagine that in the transition between driver-based multi-GPU in DX11 and EMA (explicit multi-adapter) in DX12, there will be lots of triple-A titles that don't end up getting multi-GPU support. Hell, Microsoft hasn't even added mGPU support yet to UWP apps, and now that they're set to bring more Xbox exclusives to the PC through the Win10 store, UWP is actually a viable way to get games.


----------



## Leyaena

Quote:


> Originally Posted by *willverduzco*
> 
> Now, it's just no longer a good investment because half of the time, a new game release doesn't support SLI/CFx--or it takes ages for the game developers to add it in.


Don't forget scenarios where there is SLI support, but both GPUs just kinda hover around 60ish percent, and you actually get worse framerates than you did with SLI off.
I'm looking at you, Deus Ex!


----------



## st0necold

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone?


bro if you are bored with your 1200$ TXP-- that's you! If you want to return it for another one that makes absolutely zero sense but if it makes you feel better go for it!


----------



## Jpmboy

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone else get high coil whine on their XP's at high frames like when viewing game menu's?


nope, no coil whine at all.


----------



## axiumone

The new msi afterburner beta is out.

http://www.guru3d.com/news-story/download-msi-afterburner-4-3-beta-14.html


----------



## willverduzco

Quote:


> Originally Posted by *axiumone*
> 
> The new msi afterburner beta is out.
> 
> http://www.guru3d.com/news-story/download-msi-afterburner-4-3-beta-14.html


"Not just that, we have also updated RTSS, our statistics server that enables the overlay with MSI AfterBurner, yes we now offer DirectX 12 overlay support (albeit this build again is in BETA stages and should be treated as such)."

Oh, hell yea!


----------



## KillerBee33

Quote:


> Originally Posted by *axiumone*
> 
> The new msi afterburner beta is out.
> 
> http://www.guru3d.com/news-story/download-msi-afterburner-4-3-beta-14.html


Now all we need is a GPUZ to support BIOS reading on the TXP


----------



## axiumone

Quote:


> Originally Posted by *willverduzco*
> 
> "Not just that, we have also updated RTSS, our statistics server that enables the overlay with MSI AfterBurner, yes we now offer DirectX 12 overlay support (albeit this build again is in BETA stages and should be treated as such)."
> 
> Oh, hell yea!


Welp, both time spy and tomb raider crash on launch if the overlay is active. I guess it's a step. lol


----------



## jhowell1030

Quote:


> Originally Posted by *Glerox*
> 
> I had 1080 sli watercooled with AIO and I switched to a titan x pascal with a custom loop.
> It was quite a struggle to sell my two modded 1080s without losing too much money but I'm glad I did it!
> 
> 1080s in SLI were more powerful but you have the SLI problem. YES it works in most games. BUT it's more complicated, and it took me more time (which I don't have) because i was always cheking my scaling etc etc... and some games are just not working and you definitely have stuttering and VR still don't support it... Aneways don't want to start the SLI debate lol.
> 
> I find that titan x pascal is perfect for [email protected] and [email protected] with gsync because you'll get all games at 50-60 fps in 4k and 100-165 fps in 1440p. So gsync does the rest of the job.
> 
> I'll keep my rig for a while!


As someone that went from two 980s to a TXP for my predator X34 I agree with most of this. Just keep in mind 1440p vs Ultrawide 1440p. I've only been able to test a couple of games but right now Witcher 3 averages at 75FPS for me and Dues Ex averages 55-60FPS.


----------



## DADDYDC650

Quote:


> Originally Posted by *st0necold*
> 
> bro if you are bored with your 1200$ TXP-- that's you! If you want to return it for another one that makes absolutely zero sense but if it makes you feel better go for it!


Don't remember asking you for an opinion.... card has coil whine issues regardless.


----------



## Leyaena

Quote:


> Originally Posted by *DADDYDC650*
> 
> Don't remember asking you for an opinion.... card has coil whine issues regardless.


You kinda did, though...
Quote:


> Should I return it and try the Titan lottery again?


----------



## jhowell1030

Quote:


> Originally Posted by *Leyaena*
> 
> You kinda did, though...


Yeah...you did. Don't ask for opinions if you don't like the answers bud. There's always gonna be that one guy that gives it to ya.


----------



## DADDYDC650

Quote:


> Originally Posted by *Leyaena*
> 
> You kinda did, though...


Was kinda talking to myself since I was doing it regardless. Oh wellz.

BTW, Nvidia's return process is garbage.


----------



## mikehii

Showing off mine


----------



## KillerBee33

Quote:


> Originally Posted by *mikehii*
> 
> Showing off mine


Those Cable Combs, clipons or?


----------



## mikehii

Quote:


> Originally Posted by *KillerBee33*
> 
> Those Cable Combs, clipons or?


https://mainframecustom.com/product-category/cable-management/lc-stealth-cable-combs/


----------



## KillerBee33

Quote:


> Originally Posted by *mikehii*
> 
> https://mainframecustom.com/product-category/cable-management/lc-stealth-cable-combs/


Ordering E22s Acid Green


----------



## mikehii

Quote:


> Originally Posted by *KillerBee33*
> 
> Ordering E22s Acid Green


----------



## CallsignVega

Ah nice, new MSi AB overlay is working in BF1 just fine DX12.


----------



## jodasanchezz

Is a 10k+ grafics scorew good in Timespay @ air?

http://www.3dmark.com/3dm/14577209?


----------



## KillerBee33

Quote:


> Originally Posted by *jodasanchezz*
> 
> Is a 10k+ grafics scorew good in Timespay @ air?
> 
> http://www.3dmark.com/3dm/14577209?


I'd say 10,5 should be normal


----------



## shapin

Quote:


> Originally Posted by *mikehii*
> 
> Showing off mine


You didnt got the memo from nvidia?
Every titan owner have to buy a caselabs case, its much better for your card


----------



## Ostrava

Finally got some time to install fullcover ek waterblocks on my titan x(p)s, but I noticed that the required screws to install ek's backplate aren't provided.
The instructions state that 6 m2.5 x 7mm screws are needed, but the fullcover block doesn't provide those nor do any screws come with the backplate.
I figure I'll just order them from Amazon, but did anyone else have this problem?

Also for anyone else watercooling these cards with ek blocks, did any of you mod the Nvidia HB SLI bridge to make it fit?
I noticed when I purchased the blocks that the HB bridge wasn't compatible with the blocks, but I figured there must be some way to make them fit.

Wasn't sure if I should post here or in the EK club (that thread seems to be a clusterf**k currently). Any help is appreciated.


----------



## Stateless

Quote:


> Originally Posted by *Ostrava*
> 
> Finally got some time to install fullcover ek waterblocks on my titan x(p)s, but I noticed that the required screws to install ek's backplate aren't provided.
> The instructions state that 6 m2.5 x 7mm screws are needed, but the fullcover block doesn't provide those nor do any screws come with the backplate.
> I figure I'll just order them from Amazon, but did anyone else have this problem?
> 
> Also for anyone else watercooling these cards with ek blocks, did any of you mod the Nvidia HB SLI bridge to make it fit?
> I noticed when I purchased the blocks that the HB bridge wasn't compatible with the blocks, but I figured there must be some way to make them fit.
> 
> Wasn't sure if I should post here or in the EK club (that thread seems to be a clusterf**k currently). Any help is appreciated.


They do come with the screws. Rip open the bottom of the package that the back plate came in. It is tucked under the flap of the box. I was thinking the same thing when I installed mine and was going mad that I did not get them. That is when I tore up the box and found the screws under one of the flaps on one of the ends of the box. I have 2 blocks and 2 back plates and the screws were in both packages, just nicely hidden.

For the SLI bridge, I bought the EVGA HB Bridge which is the same thing, but designed different and with color lights!!! Works great!


----------



## Grubdog

Made an account just to post this. First of all I love my Titan! Much needed upgrade from a 290x







but anyways haha. I have an ek predator 360 with a 6850k 4.3ghz 1.3v. I have my Titan x in the same loop with the ek block and qdc. At 1150 rpm my Titan x temps were 75c over clocked, 120% power +225 core +200 Mem. Even with the fans at 95% (around 2050 rpm) my Titan hits 70c is that normal? I don't know what rpm the pump is at, that was my next step. Plugging the pump into the motherboard directly and setting it to 100. Should I possibly look into getting another rad? And if so the max I could fit in my case would be a 280mm.


----------



## Ostrava

Thanks so much stateless, turns out they're exactly where you say!
Its as if they were deliberately trying to hide them!
Also thanks for letting me know about evga's HB bridge. +Rep!


----------



## Stateless

Quote:


> Originally Posted by *Ostrava*
> 
> Thanks so much stateless, turns out they're exactly where you say!
> Its as if they were deliberately trying to hide them!
> Also thanks for letting me know about evga's HB bridge. +Rep!


No problem. I thought it was a very weird place for them to put the screws. Enjoy putting your blocks on. So far it is working great for me. I had a scare when one card hit 68c but the other was at 42c. Turned out I had some major air bubbles that happen to get stuck at the wrong place and caused it to heat up. I ran my pumps for a while to bleed out some more air bubbles and re-ran the same game that caused it to hit 68c and it only hit 51c. This was at +200 to core, +200 to memory and it hitting 2,000 rock steady.


----------



## Cuylertech

Havnt been on here in forever... sitting in the first 10 or so slots for single XP + 6850K on Firestrike. Boosts Safe under a EK block to 2.1Ghz, running + 245 core/ +700 memory.

http://www.3dmark.com/fs/9787592


----------



## Stateless

Quote:


> Originally Posted by *Cuylertech*
> 
> Havnt been on here in forever... sitting in the first 10 or so slots for single XP + 6850K on Firestrike. Boosts Safe under a EK block to 2.1Ghz, running + 245 core/ +700 memory.
> 
> http://www.3dmark.com/fs/9787592


WOW that is great. I am at +200 Core/+200 Memory and I boost to a flat 2,000ghz. Are you adding any voltage under Afterburner or anything else? I have not used the new AB as of yet.


----------



## Cuylertech

No voltage has been added at all. I want to try the shunt mod as thats my cap currently, but I just completed a new loop last night and forgot to do it while it was apart. I think I could push to 2.2ghz once the shunt mod is done.


----------



## MrKenzie

Quote:


> Originally Posted by *Ostrava*
> 
> Thanks so much stateless, turns out they're exactly where you say!
> Its as if they were deliberately trying to hide them!
> Also thanks for letting me know about evga's HB bridge. +Rep!


Yes if you read the end of the box (nobody would, it's stupid) it says "open for the screws".


----------



## MrTOOSHORT

Shunt mod doesn't give you more clocks, just more stable boosts at set clocks.


----------



## HaniWithAnI

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Shunt mod doesn't give you more clocks, just more stable boosts at set clocks.


If under full load he is hitting PRel (and only PRel) the shunt would give him more OC headroom.


----------



## KillerBee33

Quote:


> Originally Posted by *Cuylertech*
> 
> Havnt been on here in forever... sitting in the first 10 or so slots for single XP + 6850K on Firestrike. Boosts Safe under a EK block to 2.1Ghz, running + 245 core/ +700 memory.
> 
> http://www.3dmark.com/fs/9787592


That's weird , this is +230+650 P120 no Voltage under EVGAs Hybrid
http://www.3dmark.com/fs/10011279


----------



## Cuylertech

Quote:


> Originally Posted by *KillerBee33*
> 
> That's weird , this is +230+650 P120 no Voltage under EVGAs Hybrid
> http://www.3dmark.com/fs/10011279


That is really weird... I did a new build with the GPU and the CPU.. guess ill see what it does. CPU at least is highest on hwb and 3dmark for clocks...


----------



## KillerBee33

Quote:


> Originally Posted by *Cuylertech*
> 
> That is really weird... I did a new build with the GPU and the CPU.. guess ill see what it does. CPU at least is highest on hwb and 3dmark for clocks...


Heh the weirdest part is tht i can get these GFX scores only with Stock 6700 @ 4.2 , tried with 4.6 , 4.7 and 4.8 and it drops to 32000's


----------



## Asmodian

Quote:


> Originally Posted by *Ostrava*
> 
> Thanks so much stateless, turns out they're exactly where you say!
> Its as if they were deliberately trying to hide them!
> Also thanks for letting me know about evga's HB bridge. +Rep!


Quote:


> Originally Posted by *Stateless*
> 
> No problem. I thought it was a very weird place for them to put the screws. Enjoy putting your blocks on. So far it is working great for me. I had a scare when one card hit 68c but the other was at 42c. Turned out I had some major air bubbles that happen to get stuck at the wrong place and caused it to heat up. I ran my pumps for a while to bleed out some more air bubbles and re-ran the same game that caused it to hit 68c and it only hit 51c. This was at +200 to core, +200 to memory and it hitting 2,000 rock steady.


I contacted EK's support about this the first time I bought a backplate from them. They probably have to answer this a lot.








Quote:


> Originally Posted by *KillerBee33*
> 
> Heh the weirdest part is tht i can get these GFX scores only with Stock 6700 @ 4.2 , tried with 4.6 , 4.7 and 4.8 and it drops to 32000's


I have a very similar system, but at 4.7 GHz; I got 31876.








http://www.3dmark.com/fs/10026489

edit:
I get the same graphics score at stock: 31964
http://www.3dmark.com/fs/10026245
The GPU was at the same +190 MHz 120% power for both runs, 2050 MHz usually, with short moments of 2063 and 2038.


----------



## Bramadan

I got a few questions for Vega as he seems to be running a more advanced version of what I am aiming for, although obviously I would welcome input from the rest of you guys as well.

My only goal here is a high visual fidelity gaming. I have no particular interest at all in benchmarking. Also I value resolution more then ultra smooth animation so I will be going [email protected] for a while.

I set up my 4k rig about two years ago. It consists of 3x980 in SLI and x99 board with i7-5930 (and Corsair 1200W power supply). Monitor I use is ASUS PQ32. It worked more or less fine (occasional annoyance with SLI related artifacts etc...) but it's starting to get a bit long in a tooth (could not quite get 60hz in most recent games and Division was borderline unplayable) and I feel there is time for upgrades. Also, I am getting tired of only 32 inches screen and am seriously considering switching to LG 55" Oled.

I got money but I do not like wasting it. Also, I am not particularly confident in my mechanical abilities so any major modifications are out of the question (though obviously I can assemble components etc...)

My current idea was to get LG55E6 and a pair of Titan XPs in SLI.
My questions are:

a) Is it true that 55B6 is the same TV as 55E6 except for the soundboard (for which I don't care as I have proper surround) and the screen mounting?

b) Is the second Titan an overkill? I would like to run everything at 60hz near or at max settings for next two to three years and I figure I may actually need some AA at 55 inches. On the other hand not having to worry about SLI issues also has its advantages. Also, with only one card it would be easier to justify upgrading again much sooner.

c) Am I better off upgrading board and the CPU? I don't think I am because so far I never felt I am running into CPU bottleneck but maybe I am not measuring things correctly.

d) Should I maybe just get a TV and wait one more generation and upgrade whole rig at once?


----------



## ypaul123

hey, I was wondering how many FPS you guys who run the titan x coupled with a x99 board, were getting in 4k.
I'm getting 95+ fps running the titan x and a 6900k.. is that normal ?

Thanks for your help.


----------



## DADDYDC650

Worth going 4k with a single XP? There's a 40 inch 4k Samsung on sale for $380.


----------



## MrKenzie

Quote:


> Originally Posted by *DADDYDC650*
> 
> Worth going 4k with a single XP? There's a 40 inch 4k Samsung on sale for $380.


4K is usable on a single 1080, so yes definitely worth it with a titan xp. If you expect to have all settings at max in every game even SLI titan xp cant handle it so don't be too concerned with that!


----------



## cisco0623

Quote:


> Originally Posted by *tpwilko08*
> 
> mine at stock at 100% power target boosts to 1886mhz while gaming.


Absolutely the same deal for me. I didn't even notice it! I did +200 and it goes to 2063 no sweat. I'm going to try more tonight. I really hope the custom bios comes out for these bad boys.


----------



## DADDYDC650

Quote:


> Originally Posted by *MrKenzie*
> 
> 4K is usable on a single 1080, so yes definitely worth it with a titan xp. If you expect to have all settings at max in every game even SLI titan xp cant handle it so don't be too concerned with that!


I'm mainly going to be playing DooM, BF1, new COD, Rocket League, Forza 3 Horizon and Gears 4. I'm guessing perhaps Forza and Gears 4 will give my XP a hard time. It's either the 4k Sammy, 34" LG UltraWide or Acer x34p but that costs a ton for what you get...


----------



## NYSE

Incredible how much better these cards are than the prior Titan


----------



## Nizzen

I have 150-160 fps with 2x 1080 with hb bridge in BF1 beta. Sli works in dx11 with 2. newest driver. Ultrasettings and 3440x1440








6900k cpu.


----------



## KillerBee33

Quote:


> Originally Posted by *Asmodian*
> 
> I contacted EK's support about this the first time I bought a backplate from them. They probably have to answer this a lot.
> 
> 
> 
> 
> 
> 
> 
> 
> I have a very similar system, but at 4.7 GHz; I got 31965.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/10026245
> 
> edit:
> I get the same graphics score at stock: 31964
> http://www.3dmark.com/fs/10026370
> The GPU was at the same +190 MHz 120% power for both runs, 2050 MHz usually, with short moments of 2063 and 2038.


Try raising +220+600 and more aggressive Fan Profile


----------



## toncij

Quote:


> Originally Posted by *NYSE*
> 
> Incredible how much better these cards are than the prior Titan


In what way? Sheer power is 60-70%. That's a lot.








Quote:


> Originally Posted by *Nizzen*
> 
> I have 150-160 fps with 2x 1080 with hb bridge in BF1 beta. Sli works in dx11 with 2. newest driver. Ultrasettings and 3440x1440
> 
> 
> 
> 
> 
> 
> 
> 
> 6900k cpu.


What driver would that be? On 372.54 or 372.70 it does not work.


----------



## KillerBee33

This block is on its way , decided to just dremel that 6pin opening


----------



## Nizzen

Quote:


> Originally Posted by *toncij*
> 
> In what way? Sheer power is 60-70%. That's a lot.
> 
> 
> 
> 
> 
> 
> 
> 
> What driver would that be? On 372.54 or 372.70 it does not work.


372.54









80-99% gpuload on 2xgpu.

One gtx 1080 is 80-90fps
Two gtx 1080 is 150-160fps.

First it did not work, but ran DDU and installed 372.54 again


----------



## Grubdog

So the guys who are getting 50's and 40's on their titans. How many rads do y'all have and do you have a cpu in the loop?


----------



## aylan1196

I average 50 to 54 with ek predator 360 CPU included in the loop
X99 asus rev10 5960x


----------



## tpwilko08

Max temp 38c 26c ambiant this is with 2 60mm thick rads 480 and 240 on its own loop.


----------



## Buford458

How does one increase the fan speed on a Nivada Titan X. I'm running 83.0 C under full load.


----------



## combat fighter

Quote:


> Originally Posted by *Cuylertech*
> 
> Havnt been on here in forever... sitting in the first 10 or so slots for single XP + 6850K on Firestrike. Boosts Safe under a EK block to 2.1Ghz, running + 245 core/ +700 memory.
> 
> http://www.3dmark.com/fs/9787592


I only need +220 to get a flat 2.1Ghz on the core. Memory +700-800.


----------



## Cuylertech

That was on air which i suspect was throttling. Gonna try later today under water.


----------



## Grubdog

Quote:


> Originally Posted by *aylan1196*
> 
> I average 50 to 54 with ek predator 360 CPU included in the loop
> X99 asus rev10 5960x


Is your 5960x overclocked? I have an ek predator 360 and a 6850k overclocked to 4.3 1.3v in the same loop and my titan hits 70c with 100% fan.


----------



## CallsignVega

Quote:


> Originally Posted by *Bramadan*
> 
> I got a few questions for Vega as he seems to be running a more advanced version of what I am aiming for, although obviously I would welcome input from the rest of you guys as well.
> 
> My only goal here is a high visual fidelity gaming. I have no particular interest at all in benchmarking. Also I value resolution more then ultra smooth animation so I will be going [email protected] for a while.
> 
> I set up my 4k rig about two years ago. It consists of 3x980 in SLI and x99 board with i7-5930 (and Corsair 1200W power supply). Monitor I use is ASUS PQ32. It worked more or less fine (occasional annoyance with SLI related artifacts etc...) but it's starting to get a bit long in a tooth (could not quite get 60hz in most recent games and Division was borderline unplayable) and I feel there is time for upgrades. Also, I am getting tired of only 32 inches screen and am seriously considering switching to LG 55" Oled.
> 
> I got money but I do not like wasting it. Also, I am not particularly confident in my mechanical abilities so any major modifications are out of the question (though obviously I can assemble components etc...)
> 
> My current idea was to get LG55E6 and a pair of Titan XPs in SLI.
> My questions are:
> 
> a) Is it true that 55B6 is the same TV as 55E6 except for the soundboard (for which I don't care as I have proper surround) and the screen mounting?
> 
> b) Is the second Titan an overkill? I would like to run everything at 60hz near or at max settings for next two to three years and I figure I may actually need some AA at 55 inches. On the other hand not having to worry about SLI issues also has its advantages. Also, with only one card it would be easier to justify upgrading again much sooner.
> 
> c) Am I better off upgrading board and the CPU? I don't think I am because so far I never felt I am running into CPU bottleneck but maybe I am not measuring things correctly.
> 
> d) Should I maybe just get a TV and wait one more generation and upgrade whole rig at once?


a. LG only makes two OLED panels in 2016 for TV's. A 55" and a 65". All of the screens are identical with those panels, just depends on if you want flat or curved, or what speaker/bezel setup. I personally think the 55OLEDC6P is the best for a computer monitor.

b. The second Titan-XP is needed if you never want to drop below 60 FPS in _all_ games. A single Titan-XP can run 4K @ 60 Hz in virtually all games, a few more recent ones you will have to turn down some settings or get a second card if you want to keep everything maxed out in all titles.

c. Your MB/CPU combo is fine. They don't have to work too hard for only 60 FPS.

d. You could always get one Titan-XP and the OLED now, and then get the second Titan-XP down the line if and when you/the games you play need it.


----------



## aylan1196

Quote:


> Originally Posted by *Grubdog*
> 
> Is your 5960x overclocked? I have an ek predator 360 and a 6850k overclocked to 4.3 1.3v in the same loop and my titan hits 70c with 100% fan.


Yes it's overclocked to 4.4 1.35v I have the ek predator in push pull intake in evolve atx


----------



## Asmodian

Quote:


> Originally Posted by *KillerBee33*
> 
> Try raising +220+600 and more aggressive Fan Profile


My Titan is actually under water (max 42°C). It also seems to be a bad clocker because it isn't stable when stress testing even at only +200MHz. It is also quite close to the power limit so overclocking the memory seems counter productive.


----------



## jcde7ago

Finally finished the 3x Titan XM > 2x Titan XP swap out last night!

Loop's been bleeding for over 12 hours now, zero problems....that said, I did have a near heart attack last night during the first leak test....figured out that one of the EK Parallel connector o-rings were missing from a side, causing a VERY slow leak over ~20 minutes...spotted it and fixed it and everything's been perfect since.

Got super lucky as the 2x SLI configuration on the X-99 Deluxe aligns perfectly with the height of the new 3-slot parallel connector, so I got to skip out on bending more acrylic tubing to make things fit! This was literally a drain + GPU swap out and I couldn't be happier.









Running a +190 core/+300 mem OC right now, no voltage adjustments, just 120% power limit in AB...performance has been AMAZING at 3440x1440p @ 100hz in The Witcher 3, EVERYTHING cranked up...wow. Deus Ex: Mankind Divided still has issues with SLI but runs really, really well with just a single TXP.





On a side note, I ended up dremeling the Nvidia HB SLI bridge's tips to fit with the EK waterblock connector....so far, it seems to be working just fine...is everyone else doing this or just going for the EVGA HB bridge? The piece I trimmed off seemed to just be empty PCB and I haven't seen any negative effects playing games in SLI.

Edit: The 3x Titan X Maxwells all sold (to the same guy) on eBay for $685 each with the EK blocks attached...just in case other TXM owners are looking to sell theirs and move on to the TXP. Getting 60%~ back of what I paid for is not a "bad" RoI considering I enjoyed the TXMs for ~18-19 months and paid the tech creep/early adopter fees.


----------



## Grubdog

Quote:


> Originally Posted by *aylan1196*
> 
> Yes it's overclocked to 4.4 1.35v I have the ek predator in push pull intake in evolve atx


Hmmm, mine is exhaust on the top of my corsair 760t. Wonder if I need to reseat my waterblock... did you leave the pump plugged into the predator or are you controlling it from the mobo?


----------



## toncij

How much FPS you're getting on your Titans on BF1 maxed out?


----------



## pez

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm mainly going to be playing DooM, BF1, new COD, Rocket League, Forza 3 Horizon and Gears 4. I'm guessing perhaps Forza and Gears 4 will give my XP a hard time. It's either the 4k Sammy, 34" LG UltraWide or Acer x34p but that costs a ton for what you get...


I had 4K with 2 x 1080s and felt it was very nice, but I wouldn't' recommend it with a single Ttian. Too much compromise would have to be made to get 60+ FPS at 4K. 2 x Titan XPs on the other hand would be a great setup for 4K.


----------



## eliau81

Quote:


> Originally Posted by *eliau81*
> 
> maybe i will try to contact them and be honest


Quote:


> Originally Posted by *profundido*
> 
> in case they cannot fix it and the card is lost, just swallow the sore pill since you learned a valuable lesson here. You won't be so careless and daunting next time with a card of this value and stick to safer more proven methods of cooling like e.g EK blocks, read and follow guidelines better, use the plastic washers everywhere, don't overtighten screws etc.... I sincerely hope it won't cost you the card eventually to learn that but in the long run it will benefit you and the industry I'm sure so if it comes down to that just realize that all you lost was some numbers on your bank account


Quote:


> Originally Posted by *Feklar*
> 
> Looks like pliers were used and not carefully to be sure. Nvidia will never honor that card for warranty after they inspect it. Very expensive lesson.


Quote:


> Originally Posted by *HyperMatrix*
> 
> Speaking of warranty....I was on the phone with Nvidia today regarding one of my cards which is acting up under the new drivers. The rep tried telling me that overclocking voids my warranty and that I should only run my card at stock settings. At this point I yelled a bunch of expletives and told him to transfer me to a supervisor. I've never been under the impression that video card overclocking would void warranty since it's blocked in the driver by thermal and power limits. Unless, of course, you're using a custom bios. So this did tick me off.


Quote:


> Originally Posted by *eliau81*
> 
> Ohhh goddd what did i done????!!!!!
> Instaled the hybrin evga 980 and the card dosnt work!!!! Oh god
> Im uploadin some photo only the pump works the gtx led logo dosent work
> Please help somebody


Quote:


> Originally Posted by *5150 Joker*
> 
> RIP Titan X. I hope you don't try to claim warranty for something you did. Should've left it with stock cooling, it's not like you gain any worthwhile performance with water or these mods some are doing.


Quote:


> Originally Posted by *skypine27*
> 
> Eliau81:
> 
> I'm sorry to say I'd have to kind of side with Joker on this one. If you aren't going to go full custom loop and get respectable and tested full blocks from EK or aqua-comptuer.de, then stick with the stock coolers. Taking an AIO off a different card and going that route doesn't seem worth the risk.
> 
> I really hope you get your money back and nvidia doesn't read this thread !
> 
> Side comment:
> 
> So I was running more Fire Strike Ultra Benches and this time had the Power Limit displayed in the Precsion X OSD.
> 
> I have the power slider set to its max of 120% and + 175 Mhz on the GPU clock and +500 on the Mhz.
> 
> I was watching clock speeds and power usage (the temps are fine to me under the EK blocks) and noticed the cards seemed to max out at 2000 and then one card or the other would smack into the 120% (briefly would see 121%) and then one or both would rapidly drop down to 1975, 1987, 1974, 1962, etc. The highest I would see was 2000 and it wasn't sustained very long before that power limit hit. I ran FSU back to back in this config and got:
> Run 1: 13817
> Run 2: 13749
> 
> I really hope someone can crack the BIOS and let us crank that slider to 125% or 130% and not have to deal with strange PCB mods !!


RMA!!!








the green team approved to replace my broken card








after explain the all story (my unprofessional work) how i scratch the PCB, and to be honest it is my fault .
they told my that what i did avoid warranty and next thing got a stunning email from eRMA team that my RMA as approved








i was in shock didn't believe at first then i have sent my broken card
and a few days later fedex at my door with a brand new unopened card.
wow!!!

14215380_10208650611757477_1701708751_o.jpg 181k .jpg file


----------



## piee

4MM HEX WRENCH


----------



## s1rrah

Is there any official word on when either a hybrid water cooled version or, even just a Hybrid cooler sold separately, might be available?

I'm chomping at the bit to buy one of these to replace my dual GTX 980 Hybrids ... but not interested in air cooling and am not going to invest in a custom loop.

The hybrid kits on my 980's have been one of the best investments I've ever made and really want a Titan X Pascal hybrid ( and not really interested in modding/hacking a 1080 hybrid kit, though I will if it ends up the only option) ...

???


----------



## s1rrah

Is there any official word on when either a hybrid water cooled version or, even just a Hybrid cooler sold seperately, might be available?

I'm chomping at the bit to buy one of these to replace my dual GTX 980 Hybrids ... but not interested in air cooling and am not going to invest in a custom loop.

The hybrid kits on my 980's have been one of the best investments I've ever made and really want a Titan X Pascal hybrid ( and not really interested in modding/hacking a 1080 hybrid kit, though I will if it ends up the only option) ...

???


----------



## profundido

Quote:


> Originally Posted by *eliau81*
> 
> O M G big gratz !! You lucky son of a gun. Can't believe they gave you a new one lol =P =P
> 
> RMA!!!
> 
> 
> 
> 
> 
> 
> 
> 
> the green team approved to replace my broken card
> 
> 
> 
> 
> 
> 
> 
> 
> after explain the all story (my unprofessional work) how i scratch the PCB, and to be honest it is my fault .
> they told my that what i did avoid warranty and next thing got a stunning email from eRMA team that my RMA as approved
> 
> 
> 
> 
> 
> 
> 
> 
> i was in shock didn't believe at first then i have sent my broken card
> and a few days later fedex at my door with a brand new unopened card.
> wow!!!
> 
> 14215380_10208650611757477_1701708751_o.jpg 181k .jpg file


----------



## Lennyx

Quote:


> Originally Posted by *eliau81*
> 
> RMA!!!
> 
> 
> 
> 
> 
> 
> 
> 
> the green team approved to replace my broken card
> 
> 
> 
> 
> 
> 
> 
> 
> after explain the all story (my unprofessional work) how i scratch the PCB, and to be honest it is my fault .
> they told my that what i did avoid warranty and next thing got a stunning email from eRMA team that my RMA as approved
> 
> 
> 
> 
> 
> 
> 
> 
> i was in shock didn't believe at first then i have sent my broken card
> and a few days later fedex at my door with a brand new unopened card.
> wow!!!
> 
> 14215380_10208650611757477_1701708751_o.jpg 181k .jpg file


Congrats. Its nice to see that it can pay off to be honest and tell the truth. Have fun with your new card, but please stay away from the violence this time if you are gonna do any custom work on it


----------



## Cuylertech

Quote:


> Originally Posted by *s1rrah*
> 
> Is there any official word on when either a hybrid water cooled version or, even just a Hybrid cooler sold seperately, might be available?
> 
> I'm chomping at the bit to buy one of these to replace my dual GTX 980 Hybrids ... but not interested in air cooling and am not going to invest in a custom loop.
> 
> The hybrid kits on my 980's have been one of the best investments I've ever made and really want a Titan X Pascal hybrid ( and not really interested in modding/hacking a 1080 hybrid kit, though I will if it ends up the only option) ...
> 
> ???


Yes. EVGA confirmed they will be releasing one in the future but as of now its in early R&D stages.


----------



## Asmodian

Quote:


> Originally Posted by *s1rrah*
> 
> Is there any official word on when either a hybrid water cooled version or, even just a Hybrid cooler sold seperately, might be available?
> 
> I'm chomping at the bit to buy one of these to replace my dual GTX 980 Hybrids ... but not interested in air cooling and am not going to invest in a custom loop.
> 
> The hybrid kits on my 980's have been one of the best investments I've ever made and really want a Titan X Pascal hybrid ( and not really interested in modding/hacking a 1080 hybrid kit, though I will if it ends up the only option) ...
> 
> ???


We are currently under the assumption that this is the only Titan X (Pascal) card that is going to be released. There has been no news, or even rumors, of anything else.

A custom loop is also a great investment, you usually only need to get a new block when upgrading. When well maintained a custom loop can last a long time. My experiance this time was excellent, I simply swapped EK's hose mount from my old 980Ti to the Titan X, I didn't even have to disconnect any tubing. Still, it is probably around $500 to start a decent custom loop and they do take some work to install the first time.
Quote:


> Originally Posted by *Cuylertech*
> 
> Yes. EVGA confirmed they will be releasing one in the future but as of now its in early R&D stages.


Oops! They are releasing the kit after all? Nice.


----------



## Stateless

Quote:


> Originally Posted by *MrKenzie*
> 
> 4K is usable on a single 1080, so yes definitely worth it with a titan xp. If you expect to have all settings at max in every game even SLI titan xp cant handle it so don't be too concerned with that!


What game cant SLI Titan X Pascal handle at 4k/60fps max settings? Pretty much each game I tested so far it is able to do it. The Witcher 3 even with highest hairworks settings I get easily 4k/60fps. Crysis 3 the same thing, even using the highest AA settings. (at some points it does dip a little), but mainly 60fps throughout.


----------



## Stateless

Quote:


> Originally Posted by *cisco0623*
> 
> Absolutely the same deal for me. I didn't even notice it! I did +200 and it goes to 2063 no sweat. I'm going to try more tonight. I really hope the custom bios comes out for these bad boys.


WOW. My cards must suck then. For me under water and max power setting and +200 on the core I max boost to 2025 and then it drops to a steady 2000. Temps never get higher than 45c on the top card.


----------



## aylan1196

Quote:


> Originally Posted by *Grubdog*
> 
> Hmmm, mine is exhaust on the top of my corsair 760t. Wonder if I need to reseat my waterblock... did you leave the pump plugged into the predator or are you controlling it from the mobo?


Pump connected to the mobo


----------



## Kyouki

Quote:


> Originally Posted by *toncij*
> 
> How much FPS you're getting on your Titans on BF1 maxed out?


I'm getting 200 solid maybe dropped to 190 a few times while on ultra 1440x2560 with no overclocks on my Titan yet. Temps with EK block while gaming over few hours never busted 50c now that i fixed my airflow.

I'm basing this off the green 200 number in top right corner not sure if that is MSI afterburner because I never used it before.


----------



## Asmodian

Quote:


> Originally Posted by *Stateless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cisco0623*
> 
> Absolutely the same deal for me. I didn't even notice it! I did +200 and it goes to 2063 no sweat. I'm going to try more tonight. I really hope the custom bios comes out for these bad boys.
> 
> 
> 
> WOW. My cards must suck then. For me under water and max power setting and +200 on the core I max boost to 2025 and then it drops to a steady 2000. Temps never get higher than 45c on the top card.
Click to expand...

Can you run stable higher than 2025 MHz?

My card boosts to 2063 when at +190, quickly falling to 2050, then even to 2037 if I am pushing it (but it is still not power throttling). It is never above 42°C. It is not stable at +200.

I have heard they boost to different max clocks at stock so it makes sense. I really want BIOS editing too.


----------



## combat fighter

Quote:


> Originally Posted by *Asmodian*
> 
> I really want BIOS editing too.


We all want that!


----------



## Vellinious

Quote:


> Originally Posted by *Asmodian*
> 
> Can you run stable higher than 2025 MHz?
> 
> My card boosts to 2063 when at +190, quickly falling to 2050, then even to 2037 if I am pushing it (but it is still not power throttling). It is never above 42°C. It is not stable at +200.
> 
> I have heard they boost to different max clocks at stock so it makes sense. I really want BIOS editing too.


This is exactly what mind does.


----------



## s1rrah

Quote:


> Originally Posted by *Stateless*
> 
> What game cant SLI Titan X Pascal handle at 4k/60fps max settings? Pretty much each game I tested so far it is able to do it. The Witcher 3 even with highest hairworks settings I get easily 4k/60fps. Crysis 3 the same thing, even using the highest AA settings. (at some points it does dip a little), but mainly 60fps throughout.


LMAO at Crysis 3 still being mentioned as one of the heaviest GPU killers. LOL ... and it's true ... I still play Crysis 3 a few times a week, (2nd or 3rd play through) at 1440p and it's still heavier than 99% of my other titles owened (Metro games are a really close second!) ... and Witcher 3 actually trumps Crysis 3 as a GPU killer I think ... (could be wrong) ..

But seeing Crysis 3 in current day, GPU benchmarks is always fun ... "way ahead of it's time" doesn't even begin to describe that tech ...


----------



## s1rrah

delete


----------



## CRITTY

Quote:


> Originally Posted by *eliau81*
> 
> RMA!!!
> 
> 
> 
> 
> 
> 
> 
> 
> the green team approved to replace my broken card
> 
> 
> 
> 
> 
> 
> 
> 
> after explain the all story (my unprofessional work) how i scratch the PCB, and to be honest it is my fault .
> they told my that what i did avoid warranty and next thing got a stunning email from eRMA team that my RMA as approved
> 
> 
> 
> 
> 
> 
> 
> 
> i was in shock didn't believe at first then i have sent my broken card
> and a few days later fedex at my door with a brand new unopened card.
> wow!!!
> 
> 14215380_10208650611757477_1701708751_o.jpg 181k .jpg file


Glad to hear that.


----------



## xarot

If that user-damaged card was approved for RMA, there still might be hope for us who like to change the stock cooler to a waterblock but not damage anything.


----------



## CRITTY

Quote:


> Originally Posted by *eliau81*
> 
> RMA!!!
> 
> 
> 
> 
> 
> 
> 
> 
> the green team approved to replace my broken card
> 
> 
> 
> 
> 
> 
> 
> 
> after explain the all story (my unprofessional work) how i scratch the PCB, and to be honest it is my fault .
> they told my that what i did avoid warranty and next thing got a stunning email from eRMA team that my RMA as approved
> 
> 
> 
> 
> 
> 
> 
> 
> i was in shock didn't believe at first then i have sent my broken card
> and a few days later fedex at my door with a brand new unopened card.
> wow!!!
> 
> 14215380_10208650611757477_1701708751_o.jpg 181k .jpg file


Glad to hear that.
Quote:


> Originally Posted by *Stateless*
> 
> What game cant SLI Titan X Pascal handle at 4k/60fps max settings? Pretty much each game I tested so far it is able to do it. The Witcher 3 even with highest hairworks settings I get easily 4k/60fps. Crysis 3 the same thing, even using the highest AA settings. (at some points it does dip a little), but mainly 60fps throughout.


Poopy Deus Ex, which supposedly "supports" SLI, can't do 4k/60fps Maxed with AA off.


----------



## eliau81

Quote:


> Originally Posted by *xarot*
> 
> If that user-damaged card was approved for RMA, there still might be hope for us who like to change the stock cooler to a waterblock but not damage anything.


hmm i think i was lucky they did Mentioned that what i did avoid warranty ...


----------



## Maintenance Bot

Quote:


> Originally Posted by *toncij*
> 
> How much FPS you're getting on your Titans on BF1 maxed out?


DX12 average around 70fps at 4k, 130fps at 1440p.


----------



## cisco0623

Quote:


> Originally Posted by *Stateless*
> 
> WOW. My cards must suck then. For me under water and max power setting and +200 on the core I max boost to 2025 and then it drops to a steady 2000. Temps never get higher than 45c on the top card.


What are you using to test? I'd like to compare. I don't go over 34c lol (radiator overkill and AC pumping)


----------



## DooRules

The warmest my titan gets to on water is 32' to 33' running Heaven. Idles around 20 - 21' C. It is on its own loop with a 4 fan rad.


----------



## Bramadan

Quote:


> Originally Posted by *CallsignVega*
> 
> a. LG only makes two OLED panels in 2016 for TV's. A 55" and a 65". All of the screens are identical with those panels, just depends on if you want flat or curved, or what speaker/bezel setup. I personally think the 55OLEDC6P is the best for a computer monitor.
> 
> b. The second Titan-XP is needed if you never want to drop below 60 FPS in _all_ games. A single Titan-XP can run 4K @ 60 Hz in virtually all games, a few more recent ones you will have to turn down some settings or get a second card if you want to keep everything maxed out in all titles.
> 
> c. Your MB/CPU combo is fine. They don't have to work too hard for only 60 FPS.
> 
> d. You could always get one Titan-XP and the OLED now, and then get the second Titan-XP down the line if and when you/the games you play need it.


Thanks a lot!
I appreciate it. I will do exactly that, except I will probably go for 55B6 cause it may get some use as TV as well and I am not sure how well curved screen would work for that.

Thanks again!


----------



## Bramadan

Quote:


> Originally Posted by *CallsignVega*
> 
> a. LG only makes two OLED panels in 2016 for TV's. A 55" and a 65". All of the screens are identical with those panels, just depends on if you want flat or curved, or what speaker/bezel setup. I personally think the 55OLEDC6P is the best for a computer monitor.
> 
> b. The second Titan-XP is needed if you never want to drop below 60 FPS in _all_ games. A single Titan-XP can run 4K @ 60 Hz in virtually all games, a few more recent ones you will have to turn down some settings or get a second card if you want to keep everything maxed out in all titles.
> 
> c. Your MB/CPU combo is fine. They don't have to work too hard for only 60 FPS.
> 
> d. You could always get one Titan-XP and the OLED now, and then get the second Titan-XP down the line if and when you/the games you play need it.


Thanks a lot!
I appreciate it. I will do exactly that, except I will probably go for 55B6 cause it may get some use as TV as well and I am not sure how well curved screen would work for that.

Thanks again!


----------



## jcde7ago

Quote:


> Originally Posted by *Vellinious*
> 
> This is exactly what mind does.


Same here.

I've had to drop it down to +180mhz core/+250 mem in AB w/ 120% power limit to get my TXPs rock solid stable...even then the core boosts past 2000mhz quite easily. Pushing 90+ FPS in The Witcher 3 @ 3440x1440p with absolutely everything cranked to the max.


----------



## CallsignVega

Quote:


> Originally Posted by *Stateless*
> 
> What game cant SLI Titan X Pascal handle at 4k/60fps max settings? Pretty much each game I tested so far it is able to do it. The Witcher 3 even with highest hairworks settings I get easily 4k/60fps. Crysis 3 the same thing, even using the highest AA settings. (at some points it does dip a little), but mainly 60fps throughout.


I've never seen any game even come close to dropping below 60 FPS at 4K with Titan-XP SLI. The real test will be once 4K 120 Hz displays hit.


----------



## cisco0623

Quote:


> Originally Posted by *CallsignVega*
> 
> I've never seen any game even come close to dropping below 60 FPS at 4K with Titan-XP SLI. The real test will be once 4K 120 Hz displays hit.


Since I'm not well informed on the monitor industry are we expecting to see these displays soon?


----------



## jcde7ago

Quote:


> Originally Posted by *cisco0623*
> 
> Since I'm not well informed on the monitor industry are we expecting to see these displays soon?


A year away at least...18 months is a safe bet.

The real question is how long until it's [email protected] is probably a $2,000 minimum investment, not to mention the GPU power required to drive it.

Personally i'm waiting on 21:9 monitors @144hz+....don't think i'll ever go back from an Ultrawide.


----------



## cisco0623

Quote:


> Originally Posted by *jcde7ago*
> 
> A year away at least...18 months is a safe bet.
> 
> The real question is how long until it's [email protected] is probably a $2,000 minimum investment, not to mention the GPU power required to drive it.
> 
> Personally i'm waiting on 21:9 monitors @144hz+....don't think i'll ever go back from an Ultrawide.


Thanks good to know! I'm gaming at 1600p. I've honestly never had anything over 60hz which is why I'm leaning going with acers 32" 4K screen.


----------



## CallsignVega

Quote:


> Originally Posted by *cisco0623*
> 
> Since I'm not well informed on the monitor industry are we expecting to see these displays soon?


A 144 Hz 3440x1440 panel goes into production in Q1. That requires DP 1.3/1.4 as does 4K @ 120 Hz. I'd say spring at the earliest.


----------



## Testier

How important is it to monitor liquid temp on a custom loop and how can I add a temperature sensor? It seems most of them uses a 1/4 port and I am not sure if I can find any spares on my components.


----------



## Asmodian

Quote:


> Originally Posted by *KillerBee33*
> 
> Try raising +220+600 and more aggressive Fan Profile


Quote:


> Originally Posted by *Asmodian*
> 
> My Titan is actually under water (max 42°C). It also seems to be a bad clocker because it isn't stable when stress testing even at only +200MHz. It is also quite close to the power limit so overclocking the memory seems counter productive.


I did try it, at +190+600 the power limit seems about the same, maybe it de-clocks one step (e.g. 2025 instead of 2037), but benchmark scores are always better.








Firestrike Graphics score at 32999
Timespy graphics score at 10858

Thanks for the advice.


----------



## cisco0623

Quote:


> Originally Posted by *CallsignVega*
> 
> A 144 Hz 3440x1440 panel goes into production in Q1. That requires DP 1.3/1.4 as does 4K @ 120 Hz. I'd say spring at the earliest.


Thanks. I'm going back and forth between the 32" 4K but thinking to get the 34" 100hz 1440p acer as well. 120hz 4K and 144hz ultra wide will probably desire Volta lol


----------



## Stateless

Quote:


> Originally Posted by *Asmodian*
> 
> Can you run stable higher than 2025 MHz?
> 
> My card boosts to 2063 when at +190, quickly falling to 2050, then even to 2037 if I am pushing it (but it is still not power throttling). It is never above 42°C. It is not stable at +200.
> 
> I have heard they boost to different max clocks at stock so it makes sense. I really want BIOS editing too.


he fdfd
Quote:


> Originally Posted by *cisco0623*
> 
> What are you using to test? I'd like to compare. I don't go over 34c lol (radiator overkill and AC pumping)


Valley and 3dmark Fire Strike and gaming (Witcher 3, Crysis 3, The Division).


----------



## pompss

Playing Doom with all setting at ultra with all filter set to max i get 144 fps stable at 1440p








This titan its a beast !!

Now i'm waiting from my third monitor for some 7680x1440 test

Decided to go with 3 monitor 144hz instead of the lg oled 65 inch 4k tv.


----------



## Rayge

First post on here. Just finished my build a few days ago. I haven't been able to build a rig in the last 8 years due to traveling. I've always had to have a laptop. I have never fooled with water before so this is my first time. I'm waiting on some custom tempered glass to get in since the one that came with this shattered during shipping and the RMA for it would have been to much of a hassle since I put everything together before unpacking the glass to check it out. The card clocks in at 2114mhz on 3D Mark.


----------



## jcde7ago

Quote:


> Originally Posted by *Rayge*
> 
> First post on here. Just finished my build a few days ago. I haven't been able to build a rig in the last 8 years due to traveling. I've always had to have a laptop. I have never fooled with water before so this is my first time. I'm waiting on some custom tempered glass to get in since the one that came with this shattered during shipping and the RMA for it would have been to much of a hassle since I put everything together before unpacking the glass to check it out. The card clocks in at 2114mhz on 3D Mark.


Heya, welcome to OCN and grats on the build! The TXP is one heck of a card, enjoy it!


----------



## dureiken

Hi mates,

I just bought a TXP







I have 2 questions :

are BP needed and useful for watercooling with an EKWB ?
is there a vmod like on 1080FE ?
Thanks a lot !


----------



## skypine27

Quote:


> Originally Posted by *dureiken*
> 
> Hi mates,
> 
> I just bought a TXP
> 
> 
> 
> 
> 
> 
> 
> I have 2 questions :
> 
> are BP needed and useful for watercooling with an EKWB ?
> is there a vmod like on 1080FE ?
> Thanks a lot !


EK backplates aren't needed but look nicer than the factory ones. You actually can use the factory back plate just fine even though EK officially claims its not compatible with their block. The left "half" of the back plate (the factory backplate is divided into 2 sections) can be secured with the to screw holes on the far "left" of the card (meaning the side by the outputs). The right half of the backplate cant be secured using any of the stock screws and for this reason EK declared it officially incompatible. However, if you mount your card the normal "lie flat" way, you can just drop the backplate on top of the card and it stays just fine. If you are mounting your card vertically, a small pice of double sided hobby tape secures that half of the backplate just fine.

EK claims their backplate will provide a bit of passive cooling (since its metal and I'm guessing EK provides some thermal pads with it). The factory backplates are ABS plastic I believe and dont provide any cooling (and actually may make the top of the card a tiny bit hotter by trapping heat beneath it).

Hope this helps

EDIT:

Here is a quick pic of my water-cooled TXP's using the factory back plates, you can see they "sit" just fine:


----------



## dureiken

Thanks for your answer, I think I will run without BP, as passive VRM cooling seems to be marchandising no ?


----------



## KillerBee33

Quote:


> Originally Posted by *Asmodian*
> 
> I did try it, at +190+600 the power limit seems about the same, maybe it de-clocks one step (e.g. 2025 instead of 2037), but benchmark scores are always better.
> 
> 
> 
> 
> 
> 
> 
> 
> Firestrike Graphics score at 32999
> Timespy graphics score at 10858
> 
> Thanks for the advice.


32.9 in FS
10.5 in TS
It's doing well


----------



## cisco0623

Quote:


> Originally Posted by *Stateless*
> 
> he fdfd
> Valley and 3dmark Fire Strike and gaming (Witcher 3, Crysis 3, The Division).


Nice- I'm 2063 and it will go to 2050-2037 as well. Still good stuff! These cards demand a custom bios. Hopefully nvidia lets us do it!


----------



## kx11

some stuff got here , the rest in 2 days hopefully


----------



## CRITTY

Two is better than one. If you have two; you can always use one.


----------



## Yeshi

Quote:


> Originally Posted by *skypine27*
> 
> Here is a quick pic of my water-cooled TXP's using the factory back plates, you can see they "sit" just fine:


That is a brutually beautiful build. I am not ashamed to admit I am somewhat envious!


----------



## jcde7ago

People with the HB SLI bridge from Nivida + EK waterblocks - did you guys cut off the bridge's tips as well to get it to fit, or resort to the EVGA HB bridge?

Been gaming all weekend so far and have had a flawless SLI experience in The Witcher 3...GPU-Z is reporting high-bandwidth SLI config. + 499.6 GB/s as well. Bridge LED works and the tips I Dremel'd off looked like it was just empty PCB anyways. Curious to know what others did, as I don't want to drop another $30+ on another HB SLI bridge if I don't absolutely have to.


----------



## Ghostface

Just finished the EK water cooling setup for a new Titan X Pascal build. Amazed at the temperatures. Was running 90 degrees on the stock cooler with the awful fan noise drowning everything out. It now hits 36 degrees under full load. Couldn't be happier with it!

I say it's finished, however I have ordered the backplate, just to protect the back of the card more than anything so will install that when it arrives.

I have got a 200mhz boost on core and 500mhz boost on memory, which sits nicely around the 120% power limit and keeps the clocks around 2070-2090mhz on boost.


----------



## MikeSanders

New nvflash version is out. Can someone check if it does support the tx pascal?
https://www.techpowerup.com/downloads/2786/nvflash-5-319-0-for-windows


----------



## Baasha

This game at 8K (on a 5K monitor) looks phenomenal!


----------



## Jpmboy

Quote:


> Originally Posted by *MikeSanders*
> 
> New nvflash version is out. Can someone check if it does support the tx pascal?
> https://www.techpowerup.com/downloads/2786/nvflash-5-319-0-for-windows


will flash and save the on-board bios... you will need to disable the driver in device manager for this version to work. Now all we need is Pascal bios TWEAKER !!!
Thx +1


----------



## pompss

here its mine just finished adding darkside led and new titan X


----------



## x3sphere

Quote:


> Originally Posted by *Baasha*
> 
> This game at 8K (on a 5K monitor) looks phenomenal!


Nice. GPU usage seems to be jumping around a lot with 4-Way though. I would be curious to see a comparison against 2-Way at max OC.


----------



## KillerBee33

Quote:


> Originally Posted by *pompss*
> 
> here its mine just finished adding darkside led and new titan X


THis is Pure PC Porn







An Eye Candy may i add


----------



## Stateless

Quote:


> Originally Posted by *skypine27*
> 
> EK backplates aren't needed but look nicer than the factory ones. You actually can use the factory back plate just fine even though EK officially claims its not compatible with their block. The left "half" of the back plate (the factory backplate is divided into 2 sections) can be secured with the to screw holes on the far "left" of the card (meaning the side by the outputs). The right half of the backplate cant be secured using any of the stock screws and for this reason EK declared it officially incompatible. However, if you mount your card the normal "lie flat" way, you can just drop the backplate on top of the card and it stays just fine. If you are mounting your card vertically, a small pice of double sided hobby tape secures that half of the backplate just fine.
> 
> EK claims their backplate will provide a bit of passive cooling (since its metal and I'm guessing EK provides some thermal pads with it). The factory backplates are ABS plastic I believe and dont provide any cooling (and actually may make the top of the card a tiny bit hotter by trapping heat beneath it).
> 
> Hope this helps
> 
> EDIT:
> 
> Here is a quick pic of my water-cooled TXP's using the factory back plates, you can see they "sit" just fine:


That looks great. I have a few questions about your loop that I hope you can answer.

On my loop, I have the water go from the res to 3 different RADS, then into the CPU and then to the GPU's. I know that water is getting warmed since it hits my CPU before entering the GPU's. On your build, from what I can tell, you have the water that is warmed by the CPU going into that small exhaust RAD then into the GPU. Are you finding that the small RAD cools it enough the make a difference? Pascal's seem to run a bit warmer in my loop than my Maxwell's do. I am hitting a high of about 58c (this is after hours of gaming and benchmarking). Under normal use, I probably hit about 52-54c on my top card. I am eathier thinking adding that same small RAD or debating on putting my GPU's on their own loop since I have an extra pump and radiator from an older build.


----------



## MrTOOSHORT

Used the newest nvflash, here is the stock TITAN XP bios:

TITANXP.zip 147k .zip file


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> That looks great. I have a few questions about your loop that I hope you can answer.
> 
> On my loop, I have the water go from the res to 3 different RADS, then into the CPU and then to the GPU's. I know that water is getting warmed since it hits my CPU before entering the GPU's. On your build, from what I can tell, you have the water that is warmed by the CPU going into that small exhaust RAD then into the GPU. Are you finding that the small RAD cools it enough the make a difference? Pascal's seem to run a bit warmer in my loop than my Maxwell's do. I am hitting a high of about 58c (this is after hours of gaming and benchmarking). Under normal use, I probably hit about 52-54c on my top card. I am eathier thinking adding that same small RAD or debating on putting my GPU's on their own loop since I have an extra pump and radiator from an older build.


Honestly, loop order shouldn't really matter if you have enough rads and enough pump flow...58c seems pretty high. Are you running a single TXP, or two?

I'm running dual TXPs and a 5960x @ 4.5 Ghz @ 1.32v, and my loop order is this:

dual pump + res > GPU 1 + 2 > RAM > CPU > RAM > 480 rad top > 480 rad bottom > 240 rad bottom > 240 rad front > back to dual pump + res. That's 480 x3 worth of rads, one for each GPU and one for the CPU, basically (the RAM blocks is pretty inconsequential and is there for aesthetics)...



Was a pretty hot day today, too, and my max temps are at 45c for my top TXP and 44c for the bottom TXP; 5960x 's max temps was 59c.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> will flash and save the on-board bios... you will need to disable the driver in device manager for this version to work. Now all we need is Pascal bios TWEAKER !!!
> Thx +1


Thank you 8lb 6oz baby jesus.

Now where is @OccamRazor when you need him :-D


----------



## pompss

Great News
Cant wait to see some mod bios to fix the power limit issue without the clu mod


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Honestly, loop order shouldn't really matter if you have enough rads and enough pump flow...58c seems pretty high. Are you running a single TXP, or two?
> 
> I'm running dual TXPs and a 5960x @ 4.5 Ghz @ 1.32v, and my loop order is this:
> 
> dual pump + res > GPU 1 + 2 > RAM > CPU > RAM > 480 rad top > 480 rad bottom > 240 rad bottom > 240 rad front > back to dual pump + res. That's 480 x3 worth of rads, one for each GPU and one for the CPU, basically (the RAM blocks is pretty inconsequential and is there for aesthetics)...
> 
> 
> 
> Was a pretty hot day today, too, and my max temps are at 45c for my top TXP and 44c for the bottom TXP; 5960x 's max temps was 59c.


Thanks. I have a 3930k OC to 4.7, 2 Titan XP's in my loop. I have duel pump system, but only using 3 360 RAD's. I think I might have an issue with my Top card however. It usually is running 10-15c higher than the bottom card. My bottom card has never broke higher that 44c no matter what and the highest my top card has hit was 58c, both running +200 Core and +200 to memory with no voltage. So it could be something when I installed the block that it did not get good contact or thermal paste application. I had the same CPU and 2 Titan Maxwell's and they never hit higher than 46c on both cards.

I wonder if I should add another RAD or change 2 of my 360's to 2 480's. I was also thinking of putting the GPU's on their own loop since I have an extra pump/res laying around. I am not sure what to do. I have a vacation coming up, but wife is working so I am free and have time to make changes, just not sure what changes I should make.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Thanks. I have a 3930k OC to 4.7, 2 Titan XP's in my loop. I have duel pump system, but only using 3 360 RAD's. I think I might have an issue with my Top card however. It usually is running 10-15c higher than the bottom card. My bottom card has never broke higher that 44c no matter what and the highest my top card has hit was 58c, both running +200 Core and +200 to memory with no voltage. So it could be something when I installed the block that it did not get good contact or thermal paste application. I had the same CPU and 2 Titan Maxwell's and they never hit higher than 46c on both cards.
> 
> I wonder if I should add another RAD or change 2 of my 360's to 2 480's. I was also thinking of putting the GPU's on their own loop since I have an extra pump/res laying around. I am not sure what to do. I have a vacation coming up, but wife is working so I am free and have time to make changes, just not sure what changes I should make.


If you have the time I would definitely drain the loop and remove + reattach the block on your top card one more time...you can even swap blocks with your other card as well and see if the symptoms carry over.

3x 360 rads should be plenty for your setup. If the top card is STILL having issues i'd probably replace the problematic waterblock outright or heck, even RMA that GPU first before I got bigger rads (I know how bad that sounds)...there shouldn't be a discrepancy more than 2-3c at most between the top and bottom cards and if you've reattached/swapped the blocks between cards and the issue persists then you either have a bad block which makes really poor contact with your cards or something's up with that particular GPU.


----------



## Jpmboy

Quote:


> Originally Posted by *MunneY*
> 
> Thank you 8lb 6oz baby jesus.
> 
> Now where is @OccamRazor when you need him :-D


lol - @skyn3t pinged the Valley Heaven thread.









@MikeSanders posted the nvflash DL link.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - @skyn3t pinged the Valley Heaven thread.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @MikeSanders posted the nvflash DL link.


oh boy!

He updated EZFlash, so maybe, just maybe we can get him to play with these bios and get us some real freaking power


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> If you have the time I would definitely drain the loop and remove + reattach the block on your top card one more time...you can even swap blocks with your other card as well and see if the symptoms carry over.
> 
> 3x 360 rads should be plenty for your setup. If the top card is STILL having issues i'd probably replace the problematic waterblock outright or heck, even RMA that GPU first before I got bigger rads (I know how bad that sounds)...there shouldn't be a discrepancy more than 2-3c at most between the top and bottom cards and if you've reattached/swapped the blocks between cards and the issue persists then you either have a bad block which makes really poor contact with your cards or something's up with that particular GPU.


My first plan is to reseat the wateblock on the top card. At idle, the top card is usually 2-3c cooler than the bottom card and usually they stay within about 1c pf each other until it hits around 44c or so, then the top card continues to climb in temp while the bottom card pretty much maintains it's 44c or so. I am thinking it has to be something when I placed the thermal paste or it just seated poorly. I did test the 2 cards on air and at the same clock they pretty much stayed the same temp when I tested them each individually, so I don't think it is an issue with the card itself.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> My first plan is to reseat the wateblock on the top card. At idle, the top card is usually 2-3c cooler than the bottom card and usually they stay within about 1c pf each other until it hits around 44c or so, then the top card continues to climb in temp while the bottom card pretty much maintains it's 44c or so. I am thinking it has to be something when I placed the thermal paste or it just seated poorly. I did test the 2 cards on air and at the same clock they pretty much stayed the same temp when I tested them each individually, so I don't think it is an issue with the card itself.


Sounds like a plan - report back and let us know how it goes!


----------



## markklok

Question for those who watercool their titan..

My idle temperature is 26 celcius (room temp is also 26)
When i run heaven the temperature will go up from 26 to 32 en then slowly go up towards its stable point..


Is this (6 degrees leap) normal or do i have to repaste my EK block


----------



## jcde7ago

Quote:


> Originally Posted by *markklok*
> 
> Question for those who watercool their titan..
> 
> My idle temperature is 26 celcius (room temp is also 26)
> When i run heaven the temperature will go up from 26 to 32 en then slowly go up towards its stable point..
> 
> 
> Is this (6 degrees leap) normal or do i have to repaste my EK block


Heaven is supposed to be an extremely taxing benchmark and high temps are to be expected (expect even higher temps if you're benchmarking with Fire Strike). Your GPU usage is going to be topped out while running Heaven, and that's normal. How are your temps while gaming?


----------



## jodasanchezz

Holy....is there now a Working Version of NV Flash für the Titans (P)?


----------



## Nunzi

occamrazor & skyn3t did amazing work on the OG titan ,hope they can do something with the TXP


----------



## willverduzco

Quote:


> Originally Posted by *pompss*
> 
> Great News
> Cant wait to see some mod bios to fix the power limit issue without the clu mod


+1 to that. Let's hope someone is able to tackle the whole signature verification issue on modified Pascal BIOSes (I think that's still missing, right?) and also create a bios editor--or hobble together a modified bios with raised power limit.


----------



## markklok

Quote:


> Originally Posted by *jcde7ago*
> 
> Heaven is supposed to be an extremely taxing benchmark and high temps are to be expected (expect even higher temps if you're benchmarking with Fire Strike). Your GPU usage is going to be topped out while running Heaven, and that's normal. How are your temps while gaming?


Its depends if v sync is on or off (i'm on a 60hz screen)

Usually i cap my frame rate at 58-60 so i don't have any tearing.
Temperature in games 38 to 40 Celcius

Heaven 60 min = 43 Celsius


----------



## jodasanchezz

Hello i need an advice,
im wana go on water wth the titan.

I was testing Fallout 4(MAX @ 4K) @ stock setting....the titan clockes down to 1418mhz @84°C @ ~50% Fan speed is this heavy throtle normal? below advertised Boost clock ?

Should i return the titan? or is this because of ther heavy load?
(Normaly im runing Fan @ 100%)


----------



## Baasha

Are the backplates causing the cards to run hotter? How easy/difficult is it to remove the backplates?

My GPU sammich is making the top two cards hit 90C - of course, that's probably because all 4 cards are at 95 - 99% usage!









BF4 in 8K w/ everything on Ultra (Zavod Graveyard Shift) - here's an 8K screenshot (native but reduced quality to be able to upload to Imgur):


----------



## jcde7ago

Quote:


> Originally Posted by *markklok*
> 
> Its depends if v sync is on or off (i'm on a 60hz screen)
> 
> Usually i cap my frame rate at 58-60 so i don't have any tearing.
> Temperature in games 38 to 40 Celcius
> 
> Heaven 60 min = 43 Celsius


That sounds perfectly normal.








Quote:


> Originally Posted by *Baasha*
> 
> Are the backplates causing the cards to run hotter? How easy/difficult is it to remove the backplates?
> 
> My GPU sammich is making the top two cards hit 90C - of course, that's probably because all 4 cards are at 95 - 99% usage!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BF4 in 8K w/ everything on Ultra (Zavod Graveyard Shift) - here's an 8K screenshot (native but reduced quality to be able to upload to Imgur):


Yeah - the plastic stock backplates are going to raise temps for you inherently due to even more restricted airflow from less space between the cards and...naturally, hot air rises of course.









They're purely aesthetic and likely keep more heat in between themselves and the PCB. Personally i'd remove them if I were on air.


----------



## CallsignVega

The back plates aren't plastic.


----------



## KillerBee33

Quote:


> Originally Posted by *CallsignVega*
> 
> The back plates aren't plastic.


It's a plastic Backplate on top of aluminum Backplate








Or it might be even ceramic ...


----------



## CallsignVega

The back-plate is aluminum with a thin clear plastic sheet so that the back plate doesn't contact any electrical components besides where the cutouts are.

I've removed my back plates.


----------



## jcde7ago

Quote:


> Originally Posted by *CallsignVega*
> 
> The back plates aren't plastic.


Hmm, they certainly feel plastic-y with a thin film of...something else on the bottom side where it meets the PCB (I just took mine out to look/feel again).

Either way, my point stands that they're on the stock cards for aesthetics, not for passive cooling...and in a 4-way SLI setup like Baasha's, they are going to needlessly raise temps even if only in the most minute amounts because he's further restricting airflow and keeping hot air in between the cards more than he would if he took them off. If he has an IR thermometer he could easily measure this.

EDIT: Didn't see your latest comment, yeah, with that thin film of plasttc that's there to keep the components from touching i'd definitely remove the backplates due to heat concerns....that film will do no good for temps, even if it helps with preventing component contact.


----------



## Jquala

Just finished my build! 14k on FSU w/6700k +225/500


----------



## azzazel99

How is he using 4 way sli when the beta has no sli support. My cards will not run in sli only 1 card gets used and if I force alternate frame rendering I get like 20 fps


----------



## SlammiN

Still having issues with my card. One screen will go fuzzy, so I will try and fix then the other will flicker like mad all one color.

Next thing all the ports wont work when unplugging and plugging back in, only fix is a reboot of the system

Worried before I fit my EK block that there is an actual hardware fault


----------



## willverduzco

Quote:


> Originally Posted by *SlammiN*
> 
> Still having issues with my card. One screen will go fuzzy, so I will try and fix then the other will flicker like mad all one color.
> 
> Next thing all the ports wont work when unplugging and plugging back in, only fix is a reboot of the system
> 
> Worried before I fit my EK block that there is an actual hardware fault


That definitely does not sound like a software issue, as I've never heard of anyone else complain about anything similar with recent drivers. Are you using DP? If so, it COULD be a poor DP handshake between one of your screens and the GPU. Those are notoriously prone to issue with poor quality cables.

I recently had something like that happen to me on my Dell P2715Q 27" 4k monitor (random flickers and picture dropouts for a second or two, once or twice a week). This was due to the supplied (and poorly shielded) DP cable. This was when I was running 2x980ti sli. I have since upgraded to a better cable, and I've never had any issues on that setup or my 2x290x, 1x1080, or 1x titan XP configs. Then again, the fuzzy screen, while the other is all one color sounds like an actual hardware problem with the card.

Personally, I'd try a different set of cables first. If that doesn't fix it, do that RMA before voiding your warranty--not like they'd ever know, but still.


----------



## axiumone

Another user figured out how to use 4 way sli with pascal. Actually, this was a month ago now. So it seems it's just messing around with custom bits and profiles. Nothing special. Just need the display technology to catch up now to make it worthwhile.

https://forums.geforce.com/default/topic/961669/geforce-1000-series/fully-functional-4-way-sli-titan-x-pascal/


----------



## jcde7ago

Quote:


> Originally Posted by *axiumone*
> 
> Another user figured out how to use 4 way sli with pascal. Actually, this was a month ago now. So it seems it's just messing around with custom bits and profiles. Nothing special. Just need the display technology to catch up now to make it worthwhile.
> 
> https://forums.geforce.com/default/topic/961669/geforce-1000-series/fully-functional-4-way-sli-titan-x-pascal/


Yep - 'venturi' was the first one to get 4-way SLI working on Titan X Pascals (or any 10 Series GPUs for the matter, I believe). Here is his original thread:

http://forums.guru3d.com/showthread.php?s=8870351f9f44e2dfa938b839a2f232e0&t=409468


----------



## 295033

Hello all, I have a very strange problem with my new build centered around a Titan X. I'm getting a 1-frame stutter once per second in all games when A) playing in fullscreen and B) playing on my Samsung JS9500's game mode. If I'm playing on my monitor or on any other mode on my TV or if I'm playing in windowed mode, there's no stutter.

Here are the things I've tried to fix the issue:

1. Different HDMI cable
2. Hooking a different display up to the same HDMI port (result was no stutter, so I ruled out a faulty port)
3. Resetting BIOS to factory default settings
4. Updating my TV's firmware
5. Reset TV settings
6. Bypassing my A/V receiver and hooking up straight to the TV
7. Reinstalling/rolling back drivers
8. Unhooking all peripherals, including my monitor
9. And finally, a fresh install of Win10 Pro x64 with only the necessary drivers, Steam, and a few games.

This didn't start until after I put the system on water. I had it running on air for about a week prior and there was no stutter. At least none that I noticed, and I'm pretty sensitive to dropped frames. Part of me wants to say it's the TV that's faulty since gaming is smooth on other displays/TV modes, but the other part of me thinks I screwed something up when installing my loop since it was working fine before.

I contacted Samsung today and they're going to send me a new One Connect box (the box that does the connections as well as processing), but something tells me that's not going to work.

Specs:
Maximus Gene VIII
6700k
Titan X Pascal
32gb 3.2ghz DD4
500gb Samsung 950 Pro
Corsair RM850i
EK pump/res + blocks.

Any help or advice would be greatly appreciated. This is giving me some serious anxiety.


----------



## cookiesowns

Quote:


> Originally Posted by *Iorek*
> 
> Hello all, I have a very strange problem with my new build centered around a Titan X. I'm getting a 1-frame stutter once per second in all games when A) playing in fullscreen and B) playing on my Samsung JS9500's game mode. If I'm playing on my monitor or on any other mode on my TV or if I'm playing in windowed mode, there's no stutter.
> 
> Here are the things I've tried to fix the issue:
> 
> 1. Different HDMI cable
> 2. Hooking a different display up to the same HDMI port (result was no stutter, so I ruled out a faulty port)
> 3. Resetting BIOS to factory default settings
> 4. Updating my TV's firmware
> 5. Reset TV settings
> 6. Bypassing my A/V receiver and hooking up straight to the TV
> 7. Reinstalling/rolling back drivers
> 8. Unhooking all peripherals, including my monitor
> 9. And finally, a fresh install of Win10 Pro x64 with only the necessary drivers, Steam, and a few games.
> 
> This didn't start until after I put the system on water. I had it running on air for about a week prior and there was no stutter. At least none that I noticed, and I'm pretty sensitive to dropped frames. Part of me wants to say it's the TV that's faulty since gaming is smooth on other displays/TV modes, but the other part of me thinks I screwed something up when installing my loop since it was working fine before.
> 
> I contacted Samsung today and they're going to send me a new One Connect box (the box that does the connections as well as processing), but something tells me that's not going to work.
> 
> Specs:
> Maximus Gene VIII
> 6700k
> Titan X Pascal
> 32gb 3.2ghz DD4
> 500gb Samsung 950 Pro
> Corsair RM850i
> EK pump/res + blocks.
> 
> Any help or advice would be greatly appreciated. This is giving me some serious anxiety.


Have you tried monitoring the card with GPUZ logging to see if maybe the clocks are fluctuating causing frame stutters ?


----------



## 295033

Quote:


> Originally Posted by *cookiesowns*
> 
> Have you tried monitoring the card with GPUZ logging to see if maybe the clocks are fluctuating causing frame stutters ?


I gave it a try, but nothing seems out of the ordinary? I attached it so you can take a look... Just a few seconds in Dark Souls III. The dip at the end is probably me alt-tabbing.

GPU-ZSensorLog.txt 12k .txt file


----------



## azzazel99

I had this issue and i moved my hdmi input from hdmi 1 on my tv to hdmi 2 and it fixed it. I was going mad trying everything and that was the last thing i tried and it worked.


----------



## axiumone

Quote:


> Originally Posted by *jcde7ago*
> 
> Yep - 'venturi' was the first one to get 4-way SLI working on Titan X Pascals (or any 10 Series GPUs for the matter, I believe). Here is his original thread:
> 
> http://forums.guru3d.com/showthread.php?s=8870351f9f44e2dfa938b839a2f232e0&t=409468


Oh that's so great! That thread really explains all of the details to get 4 way sli working on pascal. So our certain resident just stole all of the thunder without any of the credit. Figures.


----------



## Jpmboy

Baasha has been running 4-way since launch, and in the last few weeks getting all cards to scale well.








So much for the "SLI don't work" crowd.


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> Baasha has been running 4-way since launch, and in the last few weeks getting all cards to scale well.
> 
> 
> 
> 
> 
> 
> 
> 
> So much for the "SLI don't work" crowd.


Benchmarking, not gaming. He literally asks for help from the user in the thread jcde posted, from the user who actually had it working in games already. Anyway, doesn't really matter. Just reaffirms what I initially believed, not a team player. The enthusiast internet is a very small place. It's easier to be honest.


----------



## Fredthehound

Hi all,

Got a question that would normally be better asked elsewhere but since TXPs arent mainstream, this is probably my best hope. Bear with me as this isn't exactly a normal scenario. Currently my rig is a TXP on air @ 225/0, a Z87/4790K on water clocked to 4.7Ghz, 16 gig of 2400Mhz TridentX ram and a pair of HyperX 240 gig SSDs in Raid 0 as my game drive. It's all stable and works well overall.

I play a LOT of Skyrim/Fallout in Vive/VorpX and with the Skyrim remaster coming out next month (basically the F4 engine) I am looking to maximize performance. These are the only games I play or even care about so the system is being tailored to them exclusively. Some guys buy $500 golf clubs, I play modded Skyrim and Fallout in VR. Normal it ain't but it makes me happy.

What I need to learn is since Sky remastered and F4 are basically the same game, Would I be better served with an X99 system and 6-8 cores or a second TXP? The load VorpX puts on games in geometry mode is significant and has been likened to 4K. I play at 1920x1444 rez on the Vive with 2X supersampling. I can hold 60FPS in most areas now but draw calls in areas like Markarth are a killer. So I am thinking about X99 clocked up and using Process Lasso to assign 4 cores to each (Game and VorpX) or a 2/4 split depending on 6 or 8 core CPU.

I have a question in wirh Ralf on the VorpX forum (the dev) on whether VorpX is more CPU or GPU hungry but havent heard back yet.

Can anyone give me any guidance on these things? Am I headed in the right direction or off a cliff?

Any insight or a swift hick to the head appreciated ;


----------



## 295033

Quote:


> Originally Posted by *azzazel99*
> 
> I had this issue and i moved my hdmi input from hdmi 1 on my tv to hdmi 2 and it fixed it. I was going mad trying everything and that was the last thing i tried and it worked.


Unfortunately I tried all 4 ports and got the same result. If only it were that easy...


----------



## jodasanchezz

Please can someone tell me if this is normal?

Heavy Downcklock to near Baseclock on stock settings on Air?









Should i return the Card to get a nother one?

Fallout4

GTA 5

Witcher 3


----------



## jcde7ago

Quote:


> Originally Posted by *jodasanchezz*
> 
> Please can someone tell me if this is normal?
> 
> Heavy Downcklock to near Baseclock on stock settings on Air?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should i return the Card to get a nother one?
> 
> Fallout4
> 
> GTA 5
> 
> Witcher 3


I'm not on air so I can't give you any definitive help other than to say that you can expect some severe throttling given that your card is reaching 84c--88c according to your pics....but can you force 100% fan speed and play the same games and report back what your usage looks like? I can't imagine that this is anything more than thermal throttling assuming your hardware is fine.


----------



## chronicfx

@jcde7agoThat seems to be a severe case. There is definitely some compromised cooling going on. Your best bet would be to contact support and maybe they will just offer to replace it


----------



## scgeek12

I accidentally ordered a 2 slot spacing evga HB sli bridge and I need a 1 slot spacing *facepalm* anyone need a 2 slot spacing one? If you can send me a pic in a message that u can use it I'll send this sli bridge to you free


----------



## jcde7ago

Quote:


> Originally Posted by *chronicfx*
> 
> @jcde7agoThat seems to be a severe case. There is definitely some compromised cooling going on. Your best bet would be to contact support and maybe they will just offer to replace it


Pretty sure you meant to respond to @jodasanchezz ...


----------



## SlammiN

Quote:


> Originally Posted by *willverduzco*
> 
> That definitely does not sound like a software issue, as I've never heard of anyone else complain about anything similar with recent drivers. Are you using DP? If so, it COULD be a poor DP handshake between one of your screens and the GPU. Those are notoriously prone to issue with poor quality cables.
> 
> I recently had something like that happen to me on my Dell P2715Q 27" 4k monitor (random flickers and picture dropouts for a second or two, once or twice a week). This was due to the supplied (and poorly shielded) DP cable. This was when I was running 2x980ti sli. I have since upgraded to a better cable, and I've never had any issues on that setup or my 2x290x, 1x1080, or 1x titan XP configs. Then again, the fuzzy screen, while the other is all one color sounds like an actual hardware problem with the card.
> 
> Personally, I'd try a different set of cables first. If that doesn't fix it, do that RMA before voiding your warranty--not like they'd ever know, but still.


Thanks, I think I may have to!

There have been times when the screens just didn't come on, way before windows drivers, they didnt even register, been like this since day 1

Do you know if Nvidia cross ship? or I can imagine Im screwed waiting


----------



## jcde7ago

Quote:


> Originally Posted by *scgeek12*
> 
> I accidentally ordered a 2 slot spacing evga HB sli bridge and I need a 1 slot spacing *facepalm* anyone need a 2 slot spacing one? If you can send me a pic in a message that u can use it I'll send this sli bridge to you free


That's super kind of you but honestly if you contact EVGA they will totally swap that SLI bridge out for the correct one that you need without any hassles.


----------



## pez

Quote:


> Originally Posted by *CallsignVega*
> 
> The back-plate is aluminum with a thin clear plastic sheet so that the back plate doesn't contact any electrical components besides where the cutouts are.
> 
> I've removed my back plates.


I didn't know this and assumed they were definitely just plastic. That's nice to know







.
Quote:


> Originally Posted by *Iorek*
> 
> Unfortunately I tried all 4 ports and got the same result. If only it were that easy...


Sounds like you've tried it all, but have you tried putting your TV into a 'PC mode' if it has it or just removing all processing technologies in general in the TV settings?


----------



## jodasanchezz

Quote:


> Originally Posted by *jcde7ago*
> 
> I'm not on air so I can't give you any definitive help other than to say that you can expect some severe throttling given that your card is reaching 84c--88c according to your pics....but can you force 100% fan speed and play the same games and report back what your usage looks like? I can't imagine that this is anything more than thermal throttling assuming your hardware is fine.


I can´'t send u some picks now (im at work) But usually when i force the fan to 100% (BF1) the Titan stays @ ~1750mhz [email protected] ist not impressive imo.

Also the Driver crashes if i Try to oc @200 on the Core (Firestrike)

Im eable to return the Card by law in Germany,
i just wanna figure out bevor i put a Waterblock on it if ist worth it or they should send me a new one.

Sorry for the bad engish


----------



## jodasanchezz

I
Quote:


> Originally Posted by *chronicfx*
> 
> @jcde7agoThat seems to be a severe case. There is definitely some compromised cooling going on. Your best bet would be to contact support and maybe they will just offer to replace it


If wrote Nvidia Support i Let u Know what they say about it.

I normaly wouldnt use the Titan @ Stock Settings but just for testing.
I think if the Gpu isnt able to perform good in the Factory Settings it wont be a good OC Card as well...


----------



## 295033

Quote:


> Originally Posted by *pez*
> 
> Sounds like you've tried it all, but have you tried putting your TV into a 'PC mode' if it has it or just removing all processing technologies in general in the TV settings?


Yes, I've tried PC mode on the TV, and like all the other modes there's no stutter. Game mode is the only one that has the problem. And only if it's fullscreen... if I run a game in a window in Game mode it's fine. Unfortunately, PC mode comes in at about 56ms input lag compared to Game's 23 so I'd really rather not use it. Really though, what the heck could possibly be going on in Game Mode that causes something like this? It has to be a problem with the TV, right??


----------



## pez

Quote:


> Originally Posted by *Iorek*
> 
> Yes, I've tried PC mode on the TV, and like all the other modes there's no stutter. Game mode is the only one that has the problem. And only if it's fullscreen... if I run a game in a window in Game mode it's fine. Unfortunately, PC mode comes in at about 56ms input lag compared to Game's 23 so I'd really rather not use it. Really though, what the heck could possibly be going on in Game Mode that causes something like this? It has to be a problem with the TV, right??


Hmm, that's slight progress, but I like you am wondering the same thing. What is your TV's model to be exact?


----------



## jcde7ago

Quote:


> Originally Posted by *Iorek*
> 
> Yes, I've tried PC mode on the TV, and like all the other modes there's no stutter. Game mode is the only one that has the problem. And only if it's fullscreen... if I run a game in a window in Game mode it's fine. Unfortunately, PC mode comes in at about 56ms input lag compared to Game's 23 so I'd really rather not use it. Really though, what the heck could possibly be going on in Game Mode that causes something like this? *It has to be a problem with the TV, right*??


I'm inclined to say that yes, it's likely your TV and not your Titan XP that's the issue based on your original comment:
Quote:


> Originally Posted by *Iorek*
> 
> If I'm playing on my monitor or on any other mode on my TV or if I'm playing in windowed mode, there's no stutter.


If playing on a monitor is 100% fine in fullscreen, windowed, etc. then your TXP is working as expected i'd say...and it working just fine in every other mode on your TV but "Game Mode" is further proof that it's something to do with the combination of that TV mode and your GPU, nothing else.

Unfortunately it's probably hard to find similar circumstances with your exact TV model and a Titan XP exhibiting the same behavior as it's not likely that many people are opting for Titan XPs as a primary GPU option, let alone pairing it with a TV.


----------



## 295033

Quote:


> Originally Posted by *pez*
> 
> Hmm, that's slight progress, but I like you am wondering the same thing. What is your TV's model to be exact?


Samsung JS9500.

One other quirk I just discovered... as I mentioned before, running a game in a window if fine even in the TV's game mode, but apparently that's only the case for 1080p. If I set my TV to 4k and run a 4k window, the stutter returns. That makes it seem like a bandwidth problem, but again, why is it OK in other modes at 4k?


----------



## scgeek12

Quote:


> Originally Posted by *jcde7ago*
> 
> That's super kind of you but honestly if you contact EVGA they will totally swap that SLI bridge out for the correct one that you need without any hassles.


I bought the 2 slot one on ebay :/ already ordered a single slot also should be here soon







just seeing if anyone actually needs a 2slot, if not ill just toss it up on ebay


----------



## pez

Quote:


> Originally Posted by *Iorek*
> 
> Samsung JS9500.
> 
> One other quirk I just discovered... as I mentioned before, running a game in a window if fine even in the TV's game mode, but apparently that's only the case for 1080p. If I set my TV to 4k and run a 4k window, the stutter returns. That makes it seem like a bandwidth problem, but again, why is it OK in other modes at 4k?


http://www.rtings.com/tv/reviews/samsung/js9500

The section on Motion and Inputs is what I looked to. I'm no TV expert, but I did see that it says:
Quote:


> Watching movies over a Blu-ray player in 24p has no judder, but you might see some when watching movies over a 60p or 60i source, because it can't always do the reverse 3:2 pulldown (sometimes it works, but not always). 'Auto Motion Plus' gets rid of this, but it comes with the soap opera effect.


This is talking about video playback, but I'm not sure if the 'Auto Motion Plus' setting has anything to do with it.

Maybe if someone else is more familiar with TVs + PC gaming, they can chime in.


----------



## 295033

Quote:


> Originally Posted by *pez*
> 
> http://www.rtings.com/tv/reviews/samsung/js9500
> 
> The section on Motion and Inputs is what I looked to. I'm no TV expert, but I did see that it says:
> This is talking about video playback, but I'm not sure if the 'Auto Motion Plus' setting has anything to do with it.
> 
> Maybe if someone else is more familiar with TVs + PC gaming, they can chime in.


I believe that's talking about watching 24 fps content over a 60 hz source... like watching a blu-ray on your computer. 60 fps content on a 60 fps source shouldn't be an issue.


----------



## profundido

Quote:


> Originally Posted by *Stateless*
> 
> Well, playing the waiting game you can end up doing that forever because there will be something after Volta or the Titan Volta variant etc. Right now, if you nee the memory or you want to drive the best performance with a single card, the Titan X P is a good choice. I started with one and then got a second and am enjoying the performance they prove at 4k. Being able to max out something like Witcher 3 at 4k and rock solid 60fps is bliss. When I say max, I mean every possible setting at it's highest, even hairworks.


Amen to that. On that boat too and loving it. My PG27AQ is finally being done full justice and the experience is sooooo photorealistically amazing. It really pops your eyes to enjoy all this eyecandy









To test higher framerates I connect my first rog swift 144hz monitor but I just couldn't look at it for longer than 30min anymore. Was missing so much detail (read: realism) in those high res textures. I now realize I can't go back to 1440p anymore and it can only get better from here on. Bring on [email protected] already to raise the ceiling !


----------



## KillerBee33

Quote:


> Originally Posted by *pez*
> 
> http://www.rtings.com/tv/reviews/samsung/js9500
> 
> The section on Motion and Inputs is what I looked to. I'm no TV expert, but I did see that it says:
> This is talking about video playback, but I'm not sure if the 'Auto Motion Plus' setting has anything to do with it.
> 
> Maybe if someone else is more familiar with TVs + PC gaming, they can chime in.


Had my Toshiba 50L3400U 50-Inch 1080p 60Hz Smart LED TV for a little over a year ,connected run most games in DSR , VSYNC always ON, DSR Smoothness always to 28 and instead of 2160p i run 1620p resolution which visually has absolutely no difference and have no stutter, also have that TV running of a Display Port to HDMI adapter and have Oculus connected thru HDMI.
EDIT: It does have ClearScan 120 (Effective)which i still have no idea what it does and 24-60FPS Video is also Smooth.


----------



## Lobotomite430

Anyone able to offer a possible explanation as to why my drivers crash while overclocking my Titan only when playing Fallout 4? My other games dont seem to be bothered by the overclock but for some reason Fallout 4 crashes even if the overclock is very conservative. Thanks.


----------



## profundido

Quote:


> Originally Posted by *jcde7ago*
> 
> People with the HB SLI bridge from Nivida + EK waterblocks - did you guys cut off the bridge's tips as well to get it to fit, or resort to the EVGA HB bridge?
> 
> Been gaming all weekend so far and have had a flawless SLI experience in The Witcher 3...GPU-Z is reporting high-bandwidth SLI config. + 499.6 GB/s as well. Bridge LED works and the tips I Dremel'd off looked like it was just empty PCB anyways. Curious to know what others did, as I don't want to drop another $30+ on another HB SLI bridge if I don't absolutely have to.


yes, same as you, tips off. Just enough so you can get it in and modified side becomes invisible behind the waterblock anyway. I really like the end result and I got lucky on how the green led happens to fit in my new green/carbonblack build (see my avatar picture)


----------



## jodasanchezz

Quote:


> Originally Posted by *jcde7ago*
> 
> I'm not on air so I can't give you any definitive help other than to say that you can expect some severe throttling given that your card is reaching 84c--88c according to your pics....but can you force 100% fan speed and play the same games and report back what your usage looks like? I can't imagine that this is anything more than thermal throttling assuming your hardware is fine.


So Nvidia Support says if the GPU stays above the Base Clock everithing is fine...***
1300€ für a Gpu and the Stock cooler is a big Pice of Crap !

Im not shure if i should send my Crad Back and Order a new one...


----------



## Lennyx

Quote:


> Originally Posted by *jodasanchezz*
> 
> So Nvidia Support says if the GPU stays above the Base Clock everithing is fine...***
> 1300€ für a Gpu and the Stock cooler is a big Pice of Crap !
> 
> Im not shure if i should send my Crad Back and Order a new one...


My card crashed @+200 with the blower. On water i use +200/+750 in games and +220/+800 when benching. So far no issues other then coilwhine wich is bad and i might rma it.


----------



## Lobotomite430

Quote:


> Originally Posted by *Lennyx*
> 
> My card crashed @+200 with the blower. On water i use +200/+750 in games and +220/+800 when benching. So far no issues other then coilwhine wich is bad and i might rma it.


Why +800? Everything I have read on here anything over +500 is pointless and may hurt performance/card.


----------



## markklok

Quote:


> Originally Posted by *Lobotomite430*
> 
> Why +800? Everything I have read on here anything over +500 is pointless and may hurt performance/card.


Same here.. i can bench with +727 but past 525 its not stable anymore


----------



## Lennyx

Quote:


> Originally Posted by *Lobotomite430*
> 
> Why +800? Everything I have read on here anything over +500 is pointless and may hurt performance/card.


From the benchmarks i did, higher memory= higher fps and higher scores.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Benchmarking, not gaming. He literally asks for help from the user in the thread jcde posted, from the user who actually had it working in games already. Anyway, doesn't really matter. Just reaffirms what I initially believed, not a team player. The enthusiast internet is a very small place. It's easier to be honest.


not placing credit... but lol, AFR2 is not new. If that's all it took then...









(not that I need 4 TXPs for _anything_ )


----------



## jodasanchezz

Quote:


> Originally Posted by *Lobotomite430*
> 
> Why +800? Everything I have read on here anything over +500 is pointless and may hurt performance/card.


By goal is to reach under water 2000mh stable and if i can run 2050 im realy happy. So do u think thats posible?
Will run thr titan in an seperated loop on a 360 rad


----------



## Lobotomite430

Quote:


> Originally Posted by *jodasanchezz*
> 
> By goal is to reach under water 2000mh stable and if i can run 2050 im realy happy. So do u think thats posible?
> Will run thr titan in an seperated loop on a 360 rad


With my EVGA 1080/1070 Hybrid kit and a +220 core my card goes 2100mhz with +200 core it hovers around 2038-2070 according to afterburner.


----------



## carlhil2

I still can't believe how cool these cards run under water. I am using a EK universal block and max out a +10 Celsius over [email protected]+200, +600 24/7 in games....


----------



## habu58

Any benchmarks?


----------



## SlammiN

Pretty sure I've got a hardware fault as sometimes displays wont even register when PC turns on, and they cut out etc, fuzzy after wake

Does NVIDIA cross ship the Titan X?


----------



## jhowell1030

Quote:


> Originally Posted by *CRITTY*
> 
> Glad to hear that.
> Poopy Deus Ex, which supposedly "supports" SLI, can't do 4k/60fps Maxed with AA off.


I only have a single TXP and have tried adjust multiple settings to Med/High with reflections off , on, lighting effect off, on, low, high...seems to make no difference.

At 3440x1440 I get 55-60FPS regardless of the options.

Does this make sense to anyone else?


----------



## Kyouki

I haven't done any real OC or bench testing on the card, but this weekend I played Hours on hours of Battlefield One on a 2560x1440 at 60z with card at 200mhz core / 200 MHz memory and I was holding a 2050Mhz stable and even seen it jump to 2070s a few times. I only put a mild OC on the card since I have not benched tested it yet. So far really happy with the card under water.


----------



## pez

Quote:


> Originally Posted by *Iorek*
> 
> I believe that's talking about watching 24 fps content over a 60 hz source... like watching a blu-ray on your computer. 60 fps content on a 60 fps source shouldn't be an issue.


Yeah, I think you're correct. It sounds like you've possibly tried this, but have you tried what I bolded below? I'm sure you've tried multiple HDMI cables at this point, but an official HDMI 2.0 to DisplayPort adapter might be a good test. It still doesn't sound like this should be an issue related to that, however.
Quote:


> Originally Posted by *KillerBee33*
> 
> Had my Toshiba 50L3400U 50-Inch 1080p 60Hz Smart LED TV for a little over a year ,connected run most games in DSR , VSYNC always ON, DSR Smoothness always to 28 and instead of 2160p i run 1620p resolution which visually has absolutely no difference and have no stutter, also have that TV running of a *Display Port to HDMI adapter* and have Oculus connected thru HDMI.
> EDIT: It does have ClearScan 120 (Effective)which i still have no idea what it does and 24-60FPS Video is also Smooth.


----------



## 295033

Quote:


> Originally Posted by *pez*
> 
> Yeah, I think you're correct. It sounds like you've possibly tried this, but have you tried what I bolded below? I'm sure you've tried multiple HDMI cables at this point, but an official HDMI 2.0 to DisplayPort adapter might be a good test. It still doesn't sound like this should be an issue related to that, however.


I have not tried that. I've got a beefier active HDMI cable on the way, but I suppose I can order one of those as well. Maybe a DVI -> HDMI adapter while I'm at it. Might as well try everything. One thing I thought of on the way to work is to try enabling the TV's UHD color mode, which you typically only need for [email protected] 4:4:4... maybe that opens up more bandwidth or something. It wasn't needed before, but who knows.


----------



## Jpmboy

Quote:


> Originally Posted by *carlhil2*
> 
> I still can't believe how cool these cards run under water. I am using a EK universal block and max out a +10 Celsius over [email protected]+200, +600 24/7 in games....


Right? I'm getting the same with the uniblock. Wil lbe mounting full cover blocks as soon as my backplates get here...








(need these Uni's for some other cards).


----------



## CRITTY

Quote:


> Originally Posted by *Iorek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pez*
> 
> Sounds like you've tried it all, but have you tried putting your TV into a 'PC mode' if it has it or just removing all processing technologies in general in the TV settings?
> 
> 
> 
> Yes, I've tried PC mode on the TV, and like all the other modes there's no stutter. Game mode is the only one that has the problem. And only if it's fullscreen... if I run a game in a window in Game mode it's fine. Unfortunately, PC mode comes in at about 56ms input lag compared to Game's 23 so I'd really rather not use it. Really though, what the heck could possibly be going on in Game Mode that causes something like this? It has to be a problem with the TV, right??
Click to expand...

I have the same TV and don't have this issue at all. I am sorry I don't have an answer for you.


----------



## 295033

Quote:


> Originally Posted by *CRITTY*
> 
> I have the same TV and don't have this issue at all. I am sorry I don't have an answer for you.


Yeah I didn't have the issue until a few days ago either; I was implying there might be something wrong with my TV in particular


----------



## Glzmo

Quote:


> Originally Posted by *Iorek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CRITTY*
> 
> I have the same TV and don't have this issue at all. I am sorry I don't have an answer for you.
> 
> 
> 
> Yeah I didn't have the issue until a few days ago either; I was implying there might be something wrong with my TV in particular
Click to expand...

Did it auto-update the firmware, perhaps? I've seen silent firmware updates cause bugs on TVs.


----------



## 295033

Quote:


> Originally Posted by *Glzmo*
> 
> Did it auto-update the firmware, perhaps? I've seen silent firmware updates cause bugs on TVs.


No, but I updated the firmware after the fact to try and fix the problem.


----------



## jodasanchezz

Solved my Problem wit the Titan not Boosting at al...

Befor fix


After


I just removed the Gpu butet one with the iGpu and put the Gpu back in....

Someone had somethin like that bevor ??? any idea what reason this canme from?


----------



## jhowell1030

Quote:


> Originally Posted by *jodasanchezz*
> 
> Solved my Problem wit the Titan not Boosting at al...
> 
> Befor fix
> 
> 
> After
> 
> 
> I just removed the Gpu butet one with the iGpu and put the Gpu back in....
> 
> Someone had somethin like that bevor ??? any idea what reason this canme from?


Can't see the pictures at the office and can't understand your question with all of the typos.







Not trying to be facetious I just didn't understand...


----------



## jodasanchezz

Quote:


> Originally Posted by *jhowell1030*
> 
> Can't see the pictures at the office and can't understand your question with all of the typos.
> 
> 
> 
> 
> 
> 
> 
> Not trying to be facetious I just didn't understand...


My titan wasent bosting at all.

All titals where runing at 1418mhz...wrote nearly 2 houers in chat with nvidia...they had no idea...
Finali i just unpluged the gpu plugt the gpu in again and boom... I was eable to boost up to 1750 at stock settings and 2000 with +215 on the core....never had a similar situation with any gpu bevore


----------



## jhowell1030

That does sound like an odd issue. Never ran into that one before.


----------



## piee

I'm getting peak 1860 at 52c and below, thats stk outa box no oc.


----------



## Lobotomite430

Quote:


> Originally Posted by *piee*
> 
> I'm getting peak 1860 at 52c and below, thats stk outa box no oc.


1860 is normal but the 52c doesnt seem right unless you live in Antarctica


----------



## CallsignVega

Quote:


> Originally Posted by *Lobotomite430*
> 
> Why +800? Everything I have read on here anything over +500 is pointless and may hurt performance/card.


I run +850. I've tested virtually every memory step above stock and there is some gain. The largest jump is up until the +500 range, but I still squeeze a tiny bit more performance at +850 in multiple tests. It's all about finding that right memory step that keeps the tight timings and the highest frequency. Just one step difference can change, IE if I go to +840 or +860 I lose performance.


----------



## Yuhfhrh

Quote:


> Originally Posted by *CallsignVega*
> 
> I run +850. I've tested virtually every memory step above stock and there is some gain. The largest jump is up until the +500 range, but I still squeeze a tiny bit more performance at +850 in multiple tests.


Man, +500 is my max. I get artifacting at +550, crashes at +650.


----------



## jhowell1030

Quote:


> Originally Posted by *CallsignVega*
> 
> I run +850. I've tested virtually every memory step above stock and there is some gain. The largest jump is up until the +500 range, but I still squeeze a tiny bit more performance at +850 in multiple tests. It's all about finding that right memory step that keeps the tight timings and the highest frequency. Just one step difference can change, IE if I go to +840 or +860 I lose performance.


For me anything higher than +475 gets me better scores in benchmarks but either does not ad to FPS in game performance or hurts it. I've also seen this in multiple reviews. I suppose each card is different though. To each their own


----------



## chronicfx

Quote:


> Originally Posted by *jcde7ago*
> 
> Pretty sure you meant to respond to @jodasanchezz ...


Your assumption would be accurate


----------



## cisco0623

Quote:


> Originally Posted by *Jpmboy*
> 
> Right? I'm getting the same with the uniblock. Wil lbe mounting full cover blocks as soon as my backplates get here...
> 
> 
> 
> 
> 
> 
> 
> 
> (need these Uni's for some other cards).


Totally agree. I was playing some civ 4 (lol I know, but I always play some. Just a game I love) and it uses regular clock speed no boost. Cards are around 27c for me. 32c after 40 minutes of doom. Very happy with how cool these run.


----------



## piee

It only shows when game first loads and than downclocks immediately to about 1740-1838 at 74c at full fan, eventually get block to keep at 52c and below for full 1860 stk, it peaks at 2070 and steadies to 1948-1978 with 190PL/full fan 74c.


----------



## bee144

Quote:


> Originally Posted by *jodasanchezz*
> 
> My titan wasent bosting at all.
> 
> All titals where runing at 1418mhz...wrote nearly 2 houers in chat with nvidia...they had no idea...
> Finali i just unpluged the gpu plugt the gpu in again and boom... I was eable to boost up to 1750 at stock settings and 2000 with +215 on the core....never had a similar situation with any gpu bevore


sounds like your card wasn't seated properly into the PCI slot or one of the power cables wasn't inserted fully. Glad to hear you're working again.

Not having your card fully seated into the PCI slot could have caused the card to not receive the full 75w from the slot, thus causing the card not to boost. That's my best Root Cause Analysis.


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> I fixed my cooling issue with the EVGA 1080/1070 hybrid kit Im using on my Titan. Now my games are running at 55c rather than 70+ I am very happy with the EVGA hybrid kit!


does the shroud of 1080/1070 fits to the titan ?


----------



## Baasha

Quote:


> Originally Posted by *axiumone*
> 
> Oh that's so great! That thread really explains all of the details to get 4 way sli working on pascal. So our certain resident just stole all of the thunder without any of the credit. Figures.


Uhh.. was told not to post about it online so never did - but the video gives credit (as well as the comments) after some initial mix up - credit is given fully.
Quote:


> Originally Posted by *Jpmboy*
> 
> Baasha has been running 4-way since launch, and in the last few weeks getting all cards to scale well.
> 
> 
> 
> 
> 
> 
> 
> 
> So much for the "SLI don't work" crowd.


4-Way SLI FTW!


----------



## Jpmboy

Quote:


> Originally Posted by *Baasha*
> 
> Uhh.. was told not to post about it online so never did - but the video gives credit (as well as the comments) after some initial mix up - credit is given fully.
> 4-Way SLI FTW!


at least it's working. AFR has it's issues, but ya gotta do what ya gotta do.


----------



## Gary2015

Quote:


> Originally Posted by *Artah*
> 
> nice thanks. I'm getting 2012 with +215 and 5200 on +200. I have EK blocks also. Temp 54ish


I'm getting the same on air.


----------



## Gary2015

Quote:


> Originally Posted by *Jpmboy*
> 
> Baasha has been running 4-way since launch, and in the last few weeks getting all cards to scale well.
> 
> 
> 
> 
> 
> 
> 
> 
> So much for the "SLI don't work" crowd.


Just returned one of my cards .... Oops


----------



## pez

Quote:


> Originally Posted by *Iorek*
> 
> I have not tried that. I've got a beefier active HDMI cable on the way, but I suppose I can order one of those as well. Maybe a DVI -> HDMI adapter while I'm at it. Might as well try everything. One thing I thought of on the way to work is to try enabling the TV's UHD color mode, which you typically only need for [email protected] 4:4:4... maybe that opens up more bandwidth or something. It wasn't needed before, but who knows.


Quote:


> Originally Posted by *Iorek*
> 
> Yeah I didn't have the issue until a few days ago either; I was implying there might be something wrong with my TV in particular


Definitely curious to know what fixes it if you do find out







.


----------



## Artah

Quote:


> Originally Posted by *Gary2015*
> 
> I'm getting the same on air.


you're not getting throttled at all?


----------



## jodasanchezz

For those who don t Know Jet
*
MSI AfterBurner 4.3.0 Beta 14*



http://www.guru3d.com/news-story/download-msi-afterburner-4-3-beta-14.html


----------



## jodasanchezz

Quote:


> Originally Posted by *Artah*
> 
> you're not getting throttled at all?


After i solved my Problems with the Titan im alsow eable to hit 2000-2025mhz om Air (100% Fan) in Gta 5 an in Witcher 3 1987-2012 all at 74°c core 220 mem 200


----------



## Artah

Quote:


> Originally Posted by *jodasanchezz*
> 
> After i solved my Problems with the Titan im alsow eable to hit 2000-2025mhz om Air (100% Fan) in Gta 5 an in Witcher 3 1987-2012 all at 74°c core 220 mem 200


That's very good on air. I have not tried to push my cards further because I don't need the extra juice ATM I'm only running 4K.


----------



## jcde7ago

Quote:


> Originally Posted by *jodasanchezz*
> 
> After i solved my Problems with the Titan im alsow eable to hit 2000-2025mhz om Air (100% Fan) in Gta 5 an in Witcher 3 1987-2012 all at 74°c core 220 mem 200


EVGA's making a Hybrid unit for the Titan X Pascal...i'm guessing it'll release sometime in October...definitely pick one of those up if you're not gonna do a custom loop...gotta take care of that TXP and 100% fan must sound like a jet engine!

Besides, if/when we get proper bios modding w/ higher voltages you're gonna need to be underwater to push towards 2200-2300mhz+.


----------



## CreepinD

Been using my Titan XP for a few weeks now, and I have to say that i'm very pleased with the performance of this card. I have a 40" 4K monitor, and a can run all my games at max settings while staying above 60 fps. Card is rock solid at 2075Mhz on the core and +500 on the memory, with temps between 35-38C. I'm out of town for the next week, so I have yet to get a chance to try the new beta version of AB. Hopefully I can reach the magical 2100Mhz









http://s2.photobucket.com/user/Creepin_D/media/20160811_203817.jpg.html
http://s2.photobucket.com/user/Creepin_D/media/thumbnail_20160817_190655.jpg.html
http://s2.photobucket.com/user/Creepin_D/media/20160829_161912.jpg.html
http://s2.photobucket.com/user/Creepin_D/media/20160829_174335.jpg.html
http://s2.photobucket.com/user/Creepin_D/media/20160829_174320.jpg.html


----------



## jodasanchezz

Quote:


> Originally Posted by *jcde7ago*
> 
> EVGA's making a Hybrid unit for the Titan X Pascal...i'm guessing it'll release sometime in October...definitely pick one of those up if you're not gonna do a custom loop...gotta take care of that TXP and 100% fan must sound like a jet engine!
> 
> Besides, if/when we get proper bios modding w/ higher voltages you're gonna need to be underwater to push towards 2200-2300mhz+.


Yes This GPU is Horrible on 100%Fan








My Gf already ask my when i will send back this loud pease of sh... Hahaha , i sead never.

I Hope may EK Block Comes next Weak,
i will set an Extra 360 rad Loop for the Card. but i Need time for that...

I havent ad some Voltes to the gpu (always runing into pwr limit), should a lower temp reduce the Power consumption or im wrong at this Point?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *jodasanchezz*
> 
> Yes This GPU is Horrible on 100%Fan
> 
> 
> 
> 
> 
> 
> 
> 
> My Gf already ask my when i will send back this loud pease of sh... Hahaha , i sead never.
> 
> I Hope may EK Block Comes next Weak,
> i will set an Extra 360 rad Loop for the Card. but i Need time for that...
> 
> I havent ad some Voltes to the gpu (always runing into pwr limit), should a lower temp reduce the Power consumption or im wrong at this Point?


Lower temps equals lower power consumption yes.


----------



## KillerBee33

Does ayone have an image , inside EK TXP block with termal pads positions?


----------



## scgeek12

Double post


----------



## scgeek12

@KillerBee33
This is for a 1080 but I used this video and it was exactly the same, first 2 waterblocks I ever put on video cards and they both work perfect,


----------



## KillerBee33

Quote:


> Originally Posted by *scgeek12*
> 
> @KillerBee33
> This is for a 1080 but I used this video and it was exactly the same, first 2 waterblocks I ever put on video cards and they both work perfect,


Doing a small mod to a 1080 Block from Swiftech to fit TXP , just tryin to get as much info before i actually cut it.
Other than extra 6 pin opening for the TXP i cant seem to spot any major differences


----------



## DNMock

Quote:


> Originally Posted by *KillerBee33*
> 
> Doing a small mod to a 1080 Block from Swiftech to fit TXP , just tryin to get as much info before i actually cut it.
> Other than extra 6 pin opening for the TXP i cant seem to spot any major differences


the GPU die is larger, make sure the swift tech block actually makes full contact and that the PCB's of the two cards are the same.


----------



## KillerBee33

Quote:


> Originally Posted by *DNMock*
> 
> the GPU die is larger, make sure the swift tech block actually makes full contact and that the PCB's of the two cards are the same.


Was thinking Double up on the memory pads that are not on the 1080 block

Or just cover them on the PCB


----------



## piee

Seems that for every 100mhz above stock=2-3 FPS in games so 400mhz=8-12 FPS above stock.


----------



## Baasha

Is it feasible to use a hybrid cooler for four cards sandwiched together?

I'm going to first try and remove the backplates this weekend. The temps are crazy - this has usually never happened before - hottest card in 4-Way SLI was always around 80 - 82C. Now, both the top card and the one right below it are getting to 90C.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Does ayone have an image , inside EK TXP block with termal pads positions?


DL the install PDF from EK. it shows what pads go where.







Quote:


> Originally Posted by *Baasha*
> 
> Is it feasible to use a hybrid cooler for four cards sandwiched together?
> 
> I'm going to first try and remove the backplates this weekend. The temps are crazy - this has usually never happened before - hottest card in 4-Way SLI was always around 80 - 82C. Now, both the top card and the one right below it are getting to 90C.


no.. where you gonna mount all the little rads? Just water cool the rig already.









or better yet - go external and get a Aquacomputer 720 stand alone or a GiGant. works great!
my 720 has been running 24/7 for more than 4 years now. easy to move between rigs and you just pass two tubes into your case.


Spoiler: Warning: Spoiler!


----------



## Silent Scone

Just picked up second TX. Need to order the block


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> Just picked up second TX. Need to order the block


my backplates should arrive today, full cover ASAP!


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> my backplates should arrive today, full cover ASAP!


The EK ones? Nah, bareback here.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> DL the install PDF from EK. it shows what pads go where.
> 
> 
> 
> 
> 
> 
> 
> 
> no.. where you gonna mount all the little rads? Just water cool the rig already.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> or better yet - go external and get a Aquacomputer 720 stand alone or a GiGant. works great!
> my 720 has been running 24/7 for more than 4 years now. easy to move between rigs and you just pass two tubes into your case.
> 
> 
> Spoiler: Warning: Spoiler!


What he said @baasha

Just block them and run an external rad with qdcs


----------



## 295033

Quote:


> Originally Posted by *pez*
> 
> Definitely curious to know what fixes it if you do find out
> 
> 
> 
> 
> 
> 
> 
> .


Well none of the cables/adapters worked, nor did UHD color mode. My last hope is the box Samsung is sending which I won't get until next week.


----------



## ragingnomad

Does anyone have an idea why my TXP SLI setup is only running core at 570 or 608 MHz MAX at idle and in games? I can't get it to run higher and I'm thinking this probably isnt a good time to start OCing if the card won't even kick into Boost mode on its own.


----------



## Woundingchaney

Quote:


> Originally Posted by *ragingnomad*
> 
> Does anyone have an idea why my TXP SLI setup is only running core at 570 or 608 MHz MAX at idle and in games? I can't get it to run higher and I'm thinking this probably isnt a good time to start OCing if the card won't even kick into Boost mode on its own.


The cards will only boost to whatever core freq is required. It's possible the games you are playing aren't stressing them. Try turning off vsync and playing demanding titles.


----------



## ragingnomad

Quote:


> Originally Posted by *Woundingchaney*
> 
> The cards will only boost to whatever core freq is required. It's possible the games you are playing aren't stressing them. Try turning off vsync and playing demanding titles.


I was playing Solitare and Minesweeper at around 50 FPS but the clock speed was still really low. V-sync isn't the issue since the framerate isn't even getting up to 60. I think it may be a software issue because I'm now seeing in GPU Z that the clock speed looks accurate. MSI AB and HWMonitor give incorrect readings for some reason.


----------



## Woundingchaney

Quote:


> Originally Posted by *ragingnomad*
> 
> I was playing Solitare and Minesweeper at around 50 FPS but the clock speed was still really low. V-sync isn't the issue since the framerate isn't even getting up to 60. I think it may be a software issue because I'm now seeing in GPU Z that the clock speed looks accurate. MSI AB and HWMonitor give incorrect readings for some reason.


Did you just say solitaire and minesweeper? I'm not sure if you are being serious at this point. Regardless you need to actually play games that stress your cards. Those titles may simply lock at 50, particularly given that they are ancient simple games inbedded in the OS for decades.


----------



## KillerBee33

Quote:


> Originally Posted by *ragingnomad*
> 
> I was playing Solitare and Minesweeper at around 50 FPS but the clock speed was still really low. V-sync isn't the issue since the framerate isn't even getting up to 60. I think it may be a software issue because I'm now seeing in GPU Z that the clock speed looks accurate. MSI AB and HWMonitor give incorrect readings for some reason.


[email protected] none of the GIFS work here but thats friggin Funny


----------



## LongtimeLurker

Quote:


> Originally Posted by *ragingnomad*
> 
> I was playing Solitare and Minesweeper at around 50 FPS but the clock speed was still really low. V-sync isn't the issue since the framerate isn't even getting up to 60. I think it may be a software issue because I'm now seeing in GPU Z that the clock speed looks accurate. MSI AB and HWMonitor give incorrect readings for some reason.


----------



## Fredthehound

This is funny since just yesterday I sent my kid the following email

"I like fallout shelter. It's a very fun game. Rather addictive and a nice break from all the crap one goes through to mod Sky and F4. Just click an icon and play... Simple, no thought fun. And suddenly I realized I was enjoying the game...On a liquid cooled i7, 2400mhz ram, a raid array and a freaking Titan XP.

I thought I had broken the space time continuum with all-time nuclear levels of absurdity. But no. Then I realized I still had my FPS counter running. As if a ported i phone game is going to somehow fall below 60...At which point the singularity of PCMR idiocy was achieved, I have now ascended to the height of glorious overkill in the most asinine way imaginable and I just had to shut it all off and go to sleep. For an encore I might see if Minesweeper is somewhere on this computer."


----------



## eliau81

can someone confirm if the EVGA 1080 HYBRID KIT shroud fits to titan xp?


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> can someone confirm if the EVGA 1080 HYBRID KIT shroud fits to titan xp?


Nevermind "SHROUD" just passed my eye


----------



## willverduzco

Quote:


> Originally Posted by *eliau81*
> 
> can someone confirm if the EVGA 1080 HYBRID KIT shroud fits to titan xp?


It does not fit the Titan XP. You either have to cut a hole in the heatspreader to make room for the top VRM choke, or you can simply use the CLLC pump/waterblock on the stock heatspreader (what I did). Click on my rig pics for more detail of how you have to do it.


----------



## eliau81

Quote:


> Originally Posted by *willverduzco*
> 
> It does not fit the Titan XP. You either have to cut a hole in the heatspreader to make room for the top VRM choke, or you can simply use the CLLC pump/waterblock on the stock heatspreader (what I did). Click on my rig pics for more detail of how you have to do it.


nice rig









but didn't got much info from pic

BTW love the copper spreaders very good idea
wish i could do that also ....


----------



## willverduzco

Quote:


> Originally Posted by *eliau81*
> 
> nice rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but didn't got much info from pic
> 
> BTW love the copper spreaders very good idea
> wish i could do that also ....


Ah, apologies for the bad pictures (and thanks for the compliment on my baby). Basically, what you have to do (to do it my way) is the following:


Take off the stock outer-shroud components (hex screws on shroud itself)
Unscrew the GPU heatsink using the 4 rear spring-mounted screws
Remove the entire stock heatspreader assembly. First remove the back plate (be very careful with the tiny screws that can't stand any real torque before they snap), then unscrew the ~4mm hex standoff nuts that hold them in place. This will free the stock heat spreader, which has to be done in order to access the PWM fan lead that the CLLC pump will attach to.
Attach the pump to the PWM fan lead using the attached splitter, and plug the stock fan into the cable included with the Hybrid kit.
Rebuild everything, making sure that you have enough clearance with cables and the stock fan, which still has to spin to cool the heatspreader for the input and output mosfets (small one in the rear), as well as the heatspreader for the RAM and late stage VRM components (big one combined with the memory heatspreader). If you want to add copper ramsinks to cool the RAM and some of the VRM components, you can do so before replacing the FAN. You just have to make sure to get low profile ones for under the tubing and under the fan shroud.
Gamers Nexus did a series (



, 



, 



) on the process using the 980TI Hybrid kit, but it's essentially the same with the 1080 kit since we're only using the CLLC rather than using the shroud and heatspreader it comes with. The only thing really worth mentioning from that video is that it shows that the stock heatspreader and rear area of the shroud (supplying air to the heatsinks that indirectly cool the input and output mosfets of the VRM) can still be used with the mod. This last part is essential, since you don't have to worry about active cooling for the VRMs. The memory bling that I added really gives you nothing since GDDR5X isn't too warm to begin with, I just figured that I'd try to remove as much heat from that heatspreader as possible, since it still ends up warm (but not hot) to the touch after a prolonged benchmark stress test.

This method unfortunately results in a shroud-less look, which some may not like. However, it's quite functional, and delivers exceptional temps on my fairly high overclock. All in all, this process was better than spending the time draining my main loop, cutting more tubing, filling, leak testing, and bleeding, which I like to avoid as much as possible.









As mentioned before, you could also modify the 1080 hybrid kit's heatspreader to work, which involves making a small cut for the uppermost VRM choke. This would look better if you care about that... But I had no interest in cutting any metal for this mod on my personal rig.









PS. I highly recommend using a different fan once you set everything up. I had a few 120mm Gentle Typhoons laying around so I am using that instead of the stock EVGA fan. Not only did my temps improve, but it's much quieter. I now remember why I have 8 of these babies in push-pull cooling my 480mm double-width radiator at super low RPM for my CPU loop. Godly fans for low noise, high static pressure applications like watercooling.


----------



## ragingnomad

Quote:


> Originally Posted by *LongtimeLurker*


You don't think Solitare is a good way to test the cards??


----------



## pompss

i know im going off thread but i need a fast answer for you guys
Which case i should buy for asus rampage V edition 10 and One Titan X .OF Course i will watercool cpu and gpu .

Almost forget to mention that the case as too be mounted on wall so there is only 2 cases i can find on google.

First option LIAN LI PC-O7SX





its like my old build but this case its bigger and i can mount 360 rad and have more space for Psu , D5 pump and reservoir.

Second Option Termaltake core p5





i can install 480 rad which will be awesome and still have a lot of space to install another rad on the back

Now i overclock gpu and cpu and silent its not essential but preferable so a 420 or 480 60mm thick rad would be prefect for that which push me to buy the core p5 case. Problem is the i like the Gpu to be mounted vertically and with the core p5 the gpu will cover half of the ASUS RAMPAGE V 10 EDITION and also cover half of the led lights. A Solution is to install the gpu horizontal at the bottom of the motherboard end forget about it.

The LIAN LI PC-O7SX will be perfect for mounting the gpu vertically and will not cover the motherboard aesthetics but i can install only a 360 rad 50mm thick which will be enough but more noise.Also getting access to the rad for cleaning purpose would be not flexible but possible.

Sooo...... what you guys think i should buy.
If anyone know another case that can be mount on a wall then i will look into it
Thanks


----------



## piee

From 28nm to 16/14nm is a 12-14nm difference,from 16/14nm to 7nm is a 9/7nm difference, does that mean the increase in performance from 28nm(8terraflops) to 16nm(12 terraflops) is going a bigger increase performance than 16nm to 7nm(15-16terraflops?).28nm came out in 2010,16nm in 2013, Gpu in 2016, so maybe 7nm gpu in 3years? Just got a EK block for TXP, it hits 76c stk air(maxfan) at 1860peak, like cool/quiet.


----------



## eliau81

Quote:


> Originally Posted by *willverduzco*
> 
> Ah, apologies for the bad pictures (and thanks for the compliment on my baby). Basically, what you have to do (to do it my way) is the following:
> 
> 
> Take off the stock outer-shroud components (hex screws on shroud itself)
> Unscrew the GPU heatsink using the 4 rear spring-mounted screws
> Remove the entire stock heatspreader assembly. First remove the back plate (be very careful with the tiny screws that can't stand any real torque before they snap), then unscrew the ~4mm hex standoff nuts that hold them in place. This will free the stock heat spreader, which has to be done in order to access the PWM fan lead that the CLLC pump will attach to.
> Attach the pump to the PWM fan lead using the attached splitter, and plug the stock fan into the cable included with the Hybrid kit.
> Rebuild everything, making sure that you have enough clearance with cables and the stock fan, which still has to spin to cool the heatspreader for the input and output mosfets (small one in the rear), as well as the heatspreader for the RAM and late stage VRM components (big one combined with the memory heatspreader). If you want to add copper ramsinks to cool the RAM and some of the VRM components, you can do so before replacing the FAN. You just have to make sure to get low profile ones for under the tubing and under the fan shroud.
> Gamers Nexus did a series (
> 
> 
> 
> ,
> 
> 
> 
> ,
> 
> 
> 
> ) on the process using the 980TI Hybrid kit, but it's essentially the same with the 1080 kit since we're only using the CLLC rather than using the shroud and heatspreader it comes with. The only thing really worth mentioning from that video is that it shows that the stock heatspreader and rear area of the shroud (supplying air to the heatsinks that indirectly cool the input and output mosfets of the VRM) can still be used with the mod. This last part is essential, since you don't have to worry about active cooling for the VRMs. The memory bling that I added really gives you nothing since GDDR5X isn't too warm to begin with, I just figured that I'd try to remove as much heat from that heatspreader as possible, since it still ends up warm (but not hot) to the touch after a prolonged benchmark stress test.
> 
> This method unfortunately results in a shroud-less look, which some may not like. However, it's quite functional, and delivers exceptional temps on my fairly high overclock. All in all, this process was better than spending the time draining my main loop, cutting more tubing, filling, leak testing, and bleeding, which I like to avoid as much as possible.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As mentioned before, you could also modify the 1080 hybrid kit's heatspreader to work, which involves making a small cut for the uppermost VRM choke. This would look better if you care about that... But I had no interest in cutting any metal for t com his mod on my personal rig.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS. I highly recommend using a different fan once you set everything up. I had a few 120mm Gentle Typhoons laying around so I am using that instead of the stock EVGA fan. Not only did my temps improve, but it's much quieter. I now remember why I have 8 of these babies in push-pull cooling my 480mm double-width radiator at super low RPM for my CPU loop. Godly fans for low noise, high static pressure applications like watercooling.


thanks a lot for the tutorial








so no need to copper the Vram & mofset?
problem with fan replacing is the connector from pump to the fan as a 2 pin, while most fans comes with 3/4 pin (like nactua the best IMO) connector, i had issue with my 980 and the EVGA fan was unbearable, so i had to replace to nactua 120mm fan and connect to the MOBO (dident had molex for PSU)
i really like to put a shroud for aesthetic looks and to exhaust the air from card out.


----------



## DADDYDC650

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *pompss*
> 
> i know im going off thread but i need a fast answer for you guys
> Which case i should buy for asus rampage V edition 10 and One Titan X .OF Course i will watercool cpu and gpu .
> 
> Almost forget to mention that the case as too be mounted on wall so there is only 2 cases i can find on google.
> 
> First option LIAN LI PC-O7SX
> 
> 
> 
> 
> 
> its like my old build but this case its bigger and i can mount 360 rad and have more space for Psu , D5 pump and reservoir.
> 
> Second Option Termaltake core p5
> 
> 
> 
> 
> 
> i can install 480 rad which will be awesome and still have a lot of space to install another rad on the back
> 
> Now i overclock gpu and cpu and silent its not essential but preferable so a 420 or 480 60mm thick rad would be prefect for that which push me to buy the core p5 case. Problem is the i like the Gpu to be mounted vertically and with the core p5 the gpu will cover half of the ASUS RAMPAGE V 10 EDITION and also cover half of the led lights. A Solution is to install the gpu horizontal at the bottom of the motherboard end forget about it.
> 
> The LIAN LI PC-O7SX will be perfect for mounting the gpu vertically and will not cover the motherboard aesthetics but i can install only a 360 rad 50mm thick which will be enough but more noise.Also getting access to the rad for cleaning purpose would be not flexible but possible.
> 
> Sooo...... what you guys think i should buy.
> If anyone know another case that can be mount on a wall then i will look into it
> Thanks





P5 all day dude. Why spend more than you have to?


----------



## jodasanchezz

New Afterburner seems to be great









Running Heaven for 15min

on 1987-2012mhz on air
Quote:


> Originally Posted by *piee*
> 
> From 28nm to 16/14nm is a 12-14nm difference,from 16/14nm to 7nm is a 9/7nm difference, does that mean the increase in performance from 28nm(8terraflops) to 16nm(12 terraflops) is going a bigger increase performance than 16nm to 7nm(15-16terraflops?).28nm came out in 2010,16nm in 2013, Gpu in 2016, so maybe 7nm gpu in 3years? Just got a EK block for TXP, it hits 76c stk air(maxfan) at 1860peak, like cool/quiet.


I Work at the Company how is leading developer vor SMT the technikc to produce 5nm is availeable für roundabout 4 yeahrs now.
The Problem ist just the process of mass production. is/was not steable.
There is a Development ongoing and wer eable improve the Process step by step. But till this hit the marked will ned another 2 (10nm) or even more. We just dont nkow Jet.

The Problem ist at this Point the light source.

FYI


----------



## pompss

Quote:


> Originally Posted by *DADDYDC650*
> 
> P5 all day dude. Why spend more than you have to?


Thanks i agree.
I also think to go with MSI MSI Gaming Z170A XPOWER and i7 6700k since will not do sli in the future and save another $500
Look like 6700 perform better then most high end cpu in games.


----------



## DADDYDC650

Quote:


> Originally Posted by *pompss*
> 
> Thanks i agree.
> I also think to go with MSI MSI Gaming Z170A XPOWER and i7 6700k since will not do sli in the future and save another $500
> Look like 6700 perform better then most high end cpu in games.


Why not get an x99 board for around $220 and a 6800k? Should come out to a couple bucks more than that msi board and 6700k.

I see that you have a 5820 as mentioned below. Stick with that and upgrade in a year or two.


----------



## jcde7ago

Quote:


> Originally Posted by *pompss*
> 
> Thanks i agree.
> I also think to go with MSI MSI Gaming Z170A XPOWER and i7 6700k since will not do sli in the future and save another $500
> Look like 6700 perform better then most high end cpu in games.


What's the reason for not just sticking with the 5820K if you aren't going to go with SLI in the future? If all you will be doing is gaming it honestly makes no sense to switch, imo.


----------



## pez

Quote:


> Originally Posted by *Iorek*
> 
> Well none of the cables/adapters worked, nor did UHD color mode. My last hope is the box Samsung is sending which I won't get until next week.


What box might that be?


----------



## DarkIdeals

Replied to wrong post apparently. My bad. Quoted proper post in next comment.


----------



## DarkIdeals

Quote:


> Originally Posted by *piee*
> 
> From 28nm to 16/14nm is a 12-14nm difference,from 16/14nm to 7nm is a 9/7nm difference, does that mean the increase in performance from 28nm(8terraflops) to 16nm(12 terraflops) is going a bigger increase performance than 16nm to 7nm(15-16terraflops?).28nm came out in 2010,16nm in 2013, Gpu in 2016, so maybe 7nm gpu in 3years? Just got a EK block for TXP, it hits 76c stk air(maxfan) at 1860peak, like cool/quiet.


From what i've heard, insiders are saying that Nvidia is VERY unhappy with the yields and rollout of the TSMC 16nm finFET node that Pascal ran on. Despite it overall being a much more solid process than the GloFo 14nm that AMD Polaris is on, there was apparently some pretty significant issues which definitely seem to have contributed to the huge stock issues the Pascal 1080/1070 etc.. had when they first launched.

The general consensus from people on the inside working with Nvidia is that they are converting their future plans to 14nm Samsung node, in fact the new Tegra X2 (some people are calling it Tegra N1 lately, unsure if there was a name change) is using the very same 14nm Samsung process.

Honestly my opinion is that this is VERY likely to be the node that Volta will launch on. People were saying that Volta was far off because of the delays and problems with 10nm, and since 10nm has been all but confirmed to be dumped in favor of moving to 7nm it's pretty obvious that Nvidia is going to want something "inbetween" 16nm and 7nm to hold them over till they can produce 7nm effectively. And 14nm Samsung is exactly that; it provides a moderate amount of perf/watt and efficiency increase and a slight bump in transistor density etc.. akin to the 14nm GloFo that Polaris uses, but is also MUCH more stable of a process overall, giving less yield problems and lower defect rate in theory.

Volta has MANY parallels to Maxwell too; 1) Pascal followed the Keplar schedule of release nearly IDENTICALLY (gtx 1080/680 were first 16nm/28nm chip. x70 and x60 soon followed, then a surprise TITAN launch etc..) 2) Maxwell was kind of Shoehorned into the "map" in order to provide an in-between product to bridge the gap between 28nm Keplar and 16nm...originally Maxwell was planned for 20nm but we all know how that went. This means that we are likely to see 14nm Samsung bridging the gap between 16nm and 7nm just how "20nm Maxwell" was supposed to bridge 28nm and 16nm. I can imagine us getting Volta GV104 as the new GTX 1180 (~2,880-3,072 Cuda Cores, ~8GB 3072 bit HBM2; around ~5-10% slower than TITAN XP) and GTX 1170 (~2,304 - 2,560 Cuda Cores, 8GB GDDR5X, around overclocked GTX 1080 levels), and then GV102 featuring a GTX 1180 TI (~3,200 - 3,584 Cuda Cores, ~12GB 3072 bit HBM2, ~15% faster than TITAN XP) and then possibly a GV100 TITAN model (~3,840 - 4,096 Cuda Cores, ~16-24GB 4096 bit HBM2) and a full on Tesla GV100 with DP units etc...


----------



## jcde7ago

Quote:


> Originally Posted by *pez*
> 
> What box might that be?


The 'OneConnect' box that ships with pretty much all of Samsung's high-end TVs...it's a separate hub for multimedia connectivity + Smart TV functionality. I doubt that replacing this will fix his problems though...it's probably more with the panel hardware of the TV itself or the firmware managing it as opposed to this OneConnect box being the root cause.


----------



## pez

Interesting. Nice to see Samsung being cooperative in assisting him then







.


----------



## Overfiend1981

Guys without going through 300 pages of the thread....can someone enlighten me are there any news around Bios mod being in the works?

Kind regards


----------



## 295033

Quote:


> Originally Posted by *jcde7ago*
> 
> The 'OneConnect' box that ships with pretty much all of Samsung's high-end TVs...it's a separate hub for multimedia connectivity + Smart TV functionality. I doubt that replacing this will fix his problems though...it's probably more with the panel hardware of the TV itself or the firmware managing it as opposed to this OneConnect box being the root cause.


I was under the impression that the One Connect box houses all the processing functionality as well as the OS/firmware/etc for the TV. It does all the work and then sends the picture to the TV. This post sums it up better than I can.


----------



## profundido

Quote:


> Originally Posted by *MunneY*
> 
> What he said @baasha
> 
> Just block them and run an external rad with qdcs


@baasha

+1

You can also easily construct your own external shroud with 3 blackice SR2 560 rads and a bunch of low rpm fans on it. It will be whisper quiet, you'll have lots of thermal headroom and within a week you'll be wondering how you ever went without.

In case you're scared to get your feet wet in custom loop watercooling, read up on the net, inform yourself well and get your feet wet with a starter kit from EK and extra rads. Lotsa people here to help you too ? It's more than worth it in the end and you will have built an infrastructure that will serve you well for future upgrades as well. Money cannot be the problem if you're running 4 TXP's


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> The EK ones? Nah, bareback here.


yeah - there's just too many fragile little thingys on the backside of this very thin PCB to not use a backplate IMO. Lol - I've been running these naked except for the uniblock for a while on a bench rig - very fragile PCBs.








Quote:


> Originally Posted by *profundido*
> 
> @baasha
> 
> +1
> 
> You can also easily construct your own external shroud with 3 blackice SR2 560 rads and a bunch of low rpm fans on it. It will be whisper quiet, you'll have lots of thermal headroom and within a week you'll be wondering how you ever went without.
> 
> In case you're scared to get your feet wet in custom loop watercooling, read up on the net, inform yourself well and get your feet wet with a starter kit from EK and extra rads. Lotsa people here to help you too ? It's more than worth it in the end and you will have built an infrastructure that will serve you well for future upgrades as well. Money cannot be the problem if you're running 4 TXP's


there's a watercooling Section at OCN. lot's of help and Hardware Reps too.


----------



## Steven185

Quote:


> Originally Posted by *CreepinD*
> 
> Been using my Titan XP for a few weeks now, and I have to say that i'm very pleased with the performance of this card. I have a 40" 4K monitor, and a can run all my games at max settings while staying above 60 fps. Card is rock solid at 2075Mhz on the core and +500 on the memory, with temps between 35-38C. I'm out of town for the next week, so I have yet to get a chance to try the new beta version of AB. Hopefully I can reach the magical 2100Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> [/URL]


Do you get stable 2075Mhz? I mean does it never downclock to -say- 2000 Mhz or below.
Also what is that on After burner? + 200?

Thanks and congratz for the rocking setup.


----------



## profundido

Quote:


> Originally Posted by *Steven185*
> 
> Do you get stable 2075Mhz? I mean does it never downclock to -say- 2000 Mhz or below.
> Also what is that on After burner? + 200?
> 
> Thanks and congratz for the rocking setup.


yes, same here. For daily use I settled for +200core/+500mem/120%power/autovoltage and it's solid at 2063core on both cards during very heavy gaming sessions. Manual volting makes it worse for me as the temp throttling seems to be the biggest factor on these cards, not the voltage. It seems as long as you can keep the full load temps for these cards at 40°C the core clock will be stable and not throttle.


----------



## Steven185

Quote:


> Originally Posted by *profundido*
> 
> yes, same here. For daily use I settled for +200core/+500mem/120%power/autovoltage and it's solid at 2063core on both cards during very heavy gaming sessions. Manual volting makes it worse for me as the temp throttling seems to be the biggest factor on these cards, not the voltage. It seems as long as you can keep the full load temps for these cards at 40°C the core clock will be stable and not throttle.


What's your full load temps and cooling system if you don't mind my asking.


----------



## profundido

Quote:


> Originally Posted by *Steven185*
> 
> What's your full load temps and cooling system if you don't mind my asking.


1*420 SR2 Blackice rad with thermaltake 140 riiing RGB fans (front)
1*360 SR2 Blackice rad with 120 Riing Vardar's on it (top)
1*360 PE EK rad with thermaltake 120 riing RGB fans(side)

[email protected] Ghz. 2*TXP SLI. Both cards run at a constant 41°C while gaming or benchmarking Firestrike/Timespy (with +200/+500 OC)

All fans @ 600 RPM, top and rear exhaust, bottom + side intake, case inside room temp 26°C idle, up to 36°C under load. Room temp 26°C


----------



## Jpmboy

EK Copper/Plexi Blocks, Backplates and Plexi Terminal


----------



## MrTOOSHORT

gorgeous!


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> gorgeous!


Thanks 'T.


----------



## pompss

Quote:


> Originally Posted by *DADDYDC650*
> 
> Why not get an x99 board for around $220 and a 6800k? Should come out to a couple bucks more than that msi board and 6700k.
> 
> I see that you have a 5820 as mentioned below. Stick with that and upgrade in a year or two.


Quote:


> Originally Posted by *jcde7ago*
> 
> What's the reason for not just sticking with the 5820K if you aren't going to go with SLI in the future? If all you will be doing is gaming it honestly makes no sense to switch, imo.


You guys are right 100% but i sold my main rig yesterday so i need a new







+*-


----------



## DooRules

Very nice Jpmboy.









I see the thermal grizzly there in pic. Trying that now for the first time. Easy to use, liking it so far. See how it stands up over time now.


----------



## willverduzco

Quote:


> Originally Posted by *eliau81*
> 
> thanks a lot for the tutorial
> 
> 
> 
> 
> 
> 
> 
> 
> so no need to copper the Vram & mofset?
> problem with fan replacing is the connector from pump to the fan as a 2 pin, while most fans comes with 3/4 pin (like nactua the best IMO) connector, i had issue with my 980 and the EVGA fan was unbearable, so i had to replace to nactua 120mm fan and connect to the MOBO (dident had molex for PSU)
> i really like to put a shroud for aesthetic looks and to exhaust the air from card out.


My pleasure. Also, you are correct. There's really no real need for the copper ramsinks that I put on the heat spreader. It won't help your overclock one bit. If you think about it, the stock vapor chamber cooler does not have any thermal compound linking it to the heat spreader. It's not even really touching the heat spreader. If anything, simply removing the hot object is ordinarily radiating a small amount of heat to the RAM heat spreader would actually improve the heat dissipation of that heat spreader because it's getting a tiny bit more airflow and no longer has a hot object millimeters away.

Also keep in mind that the hottest parts of the VRMs (the high and low side MOSFETs that are placed to the right of the chokes, away from the GPU core) are actively cooled by the right side of the heat spreader, which is directly connected to a small heat sink that's cooled by the stock blower fan and exhausts out the rear of the card near the power connectors, just like in the stock setup. I just wanted to keep the parts that don't really get very warm (RAM, left side components of the VRMs) a bit cooler than they already were. Totally unneeded, though in the long run it may help these components last marginally longer (though if Nvidia designed the card with any real cooling of these components to begin with, you have to assume it can run safely without improving this).

As for fan control, I definitely agree that you should replace the fan. Noctua fans are great, but usually not ideal for high static pressure applications (water-cooling radiators), which is why I suggested the venerable Gentle Typhoon, which is generally regarded as the best radiator fan, even now that there are newer alternatives like the Vardar fans by EK that offer similar static pressure at a higher noise level. That said, anything would be an improvement over the loud stock fan, and as you said, you can control the RPM with a motherboard fan header.

If you really, really want to have the shroud for aesthetics, it's still possible. You just need to make the tiny cut in the 1080 hybrid's heat spreader to accommodate that last VRM choke at the top. I would be very careful though if you do this to make sure that you fully clear that VRM and don't have a raised part of the heat spreader. If you do have a raised part, the water block on the GPU core won't make full contact with the die, and you'll instantly kill your chip. So if you do this, just make sure that when you place the block on the GPU core that you see thermal compound make its way to all parts of the core and a corresponding part of the water block.


----------



## Jpmboy

Quote:


> Originally Posted by *DooRules*
> 
> Very nice Jpmboy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I see the thermal grizzly there in pic. Trying that now for the first time. Easy to use, liking it so far. See how it stands up over time now.


seems very good. It's been on a bunch of cards and cpus here. Good stuff. It's either that, PK-1, PK-3, Gelid Ex, or... yes, NT-H1 for me.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> seems very good. It's been on a bunch of cards and cpus here. Good stuff. It's either that, PK-1, PK-3, Gelid Ex, or... yes, NT-H1 for me.


Gelid-Ex is not treating my titan any better that EVGAs Factory paste did








Gonna try an X when the Block gets here.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Gelid-Ex is not treating my titan any better that EVGAs Factory paste did
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna try an X when the Block gets here.


THe EK blocks do a great job... loops of Heaven 4.0 at 2063/5500 and the core temp is less than 10C higher than the coolant.
Coolant "Cold" side 29-30C. Core temps only in the mid 30s.


----------



## eliau81

Quote:


> Originally Posted by *willverduzco*
> 
> My pleasure. Also, you are correct. There's really no real need for the copper ramsinks that I put on the heat spreader. It won't help your overclock one bit. If you think about it, the stock vapor chamber cooler does not have any thermal compound linking it to the heat spreader. It's not even really touching the heat spreader. If anything, simply removing the hot object is ordinarily radiating a small amount of heat to the RAM heat spreader would actually improve the heat dissipation of that heat spreader because it's getting a tiny bit more airflow and no longer has a hot object millimeters away.
> 
> Also keep in mind that the hottest parts of the VRMs (the high and low side MOSFETs that are placed to the right of the chokes, away from the GPU core) are actively cooled by the right side of the heat spreader, which is directly connected to a small heat sink that's cooled by the stock blower fan and exhausts out the rear of the card near the power connectors, just like in the stock setup. I just wanted to keep the parts that don't really get very warm (RAM, left side components of the VRMs) a bit cooler than they already were. Totally unneeded, though in the long run it may help these components last marginally longer (though if Nvidia designed the card with any real cooling of these components to begin with, you have to assume it can run safely without improving this).
> 
> As for fan control, I definitely agree that you should replace the fan. Noctua fans are great, but usually not ideal for high static pressure applications (water-cooling radiators), which is why I suggested the venerable Gentle Typhoon, which is generally regarded as the best radiator fan, even now that there are newer alternatives like the Vardar fans by EK that offer similar static pressure at a higher noise level. That said, anything would be an improvement over the loud stock fan, and as you said, you can control the RPM with a motherboard fan header.
> 
> If you really, really want to have the shroud for aesthetics, it's still possible. You just need to make the tiny cut in the 1080 hybrid's heat spreader to accommodate that last VRM choke at the top. I would be very careful though if you do this to make sure that you fully clear that VRM and don't have a raised part of the heat spreader. If you do have a raised part, the water block on the GPU core won't make full contact with the die, and you'll instantly kill your chip. So if you do this, just make sure that when you place the block on the GPU core that you see thermal compound make its way to all parts of the core and a corresponding part of the water block.


thanks again for post

as i see it i got 2 options :

1. to wait for proper EVGA HYBRID TITAN XP release ( wich can take some time..if so).
2.make a mod like you mentioned with 1080

some user in raddit as done so
he said he had to do 3 aggressive changes to EVGA heatspreder

__
https://www.reddit.com/r/4ziyr1/evga_hybrid_kit_modded_for_titan_x_pascal_album/
thinking doing that also but wandering what kind of tool he use to grind ?

can you give me a link to amazon for the fan? i what to make sure to buy your recommend one
did you did shunt mod?


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> thanks again for post
> 
> as i see it i got 2 options :
> 
> 1. to wait for proper EVGA HYBRID TITAN XP release ( wich can take some time..if so).
> 2.make a mod like you mentioned with 1080
> 
> some user in raddit as done so
> he said he had to do 3 aggressive changes to EVGA heatspreder
> 
> __
> https://www.reddit.com/r/4ziyr1/evga_hybrid_kit_modded_for_titan_x_pascal_album/%5B/URL


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*


just finish to chat with them

''Welcome to EVGA Web Chat. You are now being assisted by Parker M.
Hello, how can I assist you today?
Eli
hi
is EVGA planning to make hybrid kit for titan x pascal
Parker M
There was rumored to be created however we have not heard a update on that as of now.
Eli
so you cant confirm that?
Parker M
I cannot at this time since nothing has been 100% confirmed by our marketing team. From a older post in the forums I was stated that we were going to eventually, but no update has been issued. I do apologize for the inconvenience.
Eli
that's ok
thank you''

that rumor started with some guy named ''jacob'' from evga


----------



## Lobotomite430

Just buy the Hybrid 1080 kit like I did. I made 2 cuts two the baseplate and my Titan is happy and cool! EVGA might not even make a kit seeing as how many people are doing water blocks already.


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> Just buy the Hybrid 1080 kit like I did. I made 2 cuts two the baseplate and my Titan is happy and cool! EVGA might not even make a kit seeing as how many people are doing water blocks already.


did you used the titan baseplate or 1080? the 1080 need grind according that user from reddit


----------



## cisco0623

Quote:


> Originally Posted by *Jpmboy*
> 
> THe EK blocks do a great job... loops of Heaven 4.0 at 2063/5500 and the core temp is less than 10C higher than the coolant.
> Coolant "Cold" side 29-30C. Core temps only in the mid 30s.


Beautiful. They are great blocks. I have two questions for you - what monitor is that? And where did you get the awesome desktop image?!


----------



## Jpmboy

Quote:


> Originally Posted by *cisco0623*
> 
> Beautiful. They are great blocks. I have two questions for you - what monitor is that? And where did you get the awesome desktop image?!


RoG Swift. 1440P 144Hz.
Lol - I can't remember where I got this image from...

earth_moon_and_other_planets-wallpaper-3840x2160.jpg 1410k .jpg file


I use it mostly on my 50inch 4K panel... then it really looks amazing.


----------



## Lobotomite430

Quote:


> Originally Posted by *eliau81*
> 
> did you used the titan baseplate or 1080? the 1080 need grind according that user from reddit


Used the EVGA baseplate. I only cut out the spot for a extra VRM and the 6 pin power connector. Maybe his fits more flush than mine but mine looks and fits great. Im happy with my 50c with my overclock. I've been thinking about doing the shunt mod. Ive also taken the card apart a few times to reapply TIM paste that I didn't do properly now so its easy and no longer scary.


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> Used the EVGA baseplate. I only cut out the spot for a extra VRM and the 6 pin power connector. Maybe his fits more flush than mine but mine looks and fits great. Im happy with my 50c with my overclock. I've been thinking about doing the shunt mod. Ive also taken the card apart a few times to reapply TIM paste that I didn't do properly now so its easy and no longer scary.


thanks man will do that


----------



## cisco0623

Quote:


> Originally Posted by *Jpmboy*
> 
> RoG Swift. 1440P 144Hz.
> Lol - I can't remember where I got this image from...
> 
> earth_moon_and_other_planets-wallpaper-3840x2160.jpg 1410k .jpg file
> 
> 
> I use it mostly on my 50inch 4K panel... then it really looks amazing.


Awesome! Thank you. I'm undecided whether I go 1440p 100hz or 4K.


----------



## Lobotomite430

Quote:


> Originally Posted by *eliau81*
> 
> thanks man will do that


I might take it apart again anyway and borrow a friends dremel tool and make it fit even better(like the link on reddit) but I am quite happy with it for now.


----------



## willverduzco

Quote:


> Originally Posted by *eliau81*
> 
> thanks again for post
> 
> as i see it i got 2 options :
> 
> 1. to wait for proper EVGA HYBRID TITAN XP release ( wich can take some time..if so).
> 2.make a mod like you mentioned with 1080
> 
> some user in raddit as done so
> he said he had to do 3 aggressive changes to EVGA heatspreder
> 
> __
> https://www.reddit.com/r/4ziyr1/evga_hybrid_kit_modded_for_titan_x_pascal_album/%5B/URL
> shows the single, relatively minor modification that one user had to do to fit his 1080 hybrid kit onto the TXP.
> 
> As for the fan, I like the 1850 RPM GTs. They are pretty quiet (but not silent) at the full 1850 RPM. However, with a 7V adapter or 60% fan profile setting on the motherboard, you can get it to around 1200 RPM, which is inaudible at basically any distance. IMO the 1850 is the best of both worlds since you can get it to be faster if needed, which makes it slightly audible, or run it a tiny bit slower and make it inaudible.
> 
> Regarding the shunt mod, I haven't done that yet. Despite not doing it, my clocks never touch 2000 MHz even when hitting the power limiter. The lowest I ever see are about 2026 (extremely rare, doesn't stay there, and only when looping benchmarks), so I don't mind waiting for the BIOS modification. There seems to be great progress there, since there's already a NVFlash with a bypass point for pascal (doesn't support TXP), and an official TXP-enabled NVFlash (without certificate check bypass). Only a matter of time for the NVFlash and someone to hexedit a BIOS with a higher power limit.


----------



## Jpmboy

Quote:


> Originally Posted by *cisco0623*
> 
> Awesome! Thank you. I'm undecided whether I go 1440p 100hz or 4K.


buying a monitor today.. 4K with as big of a screen as you can handle (eg, sittiing close to a 50 inch 4K panel is not what you want... need to sit back at least 2x the panel width.) Guys like Callsignvega are very up-to-date on the best monitors atm.


----------



## MunneY

Quote:


> Originally Posted by *cisco0623*
> 
> Awesome! Thank you. I'm undecided whether I go 1440p 100hz or 4K.


Owning both. I prefer my 3440x1440 100hz over 4k60. They each have their merrits tho. My 4k is a 40" too, so they are roughly the same size. 4k at anything other than 32" is worthless to me personally


----------



## msp1609

Put a 1080 hybrid cooler on my Pascal Titan last night and did the shunt mod. While it did drop my reported power draw it seems my pascal titan doesn't like anything above 1.05v. I'll have to spend some more time dialing it in tonight, but gpu boost 3.0 seems to have no rhyme or reason to add or subtract voltage and the best clock speed i was able to manage last night was low 21xx. i miss gpu boost 2.0 lol.


----------



## Difunto

I love my temps with my push pull hybrid on! BTW my room is always cold










starts at 2114 then drops to 2104 when it goes pass 15c max temp on valley on 30min loop was 30c


----------



## piee

NASA has great wallpaper nebulas,galaxies,etc,etc,awsomeness. Check out live view from ISS.


----------



## Lobotomite430

Quote:


> Originally Posted by *msp1609*
> 
> Put a 1080 hybrid cooler on my Pascal Titan last night and did the shunt mod. While it did drop my reported power draw it seems my pascal titan doesn't like anything above 1.05v. I'll have to spend some more time dialing it in tonight, but gpu boost 3.0 seems to have no rhyme or reason to add or subtract voltage and the best clock speed i was able to manage last night was low 21xx. i miss gpu boost 2.0 lol.


Was that stable at all? My card hovers around 2050-2088 and seems stable. No shunt mod though.


----------



## bee144

Quote:


> Originally Posted by *eliau81*
> 
> just finish to chat with them
> 
> ''Welcome to EVGA Web Chat. You are now being assisted by Parker M.
> Hello, how can I assist you today?
> Eli
> hi
> is EVGA planning to make hybrid kit for titan x pascal
> Parker M
> There was rumored to be created however we have not heard a update on that as of now.
> Eli
> so you cant confirm that?
> Parker M
> I cannot at this time since nothing has been 100% confirmed by our marketing team. From a older post in the forums I was stated that we were going to eventually, but no update has been issued. I do apologize for the inconvenience.
> Eli
> that's ok
> thank you''
> 
> that rumor started with some guy named ''jacob'' from evga


That some Jacob guy is the lead EVGA product manger so his word typically cares a lot of weight. He's also largely the face of EVGA from a social media perspective.


----------



## Vellinious

Yup. Usually if you hear something from Jacob, it can be taken as gospel.


----------



## CreepinD

Quote:


> Originally Posted by *Steven185*
> 
> Do you get stable 2075Mhz? I mean does it never downclock to -say- 2000 Mhz or below.
> Also what is that on After burner? + 200?
> 
> Thanks and congratz for the rocking setup.


Thanks!

Yes, I have yet to see it drop below that clock speed, while playing the new tomb raider, and doom.

I think its +220 on AB, but i'm not 100% sure. I'm out of town for the next week, so I cannot confirm.


----------



## unreality

Besides overclocking i just tested some undervolting using the newest Afterburner Beta. You press CTRL+F there and can change the voltage at different Boost settings. Since im playing WoW Legion atm which doesnt push the card really hard i was adjusting the curves, so i play at the default 1531 Boost @ 0.825 Volts (Default was using 1.06) PCGH was using 0.850V at 1800 Boost btw which is really nice!

Still getting beautiful [email protected] and card is as cool as it could possibly be


----------



## CRITTY

So true and funny!


----------



## Stateless

I am using the new AB and adding voltage does nothing for me. The highest I can add to the core is +220, anything above that and it crashes. I figured I would max the voltage that AB allows and even going to 225 with the added voltage it crashes. My card runs all day/night at +220 with or without adding voltage. It seems like the voltage slider is not doing anything. I read a few pages that someone is using "autovoltage" can any explain what that is and the benefits?

Thanks!


----------



## willverduzco

Quote:


> Originally Posted by *CRITTY*
> 
> 
> 
> 
> 
> 
> So true and funny!


And with the best song from the Doom 2016 soundtrack = Winning!


----------



## Baasha

These cards are so epic! I just absolutely love them!









Dat scaling tho!









GTA V w/ a ton of mods (plus crazy ENB) @ 8K maxed out (no in-game AA): [8k screenshot]



Battlefield 1 (Beta) @ 8K everything on Ultra (including HBAO): [8k screenshot]


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> EK Copper/Plexi Blocks, Backplates and Plexi Terminal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Is this in your Caslabs build


----------



## Sowah

*Hello dear Friends!*

I have as Gift become a new EVGA HB SLi Bridge 4x Slots, same like this one on the right https://www.techpowerup.com/img/16-06-22/151b.jpg .

BUT, I still have 2x Titan X Maxwell inside. So, my question is: _"Can I use this new SLI Bridge on Maxwell Titan X or not? And when yes, what is different to old SLI?"_

*Thank you for help and Greetings from Spain & Germany*








Christian


----------



## DADDYDC650

Quote:


> Originally Posted by *Baasha*
> 
> These cards are so epic! I just absolutely love them!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dat scaling tho!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTA V w/ a ton of mods (plus crazy ENB) @ 8K maxed out (no in-game AA): [8k screenshot]
> 
> 
> 
> Battlefield 1 (Beta) @ 8K everything on Ultra (including HBAO): [8k screenshot]


So why can't Nvidia provide proper SLI support?


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> [/SPOILER]
> 
> Is this in your Caslabs build


Still deciding. I really like the look of this on an open bench ... I may put the 5960x/2TXMs kit in the SM8 - can always swap it out later I guess.
I just put the Aquaero 6 in the Mercury 8, leak tested (with Dr. Drop) filled with cooolant mix, ran the pump in a loop for a few hours. Every thing is ready except for 3 fan splitters (monday delivery). Just need to decide on which kit to put in there.


----------



## Menthol

I got a TH10 Magnum when they first came out, I was so glad to see someone selling American craftsmanship I felt compelled to support them, not to mention they sell brackets, plates spare parts etc.
It is kind of large for a desktop case, I use it for benching since it has radiators and pumps installed, I keep planning on getting an external radiator/pump setup just have never gotten around to doing it, thinking one of there bottom expansion cases for this purpose since it could also house a power supply


----------



## nwkrep82

Quote:


> Originally Posted by *willverduzco*
> 
> Hmm. I'm unable to get on Reddit right now (blocked on my current domain), but this post on hardforum shows the single, relatively minor modification that one user had to do to fit his 1080 hybrid kit onto the TXP.
> 
> As for the fan, I like the 1850 RPM GTs. They are pretty quiet (but not silent) at the full 1850 RPM. However, with a 7V adapter or 60% fan profile setting on the motherboard, you can get it to around 1200 RPM, which is inaudible at basically any distance. IMO the 1850 is the best of both worlds since you can get it to be faster if needed, which makes it slightly audible, or run it a tiny bit slower and make it inaudible.
> 
> Regarding the shunt mod, I haven't done that yet. Despite not doing it, my clocks never touch 2000 MHz even when hitting the power limiter. The lowest I ever see are about 2026 (extremely rare, doesn't stay there, and only when looping benchmarks), so I don't mind waiting for the BIOS modification. There seems to be great progress there, since there's already a NVFlash with a bypass point for pascal (doesn't support TXP), and an official TXP-enabled NVFlash (without certificate check bypass). Only a matter of time for the NVFlash and someone to hexedit a BIOS with a higher power limit.


Yes, that was me









2 relatively easy mods to get it up and running.


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> I got a TH10 Magnum when they first came out, I *was so glad to see someone selling American craftsmanship* I felt compelled to support them, not to mention they sell brackets, plates spare parts etc.
> It is kind of large for a desktop case, I use it for benching since it has radiators and pumps installed, I keep planning on getting an external radiator/pump setup just have never gotten around to doing it, thinking one of there bottom expansion cases for this purpose since it could also house a power supply


^^ This. The case is incredibly well made and (too) expandable. as far as a portable external cooling kit, take a look at the Aquacomputer GiGant rad set up (with internal pump/res and 12V converter). With Koolance QDCs it is a simple plug and run unit. THen there is always something like the Koolance EXC-800


----------



## RobotDevil666

Hey guys I've been reading through this thread a bit as I have a manly decision to make








So the question is Titan X Pascal or 1080 SLI ?
Currently I have 980Ti SLI and originally I was going to wait for 1080Ti but I caught upgrade itch and I'm afraid I can't wait any longer.
I know the downsides of SLI as I've been using it since 9800GX2, yes every single gen I had SLI setup at some point








So anybody came from 980Ti SLI to Titan X Pascal ? was it worth it ? looking at benchmarks new Titan should be roughly equal to 980Ti SLi
faster in quite few games with weak or no SLI support and slower in few games that have great SLI support like Battlefield or Crysis 3
Also would EK backplate from original Titan X work on the new one or do I need to buy a new one ?


----------



## profundido

If you've been having SLI all the way until now it means you play games that benefit from it. So you should either go for 1080SLI or buy 1 titan now and save up for your second.

I don't know if the original EK backplate can be used but for sure you can remount the stock backplate which works really well and looks gorgeous. So there's no real need to buy a new EK if you don't want to

All the rest you assumed seems pretty accurate

I personally went for TXP SLI and I'm really happy with it. No issues whatsoever for the games I play (witcher, dragon's age, eso, tombs raider)


----------



## RobotDevil666

Quote:


> Originally Posted by *profundido*
> 
> If you've been having SLI all the way until now it means you play games that benefit from it. So you should either go for 1080SLI or buy 1 titan now and save up for your second.
> 
> I don't know if the original EK backplate can be used but for sure you can remount the stock backplate which works really well and looks gorgeous. So there's no real need to buy a new EK if you don't want to
> 
> All the rest you assumed seems pretty accurate
> 
> I personally went for TXP SLI and I'm really happy with it. No issues whatsoever for the games I play (witcher, dragon's age, eso, tombs raider)


Well TXP SLI is not really happening due to the price, so it's one or the other.
What are average OC clocks on watercooled TXP ? is 2000 realistic ?


----------



## profundido

TXP base clock is only around 1200Mhz going all the way up to 2100Mhz hence creating a serious amount of more power.

But if your budget allows for 1080 sli you will for sure be able to squeeze more juice out of that in games that support sli. I would for sure take that route. Only go single TXP if you really play games that don't support SLI properly


----------



## RobotDevil666

Quote:


> Originally Posted by *profundido*
> 
> TXP base clock is only around 1200Mhz going all the way up to 2100Mhz hence creating a serious amount of more power.
> 
> But if your budget allows for 1080 sli you will for sure be able to squeeze more juice out of that in games that support sli. I would for sure take that route. Only go single TXP if you really play games that don't support SLI properly


Well 2x 1080 is not much more than single Titan XP, as for the games .... well most of them would be faster with 1080SLI but influx of new DX12 titles can change that quickly.
Plus 1080 SLI would be on air (at least for now) so there's noise/heat issue


----------



## SuCCEzz




----------



## profundido

Quote:


> Originally Posted by *RobotDevil666*
> 
> Well 2x 1080 is not much more than single Titan XP, as for the games .... well most of them would be faster with 1080SLI but influx of new DX12 titles can change that quickly.
> Plus 1080 SLI would be on air (at least for now) so there's noise/heat issue


good points ! I was assuming any sort of watercooling but indeed thermal throttling is a real issue even with 1 card on air so yes your 1080 SLI will throttle down constantly if on air. This would be a trigger to go single TXP instead.

Future games and SLI support ? I whish I had a crystal ball...

single TXP is the sure bet in this case I guess. Can't go wrong with it


----------



## RobotDevil666

Quote:


> Originally Posted by *profundido*
> 
> good points ! I was assuming any sort of watercooling but indeed thermal throttling is a real issue even with 1 card on air so yes your 1080 SLI will throttle down constantly if on air. This would be a trigger to go single TXP instead.
> 
> Future games and SLI support ? I whish I had a crystal ball...
> 
> single TXP is the sure bet in this case I guess. Can't go wrong with it


Plus it eliminates all the SLI related issues which I have to say are getting worse lately, I'm running 3440x1440 so single Titan XP shouldn't have problems squeezing 100fps in most games apart from maybe few more demanding titles.


----------



## jhowell1030

Quote:


> Originally Posted by *RobotDevil666*
> 
> Plus it eliminates all the SLI related issues which I have to say are getting worse lately, I'm running 3440x1440 so single Titan XP shouldn't have problems squeezing 100fps in most games apart from maybe few more demanding titles.


As someone that went from an SLI 980 setup to a single TXP for gaming on my x34 I'm really happy. SLI was too bungie for me as to witch games would scale well and which games wouldn't.

Another thing worth noting...Nvidia is barely supporting SLI as it is now. Eventually it will be up to developers of titles as far as the performance of SLI in the future.


----------



## mbze430

Quote:


> Originally Posted by *RobotDevil666*
> 
> Hey guys I've been reading through this thread a bit as I have a manly decision to make
> 
> 
> 
> 
> 
> 
> 
> 
> So the question is Titan X Pascal or 1080 SLI ?
> Currently I have 980Ti SLI and originally I was going to wait for 1080Ti but I caught upgrade itch and I'm afraid I can't wait any longer.
> I know the downsides of SLI as I've been using it since 9800GX2, yes every single gen I had SLI setup at some point
> 
> 
> 
> 
> 
> 
> 
> 
> So anybody came from 980Ti SLI to Titan X Pascal ? was it worth it ? looking at benchmarks new Titan should be roughly equal to 980Ti SLi
> faster in quite few games with weak or no SLI support and slower in few games that have great SLI support like Battlefield or Crysis 3
> Also would EK backplate from original Titan X work on the new one or do I need to buy a new one ?


I came from a 980TI SLI to a Titan XP SLI. BUT! I forced myself to use Titan XP single for 1 week, to see if I can just "deal" with it. Nope, can't do it lol


----------



## cisco0623

I can't wait to see how much more peeps push these cards once a custom bios is out.


----------



## cisco0623

Quote:


> Originally Posted by *Jpmboy*
> 
> buying a monitor today.. 4K with as big of a screen as you can handle (eg, sittiing close to a 50 inch 4K panel is not what you want... need to sit back at least 2x the panel width.) Guys like Callsignvega are very up-to-date on the best monitors atm.


Thanks again. Yeah i sit about two feet away max on my desk. After splurging on the Titans my plan is to save up and treat myself to a new monitor for
Xmas. It is only 3 months away lol


----------



## jcde7ago

Quote:


> Originally Posted by *RobotDevil666*
> 
> Plus it eliminates all the SLI related issues which I have to say are getting worse lately, I'm running 3440x1440 so single Titan XP shouldn't have problems squeezing 100fps in most games apart from maybe few more demanding titles.


3440x1440p @100hz here as well...you will definitely still "struggle" with a single TXP in most of the demanding games if by "struggle" you mean having to "settle" for ~75FPS instead of 100FPS...lol.









If you're looking to crank up absolutely all graphical settings that demanding games have to offer then a single TXP is quite there yet in terms of being able to maintain 95+FPS in a lot of the more recent demanding titles (Witcher 3, GTA V, RoTR, etc).


----------



## RobotDevil666

Quote:


> Originally Posted by *jhowell1030*
> 
> As someone that went from an SLI 980 setup to a single TXP for gaming on my x34 I'm really happy. SLI was too bungie for me as to witch games would scale well and which games wouldn't.
> 
> Another thing worth noting...Nvidia is barely supporting SLI as it is now. Eventually it will be up to developers of titles as far as the performance of SLI in the future.


Yea I've noticed that SLI support has been patchy lately, more hit or miss than ever, and with DX12 not supporting it at all it's looking even worse.
Quote:


> Originally Posted by *mbze430*
> 
> I came from a 980TI SLI to a Titan XP SLI. BUT! I forced myself to use Titan XP single for 1 week, to see if I can just "deal" with it. Nope, can't do it lol


What do you mean by 'can't do it' ?

Quote:


> Originally Posted by *jcde7ago*
> 
> 3440x1440p @100hz here as well...you will definitely still "struggle" with a single TXP in most of the demanding games if by "struggle" you mean having to "settle" for ~75FPS instead of 100FPS...lol.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you're looking to crank up absolutely all graphical settings that demanding games have to offer then a single TXP is quite there yet in terms of being able to maintain 95+FPS in a lot of the more recent demanding titles (Witcher 3, GTA V, RoTR, etc).


meh way to ruin it








Thanks for that, I was looking at benchmarks but most of them are for 2560x1440 which is easier to run, my 980ti sli is also struggling with more demanding titles, Whicher 3 gets around 60/70 fps and Division is even worse hovering around 60, GTA V after some tweaking is around 80/90 but that mostly maxed, no 100% max.
From what you said sigle Titan X still appears to be faster than 980Ti SLI


----------



## jcde7ago

Quote:


> Originally Posted by *RobotDevil666*
> 
> Yea I've noticed that SLI support has been patchy lately, more hit or miss than ever, and with DX12 not supporting it at all it's looking even worse.
> What do you mean by 'can't do it' ?
> meh way to ruin it
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for that, I was looking at benchmarks but most of them are for 2560x1440 which is easier to run, my 980ti sli is also struggling with more demanding titles, Whicher 3 gets around 60/70 fps and Division is even worse hovering around 60, GTA V after some tweaking is around 80/90 but that mostly maxed, no 100% max.
> From what you said sigle Titan X still appears to be faster than 980Ti SLI


This is a good comparison of a single TXM/980Ti vs a single TXP:

http://www.babeltechreviews.com/battle-titans-pascal-titan-x-vs-maxwell-titan-x/3/

Coming from 3-way Titan XMs I have to say that yes, a single TXP is going to be a better experience than SLI 980 Tis.









SLI support has been pretty decent, tbh...it's just the initial drivers are never usually refined enough to make the experience seem worthwhile at first. Eventually, most games get it right and SLI scaling is going to be there...with the high-bandwidth capability of the Pascals (though you need an HB bridge) and Nvidia essentially abandoning official support for 3 and 4-way SLI (they can focus more on 2-way SLI strictly now)I actually think SLI support will get much better from here on out.


----------



## mbze430

Quote:


> Originally Posted by *RobotDevil666*
> 
> What do you mean by 'can't do it' ?
> I


I couldn't handle the fact that I wasn't doing SLI. I have been doing SLI since 3Dfx Voodoo2 days.

I have a plain old [email protected] monitor. but really looking forward to [email protected]+ for next year


----------



## RobotDevil666

Quote:


> Originally Posted by *jcde7ago*
> 
> This is a good comparison of a single TXM/980Ti vs a single TXP:
> 
> http://www.babeltechreviews.com/battle-titans-pascal-titan-x-vs-maxwell-titan-x/3/
> 
> Coming from 3-way Titan XMs I have to say that yes, a single TXP is going to be a better experience than SLI 980 Tis.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SLI support has been pretty decent, tbh...it's just the initial drivers are never usually refined enough to make the experience seem worthwhile at first. Eventually, most games get it right and SLI scaling is going to be there...with the high-bandwidth capability of the Pascals (though you need an HB bridge) and Nvidia essentially abandoning official support for 3 and 4-way SLI (they can focus more on 2-way SLI strictly now)I actually think SLI support will get much better from here on out.


Well this is harder than I thought, 1080 SLI would be faster easily given decent support ...... eh choices choices

Quote:


> Originally Posted by *mbze430*
> 
> I couldn't handle the fact that I wasn't doing SLI. I have been doing SLI since 3Dfx Voodoo2 days.
> 
> I have a plain old [email protected] monitor. but really looking forward to [email protected]+ for next year


Yeach I know what you mean I usually go with SLI sooner or later but With Titan XP It's just way too much money so I need to make my choice now.


----------



## jcde7ago

Quote:


> Originally Posted by *RobotDevil666*
> 
> Well this is harder than I thought, 1080 SLI would be faster easily given decent support ...... eh choices choices
> Yeach I know what you mean I usually go with SLI sooner or later but With Titan XP It's just way too much money so I need to make my choice now.


Yeah, if 1080 SLI is a choice then it's definitely going to be faster than a single TXP, no doubt.


----------



## RobotDevil666

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah, if 1080 SLI is a choice then it's definitely going to be faster than a single TXP, no doubt.


Well that's the choice I'm facing 1080SLI on air or watercooled Titan XP
I was considering 2 of those

https://www.scan.co.uk/products/zotac-geforce-gtx-1080-amp-edition-8gb-gddr5x-vr-ready-graphics-card-2560-core-1683mhz-gpu-1822mhz-b

or those

https://www.scan.co.uk/products/palit-geforce-gtx-1080-super-jetstream-8gb-gddr5x-vr-ready-graphics-card-2560-core-1708mhz-gpu-1847m

and potentially watercool them bit later, maybe ... depends on the noise really


----------



## jcde7ago

Quote:


> Originally Posted by *RobotDevil666*
> 
> Well that's the choice I'm facing 1080SLI on air or watercooled Titan XP
> I was considering 2 of those
> 
> https://www.scan.co.uk/products/zotac-geforce-gtx-1080-amp-edition-8gb-gddr5x-vr-ready-graphics-card-2560-core-1683mhz-gpu-1822mhz-b
> 
> or those
> 
> https://www.scan.co.uk/products/palit-geforce-gtx-1080-super-jetstream-8gb-gddr5x-vr-ready-graphics-card-2560-core-1708mhz-gpu-1847m
> 
> and potentially watercool them bit later, maybe ... depends on the noise really


In 18-24 months' time, the Titan X Pascal's price drop probably won't be enough to get you to pull the trigger on another one to prolong the life of your GPU setup (i mean, even if it were a 50% price drop outright, that's still $600, probably close to that future generation's equivalent GTX 1080 solution).

In that case, i'd go with 1080 SLI every time to get that better performance that you want NOW and for the next couple of years for the same price as a single TXP + waterblock/Hybrid cooler, which will be a lesser performer...


----------



## RobotDevil666

Quote:


> Originally Posted by *jcde7ago*
> 
> In 18-24 months' time, the Titan X Pascal's price drop probably won't be enough to get you to pull the trigger on another one to prolong the life of your GPU setup (i mean, even if it were a 50% price drop outright, that's still $600, probably close to that future generation's equivalent GTX 1080 solution).
> 
> In that case, i'd go with 1080 SLI every time to get that better performance that you want NOW and for the next couple of years for the same price as a single TXP + waterblock/Hybrid cooler, which will be a lesser performer...


Aaaaaand I think this is pretty much my conclusion too








Titan XP SLI is definitely out of my budget and with 1080 SLI I get better performance now and even in the games that do not support SLI it's still better than single 980Ti.
Thank you for the input well deserved +REP for you mate


----------



## jcde7ago

Quote:


> Originally Posted by *RobotDevil666*
> 
> Aaaaaand I think this is pretty much my conclusion too
> 
> 
> 
> 
> 
> 
> 
> 
> Titan XP SLI is definitely out of my budget and with 1080 SLI I get better performance now and even in the games that do not support SLI it's still better than single 980Ti.
> Thank you for the input well deserved +REP for you mate


No problem, best of luck! You're going to have a beast rig no matter which way you go.


----------



## willverduzco

Like many others here, I've been running dual cards for ages. I started with 2x Voodoo 2 12 MB SLI and then continued my experiences with 2x GTX 460, 2x 570, 2x 670, 2x 7970, 2x 290x, and 2x 980Ti. However starting with my GTX 1080 (Gigabyte Xtreme Gaming Waterforce), I started experimenting with single GPU and saw that it was sort of doable. Now with my Titan XP (also water-cooled), I just can't justify ever going back to the nightmare of support that is multi-GPU. When it works, it'll usually work great. But if you like playing AAA titles on launch day, the scaling usually isn't there yet (if support is there at all), and there tends to be much more variability in frame times (even if overall frame rate is higher).

As a side note, I actually found that I preferred the frame pacing on AMD's CrossfireX when I was running 2x290x versus when I had the 2x980Ti setup. That said, SLI generally works in a greater number of games--just with slightly worse scaling and frame pacing when both are on recent drivers.

I think unless you absolutely need to max out every single game at 4k60 ultra settings (and can live with simply high/very high in those games), a single heavily OCed TXP will more than suffice. After all, in my experience 1x TXP at 2126 MHz is as fast or slightly faster than 2x 980TI at max OC under water-cooling, and about as fast as 2x GTX 1080 when the 1080s are left to stock speeds.

Let's remember that with 40% more shaders, a 2126 MHz TXP pushes 15.24 TFLOP, which is nearly double the 1080's 8.2 at stock boost clocks. Then when factoring how SLI scaling is around 70-85% most of the time, as well as how the TXP isn't memory bandwidth constrained like the 1080 is at 4k resolution, you'll get similar if not better performance than 2x 1080 at stock. Similarly, a 1080 at max OC under water is about as fast as 2x 290x at max OC, but without the multi-GPU issues.

Don't get me wrong... I'd still prefer 2x Titan XP if someone gave me free money, but since I have to pay another $1200 for another, it's not high on my list of priorities. Similarly, if you can avoid having to buy 2x 1080s and can buy a single TXP for around the same money that when OCed will match the performance of 2x 1080 at stock, it seems silly to go with dual card.


----------



## mbze430

Can someone here give me their SLI performance in DX11 in RoTTR?

I have been using DX12 in RoTTR with mGPU (disabled SLI), but last night I decided to run RoTTR with DX11 in SLI mode. notice it was only using 65-75% on each card. I am wondering if anyone else seeing the same.


----------



## jcde7ago

Quote:


> Originally Posted by *willverduzco*
> 
> Like many others here, I've been running dual cards for ages. I started with 2x Voodoo 2 12 MB SLI and then continued my experiences with 2x GTX 460, 2x 570, 2x 670, 2x 7970, 2x 290x, and 2x 980Ti. However starting with my GTX 1080 (Gigabyte Xtreme Gaming Waterforce), I started experimenting with single GPU and saw that it was sort of doable. Now with my Titan XP (also water-cooled), I just can't justify ever going back to the nightmare of support that is multi-GPU. When it works, it'll usually work great. But if you like playing AAA titles on launch day, the scaling usually isn't there yet (if support is there at all), and there tends to be much more variability in frame times (even if overall frame rate is higher).
> 
> As a side note, I actually found that I preferred the frame pacing on AMD's CrossfireX when I was running 2x290x versus when I had the 2x980Ti setup. That said, SLI generally works in a greater number of games--just with slightly worse scaling and frame pacing when both are on recent drivers.
> 
> I think unless you absolutely need to max out every single game at 4k60 ultra settings (and can live with simply high/very high in those games), a single heavily OCed TXP will more than suffice. After all, in my experience 1x TXP at 2126 MHz is as fast or slightly faster than 2x 980TI at max OC under water-cooling, and about as fast as 2x GTX 1080 when the 1080s are left to stock speeds.
> 
> Let's remember that with 40% more shaders, a 2126 MHz TXP pushes 15.24 TFLOP, which is nearly double the 1080's 8.2 at stock boost clocks. Then when factoring how SLI scaling is around 70-85% most of the time, as well as how the TXP isn't memory bandwidth constrained like the 1080 is at 4k resolution, you'll get similar if not better performance than 2x 1080 at stock. Similarly, a 1080 at max OC under water is about as fast as 2x 290x at max OC, but without the multi-GPU issues.
> 
> Don't get me wrong... I'd still prefer 2x Titan XP if someone gave me free money, but since I have to pay another $1200 for another, it's not high on my list of priorities. Similarly, if you can avoid having to buy 2x 1080s and can buy a single TXP for around the same money that when OCed will match the performance of 2x 1080 at stock, it seems silly to go with dual card.


I understand where you're coming from but i've also been running multi-GPU configs for over a decade and have never had this "nightmare of support" that you seem to be portraying. It's also never been a problem to have more than a single GPU and then choosing to run only one to support a new-release title that may not have the best SLI support at launch...

The things you're referring to like lack of bandwidth constraints, etc., are all benefits that equally transfer to multi-GPU configs; I mean, that's what the HB SLI bridge is for.

Lastly, you keep reiterating that a single OC'd TXP is going to "perform similarly" to 2x 1080s at stock, but let's be real...NO ONE on OCN is going to spend $1,400 for a TXP and a block/hybrid cooler and NOT OC the crap out of it, nor would someone spend an equal amount of cash on 2x 1080s and NOT OC the crap out of those...

From a raw performance and versatility standpoint there is no way that anyone who can choose between a single TXP and 1080 SLI who's had a decent experience with SLI is going to pay the same amount for less performance...cause there is no going around it....a single Titan X Pascal IS, objectively speaking, a lesser performer than 2x 1080s. You can't cherry pick a situation where someone is going to OC their TXP but not their 1080s just to prop up your argument on an enthusiast forum.


----------



## RobotDevil666

I have to agree with jcde7ago here 1080 SLI which I will obviously OC as much as possible will be definitely faster than single TXP give even half decent SLI profile.
Now like both of you guys I've been on SLI for ages and I had SLI setup every gen since 9800GX2 and I know SLI support can be hit and miss but when it works it's massive.#
Lately I had a spat of issues with SLI which made me consider single card solution but the games I play that need more power than TXP has to offer happen to have good SLI support (Whicher 3, Division GTA V) to name a few, and in those 1080SLI will be easily faster than TXP


----------



## llll

I chose 2x 780ti over a Titan X a couple years back. I regret the decision.

SLi works, mostly, but it's a headache, and for some reason, it doesn't work at all with newer drivers-after a DDU uninstall, it'll be fine, but after a couple restarts/shutdowns, SLi stops working, and I get ~50% usage and huge stuttering even in benchmarks like Heaven and Firestrike. Which is very weird. This time, instead of going 2x1080, I'll get a Titan XP and I won't have to deal with that nonsense.


----------



## msp1609

Quote:


> Originally Posted by *Lobotomite430*
> 
> Was that stable at all? My card hovers around 2050-2088 and seems stable. No shunt mod though.


I was able to mess around with it more last night. I got it to where it's stable now. The card will start at 2126mhz then drop to a solid 2088. The card is now hitting the voltage limit though. My OC on the mem is at +475.


----------



## willverduzco

Quote:


> Originally Posted by *jcde7ago*
> 
> I understand where you're coming from but i've also been running multi-GPU configs for over a decade and have never had this "nightmare of support" that you seem to be portraying. It's also never been a problem to have more than a single GPU and then choosing to run only one to support a new-release title that may not have the best SLI support at launch...


Well there's where we disagree. If you have 2x1080 and get used to playing supported games at 4k60 with very high/ultra settings and suddenly have to deal with 1440p60 at ultra settings, that's quite horrible. I know when I was running 2x980TI, having to play Doom 2016 at 1440p was simply unacceptable. That's what made me buy my 1080, and when I saw that wasn't enough, the TXP. And let's not kid ourselves, a single 1080 is just not fast enough for 4k, so your experience would suffer.

Personally, I consider not being able to play AAA titles at launch day with the performance I have become accustomed to expect from double the investment to be a "nightmare." I also am very attuned to stutter and frame drops, so the frame time variance / micro-stutter (especially on Nvidia SLI) is unbearable to me. I assume this is much better with the HB SLI bridge, though.
Quote:


> Originally Posted by *jcde7ago*
> 
> The things you're referring to like lack of bandwidth constraints, etc., are all benefits that equally transfer to multi-GPU configs; I mean, that's what the HB SLI bridge is for.


Yes and no. The HB SLI Bridge solves inter-card bandwidth, but it does nothing to alleviate the fact that the 1080's GPU is bottlenecked at 4k resolution by its memory bandwidth and performs worse than its raw compute performance would lead you to believe. Don't believe me? Look at 1080 vs 980ti at low res (1440p and below) and then at 4k. The 980Ti and Maxwell Titan X catch up quite a bit because they aren't memory bandwidth-constrained. The same situation is the case when running SLI. This doesn't magically change in that scenario, since there is not enough memory bandwidth for the card's high compute performance. Adding more compute and more bandwidth at the same time in a similar ratio doesn't really address the issue.
Quote:


> Originally Posted by *jcde7ago*
> 
> Lastly, you keep reiterating that a single OC'd TXP is going to "perform similarly" to 2x 1080s at stock, but let's be real...NO ONE on OCN is going to spend $1,400 for a TXP and a block/hybrid cooler and NOT OC the crap out of it, nor would someone spend an equal amount of cash on 2x 1080s and NOT OC the crap out of those...
> 
> From a raw performance and versatility standpoint there is no way that anyone who can choose between a single TXP and 1080 SLI who's had a decent experience with SLI is going to pay the same amount for less performance...cause there is no going around it....a single Titan X Pascal IS, objectively speaking, a lesser performer than 2x 1080s. You can't cherry pick a situation where someone is going to OC their TXP but not their 1080s just to prop up your argument on an enthusiast forum.


Yes, 2x 1080 at max OC will OBVIOUSLY be faster than 1x TXP at max OC. I wasn't debating that. I was just saying that 1x TXP at max OC is as fast as 2x 1080 at stock (or 2x 980ti at max OC), which shows how capable it would be, especially since you don't have to deal with the aforementioned SLI issues.

Also, let's not forget how this is all going to become more of an issue with DX12 and Vulkan, where it's now up to the developer to actively support EMA. Driver-based multi-GPU is now a thing of the past, and I see quite a lot of titles either forgoing support or taking a while to get it. Look at how long it took RoTR to get DX12 EMA support, and note how we don't even have official support for Doom. Game engines are getting more and more complicated each passing generation. Simple immediate mode rendering is no longer feasible when doing lots of shader work. Modern engines are either deferred renderers or hybrid mode like in Idtech 6, which makes it much, much harder for devs to properly support AFR.

All in all, when the choice of selling my 1080 and buying a TXP vs buying a second 1080 presented itself, the choice was crystal clear. Obviously if someone gave me a second TXP for free, I'd love to deal with those SLI issues and have a setup like yours. But since I'd have to pay for a second card, I am more than happy to get away from the issues at half the cost.


----------



## jcde7ago

Quote:


> Originally Posted by *RobotDevil666*
> 
> I have to agree with jcde7ago here 1080 SLI which I will obviously OC as much as possible will be definitely faster than single TXP give even half decent SLI profile.
> Now like both of you guys I've been on SLI for ages and I had SLI setup every gen since 9800GX2 and I know SLI support can be hit and miss but when it works it's massive.#
> Lately I had a spat of issues with SLI which made me consider single card solution but the games I play that need more power than TXP has to offer happen to have good SLI support (Whicher 3, Division GTA V) to name a few, and in those 1080SLI will be easily faster than TXP


I think the "need" for SLI becomes more readily apparent for those of us on [email protected] or [email protected] (and also [email protected]) who want to graphically max out games as much as humanly possible while also trying to have the highest average framerates as possible...the benefits far, far outweigh the negatives especially when there are ways to tweak SLI profiles and such for better performance.

Now, if we were talking about SLI past 2 cards then my opinion would shift drastically after having come off of 3x Titan X Maxwells and knowing how THAT experience was like...things might be different now with high-bandwidth Pascal GPUs/HB SLI bridge but 3-way SLI Titan X Maxwells was a NIGHTMARE for a lot of games, even WITH proper SLI support/SLI profile tweaking; I had to resort to 2 cards a lot of the times just because the scaling was poor or the AFR implementation so bad after 2-cards that the microstuttering was completely unavoidable.
Quote:


> Originally Posted by *llll*
> 
> after a couple restarts/shutdowns, SLi stops working, and I get ~50% usage and huge stuttering even in benchmarks like Heaven and Firestrike.


I'm willing to give people the benefit of the doubt but that kind of interaction screams of a different problem than SLI driver issues to me...or else literally everyone who has ever ran benchmarks on more than a single Nvidia GPU would be reporting the same problems and would suffer from the same low scores, etc.

Sorry you had a crap SLI experience but it's not like that for everyone.


----------



## TremF

One era ends (I have sold my GTX Titan X SLI setup) and another begins - Titan X Pascal on route!


----------



## jcde7ago

@RobotDevil666

Here's Guru3D's GTX 1080 SLI review with some FCAT results (that's the only reliable way to accurately measure any scaling/microstuttering issues with SLI):

http://www.guru3d.com/articles-pages/geforce-gtx-1080-2-way-sli-review,1.html

Basically, the 1080s in SLI scale phenomenally in games that have proper SLI support....and that's what my experience has been with SLI'd Titan X Pascals as well.









What it's going to come down to really is the way particular game engines deal with SLI...even with proper driver support a game can still just not do well with multi-GPUs overall, but from my experience this is pretty rare and I have no issues switching to only a single GPU for that particular game if there are noticeable issues.

Any other microstutter/perceived issues with SLI in the last couple generations of GPUs is literally either eyeballing it and being sensitive to those issues or as I said, the game engine doesn't translate well for SLI support in the first place...and unless everyone is showing their FCAT results, I don't want to hear that "zomg SLI issues!" nonsense.


----------



## bwana

Quote:


> Originally Posted by *eliau81*
> 
> thanks again for post
> 
> as i see it i got 2 options :
> 
> 1. to wait for proper EVGA HYBRID TITAN XP release ( wich can take some time..if so).
> 2.make a mod like you mentioned with 1080
> 
> some user in raddit as done so
> he said he had to do 3 aggressive changes to EVGA heatspreder
> 
> __
> https://www.reddit.com/r/4ziyr1/evga_hybrid_kit_modded_for_titan_x_pascal_album/
> thinking doing that also but wandering what kind of tool he use to grind ?
> 
> can you give me a link to amazon for the fan? i what to make sure to buy your recommend one
> did you did shunt mod?


Why is this necessary? Will the shroud that comes with the 1080 kit not fit on the Titan baseplate?


----------



## willverduzco

Quote:


> Originally Posted by *bwana*
> 
> Why is this necessary? Will the shroud that comes with the 1080 kit not fit on the Titan baseplate?


It does not fit properly because the TXP has an extra VRM phase for the core that the reference 1080 does not have. Thus, you either have to cut a hole in the baseplate to make room for that extra choke (and cut a hole for the extra power connector) or simply use the stock TXP baseplate and only use the GPU core portion of the 1080 hybrid kit.


----------



## SuCCEzz

Quote:


> Originally Posted by *bwana*
> 
> Why is this necessary? Will the shroud that comes with the 1080 kit not fit on the Titan baseplate?


Don't worry about modding the EVGA cover. Just do this, unless you are just partial to the EVGA look.

http://i.imgur.com/8sderDh.jpg


----------



## bwana

I guess I am missing something. Is the 1080 shroud not the same as the TXP shroud? Can I not just mod the shroud that comes with the 1080 kit to make it fit on the TXP baseplate?


----------



## bwana

has anyone used these fans instead of gentle typhoons?

https://www.amazon.com/gp/product/B00KFCR5BA/ref=ox_sc_act_title_5?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

If I use 2 fans on the circuit that powers the rad fans, can the TXP support that? Anyone know what the max amps are on that circuit?


----------



## SuCCEzz

Quote:


> Originally Posted by *bwana*
> 
> I guess I am missing something. Is the 1080 shroud not the same as the TXP shroud? Can I not just mod the shroud that comes with the 1080 kit to make it fit on the TXP baseplate?


You can mod the shroud, Its actually quite simple.


----------



## jhowell1030

Quote:


> Originally Posted by *SuCCEzz*
> 
> Don't worry about modding the EVGA cover. Just do this, unless you are just partial to the EVGA look.
> 
> http://i.imgur.com/8sderDh.jpg


Mind explaining a bit what "this" is. Yeah, we see it (for the third time) but without explanation as to what was done with what to achieve the results of this beauty than the picture is kind of pointless.


----------



## axiumone

Quote:


> Originally Posted by *jhowell1030*
> 
> Mind explaining a bit what "this" is. Yeah, we see it (for the third time) but without explanation as to what was done with what to achieve the results of this beauty than the picture is kind of pointless.


Same thing as this - http://www.overclock.net/t/1601323/gtx-1080-fe-ref-hybrid-guide-minimal-tools-clean-look/0_100


----------



## kx11

Quote:


> Originally Posted by *RobotDevil666*
> 
> Well that's the choice I'm facing 1080SLI on air or watercooled Titan XP
> I was considering 2 of those
> 
> https://www.scan.co.uk/products/zotac-geforce-gtx-1080-amp-edition-8gb-gddr5x-vr-ready-graphics-card-2560-core-1683mhz-gpu-1822mhz-b
> 
> or those
> 
> https://www.scan.co.uk/products/palit-geforce-gtx-1080-super-jetstream-8gb-gddr5x-vr-ready-graphics-card-2560-core-1708mhz-gpu-1847m
> 
> and potentially watercool them bit later, maybe ... depends on the noise really


i made my decision 2 weeks ago , selling 2 Strix 1080 and replacing them with a watercooled TXP


----------



## RobotDevil666

Quote:


> Originally Posted by *kx11*
> 
> i made my decision 2 weeks ago , selling 2 Strix 1080 and replacing them with a watercooled TXP


Care to elaborate why ?


----------



## jcde7ago

Quote:


> Originally Posted by *RobotDevil666*
> 
> Care to elaborate why ?


I'd hazard a guess that it's because a single TXP will hit 120FPS+ in a large majority of games, including an EASY 100FPS+ in most demanding titles @2560x1440p....and that's BEFORE overclocking. The net benefit is basically nothing and he gets to stay with a single card solution.

I'll say again though that anyone looking for more than a single TXP is 1) bat**** crazy and 2) wanting to squeeze every last drop out of [email protected] or [email protected] primarily.


----------



## Jpmboy

Quote:


> Originally Posted by *jcde7ago*
> 
> I'd hazard a guess that it's because a single TXP will hit 120FPS+ in a large majority of games, including an EASY 100FPS+ in most demanding titles @2560x1440p....and that's BEFORE overclocking. The net benefit is basically nothing and he gets to stay with a single card solution.
> 
> I'll say again though that anyone looking for more than a single TXP is 1) bat**** crazy and 2) wanting to squeeze every last drop out of [email protected] or [email protected] primarily.


#1 is the answer... wait, wut? that's me.


----------



## RobotDevil666

One funny thing in this whole discussion is ....... 1080SLI = not good cuz SLI sucks Titan X Pascal SLI = awesome cuz Titan







wait what ?


----------



## Creator

Quote:


> Originally Posted by *RobotDevil666*
> 
> One funny thing in this whole discussion is ....... 1080SLI = not good cuz SLI sucks Titan X Pascal SLI = awesome cuz Titan
> 
> 
> 
> 
> 
> 
> 
> wait what ?


That's because SLI only truly sucks when there's actually a single GPU alternative. And as you know there is currently none in the case of SLI TXP. But as soon as 1180 or Titan X Volta come out, Titan XP SLI will be the worst thing ever.


----------



## scoobied77

Quote:


> Originally Posted by *Menthol*
> 
> I got a TH10 Magnum when they first came out, I was so glad to see someone selling American craftsmanship I felt compelled to support them, not to mention they sell brackets, plates spare parts etc.
> It is kind of large for a desktop case, I use it for benching since it has radiators and pumps installed, I keep planning on getting an external radiator/pump setup just have never gotten around to doing it, thinking one of there bottom expansion cases for this purpose since it could also house a power supply


Shame they look so dated now. . .

The look was cool about 15 years ago.


----------



## Jpmboy

Quote:


> Originally Posted by *scoobied77*
> 
> Shame they look so dated now. . .
> 
> The look was cool about 15 years ago.


I'm into _ratrods_... this looks cool.












Spoiler: Warning: Spoiler!


----------



## atreides

Quote:


> Originally Posted by *jcde7ago*
> 
> I'd hazard a guess that it's because a single TXP will hit 120FPS+ in a large majority of games, including an EASY 100FPS+ in most demanding titles @2560x1440p....and that's BEFORE overclocking. The net benefit is basically nothing and he gets to stay with a single card solution.
> 
> I'll say again though that anyone looking for more than a single TXP is 1) bat**** crazy and 2) wanting to squeeze every last drop out of [email protected] or [email protected] primarily.


I have a 3440x1440p 100hz panel and today my 2nd TXP arrived, unfortunately Nvidia mistakenly sent me a 3 slot sli hb bridge instead of a 4 slot sli hb bridge so now I have to send the one I have back and then wait for Nvidia to receive that and then wait for them to send me the correct hb bridge. I have the soft bridge connecting my Titan in SLI atm and so far I can't really see much of a difference in performance. I really thought that having having SLI would improve my fallout 4 enb performance but even with two Titan's I cant stay above 60fps. I know enb is very demanding but I was convinced that I could achieve stable 60 fps with 2 of these. Maybe when I use the HB Bridge to sli the cards this will change? What kind of overclock would you guys recommend? I have a Cosmos II case and right now on stock settings my cards are staying pretty cool just wondering how high I should overclock. Any help would be great thanks guys.


----------



## scoobied77

Quote:


> Originally Posted by *Jpmboy*
> 
> I'm into _ratrods_... this looks cool.


Meh,

This looks a darn sight cooler (and modern).

http://smg.photobucket.com/user/scoobiedave/media/phanteks-4b_zpsqz4wrtru.jpg.html


----------



## 295033

Quote:


> Originally Posted by *scoobied77*
> 
> Meh,
> 
> This looks a darn sight cooler (and modern).
> 
> http://smg.photobucket.com/user/scoobiedave/media/phanteks-4b_zpsqz4wrtru.jpg.html


Agree to disagree. Nothing beats the aesthetics of a CaseLabs case in my opinion.


----------



## Godsarmy

got mine finally on water


----------



## jcde7ago

Quote:


> Originally Posted by *Iorek*
> 
> A
> Agree to disagree. Nothing beats the aesthetics of a CaseLabs case in my opinion.


Agreed...love my SMA8...will not be buying another case for many, many years...it's well-built, simple, clean and intuitive.

Also, there are literally endless ways to make a case look 'modern,' and a lot of that is how clean component installs look, down to the paint job and however else a room is setup and what other hardware/parts are there on/around a desk, etc....
Quote:


> Originally Posted by *scoobied77*
> 
> Meh,
> 
> This looks a darn sight cooler (and modern).
> 
> http://smg.photobucket.com/user/scoobiedave/media/phanteks-4b_zpsqz4wrtru.jpg.html


That build/case could look like absolute trash without LED lighting highlights (easy to do on literally anything) + if the component setup inside was all over the place with zero color coordination, air instead of water cooling, etc...again, pretty easy to make most builds/cases look 'modern' with minimal effort. I'm picking my build/SMA8 as being 'darn sight cooler and modern' every time over that picture, no offense.


----------



## kx11

Quote:


> Originally Posted by *RobotDevil666*
> 
> Care to elaborate why ?


SLi isn't worth it


----------



## jcde7ago

Quote:


> Originally Posted by *kx11*
> 
> SLi isn't *worth* it


Lol...talk about being super subjective...but hey, whatever floats your boat...it's worth it to a whole helluva lot of people, even if it's worth nothing to you.


----------



## MunneY

Quote:


> Originally Posted by *kx11*
> 
> SLi isn't worth it


This is an opinion, not a fact


----------



## Glerox

I had two gtx 1080 in sli. I switched to a txp and put it under water. Overclocked at +235/+550 with stock voltage. Stable 2076-2088mhz which is freaking awesome because i was at 2101mhz on my 1080s. 80-90 fps in battlefield one in 4k at Ultra on a single gpu! Couldn't be happier with my decision







But yes, two 1080s are way more powerful.

Here are my 3dmark Ultra firestrike benchmarks:

One gtx 1080 OC = 5500
One txp OC = 7700 (+40%)
Two 1080s OC = 10000 (+30%)

As you can see the scaling in 3dmark is almost perfect, which is not representative of reality unfortunately.


----------



## kx11

Quote:


> Originally Posted by *MunneY*
> 
> This is an opinion, not a fact


when you play a game that you really like ( batman AK ) and it doesn't support SLi , it feels you wasted your money

in that case it is a fact


----------



## llll

Quote:


> Originally Posted by *jcde7ago*
> 
> I think the "need" for SLI becomes more readily apparent for those of us on [email protected] or [email protected] (and also [email protected]) who want to graphically max out games as much as humanly possible while also trying to have the highest average framerates as possible...the benefits far, far outweigh the negatives especially when there are ways to tweak SLI profiles and such for better performance.
> 
> Now, if we were talking about SLI past 2 cards then my opinion would shift drastically after having come off of 3x Titan X Maxwells and knowing how THAT experience was like...things might be different now with high-bandwidth Pascal GPUs/HB SLI bridge but 3-way SLI Titan X Maxwells was a NIGHTMARE for a lot of games, even WITH proper SLI support/SLI profile tweaking; I had to resort to 2 cards a lot of the times just because the scaling was poor or the AFR implementation so bad after 2-cards that the microstuttering was completely unavoidable.
> I'm willing to give people the benefit of the doubt but that kind of interaction screams of a different problem than SLI driver issues to me...or else literally everyone who has ever ran benchmarks on more than a single Nvidia GPU would be reporting the same problems and would suffer from the same low scores, etc.
> 
> Sorry you had a crap SLI experience but it's not like that for everyone.


Prior to the 2x780tis, I owned a 3x580 system and it was fantastic. I didn't suspect a hardware issue because it always works immediately following a DDU driver wipe+clean install, though not for more than a day or so. It's probably the strangest problem I've ever encountered since I can't identify any sort of cause.


----------



## axiumone

Did the hybrid mod on my cards tonight. Push/Pull fans at 1000rpm keeps both cards at 45-50c @ 2100 core and +700 vram game stable.



http://www.3dmark.com/fs/10111187



Spoiler: PICS



One of the cards had a nice thin layer of thermal paste from the factory. The second was a horror show.


----------



## lilchronic

Quote:


> Originally Posted by *kx11*
> 
> when you play a game that you really like ( batman AK ) and it doesn't support SLi , it feels you wasted your money
> 
> in that case it is a fact


Use one as a physx card, guaranteed it will run better


----------



## jcde7ago

Quote:


> Originally Posted by *kx11*
> 
> when you play a game that you really like ( batman AK ) and it doesn't support SLi , it feels you wasted your money
> 
> in that case it is a fact


No, that's still subjective as that is YOUR definition of worth/value; someone else can feel like it's a waste for that particular game but feel that SLI is still a worthy investment for them overall if it works for all of the other games they play. Others simply aren't bothered by outlier games even if they like those games a lot.

In any case, we're at a point now where Batman: AK doesn't even need SLI to be maxed out at most resolutions @ max settings anyways, so of course it's not objectively "worth it" anymore today (and also, AK's engine just flat out doesn't jive well with ANY multi-GPU configs because of the PhysX implementation; it gets screwy if you have SLI but don't use a dedicated PhysX card too, etc. You can Google the SLI driver bits to at least get 180%~ scaling in AK with minimal issues).


----------



## scoobied77

Quote:


> Originally Posted by *jcde7ago*
> 
> Agreed...love my SMA8...will not be buying another case for many, many years...it's well-built, simple, clean and intuitive.
> 
> Also, there are literally endless ways to make a case look 'modern,' and a lot of that is how clean component installs look, down to the paint job and however else a room is setup and what other hardware/parts are there on/around a desk, etc....
> That build/case could look like absolute trash without LED lighting highlights (easy to do on literally anything) + if the component setup inside was all over the place with zero color coordination, air instead of water cooling, etc...again, pretty easy to make most builds/cases look 'modern' with minimal effort. I'm picking my build/SMA8 as being 'darn sight cooler and modern' every time over that picture, no offense.


Enjoy your 90's looking server case.


----------



## jcde7ago

Quote:


> Originally Posted by *scoobied77*
> 
> Enjoy your 90's looking server case.


Heh, thanks, I will...and i'll take that as a complement because the 90's were the *****.


----------



## scoobied77

Quote:


> Originally Posted by *jcde7ago*
> 
> Heh, thanks, I will...and i'll take that as a complement because the 90's were the *****.


I'll have to agree with you there 90's was great as well as the 80's too.


----------



## shapin

I love the sma8, no case get near to his quality, caselabs have the best cases by far.


----------



## RobotDevil666

Quote:


> Originally Posted by *jcde7ago*
> 
> No, that's still subjective as that is YOUR definition of worth/value; someone else can feel like it's a waste for that particular game but feel that SLI is still a worthy investment for them overall if it works for all of the other games they play. Others simply aren't bothered by outlier games even if they like those games a lot.
> 
> In any case, we're at a point now where Batman: AK doesn't even need SLI to be maxed out at most resolutions @ max settings anyways, so of course it's not objectively "worth it" anymore today (and also, AK's engine just flat out doesn't jive well with ANY multi-GPU configs because of the PhysX implementation; it gets screwy if you have SLI but don't use a dedicated PhysX card too, etc. You can Google the SLI driver bits to at least get 180%~ scaling in AK with minimal issues).


It's done now







I've ordered 2 EVGA SC 1080's + EK waterblocks/backplates








Here's why, 1080 SLI will be definitely faster than 980Ti SLI when supported, and in those games that don't support it single OC'd 1080 will still be much faster than single 980Ti, so it's an upgrade in both respects.
Also I've done a lille count and those games I play that don't support SLI I can easily max with 1080 and those that need more power than 980Ti SLI have good SLI profiles like GTA V, Whicher 3, Division for example.
Also EVGA has step up so if 1080Ti happens to drop within 90 days ill jump on that, if it doesn't 1080 SLI should be enough power to tie me up till volta.
Gear comes tomorrow, really excited, I feel like a fat kid in a candy store


----------



## Jpmboy

Quote:


> Originally Posted by *scoobied77*
> 
> Meh,
> 
> This looks a darn sight cooler (and modern).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://smg.photobucket.com/user/scoobiedave/media/phanteks-4b_zpsqz4wrtru.jpg.html


lol - cooler, maybe it subjective. Slower, definitely.








I stopped building "pretty" PCs like that a decade ago. And when you change gear every few months.. well, it ain't so pretty.
Quote:


> Originally Posted by *jcde7ago*
> 
> Heh, thanks, I will...and i'll take that as a complement because the 90's were the *****.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice one!


----------



## mbze430

Quote:


> Originally Posted by *lilchronic*
> 
> Use one as a physx card, guaranteed it will run better


I can confirm it does! Running SLI Titan XP w/ 980TI as dedicated PhysX
Quote:


> Originally Posted by *Jpmboy*
> 
> And when you change gear every few months.. well, it ain't so pretty.


AGREE! and its a waste of a lot of good zip ties lol


----------



## davidm71

Wonder where is the cheapest place to buy a Titan XP?

Thanks


----------



## DADDYDC650

Quote:


> Originally Posted by *davidm71*
> 
> Wonder where is the cheapest place to buy a Titan XP?
> 
> Thanks


Used on ebay or craigslist. Perhaps the sales section on this site?


----------



## Jpmboy

Quote:


> Originally Posted by *davidm71*
> 
> Wonder where is the cheapest place to buy a Titan XP?
> 
> Thanks


new.. only from Nvidia. Used... OCN market place, fleaBay etc.


----------



## DarkIdeals

Quote:


> Originally Posted by *DADDYDC650*
> 
> Used on ebay or craigslist. Perhaps the sales section on this site?


I picked up my 2nd TITAN XP from a facebook meet-up with a guy who does youtube tech reviews etc.. for a living. Paid $1,200 shipped so i avoided the taxes and stuff. My first TITAN XP i paid $1,309 for with one day shipping and tax.

Honestly though, it's not worth it in most cases to buy used. Only reason i did was that 1) they were out of stock and it was a friday so didn't want to wait till monday to see if they re-stocked, and 2) The guy was pretty legit, he had charity events livestreamed on his channel and stuff and you could see the exact same two TITAN XPs he was selling in his review videos and he stated the only reason he bought them was for review anyway. So i knew that they were only used for a couple days in review testing and that he seemed to take care of them etc..

If you don't have someone you trust like that i'd definitely just recommend buying from Nvidia to get the warranty and peace of mind. (unless you can snag one for like $1,000 or less then i'd definitely jump on it)


----------



## davidm71

Thanks. Don't want to spend more than a thousand dollars on one new. Wonder if Nvidia will drop the price soon at all or by xmas. Think it would pair nicely with an Acer Predator 4K monitor and those go for $900. Got to budget. Thanks.


----------



## mbze430

didn't realize people are already selling their used Titan XP already


----------



## eliau81

Quote:


> Originally Posted by *mbze430*
> 
> I can confirm it does! Running SLI Titan XP w/ 980TI as dedicated PhysX
> AGREE! and its a waste of a lot of good zip ties lol


Wait... are u saying that my 980 not TI just 980 could be as Physx olso ?


----------



## lilchronic

Quote:


> Originally Posted by *eliau81*
> 
> Wait... are u saying that my 980 not TI just 980 could be as Physx olso ?


Yep


----------



## jcde7ago

Quote:


> Originally Posted by *eliau81*
> 
> Wait... are u saying that my 980 not TI just 980 could be as Physx olso ?


Please don't use a 980 for a dedicated PhysX card...if you do, somewhere in the world a kitten will die...


----------



## eliau81

Quote:


> Originally Posted by *lilchronic*
> 
> Yep


hmmm... LinusTechTips as a prospective for another card as Physx
dono if it refers to TXP + GTX 980 they didn't test it


----------



## jcde7ago

Quote:


> Originally Posted by *eliau81*
> 
> hmmm... LinusTechTips as a prospective for another card as Physx
> dono if it refers to TXP + GTX 980 they didn't test it


A GTX 980 (non-Ti) can still fetch around $220~ USD...unless you're playing at 4K with a TXP and REALLY into a game that takes advantage of a dedicated PhysX card, you're much better off selling it (you're not going to get $200+ worth of performance out of the 980 in a PhysX game if you're already using a TXP).


----------



## PowerK

I have a spare Asus Strix 1070 laying around and wondering whether it's a good idea to add it to my Titan SLI as a dedicated PhysX. (Occupies PCI-E lane, heat, power etc)

Edit: 6950X with X99 has 40 PCI-E lanes, so.. Should be ok in terms of lane.


----------



## jcde7ago

Quote:


> Originally Posted by *PowerK*
> 
> I have a spare Asus Strix 1070 laying around and wondering whether it's a good idea to add it to my Titan SLI as a dedicated PhysX. (Occupies PCI-E lane, heat, power etc)
> 
> Edit: 6950X with X99 has 40 PCI-E lanes, so.. Should be ok in terms of lane.


This would be an absolute waste for TXPs in SLI; sell that 1070. You stand to gain literally nothing from having a $450 dedicated Pascal PhysX card when you're running a single 2560x1440p @ 144hz monitor that won't even take full advantage of 2x TXPs as it is.

If I were in your position i'd sell the 1070 AND your S2716DG and either get an Acer Predator X34 or a 4k display...but if you're not interested in either then get your $400+ back anyway and stash it away for the next inevitable Volta SLI upgrade before its value depreciates (unless you really hate money and/or are too lazy to resell. Either way, the 1070 would be wasted heat and electricity).


----------



## lilchronic

Mafia 3 is coming out soon. A physx card sure helped a little in mafia 2

GTX780


GTX 780 + GTX480 Dedicated to physx


----------



## PowerK

Quote:


> Originally Posted by *jcde7ago*
> 
> This would be an absolute waste for TXPs in SLI; sell that 1070. You stand to gain literally nothing from having a $450 dedicated Pascal PhysX card when you're running a single 2560x1440p @ 144hz monitor that won't even take full advantage of 2x TXPs as it is.
> 
> If I were in your position i'd sell the 1070 AND your S2716DG and either get an Acer Predator X34 or a 4k display...but if you're not interested in either then get your $400+ back anyway and stash it away for the next inevitable Volta SLI upgrade before its value depreciates (unless you really hate money and/or are too lazy to resell. Either way, the 1070 would be wasted heat and electricity).


1070 being a waste for dedicated PhysX, I have to agree with you.
As for monitor, nah..
I also am a videophile who loves eye candy.
Used to own Sony GDM-FW900 since 2001.
For me, 4K panels are not there yet.
[email protected] > [email protected] for me, currently.

The last but not least, your comment on [email protected] monitor not taking full advantage of TXP SLI, I have no idea what you're talking about. I can get 1080P monitor taking full advantage of TXP SLI. Downsampling, DSR, SGSSAA.. the list goes on.


----------



## jcde7ago

Quote:


> Originally Posted by *PowerK*
> 
> 1070 being a waste for dedicated PhysX, I have to agree with you.
> As for monitor, nah..
> I also am a videophile who loves eye candy.
> Used to own Sony GDM-FW900 since 2001.
> For me, 4K panels are not there yet.
> [email protected] > [email protected] for me, currently.
> 
> The last but not least, your comment on [email protected] monitor not taking full advantage of TXP SLI, I have no idea what you're talking about. I can get 1080P monitor taking full advantage of TXP SLI. Downsampling, DSR, SGSSAA.. the list goes on.


It was just a suggestion; i'm pretty sure most people are waiting on [email protected]+, but some prefer higher, native resolutions over FPS.

Sure, mods and upsampling/DSR can take advantage of lower resolution monitors, but i'm speaking in generalities; most people do not dabble in DSR and upsampling because the performance hit on likely smaller monitors isn't worth it when people can take advantage of native 2k-4k displays with appropriate sizes.

"Generally speaking" without upsampling or DSR 2x TXPs will destroy pretty much every game on the planet right now at 2560x1440p @144hz with every in-game setting at the highest outside of the most taxing SSAA implementations and still 144hz 99.9% of the time. Sticking a 1070 in scenarios where DSR/upsampling is used in the very limited PhysX games as it is is STILL not likely to make much of a difference, if at all, in any of those games...you'd basically be choosing to a $450 PhysX paperweight card while simultaneously heating up your house and consuming more electricity. But if that's what you want because you feel like it will make a difference, that is certainly your prerogative.


----------



## PowerK

Quote:


> Originally Posted by *jcde7ago*
> 
> It was just a suggestion; i'm pretty sure most people are waiting on [email protected]+, but some prefer higher, native resolutions over FPS.
> 
> Sure, mods and upsampling/DSR can take advantage of lower resolution monitors, but i'm speaking in generalities; most people do not dabble in DSR and upsampling because the performance hit on likely smaller monitors isn't worth it when people can take advantage of native 2k-4k displays with appropriate sizes.
> 
> "Generally speaking" without upsampling or DSR 2x TXPs will destroy pretty much every game on the planet right now at 2560x1440p @144hz with every in-game setting at the highest outside of the most taxing SSAA implementations and still 144hz 99.9% of the time. Sticking a 1070 in scenarios where DSR/upsampling is used in the very limited PhysX games as it is is STILL not likely to make much of a difference, if at all, in any of those games...you'd basically be choosing to a $450 PhysX paperweight card while simultaneously heating up your house and consuming more electricity. But if that's what you want because you feel like it will make a difference, that is certainly your prerogative.


Understood and vaild point. 
I won't waste my time installing 1070 as a dedicated PhysX.


----------



## jcde7ago

Quote:


> Originally Posted by *lilchronic*
> 
> Mafia 3 is coming out soon. A physx card sure helped a little in mafia 2
> 
> GTX780
> 
> 
> GTX 780 + GTX480 Dedicated to physx


Yeah, we've come a long way since the days of using the 400 Series as beneficial PhysX cards....architecture changes since then basically means that GPUs below the mid-high end GTX 700 series used as dedicated PhysX cards are likely to have a NEGATIVE effect on performance; you now need a "decent" card that won't slow down your primary GPU if you want to see performance increases using a dedicated PPU, but that "decent" extra PhysX card has to make financial sense before it's not at all worth it.

If you have Pascals in SLI I would never recommend getting a third card to run for dedicated PhysX as those two in SLI should be able to handle everything including PhysX; however, with a single 1060/1070/1080/TXP I could definitely see some benefits from having a GTX 7xx as a dedicated PhysX card.

I'd imagine that anything higher than a GTX 7xx card would be a waste, especially if you don't have that extra card already as then you're just getting into "might as well pick up a second GPU for SLI instead" territory (why would you waste money on a dedicated physx card instead of just outright going for SLI, which would have a bigger net performance increase).
Quote:


> Originally Posted by *PowerK*
> 
> Understood and vaild point.
> I won't waste my time installing 1070 as a dedicated PhysX.


No worries dude, i've experienced the pain firsthand of trying to squeeze out that extra bit of performance, particularly with using a dedicated PhysX card...I tried that various times since my GTX 470 SLI days (using a GTX 260 as dedicated PhysX and then getting GTX 480 SLI and using a GTX 470 as PhysX ), it was ALWAYS disappointing.


----------



## bl4ckdot

There we go


----------



## Ninjawithagun

My Titan XP is finally watercooled (EKWB full cover waterblock and back plate) and temps are awesome! 33-46C when gaming and max 50C when heavy load benchmarking...and those temps are with an Intel Core i7 3930K @ 4.5Ghz at 1.41V ;-)


----------



## Ninjawithagun

Nice man ;-)


----------



## lilchronic

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah, we've come a long way since the days of using the 400 Series as beneficial PhysX cards....architecture changes since then basically means that GPUs below the mid-high end GTX 700 series used as dedicated PhysX cards are likely to have a NEGATIVE effect on performance; you now need a "decent" card that won't slow down your primary GPU if you want to see performance increases using a dedicated PPU, but that "decent" extra PhysX card has to make financial sense before it's not at all worth it.
> 
> If you have Pascals in SLI I would never recommend getting a third card to run for dedicated PhysX as those two in SLI should be able to handle everything including PhysX; however, with a single 1060/1070/1080/TXP I could definitely see some benefits from having a GTX 7xx as a dedicated PhysX card.
> 
> I'd imagine that anything higher than a GTX 7xx card would be a waste, especially if you don't have that extra card already as then you're just getting into "might as well pick up a second GPU for SLI instead" territory (why would you waste money on a dedicated physx card instead of just outright going for SLI, which would have a bigger net performance increase).
> No worries dude, i've experienced the pain firsthand of trying to squeeze out that extra bit of performance, particularly with using a dedicated PhysX card...I tried that various times since my GTX 470 SLI days (using a GTX 260 as dedicated PhysX and then getting GTX 480 SLI and using a GTX 470 as PhysX ), it was ALWAYS disappointing.


Ok so say you have TXP SLI and you play a game with heavy Physx. Disabling sli and dedicating one card to physx i can almost guarantee it will run better than just with sli.

Its has been like that with GTX 480's sli, GTX680's sli and GTX 780Ti's sli. Games with heavy physx are meant to be played with a dedicated physx.


----------



## Stateless

Quote:


> Originally Posted by *Ninjawithagun*
> 
> My Titan XP is finally watercooled (EKWB full cover waterblock and back plate) and temps are awesome! 33-46C when gaming and max 50C when heavy load benchmarking...and those temps are with an Intel Core i7 3930K @ 4.5Ghz at 1.41V ;-)


Looks good. And good to see a fellow 3930k owner here. Have you run Firestrike yet? How did you do? I in the top 60 with my 3930k, surrounded by nothing but 6950 owners on the hall of fame. I have the oldest CPU in the HOF listing as of right now, so would be nice to have a fellow owner on that list.


----------



## kx11

Finally got watercooling (milkcooling according to my friends)



heaven benchmark , core clock 200+ , memory 500mhz , temps fot up to 73c , hopefully that's ok


----------



## scgeek12

Quote:


> Originally Posted by *kx11*
> 
> Finally got watercooling (milkcooling according to my friends)
> 
> 
> 
> heaven benchmark , core clock 200+ , memory 500mhz , temps fot up to 73c , hopefully that's ok


Holy crap that's hot.. Mine barely hit that on air, thinking you did something wrong (also do that card some justice and clean up those wires







)


----------



## Stateless

Quote:


> Originally Posted by *kx11*
> 
> Finally got watercooling (milkcooling according to my friends)
> 
> 
> 
> heaven benchmark , core clock 200+ , memory 500mhz , temps fot up to 73c , hopefully that's ok


Your temp is too high for a watercooled GPU. Something is not right with the cooling. How many RAD's are you using? I have 2 cards in SLI and my highest temp is 54c, which I think it a little high as well, if I hit 73c I would be freaking out.


----------



## kx11

Quote:


> Originally Posted by *Stateless*
> 
> Your temp is too high for a watercooled GPU. Something is not right with the cooling. How many RAD's are you using? I have 2 cards in SLI and my highest temp is 54c, which I think it a little high as well, if I hit 73c I would be freaking out.


1 RAD , EK 360xe

i think the liquid is too thick and needs more water mixed in


----------



## msp1609

Yea my card with a hybrid cooler only hits 55c. Thats while holding 2114mhz during gaming.


----------



## kx11

this is after FS Ultra , the RAD fans are running 100%


----------



## Jpmboy

Quote:


> Originally Posted by *kx11*
> 
> this is after FS Ultra , the RAD fans are running 100%


what TIM did you use on the block mount? And IMO, that white sheeet is worthless. Just use DW and a stabilizer. Redline Water wetter works best.


----------



## eliau81

Quote:


> Originally Posted by *msp1609*
> 
> Yea my card with a hybrid cooler only hits 55c. Thats while holding 2114mhz during gaming.


pretty good OC, can you upload some pic?


----------



## scgeek12

Quote:


> Originally Posted by *kx11*
> 
> 1 RAD , EK 360xe
> 
> i think the liquid is too thick and needs more water mixed in


I'm running 2 Titans in sli and my 6700k that's clocked to 4.8 @ 1.40V in the same loop with 2 rads (1 360mm and 1 480mm and I have never seen mine go above 45C, at 70+ I'd be taking the block back off and trying to reseat it

Edit- also what pump and CPU block is that?


----------



## Mbbx

I just bought a Titan X (pascal).

I was sent a used one.

http://www.overclock.net/t/1611216/used-titan-x-pascal-sent-from-nvidia-be-aware/0_100


----------



## kx11

Quote:


> Originally Posted by *scgeek12*
> 
> I'm running 2 Titans in sli and my 6700k that's clocked to 4.8 @ 1.40V in the same loop with 2 rads (1 360mm and 1 480mm and I have never seen mine go above 45C, at 70+ I'd be taking the block back off and trying to reseat it
> 
> Edit- also what pump and CPU block is that?


EK-XRES 100 Revo D5 PWM (incl. pump)

and
ASUS X99 Monoblock

probably 1 360 RAD isn't enough for 2 Blocks ?

ran FS Ultra just fine btw and scored a good one

http://www.3dmark.com/fs/10133489


----------



## TremF

Quote:


> Originally Posted by *Mbbx*
> 
> I just bought a Titan X (pascal).
> 
> I was sent a used one.
> 
> http://www.overclock.net/t/1611216/used-titan-x-pascal-sent-from-nvidia-be-aware/0_100


Hmmm. I purchased one Friday. I hope they don't deliver a pre-owned one to me too.









Update: Mine arrived yesterday. Everything was properly sealed and brand spanking new. Not sure how they managed to send you a used one out. I hope you get it sorted soon.


----------



## Glzmo

Quote:


> Originally Posted by *kx11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MunneY*
> 
> This is an opinion, not a fact
> 
> 
> 
> when you play a game that you really like ( batman AK ) and it doesn't support SLi , it feels you wasted your money
> 
> in that case it is a fact
Click to expand...

Well, in the case of Batman: Arkham Knight you can simply use the second card as dedicated PhysX card as the game's framerates greatly benefit from it, especially in PhysX-heavy Batmobile scenarios. Even a GTX 1060 as dedicated PhysX card does wonders in conjunction with the Titan X in that game at high resolutions.

Batman: Arkham Knight all in-game options including gameworks/PhysX maxed, 3840x2160, Windows 10 Pro 64 bit 1607:

Titan X only (PhysX set to Titan X):
Min FPS: 26
Max FPS: 81
Average FPS: 63

Titan X for rendering + GTX 1060 as dedicated PhysX card:
Min FPS: 54
Max FPS: 100
Average FPS: 73

In games that support neither SLI nor GPU/Hardware PhysX, however, that's a different story, however, as your additional GPU will sit idle.


----------



## pompss

Anyone have msi titanium gaming x99 motherboard or asus rampage V edition 10 ?

I experience very slow booting time with the MSI even with the fast boot option but still its taking too long to boot. also one of the ram slot ts defect . wont boot at all.








Shipping back the msi to amazon and ordered the asus edition 10 . i hope i don't have the same slow boot crap.

My asrock x99 itx was three time faster like 7-10 sec to boot and i pretend at least the same booting time form a $600 mb.


----------



## profundido

Quote:


> Originally Posted by *RobotDevil666*
> 
> Aaaaaand I think this is pretty much my conclusion too
> 
> 
> 
> 
> 
> 
> 
> 
> Titan XP SLI is definitely out of my budget and with 1080 SLI I get better performance now and even in the games that do not support SLI it's still better than single 980Ti.
> Thank you for the input well deserved +REP for you mate


lmao here, can't believe you went with my initial suggestion after all after other people repeated the same arguments. The human mind is a strange thing


----------



## profundido

Quote:


> Originally Posted by *atreides*
> 
> I have a 3440x1440p 100hz panel and today my 2nd TXP arrived, unfortunately Nvidia mistakenly sent me a 3 slot sli hb bridge instead of a 4 slot sli hb bridge so now I have to send the one I have back and then wait for Nvidia to receive that and then wait for them to send me the correct hb bridge. I have the soft bridge connecting my Titan in SLI atm and so far I can't really see much of a difference in performance. I really thought that having having SLI would improve my fallout 4 enb performance but even with two Titan's I cant stay above 60fps. I know enb is very demanding but I was convinced that I could achieve stable 60 fps with 2 of these. Maybe when I use the HB Bridge to sli the cards this will change? What kind of overclock would you guys recommend? I have a Cosmos II case and right now on stock settings my cards are staying pretty cool just wondering how high I should overclock. Any help would be great thanks guys.


First let's see if the basics are in order. Are you reaching roughly 14K timespy with SLI enabled ? (With HB SLI bridge that would be above 16-17K)


----------



## profundido

Quote:


> Originally Posted by *kx11*
> 
> EK-XRES 100 Revo D5 PWM (incl. pump)
> 
> and
> ASUS X99 Monoblock
> 
> probably 1 360 RAD isn't enough for 2 Blocks ?
> 
> ran FS Ultra just fine btw and scored a good one
> 
> http://www.3dmark.com/fs/10133489


correct, you need more 3D rad space. Adding one or 2 big (6cm) ones outside the case would solve it and make it alot more quiet. I don't suppose you have much space left inside the case ?

for reference, both my titans never exceed 42°C and I'm using the same EK pump as you except with different rads (and more rads btw)


----------



## kx11

Quote:


> Originally Posted by *profundido*
> 
> correct, you need more 3D rad space. Adding one or 2 big (6cm) ones outside the case would solve it and make it alot more quiet. I don't suppose you have much space left inside the case ?
> 
> for reference, both my titans never exceed 42°C and I'm using the same EK pump as you except with different rads (and more rads btw)


space inside the case ? nah nothing left , probably outside the case is the only option for me , or re-seating the GPU again


----------



## Lobotomite430

Which one of you is selling this?
http://www.ebay.com/itm/EVGA-Hybrid-Cooled-Titan-X-Pascal-12GB-GDDR5X-Video-Card/122123794308?_trksid=p2050601.c100085.m2372&_trkparms=aid%3D111001%26algo%3DREC.SEED%26ao%3D1%26asc%3D37338%26meid%3Ddcbdc95676bf40a6a42b92dff8f45221%26pid%3D100085%26rk%3D3%26rkt%3D4%26sd%3D281948275125%26clkid%3D8208719244919696085&_qi=RTM2247627


----------



## profundido

Quote:


> Originally Posted by *kx11*
> 
> space inside the case ? nah nothing left , probably outside the case is the only option for me , or re-seating the GPU again


If it's any consolation to you, that D5 pump has no trouble handling 3 big 6cm rads (high flow though), a CPU pump and 2 GPU waterblocks and 4 Koolance QDC's here easily at only 3600/5000 RPM. So it looks like you could get away by simply extending your loop outside the case with 1or 2 big rads and a few fans on it. Cheap and efficient extension. Not only your temps but the fan noise will be world of difference i guarantee


----------



## jhowell1030

Betchya it won't. For folks out there on 16:9 1440p, 21:9 1440p and 4k displays that didn't matter. They needed the SLI at the time to hit consistent frames above 60FPS. If they didn't have the horsepower fro just one card...it didn't matter.


----------



## SuCCEzz

Quote:


> Originally Posted by *PowerK*
> 
> 1070 being a waste for dedicated PhysX, I have to agree with you.
> As for monitor, nah..
> I also am a videophile who loves eye candy.
> Used to own Sony GDM-FW900 since 2001.
> For me, 4K panels are not there yet.
> [email protected] > [email protected] for me, currently.
> 
> The last but not least, your comment on [email protected] monitor not taking full advantage of TXP SLI, I have no idea what you're talking about. I can get 1080P monitor taking full advantage of TXP SLI. Downsampling, DSR, SGSSAA.. the list goes on.


FW900 FTW! Contrast, motion reproduction, no lag. Much better than resolution bump. Which the FW900 already does 2304x1440.


----------



## kx11

Quote:


> Originally Posted by *profundido*
> 
> If it's any consolation to you, that D5 pump has no trouble handling 3 big 6cm rads (high flow though), a CPU pump and 2 GPU waterblocks and 4 Koolance QDC's here easily at only 3600/5000 RPM. So it looks like you could get away by simply extending your loop outside the case with 1or 2 big rads and a few fans on it. Cheap and efficient extension. Not only your temps but the fan noise will be world of difference i guarantee


no worries i fixed it

ran FS ULTRA , GPU OC 230+core 520+mem , CPU 4.4ghz



why the temps were so hot before?? because asus AiSuite water pump tab was manually set to lowest rpm ( 1100 i think )



and my FS Ultra score
http://www.3dmark.com/fs/10140795


----------



## CallsignVega

Quote:


> Originally Posted by *SuCCEzz*
> 
> FW900 FTW! Contrast, motion reproduction, no lag. Much better than resolution bump. Which the FW900 already does 2304x1440.


Not really. The FW900 has a 1920x1200 shadow mask. Any resolution higher than that blurs considerably.


----------



## SuCCEzz

Uniformity and geometry is better on the lcd. I prefer movies on my FW900 and media, but I actually enjoy gaming better on my Acer XB271hu due to the clarity of text and gsync.


----------



## jcde7ago

Quote:


> Originally Posted by *kx11*
> 
> why the temps were so hot before?? because asus AiSuite water pump tab was manually set to lowest rpm ( 1100 i think )


If your loop is properly bled I would run the D5 at no more than 50-65% max RPM; anything over *might* make a 1c difference, but the noise trade off isn't worth it for that 1c. I run dual D5s and I couldn't stand anything over 65% due to the noise so I actually disconnected the PWMs entirely and am running strictly off molex for a constant 60% on both.

Helps prolong it too as D5s in particular were more prone to failures when being run at 100% RPM 24x7.


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> Which one of you is selling this?
> http://www.ebay.com/itm/EVGA-Hybrid-Cooled-Titan-X-Pascal-12GB-GDDR5X-Video-Card/122123794308?_trksid=p2050601.c100085.m2372&_trkparms=aid%3D111001%26algo%3DREC.SEED%26ao%3D1%26asc%3D37338%26meid%3Ddcbdc95676bf40a6a42b92dff8f45221%26pid%3D100085%26rk%3D3%26rkt%3D4%26sd%3D281948275125%26clkid%3D8208719244919696085&_qi=RTM2247627


lol








this is probably a unique titan made of Kryptonite
how is he even dare


----------



## eliau81

Is someone tried to install the h115i of corsair on the titan?


----------



## 295033

Quote:


> Originally Posted by *pez*
> 
> Definitely curious to know what fixes it if you do find out
> 
> 
> 
> 
> 
> 
> 
> .


I received my replacement One Connect box from Samsung... it fixed the stutter. Horray!


----------



## jcde7ago

Quote:


> Originally Posted by *Iorek*
> 
> I received my replacement One Connect box from Samsung... it fixed the stutter. Horray!


Oh sweet! Glad to know I was wrong! Those things are bricks, didn't think one was going to fail on you out of the blue like that. Good stuff.


----------



## 295033

Maybe I spoke too soon? After replacing the box I booted up Dark Souls III and saw no stutter so I assumed it was fixed. A while later I tried DOOM, and it had stutter. Then I re-tried Dark Souls III and it was stuttering again. Same for Alien: Isolation. Then all the sudden they all stopped stuttering. I cycled through those 3 as well as Dark Souls and The Witcher III and couldn't get any of them to stutter. Rebooted and everything was still good.

I don't get it...


----------



## jcde7ago

Quote:


> Originally Posted by *Iorek*
> 
> Maybe I spoke too soon? After replacing the box I booted up Dark Souls III and saw no stutter so I assumed it was fixed. A while later I tried DOOM, and it had stutter. Then I re-tried Dark Souls III and it was stuttering again. Same for Alien: Isolation. Then all the sudden they all stopped stuttering. I cycled through those 3 as well as Dark Souls and The Witcher III and couldn't get any of them to stutter. Rebooted and everything was still good.
> 
> I don't get it...


I can do some extensive testing for you, but you'll need to ship me that OneConnect box *and* the $8,000 TV it attaches to for a thorough evaluation.


----------



## Baasha

Titan X Pascal 4 Way SLI Benchmarks:


----------



## CallsignVega

Quote:


> Originally Posted by *SuCCEzz*
> 
> Uniformity and geometry is better on the lcd. I prefer movies on my FW900 and media, but I actually enjoy gaming better on my Acer XB271hu due to the clarity of text and gsync.


Ya I loved the FW900 back in the day. So smooth. But with G-Sync and backlight strobing monitors out there, that 22.5" viewable of the FW900 just doesn't cut it.


----------



## STRYC

Quote:


> Originally Posted by *profundido*
> 
> First let's see if the basics are in order. Are you reaching roughly 14K timespy with SLI enabled ? (With HB SLI bridge that would be above 16-17K)


Quote:


> Originally Posted by *atreides*
> 
> I have a 3440x1440p 100hz panel and today my 2nd TXP arrived, unfortunately Nvidia mistakenly sent me a 3 slot sli hb bridge instead of a 4 slot sli hb bridge so now I have to send the one I have back and then wait for Nvidia to receive that and then wait for them to send me the correct hb bridge. I have the soft bridge connecting my Titan in SLI atm and so far I can't really see much of a difference in performance. I really thought that having having SLI would improve my fallout 4 enb performance but even with two Titan's I cant stay above 60fps. I know enb is very demanding but I was convinced that I could achieve stable 60 fps with 2 of these. Maybe when I use the HB Bridge to sli the cards this will change? What kind of overclock would you guys recommend? I have a Cosmos II case and right now on stock settings my cards are staying pretty cool just wondering how high I should overclock. Any help would be great thanks guys.


I'm cuddling up to a pair of EK water cooled Pascal Titan XP's. Using various SLI bridges, I ran some benchmarks and was disappointed. I ran them with 1 and 2 soft bridges (no difference there), 1 EVGA PRO V2 bridge (LED Hard bridge), and Nvidia's HB bridge. (I modded the HB bridge to fit with my EK waterblocks) Tom Clancy's THE DIVISION showed marked improvement at 4K from Soft bridge to the EVGA Hard bridge, but very slight difference from the EVGA bridge to Nvidia's HB bridge. RB6S, Unigine Valley and 3DMARK showed no difference at all between Soft, EVGA or HB. . Here are my numbers:

ALL BENCHMARKS ARE AT STOCK SPEED NO OVERCLOCK.

In RB6 Siege these numbers did not improve with any combination of SLI bridge.

RB6S SLI 2560x1440 AVERAGE: 114.1 MIN: 89.9 EVERYTHING MAXED With Post-Process [email protected] FXAA (T-AA looks fuzzy) and Multisample AA @TXAA 4X

RB6S SINGLE CARD: 2560x1440 AVERAGE: 64.4 MIN: 87.0

RB6S 4k 3840X2160 EVERYTHING MAXED With Post-Process OFF AND Multisample AA TEMPORAL

SINGLE CARD: AVERAGE: 101.4 MIN:64.4
SLI

AVERAGE: 53.1 MIN:46.4 (At 4K I lose performance in SLI mode. I've tried 4k with 3 different SLI setups and all lost performance at 4K in SLI in this game. Kepler GTX Titan X2, 908TI x2, and Pascal Titan X all showed diminished performance at 4K in SLI.)

THE DIVISION:
SOFT BRIDGE: 4K Typical: 59.6
EVGAHard Led: 4K Typical: 86.5
HB BRIDGE: 4K Typical: 87.8 (This 1 frame improvement is actually consistent; I'll take it!)

Unigine Valley Numbers did not improve with any combination of SLI bridge.
ULTRA 4K X8 AA FPS: 81 MIN FPS: 27.3

3DMARK ULTRA GRAPHICS SCORE: One run each. Shows what you would expect from standard variance from one run to the next without changing bridges. So no real change here.

SOFT BRIDGE: 13887
EVGA BRIDGE: 13869
NVIDIA HB : 13819 This is not an error. I ran these 3 time each and posted best. The lower two scores of soft and led bridges were slightly less than this. So again no difference between the bridges in 3Dmark.

Additionally, on the stock bios, I cannot overclock any higher on my water cooled XP's than I could on air because I'm power limited. Using MSI afterburner: On stock air I ran the fans to 100%, Voltage +100mv, power +120 and my thermals were in the 70's C while benchmarking. On water my thermals are in the 30s C. but I can still only run stable at +225 on the core and I can run over +600 on the memory. My scores however were slightly and consistently better under water because they are running nice and cool. But until the gee whiz guys come out with a BIOS mod and Pascal tweaker I'm stuck at these clocks. 
Here is a screenshot of my Overclocked score. which is above


----------



## jcde7ago

There are supposedly some 'frame quality' improvements with the HB SLI bridge that aren't reflected in maximum/average FPS, but I personally can't confirm that. I too have a modded Nvidia HB SLI bridge (cut off the tips to fit with the EK waterblocks) so i'm just rolling with that as it was the recommended bridge for my setup and i'd already sold my standard SLI bridge when I sold my Titan X Maxwells.

Most reviews claim a minor improvement in minimum, average and maximum FPS using the HB bridge over a SINGLE stadard SLI bridge, but using a couple of the standard flex bridges over a single HB bridge appears to yield similar results...so who really knows except Nvidia.


----------



## Vamrick

Serious problem here guys i went from air with oc of 2088 on the core and installed a water block.

Now my titan xp will not boost past 1417 the base clock. it runs fine.. 33 C under load but will not turbo up any? I have tried multiple programs

Msi afterburner , evga percision,
If i try to add ANYTHING to core or power limit it locks up my system.

I have uninstalled nvidia Drivers using DDU and installed old drivers.
every program reports the exact same boost clock I am so lost guys and not sure what happend from the time i put my block on until now. I didnt change anything else in the system.

Any insight is helpful



http://imgur.com/jr3kjSG


Screen shot of furmark maxing out at 1417 the BASE clock
















HELP ME!!!


----------



## PowerK

Quote:


> Originally Posted by *jcde7ago*
> 
> There are supposedly some 'frame quality' improvements with the HB SLI bridge that aren't reflected in maximum/average FPS, but I personally can't confirm that. I too have a modded Nvidia HB SLI bridge (cut off the tips to fit with the EK waterblocks) so i'm just rolling with that as it was the recommended bridge for my setup and i'd already sold my standard SLI bridge when I sold my Titan X Maxwells.
> 
> Most reviews claim a minor improvement in minimum, average and maximum FPS using the HB bridge over a SINGLE stadard SLI bridge, but using a couple of the standard flex bridges over a single HB bridge appears to yield similar results...so who really knows except Nvidia.


Indeed.
I've posted this in other thread but this is one of those articlesI came across a few weeks ago.
http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/

Anyway, I have not done any A/B comparisons with HB bridge vs. LED bridge vs. flex bridge. However, it's my understanding that HB bridge's primary focus is on improving frametime in SLI configration.

Flex Bridge = 400MHz single link
LED Bridge = 650MHz single link
HB bridge = 650MHz dual link.

Also, according to the article, using two flex bridges (instead of one) seems to provide the same performance boost as using a HB bridge.


----------



## pompss

Quote:


> Originally Posted by *Vamrick*
> 
> Serious problem here guys i went from air with oc of 2088 on the core and installed a water block.
> 
> Now my titan xp will not boost past 1417 the base clock. it runs fine.. 33 C under load but will not turbo up any? I have tried multiple programs
> 
> Msi afterburner , evga percision,
> If i try to add ANYTHING to core or power limit it locks up my system.
> 
> I have uninstalled nvidia Drivers using DDU and installed old drivers.
> every program reports the exact same boost clock I am so lost guys and not sure what happend from the time i put my block on until now. I didnt change anything else in the system.
> 
> Any insight is helpful
> 
> 
> 
> http://imgur.com/jr3kjSG
> 
> 
> Screen shot of furmark maxing out at 1417 the BASE clock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HELP ME!!!


Best thing you can do its unistall all the drivers and try to remove the card from the pci express slot and disconnected all pci express 8 -6 pin form the vga.
I know you underwater but if you have flexible tubing shouldn't be a problem.
let the card stay for like 20-30 min without power then reinstall the card


----------



## CreepinD

Quote:


> Originally Posted by *Vamrick*
> 
> Serious problem here guys i went from air with oc of 2088 on the core and installed a water block.
> 
> Now my titan xp will not boost past 1417 the base clock. it runs fine.. 33 C under load but will not turbo up any? I have tried multiple programs
> 
> Msi afterburner , evga percision,
> If i try to add ANYTHING to core or power limit it locks up my system.
> 
> I have uninstalled nvidia Drivers using DDU and installed old drivers.
> every program reports the exact same boost clock I am so lost guys and not sure what happend from the time i put my block on until now. I didnt change anything else in the system.
> 
> Any insight is helpful
> 
> 
> 
> http://imgur.com/jr3kjSG
> 
> 
> Screen shot of furmark maxing out at 1417 the BASE clock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HELP ME!!!


Another guy had the same problem a few pages back. He removed his card and re-installed it, and it worked fine after. I'd say start with that


----------



## Vamrick

I removed it and tried multiple pcie slots


----------



## Vamrick

Quote:


> Originally Posted by *CreepinD*
> 
> Another guy had the same problem a few pages back. He removed his card and re-installed it, and it worked fine after. I'd say start with that


Can you link me to the page to see if its exactly the same issue?

I have tried unplugging pcie power removing the card and swapping positioning and all comes to the same conclusion?


----------



## CreepinD

I took a quick look but I couldn't find it, I know it's here though.

I would start with page 350 and work forward. It's a good read


----------



## mbze430

why are people so hell bent to find more FPS with the Nvidia HB Bridge, when it supposed to have better frame-latency instead..... nobody ever go and test for frame-latency.......


----------



## STRYC

So far you've done everything I would do in your situation. Next step would be to pull the waterblock off and double check the install. You might have something touching a circuit keeping you at base clock. What has changed is your cooler so I'd pull the waterblock and put the stock cooler back on for a test run. If it boosts, than I'd put the waterblock back on triple checking you're doing it right. If it doesn't boost correctly after you put the stock cooler back on I'd call support for help; maybe RMA it.


----------



## pompss

Sometimes electricity still run inside the gpu for quite time.
If nothing work i would suggest to leave the card without power connector and pci express for the night and tomorrow morning redo the test.
As some suggest it might be a wrong waterblock installation or something with the waterblock like nickel or copper residue its touching some circuit on the pcb causing the issue.


----------



## STRYC

Also check your BIOS. I flashed my motherboard bios awhile ago and I had issues with my card not boosting. I re-flashed and it fixed my problem.


----------



## CoolGTX

Anyone ever find a "promo code" to use at store.nvidia.com --> even for free faster shipping/ game?


----------



## xarot

I'd give a shot at uninstalling Riva Tuner Statistics Server, EVGA Precision and MSI Afterburner with the drivers. Then install new drivers, use GPU-Z in the background to monitor core clock usage without any OC. If it still doesn't work, try a fresh install of OS.


----------



## Nizzen

Quote:


> Originally Posted by *jcde7ago*
> 
> There are supposedly some 'frame quality' improvements with the HB SLI bridge that aren't reflected in maximum/average FPS, but I personally can't confirm that. I too have a modded Nvidia HB SLI bridge (cut off the tips to fit with the EK waterblocks) so i'm just rolling with that as it was the recommended bridge for my setup and i'd already sold my standard SLI bridge when I sold my Titan X Maxwells.
> 
> Most reviews claim a minor improvement in minimum, average and maximum FPS using the HB bridge over a SINGLE stadard SLI bridge, but using a couple of the standard flex bridges over a single HB bridge appears to yield similar results...so who really knows except Nvidia.


It depends on the game and resolution.

Some good examples here:

2x soft briges VS HB bridge:

http://www.pcworld.com/article/3087524/hardware/tested-the-payoff-in-buying-nvidias-40-sli-hb-bridge.html

I have 2x GTX 1080 and a HB bridge, and can confirm that HB bridge helps in 3440x1440 and 4k res


----------



## profundido

Quote:


> Originally Posted by *PowerK*
> 
> Indeed.
> I've posted this in other thread but this is one of those articlesI came across a few weeks ago.
> http://www.hardwareunboxed.com/nvidias-hb-sli-bridge-surprising-gains-gtx-1080-sli-testing-inside/
> 
> Anyway, I have not done any A/B comparisons with HB bridge vs. LED bridge vs. flex bridge. However, it's my understanding that HB bridge's primary focus is on improving frametime in SLI configration.
> 
> Flex Bridge = 400MHz single link
> LED Bridge = 650MHz single link
> HB bridge = 650MHz dual link.
> 
> Also, according to the article, using two flex bridges (instead of one) seems to provide the same performance boost as using a HB bridge.


Confirmed, I got the exact same results with 2 flex bridges as with the HB, except that the HB one shorts out an extra circuit to make the warning message dissappear (cosmetic)

also the higher bandwidth was needed to overcome a new bottleneck SLI developers ran into while transferring alot of static data from 1 card to another (super high textures on super high resolution), not calculation information issues (high fps games). So in order to see alot of benefit from a HB bridge one needs to run heavy textures @4K or higher. If you're running high fps games you'll be dissapointed because you won't hit that bottleneck anyway and those won't see alot of difference.


----------



## jcde7ago

Quote:


> Originally Posted by *profundido*
> 
> Confirmed, I got the exact same results with 2 flex bridges as with the HB, except that the HB one shorts out an extra circuit to make the warning message dissappear (cosmetic)
> 
> also the higher bandwidth was needed to overcome a new bottleneck SLI developers ran into while transferring alot of static data from 1 card to another (super high textures on super high resolution), not calculation information issues (high fps games). So in order to see alot of benefit from a HB bridge one needs to run heavy textures @4K or higher. If you're running high fps games you'll be dissapointed because you won't hit that bottleneck anyway and those won't see alot of difference.


So the TL;DR of using an HB SLI bridge vs. 2x flex normal SLI bridges with Pascal cards according to your post + @Nizzen above is:

An HB bridge is only required if gaming at a minimum of 3440x1440p resolution w/ a game that has some really high res textures...otherwise, there is no performance difference over 2x 'normal' SLI flex bridges.


----------



## profundido

Quote:


> Originally Posted by *jcde7ago*
> 
> So the TL;DR of using an HB SLI bridge vs. 2x flex normal SLI bridges with Pascal cards according to your post + @Nizzen above is:
> 
> An HB bridge is only required if gaming at a minimum of 3440x1440p resolution w/ a game that has some really high res textures...otherwise, there is no performance difference over 2x 'normal' SLI flex bridges.


yes indeed.

Besides that perfectionists typically don't sleep well as long as the nvidia control panel keeps showing that yellow warning message altough it is pure cosmetic









Keep in mind though that although one may not see alot of performance increase there may be more to it. Imagine you running a game that runs just close up to the bottleneck threshold with occasional spikes into that "red zone". Best case you'll see sudden fps drops during those spikes, worst case you'll experience microstuttering and many gamers are sensitive to that.

Same for the many people with 28-lane processors (as opposed to 40-lane processors) or the people that have motherboards where putting any extra PCIE cards (besides their 2 GPU cards in SLI) in causes 1 or 2 of the slots of their GPU cards to fall back to 8x mode because the wiring of that specific motherboard is set to share bandwidth with the other slots. It used to be no problem in the past because our GPU cards never exceeded 8x anyway. 16x was just a useless figure on a spec sheet. But nowadays with cards like these TXP's we start surpassing the 8X threshold leading to equal bottleneck scenario's and reduced fps or microstuttering as a possible consequence and people breaking their heads as to where it could be coming from.

So, it's not just about less fps. It's also about ensuring a smooth worry-free experience by making sure you have no bottlenecks anywhere in the system because the entire flow of a closed circuit system only runs as well as the smallest bottleneck, and that goes for SLI as well. One of the ways Nvidia handles the increased needs for this generation of cards is by creating more bandwidth for the inter-SLI card communication until we have a better chipset design with alot more bandwith (near future) and HBM memory which will make the need for HB SLI bridges obsolete


----------



## Vamrick

This is a fresh OS install 2nd time actually. I also uninstalled every 3rd party program but gpuz and unplugged power still the same. I will be removing the block tonight. But im unsure of how the block would cuase it to lock in a BASE state? will update this evening. Thanks for everyones help.


----------



## KillerBee33

Need to replace Thermal Pad for the new WBlock can you guys advise if this is a good product? need 0.5
https://www.amazon.com/gp/product/B00UYTTLI4/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A23NVCSO4PYH3S


----------



## zipper17

I just find out KingPin's Timespy, here are his scores:

http://www.3dmark.com/spy/336566

Titan XP (1x) Graphics Score = 11 385

i7 6950X 5.1GHZ CPU Score = 14 015

I'm just curious, why i7 6950X 10c/20t @5.1ghz framerates only 47FPS ***?. Anybody know why is Timespy CPU test its more heavier than Firestrike CPu test? 6950x thats freakin 10 cores and 20 thread.


----------



## Jpmboy

Quote:


> Originally Posted by *Vamrick*
> 
> Can you link me to the page to see if its exactly the same issue?
> 
> I have tried unplugging pcie power removing the card and swapping positioning and all comes to the same conclusion?


First check that the block is not over tightened on the PCB. The TXP PCB is thin and prone to flexing and crushing of trace layers. loosen it up if possible and test again. It won't hurt the card - you can run these completely open without any cooling on the vram or anything else except the core. If that's not it, sweep the system with DDU and reinstall drivers. Make sure that PX server and/or AB are not starting with he OS.
Quote:


> Originally Posted by *pompss*
> 
> Sometimes electricity still run inside the gpu for quite time.
> If nothing work i would suggest to leave the card without power connector and pci express for the night and tomorrow morning redo the test.
> As some suggest it might be a wrong waterblock installation or something with the waterblock like nickel or copper residue its touching some circuit on the pcb causing the issue.


disconnecting the PSU power and holding down the start button for 20 sec will do the same thing (discharge caps).


----------



## Vamrick

Quote:


> Originally Posted by *Jpmboy*
> 
> First check that the block is not over tightened on the PCB. The TXP PCB is thin and prone to flexing and crushing of trace layers. loosen it up if possible and test again. It won't hurt the card - you can run these completely open without any cooling on the vram or anything else except the core. If that's not it, sweep the system with DDU and reinstall drivers. Make sure that PX server and/or AB are not starting with he OS.
> disconnecting the PSU power and holding down the start button for 20 sec will do the same thing (discharge caps).


How do i make sure PX server and AB are not starting? I have done a DDU uninstall and reinstall of 3 different drivers. I will be formatting again tonight i need to Raid 0 my PCIE NVME drives anyway. And I will be taking appart the gpu block and reinstalling it i will make sure to very loosely tighten.

Thanks for your help man this has been stresssing me out lol


----------



## Jpmboy

Quote:


> Originally Posted by *Vamrick*
> 
> How do i make sure PX server and AB are not starting? I have done a DDU uninstall and reinstall of 3 different drivers. I will be formatting again tonight i need to Raid 0 my PCIE NVME drives anyway. And I will be taking appart the gpu block and reinstalling it i will make sure to very loosely tighten.
> Thanks for your help man this has been stresssing me out lol


AB is easy - just uncheck the "start with windows" box in the settings. PX... more of a n issue sometimes. Best to just uninstall it. And then, from the windows run command, type" msconfig". in the config window, select the Services tab, and disable any PX related services if any are there... Leave the NV service Enabled.


----------



## Lobotomite430

Quote:


> Originally Posted by *KillerBee33*
> 
> Need to replace Thermal Pad for the new WBlock can you guys advise if this is a good product? need 0.5
> https://www.amazon.com/gp/product/B00UYTTLI4/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A23NVCSO4PYH3S


Just put the 145x145 1.0mm on my titan last night! No complaints.


----------



## KillerBee33

Quote:


> Originally Posted by *Lobotomite430*
> 
> Just put the 145x145 1.0mm on my titan last night! No complaints.


Puting a block on it and need to change Tpads , ones on the block now are 0.5 , decided to get 0.5 ARCTIC all over


----------



## inoran81

Titan X Pascal Blocks are here!!





kryographics Pascal for NVIDIA TITAN X acrylic glass edition, nickel plated version x 2pcs ..... this thing weights a ton... excellent quality



Back plate for kryographics Pascal NVIDIA TITAN X, active XCS x2 pcs .... first time trying active backplate....hmm...

Bonus blocks :





AQC latest cuplex kryos XT for Socket 2011/2011-3, Special Edition - finally they nickel plated the base plate!





kryoM.2 PCIe 3.0 x4 adapter for M.2 NGFF PCIe SSD, M-Key with nickel plated water block


----------



## bee144

My custom loop water cooling parts should arrive tomorrow. While having my two titans run at 30 c would be neat, I'm also hoping to keep things near silent. At what temperature does the titan XP seem to start down locking etc? My goal would be to keep it below this target and find the right fan speed


----------



## Yuhfhrh

Quote:


> Originally Posted by *bee144*
> 
> My custom loop water cooling parts should arrive tomorrow. While having my two titans run at 30 c would be neat, I'm also hoping to keep things near silent. At what temperature does the titan XP seem to start down locking etc? My goal would be to keep it below this target and find the right fan speed


It downclocks for me going from 27 to 32C. I don't think there's a specific spot. It downclocks about 13mhz every 10Cish (?)


----------



## CoolGTX

Quote:


> Originally Posted by *Jpmboy*
> 
> AB is easy - just uncheck the "start with windows" box in the settings. PX... more of a n issue sometimes. Best to just uninstall it. And then, from the windows run command, type" msconfig". in the config window, select the Services tab, and disable any PX related services if any are there... Leave the NV service Enabled.


I've read on a different forum, that you should uninstall AB or PX before, DDU changing drivers then reinstall for best stability

If not installing driver with DDU in safe mode; go off line and disable your security software and All other background software before loading GPU drivers as Admin.


----------



## mbze430

Quote:


> Originally Posted by *inoran81*
> 
> Titan X Pascal Blocks are here!!
> 
> 
> 
> 
> 
> kryographics Pascal for NVIDIA TITAN X acrylic glass edition, nickel plated version x 2pcs ..... this thing weights a ton... excellent quality
> 
> 
> 
> Back plate for kryographics Pascal NVIDIA TITAN X, active XCS x2 pcs .... first time trying active backplate....hmm...
> 
> Bonus blocks :
> 
> 
> 
> 
> 
> AQC latest cuplex kryos XT for Socket 2011/2011-3, Special Edition - finally they nickel plated the base plate!
> 
> 
> 
> 
> 
> kryoM.2 PCIe 3.0 x4 adapter for M.2 NGFF PCIe SSD, M-Key with nickel plated water block


I am waiting for my Aquacomputer blocks to come in... I don't have room for the active backplates, so I settle for the passive. Looks goooood


----------



## Jpmboy

Lookin Good! AQ stuff is very well made.








Quote:


> Originally Posted by *CoolGTX*
> 
> I've read on a different forum, that you should uninstall AB or PX before, DDU changing drivers then reinstall for best stability
> 
> If not installing driver with DDU in safe mode; go off line and disable your security software and All other background software before loading GPU drivers as Admin.


I haven't had to jump thru those hoops yet with multiple driver installs and AB versions (switching back and forth between non-pascal cards). IMO, PX is generally a problem. I only load it for cards that it can enable K-boost on. Otherwise, I uninstall it and quarantine it.


----------



## Sketchus

Hi all,

As others have also managed to fit a 980 hybrid kit on a my Pascal Titan (it was originally on my Maxwell Titan). So far everything seems good.


http://i.imgur.com/KIxODpH.jpg


----------



## Vamrick

As an update I have taken the block off completely and replaced with the original air cooler and the problem persists i formatted my pc and the problem persists at this point it has to be the gpu and contacted nvidia for an rma.... wish me luck


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Vamrick*
> 
> As an update I have taken the block off completely and replaced with the original air cooler and the problem persists i formatted my pc and the problem persists at this point it has to be the gpu and contacted nvidia for an rma.... wish me luck


Bios could be corrupt. Could reflash. I posted the tx P bios a few pages back.


----------



## Vamrick

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Bios could be corrupt. Could reflash. I posted the tx P bios a few pages back.


think thats really possible ?


----------



## eliau81

What thermal compound are good for titan xp?


----------



## cookiesowns

Quote:


> Originally Posted by *eliau81*
> 
> What thermal compound are good for titan xp?


Basically any TIM that's good on any thing else. Consensus among enthusiasts are GC-Extreme and Thermal Grizzly Kryonaut.


----------



## eliau81

Quote:


> Originally Posted by *cookiesowns*
> 
> Basically any TIM that's good on any thing else. Consensus among enthusiasts are GC-Extreme and Thermal Grizzly Kryonaut.


What about ic diamond 24 karat?


----------



## PasK1234Xw

Quote:


> Originally Posted by *eliau81*
> 
> What about ic diamond 24 karat?


IC diamond is really good but can be pain to put on evenly especially if you get one that is dry. And when you remove don't wipe off dry as it will scratch surface of what it is one use something like arctic clean or rubbing alcohol let it soak before you wipe off.

I used for so long on CPUs and on my 780/980s but got sick of it other much better stuff out there like mention above GC-Extreme and Thermal Grizzly Kryonaut


----------



## eliau81

Quote:


> Originally Posted by *PasK1234Xw*
> 
> IC diamond is really good but can be pain to put on evenly especially if you get one that is dry. And when you remove don't wipe off dry as it will scratch surface of what it is one use something like arctic clean or rubbing alcohol let it soak before i wipe off.
> 
> I used for so long on CPUs and on my 780/980s but got sick of it other much better stuff out there like mention above GC-Extreme and Thermal Grizzly Kryonaut


Are GC-Extreme and Thermal Grizzly Kryonaut conductive?


----------



## mbze430

Quote:


> Originally Posted by *eliau81*
> 
> Are GC-Extreme and Thermal Grizzly Kryonaut conductive?


nope, they are non-conductive


----------



## GunnzAkimbo

Idea:

2 EK Predators 280mm

1 for cpu only

1 for GPUs only (remove CPU block and install onto GPU blocks.

There's a 280mm unit available (soon).

That's what i would do with $$$


----------



## eliau81

found myself ordering the GC-Extreme in a rally good price on eBay 4.95$ for 1g
thanks guys for helping


----------



## W1zzard

Could one of you proud owners check if latest GPU-Z (1.11.0) can save Titan X Pascal BIOS?


----------



## inoran81

Quote:


> Originally Posted by *mbze430*
> 
> I am waiting for my Aquacomputer blocks to come in... I don't have room for the active backplates, so I settle for the passive. Looks goooood


hope they arrive fast for you....









Quote:


> Originally Posted by *Jpmboy*
> 
> Lookin Good! AQ stuff is very well made.
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't had to jump thru those hoops yet with multiple driver installs and AB versions (switching back and forth between non-pascal cards). IMO, PX is generally a problem. I only load it for cards that it can enable K-boost on. Otherwise, I uninstall it and quarantine it.


thanks bud - agree AQC and their services are top notch....


----------



## Diarrhea

Quote:


> Originally Posted by *cookiesowns*
> 
> Basically any TIM that's good on any thing else. Consensus among enthusiasts are GC-Extreme and Thermal Grizzly Kryonaut.


Just making sure...the thermal paste that comes with the EK GPU Water blocks are re-branded GC-Extreme right?

Also. for all those out there that water-cooled their Titan X Pascal(s), has anyone used CLP/CLU on the GPU die? If yes, did/should you have to put a protective coating around the transistors (at least I think they're transistors) too or should I just stick with the EK paste? Just wondering because if the temperature drop is significant, I might do it otherwise, it'll be a costly mistake if something goes wrong.


----------



## Lennyx

Quote:


> Originally Posted by *Diarrhea*
> 
> Just making sure...the thermal paste that comes with the EK GPU Water blocks are re-branded GC-Extreme right?
> 
> Also. for all those out there that water-cooled their Titan X Pascal(s), has anyone used CLP/CLU on the GPU die? If yes, did/should you have to put a protective coating around the transistors (at least I think they're transistors) too or should I just stick with the EK paste? Just wondering because if the temperature drop is significant, I might do it otherwise, it'll be a costly mistake if something goes wrong.


The thermal paste is not GC-Extreme. I actually have a EK branded GC-extreme syringe. It is probably around 5 years old and i dont know when they changed to their own brand.

I dont have any experience with CLU on gpu's, only used it on the cpu die.


----------



## VulcanVFX

Got my Titan in today, will be putting it my build as it comes along over the next few weeks, will be doing a build vlog on here for anybody interested


----------



## Vamrick

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Bios could be corrupt. Could reflash. I posted the tx P bios a few pages back.


Flashed bios and flashed mb bios with original air cooler back on with a new OS .. its the card .. Somehow i managed to **** it up and make it not boost ????


----------



## MrKenzie

Quote:


> Originally Posted by *Vamrick*
> 
> Flashed bios and flashed mb bios with original air cooler back on with a new OS .. its the card .. Somehow i managed to **** it up and make it not boost ????


You don't have another card that uses GPU boost you can test? It's possible but unlikely that a performance setting could be holding the card back from boosting, but I'm sure you have checked the power level settings etc in Nvidia control panel. I think afterburner overrides the power setting anyway?


----------



## cg4200

Quote:


> Originally Posted by *W1zzard*
> 
> Could one of you proud owners check if latest GPU-Z (1.11.0) can save Titan X Pascal BIOS?


yes you n
can now save bios with the new gpu-z 1.11... hopefully someone can get Maxwell bios editor to work still says unsupported device..is there anyother bios editing programsthat might work??


----------



## Vamrick

Quote:


> Originally Posted by *MrKenzie*
> 
> You don't have another card that uses GPU boost you can test? It's possible but unlikely that a performance setting could be holding the card back from boosting, but I'm sure you have checked the power level settings etc in Nvidia control panel. I think afterburner overrides the power setting anyway?


yes sir first thing i checked


----------



## KillerBee33

Quote:


> Originally Posted by *Vamrick*
> 
> yes sir first thing i checked


It's not boosting but have you checked your TDP in GPU Z even at stock in full load?
Also try creating a NEW USER in Windows , logout of current and run some test in new user.
Try using this to uninstall driver and for once let Windows install it for you, just to check if you doing something wrong.
http://www.wagnardmobile.com/forums/viewtopic.php?f=5&t=592


----------



## Vamrick

Quote:


> Originally Posted by *KillerBee33*
> 
> It's not boosting but have you checked your TDP in GPU Z even at stock in full load?
> Also try creating a NEW USER in Windows , logout of current and run some test in new user.
> Try using this to uninstall driver and for once let Windows install it for you, just to check if you doing something wrong.
> http://www.wagnardmobile.com/forums/viewtopic.php?f=5&t=592


Yes it idles down and boosts up to BASE clock even while games all correctly it just wont go past 1417.

Also this is the 3rd time I have reformatted to ensure its not a driver or windows issue


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Vamrick*
> 
> Yes it idles down and boosts up to BASE clock even while games all correctly it just wont go past 1417.
> 
> Also this is the 3rd time I have reformatted to ensure its not a driver or windows issue


Clean the pci-E pins of the gpu? Could be thermal pad or paste residue on a pin. Try different psu cables to the gpu, go native if using extensions.

Try a different monitor cable. Just a few more ideas.

Very strange issue I must say.

I'm sure you'll get RMA approval anyways, but what a hassle.


----------



## Vamrick

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Clean the pci-E pins of the gpu? Could be thermal pad or paste residue on a pin. Try different psu cables to the gpu, go native if using extensions.
> 
> Try a different monitor cable. Just a few more ideas.
> 
> Very strange issue I must say.
> 
> I'm sure you'll get RMA approval anyways, but what a hassle.


Pcie pins are good checked them for the residue. tried an entire differnt psu. not using extentions tried HDMI and DP,

lol im so ******* lost









I am not sure if there isnt something i havnt checked but please dont stop giving me ideas lol... the card works ROCK STABLE lol just at 1418 hahaha


----------



## EniGma1987

Quote:


> Originally Posted by *W1zzard*
> 
> Could one of you proud owners check if latest GPU-Z (1.11.0) can save Titan X Pascal BIOS?


I think a few pages back a couple people said it works now, ill check for sure myself when I get home today.


----------



## Jpmboy

Quote:


> Originally Posted by *W1zzard*
> 
> Could one of you proud owners check if latest GPU-Z (1.11.0) can save Titan X Pascal BIOS?


the new nvflash will save and flash the TXP safely. Disable the driver in device manager first.
Quote:


> Originally Posted by *Vamrick*
> 
> Flashed bios and flashed mb bios with original air cooler back on with a new OS .. its the card .. Somehow i managed to **** it up and make it not boost ????


no... no you didn't do anything.. it just happened and you do not know why! RMA.


----------



## W1zzard

Thanks everyone!


----------



## jodasanchezz

Quote:


> Originally Posted by *Vamrick*
> 
> Pcie pins are good checked them for the residue. tried an entire differnt psu. not using extentions tried HDMI and DP,
> 
> lol im so ******* lost
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am not sure if there isnt something i havnt checked but please dont stop giving me ideas lol... the card works ROCK STABLE lol just at 1418 hahaha


Hi ihad the same isue with my titan.
After a long conversation with nvidia (they have no idea) i unpluged the gpu restartet pc with igpu shut pc down repluged the gpu....and all workes fine......

View days later got the problem arain. After pc was in hipernate or sleep i thin the problem for me comes with this....but im not 100% shure
Updated the mb bios btw


----------



## Vamrick

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi ihad the same isue with my titan.
> After a long conversation with nvidia (they have no idea) i unpluged the gpu restartet pc with igpu shut pc down repluged the gpu....and all workes fine......
> 
> View days later got the problem arain. After pc was in hipernate or sleep i thin the problem for me comes with this....but im not 100% shure
> Updated the mb bios btw


Like exactly the same issue ??? where its stuck at 1417.5 mhz core and if you try to overclock or raise power limit it locks up your entire pc???

I will try this tonight.


----------



## cookiesowns

Quote:


> Originally Posted by *Diarrhea*
> 
> Just making sure...the thermal paste that comes with the EK GPU Water blocks are re-branded GC-Extreme right?
> 
> Also. for all those out there that water-cooled their Titan X Pascal(s), has anyone used CLP/CLU on the GPU die? If yes, did/should you have to put a protective coating around the transistors (at least I think they're transistors) too or should I just stick with the EK paste? Just wondering because if the temperature drop is significant, I might do it otherwise, it'll be a costly mistake if something goes wrong.


The CPU ones is GC extreme on the supremacy EVOs. The GPUs are some sort of ceramic paste ergo: artic ceramic.

Honestly just grab a tube of GC Extreme. Easy to find on Amazon.


----------



## Vamrick

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi ihad the same isue with my titan.
> After a long conversation with nvidia (they have no idea) i unpluged the gpu restartet pc with igpu shut pc down repluged the gpu....and all workes fine......
> 
> View days later got the problem arain. After pc was in hipernate or sleep i thin the problem for me comes with this....but im not 100% shure
> Updated the mb bios btw


Nope







didnt work for me


----------



## 295033

Just a small update to my stutter problem... I've discovered that it doesn't just happen in games. It happens on the desktop as well, so I'm assuming it just happens everywhere. The weird thing is if I switch from 60 to 59hz and vice versa, it can fix the problem a lot of the time. Once it's stutter-free on the desktop I can load up a game and it'll be perfectly smooth, even set to 59hz. Once I quit the game, however, there's a high chance it'll be stuttering again and I'll have to fiddle with the refresh rates. And again this only happens in my TV's game mode.

So to me it seems like there's some miscommunication between the TV and the GPU regarding the refresh rate, and toggling the refresh rates in Nvidia CP forces it to re-try that handshake. I've taken a look at both Monitor Asset Manager and Custom Resolution Utility and nothing seems out of place, but I'm not an expert on the matter of refresh rates.

I've sent Nvidia the necessary information and they're on the case. I just want this nightmare to be over. At least I have a way to somewhat quickly and reliably get rid of the stutter for long enough to play games.


----------



## W1zzard

Quote:


> Originally Posted by *EniGma1987*
> 
> I think a few pages back a couple people said it works now, ill check for sure myself when I get home today.


If you have a minute, upload the BIOS using GPU-Z so that it's in the database.


----------



## mbze430

Quote:


> Originally Posted by *W1zzard*
> 
> If you have a minute, upload the BIOS using GPU-Z so that it's in the database.


I just did the GPU-Z Online submission. When I tried to do a file though, it rebooted my system. But the online submission went through

Capture.PNG 35k .PNG file


----------



## eliau81

Quote:


> Originally Posted by *Vamrick*
> 
> Nope
> 
> 
> 
> 
> 
> 
> 
> didnt work for me


just RMA the damm thing , probably a factory defect


----------



## xbiker321

Just watercooled my Titan X. Just curious what clock speeds people are getting? 3dmark ultra?

7899 firestrike ultra graphics (Couldn't get past 8k







)
2139mhz
5594mhz
GPU temp max 40*C


----------



## kx11

post the results link


----------



## xbiker321

Quote:


> Originally Posted by *kx11*
> 
> post the results link


http://www.3dmark.com/fs/10166846

Crappy CPU at 4.4ghz. Was mostly just focused on the graphics score on the Titan X


----------



## kx11

very good GPU score you got there


----------



## dante`afk

I'ts time for a titan black or Ti now. the txp is already too long inside my tower.


----------



## Lobotomite430

Played GTA V last night and I was getting frequent drops into the 30-45 fps range with my Titan x running at 3440x1440. That doesnt seem normal I remember my 980ti getting solid 60 fps. Driver issue?


----------



## Baasha

Quote:


> Originally Posted by *dante`afk*
> 
> I'ts time for a titan black or Ti now. the txp is already too long inside my tower.


http://media.photobucket.com/user/darksong444/media/1302.jpg.html


----------



## DNMock

Quote:


> Originally Posted by *Lobotomite430*
> 
> Played GTA V last night and I was getting frequent drops into the 30-45 fps range with my Titan x running at 3440x1440. That doesnt seem normal I remember my 980ti getting solid 60 fps. Driver issue?


Did you make any changes to your Nvidia control panel global settings compared to when you had the 980ti?

I had a similar problem and realized that I had MSAA set to x4 in the global settings when I used to have the AA always turned off or set to application controlled. Then when I fired up Witcher 3 took me a while to figure out why my FPS was at or lower than it was with my Maxwell Titan set-up.


----------



## CRITTY

What was the fix for card/cards not boosting? My power went out and when I went back into the game I was playing, one of the cards was not boosting. Any ideas?


----------



## Glerox

After changing from 1080s SLI to a titan XP 2 months later (blame on Nvidia...), I'm relieved to see that the gtx 1080 TI will not be more powerful than the titan X Pascal hahaha.

http://www.overclock3d.net/news/gpu_displays/nvidia_gtx_1080ti_specifications_leak/1


----------



## Jpmboy

Quote:


> Originally Posted by *xbiker321*
> 
> Just watercooled my Titan X. Just curious what clock speeds people are getting? 3dmark ultra?
> 
> 7899 firestrike ultra graphics (Couldn't get past 8k
> 
> 
> 
> 
> 
> 
> 
> )
> 2139mhz
> 5594mhz
> GPU temp max 40*C


there is some comparative data here


----------



## EniGma1987

Quote:


> Originally Posted by *Glerox*
> 
> After changing from 1080s SLI to a titan XP 2 months later (blame on Nvidia...), I'm relieved to see that the gtx 1080 TI will not be more powerful than the titan X Pascal hahaha.
> 
> http://www.overclock3d.net/news/gpu_displays/nvidia_gtx_1080ti_specifications_leak/1


That is a well proven photoshopped image...


----------



## Glerox

Quote:


> Originally Posted by *EniGma1987*
> 
> That is a well proven photoshopped image...


Haha oops my bad then.
I was wondering also why the hell would they put GDDR5 VRAM (and not GDDR5X)


----------



## chronicfx

Quote:


> Originally Posted by *EniGma1987*
> 
> That is a well proven photoshopped image...


So does that mean there is no existence of the Ti or are there just no known specs for it?


----------



## Stateless

I was able to resolve my issue with one of my GPU's running 12-20c hotter than my other card. Both cards have the same EK Full Cover block and back plates, but for some reason GPU 1 was running under heavy load 12-20c hotter that the bottom GPU2 card. I ended up doing a few things. Main one was removing the block, removing the TIM and reapplying TIM and reseating the block.

I also decided to run my GPU's on their own loop separate from the CPU. I know many say it does not make a difference, but I just wanted to give it a shot since I had some time off of work and had the funds to buy some water cooling gear. GPU's are running off a XSPC 360 Rad + XSPC 240 Rad. So far after testing, my max temp is 44c on the top card and 40c on the bottom, this is after an hour of running Heaven & Valley. Also broke my record in SLI Firestrike to boot.

What was weird is that my cards were only boosting to 1853 with a +200 to the core/+200 to the memory. I changed the core to +220, Memory to +600 and now they boost to 2050 and after about 15 min they drop to 2035 or so and basically stay there after an hour or so of Valley. It was only an additional +20 to the core and the boost jumped by almost 250. Not sure if something did not stick with the initial +200, but just happy to report that both cards are running cooler and was able to remove a potential issue with one card getting a lot hotter than the other.

Will try to post pics of my set up. I have the CPU running with Red Tubing and the GPU's with Clear Tubing but with blue water. Some of the power cables are white, so it is almost an American/Patriotic color scheme. I am already wanting to remove the Red tubing and going clear with red water because of how good it looks to me. Lot's of tubing though, since I wanted to set it up where I can cutoff the water flow in two areas to minimize the amount of water I loose when switching out GPU's.


----------



## Menthol

Quote:


> Originally Posted by *chronicfx*
> 
> So does that mean there is no existence of the Ti or are there just no known specs for it?


I would be surprised if there was no 1080ti or same other high end card, otherwise Nvidia screwed there partner card manufacturers


----------



## mbze430

Quote:


> Originally Posted by *Glerox*
> 
> After changing from 1080s SLI to a titan XP 2 months later (blame on Nvidia...), I'm relieved to see that the gtx 1080 TI will not be more powerful than the titan X Pascal hahaha.
> 
> http://www.overclock3d.net/news/gpu_displays/nvidia_gtx_1080ti_specifications_leak/1


Another review tech site had that up for a month now, many say it is PS'd


----------



## guttheslayer

Quote:


> Originally Posted by *Menthol*
> 
> I would be surprised if there was no 1080ti or same other high end card, otherwise Nvidia screwed there partner card manufacturers


Who knows, if Nvidia want they could release new revision of titan class gaming card that is full blown gp102, run faster and probably come cheaper at 1k, just to counter vega when it drops next year.

Anything can happen


----------



## carlhil2

Finally did the shunt mod today, waited til I put my card in it's own loop. TDP went from a max of 135% down to 65.......







a couple of loops of Valley, says that card used 125 watts?  my ambient was 24c during this run...


----------



## xarot

Finally got the motivation to put my TXP on water and in the loop. Had to partly drain and change one tube, took around two nights with the block installation..never enough time. Some I/O brackets still missing.



Loop is so much cooler after TXM SLI.


----------



## TurricanM3

I did the shunt mod on all three shunts. I have a maximum PT in GPU-Z of ~45% while gaming. But Firestrike Ultra still keeps throtteling down from 2100 to 1800-2000MHz. When i raise the power limit to 120% it gets better but i still have throtteling.
What about your modded cards?


----------



## carlhil2

I only did two of them..., happy with results though...


----------



## Jpmboy

Quote:


> Originally Posted by *xarot*
> 
> Finally got the motivation to put my TXP on water and in the loop. Had to partly drain and change one tube, took around two nights with the block installation..never enough time. Some I/O brackets still missing.
> 
> 
> 
> Loop is so much cooler after TXM SLI.


nice set up buddy! And... QDCs!


----------



## carlhil2

Because the previous owner thought that they were so ugly, I got some for free, I love free stuff....


----------



## xarot

Quote:


> Originally Posted by *Jpmboy*
> 
> nice set up buddy! And... QDCs!


Thanks! Never another WC build for me without QDCs.


----------



## Steven185

Is there any way at all to avoid throttling? Even as I keep my temps to 42c max I'm still getting frequent downvolting/downclocking. My max clocks (that my card can achieve) is around 2075 mhz, but it hovers around 2000-2020. I mean ***.

At first I thought it is the temps, but even maxing at 42C (instead of 85 that it used to) makes little difference. Is there any way, any way at all to sustain high clocks with this card?


----------



## Jared Pace

Quote:


> Originally Posted by *Steven185*
> 
> Is there any way at all to avoid throttling? Is there any way, any way at all to sustain high clocks with this card?


Reducing voltage should equal a 1:1 ratio increase in boost clock until reaching instability/crashing. In the case of the card being at 40C and Power limited at ~2000mhz & 1060mv then -40mv should = +80mhz while keeping the same TDP, but increasing the likelihood of crashing. Try the generic I2C control in Msi Afterburner, and try to get a 10% reduction in voltage, and max out the power limit sliders. Maybe not using the Titan's fan or not using increased GDDR5X memory speed/volts/timings will add more headroom to the core mhz.

This wouldn't really be a problem if there were not power limits


----------



## DNMock

Quote:


> Originally Posted by *carlhil2*
> 
> Finally did the shunt mod today, waited til I put my card in it's own loop. TDP went from a max of 135% down to 65.......
> 
> 
> 
> 
> 
> 
> 
> a couple of loops of Valley, says that card used 125 watts?  my ambient was 24c during this run...


did something happen while I was gone? what's this 135 max PL you speak of?


----------



## axiumone

Quote:


> Originally Posted by *DNMock*
> 
> did something happen while I was gone? what's this 135 max PL you speak of?


Shunt mod.






Tutorial for a 1080, same principal applies.


----------



## DNMock

Quote:


> Originally Posted by *axiumone*
> 
> Shunt mod.
> 
> 
> 
> 
> 
> 
> Tutorial for a 1080, same principal applies.


Your PL was at 135% before the shunt mod though, right? That's how I'm reading it anyway and I wanted to know how you were able to get past the 120% wall to begin with.


----------



## Jpmboy

Quote:


> Originally Posted by *carlhil2*
> 
> Because the previous owner thought that they were so ugly, I got some for free, I love free stuff....


lol - ugly to one is beautiful (and free) to another.









Quote:


> Originally Posted by *xarot*
> 
> Thanks! Never another WC build for me without QDCs.


^^ Absolutely. I just finished a Caselabs Mercury 8 build (5960x/R5E/sli titanXM... etc) and put QDCs in so that you can separately remove the dropin 2x360 rad assembly, the cpu block and the SLI TitanX assembly without draining the loop. Even the pump/res could be removed without draining anything else. only way to go.


----------



## carlhil2

Quote:


> Originally Posted by *DNMock*
> 
> did something happen while I was gone? what's this 135 max PL you speak of?


Before the mod, my card, according to gpuz, was hitting 135%TDP during 3DMark benching. my card, in effect, was the throttle KING in those benches.after the the mod, get much higher scores, with same OC, but, only hitting about 65%TDP in those benches. clock is steady=SUCCESS....2088 is my game stable clocks, so, it's what I bench at....


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Before the mod, my card, according to gpuz, was hitting 135%TDP during 3DMark benching. my card, in effect, was the throttle KING in those benches.after the the mod, get much higher scores, with same OC, but, only hitting about 65%TDP in those benches. clock is steady=SUCCESS....2088 is my game stable clocks, so, it's what I bench at....


Can you run Time Spy and post GFX score on a single Modded Titan?


----------



## kx11

Batman AK runs solid 60fps @ 4k maxed out with this GPU


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Can you run Time Spy and post GFX score on a single Modded Titan?


http://www.3dmark.com/spy/451693


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> http://www.3dmark.com/spy/451693


Thx , what was it before the mod? in gfx?


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Thx , what was it before the mod? in gfx?


http://www.3dmark.com/spy/451651


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> http://www.3dmark.com/spy/451651


Thank you.


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Thank you.


No problem....


----------



## mbze430

Quote:


> Originally Posted by *TurricanM3*
> 
> I did the shunt mod on all three shunts. I have a maximum PT in GPU-Z of ~45% while gaming. But Firestrike Ultra still keeps throtteling down from 2100 to 1800-2000MHz. When i raise the power limit to 120% it gets better but i still have throtteling.
> What about your modded cards?


check it with GPU-Z, I am pretty sure it's limited and throttling because of voltage limits.


----------



## zipper17

is it recommended to Use Molex to PCIEx ? i read some google, said it's not recommended, it causes a problems.

I have molex to pciex that come from the graphic card itself (free accessories).

planning to SLI 1070, my PSU are non modular, so pciex are limited.

(sorry wrong thread lol i thought this is 1070 owner thread.) But if anyone can answer ....


----------



## Steven185

Quote:


> Originally Posted by *Jared Pace*
> 
> Reducing voltage should equal a 1:1 ratio increase in boost clock until reaching instability/crashing. In the case of the card being at 40C and Power limited at ~2000mhz & 1060mv then -40mv should = +80mhz while keeping the same TDP, but increasing the likelihood of crashing. Try the generic I2C control in Msi Afterburner, and try to get a 10% reduction in voltage, and max out the power limit sliders. Maybe not using the Titan's fan or not using increased GDDR5X memory speed/volts/timings will add more headroom to the core mhz.
> 
> This wouldn't really be a problem if there were not power limits


Funny thing is that I get downvolting/downclocking even when I don't hit the power limit. For example power may hover around 100% and I'm still being throttled. Afterburner tells me that voltage limit has been reached. I mean what the heck. Who constructs a card that even during normal operation hits its limits? It happens even when I don't overlock and my temps hover around low 40s...

A bios mod could not come any earlier. Upping the power limit to 140% (350w) would be great, but also removing that d*mn voltage limit would be heaven.


----------



## EniGma1987

Quote:


> Originally Posted by *zipper17*
> 
> is it recommended to Use Molex to PCIEx ? i read some google, said it's not recommended, it causes a problems.
> 
> I have molex to pciex that come from the graphic card itself (free accessories).
> 
> planning to SLI 1070, my PSU are non modular, so pciex are limited.
> 
> (sorry wrong thread lol i thought this is 1070 owner thread.) But if any guy can answer ....


It would be bad to use those for one main reason, a PCI-E power connector has 3-4 12v power lines and 3-4 ground lines. A Molex only has 1 12v and 1 ground, so you are trying to draw a lot more amperage from the single power cable than it is rated for. Even the ones that have 2 molex on the adapter is still kinda unsafe because it is still drawing a lot more power from those 2 12v cables than they are supposed to do.

Another minor note would be that using molex adapters for GPU power on an old multi-rail PSU could also go badly. Some PSUs used 2-3 12v rails for GPU and 1 12v rail for molex and sata, so in such cases using molex adapters would overdraw the one rail meant for drives. Those PSUs are not really made any more though and were not common to begin with so this probably wont happen to anyone.


----------



## xarot

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ Absolutely. I just finished a Caselabs Mercury 8 build (5960x/R5E/sli titanXM... etc) and put QDCs in so that you can separately remove the dropin 2x360 rad assembly, the cpu block and the SLI TitanX assembly without draining the loop. Even the pump/res could be removed without draining anything else. only way to go.


Same...every component is removable without draining. Using 6x pairs of Koolance QDCs.


----------



## Stateless

Quote:


> Originally Posted by *xarot*
> 
> Same...every component is removable without draining. Using 6x pairs of Koolance QDCs.


I have to test out QDC's when I do another build or modify. I just finished adding a separate loop for my GPU's and I have shut off valves and looking at QDC's these could of saved me some time and have minimal drainage when switching out GPU's. I have shut off parts, so I can turn off the flow right before and right after the GPU's. but QDC's probably would of worked better. Oh well...live and learn.


----------



## DNMock

Quote:


> Originally Posted by *carlhil2*
> 
> Before the mod, my card, according to gpuz, was hitting 135%TDP during 3DMark benching. my card, in effect, was the throttle KING in those benches.after the the mod, get much higher scores, with same OC, but, only hitting about 65%TDP in those benches. clock is steady=SUCCESS....2088 is my game stable clocks, so, it's what I bench at....


Ok, you are talking about TDP not PL, that's my bad.


----------



## Kendragon

Count me in!!! Got me a Titan XP! Waiting on block from perfirmancepcs. Cannot stand the loud fan and the 80deg temps!


----------



## mouacyk

Quote:


> Originally Posted by *xarot*
> 
> Same...every component is removable without draining. Using 6x pairs of Koolance QDCs.


Man, that can't be healthy. I hope you run dual pumps at least.


----------



## eliau81

so i have ordered the HYBRID KIT 1080 from EVGA and gonna mod it to fit the TXP,
also ordered Gentle Typhoon 120mm 1850 rpm to replace the stock fan of the RAD , and the GELID GC-EXTREME thermal compound , im also thinking to shunt my TXP so i have a few questions :

1. is it safe to shunt with the hybrid (i don't want to fried anything there), and is there any gain (preference gain) from shunting ?
2. i will use the CLU on the 5mo , do i need all 3 or just 2 of them (without the lower right}?
3. i have a core i7 4790k @ 4.7Ghz with Seasonic G-750 750W ,is it good enough for TXP OC?


----------



## carlhil2

Quote:


> Originally Posted by *eliau81*
> 
> so i have ordered the HYBRID KIT 1080 from EVGA and gonna mod it to fit the TXP,
> also ordered Gentle Typhoon 120mm 1850 rpm to replace the stock fan of the RAD , and the GELID GC-EXTREME thermal compound , im also thinking to shunt my TXP so i have a few questions :
> 
> 1. is it safe to shunt with the hybrid (i don't want to fried anything there), and is there any gain (preference gain) from shunting ?
> 2. i will use the CLU on the 5mo , do i need all 3 or just 2 of them (without the lower right}?
> 3. i have a core i7 4790k @ 4.7Ghz with Seasonic G-750 750W ,is it good enough for TXP OC?


I just put electrical tape around that area and smacked it with some Thermal Grizzly Conductonaut , and the mod made my clocks more stable....I only did two though, knocked my tdp down to almost half...had my UPS crying today with the extra power being pulled from it...


----------



## xarot

Quote:


> Originally Posted by *mouacyk*
> 
> Man, that can't be healthy. I hope you run dual pumps at least.


Sure there is some restriction introduced in the loop but current loop has been going strong for 16 months without any issues and temps are fine. Good enough for me.


----------



## Silent Scone

Quote:


> Originally Posted by *mouacyk*
> 
> Man, that can't be healthy. I hope you run dual pumps at least.


There's nothing wrong with it, stop being such a tart. Flow rate has minimal impact. My cards don't go over 38c and I've 3 pairs of QD3s on a single D5.


----------



## Nizzen

Quote:


> Originally Posted by *Silent Scone*
> 
> There's nothing wrong with it, stop being such a tart. Flow rate has minimal impact. My cards don't go over 38c and I've 3 pairs of QD3s on a single D5.


QD ftw









I have it on ALL components in both my pc's. Even the server








http://s413.photobucket.com/user/Nizzen/media/nizzen data 3x980.jpg.html


----------



## MunneY

Quote:


> Originally Posted by *Nizzen*
> 
> QD ftw
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have it on ALL components in both my pc's. Even the server
> 
> 
> 
> 
> 
> 
> 
> 
> http://s413.photobucket.com/user/Nizzen/media/nizzen data 3x980.jpg.html


Hey there LD PC-v8!

Simetimes i miss my v10... other than the noise.

And isnt 1200i x2 a lil overkill lol


----------



## mouacyk

Quote:


> Originally Posted by *xarot*
> 
> Sure there is some restriction introduced in the loop but current loop has been going strong for 16 months without any issues and temps are fine. Good enough for me.


That's good to know.
Quote:


> Originally Posted by *Silent Scone*
> 
> There's nothing wrong with it, stop being such a tart. Flow rate has minimal impact. My cards don't go over 38c and I've 3 pairs of QD3s on a single D5.


Not sure about minimal impact. Can we at least agree that people shouldn't be combining QDC's with a single pump willy-nilly, if quality of given QDCs and pump are unknown, and there may be excessive quantities of QDC's used? 3 pairs vs 6 pairs is also a factor difference of two.


----------



## Nizzen

Quote:


> Originally Posted by *mouacyk*
> 
> That's good to know.
> Not sure about minimal impact. Can we at least agree that people shouldn't be combining QDC's with a single pump willy-nilly, if quality of given QDCs and pump are unknown, and there may be excessive quantities of QDC's used? 3 pairs vs 6 pairs is also a factor difference of two.


I use one pump on all my systems with full of QD. No problems


----------



## Jpmboy

Quote:


> Originally Posted by *Nizzen*
> 
> QD ftw
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I have it on ALL components in both my pc's. Even the server
> 
> 
> 
> 
> 
> 
> 
> 
> http://s413.photobucket.com/user/Nizzen/media/nizzen data 3x980.jpg.html


lol - that is an affliction.








Quote:


> Originally Posted by *mouacyk*
> 
> That's good to know.
> Not sure about minimal impact. Can we at least agree that people shouldn't be combining QDC's with a single pump willy-nilly, if quality of given QDCs and pump are unknown, and there may be excessive quantities of QDC's used? 3 pairs vs 6 pairs is also a factor difference of two.


the QDCs are not the main points of restriction... pushing coolant thru serial cpu and gpu waterblocks introduce waaay more flow reduction. My bench table has 8 to 10 QDCs (depending on whether the EK chiller is in the loop). A single DDC-1T pump yields 2 L/m, switch on the second and it's 4 L/min. If I remove the GPUs to run an air cooled card, the flow jumps to 4 l/m on a single pump.

Maybe there are QDCs brands that are very restrictive, the Koolance products have little impact on flow rate. That said, the guys at aquacomputer pretty much showed that flow rate is... over rated anyway.


----------



## chronicfx

Quote:


> Originally Posted by *carlhil2*
> 
> Because the previous owner thought that they were so ugly, I got some for free, I love free stuff....


If those were cheap... Quick connects just make so much more sense..


----------



## chronicfx

Quote:


> Originally Posted by *mouacyk*
> 
> Man, that can't be healthy. I hope you run dual pumps at least.


Do QDC's cause alot of flow loss?


----------



## chronicfx

Quote:


> Originally Posted by *Nizzen*
> 
> QD ftw
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have it on ALL components in both my pc's. Even the server
> 
> 
> 
> 
> 
> 
> 
> 
> http://s413.photobucket.com/user/Nizzen/media/nizzen data 3x980.jpg.html


Holy... One for the computer and one for the cooling lol? Thats alot of watts for two cards I don't care what they are!


----------



## chronicfx

Quote:


> Originally Posted by *mouacyk*
> 
> That's good to know.
> Not sure about minimal impact. Can we at least agree that people shouldn't be combining QDC's with a single pump willy-nilly, if quality of given QDCs and pump are unknown, and there may be excessive quantities of QDC's used? 3 pairs vs 6 pairs is also a factor difference of two.


Somewhat true.. But virtually every component that can be damaged now-a-days has a thermal trigger, Bet the worst that could happen is finding blue screened.. Lets be real.


----------



## chronicfx

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - that is an affliction.
> 
> 
> 
> 
> 
> 
> 
> 
> the QDCs are not the main points of restriction... pushing coolant thru serial cpu and gpu waterblocks introduce waaay more flow reduction. My bench table has 8 to 10 QDCs (depending on whether the EK chiller is in the loop). A single DDC-1T pump yields 2 L/m, switch on the second and it's 4 L/min. If I remove the GPUs to run an air cooled card, the flow jumps to 4 l/m on a single pump.
> 
> Maybe there are QDCs brands that are very restrictive, the Koolance products have little impact on flow rate. That said, the guys at aquacomputer pretty much showed that flow rate is... over rated anyway.


You know this is like the 9th time or maybe the 900th that I have scrolled back to find you have either asked the same question or answered the same question just like I did... We have a lot in common!


----------



## carlhil2

Quote:


> Originally Posted by *chronicfx*
> 
> Holy... One for the computer and one for the cooling lol? Thats alot of watts for two cards I don't care what they are!


What, my pumps/fans controllers have their own PSU, lol....at full blast, they use close to 200 watts...2 loops, 3 controllers, and, 26 fans. got rid of 4 of them, on low, they were humming...


----------



## MrKenzie

Quote:


> Originally Posted by *carlhil2*
> 
> What, my pumps/fans controllers have their own PSU, lol....at full blast, they use close to 200 watts...2 loops, 3 controllers, and, 26 fans. got rid of 4 of them, on low, they were humming...


I don't feel so bad now that my water chiller draws 220W! No fans or radiator's required either.


----------



## carlhil2

Quote:


> Originally Posted by *MrKenzie*
> 
> I don't feel so bad now that my water chiller draws 220W! No fans or radiator's required either.


I need one of those, hopefully by the end of the year...will have to check with Jpmboy on a good one to buy..which model are you using?


----------



## KillerBee33

Got all together , waiting for EK Block , so far not very happy with the outcome


----------



## kx11

some kind of a Joker/riddler theme happening there


----------



## Asmodian

I noticed an odd phenomenon with my Titan X (Pascal) and madVR. Up to a 60% change in the power usage graphs on a ~30 second time scale for no apparent reason. These results are using NNEDI3 256/32 neurons to double 720p24 to 1440p (if you happen to know what that means







).

Titan X (Pascal) with: madVR, ETH Mining, Heaven Extreme Benchmark (144FPS cap)


Edit:
Of course right after I post I figure it out, it was a polling issue, simply the sampling interval drifting in and out of phase with a very spiky instantaneous power usage.









Increasing the poling speed to once every 100ms makes it obvious:









madVR renders each frame in 19ms, but only needs to render a new frame every 41.7ms (23.976 FPS video) so the entire GPU powers down after rendering every single frame, much like C-states for CPUs that don't change the clock speed.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Got all together , waiting for EK Block , so far not very happy with the outcome


colorful.








Quote:


> Originally Posted by *kx11*
> 
> some kind of a Joker/riddler theme happening there


lol- that must be the theme.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> colorful.
> 
> 
> 
> 
> 
> 
> 
> 
> lol- that must be the theme.


Adding more DYE does not seem to change color any more


----------



## st0necold

Quote:


> Originally Posted by *Baasha*
> 
> Titan X Pascal 4 Way SLI Benchmarks:


that looks like a lot of fun
Quote:


> Originally Posted by *KillerBee33*
> 
> Got all together , waiting for EK Block , so far not very happy with the outcome


looking good bro


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Adding more DYE does not seem to change color any more


Handsome build! ... but are you really using colored dye in the loop?








(I mean, it's colorful, but these dyes can cause more contact-surface issues than you'd want to hear about.)


----------



## KillerBee33

What do you mean?


----------



## piee

In bf4 if I put post processing to med from ultra my fps jumps from 71 to 138 on TXP, dont notice any difference, what is post processing doesnt do much in BF4 with 4k DPS.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> What do you mean?


it's fine bro... I just have an aversion to those types of additives. JUst check the pH after a month or so a simple pool pH kit works just fine - make sure it is not getting below 6.5, ideally between 7 and 7.5 - that's the main cause of staining and muck buildup.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> it's fine bro... I just have an aversion to those types of additives. JUst check the pH after a month or so a simple pool pH kit works just fine - make sure it is not getting below 6.5, ideally between 7 and 7.5 - that's the main cause of staining and muck buildup.


Gonna flush it soon anyway , this was just a first try
Ordering this for the top in PUSH http://www.newegg.com/Product/Product.aspx?Item=9SIA9F93EB0755&ignorebbr=1also ordered 1.5 Thermal Pad , this block wasnt meant for Titan so not all vrms ate in direct contact with it and probably will use this for my coolant
https://www.amazon.com/gp/product/B00QDSH8K8/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A35IHEXR76RHMZ
Hard modded that block to fit L32 on the Titan pcb.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Gonna flush it soon anyway , this was just a first try
> Ordering this for the top in PUSH http://www.newegg.com/Product/Product.aspx?Item=9SIA9F93EB0755&ignorebbr=1also ordered 1.5 Thermal Pad , this block wasnt meant for Titan so not all vrms ate in direct contact with it and probably will use this for my coolant
> https://www.amazon.com/gp/product/B00QDSH8K8/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A35IHEXR76RHMZ
> Hard modded that block to fit L32 on the Titan pcb.


yeah, the VRMs do need cooling. Unless you can machine the block, or use a cut copper shim with the right size pads, stacking pads really cuts down on their thermal flux (a lot). To use a shim, its: "block-TIM (or thermal glue) - shim - Tpads - VRM". Thermal glue works great and make a semi-permanent setup for the vrms. Works great, I did this on gpus years ago.

coolant? just go to the grocery store, buy some distilled water, and add 1% redline water wetter. Nothing works better (or costs less). PT-Nuke or any other stabilizer is fine also.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah, the VRMs do need cooling. Unless you can machine the block, or use a cut copper shim with the right size pads, stacking pads really cuts down on their thermal flux (a lot). To use a shim, its: "block-TIM (or thermal glue) - shim - Tpads - VRM". Thermal glue works great and make a semi-permanent setup for the vrms. Works great, I did this on gpus years ago.
> 
> coolant? just go to the grocery store, buy some distilled water, and add 1% redline water wetter. Nothing works better (or costs less). PT-Nuke or any other stabilizer is fine also.


Well space between few VRMs and the block is a little under 1 mm now and not all of them 1.5+1.5 Left +Top+Right , heh done modding that block for now








Distilled Water ...ehh hopefully UV DYE wont eat up much in 6 to 8 months before new GPU is out
Thank you for the info








Humm , did i miss it or you said i can use TIM on the VRMs even if they not in contact with the block?


----------



## MrKenzie

false
Quote:


> Originally Posted by *carlhil2*
> 
> I need one of those, hopefully by the end of the year...will have to check with Jpmboy on a good one to buy..which model are you using?


I'm using a Teco TK500. It's spot on for a CPU and 1x GPU, keeps the liquid temp at 15-20c depending on the game/load. That is with ambient temps of about 25c.

I have mine set up so the compressor won't start until the liquid temp hits 32c, and it will shut off when it drops it to 15c. Having it like that I find it hardly ever runs when I'm not gaming, and then if I start gaming it will automatically start when the coolant hits 32c and run continuously while I'm gaming because it can't lower it below 15c under load on most occasions.

You definitely need a larger compressor than mine if you have multi GPU's though.


----------



## carlhil2

Quote:


> Originally Posted by *MrKenzie*
> 
> false
> I'm using a Teco TK500. It's spot on for a CPU and 1x GPU, keeps the liquid temp at 15-20c depending on the game/load. That is with ambient temps of about 25c.
> 
> I have mine set up so the compressor won't start until the liquid temp hits 32c, and it will shut off when it drops it to 15c. Having it like that I find it hardly ever runs when I'm not gaming, and then if I start gaming it will automatically start when the coolant hits 32c and run continuously while I'm gaming because it can't lower it below 15c under load on most occasions.
> 
> You definitely need a larger compressor than mine if you have multi GPU's though.


Thanks for the info, and, yeah, cpu/one gpu only. Pascal is easy to cool with water, it's my cpu that gets the water warm, that's why I went back to dual loops....since I reseated my gpu block, my temps haven't hit more than +12 over ambient, on a hot day....during gaming, my card hitting about 34c is the norm, @25/26c ambient......


----------



## MrKenzie

Quote:


> Originally Posted by *carlhil2*
> 
> Thanks for the info, and, yeah, cpu/one gpu only. Pascal is easy to cool with water, it's my cpu that gets the water warm, that's why I went back to dual loops....since I reseated my gpu block, my temps haven't hit more than +12 over ambient, on a hot day....during gaming, my card hitting about 34c is the norm, @25/26c ambient......


The GPU should create much more heat than the CPU, 100W vs 200W would be twice the heat potential, I find my CPU hardly heats the water at all in comparison to the GPU. The CPU's heat transfer properties are poor compared to the GPU's also.


----------



## aliron

Hi everyone in my first post here.
I bought the Titan X Pascal and few days late i saw this when browsing in firefox:
http://subefotos.com/ver/?e384de19b0335cde8de6ad62b9301de7o.jpg

This artifacts change a little when i move the mouse over them, and it only shows in this tab. Just closing Firefox everything goes normal again. I send a ticket to Nvidia and because not happend again, they close it.
But days later, (372.70 drivers instead 372.54) it happend again in a video in the Quantum Break game and 2 days later when scrolling in a google results page. This 2 were just a flash, but looks like the first one. I reopened the ticket and im waiting the Nvidia answer.

I have to say that except for this, in games seems to work normal, with no errors (i have it since august 25th).
The errors happend not in 3D mode, just with some hardware acceleration (Firefox and playing video). And in that moments there are peaks in the memory and GPU frequency and voltage (like with other GPUs i own)
Did anything like this happend to anyone of you? Do you think the GPU is defective or its originated because drivers or something?
Thank You.
EDIT: I forgot to say that the titan X is with the stock cooler. So i dont touch anything.


----------



## bl4ckdot

Quote:


> Originally Posted by *aliron*
> 
> Hi everyone in my first post here.
> I bought the Titan X Pascal and few days late i saw this when browsing in firefox:
> http://subefotos.com/ver/?e384de19b0335cde8de6ad62b9301de7o.jpg
> 
> This artifacts change a little when i move the mouse over them, and it only shows in this tab. Just closing Firefox everything goes normal again. I send a ticket to Nvidia and because not happend again, they close it.
> But days later, (372.70 drivers instead 372.54) it happend again in a video in the Quantum Break game and 2 days later when scrolling in a google results page. This 2 were just a flash, but looks like the first one. I reopened the ticket and im waiting the Nvidia answer.
> 
> I have to say that except for this, in games seems to work normal, with no errors (i have it since august 25th).
> The errors happend not in 3D mode, just with some hardware acceleration (Firefox and playing video). And in that moments there are peaks in the memory and GPU frequency and voltage (like with other GPUs i own)
> Did anything like this happend to anyone of you? Do you think the GPU is defective or its originated because drivers or something?
> Thank You.
> EDIT: I forgot to say that the titan X is with the stock cooler. So i dont touch anything.


It made me remember an old thread I started with more or less the same issue : http://www.overclock.net/t/1539080/artifacts-not-gpu-related


----------



## Jpmboy

Quote:


> Originally Posted by *aliron*
> 
> Hi everyone in my first post here.
> I bought the Titan X Pascal and few days late i saw this when browsing in firefox:
> http://subefotos.com/ver/?e384de19b0335cde8de6ad62b9301de7o.jpg
> 
> This artifacts change a little when i move the mouse over them, and it only shows in this tab. Just closing Firefox everything goes normal again. I send a ticket to Nvidia and because not happend again, they close it.
> But days later, (372.70 drivers instead 372.54) it happend again in a video in the Quantum Break game and 2 days later when scrolling in a google results page. This 2 were just a flash, but looks like the first one. I reopened the ticket and im waiting the Nvidia answer.
> 
> I have to say that except for this, in games seems to work normal, with no errors (i have it since august 25th).
> The errors happend not in 3D mode, just with some hardware acceleration (Firefox and playing video). And in that moments there are peaks in the memory and GPU frequency and voltage (like with other GPUs i own)
> Did anything like this happend to anyone of you? Do you think the GPU is defective or its originated because drivers or something?
> Thank You.
> EDIT: I forgot to say that the titan X is with the stock cooler. So i dont touch anything.


can be caused by several things:

1) check that all driver settings are at defaults
2) memory OC can do this during video playback in 2D mode (P8-P5 state) since the voltage table may not raise appropriately in those states with a high vram OC
3) disable hardware acceleration in firefox
4) lastly, the card may just be a bad part, and should be returned for RMA (soon).


----------



## profundido

Quote:


> Originally Posted by *aliron*
> 
> Hi everyone in my first post here.
> I bought the Titan X Pascal and few days late i saw this when browsing in firefox:
> http://subefotos.com/ver/?e384de19b0335cde8de6ad62b9301de7o.jpg
> 
> This artifacts change a little when i move the mouse over them, and it only shows in this tab. Just closing Firefox everything goes normal again. I send a ticket to Nvidia and because not happend again, they close it.
> But days later, (372.70 drivers instead 372.54) it happend again in a video in the Quantum Break game and 2 days later when scrolling in a google results page. This 2 were just a flash, but looks like the first one. I reopened the ticket and im waiting the Nvidia answer.
> 
> I have to say that except for this, in games seems to work normal, with no errors (i have it since august 25th).
> The errors happend not in 3D mode, just with some hardware acceleration (Firefox and playing video). And in that moments there are peaks in the memory and GPU frequency and voltage (like with other GPUs i own)
> Did anything like this happend to anyone of you? Do you think the GPU is defective or its originated because drivers or something?
> Thank You.
> EDIT: I forgot to say that the titan X is with the stock cooler. So i dont touch anything.


This a new feature of the Titan X Pascal and Firefox called "*ATARI mode*"....


----------



## KillerBee33

This is why i was skeptical ordering from Overseas








Shipped today from SLOVENIA , Delivery tomorrow in "WOODSIDE NY" BROOKLYN NY... so many things wrong ....


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Well space between few VRMs and the block is a little under 1 mm now and not all of them 1.5+1.5 Left +Top+Right , heh done modding that block for now
> 
> 
> 
> 
> 
> 
> 
> 
> Distilled Water ...ehh hopefully UV DYE wont eat up much in 6 to 8 months before new GPU is out
> Thank you for the info
> 
> 
> 
> 
> 
> 
> 
> 
> Humm , did i miss it or you said i can use TIM on the VRMs even if they not in contact with the block?


IF you use a copper shim to cut down on the stacking of T-Pads, use thermal glue to attach the shim to the block, then t-Pad to the vrm.. so in this order: Block-glue/tim-shim-pad-vrm.








Quote:


> Originally Posted by *carlhil2*
> 
> Thanks for the info, and, yeah, cpu/one gpu only. Pascal is easy to cool with water, it's my cpu that gets the water warm, that's why I went back to dual loops....since I reseated my gpu block, my temps haven't hit more than +12 over ambient, on a hot day....during gaming, my card hitting about 34c is the norm, @25/26c ambient......


you thinking of getting a chiller? Aquarium chillers, I have the AquaEuroUsa 1/10th HP. Work great. The Koolance EKC-800 is a beast, and really made for PC cooling. I think th eEKC would be overkill for 1 gpu/cpu (and is also like 5x the cost!







)
Quote:


> Originally Posted by *profundido*
> 
> This a new feature of the Titan X Pascal and Firefox called "*ATARI mode*"....











Quote:


> Originally Posted by *KillerBee33*
> 
> This is why i was skeptical ordering from Overseas
> 
> 
> 
> 
> 
> 
> 
> 
> Shipped today from SLOVENIA , Delivery tomorrow in "WOODSIDE NY" BROOKLYN NY... so many things wrong ....


if that's your Ek shipment, no worries. I have ordered dozens of times from them - direct - and delivery is 100% perfect (so far).


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> IF you use a copper shim to cut down on the stacking of T-Pads, use thermal glue to attach the shim to the block, then t-Pad to the vrm.. so in this order: Block-glue/tim-shim-pad-vrm.


As you may notice i'm fairly new to all this







Any chance you can point to where i can get these things? I really like this block and don't mind doing mods" been working in jewelry a long time" but have no idea what to work with .


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> As you may notice i'm fairly new to all this
> 
> 
> 
> 
> 
> 
> 
> Any chance you can point to where i can get these things? I really like this block and don't mind doing mods" been working in jewelry a long time" but have no idea what to work with .


copper shim? just a piece of copper plate or sheet at the thickness you need. So.. if the gap is say, 1.2mm, get a piece of scrap copper sheeting (like roofing ) that's 0.7mm (hardware store, homedepot etc) and a 0.5mm t-pad (common). Thermal adhesive from Performance PCS.
or get it all from PCS

http://www.performance-pcs.com/thermal-compounds/shopby/tim-type--copper-pad/?limit=90

http://www.performance-pcs.com/arctic-alumina-thermal-adhesive-5-0-grams.html

http://www.performance-pcs.com/akasa-thermal-adhesive-tape-80-x-80mm.html#Specifications


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> copper shim? just a piece of copper plate or sheet at the thickness you need. So.. if the gap is say, 1.2mm, get a piece of scrap copper sheeting (like roofing ) that's 0.7mm (hardware store, homedepot etc) and a 0.5mm t-pad (common). Thermal adhesive from Performance PCS.
> or get it all from PCS
> 
> http://www.performance-pcs.com/thermal-compounds/shopby/tim-type--copper-pad/?limit=90
> 
> http://www.performance-pcs.com/arctic-alumina-thermal-adhesive-5-0-grams.html
> 
> http://www.performance-pcs.com/akasa-thermal-adhesive-tape-80-x-80mm.html#Specifications


Thank you. Block-Glue-Copper-Pad-VRM








After all i might have to use Distilled Water just to check this Blocks performance few times , any suggestions on what to use when flushing?
This is what you suggest with Distilled Water? http://www.newegg.com/Product/Product.aspx?Item=9SIA13209P3110&ignorebbr=1


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Thank you. Block-Glue-Copper-Pad-VRM
> 
> 
> 
> 
> 
> 
> 
> 
> After all i might have to use Distilled Water just to check this Blocks performance few times , any suggestions on what to use when flushing?
> This is what you suggest with Distilled Water? http://www.newegg.com/Product/Product.aspx?Item=9SIA13209P3110&ignorebbr=1


yes... just use a CAPFULL per gallon of DW for PC cooling. No more. If you prefer to use an additive made for PC water cooling, just get a bottle of clear or UV coolant (koolance is very good stuff) and run it as a 25% mix with DW.
regarding RLWW, I have been using it in loops since 2013.. one has not been drained or flushed since then. liquid is clear, pH is nominal (this is really critical, pH makes all the difference in any coolant mix) and components (plexi blocks) looks are shinny and clear as when they were new.

you mean to flush the loop? Just use water DW or RO water. check the watercooling section here at OCN for the best procedures. Lots of expert help there.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> yes... just use a CAPFULL per gallon of DW. No more. If you prefer to use an additive made for PC water cooling, just get a bottle of clear or UV coolant (koolance is very good stuff) and run it as a 25% mix with DW.
> regarding RLWW, I have been using it in loops since 2013.. one has not been drained or flushed since then. liquid is clear, pH is nominal (this is really critical, pH makes all the difference in any coolant mix) and components (plexi blocks) looks are shinny and clear as when they were new.
> 
> you mean to flush the loop? Just use water DW or RO water. check the watercooling section here at OCN for the best procedures. Lots of expert help there.


Thanx again.
Adding a 240 Rad. on top and will try to mod that block








Ordered this for a finished Build
https://www.amazon.com/gp/product/B00QDSH8K8/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1


----------



## DADDYDC650

Finally got a refund for the defective XP I sent back to Nvidia. Question, should I settle for a 1080 or purchase another Titan XP? I'm planning on either going for a 60Hz UltraWide 34" or the Acer x34p. I plan on playing BF1, Forza Horizon 3, Gears 4 and whatever highly rated fps/action adventure game comes out in the next couple of months. Any thoughts would be appreciated.


----------



## xbiker321

X34 will do 100hz so you probably need the Titan XP to push those frames at that 1440p ultrawide res. Unless money is a factor I'd get the Titan hands down.


----------



## aliron

Quote:


> Originally Posted by *bl4ckdot*
> 
> It made me remember an old thread I started with more or less the same issue : http://www.overclock.net/t/1539080/artifacts-not-gpu-related


How you solve it and what was the problem? In my case its only 3 times in almost a moth. The only similarities is that the GPU is working with low load, but i cant reproduce the error.

Quote:


> Originally Posted by *Jpmboy*
> 
> can be caused by several things:
> 
> 1) check that all driver settings are at defaults
> 2) memory OC can do this during video playback in 2D mode (P8-P5 state) since the voltage table may not raise appropriately in those states with a high vram OC
> 3) disable hardware acceleration in firefox
> 4) lastly, the card may just be a bad part, and should be returned for RMA (soon).


1- The driver should be ok, because i reinstall several times with Display Driver Uninstaller.
2- There was no memory OC when browsing (i only set OC when playing), but in MSI Afterburner i see how the memory goes from 405Mhz to 5006Mhz (default speed). But i think its something my previous 980ti do, but im not 10% sure now.
4-For now it seems the only option. Nvidia already answer me and ask me to send a replacement, so lets hope the new one its ok

Thanks all for answering


----------



## jhowell1030

If you're thinking about getting an ultra wide that is +60hz you're going to want an XP for sure.

I have an X34 and it drives it nicely. Even some demanding titles like Witcher 3 and Crysis 3 run at +60 frames for me with everything turned up.


----------



## Jpmboy

Quote:


> Originally Posted by *aliron*
> 
> How you solve it and what was the problem? In my case its only 3 times in almost a moth. The only similarities is that the GPU is working with low load, but i cant reproduce the error.
> 1- The driver should be ok, because i reinstall several times with Display Driver Uninstaller.
> 2- There was no memory OC when browsing (i only set OC when playing), but in MSI Afterburner i see how the memory goes from 405Mhz to 5006Mhz (default speed). But i think its something my previous 980ti do, but im not 10% sure now.
> 4-For now it seems the only option. Nvidia already answer me and ask me to send a replacement, so lets hope the new one its ok
> 
> Thanks all for answering


good going. A replacement is the right thing to do. It's a new card, let NVidia sort it out.


----------



## TurricanM3

The shunt mod doesn't work for me completely.
Fire Strike Ultra still keeps throtteling. When i run Test 1 in loop mode i can see the card slowing down from 2114 to 19XX. Mostly it stays above 2000MHz though. Some games also throttle in 4k. Sometimes you can't see the short drops in the monitoring history. You have to watch the OSD or log the frequency with 200ms polling rate.

Today i added some LM, all three shunts are completely covered with a lot of LM (Liquid Pro):

http://abload.de/image.php?img=20160919_1852005jsb3.jpg

Highest TDP in GPU-Z while running test 1 in loop was ~35%.
My max. TDP in 30s GPU-Z rendertest (without OC) is just 17.5%.

Any ideas how to entirely get rid of the throtteling?


----------



## Jpmboy

Quote:


> Originally Posted by *TurricanM3*
> 
> The shunt mod doesn't work for me completely.
> Fire Strike Ultra still keeps throtteling. When i run Test 1 in loop mode i can see the card slowing down from 2114 to 19XX. Mostly it stays above 2000MHz though. Some games also throttle in 4k. Sometimes you can't see the short drops in the monitoring history. You have to watch the OSD or log the frequency with 200ms polling rate.
> 
> Today i added some LM, all three shunts are completely covered with a lot of LM (Liquid Pro):
> 
> http://abload.de/image.php?img=20160919_1852005jsb3.jpg
> 
> Highest TDP in GPU-Z while running test 1 in loop was ~35%.
> My max. TDP in 30s GPU-Z rendertest (without OC) is just 17.5%.
> 
> Any ideas how to entirely get rid of the throtteling?


cool the core... and be aware that should any of the LM wander off the resistors - dead card and no RMA.


----------



## TurricanM3

That's quite plain to me.
I have LM on the GPU too. The problem isn't related to temperatures. The card is watercooled and hardly reaches 42°.


----------



## mbze430

it's probably limited by Vrel...


----------



## TurricanM3

Quote:


> Originally Posted by *mbze430*
> 
> it's probably limited by Vrel...


What does that mean exactly?
Is that a TXP problem with all cards?


----------



## Jpmboy

Quote:


> Originally Posted by *TurricanM3*
> 
> That's quite plain to me.
> I have LM on the GPU too. The problem isn't related to temperatures. The card is watercooled and hardly reaches 42°.


LM on the GPU is no benefit if the coolant temp is in the high 30s. lol, I don;t consider 42C "cool". From what I've seen with two cards, if you keep the core temp below 35C there is no thermal throttling.
Like MBZE said, what throttle reason does GPUZ show? Vrel?


----------



## Asmodian

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TurricanM3*
> 
> That's quite plain to me.
> I have LM on the GPU too. The problem isn't related to temperatures. The card is watercooled and hardly reaches 42°.
> 
> 
> 
> LM on the GPU is no benefit if the coolant temp is in the high 30s. lol, I don;t consider 42C "cool". From what I've seen with two cards, if you keep the core temp below 35C there is no thermal throttling.
> Like MBZE said, what throttle reason does GPUZ show? Vrel?
Click to expand...

I think there is thermal throttling even at 35°C, there is probably a 20°C-35°C step, with steps all the way down to -80°C.









This new boost is actually pretty funny, I like how applying an OC in Afterburner drops the voltage of all the middle "slightly using the GPU" modes. For example, on my GPU and at 36-37°C: 1418 MHz goes from 775 mV to 743 mV when hitting apply on a +100 MHz OC, and more to 700 mV at +178 MHz, the clock speed doesn't change but the voltage drops. It is very reproducible and a larger overclock always drops the voltage more bins. 1569 MHz goes from 800 mV at +100MHz to 762 mV at +178 MHz. 1734 MHz goes from 943 mV at stock to 843 mV at +178 MHz.









Edit: There seems to be a floor at 625 mV and temperature does the opposite of a +OC, moving the voltage up the bins at each clock speed. There is obviously a temperature step between 34°C and 35°, based on my testing.


----------



## glnn_23

Hi all. Can someone tell me if I would gain much from the shunt mod if I'm at 1.08v at the moment.

Also here's something I'm trying with ek water block and back plate.

Only tried 3dMark and max temp so far 25C.


----------



## Asmodian

Quote:


> Originally Posted by *glnn_23*
> 
> Hi all. Can someone tell me if I would gain much from the shunt mod if I'm at 1.08v at the moment.
> 
> Also here's something I'm trying with ek water block and back plate.
> 
> Only tried 3dMark and max temp so far 25C.


I do see my card power throttling to between 1900 and 2000 Mhz with various loads while keeping it below ~45°C at all times. It is hard to measure the performance impact. I am unstable over ~2050 MHz so I don't have much room above there even without it throttling, but probably at least +100 Mhz.









Set your polling rate in afterburner really high (100ms) and see how often the power limit triggers.


----------



## glnn_23

Quote:


> Originally Posted by *Asmodian*
> 
> I do see my card power throttling to between 1900 and 2000 Mhz with various loads while keeping it below ~45°C at all times. It is hard to measure the performance impact. I am unstable over ~2050 MHz so I don't have much room above there even without it throttling, but probably at least +100 Mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Set your polling rate in afterburner really high (100ms) and see how often the power limit triggers.


Thanks for the info, I'll give it a try.


----------



## bee144

Quote:


> Originally Posted by *KillerBee33*
> 
> This is why i was skeptical ordering from Overseas
> 
> 
> 
> 
> 
> 
> 
> 
> Shipped today from SLOVENIA , Delivery tomorrow in "WOODSIDE NY" BROOKLYN NY... so many things wrong ....


I just placed two orders from EK a few weeks ago. DHL moved quickly assuming your order doesn't get caught up on the weekend as DHL stops all movement on the weekend for general express packages.

DHL has two main hubs. One in Cincinnati and the other in NY. I assume you must live in NY given the quick turnaround.

Your package will require a signature FYI. Also, one of my orders required that pay an import tax before DHL released the package. Only held it up by 1 day though.


----------



## Jpmboy

Yeah - boost 3.0 is a strange thing. I never see temp throttling or voltage limit (via AB or GPUZ), only power limit and SLI (?). This has been the case down to 8C with a max T of 19C (EXC-800 Chiller). Like any otherr IC, cold always buys some Hz.








three consec run of FSU test 1:



time spy gives perf cap reasons as Vrel and SLI


----------



## KillerBee33

Quote:


> Originally Posted by *bee144*
> 
> I just placed two orders from EK a few weeks ago. DHL moved quickly assuming your order doesn't get caught up on the weekend as DHL stops all movement on the weekend for general express packages.
> 
> DHL has two main hubs. One in Cincinnati and the other in NY. I assume you must live in NY given the quick turnaround.
> 
> Your package will require a signature FYI. Also, one of my orders required that pay an import tax before DHL released the package. Only held it up by 1 day though.


Emailed EK about better shipping options , got email back , will be getting it this Friday that would make it exactly a week after order







not sure if it's believable though








What Import TAX?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Emailed EK about better shipping options , got email back , will be getting it this Friday that would make it exactly a week after order
> 
> 
> 
> 
> 
> 
> 
> 
> not sure if it's believable though
> 
> 
> 
> 
> 
> 
> 
> 
> What Import TAX?


none into the US that I ever had to pay.


----------



## Yuhfhrh

Quote:


> Originally Posted by *KillerBee33*
> 
> Emailed EK about better shipping options , got email back , will be getting it this Friday that would make it exactly a week after order
> 
> 
> 
> 
> 
> 
> 
> not sure if it's believable though
> 
> 
> 
> 
> 
> 
> 
> 
> What Import TAX?


Quote:


> Originally Posted by *Jpmboy*
> 
> none into the US that I ever had to pay.


No import tax in the US on shipments under $2500.


----------



## CallsignVega

Boost 3.0 sucks. To start throttling at only 35C? Can we say ridiculously and pointlessly restrictive.


----------



## bee144

Quote:


> Originally Posted by *KillerBee33*
> 
> Emailed EK about better shipping options , got email back , will be getting it this Friday that would make it exactly a week after order
> 
> 
> 
> 
> 
> 
> 
> not sure if it's believable though
> 
> 
> 
> 
> 
> 
> 
> 
> What Import TAX?


Quote:


> Originally Posted by *Jpmboy*
> 
> none into the US that I ever had to pay.


Quote:


> Originally Posted by *Yuhfhrh*
> 
> No import tax in the US on shipments under $2500.


My first order only cost $78 so there was no import tax. My second order cost ~$1060 and I had to pay about a $50 import tax, which is about 5%. I think once you go above a certain amount, you get charged import tax.

Also keep in mind that import taxes are different depending on what type of items you are buying.

If you're buying jewelry overseas, then the import tax limit might be $2500 for that category, for example.


----------



## DADDYDC650

Just ordered another Titan XP to go along with my new Samsung UN65KS8500 which I got for a sweet deal thru EPP.


----------



## Yuhfhrh

Quote:


> Originally Posted by *bee144*
> 
> My first order only cost $78 so there was no import tax. My second order cost ~$1060 and I had to pay about a $50 import tax, which is about 5%. I think once you go above a certain amount, you get charged import tax.
> 
> Also keep in mind that import taxes are different depending on what type of items you are buying.
> 
> If you're buying jewelry overseas, then the import tax limit might be $2500 for that category, for example.


Unless you're using it for business, most anything under $2500 can be imported as an informal entry afaik. Sure, there are some exceptions, but I think computer hardware is safe. I've certainly got packages valued near $2000 that did not require any import duties.


----------



## HotClock

I
Quote:


> Originally Posted by *aliron*
> 
> Hi everyone in my first post here.
> I bought the Titan X Pascal and few days late i saw this when browsing in firefox:
> http://subefotos.com/ver/?e384de19b0335cde8de6ad62b9301de7o.jpg
> 
> This artifacts change a little when i move the mouse over them, and it only shows in this tab. Just closing Firefox everything goes normal again. I send a ticket to Nvidia and because not happend again, they close it.
> But days later, (372.70 drivers instead 372.54) it happend again in a video in the Quantum Break game and 2 days later when scrolling in a google results page. This 2 were just a flash, but looks like the first one. I reopened the ticket and im waiting the Nvidia answer.
> 
> I have to say that except for this, in games seems to work normal, with no errors (i have it since august 25th).
> The errors happend not in 3D mode, just with some hardware acceleration (Firefox and playing video). And in that moments there are peaks in the memory and GPU frequency and voltage (like with other GPUs i own)
> Did anything like this happend to anyone of you? Do you think the GPU is defective or its originated because drivers or something?
> Thank You
> EDIT: I forgot to say that the titan X is with the stock cooler. So i dont touch anything.


I had the same thing happen to me with MS Edge and video's playing, and this is even before I did any overclocking. Hopefully they patch it. Would be nice to if they patched in 10-bit support, WCG and HDR for the Desktop. After all, it's only $1200, and for that price we should expect the latest and greatest for what we're buying.


----------



## aliron

Quote:


> Originally Posted by *HotClock*
> 
> I
> I had the same thing happen to me with MS Edge and video's playing, and this is even before I did any overclocking. Hopefully they patch it. Would be nice to if they patched in 10-bit support, WCG and HDR for the Desktop. After all, it's only $1200, and for that price we should expect the latest and greatest for what we're buying.


It happens all the time or only a few times like in my PC?
I still think its a problem that a new bios or driver can fix because it only happens in rare cases, and dont freeze the PC or something like that. But who knows.


----------



## axiumone

Quote:


> Originally Posted by *CallsignVega*
> 
> Boost 3.0 sucks. To start throttling at only 35C? Can we say ridiculously and pointlessly restrictive.


They've decided on a very strict tdp envelope with pascal cards regardless of what the user wants and will do anything and everything to stay in that envelope, even gimping performance of their ultra expensive flagship product in the process. It makes no sense.


----------



## Fera Nenem

Hi guys,

I am also seeing artifacts on my Titan X Pascal and I need some advice from you.

I tested the card for 3 days on air before putting it on water.
The OC I did for testing was +200 / 0. Temperatures were around 84°C;

After putting it on Water I have seen artifacts a couple of times. Once playing BF4 on 4k Ultra: the bottom half of the screen had artifacts for like 2 seconds and then disappeared. The other time was playing Ownard on my Vive with SS 1.7: Just like the other time, the bottom half showed artifacts for 2 seconds and went away.
The OC I used was +175/+175. The temperature never went over 48°C. The card was on Volt limit, with 100% GPU usage, hitting 2012MHz/5175MHz;

Ever since that I have done some FurMark at 4k 8xMSAA with following results: Max Core OC: +275MHz / 0. After that the screen locked, no artifacts before that. Max Memory OC: 0 / +400MHz. After that there were some artifacts visible on the upper quarter of the screen.
I also ran FurMark stress test for 2 hours with +200/+200, which is the best performance OC I could get stable. No artifacts or freezing.
I also ran Valley at 1440p 8MSAA for 2 hours without issues.

I talked to NVIDIA and explained that I have seen artifacts while gaming and they told me to RMA the card, since it is probably a defect hardware. However I would have to reverse to air and risk having the RMA denied. I am not sure if what I am seeing is a real problem, and therefore I would like to keep the card.
Right now I am gaming with 2038MHz/5200MHz and I am really happy with the performance. The card never goes more than 48°C.

Now my questions:
1) Is there any reason for me to RMA the card?
2) Is there any way for me to confirm a hardware defekt?
3) Has anyone tried to RMA a card that was put on water?
4) Should I panic?


----------



## TurricanM3

Quote:


> Originally Posted by *Jpmboy*
> 
> LM on the GPU is no benefit if the coolant temp is in the high 30s.


That's wrong. My delta T is just 4-5°. With regular paste you reach around 10° or more.

Quote:


> Originally Posted by *Jpmboy*
> 
> lol, I don;t consider 42C "cool". From what I've seen with two cards, if you keep the core temp below 35C there is no thermal throttling.
> Like MBZE said, what throttle reason does GPUZ show? Vrel?


I can bench @35° and the card still throttels. This is no thermal throtteling. Don't you know that the cards throttle just 2-3 steps in this temp regions? First step is around 28°, next ~35° and the last step ~43°. My card throttels a lot more than that. As i wrote even down to 19XX while temp beeing under 40°. GPU-Z perf.cap is the known rainbow.


----------



## KillerBee33

You can't make this stuff up


----------



## cg4200

Quote:


> Originally Posted by *TurricanM3*
> 
> That's wrong. My delta T is just 4-5°. With regular paste you reach around 10° or more.
> I can bench @35° and the card still throttels. This is no thermal throtteling. Don't you know that the cards throttle just 2-3 steps in this temp regions? First step is around 28°, next ~35° and the last step ~43°. My card throttels a lot more than that. As i wrote even down to 19XX while temp beeing under 40°. GPU-Z perf.cap is the known rainbow.


Quote:


> Originally Posted by *Fera Nenem*
> 
> Hi guys,
> 
> I am also seeing artifacts on my Titan X Pascal and I need some advice from you.
> 
> I tested the card for 3 days on air before putting it on water.
> The OC I did for testing was +200 / 0. Temperatures were around 84°C;
> 
> After putting it on Water I have seen artifacts a couple of times. Once playing BF4 on 4k Ultra: the bottom half of the screen had artifacts for like 2 seconds and then disappeared. The other time was playing Ownard on my Vive with SS 1.7: Just like the other time, the bottom half showed artifacts for 2 seconds and went away.
> The OC I used was +175/+175. The temperature never went over 48°C. The card was on Volt limit, with 100% GPU usage, hitting 2012MHz/5175MHz;
> 
> Ever since that I have done some FurMark at 4k 8xMSAA with following results: Max Core OC: +275MHz / 0. After that the screen locked, no artifacts before that. Max Memory OC: 0 / +400MHz. After that there were some artifacts visible on the upper quarter of the screen.
> I also ran FurMark stress test for 2 hours with +200/+200, which is the best performance OC I could get stable. No artifacts or freezing.
> I also ran Valley at 1440p 8MSAA for 2 hours without issues.
> 
> I talked to NVIDIA and explained that I have seen artifacts while gaming and they told me to RMA the card, since it is probably a defect hardware. However I would have to reverse to air and risk having the RMA denied. I am not sure if what I am seeing is a real problem, and therefore I would like to keep the card.
> Right now I am gaming with 2038MHz/5200MHz and I am really happy with the performance. The card never goes more than 48°C.
> 
> Now my questions:
> 1) Is there any reason for me to RMA the card?
> 2) Is there any way for me to confirm a hardware defekt?
> 3) Has anyone tried to RMA a card that was put on water?
> 4) Should I panic?


Quote:


> Originally Posted by *Fera Nenem*
> 
> Hi guys,
> 
> I am also seeing artifacts on my Titan X Pascal and I need some advice from you.
> 
> I tested the card for 3 days on air before putting it on water.
> The OC I did for testing was +200 / 0. Temperatures were around 84°C;
> 
> After putting it on Water I have seen artifacts a couple of times. Once playing BF4 on 4k Ultra: the bottom half of the screen had artifacts for like 2 seconds and then disappeared. The other time was playing Ownard on my Vive with SS 1.7: Just like the other time, the bottom half showed artifacts for 2 seconds and went away.
> The OC I used was +175/+175. The temperature never went over 48°C. The card was on Volt limit, with 100% GPU usage, hitting 2012MHz/5175MHz;
> 
> Ever since that I have done some FurMark at 4k 8xMSAA with following results: Max Core OC: +275MHz / 0. After that the screen locked, no artifacts before that. Max Memory OC: 0 / +400MHz. After that there were some artifacts visible on the upper quarter of the screen.
> I also ran FurMark stress test for 2 hours with +200/+200, which is the best performance OC I could get stable. No artifacts or freezing.
> I also ran Valley at 1440p 8MSAA for 2 hours without issues.
> 
> I talked to NVIDIA and explained that I have seen artifacts while gaming and they told me to RMA the card, since it is probably a defect hardware. However I would have to reverse to air and risk having the RMA denied. I am not sure if what I am seeing is a real problem, and therefore I would like to keep the card.
> Right now I am gaming with 2038MHz/5200MHz and I am really happy with the performance. The card never goes more than 48°C.
> 
> Now my questions:
> 1) Is there any reason for me to RMA the card?
> 2) Is there any way for me to confirm a hardware defekt?
> 3) Has anyone tried to RMA a card that was put on water?
> 4) Should I panic?


I would not panic .. I had no artifacts playing until I installed newest drivers now randomly playing starcitizen or browsing I will see artifacts. I do have ek block on 200/500 overclock same thing I have been running only thing I changed was driver. I am gonna try going back to old driver and see if it still does it.. you don,t say wich driver you have?? worst case if it persists take off waterblock carefully to not put spin scratches where screws are and rma>>> I will test mine out tonight..Good luck


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> Boost 3.0 sucks. To start throttling at only 35C? Can we say ridiculously and pointlessly restrictive.


yeah, I'd call it that. A pascal bios editor would fix these issues very quickly.
Quote:


> Originally Posted by *TurricanM3*
> 
> That's wrong. My delta T is just 4-5°. With regular paste you reach around 10° or more.
> I can bench @35° and the card still throttels. This is no thermal throtteling. Don't you know that the cards throttle just 2-3 steps in this temp regions? First step is around 28°, next ~35° and the last step ~43°. My card throttels a lot more than that. As i wrote even down to 19XX while temp beeing under 40°. GPU-Z perf.cap is the known rainbow.


lol Wrong? Don't I Know? c'mon. Bench what? where's your scores? It depends on the TIM and the quality of the block mount. LM is just overkill on TXP. Here ya go.. after 8 hours folding on 2 TXPs (2000 to 1987 depending on the folding core). Coolant is 30C cold, 33C hot side, GPUs are 38C. Max 8C. (this is with thermal grizzly on copper blocks.. a no-no with LM.
But, if you are a believer in LM, then fine.
You'd do better by improving your cooling loop.









after overnight folding on two cards, 4x420 rads (Aquacomputer Gigant - 1 220 fan) 1x 360 rad 3 120 fans)



GPU temps were the same whether using full cover or uniblocks. TGK or gelid extreme.

btw - "don't you know" that if you run your mouse over the "known rainbow" you can ID every cause? Like I said, at 9C there is no thermal throttling. The CLU resistor mod needs to be done step wise, not all cards benefit from all 3 resistors being shorted, and some result in cards locked into the P8 State. Did you measure the change in resistance for each resistor? If not - you are doing it blind and hoping for a good result.


----------



## Fera Nenem

Quote:


> Originally Posted by *cg4200*
> 
> I would not panic .. I had no artifacts playing until I installed newest drivers now randomly playing starcitizen or browsing I will see artifacts. I do have ek block on 200/500 overclock same thing I have been running only thing I changed was driver. I am gonna try going back to old driver and see if it still does it.. you don,t say wich driver you have?? worst case if it persists take off waterblock carefully to not put spin scratches where screws are and rma>>> I will test mine out tonight..Good luck


I am using the latest. It is the 372.xx something... it was the latest when I first installed the card and it still remains the latest.
Please keep us posted here. This thing has been giving a bad headache!

Thanks!


----------



## TurricanM3

I have my loop running at below 500rpm because i like it silent. My delta got cutted in half after using LM. Had kryonaut on the cards before. Folding doesn't stress the gpu-z as much as FSU test1 in loop. But that's not the point. If you want to use paste stick with it.

I wonder why some (not all) cards still throttle. That was my initial question.


----------



## Silent Scone

Quote:


> Originally Posted by *CallsignVega*
> 
> Boost 3.0 sucks. To start throttling at only 35C? Can we say ridiculously and pointlessly restrictive.


I've not seen this?


----------



## eliau81

Quote:


> Originally Posted by *Fera Nenem*
> 
> Hi guys,
> 
> I am also seeing artifacts on my Titan X Pascal and I need some advice from you.
> 
> I tested the card for 3 days on air before putting it on water.
> The OC I did for testing was +200 / 0. Temperatures were around 84°C;
> 
> After putting it on Water I have seen artifacts a couple of times. Once playing BF4 on 4k Ultra: the bottom half of the screen had artifacts for like 2 seconds and then disappeared. The other time was playing Ownard on my Vive with SS 1.7: Just like the other time, the bottom half showed artifacts for 2 seconds and went away.
> The OC I used was +175/+175. The temperature never went over 48°C. The card was on Volt limit, with 100% GPU usage, hitting 2012MHz/5175MHz;
> 
> Ever since that I have done some FurMark at 4k 8xMSAA with following results: Max Core OC: +275MHz / 0. After that the screen locked, no artifacts before that. Max Memory OC: 0 / +400MHz. After that there were some artifacts visible on the upper quarter of the screen.
> I also ran FurMark stress test for 2 hours with +200/+200, which is the best performance OC I could get stable. No artifacts or freezing.
> I also ran Valley at 1440p 8MSAA for 2 hours without issues.
> 
> I talked to NVIDIA and explained that I have seen artifacts while gaming and they told me to RMA the card, since it is probably a defect hardware. However I would have to reverse to air and risk having the RMA denied. I am not sure if what I am seeing is a real problem, and therefore I would like to keep the card.
> Right now I am gaming with 2038MHz/5200MHz and I am really happy with the performance. The card never goes more than 48°C.
> 
> Now my questions:
> 1) Is there any reason for me to RMA the card?
> 2) Is there any way for me to confirm a hardware defekt?
> 3) Has anyone tried to RMA a card that was put on water?
> 4) Should I panic?


1. i do not see any reason to RMA the card ,you have a very good OC with nice temp dont panic.









2. in this case there is no hardware defact ,the factory clocks are stable ,you got a little issue when trying to push a bit further so i gussing that you probbly didn't win the silicon lottery.

3. i tried to replace my stocke cooler with the EVGA 980 HYBRID KIT , while removing the hax screws with a wrench (yea i know unprofessional work, don't judge me didn't had the the 4.5mm box for those damm screws) so end up scratching the PCB
and the card was Totalos , so i contacted the green team, told my story (all true story) , and they approved to RMA, and i was like








so don't panic , i was and had a good reason to


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> I've not seen this?


yeah - it's a load dependent clock bin drop that is not consistent across applications. TimeSpy (dx12) does not trip the drop, and it seems that Heaven does not either, but Firestrike will - IDK, it usually only 13Hz anyway. Games... eh, had to tell since the load is so erratic and no K-boost. Tho you can lock the card in P0 using NVI.


----------



## Jpmboy

Quote:


> Originally Posted by *eliau81*
> 
> 1. i do not see any reason to RMA the card ,you have a very good OC with nice temp dont panic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. in this case there is no hardware defact ,the factory clocks are stable ,you got a little issue when trying to push a bit further so i gussing that you probbly didn't win the silicon lottery.
> 
> 3. i tried to replace my stocke cooler with the EVGA 980 HYBRID KIT , while removing the hax screws with a wrench (yea i know unprofessional work, don't judge me didn't had the the 4.5mm box for those damm screws) so end up scratching the PCB
> and the card was Totalos , *so i contacted the green team, told my story (all true story) , and they approved to RMA*, and i was like
> 
> 
> 
> 
> 
> 
> 
> 
> so don't panic , i was and had a good reason to


This is very good to hear!


----------



## Fera Nenem

Quote:


> Originally Posted by *eliau81*
> 
> 1. i do not see any reason to RMA the card ,you have a very good OC with nice temp dont panic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. in this case there is no hardware defact ,the factory clocks are stable ,you got a little issue when trying to push a bit further so i gussing that you probbly didn't win the silicon lottery.
> 
> 3. i tried to replace my stocke cooler with the EVGA 980 HYBRID KIT , while removing the hax screws with a wrench (yea i know unprofessional work, don't judge me didn't had the the 4.5mm box for those damm screws) so end up scratching the PCB
> and the card was Totalos , so i contacted the green team, told my story (all true story) , and they approved to RMA, and i was like
> 
> 
> 
> 
> 
> 
> 
> 
> so don't panic , i was and had a good reason to


I did read the older posts and saw your history. You had a lot of luck! Good for you.
I talked with NVIDIA and they said that ANY change would void the warranty. Taking out the original cooler is not allowed and the card is void of warranty even for problems that could not be related to the change.


----------



## Silent Scone

That's good to know for people genuinely just trying to fit aftermarket cooling that they are not waving any voided flags. Did wonder how much of a pig it would be.


----------



## eliau81

Quote:


> Originally Posted by *Fera Nenem*
> 
> I did read the older posts and saw your history. You had a lot of luck! Good for you.
> I talked with NVIDIA and they said that ANY change would void the warranty. Taking out the original cooler is not allowed and the card is void of warranty even for problems that could not be related to the change.


yes indeed
I have also explained that the stock cooler is very loud and unbearable. the GPU was very hot like 85c without OC


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*


How are you keeping [email protected] 35?


----------



## Silent Scone

With a good loop. Mine only go above that when the weather is particularly warm.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> How are you keeping [email protected] 35?


Quote:


> Originally Posted by *Silent Scone*
> 
> With a good loop. Mine only go above that when the weather is particularly warm.


It really is that simple. Pascal core is very cool-running


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> It really is that simple. Pascal core is very cool-running










Does that mean i MUST use that ugly @ss EK block huh....
Feel bad letting KOMODO go Beatutiful looking thing.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> 
> 
> 
> 
> 
> 
> 
> Does that mean i MUST use that ugly @ss EK block huh....
> Feel bad letting KOMODO go Beatutiful looking thing.


why get rid of the komodo block? Just needs a little "customization" right?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> why get rid of the komodo block? Just needs a little "customization" right?


It's not doing well so far, 54 in Time Spy "UNCLOCKED" +- few degrees bcz well, windows open AC off and 80 outside.
Ordered what you suggested , will see how that goes first for sure but i'm being skeptical about it.
Also Alphacool 240 Rad. is on its way for the top in PUSH , that might save me few degrees hopefully.


----------



## CallsignVega

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - it's a load dependent clock bin drop that is not consistent across applications. TimeSpy (dx12) does not trip the drop, and it seems that Heaven does not either, but Firestrike will - IDK, it usually only 13Hz anyway. Games... eh, had to tell since the load is so erratic and no K-boost. Tho you can lock the card in P0 using NVI.


I've also seen higher resolutions/demand also trigger the drop much earlier. Look at some of Baasha's videos, his downclock to even 16xx range! Granted he's running like 90C lol but still.

I


----------



## Yuhfhrh

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - it's a load dependent clock bin drop that is not consistent across applications. TimeSpy (dx12) does not trip the drop, and it seems that Heaven does not either, but Firestrike will - IDK, it usually only 13Hz anyway. Games... eh, had to tell since the load is so erratic and no K-boost. Tho you can lock the card in P0 using NVI.


The latest Afterburner can lock the frequency now like K-boost. Ctrl+F, choose frequency, then Ctrl+L.


----------



## KillerBee33

Quote:


> Originally Posted by *Yuhfhrh*
> 
> The latest Afterburner can lock the frequency now like K-boost. Ctrl+F, choose frequency, then Ctrl+L.


You mean Custom Curve can be Locked? Tried Curve with the 1080 got higher clock but worse performance


----------



## Yuhfhrh

Quote:


> Originally Posted by *KillerBee33*
> 
> You mean Custom Curve can be Locked? Tried Curve with the 1080 got higher clock but worse performance


No need for a custom curve. Use a standard offset, and double click on the highest frequency/1.09V


----------



## KillerBee33

Quote:


> Originally Posted by *Yuhfhrh*
> 
> No need for a custom curve. Use a standard offset, and double click on the highest frequency/1.09V


Cool. Will give it a shot tonight.


----------



## mbze430

we just need a BIOS mod.... than we can all be happy.


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> we just need a BIOS mod.... than we can all be happy.


Amen to that .
Chose cheapest shipping on Thursday Night


----------



## Jpmboy

Quote:


> Originally Posted by *Yuhfhrh*
> 
> The latest Afterburner can lock the frequency now like K-boost. Ctrl+F, choose frequency, then Ctrl+L.


gotta try that again (different gpu on that MB atm)... if you open NV Inspector after doing cntrl-F, L, what does it say in the P-State field?


----------



## unreality

It says P0


----------



## Yuhfhrh

Quote:


> Originally Posted by *Jpmboy*
> 
> gotta try that again (different gpu on that MB atm)... if you open NV Inspector after doing cntrl-F, L, what does it say in the P-State field?


Quote:


> Originally Posted by *unreality*
> 
> It says P0


Yup, P0.


----------



## Menthol

Jpmboy, in sli, I could only get CRTL-L to work on one card at a time, never on both cards at the same time, although it works on one card, setting up X-99 for benching again soon, will try after I get soon tubing in this week


----------



## DADDYDC650

Looks like Microsoft is giving away free Gears 4 keys with 1070's/1080's.... should I go SLI 1080 for 4K or 1 XP?


----------



## axiumone

Quote:


> Originally Posted by *DADDYDC650*
> 
> Looks like Microsoft is giving away free Gears 4 keys with 1070's/1080's.... should I go SLI 1080 for 4K or 1 XP?


1 XP. Especially if you're looking to play dx12 games from the windows store. None of them support multi GPU yet.


----------



## DADDYDC650

Quote:


> Originally Posted by *axiumone*
> 
> 1 XP. Especially if you're looking to play dx12 games from the windows store. None of them support multi GPU yet.


Typical MS...


----------



## axiumone

Quote:


> Originally Posted by *DADDYDC650*
> 
> Typical MS...


Here's some awesome new technology. Everyone go use it right now! Except we won't use any of it in the titles that we are going to produce, because it's outside their scope.







I think my eyes just rolled out of their sockets.


----------



## bee144

Quote:


> Originally Posted by *axiumone*
> 
> Here's some awesome new technology. Everyone go use it right now! Except we won't use any of it in the titles that we are going to produce, because it's outside their scope.
> 
> 
> 
> 
> 
> 
> 
> I think my eyes just rolled out of their sockets.


That's because DX12 changed how multi gpu setups are integrated into games. I know you already know this though. Also, the Windows Store was recently rewritten from the ground up for Universal apps. Between the DX12 multi gpu changes and the newness of the Windows Store, there is likely going to be some growing pains as things are finalized.

Please note that while I work for Microsoft, I don't work with DirectX or the Windows Store. Everything I've shared in this post is my own view point and not reflective of Microsoft. I have not disclosed anything that isn't already publicly known. (Required when I speak about my employer)


----------



## axiumone

Quote:


> Originally Posted by *bee144*
> 
> That's because DX12 changed how multi gpu setups are integrated into games. I know you already know this though. Also, the Windows Store was recently rewritten from the ground up for Universal apps. Between the DX12 multi gpu changes and the newness of the Windows Store, there is likely going to be some growing pains as things are finalized.
> 
> Please note that while I work for Microsoft, I don't work with DirectX or the Windows Store. Everything I've shared in this post is my own view point and not reflective of Microsoft. I have not disclosed anything that isn't already publicly known. (Required when I speak about my employer)


Didn't know we had anyone from MS on here. Welcome aboard!

Any radical redesign will have a lengthy adoption period. I'm just cranky because it feels a lot of devs have abandoned multi gpu support over the last few years.


----------



## Jpmboy

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Yup, P0.











Quote:


> Originally Posted by *unreality*
> 
> It says P0


thx bro. +1
Quote:


> Originally Posted by *Menthol*
> 
> Jpmboy, in sli, I could only get CRTL-L to work on one card at a time, never on both cards at the same time, although it works on one card, setting up X-99 for benching again soon, will try after I get soon tubing in this week


cool. ATM I'm playing with this base of the line, ASUS GTX 1080 that I never tested and now find that it is doing over 2100 on air! I gifted it to my nephew and borrowed it back for a few days... he's havin' fun with 2 WC's TXPs for a few days.








... well maybe just tonight.


----------



## eliau81

Quote:


> Originally Posted by *axiumone*
> 
> Didn't know we had anyone from MS on here. Welcome aboard!
> 
> Any radical redesign will have a lengthy adoption period. I'm just cranky because it feels a lot of devs have abandoned multi gpu support over the last few years.


You also have someone who used to work for Intel ... Hmm that's me


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> I've also seen higher resolutions/demand also trigger the drop much earlier. Look at some of Baasha's videos, his downclock to even 16xx range! Granted he's running like 90C lol but still.
> 
> I


bro - I see you are selling one of your TXPs? gonna run solo?


----------



## willverduzco

Quote:


> Originally Posted by *aliron*
> 
> Hi everyone in my first post here.
> I bought the Titan X Pascal and few days late i saw this when browsing in firefox:
> http://subefotos.com/ver/?e384de19b0335cde8de6ad62b9301de7o.jpg
> 
> This artifacts change a little when i move the mouse over them, and it only shows in this tab. Just closing Firefox everything goes normal again. I send a ticket to Nvidia and because not happend again, they close it.
> But days later, (372.70 drivers instead 372.54) it happend again in a video in the Quantum Break game and 2 days later when scrolling in a google results page. This 2 were just a flash, but looks like the first one. I reopened the ticket and im waiting the Nvidia answer.
> 
> I have to say that except for this, in games seems to work normal, with no errors (i have it since august 25th).
> The errors happend not in 3D mode, just with some hardware acceleration (Firefox and playing video). And in that moments there are peaks in the memory and GPU frequency and voltage (like with other GPUs i own)
> Did anything like this happend to anyone of you? Do you think the GPU is defective or its originated because drivers or something?
> Thank You.
> EDIT: I forgot to say that the titan X is with the stock cooler. So i dont touch anything.


Like a few others in this thread, I've occasionally gotten this same exact pattern of artifacts on my Titan XP. Both of my Gigabyte GTX 1080 Waterforce cards did the same as well at mild memory OCs while browsing YouTube through MS Edge, but never in a game.

Given how sporadic the issue is, I wouldn't worry about it.

On my TXP, I've gotten the artifacts about 3 times while using Edge (YouTube) in the last month. It's never happened in a game for me, but I don't own Quantum Break. I've never seen it in anything other than HW accelerated browsing in Edge (low performance state); and never in Heaven, Timespy, Firestrike, 3dm11, or any game that I've played (which would run in high performance states). The same goes for the 1080s that I got rid of before upgrading to the TXP.

As Jpmboy stated, I think the most likely explanation is that vid playback in 2D (p8-p5 performance states) do not raise the voltages. See if it still happens at stock memory speed (but first reboot after changing clocks to fully clear out the memory). If so, then you MAY want to RMA.


----------



## carlhil2

I don't understand how guys are getting temps in the mid 40's to mid 50's with these cards under water.....maybe too much paste/block too tight?


----------



## Yuhfhrh

Quote:


> Originally Posted by *carlhil2*
> 
> I don't understand how guys are getting temps in the mid 40's to mid 50's with these cards under water.....maybe too much paste/block too tight?


I don't break 35C even after hours of gaming, I have 480MM dedicated to the card though.


----------



## Woundingchaney

Quote:


> Originally Posted by *carlhil2*
> 
> I don't understand how guys are getting temps in the mid 40's to mid 50's with these cards under water.....maybe too much paste/block too tight?


I'm getting mid 50s with an aio. I would assume those with higher end water solutions would be getting a good bit lower temperatures.


----------



## carlhil2

Quote:


> Originally Posted by *Yuhfhrh*
> 
> I don't break 35C even after hours of gaming, I have 480MM dedicated to the card though.


Yeah, that sounds about right, but, I have seen guys hit mid 50's...







even a good 240mm or even a 140mm Monsta would suffice...I have used both, separately and together, no difference...together though, I can run my fans at half speed...







push only...


----------



## Fera Nenem

Quote:


> Originally Posted by *carlhil2*
> 
> I don't understand how guys are getting temps in the mid 40's to mid 50's with these cards under water.....maybe too much paste/block too tight?


And I don't understand how you are getting 35°C...

Can I add some info so we are sure that the comparison is fair?

THE LOOP ITSELF:
- aquaduct XT 720 eternal with 6x120mm fans;
- about 2L water in the system;
- measuring 80L/h;
- 11/13mm tubes;
- water goes in at about 33 and goes out 0,5 to 1°C less out under load.

ON THE LOOP:
- CPU i7 [email protected] running somewhat 50-60°C under load;
- GPU Titan [email protected]+200/+400 running 48°C under load.

THE WATER BLOCK:
- kryographics Pascal for NVIDIA TITAN X acrylic glass edition, nickel plated version from Aqua Computer;
- backplate did not arrive yet.

COMMENTS:
- This is the third time I build water blocks on a graphic card;
- This is the first time I use kryographics from Aqua Computer. I always used EK blocks before;
- If there is anything wrong with my build it should be too much thermal paste and/or too tight. I used 0,8g Thermal Grizzly Kryonaut for GPU and VRAM;
- The block from Aqua Computer works with thermal paste on the VRAM modules. That was new for me and I wasn't very confortable with that;
- (edit) my room temp is somewhat 25 to 27°C.


----------



## carlhil2

Quote:


> Originally Posted by *Fera Nenem*
> 
> And I don't understand how you are getting 35°C...
> 
> Can I add some info so we are sure that the comparison is fair?
> 
> THE LOOP ITSELF:
> - aquaduct XT 720 eternal with 6x120mm fans;
> - about 2L water in the system;
> - measuring 80L/h;
> - 11/13mm tubes;
> - water goes in at about 33 and goes out 0,5 to 1°C less out under load.
> 
> ON THE LOOP:
> - CPU i7 [email protected] running somewhat 50-60°C under load;
> - GPU Titan [email protected]+200/+400 running 48°C under load.
> 
> THE WATER BLOCK:
> - kryographics Pascal for NVIDIA TITAN X acrylic glass edition, nickel plated version from Aqua Computer;
> - backplate did not arrive yet.
> 
> COMMENTS:
> - This is the third time I build water blocks on a graphic card;
> - This is the first time I use kryographics from Aqua Computer. I always used EK blocks before;
> - If there is anything wrong with my build it should be too much thermal paste and/or too tight. I used 0,8g Thermal Grizzly Kryonaut for GPU and VRAM;
> - The block from Aqua Computer works with thermal paste on the VRAM modules. That was new for me and I wasn't very confortable with that;
> - (edit) my room temp is somewhat 25 to 27°C.


I can only speak for myself, but, when I first installed my uniblock, I was hitting about 42c on average. I was satisfied, up to the point when I pulled the card to do some work on it. once reinstalled, I booted up, my cards temp was up to 85c. alarmed, I checked my loop and realized that I had forgot to open one of the valves that I had attached to the block. lol, water flow wasn't reaching it of course..







after that incident, my temps were up to 55c during benching. anyways, pulled the card, reseated the block, along with some fresh tim, BAM, my temps never go above 10c over ambient, or, on average, the card hits about 35c, that's during long, I mean hours, of gaming...yesterday, temps hit 38c, but, that's because my ambient was 28c because I had the ac turned off, too loud, and it got HOT in here...my ambient on average is about 26c max...


----------



## xarot

I am getting some very annoying stuttering in Witcher 3. I didn't see such issue with Titan X Maxwell SLI but this happens now with my single TXP. Anyone else?

GPU is water cooled and max temps are 43c. No OC.


----------



## DooRules

Using the EK block on my Titan. Kryonaut for thermal paste. Ambient temp is usually around 20 -21' C. This is the temp the card idles at as well, 20 - 21' C. The hottest I ever see is running Heaven when it can get to 32' C. Have yet to see higher than that with the block on.

The Titan is on its own loop with a 4 fan Coolstreams XE rad. XSPC Photon 270 reservoir pump combo. Put cold air to the rads and I can get gpu temp down to low single digits.

Outside of having very high ambient temps not sure how you can get temps above 35' C with this card under water.


----------



## carlhil2

Quote:


> Originally Posted by *DooRules*
> 
> Using the EK block on my Titan. Kryonaut for thermal paste. Ambient temp is usually around 20 -21' C. This is the temp the card idles at as well, 20 - 21' C. The hottest I ever see is running Heaven when it can get to 32' C. Have yet to see higher than that with the block on.
> 
> The Titan is on its own loop with a 4 fan Coolstreams XE rad. XSPC Photon 270 reservoir pump combo. Put cold air to the rads and I can get gpu temp down to low single digits.
> 
> Outside of having very high ambient temps not sure how you can get temps above 35' C with this card under water.


Lol, for reals...45-55c is unacceptable with Pascal under water...and I throw the max voltages at mine sometimes...I thought that my previous 42-43c was proper, the temps that I also got with my 980Ti under water, it wasn't....


----------



## DooRules

Yeah, for reals


----------



## carlhil2

Also, when I reinstalled my block, I only used my fingers to tighten the block. I might have had it too tight the first time..


----------



## jeff3206

My 2 TXPs finally arrived. Now with 1080 hybrid coolers in my Corsair Graphite 380T modded to micro ATX.

I have the hybrid pumps powered from a motherboard header instead of the card, using this cable running out the back of the card. Hybrid radiator fans and pumps are then controlled by a custom curve in speedfan. I find this works well for managing hybrid pump noise.

Notwithstanding the compromised airflow I can keep everything in the high 50s (C) when gaming, with acceptably low noise.

Fire Strike Ultra 12,730 (14,389 graphics) with 4790K @ 4.8GHz (with H100i), DDR3 @ 2400MHz and TXPs @ +200/+350 (around 2,050MHz). For daily use I run the CPU at 4.6GHz for reduced H100i fan noise.


----------



## aliron

Quote:


> Originally Posted by *willverduzco*
> 
> Like a few others in this thread, I've occasionally gotten this same exact pattern of artifacts on my Titan XP. Both of my Gigabyte GTX 1080 Waterforce cards did the same as well at mild memory OCs while browsing YouTube through MS Edge, but never in a game.
> 
> Given how sporadic the issue is, I wouldn't worry about it.
> 
> On my TXP, I've gotten the artifacts about 3 times while using Edge (YouTube) in the last month. It's never happened in a game for me, but I don't own Quantum Break. I've never seen it in anything other than HW accelerated browsing in Edge (low performance state); and never in Heaven, Timespy, Firestrike, 3dm11, or any game that I've played (which would run in high performance states). The same goes for the 1080s that I got rid of before upgrading to the TXP.
> 
> As Jpmboy stated, I think the most likely explanation is that vid playback in 2D (p8-p5 performance states) do not raise the voltages. See if it still happens at stock memory speed (but first reboot after changing clocks to fully clear out the memory). If so, then you MAY want to RMA.


So is a "common problem". And the question is why only happend to a few of us. Its because other hardware or software?
When it happens to me, is without memory OC, but i dont think there is a great difference between 5000Mhz and for example 5300Mhz when rising from 495Mhz. As you say that could be the problem.
In the Quantum Brake game, the video part works like other video, and the card was almost idle
I already send the the card, but i dont think there is a problem that new drivers or Bios cant fix. But they told me make the RMA, so im doing it.
Lets see how works the new one when arrived.


----------



## carlhil2

My gpu clocks down from 2100 to 2088 once the temp hits 33c, that's why 99% of my benches are ran @2088. hurry up winter.....


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> My gpu clocks down from 2100 to 2088 once the temp hits 33c, that's why 99% of my benches are ran @2088. hurry up winter.....


When Validatiing Online any of the 3DMark tests , what does it show on the GPU Clock?


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> Jpmboy, in sli, I could only get CRTL-L to work on one card at a time, never on both cards at the same time, although it works on one card, setting up X-99 for benching again soon, will try after I get soon tubing in this week


only one card here too. (well, on mt nephew's rig). lol - that 1080 took second place on air! http://hwbot.org/submission/3320500_
Quote:


> Originally Posted by *KillerBee33*
> 
> When Validatiing Online any of the 3DMark tests , what does it show on the GPU Clock?


I usually get the base OC, not the boost clock.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> only one card here too. (well, on mt nephew's rig). lol - that 1080 took second place on air! http://hwbot.org/submission/3320500_
> I usually get the base OC, not the boost clock.


I used to get 2126 but since this Block it giving me higher temps than a Hybrid Kit i see 2114








And instead of 11088-11130 GPU in Time Spy , i can't get over 11044 now days


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> When Validatiing Online any of the 3DMark tests , what does it show on the GPU Clock?


I 2088, but, HWiNFO64, it will show the boost of 2100+. I need +2015 for 2100+, +200 gets me 2088. since that is my stable clock in the games that I play, I stick with that.my core stays cooler in games than in benches. it will downclock to 2075 after an hour or so......on average, about 2-3 Celsius less. my voltage is usually set to 50%....+500 on the ram...which that I could do 2100+ all day, but, my gpu is average..I will be selling it by the end of the year anyways, if Volta xx80 will have HBM2. will get 2 of those and be set...one gpu just doesn't look sexy in my case..I can't afford another TXP any time soon...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> I 2088, but, HWiNFO64, it will show the boost of 2100+. I need +2015 for 2100+, +200 gets me 2088. since that is my stable clock in the games that I play, I stick with that.my core stays cooler in games than in benches. it will downclock to 2075 after an hour or so......on average, about 2-3 Celsius less. my voltage is usually set to 50%....+500 on the ram...which that I could do 2100+ all day, but, my gpu is average..I will be selling it by the end of the year anyways, if Volta xx80 will have HBM2. will get 2 of those and be set...


Heh i got a [email protected] setup , "Stock" .....or 240+650 OC GPU tops out @ 52-54







Room Temp. is around 85







Same exact MAX


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Heh i got a [email protected] setup , "Stock" .....or 240+650 OC GPU tops out @ 52-54
> 
> 
> 
> 
> 
> 
> 
> Room Temp. is around 85
> 
> 
> 
> 
> 
> 
> 
> Same exact MAX


You have a better gpu than I have, but, OUCH, those temps...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> You have a better gpu than I have, but, OUCH, those temps...


Ehh don't see a BIG step down from running 45 degrees if you look two posts back , i do have EK block just in case , but wont be using it unless BIOS tools are out and we finally get that Power Limit out of the way, then im sure i'll need better cooling







Will try few more things with this Block later this week .


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh don't see a BIG step down from running 45 degrees if you look two posts back , i do have EK block just in case , but wont be using it unless BIOS tools are out and we finally get that Power Limit out of the way, then im sure i'll need better cooling
> 
> 
> 
> 
> 
> 
> 
> Will try few more things with this Block later this week .


ther's gains in clock stability to be had with just good cooling. if you keep the GPU below 40C (preferably 35C) it really hewlps with maintaining higher frequencies. just my experience with 2 TXPs and 2 1080s.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> ther's gains in clock stability to be had with just good cooling. if you keep the GPU below 40C (preferably 35C) it really hewlps with maintaining higher frequencies. just my experience with 2 TXPs and 2 1080s.


Sure , i know that. But you know my dilemma








Beautiful Block that i've spent so much effort to work which still performs weird "not bad" or something Simpler but with a better chance of working .


----------



## profundido

Quote:


> Originally Posted by *xarot*
> 
> I am getting some very annoying stuttering in Witcher 3. I didn't see such issue with Titan X Maxwell SLI but this happens now with my single TXP. Anyone else?
> 
> GPU is water cooled and max temps are 43c. No OC.


I believe I had that first run of the game. After I closed the game and had the nvidia experience detect and "optimize" the game all stuttering was gone immediately. If not you might have a bottleneck somewhere in your system


----------



## eliau81

any word on a BIOS mod???
hhhhhgggg my card is so thirsty for some more juice


----------



## Artah

Quote:


> Originally Posted by *eliau81*
> 
> any word on a BIOS mod???
> hhhhhgggg my card is so thirsty for some more juice


pour some vanilla coke on it and see it smoke







any other video cards you have had in the past! jk hehe. Did NVidia lock something down on this card that makes it more difficult to edit the bios?


----------



## Jpmboy

Quote:


> Originally Posted by *Artah*
> 
> pour some vanilla coke on it and see it smoke
> 
> 
> 
> 
> 
> 
> 
> any other video cards you have had in the past! jk hehe. Did NVidia lock something down on this card that makes it more difficult to edit the bios?


yes they did.


----------



## willverduzco

Quote:


> Originally Posted by *aliron*
> 
> So is a "common problem". And the question is why only happend to a few of us. Its because other hardware or software?
> When it happens to me, is without memory OC, but i dont think there is a great difference between 5000Mhz and for example 5300Mhz when rising from 495Mhz. As you say that could be the problem.
> In the Quantum Brake game, the video part works like other video, and the card was almost idle
> I already send the the card, but i dont think there is a problem that new drivers or Bios cant fix. But they told me make the RMA, so im doing it.
> Lets see how works the new one when arrived.


Yeah... well hopefully the new card fixes it for you! Keep us informed as to whether your new card does it too!


----------



## bwana

Quote:


> Originally Posted by *carlhil2*
> 
> I don't understand how guys are getting temps in the mid 40's to mid 50's with these cards under water.....maybe too much paste/block too tight?


why would a block too tight cause high temps


----------



## Jpmboy

Quote:


> Originally Posted by *bwana*
> 
> why would a block too tight cause high temps


warp the pcb decreasing proper contact with the core die.


----------



## carlhil2

Quote:


> Originally Posted by *bwana*
> 
> why would a block too tight cause high temps


Just a assumption of mine, nothing scientific... just seemed to work for my situation..







when I had my block on tighter, my temps were about 42c, reseated while only using my fingers, about 35c.. same amount of tim used.. matter of fact, for example, in Valley, before reseating, my temps were 44 max, after, 33c..originally used allen wrench..


----------



## carlhil2

Quote:


> Originally Posted by *Jpmboy*
> 
> warp the pcb decreasing proper contact with the core die.


That too...


----------



## Jpmboy

Quote:


> Originally Posted by *carlhil2*
> 
> That too...


yeah - this is actually pretty common on motherboards also... especially full cover blocks. We tend to over-tighten these things


----------



## bwana

so I put the 1080 evga hybrid unit on a TXP 40 deg C at idle w 681 mV.. I am getting 63deg C during the Heaven bench. Is this about normal for an AIO cooler? The back plate gets toasty-too hot to leave my hand on it for any length of time. Maybe I'll put some sinks on it.

During Heaven, max volts jump around alot - 1.035 up to 1.083. Power limit spends more time at 1 than at 0 indicating the card is being throttled due to power limit.

Clocks are more stable tho - 2025 to 2088. Max gpu clock offset on precision 6.06 is +209 mhz on core. The next step to +222 mhz crashes heaven bench.


----------



## markklok

Room temperature is slowly dropping and so is my precious 










What i do find strange is that in the beginning i couldn't get over 2038... then a couple of weeks later 2044 and now its like 2066
Temperature differences is negligible between those different core points...

card temperature idle = room temperature
When playing cames its like room + 12-15 degrees celcius

My Card hangs vertical and the CLU mod hasn't dripped or moved an inch so all is good

in the beginning i started with max voltage with AB.. but now i just put the slider back and i didn't notice any change.... So i wonder if the card really gets any extra voltage at all


----------



## xarot

Quote:


> Originally Posted by *profundido*
> 
> I believe I had that first run of the game. After I closed the game and had the nvidia experience detect and "optimize" the game all stuttering was gone immediately. If not you might have a bottleneck somewhere in your system


Thanks, actually after this I noticed my PCIe link was at x8. Moved my Intel 750 PCIe NVME card from slot to another and now it is x16. Now GPU is working at x16 but actually it's not any better.

I think there should not be a bottleneck from hardware perspective... CPU, RAM or GPU.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - this is actually pretty common on motherboards also... especially full cover blocks. We tend to over-tighten these things


How do you find the right spot in this case?


----------



## Lobotomite430

If anyone is curious I uploaded a video of me installing a EVGA hybrid kit on my Titan X Pascal.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> How do you find the right spot in this case?


no torque specs that are useful to us (tho the LAND spec sheet does have them)... so it's just a feel. If seating a block, tighten down finger tight and allow the TIM to squeeze out slowly.. than 30 min later just check the screws for any slack and tighten again.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> no torque specs that are useful to us (tho the LAND spec sheet does have them)... so it's just a feel. If seating a block, tighten down finger tight and allow the TIM to squeeze out slowly.. than 30 min later just check the screws for any slack and tighten again.


Thnx will try this weekend







What about GPU?
BTW decided to try with AC on last night , highest I've seen on 6700 @ 4.6 @ 1.34V was 63* and on GPU 340+650 is 48*








Also, CPU Block bolted with springs so you never know how far to go.... any suggestions on that?


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> If anyone is curious I uploaded a video of me installing a EVGA hybrid kit on my Titan X Pascal.


great tutorial thanks








what kind of tools did you used for cutting ?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Thnx will try this weekend
> 
> 
> 
> 
> 
> 
> 
> What about GPU?
> BTW decided to try with AC on last night , highest I've seen on 6700 @ 4.6 @ 1.34V was 63* and on GPU 340+650 is 48*
> 
> 
> 
> 
> 
> 
> 
> 
> Also, CPU Block bolted with springs so you never know how far to go.... any suggestions on that?


for something like an EK uniblock I usually tighten to 1/2 the uncompressed length... but it is really guess work.

edit - for the stock spring/screws - these are tightened to the point where they cannot go further. - fully seated.


----------



## Jpmboy

Quote:


> Originally Posted by *Lobotomite430*
> 
> If anyone is curious I uploaded a video of me installing a EVGA hybrid kit on my Titan X Pascal.


are you really handling that card on a granite counter top?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> for something like an EK uniblock I usually tighten to 1/2 the uncompressed length... but it is really guess work.
> 
> edit - for the stock spring/screws - these are tightened to the point where they cannot go further. - fully seated.


Yeap , figured as much , tried finger tight and temps were horrible on that CPU. Thnx.


----------



## fernlander

Quote:


> Originally Posted by *bwana*
> 
> so I put the 1080 evga hybrid unit on a TXP 40 deg C at idle w 681 mV.. I am getting 63deg C during the Heaven bench. Is this about normal for an AIO cooler? The back plate gets toasty-too hot to leave my hand on it for any length of time. Maybe I'll put some sinks on it.
> 
> During Heaven, max volts jump around alot - 1.035 up to 1.083. Power limit spends more time at 1 than at 0 indicating the card is being throttled due to power limit.
> 
> Clocks are more stable tho - 2025 to 2088. Max gpu clock offset on precision 6.06 is +209 mhz on core. The next step to +222 mhz crashes heaven bench.


Way too high. Shouldn't really cross 45C in heaven benchmark. Probably a mounting issue.


----------



## bizplan

Quote:


> Originally Posted by *fernlander*
> 
> Way too high. Shouldn't really cross 45C in heaven benchmark. Probably a mounting issue.


Agree way too high, my 980ti hybrid kit on my TXP does not exceed 50C on Heaven at max settings (+15 minutes). Would re-check pump, rad, TIM, mounting & fan (I have push/pull fans on the rad).


----------



## fernlander

Quote:


> Originally Posted by *bizplan*
> 
> Agree way too high, my 980ti hybrid kit on my TXP does not exceed 50C on Heaven at max settings (+15 minutes). Would re-check pump, rad, TIM, mounting & fan (I have push/pull fans on the rad).


I actually had a situation where I forgot to connect the pump. And it very quickly reaches 80C and the fan (Gpu fan) goes to 100% very quickly.

40C idle is too much unless you're running it in the open in the Sahara desert. I suspect mounting first. If the fans aren't enough you can feel the heat when touching the rad.

But all I all the 980ti hybrid kit is pretty darn great. I have not felt any desire for building custom water myself. It's basically two loops if you have AiO on both CPU and GPU. Though those water chillers seem really cool. It's sad this thing throttles at 35C.


----------



## bizplan

Quote:


> Originally Posted by *fernlander*
> 
> I actually had a situation where I forgot to connect the pump. And it very quickly reaches 80C and the fan (Gpu fan) goes to 100% very quickly.
> 
> 40C idle is too much unless you're running it in the open in the Sahara desert. I suspect mounting first. If the fans aren't enough you can feel the heat when touching the rad.
> 
> But all I all the 980ti hybrid kit is pretty darn great. I have not felt any desire for building custom water myself. It's basically two loops if you have AiO on both CPU and GPU. Though those water chillers seem really cool. It's sad this thing throttles at 35C.


Right, the card at idle should run [only] a few degrees higher than ambient temps.


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> Right, the card at idle should run [only] a few degrees higher than ambient temps.


Thats not possible ! Have Hybrid Kit on a 980 and had one on Current Titan , 45-50 in 3DMark is absolutely normal and let's say The Witcher 3 @ 4K might give you a 60 to 62 degrees easy.
Lets say ambient is 28 a Hybrid kit will run you easy to mid 50's in my experiences


----------



## Lobotomite430

Quote:


> Originally Posted by *Jpmboy*
> 
> are you really handling that card on a granite counter top?


Its just a laminate I will take that as a compliment though, must be good laminate! Thanks!


----------



## bwana

@Lobotomite430
That's the best video of the process I've seen. Thank you.

I changed the stock rad fan for 2 Noctua i2000 (the new 'industrial fan' 12 cm) and plugged those into a motherboard header. They run at 1000 rpm and are quiet at that speed.



Regarding the mounting, I left the stock titan baseplate on-never took it off. I dont have the tools to grind. But I guess I should get a dremel. I tried adding heatsinks and a fan to cool the card itself - did not make a lick of difference.I even did a ghetto mod of just putting the evga shroud over the card. You can gently squeeze it over the the TXP baseplate and goes down to within 1/2 in of seating all the way. I taped


the gap closed (~ 1/2 in) I backed off the mounting screws 1/2 turn - temps went up 2 degrees. Now it's 39 deg idle, 65 deg during heaven. I can get some thermal paste and remount but am leary of all those little connections coming out of the GP102. What is the best electrically nonconductive paste to use?


----------



## Lobotomite430

Quote:


> Originally Posted by *eliau81*
> 
> great tutorial thanks
> 
> 
> 
> 
> 
> 
> 
> 
> what kind of tools did you used for cutting ?


This below was used on the power connector spot and the vrm spot I used a hack saw. Hack saw is what I should have used from the start that went though no problem. I recently got a dremel tool which would have made life even easier and cleaner even though its pretty clean as is because I took a file to smooth out any rough edges.
http://www.homedepot.com/p/Milwaukee-Straight-Cutting-Aviation-Snips-48-22-4030/202950858


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> Thats not possible ! Have Hybrid Kit on a 980 and had one on Current Titan , 45-50 in 3DMark is absolutely normal and let's say The Witcher 3 @ 4K might give you a 60 to 62 degrees easy.
> Lets say ambient is 28 a Hybrid kit will run you easy to mid 50's in my experiences


I said at idle [a properly operating EVGA hybrid kit] should keep core temps a few degrees over ambient, agree with your numbers when card under load (albeit anything over 55C is high IMO and in my experience)..


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> I said at idle [an EVGA hybrid kit] should keep core temp a few degrees over ambient, agree with your numbers when TXP under load..


Idle shouldn't matter much at this point , you not buying a Titan to run it at IDLE







but yes thats usually the case , if room temp. are around 80 , Titan idles at 28-32


----------



## Lobotomite430

Quote:


> Originally Posted by *KillerBee33*
> 
> Idle shouldn't matter much at this point , you not buying a Titan to run it at IDLE
> 
> 
> 
> 
> 
> 
> 
> but yes thats usually the case , if room temp. are around 80 , Titan idles at 28-32


That is what my Titan idles at with the EVGA kit. It will rarely go over 50c unless the room temp goes up with it.


----------



## KillerBee33

Quote:


> Originally Posted by *Lobotomite430*
> 
> That is what my Titan idles at with the EVGA kit. It will rarely go over 50c unless the room temp goes up with it.


Ehh, turned my AC of for the past 2 weeks , can't stand it anymore


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh, turned my AC of for the past 2 weeks , can't stand it anymore


Can't do that in Dallas, it got up to 96F yesterday!


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> Can't do that in Dallas, it got up to 96F yesterday!


Hehe , NY isn't in the 90's anymore but Pollution makes it feel like 120







i'm in the office for 13 hours a day with AC blasting and im allowed to smoke here , who cares about AC , i just need AIR


----------



## fernlander

I left the stock baseplate on. So the baseplate seemed like plastic and so I'm wondering what the point of all that thermal tape is.

The backplate seems like metal but I'm not sure if it's in contact with anything that needs to lose heat. Would sticking a heat sink or two on it help with temps?


----------



## Lobotomite430

Quote:


> Originally Posted by *fernlander*
> 
> I left the stock baseplate on. So the baseplate seemed like plastic and so I'm wondering what the point of all that thermal tape is.
> 
> The backplate seems like metal but I'm not sure if it's in contact with anything that needs to lose heat. Would sticking a heat sink or two on it help with temps?


Probably not a whole lot. heat will still come off the card regardless of the cooling solution. Cooler ambient temps so the case can draw cool air over the card and exhaust the hot air but even then its only good to a point. Full waterblocks seem to give the best results. The thermal tape acts as a cushion and more importantly it transfers heat from one surface to the other.


----------



## fernlander

Quote:


> Originally Posted by *Lobotomite430*
> 
> Probably not a whole lot. heat will still come off the card regardless of the cooling solution. Cooler ambient temps so the case can draw cool air over the card and exhaust the hot air but even then its only good to a point. Full waterblocks seem to give the best results. The thermal tape acts as a cushion and more importantly it transfers heat from one surface to the other.


A full water block seems ideal. I still keep the fan at 45% to try to keep the VRMs cool. I run without the shroud because I didn't modify the baseplate. But since I use a 380t case the air can escape directly to the outside even without the shroud.

It's still a bit more fan noise than I'd like.


----------



## Jpmboy

Quote:


> Originally Posted by *fernlander*
> 
> I left the stock baseplate on. So the baseplate seemed like plastic and so I'm wondering what the point of all that thermal tape is.
> 
> The backplate seems like metal but I'm not sure if it's in contact with anything that needs to lose heat. Would sticking a heat sink or two on it help with temps?


just so you guys don;t have to read thru the "event" .. .the backplate is metal, the plastic film on the inside it there to avoid shorting across the many exposed IC... and more importantly, it is there to protect the very fragile ICs on the backside of the PCB. We already seen 2 (or more) folks with borked cards due to knocking off or knocking loose components on the back of the card. yeah, tragic.
ther TXP OEM backplate is not there as a heat sink,.


----------



## fernlander

Quote:


> Originally Posted by *Jpmboy*
> 
> just so you guys don;t have to read thru the "event" .. .the backplate is metal, the plastic film on the inside it there to avoid shorting across the many exposed IC... and more importantly, it is there to protect the very fragile ICs on the backside of the PCB. We already seen 2 (or more) folks with borked cards due to knocking off or knocking loose components on the back of the card. yeah, tragic.
> ther TXP OEM backplate is not there as a heat sink,.


Wow ok. So its purpose is just protection. My question was more about the baseplate that's inside. It too feels like plastic yet has thermal tape all over it.


----------



## Jpmboy

Quote:


> Originally Posted by *fernlander*
> 
> Wow ok. So its purpose is just protection. My question was more about the baseplate that's inside. It too feels like plastic yet has thermal tape all over it.


base plate is metal. the white (fibrous) pads are thermal/electrical insulation.


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> Jpmboy, in sli, I could only get CRTL-L to work on one card at a time, never on both cards at the same time, although it works on one card, setting up X-99 for benching again soon, will try after I get soon tubing in this week


hey menthol - to get multiple cards locked in P0 you need to set one, then open AB settings, change the card listed under Master Graphics Adapter, highlight the point in the graph again and cntrl-L, apply. Now both cards are in P0 state and then you can just adj the offset as usual!


----------



## kx11

Forza Horizon3 @ 4k ultra settings runs 60fps and drops to 50 when i start recording


----------



## bizplan

Quote:


> Originally Posted by *fernlander*
> 
> A full water block seems ideal. I still keep the fan at 45% to try to keep the VRMs cool. I run without the shroud because I didn't modify the baseplate. But since I use a 380t case the air can escape directly to the outside even without the shroud.
> 
> It's still a bit more fan noise than I'd like.


Has anyone heard of someone's VRM ever going bad because of heat? Or that they got a bad or unstable OC because the VRMs got too hot?


----------



## fernlander

Quote:


> Originally Posted by *bizplan*
> 
> Has anyone heard of someone's VRM ever going bad because of heat? Or that they got a bad or unstable OC because the VRMs got too hot?


I haven't but I don't take chances. If you leave the fan on auto it will go by GPU temps which will be low for a hybrid cooler. So I keep it around 45-50% manually to protect the VRMs. It keeps some decent airflow over them.


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> Its just a laminate I will take that as a compliment though, must be good laminate! Thanks!


it's not a granite counter top ?!?!







damm... also fale for it


----------



## carlhil2

Alittle chilly this morning, and, my score went up a bit, same clocks.. http://www.3dmark.com/spy/485610 highest score in the HOF pushing a 5960x.. should last a Day anyways..


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*


All the ingredients are here to try and F***up that KOMODO block even more


----------



## carlhil2

Is the new driver any good? just found out about it, but, am at work...


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> Is the new driver any good? just found out about it, but, am at work...


Absolutely no changes here from the .70


----------



## piee

Got Ek block and xe360(push), 38-41c, at 203P.L.=2088 initial(31-36c) settles to 2076(37-41c) in BF4,tearing it up at 70-90FPS 4kdsr. Trying to figure how to add my 240rad or 120rad or just sell them, dont think make much difference with xe360(awsomeness),mod corsair 450d to fit front.


----------



## carlhil2

I did a Firestrike run this morning, didn't break 28c, true story..







it was the first thing that I ran on boot though..







didn't help my score much though, only so much that you can get out of +200... 2088 is my max stable, unless I don't oc my ram much...FSU and Extreme is a different story though, weird...might be my 4k monitor...


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> hey menthol - to get multiple cards locked in P0 you need to set one, then open AB settings, change the card listed under Master Graphics Adapter, highlight the point in the graph again and cntrl-L, apply. Now both cards are in P0 state and then you can just adj the offset as usual!


Thanks, that's what I tried before and the first card wouldn't hold when I changed the master cad, I must have done something wrong, or maybe the new AB beta fixed this, will check this weekend, going to have one of my grandson's to entertain today, it's great having kids that leave at the end of the day


----------



## markklok

So i'm still wondering if i over-tightened my EK block....

idle temp = 24 degrees
when i start an ultra 3dmark session.. it will immediatly raise to 34 (like in 2 seconds).
Then it can flip from 34 to 27 depends on the load... and max temp will slowly raise to like 36

room temp = 24-25 degrees

What do you guys think.. just let it for what it is or re-paste ?
(sort paste = grizzly)


----------



## carlhil2

Quote:


> Originally Posted by *markklok*
> 
> So i'm still wondering if i over-tightened my EK block....
> 
> idle temp = 24 degrees
> when i an ultra 3dmark session.. it will immediatly raise to 34 (like in 2 seconds).
> Then it can flip from 34 to 27 depende on the load... and max temp will slowly raise to like 36
> 
> room temp = 24-25 degrees
> 
> What do you guys think.. just let it for what it is or re-paste ?
> (sort paste = grizzly)


Those temps seem about right...


----------



## Lobotomite430

Quote:


> Originally Posted by *markklok*
> 
> So i'm still wondering if i over-tightened my EK block....
> 
> idle temp = 24 degrees
> when i start an ultra 3dmark session.. it will immediatly raise to 34 (like in 2 seconds).
> Then it can flip from 34 to 27 depends on the load... and max temp will slowly raise to like 36
> 
> room temp = 24-25 degrees
> 
> What do you guys think.. just let it for what it is or re-paste ?
> (sort paste = grizzly)


Those seem really good to me, my EVGA hybrid is 26-30 idle and 45-50 under load.


----------



## dureiken

Hi,

I just received my Titan X Pascal today, with EK WB and BP !!!

I did a +220/+600 stable OC. I have a max core memory at +240 (2063Mhz). I would like to know how they do to have a 2120Mhz stable on Firestrike Ultra ? Is there a tweak or just luck with gpu ?

Is there any hard mod or bios mod atm ?

Thanks a lot


----------



## willverduzco

Quote:


> Originally Posted by *bwana*
> 
> so I put the 1080 evga hybrid unit on a TXP 40 deg C at idle w 681 mV.. I am getting 63deg C during the Heaven bench. Is this about normal for an AIO cooler? The back plate gets toasty-too hot to leave my hand on it for any length of time. Maybe I'll put some sinks on it.
> 
> During Heaven, max volts jump around alot - 1.035 up to 1.083. Power limit spends more time at 1 than at 0 indicating the card is being throttled due to power limit.
> 
> Clocks are more stable tho - 2025 to 2088. Max gpu clock offset on precision 6.06 is +209 mhz on core. The next step to +222 mhz crashes heaven bench.


Quote:


> Originally Posted by *dureiken*
> 
> Hi,
> 
> I just received my Titan X Pascal today, with EK WB and BP !!!
> 
> I did a +220/+600 stable OC. I have a max core memory at +240 (2063Mhz). I would like to know how they do to have a 2120Mhz stable on Firestrike Ultra ? Is there a tweak or just luck with gpu ?
> 
> Is there any hard mod or bios mod atm ?
> 
> Thanks a lot


That's quite strange that you're only hitting 2063 at +240. I hit around 2075 at +175 on my card and leave it at 2126 MHz (+225) for the most part. Other than the power limit drops to ~2100 when in Heaven and Firestrike, the clocks are pretty solid for me, even without the CLU mod. There is unfortunately no way to flash an unsigned BIOS at the moment, and nobody has hexedited a TXP bios to increase the power limits because there'd be no way to flash it anyway. Thus if you're running into power limits, you may need to shunt mod it.

Are your temps sub optimal? I have peak load temps of about 42 C when running Heaven (extreme preset, but at 4k res) on my EVGA hybrid with 2x Gentle Typhoons in push/pull (didn't want to drain/refill/bleed my admittedly overkill CPU loop), and that really helps keep your clocks high. I think the first clock reduction step is somewhere in the 30s, and each 10 degrees or so seem to drop it another ~13 MHz step. When I was on air and hitting 80+ C at 100% fan speed, the same +225 offset would only get me to the low 2000s.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *willverduzco*
> 
> That's quite strange that you're only hitting 2063 at +240. I hit around 2075 at +175 on my card and leave it at 2126 MHz (+225) for the most part. Other than the power limit drops to ~2100 when in Heaven and Firestrike, the clocks are pretty solid for me, even without the CLU mod. There is unfortunately no way to flash an unsigned BIOS at the moment, and nobody has hexedited a TXP bios to increase the power limits because there'd be no way to flash it anyway. Thus if you're running into power limits, you may need to shunt mod it.


It's not strange at all, your card overclocks better than the other guys.

That's why it's better to post actual core clocks over offset.


----------



## paxw

so it looks like an Evga Hybrid cool is coming eventually


__ https://twitter.com/i/web/status/779006683708137472


----------



## Lobotomite430

Quote:


> Originally Posted by *paxw*
> 
> so it looks like an Evga Hybrid cool is coming eventually
> 
> 
> __ https://twitter.com/i/web/status/779006683708137472


He said the same thing on their forums over a month ago. I still don't believe it will happen. The small change they need to make to the 1080 kit to make it fit a titan is very minimal. If hes saying Nvidia gave EVGA the go ahead to produce a Titan Hybrid and not a kit for them but the real deal card with a cooler installed that would be a different story.


----------



## Lobotomite430

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> It's not strange at all, your card overclocks better than the other guys.
> 
> That's why it's better to post actual core clocks over offset.


So how does that actually work then? This is my first GPU I have had any interest in overclocking and I get 2088mhz on +200 but the other guy gets 2063mhz at +240? Its been a long week and its the tail end of the day on a friday so pardon my brain for not being able to make sense of it.


----------



## opt33

Quote:


> Originally Posted by *Lobotomite430*
> 
> So how does that actually work then? This is my first GPU I have had any interest in overclocking and I get 2088mhz on +200 but the other guy gets 2063mhz at +240? Its been a long week and its the tail end of the day on a friday so pardon my brain for not being able to make sense of it.


Stock boost speed varies, ie on one may be 1800, another 1850, etc. So +200 on one may be 2000, another 2050.


----------



## dureiken

Thanks for your answer,

where do you see clocks ? in MSI AB ?

I have 480+360+240 radiators, for CPU and GPU and they are at maximum for benchmark. I have voltage limit at 1 constant, power limit at 1 by peaks during heaven

Thanks


----------



## cg4200

Quote:


> Originally Posted by *dureiken*
> 
> Thanks for your answer,
> 
> where do you see clocks ? in MSI AB ?
> 
> I have 480+360+240 radiators, for CPU and GPU and they are at maximum for benchmark. I have voltage limit at 1 constant, power limit at 1 by peaks during heaven
> 
> Thanks


I'm sure you know to have beta 4.3.0 14 afterburner and not regular download..and clock is on top left usually card downclocks with no load open gpu-z start render then look at clock.. also make sure to turn your power limit up to 120% max I get firestrike 200 core is 2100 then downclocks I have clu mod so stays around 2088 ..gaming I can go 210 core get 2126 or so..luck off draw+ 240 seems like a lot to get your core


----------



## dureiken

I have MSI AB 4.3.0 beta4


----------



## Jpmboy

Quote:


> Originally Posted by *carlhil2*
> 
> Alittle chilly this morning, and, my score went up a bit, same clocks.. http://www.3dmark.com/spy/485610 highest score in the HOF pushing a 5960x.. should last a Day anyways..


nice gfx score!








Quote:


> Originally Posted by *KillerBee33*
> 
> All the ingredients are here to try and F***up that KOMODO block even more


post upi with how it goes... remember, you're fitting a "custom" shoe... do some test fits and use a cheap TIM to ensure you are getting proper contact between the block and hot parts.
Quote:


> Originally Posted by *Menthol*
> 
> Thanks, that's what I tried before and the first card wouldn't hold when I changed the master cad, I must have done something wrong, or maybe the new AB beta fixed this, will check this weekend, going to have one of my grandson's to entertain today, it's great having kids that leave at the end of the day


lol - aren't those the best kids.









hey - the dual card P0 thing works for sure. I haven;t benched it yet, but the clocks do stay up - gotta see if Pascal likes that.


----------



## Jpmboy

the TXP is an incredibly "productive" folding unit... join up and SHOW 'EM WHAT txp CAN DO (FYI - 2 CARDS DO >2.5m ppd).









http://www.overclock.net/t/1611567/september-2016-foldathon-monday-26th-28th-12-noon-est-4pm-utc/0_20


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> post upi with how it goes... remember, you're fitting a "custom" shoe... do some test fits and use a cheap TIM to ensure you are getting proper contact between the block and hot parts.


Will do. Problem is , Chip on the Titan feels to be a bit higher from the PCB than the 1080 , should've gotten 1.0 instead of 0.5 TP bcz i'm bending PCB from the sides


----------



## dureiken

Hi again,

do you use +100mV in AB or not for your OC ? and force constant voltage ?

Thanks


----------



## willverduzco

Quote:


> Originally Posted by *dureiken*
> 
> Hi again,
> 
> do you use +100mV in AB or not for your OC ? and force constant voltage ?
> 
> Thanks


First off, the setting you're talking about is +100%, and not +100 mV. All it does is allow the card use the entire voltage range (up to 1.093 V, from 1.05 V at default). In other words, the "+100%" setting is actually only +43 mV.

Anyway getting to your question, I've tried both with +100% voltage and without on my Hybrid mod card that is kept at 42C peak temp at full load. I have not done the CLU mod because I'm waiting on BIOS editing. In my case, upping the max voltage helps hit higher peak clocks at first, but I then just run into the power limit and get very variable clocks. At default voltage settings, I get a stable 2126 MHz with +225 MHz offset, and only very occasionally does it go down a step or two when looping benchmarks. If I raise that voltage setting to +100%, I'll hit about two or three frequency steps higher (each is 12.5 MHz) for a brief moment, only to hit the power limiter and get very inconsistent clocks.

If you've done the CLU shunt mod, you may want to play with this setting. But if you haven't done the shunt mod, it's probably not a good idea to raise the voltage cap. You'll end up with inconsistent clocks, which lead to worse frame rate consistency and a worse overall gaming experience.

If you already did the CLU shunt mod and raise voltage, just make sure your VRMs are kept nice and cool (to prevent them from derating). Don't forget that you're 



. The VRM high side is rated for a total 176A continuous 704A pulsed. That is calculated from 8 MOSFETs, which are each rated for 22A continuous, 88A pulsed(PDF datasheet). This isn't bad if you leave your power draw under around 150% total (120% in software + shunt mod), but if you do the shunt mod on all 3 of the 5 micro Ohm resistors, run the max 120% power limit in software, run the voltage at the full 1.093 V, run something very intense like Furmark, AND don't properly cool the VRM MOSFETs to prevent them from derating, you may end up with a very expensive paper weight.


----------



## bizplan

Quote:


> Originally Posted by *willverduzco*
> 
> First off, the setting you're talking about is +100%, and not +100 mV. All it does is allow the card use the entire voltage range (up to 1.093 V, from 1.05 V at default). In other words, the "+100%" setting is actually only +43 mV.
> 
> Anyway getting to your question, I've tried both with +100% voltage and without on my Hybrid mod card that is kept at 42C peak temp at full load. I have not done the CLU mod because I'm waiting on BIOS editing. In my case, upping the max voltage helps hit higher peak clocks at first, but I then just run into the power limit and get very variable clocks. At default voltage settings, I get a stable 2126 MHz with +225 MHz offset, and only very occasionally does it go down a step or two when looping benchmarks. If I raise that voltage setting to +100%, I'll hit about two or three frequency steps higher (each is 12.5 MHz) for a brief moment, only to hit the power limiter and get very inconsistent clocks.
> 
> If you've done the CLU shunt mod, you may want to play with this setting. But if you haven't done the shunt mod, it's probably not a good idea to raise the voltage cap. You'll end up with inconsistent clocks, which lead to worse frame rate consistency and a worse overall gaming experience.
> 
> If you already did the CLU shunt mod and raise voltage, just make sure your VRMs are kept nice and cool (to prevent them from derating). Don't forget that you're
> 
> 
> 
> . The VRM high side is rated for a total 176A continuous 704A pulsed. That is calculated from 8 MOSFETs, which are each rated for 22A continuous, 88A pulsed(PDF datasheet). This isn't bad if you leave your power draw under around 150% total (120% in software + shunt mod), but if you do the shunt mod on all 3 of the 5 micro Ohm resistors, run the max 120% power limit in software, run the voltage at the full 1.093 V, run something very intense like Furmark, AND don't properly cool the VRM MOSFETs to prevent them from derating, you may end up with a very expensive paper weight.


That's some good stuff Willver, one would think there is some logic in the VRM circuitry that would down-clock the GPU to prevent the MOSFETs from overheating or excessively de-rating. I have not done the CLU mod but have run at 100%, 120%, 90C,+195/+595 (1999-2088, 11,200, 1.093), stock fan curve (~1500 RPM at 54C max core temp) using EVGA hybrid kit, for hours on end (i.e. playing Doom at 4k/60, Acer XB321K w/G-SYNC (~120FPS)), 6700K at 4.7, with no degradation in GPU performance.. these TXP cards seem to hold up very well!


----------



## willverduzco

Quote:


> Originally Posted by *bizplan*
> 
> That's some good stuff Willver, one would think there is some logic in the VRM circuitry that would down-clock the GPU to prevent the MOSFETs from overheating or excessively de-rating. I have not done the CLU mod but have run at 100%, 120%, 90C,+195/+595 (1999-2088, 11,200, 1.093), stock fan curve (~1500 RPM at 54C max core temp) using EVGA hybrid kit, for hours on end (i.e. playing Doom at 4k/60, Acer XB321K w/G-SYNC (~120FPS)), 6700K at 4.7, with no degradation in GPU performance.. these TXP cards seem to hold up very well!


You would hope! But unfortunately, the VRMs on GPUs don't have temp sensors built in. This means that they're going to work until they die, at which point you're be passing 12V directly to the core.







I guess theoretically a board partner could design a PCB with a temp diode right around the MOSFETs (like basically all mobos have around their VRMs) and have a limiter kick in if it gets too hot, but unfortunately I haven't seen any GPU that does this. Alternatively, since VRM efficiency goes down as temp goes up, you could somehow measure input and output current levels and throttle if efficiency goes under a set threshold.

Luckily looking at the data sheet for these VRMs, you can see that despite not having a lot of current capability, they don't start derating until very high temperatures. Up to about 130C, they're capable of outputting the same current, with a very sharp drop off beyond that. That's actually quite remarkable now that I think about it, as you're not going to exceed that unless you're drawing current significantly beyond spec (hardware volt mod + very high clocks with exotic cooling). Also, I guess the typical use-case of a high-side is more similar to pulsed output, so hopefully the voltage controller is smart in how it controls the MOSFETs.

As for performance consistency, I guess I overstated the potential downsides to the horribly named "+100% voltage" option a little. In your own example, you mention a 1999-2088 range, so even at its worst, you're only 4.2% down when compared to your highest clocks. I guess when you're running a TXP and getting 120 FPS, you'll never notice the difference between 120 and 128 or so.







Some part of me just inherently prefers more stable clocks, even if they're a tad lower, just so that I can keep my minimums higher. And at least on my card, my minimums do go down further if I set it to the full range (~2063-2138 range at 100% vs ~2100-2126 range at 0%).


----------



## fewness

finally follow this instruction (http://forums.guru3d.com/showthread.php?t=409468) to get 3 way SLI work









pretty good GPU utilization








and very nice scaling from 1 to 2 to 3 cards


----------



## KillerBee33

Weird question .... Does anyone know where to get Factory Pascal Thermal Pads?


----------



## carlhil2

Today is a sad Day. RIP Jose Fernandez.....


----------



## Silent Scone

Quote:


> Originally Posted by *carlhil2*
> 
> Today is a sad Day. RIP Jose Fernandez.....


Did he have a TX?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Weird question .... Does anyone know where to get Factory Pascal Thermal Pads?


aren't they on the stock air cooler you pulled?

__________________________

if anyone is interested: http://www.overclock.net/t/1611567/september-2016-foldathon-monday-26th-28th-12-noon-est-4pm-utc/0_20


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> aren't they on the stock air cooler you pulled?


Removed when taking it apart and don't have'em .
Currently with nVidia Chat for RMA. Still can't figure out WTH happened








Will have to slap TP from EK Block to ship it out .
Wrong timing too, finally set CPU block nicely ...


Spoiler: Warning: Spoiler!


----------



## Silent Scone

Quote:


> Originally Posted by *KillerBee33*
> 
> Removed when taking it apart and don't have'em .
> Currently with nVidia Chat for RMA. Still can't figure out WTH happened
> 
> 
> 
> 
> 
> 
> 
> 
> Will have to slap TP from EK Block to ship it out .
> Wrong timing too, finally set CPU block nicely ...


What _did_ happen? Is the card dead?


----------



## carlhil2

Quote:


> Originally Posted by *Silent Scone*
> 
> Did he have a TX?


Lol, not sure, but, I wanted to express my feelings and this is the only thread that is always open in my browser when I open it. sorry for the OT post though....


----------



## KillerBee33

Quote:


> Originally Posted by *Silent Scone*
> 
> What _did_ happen? Is the card dead?


Uhummm









Spoiler: Warning: Spoiler!


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhummm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


WT....


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> WT....


Do that for about 20 hours and you'll be where i am







Don'f friggin know.
Even if RMA works out , i might get some crappy clocker , mine was sublime








Might have a slight assumption on what happened , KOMODO block has two LED power cables hanging from the back , one is in use and the second probably got in between there and shortened it. Whats weird that it's not the first closest vrm , it's second from the end


----------



## Zurv

The Nvidia RMA process is a pain in the butt. I just RMA'd one of my cards because it just wanted to run at x8 and was also unstable. I had to talk to someone on IM chat for like an hour. Send a billion pix. then wait days for a response. Even if Nvidia wanted to be the only one making this model, they could have at least use amazon or something. I could have had a return in a day.


----------



## KillerBee33

Quote:


> Originally Posted by *Zurv*
> 
> The Nvidia RMA process is a pain in the butt. I just RMA'd one of my cards because it just wanted to run at x8 and was also unstable. I had to talk to someone on IM chat for like an hour. Send a billion pix. then wait days for a response. Even if Nvidia wanted to be the only one making this model, they could have at least use amazon or something. I could have had a return in a day.


I was only on chat for about 20 minutes , copied and pasted all the info from INVOICE , had to take an image from a box and a screenshot of Order Confirmation . Hopefully having EK thermal pads instead of factory wont be an issue .Now the actual RMA email , not sure how long that'll take .


----------



## eliau81

Quote:


> Originally Posted by *Zurv*
> 
> The Nvidia RMA process is a pain in the butt. I just RMA'd one of my cards because it just wanted to run at x8 and was also unstable. I had to talk to someone on IM chat for like an hour. Send a billion pix. then wait days for a response. Even if Nvidia wanted to be the only one making this model, they could have at least use amazon or something. I could have had a return in a day.


for me the RMA was very quickly ,it took a few days to get approval although i broke the card.
Quote:


> Originally Posted by *KillerBee33*
> 
> Do that for about 20 hours and you'll be where i am
> 
> 
> 
> 
> 
> 
> 
> Don'f friggin know.
> Even if RMA works out , i might get some crappy clocker , mine was sublime
> 
> 
> 
> 
> 
> 
> 
> 
> Might have a slight assumption on what happened , KOMODO block has two LED power cables hanging from the back , one is in use and the second probably got in between there and shortened it. Whats weird that it's not the first closest vrm , it's second from the end


what do you mean crappy clocker? they sending a new one...


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> what do you mean crappy clocker? they sending a new one...


Ever run TimeSpy with absolute highest clock on your titan? Every GPU is different , some clock Higher and better , some worse , some can go up to 700-750 on the memory and 2151 on the core and there are few which cant go over 2088 and 500 on the memory.
In my case Titan was close to one of the best out there


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*
> 
> Ever run TimeSpy with absolute highest clock on your titan? Every GPU is different , some clock Higher and better , some worse , some can go up to 700-750 on the memory and 2151 on the core and there are few which cant go over 2088 and 500 on the memory.
> In my case Titan was close to one of the best out there


so they won't send a bad one on purpose, it depends on availability.
you do know that they also can send a refurbished one...


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Ever run TimeSpy with absolute highest clock on your titan? Every GPU is different , some clock Higher and better , some worse , some can go up to 700-750 on the memory and 2151 on the core and there are few which cant go over 2088 and 500 on the memory.
> In my case Titan was close to one of the best out there


Yeah, you have an above average card, mine is average. it might be better, but, since I only have a uniblock on it, and, the rest of the pcb is bare, I don't even try to push it too hard. I am satisfied though...


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> so they won't send a bad one on purpose, it depends on availability.
> you do know that they also can send a refurbished one...


It's not exact science so refurbished or off the factory every GPU is just different. Ehh i'll deal with what i get


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*
> 
> It's not exact science so refurbished or off the factory every GPU is just different. Ehh i'll deal with what i get


did you got approval ? if so you should be happy , replacing the stocke cooler void warranty .


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> *Removed when taking it apart and don't have'em .*
> Currently with nVidia Chat for RMA. Still can't figure out WTH happened
> 
> 
> 
> 
> 
> 
> 
> 
> Will have to slap TP from EK Block to ship it out .
> Wrong timing too, finally set CPU block nicely ...
> 
> 
> Spoiler: Warning: Spoiler!











I bet that won't happen again.
Quote:


> Originally Posted by *KillerBee33*
> 
> *Do that for about 20 hours and you'll be where i am*
> 
> 
> 
> 
> 
> 
> 
> Don'f friggin know.
> Even if RMA works out , i might get some crappy clocker , mine was sublime
> 
> 
> 
> 
> 
> 
> 
> 
> Might have a slight assumption on what happened , KOMODO block has two LED power cables hanging from the back , one is in use and the second probably got in between there and shortened it. Whats weird that it's not the first closest vrm , it's second from the end











Quote:


> Originally Posted by *carlhil2*
> 
> Yeah, you have an above average card, mine is average. it might be better, but, since I only have a uniblock on it, and, the rest of the pcb is bare, I don't even try to push it too hard. I am satisfied though...


with the uniblock, all ya really need is a decent "breeze" over the exposed ram, vrms and chokes.


----------



## KillerBee33

Looking forward to BURNING another one if RMA works out bcz im slapping that KOMODO block back ON






















This is my assumption







forgot to hold it in place when installing


Spoiler: Warning: Spoiler!


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> did you got approval ? if so you should be happy , replacing the stocke cooler void warranty .


No questions on HOW and Where, Just said RMA sent and got email about RMA sticker will be sent in 24 hours


----------



## carlhil2

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I bet that won't happen again.
> 
> 
> 
> 
> 
> 
> 
> 
> with the uniblock, all ya really need is a decent "breeze" over the exposed ram, vrms and chokes.


Oh, no doubt. my airflow is allll set...


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> No questions on HOW and Where, Just said RMA sent and got email about RMA sticker will be sent in 24 hours


Back in July I was moving one of the temp sensors from my fan controller while the computer was on, tip of sensor must have brushed up against something on my mobo, shorted out my system and ended up having to replace my CPU, RAM, mobo, PSU, and fan controller. About a $700 mistake.


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> Back in July I was moving one of the temp sensors from my fan controller while the computer was on, tip of sensor must have brushed up against something on my mobo, shorted out my system and ended up having to replace my CPU, RAM, mobo, PSU, and fan controller. About a $700 mistake.


Heh , at first i didnt know what got burned and i just put this thing together , then unplugged GPU and i'm so [email protected] glad 6700 has integrated , honestly thought everything burned down , almost reached for my medicine cabinet







Refilled and Flushed whole setup twice yesterday









Spoiler: Warning: Spoiler!






This could've been a lot more expensive than a 1300$ mistake


----------



## mbze430

Nvidia must being having a great weekend, one of my TXP is dying... it's artifacting and showing big ole squares not even OC. Hopefully I get a RMA # by Monday.


----------



## axiumone

Quote:


> Originally Posted by *mbze430*
> 
> Nvidia must being having a great weekend, one of my TXP is dying... it's artifacting and showing big ole squares not even OC. Hopefully I get a RMA # by Monday.


One of my cards has a faulty display port. It's a good clocker, so I'm torn about sending it in.


----------



## Maintenance Bot

One of the led Geforce GTX logo's on one of my cards is very dim and does not work.

4 busted cards in the owners club today


----------



## jcde7ago

Had a near heart attack this weekend as the eBay buyer of my old 3x Titan X Maxwells initiated a return (and he had paid $685 each for them)!

Turns out, I made a completely amateur move and forgot to BIOS flash the TXMs back to stock...after a lot of angry back and forth with the buyer he flashed back to stock, I partially refunded him $50 for the troubles and we went our merry ways leaving each other positive feedback and the return case was closed.









Just a reminder to those of you selling GPUs on fleabay or anywhere else for that matter to pick up a TXP(s) ....REMEMBER TO FLASH YOUR CARDS BACK TO STOCK BIOS! DON'T DERP OUT LIKE I DID!









Also, what's going on with all the reports of problematic TXPs?? My two are humming along just fine...they're actually a bit underused as there's simply no games that really stretch their legs...i'm replaying Witcher 3 with all the DLC just to experience some legit eye candy...sigh.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> No questions on HOW and Where, Just said RMA sent and got email about RMA sticker will be sent in 24 hours


lol - fantastic!


----------



## KillerBee33

Anyone wants a Slightly Used and slightly burned TITAN X? HEHEHE






















Hate putting these things back together ...


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - fantastic!


Yeah had to use TP from EK , hoping for the best


----------



## lanofsong

Hey Titan Pascal owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Jpmboy

Quote:


> Originally Posted by *jcde7ago*
> 
> Had a near heart attack this weekend as the eBay buyer of my old 3x Titan X Maxwells initiated a return (and he had paid $685 each for them)!
> 
> Turns out, I made a completely amateur move and forgot to BIOS flash the TXMs back to stock...after a lot of angry back and forth with the buyer he flashed back to stock, I partially refunded him $50 for the troubles and we went our merry ways leaving each other positive feedback and the return case was closed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just a reminder to those of you selling GPUs on fleabay or anywhere else for that matter to pick up a TXP(s) ....REMEMBER TO FLASH YOUR CARDS BACK TO STOCK BIOS! DON'T DERP OUT LIKE I DID!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, what's going on with all the reports of problematic TXPs?? My two are humming along just fine...*they're actually a bit underused* as there's simply no games that really stretch their legs...i'm replaying Witcher 3 with all the DLC just to experience some legit eye candy...sigh.


FOLD for a few days:
Quote:


> Originally Posted by *lanofsong*
> 
> Hey Titan Pascal owners,
> 
> Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.
> 
> September Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


ya beat me to it.


----------



## lanofsong




----------



## willverduzco

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhummm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Jeez. How'd you blow the VRM? Are you sure you shorted something out? It seems far more likely that something like this is due to excessive power load on that specific MOSFET. Did you do the CLU shunt mod? If so, were you at the software max power limit and running the card at prolonged full load? Our cards have pretty weak VRMs, as I talked about a couple pages ago, so I wouldn't be surprised if something like this would be caused by shunt mod + increased PL + heavy load...


----------



## Silent Scone

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhummm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Had you applied the shunt mod? Not really normal for that to happen. Looks like the pad has shifted also so you probably borked that. Who knows.
Quote:


> Originally Posted by *jcde7ago*
> 
> Had a near heart attack this weekend as the eBay buyer of my old 3x Titan X Maxwells initiated a return (and he had paid $685 each for them)!
> 
> Turns out, I made a completely amateur move and forgot to BIOS flash the TXMs back to stock...after a lot of angry back and forth with the buyer he flashed back to stock, I partially refunded him $50 for the troubles and we went our merry ways leaving each other positive feedback and the return case was closed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just a reminder to those of you selling GPUs on fleabay or anywhere else for that matter to pick up a TXP(s) ....REMEMBER TO FLASH YOUR CARDS BACK TO STOCK BIOS! DON'T DERP OUT LIKE I DID!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, what's going on with all the reports of problematic TXPs?? My two are humming along just fine...they're actually a bit underused as there's simply no games that really stretch their legs...i'm replaying Witcher 3 with all the DLC just to experience some legit eye candy...sigh.


Just don't use eBay. It's where mongs go to grab a bargain. I got sick of idiots abusing the returns system. Plus the final value fees are criminal.


----------



## KillerBee33

@ willverduzco
@ Silent Scone
Nope,no mods to the GPU , everything covered and working an hour earlier , the only thing was naked and loose is that LED power cable which seems unlikely still but it's the only thing i can think of








Pad did not shift , ripped a little piece when trying to take it off and also no burns on the outside facing up until i looked underneath the pad, just flipped that pad upside down and put it on the side for the image.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> @ willverduzco
> @ Silent Scone
> Nope,no mods to the GPU , everything covered and working an hour earlier , the only thing was naked and loose is that LED power cable which seems unlikely still but it's the only thing i can think of
> 
> 
> 
> 
> 
> 
> 
> 
> Pad did not shift , ripped a little piece when trying to take it off and also no burns on the outside facing up until i looked underneath the pad, just flipped that pad upside down and put it on the side for the image.


I'm tellin' ya.. all you know "is the part just blew up" and "I'm soooo distraught" . And send it back to NV (which I know you have already).


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> I'm tellin' ya.. all you know "is the part just blew up" and "I'm soooo distraught" . And send it back to NV (which I know you have already).


Uhumm , did that ! Still sad KOMODO block finally let me know it's not meant to be








I saw you Post being Done making things pretty but you probably remember the feeling








Slapping that EK simplicity wasn't in the plan but it looks like the only reasonable choice .


----------



## Delitus

Quote:


> Originally Posted by *aliron*
> 
> Hi everyone in my first post here.
> I bought the Titan X Pascal and few days late i saw this when browsing in firefox:
> http://subefotos.com/ver/?e384de19b0335cde8de6ad62b9301de7o.jpg
> 
> This artifacts change a little when i move the mouse over them, and it only shows in this tab. Just closing Firefox everything goes normal again. I send a ticket to Nvidia and because not happend again, they close it.
> But days later, (372.70 drivers instead 372.54) it happend again in a video in the Quantum Break game and 2 days later when scrolling in a google results page. This 2 were just a flash, but looks like the first one. I reopened the ticket and im waiting the Nvidia answer.
> 
> I have to say that except for this, in games seems to work normal, with no errors (i have it since august 25th).
> The errors happend not in 3D mode, just with some hardware acceleration (Firefox and playing video). And in that moments there are peaks in the memory and GPU frequency and voltage (like with other GPUs i own)
> Did anything like this happend to anyone of you? Do you think the GPU is defective or its originated because drivers or something?
> Thank You.
> EDIT: I forgot to say that the titan X is with the stock cooler. So i dont touch anything.


Quote:


> Originally Posted by *HotClock*
> 
> I
> I had the same thing happen to me with MS Edge and video's playing, and this is even before I did any overclocking. Hopefully they patch it. Would be nice to if they patched in 10-bit support, WCG and HDR for the Desktop. After all, it's only $1200, and for that price we should expect the latest and greatest for what we're buying.


Noticed the exact same issue on my new ASUS GTX 1080 FE, no overclocks. (Sorry, not a Titan XP owner...) Still, the same problem occurring on two different Pascal-based cards - does this indicate a possible driver/firmware issue?

The problem seems to be strictly visual-only - zero errors on MemtestCL, zero artifacts using OC Scanner, zero crashes.

I have also noticed that these artifacts, when they happen, also appear within screen captures - meaning the image corruption has occurred at the framebuffer level.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Delitus*
> 
> Noticed the exact same issue on my new ASUS GTX 1080 FE, no overclocks. (Sorry, not a Titan XP owner...) Still, the same problem occurring on two different Pascal-based cards - does this indicate a possible driver/firmware issue?
> 
> The problem seems to be strictly visual-only - zero errors on MemtestCL, zero artifacts using OC Scanner, zero crashes.
> 
> I have also noticed that these artifacts, when they happen, also appear within screen captures - meaning the image corruption has occurred at the framebuffer level.


I get that weird glitch too once in a blue moon when browsing. I'm thinking driver issue myself.


----------



## Jpmboy

hey guys... the TXP Folds off the charts. almost 3M PPD and cores never above 37C.











join in (the OCN team needs the points!!): http://www.overclock.net/t/1611567/september-2016-foldathon-monday-26th-28th-12-noon-est-4pm-utc/0_20


----------



## Yuhfhrh

Quote:


> Originally Posted by *Jpmboy*
> 
> hey guys... the TXP Folds off the charts. almost 3M PPD and cores never above 37C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> join in (the OCN team needs the points!!): http://www.overclock.net/t/1611567/september-2016-foldathon-monday-26th-28th-12-noon-est-4pm-utc/0_20


Started a bit late, but should be able to keep it up the next couple days.







Will see if I can get a couple other machines going too.


----------



## lanofsong

Ughhh - It will take 3 x 980's to put out similar PPD as a TXP - Awesome folding power there


----------



## Jpmboy

Quote:


> Originally Posted by *lanofsong*
> 
> Ughhh - It will take 3 x 980's to put out similar PPD as a TXP - Awesome folding power there


well it's 2 TXPs.


----------



## lanofsong

Quote:


> Originally Posted by *Jpmboy*
> 
> well it's 2 TXPs.


Then I will need 6 x 980 (which also have to be good clockers)


----------



## jodasanchezz

HI guys,
if just finished my build.

Got a Titan in an separated Loop an wich
100% V
120% Power
94°C
+230
+450

i can can only manage to Keep 1964-2025 in Witcher 3 (4k all max hw off) fps are fine but is this normal.
I would be happy if i could reach an keep 2000+ on the core.

Temps are MAX 56C


----------



## V I P E R

If you're using the stock cooler it is normal. You can manage 2000+ with waterblock and keep the temps under 45 degrees.


----------



## KillerBee33

Quote:


> Originally Posted by *jodasanchezz*
> 
> HI guys,
> if just finished my build.
> 
> Got a Titan in an separated Loop an wich
> 100% V
> 120% Power
> 94°C
> +230
> +450
> 
> i can can only manage to Keep 1964-2025 in Witcher 3 (4k all max hw off) fps are fine but is this normal.
> I would be happy if i could reach an keep 2000+ on the core.
> 
> Temps are MAX 56C


Keep Voltage @ stock .


----------



## EniGma1987

Quote:


> Originally Posted by *axiumone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DADDYDC650*
> 
> Looks like Microsoft is giving away free Gears 4 keys with 1070's/1080's.... should I go SLI 1080 for 4K or 1 XP?
> 
> 
> 
> 1 XP. Especially if you're looking to play dx12 games from the windows store. None of them support multi GPU yet.
Click to expand...

That was just Quantum Break.

Gears 4: http://arstechnica.com/gaming/2016/09/gears-of-war-4-reveals-offline-lan-free-matchmaking-dlc-smooth-4k-on-pc/
Quote:


> _Yes, there are "insane" settings... but why?_
> 
> "DirectX 12 is a big part of how we're able to get additional performance out of the PC," *Technical Director Mike Raynor told Ars*. "The DX12 graphics API takes away a bunch of layers from the developer and hardware. It's letting us have more direct control over what the GPU's doing... We've completely parallelized the rendering system, meaning we're utilizing and running more CPU cores. One big focus we had was to lower the simulation cost to give headroom for the GPU. A lot of games out there have monster GPU but get framelocked because you're CPU bound. There's a huge upper range we support with what we've done."
> 
> I also jacked up settings beyond my rig's near-ultra default and dropped the resolution to 1440p, which still looked gorgeous and rocked a 60fps refresh. I didn't have as much luck pushing some of the settings into the menus' crazy-high "insane" setting, which might be because, *according to Raynor, that setting has been offered for future systems and SLI power users*: "The game will scale with new hardware that will come out," he said. "You need a really high spec to get in there. It looks awesome, but it's very GPU-hungry." (*While I couldn't test Raynor's assertions that the game has been optimized to make the most of SLI graphics card performance, he insisted that users would see substantial boosts if they double-card*.)


Quote:


> Originally Posted by *bizplan*
> 
> Has anyone heard of someone's VRM ever going bad because of heat? Or that they got a bad or unstable OC because the VRMs got too hot?


I have had a VRM catch fire from being run too hot before. Typically on GPUs, as the VRMs get hotter they also have diminished current capacity which means they can fail from too much current going through them while being too hot. Though with this Titan X, the VRM is actually rated for basically the same current at both cool temps and the max temperature they will ever really even possibly reach. First VRM I have ever seen function like that.

Quote:


> Originally Posted by *KillerBee33*
> 
> Do that for about 20 hours and you'll be where i am
> 
> 
> 
> 
> 
> 
> 
> Don'f friggin know.
> Even if RMA works out , i might get some crappy clocker , mine was sublime
> 
> 
> 
> 
> 
> 
> 
> 
> Might have a slight assumption on what happened , KOMODO block has two LED power cables hanging from the back , one is in use and the second probably got in between there and shortened it. Whats weird that it's not the first closest vrm , it's second from the end


That is probably why you should not have used a water block meant for a different card and modified it to go on a Titan X.


----------



## KillerBee33

Quote:


> Originally Posted by *EniGma1987*
> 
> That is probably why you should not have used a water block meant for a different card and modified it to go on a Titan X.


Thank you. Most helpful info i got so far








For the Record , this block works just fine and Block itself has nothing to do with a mistake i made.


----------



## EniGma1987

Quote:


> Originally Posted by *KillerBee33*
> 
> Thank you. Most helpful info i got so far
> 
> 
> 
> 
> 
> 
> 
> 
> For the Record , this block works just fine and Block itself has nothing to do with a mistake i made.


And how do you know? You already said you dont even know what caused the problem only guessing it is related to a fan power cable. Cutouts on the 1080 block are not the same as titan blocks. You very well could have had the end of that VRM section cutout touch something it shouldnt have. Dont just insist that using wrong parts in places is perfectly fine and works great when you have had such a massive problem and dont even know a cause yourself.


----------



## KillerBee33

Quote:


> Originally Posted by *EniGma1987*
> 
> And how do you know? You already said you dont even know what caused the problem only guessing it is related to a fan power cable. Cutouts on the 1080 block are not the same as titan blocks. You very well could have had the end of that VRM section cutout touch something it shouldnt have. Dont just insist that using wrong parts in places is perfectly fine and works great when you have had such a massive problem and dont even know a cause yourself.


Worked just fine for two weeks before the incident . There are two possible things that might've happened , 1-The naked LED power cable 2-Thermal Pad shifted and shortened the VRM.
Whichever reason it was it still has nothing to do with the modded block itself.
This 7X7MM and 4MM deep opening is the only difference there which i modded


Spoiler: Warning: Spoiler!






This wouldn't happen if i were a bit more careful putting thing together the second time .


----------



## tin0

Joining the club @ 2100MHz core clock, still on stock cooler, can't complain


----------



## willverduzco

Quote:


> Originally Posted by *tin0*
> 
> Joining the club @ 2100MHz core clock, still on stock cooler, can't complain


Sheezus. 2100 MHz on the stock cooler is something special. I only get 2126 sustained on water with max temps of high 30s in games and low 40s in benchmarks. Any higher clocks (by adjusting the overvolting slider) trips my power limiter. I see in that pic that your max temp was in the 60s, which is very high for GPUBoost 3, which begins lowering max boost somewhere in the 30s. =X If you cool it down to the high 30s/low 40s, you're probably looking at another 50-75 MHz. For reference, my card only did 2050ish on air (hitting the high 70s).


----------



## typingofthedead

ugg hey, looking for help-- I just got my Titan X pascal today, and immediately put on a waterblock... temps were great, but i was getting some crashing in games while testing, and after a crash I rebooted...

now the green 'geforce gtx' light comes on during boot, but then goes off after about 5 seconds and never comes back on. it never seems to post anything to a monitor... has anyone seen this / is there any fix? or is my card borked? thanks for any insight


----------



## hotrod717

Quote:


> Originally Posted by *Jpmboy*
> 
> are you really handling that card on a granite counter top?


Thank goodness there is a voice of sanity in this thread. Lol.


----------



## xarot

You guys are getting some serious OCs on these. My card is not much above 2025 stable in 3DMark with EK block and it already throttles clocks to under 2 GHz at times, especially in Witcher 3 regardless of the set clock offset. Well, on 24x7 I run stock anyway as I am not a fan of 3rd party software OC tools since no BIOS editor is available.


----------



## tin0

Did some LN2 benching on saturday with my buddy, first on a 980Ti KP. After that, my friend put his TITAN XP on the setup (still on stock cooler). Seems like his chip and mine are twins, both 2100MHz, time for water & SLI












Score


----------



## Lobotomite430

Quote:


> Originally Posted by *hotrod717*
> 
> Thank goodness there is a voice of sanity in this thread. Lol.


Its still laminate. Everything is fine.


----------



## EniGma1987

Quote:


> Originally Posted by *Jpmboy*
> 
> are you really handling that card on a granite counter top?


Why are so many people against putting computer parts on granite? Some weird thing I dont know about? It doesnt really conduct electricity, no more than any other table top would. Any spills of something can easily be taken out completely with some bleach. It is hard and stable. PCB is softer than the granite so you arent going to scratch the counter top. I cant see why people would be so against it.

Just last week I showed a guy how to apply some CLU to a de-lidded 6700K on my granite counter. And the week before I built a whole computer on the same granite counter top. And the week before that I was modding my GTX 1060 on the counter. Everything seems fine.


----------



## Silent Scone

Quote:


> Originally Posted by *EniGma1987*
> 
> Why are so many people against putting computer parts on granite? Some weird thing I dont know about? It doesnt really conduct electricity, no more than any other table top would. Any spills of something can easily be taken out completely with some bleach. It is hard and stable. I cant see why people would be so against it.


Yeah, or why not just get a couple of paving slabs and push them together so you can vice the card in that way.


----------



## pez

I'm obviously missing the importance of this as well. Never been fortunate enough to have granite counter tops







. What's the deal, here?


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> I'm obviously missing the importance of this as well. Never been fortunate enough to have granite counter tops
> 
> 
> 
> 
> 
> 
> 
> . What's the deal, here?











look at the bare backside of the TXP PCB... already know of two owners who knocked one of those tiny components loose. Hey, handle your card the way you like, but frankly - a video showing such goofy practice is not smart.
it's just not good assembly/disassembly practice (and that's why NO facility I've ever been to has rock-hard/unpadded surfaces for component assembly. flow bench, soldering - sure (not granite tho)

Some folks can only learn an expensive lesson.



not this:

Quote:


> Originally Posted by *hotrod717*
> 
> Thank goodness there is a voice of sanity in this thread. Lol.


Hey buddy! what's news?


----------



## EniGma1987

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> look at the bare backside of the TXP PCB... already know of two owners who knocked one of those tiny components loose. Hey, handle your card the way you like, but frankly - a video showing such goofy practice is not smart.
> it's just not good assembly/disassembly practice (and that's why NO facility I've ever been to has rock-hard/unpadded surfaces for component assembly. flow bench, soldering - sure (not granite tho)
> 
> Some folks can only learn an expensive lesson.
> 
> 
> 
> not this:
> 
> Hey buddy! what's news?


I guess to each their own. At least there is no technical reason I am missing why granite is somehow bad. Ill just continue using that and the marble counter tops in my bathroom like I always have. Haven't had an "incident" in the 15 years I have been doing it yet.


----------



## mbze430

I do all my work on a metal sheet that is highly conductive


----------



## Lee0

Hello! I'm finally joining this club tomorrow!








I originally planned on buying a 1080 AMP! Extreme but it left me disappointed in actual noise levels and wattage consumptions. And just so the Titan X became in stock in Sweden and therefore I seized the opportunity.







I will be on air for maybe 4-5 months before I plan my custom-loop.Then for the time being I will be in your care, titan x club!


----------



## mbze430

if you think the 1080 AMP! fan is bad, wait till you hear the Titan XP fans... better get those titan XP waterblock PRONTO!


----------



## mbze430

Looks like the news outlet are starting to pick up the real story on the 1080TI

http://wccftech.com/nvidia-gtx-1080-ti-launch-january/


----------



## Jpmboy

Quote:


> Originally Posted by *mbze430*
> 
> Looks like the news outlet are starting to pick up the real story on the 1080TI
> 
> http://wccftech.com/nvidia-gtx-1080-ti-launch-january/


January is good!


----------



## bwana

remounted my evga 1080 hybrid cooler on the txp using TG cryonaut. Temps dropped 10 degrees! Max on heaven is now 51 deg. down from 63 when i was using the stock paste on the evga block. Just trying to figure out the best way to oc the card.

Leave voltage at default.
Turn power to 120% (max)
Turn temp to 90 (max)
set priority to temp

move core oc slider to 200
hit apply and go.

Now what confuses me is what the graph thingy does. You know, when you move to the second screen of precision oc you get to pick from manual, basic and linear. In basic, I can add 200 mhz. Does this curve override what I put on the first page of precision oc?


----------



## hotrod717

Other interests taking the lead or green so to speak. Shame, a pile of hardware and kit with a slacker staring at it. Lol. I'll get the itch soon enough. Winters almost here and damn near all my vacation time to burn.


----------



## bizplan

Quote:


> Originally Posted by *mbze430*
> 
> Looks like the news outlet are starting to pick up the real story on the 1080TI
> 
> http://wccftech.com/nvidia-gtx-1080-ti-launch-january/


I don't believe it, the specs are too close to the TXP, I'll believe it when I see it though.


----------



## pez

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> look at the bare backside of the TXP PCB... already know of two owners who knocked one of those tiny components loose. Hey, handle your card the way you like, but frankly - a video showing such goofy practice is not smart.
> it's just not good assembly/disassembly practice (and that's why NO facility I've ever been to has rock-hard/unpadded surfaces for component assembly. flow bench, soldering - sure (not granite tho)
> 
> Some folks can only learn an expensive lesson.
> 
> 
> 
> not this:
> 
> Hey buddy! what's news?


Ah, the hard surface thing I understand. I have a big QcK heavy I don't use where I do my stuff on. Just didn't know if there was some property of granite I didn't understand







.


----------



## meson1

So basically, the upshot is, if you're working on a hard surface, you just need to be a bit careful.


----------



## Ghoxt

Source: Functional 4 Way SLI Titan X Pascal - in several games - how he did it, in his words.

Educational/ Info only on how one guy says he did it in his niche "server" setup with listed games









Quote: Post #23


> 1. make sure you have a 4 way bridge with metal connectors. The prongs on each side actually close a circuit and notify the card you have a higher band bridge.
> 
> 2. make sure you're on driver 372.54, the 369.05 is broke
> 
> 3. pic any game in nvcpl, go to sli mode under programs, does the drop down show 3 and 4 gpu afr 1 and afr 2 as a choice as in the picture I have earlier in the post?I should have 4-Gpu in the drop down.
> 
> 4. with nvinspector, in base profile, sli mode, instead of auto, make sure you selected 4 cards. Apply changes.
> 
> 5. I don't have Rise of the Tomb Raider. I get my games from GOG and never steam (don't like rent-to-play). I like clicking on the executable and bam being right in the game WITHOUT a third party ap constantly running in the background. Otherwise, you don't own the game, you're streaming it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lastly:
> 
> I don't run windows 10.Never will,
> So I don't have dx12.
> 
> My rig is on *Windows Server 2012r2*, running as a workstation, no issues from that what so ever.
> So maybe in windows 10 there is a driver issue/difference that I don't know about.
> Im using 372.54 without any issues or any changes in sli compared to when I had the regular 4-way sli with the original Titan X
> 
> I launch about half of my work stuff as a VM so windows 2012r2 was a good choice for that. Protected environment for a lot of stuff.
> 
> I also launch in no GUI mode when I have a lot of stuff to run. One of my apps is a unix only App so I use Ubuntu for that one.
> 
> I have had no issues in Ubuntu, but that is a different post.


Current build:
Supermicro X10DRG-Q (One of the few boards that numbers the PCI slots in reverse. Slot #1 is furthest away from CPU) hmm
2x E5-2699 v4 (44 cores/ 88HT) (3.7ghz turbo)
512GB ram DDR4 2400mhz ecc reg
(QUAD SLI) 4 Titan X PASCAL
2x Samsung NVMe 961 Pro pcie 3.0 (os drives)
10x Samsung 850 Pro SSD RAID (apps drive)
LG 31MU97z 4096x2160 true 4K rev C
modded P5 case, Noctua heatsinks
Digital power supply 1650w
Windows Server 2012 R2 Data Center, Ubuntu 15

Quote:


> Originally Posted by *XcroN*
> Have you tried some games with 4 way SLI under pascal?
> 
> Yes, a little...
> 
> Fallout 4. custom sli profile, excellent
> No man's sky. Tried nvidia and custom, poor results so far, very poor implementation for pc so far
> Shadwen. custom sli profile, excellent
> We happy few. custom sli profile, excellent
> Hard reset redux. custom sli profile, excellent
> Doom 4 So so, still trying to figure it out
> Crystals. custom sli profile, excellent
> Crisis 2 and 3. custom sli profile, excellent
> Dying light. custom sli profile, awesome
> Ethan carter redux. custom sli profile, awesome
> Ashes of the singularity. Poor
> Soma. custom sli profile, So so
> Quake 4, doom 3, prey, So so, never ran sli well
> Soma. custom sli profile, excellent
> UT3. 32xAf, custom sli profile, Awesome
> Unreal gold. custom sli profile, excellent
> Ziggurat. custom sli profile, excellent
> Singularity. custom sli profile, excellent
> Sir, you're being hunted. custom sli profile, excellent
> Quake HD. custom sli profile, excellent
> Quake 2 custom sli profile, excellent
> Assassins creed. custom sli profile, excellent
> Batman Arkham. custom sli profile, with physics mod, good
> Batman ark city. custom sli profile, good
> Bloodrayne 2. HD \mod, threaded mod, custom sli profile, excellent
> Necropolis. Custom profile, work in progress, good
> Pollen. custom sli profile, excellent
> SuperHot. custom sli profile, excellent
> Solus project. Poor
> Far cry 1,2, 3 custom sli profile, good/ok
> Layers of fear. custom sli profile, excellent
> The witches 3. ok 2 way sli, poor 3 or 4 way
> The witcher 1 and 2. custom sli profile, excellent
> Metro and metro 2033. custom sli profile, mediocre/ok
> Outlast. custom sli profile, excellent
> Valve engines- portal, hL2, custom sli profile, excellent
> Portal 2- Very poor
> 
> Etc etc about 30 more titles


Again this is as far as I'm concerned, info only, about a wild one-off. But this is OCN so nothing can be considered out of bounds as we have the crazy 1%'ers here as well.

P.S., the politics of Nvidia shutting >2 way SLI down, someone else can talk to it. My personal opinion is that ultimately even for me it's too expensive to play with, for it to not work on a whim or driver update etc... I'm going single card next gen.


----------



## Jpmboy

Quote:


> Originally Posted by *meson1*
> 
> So basically, the upshot is, if you're working on a hard surface, you just need to be a bit careful.


..."on any surface"...

corrected.


----------



## meson1

Quote:


> Originally Posted by *Jpmboy*
> 
> ..."on any surface"...
> 
> corrected.


----------



## bl4ckdot

Quote:


> Originally Posted by *Ghoxt*
> 
> Source: Functional 4 Way SLI Titan X Pascal - in several games - how he did it, in his words.
> 
> Educational/ Info only on how one guy says he did it in his niche "server" setup with listed games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Current build:
> 
> Supermicro X10DRG-Q _(One of the few boards that numbers the PCI slots in reverse. Slot #1 is furthest away from CPU) hmm_
> 
> 2x E5-2699 v4 (44 cores/ 88HT) (3.7ghz turbo)
> 
> 512GB ram DDR4 2400mhz ecc reg
> 
> (QUAD SLI) 4 Titan X PASCAL
> 
> 2x Samsung NVMe 961 Pro pcie 3.0 (os drives)
> 
> 10x Samsung 850 Pro SSD RAID (apps drive)
> 
> LG 31MU97z 4096x2160 true 4K rev C
> 
> modded P5 case, Noctua heatsinks
> 
> Digital power supply 1650w
> 
> Windows Server 2012 R2 Data Center, Ubuntu 15
> 
> Again this is as far as I'm concerned, info only, about a wild one-off. But this is OCN so nothing can be considered out of bounds as we have the crazy 1%'ers here as well.
> 
> P.S., the politics of Nvidia shutting >2 way SLI down, someone else can talk to it. My personal opinion is that ultimately even for me it's too expensive to play with, for it to not work on a whim or driver update etc... I'm going single card next gen.


Damn his build is sexy


----------



## jodasanchezz

Hello Need some Help Here

My Titan is under water evryting was fine since today.









Im not eable to overcklock any way.

If got hevy artefacts.
even @ Stock setting there are some artefacts to see.

I Have no idea whats going on
Yesterday it was fine

Tried:

Removed an reinstalled Driver
Reseat the Card in Pcie Slot
CHecked 6 and 8pin

Someone may knows a reason whats going on?
What should i do

Temps @ full load heaven max 47°C

Heres a Video of my Situation

Sorry for the bad English!


----------



## TremF

Well I bought a new playmate for my TXP today... My 27" 1440P ViewSonic VP2770-LED monitor is being retired and an Acer Predator XB321HK is taking it's place!

I am moving on from 27" 1440P to 32" 4k G-Sync! I think that's going to be as big a step as it was going from 22" 1080P to 27"1440P but with a few more bells and whistles thrown in. Delivery tomorrow so all weekend to play.









I am an impulse buyer but tend to read reviews before taking the plunge. In this instance I had a good friend looking at various options with me - I was looking at 1440P G-Sync but that wouldn't have made a big enough difference, so I was looking at standard 4K initially to keep the price down a bit . Then my mate showed me the 32" and with the good reviews I quickly took the leap. Once I clicked to buy I thought to myself that the 27" version with smaller bezel would have probably done but it's done now! The extra screen space will give more to look at and help with immersion when I can't be bothered with my Oculus Rift VR headset









Can anyone tell I am happy and excited? lol GADGETS!


----------



## KillerBee33

Not sure what to say here , BURNED my Titan on Saturday, RMA'd and shipped out on Tuesday , got this in the mail today









Anticipated ship date:
Thu, 9/29/2016
Shipping Department
NVIDIA Corporation
Santa Clara, CA 95050
US
Scheduled delivery:
Fri, 9/30/2016 by 10:30 am


----------



## bizplan

Quote:


> Originally Posted by *TremF*
> 
> Well I bought a new playmate for my TXP today... My 27" 1440P ViewSonic VP2770-LED monitor is being retired and an Acer Predator XB321HK is taking it's place!
> 
> I am moving on from 27" 1440P to 32" 4k G-Sync! I think that's going to be as big a step as it was going from 22" 1080P to 27"1440P but with a few more bells and whistles thrown in. Delivery tomorrow so all weekend to play.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am an impulse buyer but tend to read reviews before taking the plunge. In this instance I had a good friend looking at various options with me - I was looking at 1440P G-Sync but that wouldn't have made a big enough difference, so I was looking at standard 4K initially to keep the price down a bit . Then my mate showed me the 32" and with the good reviews I quickly took the leap. Once I clicked to buy I thought to myself that the 27" version with smaller bezel would have probably done but it's done now! The extra screen space will give more to look at and help with immersion when I can't be bothered with my Oculus Rift VR headset
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can anyone tell I am happy and excited? lol GADGETS!


Got an XB321HK a couple of weeks ago, you will really like this monitor, playing Doom at 4K, ~120 FPS average with TXP (max eye candy settings), unbelievable clarity and color. Even at 60Hz refresh rate, very smooth, no latency or input lag.


----------



## bizplan

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hello Need some Help Here
> 
> My Titan is under water evryting was fine since today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im not eable to overcklock any way.
> 
> If got hevy artefacts.
> even @ Stock setting there are some artefacts to see.
> 
> I Have no idea whats going on
> Yesterday it was fine
> 
> Tried:
> 
> Removed an reinstalled Driver
> Reseat the Card in Pcie Slot
> CHecked 6 and 8pin
> 
> Someone may knows a reason whats going on?
> What should i do
> 
> Temps @ full load heaven max 47°C
> 
> Heres a Video of my Situation
> 
> Sorry for the bad English!


I think some of us may be pushing our cards too hard, I've noticed that I am getting some artifacting at high core offsets (Heaven & Firestrike) where I once did not get any artifacts at those settings. So I have had to dial my card back a bit. I am monitoring to see if problem gets worse. May have to RMA my card.








Edit: what's weird is I'm not getting any artifacts playing Doom (as high as 2100Mhz), so not sure what is going on...
Edit: dialing back memory settings makes no difference, it is core clock setting that is the problem (max temp 54C under water)..


----------



## tin0

Did anyone here try to get a free copy of Gears Of War 4 with the TXP? I know the promotion page says GTX 1080/1070 and Notebook only, but was just wondering..I tried Nvidia chat, but they were sorry and couldn't help, kept pointing at the terms & conditions..shame us TITAN X Pascal buyers are left out.

Something else though, managed to break my PR for single card graphics score in Fire Strike at 2100 MHz, well over 33K now still on stock cooler


----------



## KillerBee33

Does anyone know how to revert to previous Monitoring View in 3DMark?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Not sure what to say here , BURNED my Titan on Saturday, RMA'd and shipped out on Tuesday , got this in the mail today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anticipated ship date:
> Thu, 9/29/2016
> Shipping Department
> NVIDIA Corporation
> Santa Clara, CA 95050
> US
> Scheduled delivery:
> Fri, 9/30/2016 by 10:30 am











Quote:


> Originally Posted by *KillerBee33*
> 
> Does anyone know how to revert to previous Monitoring View in 3DMark?


not sure what you are asking... in test sensor monitoring? It's on the Option tab.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not sure what you are asking... in test sensor monitoring? It's on the Option tab.


The monitor Chart or Details


Spoiler: Warning: Spoiler!






You can see where it gets annoying








Nvidia emailed me a confirmation of the RMA stating Part #*********RETAIL , does that mean it's not refurbished?
Quote:


> Originally Posted by *Jpmboy*


Any way to remove Thermal Adhesive i used on Komodo block btw?
Used Arctic Alumina Thermal Adhesive 5.0 grams


----------



## mbze430

the first bad Titan XP I replaced came up in a completely retail package. Unforuntely that same replacement has been RMA for another replacement. Mine will come in tomorrow. So we'll see how this one works out.

I really hope this one holds, I am tired of sending stuff back (even though they pay for it, I just hate having to drop it off at FedEX)


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> the first bad Titan XP I replaced came up in a completely retail package. Unforuntely that same replacement has been RMA for another replacement. Mine will come in tomorrow. So we'll see how this one works out.
> 
> I really hope this one holds, I am tired of sending stuff back (even though they pay for it, I just hate having to drop it off at FedEX)


What was bad on those two? Bad clockers or ?


----------



## mbze430

the memory started to fail. artifacting and big blocks the first one gotten so bad it was practically unuseable. even on desktop it would flash artifacts on screen


----------



## jodasanchezz

[/quote]
Quote:


> Originally Posted by *mbze430*
> 
> the memory started to fail. artifacting and big blocks the first one gotten so bad it was practically unuseable. even on desktop it would flash artifacts on screen


Looks like i got the sa´me Problem ...

all was fine since till yesterday....
was playing with oc curve in AB and it startet to artefact heavy @ about 2068mhz core 0mhz on the Ram

Backed up a bit still same
Back to stock workes for 5-15 sec getting artefacts....

When i try to dial in any oc i starts even in 2d to artefact









Titan is in a single Loop with 360 rad and never goes over 47°C

What have u Tolled Nvidia ? Have u had ur Card under Water?


----------



## KillerBee33

@ jodasanchezz

Put it together and tell nVidia "Screen Flashing was so bad you were scared for the rest of the System , had to unplug it and don't feel safe to install it back"
In my case they never asked was was done to the card previously ,The Answer would always be "Absolutely Nothing was done , the last driver installed a month ago and haven't see a single issue" hopefully you get the same treatment "Out & Back In the System in 5 Days"


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*
> 
> Not sure what to say here , BURNED my Titan on Saturday, RMA'd and shipped out on Tuesday , got this in the mail today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anticipated ship date:
> Thu, 9/29/2016
> Shipping Department
> NVIDIA Corporation
> Santa Clara, CA 95050
> US
> Scheduled delivery:
> Fri, 9/30/2016 by 10:30 am


good for you


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> good for you


Do i smell sarcasm ?


----------



## mbze430

Quote:


> Originally Posted by *jodasanchezz*


all was fine since till yesterday....
was playing with oc curve in AB and it startet to artefact heavy @ about 2068mhz core 0mhz on the Ram

Backed up a bit still same
Back to stock workes for 5-15 sec getting artefacts....

When i try to dial in any oc i starts even in 2d to artefact









Titan is in a single Loop with 360 rad and never goes over 47°C

What have u Tolled Nvidia ? Have u had ur Card under Water?[/quote]

My 2 cards, NEVER OC'd nor have I modified them. They ran stock.


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*
> 
> Do i smell sarcasm ?


What??
No bro
Nvidia spare my life also when I broke my card


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> What??
> No bro
> Nvidia spare my life also when I broke my card


Cool







It's suppose to be here now , Out for Delivery.


----------



## mbze430

got mine, this 3rd card is actually the 'best' packaged lol... the anti-static bag in this new one is actually not ripped. Hopefully this will last me more than 2 weeks. Now I can play RoTTR back in SLI again!


----------



## KillerBee33

Quote:


> Originally Posted by *mbze430*
> 
> got mine, this 3rd card is actually the 'best' packaged lol... the anti-static bag in this new one is actually not ripped. Hopefully this will last me more than 2 weeks. Now I can play RoTTR back in SLI again!


Yeap, this one is definitely brand new and also came sealed with better packaging









Spoiler: Warning: Spoiler!






Downloaded ROTTR and Quantum Break , will check it out today


----------



## eliau81

Does our card support free sync?

I know that not, but my friend here bet that it does so I want a proof that I am right
Tried to explain that this is for AMD


----------



## Yuhfhrh

Quote:


> Originally Posted by *eliau81*
> 
> Does our card support free sync?
> 
> I know that not, but my friend here bet that it does so I want a proof that that I am right
> Tried to explain that this is for AMD


Nvidia drivers do not support free sync.


----------



## mbze430

yeah my first and 2nd replacements... it has the 2 side seals, but the inside was a mess.

However.. I bought my SLI 2nd one maybe about 2wks after the first.. and that came in perfect

Quote:


> Originally Posted by *eliau81*
> 
> Does our card support free sync?
> 
> I know that not, but my friend here bet that it does so I want a proof that I am right
> Tried to explain that this is for AMD


hhahahhahahhah that would be Nvidia stabbing their own foot.


----------



## eliau81

Thanks guys I have just won a 100 buck's, lol


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> Thanks guys I have just won a 100 buck's, lol


But we have a NEW VSYNC option in Pascal i think it's called FAst SYNC


----------



## mbze430

although when you enable Active vsync, it is a very very close similar to what Freesync does. .


----------



## istudy92

So, I bought 1080 SLI since it fits my needs to play 1440p on gsync max settings, I get 80+FPS most games.

I was quite disappointed though, too many issues from freezes and I play DOTA alot, and I get like 80FPS, when I single card it I get 180 FPS.

Like minesweeper gets 40FPS and solitaire is even worse at 30FPS.

Anyone here selling spare TXP?


----------



## skypine27

Hey guys

Been gone a while....

Did anyone ever come out with a modded BIOS (or a way for us to mod the BIOS)? I want to change the power slider max from 120% up to 125 or 130%

Thanks for any updates


----------



## Silent Scone

Quote:


> Originally Posted by *istudy92*
> 
> So, I bought 1080 SLI since it fits my needs to play 1440p on gsync max settings, I get 80+FPS most games.
> 
> I was quite disappointed though, too many issues from freezes and I play DOTA alot, and I get like 80FPS, when I single card it I get 180 FPS.
> 
> Like minesweeper gets 40FPS and solitaire is even worse at 30FPS.
> 
> Anyone here selling spare TXP?


I do, but I can't tell if your post is salty or not.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *istudy92*
> 
> So, I bought 1080 SLI since it fits my needs to play 1440p on gsync max settings, I get 80+FPS most games.
> 
> I was quite disappointed though, too many issues from freezes and I play DOTA alot, and I get like 80FPS, when I single card it I get 180 FPS.
> 
> Like minesweeper gets 40FPS and solitaire is even worse at 30FPS.
> 
> Anyone here selling spare TXP?


Two people selling a TX P in the market place:

*http://www.overclock.net/f/14779/video*
Quote:


> Originally Posted by *skypine27*
> 
> Hey guys
> 
> Been gone a while....
> 
> Did anyone ever come out with a modded BIOS (or a way for us to mod the BIOS)? I want to change the power slider max from 120% up to 125 or 130%
> 
> Thanks for any updates


Nothing yet. If you want a smidge more performance, do the shunt mod.


----------



## skypine27

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Two people selling a TX P in the market place:
> 
> *http://www.overclock.net/f/14779/video*
> Nothing yet. If you want a smidge more performance, do the shunt mod.


Thx bro. No way im dropping solder between resistors etc. thx for the update


----------



## lowbudgethero

Quote:


> Originally Posted by *skypine27*
> 
> Thx bro. No way im dropping solder between resistors etc. thx for the update


the minor increase in performance isn't worth the risk even for me, maybe when the titanxp is starts to show it's age i'll reconsider


----------



## mouacyk

When someone has a BIOS mod to let the XP (or 1080TI FTM) suck 400W, wake me up.


----------



## istudy92

Quote:


> Originally Posted by *Silent Scone*
> 
> I do, but I can't tell if your post is salty or not.


haha, like posts back from october someone mentioned that they were getting 45fps on minesweeper and solitaire, blast from past I was hopping someone would catch it.
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Two people selling a TX P in the market place:
> 
> *http://www.overclock.net/f/14779/video*
> Nothing yet. If you want a smidge more performance, do the shunt mod.


Yeah I know, 1 TXP is moded with fans on it though, and the other is from UK the price would exceed that of here in US shipped.


----------



## KillerBee33

Ehh , just as expected







4 in the row and finally broke the cycle ....
New Titan cant go over +210 but rolls just fine +800 on the memory ..


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!






http://www.3dmark.com/3dm/15193020


----------



## cg4200

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh , just as expected
> 
> 
> 
> 
> 
> 
> 
> 4 in the row and finally broke the cycle ....
> New Titan cant go over +210 but rolls just fine +800 on the memory ..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15193020


with 210 what do you get average core?and is that water cooled or air ? thanks


----------



## MrTOOSHORT

Installed the nickel back plate today:




Spoiler: Warning: Spoiler!










Those millions of little resistors made me nervous running bare for the last 6 weeks. Feels better to have some armor there now.


----------



## bouncingsoul

Quote:


> Originally Posted by *cg4200*
> 
> with 210 what do you get average core?and is that water cooled or air ? thanks


Hey there!

I'm new on the forums although I've been reading this thread for some time.
My TX is watercooled and I'm running +225MHz core and +675MHz mem. When playing Deus Ex Mankind Divided the card boosts up to 2088MHz. Depending on the temp it clocks down to approx. 2040MHz at max. 55 degrees - which are the highest temps I got until now after hours of running at 100% load in DE:MD.

I was quite surprised to get that high temps in Deus Ex... I haven't seen and temps above 44 degrees until now.
Is anyone getting similar temps? Or is it just my insufficient radiator?


----------



## chantruong

Just received this.



Should humans be allowed such power?


----------



## bujao

Finally installed the EVGA jit for 980ti on my titan X. I, very happy with. its quiet. so far it seens I cant get past 2062 but tems are 40C or lower under loas. sometimes goes 44C. idles at 23-24C

Install was pretty easy

btw. currently settings is +240/400


----------



## carlhil2

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh , just as expected
> 
> 
> 
> 
> 
> 
> 
> 4 in the row and finally broke the cycle ....
> New Titan cant go over +210 but rolls just fine +800 on the memory ..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15193020


You should have cracked 11G with those clocks, maybe ram too high? I might move on to 1080 SLI in order to push Star [email protected] should be dropping before the Year is out... Oops, my bad, you are only at 2063 core..


----------



## DooRules

That backing plate looks awesome MrTOOSHORT.


----------



## hotrod717

I'm sure some of you have seen this. This is is what I'm waiting for - http://www.overclock.net/t/1612538/wccftech-nvidia-gtx-1080-ti-launching-in-january-with-10-8-tflops-12gb-gddr5x-to-deliver-titan-x-pascal-performance-at-a-much-lower-price#post_25549189


----------



## KillerBee33

Quote:


> Originally Posted by *carlhil2*
> 
> You should have cracked 11G with those clocks, maybe ram too high? I might move on to 1080 SLI in order to push Star [email protected] should be dropping before the Year is out... Oops, my bad, you are only at 2063 core..


Got 110K once with those clocks , mostly stays at 10900's
Some lucky bastard is gonna get my refurbished Titan


----------



## cg4200

Hey seems temps are little high.. I have 2 black ice nemesis 360gtx in loop running 6700k 4.8 and my xp runs 40c max that's gaming hours,it is cooler now in new England played gtav 2 hours temps 34c..
What kind of block I have ek .and I don't think it is as good fit as my 980 ti or my titan x was..i did not have my spark plug gapper gauge but if you look at 4 ram chips closest to vrm as I lower does not make good contact..so I took out.5 vrm tried again not great fit. I only looked cuz when I took off block I could see other side spots where thermal pad left little squares spots on copper..any way mabe only my block not telling anyone to try but I took 1 mm and replaced .5 on vrm and also added 1 mm to vrm ek gives you nothing for I carefully placed down makes good contact on mine ..still does not appear to hit 4 closest vram chips so I put thermal grizzly on and tighten tested ...I now see benefit with ram 750 where before over +710 would bring score down.

20160930_173549_resized.jpg 383k .jpg file


20160930_171628.jpg 1977k .jpg file


20160930_173549_resized.jpg 383k .jpg file
 also seems to be misconception on shunt mod.. it is easy and take 25min and like anything if clumsy do not try..do not use pliers buy right tool if can afford 1200 card>>>>>> I took electric tape placed over shunt that would drip on chip below then put glue from glue gun on tape to make a catch..glue will come off and not damage card..them put painters tape around shunt lay thin layer clu..next blow dry carefully let cool put an other coat so you make sure to lower tdp.. blow dry again.. remove tape clean up done .. my card is not lotto winner but I get 2114 gaming for hours no down clock ..also see many people messing up your card I feel bad for you make sure to take your time.. I wish there was someone who would crack bios already.. I took a look am over my head I think we should start a paypal pool and come up with 500.00 dollars to make it worth someones time to crack????


----------



## cg4200

20160930_173539_resized.jpg 493k .jpg file
forgot this pic much better view


----------



## Artah

Quote:


> Originally Posted by *eliau81*
> 
> Thanks guys I have just won a 100 buck's, lol


There was wishful thinking that they would make a driver but never going to happen.


----------



## bouncingsoul

Quote:


> Originally Posted by *cg4200*
> 
> I took a look am over my head I think we should start a paypal pool and come up with 500.00 dollars to make it worth someones time to crack????


I'm in


----------



## kx11

got Nickel plate in last night


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> got Nickel plate in last night


Weird that it's ribbed , always thought it'd be nice and flatout nickel .


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Weird that it's ribbed , always thought it'd be nice and flatout nickel .


the 1st time you take it out and look at it it's smooth flat nickel but after some seconds the texture shows up somehow


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> the 1st time you take it out and look at it it's smooth flat nickel but after some seconds the texture shows up somehow


It's fine , i cant believe they didn't add NICKEL screws with it , had to use black ...


----------



## jeff3206

Has anyone found a workaround to get SLI working in Deus Ex: Mankind Divided? I am getting negative scaling. Thanks.


----------



## bizplan

Quote:


> Originally Posted by *jeff3206*
> 
> Has anyone found a workaround to get SLI working in Deus Ex: Mankind Divided? I am getting negative scaling. Thanks.


Ask this gent: http://forums.guru3d.com/showthread.php?t=409468


----------



## Baasha

Here's a petition to get Nvidia to "unlock" or "enable" 3-way or 4-way SLI on Pascal GPUs: https://goo.gl/331yGY

It would be great if all of us can sign it and get them to take notice.


----------



## FattysGoneWild

Quote:


> Originally Posted by *Baasha*
> 
> Here's a petition to get Nvidia to "unlock" or "enable" 3-way or 4-way SLI on Pascal GPUs: https://goo.gl/331yGY
> 
> It would be great if all of us can sign it and get them to take notice.


SLI has a foot in the grave now as it is. Nvidia has no interest any more in 3-4 way SLI. They know the market best. Customers that want that are few and far between. SLI will be a thing of the past as cards get more powerful. I would not be surprised if they dropped SLI support completely by next year or 2018 at the latest.

You know it and so does everyone else. 3-4 SLI is just for e-peen status. Scaling always has and will be horrific with setups like that in games. Its a complete waste.


----------



## Maxxamillion

I just got my 2 titan x's installed with EK water blocks and the main gpu is 5 degrees higher than the the other. I took it out and reinstalled the block and the temps are still the same. I'm not really sure what else to do about it.


----------



## meson1

Quote:


> Originally Posted by *Maxxamillion*
> 
> I just got my 2 titan x's installed with EK water blocks and the main gpu is 5 degrees higher than the the other. I took it out and reinstalled the block and the temps are still the same. I'm not really sure what else to do about it.


Surely the workload is not usually split exactly between the two. I thought it was normal for the main card to have to work a little harder?


----------



## lowbudgethero

Serial loop?
Quote:


> Originally Posted by *Maxxamillion*
> 
> I just got my 2 titan x's installed with EK water blocks and the main gpu is 5 degrees higher than the the other. I took it out and reinstalled the block and the temps are still the same. I'm not really sure what else to do about it.


picture of your loop?


----------



## Maxxamillion

Quote:


> Originally Posted by *lowbudgethero*
> 
> Serial loop?
> picture of your loop?


My loop is parallel and I had the older titan x's in it before and there was a 2 degree difference. Here are some pics.


----------



## lowbudgethero

If you increase pump speed does it make a difference?


----------



## Maxxamillion

Quote:


> Originally Posted by *lowbudgethero*
> 
> If you increase pump speed does it make a difference?


No it does not.


----------



## ratzofftoya

Just got two of these in my test bench. Benchmarks to come!


----------



## kx11

cool intro man


----------



## Maxxamillion

I took the block off and reseated it for the 3rd time now. While tightening the screws I looked very carefully and made sure nothing got bent. I also switch positions of the gpus so it is now the first to receive liquid. I did use the X formation putting the paste on if that matters. The temps are still 3-5 degrees difference. Not sure what the problem is but I'm about to give up.


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh , just as expected
> 
> 
> 
> 
> 
> 
> 
> 4 in the row and finally broke the cycle ....
> New Titan cant go over +210 but rolls just fine +800 on the memory ..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/15193020


i wonder how you OC core clock is reported correctly by 3dmark while mine is reporting wrong values

http://www.3dmark.com/fs/10361626

my GPU can hit 240+ core and 550+ just fine but the score is bad compared to 520+ memory in FS


----------



## Maxxamillion

Are people using pea or X formation when installing their water blocks for this card?


----------



## DooRules

Quote:


> Originally Posted by *kx11*
> 
> i wonder how you OC core clock is reported correctly by 3dmark while mine is reporting wrong values


Go to options in 3d mark and turn on systeminfo hardware monitoring and it will then show correct clock, but there is a hit on score for this to run


----------



## kx11

Quote:


> Originally Posted by *DooRules*
> 
> Go to options in 3d mark and turn on systeminfo hardware monitoring and it will then show correct clock, but there is a hit on score for this to run


i hate the thing i lose a lot of points because of it


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> i wonder how you OC core clock is reported correctly by 3dmark while mine is reporting wrong values
> 
> http://www.3dmark.com/fs/10361626
> 
> my GPU can hit 240+ core and 550+ just fine but the score is bad compared to 520+ memory in FS


+ 520 on the memory VS + 800
http://www.3dmark.com/fs/10366404
http://www.3dmark.com/fs/10366429
Not sure whats wrong with this card








Same run with Sys Info+ Hardware Monitoring OFF


Spoiler: Warning: Spoiler!






http://www.3dmark.com/3dm/15236727


----------



## tin0

Quote:


> Originally Posted by *Maxxamillion*
> 
> Are people using pea or X formation when installing their water blocks for this card?


Thumb rule:
- For CPU use the pea method
- For GPU always use the X method since GPU is bare DIE with no heatspreader, use enough paste (much more than pea method). Most of the time I use the X method and even connect the outsides with a small line to make sure the full DIE is covered (so make a square along the outside of the DIE and put the X inside it). Must have done over 200 cards that way either with cooler reseating, installing waterblocks or when doing LN2/phase change benching. Never had any problems


----------



## KillerBee33

Quote:


> Originally Posted by *tin0*
> 
> Thumb rule:
> - For CPU use the pea method
> - For GPU always use the X method since GPU is bare DIE with no heatspreader, use enough paste (much more than pea method). Most of the time I use the X method and even connect the outsides with a small line to make sure the full DIE is covered (so make a square along the outside of the DIE and put the X inside it). Must have done over 200 cards that way either with cooler reseating, installing waterblocks or when doing LN2/phase change benching. Never had any problems


This is about right












I just used a half inch line in the middle vertically on the Titan , 36 in Firestrike ULTRA 2050 C/1451 M also 6700K @ 4.6 @ 1.34V in the same loop


----------



## Maxxamillion

Quote:


> Originally Posted by *tin0*
> 
> Thumb rule:
> - For CPU use the pea method
> - For GPU always use the X method since GPU is bare DIE with no heatspreader, use enough paste (much more than pea method). Most of the time I use the X method and even connect the outsides with a small line to make sure the full DIE is covered (so make a square along the outside of the DIE and put the X inside it). Must have done over 200 cards that way either with cooler reseating, installing waterblocks or when doing LN2/phase change benching. Never had any problems


Could you post a picture of this method? I'm just wondering how big of an X that should be and how it looks all done.


----------



## eliau81

so i want to get started to put the hybrid 1080 kit on my titan
but i got only the cooler master thermal fusion 400 paste... would it be enough?


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> so i want to get started to put the hybrid 1080 kit on my titan
> but i got only the cooler master thermal fusion 400 paste... would it be enough?


In my experience , EVGA paste ran 980 @ 1582MHz at 45-50
Gelid EX. ran TXP 50-60 with EVGA kit.
I'm sure you wont go by a single user review but i doubt it matters much 3-5 degrees at most.


----------



## axiumone

Quote:


> Originally Posted by *Maxxamillion*
> 
> Could you post a picture of this method? I'm just wondering how big of an X that should be and how it looks all done.


I've been using a pea method on both gpus and cpus for as long as I can remember. Never been unhappy with temps. The pressure from mounting the cooler will spread the thermal paste as needed.


----------



## eliau81

Quote:


> Originally Posted by *KillerBee33*
> 
> In my experience , EVGA paste ran 980 @ 1582MHz at 45-50
> Gelid EX. ran TXP 50-60 with EVGA kit.
> I'm sure you wont go by a single user review but i doubt it matters much 3-5 degrees at most.


with my 980/1555hz + 87mv @ evga kit with cooler master thermal fusion 400 got me about 50 - 60c
i did ordered the Gelid EX ,but didn't came as for now and i don't want to wait any more


----------



## pez

Pea method usually still works with GPU dies as the dies are small enough that in the end it's spreading anyways







.


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> with my 980/1555hz + 87mv @ evga kit with cooler master thermal fusion 400 got me about 50 - 60c
> i did ordered the Gelid EX ,but didn't came as for now and i don't want to wait any more


Just use EVGAs TP as is








If you feel like its not doing the job you can repaste without taking off the radiator , like i did the first 3 times


----------



## eliau81

will do and update
thx man


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> will do and update
> thx man


NP this is what EVGAs factory paste did with 980 as proof










Spoiler: Warning: Spoiler!






Look at the bottom right for Temps.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> The monitor Chart or Details
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> You can see where it gets annoying
> 
> 
> 
> 
> 
> 
> 
> 
> Nvidia emailed me a confirmation of the RMA stating Part #*********RETAIL , does that mean it's not refurbished?
> 
> 
> 
> 
> 
> 
> 
> 
> *Any way to remove Thermal Adhesive i used on Komodo block btw*?
> Used Arctic Alumina Thermal Adhesive 5.0 grams


Acetone or methylene chloride.








Quote:


> Originally Posted by *mbze430*
> 
> the first bad Titan XP I replaced came up in a completely retail package. Unforuntely that same replacement has been RMA for another replacement. Mine will come in tomorrow. So we'll see how this one works out.
> 
> I really hope this one holds, I am tired of sending stuff back (even though they pay for it, I just hate having to drop it off at FedEX)


Nv rule is 3 strikes and you're (SOL) out.








Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Installed the nickel back plate today:
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those millions of little resistors made me nervous running bare for the last 6 weeks. Feels better to have some armor there now.


Quote:


> Originally Posted by *kx11*
> 
> i hate the thing i lose a lot of points because of it


Looks great T. I agree 100%... leaving all those pcb components "hanging out in traffic" is an Oh-sheet moment waiting to happen. The backplate NV provided is not included for cooling.









( I really like the Intel 750 waterrblock.







)


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> Acetone or methylene chloride.


All off







just cant find a way to dry inside of that block .


----------



## eliau81

got some pretty bad results with evga kit 1080 with stock paste
temp 55 - 62 c
clocks +180 /450 @ 2050hz ~ 1925mh
+100mv / 120%
fire strike extreme - 15320 graphic score previous was 12600 score
Valley Benchmark 1.0 - 6047 score
witcher 4k ultra setting with nvidia hair work off i get 60 - 65 FPS


----------



## Woundingchaney

Quote:


> Originally Posted by *eliau81*
> 
> got some pretty bad results with evga kit 1080 with stock paste
> temp 55 - 62 c
> clocks +180 /450 @ 2050hz ~ 1925mh
> +100mv / 120%
> fire strike extreme - 15320 graphic score previous was 12600 score
> Valley Benchmark 1.0 - 6047 score
> witcher 4k ultra setting with nvidia hair work off i get 60 - 65 FPS


Thats not too far off what you can expect with the hybrid cooler. Using higher end paste will net you around 3 degrees lower temps. Realisically you are sustaining 20+ degrees cooler under load.


----------



## KillerBee33

Quote:


> Originally Posted by *eliau81*
> 
> got some pretty bad results with evga kit 1080 with stock paste
> temp 55 - 62 c
> clocks +180 /450 @ 2050hz ~ 1925mh
> +100mv / 120%
> fire strike extreme - 15320 graphic score previous was 12600 score
> Valley Benchmark 1.0 - 6047 score
> witcher 4k ultra setting with nvidia hair work off i get 60 - 65 FPS


That looks about right , had same temps. on a TXP with Gelid EX. which is one of the best on the market , you may gain 1-2 degrees at most with something else .I've seen 60's in 4K gaming


----------



## Lobotomite430

Quote:


> Originally Posted by *eliau81*
> 
> got some pretty bad results with evga kit 1080 with stock paste
> temp 55 - 62 c
> clocks +180 /450 @ 2050hz ~ 1925mh
> +100mv / 120%
> fire strike extreme - 15320 graphic score previous was 12600 score
> Valley Benchmark 1.0 - 6047 score
> witcher 4k ultra setting with nvidia hair work off i get 60 - 65 FPS


What is your ambient temp?


----------



## eliau81

Quote:


> Originally Posted by *Lobotomite430*
> 
> What is your ambient temp?


about 24~25c


----------



## Lobotomite430

Quote:


> Originally Posted by *eliau81*
> 
> about 24~25c


That doesnt seem so bad to me as I can creep into mid 50c range with and ambient of 21c under heavy load


----------



## CallsignVega

Haven't checked in for a few weeks. There a modded BIOS yet?


----------



## TopicClocker

Quote:


> Originally Posted by *Ghoxt*
> 
> Source: Functional 4 Way SLI Titan X Pascal - in several games - how he did it, in his words.
> 
> Educational/ Info only on how one guy says he did it in his niche "server" setup with listed games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -Snip-
> 
> Again this is as far as I'm concerned, info only, about a wild one-off. But this is OCN so nothing can be considered out of bounds as we have the crazy 1%'ers here as well.
> 
> P.S., the politics of Nvidia shutting >2 way SLI down, someone else can talk to it. My personal opinion is that ultimately even for me it's too expensive to play with, for it to not work on a whim or driver update etc... I'm going single card next gen.


Fascinating!

I've seen another member of that forum's YouTube videos, he goes by the name "ThirtyIR" on Guru3D and YouTube and also posted in that thread, the 4 way Titan X Pascal Sli numbers are incredible! And it seems to work fairly well too judging by both his and Venturi's posts who helped him set it up.

I really wish NVIDIA didn't try to stop people from running 4 way SLI, it already works for previous generation GPUs. If people want to run 4 way SLI let them, but let them know that they might not be actively optimizing for more than 2 GPUs in SLI.

If you want to see ThirtyIR's videos here's a link. It's incredible to see such powerful GPU hardware at work!


----------



## carlhil2

Quote:


> Originally Posted by *CallsignVega*
> 
> Haven't checked in for a few weeks. There a modded BIOS yet?


Lol, Volta will drop first....


----------



## profundido

Quote:


> Originally Posted by *Maxxamillion*
> 
> I took the block off and reseated it for the 3rd time now. While tightening the screws I looked very carefully and made sure nothing got bent. I also switch positions of the gpus so it is now the first to receive liquid. I did use the X formation putting the paste on if that matters. The temps are still 3-5 degrees difference. Not sure what the problem is but I'm about to give up.


Reverse the entire loop flow at your pump+res. In other words: swap out and in and watch the magic happen


----------



## Maxxamillion

Quote:


> Originally Posted by *profundido*
> 
> Reverse the entire loop flow at your pump+res. In other words: swap out and in and watch the magic happen


I'm sorry, I don't entirely understand what you mean. Do you want me to want me to reverse the loop flow and if so why would that make a difference?


----------



## profundido

Quote:


> Originally Posted by *Maxxamillion*
> 
> I'm sorry, I don't entirely understand what you mean. Do you want me to want me to reverse the loop flow and if so why would that make a difference?


You understand me perfectly but the existing theory knowledge in your head refuses to accept that this could be a possible solution and worth the trouble of trying this suggestion







Let me take a peek inside your head:

You're a perfectionist and if the theory doesn't add up why would the practice suddenly do so unless the current theory in your head needs to be revised all of a sudden and thus was wrong all along ? After all the general theory consensus nowadays is that as soon as you reach and stay above a certain flow per minute the average temperature will not differ more than roughly 2 degrees Celsius on any point or in any direction anymore because we're no longer considering static fluids but fluids in motion where all heat never "sticks" to any component right ? Hence you would not expect that 4-5 Degrees difference between 2 side-by-die components could suddenly become more or less equal temps by just switching the entire loop flow where the components are still side-by-side....right ?

yet that is exactly what I saw happen when I did it. My previous equal temps became 4-5 different on the TXP's in SLI upon reversing loop order. Believe me it's better if you first see it work with your own eyes as I have before we start on the thermodynamical theory that matches it...

Then again your loop may behave different to it depending on your components but if you can try my suggestion, please give it a go as compared to all the other things you've apparently done already so far this is easy and worth a try


----------



## opt33

Quote:


> Originally Posted by *Maxxamillion*
> 
> I just got my 2 titan x's installed with EK water blocks and the main gpu is 5 degrees higher than the the other. I took it out and reinstalled the block and the temps are still the same. I'm not really sure what else to do about it.


It is normal for 2 gpus to have different temps, just sensor variation. Intel sensors are +/- 5C in accuracy at the calibrated point of tjmax, and will see higher spread away from calibration point. Gpus sensors are not any better. If you had 2 gpus with same or closer temps, it is just luck in similar sensors. Water cooling for 15 years, have seen higher than 5C difference and lower than 5C with sli gpus....all normal.

Dont waste time changing loop order, a 5C temp difference from loop order requires a very low flow problem, ie dying pump or obstruction. Those that measure with calibrated sensors (mine are all within 0.2C on Fluke), the water temp going into my components vs out is never more than 1C even at full component loads, even when first starting. Accurate testing will simply mimic physics. At 1.5 gpm it takes ~ 400W to heat water 1C. At 1 gpm (most with decent pumps will be that high), takes 264W to heat water 1C. You can calculate your defective flow if you had 5C temp difference.

Q watts (joules/second) = mdot * Cp*dt = 1.5 gal/min x 8.34/lbs/gal x (1min/60sec) x (.4536 kg/1lbs) x 4186 Joules/KG C x 1 C = 396W (or for1 gpm takes about 264W)

calculations in parenthesis just for unit conversion
mdot = flow x density (water)
Cp = specific heat capacity (water here)
dt = delta temp water before and after heat load


----------



## EniGma1987

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Installed the nickel back plate today:
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those millions of little resistors made me nervous running bare for the last 6 weeks. Feels better to have some armor there now.


That is really sexy. Both the backplate and your idea to use plain fittings and an extension to do a nice little jump from the CPU to GPU like that.

Quote:


> Originally Posted by *hotrod717*
> 
> I'm sure some of you have seen this. This is is what I'm waiting for - http://www.overclock.net/t/1612538/wccftech-nvidia-gtx-1080-ti-launching-in-january-with-10-8-tflops-12gb-gddr5x-to-deliver-titan-x-pascal-performance-at-a-much-lower-price#post_25549189


That is just rehashing the same rumor they posted multiple times before, all based on a proven fake specs page image.

Quote:


> Originally Posted by *profundido*
> 
> You understand me perfectly but the existing theory knowledge in your head refuses to accept that this could be a possible solution and worth the trouble of trying this suggestion
> 
> 
> 
> 
> 
> 
> 
> Let me take a peek inside your head:
> 
> You're a perfectionist and if the theory doesn't add up why would the practice suddenly do so unless the current theory in your head needs to be revised all of a sudden and thus was wrong all along ? After all the general theory consensus nowadays is that as soon as you reach and stay above a certain flow per minute the average temperature will not differ more than roughly 2 degrees Celsius on any point or in any direction anymore because we're no longer considering static fluids but fluids in motion where all heat never "sticks" to any component right ? Hence you would not expect that 4-5 Degrees difference between 2 side-by-die components could suddenly become more or less equal temps by just switching the entire loop flow where the components are still side-by-side....right ?
> 
> yet that is exactly what I saw happen when I did it. My previous equal temps became 4-5 different on the TXP's in SLI upon reversing loop order. Believe me it's better if you first see it work with your own eyes as I have before we start on the thermodynamical theory that matches it...
> 
> Then again your loop may behave different to it depending on your components but if you can try my suggestion, please give it a go as compared to all the other things you've apparently done already so far this is easy and worth a try


ok so first and foremost, you seem like a dick. So might want to change your elitist attitude.
Second, you seem to be saying to push water from the pump up into the reservoir??? That is what would happen with a simple out/in swap..... How do you expect people to push water from the pump, up the reservoir, out the top and into other parts? That seems like just begging for forcing air bubbles into your loop.


----------



## pompss

Hey guys

My titan X overlock really good but have whine coil which is very annoying for me

Its a shame i have to rma the card since overlocks really high

If someone want the card i can sell it for 1300 + shipping which is what i paid for it.

i will wait 1 days after i have to rma the card


----------



## skypine27

Quote:


> Originally Posted by *pompss*
> 
> Hey guys
> 
> My titan X overlock really good but have whine coil which is very annoying for me
> 
> Its a shame i have to rma the card since overlocks really high
> 
> If someone want the card i can sell it for 1300 + shipping which is what i paid for it.
> 
> i will wait 1 days after i have to rma the card


A new one that doesn't even make noise is only 1200....

http://www.geforce.com/hardware/10series/titan-x-pascal


----------



## havabeer

Hi everyone

Australia owner here, sold my maxwell Titans for 2x pascal's titans. Cost a pretty penny to get them shipped out to aus but happy with them

Bought some water blocks for them as well (didn't even try them on air) originally had them paired with an i5 4690k but have up graded the whole system to a i7 5960k and 16 gig ddr4 ram. Literally just putting the finishing touches and installing Windows now.

Currently how it looks









Here's some bench marks I did on the old system, I couldn't get them stable at 2100mhz, not sure if new system and new psu will help get it to 2100mhz. Really need to give the cards some extra volts to keep it stable

as mentioned above I'm getting some coil whine as well, very happy with the temps so far though ambient sounds where low-ish so will have to wait and see how it handles some 35 degree summer days










Should point all those tests are done at 4K res

Rather then Reading through the 155 pages is there any development for a custom bios in the works? Managed to get my old Titan X's to 1500mhz stable with one and really want to unleash these current Titans but the lack of voltage support is holding them back

Cheers


----------



## Jpmboy

^^ no bios mods yet... no pascal bios tweaker (due to NV's hash lock). Probably won't happen until the core is released to 3rd party vendors.


----------



## mbze430

IF they will let 3rd party vendor to release any Titan XPs... thinking this is going to be a very big IF


----------



## DADDYDC650

What is the high end stock boost clock for this card again? How would I know if I have a good card off the bat without overclocking?


----------



## eliau81

some odd thing i'v notice

my clocks always staying at 1417.5mhz
while playing it boost fine , but while resting and doing nothing its stay 1417.5mhz
it should go down ..to save power or what not?


----------



## cg4200

You might want to look in nv control panel and make sure power is set to adaptive..If set on prefer maximum it will not really down clock below around what clocks your describing.


----------



## bee144

Quote:


> Originally Posted by *pompss*
> 
> Hey guys
> 
> My titan X overlock really good but have whine coil which is very annoying for me
> 
> Its a shame i have to rma the card since overlocks really high
> 
> If someone want the card i can sell it for 1300 + shipping which is what i paid for it.
> 
> i will wait 1 days after i have to rma the card


No offense but why would someone buy a 1300 used titan with serious coil wine issue when they can hope for a better card directly from NVIDIA and for cheaper. Oh, and don't forget that warranty does not transfer.

Someone could buy the card from you tomorrow and if it broke the next day, they'd be out $1300. Pass on this offer people.


----------



## Jpmboy

Quote:


> Originally Posted by *DADDYDC650*
> 
> What is the high end stock boost clock for this card again? How would I know if I have a good card off the bat without overclocking?


stock load voltage below 1.05V?


----------



## DADDYDC650

Quote:


> Originally Posted by *Jpmboy*
> 
> stock load voltage below 1.05V?


I'm not on my PC at the moment. Just wanted to know what the max stock boost is running something like regular Firestrike benchmark at stock volts. Better overclocking XP's run about 40Mhz higher stock boost from what I remember.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm not on my PC at the moment. Just wanted to know what the max stock boost is running something like regular Firestrike benchmark at stock volts. Better overclocking XP's run about 40Mhz higher stock boost from what I remember.


1860Mhz stock boost and over is a good card imo.


----------



## MrKenzie

Quote:


> Originally Posted by *havabeer*
> 
> Hi everyone
> 
> Australia owner here, sold my maxwell Titans for 2x pascal's titans. Cost a pretty penny to get them shipped out to aus but happy with them
> 
> Bought some water blocks for them as well (didn't even try them on air) originally had them paired with an i5 4690k but have up graded the whole system to a i7 5960k and 16 gig ddr4 ram. Literally just putting the finishing touches and installing Windows now.


It seems stupid that Nvidia doesn't ship to Australia, I bet if we have a warranty claim we would have to ship the Titan X back to the US at our own cost, that's if it's covered under warranty at all! My 1x Titan X cost $90 AUD to send with insurance cover, I'm happy I didn't have to pay tax or import duty on it though


----------



## piee

I get 1886 stk at 34c and below


----------



## Sheyster

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> 1860Mhz stock boost and over is a good card imo.


LOL - 1860 is EXACTLY my stock boost level.


----------



## Jpmboy

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm not on my PC at the moment. Just wanted to know what the max stock boost is running something like regular Firestrike benchmark at stock volts. Better overclocking XP's run about 40Mhz higher stock boost from what I remember.


firestrike 1080 really is not the load to get max boost with, but here ya go:
both boost to 1873, but i know one card is much stronger than the other.


----------



## eliau81

is there any way to read ASIC?


----------



## Jpmboy

Quote:


> Originally Posted by *eliau81*
> 
> is there any way to read ASIC?


not as far as I know.


----------



## EniGma1987

Quote:


> Originally Posted by *DADDYDC650*
> 
> What is the high end stock boost clock for this card again? How would I know if I have a good card off the bat without overclocking?


Most do somewhere around 1850-1900 completely stock out of the box with no settings touched. Though that range seems to be the max and min of stock, with the vast majority hitting either 1860 or 1886 speeds. Nvidia specifies 1531MHz though, not sure why since that isnt even close to what every card has done for stock speed.


----------



## lanofsong

Hey Titan X Pascal owners,

We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

October Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Jpmboy

TXP owners really should participate for the two days! I did last month and the TXP is simply incredible at folding. Set it and forget it for 48h... ~ 1.5M points per day. TEAM OCN NEEDS THE POINTS!!!


----------



## lanofsong

^ This and these GPU's are awesome folders


----------



## KillerBee33

What exactly is FOLDING?


----------



## lanofsong

Quote:


> Originally Posted by *KillerBee33*
> 
> What exactly is FOLDING?


http://web.stanford.edu/group/pandegroup/folding/FoldingFAQ.pdf


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> What exactly is FOLDING?


It uses the computational capabilities of the gpu to participate in a distributed computational problem in protein folding. You'll be amazed at how powerful these cards are. Way more than any cpu...any¡


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> It uses the computational capabilities of the gpu to participate in a distributed computational problem in protein folding. You'll be amazed at how powerful these cards are. Way more than any cpu...any¡


May check that when home later, leave it overnight i guess but still not sure what it does.


----------



## pompss

Anyone have coil whine issue with the titan x? Mine have terrible high noise level.
I bought a new one days ago , got it today and i have the same coil whine issue with the second card
I also ordered another seasonic snow white 1050w and with both psu i have the same noise coming from both titan

Planning to Ship back both card.
I think its unacceptable spending $1200 for a gpu and have this annoying high level of noise.

Has anyone the same psu and coil whine issue?

Any help would be appreciate
Thnaks


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> Anyone have coil whine issue with the titan x? Mine have terrible high noise level.
> I bought a new one days ago , got it today and i have the same coil whine issue with the second card
> I also ordered another seasonic snow white 1050w and with both psu i have the same noise coming from both titan
> Planning to Ship back both card.
> I think its unacceptable spending $1200 for a gpu and have this annoying high level of noise.
> Has anyone the same psu and coil whine issue?
> Any help would be appreciate
> Thnaks


No coil whine on either card I have... I assume the cards are water cooled else youy would not hear anything but the fan. So, make sure that the block is not overtightened and that all required plastic washers are in place. It would be very unusual for multiple cards to have such a problem, unless it is being driven by another component in the system (PSU grounding etc). If you get another pair and they both whine, it's not the cards - 100% sure.


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> No coil whine on either card I have... I assume the cards are water cooled else youy would not hear anything but the fan. So, make sure that the block is not overtightened and that all required plastic washers are in place. It would be very unusual for multiple cards to have such a problem, unless it is being driven by another component in the system (PSU grounding etc). If you get another pair and they both whine, it's not the cards - 100% sure.


I was thinking that too about the block been overtightened but after receiving The new card today i notice the same issue and same noise .

What i did is i put the fan a min and keep it at 25% .

When playing gear or war 4 i hear the same noise coming from the card which wasn't the fan.

Maybe its the seasonic white snow that cause this issue?? Its possible that both seasonic Psu have the same issue?


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> I was thinking that too about the block been overtightened but after receiving The new card today i notice the same issue and same noise .
> 
> What i did is i put the fan a min and keep it at 25% .
> 
> When playing gear or war 4 i hear the same noise coming from the card which wasn't the fan.
> 
> Maybe its the seasonic white snow that cause this issue?? Its possible that both seasonic Psu have the same issue?


is that one of the Seasonic "Platnium" PSUs? If yes, they are not the best seasonic has put out (and they normally make great PSUs).


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> is that one of the Seasonic "Platnium" PSUs? If yes, they are not the best seasonic has put out (and they normally make great PSUs).


SeaSonic Snow Silent-1050 1050W ATX12V / EPS12V 80 PLUS PLATINUM

here the link Seasonic

Change pci express cable same thing. Still thinking about both titanx been defective

Never had problem with this psu and i always use it in various builds.
I have also amd rx 480 and i will install the card to see if have coil whine.

I think if the rx 480 doesnt have the coil whine then 100% its not the PSU
Otherwise i will order another psu from evga or corsair and test it


----------



## pompss

Ok i tested the amd rx 480

I cant hear the same noise like the titan x but there is some very light coil whine but very low noise which is acceptable.

tested doom and no coil whine at all.

I'm concluding that must be the titan x the probem here . i have a evga 850 watt psu in my storage tomorrow i will test it to see if there is coil whine with the titan x.

still waiting for nvidia sent me the rma and shipping label


----------



## pez

Quote:


> Originally Posted by *Jpmboy*
> 
> TXP owners really should participate for the two days! I did last month and the TXP is simply incredible at folding. Set it and forget it for 48h... ~ 1.5M points per day. TEAM OCN NEEDS THE POINTS!!!


Man. That's crazy considering when I last folded, it took me I think a whole week of folding to reach 1 million (hence my badge







).


----------



## Leyaena

Anyone here that went from two Titan X (Maxwell)s to a single Titan X (Pascal)?
How did you like the experience?

I'm seriously considering making the switch, SLI nowadays seems to be broken in so many game launches that there's been periods of weeks where I didn't even bother re-enabling sli after a driver update....

As a bonus question, would you wait for the 1080ti instead?
I'd watercool the card for sure, and the money isn't a huge problem for me but saving a bit is never a bad thing.
That said, there's always something better/faster/cheaper around the corner...


----------



## Edge0fsanity

Quote:


> Originally Posted by *Leyaena*
> 
> Anyone here that went from two Titan X (Maxwell)s to a single Titan X (Pascal)?
> How did you like the experience?
> 
> I'm seriously considering making the switch, SLI nowadays seems to be broken in so many game launches that there's been periods of weeks where I didn't even bother re-enabling sli after a driver update....
> 
> As a bonus question, would you wait for the 1080ti instead?
> I'd watercool the card for sure, and the money isn't a huge problem for me but saving a bit is never a bad thing.
> That said, there's always something better/faster/cheaper around the corner...


I didn't go from 2 TXM's but did go from 2 980tis @ 1525mhz to a TXP @ 2100mhz which is close enough. I really ended up liking the switch down to a single card. No more dealing with SLI issues and games seem a bit smoother, some moreso than others. Its nice being able to buy a game on launch and play it right away without issue. I run close to the same settings as my 980tis in games, might have to turn down AA 1 notch over what my 980tis ran. I'm likely going to run a single titan of whatever the current gen is from now on as they seem to handle 3440x1440 just fine.

Personally i'd only wait on the 1080ti if i was going to buy 2 of them again.


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> Man. That's crazy considering when I last folded, it took me I think a whole week of folding to reach 1 million (hence my badge
> 
> 
> 
> 
> 
> 
> 
> ).


yeah - crazy right? you should sign up for the foldathon (it's only 2 days).


----------



## Sphere07

Should I actually buy this from Australia?


----------



## iKoshee

Hello everyone, is anybody here was able to resolve Titan XP coil whine. I just got my 2 RMAs today and guess what, both of these cards are coil whine still







. I tried them with and without Waterblocks. It looks like my cards have a coil whine during mid loads, nothing during high loads. I also changed PSU today, because some people mentioned online that that could be an issue, still no luck. I always been purchasing cards from EVGA and never ever had anything like this, this is my first time purchasing cards directly from NVIDIA and this nightmare started. Do you guys think changing motherboard might help? Swapping hardware a bit annoying since I'm hard tubing everything. P.S. Freaking Nvidias Customer Service sucks big time, it took them 1 week to RMA my cards and even for that I had to "yel" to the support guy ). I wish EVGA had Titans :|


----------



## iKoshee

Quote:


> Originally Posted by *pompss*
> 
> Anyone have coil whine issue with the titan x? Mine have terrible high noise level.
> I bought a new one days ago , got it today and i have the same coil whine issue with the second card
> I also ordered another seasonic snow white 1050w and with both psu i have the same noise coming from both titan
> 
> Planning to Ship back both card.
> I think its unacceptable spending $1200 for a gpu and have this annoying high level of noise.
> 
> Has anyone the same psu and coil whine issue?
> 
> Any help would be appreciate
> Thnaks


I went through these as well. Except I have EVGA 1600W SuperNOVA PSU. I RMAed cards so 4 of them have coil whine. Also today I have changed my PSU and cards still whine. I don't know what else to do :\


----------



## GunnzAkimbo

Quote:


> Originally Posted by *iKoshee*
> 
> I went through these as well. Except I have EVGA 1600W SuperNOVA PSU. I RMAed cards so 4 of them have coil whine. Also today I have changed my PSU and cards still whine. I don't know what else to do :\


VSync


----------



## xarot

I have coil whine too on my card, but after I put it on water and it's inside the case, I cannot hear it at all. The D5 pump is the loudest part in my computer.


----------



## MrTOOSHORT

Recent cards, two Titans, Two kpe 780tis, one Maxwell Titan and one Pascal Titan, no coil whine.

Odd thing to point out, had massive coil whine when I installed a nickel back plate from ek on the Maxwell Titan. The regular black ek back plate was ok.

Went to reinstall the nickel back plate again a couple months later, the coil whine was a lot better.

Has to be system, grounding related with some systems and coil whine some of the time.


----------



## Lobotomite430

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> VSync


Ha no kidding!


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Recent cards, two Titans, Two kpe 780tis, one Maxwell Titan and one Pascal Titan, no coil whine.
> 
> Odd thing to point out, had massive coil whine when I installed a nickel back plate from ek on the Maxwell Titan. The regular black ek back plate was ok.
> 
> Went to reinstall the nickel back plate again a couple months later, the coil whine was a lot better.
> 
> Has to be system, grounding related with some systems and coil whine some of the time.


yeah - unfortunately the same users keep RMAing cards claiming coil whine... one, maybe, but 4 and I'm 100% sure it is not the cards whining.


----------



## TremF

Quote:


> Originally Posted by *Leyaena*
> 
> Anyone here that went from two Titan X (Maxwell)s to a single Titan X (Pascal)?
> How did you like the experience?
> 
> I'm seriously considering making the switch, SLI nowadays seems to be broken in so many game launches that there's been periods of weeks where I didn't even bother re-enabling sli after a driver update....
> 
> As a bonus question, would you wait for the 1080ti instead?
> I'd watercool the card for sure, and the money isn't a huge problem for me but saving a bit is never a bad thing.
> That said, there's always something better/faster/cheaper around the corner...


I moved from a GTX Titan X (Maxwell) SLI setup to a single Titan X Pascal and it has been brilliant. For games where SLI doesn't work or for VR where SLI isn't utilised the TXP is massively faster than 1 GTX TX also games where SLI worked are now running super smooth. I never thought I noticed SLI stutter previously but seeing how smooth 1 card runs I am sure I had it.

I have now also upgraded my 27" 1440P monitor to a 32" 4K G-Sync monitor as most games were running in 4k downscaled to 1440P previously anyway but for any that drop below 60FPS the G-Sync now smooths it out. ANything over 60FPS is managed with Fast Sync which really works well.









I thought about the 1080Ti but wanted a faster single card now and the TXP was a big enough jump in performance for me to move from the previous GTX TX.

Weirdly enough I haven't actually bothered with VR since getting the TXP and 4k monitor. I'm loving the new level of gaming


----------



## ratzofftoya

Just put up a video of my Titan X SLI benchmarks:


----------



## Maintenance Bot

Quote:


> Originally Posted by *ratzofftoya*
> 
> Just put up a video of my Titan X SLI benchmarks:


Nice video









How did you get sli working for The Division? I thought it was forever broke.


----------



## Lobotomite430

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - unfortunately the same users keep RMAing cards claiming coil whine... one, maybe, but 4 and I'm 100% sure it is not the cards whining.


They don't even realize that a cooler does not cause coil whine, it will still be present without a cooling solution.


----------



## X3NEIZE

Hmm looking at some results here suggests my Titan X setup may not be performing as well as I thought...

Running 6850k with Titan X SLI my 3DMark scores are as follow:

Firestrike: http://www.3dmark.com/fs/10164052

Firestrike Extreme: http://www.3dmark.com/fs/10447138

Firestrike Ultra: http://www.3dmark.com/fs/10107982

Timespy: http://www.3dmark.com/spy/444210

These are with OC (~150 core / 300 mem)

What do you think is happening here? are these numbers normal?

I just saw Jayztwocents on you tube, his Titan X SLI is getting a 54k graphic score vs my 48k... what is my issue


----------



## EniGma1987

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Nice video
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How did you get sli working for The Division? I thought it was forever broke.


When did SLI supposedly break? It worked fine from launch of the game till I stopped playing it 2-3 months in.


----------



## GosuPl

Quote:


> Originally Posted by *X3NEIZE*
> 
> Hmm looking at some results here suggests my Titan X setup may not be performing as well as I thought...
> 
> Running 6850k with Titan X SLI my 3DMark scores are as follow:
> 
> Firestrike: http://www.3dmark.com/fs/10164052
> 
> Firestrike Extreme: http://www.3dmark.com/fs/10447138
> 
> Firestrike Ultra: http://www.3dmark.com/fs/10107982
> 
> Timespy: http://www.3dmark.com/spy/444210
> 
> These are with OC (~150 core / 300 mem)
> 
> What do you think is happening here? are these numbers normal?
> 
> I just saw Jayztwocents on you tube, his Titan X SLI is getting a 54k graphic score vs my 48k... what is my issue


You will need SLI HB bridge.

My scores when i have 2x TX P (now return to watercooled 2x TX M and want skip Pascal ;-) )

http://www.3dmark.com/compare/fs/10447138/fs/9745619 - You have much worse fps and scores

http://www.3dmark.com/compare/spy/444210/spy/279601 - Time Spy is ok

On your extreme score, you have fps not much more better than my 2x TX M.

http://www.3dmark.com/compare/fs/10447138/fs/9991936

I think this is two possible reasons.

1.You dont use SLI HB
2.You have Vsync ON


----------



## X3NEIZE

Quote:


> Originally Posted by *GosuPl*
> 
> You will need SLI HB bridge.
> 
> My scores when i have 2x TX P (now return to watercooled 2x TX M and want skip Pascal ;-) )
> 
> http://www.3dmark.com/compare/fs/10447138/fs/9745619 - You have much worse fps and scores
> 
> http://www.3dmark.com/compare/spy/444210/spy/279601 - Time Spy is ok
> 
> On your extreme score, you have fps not much more better than my 2x TX M.
> 
> http://www.3dmark.com/compare/fs/10447138/fs/9991936
> 
> I think this is two possible reasons.
> 
> 1.You dont use SLI HB
> 2.You have Vsync ON


I do have a HB SLI Bridge.... and Vsync is clearly off, otherwise the FPS would top at 100....

I just benched one card with no OC and the results seem to be in line with the reviews out there, so I'm guessing there's something wrong with my bridge??

http://www.3dmark.com/3dm/15404931?


----------



## GunnzAkimbo

Quote:


> Originally Posted by *Lobotomite430*
> 
> Ha no kidding!


----------



## Maintenance Bot

Quote:


> Originally Posted by *EniGma1987*
> 
> When did SLI supposedly break? It worked fine from launch of the game till I stopped playing it 2-3 months in.


Probably an issue on my end somewhere.


----------



## X3NEIZE

Quote:


> Originally Posted by *GosuPl*
> 
> You will need SLI HB bridge.
> 
> My scores when i have 2x TX P (now return to watercooled 2x TX M and want skip Pascal ;-) )
> 
> http://www.3dmark.com/compare/fs/10447138/fs/9745619 - You have much worse fps and scores
> 
> http://www.3dmark.com/compare/spy/444210/spy/279601 - Time Spy is ok
> 
> On your extreme score, you have fps not much more better than my 2x TX M.
> 
> http://www.3dmark.com/compare/fs/10447138/fs/9991936
> 
> I think this is two possible reasons.
> 
> 1.You dont use SLI HB
> 2.You have Vsync ON


I actually figure out what it was...

G-Sync.... I disabled Gsync and:

http://www.3dmark.com/3dm/15406847

Have to play with the OC's but looks like Gsync is the devil for 3DMark....


----------



## kx11

solid 4k 60fps on ultra settings gameplay in BF1 campaign

i think the highest temps i got was 50c on CPU\GPU but i was running the silent cooling mode so it's alright


----------



## MrKenzie

I finally put my Titan XP in my cooling loop and I'm pretty happy with the results. At the moment I have settled on +220 core / +400 memory which has yet to crash the driver. My GTX 1080 would not clock above +190 on the core so I'm happy with the Titan's overclock!

During gaming it is pretty solid at around 2114 with slightly higher on occasion and I think it was "rise of the tomb raider" where it would drop to 2064 or so occasionally likely because of power limit.

So far my GPU core temp has not gone above 25c even after 3+ hours of gaming in Deus Ex, with an ambient of 25c (heater on).
My old 780Ti would hit 32c with the same cooling setup, the Titan is much more efficient at dissipating heat..

I can't post Firestrike results as it won't run since the latest windows update!


----------



## Ghostface

To those with the Coil Whine I have the exact same issue and it's pretty disheartening after spending a fair bit of cash and time to get the ultimate gaming rig to hear unbelievably annoying whine, even at a reasonable distance from the case.

Had it on 2 different Titan X Pascal cards now, a lot worse than my 980ti was.

I have just replaced my Corsair AX860i Platinum PSU for a Super Flower 1000watt Titanium PSU, exactly the same level of coil whine. I originally had a 600watt Seasonic Gold PSU which I thought might be to blame, so that's 3 high quality PSU's and no fix.

To quash the usual theories:

V-sync does not fix the issue at
If I run my monitor at 60hz or 100hz, the whine happens in the same places in the same games
I have replaced my motherboard, PSU, swapped GPU, power cables and rewired my case several times over. I am using some of the best components available
It's not high frame rate which triggers it, it happens at mid range
PSU quality is not the issue
I have tried a mains conditioner on the PC and Monitor and also on the usual Power Drains in the house (washing machine, fridge/freezer etc) to see if they were causing an issue with power I the house
Different overclocking levels on CPU and GPU make no difference
It's definitely coming from the card and not my perception
After water cooling it two times over, there is no way I am taking it apart to send it back as I know the replacement card will have exactly the same issue. For those convinced it isn't the card, I'd like to hear what you think is the cause as everything has been swapped out at least 2 times over and mixed and matched with exactly the same result at the end.


----------



## ratzofftoya

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Probably an issue on my end somewhere.


Yeah this has not come up for me.


----------



## DADDYDC650

Seems like most of these bad boys can do +200 core and +500 memory np. Both XP's I've owned have similar OC's.


----------



## X3NEIZE

Quote:


> Originally Posted by *MrKenzie*
> 
> I finally put my Titan XP in my cooling loop and I'm pretty happy with the results. At the moment I have settled on +220 core / +400 memory which has yet to crash the driver. My GTX 1080 would not clock above +190 on the core so I'm happy with the Titan's overclock!
> 
> During gaming it is pretty solid at around 2114 with slightly higher on occasion and I think it was "rise of the tomb raider" where it would drop to 2064 or so occasionally likely because of power limit.
> 
> So far my GPU core temp has not gone above 25c even after 3+ hours of gaming in Deus Ex, with an ambient of 25c (heater on).
> My old 780Ti would hit 32c with the same cooling setup, the Titan is much more efficient at dissipating heat..
> 
> I can't post Firestrike results as it won't run since the latest windows update!


25C?? wow what's your room temperature? My custom loop (great system) with EK waterblocks, dual rad, i'm looking at 40's while heavy gaming.... my room temperature is ~75c


----------



## Lobotomite430

Quote:


> Originally Posted by *DADDYDC650*
> 
> Seems like most of these bad boys can do +200 core and +500 memory np. Both XP's I've owned have similar OC's.


But +200 could mean anywhere from 1900mhz to 2200mhz depends how the card boosts but these cards do overclock nicely.
Quote:


> Originally Posted by *Ghostface*
> 
> To those with the Coil Whine I have the exact same issue and it's pretty disheartening after spending a fair bit of cash and time to get the ultimate gaming rig to hear unbelievably annoying whine, even at a reasonable distance from the case.
> 
> Had it on 2 different Titan X Pascal cards now, a lot worse than my 980ti was.
> 
> I have just replaced my Corsair AX860i Platinum PSU for a Super Flower 1000watt Titanium PSU, exactly the same level of coil whine. I originally had a 600watt Seasonic Gold PSU which I thought might be to blame, so that's 3 high quality PSU's and no fix.
> 
> To quash the usual theories:
> 
> V-sync does not fix the issue at
> If I run my monitor at 60hz or 100hz, the whine happens in the same places in the same games
> I have replaced my motherboard, PSU, swapped GPU, power cables and rewired my case several times over. I am using some of the best components available
> It's not high frame rate which triggers it, it happens at mid range
> PSU quality is not the issue
> I have tried a mains conditioner on the PC and Monitor and also on the usual Power Drains in the house (washing machine, fridge/freezer etc) to see if they were causing an issue with power I the house
> Different overclocking levels on CPU and GPU make no difference
> It's definitely coming from the card and not my perception
> After water cooling it two times over, there is no way I am taking it apart to send it back as I know the replacement card will have exactly the same issue. For those convinced it isn't the card, I'd like to hear what you think is the cause as everything has been swapped out at least 2 times over and mixed and matched with exactly the same result at the end.


Coil whine can exist in pretty much any electronic components. Only time I have had issues with it was game menus with vsync off and FPS is 200+
What games are you trying and what FPS are you seeing? Its certainly possible you are just super unlucky with technology lottery and got some bad hardware.


----------



## Jpmboy

Quote:


> Originally Posted by *iKoshee*
> 
> Hello everyone, is anybody here was able to resolve Titan XP coil whine. I just got my 2 RMAs today and guess what, both of these cards are coil whine still
> 
> 
> 
> 
> 
> 
> 
> . I tried them with and without Waterblocks. It looks like my cards have a coil whine during mid loads, nothing during high loads. I also changed PSU today, because some people mentioned online that that could be an issue, still no luck. I always been purchasing cards from EVGA and never ever had anything like this, this is my first time purchasing cards directly from NVIDIA and this nightmare started. Do you guys think changing motherboard might help? Swapping hardware a bit annoying since I'm hard tubing everything. P.S. Freaking Nvidias Customer Service sucks big time, it took them 1 week to RMA my cards and even for that I had to "yel" to the support guy ). I wish EVGA had Titans :|


Quote:


> Originally Posted by *Ghostface*
> 
> To those with the Coil Whine I have the exact same issue and it's pretty disheartening after spending a fair bit of cash and time to get the ultimate gaming rig to hear unbelievably annoying whine, even at a reasonable distance from the case.
> 
> Had it on 2 different Titan X Pascal cards now, a lot worse than my 980ti was.
> 
> I have just replaced my Corsair AX860i Platinum PSU for a Super Flower 1000watt Titanium PSU, exactly the same level of coil whine. I originally had a 600watt Seasonic Gold PSU which I thought might be to blame, so that's 3 high quality PSU's and no fix.
> 
> To quash the usual theories:
> 
> V-sync does not fix the issue at
> If I run my monitor at 60hz or 100hz, the whine happens in the same places in the same games
> I have replaced my motherboard, PSU, swapped GPU, power cables and rewired my case several times over. I am using some of the best components available
> It's not high frame rate which triggers it, it happens at mid range
> PSU quality is not the issue
> I have tried a mains conditioner on the PC and Monitor and also on the usual Power Drains in the house (washing machine, fridge/freezer etc) to see if they were causing an issue with power I the house
> Different overclocking levels on CPU and GPU make no difference
> It's definitely coming from the card and not my perception
> After water cooling it two times over, there is no way I am taking it apart to send it back as I know the replacement card will have exactly the same issue. For those convinced it isn't the card, I'd like to hear what you think is the cause as everything has been swapped out at least 2 times over and mixed and matched with exactly the same result at the end.


Coil whine is caused by a frequency harmonic in the power section (there are no "coils" in a modern gpu architecture). One possible electronic cause is a ground loop (mostly when using dual PSUs) and poor power isolation - sometimes running a wire from one PSU mount screw to a clean chassis ground can help. You can check this with a DMM between the psu and chassis... should be ZERO voltage). Since you seem to have eliminated that possibility (the superflower PSU is solid) then the only other cause - aside from incredibly bad luck in the lottery, I mean the probabilities are really low of getting 2 or 4 cards with the same manufacturing defect- is mechanical... usually unevenly tightened block mounts. No need to remount, just re-torque the screws. And check that the I/O panel bracket is not applying pressure in either direction in the slot. The card should be self centering (tho this is really a rare problem).
Of the 2 dozen or so cards I've had in the past 3 years, I can recall only one that "buzzed" and that was at stupid high VDDCC (3x780Ti Kingpins) when the Unigine credits would pop up at the end of valley or heaven using 2 1200W PSUs.
Look at it this way, if your 9080ti had whine, and every card put in the rig does too... it's the rig, not the cards.
IDK bud, 1 card - okay, bad luck... 2 or more - it's not the cards by all reasoning.
Quote:


> Originally Posted by *X3NEIZE*
> 
> Hmm looking at some results here suggests my Titan X setup may not be performing as well as I thought...
> 
> Running 6850k with Titan X SLI my 3DMark scores are as follow:
> 
> Firestrike: http://www.3dmark.com/fs/10164052
> 
> Firestrike Extreme: http://www.3dmark.com/fs/10447138
> 
> Firestrike Ultra: http://www.3dmark.com/fs/10107982
> 
> Timespy: http://www.3dmark.com/spy/444210
> 
> These are with OC (~150 core / 300 mem)
> 
> What do you think is happening here? are these numbers normal?
> I just saw Jayztwocents on you tube, his Titan X SLI is getting a 54k graphic score vs my 48k... what is my issue


Don't forget to post your benchmarks in the OCN bench threads:
http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20


----------



## EniGma1987

Quote:


> Originally Posted by *Ghostface*
> 
> After water cooling it two times over, there is no way I am taking it apart to send it back as I know the replacement card will have exactly the same issue. For those convinced it isn't the card, I'd like to hear what you think is the cause as everything has been swapped out at least 2 times over and mixed and matched with exactly the same result at the end.


Did you use nylon washers on all the bolts that secure the waterblock to the GPU except for the ones that connect the block to the IO bracket?


----------



## Ghostface

Thank you for the replies.

In terms of games where I get it. I get it heavy on the menus for Shadows of Mordor, The Division, Sniper Elite V2, Splinter Cell Conviction. In the games if I have V-Sync on and keep the overclock on the monitor off, I am at a solid 60FPS. The Division is the only game where I don't hit 100fps if I overclock the monitor to 100hz with G-Sync and V-Sync on. The whine comes in at both 60fps and anywhere between 60 and 100fps. In Splinter Cell Conviction during gameplay it quietens down in tight corridor sections and then opens up and whines like crazy in the more open sections, even with no change in the frame rate.

I am actually away until Sunday evening but I will try loosing the screws on the waterblock and backplate. I have used the washers which came with the EK waterblock as the instructions stated, the backplate itself didn't come with any washers so the screws are just straight in on those. I will try removing the chassis screws to hold the card in place as I know the case wasn't perfect for the card (Phantek Enthoo Primo) so I think there is a bit of tension there holding the card in place.

I appreciate the help so far and will post back to see if loosening the screws and removing the chassis screws altogether makes a difference. What's been so frustrating so far is the sheer amount of different things I have tried and it seems so unlikely that through different motherboards, PSU's and graphics cards that the cause is the GPU.


----------



## CallsignVega

Quote:


> Originally Posted by *Ghostface*
> 
> Thank you for the replies.
> 
> In terms of games where I get it. I get it heavy on the menus for Shadows of Mordor, The Division, Sniper Elite V2, Splinter Cell Conviction. In the games if I have V-Sync on and keep the overclock on the monitor off, I am at a solid 60FPS. The Division is the only game where I don't hit 100fps if I overclock the monitor to 100hz with G-Sync and V-Sync on. The whine comes in at both 60fps and anywhere between 60 and 100fps. In Splinter Cell Conviction during gameplay it quietens down in tight corridor sections and then opens up and whines like crazy in the more open sections, even with no change in the frame rate.
> 
> I am actually away until Sunday evening but I will try loosing the screws on the waterblock and backplate. I have used the washers which came with the EK waterblock as the instructions stated, the backplate itself didn't come with any washers so the screws are just straight in on those. I will try removing the chassis screws to hold the card in place as I know the case wasn't perfect for the card (Phantek Enthoo Primo) so I think there is a bit of tension there holding the card in place.
> 
> I appreciate the help so far and will post back to see if loosening the screws and removing the chassis screws altogether makes a difference. What's been so frustrating so far is the sheer amount of different things I have tried and it seems so unlikely that through different motherboards, PSU's and graphics cards that the cause is the GPU.


You test the outlet you are plugged into for a good ground and correct polarity? What about using a uninterruptible power supply? I've used a UPS for over a decade and in that time I've probably had 40+ video cards with zero coil whine. This is with open air cases too. My Titan-XP's also have zero coil while.


----------



## jsutter71

Hello all....I've been watching this thread for a while now in anticipation of selling my 980Ti's and upgrading to TXPs. Last week I sold 2 of my 980Ti's and will be selling the 3rd once it's replacement arrives Monday from EVGA. Long story short. It died and I had to RMA it. *Last night I took the plunge and purchased 2 TXPs and HB bridge from Nvidia with next day shipping. How long does Nvidia take to ship them out?*


----------



## jodasanchezz

Finaly affter near 2 Weaks if got an answer from NVIDIA
If Rmad my Titan because of this (all settings stock)










They send me this back








900-1G611-2500-000 , NVIDIA TITAN X(PG611-A00), Retail, SKU0

In Germany the Rma Support is not bad but we need more time ....


----------



## MrTOOSHORT

Quote:


> Originally Posted by *jsutter71*
> 
> Hello all....I've been watching this thread for a while now in anticipation of selling my 980Ti's and upgrading to TXPs. Last week I sold 2 of my 980Ti's and will be selling the 3rd once it's replacement arrives Monday from EVGA. Long story short. It died and I had to RMA it. *Last night I took the plunge and purchased 2 TXPs and HB bridge from Nvidia with next day shipping. How long does Nvidia take to ship them out?*


I ordered Aug 2nd and received my card Aug 3rd and I'm in Canada too.


----------



## mbze430

when I bought my TXPs, I didn't use express service, it took maybe about 5 days (2 days to process the order and 3 days actual transit)


----------



## iTurn

Is this a good standard measurement to go by "Titan Xp is 30-33% faster than the GTX 1080"?
http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-titan-x-pascal-review


----------



## jsutter71

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I ordered Aug 2nd and received my card Aug 3rd and I'm in Canada too.


Interesting. I still have not received the second email from Nvidia regarding the status. Seems slow to me. I also ordered the water blocks yesterday from MAINFrame Customs, and the tracking email said it will arrive tomorrow. Its a shame that Nvidia does not allow other companies to sell this card. I would have preferred to purchase from EVGA if the option WAS available.


----------



## meson1

Well that's gone and done it. One Titan XP ordered from nvidia.co.uk; free delivery.

Now I need a waterblock. A Heatkiller IV in nickel looks favorite.


----------



## animeowns

Quote:


> Originally Posted by *Leyaena*
> 
> Anyone here that went from two Titan X (Maxwell)s to a single Titan X (Pascal)?
> How did you like the experience?
> 
> I'm seriously considering making the switch, SLI nowadays seems to be broken in so many game launches that there's been periods of weeks where I didn't even bother re-enabling sli after a driver update....
> 
> As a bonus question, would you wait for the 1080ti instead?
> I'd watercool the card for sure, and the money isn't a huge problem for me but saving a bit is never a bad thing.
> That said, there's always something better/faster/cheaper around the corner...


going from titan maxwell to 3 titan xp's pascal currently using 1 card my 2nd one is here Im waiting on my 3rd to show.


----------



## willverduzco

Quote:


> Originally Posted by *iTurn*
> 
> Is this a good standard measurement to go by "Titan Xp is 30-33% faster than the GTX 1080"?
> http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-titan-x-pascal-review


Maybe at stock, but if you OC both to max speeds, the difference jumps to about 40-45%. (Max 100% stable clocks were 2126 for me on my TXP, and 2176 for my 1080 when I had a single card, and 2100 when I had dual 1080s for a week.)

At equal clocks (water cooling required for stable 2000+ without hairdryer noise), the TXP has 40% more shader cores, and thus 40% more raw processing power than a 1080. That said, it can use the shaders that it has slightly more efficiently since it faces less of a memory bandwidth constraint (50% more bandwidth for 40% higher shader count than a 1080). As an aside, the full fat GP102 (used in the P6000) would probably demonstrate the same kind of bandwidth- rather than compute-limited behavior at high resolutions as the 1080 does--though since the bandwidth is higher, the resolutions would also be correspondingly higher (e.g. 5k). To see this in action, look at a 1080 vs 980ti. In low to mid res (e.g. 1440p), the 1080 has a pretty huge lead, but this declines the higher the resolution due to bandwidth constraints on the 1080 which don't affect the 980ti (a much slower card in compute, but not that much lower bandwidth) to the same degree.


----------



## iTurn

Quote:


> Originally Posted by *willverduzco*
> 
> Maybe at stock, but if you OC both to max speeds, the difference jumps to about 40-45%. (Max 100% stable clocks were 2126 for me on my TXP, and 2176 for my 1080 when I had a single card, and 2100 when I had dual 1080s for a week.)
> 
> At equal clocks (water cooling required for stable 2000+ without hairdryer noise), the TXP has 40% more shader cores, and thus 40% more raw processing power than a 1080. That said, it can use the shaders that it has slightly more efficiently since it faces less of a memory bandwidth constraint (50% more bandwidth for 40% higher shader count than a 1080). As an aside, the full fat GP102 (used in the P6000) would probably demonstrate the same kind of bandwidth- rather than compute-limited behavior at high resolutions as the 1080 does--though since the bandwidth is higher, the resolutions would also be correspondingly higher (e.g. 5k). To see this in action, look at a 1080 vs 980ti. In low to mid res (e.g. 1440p), the 1080 has a pretty huge lead, but this declines the higher the resolution due to bandwidth constraints on the 1080 which don't affect the 980ti (a much slower card in compute, but not that much lower bandwidth) to the same degree.


Sweet thanks! Made my choice easier.


----------



## X3NEIZE

Can someone help me understand how to actually successfully OC a Titan X in SLI... my Core Clocks seem to be all over the place even tho my temperatures all way in check ~55c (watercooled)

I'm using Precision XOC and seems I can't hit an ideal OC.. My max graphic score in Firestrike is ~51k.....

Anyone experienced care to assist? What am I not doing here... my bets score was achieved with a +160 core / + 200 mem and Power Target=120% Max temperature

Any help will be appreciated it.

Edit (some observations)

1) By analazing the monitor graphs, it looks like Titan (or maybe Pascal cards) start to throttle at around 49c-50c and then they further downclock again when it closes in to 60c.... is this actually the way these cards where design? is that was Turbo Boost 3.0 does?

2) I keep hitting power limits on both cards very frequent.... why is this happening?


----------



## jhowell1030

Congrats? None of that really answers his questions though.









Quote:


> Originally Posted by *animeowns*
> 
> going from titan maxwell to 3 titan xp's pascal currently using 1 card my 2nd one is here Im waiting on my 3rd to show.


----------



## Baasha

After having 4-Way SLI for several weeks, I tried just SLI last week and it holds its own up to 5K which is quite amazing tbh.

Once you bump up the resolution to 8K, that's when the 4 cards really shine.

Finally got Rise of the Tomb Raider to work w/ 4-Way SLI - getting around 105 - 125 FPS in 5K maxed out in DX12..


----------



## piee

noticed 2k P.L.goes to 213,4k P.L. goes alittle high in 2k


----------



## piee

2k PL213,4k P.L.203, in bf4 gaming.


----------



## xfachx

Quick question if anyone knows: will a 1080 reference waterblock fit on the Titan X Pascal card? There is one on Amazon from XSPC which I can have tomorrow vs waiting on the one sold on EKWB. Just curious!


----------



## willverduzco

Quote:


> Originally Posted by *xfachx*
> 
> Quick question if anyone knows: will a 1080 reference waterblock fit on the Titan X Pascal card? There is one on Amazon from XSPC which I can have tomorrow vs waiting on the one sold on EKWB. Just curious!


The same waterblocks won't work because the VRM configuration is different. For one, the TXP has an added inductor near the top of the card that isn't present on the 1080. I'm sure there are a bunch of other differences in layout that would make it incompatible as well.


----------



## xfachx

Quote:


> Originally Posted by *willverduzco*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xfachx*
> 
> Quick question if anyone knows: will a 1080 reference waterblock fit on the Titan X Pascal card? There is one on Amazon from XSPC which I can have tomorrow vs waiting on the one sold on EKWB. Just curious!
> 
> 
> 
> The same waterblocks won't work because the VRM configuration is different. For one, the TXP has an added inductor near the top of the card that isn't present on the 1080. I'm sure there are a bunch of other differences in layout that would make it incompatible as well.
Click to expand...

Thanks for the quick reply! I will be patient with EK then!


----------



## Vellinious

Quote:


> Originally Posted by *xfachx*
> 
> Thanks for the quick reply! I will be patient with EK then!


Aquacomputer's Kryographics blocks for the Titan are out. They're awesome blocks.


----------



## animeowns

Quote:


> Originally Posted by *jhowell1030*
> 
> Congrats? None of that really answers his questions though.


Sorry did not read all post to answer his question I would say it should be equal or better in some cases from my testing but I cant see nvidia releasing 1080 ti below $800 early next year


----------



## hotrod717

Quote:


> Originally Posted by *KillerBee33*
> 
> Weird that it's ribbed , always thought it'd be nice and flatout nickel .


Tooling marks from face milling. Would think they would have finished it better or flagged it in QC before plating.


----------



## MrKenzie

Quote:


> Originally Posted by *X3NEIZE*
> 
> 25C?? wow what's your room temperature? My custom loop (great system) with EK waterblocks, dual rad, i'm looking at 40's while heavy gaming.... my room temperature is ~75c


I use an aquarium chiller in the loop, I have no radiator's or fans. I decided to go this way and I don't regret it. My PC only has case fans running when I'm not gaming and the Aquarium chiller turns on when the liquid temp raises above 30c and shuts off at 15c. The temp settings are fully adjustable, but during extended gaming the liquid temp seems to settle around 17-18c.


----------



## SlayVus

So this is kind of obscure and probably won't get a true fix for it, but I'm having issues with my Titan X downclocking during the X3 Terran Conflict rolling demo. The core clocks will run at around 1200 Mhz and the VRAM will run at 640MHz. I finish the benchmark out with an average of ~120 FPS. Whereas with a GTX 970 my system scored a 161 FPS average. I don't understand why the GPU isn't fully engaging in the benchmark.


----------



## MrKenzie

Quote:


> Originally Posted by *SlayVus*
> 
> So this is kind of obscure and probably won't get a true fix for it, but I'm having issues with my Titan X downclocking during the X3 Terran Conflict rolling demo. The core clocks will run at around 1200 Mhz and the VRAM will run at 640MHz. I finish the benchmark out with an average of ~120 FPS. Whereas with a GTX 970 my system scored a 161 FPS average. I don't understand why the GPU isn't fully engaging in the benchmark.


Are the FPS going above 120 at all during the benchmark? It seems like the card is throttling down because it's hitting a FPS cap.


----------



## Jpmboy

Quote:


> Originally Posted by *MrKenzie*
> 
> Are the FPS going above 120 at all during the benchmark? It seems like the card is throttling down because it's hitting a FPS cap.


disable gsynch and set NVCP to fixed and "Highest Available" framerate.


----------



## Sheyster

Quote:


> Originally Posted by *Leyaena*
> 
> Anyone here that went from two Titan X (Maxwell)s to a single Titan X (Pascal)?
> How did you like the experience?


I did, no complaints!


----------



## ocvn

Quote:


> Originally Posted by *Sheyster*
> 
> I did, no complaints!


Mod the bios like you did with ttx maxwell


----------



## Kold

Forgive me if this has been asked, but does the Corsair HG10 work with this?

Does the EVGA 980 Ti Hybrid cooler work with it?

Does the NZXT G10 work?

I'd prefer to use the EVGA, but it is very hard to find now.

EDIT: Just found a 980 TI Hybrid shroud in one of my storage cabinets.. not sure why I have it or when I got it, but it's brand new. Anyways, it looks to have the same power cut outs that would be needed for the Titan XP. Gonna order the 1080 Hybrid kit and try to use it with this shroud. I'll let y'all know.


----------



## SlayVus

@Kold - The hybrid over will only fit on the VRAM/VRM block included in the kit. To make the 1080 hybrid block fit on the XP require cutting and grinding. If you dont care about looks you can just take off the GPU heatsink and install the block. Though the card will then exhaust into the case instead of outside.


----------



## Kold

I don't quite understand. The shroud that I showed wont sit flush with the card?


----------



## SlayVus

No, it won't fit what is already installed on the Titan itself. The screw holes for the shroud won't line up with the screw holes on the base plate.


----------



## bizplan

Quote:


> Originally Posted by *SlayVus*
> 
> @Kold - The hybrid over will only fit on the VRAM/VRM block included in the kit. To make the 1080 hybrid block fit on the XP require cutting and grinding. If you dont care about looks you can just take off the GPU heatsink and install the block. Though the card will then exhaust into the case instead of outside.


One of our members in this forum posted on YouTube a way to modify the 1080 kit's baseplate and cover shroud for use on the TXP: 



. Many of us gave him a hard time for doing this mod on a granite counter top!


----------



## Kold

Ah, damn. Well thanks for the save. Almost ordered it. I suppose I'll just wait for the hybrid kit from EVGA like everyone else.


----------



## SlayVus

You can still use the included GPU AIO without any modification, it just won't look as good. The modifications to the hybrid front plate aren't that difficult, just take some time and equipment.


----------



## pompss

third Titan X installed and guess what ?? Coil whine.

I change the psu no difference still coil Whine.

I went to a friend who has another titan and complete different components and still coil whine

Honestly i really don't understand how this is possible. Its Nvidia shipping those card that i return back?

$1200 its a lot of money and after thinking about it i just asked for a refund. Will wait next year to buy a new card


----------



## Sphere07

It's a scary thought that I might end up with a card that has coil whine. Kinda like how when I bought the HTC Vive and I ended up with dead pixels, and replacing the unit with RMA just gets you somebody elses unit with dead pixels.


----------



## Artureld

That's crazy having coil-whine on a $1200 card.


----------



## meson1

Quote:


> Originally Posted by *Artureld*
> 
> That's crazy having coil-whine on a $1200 card.


I didn't know coil whine was supposed to be inversely proportional to price.









Seriously though, I get your point..

Thinking of the bigger picture, how long has coil whine been an issue on graphics cards? It's probably always been around, but has only really grown prevalent as a problem over the past few years, probably due to the greater number of people wanting to run ultra high refresh rates which has not been helped with the number of monitors now that will run at rates as high as 144Hz. Should we not be making this a bigger issue and pushing this back to the manufacturers and asking them to design coil whine out of future GPU's? Otherwise, it's only ever going to get worse.


----------



## Artureld

Lol only reason I mentioned the price was that with a higher price tag you'd imagine much better components thus reducing the chances of coil whine.

But you're completely right, this needs to be factored out of the equation. Especially how water cooled builds are pretty silent enough for one to hear whats going on.

Thankfully haven't had an issue with both Titan X (Pascals). Hopefully stays that way


----------



## Sphere07

I think I will wait for it to release in Australia, that way I can make use of the Australian consumer rights. If I buy from America, I will probably be screwed over.


----------



## Artureld

If you are going to be waiting Sphere07, then I think the rumored 1080Ti would be a better bet as it would be directly offered by local retailers as opposed to just nVidia's online store and the performance variance wouldn't be all that much different. Thus exchanges and peace of mind should be all but guaranteed.


----------



## meson1

Well this morning I received my email from Nvidia telling me they have finally dispatched my card.

Two hours later, and I now have my Titan X Pascal in my actual hands!!!








Not bad for a delivery time.
I am now a real actual proper TXP owner.


----------



## Artureld

Congratulations, That's fantastic, Enjoy! It's a beast of a card as others have undoubtedly said.


----------



## meson1

I can finally begin my build after a year of collecting the necessary parts. Starting with a basic test of the electronic components and making sure the whole thing will post. Then begins the work to put it all under water in a TH10A.


----------



## Ghostface

Just as a follow up to the suggested Coil Whine fixes. To add to the list I posted earlier I have now gone back and loosened off all the screws on the water block back plate, all the screws on the water block (they weren't very tight). I have done the same with the Motherboard screws and CPU screws so that they are very loose around the components. Booted up, flashed the BIOS and reconfigured the settings...

... same Coil Whine issues at the same points in the same games.

That's me finished in terms of trying different things. I have spent days running into weeks trying to fix it, including hundreds of extra pounds buying different components to see if it's the mix of parts I am using. Don't honestly have the heart to keep trying it so I will just have to use it as a very expensive work computer and give gaming a miss. I'll come back to PC gaming when someone finds a definitive resolution to it.

Cheers for all the help, wish it had turned out different!


----------



## TK421

Quote:


> Originally Posted by *SlayVus*
> 
> @Kold - The hybrid over will only fit on the VRAM/VRM block included in the kit. To make the 1080 hybrid block fit on the XP require cutting and grinding. If you dont care about looks you can just take off the GPU heatsink and install the block. Though the card will then exhaust into the case instead of outside.


That's what I actually did with my TX Maxwell


----------



## Jpmboy

Quote:


> Originally Posted by *Ghostface*
> 
> Just as a follow up to the suggested Coil Whine fixes. To add to the list I posted earlier I have now gone back and loosened off all the screws on the water block back plate, all the screws on the water block (they weren't very tight). I have done the same with the Motherboard screws and CPU screws so that they are very loose around the components. Booted up, flashed the BIOS and reconfigured the settings...
> 
> ... same Coil Whine issues at the same points in the same games.
> 
> That's me finished in terms of trying different things. I have spent days running into weeks trying to fix it, including hundreds of extra pounds buying different components to see if it's the mix of parts I am using. Don't honestly have the heart to keep trying it so I will just have to use it as a very expensive work computer *and give gaming a miss*. I'll come back to PC gaming when someone finds a definitive resolution to it.
> 
> Cheers for all the help, wish it had turned out different!


that's tragic... headphones? or better yet - quality in-ear monitors? (IEMs are the best IMO. *My TripleFi 10s* are incredible)


----------



## Ghostface

I don't like gaming with headphones, I am a developer so spend a lot of my time listening to music that way during the day









I have some really good IEM and headphones, but I also have a really good Amp and 5.1 Speaker setup and I prefer to game like that. I wouldn't have gone to the trouble of water cooling if I was just going to wear headphones, the primary reason was the noise of the fan on the Titan X which made me go down that route... and now I have something even more annoying as the fan was working by design.


----------



## Jpmboy

Quote:


> Originally Posted by *Ghostface*
> 
> I don't like gaming with headphones, I am a developer so spend a lot of my time listening to music that way during the day
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have some really good IEM and headphones, but I also have a really good Amp and 5.1 Speaker setup and I prefer to game like that. I wouldn't have gone to the trouble of water cooling if I was just going to wear headphones, the primary reason was the noise of the fan on the Titan X which made me go down that route... and now I have something even more annoying as the fan was working by design.


you can actually hear the coil hum over a 5.1 sound system? really?


----------



## jsutter71

*I just received a delivery from Fed Ex. The water blocks arrived on Friday so I paused just long enough to take these pics and post. I'm very impressed with the way they came packed. NICE JOB NVIDIA!!!

*


----------



## axiumone

So, crazy idea here. Anyone here with access to a 3d printer and scanner? Any interest in fabricating some custom shrouds for the stock cooler converted to an evga hybrid?


----------



## Azazil1190

Hey guys!!!
Does anyone knows where I can order an titan xp for europe.

Thanks in advance


----------



## -terabyte-

Quote:


> Originally Posted by *Azazil1190*
> 
> Hey guys!!!
> Does anyone knows where I can order an titan xp for europe.
> 
> Thanks in advance


The only place is the Nvidia site, Nvidia this time didn't allow any AIB partner to sell Titan X.


----------



## Azazil1190

Quote:


> Originally Posted by *-terabyte-*
> 
> The only place is the Nvidia site, Nvidia this time didn't allow any AIB partner to sell Titan X.


Thanks mate.
I ll try to order from nvidia and hope to can buy from them.
If not...have to wait the refresh of pascal


----------



## DADDYDC650

Returning my Titan X for good. Still stuck at 1440p so no point in owning one of these bad boys. Was fun while it lasted but I'll be ready for AMD Vega and the 1080 Ti/Titan X Black 2.


----------



## jsutter71

Question. When I removed the back plate on my first card I saw this on one of the chips. Has anyone else seen this?


----------



## Yuhfhrh

Quote:


> Originally Posted by *jsutter71*
> 
> Question. When I removed the back plate on my first card I saw this on one of the chips. Has anyone else seen this?


Just a (the) thermal pad that makes contact with the back plate.


----------



## jsutter71

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Just a (the) thermal pad that makes contact with the back plate.


Yes I just saw it on the second card I received.


----------



## DarkIdeals

So the weirdest thing happened to my cards. I've got my two TITAN XP's set up with waterblocks in my loop etc.. and had a weird issue where one of my GPUs wasn't showing up/being detected by the computer; and therefore i couldn't enable SLI or anything. I had previously had issues with the card looking like it wasn't going in the slot all the way and accidentally broke off the clip on the 1st pci slot for my Rampage V Edition 10 board. But after that i was able to get both cards working without the clip just fine.

Then i go to sleep and the next day only one card is working for seemingly no reason. I've tried everything imaginable; used DDU to completely remove and reinstall drivers, removed my RAM, took the CMOS battery out etc..etc.. and nothing works.

I figured it might have been a damaged pci slot, so i switch from slot 1 and 3 for my two cards to slot 2 and 4; but it STILL only has one card showing up. The strangest thing though is that when i go into device manager it says that the one card that is working is in slot TWO, but i have my displayport cable in the bottom card which is in slot FOUR!! So how the hell could a displayport in a card in slot 4 be providing an image when only the card in slot 2 is being detected? I also have one of my RAM slots (DIMM slot D1) not being detected either; it's insane. I'm losing my mind with all the nonsensical problems i've been having lately...

Anyone have an idea here?


----------



## Silent Scone

Stop breaking things? lol.

Test both cards individually, and more importantly breathe.


----------



## DarkIdeals

Quote:


> Originally Posted by *Silent Scone*
> 
> Stop breaking things? lol.
> 
> Test both cards individually, and more importantly breathe.


Well it's not like i'm some idiot who's smashing things lol. Somehow i got it working again. I think it had something to do with me putting the EK Terminal together with the cards outside the system and then putting the entire thing in the pci slots all at once which caused problems with the card not going in the slot all the way. After taking the loop apart i tested just the one card that wasn't working before by itself and it did fine; then after re-connecting the 2nd one i got it all working again. The key seems to be that i put the cards in individually and THEN put the EK terminal on afterwards.

Haven't tested the RAM dimm slot yet though; so can't say whether that is working or not.


----------



## Silent Scone

This is why I stay away from the EK bridges. Adjustable fittings all the way.


----------



## EniGma1987

Quote:


> Originally Posted by *pompss*
> 
> third Titan X installed and guess what ?? Coil whine.
> 
> I change the psu no difference still coil Whine.
> 
> I went to a friend who has another titan and complete different components and still coil whine
> 
> Honestly i really don't understand how this is possible. Its Nvidia shipping those card that i return back?
> 
> $1200 its a lot of money and after thinking about it i just asked for a refund. Will wait next year to buy a new card


You sure your ears aren't just ringing?


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> This is why I stay away from the EK bridges. Adjustable fittings all the way.


lol, I do like the easily handled unit a hard bridge creates, but one does need to be careful when inserting the "gpu pack"



Quote:


> Originally Posted by *EniGma1987*
> 
> You sure your ears aren't just ringing?


Ha! must be Tinnitus.


----------



## bizplan

Quote:


> Originally Posted by *meson1*
> 
> Well this morning I received my email from Nvidia telling me they have finally dispatched my card.
> 
> Two hours later, and I now have my Titan X Pascal in my actual hands!!!
> 
> 
> 
> 
> 
> 
> 
> 
> Not bad for a delivery time.
> I am now a real actual proper TXP owner.


Share with us your experience with this card!


----------



## TangoDJ

*How to flash the BIOS of Titan X Pascal to change the default Fanspeed settings ?*

Hi Guys,

I built a computing machine (mostly floating points calculations for deep learning) with four Titan X Pascal. (part list)



Loaded Windows first and during stress testing, noticed the GPU fans don't go above 50%! Within a minute or so the GPU temps hits 84C and the they start slowing down.

Got the MSI Afterburner, and I could change the fan speed with temperature (yay). Right now, at 85% load, temp ~76C and fans are at 77%, GPUs are running ~1800 MHz. Good.

The problem is: the system will run Linux (Ubuntu, that's the platform deep learning software supports) and *there is NO "Afterburner" for Linux







*

One suggestion I got is to the flash the BIOS of the GPU with a custom temperature settings. Found this "THE ULTIMATE NiBiTor "FANSPEED IC SETTINGS" GUIDE" - a six year old guide. Is this still valid ? Also the "GUIDE for Flashing BIOS of NVIDIA GPU" was written quite a while ago.

Wonder if anyone has changed the BIOS temperature settings of Titan X Pascal ?

Thanks
AR


----------



## mbze430

BIOS is locked, there is no way (as of right now) to modify the BIOS.


----------



## bizplan

Quote:


> Originally Posted by *Ghostface*
> 
> Just as a follow up to the suggested Coil Whine fixes. To add to the list I posted earlier I have now gone back and loosened off all the screws on the water block back plate, all the screws on the water block (they weren't very tight). I have done the same with the Motherboard screws and CPU screws so that they are very loose around the components. Booted up, flashed the BIOS and reconfigured the settings...
> 
> ... same Coil Whine issues at the same points in the same games.
> 
> That's me finished in terms of trying different things. I have spent days running into weeks trying to fix it, including hundreds of extra pounds buying different components to see if it's the mix of parts I am using. Don't honestly have the heart to keep trying it so I will just have to use it as a very expensive work computer and give gaming a miss. I'll come back to PC gaming when someone finds a definitive resolution to it.
> 
> Cheers for all the help, wish it had turned out different!


Someone was crazy enough to do this:


http://imgur.com/Bgtll


----------



## Baasha

Quote:


> Originally Posted by *TangoDJ*
> 
> *How to flash the BIOS of Titan X Pascal to change the default Fanspeed settings ?*


Good to see another person w/ 4x Titan X Pascal.









What are some good number-crunching programs to test out the computational capacity of the four GPUs? Any specific benchmarks etc. specifically for FP operations?

My rig is a gaming rig but I'd like to test out some of these programs. Going to do an Octane Render test w/ 4x Titan X Pascal - would be fun to see!


----------



## Maintenance Bot

@TangoDJ


Spoiler: Warning: Spoiler!



How to flash the BIOS of Titan X Pascal to change the default Fanspeed settings ?[/B]

Hi Guys,

I built a computing machine (mostly floating points calculations for deep learning) with four Titan X Pascal. (part list)



Loaded Windows first and during stress testing, noticed the GPU fans don't go above 50%! Within a minute or so the GPU temps hits 84C and the they start slowing down.

Got the MSI Afterburner, and I could change the fan speed with temperature (yay). Right now, at 85% load, temp ~76C and fans are at 77%, GPUs are running ~1800 MHz. Good.

The problem is: the system will run Linux (Ubuntu, that's the platform deep learning software supports) and *there is NO "Afterburner" for Linux







*

One suggestion I got is to the flash the BIOS of the GPU with a custom temperature settings. Found this "THE ULTIMATE NiBiTor "FANSPEED IC SETTINGS" GUIDE" - a six year old guide. Is this still valid ? Also the "GUIDE for Flashing BIOS of NVIDIA GPU" was written quite a while ago.

Wonder if anyone has changed the BIOS temperature settings of Titan X Pascal ?

Thanks
AR


I used coolbits to control nvidia 980 fan in Linux, it has been about a year since I used linux but this is what worked for me. Try method 1 http://www.upubuntu.com/2015/05/how-to-controladjust-gpu-fan-speed-for.html


----------



## SlayVus

So I'm planning on using copper shims instead of thermal pads for installing the EVGA hybrid kit front plate to my Titan XP. I have some 0.3mm on the way with some 0.5mm and 0.8mm on hand in 20x20mm pieces. What would be the best way to going about test fitting these for the card? I've never done anything like this before.

I figure with the 20x20mm piece that I have, I can cut to fit where I need them. Then when I actually go about installed the plate onto the card I have a 8cc(Giant tube) of Arctic Silver Ceramique that I would use to actually affix the shims to the various components for cooling, dabbing the ceramique on both sides of the shim.


----------



## Jpmboy

Quote:


> Originally Posted by *SlayVus*
> 
> So I'm planning on using copper shims instead of thermal pads for installing the EVGA hybrid kit front plate to my Titan XP. I have some 0.3mm on the way with some 0.5mm and 0.8mm on hand in 20x20mm pieces. What would be the best way to going about test fitting these for the card? I've never done anything like this before.
> 
> I figure with the 20x20mm piece that I have, I can cut to fit where I need them. Then when I actually go about installed the plate onto the card I have a 8cc(Giant tube) of Arctic Silver Ceramique that I would use to actually affix the shims to the various components for cooling, dabbing the ceramique on both sides of the shim.


Unless you hammer the shims perfectly flat, you'll have bad contact and likely crack an IC (especially if cutting fro0m a 20x20 - if you had a cutting bit and press - maybe). More inportantly, the PCB is not mfr'd to tight tolerances in terms of IC raise off the PCB, this is why pads are used, not only to account for this variance, and provide thermal flux, but also to provide electrical insulation in some places. Bad idea IMO. TXPs run cool, what you are thinking of doing is totally unnecessary. Just get Fuji poly pads.


----------



## TangoDJ

Quote:


> Originally Posted by *Maintenance Bot*
> 
> @TangoDJ
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> How to flash the BIOS of Titan X Pascal to change the default Fanspeed settings ?[/B]
> 
> Hi Guys,
> 
> I built a computing machine (mostly floating points calculations for deep learning) with four Titan X Pascal. (part list)
> 
> 
> 
> Loaded Windows first and during stress testing, noticed the GPU fans don't go above 50%! Within a minute or so the GPU temps hits 84C and the they start slowing down.
> 
> Got the MSI Afterburner, and I could change the fan speed with temperature (yay). Right now, at 85% load, temp ~76C and fans are at 77%, GPUs are running ~1800 MHz. Good.
> 
> The problem is: the system will run Linux (Ubuntu, that's the platform deep learning software supports) and *there is NO "Afterburner" for Linux
> 
> 
> 
> 
> 
> 
> 
> *
> 
> One suggestion I got is to the flash the BIOS of the GPU with a custom temperature settings. Found this "THE ULTIMATE NiBiTor "FANSPEED IC SETTINGS" GUIDE" - a six year old guide. Is this still valid ? Also the "GUIDE for Flashing BIOS of NVIDIA GPU" was written quite a while ago.
> 
> Wonder if anyone has changed the BIOS temperature settings of Titan X Pascal ?
> 
> Thanks
> AR
> 
> 
> I used coolbits to control nvidia 980 fan in Linux, it has been about a year since I used linux but this is what worked for me. Try method 1 http://www.upubuntu.com/2015/05/how-to-controladjust-gpu-fan-speed-for.html


Thanks for pointing out the coolbits option. Read somewhere that I need to run xserver and attach a monitor to each of the GPUs (4 monitors total). There should be a way to 'fool' the GPUs with virtual monitors. Exploring this now.

Quote:


> Originally Posted by *Baasha*
> 
> Good to see another person w/ 4x Titan X Pascal.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What are some good number-crunching programs to test out the computational capacity of the four GPUs? Any specific benchmarks etc. specifically for FP operations?
> 
> My rig is a gaming rig but I'd like to test out some of these programs. Going to do an Octane Render test w/ 4x Titan X Pascal - would be fun to see!


Its the Nvidia Cuda Toolkit that made scientific computing popular in GPU. cuBLAS could be one for benchmarking. Also, have a look at the CUDA Tools & Ecosystem

Cheers


----------



## pompss

Quote:


> Originally Posted by *EniGma1987*
> 
> You sure your ears aren't just ringing?


Quote:


> Originally Posted by *Jpmboy*
> 
> lol, I do like the easily handled unit a hard bridge creates, but one does need to be careful when inserting the "gpu pack"
> 
> 
> Ha! must be Tinnitus.


Lol !!!

Trust me i have amd rx 480 and there is no Coil whine.. My ears are real good. By the way the coil whine was really laud
I say was bc i sold both oversees and made also some good money









Waiting for vega or 1080 ti hoping for no Coil whine


----------



## xarot

Quote:


> Originally Posted by *DarkIdeals*
> 
> Well it's not like i'm some idiot who's smashing things lol. Somehow i got it working again. I think it had something to do with me putting the EK Terminal together with the cards outside the system and then putting the entire thing in the pci slots all at once which caused problems with the card not going in the slot all the way. After taking the loop apart i tested just the one card that wasn't working before by itself and it did fine; then after re-connecting the 2nd one i got it all working again. The key seems to be that i put the cards in individually and THEN put the EK terminal on afterwards.


Quote:


> Originally Posted by *Jpmboy*
> 
> lol, I do like the easily handled unit a hard bridge creates, but one does need to be careful when inserting the "gpu pack"


I too like using the terminal. The GPU pack the terminal creates is very sturdy and I've never had a single issue even when inserting 3x GPUs in the same time. They are very heavy though when all are together. When removing them, you'll need to make sure that the mobo clips are open and do it very carefully so you don't scratch the board near the GPU I/O plates. Same goes of course when installing them.


----------



## Jpmboy

Quote:


> Originally Posted by *xarot*
> 
> I too like using the terminal. The GPU pack the terminal creates is very sturdy and I've never had a single issue even when inserting 3x GPUs in the same time. They are very heavy though when all are together. When removing them, you'll need to make sure that the mobo clips are open and do it very carefully so you don't scratch the board near the GPU I/O plates. Same goes of course when installing them.


yeah - I run both adj (2 TXMs) and ek terminals (2 txp). But for sure it is the only way to go with 3 cards:

3x 980 Strix was the last tri-sli setup I had.


----------



## jsutter71

What software do some of you use to monitor temps. With my Evga cards I was using their software but now that my TXPs are running that software does not want to work.


----------



## TremF

Quote:


> Originally Posted by *jsutter71*
> 
> What software do some of you use to monitor temps. With my Evga cards I was using their software but now that my TXPs are running that software does not want to work.


MSI Afterburner

It may also work if you uninstall your previous software, including any profiles, and re-install it.


----------



## jsutter71

Thank you much.


----------



## jsutter71

How do you disable g-sync?


----------



## TremF

Quote:


> Originally Posted by *jsutter71*
> 
> How do you disable g-sync?


Right click on your desktop and select NVidia Control Panel then go to the Set up G-Sync option and se-select it


----------



## jsutter71

Quote:


> Originally Posted by *TremF*
> 
> Right click on your desktop and select NVidia Control Panel then go to the Set up G-Sync option and se-select it


Where in the NVIDIA driver is that located? I can't seem to find it.


----------



## TremF

Quote:


> Originally Posted by *jsutter71*
> 
> Where in the NVIDIA driver is that located? I can't seem to find it.


Right click on your desktop and you should see "NVidia Control Panel". Select that then in the control panel select "Set up G-Sync" as shown below and disable it.

I presume you do have a G-Sync monitor then?


----------



## DarkIdeals

Ugh....this is getting absolutely ridiculous. Had things going fine for the past day and a half and now the top card isn't being recognized again. i kept getting random restarts last night too, one of which gave a USB overvoltage error. Haven't done anything to cause this that i know of either; it's driving me crazy! i just can't understand why a GPU keeps appearing and then disappearing from the list. It always seems to be fine when i leave the PC to go to sleep and when i wake up in the morning it's just not recognized (3 times i've had this happen the same way...) i've tried doing a ton of DDU clean re-installs of drivers, i tried ensuring the cards were in the slot all the way etc.. and nothing works. I'm starting to genuinely hate this damn system....i can't afford to have the thing down any longer than it already has been.


----------



## jsutter71

Quote:


> Originally Posted by *TremF*
> 
> Right click on your desktop and you should see "NVidia Control Panel". Select that then in the control panel select "Set up G-Sync" as shown below and disable it.
> 
> I presume you do have a G-Sync monitor then?


Interesting. I don't have that option available. Drivers version 373.06


----------



## TremF

Quote:


> Originally Posted by *DarkIdeals*
> 
> Ugh....this is getting absolutely ridiculous. Had things going fine for the past day and a half and now the top card isn't being recognized again. i kept getting random restarts last night too. Haven't done anything to cause this that i know of either; it's driving me crazy! i just can't understand why a GPU keeps appearing and then disappearing from the list. It always seems to be fine when i leave the PC to go to sleep and when i wake up in the morning it's just not recognized (3 times i've had this happen the same way...) i've tried doing a ton of DDU clean re-installs of drivers, i tried ensuring the cards were in the slot all the way etc.. and nothing works. I'm starting to genuinely hate this damn system....i can't afford to have the thing down any longer than it already has been.


What driver are you using? Have you tried an older one?

I think I am having an issue with the latest driver. My PC ran fine with my old GTX Titan X SLI setup and also ran fine with the TXP until the new driver and now it locks up now and again when starting a game. All temperatures are fine and my PSU is plenty powerful enough - like I said it ran my GTX TX SLI.
Quote:


> Originally Posted by *jsutter71*
> 
> Interesting. I don't have that option available. Drivers version 373.06


Do you have a G-Sync monitor and if so do you have multiple montiors? If the answer is yes to both then is the G-Sync monitor set as your Primary display?


----------



## DarkIdeals

Quote:


> Originally Posted by *TremF*
> 
> What driver are you using? Have you tried an older one?
> 
> I think I am having an issue with the latest driver. My PC ran fine with my old GTX Titan X SLI setup and also ran fine with the TXP until the new driver and now it locks up now and again when starting a game. All temperatures are fine and my PSU is plenty powerful enough - like I said it ran my GTX TX SLI.


I'm using latest drivers now, but it was still causing problems with the 3rd driver down the list as well a couple days ago.

It's just the most bizarre behavior, sometimes my video signal would just go out, other times the whole system would restart, now the gpu isn't recognized AGAIN, and i'm getting usb devices hanging up until i unplug and put them in a different port, usb overvoltage errors when it restarts etc..etc.. just ALL kinds of nonsense with no apparent cause.

I replaced my PSU recently so i didn't think it was that (using an EVGA 1050 watt GS) but SOMETHING is wrong here....


----------



## TremF

Quote:


> Originally Posted by *DarkIdeals*
> 
> I'm using latest drivers now, but it was still causing problems with the 3rd driver down the list as well a couple days ago.
> 
> It's just the most bizarre behavior, sometimes my video signal would just go out, other times the whole system would restart, now the gpu isn't recognized AGAIN, and i'm getting usb devices hanging up until i unplug and put them in a different port, usb overvoltage errors when it restarts etc..etc.. just ALL kinds of nonsense with no apparent cause.


Hmm have you tried another PSU? It sounds like a lack of power if it restarts? Also try a memcheck in case you have bad RAM but if that was the case you'd prob have BSOD rather than restarts and dropped cards. Another option is IRQ conflicts?

Another, weird one, is check for Malware? It can do some strange things?

Sorry I'm not much help. I've been really lucky over the years and not had issues like this. I hope you manage to sort it soon.


----------



## jsutter71

Quote:


> Originally Posted by *TremF*
> 
> What driver are you using? Have you tried an older one?
> 
> I think I am having an issue with the latest driver. My PC ran fine with my old GTX Titan X SLI setup and also ran fine with the TXP until the new driver and now it locks up now and again when starting a game. All temperatures are fine and my PSU is plenty powerful enough - like I said it ran my GTX TX SLI.
> Do you have a G-Sync monitor and if so do you have multiple montiors? If the answer is yes to both then is the G-Sync monitor set as your Primary display?


Ok. No my monitor is not. Makes since why the feature is not available then.


----------



## jsutter71

I have the advanced addition of 3D mark but when I click on the install button to load the benchmarks nothing happens. Any ideas?


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> I have the advanced addition of 3D mark but when I click on the install button to load the benchmarks nothing happens. Any ideas?


NUKE everything , Start FRESH https://www.microsoft.com/en-us/software-download/windows10/
No reason to fiddle with every little thing


----------



## jsutter71

Quote:


> Originally Posted by *KillerBee33*
> 
> NUKE everything , Start FRESH https://www.microsoft.com/en-us/software-download/windows10/
> No reason to fiddle with every little thing


Already running Win 10 pro


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> NUKE everything , Start FRESH https://www.microsoft.com/en-us/software-download/windows10/
> No reason to fiddle with every little thing


I believe you have to run the INSTALL program using Win XP compatibility.


----------



## bizplan

Quote:


> Originally Posted by *jsutter71*
> 
> I have the advanced addition of 3D mark but when I click on the install button to load the benchmarks nothing happens. Any ideas?


I believe you have to run the INSTALL program using Win XP compatibility.


----------



## jsutter71

Quote:


> Originally Posted by *bizplan*
> 
> I believe you have to run the INSTALL program using Win XP compatibility.


Ok..Now it.s a different issue. 3D Mark has installed the benchmarks but gives me an error every time I try to run them. Man what a waste of money this program is. Doesn't even describe the error.


----------



## mbze430

should go get a refund


----------



## Jpmboy

Quote:


> Originally Posted by *jsutter71*
> 
> Interesting. I don't have that option available. Drivers version 373.06


down load Display driver uninstaller and run it. Then reinstall the most recent driver.
Quote:


> Originally Posted by *DarkIdeals*
> 
> I'm using latest drivers now, but it was still causing problems with the 3rd driver down the list as well a couple days ago.
> 
> It's just the most bizarre behavior, sometimes my video signal would just go out, other times the whole system would restart, now the gpu isn't recognized AGAIN, and i'm getting usb devices hanging up until i unplug and put them in a different port, usb overvoltage errors when it restarts etc..etc.. just ALL kinds of nonsense with no apparent cause.
> 
> I replaced my PSU recently so i didn't think it was that (using an EVGA 1050 watt GS) but SOMETHING is wrong here....


Shut down unplug and ensure that the cards are fully and correctly seated in the slots - in fact, ensure that all power connectors are correctly seated and that you have the SLI 4-pin molex powered. . Then, unplug any USB devices except mouse and keyboard switch on the PSU and try a clrcmos. If it runs fine at full stock, apply your OC (no new USB devices) - does it still have the problem? If it is not running clean at full stock, and dropping a PCIE slot, check that the board standoffs are not over tightened and that there are no "extra" standoffs under the board (had a guys fry a MB by leaving an unused standoff from a MATX board in the MB tray -00 shorted the board out.


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> Ok..Now it.s a different issue. 3D Mark has installed the benchmarks but gives me an error every time I try to run them. Man what a waste of money this program is. Doesn't even describe the error.


Here is Standalone Edition no STEAM required...
http://www.guru3d.com/files-details/3dmark-download.html


----------



## jsutter71

Quote:


> Originally Posted by *KillerBee33*
> 
> Here is Standalone Edition no STEAM required...
> http://www.guru3d.com/files-details/3dmark-download.html


The version that I installed was the Standalone version that I purchased directly from their web site. Must be a 3Dmark issue. I have no problem running Heaven. Also opened Witcher 3 with no issues. Looks beautiful on dual TXPs. My gaming monitor is a 31" 4096 X 2160 resolution. Also my temps maxed out at 53C for both cards with 8X AA and ultra settings with Heaven.
*
This destroys my previous configuration. Triple SLI 890Ti's.*

*No AA*


*8X AA*


----------



## Jpmboy

Quote:


> Originally Posted by *jsutter71*
> 
> The version that I installed was the Standalone version that I purchased directly from their web site. Must be a 3Dmark issue. I have no problem running Heaven. Also opened Witcher 3 with no issues. Looks beautiful on dual TXPs. My gaming monitor is a 31" 4096 X 2160 resolution. Also my temps maxed out at 53C for both cards with 8X AA and ultra settings with Heaven.
> *
> This destroys my previous configuration. Triple SLI 890Ti's.*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *No AA*
> 
> 
> *8X AA*


you need win10 to run timespy. It needs dx12.


----------



## KillerBee33

Not sure whats going on just ran MAFIA 3 maxed @ 1620P for about an hour TXP and 6700K at stock and on water TXP 55 degrees and 6700K 60-70


----------



## ocvn

Quote:


> Originally Posted by *jsutter71*
> 
> Ok..Now it.s a different issue. 3D Mark has installed the benchmarks but gives me an error every time I try to run them. Man what a waste of money this program is. Doesn't even describe the error.


Close MSI AB


----------



## jsutter71

Quote:


> Originally Posted by *Jpmboy*
> 
> you need win10 to run timespy. It needs dx12.


I'm running Win 10 pro


----------



## jsutter71

Quote:


> Originally Posted by *ocvn*
> 
> Close MSI AB


Thanks. I'll try that tomorrow. Why would Afterburner interfere?


----------



## ocvn

Quote:


> Originally Posted by *jsutter71*
> 
> Thanks. I'll try that tomorrow. Why would Afterburner interfere?


Frankly i dont know. MSI AB work perfect with FSU but eveytime i run timespy with MSI AB open, i got the problem same as ur.


----------



## DarkIdeals

after swapping to my old PSU (evga 1000w G2) it started detecting my 2nd GPU again; but within a couple minutes of turning on i get constant random reboots or video signal cutting out followed by reboot. I thought maybe it was the drivers again so i did a full DDU clean and every time i went to install the Nvidia drivers it would shut down on its own the second the drivers started to install. After like 10 tries i FINALLY got it to install the drivers and it shut off again right after.

I keep getting my USB devices not functioning where i have to swap ports, and i figured maybe it was the USB drivers but i can't install new USB drivers as it shuts down the second i try to install them. I then used the EZ Flash tool to switch to the new 1003 BIOS on the Rampage V Edition 10 and now it's only detecting one card again and i just got a "USB OVERCURRENT STATUS DETECTED SHUTTING DOWN IN 15 SECONDS"

This is THE most infuriating thing i've ever had happen to me, is there anyone with any idea what the hell this nightmare is all about? I've tried for nearly four days and i can't get the PC to stay on longer than a few minutes and can't get it to consistently detect both GPUs either.


----------



## DarkIdeals

And for some reason the random shut downs etc.. appear to stop or at least get less common once the 2nd GPU stops being recognized by the system and starts back up real bad when i get the 2nd GPU detected properly again.

It's also having issues with detecting my "D" slot for the RAM. says i only have 12gb of RAM but i have 16gb.

Anyone have ANY ideas here?


----------



## MrKenzie

Quote:


> Originally Posted by *DarkIdeals*
> 
> And for some reason the random shut downs etc.. appear to stop or at least get less common once the 2nd GPU stops being recognized by the system and starts back up real bad when i get the 2nd GPU detected properly again.
> 
> It's also having issues with detecting my "D" slot for the RAM. says i only have 12gb of RAM but i have 16gb.
> 
> Anyone have ANY ideas here?


Honestly if it were me I would replace the motherboard..


----------



## Menthol

Quote:


> Originally Posted by *DarkIdeals*
> 
> And for some reason the random shut downs etc.. appear to stop or at least get less common once the 2nd GPU stops being recognized by the system and starts back up real bad when i get the 2nd GPU detected properly again.
> 
> It's also having issues with detecting my "D" slot for the RAM. says i only have 12gb of RAM but i have 16gb.
> 
> Anyone have ANY ideas here?


Remove the memory and test each stick in each slot for either a bad memory module or slot, or CPU, do you have another CPU to test with,possibly CPU and or motherboard


----------



## V I P E R

Hello Titan XP owners









I currently have 2 Titan XP's which soon will have waterblocks, but I have a question for the people that have SLI Titan XP and play BF1. With 2 cards @ 4K Ultra I get very poor GPU Core Load - 50-60% on both cards with FPS between 60-80. With single card I get normal 95-100 GPU utilisation and same FPS. Anyone else noticed this behaviour in BF1?

I've made a clip with MSI Afterburner overlay in which you can see GPU Core Load:


----------



## EniGma1987

Quote:


> Originally Posted by *MrKenzie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DarkIdeals*
> 
> And for some reason the random shut downs etc.. appear to stop or at least get less common once the 2nd GPU stops being recognized by the system and starts back up real bad when i get the 2nd GPU detected properly again.
> 
> It's also having issues with detecting my "D" slot for the RAM. says i only have 12gb of RAM but i have 16gb.
> 
> Anyone have ANY ideas here?
> 
> 
> 
> Honestly if it were me I would replace the motherboard..
Click to expand...

Or the CPU is somehow extremely damages and part of the dram controller and PCI-E controller is messed up. More likely the board though since if it was a damaged DRAM controller (I have had that happen before) then both sticks on that channel would probably not be recognized. And we do know Rampage boards can be quite finicky


----------



## Jpmboy

Quote:


> Originally Posted by *EniGma1987*
> 
> Or the CPU is somehow extremely damages and part of the dram controller and PCI-E controller is messed up. More likely the board though since if it was a damaged DRAM controller (I have had that happen before) then both sticks on that channel would probably not be recognized. And we do know Rampage boards can be quite finicky


.... and or bent pins.


----------



## Ghostface

Quote:


> Originally Posted by *DarkIdeals*
> 
> This is THE most infuriating thing i've ever had happen to me, is there anyone with any idea what the hell this nightmare is all about? I've tried for nearly four days and i can't get the PC to stay on longer than a few minutes and can't get it to consistently detect both GPUs either.


Before you go to the trouble of replacing your motherboard, make sure that you don't have any metal stand offs or screws touching the circuitry on your motherboard.

I have an MSI X-Power Titanium Motherboard and spent an infuriating couple of days wondering why my machine was crashing, failing to detect memory running dual channel and 75% of the time just failing to boot. Turned out one of the stand offs on the CPU cooler was touching the circuit on the motherboard, which on the MSI Titanium runs directly behind the screw socket.

I've had a quick look at Google image and although it doesn't look like the CPU socket on your board has circuitry behind the screw sockets, it does look as though some of the Motherboard standoffs might be. Try removing the screws from any sockets that are very close to the circuitry. The fact it intermittently works indicates it could be falling foul of interference.


----------



## Baasha

Quote:


> Originally Posted by *V I P E R*
> 
> Hello Titan XP owners
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I currently have 2 Titan XP's which soon will have waterblocks, but I have a question for the people that have SLI Titan XP and play BF1. With 2 cards @ 4K Ultra I get very poor GPU Core Load - 50-60% on both cards with FPS between 60-80. With single card I get normal 95-100 GPU utilisation and same FPS. Anyone else noticed this behaviour in BF1?


Your CPU is bottlenecking the cards. In other words, your CPU core clock is not high enough for the GPUs to be utilized at their optimal levels (>95% each).

4K is too *low* of a resolution for 2x Titan XP.

I played the other night with everything maxed out including TAA @ 4K w/ just one Titan XP and was getting around 70FPS constantly with GPU usage around 99% at all times.

Here's BF1 in 5K w/ 4x Titan XP and even w/ a 6950X @ 4.30Ghz, you can see that GPU utilization is not that great (~ 70% across all 4 cards): (5K is 70% more pixels than 4K)




To test this out, put your resolution scale to 133% at 4K which will be approx. 5K resolution - the cards should then be utilized at their optimal level. Two cards is ideal for 5K. One card for 4K.

We really have reached the promised land of 4K 60FPS w/ just one card!


----------



## V I P E R

Quote:


> Originally Posted by *Baasha*
> 
> Your CPU is bottlenecking the cards. In other words, your CPU core clock is not high enough for the GPUs to be utilized at their optimal levels (>95% each).
> 
> 4K is too *low* of a resolution for 2x Titan XP.
> 
> I played the other night with everything maxed out including TAA @ 4K w/ just one Titan XP and was getting around 70FPS constantly with GPU usage around 99% at all times.
> 
> Here's BF1 in 5K w/ 4x Titan XP and even w/ a 6950X @ 4.30Ghz, you can see that GPU utilization is not that great (~ 70% across all 4 cards): (5K is 70% more pixels than 4K)
> 
> 
> 
> 
> To test this out, put your resolution scale to 133% at 4K which will be approx. 5K resolution - the cards should then be utilized at their optimal level. Two cards is ideal for 5K. One card for 4K.
> 
> We really have reached the promised land of 4K 60FPS w/ just one card!


I get the same results with 5960X @ 4,700 Mhz and my 6700K @ 5000 Mhz. I know I can get better utilization with scaling, but when 4K monitors with more than 60Hz came out it will be impossible to play with 2 Titan XP and that bothers me. I'll have to test with 6700K on LN2 to confirm that the CPU is the bottleneck here, but it is hard to believe that.


----------



## jsutter71

Quote:


> Originally Posted by *V I P E R*
> 
> I get the same results with 5960X @ 4,700 Mhz and my 6700K @ 5000 Mhz. I know I can get better utilization with scaling, but when 4K monitors with more than 60Hz came out it will be impossible to play with 2 Titan XP and that bothers me. I'll have to test with 6700K on LN2 to confirm that the CPU is the bottleneck here.


Interesting. I never understood why so many people here have payed the extra money for the 6950X. Why do all the review sites say that the best CPU for gaming is the 6700K since that CPU is so much less powerful?


----------



## Lee0

Quote:


> Originally Posted by *jsutter71*
> 
> Interesting. I never understood why so many people here have payed the extra money for the 6950X. Why do all the review sites say that the best CPU for gaming is the 6700K since that CPU is so much less powerful?


Not everyone here is a pure gamer.


----------



## pompss

http://www.overclock.net/t/1614093/ek-waterblock-titan-pascal-used-for-few-days

if some one looking for pascal titan x waterblock i have now to sell
I also have one from acquacomputer and backplate too


----------



## Lobotomite430

Quote:


> Originally Posted by *jsutter71*
> 
> Interesting. I never understood why so many people here have payed the extra money for the 6950X. Why do all the review sites say that the best CPU for gaming is the 6700K since that CPU is so much less powerful?


I dont think Ive seen anyone say the 6950x is better for gaming. Everything I have read points to 6700k because hardly anything needs 10 cores. I think its because a more powerful gpu is more important than cpu cores at this time. I think its kinda like buying a car with a V8 when you could have got a turbo 4 cylinder. More cores more cylinder. Basically why I went with a 5820k over a 4770k.


----------



## jsutter71

Quote:


> Originally Posted by *Lee0*
> 
> Not everyone here is a pure gamer.


Funny. But how many games can actually utilize a CPU with 10 cores? Like NONE.


----------



## Lee0

Quote:


> Originally Posted by *jsutter71*
> 
> Funny. But how many games can actually utilize a CPU with 10 cores? Like NONE.


Please read my first message as it seems like you just ignored it.


----------



## jsutter71

Quote:


> Originally Posted by *Lobotomite430*
> 
> I dont think Ive seen anyone say the 6950x is better for gaming. Everything I have read points to 6700k because hardly anything needs 10 cores. I think its because a more powerful gpu is more important than cpu cores at this time. I think its kinda like buying a car with a V8 when you could have got a turbo 4 cylinder. More cores more cylinder. Basically why I went with a 5820k over a 4770k.


LOL...I understand that logic. My 2014 Camaro 2SS I bought it brand new in 2014 and I just kit 5000 miles on the O. Some people can afford the ultimate system and other nice things together, but I'm not rich, so after dropping a ton of cash on dual TXPs, I'll have to stay satisfied with my 5930K for a while.


----------



## V I P E R

Quote:


> Originally Posted by *jsutter71*
> 
> Interesting. I never understood why so many people here have payed the extra money for the 6950X. Why do all the review sites say that the best CPU for gaming is the 6700K since that CPU is so much less powerful?


I was just interested why my two Titan X's cannot give me 120-140 FPS and utilize better, and I don't think that the CPU is bottleneck in BF1 nor I agree that 4K Ultra with TAA is weak for 2 Titan XP's. I think that drivers need to mature a little more maybe.

I don't expect SLI to scale perfectly, nor that 6700K @ 4600 Mhz or 5960X @ 4500 Mhz (my 24/7 settings) are weak CPU's, but getting almost same FPS whether I'm using 1 or 2 cards seemed strange and that's why I asked here if anyone had same experience.


----------



## jsutter71

Quote:


> Originally Posted by *Lee0*
> 
> Please read my first message as it seems like you just ignored it.


I apologize if I sounded impolite. I did not mean to be rude and should have used the







emoji.


----------



## jsutter71

Quote:


> Originally Posted by *V I P E R*
> 
> I was just interested why my two Titan X's cannot give me 120-140 FPS and utilize better, and I don't think that the CPU is bottleneck in BF1 nor I agree that 4K Ultra with TAA is weak for 2 Titan XP's. I think that drivers need to mature a little more maybe.
> 
> I don't expect SLI to scale perfectly, nor that 6700K @ 4600 Mhz or 5960X @ 4500 Mhz (my 24/7 settings) are weak CPU's, but getting almost same FPS whether I'm using 1 or 2 cards seemed strange and that's why I asked here if anyone had same experience.


All this new technology is making my head spin. Going to get some member berries and play a game of missile command on the Atari 2600


----------



## Lobotomite430

Quote:


> Originally Posted by *jsutter71*
> 
> All this new technology is making my head spin. Going to get some member berries and play a game of missile command on the Atari 2600


oooo I member! member 2d graphics and 8 colors?


----------



## SlayVus

Quote:


> Originally Posted by *Baasha*
> 
> 4K is too *low* of a resolution for 2x Titan XP


It really is amazing how powerful the Titan XP is. I was messing around with Dynamic Super Resolution on my x34 at four times native resolution. I was getting 30 FPS at 6880x2880 in Divinity Original Sin 2 at high settings with no AA. That's a staggering 11.7 million pixels more than 4k, 5.2 million more than 5k, in fact it's 500k more pixels than even 6k.


----------



## bizplan

Quote:


> Originally Posted by *Lee0*
> 
> Not everyone here is a pure gamer.


Why do all the review sites say that the best CPU for gaming is the 6700K since that CPU is so much less powerful?

It's [partly] for economic reasons, the price-to-performance ratio on the 6700K is much better than the 6950X (for gaming that is). Also, the single threaded performance of the 6700K is higher than the 6950X, check it out in CPU-Z.


----------



## mouacyk

Quote:


> Originally Posted by *SlayVus*
> 
> It really is amazing how powerful the Titan XP is. I was messing around with Dynamic Super Resolution on my x34 at four times native resolution. I was getting 30 FPS at 6880x2880 in Divinity Original Sin 2 at high settings with no AA. That's a staggering 11.7 million pixels more than 4k, 5.2 million more than 5k, in fact it's 500k more pixels than even 6k.


I think you can use a little more RAM in your system.


----------



## Maintenance Bot

Quote:


> Originally Posted by *V I P E R*
> 
> Hello Titan XP owners
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I currently have 2 Titan XP's which soon will have waterblocks, but I have a question for the people that have SLI Titan XP and play BF1. With 2 cards @ 4K Ultra I get very poor GPU Core Load - 50-60% on both cards with FPS between 60-80. With single card I get normal 95-100 GPU utilisation and same FPS. Anyone else noticed this behaviour in BF1?


I also see no scaling with 2nd gpu here. Offical drivers tomorrow hopefully.


----------



## Jpmboy

Quote:


> Originally Posted by *Lobotomite430*
> 
> I dont think Ive seen anyone say the 6950x is better for gaming. Everything I have read points to 6700k because hardly anything needs 10 cores. I think its because a more powerful gpu is more important than cpu cores at this time. I think its kinda like buying a car with a V8 when you could have got a turbo 4 cylinder. More cores more cylinder. Basically why I went with a 5820k over a 4770k.


6700K is a super good price-performance point... but "games" better is basically a myth that nearly every benchmark and game-benchmark demonstrates. We had the debate months ago in this thread.


----------



## jsutter71

After I recover from the sticker shock of purchasing dual TXP's I'll consider the 6950X. In the mean time I'll do my best to stay content with my 5930K


----------



## mbze430

once game developers start using the scaling with DX12 for both GPU and CPU, it would be a whole new story with 8C+ CPUs. Right now, it's too early. Probably won't be a while either. Consider how long each DX revision took for developers to utilize it's full potentials


----------



## Jpmboy

Quote:


> Originally Posted by *jsutter71*
> 
> After I recover from the sticker shock of purchasing dual TXP's I'll consider the 6950X. In the mean time I'll do my best to stay content with my 5930K


lol - a 6950X may not be the best choice for a gaming-only rig... since it is capable of so much more. For a gaming-only rig (and of course the basic 2D "office-like" productivity stuff) a 6700K is the right choice ATM, especially at 1440P and lower resolutions.


----------



## jhowell1030

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - a 6950X may not be the best choice for a gaming-only rig... since it is capable of so much more. For a gaming-only rig (and of course the basic 2D "office-like" productivity stuff) a 6700K is the right choice ATM, especially at 1440P and lower resolutions.


For a gaming only rig...that's sufficient for any resolution.


----------



## bizplan

Quote:


> Originally Posted by *jhowell1030*
> 
> For a gaming only rig...that's sufficient for any resolution.


Yeah and very soon the 7700K, one can swap out their 6700K for that w/o having to buy a different motherboard; only a bios update, 15%+ performance boost for around $400 (net $150-$175 assuming one can sell their used 6700K for $225-$250(?)).


----------



## jsutter71

Quote:


> Originally Posted by *bizplan*
> 
> Yeah and very soon the 7700K, one can swap out their 6700K for that w/o having to buy a different motherboard; only a bios update, 15%+ performance boost for around $400 (net $150-$175 assuming one can sell their used 6700K for $225-$250(?)).


Understandably the hardcore gaming community is driving the TXPs. In the same since that the technical professionals are driving the Quadro community. Some of us non hardcore gamers happen to have nice big expensive TRUE 4k 4096 X 2160 monitors that need powerful GPU's to drive them for things like video and photo shop editing for our nice expensive 5D Mark 3 cameras.The ability to play Battlefront and Witcher 3 on those nice big monitors is just a added benefit.


----------



## Nikos4Life

I do not think there is just one right choice for gaming purposes not even for just one person as needs may change over time.


----------



## Nikos4Life

By the way I would like to present myself on this thread as a proud owner of two Titan X Pascal.
And show my rig to other fellows:

Hope you like it












Spoiler: BFG

















Spoiler: Benchmarks



FIRE STRIKE 1.1 30 909
3DMARK 11 PERFORMANCE P34 549



Note: I have changed the parallel system to serie.


----------



## Jpmboy

Quote:


> Originally Posted by *jhowell1030*
> 
> For a gaming only rig...that's sufficient for any resolution.


so is a 6600K for that matter.


----------



## st0necold

Quote:


> Originally Posted by *jsutter71*
> 
> Understandably the hardcore gaming community is driving the TXPs. In the same since that the technical professionals are driving the Quadro community. Some of us non hardcore gamers *happen to have nice big expensive TRUE 4k 4096 X 2160* monitors that need powerful GPU's to drive them for things like video and photo shop editing for our nice expensive 5D Mark 3 cameras.The ability to play Battlefront and Witcher 3 on those nice big monitors is just a added benefit.


What 4096 monitor do you have? I am just curious what you do with them (did a google search and saw the 25k+ price tags I assume these are for professional photo and video editing rigs?


----------



## bizplan

Quote:


> Originally Posted by *st0necold*
> 
> What 4096 monitor do you have? I am just curious what you do with them (did a google search and saw the 25k+ price tags I assume these are for professional photo and video editing rigs?


One can get a LG Electronics IPS Digital Cinema 4K Monitor 31MU97-B 31.0-Inch Screen LED-Lit Monitor, True 4K (4096x2160) for $880 at Amazon, which I believe is the monitor that was referenced.


----------



## ttg35fort

Here is my rig:

Fractal Design Define S case
Seasonic Prime Titanium 850 PSU
Asus X99-Deluxe II MB
Core I7 6800k
Titan X (Pascal)
G.Skill TridentZ Series 32GB DDR4 3400
Samsung 950 Pro M.2 512 GB
Windows 10 64 Bit.

Originally I was running a Noctua NH-D15. I used the Asus automatic overclock and was at 3.98 GHz (39 x 102) at stock voltage. The default RAM frequency was around 2.4 GHz. I changed this to 3.4 MHz since I have DDR 3400 RAM, and I was stable in IntelBurn test.

I played with overclocking the Titan X on air in MSI Afterburner. I pushed the GPU with an added a +200 MHz overclock and was stable at stock voltage. Then I started playing with the memory and got to 600 MHz. It was stable and I ran out of time, so I left it there. At full load, the Titan X was running in the low to mid 80 deg. C range and the CPU was upper 60s to low 70s if I remember correctly.

FireStrike Extreme gave me a score in the low 14000 range.

I decided to change to open loop water cooling. I took out the Noctua and added:

EK-FB ASUS X99 Monoblock
EK-FC Titan X Pascal waterblock
EK-XRES 140 Revo D5 PWM (incl. pump)
Alphacool NexXxoS ST 30 420 mm Radiator
Alphacool NexXxoS ST 30 280 mm Radiator

I manually OC'd the processor and am stable at 4.41 GHz (43 x 102.5) at 1.28 V, although I had to bump the memory to 1.35V and drop the memory frequency down to around 2.85 GHz to get stability in IntelBurn test. With the CPU at 4.4 GHz and the memory at 3.4 GHz, the system would freeze in IntelBurn test.

That is the background. Now, here is where I am struggling:

On water, my CPU at full load is in the upper 50 to low 60 deg. C range, and the Titan X at full load is running in the low to mid 50 deg. C range. So, my CPU temp has dropped about 10 deg. C and my GPU temp has dropped about 30 deg. C. Yet, I cannot get more OC out of the Titan X at stock voltage. Moreover, even with the CPU bumped up to 4.4 GHz from 4.0 GHz, my FireStrike Extreme score remains about the same, in the low 14000 range. Perhaps the increase CPU speed is being offset by the reduced RAM speed. I'm not sure.

I played with the Titan X voltage in MSI Afterburner, and it took a 30% voltage bump to get it stable at a +205 MHz OC. After that, though, my FireStrike Extreme actually went down into the mid 13000 range. I then tried a 210 MHz OC, and it took a 75% voltage bump to get it stable. Then, my FireStrike Extreme score dropped even more, down to the low 13000 range. I tried going the other direction and dropped the OC to +190 MHz, and my score again down below 14000. So, +200 MHz seems to be the sweet spot, regardless of whether I am on air or water.

So, after adding all of the water cooling, my rig does not seem to be any faster. Any suggestions for what I should try next?


----------



## jhowell1030

Quote:


> Originally Posted by *ttg35fort*
> 
> Here is my rig:
> 
> Fractal Design Define S case
> Seasonic Prime Titanium 850 PSU
> Asus X99-Deluxe II MB
> Core I7 6800k
> Titan X (Pascal)
> G.Skill TridentZ Series 32GB DDR4 3400
> Samsung 950 Pro M.2 512 GB
> Windows 10 64 Bit.
> 
> Originally I was running a Noctua NH-D15. I used the Asus automatic overclock and was at 3.98 GHz (39 x 102) at stock voltage. The default RAM frequency was around 2.4 GHz. I changed this to 3.4 MHz since I have DDR 3400 RAM, and I was stable in IntelBurn test.
> 
> I played with overclocking the Titan X on air in MSI Afterburner. I pushed the GPU with an added a +200 MHz overclock and was stable at stock voltage. Then I started playing with the memory and got to 600 MHz. It was stable and I ran out of time, so I left it there. At full load, the Titan X was running in the low to mid 80 deg. C range and the CPU was upper 60s to low 70s if I remember correctly.
> 
> FireStrike Extreme gave me a score in the low 14000 range.
> 
> I decided to change to open loop water cooling. I took out the Noctua and added:
> 
> EK-FB ASUS X99 Monoblock
> EK-FC Titan X Pascal waterblock
> EK-XRES 140 Revo D5 PWM (incl. pump)
> Alphacool NexXxoS ST 30 420 mm Radiator
> Alphacool NexXxoS ST 30 280 mm Radiator
> 
> I manually OC'd the processor and am stable at 4.41 GHz (43 x 102.5) at 1.28 V, although I had to bump the memory to 1.35V and drop the memory frequency down to around 2.85 GHz to get stability in IntelBurn test. With the CPU at 4.4 GHz and the memory at 3.4 GHz, the system would freeze in IntelBurn test.
> 
> That is the background. Now, here is where I am struggling:
> 
> On water, my CPU at full load is in the upper 50 to low 60 deg. C range, and the Titan X at full load is running in the low to mid 50 deg. C range. So, my CPU temp has dropped about 10 deg. C and my GPU temp has dropped about 30 deg. C. Yet, I cannot get more OC out of the Titan X at stock voltage. Moreover, even with the CPU bumped up to 4.4 GHz from 4.0 GHz, my FireStrike Extreme score remains about the same, in the low 14000 range. Perhaps the increase CPU speed is being offset by the reduced RAM speed. I'm not sure.
> 
> I played with the Titan X voltage in MSI Afterburner, and it took a 30% voltage bump to get it stable at a +205 MHz OC. After that, though, my FireStrike Extreme actually went down into the mid 13000 range. I then tried a 210 MHz OC, and it took a 75% voltage bump to get it stable. Then, my FireStrike Extreme score dropped even more, down to the low 13000 range. I tried going the other direction and dropped the OC to +190 MHz, and my score again down below 14000. So, +200 MHz seems to be the sweet spot, regardless of whether I am on air or water.
> 
> So, after adding all of the water cooling, my rig does not seem to be any faster. Any suggestions for what I should try next?


If you had good/sufficient cooling with air coolers (which you did) adding water to the mix isn't going to make things any faster. Some folks *may* get a tiny increase in overclocks but that's usually when their limit was due to temps.


----------



## bizplan

Quote:


> Originally Posted by *ttg35fort*
> 
> Here is my rig:
> 
> Fractal Design Define S case
> Seasonic Prime Titanium 850 PSU
> Asus X99-Deluxe II MB
> Core I7 6800k
> Titan X (Pascal)
> G.Skill TridentZ Series 32GB DDR4 3400
> Samsung 950 Pro M.2 512 GB
> Windows 10 64 Bit.
> 
> Originally I was running a Noctua NH-D15. I used the Asus automatic overclock and was at 3.98 GHz (39 x 102) at stock voltage. The default RAM frequency was around 2.4 GHz. I changed this to 3.4 MHz since I have DDR 3400 RAM, and I was stable in IntelBurn test.
> 
> I played with overclocking the Titan X on air in MSI Afterburner. I pushed the GPU with an added a +200 MHz overclock and was stable at stock voltage. Then I started playing with the memory and got to 600 MHz. It was stable and I ran out of time, so I left it there. At full load, the Titan X was running in the low to mid 80 deg. C range and the CPU was upper 60s to low 70s if I remember correctly.
> 
> FireStrike Extreme gave me a score in the low 14000 range.
> 
> I decided to change to open loop water cooling. I took out the Noctua and added:
> 
> EK-FB ASUS X99 Monoblock
> EK-FC Titan X Pascal waterblock
> EK-XRES 140 Revo D5 PWM (incl. pump)
> Alphacool NexXxoS ST 30 420 mm Radiator
> Alphacool NexXxoS ST 30 280 mm Radiator
> 
> I manually OC'd the processor and am stable at 4.41 GHz (43 x 102.5) at 1.28 V, although I had to bump the memory to 1.35V and drop the memory frequency down to around 2.85 GHz to get stability in IntelBurn test. With the CPU at 4.4 GHz and the memory at 3.4 GHz, the system would freeze in IntelBurn test.
> 
> That is the background. Now, here is where I am struggling:
> 
> On water, my CPU at full load is in the upper 50 to low 60 deg. C range, and the Titan X at full load is running in the low to mid 50 deg. C range. So, my CPU temp has dropped about 10 deg. C and my GPU temp has dropped about 30 deg. C. Yet, I cannot get more OC out of the Titan X at stock voltage. Moreover, even with the CPU bumped up to 4.4 GHz from 4.0 GHz, my FireStrike Extreme score remains about the same, in the low 14000 range. Perhaps the increase CPU speed is being offset by the reduced RAM speed. I'm not sure.
> 
> I played with the Titan X voltage in MSI Afterburner, and it took a 30% voltage bump to get it stable at a +205 MHz OC. After that, though, my FireStrike Extreme actually went down into the mid 13000 range. I then tried a 210 MHz OC, and it took a 75% voltage bump to get it stable. Then, my FireStrike Extreme score dropped even more, down to the low 13000 range. I tried going the other direction and dropped the OC to +190 MHz, and my score again down below 14000. So, +200 MHz seems to be the sweet spot, regardless of whether I am on air or water.
> 
> So, after adding all of the water cooling, my rig does not seem to be any faster. Any suggestions for what I should try next?


Re: TXP, +200MHz on clock & +600MHz on memory is usually the top end for this card, under water allows the clock/memory to max out at 2,100MHz/11,200MHz and stay at this range, after 35C you will see some thermal [clock] throttling.


----------



## ttg35fort

I read what I could find Re the TXP, and one review indicated that water cooling it would virtually eliminate the throttling. I thought this would show up in the benchmark score, but for some reason it doesn't. On air I had the fan speed set really high, so perhaps I was not getting much throttling, which might explain why the scores are essentially the same. I'm not sure.


----------



## ttg35fort

Oops. I meant to respond to another post and it showed up here. I didn't see how to delete it, so I am just editing it. I'm a newb.


----------



## KillerBee33

Hey all , i see different numbers on watercooled TXPs just wanted to ask again, 6700k @ 4.6 @ 1.34V TXP on Stock and running around 50-55 degrees in 1440 when gaming. Is that anywhere normal?
My room temps are around 28 btw.


----------



## axiumone

Quote:


> Originally Posted by *KillerBee33*
> 
> Hey all , i see different numbers on watercooled TXPs just wanted to ask again, 6700k @ 4.6 @ 1.34V TXP on Stock and running around 50-55 degrees in 1440 when gaming. Is that anywhere normal?
> My room temps are around 28 btw.


That looks a high. My hybrid TXP are running about the same temp @2050 core. I'd expect a full water setup to perform better. Especially at stock.


----------



## KillerBee33

Quote:


> Originally Posted by *axiumone*
> 
> That looks a high. My hybrid TXP are running about the same temp @2050 core. I'd expect a full water setup to perform better. Especially at stock.


Well, i'm basing this of a single Game now, MAFIA 3, i can do Borderlands 1,2,3 @ 2160P or NFS 2016 @ 2160P and it runs around 39-45. My idle 24 CPU and 26 GPU, Firestrike TimeSpy tops @ 42 degrees 2050MHz and +800 on memory +120 Power +100 Voltage
Just wanted to check before i go NUTZ


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> Hey all , i see different numbers on watercooled TXPs just wanted to ask again, 6700k @ 4.6 @ 1.34V TXP on Stock and running around 50-55 degrees in 1440 when gaming. Is that anywhere normal?
> My room temps are around 28 btw.


When I play Doom at 2160p (100-120 FPS range, 6700K @ 4.7, single TXP, XB321HK 4K/60Hz monitor), my temps max out at 55C (using EVGA hybrid kit push/pull), when I limit FPS to 59 (which I usually do to maintain G-Sync) my max temps are 48C. Obviously lower frame rates means less work for the CPU, CPU package temps avg. 50C @~1.35V, maybe 10-15C higher at max FPS. I also have 16Gb of G-Skill Trident Z memory at 3600 MHz, that raises my CPU core/package temps quite a bit as CPU/memory transact at higher frequencies.


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> When I play Doom at 2160p (100-120 FPS range, 6700K @ 4.7, single TXP, XB321HK 60Hz monitor), my temps max out at 55C (using EVGA hybrid kit), when I limit FPS to 59 (which I usually do to maintain G-Sync) my max temps are 48C. Obviously lower frame rates means less work for the CPU, CPU package temps avg. 50C, maybe 10-15C higher at max FPS.


I think i have DOOM somewhere , will check those temps today @ 2160p.


----------



## jsutter71

Quote:


> Originally Posted by *st0necold*
> 
> What 4096 monitor do you have? I am just curious what you do with them (did a google search and saw the 25k+ price tags I assume these are for professional photo and video editing rigs?


Hello..My monitor is the LG 31MU97-B. I payed about $1300 when I first got it but Amazon sells them for $899 new. You can get them for less then $600 used but buying used LED monitors is risky.


----------



## jsutter71

Quote:


> Originally Posted by *KillerBee33*
> 
> Hey all , i see different numbers on watercooled TXPs just wanted to ask again, 6700k @ 4.6 @ 1.34V TXP on Stock and running around 50-55 degrees in 1440 when gaming. Is that anywhere normal?
> My room temps are around 28 btw.


No...My TXP's under water are running at 23-26C idle and hit mid 30s gaming, depending on how warm my office gets. Also, my cards have 0 extra slots spaced between them.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Hey all , i see different numbers on watercooled TXPs just wanted to ask again, 6700k @ 4.6 @ 1.34V TXP on Stock and running around 50-55 degrees in 1440 when gaming. Is that anywhere normal?
> My room temps are around 28 btw.


it really depends on the water temp. EK blocks will run about ~10C higher than th ecold side of the loop, My TXPs run ~5C higher than the _hot side temp_ when folding at 2101 (in the mid-high 30s, like 37-38C). Looping Heaven 4.0, this WC loop reaches staedy-state at ~ 38C on both cards (again 2100/11,000 or so)


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> it really depends on the water temp. EK blocks will run about ~10C higher than th ecold side of the loop, My TXPs run ~5C higher than the _hot side temp_ when folding at 2101 (in the mid-high 30s, like 37-38C). Looping Heaven 4.0, this WC loop reaches staedy-state at ~ 38C on both cards (again 2100/11,000 or so)


Humm not sure whats wrong then...it was doing well when i put it all together.
BTW ramming 3x120 fans @ 2000 RPM and Pump @ 3000 RPM changes things by only 3-5 degrees...


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Humm not sure whats wrong then...it was doing well when i put it all together.
> BTW ramming 3x120 fans @ 2000 RPM and Pump @ 3000 RPM changes things by only 3-5 degrees...


Do you have an in-line water temp sensor in the loop?


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> Humm not sure whats wrong then...it was doing well when i put it all together.
> BTW ramming 3x120 fans @ 2000 RPM and Pump @ 3000 RPM changes things by only 3-5 degrees...


I don't think you can come to any solid conclusions on "right" temps in that each of us that uses our TXP(s) under water have very different water-cooling set-ups, not so much with hybrid kits but the custom water set-ups are all different (fan size & speed, # of fans, rad size, # of rads, diameter/length of loops/pipes, reservoir(s), pumps, etc. Notwithstanding, I think anything over a 55C GPU core temp while under water is probably too high.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> Do you have an in-line water temp sensor in the loop?


Nope...


----------



## cg4200

yeah your temp seems high for full loop. what pump? how many ml res? my friend runs I 360 rad 120 fans out on top 200nl res 6700 k [email protected] volts gaming 47 tops last month when still warm out..
I have 2 360 rad 6 120 fans on top 400ml or more 6700k 4.850 @1.45 txp 100%volt 120 power 2114 on core and +585 on mem tops 40C gaming gta 5 maxed for hours.. not sure how you run your loop into card and people will say it does not matter (also directions)but going back to 980 ek blocks for me I notice temp change if you run water inlet closest to power connector and back side of card as inlet and front farthest as return also easier for getting air bubbles out my 980ti also same.. last tip big chip to cover I use thermal grizzly and cut old credit card and use spread method while making sure to not over work and get air spots.
I don't over tighten screws either on water block and after running first time I will turn off unplug and power discharge then give quarter turn on screws especially near ram I notice ek fit is not greatest contact on ram on my txp.. my 2cents good luck


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Nope...


yeah - so, the point you make about increasing fan and pump speed lowered temps 3-5C (which is significant), it may be that after running a number of thermal cycl;es (heat-cool) the coolant has finally degassed and possibly any air in the system has collected, maybe in the rads. Run the pump at highest speed (if it does not have a "de-aeration" mode) and ... yes, tip the rig around to move the air thru the rad. May help.


----------



## ttg35fort

I did some more tweaking. I changed the BLCK freq. back to 100 MHz and the CPU multiplier to 44 to get 4.4 GHz on the CPU. I upped the cache voltage to 1.30 V and got it up to 3.7 GHz. I upped the system agent voltage to 1.23 V and got the memory up to 3.2 GHz. FireStrike Extreme score is still in the low 14000 range, 14,2xx. I'm guessing that is as far as I can go for now without going significantly higher on voltages.


----------



## jsutter71

And after upgrading the drivers today and disabling afterburner I am still getting a stupid error message when I try to run 3DMark. Funny HOW it's the only program I have that's giving me problems.


----------



## Dr Mad

You guys seem to be very lucky pushing TX-P to 2100/2114 game stable with up to 40° gpu temp.

I guess mine is not a really good clocker.

+120% everything else at stock = 1860
+120% / +200 = 2050 (1.05v)
+120% / +200 / +100mv = 2073 (1.08v)

I need +220 to get 2100 but as soon as temp reach 27/28°, I lose a bin and once it goes up to 32°, I lose another bin (2076)
This with CLU mod on 2 of the 3 resistors (85% TDP in FS Extreme and 105% FS Ultra)

But it's fine though, I'm able to maintain 2076 (+205) / +600 mem stable (witcher 3, deux ex MD...) and temp never exceeds 35° after some hours of gaming (ambient at 22-23°, EK waterblock)
If only we could edit the boost table









*ttg35fort* --> Is 14K FS Extreme score the overall or GPU score?
With TX-P at 2050, you should get ~16K GPU score or a bit less depending on the memory overclock.

FS Extreme --> http://www.3dmark.com/fs/10400010
FS Ultra --> http://www.3dmark.com/fs/10392611
FS --> http://www.3dmark.com/fs/10392644


----------



## DarkIdeals

Quote:


> Originally Posted by *Menthol*
> 
> Remove the memory and test each stick in each slot for either a bad memory module or slot, or CPU, do you have another CPU to test with,possibly CPU and or motherboard


Yeah it's only slot D1 that gives the issue. And no i don't have another CPU or motherboard; these things are far too expensive as it is ($1700 CPU and $600 motherboard on top of two $1200 GPUs. I have NO money left to be buying anything else really)

Quote:


> Originally Posted by *EniGma1987*
> 
> Or the CPU is somehow extremely damages and part of the dram controller and PCI-E controller is messed up. More likely the board though since if it was a damaged DRAM controller (I have had that happen before) then both sticks on that channel would probably not be recognized. And we do know Rampage boards can be quite finicky


Yeah it shows up as TRIPLE channel when i look at the RAM; and shows 12gb of 16gb. It's only one RAM slot that won't detect properly. Then it also won't detect one of my GPUs either. The weirdest thing is that it wouldn't detect the TOP graphic card before, and after taking everything apart including the monoblock, motherboard, taking GPU waterblocks off and reseating it all just in case some fluid got on something causing problems, it's now the THIRD slot that won't detect the GPU. I've tried switching them to as many different slots as i can too; but for some damn reason it just won't detect both GPUs for longer than a few hours (and while the 2nd GPU is detected i get constant crashes; and the crashes stop happening as soon as it stops detecting two GPUs)

It's seriously weird. I manage to get the 2nd GPU to detect from time to time but it never detects the D1 slot for RAM no matter what i've tried. I can't tell what's going on really. I thought maybe one of the GPUs went bad but both GPUs "seemed" to be fine when i put them in by themselves so that seems unlikely; but with two cards plugged in things go crazy.

Unfortunately i can't afford a new board and i doubt ASUS will take my board after i removed the heatsinks and pipe etc.. assembly stuff to put the monoblock on.

Quote:


> Originally Posted by *Jpmboy*
> 
> .... and or bent pins.


Checked the pins and they are fine.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - so, the point you make about increasing fan and pump speed lowered temps 3-5C (which is significant), it may be that after running a number of thermal cycl;es (heat-cool) the coolant has finally degassed and possibly any air in the system has collected, maybe in the rads. Run the pump at highest speed (if it does not have a "de-aeration" mode) and ... yes, tip the rig around to move the air thru the rad. May help.


Will be installing Noiseblocker fans tomorrow , will see what changes , also will try a new pump ...thnx


----------



## MrKenzie

Quote:


> Originally Posted by *Dr Mad*
> 
> You guys seem to be very lucky pushing TX-P to 2100/2114 game stable with up to 40° gpu temp.
> 
> I guess mine is not a really good clocker.
> 
> +120% everything else at stock = 1860
> +120% / +200 = 2050 (1.05v)
> +120% / +200 / +100mv = 2073 (1.08v)
> 
> I need +220 to get 2100 but as soon as temp reach 27/28°, I lose a bin and once it goes up to 32°, I lose another bin (2076)
> This with CLU mod on 2 of the 3 resistors (85% TDP in FS Extreme and 105% FS Ultra)
> 
> But it's fine though, I'm able to maintain 2076 (+205) / +600 mem stable (witcher 3, deux ex MD...) and temp never exceeds 35° after some hours of gaming (ambient at 22-23°, EK waterblock)
> If only we could edit the boost table
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *ttg35fort* --> Is 14K FS Extreme score the overall or GPU score?
> With TX-P at 2050, you should get ~16K GPU score or a bit less depending on the memory overclock.
> 
> FS Extreme --> http://www.3dmark.com/fs/10400010
> FS Ultra --> http://www.3dmark.com/fs/10392611
> FS --> http://www.3dmark.com/fs/10392644


I would try your overclock with the voltage slider at stock. I use +220core / +400mem stock voltage. Some games fluctuate between 2100-2114, while others will drop to as low as 2050. The drops I encounter are always power related or another "safety" feature as my GPU temp is stable at 25c constantly.


----------



## ttg35fort

Quote:


> Originally Posted by *MrKenzie*
> 
> I would try your overclock with the voltage slider at stock. I use +220core / +400mem stock voltage. Some games fluctuate between 2100-2114, while others will drop to as low as 2050. The drops I encounter are always power related or another "safety" feature as my GPU temp is stable at 25c constantly.


Quote:


> Originally Posted by *Dr Mad*
> 
> You guys seem to be very lucky pushing TX-P to 2100/2114 game stable with up to 40° gpu temp.
> 
> I guess mine is not a really good clocker.
> 
> +120% everything else at stock = 1860
> +120% / +200 = 2050 (1.05v)
> +120% / +200 / +100mv = 2073 (1.08v)
> 
> I need +220 to get 2100 but as soon as temp reach 27/28°, I lose a bin and once it goes up to 32°, I lose another bin (2076)
> This with CLU mod on 2 of the 3 resistors (85% TDP in FS Extreme and 105% FS Ultra)
> 
> But it's fine though, I'm able to maintain 2076 (+205) / +600 mem stable (witcher 3, deux ex MD...) and temp never exceeds 35° after some hours of gaming (ambient at 22-23°, EK waterblock)
> If only we could edit the boost table
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *ttg35fort* --> Is 14K FS Extreme score the overall or GPU score?
> With TX-P at 2050, you should get ~16K GPU score or a bit less depending on the memory overclock.
> 
> FS Extreme --> http://www.3dmark.com/fs/10400010
> FS Ultra --> http://www.3dmark.com/fs/10392611
> FS --> http://www.3dmark.com/fs/10392644


Overall score. You have me beat by a 1000 points. I'm not sure what is going on with mine.


----------



## ttg35fort

Quote:


> Originally Posted by *Dr Mad*
> 
> You guys seem to be very lucky pushing TX-P to 2100/2114 game stable with up to 40° gpu temp.
> 
> I guess mine is not a really good clocker.
> 
> +120% everything else at stock = 1860
> +120% / +200 = 2050 (1.05v)
> +120% / +200 / +100mv = 2073 (1.08v)
> 
> I need +220 to get 2100 but as soon as temp reach 27/28°, I lose a bin and once it goes up to 32°, I lose another bin (2076)
> This with CLU mod on 2 of the 3 resistors (85% TDP in FS Extreme and 105% FS Ultra)
> 
> But it's fine though, I'm able to maintain 2076 (+205) / +600 mem stable (witcher 3, deux ex MD...) and temp never exceeds 35° after some hours of gaming (ambient at 22-23°, EK waterblock)
> If only we could edit the boost table
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *ttg35fort* --> Is 14K FS Extreme score the overall or GPU score?
> With TX-P at 2050, you should get ~16K GPU score or a bit less depending on the memory overclock.
> 
> FS Extreme --> http://www.3dmark.com/fs/10400010
> FS Ultra --> http://www.3dmark.com/fs/10392611
> FS --> http://www.3dmark.com/fs/10392644


After comparing your score to mine, I see the problem. During the test my core clock was 1618 MHz and the memory clock stayed at 1401 MHz. You were at 2088 MHz and 1395 MHz. So, for some reason my Afterburner settings are not staying set during the benchmark. I will look at it to see what is going on.


----------



## Jpmboy

Quote:


> Originally Posted by *ttg35fort*
> 
> I did some more tweaking. I changed the BLCK freq. back to 100 MHz and the CPU multiplier to 44 to get 4.4 GHz on the CPU. I upped the cache voltage to 1.30 V and got it up to 3.7 GHz. I upped the system agent voltage to 1.23 V and got the memory up to 3.2 GHz. FireStrike Extreme score is still in the low 14000 range, 14,2xx. I'm guessing that is as far as I can go for now without going significantly higher on voltages.


that's a very high VSA for x99, unless you are running 128GB oif ram, it's totally unnecessary
. drop VSA to 1.05V or so, and in crease VDIMM instead. CPU VCCIO up a notch or two would be a better balance
Quote:


> Originally Posted by *jsutter71*
> 
> And after upgrading the drivers today and disabling afterburner I am still getting a stupid error message when I try to run 3DMark. Funny HOW it's the only program I have that's giving me problems.


Right click on the windows menu, select "run" type "winver". What version of w10 are you running? Also, in the 3Dmark window> settings> deselect In-run monitoring, and allow 3dmark to complete the initial system scan it does when starting.


----------



## carlhil2

Quote:


> Originally Posted by *Dr Mad*
> 
> You guys seem to be very lucky pushing TX-P to 2100/2114 game stable with up to 40° gpu temp.
> 
> I guess mine is not a really good clocker.
> 
> +120% everything else at stock = 1860
> +120% / +200 = 2050 (1.05v)
> +120% / +200 / +100mv = 2073 (1.08v)
> 
> I need +220 to get 2100 but as soon as temp reach 27/28°, I lose a bin and once it goes up to 32°, I lose another bin (2076)
> This with CLU mod on 2 of the 3 resistors (85% TDP in FS Extreme and 105% FS Ultra)
> 
> But it's fine though, I'm able to maintain 2076 (+205) / +600 mem stable (witcher 3, deux ex MD...) and temp never exceeds 35° after some hours of gaming (ambient at 22-23°, EK waterblock)
> If only we could edit the boost table
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *ttg35fort* --> Is 14K FS Extreme score the overall or GPU score?
> With TX-P at 2050, you should get ~16K GPU score or a bit less depending on the memory overclock.
> 
> FS Extreme --> http://www.3dmark.com/fs/10400010
> FS Ultra --> http://www.3dmark.com/fs/10392611
> FS --> http://www.3dmark.com/fs/10392644


We have similar builds, clocks and scores, in FS Ultra, run those same settings on a cool night with window open, you'll even do better... http://www.3dmark.com/fs/10251603 I am only one spot ahead of you in the HOF....


----------



## Iceman2733

Anyone upgrade here from 2x 1080 to a Titan XP? I recently picked up two EVGA 1080 FTW but now thinking I want to send them back and maybe go for a Titan XP and add a second one around tax season. But wanted to see how the real life performance is between these setups

Sent from my iPhone using Tapatalk


----------



## MrTOOSHORT

Quote:


> Originally Posted by *jsutter71*
> 
> No...My TXP's under water are running at 23-26C idle and hit mid 30s gaming, depending on how warm my office gets. Also, my cards have 0 extra slots spaced between them.


Look at those EK thermal pads.









They stick out like a sore thumb.









Wish they just stuck to gray.


----------



## MrKenzie

I just did a test on temperature vs clock running shadow warrior 2. I had lo wang just standing in one spot so as to make this more accurate.

At 25c core temp I was seeing 2128Mhz. I turned off my cooling system and let the water temp rise until the core temp hit 48c, which at this point I was getting 2025Mhz.

So there was around a 5% drop from 25c Core to 48c Core. Not a lot but enough to make a difference!


----------



## outofmyheadyo

How many of you have no coilwhine with your titans? Or is it a frequent problem?


----------



## MrKenzie

Quote:


> Originally Posted by *outofmyheadyo*
> 
> How many of you have no coilwhine with your titans? Or is it a frequent problem?


Small (inaudible) coil whine here. I have to open my case and put my ear about 30cm away from the GPU to be able to hear it.. This is with water cooling also.


----------



## cisco0623

Quote:


> Originally Posted by *outofmyheadyo*
> 
> How many of you have no coilwhine with your titans? Or is it a frequent problem?


None at all on mine.


----------



## Jpmboy

Quote:


> Originally Posted by *outofmyheadyo*
> 
> How many of you have no coilwhine with your titans? Or is it a frequent problem?


more like only a very few people "have" vrm buzz (here are no coils) - and it seems they have it with every card they install, so I doubt it is the cards. They just complain a lot.


----------



## Maintenance Bot

Quote:


> Originally Posted by *outofmyheadyo*
> 
> How many of you have no coilwhine with your titans? Or is it a frequent problem?


No noise here on 2 cards.


----------



## outofmyheadyo

Quote:


> Originally Posted by *Ghostface*
> 
> Thank you for the replies.
> 
> In terms of games where I get it. I get it heavy on the menus for Shadows of Mordor, The Division, Sniper Elite V2, Splinter Cell Conviction. In the games if I have V-Sync on and keep the overclock on the monitor off, I am at a solid 60FPS. The Division is the only game where I don't hit 100fps if I overclock the monitor to 100hz with G-Sync and V-Sync on. The whine comes in at both 60fps and anywhere between 60 and 100fps. In Splinter Cell Conviction during gameplay it quietens down in tight corridor sections and then opens up and whines like crazy in the more open sections, even with no change in the frame rate.
> 
> I am actually away until Sunday evening but I will try loosing the screws on the waterblock and backplate. I have used the washers which came with the EK waterblock as the instructions stated, the backplate itself didn't come with any washers so the screws are just straight in on those. I will try removing the chassis screws to hold the card in place as I know the case wasn't perfect for the card (Phantek Enthoo Primo) so I think there is a bit of tension there holding the card in place.
> 
> I appreciate the help so far and will post back to see if loosening the screws and removing the chassis screws altogether makes a difference. What's been so frustrating so far is the sheer amount of different things I have tried and it seems so unlikely that through different motherboards, PSU's and graphics cards that the cause is the GPU.


Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - unfortunately the same users keep RMAing cards claiming coil whine... one, maybe, but 4 and I'm 100% sure it is not the cards whining.


Quote:


> You test the outlet you are plugged into for a good ground and correct polarity? What about using a uninterruptible power supply? I've used a UPS for over a decade and in that time I've probably had 40+ video cards with zero coil whine. This is with open air cases too. My Titan-XP's also have zero coil while.


If the polarity of my electrical outlets is wrong, or if there is an issue with the ground ( living in an old house ) would an UPS help with that ?

But what is it that`s whining then ? I probably had 5 of my previous cards coilwhining 2x980ti, 1x titanx(m), gtx 1070 and gtx 1080 have tried different powersupplies, heck I even sold my old pc and build a brand new one with all new components and all of them whine. I have tried different outlets in different rooms in my house, still nothing.
Only thing I have not tried like " callsignvega " reccomended, is a UPS, dont really feel like spending 200€ on an offchance that it might help, because it makes no sense for it to work.
I also have an open case, and im not just complaining for the sake of complaining, but the whine is there and it`s real and it`s annoying as hell.
PSU was seasonic X-650 on the old PC and now it`s an EVGA G2 750


----------



## Ghostface

The UPS route is the only one I haven't tried in terms of throwing money at the issue.

I am hesitant because on the back of a forum post saying a mains conditioner resolved someone's coil whine a few years a go, I tried 3 different mains conditioners and none of them made a difference.

Like you outofmyheadyo I have had 2 x MSI 970 cards (first time I really noticed coil whine) which I swapped thinking they were faulty for 2 x Asus Strix 970 cards. The whine was different with the Asus cards but still audible with the case closed and gaming. I then went for a single 980ti by Inno3D which was definitely quieter than the Asus cards but some games would drive me made as the buzzing would be so loud. During this time I have gone through all permutations of component combinations and a completely new case.

My only two options left are to switch off power to the big electrical items in the house and test it, or buy a UPS. The latter I may try ordering from somewhere like Amazon where you can usually return stuff with no questions asked.

I'll try playing around with some of the big power drain items in the house and let you know if I get any luck from that.


----------



## outofmyheadyo

I had the same idea to order an UPS from somewhere where returns are hassle free like amazon, but sadly here in europe, and maybe in the US aswell amazon does not like shipping UPS units for some reason ( even small batteries ).

I wish someone would conduct a little test if they have been using an UPS for ages and had no coilwhine, simply try to bypass it for 5 minutes if the card or cards whine without it we might have the answer, should not take too long a quick test in heaven or whatever would work. Testing out a couple of games for example.


----------



## Jpmboy

Quote:


> Originally Posted by *outofmyheadyo*
> 
> But what is it that`s whining then ? I probably had 5 of my previous cards coilwhining 2x980ti, 1x titanx(m), gtx 1070 and gtx 1080 have tried different powersupplies, heck I even sold my old pc and build a brand new one with all new components and all of them whine. I have tried different outlets in different rooms in my house, still nothing.
> Only thing I have not tried like " callsignvega " reccomended, is a UPS, dont really feel like spending 200€ on an offchance that it might help, because it makes no sense for it to work.
> I also have an open case, and im not just complaining for the sake of complaining, but the whine is there and it`s real and it`s annoying as hell.
> PSU was seasonic X-650 on the old PC and now it`s an EVGA G2 750


VRM buzz (again - look at the naked PCB - there is no coil) is not likely a component defect, but maybe an assembly (flow bench) defect if it is a defect at all. CSV's suggestion of using a UPS basically has it acting as a line conditioner - these are cheaper than a UPS (eg, a tripplite hospital grade "Iso-Lator). Now, if as you say, all cards you've had in several setups all have whine/buzz, the problem is highly likely to be external to all of the rigs and cards. Really check the grounding of the chassis. PSU and house ground. PSUs have varing ground potentials, make sure that there is no leakage from the PSU to the chassis: put one probe from a good DMM on a none painted surfaced of the PSU and then the other probe to a non-painted surface on the chassis - is the reading zero millivolts?, If not, run a any type of copper wire from a PSU mount screw to a clean chassis ground to shunt some of that ground potential thru the wire).
I mean I had a house ground loop in a custom-made 50Ft RCA (shield grounded cable) running between a Denon amp and a 2000watt Carver subwoofer - drove me and everyone here freaking nutz ... I had to run a separate common ground between the two units (even though they shared a common socket ground - which was likely not "common") in order to remove the 15Hz hum/rumble since the two components had different ground potentials.


----------



## Jpmboy

Quote:


> Originally Posted by *Ghostface*
> 
> The UPS route is the only one I haven't tried in terms of throwing money at the issue.
> I am hesitant because on the back of a forum post saying a mains conditioner resolved someone's coil whine a few years a go, I tried 3 different mains conditioners and none of them made a difference.
> Like you outofmyheadyo I have had 2 x MSI 970 cards (first time I really noticed coil whine) which I swapped thinking they were faulty for 2 x Asus Strix 970 cards. The whine was different with the Asus cards but still audible with the case closed and gaming. I then went for a single 980ti by Inno3D which was definitely quieter than the Asus cards but some games would drive me made as the buzzing would be so loud. During this time I have gone through all permutations of component combinations and a completely new case.
> My only two options left are to switch off power to the big electrical items in the house and test it, or buy a UPS. The latter I may try ordering from somewhere like Amazon where you can usually return stuff with no questions asked.
> I'll try playing around with some of the big power drain items in the house and let you know if I get any luck from that.


IME, the only type of mains conditioner that might work is an induction type, Check out the tripplite products.


----------



## Ghostface

Please could you confirm what you mean by clean chassis ground?

If I remember rightly my old Seasonic PSU had a grounding wire with it which I had connected, that was back when I had my 970 cards and they both had the buzz/vrm interference. Happy to try something like that again though if you can clarify what I should connect the grounding cable to.

Cheers.


----------



## Jpmboy

Quote:


> Originally Posted by *Ghostface*
> 
> Please could you confirm what you mean by clean chassis ground?
> 
> If I remember rightly my old Seasonic PSU had a grounding wire with it which I had connected, that was back when I had my 970 cards and they both had the buzz/vrm interference. Happy to try something like that again though if you can clarify what I should connect the grounding cable to.
> 
> Cheers.


what I mean by clean is one where there is good bare metal-to-wire contact, that's all. This really only works if there is a ground-potential problem. But it is easy to try.


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> When I play Doom at 2160p (100-120 FPS range, 6700K @ 4.7, single TXP, XB321HK 4K/60Hz monitor), my temps max out at 55C (using EVGA hybrid kit push/pull), when I limit FPS to 59 (which I usually do to maintain G-Sync) my max temps are 48C. Obviously lower frame rates means less work for the CPU, CPU package temps avg. 50C @~1.35V, maybe 10-15C higher at max FPS. I also have 16Gb of G-Skill Trident Z memory at 3600 MHz, that raises my CPU core/package temps quite a bit as CPU/memory transact at higher frequencies.


Tried DOOM3 Maxed out @ 2160p...40-42 MAX


----------



## feznz

Quote:


> Originally Posted by *Ghostface*
> 
> The UPS route is the only one I haven't tried in terms of throwing money at the issue.
> 
> I am hesitant because on the back of a forum post saying a mains conditioner resolved someone's coil whine a few years a go, I tried 3 different mains conditioners and none of them made a difference.
> 
> Like you outofmyheadyo I have had 2 x MSI 970 cards (first time I really noticed coil whine) which I swapped thinking they were faulty for 2 x Asus Strix 970 cards. The whine was different with the Asus cards but still audible with the case closed and gaming. I then went for a single 980ti by Inno3D which was definitely quieter than the Asus cards but some games would drive me made as the buzzing would be so loud. During this time I have gone through all permutations of component combinations and a completely new case.
> 
> My only two options left are to switch off power to the big electrical items in the house and test it, or buy a UPS. The latter I may try ordering from somewhere like Amazon where you can usually return stuff with no questions asked.
> 
> I'll try playing around with some of the big power drain items in the house and let you know if I get any luck from that.


or take PC to another persons house to see if it is a power service problem.

I personally like to run a UPS just for peace of mind for knowing I still run my VOIP phone and have Internet if there is a power outage, with the benefit of "clean" power delivery to modem, pc, phone, Screens and printer.
I could probably run a few days with just Phone and modem.

Had quite a few earthquakes here in the last few years and with software monitoring I can tell you that the power could easily have a dozen spikes a day.


----------



## cookiesowns

UPS will really only help if it's an online UPS assuming power is the problem.

Line interactive will essentially be useless for cleaning up power.

That said. I have VRM whine on mine but only in high FPS scenarios. I'm also on an old AX1200 PSU

I don't think I've ever encountered a card with no whine ever. It's really only notable when you take off the stock heat sink IMO


----------



## HyperMatrix

Finally got around to installing my aquacomputer blocks. Used the Thermal Grizzly conductonaut on the GPU and on the resistors. Idle temps at 23 degrees Celsius. With conductonaut on 2 resistors, TDP under max load is between 50-60% when going full blast. As for OC levels, I can only get to about 2126MHz core and 11610MHz on the memory stable. Beyond that it starts artifacting due to lack of voltage. I'm still using the +100mv generic voltage detection in MSI afterburner. Is there any way to get some more juice to this sucker?


----------



## ttg35fort

Quote:


> Originally Posted by *Jpmboy*
> 
> that's a very high VSA for x99, unless you are running 128GB oif ram, it's totally unnecessary
> . drop VSA to 1.05V or so, and in crease VDIMM instead. CPU VCCIO up a notch or two would be a better balance


Thank you for the guidance. I dropped he VSA to 1.05V and increased the CPU VCCIO to 1.1V. It worked like a charm. (I already had the memory voltage increased.)


----------



## Ghostface

Quote:


> Originally Posted by *Jpmboy*
> 
> what I mean by clean is one where there is good bare metal-to-wire contact, that's all. This really only works if there is a ground-potential problem. But it is easy to try.


Cheers, but unfortunately no cigar.

I have also tried temporarily turning off the fridge/freezers in the house and then re-testing on a scene where it is particularly loud and it was still there.

What's really odd is that using the MSI Afterburner graphs, I have picked Splinter Cell Conviction and Shadows of Mordor to compare usage. Between the two I have managed to find two scenes which hit the same max Power Limit on the card, the same Core Clock speed, in Splinter Cell no buzzing, in Shadows of Mordor, a really loud hiss. Moving the camera in Shadows of Mordor and navigating the map a bit more brings the hiss back, but it also changes the load on the card based on the monitoring. Likewise in Splinter Cell, the more open the map becomes, the louder it gets.

From the menu in Splinter Cell Blacklist, if I start off with low graphics settings it is silent. Then as I crank them up and turn AA to 4x, I am back hissing again, and this is at 6-8 feet away from the case.


----------



## outofmyheadyo

I would gladly test the UPS solution but sadly most stores here where I live only stock crap, and the return on things is complicated to say the least.
Perhaps some of you guys who live in the US could test the UPS theory, you should have pretty relaxed return policies there and even ordering one from amazon would make returns easy.


----------



## Jpmboy

Quote:


> Originally Posted by *cookiesowns*
> 
> UPS will really only help if it's an online UPS assuming power is the problem.
> 
> Line interactive will essentially be useless for cleaning up power.
> 
> That said. I have VRM whine *on mine but only in high FPS scenarios.* I'm also on an old AX1200 PSU
> 
> I don't think I've ever encountered a card with no whine ever. It's really only notable when you take off the stock heat sink IMO


^^ This - or a static screen in 3D mode, and then I can hear a buzz - that is present in every card I have had in the last 2-3 years(dozens).
Quote:


> Originally Posted by *ttg35fort*
> 
> Thank you for the guidance. I dropped he VSA to 1.05V and increased the CPU VCCIO to 1.1V. It worked like a charm. (I already had the memory voltage increased.)


you're welcome.


----------



## jsutter71

I'm taking a look at the Intel Xeon E5-2687W v4 for a bunch of different uses but also some gaming. I figure if I was going to drop $1500 for a 6950X then I might consider the Xeon also. Any Suggestions? Currently running a 5930K and Dual TXPs.


----------



## MunneY

Quote:


> Originally Posted by *jsutter71*
> 
> I'm taking a look at the Intel Xeon E5-2687W v4 for a bunch of different uses but also some gaming. I figure if I was going to drop $1500 for a 6950X then I might consider the Xeon also. Any Suggestions? Currently running a 5930K and Dual TXPs.


The only real downside to the Xeons is you get about 10% OC via BLK if you are lucky.

Depending on your budget... I'd suggest looking for an ES chip or 2 :-D


----------



## jsutter71

Quote:


> Originally Posted by *MunneY*
> 
> The only real downside to the Xeons is you get about 10% OC via BLK if you are lucky.
> 
> Depending on your budget... I'd suggest looking for an ES chip or 2 :-D


That particular xeon chip mirrored the 6950X with 2 more cores. I'm not changing my motherboard which is a Asus X99 E WS USB 3.1


----------



## KillerBee33

both .57 and .63 drivers crashing TXP when folding...anyone else getting this?


----------



## jodasanchezz

Well after my RAM to Nvidia i finaly got a New Card (much better then my first)

as my last Titan gets brocken after reoundabout 6 weeks im a litle afraid to push the ward as hard as the first one.

Im asking here for the Recommended RAM Speeds shloud i be fine with +400 in the Ram? ()24/7)
I figured out , a plus of 400-450mhz on the ram gives me the most boost in fps.


----------



## jodasanchezz

what do u mean with folding ?

BTW : RMA is done Need a Little Longer than yours here in Germany (14days) but a new Card was Shiped


----------



## KillerBee33

Quote:


> Originally Posted by *jodasanchezz*
> 
> what do u mean with folding ?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> both .57 and .63 drivers crashing TXP when folding...anyone else getting this?


no I haven't.. is that at stock clocks?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> no I haven't.. is that at stock clocks?


Yeap...all stock.
Everything else runs just fine btw.


----------



## cookiesowns

Quote:


> Originally Posted by *jsutter71*
> 
> I'm taking a look at the Intel Xeon E5-2687W v4 for a bunch of different uses but also some gaming. I figure if I was going to drop $1500 for a 6950X then I might consider the Xeon also. Any Suggestions? Currently running a 5930K and Dual TXPs.


Do you know if you can actually leverage the extra 2 cores ? Remember you're also sacrificing single thread along with memory performance by going with a locked CPU.

Personally if you have the cooling I would get a high binned 6950X and run it 1-2 clock bins lower with reduced voltages for your use case. Will out perform the Xeon without sacrificing an ounce of gaming performance.

I run a 6950X @ 4.4 and a dual E5-2676 v3 or E5-2683 V3 system isn't that much faster in X265 or heavily threaded workloads


----------



## kx11

Quote:


> Originally Posted by *KillerBee33*
> 
> both .57 and .63 drivers crashing TXP when folding...anyone else getting this?


not me

running it at full workload mode in chrome (latest update) with geforce xxx.63


----------



## KillerBee33

Quote:


> Originally Posted by *kx11*
> 
> not me
> 
> running it at full workload mode in chrome (latest update) with geforce xxx.63


Ran a bunch of Tests and 4K gaming without issues but Folding knocks GPU off


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Ran a bunch of Tests and 4K gaming without issues but Folding knocks GPU off


this is the new openCL : https://drive.google.com/open?id=0B7gpMyj43ZFjR2FGdTFzbUQzWTA
you can also get it directly from the INtel site. it's the best performing yet.

btw - are you getting a driver crash or system restart?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> this is the new openCL : https://drive.google.com/open?id=0B7gpMyj43ZFjR2FGdTFzbUQzWTA
> you can also get it directly from the INtel site. it's the best performing yet.
> 
> btw - are you getting a driver crash or system restart?


None of those , just a screen flash and it says One Slot Failed on taskbar Folding Icon..


----------



## mbze430

Nvidia's recent drivers are NIGHTMARE!


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> None of those , just a screen flash and it says One Slot Failed on taskbar Folding Icon..


might be the card or the slot... does that board have PCIE lane switches (so you can test each card "in-situ".


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> might be the card or the slot... does that board have PCIE lane switches (so you can test each card "in-situ".


Refilled my Loop 6 times in the past 1.5 months . Not going thru that again , it runs fine on all the tests and all the games.
WEIRD.....doing fine NOW


Spoiler: Warning: Spoiler!






HUmm... it's fine now, 20 minutes in Full.


----------



## jsutter71

Quote:


> Originally Posted by *mbze430*
> 
> Nvidia's recent drivers are NIGHTMARE!


They released another update today to fix the update from Friday. My start10 was all messed up until I loaded todays drivers.


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> They released another update today to fix the update from Friday. My start10 was all messed up until I loaded todays drivers.


Other than .63?


----------



## jsutter71

Quote:


> Originally Posted by *KillerBee33*
> 
> Other than .63?


.63 is the latest.


----------



## clipse84

Sold My Tri Sli Titans X (Maxwell) for $2,400. And Purchase These Baby's Instead. Time To Start Overclocking. Can't Wait for Custom Bios


----------



## skypine27

Quote:


> Originally Posted by *clipse84*
> 
> 
> 
> 
> 
> 
> Sold My Tri Sli Titans X (Maxwell) for $2,400. And Purchase These Baby's Instead. Time To Start Overclocking. Can't Wait for Custom Bios


Great looking build. I love the sliver EK backplates.

I don't think you'll have much luck with a modded bios. All of us have been waiting a long time for one with no luck unless I recently missed something.


----------



## jsutter71

Quote:


> Originally Posted by *skypine27*
> 
> Great looking build. I love the sliver EK backplates.
> 
> I don't think you'll have much luck with a modded bios. All of us have been waiting a long time for one with no luck unless I recently missed something.


I have the same blocks and back plates for my Titans...The are absolutely beautiful and my temps are very cool.


----------



## skypine27

Quote:


> Originally Posted by *jsutter71*
> 
> I have the same blocks and back plates for my Titans...The are absolutely beautiful and my temps are very cool.


Nice!

I run the same blocks, but did a minor mod to allow me to run the factory backplates even though EK says they aren't officially supported (EK backplates weren't available at the time and I was tired of waiting to build!):
http://s82.photobucket.com/user/skypine27/media/IMG_3601.jpg.html


----------



## Menthol

I can't lock the clocks on more than one card at a time using cntrl l, anyone can help please


----------



## pompss

selling two waterblock for pascal titan x if someone is interested here the link

ek waterblock

http://www.overclock.net/t/1614492/ek-pascal-titan-x-waterblock-like-new-opened

acquacomputer waterblock + backplate kryographics Pascal NVIDIA TITAN X, active XCS

http://www.overclock.net/t/1614493/aquacomputer-pascal-titan-x-waterblock-backplate-kryographics-pascal-nvidia-titan-x-active-xcs


----------



## scottb75

Quote:


> Originally Posted by *Iceman2733*
> 
> Anyone upgrade here from 2x 1080 to a Titan XP? I recently picked up two EVGA 1080 FTW but now thinking I want to send them back and maybe go for a Titan XP and add a second one around tax season. But wanted to see how the real life performance is between these setups
> 
> Sent from my iPhone using Tapatalk


I'm going that route right now. I currently have 2 x 1080 FTW and today I ordered a single TItan XP which I should get Thursday or Friday. If the Titan works out I'll probably sell the 1080s.


----------



## Iceman2733

Quote:


> Originally Posted by *scottb75*
> 
> I'm going that route right now. I currently have 2 x 1080 FTW and today I ordered a single TItan XP which I should get Thursday or Friday. If the Titan works out I'll probably sell the 1080s.


Let me know what you think please. Depending on what review you read defers on how much different they are, some reviews has two 1080 beating a titan other reviews have the titan beating the two 1080


----------



## jhowell1030

Quote:


> Originally Posted by *Iceman2733*
> 
> Let me know what you think please. Depending on what review you read defers on how much different they are, some reviews has two 1080 beating a titan other reviews have the two 1080 beating a Titan


So all reviews are showing the same thing then?


----------



## Iceman2733

Quote:


> Originally Posted by *jhowell1030*
> 
> So all reviews are showing the same thing then?


lmao these 16hr shifts are taking there toll as you can tell I have corrected it now lol


----------



## bizplan

Quote:


> Originally Posted by *Iceman2733*
> 
> Let me know what you think please. Depending on what review you read defers on how much different they are, some reviews has two 1080 beating a titan other reviews have the titan beating the two 1080


All of us Titan XP owners are [by now] burned out on the two 1080s in SLI vs. one Titan XP debate.


----------



## chronicfx

Quote:


> Originally Posted by *jsutter71*
> 
> I have the same blocks and back plates for my Titans...The are absolutely beautiful and my temps are very cool.


You should call this build the Ba Da Bing! It looks like a stripper stage! Nice build


----------



## jsutter71

Quote:


> Originally Posted by *chronicfx*
> 
> You should call this build the Ba Da Bing! It looks like a stripper stage! Nice build


Thank you. Today I ordered a 6950X from Silicone Lottery so I'm looking forward to adding that into the mix.


----------



## jodasanchezz

Hi Guys ,

How it your pervormance in BF1 Multiplayer 64 plaers (for example)

Im just asking because i think im running in CPU Bottleneck, but not shure.

System.

CPU : [email protected],5hz (dont blame me for this my z97 bord just died an am Wating for 7700k hat this tiny sucker in my livingroom rig)
16Gb DDR4 3000
Asus Hero Viii
TItan x @ 2050 @5400

I Noticed in afterburner when i Run BF1 4k Ultra TTA (85-115fps)

My CPU is running in between 95-100% Load my Gpu Also @ 99% sometimes Drops to 78%

>this is the bottleneck i think.


----------



## feznz

Quote:


> Originally Posted by *Iceman2733*
> 
> Let me know what you think please. Depending on what review you read defers on how much different they are, some reviews has two 1080 beating a titan other reviews have the titan beating the two 1080


On paper 1080 SLI should whip a single Titan but in practice the Immature drivers are failing but in all honesty some games will never be full patched up to run SLI


----------



## KillerBee33

The HULK. lil Green Hulk


----------



## jodasanchezz

Quote:


> Originally Posted by *KillerBee33*
> 
> The HULK. lil Green Hulk


Very Nice


----------



## mouacyk

Quote:


> Originally Posted by *KillerBee33*
> 
> The HULK. lil Green Hulk


Gonna get gamma radiation from all that green m


----------



## cookiesowns

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi Guys ,
> 
> How it your pervormance in BF1 Multiplayer 64 plaers (for example)
> 
> Im just asking because i think im running in CPU Bottleneck, but not shure.
> 
> System.
> 
> CPU : [email protected],5hz (dont blame me for this my z97 bord just died an am Wating for 7700k hat this tiny sucker in my livingroom rig)
> 16Gb DDR4 3000
> Asus Hero Viii
> TItan x @ 2050 @5400
> 
> I Noticed in afterburner when i Run BF1 4k Ultra TTA (85-115fps)
> 
> My CPU is running in between 95-100% Load my Gpu Also @ 99% sometimes Drops to 78%
> 
> >this is the bottleneck i think.


With 64 players your 6600K is def a bottleneck. 6700K will improve but I feel nowadays especially on game engines that are much better threaded a 6 core or even higher will be much better.

High refresh rate screens only make that more CPU centric as well.

I could easily CPU bottleneck a 4790K back in the day with just CF R9 280X in BF4 in 1440P 60 and 1080 144+


----------



## uggy

Whats your experience with OC on the titanXP?
I have 2 on water and im getting +200 on the core and +550 on the memory.. but I dont get at stable voltage on 1.092, so my core clock is coming down to around 2070ish ..

Do you guys think I should downgrade the memory overclock? So I dont get an power problem?

Whats your OC on 2 titanxp on water?


----------



## TremF

Quote:


> Originally Posted by *uggy*
> 
> Whats your experience with OC on the titanXP?
> I have 2 on water and im getting +200 on the core and +550 on the memory.. but I dont get at stable voltage on 1.092, so my core clock is coming down to around 2070ish ..
> 
> Do you guys think I should downgrade the memory overclock? So I dont get an power problem?
> 
> Whats your OC on 2 titanxp on water?


I have to say my TXP is on stock air cooler and not overclocked. The most I have done Is up the Power Limit to 120%, up the Temp. Limit to 90°C and set up a custom fan profile.

With my I7 4930K running at 4.3GHz (Cooled with a Corsair H115i - never goes over 40°C in any game) I am able to run any game on my 32" 4K Acer Predator XB321HK GSync monitor with everything set to Ultra/Max except AA which isn't needed.

Seeing as it runs so well I don't see the point in Overclocking. The 4K Gsync monitor looks after any frames that drop below 60 and everything looks and plays fantastic









Sorry I'm not really helping but the TXP is an utter beast and, along with the monitor, has been my best purchase this year.


----------



## Dr Mad

Quote:


> Originally Posted by *uggy*
> 
> Whats your experience with OC on the titanXP?
> I have 2 on water and im getting +200 on the core and +550 on the memory.. but I dont get at stable voltage on 1.092, so my core clock is coming down to around 2070ish ..
> 
> Do you guys think I should downgrade the memory overclock? So I dont get an power problem?
> 
> Whats your OC on 2 titanxp on water?


Your overclock is pretty much the same as mine, assuming you did the CLU mod, otherwise you're likely being Power limited at 1.092v
+200 core / +600 / +100mv result in 2076 at 1.08v. 80-85% TDP in heavy games at 3440x1440.
(2050 if voltage remains at stock)

+210 = 2088 at 1.092v but as soon as the core exceeds 32°, it drops to 2076 at 1.08v

Some lucky guys here can do 2114 with just +200 / stock voltage and with watercooling.


----------



## Woundingchaney

Quote:


> Originally Posted by *Dr Mad*
> 
> Your overclock is pretty much the same as mine, assuming you did the CLU mod, otherwise you're likely being Power limited at 1.092v
> +200 core / +600 / +100mv result in 2076 at 1.08v. 80-85% TDP in heavy games at 3440x1440.
> (2050 if voltage remains at stock)
> 
> +210 = 2088 at 1.092v but as soon as the core exceeds 32°, it drops to 2076 at 1.08v
> 
> Some lucky guys here can do 2114 with just +200 / stock voltage and with watercooling.


You can come down on the memory, but ultimately 2070 is a good overclock and going to be approx. where you stabalize at. The core starts to downclock slightly even at relatively low operating temperatures.


----------



## uggy

Quote:


> Originally Posted by *Dr Mad*
> 
> Your overclock is pretty much the same as mine, assuming you did the CLU mod, otherwise you're likely being Power limited at 1.092v
> +200 core / +600 / +100mv result in 2076 at 1.08v. 80-85% TDP in heavy games at 3440x1440.
> (2050 if voltage remains at stock)
> 
> +210 = 2088 at 1.092v but as soon as the core exceeds 32°, it drops to 2076 at 1.08v
> 
> Some lucky guys here can do 2114 with just +200 / stock voltage and with watercooling.


What is the CLU mod? sorry for asking, but I have never heard of that..
I have water on mine









Quote:


> Originally Posted by *Woundingchaney*
> 
> You can come down on the memory, but ultimately 2070 is a good overclock and going to be approx. where you stabalize at. The core starts to downclock slightly even at relatively low operating temperatures.


Its an good overclock for water or stock? Maybe I am being to demanding on my two titanxp on water









Do I lose some FPS on down clocking my memory a little bit? or do I gain from it? Is memory not that important on pascal?


----------



## jhowell1030

Quote:


> Originally Posted by *uggy*
> 
> What is the CLU mod? sorry for asking, but I have never heard of that..
> I have water on mine
> 
> 
> 
> 
> 
> 
> 
> 
> Its an good overclock for water or stock? Maybe I am being to demanding on my two titanxp on water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do I lose some FPS on down clocking my memory a little bit? or do I gain from it? Is memory not that important on pascal?


Many people...both here and in reviews...have proven that OCing the memory beyond 500 hasn't gotten them any benefits in FPS whatsoever. Some folks have even had results diminish when pushing the memory even though it OCed stable.


----------



## mouacyk

The problem is the power limit. You have such a low power limit on the TXP that it's not worth wasting that on memory overclocking, when you're already at 10GHz and 480GigaBYTES/s. Remember, the previous flagship could reach 8GHz with only a decent OC, being 7GHz at stock. Unless you're aiming for 4K+ and 60Hz+, memory OC should not be a priority, considering the limited power.


----------



## scohen158

Any word on an official AIO for the Titan XP I've been trying to patiently wait but would like to get lower temps.


----------



## KillerBee33

Quote:


> Originally Posted by *scohen158*
> 
> Any word on an official AIO for the Titan XP I've been trying to patiently wait but would like to get lower temps.


Wait for 1080Ti to come out , it might have same pcb and it wont be nVidia exclusive , meaning there will be AIOs for it , it wont say TITAN X on it's shroud but most likely will fit TXP.
Again, just an assumption


----------



## Seyumi

Quote:


> Originally Posted by *scohen158*
> 
> Any word on an official AIO for the Titan XP I've been trying to patiently wait but would like to get lower temps.


Yesterday EVGA_Jacob just said "TITAN is on the way..." on the Titan X Hybrid petition thread on the official forum. We've unfortunately been hearing this from the first confirmation back on July 25th but it was more like "in the works" or "working on it." With a solid month+ of silence from him and just now announcing this it should hopefully be out relatively soon.


----------



## jsutter71

Quote:


> Originally Posted by *Seyumi*
> 
> Yesterday EVGA_Jacob just said "TITAN is on the way..." on the Titan X Hybrid petition thread on the official forum. We've unfortunately been hearing this from the first confirmation back on July 25th but it was more like "in the works" or "working on it." With a solid month+ of silence from him and just now announcing this it should hopefully be out relatively soon.


Oh well...I have been using EVGA video cards exclusively since the time they were providing *LIFETIME* warranties for their cards. So long ago now. Anyone else remember that? The first break from that was my TXP's. I must say that even though I was sceptical to purchase a card from a company who refused to allow other companies to sell it, so far my cards have met all my expectations.


----------



## bizplan

Quote:


> Originally Posted by *jsutter71*
> 
> Oh well...I have been using EVGA video cards exclusively since the time they were providing *LIFETIME* warranties for their cards. So long ago now. Anyone else remember that? The first break from that was my TXP's. I must say that even though I was sceptical to purchase a card from a company who refused to allow other companies to sell it, so far my cards have met all my expectations.


Skeptical about buying directly from Nvidia?


----------



## ttg35fort

Quote:


> Originally Posted by *jhowell1030*
> 
> Many people...both here and in reviews...have proven that OCing the memory beyond 500 hasn't gotten them any benefits in FPS whatsoever. Some folks have even had results diminish when pushing the memory even though it OCed stable.


Tonight I ran Fire Strike Extreme with different memory overclock settings. 600 MHz gave me the highest score. I lost about 500 pts. at 500 MHz. I am water cooled with 280mm and 420mm radiators.


----------



## jsutter71

Quote:


> Originally Posted by *bizplan*
> 
> Skeptical about buying directly from Nvidia?


Yes. Before I bought them I read through this thread. What stuck out for me was a lot of discussion about bad cards related to water cooling and poor customer service. EVGA has great customer service and my cards are water cooled.


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> Yes. Before I bought them I read through this thread. What stuck out for me was a lot of discussion about bad cards related to water cooling and poor customer service. EVGA has great customer service and my cards are water cooled.


Which is exactly why I used FPS as a frame of reference vs synthetic benchmarking.

That being said...every card is different.


----------



## piee

Now play 4k at 80hz on new monitor, ufo tested success


----------



## giggsy07

Patiently waiting on this Hybrid kit.Considering the 1080 kit mod but afraid I may f**k it up.


----------



## Lobotomite430

Quote:


> Originally Posted by *giggsy07*
> 
> Patiently waiting on this Hybrid kit.Considering the 1080 kit mod but afraid I may f**k it up.


If you are impatient I can make you one, I'm selling them for 150 got a listing on ebay for it. Maybe EVGA saw my listing and is pushing the release of the titan kit to put me out of business


----------



## mouacyk

Quote:


> Originally Posted by *piee*
> 
> Now play 4k at 80hz on new monitor, ufo tested success


Minecraft?


----------



## KillerBee33

Quote:


> Originally Posted by *mouacyk*
> 
> Minecraft?


Borderlands Series "1,2,3"runs great @ 4K with PhysX ON in Single Player.


----------



## Seyumi

Quote:


> Originally Posted by *Lobotomite430*
> 
> If you are impatient I can make you one, I'm selling them for 150 got a listing on ebay for it. Maybe EVGA saw my listing and is pushing the release of the titan kit to put me out of business


Not a bad idea. I may buy 2 from you. I'm waiting until November 13 or so when I plan on buying my 2nd Titan X Pascal and if EVGA hasn't released by then I'm just getting two 1080 kits or two of your modded kits if you'd be willing to make.


----------



## Lobotomite430

Quote:


> Originally Posted by *Seyumi*
> 
> Not a bad idea. I may buy 2 from you. I'm waiting until November 13 or so when I plan on buying my 2nd Titan X Pascal and if EVGA hasn't released by then I'm just getting two 1080 kits or two of your modded kits if you'd be willing to make.


I would gladly do so!


----------



## Seyumi

Quote:


> Originally Posted by *Lobotomite430*
> 
> I would gladly do so!


Do you dremel the shroud as well to make room for the extra pin connectors? You only mentioned modifying the base-plate on the posting.


----------



## bizplan

Quote:


> Originally Posted by *jsutter71*
> 
> Yes. Before I bought them I read through this thread. What stuck out for me was a lot of discussion about bad cards related to water cooling and poor customer service. EVGA has great customer service and my cards are water cooled.


You may recall our friend in this forum who broke his new TXP when affixing a water-cooling hybrid kit, Nvidia replaced his card with a new one despite him clearly voiding the warranty, he was totally honest with them and they took care of him anyway. In his case Nvidia was most accommodative, he repeatedly attested to this in this forum.


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> You may recall our friend in this forum who broke his new TXP when affixing a water-cooling hybrid kit, Nvidia replaced his card with a new one despite him clearly voiding the warranty, he was totally honest with them and they took care of him anyway. In his case Nvidia was most accommodative, he repeatedly attested to this in this forum.


Burned my first TXP , out and back in , in 5 days. Not sure where this Bad Rep coming from for nVidia.


----------



## Nizzen

Quote:


> Originally Posted by *piee*
> 
> Now play 4k at 80hz on new monitor, ufo tested success


What monitor?

No g-sync is epic fail


----------



## jsutter71

Quote:


> Originally Posted by *bizplan*
> 
> You may recall our friend in this forum who broke his new TXP when affixing a water-cooling hybrid kit, Nvidia replaced his card with a new one despite him clearly voiding the warranty, he was totally honest with them and they took care of him anyway. In his case Nvidia was most accommodative, he repeatedly attested to this in this forum.


Good to know.


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> Burned my first TXP , out and back in , in 5 days. Not sure where this Bad Rep coming from for nVidia.


I think Nvidia knows they have the market cornered with the TXP and that many folks are buying more than one of them at 1,200 bucks a pop. They know they have to take good care of us!

Else we buy AMD!


----------



## jsutter71

Quote:


> Originally Posted by *bizplan*
> 
> I think Nvidia knows they have the market cornered with the TXP and that many folks are buying more than one of them at 1,200 bucks a pop. They know they have to take good care of us!
> 
> Else we buy AMD!










I think the last AMD anything I owned was a video card 10 years ago.


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> I think Nvidia knows they have the market cornered with the TXP and that many folks are buying more than one of them at 1,200 bucks a pop. They know they have to take good care of us!
> 
> Else we buy AMD!


5700SE-6800GT-7900Gs-GT555-760Ti-970-980-1080 and now TXP , not a fan boy but just so happens neva had an AMD and most likely neva will


----------



## bizplan

Quote:


> Originally Posted by *KillerBee33*
> 
> 5700SE-6800GT-7900Gs-GT555-760Ti-970-980-1080 and now TXP , not a fan boy but just so happens neva had an AMD and most likely neva will


When I can afford it might buy that $10,000 (http://wccftech.com/amd-dracrays-vega-10/) AMD card with 1 TB of memory!


----------



## KillerBee33

Quote:


> Originally Posted by *bizplan*
> 
> When I can afford it might buy that $10,000 (http://wccftech.com/amd-dracrays-vega-10/) AMD card with 1 TB of memory!


Ehh, just be waitin for when they fit 100 TXPs in that lil nVidia Shield


----------



## bizplan

Y'all check this out:

{Build Log} Upside Down S8 project.
http://www.overclock.net/t/1608897/build-log-upside-down-s8-project

He is spending over $6K just on water cooling for a 6700K machine.

Partly for Show


----------



## eliau81

does my CPU 4790K bottleneck my titan xp?


----------



## Maintenance Bot

Quote:


> Originally Posted by *eliau81*
> 
> does my CPU 4790K bottleneck my titan xp?


Try some of the maps in BF1 if you have that. BF1 hammers 6700k here atm.


----------



## jsutter71

Does this look right? Using drivers that were posted yesterday. It wont let me add my results because it doesn't like the new drivers yet.
http://www.3dmark.com/fs/10622261


----------



## oquraishi

Remember that both Core Clock and Memory Clock when you overclock draw upon the same source for Power and Voltage.
Dial back your Memory Overclock and you'll be able to put a bit more into your Core Overclock. Or vice versa to strike optimal balance.


----------



## ENZO1

EVGA 1080/1070s Catching Fire due to Overheating VRMs, does this effect us who watercooled our Titan X Ps?

Please reference this article http://wccftech.com/nvidia-gtx-1080-1070-evga-cards-dying/
-Guys who water cooled our TitanXPs, do you think that our MOSFETS are safe or not, i used a 980TI water cooler to cool my Titan. Looks like if MOSFETS are not properly cooled they will fry. .


----------



## outofmyheadyo

What are u on about ? The watercooled cards are cooled properly, because the companies who manufactured the waterblocks actually took the time to make things fit, evga just said looks close enough lets go...


----------



## hotrod717

Quote:


> Originally Posted by *bizplan*
> 
> Y'all check this out:
> 
> {Build Log} Upside Down S8 project.
> http://www.overclock.net/t/1608897/build-log-upside-down-s8-project
> 
> He is spending over $6K just on water cooling for a 6700K machine.
> 
> Partly for Show


LOL. Unfortunately ambient temps become limiting at probably a little over $1000. A chiller is cheaper and better!


----------



## bizplan

Quote:


> Originally Posted by *hotrod717*
> 
> LOL. Unfortunately ambient temps become limiting at probably a little over $1000. A chiller is cheaper and better!


Yeah & all this is supposed to work upside down!


----------



## bizplan

Quote:


> Originally Posted by *ENZO1*
> 
> EVGA 1080/1070s Catching Fire due to Overheating VRMs, does this effect us who watercooled our Titan X Ps?
> 
> Please reference this article http://wccftech.com/nvidia-gtx-1080-1070-evga-cards-dying/
> -Guys who water cooled our TitanXPs, do you think that our MOSFETS are safe or not, i used a 980TI water cooler to cool my Titan. Looks like if MOSFETS are not properly cooled they will fry. .


Re: EVGA 980ti hybrid kit on TXP, Nvidia stock fan seems to keep MOSFETS cool enough (been running OC'd under full load for hours on end, now for several months).


----------



## MunneY

Quote:


> Originally Posted by *hotrod717*
> 
> LOL. Unfortunately ambient temps become limiting at probably a little over $1000. A chiller is cheaper and better!


yes it is, but then you gotta worry about condensation!


----------



## mouacyk

Quote:


> Originally Posted by *outofmyheadyo*
> 
> What are u on about ? The watercooled cards are cooled properly, because the companies who manufactured the waterblocks actually took the time to make things fit, evga just said looks close enough lets go...


Well, there's that... and then there's the possibility of an inadequate VRM because it "looks close enough". No amount of water cooling can fix an inadequate VRM.


----------



## Seyumi

I'm afraid this is the beginning of the end for having a factory-built custom cooled top end Nvidia card. The previous generation of Titan X only had few variants allowed from EVGA. Now this generation there is none and only Nvidia is allowed to sell them. The card has been out since August 2nd. It's now almost 3 months later and still no Hybrid kit from EVGA.

I think GPU partner companies such as EVGA knew this was a long time coming. Haven't you all noticed that EVERY GPU company now makes and sells PSUs, fans, cases, motherboards, memory, keyboard, mice, headsets, laptops, etc.? They knew they would have to expand their product line or eventually go out of business if selling GPU's only.

My mind was 100% set on having a CPU & 2x GPU AIO water cooling setup but it looks like I'm going to have to go custom loop that I've been trying to avoid like the plague for the longest time. I think to make this easier on me I'll end up going the EK Predator route with the prefilled water blocks since I doubt we'll see any new generations of Titans with anything but a stock cooler. My case is setup for an AIO layout now but I think if I just jump ship and go the Predator route it'll be easy to just swap out the GPU's in the future and have a nice plug & play setup. Technically I'll have better cooling but it will cost more, have less resale value, take longer to install/setup and have a higher chance of leaks versus just sticking to the AIO route.

I can't do anything less such as 1080's or the supposed 1080Ti's since there will always be games that struggle unless you have 2 of the top-end GPUs. Monitor tech advances faster than GPU tech and I doubt we'll ever see the day of only having 1 GPU to run every game at max settings on max resolution. Having the top-end GPU is even more important when you see like 75% of all "next gen" DX12 games coming out that don't even support SLI or has to get patched in months later.


----------



## jsutter71

Quote:


> Originally Posted by *Seyumi*
> 
> I'm afraid this is the beginning of the end for having a factory-built custom cooled top end Nvidia card. The previous generation of Titan X only had few variants allowed from EVGA. Now this generation there is none and only Nvidia is allowed to sell them. The card has been out since August 2nd. It's now almost 3 months later and still no Hybrid kit from EVGA.
> 
> I think GPU partner companies such as EVGA knew this was a long time coming. Haven't you all noticed that EVERY GPU company now makes and sells PSUs, fans, cases, motherboards, memory, keyboard, mice, headsets, laptops, etc.? They knew they would have to expand their product line or eventually go out of business if selling GPU's only.
> 
> My mind was 100% set on having a CPU & 2x GPU AIO water cooling setup but it looks like I'm going to have to go custom loop that I've been trying to avoid like the plague for the longest time. I think to make this easier on me I'll end up going the EK Predator route with the prefilled water blocks since I doubt we'll see any new generations of Titans with anything but a stock cooler. My case is setup for an AIO layout now but I think if I just jump ship and go the Predator route it'll be easy to just swap out the GPU's in the future and have a nice plug & play setup. Technically I'll have better cooling but it will cost more, have less resale value, take longer to install/setup and have a higher chance of leaks versus just sticking to the AIO route.
> 
> I can't do anything less such as 1080's or the supposed 1080Ti's since there will always be games that struggle unless you have 2 of the top-end GPUs. Monitor tech advances faster than GPU tech and I doubt we'll ever see the day of only having 1 GPU to run every game at max settings on max resolution. Having the top-end GPU is even more important when you see like 75% of all "next gen" DX12 games coming out that don't even support SLI or has to get patched in months later.


I was *VERY* hesitant myself to build my own custom loop and for years and put that off. Now that I have taken that plunge I have no regrats







Best damn decision I ever made. The biggest issue for most people is the cost, but if your wheeling to purchase expensive hardware then set aside the money needed to properly cool it. It's no different then people who spend thousands of dollars on a system and go cheap on the power supply. I'm not saying you did that. That's just an example.

Don't believe the horror stories about people flooding their system with custom loops. People who did that either used cheap parts or did not take the time to properly install their equipment. Regardless of your choice to use flexible or hard tubing, make sure to use compression fittings to properly secure your connections. Take your time and do your homework. Prior to my current system I had always used closed loop water cooling. with your own design you never have to worry about to much or to little tubing and proper routing.

One last note. My temps are ridiculously cooler using a custom loop then they ever were with a closed loop.


----------



## MaDeOfMoNeY

Should have my card in 2 weeks.


----------



## MrKenzie

Quote:


> Originally Posted by *MunneY*
> 
> yes it is, but then you gotta worry about condensation!


I have been using an aquarium cooler for nearly 2 years and I have only been worried about condensation when it is raining outside and my coolant temp is under 10c. Other than that there is nothing to worry about! If you have one running at under 10c then sure condensation can be a problem but I found very little gain from coolant temps of 5c compared to 15c..


----------



## MunneY

Quote:


> Originally Posted by *MrKenzie*
> 
> I have been using an aquarium cooler for nearly 2 years and I have only been worried about condensation when it is raining outside and my coolant temp is under 10c. Other than that there is nothing to worry about! If you have one running at under 10c then sure condensation can be a problem but I found very little gain from coolant temps of 5c compared to 15c..


I see you are in Australia, so you outta be familiar with humidity! Here, even right now in "fall", the humidity is around 70-90% . That means you constantly have to worry about ambiet to coolant delta


----------



## MrKenzie

Quote:


> Originally Posted by *MunneY*
> 
> I see you are in Australia, so you outta be familiar with humidity! Here, even right now in "fall", the humidity is around 70-90% . That means you constantly have to worry about ambit to coolant delta


Well that would suck! I'm from Victoria, Australia so the humidity is normally low. Northern Australia (Darwin) would be closer in humidity to where you are. It would probably be easier to use direct phase change cooling as there is less surfaces to condense water. I did that until I forgot to turn the motherboard heater on and fried the motherboard and CPU. An i7 920 at 5GHz was pretty awesome though!


----------



## jsutter71

And to think I actually considered dual EVGA 1080's instead of TXPs.
http://www.digitaltrends.com/computing/evga-gtx-1080-1070-overheating-issue/


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> And to think I actually considered dual EVGA 1080's instead of TXPs.
> http://www.digitaltrends.com/computing/evga-gtx-1080-1070-overheating-issue/


I keep saying this was my fault but my first TXPs VRM burned in that exact SPOT.


Spoiler: Warning: Spoiler!


----------



## axiumone

Pretty sure it's completely unrelated. The TXP and the EVGA FTW cards have different power delivery components.


----------



## jhowell1030

Quote:


> Originally Posted by *axiumone*
> 
> Pretty sure it's completely unrelated. The TXP and the EVGA FTW cards have different power delivery components.


It's not just their FTW cards. It's every card that uses their ACX cooler with the exception of their classified. There was no direct cooling for the VRMs.


----------



## axiumone

It's just the percentage of the FTW cards. Look at EVGA's own website for info. Some media outlet copy/pasted the news as all ACX cards and now everyone is running with it like broken telephone.


----------



## jhowell1030

Quote:


> Originally Posted by *axiumone*
> 
> It's just the percentage of the FTW cards. Look at EVGA's own website for info. Some media outlet copy/pasted the news as all ACX cards and now everyone is running with it like broken telephone.


That actually makes sense. I don't know why I didn't think about it...but the SCs would have the original PCB.


----------



## Claustrum

Noob here and need some help. I'm looking for the best CPU match to be running a single titanx. I'll be looking to run 2 titanx in sli next year but I don't necessarily want to future proof my cpu. I'll have the asus deluxe ii mobo, and 4x8 3333/C16 dominators. I was going to originally go with the 6850k but I'm learning the 6850-6950 would mainly be for sli or other uses besides gaming. And this is primarily a gaming pc. I've heard recommendation for 6700 and 6800. Any help would be greatly appreciated! Thanks in advance for your time.


----------



## outofmyheadyo

if its for gaming grab a 6700K and overclock the snot out of it or better yet, wait for the new 7700K should be out any time now.


----------



## Dark

Placed an order yesterday for two Titans (pascal). I had originally swapped out my previous titans (maxwell) with 1080 Classifieds and I wasn't completely sold on the performance.

Hoping they show up this week.


----------



## TremF

Quote:


> Originally Posted by *Dark*
> 
> Placed an order yesterday for two Titans (pascal). I had originally swapped out my previous titans (maxwell) with 1080 Classifieds and I wasn't completely sold on the performance.
> 
> Hoping they show up this week.


Congratulations on your decision.









I moved from a dual GTX Titan X SLI setup to a single TXP and, along with my I7 4930K @ 4.3GHz I am able to run every game at 4K ultra/max settings except for AA which isn't needed for most games at 4k. For any game that drops below 60FPS my 32" 4K Gsync Acer Predator XB321HK means I still have a silky smooth experience.


----------



## Dark

Quote:


> Originally Posted by *TremF*
> 
> Congratulations on your decision.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I moved from a dual GTX Titan X SLI setup to a single TXP and, along with my I74930K @ 4.3GHz I am able to run every game at 4K ultra/max settings except for AA which isn't needed for most games at 4k. For any game that drops below 60FPS my 32" 4K Gsync Acer Predator XB321HK means I still have a silky smooth experience.


That's great to hear! The reviews were too shiny to not go with a Titan XP.


----------



## Baasha

Guys,

Planning to take out the back-plates on the cards.

What is the 'best' screwdriver set I can get from Amazon for this?

TIA


----------



## EniGma1987

Quote:


> Originally Posted by *Baasha*
> 
> Guys,
> 
> Planning to take out the back-plates on the cards.
> 
> What is the 'best' screwdriver set I can get from Amazon for this?
> 
> TIA


I think the little screws use a #00? Maybe a #0.

Just buy this and you cover all your bases:
https://www.amazon.com/TEKTON-2977-Phillips-Precision-Screwdriver/dp/B008TM1910/ref=sr_1_4?ie=UTF8&qid=1478032571&sr=8-4&keywords=%2300+phillips


----------



## KillerBee33

Quote:


> Originally Posted by *Baasha*
> 
> Guys,
> 
> Planning to take out the back-plates on the cards.
> 
> What is the 'best' screwdriver set I can get from Amazon for this?
> 
> TIA


https://www.amazon.com/gp/product/B016Q3D4AC/ref=oh_aui_detailpage_o07_s00?ie=UTF8&psc=1
The last SET you'll need


----------



## Claustrum

Yeah I went with the 6700. The x99 deluxe ii is only compatible with lga2011 so switched my mobo to the maximus viii. I'm saving a heap compared to my original projected build (6850k w/ deluxe ii mobo). And from what I understand it's going to be faster for gaming. I live in Bali most of the time and everything is so expensive here. The maximus vii goes for $800usd and don't even think about getting a titanx pascal until next year. And when they do get here they'll be outrageously priced. Flying home to the States this coming week and picking up everything I need. Including a htc vive.

I'll be putting the waterblock on the titanx. Anyone know what hex key I need?


----------



## Dark

I keep seeing folks mention removing the backplates. I understand that removing one in an SLI setup (when they are sandwiched together) will allow for additional airflow but what if you have 1-2 slots between the cards? Still worth removing?


----------



## Leyaena

It's official, I'm getting rid of my SLI Titan X Maxwells.
The order for my Titan X Pascal has been placed


----------



## bizplan

Quote:


> Originally Posted by *Dark*
> 
> I keep seeing folks mention removing the backplates. I understand that removing one in an SLI setup (when they are sandwiched together) will allow for additional airflow but what if you have 1-2 slots between the cards? Still worth removing?


Probably not. Card is designed to diffuse heat through backplate and operate properly at temps generated by GPU and certain board components. Would want to make sure there is sufficient airflow inside case however.


----------



## Dark

Quote:


> Originally Posted by *bizplan*
> 
> Probably not. Card is designed to diffuse heat through backplate and operate properly at temps generated by GPU and certain board components. Would want to make sure there is sufficient airflow inside case however.


I'll inspect it more when they arrive but I thought I read somewhere that the new backplate is lined with plastic so it wasn't dissipating heat much at all. Regardless, I appreciate your response and that's what I'm hoping for.

Case has plenty of airflow (high positive pressure).


----------



## bizplan

Quote:


> Originally Posted by *Dark*
> 
> I'll inspect it more when they arrive but I thought I read somewhere that the new backplate is lined with plastic so it wasn't dissipating heat much at all. Regardless, I appreciate your response and that's what I'm hoping for.
> 
> Case has plenty of airflow (high positive pressure).


This might help you: http://www.guru3d.com/articles-pages/nvidia-geforce-titan-x-pascal-review,9.html


----------



## Baasha

Quote:


> Originally Posted by *KillerBee33*
> 
> https://www.amazon.com/gp/product/B016Q3D4AC/ref=oh_aui_detailpage_o07_s00?ie=UTF8&psc=1
> The last SET you'll need


Thanks! +REP









Just ordered it...


----------



## Dark

Things are happening...

Out with the old, in with the new.


----------



## Baasha

^welcome to the club.









How were the 1080 Classified GPUs btw?

I'm thinking of upgrading my 980 Ti Classified on my 2nd rig but was going to wait for the 1080 Ti to drop -- what kind of OC did you achieve with the 1080 Classy?


----------



## jmaz87

Hey guys! loving my single gpu but should i be considering a second one sooner than later? current limiting factor is the EK HB brigde not existing yet...









but when higher framerate 4k monitors come out I really want to be ready!

also does everyone have power issues with OC? i hit pwr limit pretty quickly under load with even moderate OC and with anything over 2000 it hits it all the time... idk if that's bad but i can assume it leads to "throttling" for lack of a better term just like it would thermally which would make it less smooth...

games without gsync are more noticeable when gpu is pushed like skyrim [email protected] looks perfect but [email protected] is a bit jumpy at times and other games like overwatch, battlefield1, gtaV look perfect with either monitor...

first build with a custom loop btw and my old rig is an i7930 so i'm a bit out of the loop


----------



## MrKenzie

Quote:


> Originally Posted by *Dark*
> 
> I'll inspect it more when they arrive but I thought I read somewhere that the new backplate is lined with plastic so it wasn't dissipating heat much at all. Regardless, I appreciate your response and that's what I'm hoping for.
> 
> Case has plenty of airflow (high positive pressure).


Yes the factory backplate is lined with what I assume is an insulator so no PCB components touch bare metal. There is however a small part on the PCB that makes direct contact with the aluminum backplate for heat dissipation. The backplate does get very hot under load so I can only assume there is a lot of hot air trapped inside..


----------



## Claustrum

I'm not sure about others but i'm installing a waterblock and new backplate. With the new backplate you can add more thermal tape to help with cooling as well as the additional cooling from the waterblock. Overclocking with custom loop on a titan should open some awesome benchmarks. But you need a hex key and I'm not sure what size.. Does anyone know?


----------



## Greens

Is it possible to put some thermal pads under the stock backplate?


----------



## jsutter71

Quote:


> Originally Posted by *Greens*
> 
> Is it possible to put some thermal pads under the stock backplate?


Just got my TXPs a couple weeks ago and mine DID have thermal pads on the back plates. Granted I had no use for them because I had purchased Nickel ones to match my water blocks. Now I know a lot of people here air cool but water cooling is the way to go. My cards are positioned next to each other with no spacing and I have never seen them above 50C under heavy load. Nvidia really did a great job because my 980Ti's were constantly hitting 70-85C under the same conditions.


----------



## jsutter71

Quote:


> Originally Posted by *Claustrum*
> 
> I'm not sure about others but i'm installing a waterblock and new backplate. With the new backplate you can add more thermal tape to help with cooling as well as the additional cooling from the waterblock. Overclocking with custom loop on a titan should open some awesome benchmarks. But you need a hex key and I'm not sure what size.. Does anyone know?


This is what I use...Expensive but *WORTH IT!!!!!* Never have to worry about having the correct size bit or the proper bit every again. Anyone who ever does any type of detailed or PC related repairs needs a set like this. Don't go cheap when it comes to your tools.

https://www.amazon.com/Wiha-75965-Precision-Ratchet-65-Piece/dp/B00JQ753W8/ref=sr_1_29?ie=UTF8&qid=1478183661&sr=8-29&keywords=wera+tools


----------



## Dark

Quote:


> Originally Posted by *Baasha*
> 
> ^welcome to the club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How were the 1080 Classified GPUs btw?
> 
> I'm thinking of upgrading my 980 Ti Classified on my 2nd rig but was going to wait for the 1080 Ti to drop -- what kind of OC did you achieve with the 1080 Classy?


I didn't noticed much of a difference coming from SLI Titan X (maxwell), besides the Classified cards being rather large. I would almost say to wait and make the 1080Ti jump. The only reason I sold my maxwell titans was because of their current value (which will almost certainly dive when the 1080Ti is released).
Quote:


> Originally Posted by *MrKenzie*
> 
> Yes the factory backplate is lined with what I assume is an insulator so no PCB components touch bare metal. There is however a small part on the PCB that makes direct contact with the aluminum backplate for heat dissipation. The backplate does get very hot under load so I can only assume there is a lot of hot air trapped inside..


Good to know! I ended up just ordering a complete loop + backplates but it will be interesting to see the stock ones once they're off.


----------



## Claustrum

Quote:


> Originally Posted by *jsutter71*
> 
> This is what I use...Expensive but *WORTH IT!!!!!* Never have to worry about having the correct size bit or the proper bit every again. Anyone who ever does any type of detailed or PC related repairs needs a set like this. Don't go cheap when it comes to your tools.
> 
> https://www.amazon.com/Wiha-75965-Precision-Ratchet-65-Piece/dp/B00JQ753W8/ref=sr_1_29?ie=UTF8&qid=1478183661&sr=8-29&keywords=wera+tools


Thanks! This looks great


----------



## Claustrum

Quote:


> Originally Posted by *Greens*
> 
> Is it possible to put some thermal pads under the stock backplate?


I'm not really sure... Basically the backplate comes with instruction showing where to place the tape. If I had to guess I would say yes. Just look up an installation or find the manual and place the tape in the instructed placement. I'm pretty sure the material of the stock backplate can handle it as it's taking the heat regardless if there is a thermal tape or not. So the thermal tape would just absorb the heat.. Right?


----------



## mouacyk

To those complaining that the backplate gets hot -- would you rather have that heat added to your GPU, VRM, VRAM? If it gets hot, it's doing its job. Just make sure there's airflow to help taper off that heat.


----------



## Lennyx

Quote:


> Originally Posted by *mouacyk*
> 
> To those complaining that the backplate gets hot -- would you rather have that heat added to your GPU, VRM, VRAM? If it gets hot, it's doing its job. Just make sure there's airflow to help taper off that heat.


Im pretty sure i read somewhere that they saw a 4-6c difference after taking off the backplate. I dont remember if that was in sli or single card setup.


----------



## KillerBee33

Any1 got extra nickel M2.5X7 A1 screws?
Putting everything together this weekend , forgot that i never got those with the backplate from EK


----------



## axiumone

Quote:


> Originally Posted by *KillerBee33*
> 
> Any1 got extra nickel M2.5X7 A1 screws?
> Putting everything together this weekend , forgot that i never got those with the backplate from EK


Last time I was missing those, I found them on ebay. Apparently it's a popular hobby size.


----------



## KillerBee33

Quote:


> Originally Posted by *axiumone*
> 
> Last time I was missing those, I found them on ebay. Apparently it's a popular hobby size.


will give it a try







Nope, no luck there...


----------



## Dark

Quote:


> Originally Posted by *KillerBee33*
> 
> will give it a try
> 
> 
> 
> 
> 
> 
> 
> Nope, no luck there...


http://www.laptopscrews.com/M2.5x7.htm


----------



## KillerBee33

Quote:


> Originally Posted by *Dark*
> 
> http://www.laptopscrews.com/M2.5x7.htm


Yeah looked in there too ....M2.5x7mm Black OEM Wafer "BLACK" i have those


----------



## ttg35fort

Quote:


> Originally Posted by *Claustrum*
> 
> I'm not really sure... Basically the backplate comes with instruction showing where to place the tape. If I had to guess I would say yes. Just look up an installation or find the manual and place the tape in the instructed placement. I'm pretty sure the material of the stock backplate can handle it as it's taking the heat regardless if there is a thermal tape or not. So the thermal tape would just absorb the heat.. Right?


I was wondering the same thing. I have the EK water block, but I drilled out the holes on the stock back plates and put them back on with the water block. But, the back plates appear to have an electrical insulation coating, and that might not work very well as a thermal conductor. So I'm not sure just adding the thermal tape is going to accomplish much unless you take the time to remove the electrical insulation coating where back plates make contact with the thermal tape. If you do that, though, you need to make sure you don't inadvertently remove the insulation where the back plates might touch electrical connections on the back of the card.

I'm debating on whether to remove the back plates, scrape of the electrical insulation in the appropriate areas, and put them back on with thermal tape in those areas, order an EK back plate, or leave it as is. The question, which I don't know the answer to, is whether using thermal tape with the stock back plates or getting an EK back plate will lower temperatures enough to get any additional overclock out of the card or make it last significantly longer.


----------



## Dark

Quote:


> Originally Posted by *KillerBee33*
> 
> Yeah looked in there too ....M2.5x7mm Black OEM Wafer "BLACK" i have those


Ah, my fault, I wasn't tracking on the color/coating.


----------



## Claustrum

Quote:


> Originally Posted by *ttg35fort*
> 
> I was wondering the same thing. I have the EK water block, but I drilled out the holes on the stock back plates and put them back on with the water block. But, the back plates appear to have an electrical insulation coating, and that might not work very well as a thermal conductor. So I'm not sure just adding the thermal tape is going to accomplish much unless you take the time to remove the electrical insulation coating where back plates make contact with the thermal tape. If you do that, though, you need to make sure you don't inadvertently remove the insulation where the back plates might touch electrical connections on the back of the card.
> 
> I'm debating on whether to remove the back plates, scrape of the electrical insulation in the appropriate areas, and put them back on with thermal tape in those areas, order an EK back plate, or leave it as is. The question, which I don't know the answer to, is whether using thermal tape with the stock back plates or getting an EK back plate will lower temperatures enough to get any additional overclock out of the card or make it last significantly longer.


Well if you already took the time to put on a waterblock you may want to consider it. If you order a backplate from ek I don't think there's any coating. I'll be installing a waterblock and ek backplate in the next couple of weeks. I'll take a look again.


----------



## Baasha

If anyone wonders how powerful the Titan X Pascal GPUs are, I present you this:

Battlefield 1 in 8K (native) with everything on Ultra including HBAO (no AA) at ~ 100 FPS. o_0


----------



## roccale

Hi all, i'm new (sorry for my bad english if you can) here and i buy TXp...
For the moment i don' t want change cooler with waterblock.
I do a test with HEAVEN, someone can tell me if all is ok please?
+205 core +705 memory, 2113 MHz * 11414, stock air temp max 75 °.
375.70 drivers
Voltage even touched
CPU 6700K stock at 4200mhz



I have done some more testing and are able to go up again on core and memory ...


----------



## Dark

Quote:


> Originally Posted by *Baasha*
> 
> If anyone wonders how powerful the Titan X Pascal GPUs are, I present you this:
> 
> Battlefield 1 in 8K (native) with everything on Ultra including HBAO (no AA) at ~ 100 FPS. o_0


Sounds great (for $4800 in GPU's







)

What other games have you found that are supporting that setup?


----------



## Ghostface

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I would gladly test the UPS solution but sadly most stores here where I live only stock crap, and the return on things is complicated to say the least.
> Perhaps some of you guys who live in the US could test the UPS theory, you should have pretty relaxed return policies there and even ordering one from amazon would make returns easy.


Just thought I'd post an update to let you know I have invested in a CyberPower Sinewave UPS, up to 1000watt load....

... The vibration/whine still exists.

This UPS has voltage regulator, power conditioning, the full shebang.

So that completes the full circle of testing... and it's just the Graphics Card out of ALL my electrical items (of which I have a bullet load in this room) that has this issue.

I will keep the UPS as it is handy to have for other reasons, however don't buy one if you are hoping to resolve your VRM buzz/vibration, bare in mind it made absolutely no difference for me.

Thankfully I am over it and game regardless on the rig. it doesn't take away the fact that a 1200 quid card should have counter measures in place to stop this from happening at 60-100 fps.


----------



## ttg35fort

Quote:


> Originally Posted by *Claustrum*
> 
> Well if you already took the time to put on a waterblock you may want to consider it. If you order a backplate from ek I don't think there's any coating. I'll be installing a waterblock and ek backplate in the next couple of weeks. I'll take a look again.


When I ordered my water block and video card the back plates were not yet available. When running the graphics test in Fire Strike Extreme, my GPU gets to about 42 deg. C, which is reasonably cool. I have seen posted here in the forum that once the GPU gets above 37 deg. it starts throttling. I have seen that happen to some extent. The frequency varies between about 2020 and 2080 MHz in Fire Strike with the GPU overclocked at 200 MHz. The question is whether the EK back plate will help drop my GPU by at least 5 deg. I am not optimistic it will, but I don't know. Maybe I'll order one and give it a try. I'm just not enthused about draining the coolant and re-bleeding the system, especially if it turns out that it makes no difference. Besides, I kind of like the way the stock back plate looks in my system. Decisions, decisions...


----------



## MrTOOSHORT

Quote:


> Originally Posted by *ttg35fort*
> 
> When I ordered my water block and video card the back plates were not yet available. When running the graphics test in Fire Strike Extreme, my GPU gets to about 42 deg. C, which is reasonably cool. I have seen posted here in the forum that once the GPU gets above 37 deg. it starts throttling. I have seen that happen to some extent. The frequency varies between about 2020 and 2080 MHz in Fire Strike with the GPU overclocked at 200 MHz. The question is whether the EK back plate will help drop my GPU by at least 5 deg. I am not optimistic it will, but I don't know. Maybe I'll order one and give it a try. I'm just not enthused about draining the coolant and re-bleeding the system, especially if it turns out that it makes no difference. Besides, I kind of like the way the stock back plate looks in my system. Decisions, decisions...


Card will run cooler without the back plate.

Get the back plate for looks and to help protect the pcb and components.


----------



## ttg35fort

Quote:


> Originally Posted by *roccale*
> 
> Hi all, i'm new (sorry for my bad english if you can) here and i buy TXp...
> For the moment i don' t want change cooler with waterblock.
> I do a test with HEAVEN, someone can tell me if all is ok please?
> +205 core +705 memory, 2113 MHz * 11414, stock air temp max 75 °.
> 375.70 drivers
> Voltage even touched
> CPU 6700K stock at 4200mhz
> 
> 
> 
> I have done some more testing and are able to go up again on core and memory ...


Those are good numbers.

I have a water block. Although I can get the card stable with + 210 MHz on the GPU and +650 MHz on the memory with a significant voltage increase, my card throttles back more with those settings. I spent hours playing with my settings, and I found that I get the highest score in Fire Strike Extreme with +200 MHz on the GPU and +600 MHz on the memory, and no overvoltage. At +205 MHz on the GPU I lose about 500 points in the graphics score due to throttling, and at +210 I lose about 1000 points due to even greater throttling. This is likely due to the temperature increase resulting from the increased voltage I need to get the card stable at those frequencies.

You might want to benchmark your graphics settings while making the adjustments to see what settings give you the highest graphics scores (i.e., the lowest amount of throttling).


----------



## KillerBee33

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Card will run cooler without the back plate.
> 
> Get the back plate for looks and to help protect the pcb and components.


How much COOLER? If it's 1-3 degrees its not worth taking it off...


----------



## ttg35fort

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Card will run cooler without the back plate.
> 
> Get the back plate for looks and to help protect the pcb and components.


Quote:


> Originally Posted by *KillerBee33*
> 
> How much COOLER? If it's 1-3 degrees its not worth taking it off...


I just ordered the back plate. It is due to arrive around the 15th. Once I get it installed I will re-run testing to see if it reduces the throttling of the GPU, and post the results. I am not optimistic the little amount of extra cooling it will provide will make much of a difference, but who knows. If I don't try it, I will be left wondering. So maybe the $50 (back plate + shipping) is worth it just to satisfy my curiosity. At least I can share the knowledge with others who also may be wondering.


----------



## KillerBee33

Quote:


> Originally Posted by *ttg35fort*
> 
> I just ordered the back plate. It is due to arrive around the 15th. Once I get it installed I will re-run testing to see if it reduces the throttling of the GPU, and post the results. I am not optimistic the little amount of extra cooling it will provide will make much of a difference, but who knows. If I don't try it, I will be left wondering. So maybe the $50 (back plate + shipping) is worth it just to satisfy my curiosity. At least I can share the knowledge with others who also may be wondering.


What do you get now at the end of Second Test in TimeSpy?


----------



## ttg35fort

Quote:


> Originally Posted by *KillerBee33*
> 
> What do you get now at the end of Second Test in TimeSpy?


I don't remember. I am at work now, so I can't check it. I will run TimeSpy before I install the back plate, the re-run it with the back plate installed. I'll do that with Fire Strike Extreme too.

My last score in Fire Strike Extreme was something like 14,385. About a month ago someone else in the forum with almost the exact same setup as mine, and about the same overclocking, indicated he was getting over 15,2xx. I'm not sure why there was such a big difference. I tried turning off G-Sync and made some other minor changes in the NVidia control panel, but it did not make a difference on my score. I figured it may be background services or something running on my rig that kept me from getting to 15,000, but I haven't taken the time to start disabling any services.

The OS is Win 10 with the latest updates. The only software installed on the machine is Steam, games, NVidia Control Panel, Afterburner, Asus Extreme Tuning Utility (which I found not to be very useful), Samsung Magician, Prime95, IntelBurn test, CPU-Z, HW Monitor, Chrome and the Edge browser.


----------



## KillerBee33

Quote:


> Originally Posted by *ttg35fort*
> 
> I don't remember. I am at work now, so I can't check it. I will run TimeSpy before I install the back plate, the re-run it with the back plate installed. I'll do that with Fire Strike Extreme too.
> 
> My last score in Fire Strike Extreme was something like 14,385. About a month ago someone else in the forum with almost the exact same setup as mine, and about the same overclocking, indicated he was getting over 15,2xx. I'm not sure why there was such a big difference. I tried turning off G-Sync and made some other minor changes in the NVidia control panel, but it did not make a difference on my score. I figured it may be background services or something running on my rig that kept me from getting to 15,000, but I haven't taken the time to start disabling any services.
> 
> The OS is Win 10 with the latest updates. The only software installed on the machine is Steam, games, NVidia Control Panel, Afterburner, Asus Extreme Tuning Utility (which I found not to be very useful), Samsung Magician, Prime95, IntelBurn test, CPU-Z, HW Monitor, Chrome and the Edge browser.


Well , i get 42 in TimeSpy but FSU tops @37. My 320 30MM rad blows into the case


----------



## ttg35fort

Quote:


> Originally Posted by *KillerBee33*
> 
> Well , i get 42 in TimeSpy but FSU tops @37. My 320 30MM rad blows into the case


Is that 4,200, and 3,700?


----------



## KillerBee33

Quote:


> Originally Posted by *ttg35fort*
> 
> Is that 4,200, and 3,700?


Sorry did not specify ...42 Degrees and 37 Degrees in FSU


----------



## ttg35fort

Quote:


> Originally Posted by *KillerBee33*
> 
> Sorry did not specify ...42 Degrees and 37 Degrees in FSU


I'm at 42 deg. in Fire Strike Extreme during the graphics testing. I have a 420x30 mm radiator blowing out, and a 280x30 mm radiator blowing in, but low fan speeds. My fans are Noctua NF-P14s - 1500 rpm, but my fan profile keeps them running well below that until the CPU gets over 70 deg., which it never does. I might try a more aggressive fan profile. That might help keep my GPU temp lower. I might also try a more aggressive profile for the pump control. It connected to a fan header and right now I am using a similar profile for controlling its speed.


----------



## KillerBee33

Quote:


> Originally Posted by *ttg35fort*
> 
> I'm at 42 deg. in Fire Strike Extreme during the graphics testing. I have a 420x30 mm radiator blowing out, and a 280x30 mm radiator blowing in, but low fan speeds. My fans are Noctua NF-P14s - 1500 rpm, but my fan profile keeps them running well below that until the CPU gets over 70 deg., which it never does. I might try a more aggressive fan profile. That might help keep my GPU temp lower.


All my fans run under 1000 RPM at all times , changing it makes 1-2 degree difference and not worth the noise but my pump @ 2400RPM min and 3000 TOP.


----------



## ttg35fort

Quote:


> Originally Posted by *ttg35fort*
> 
> I'm at 42 deg. in Fire Strike Extreme during the graphics testing. I have a 420x30 mm radiator blowing out, and a 280x30 mm radiator blowing in, but low fan speeds. My fans are Noctua NF-P14s - 1500 rpm, but my fan profile keeps them running well below that until the CPU gets over 70 deg., which it never does. I might try a more aggressive fan profile. That might help keep my GPU temp lower. I might also try a more aggressive profile for the pump control. It connected to a fan header and right now I am using a similar profile for controlling its speed.


Quote:


> Originally Posted by *KillerBee33*
> 
> All my fans run under 1000 RPM at all times , changing it makes 1-2 degree difference and not worth the noise but my pump @ 2400RPM min and 3000 TOP.


I have my pump running on a fan header with a fan profile. It only indicates percentages. Are you extrapolating the rpm numbers from that, or are you able to directly monitor the pump rpm? If so, how?


----------



## KillerBee33

Quote:


> Originally Posted by *ttg35fort*
> 
> I have my pump running on a fan header with a fan profile. It only indicates percentages. Are you extrapolating the rpm numbers from that, or are you able to directly monitor the pump rpm? If so, how?


Pump plugged into mobo , control from bios and monitoring well there are so many programs , i just use Corsair LINK. Also in BIOSMonitoring its RPM instead of %


----------



## ttg35fort

Quote:


> Originally Posted by *KillerBee33*
> 
> Pump plugged into mobo , control from bios and monitoring well there are so many programs , i just use Corsair LINK. Also in BIOSMonitoring its RPM instead of %


I'll check it out.


----------



## roccale

Then tis is last score with cpu 4.6ghz:



An this is 2560*1440 clock 2126*11430 AA8X, tassellation extreme, quality ultra:


----------



## MrKenzie

Quote:


> Originally Posted by *ttg35fort*
> 
> When I ordered my water block and video card the back plates were not yet available. When running the graphics test in Fire Strike Extreme, my GPU gets to about 42 deg. C, which is reasonably cool. I have seen posted here in the forum that once the GPU gets above 37 deg. it starts throttling. I have seen that happen to some extent. The frequency varies between about 2020 and 2080 MHz in Fire Strike with the GPU overclocked at 200 MHz. The question is whether the EK back plate will help drop my GPU by at least 5 deg. I am not optimistic it will, but I don't know. Maybe I'll order one and give it a try. I'm just not enthused about draining the coolant and re-bleeding the system, especially if it turns out that it makes no difference. Besides, I kind of like the way the stock back plate looks in my system. Decisions, decisions...


The GPU throttles with temperature at least once before 37 degrees.. about 29 degrees from memory, and then throttles again between 45-50c. That is the case with mine anyway when I did testing, almost 100MHz of throttling between 25c and 50c.


----------



## Revan654

Just got mine today:










Build it's going into: http://www.overclock.net/t/1610817/build-log-project-frost-case-labs-sm8-with-pedestals-x99-watercooled-i7-6950x-titan-x


----------



## Lee0

Quote:


> Originally Posted by *Revan654*
> 
> Just got mine today:
> 
> Build it's going into: http://www.overclock.net/t/1610817/build-log-project-frost-case-labs-sm8-with-pedestals-x99-watercooled-i7-6950x-titan-x


Oh man I love that box. It's really cocky and mighty, I usually don't have boxes out in the open because most are ugly and swarmed with marketing. But the I have my titan xp box next to the pc on a shelf, purely because of the looks. .


----------



## uggy

Do you guys max the power and voltage when you overclock the txp? On water that is.


----------



## MrKenzie

Quote:


> Originally Posted by *uggy*
> 
> Do you guys max the power and voltage when you overclock the txp? On water that is.


Leave the voltage stock, you generally get worse results by adding more voltage because it hits the power limit quicker.


----------



## bizplan

Quote:


> Originally Posted by *uggy*
> 
> Do you guys max the power and voltage when you overclock the txp? On water that is.


I set the core voltage slider to 100% when OC'ing, increasing this setting purportedly stabilizes higher core overclocks. I have found that it may increase voltage by up to .05 volts, up to a maximum of 1.093 volts, and such voltage is subject to the card's current power draw, temperature, core clock speed and load.


----------



## hlucn8tn

I can not get a straight answer from Nvidia about my card. Is it messed up or normal!?!?

In MSI Afterburner or EVGA Precision X @ stock clocks my power limit keeps jumping back and forth from "0" to "1"

Is this a typical behavior for the Titan X Pascal or is there something wrong with the card?
My FPS doesn't seem to be effected unless the card hits the power limit and stays there but because of this power throttling the card does not heat up past 74 deg on a stock cooler.
the card does seem to be able to boost up properly either.
I don't want to over clock the card. I have changed the fan curve to keep the temps lower and if I change the power limit % from 100% to 120% it does not limit as much but I should not have to do this on the card with stock clocks in order to keep the card from hitting the power limits under gaming and benchmark loads. It still does hit the power limit at 120% but not as much, maybe 3-5 times a game.

Has anyone else experienced this problems?
If so, how are you overclocking this card and what would be the sense of going to liquid cooling if it isn't the temperature holding the card back

I keep hearing about Temp Throttling but this card will not temp throttle because the power limit does not allow the card to be push hard enough to heat up to 84 dec C

Once again this is with stock clock setting for the core and the memory and other then the fan curve nothing has been done.

the first picture is the title screen on black ops 3 (avg frames 115-150 during game play max settings)
the second picture is on Valley Benchmark
the 3rd picture is on the title screen for Titanfall 2 (avg frames 90-120 during game play max settings)
the 4th and 5th pictures are 2 different rounds on Titanfall 2

the last image is GURU3D overclock image. Do they not see the card is power throttling?

CHECK OUT MY RESULTS HERE ---->


http://imgur.com/c1eCh


----------



## MrKenzie

Quote:


> Originally Posted by *hlucn8tn*
> 
> I can not get a straight answer from Nvidia about my card. Is it messed up or normal!?!?
> 
> In MSI Afterburner or EVGA Precision X @ stock clocks my power limit keeps jumping back and forth from "0" to "1"
> 
> Is this a typical behavior for the Titan X Pascal or is there something wrong with the card?
> My FPS doesn't seem to be effected unless the card hits the power limit and stays there but because of this power throttling the card does not heat up past 74 deg on a stock cooler.
> the card does seem to be able to boost up properly either.
> I don't want to over clock the card. I have changed the fan curve to keep the temps lower and if I change the power limit % from 100% to 120% it does not limit as much but I should not have to do this on the card with stock clocks in order to keep the card from hitting the power limits under gaming and benchmark loads. It still does hit the power limit at 120% but not as much, maybe 3-5 times a game.
> 
> Has anyone else experienced this problems?
> If so, how are you overclocking this card and what would be the sense of going to liquid cooling if it isn't the temperature holding the card back
> 
> I keep hearing about Temp Throttling but this card will not temp throttle because the power limit does not allow the card to be push hard enough to heat up to 84 dec C
> 
> Once again this is with stock clock setting for the core and the memory and other then the fan curve nothing has been done.
> 
> the first picture is the title screen on black ops 3 (avg frames 115-150 during game play max settings)
> the second picture is on Valley Benchmark
> the 3rd picture is on the title screen for Titanfall 2 (avg frames 90-120 during game play max settings)
> the 4th and 5th pictures are 2 different rounds on Titanfall 2
> 
> the last image is GURU3D overclock image. Do they not see the card is power throttling?
> 
> CHECK OUT MY RESULTS HERE ---->
> 
> 
> http://imgur.com/c1eCh


Seems normal to me for the Titan X Pascal. If you overclock the card you should average at least 2000MHz on the stock cooler even though it keeps hitting the power limit. Water cooling does help, but probably only 50MHz so if you can deal with a card running at around 75-80c then keep the stock cooler.


----------



## Jpmboy

Quote:


> Originally Posted by *uggy*
> 
> Do you guys max the power and voltage when you overclock the txp? On water that is.


voltage will not help. If you have the temps under 35C, power limit can make a big difference in holding max boost under load.








Quote:


> Originally Posted by *hlucn8tn*
> 
> I can not get a straight answer from Nvidia about my card. Is it messed up or normal!?!?
> 
> In MSI Afterburner or EVGA Precision X @ stock clocks my power limit keeps jumping back and forth from "0" to "1"
> 
> *Is this a typical behavior for the Titan X Pascal or is there something wrong with the card?*
> My FPS doesn't seem to be effected unless the card hits the power limit and stays there but because of this power throttling the card does not heat up past 74 deg on a stock cooler.
> the card does seem to be able to boost up properly either.
> I don't want to over clock the card. I have changed the fan curve to keep the temps lower and if I change the power limit % from 100% to 120% it does not limit as much but I should not have to do this on the card with stock clocks in order to keep the card from hitting the power limits under gaming and benchmark loads. It still does hit the power limit at 120% but not as much, maybe 3-5 times a game.
> 
> Has anyone else experienced this problems?
> If so, how are you overclocking this card and what would be the sense of going to liquid cooling if it isn't the temperature holding the card back
> 
> I keep hearing about Temp Throttling but this card will not temp throttle because the power limit does not allow the card to be push hard enough to heat up to 84 dec C
> 
> Once again this is with stock clock setting for the core and the memory and other then the fan curve nothing has been done.
> 
> the first picture is the title screen on black ops 3 (avg frames 115-150 during game play max settings)
> the second picture is on Valley Benchmark
> the 3rd picture is on the title screen for Titanfall 2 (avg frames 90-120 during game play max settings)
> the 4th and 5th pictures are 2 different rounds on Titanfall 2
> 
> the last image is GURU3D overclock image. Do they not see the card is power throttling?
> 
> CHECK OUT MY RESULTS HERE ---->
> 
> 
> http://imgur.com/c1eCh


At stock clocks, max the power limit - does it still happen? and in what game or benchmark?


----------



## hlucn8tn

Yes it does still happen but only maybe 3-5 times over 7-10 minutes of gaming. so at least it is not constantly up and down throughout the game. This happens on Titanfall 2 and Blackops 3. The Valley benchmark pretty much stays in range. I did take the GPU and put it in the other computer and it was 10% lower on average on the power %. This was on a Thermaltake 600 w psu also, rather then the 750w EVGA. The card seemed to be running more efficiently on that board, which that CPU, RAM etc. I don't know if is the PSU that is causing it to run between 85%-95% where as it runs at 95%-105% in my normal rig. It only occasionally, with stock settings, hit the power limit. So what I did, was order a larger and newer PSU Titanium rather then wait to see if it is the power supply that is the problem. If it isn't then there is something wrong with the board or who knows what. But at least it will be one less thing to worry about. I am re-installing the operating system also just in case there is something there too. If at that point the problem still exists, I will just have to run the card on the other computer with the new powersupply that is larger and can handle it better. That means an AMD CPU and MSI MINI ATX MOB rather then my i7 and ATX MOB. I really don't know what else to think


----------



## SnakeBiteScares

So many posts, so many op builds. Tried skimming through and looking at sigs but some of them seem too much for what I'm asking. Which cpu would you put with a rig with 2 of these cards for 4K? Anyone got 2 of them and can tell me more about how the SLI is working out for them? It's already a high cost for one card so want to make sure it's worth it if I'm getting two


----------



## TremF

Quote:


> Originally Posted by *SnakeBiteScares*
> 
> So many posts, so many op builds. Tried skimming through and looking at sigs but some of them seem too much for what I'm asking. Which cpu would you put with a rig with 2 of these cards for 4K? Anyone got 2 of them and can tell me more about how the SLI is working out for them? It's already a high cost for one card so want to make sure it's worth it if I'm getting two


I don't have two cards and am able to run pretty much every game in 4K ultra/max settings (except AA which isn't really needed) on my 32" Acer Predator XB321HK GSync monitor.

I have an I7 4930K @ 4.3GHz. Any game that drop below 60FPS is managed by my monitors GSync and anything over by FastSync.


----------



## Jpmboy

Quote:


> Originally Posted by *SnakeBiteScares*
> 
> So many posts, so many op builds. Tried skimming through and looking at sigs but some of them seem too much for what I'm asking. Which cpu would you put with a rig with 2 of these cards for 4K? Anyone got 2 of them and can tell me more about how the SLI is working out for them? It's already a high cost for one card so want to make sure it's worth it if I'm getting two


I run two. depending on which games, a 6700K can be enough.


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> I run two. depending on which games, a 6700K can be enough.


Can confirm, a well overclocked 6700k is in fact better for gaming than a haswell-e or a broadwell-e.


----------



## lilchronic

I see it like this 6700k is good if you don't plan on running above 60Hz once you double the fps like 120 -144 the 6700k starts to fall behind but It also depends on the games you play aswell.


----------



## axiumone

Quote:


> Originally Posted by *lilchronic*
> 
> I see it like this 6700k is good if you don't plan on running above 60Hz once you double the fps like 120 -144 the 6700k starts to fall behind but It also depends on the games you play aswell.


You say that, but - http://www.overclock.net/t/1608309/x99-5960x-4-6-vs-z170-6700k-4-8-w-1080-sli-3440x1440/0_100

Testing a [email protected] vs the same 6700k produced very similar results.


----------



## lilchronic

Quote:


> Originally Posted by *axiumone*
> 
> You say that, but - http://www.overclock.net/t/1608309/x99-5960x-4-6-vs-z170-6700k-4-8-w-1080-sli-3440x1440/0_100
> 
> Testing a [email protected] vs the same 6700k produced very similar results.


I still stand by what i say, I mean it also depends on the resolution as well. 3440x1440 is up there and still pretty GPU limited.

When you tested farcry and valley have you looked at cpu usage for each core? guaranteed its not using more than 4 cores. so it does depend on the games you play.

Even dropping to 2560x1440 or 1080p would change those results. Would be interesting to see with different resolutions and how much that changes things.


----------



## axiumone

If you get better consistent frame rate as well as smaller frame time variance, way less heat to manage while paying a fraction of the cost, what is the point of a 6-10 core cpu right at this very moment if all you're focusing on is gaming? That's the question I wanted answered for myself and I believe that I have.

I'm planing on redoing these tests using the presentmon tool and some dx12 titles when I have some time. Should be interesting!


----------



## lilchronic

Quote:


> Originally Posted by *axiumone*
> 
> If you get better consistent frame rate as well as smaller frame time variance, way less heat to manage while paying a fraction of the cost, what is the point of a 6-10 core cpu right at this very moment if all you're focusing on is gaming? That's the question I wanted answered for myself and I believe that I have.
> 
> I'm planing on redoing these tests using the presentmon tool and some dx12 titles when I have some time. Should be interesting!


Yeah i agree with ya. great work! 6700k is a awesome chip. looking forward to see some more results of your's.


----------



## Baasha

Quote:


> Originally Posted by *SnakeBiteScares*
> 
> So many posts, so many op builds. Tried skimming through and looking at sigs but some of them seem too much for what I'm asking. Which cpu would you put with a rig with 2 of these cards for 4K? Anyone got 2 of them and can tell me more about how the SLI is working out for them? It's already a high cost for one card so want to make sure it's worth it if I'm getting two


At 4K, you don't need two cards. This is from someone who is running 4x Titan X Pascals.

Here's a benchmarking video of various games using one Titan X Pascal at 4K and 5K (which is 70% more pixels than 4K):





This is of course assuming your 4K monitor is a 60Hz panel and since no 120Hz (or higher) 4K monitors are out yet, it's safe to say that you definitely do NOT need 2x Titan XPs to run 4K @ 60Hz in pretty much every game out there.
Quote:


> Originally Posted by *TremF*
> 
> I don't have two cards and am able to run pretty much every game in 4K ultra/max settings (except AA which isn't really needed) on my 32" Acer Predator XB321HK GSync monitor.
> 
> I have an I7 4930K @ 4.3GHz. Any game that drop below 60FPS is managed by my monitors GSync and anything over by FastSync.


How is that Acer 32" G-Sync 4K monitor? I was thinking of doing 4K Surround again on the Uber Rig and moving my 5K monitor to the 2nd rig.


----------



## TremF

Quote:


> Originally Posted by *Baasha*
> 
> How is that Acer 32" G-Sync 4K monitor? I was thinking of doing 4K Surround again on the Uber Rig and moving my 5K monitor to the 2nd rig.


I'm loving it. Any games under 60FPS are managed by the GSync and any over FastSync takes care of. If it wasn't for the overlay from MSI AB I wouldn't know if my FPS was high or low. The built in speakers aren't the best but most people use external speakers or a headset so not a problem. I don't mind the sound as I have used a headset for so many years it's good to not have to for a change.

This is the biggest monitor I've used (apart from using a 40" 1080P TV now and again). My last was a 27" 1440P ViewSonic VP2770-LED which I liked but since getting this 4K I haven't even bothered with my Rift CV1.

Before getting this I was looking at standard 4K monitors or 1440P Gsync monitors but nothing grabbed me and I'm not into the ultrawide at all. As soon as a friend drew my attention to this I knew instantly it was for me. No regrets. Gaming at 4K with just one TXP is amazing. It was worth moving from 2 GTX Titan X


----------



## bizplan

Quote:


> Originally Posted by *TremF*
> 
> I'm loving it. Any games under 60FPS are managed by the GSync and any over FastSync takes care of. If it wasn't for the overlay from MSI AB I wouldn't know if my FPS was high or low. The built in speakers aren't the best but most people use external speakers or a headset so not a problem. I don't mind the sound as I have used a headset for so many years it's good to not have to for a change.
> 
> This is the biggest monitor I've used (apart from using a 40" 1080P TV now and again). My last was a 27" 1440P ViewSonic VP2770-LED which I liked but since getting this 4K I haven't even bothered with my Rift CV1.
> 
> Before getting this I was looking at standard 4K monitors or 1440P Gsync monitors but nothing grabbed me and I'm not into the ultrawide at all. As soon as a friend drew my attention to this I knew instantly it was for me. No regrets. Gaming at 4K with just one TXP is amazing. It was worth moving from 2 GTX Titan X


I second on that Acer XB321HK 32" G-Sync 4K monitor & single TXP combo (+200/+600), I'm running Doom 4 at 5K (5120x2880 using DSR, setting the game AND monitor at that resolution) at 60 FPS average, 6700K @ 4.7.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Can confirm, a well overclocked 6700k is in fact better for gaming than a haswell-e or a broadwell-e.


I wouldn;t go so far as better... unless one is playing old 1-2 core cpu-bound games. And very few "gaming" benchmarks support your conclusion.








we had this discussion here months ago.... my point is for the price a 6700K is good enough.


----------



## SnakeBiteScares

Quote:


> Originally Posted by *TremF*
> 
> I don't have two cards and am able to run pretty much every game in 4K ultra/max settings (except AA which isn't really needed) on my 32" Acer Predator XB321HK GSync monitor.
> 
> I have an I7 4930K @ 4.3GHz. Any game that drop below 60FPS is managed by my monitors GSync and anything over by FastSync.


Quote:


> Originally Posted by *Jpmboy*
> 
> I run two. depending on which games, a 6700K can be enough.


Quote:


> Originally Posted by *axiumone*
> 
> Can confirm, a well overclocked 6700k is in fact better for gaming than a haswell-e or a broadwell-e.


Quote:


> Originally Posted by *lilchronic*
> 
> I see it like this 6700k is good if you don't plan on running above 60Hz once you double the fps like 120 -144 the 6700k starts to fall behind but It also depends on the games you play aswell.


Quote:


> Originally Posted by *axiumone*
> 
> You say that, but - http://www.overclock.net/t/1608309/x99-5960x-4-6-vs-z170-6700k-4-8-w-1080-sli-3440x1440/0_100
> 
> Testing a [email protected] vs the same 6700k produced very similar results.


Quote:


> Originally Posted by *lilchronic*
> 
> Yeah i agree with ya. great work! 6700k is a awesome chip. looking forward to see some more results of your's.


Seems like 6700k is the choice for me then, which is great as that is what I'm already using. I was just wondering about the number of pcie lanes required for two cards but it seems like that cpu will be fine.
Quote:


> Originally Posted by *Baasha*
> 
> At 4K, you don't need two cards. This is from someone who is running 4x Titan X Pascals.
> 
> Here's a benchmarking video of various games using one Titan X Pascal at 4K and 5K (which is 70% more pixels than 4K):


Thanks for the video. The games I'd be playing are Witcher 3, BF1, DOOM, GTA V. It looks like a single card would manage it fine for most of these. I notice though that Witcher 3 doesn't even hit 60fps though at 4K and this is the kind of thing that bothers me and pushes me to go sli instead. I'll think more on this one before I decide if it'll be worth it. Good to know I won't have to upgrade my cpu though. Thanks for the info guys


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> I wouldn;t go so far as better... unless one is playing old 1-2 core cpu-bound games. And very few "gaming" benchmarks support your conclusion.
> 
> 
> 
> 
> 
> 
> 
> 
> we had this discussion here months ago.... my point is for the price a 6700K is good enough.


Can you point me to a few that don't support my conculsion? I'd like to test them. Genuinely asking. In the testing that I've done so far, evidence points to the contrary. Even games that people always assumed were heavily threaded like GTAV performed better on the 6700k vs some of the socket 2011 counterparts that I've tested.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Can you point me to a few that don't support my conculsion? I'd like to test them. Genuinely asking. In the testing that I've done so far, evidence points to the contrary. Even games that people always assumed were heavily threaded like GTAV performed better on the 6700k vs some of the socket 2011 counterparts that I've tested.


Have to search thru this thread for the examples, I do have these handy:

http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20

you can see GFX only scores in some of these directly, and depending on the physics load in a given game, physx and combined scores are "directionally consistent"







with core count benefit. But no doubt the IPC in a 6700K is very good... with the 7700K arriving very soon that may be the mainstream CPU to go with.


----------



## axiumone

Ohh, you mean straight up benchmarks, I misundestoood. Yeah 3dmark definitely scales the physics score with more cores, unfortunately those gains don't carry over to actual gaming.


----------



## jhowell1030

Quote:


> Originally Posted by *axiumone*
> 
> Ohh, you mean straight up benchmarks, I misundestoood. Yeah 3dmark definitely scales the physics score with more cores, unfortunately those gains don't carry over to actual gaming.


Exactly.


----------



## ttg35fort

I debated on whether to go 6700k and the 6800k. For me, the decision came down to how long I expect to use my CPU. The 6700k will outperform the 6800k a bit on many, if not most, of today's games. I have only seen benchmarks for one game that benefits from more than 4 cores, Ashes of the Singularity. This was using DirectX 12. http://www.pcworld.com/article/3039552/hardware/tested-how-many-cpu-cores-you-really-need-for-directx-12-gaming.html



As developers begin to develop more games that are better configured to maximize the benefits of DirectX 12 or Vulcan, more games likely will see benefits using 6 or more cores. It may take a year or two to see such titles being released.

My last CPU was an I7 920 clocked at 4 GHz. I used it for 7 years. I expect to use my current CPU for nearly that long, so I went with the 6800k. If I were planning on upgrading in less than 2 years, I would have went with the 6700k or waited for the 7700k.


----------



## jhowell1030

Odd question for you folks out there.

I really want to watercool this bad boy. Optimally, would like to have 1 280mm radiator up front (slim would be fine seeing as I have an x61 on my cpu right now) and 240mm radiator on top. All I would be cooling would be my Titan and 5820k (4.3ghz @ stock voltage) Ideally I would like to do hard tubing, one quiet pump, and a small reservoir. I am completely new to this though.

Here's the interesting part. I have a gift card to Amazon for $250. If you could get $250 worth of parts from Amazon...what would you suggest I pick up there? I do plan on spending more than that on the watercooling stuff (probably $600ish out of pocket) but I didn't really know how to best utilize the $250 from Amazon.

Right now I do have a kraken g10 bracket laying around...but I don't know how successful folks have been using that on the Titan


----------



## ttg35fort

I would make your component selections first, then see which ones are available on Amazon. As far as GPU, I went with a water block specifically designed for the Titan X (Pascal). Mine is an EK, but I think other manufacturers make them for the Titan X (Pascal) as well.


----------



## jhowell1030

Quote:


> Originally Posted by *ttg35fort*
> 
> I would make your component selections first, then see which ones are available on Amazon. As far as GPU, I went with a water block specifically designed for the Titan X (Pascal). Mine is an EK, but I think other manufacturers make them for the Titan X (Pascal) as well.


I guess I'm asking for input on components available from Amazon. I know nothing about watercooling components, who makes the good ones (other than ek), who cheaps out, etc.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Ohh, you mean straight up benchmarks, I misundestoood. Yeah 3dmark definitely scales the physics score with more cores, unfortunately those gains don't carry over to actual gaming.


Quote:


> Originally Posted by *jhowell1030*
> 
> Exactly.


If you guys are comparing a 6700K @ 5.0 to a 5960x @ 4.7 or a 6950X at 4.4.. .yeah sure when running low core use games with console physics. In that use-case, my 6320 at 4875 with ram at 4333 is the best choice I guess.








Like I said, users here went thru this a while ago and posted game benchmarks they ran in this thread. Price/performance, a 6700K is best (as I recommended). Can;t say I'm disappointing by my 6950X or 5060X or 6700K for that matter - and they are all running within 20ft of eachother.


----------



## stephen427

Hey guys! Just preordered my first titan gpu! Lovely titan XP coming in my mail next week. I always been high chaser going from 780ti to 980ti every gen. I really cant wait for 1080ti this time around. I have my doubts it will be much better than titan XP at all like 980ti was to maxwell titan.

Will be installing full cover block EKWD on it. I hope temps will be alright. my full cover block on my 980ti temps are in 60C's I was bit dissapointed how high it was for stock voltages on gpu. Its even hooked on to own 360mm rad with high qaulity noiseblockers fans at 1300rpm and 240mm rad for cpu only.

Anyone know what to espect in terms of temp on titan xp? I really hope it wont be as high this time. I paid as much on my watercooling parts as I did on this titan xp. All EKWB parts. No cheap stuff. :/


----------



## ttg35fort

Quote:


> Originally Posted by *stephen427*
> 
> Anyone know what to espect in terms of temp on titan xp? I really hope it wont be as high this time. I paid as much on my watercooling parts as I did on this titan xp. All EKWB parts. No cheap stuff. :/


Mine runs about 42 deg. C in stress testing at +200 MHz overclock on the GPU and +600 MHz on the VRAM, and at stock voltage. I am using the EK TXP water block, CPU/Motherboard water block, a 420mm radiator and a 280mm radiator. I have 1500 rpm fans and the water pump controlled by the mother board, so they are not running at full speed, maybe 50-70%. I have been wanting to play with the fan/pump speed settings to see how much impact it has on the CPU and GPU temperatures, but have not gotten around to it yet.

I have the stock TXP back plate on right now, but ordered the EK back plate to see if it will reduce the GPU temp further at my present fan/pump speed settings. It was not yet available when I ordered the water block. I am not optimistic it will lower the temperature by a noticeable amount, but I figured I would try it as an experiment.


----------



## ttg35fort

Quote:


> Originally Posted by *jhowell1030*
> 
> I guess I'm asking for input on components available from Amazon. I know nothing about watercooling components, who makes the good ones (other than ek), who cheaps out, etc.


I have Alphacool radiators and EK water blocks, pump and fittings. But, there are other reputable manufacturers, and they may be available on Amazon as well. For reference, here is my water cooling setup:

EKWB Asus X99 Full Water Block
EK-FC Titan X Pascal Water Block
EK-XRES 140 Revo D5 PWM (incl. pump)
Alphacool NexXxoS ST30 420mm
Alphacool NexXxoS ST30 280mm

The radiators look to have a decent price on Amazon, but the TXP Water Block and pump looked to be more expensive than buying directly from EK. I did not find the Asus X99 full water block on Amazon.


----------



## jhowell1030

Quote:


> Originally Posted by *ttg35fort*
> 
> I have Alphacool radiators and EK water blocks, pump and fittings. But, there are other reputable manufacturers, and they may be available on Amazon as well. For reference, here is my water cooling setup:
> 
> EKWB Asus X99 Full Water Block
> EK-FC Titan X Pascal Water Block
> EK-XRES 140 Revo D5 PWM (incl. pump)
> Alphacool NexXxoS ST30 420mm
> Alphacool NexXxoS ST30 280mm
> 
> The radiators look to have a decent price on Amazon, but the TXP Water Block and pump looked to be more expensive than buying directly from EK. I did not find the Asus X99 full water block on Amazon.


Folks out there that have an ek res or ek res/pump combo...

Can you remove that ugly ek insert that's in them? I know it sounds stupid but I just want a very clean looking res and that ruins it for me.


----------



## ttg35fort

Quote:


> Originally Posted by *jhowell1030*
> 
> Folks out there that have an ek res or ek res/pump combo...
> 
> Can you remove that ugly ek insert that's in them? I know it sounds stupid but I just want a very clean looking res and that ruins it for me.


It is removable. Also, they provide an "anti-vortex foam" piece you can use in place of, or with, the "anti-cyclone". [EDIT] Also, looking at the user guide, there is a tube you can use in place of the "anti-cyclone", which you can use with or without the foam piece. The unit is pretty versatile.

https://www.ekwb.com/shop/EK-IM/EK-IM-3831109843079.pdf


----------



## skypine27

I guess a modded BIOS (i.e. increased power draw % cap) is never coming ?


----------



## EniGma1987

Quote:


> Originally Posted by *skypine27*
> 
> I guess a modded BIOS (i.e. increased power draw % cap) is never coming ?


They have to first crack the protection methods of the bios, which may or may never happen. You can easily increase your power draw % cap already though.


----------



## skypine27

Quote:


> Originally Posted by *EniGma1987*
> 
> They have to first crack the protection methods of the bios, which may or may never happen. You can easily increase your power draw % cap already though.


I wouldn't say "easily". I'm not soldering things to my PCB


----------



## Dark

Finally received the parts from EKWB to complete the loop.


----------



## Yuhfhrh

Quote:


> Originally Posted by *skypine27*
> 
> I wouldn't say "easily". I'm not soldering things to my PCB


Just a dab of CLU that you can wipe off when you're done.


----------



## Captain_cannonfodder

How well suited is the TXP for folding at home? Not interested in WC'ing it or overclocking it, just running stock clocks and voltages.


----------



## TremF

Quote:


> Originally Posted by *Captain_cannonfodder*
> 
> How well suited is the TXP for folding at home? Not interested in WC'ing it or overclocking it, just running stock clocks and voltages.


The original GTX Titan X's were great so the TXP will be fantastic


----------



## Jpmboy

Quote:


> Originally Posted by *Captain_cannonfodder*
> 
> How well suited is the TXP for folding at home? Not interested in WC'ing it or overclocking it, just running stock clocks and voltages.


each TXP is ~ 1.5M PPD. at 2088 on the core. Air cooled you will need good air flow or it will thermal throttle.
txp ~ 1.5MPPD
1080 ~ 800k PPD
TXM ~ 700 PPD
( I currently fold with all three GPUs)

( I posted these results in the last foldathon thread







)


----------



## Captain_cannonfodder

Quote:


> Originally Posted by *Jpmboy*
> 
> each TXP is ~ 1.5M PPD. at 2088 on the core. Air cooled you will need good air flow or it will thermal throttle.
> txp ~ 1.5MPPD
> 1080 ~ 800k PPD
> TXM ~ 700 PPD
> ( I currently fold with all three GPUs)
> 
> ( I posted these results in the last foldathon thread
> 
> 
> 
> 
> 
> 
> 
> )


What about stock clocks? I don't plan on OC'ing one if I do buy one, not really interested in ripping apart a £1100 video card to install a waterblock, something I've never done before.


----------



## WaXmAn

Dumb question, but I just sold my (2) Titan X maxwell video cards and the guy only wanted the stock OEM heatsinks. I have my EK waterblocks that I took off left with back plates...Question is...Will these Titan X Maxwell blocks fit the newer Titan X Pascal? Just wondering...guess not.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *WaXmAn*
> 
> Dumb question, but I just sold my (2) Titan X maxwell video cards and the guy only wanted the stock OEM heatsinks. I have my EK waterblocks that I took off left with back plates...Question is...Will these Titan X Maxwell blocks fit the newer Titan X Pascal? Just wondering...guess not.


No they won't, sorry.


----------



## MrTOOSHORT

double post


----------



## hotrod717

Quote:


> Originally Posted by *MunneY*
> 
> yes it is, but then you gotta worry about condensation!


I use a 10,000btu A//C vented directly into my case and havent had a single issue with condensation. The entire inside of the case is close to the same temp as the water in the loop. Viola, no condensation.


----------



## xixou

In case you have more than two pascal GPU,
there is a trick to make games working in tri/four way sli:


----------



## chantruong

Dumb question. What size socket do you need to remove the stock cooler off this card? Thanks in advance.


----------



## MaDeOfMoNeY

My card should be here on Wednesday, so excited!!!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *MaDeOfMoNeY*
> 
> My card should be here on Wednesday, so excited!!!


My excitement died off a week after I got my card but I still remember the feeling. Felt nice.









Good luck on getting a good clocker!


----------



## TremF

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> My excitement died off a week after I got my card but I still remember the feeling. Felt nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck on getting a good clocker!


I'm still excited at my purchase and I've had the card since September lol It has replaced a 2 x GTX Titan X SLI setup. I've even upgraded my 1440P monitor to a 4K GSync and play every game at 4K Ultra/Max settings except for AA as it isn't really needed for most games at 4K.

It's a beast of a card for 4K or VR though I haven't touched my VR since getting the 4K monitor


----------



## Jpmboy

Quote:


> Originally Posted by *chantruong*
> 
> Dumb question. What size socket do you need to remove the stock cooler off this card? Thanks in advance.


4mm
Quote:


> Originally Posted by *Captain_cannonfodder*
> 
> What about stock clocks? I don't plan on OC'ing one if I do buy one, not really interested in ripping apart a £1100 video card to install a waterblock, something I've never done before.


lower PPD.








(don;t really know - i kept the stock heat sink on for only a few days. The 1080 value posted is aircooled - Asus "turbo model - running at 21-something. max T is mid 60s with the fan at 90%)


----------



## stephen427

guys just installed my titan XP. im wondering if you guys could confirm its running ok.

 I put on 200+ core and 400+ mem its running alright. It settles around 1950Mhz on core when gaming on loads with 75-80% fan. Dont matter going to put on waterblock this weekend.



I have problem with card running not in idle clocks? Whats up with this. It was not like this yesterday when I first installed it any ideas?.

Its running on 375.70 latest nvidia drivers. I have not changed this yet. Im also on 144hz 1440p monitor but I've tried putting it on 60Hz or 120hz to no avail. And no im not running duel monitors. I know this was a issue with drivers sometimes that was fixed does not seem case here. help


----------



## Jpmboy

Quote:


> Originally Posted by *stephen427*
> 
> guys just installed my titan XP. im wondering if you guys could confirm its running ok.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I put on 200+ core and 400+ mem its running alright. It settles around 1950Mhz on core when gaming on loads with 75-80% fan. Dont matter going to put on waterblock this weekend.
> 
> 
> 
> 
> I have problem with card running not in idle clocks? Whats up with this. It was not like this yesterday when I first installed it any ideas?.
> 
> Its running on 375.70 latest nvidia drivers. I have not changed this yet. Im also on 144hz 1440p monitor but I've tried putting it on 60Hz or 120hz to no avail. And no im not running duel monitors. I know this was a issue with drivers sometimes that was fixed does not seem case here. help


if you are not running a 144MHz monitor (which can keep the card from entering P8 state)
in NVCP set power to Adaptive. Or clean the drivers out with DDU and install fresh.


----------



## stephen427

New nvidia driver hotfix worked!. Also is my memory supposed to run at 5000Mhz? Isn't the titanXP supposed to be at 10000mhz?


----------



## Jpmboy

Quote:


> Originally Posted by *stephen427*
> 
> New nvidia driver hotfix worked!. Also is my memory supposed to run at 5000Mhz? Isn't the titanXP supposed to be at 10000mhz?


Neither gpuZ or Abeta provide the memory speed. gpuZ reads the frequency (speed/5), Abeta reports the speed/2


----------



## mbze430

I'd be more excited to see the BIOS being hacked....


----------



## Captain_cannonfodder

Caved and ordered one on Monday night along with some new parts for a dedicated folding PC. Super annoyed as the only place to order them is Nvidia directly which is run by Digital River... My Scan order was processed and picked today and will be with me later tonight, 2am here now. Meanwhile at Digital River, its still stuck on "processing". Card hasn't been charged yet and the customer service rep said to give it till Friday, if nothing has changed to call in again... So much for 1-3 day shipping.


----------



## Jpmboy

Quote:


> Originally Posted by *Captain_cannonfodder*
> 
> Caved and ordered one on Monday night along with some new parts for a dedicated folding PC. Super annoyed as the only place to order them is Nvidia directly which is run by Digital River... My Scan order was processed and picked today and will be with me later tonight, 2am here now. Meanwhile at Digital River, its still stuck on "processing". Card hasn't been charged yet and the customer service rep said to give it till Friday, if nothing has changed to call in again... So much for 1-3 day shipping.


mine arrived in 3 days (ordered at launch). Nvidia has held the TXP to itself... hence the lack of a bios unlock.


----------



## Captain_cannonfodder

Quote:


> Originally Posted by *Jpmboy*
> 
> mine arrived in 3 days (ordered at launch). Nvidia has held the TXP to itself... hence the lack of a bios unlock.


Its not been a good day for me. My new Google Pixel died for no reason and its out of stock at EE so I have to wait till it comes back for a new model to be shipped to me. EE to their credit gave me a free month off my bill and will ship one same day if they get the stock in before 11am. Just hoping that it will update soon as I've only got this week off work and its £1100! Having that sit in limbo is making me worried.


----------



## Jpmboy

Quote:


> Originally Posted by *Captain_cannonfodder*
> 
> Its not been a good day for me. My new Google Pixel died for no reason and its out of stock at EE so I have to wait till it comes back for a new model to be shipped to me. EE to their credit gave me a free month off my bill and will ship one same day if they get the stock in before 11am. Just hoping that it will update soon as I've only got this week off work and its £1100! Having that sit in limbo is making me worried.


how do ya like(d) the Pixel, I've been thinking of upgrading my S4.


----------



## Captain_cannonfodder

Quote:


> Originally Posted by *Jpmboy*
> 
> how do ya like(d) the Pixel, I've been thinking of upgrading my S4.


As far as phones go its alright. Camera is great, interface is also good but takes a bit of getting used to coming from an iPhone. It uses a Type C USB connector which isn't as secure as the Lightning one on the iPhone which can be annoying. Not used the finger print scanner thing as its on the back of the phone. The screen is lovely and the regular model is plenty big enough. No expandable storage though which could be an issue for some but I got the 128GB model. It does not however play well with my Mercedes car bluetooth system. Phone calls work ok but the audio is tempermental at best. Sometimes it works, sometimes it doesn't. Sometimes it lets you change tracks using the controls on the steering wheel, sometimes it doesn't. Sometimes it connects, sometimes it doesn't. Been reading that its more of an Android problem rather than a pure device issue.

Build quailty isn't as nice as the iPhone. The back of the phone is plastic rather than metal and the front looks a little odd without a Home button. The speakers are OK, they get the job done but they aren't anything to write home about.

I got the Pixel over the S7 Edge because of Samsungs exploding batteries, had Samsung made proper batteries and not bombs I would of had one of their phones. The problem with the Pixel is that its directly competing with the iPhone. It looks like an iPhone and its around the same price but it doesn't have the same level of quality in the handset itself. That being said, I moved away from getting the iPhone 7 because of Apple's "courageous" decision to ditch the headphone jack. I drive HGV's for a living at nearly all of them have aux ports. I need the headphone jack so I can listen to my own music. It was that big of a deal for me that I had to switch.


----------



## Jpmboy

Quote:


> Originally Posted by *Captain_cannonfodder*
> 
> As far as phones go its alright. Camera is great, interface is also good but takes a bit of getting used to coming from an iPhone. It uses a Type C USB connector which isn't as secure as the Lightning one on the iPhone which can be annoying. Not used the finger print scanner thing as its on the back of the phone. The screen is lovely and the regular model is plenty big enough. No expandable storage though which could be an issue for some but I got the 128GB model. It does not however play well with my Mercedes car bluetooth system. Phone calls work ok but the audio is tempermental at best. Sometimes it works, sometimes it doesn't. Sometimes it lets you change tracks using the controls on the steering wheel, sometimes it doesn't. Sometimes it connects, sometimes it doesn't. Been reading that its more of an Android problem rather than a pure device issue.
> 
> Build quailty isn't as nice as the iPhone. The back of the phone is plastic rather than metal and the front looks a little odd without a Home button. The speakers are OK, they get the job done but they aren't anything to write home about.
> 
> I got the Pixel over the S7 Edge because of Samsungs exploding batteries, had Samsung made proper batteries and not bombs I would of had one of their phones. The problem with the Pixel is that its directly competing with the iPhone. It looks like an iPhone and its around the same price but it doesn't have the same level of quality in the handset itself. That being said, I moved away from getting the iPhone 7 because of Apple's "courageous" decision to ditch the headphone jack. I drive HGV's for a living at nearly all of them have aux ports. I need the headphone jack so I can listen to my own music. It was that big of a deal for me that I had to switch.


Super review! thanks. +1


----------



## MaDeOfMoNeY

Look what just got here


----------



## Yuhfhrh

Quote:


> Originally Posted by *MaDeOfMoNeY*
> 
> Look what just got here


Welcome to the club!


----------



## MaDeOfMoNeY

Quote:


> Originally Posted by *Yuhfhrh*
> 
> Welcome to the club!


Thanks I appreciate it!!


----------



## Baasha

Any suggestions on fixing the VRAM bug I keep getting? I've had this issue since I started running 4 Way SLI in early Aug.

It shows around 4TB of usage when obviously that's not the case. I asked the guy who built Afterburner and he just said it's a 'driver bug.'

Would like to see the VRAM usage when playing at 5K and 8K...


----------



## Silent Scone

Quote:


> Originally Posted by *Baasha*
> 
> Any suggestions on fixing the VRAM bug I keep getting? I've had this issue since I started running 4 Way SLI in early Aug.
> 
> It shows around 4TB of usage when obviously that's not the case. I asked the guy who built Afterburner and he just said it's a 'driver bug.'
> 
> Would like to see the VRAM usage when playing at 5K and 8K...


I've found a fix.

Limit SLI to 2 cards.

You're welcome


----------



## EniGma1987

Quote:


> Originally Posted by *Baasha*
> 
> Any suggestions on fixing the VRAM bug I keep getting? I've had this issue since I started running 4 Way SLI in early Aug.
> 
> It shows around 4TB of usage when obviously that's not the case. I asked the guy who built Afterburner and he just said it's a 'driver bug.'
> 
> Would like to see the VRAM usage when playing at 5K and 8K...


I would guess it would have to be fixed in the driver by Nvidia. And the bug probably relates to not actually having support for more than 2 way so the driver flips out when trying to see the vram with 4 cards there.


----------



## Baasha

Quote:


> Originally Posted by *Silent Scone*
> 
> I've found a fix.
> 
> Limit SLI to 2 cards.
> 
> You're welcome


----------



## xixou

Quote:


> Originally Posted by *Silent Scone*
> 
> I've found a fix.
> 
> Limit SLI to 2 cards.
> 
> You're welcome


Actually the same bug exists in 2 way sli ^^

BTW, if you play games Baasha, you can enjoy your 4 way sli as well by changing one profile value (instead of only 2 cards being used).


----------



## EniGma1987

Quote:


> Originally Posted by *xixou*
> 
> Actually the same bug exists in 2 way sli ^^
> 
> BTW, if you play games Baasha, you can enjoy your 4 way sli as well by changing one profile value (instead of only 2 cards being used).


Maybe Nvidia intentionally broke vram reading in SLI then to hide what it was doing with memory on the new SLI hardware version so that AMD doesn't try to do something similar. Wouldnt surprise me at all.


----------



## Baasha

Quote:


> Originally Posted by *xixou*
> 
> Actually the same bug exists in 2 way sli ^^
> 
> BTW, if you play games Baasha, you can enjoy your 4 way sli as well by changing one profile value (instead of only 2 cards being used).


Hmm.. that's strange - when I was running 2 Way SLI, VRAM reading was fine. Only when I switched to 4 Way did it break.

Also, you have PM.


----------



## TheGeneralLee86

Just bought a Titan X Pascal and i7 6950X Extreme Edition for my new mATX build and will also be buying second one later on!







:thumb:


----------



## Jpmboy

Quote:


> Originally Posted by *TheGeneralLee86*
> 
> Just bought a Titan X Pascal and i7 6950X Extreme Edition for my new mATX build and will also be buying second one later on!
> 
> 
> 
> 
> 
> 
> 
> :thumb:


avatar and location? huh?


----------



## pompss

Time to sell those titan x guyzzzzz.....







.
1080 ti coming in January

By the wait i just want to update about my titan x that i sold.

Apparently the guy who bought My titan x after doing the mod he was able to reach 2190 mhz on water.

Really a good card he got







.

I think thats the higher core clock i saw on water on the Titan X


----------



## Silent Scone

Quote:


> Originally Posted by *pompss*
> 
> Time to sell those titan x guyzzzzz.....
> 
> 
> 
> 
> 
> 
> 
> .
> 1080 ti coming in January
> 
> By the wait i just want to update about my titan x that i sold.
> 
> Apparently the guy who bought My titan x after doing the mod he was able to reach 2190 mhz on water.
> 
> Really a good card he got
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I think thats the higher core clock i saw on water on the Titan X


You sold your Titan X to get a 1080Ti? *Heavy mouth breathing*


----------



## carlhil2

That's like selling a 1080 to cop a 1070 isn't it?


----------



## Leyaena

Quote:


> Originally Posted by *TheGeneralLee86*
> 
> Just bought a Titan X Pascal and i7 6950X Extreme Edition for my new mATX build and will also be buying second one later on!
> 
> 
> 
> 
> 
> 
> 
> :thumb:


What motherboard will you be using? I've been wanting to downscale, but I honestly couldn't find a 2011-3 mATX motherboard I liked enough to make the switch.


----------



## Silent Scone

Quote:


> Originally Posted by *Leyaena*
> 
> What motherboard will you be using? I've been wanting to downscale, but I honestly couldn't find a 2011-3 mATX motherboard I liked enough to make the switch.


There's only one in that SKU which is the Asrock one.


----------



## Leyaena

Quote:


> Originally Posted by *Silent Scone*
> 
> There's only one in that SKU which is the Asrock one.


Not quite, Asus has one:
https://www.asus.com/Motherboards/X99M_WS/

And EVGA has two as well, the X99 Micro and the X99 Micro 2.


----------



## Silent Scone

Ignore me, I read that as ITX


----------



## Leyaena

Hey, no problem


----------



## Lobotomite430

If anyone is curious I made a more detailed video about modding a EVGA 1080/1070 hybrid kit to fit the Titan X


----------



## guttheslayer

Quote:


> Originally Posted by *Lobotomite430*
> 
> If anyone is curious I made a more detailed video about modding a EVGA 1080/1070 hybrid kit to fit the Titan X


Wow maybe ppl will start finding you to mod their hybrid.

I really hope Nvidia would release their TXP to their AiB just so we can see a AiO version of this.


----------



## Lobotomite430

Quote:


> Originally Posted by *guttheslayer*
> 
> Wow maybe ppl will start finding you to mod their hybrid.
> 
> I really hope Nvidia would release their TXP to their AiB just so we can see a AiO version of this.


A few have found me as I have sold some modded kits. I wish EVGA would have made one its not that different from the 1080 kit.


----------



## jhowell1030

Quote:


> Originally Posted by *Lobotomite430*
> 
> A few have found me as I have sold some modded kits. I wish EVGA would have made one its not that different from the 1080 kit.


I do too but you can't blame them for not rushing one out. I feel like more Titan Xs would've been sold if Nvidia would've let partners have a slice of the pie.


----------



## Baasha

Happy to report that the VRAM bug has been fixed - new driver 375.86 works great. Using about 11.6GB @ 8K w/ everything on Ultra in BF1:


----------



## shiokarai

VRAM bug is indeed fixed. Dishonored 2 running with DSRx4 on my Asus PG348Q (@6880x2880) is reporting >10GB VRAM usage. Running with native res (that's 3440x1440) is using about 8 gigs of VRAM.


----------



## mbze430

Hot Fix 375.95 for low GPU usage

https://forums.geforce.com/default/topic/977133/geforce-drivers/announcing-hot-fix-driver-375-95/


----------



## Artah

Quote:


> Originally Posted by *mbze430*
> 
> Hot Fix 375.95 for low GPU usage
> 
> https://forums.geforce.com/default/topic/977133/geforce-drivers/announcing-hot-fix-driver-375-95/


Nice, I need this fix. Have you tried it?


----------



## TheGeneralLee86

Quote:


> Originally Posted by *Leyaena*
> 
> What motherboard will you be using? I've been wanting to downscale, but I honestly couldn't find a 2011-3 mATX motherboard I liked enough to make the switch.


The one I am buying is the ASUS uATX X99-M WS dual PCI-E 3.0 x16 DDR4 USB 3.1 with Bluetooth /audio /802.11ac Onboard Wi-Fi Motherboard which can support 64gb of ram and SLI.


----------



## Baasha

Quote:


> Originally Posted by *mbze430*
> 
> Hot Fix 375.95 for low GPU usage
> 
> https://forums.geforce.com/default/topic/977133/geforce-drivers/announcing-hot-fix-driver-375-95/


Are you kidding me?









Sick and tired of these quick-fire driver releases; each one takes a while to uninstall (DDU), reinstall and setup (custom settings etc.).

Ugh...


----------



## meson1

Quote:


> Originally Posted by *Baasha*
> 
> Are you kidding me?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sick and tired of these quick-fire driver releases; each one takes a while to uninstall (DDU), reinstall and setup (custom settings etc.).
> 
> Ugh...


That sounds like overkill to me. I get what you're doing, but I don't find that installing over the top mucks up Nvidia's drivers all that often. Surely you only need to do that when you hit problems, or every now and again to keep things clean.


----------



## CptSpig

Quote:


> Originally Posted by *meson1*
> 
> That sounds like overkill to me. I get what you're doing, but I don't find that installing over the top mucks up Nvidia's drivers all that often. Surely you only need to do that when you hit problems, or every now and again to keep things clean.


Absolutely wrong installing Nvidia drivers over the top or using GeForce experience will definitely give you problems. Best to use DDU and load the graphics and physics drivers only using clean install. This will give you the best gaming and bench marking experience. Now you know it's done right.


----------



## meson1

Quote:


> Originally Posted by *CptSpig*
> 
> Absolutely wrong installing Nvidia drivers over the top or using GeForce experience will definitely give you problems. Best to use DDU and load the graphics and physics drivers only using clean install. This will give you the best gaming and bench marking experience. Now you know it's done right.


I'm sure you're right. But I never had a problem in all these years. Either I've gotten lucky, or I'm leaving a small amount of performance untapped. Or both.


----------



## Artah

Quote:


> Originally Posted by *meson1*
> 
> I'm sure you're right. But I never had a problem in all these years. Either I've gotten lucky, or I'm leaving a small amount of performance untapped. Or both.


I've been installing drivers on top of another myself unless I run into issues then I DDU. I had no problems overclocking my video cards to hell on multiple rigs by doing this. The only time I use DDU every single time is when I switch video cards which I have done very often lately 780/980/980 ti/titan x/ 1080 FE, 1080 FTW and titan x pascal. I use the NVidia experience all the time to baseline all my games settings before I go into the games and make my changes.


----------



## meson1

Quote:


> Originally Posted by *Artah*
> 
> I've been installing drivers on top of another myself unless I run into issues then I DDU. I had no problems overclocking my video cards to hell on multiple rigs by doing this. The only time I use DDU every single time is when I switch video cards which I have done very often lately 780/980/980 ti/titan x/ 1080 FE, 1080 FTW and titan x pascal. I use the NVidia experience all the time to baseline all my games settings before I go into the games and make my changes.


Mind you, CptSpig is talking about benchmarking. If you're into benchmarking and eeking out every last drop of performance, then you'd need every edge you can get yourself, which would mean clean fresh driver installs every time. So I get where he's coming from.


----------



## CptSpig

Quote:


> Originally Posted by *Artah*
> 
> I've been installing drivers on top of another myself unless I run into issues then I DDU. I had no problems overclocking my video cards to hell on multiple rigs by doing this. The only time I use DDU every single time is when I switch video cards which I have done very often lately 780/980/980 ti/titan x/ 1080 FE, 1080 FTW and titan x pascal. I use the NVidia experience all the time to baseline all my games settings before I go into the games and make my changes.


Like I said above if you want the best performance out of your computer do it right. You may not be as picky as some of us computer builders. When you see tiny glitches or a micro shutter while gaming you probably think it's the game when it is in fact a driver issue. If your ok with this
than that's all that matters.


----------



## Artah

Quote:


> Originally Posted by *CptSpig*
> 
> Like I said above if you want the best performance out of your computer do it right. You may not be as picky as some of us computer builders. When you see tiny glitches or a micro shutter while gaming you probably think it's the game when it is in fact a driver issue. If your ok with this
> than that's all that matters.


I see many reporting this micro stutter but never once experienced it. I've been building computers since the 8086 days so I have a tiny bit of experience with it.


----------



## Yuhfhrh

Quote:


> Originally Posted by *Artah*
> 
> I see many reporting this micro stutter but never once experienced it. I've been building computers since the 8086 days so I have a tiny bit of experience with it.


Use Nvidia fast sync on a 60hz monitor with a game running 70-150 fps. Compare directly afterwards with vsync on. That is the best example I can give you of what micro stutter looks like.

If you've never seen it, I would not recommend looking for it. Once you see it, it can't be unseen.


----------



## CptSpig

Quote:


> Originally Posted by *Artah*
> 
> I see many reporting this micro stutter but never once experienced it. I've been building computers since the 8086 days so I have a tiny bit of experience with it.


My friend I am not arguing with you....if it works for you keep doing what your doing.


----------



## Artah

Quote:


> Originally Posted by *CptSpig*
> 
> My friend I am not arguing with you....if it works for you keep doing what your doing.


It's not that, I'm just wondering why so many get micro stutter but I'm not seeing it and I should be more susceptible to it since I SLI. All I have ever seen is rubber banding in GTA5, I guess that's when my neighbors download pron and slowing down my cable internet connection.


----------



## TremF

Quote:


> Originally Posted by *Artah*
> 
> It's not that, I'm just wondering why so many get micro stutter but I'm not seeing it and I should be more susceptible to it since I SLI. All I have ever seen is rubber banding in GTA5, I guess that's when my neighbors download pron and slowing down my cable internet connection.


You may just not notice it or be able to ignore it. It wasn't until I moved from the GTX Titan X SLI setup I had on a 1440P monitor had to a TXP with 4k GSync (and FastSync for anything over 60fps) that I noticed just how much smoother it was. It was a definite wow! moment.


----------



## CptSpig

Quote:


> Originally Posted by *Artah*
> 
> It's not that, I'm just wondering why so many get micro stutter but I'm not seeing it and I should be more susceptible to it since I SLI. All I have ever seen is rubber banding in GTA5, I guess that's when my neighbors download pron and slowing down my cable internet connection.


No Worries! Try a fresh driver install and see if you still get rubber banding. Only install graphics and physics drivers.


----------



## WaXmAn

Just got my Titan X Pascal installed with a EK block and back plate. Anyone else have annoying coil wine?


----------



## Blaise Pascal

Hi guys, been following this thread since I got my Titan XP at the start of September. I've really enjoyed the conversation. Great comments and advice!

As my new build quickly turned me into a poor-ass, I'm still using an old Dell 1080 60hz 24" lcd as my display and am vastly under-utilizing my TXP in every respect... Like a... commoner. Bleh.

My funds are back up though and I come seeking advice!

--Currently I am considering the Asus ROG swift PG278Q - 144hz, g-sync, 1ms, 1440 at about $650.

--I can't help but notice though that there are some wonderful monitors that don't have g-sync though that are cheaper:
BenQ XL2730Z - 144hz, 1ms, 1440 at about $600

--and the particularly beautiful Acer XG270HU - 144hz, 1ms, 1440, edge to edge/borderless at about $475.

Reviews for all of those are well within my standards. I am not particularly interested in 4k, as I have tried and can't tell too much of a difference past 2k (I don't have the best eyes). I also believe that 27" is an ideal size for my setup.

I don't want to take up too much space here, but does anyone have a quick thought on this? Is G-sync THAT important if I properly manage things (overwatch, BF1, etc)?
Money isn't too important, but a borderless display would be pretty sexy. And it's always nice to save a buck, as I do need a few other things. My headset is from 2004 and all my friends hate me for it LOL.


----------



## Blaise Pascal

Oh!
Additionally the Dell S2716DG - 1440, 144hz, gsync, edge-to-edge, 1ms, $500


----------



## xarot

I usually run with G-Sync off because it introduces some medium lag to the mouse movement at least on my PG279Q in multiplayer FPS games..and don't bother to turn it back on after I finish.


----------



## Lee0

double post ???


----------



## mbze430

Just so you all know... new drivers lol















https://forums.geforce.com/default/topic/977502/geforce-drivers/official-375-95-game-ready-whql-display-driver-feedback-thread-released-11-18-16-/

375.95


----------



## Artah

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Blaise Pascal*
> 
> Hi guys, been following this thread since I got my Titan XP at the start of September. I've really enjoyed the conversation. Great comments and advice!
> 
> As my new build quickly turned me into a poor-ass, I'm still using an old Dell 1080 60hz 24" lcd as my display and am vastly under-utilizing my TXP in every respect... Like a... commoner. Bleh.
> 
> My funds are back up though and I come seeking advice!
> 
> --Currently I am considering the Asus ROG swift PG278Q - 144hz, g-sync, 1ms, 1440 at about $650.
> 
> --I can't help but notice though that there are some wonderful monitors that don't have g-sync though that are cheaper:
> BenQ XL2730Z - 144hz, 1ms, 1440 at about $600
> 
> --and the particularly beautiful Acer XG270HU - 144hz, 1ms, 1440, edge to edge/borderless at about $475.
> 
> Reviews for all of those are well within my standards. I am not particularly interested in 4k, as I have tried and can't tell too much of a difference past 2k (I don't have the best eyes). I also believe that 27" is an ideal size for my setup.
> 
> I don't want to take up too much space here, but does anyone have a quick thought on this? Is G-sync THAT important if I properly manage things (overwatch, BF1, etc)?
> Money isn't too important, but a borderless display would be pretty sexy. And it's always nice to save a buck, as I do need a few other things. My headset is from 2004 and all my friends hate me for it LOL.






I used to use that Asus monitor at 1440, I don't see too much benefit from gsync and 144hz to a point of wanting to pay more than double for a monitor (rotten tomatoes incoming for that comment) I run just 4K 60hz now on a 43" monitor. 4K resolution will give your titan xp a good workout and certain games 1440. I can tell the huge difference between 1080 and 4K for sure.


----------



## Baasha

Quote:


> Originally Posted by *mbze430*
> 
> Just so you all know... new drivers lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://forums.geforce.com/default/topic/977502/geforce-drivers/official-375-95-game-ready-whql-display-driver-feedback-thread-released-11-18-16-/
> 
> 375.95


Please tell me that this is the same driver as the 'Hotfix' they _just_ released two days ago?









I just installed that driver.

Here we go again. Grrrr......


----------



## jhowell1030

Quote:


> Originally Posted by *Blaise Pascal*
> 
> Hi guys, been following this thread since I got my Titan XP at the start of September. I've really enjoyed the conversation. Great comments and advice!
> 
> As my new build quickly turned me into a poor-ass, I'm still using an old Dell 1080 60hz 24" lcd as my display and am vastly under-utilizing my TXP in every respect... Like a... commoner. Bleh.
> 
> My funds are back up though and I come seeking advice!
> 
> --Currently I am considering the Asus ROG swift PG278Q - 144hz, g-sync, 1ms, 1440 at about $650.
> 
> --I can't help but notice though that there are some wonderful monitors that don't have g-sync though that are cheaper:
> BenQ XL2730Z - 144hz, 1ms, 1440 at about $600
> 
> --and the particularly beautiful Acer XG270HU - 144hz, 1ms, 1440, edge to edge/borderless at about $475.
> 
> Reviews for all of those are well within my standards. I am not particularly interested in 4k, as I have tried and can't tell too much of a difference past 2k (I don't have the best eyes). I also believe that 27" is an ideal size for my setup.
> 
> I don't want to take up too much space here, but does anyone have a quick thought on this? Is G-sync THAT important if I properly manage things (overwatch, BF1, etc)?
> Money isn't too important, but a borderless display would be pretty sexy. And it's always nice to save a buck, as I do need a few other things. My headset is from 2004 and all my friends hate me for it LOL.


Personally, I think Gsync is only worth it if/when you fall under the monitor's refresh rate/your preferred target refresh rate.

That being said...I have a predator x34 and love gsync. However, If I was gaming at 16:9 1440p I don't know if I'd spend the additional money.


----------



## Nunzi

Quote:


> Originally Posted by *Baasha*
> 
> Please tell me that this is the same driver as the 'Hotfix' they _just_ released two days ago?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just installed that driver.
> 
> Here we go again. Grrrr......


I think its just the WHQL version


----------



## CptSpig

Quote:


> Originally Posted by *Nunzi*
> 
> I think its just the WHQL version


Yes that is correct it is the WHQL driver released Nov. 18 2016.


----------



## Captain_cannonfodder




----------



## Glerox

A little update on my Titan XP overclocking. Might be useful to some of you!
With EKWB included thermal paste and a overclock of +235/+550 with power limit 120% and no voltage increase (doesn't change anything because of power limit throttling), here were my results :

Load Temp 52
Load Clock 2063-2076
Firestrike Ultra 7851

Yesterday I finally decided to do the shunt mod with CLU on the two resistors according to this guide:
http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus
I also decided to change the EKWB thermal paste for CLU.

At first, my display wasn't showing anything, I thought I broke my card :S Then I removed CLU off one of the two resistors and it worked, but power limit throttling was still a problem. I then reapplied CLU on the second resistor but in a minimal amount, and now it's working great! Power is never beyond 100% so with a 120% power limit, there is no throttling. I had to lower my overclock to +230/+550 for stability issues. But now I can increase the voltage to 100% and it's increasing the clock speed! Here are my results :

Load Temp 44 (-8) (I also switched my top rad to intake instead of exhaust, so it's not only the CLU effect)
Load Clock 2101 (+37 to +24)
Firestrike Ultra 7951 (+100)
Load Voltage 1.081

So these results are quite amazing and I'm really happy to finally have a stable 2,1 GHz


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> how do ya like(d) the Pixel, I've been thinking of upgrading my S4.


Once you go pure Android you never go back.


----------



## Jpmboy

Quote:


> Originally Posted by *Baasha*
> 
> Please tell me that this is the same driver as the 'Hotfix' they _just_ released two days ago?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just installed that driver.
> 
> Here we go again. Grrrr......


the 375 drivers have been failures during folding... when 372 series have not failed any work units IME. So far every 375 driver has been a fail IMO.
Quote:


> Originally Posted by *Sheyster*
> 
> Once you go pure Android you never go back.


yeah - I've been on android since my 1st gen droid (with a few iphones I had to use for work related stuff) - lack of memory and battery upgrades on Apple's phones has been an issue for me (especially in a professional setting).


----------



## GunnzAkimbo

Anyone play Transport fever on a Large map 1:3 scale with all the eye candy enabled?

Wondering how the XP handles it, seems to be pretty resource intensive.

If you like transport games, check this out, quite a huge map, if you create a transport line from top to bottom of the map.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - I've been on android since my 1st gen droid (with a few iphones I had to use for work related stuff) - lack of memory and battery upgrades on Apple's phones has been an issue for me (especially in a professional setting).


The Google phones (Pixel, Nexus 6, etc.) are the best. I was using Samsung but switched over and never looked back. I've basically boycotted Samsung. I will never again buy one of their products. They really just don't care about their customers at the end of the day.


----------



## lanofsong

Hey Titan X Pascal owners,

We are having our monthly Foldathon from Monday 21st - 23rd 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

November Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## stephen427

Hey people I got a question regarding power limit is it safe to reach this?

Ignore the bouncing around core clock. Its ashes of singularity doing that.. No other games make it do that. Also wondering if I got a good card.. I can reach 2050Mhz stable with a 500+ on memory. It only reaches 2038Mhz after hours of gaming when it goes over 50C.


----------



## EniGma1987

Quote:


> Originally Posted by *GunnzAkimbo*
> 
> Anyone play Transport fever on a Large map 1:3 scale with all the eye candy enabled?
> Wondering how the XP handles it, seems to be pretty resource intensive.
> 
> If you like transport games, check this out, quite a huge map, if you create a transport line from top to bottom of the map.


Now that you are actually back could you fix the OP finally to be a decent owners thread?


----------



## Jpmboy

@Baasha - unleash those quad TXPs for a 2 day fold: http://www.overclock.net/t/1615724/november-2016-foldathon-monday-21st-23rd-12-noon-est-4pm-utc/80_20#post_25662281

(... 375.xx drivers do not hld up, 373 or 372 work fine)


----------



## Lee0

Quote:


> Originally Posted by *EniGma1987*
> 
> Now that you are actually back could you fix the OP finally to be a decent owners thread?


Yeah this thread is such a clustertruck...


----------



## Fiercy

Hello, everyone just wanted to come and warn you guys. I bought my titan x the first day and this Sunday I am pretty sure it died and I have no idea why.

I only remember updating to latest drivers and I am not even sure I was running the overclock at that time but usually it was +200 on the core with EK water block.

The computer right now goes through the B2 and A2 errors when it boots up but when it shows CPU temp meaning it booted up my display lights up ( like it has a signal ) but no image nothing...

Maybe it's my luck but I actually never had a gpu death like this.


----------



## Menthol

My opinion is if your installing $25
Quote:


> Originally Posted by *Jpmboy*
> 
> @Baasha - unleash those quad TXPs for a 2 day fold: http://www.overclock.net/t/1615724/november-2016-foldathon-monday-21st-23rd-12-noon-est-4pm-utc/80_20#post_25662281
> 
> (... 375.xx drivers do not hld up, 373 or 372 work fine)


I had just removed my cards to get ready for the RB challenge, will there be another foldathon in Dec.?


----------



## EniGma1987

Quote:


> Originally Posted by *Fiercy*
> 
> Hello, everyone just wanted to come and warn you guys. I bought my titan x the first day and this Sunday I am pretty sure it died and I have no idea why.
> 
> I only remember updating to latest drivers and I am not even sure I was running the overclock at that time but usually it was +200 on the core with EK water block.
> 
> The computer right now goes through the B2 and A2 errors when it boots up but when it shows CPU temp meaning it booted up my display lights up ( like it has a signal ) but no image nothing...
> 
> Maybe it's my luck but I actually never had a gpu death like this.


Are A2 and B2 error codes on your MB even related to the GPU? Dont just spit out random info and warnings of death and horror without any actual information. What is your computer hardware? Did you even look up the error codes? You have an EK waterblock on the GPU, are you sure you didnt over tighten the screws and warp the PCB and damage something? You sure you didn't get a drip from reconnecting the system?


----------



## Lee0

Quote:


> Originally Posted by *EniGma1987*
> 
> Are A2 and B2 error codes on your MB even related to the GPU? Dont just spit out random info and warnings of death and horror without any actual information. What is your computer hardware? Did you even look up the error codes? You have an EK waterblock on the GPU, are you sure you didnt over tighten the screws and warp the PCB and damage something? You sure you didn't get a drip from reconnecting the system?


Exactly, just warning about a ''death of titan'' doesn't help at all. The fact that you have it under a watercooling block might tell that it's not just random either.


----------



## ratzofftoya

Hey all, I made a little GPU overclocking guide using my SLI Titan XPs. Just on air now, but migrating to water this weekend....


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> My opinion is if your installing $25
> I had just removed my cards to get ready for the RB challenge, will there be another foldathon in Dec.?


I think there is one every month...









(yeah - I still have a couple of days till I need to pull these TXPs and stick a 1080 on this R5E-10. Not that I'm in contention from 5th place...







)


----------



## DNMock

So I know this has been asked 100 times, but I really don't wanna sift through 3 months worth of posts to check.

Custom Bios Progress?


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> So I know this has been asked 100 times, but I really don't wanna sift through 3 months worth of posts to check.
> 
> Custom Bios Progress?


zero


----------



## Taint3dBulge

wonder if the TX will go onsale for blk friday?


----------



## Lee0

Quote:


> Originally Posted by *Taint3dBulge*
> 
> wonder if the TX will go onsale for blk friday?


Doubt it.


----------



## DNMock

Quote:


> Originally Posted by *Jpmboy*
> 
> zero


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*


----------



## Exnetic

Quote:


> Originally Posted by *DNMock*
> 
> So I know this has been asked 100 times, but I really don't wanna sift through 3 months worth of posts to check.
> 
> Custom Bios Progress?


Just do shunt mod and add a waterblock and you should gett very close to 2.1ghz stable clocks


----------



## Exnetic

Edit.


----------



## Fiercy

Quote:


> Originally Posted by *EniGma1987*
> 
> Are A2 and B2 error codes on your MB even related to the GPU? Dont just spit out random info and warnings of death and horror without any actual information. What is your computer hardware? Did you even look up the error codes? You have an EK waterblock on the GPU, are you sure you didnt over tighten the screws and warp the PCB and damage something? You sure you didn't get a drip from reconnecting the system?


No I am a complete idiot and I didn't think about any of this. Thank you for the info that was a lot to go over









Meanwhile got an answer from Nvidia they are replacing the card didn't say what happen though.


----------



## jodasanchezz

Hi there,

have modifyed my Loop

Specs:

EK XRES with D5 Pump
Alphacool 280 Monsta +2 Noctua NF A14 FLX @ 900rpm
EX XE360 +3 Gts Ap15 @ 900rpm

i5 6600k @4,5ghz @ 1.32V (ek supremacy evo)
Tiatan X Pascal @2050mhz (+220 core + 300mem) (Ek Waterblock)

Loope goes
Pump/res ->Gpu ->Cpu->360rad->280Rad->pump/res

@ These Setting the Titan runs about max 53°C the cpu ~60°C

Im wondering just how People can stay with theyre Gpu roundabout 40C ??

360+280 Thik rads shoud be more than enough....i Thougt


----------



## CptSpig

Pm
Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi there,
> 
> have modifyed my Loop
> 
> Specs:
> 
> EK XRES with D5 Pump
> Alphacool 280 Monsta +2 Noctua NF A14 FLX @ 900rpm
> EX XE360 +3 Gts Ap15 @ 900rpm
> 
> i5 6600k @4,5ghz @ 1.32V (ek supremacy evo)
> Tiatan X Pascal @2050mhz (+220 core + 300mem) (Ek Waterblock)
> 
> Loope goes
> Pump/res ->Gpu ->Cpu->360rad->280Rad->pump/res
> 
> @ These Setting the Titan runs about max 53°C the cpu ~60°C
> 
> Im wondering just how People can stay with theyre Gpu roundabout 40C ??
> 
> 360+280 Thik rads shoud be more than enough....i Thougt


The higher the overclock and voltage the higher the temperatures. If you remove your overclocks your temperatures will drop significantly. Try a different thermal paste that can help two to three degrees C.


----------



## Jpmboy

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi there,
> have modifyed my Loop
> Specs:
> EK XRES with D5 Pump
> Alphacool 280 Monsta +2 Noctua NF A14 FLX @ 900rpm
> EX XE360 +3 Gts Ap15 @ 900rpm
> i5 6600k @4,5ghz @ 1.32V (ek supremacy evo)
> Tiatan X Pascal @2050mhz (+220 core + 300mem) (Ek Waterblock)
> Loope goes
> Pump/res ->Gpu ->Cpu->360rad->280Rad->pump/res
> @ These Setting the Titan runs about max 53°C the cpu ~60°C
> Im wondering just how People can stay with theyre Gpu roundabout 40C ??
> 360+280 Thik rads shoud be more than enough....i Thougt


Do you have an in-line coolant temp sensor inb the loop? if yes, what was the cold-side water temp at those cpu and gpu max Ts What was the ambient (room) temperature when you ran those max temps? And what TIM did you use on the GPU? Also, in general, a pump adds heat toi a loop.. and needs the coolant flow to keep itself cool... it's always best to have the rad(s) last in the loop, before the first waterblock.


----------



## Nikos4Life

Hello I have been playing around my OC profile for a little bit and testing stability.

How do you see this result?


http://www.3dmark.com/3dm11/11772265

I think I can hit 35K but maybe tunning the CPU. What do you think?


----------



## CptSpig

Quote:


> Originally Posted by *Nikos4Life*
> 
> Hello I have been playing around my OC profile for a little bit and testing stability.
> 
> How do you see this result?
> 
> 
> http://www.3dmark.com/3dm11/11772265
> 
> I think I can hit 35K but maybe tunning the CPU. What do you think?


Yes, I agree you should be able to get that CPU to 4.6 GHz. That will get you to 35000 to 36000. I have a 3590k OC to 4.6 With offset voltage 225 Mv. I am at 28000 plus in 3D mark 11 single card. I think with a little more I can get 29000.


----------



## Dagamus NM

Forgive me if this is posted in here, but what mods have people made to their HB SLI bridge to make it fit with EK waterblocks and a triple slot bridge?

I am thinking I may just remove the cover and the green light bar, then trim the pointless points off of the board so that it snugs up against the bridge.

Oh, I has four of these rascal Pascal titans. Better late than never to join this thread I suppose.


----------



## Dr Mad

Since I mounted EK backplate, I noticed my GPU/water temp delta increased by 3-4°.
Before that, gpu core was 8° more than water and now it's ~12°.

Not sure if it's related to a bad tim application (kryonaut) but the backplate is the only change to my watercooling loop.


----------



## Lee0

Quote:


> Originally Posted by *Dr Mad*
> 
> Since I mounted EK backplate, I noticed my GPU/water temp delta increased by 3-4°.
> Before that, gpu core was 8° more than water and now it's ~12°.
> 
> Not sure if it's related to a bad tim application (kryonaut) but the backplate is the only change to my watercooling loop.


Higher ambient temps?


----------



## outofmyheadyo

Quote:


> Originally Posted by *jodasanchezz*
> 
> Hi there,
> 
> have modifyed my Loop
> 
> Specs:
> 
> EK XRES with D5 Pump
> Alphacool 280 Monsta +2 Noctua NF A14 FLX @ 900rpm
> EX XE360 +3 Gts Ap15 @ 900rpm
> 
> i5 6600k @4,5ghz @ 1.32V (ek supremacy evo)
> Tiatan X Pascal @2050mhz (+220 core + 300mem) (Ek Waterblock)
> 
> Loope goes
> Pump/res ->Gpu ->Cpu->360rad->280Rad->pump/res
> 
> @ These Setting the Titan runs about max 53°C the cpu ~60°C
> 
> Im wondering just how People can stay with theyre Gpu roundabout 40C ??
> 
> 360+280 Thik rads shoud be more than enough....i Thougt


Just because you have thick rads doesnt mean they perform when you dont run the fans at any decent speed, nemesis GTS rads ( 28mm thick ) would most surely outperform your thick rads on the low rpm fans you are using.
Monsta and other needlessly thick rads are about the worst rads you can buy for lowrpm fans. If you want to use em you got to pump up the speeds on your fans or use push/pull.
Thatswhy I never see myself using those thicker rads again, since even 1000rpm fans are way too loud.


----------



## Glerox

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Just because you have thick rads doesnt mean they perform when you dont run the fans at any decent speed, nemesis GTS rads ( 28mm thick ) would most surely outperform your thick rads on the low rpm fans you are using.
> Monsta and other needlessly thick rads are about the worst rads you can buy for lowrpm fans. If you want to use em you got to pump up the speeds on your fans or use push/pull.
> Thatswhy I never see myself using those thicker rads again, since even 1000rpm fans are way too loud.


Exactly, I had 1000 RPM fans before on my setup and my temps were around 60... Now I have 2400 RPM fans and my load temp is 44-48.
It's more noisy, but I have NZXT grid fan controller and I set it so that the fans start only when my gpu heats.


----------



## jodasanchezz

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Just because you have thick rads doesnt mean they perform when you dont run the fans at any decent speed, nemesis GTS rads ( 28mm thick ) would most surely outperform your thick rads on the low rpm fans you are using.
> Monsta and other needlessly thick rads are about the worst rads you can buy for lowrpm fans. If you want to use em you got to pump up the speeds on your fans or use push/pull.
> Thatswhy I never see myself using those thicker rads again, since even 1000rpm fans are way too loud.


Quote:


> Originally Posted by *Jpmboy*
> 
> Do you have an in-line coolant temp sensor inb the loop? if yes, what was the cold-side water temp at those cpu and gpu max Ts What was the ambient (room) temperature when you ran those max temps? And what TIM did you use on the GPU? Also, in general, a pump adds heat toi a loop.. and needs the coolant flow to keep itself cool... it's always best to have the rad(s) last in the loop, before the first waterblock.


Quote:


> Originally Posted by *Jpmboy*
> 
> Do you have an in-line coolant temp sensor inb the loop? if yes, what was the cold-side water temp at those cpu and gpu max Ts What was the ambient (room) temperature when you ran those max temps? And what TIM did you use on the GPU? Also, in general, a pump adds heat toi a loop.. and needs the coolant flow to keep itself cool... it's always best to have the rad(s) last in the loop, before the first waterblock.


Thanks for all ur responde.

Well i gave the Loop some time to bleed more.
I was Trying different tipes of fancurves on fans and pump.

While Playing BF4 (100%CPU USAGE (98%all the time)) GPU @ 2050~2068mhz after 30 min the temp of the cpu settls @ 48-49°C ,
System is Realy Quiet, the loudest part in are the GPU Coils.

For Temal Paste i use in GPU and CPU Thermal Grizzly Cyronout.

I konw the Pump adds heat to the Loop but to realize

pump rad gpu cpu rad ...it would be a mess in tubing


----------



## Blaise Pascal

An update to an earlier information request that I made for an appropriate monitor for my Titan XP:
After getting the lay of the land and surveying prices this last week, I bought an Asus PG279Q (27", 1440p, 165hz variable, g-sync, IPS panel) for 739$ at microcenter. I was cautious, as these models have commonly been plagued by factory defects with pixels and crazy IPS glow.

I am EXTREMELY happy with this purchase. Coming from a 60hz 1080p, I am stunned by the color, responsiveness, and overall quality of the product. The controls are extremely intuitive (the joystick selector is so easy to use. No fumbling around at all), and the mount is both ergonomic and stable.

I had NO dead pixels, and only a small amount of IPS glow focused mainly in the corners. The right side exhibited a noticeably greater amount of glow than the left on a black screen, but in real-life use this did not bother me. I even tested this in the darkest areas of Dark Souls 3 and was not distracted.

In conclusion, the Titan fits wonderfully with this monitor. If you choose to buy this or any other high end IPS display, I suggest buying it in-store (not online) if possible in case you need to do a quick return and trade-out on yours if necessary. Microcenter said to test it asap and that they'd be happy to exchange if it was sub-quality.


----------



## CptSpig

Quote:


> Originally Posted by *Blaise Pascal*
> 
> An update to an earlier information request that I made for an appropriate monitor for my Titan XP:
> After getting the lay of the land and surveying prices this last week, I bought an Asus PG279Q (27", 1440p, 165hz variable, g-sync, IPS panel) for 739$ at microcenter. I was cautious, as these models have commonly been plagued by factory defects with pixels and crazy IPS glow.
> 
> I am EXTREMELY happy with this purchase. Coming from a 60hz 1080p, I am stunned by the color, responsiveness, and overall quality of the product. The controls are extremely intuitive (the joystick selector is so easy to use. No fumbling around at all), and the mount is both ergonomic and stable.
> 
> I had NO dead pixels, and only a small amount of IPS glow focused mainly in the corners. The right side exhibited a noticeably greater amount of glow than the left on a black screen, but in real-life use this did not bother me. I even tested this in the darkest areas of Dark Souls 3 and was not distracted.
> 
> In conclusion, the Titan fits wonderfully with this monitor. If you choose to buy this or any other high end IPS display, I suggest buying it in-store (not online) if possible in case you need to do a quick return and trade-out on yours if necessary. Microcenter said to test it asap and that they'd be happy to exchange if it was sub-quality.


Great choice! I also have this Monitor and love the G-Sync and 165 refresh rate. 1440 is the sweet spot right now until they improve 4K monitors with faster refresh rates. This Titan X Pascal works very well with this monitor handling any thing I have thrown at it with ease. I also have no dead pixels and almost no IPS glow. Enjoy


----------



## Dagamus NM

I need to run two different sets of monitors, three Acer h257hu 1440P monitors and three Asus pb287q 4k 60p monitors.

Which hardware would you choose to run the 4k surround? Two Titan XPs or four 980Tis?

I currently have the four 980Tis setup with these monitors but my PC is too big to fit under my desk and the cables don't reach from the PC to the monitors unless I have them in portrait mode. I am going to make the move from a caselabs sm8 to an s8 to fit under the desk.

One setup will get the Asus RVE/5960X combo and the other the RVE10 and 6950X if it matters. I don't think it really will.

In the past I would have just run four cards on the 4K monitor setup and called it good but Nvidia took that away.

So I just really need to decide if I want to use the pair of Titan XPs for this or if that would be underpowered compared to the four 980Tis.

Thank you.


----------



## Jpmboy

Quote:


> Originally Posted by *Dagamus NM*
> 
> I need to run two different sets of monitors, three Acer h257hu 1440P monitors and three Asus pb287q 4k 60p monitors.
> 
> Which hardware would you choose to run the 4k surround? Two Titan XPs or four 980Tis?
> 
> I currently have the four 980Tis setup with these monitors but my PC is too big to fit under my desk and the cables don't reach from the PC to the monitors unless I have them in portrait mode. I am going to make the move from a caselabs sm8 to an s8 to fit under the desk.
> 
> One setup will get the Asus RVE/5960X combo and the other the RVE10 and 6950X if it matters. I don't think it really will.
> 
> In the past I would have just run four cards on the 4K monitor setup and called it good but Nvidia took that away.
> 
> So I just really need to decide if I want to use the pair of Titan XPs for this or if that would be underpowered compared to the four 980Tis.
> 
> Thank you.


2 titan XPs >>. quad SLI with 980Tis. AFAIK, there really isn't the power available to run surround 4K with always > 60FPS unless you run 4 TXPs.


----------



## Dagamus NM

Quote:


> Originally Posted by *Jpmboy*
> 
> 2 titan XPs >>. quad SLI with 980Tis. AFAIK, there really isn't the power available to run surround 4K with always > 60FPS unless you run 4 TXPs.


Making four TXPs work with everything seems to be a pain in the arse. I have four but decided to run them in two pairs instead.

If a pair of TXPs are better than four 980Tis then I will let the TXPs do the 4k and the 980Tis do the surround 1440P. At least I won't have to gut my sm8 this way. I will just swap the monitors and build out the S8 fresh.

I used the other pair of TXPs to update my quad 780Ti setup. Decent performance upgrade so far. The pair of TXPs run the single 4k panel there just fine.


----------



## EniGma1987

Quote:


> Originally Posted by *Dagamus NM*
> 
> I need to run two different sets of monitors, three Acer h257hu 1440P monitors and three Asus pb287q 4k 60p monitors.
> 
> Which hardware would you choose to run the 4k surround? Two Titan XPs or four 980Tis?
> 
> I currently have the four 980Tis setup with these monitors but my PC is too big to fit under my desk and the cables don't reach from the PC to the monitors unless I have them in portrait mode. I am going to make the move from a caselabs sm8 to an s8 to fit under the desk.
> 
> One setup will get the Asus RVE/5960X combo and the other the RVE10 and 6950X if it matters. I don't think it really will.
> 
> In the past I would have just run four cards on the 4K monitor setup and called it good but Nvidia took that away.
> 
> So I just really need to decide if I want to use the pair of Titan XPs for this or if that would be underpowered compared to the four 980Tis.
> 
> Thank you.


The GPUs just really arent there yet for surround 4k. I would suggest just dealing with the current setup for another year and then get two of the top end card's from next gen. Those should be able to just about push surround 4k when in 2-way SLI. Right now the current Titan X can barely push that 60fps mark on a single monitor, running 2 will struggle a bit with two 4k monitor, so adding the 3rd in there will just be too much. Either do 4-way Titan X's now, which costs a ton, is hard to implement, time consuming, and a bit buggy. Or just wait and get two of the next gen cards.


----------



## Dagamus NM

Quote:


> Originally Posted by *EniGma1987*
> 
> The GPUs just really arent there yet for surround 4k. I would suggest just dealing with the current setup for another year and then get two of the top end card's from next gen. Those should be able to just about push surround 4k when in 2-way SLI. Right now the current Titan X can barely push that 60fps mark on a single monitor, running 2 will struggle a bit with two 4k monitor, so adding the 3rd in there will just be too much. Either do 4-way Titan X's now, which costs a ton, is hard to implement, time consuming, and a bit buggy. Or just wait and get two of the next gen cards.


Yeah. Such first world problems.


----------



## Baasha

Quote:


> Originally Posted by *Dagamus NM*
> 
> I need to run two different sets of monitors, three Acer h257hu 1440P monitors and three Asus pb287q 4k 60p monitors.
> 
> Which hardware would you choose to run the 4k surround? Two Titan XPs or four 980Tis?


Neither. 4K Surround requires a lot more GPU power than 2x Titan XPs. I've run 4K Surround (3x 4K monitors) in 2014 w/ 4x Titan Black SC and they were fun. I've tried 4K Surround w/ my current 4x Titan XP setup and it works great - I get > 70FPS in most games, sometimes even 100FPS (BF1 etc.).

Here's BF1 in 4K Surround with 4x Titan XPs:


----------



## Jpmboy

Quote:


> Originally Posted by *Dagamus NM*
> 
> Making four TXPs work with everything seems to be a pain in the arse. I have four but decided to run them in two pairs instead.
> 
> If a pair of TXPs are better than four 980Tis then I will let the TXPs do the 4k and the 980Tis do the surround 1440P. At least I won't have to gut my sm8 this way. I will just swap the monitors and build out the S8 fresh.
> 
> I used the other pair of TXPs to update my quad 780Ti setup. Decent performance upgrade so far. The pair of TXPs run the single 4k panel there just fine.


thaT'S BEEN MY EXPERIENCE - 2 TXPs handle 1 4K panel with HP to spare. Once higher frequency 4K shows up, things on the bleeding edge will change again.








Quote:


> Originally Posted by *Baasha*
> 
> Neither. 4K Surround requires a lot more GPU power than 2x Titan XPs. I've run 4K Surround (3x 4K monitors) in 2014 w/ 4x Titan Black SC and they were fun. I've tried 4K Surround w/ my current 4x Titan XP setup and it works great - I get > 70FPS in most games, sometimes even 100FPS (BF1 etc.).
> 
> Here's BF1 in 4K Surround with 4x Titan XPs:
> 
> 
> Spoiler: Warning: Spoiler!


Ninja'd... was gonna suggest to ping Baasha.


----------



## Dagamus NM

Yeah, Baasha's builds have been inspirational. I am not against making custom SLI profiles. I suppose once I have it figured out it should be pretty simple.

What SLI bridge are you using Baasha? I got the MSI quad setup as it has the metal fingers like the HB SLI does. I read on guru3d of a couple fellows that have done just this and they used the MSI one or some other but the MSI one was cheaper after rebate. I will have to get a couple more TXPs then.

I already swapped four 780Tis out in favor of a pair of TXPs with waterblocks and backplates. I used an EK triple slot parallel and blocked the middle slot. I had to remove the HB SLI bridge cover and light and trim off the points to get it to fit.

I am less concerned with games as I am using Monte Carlo programs that use CUDA, that said I do enjoy a game or two from time to time.

Does the same process for a custom SLI profile for a game work for software? Like rendering in Premiere Pro.


----------



## Djootn

Hi Guys,

First post here
I'm having trouble with my sli setup.
Both of my gpu's have a ek block + backplate.
The thing is at full load for 30min one of my gpu's stays around 37° C while my second slowly heats up to cap out at around 80°C.
I'm guessing opening my loop and reseating my second gpu (new tim and cooling pads); It's my first waterloop so i'm kind of novice.
Anyone who has had the same experience?
Sorry for gramatical errors or other, English is not my native tongue









greetings
Djootn


----------



## Jpmboy

Quote:


> Originally Posted by *Dagamus NM*
> 
> Yeah, Baasha's builds have been inspirational. I am not against making custom SLI profiles. I suppose once I have it figured out it should be pretty simple.
> 
> What SLI bridge are you using Baasha? I got the MSI quad setup as it has the metal fingers like the HB SLI does. I read on guru3d of a couple fellows that have done just this and they used the MSI one or some other but the MSI one was cheaper after rebate. I will have to get a couple more TXPs then.
> 
> I already swapped four 780Tis out in favor of a pair of TXPs with waterblocks and backplates. I *used an EK triple slot parallel and blocked the middle slot. I had to remove the HB SLI bridge cover and light and trim off the points to get it to fit.
> *
> I am less concerned with games as I am using Monte Carlo programs that use CUDA, that said I do enjoy a game or two from time to time.
> 
> Does the same process for a custom SLI profile for a game work for software? Like rendering in Premiere Pro.


did the same:


----------



## Glerox

Do you force constant voltage in MSI AB?

I tried and even if the voltage is not constant, I got 60 more points in Firestrike Ultra.
With that I just reached 8k fort the first time in Firestrike ultra so I'm quite happy!
This GPU is so crazy powerful.

But I don't know if it's just "a good day at work" or if the constant voltage actually changed something.


----------



## jhowell1030

Quote:


> Originally Posted by *Glerox*
> 
> Do you force constant voltage in MSI AB?
> 
> I tried and even if the voltage is not constant, I got 60 more points in Firestrike Ultra.
> With that I just reached 8k fort the first time in Firestrike ultra so I'm quite happy!
> This GPU is so crazy powerful.
> 
> But I don't know if it's just "a good day at work" or if the constant voltage actually changed something.


Personally I think 60 points is small enough to be within margin of error.


----------



## ulnevrgtit

Got one...dope card but still cant hold horizon 3 at 60fps @ 3840x1600 (


----------



## Jpmboy

Quote:


> Originally Posted by *Glerox*
> 
> Do you force constant voltage in MSI AB?
> 
> I tried and even if the voltage is not constant, I got 60 more points in Firestrike Ultra.
> With that I just reached 8k fort the first time in Firestrike ultra so I'm quite happy!
> This GPU is so crazy powerful.
> 
> But I don't know if it's just "a good day at work" or if the constant voltage actually changed something.


Voltage application is tied to freq and load in this architecture, you are better off forcing k-boost in ab... Cntrl-F, shift and slide the graph line up, cntrl-L the apply. now you can adj the core clock with the slider and have a locked frequency... and have the cooling to avoid temp clock bin drops.


----------



## Glerox

Quote:


> Originally Posted by *Jpmboy*
> 
> Voltage application is tied to freq and load in this architecture, you are better off forcing k-boost in ab... Cntrl-F, shift and slide the graph line up, cntrl-L the apply. now you can adj the core clock with the slider and have a locked frequency... and have the cooling to avoid temp clock bin drops.


Wow thanks it's the first time I hear about this! I'll definitely try! Did you get better performance with this?


----------



## Dagamus NM

Quote:


> Originally Posted by *Jpmboy*
> 
> did the same:


Do you have pics of the rest of your bench?


----------



## Jpmboy

Quote:


> Originally Posted by *Dagamus NM*
> 
> Do you have pics of the rest of your bench?


sure:


Spoiler: Warning: Spoiler!


----------



## Petnax

Recently upgraded my 4x Titans to 2x Titans XP and noticed significant performacen boost compared to 4-way SLI. Both connected to water chiller. Core clock 2025 MHz. Achived 110-140FPS in Battlefield 4 @ Surround 7680x1440p. Here is 3DMark results http://www.3dmark.com/fs/10920096

Does any one knows of an existing V-BIOS for voltage unlock for titan XP? Im planning to push them higher on the clock.


----------



## Fiercy

I have a question if you are installing the same water block on another titan x can you use old heat pads or do I have to order new ones?


----------



## Petnax

Quote:


> Originally Posted by *Fiercy*
> 
> I have a question if you are installing the same water block on another titan x can you use old heat pads or do I have to order new ones?


I would replace them to avoid inconsistent pressure from water block on vram.


----------



## Jquala

Guys I seek the wisdom of this forum I am stumped. First off my first Titan x pascal are both shunt modded 2/3 shunts are modded with Grizzlys LM and under water doesn't get hotter than 39C. My top card boost 1898 stock it runs clocks at low voltages an ASIC winner I thought. I set cards to be unsynced due to voltage discrepancy between the cards. The top card is at 2114mhz and for testing purposes I set it at 1.075v to match my second card 2114mhz @1.093v both cards are stable independently and my 2nd card although requires more voltage can clock higher. My problem and source of hours of tinkering in AB is the top card keeps hitting power limit wall @1.075v 131% tdp with gpuz and 320w Hwinfo while my 2nd using more voltage 1.093v is sitting well under 290w and around 100%tdp under load. Did I mess up the shunt mod on the top card? If I did wouldn't I not be able to surpass 300w regardless? Why is the card using less voltage hitting the power limit while the bottom card is cruising along just fine at 100% voltage @1.093v pulling 290w. I thought it would follow a linear slope. More voltage = more watts? Am I missing something

@Jpmboy


----------



## skypine27

Quote:


> Originally Posted by *Petnax*
> 
> Recently upgraded my 4x Titans to 2x Titans XP and noticed significant performacen boost compared to 4-way SLI. Both connected to water chiller. Core clock 2025 MHz. Achived 110-140FPS in Battlefield 4 @ Surround 7680x1440p. Here is 3DMark results http://www.3dmark.com/fs/10920096
> 
> Does any one knows of an existing V-BIOS for voltage unlock for titan XP? Im planning to push them higher on the clock.


Petnax:

Nope. No modded BIOS yet. No one has cracked it. Chances look slim
To none


----------



## meson1

Quote:


> Originally Posted by *skypine27*
> 
> Petnax:
> 
> Nope. No modded BIOS yet. No one has cracked it. Chances look slim
> To none


Not that I know much about these things, but a question. If and when the 1080 Ti comes out, I'm guessing that it will ship with a less locked down BIOS, so the AIB partners can apply their own tweaks. So given that it's a GP102 chip and more or less the same basic PCB architecture, what are the chances that a 1080 Ti BIOS could be reverse-adapted and modded for the TXP?


----------



## EniGma1987

Quote:


> Originally Posted by *Jquala*
> 
> Guys I seek the wisdom of this forum I am stumped. First off my first Titan x pascal are both shunt modded 2/3 shunts are modded with Grizzlys LM and under water doesn't get hotter than 39C. My top card boost 1898 stock it runs clocks at low voltages an ASIC winner I thought. I set cards to be unsynced due to voltage discrepancy between the cards. The top card is at 2114mhz and for testing purposes I set it at 1.075v to match my second card 2114mhz @1.093v both cards are stable independently and my 2nd card although requires more voltage can clock higher. My problem and source of hours of tinkering in AB is the top card keeps hitting power limit wall @1.075v 131% tdp with gpuz and 320w Hwinfo while my 2nd using more voltage 1.093v is sitting well under 290w and around 100%tdp under load. Did I mess up the shunt mod on the top card? If I did wouldn't I not be able to surpass 300w regardless? Why is the card using less voltage hitting the power limit while the bottom card is cruising along just fine at 100% voltage @1.093v pulling 290w. I thought it would follow a linear slope. More voltage = more watts? Am I missing something
> 
> @Jpmboy


My guess would be the shunt mod is not quite right. You should never hit 120% TDP with the shunt mod in place.


----------



## Glerox

Quote:


> Originally Posted by *Jpmboy*
> 
> Voltage application is tied to freq and load in this architecture, you are better off forcing k-boost in ab... Cntrl-F, shift and slide the graph line up, cntrl-L the apply. now you can adj the core clock with the slider and have a locked frequency... and have the cooling to avoid temp clock bin drops.


If my max frequency is 2101 Mhz and I want a stable clock, until what lower voltage do you fix this max frequency with the slider?
For example, for the 1,050v mark, do I select as high as 2101 Mhz? Or will it crash?
(knowing that the maximum voltage for Pascal is 1.093v)


----------



## Jpmboy

Quote:


> Originally Posted by *Glerox*
> 
> If my max frequency is 2101 Mhz and I want a stable clock, until what lower voltage do you fix this max frequency with the slider?
> For example, for the 1,050v mark, do I select as high as 2101 Mhz? Or will it crash?
> (knowing that the maximum voltage for Pascal is 1.093v)


the stock card will run 1.05-1.063V at load.. I use the 1.05V point. whether "it crash(es)" or not is up to the card - right?








remember - what cntrl-F, cnrtl-L does is lock the card in P0, so it will hold that boost clock even at idle.


----------



## Glerox

Quote:


> Originally Posted by *Jpmboy*
> 
> the stock card will run 1.05-1.063V at load.. I use the 1.05V point. whether "it crash(es)" or not is up to the card - right?
> 
> 
> 
> 
> 
> 
> 
> 
> remember - what cntrl-F, cnrtl-L does is lock the card in P0, so it will hold that boost clock even at idle.


Thanks, I tried the 1.093v point with ctrl-L to maintain the card at full load while running benchmark but it still throttles down, even with 40-45 degrees and shunt mod.
I guess there is no way of keeping the card at 1.093v/max frequency 100% of the time until we have a bios mod.


----------



## Jpmboy

Quote:


> Originally Posted by *Glerox*
> 
> Thanks, I tried the 1.093v point with ctrl-L to maintain the card at full load while running benchmark but it still throttles down, even with 40-45 degrees and shunt mod.
> I guess there is no way of keeping the card at 1.093v/max frequency 100% of the time until we have a bios mod.


shunt mod does not stop the temp-induced clock bin drops. 40-45C is well above the first T-bin. A bios mod may not be able to address the hardware safety feature... just keep the core temp below 30C and the downclocking minimizes. Below 20C and it holds the set value.


----------



## Glerox

Quote:


> Originally Posted by *Jpmboy*
> 
> shunt mod does not stop the temp-induced clock bin drops. 40-45C is well above the first T-bin. A bios mod may not be able to address the hardware safety feature... just keep the core temp below 30C and the downclocking minimizes. Below 20C and it holds the set value.


Ok thanks! Keeping the gpu colder is out of reach for me after spending so much for my rig... so I'll have to accept only 2088mhz









This is such a first world problem haha, like all of this thread


----------



## Menthol

Quote:


> Originally Posted by *Glerox*
> 
> Ok thanks! Keeping the gpu colder is out of reach for me after spending so much for my rig... so I'll have to accept only 2088mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is such a first world problem haha, like all of this thread


Several people here have reported components falling off their cards due to the liquid metal dissolving solder, be careful


----------



## Dr Mad

Quote:


> Originally Posted by *Menthol*
> 
> Several people here have reported components falling off their cards due to the liquid metal dissolving solder, be careful


You mean 1080/1070 ref design owners?

Because I didn't read anything about that, but I'm interested since I applied a thin layer of Conductonaut on 2/3 resistors.
It seems my card is power hungry since whitout mod, the max I can get without power throttling is 1950 (stock voltage) and now, the card handles 2088 H24 stable (temp <34° / ambient temp 22°)


----------



## EniGma1987

Quote:


> Originally Posted by *Menthol*
> 
> Several people here have reported components falling off their cards due to the liquid metal dissolving solder, be careful


Several people = 1 and the person completely covered all around the resistor with liquid metal, which is not the correct way at all. You are not supposed to put LM *around* the resistor, only on the top. In his picture here you can even see some liquid metal TIM stuck on the underside of the resistor.......



I dont know how I could be any more clear on how to do it correctly.:
Quote:


> 4) Now use the brush included in the CLU package to spread the CLU over the whole *top* of the resistors. Be careful not to spill it off the resistor. If the CLU spills off the top of the resistor it will most likely destroy your graphics card. Be careful not to use too much CLU as using too much will cause a spill as well when you put the graphics card back together. You do not want to smother this and have CLU completely enclosing and surrounding the resistor, you are just covering the top of it. There is an inductor very close to resistor #2 (RS2) that can easily be spilled onto. Be very careful with this step.


----------



## Jpmboy

Quote:


> Originally Posted by *EniGma1987*
> 
> Several people = 1 and the person completely covered all around the resistor with liquid metal, which is not the correct way at all. You are not supposed to put LM *around* the resistor, only on the top. In his picture here you can even see some liquid metal TIM stuck on the underside of the resistor.......
> 
> 
> 
> I dont know how I could be any more clear on how to do it correctly.:


good thing it fell off before the wandering LM shorted the whole card out. We should establish a Darwin Award for this kind of stuff.


----------



## ttg35fort

Quote:


> Originally Posted by *Jpmboy*
> 
> shunt mod does not stop the temp-induced clock bin drops. 40-45C is well above the first T-bin. A bios mod may not be able to address the hardware safety feature... just keep the core temp below 30C and the downclocking minimizes. Below 20C and it holds the set value.


On my particular card, I maintain the highest clock frequencies at stock voltage, ranging from 2.012 GHz and 2.08 GHz. In graphics test #1 in Fire Strike Extreme, it mostly settles around 2.035 GHz. In graphics test #2, it mostly settles around 2.05 GHz.

Even though I am water cooled, if I increase the voltage even by 10-20% in AB, I get more throttling, the average frequencies drop a bit, and my graphics score drops. I am assuming that this is due to increased temp.

So, I wonder if the shunt mod buys you anything unless you have enough cooling to keep the GPU below the 37C, or maybe even 30C. I have a 420mm radiator and a 280mm radiator, and in my system this still is not enough, though I am not running my fans at full speed.


----------



## Jpmboy

Quote:


> Originally Posted by *ttg35fort*
> 
> On my particular card, I maintain the highest clock frequencies at stock voltage, ranging from 2.012 GHz and 2.08 GHz. In graphics test #1 in Fire Strike Extreme, it mostly settles around 2.035 GHz. In graphics test #2, it mostly settles around 2.05 GHz.
> 
> Even though I am water cooled, if I increase the voltage even by 10-20% in AB, I get more throttling, the average frequencies drop a bit, and my graphics score drops. I am assuming that this is due to increased temp.
> 
> So, I wonder if the shunt mod buys you anything unless you have enough cooling to keep the GPU below the 37C, or maybe even 30C. I have a 420mm radiator and a 280mm radiator, and in my system this still is not enough, though I am not running my fans at full speed.


yeah, the power control system in these cards is overkill. NOt only do load (current draw), TDP and temp factor in, voltage (at which the current is delivered) is weighted in also. For the most part, voltage (especially in AB or PX) has little to a mostly negative effect. Temperature has the greatest effect on unmodified cards. Not just for holding a given frequency, but for outright stability. I can run mush higher clocks and complete benchmarks with the water chiller on (cards @ 9C) than I can with the cards at 30C. Subzero guys get even better scaling.


----------



## ttg35fort

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah, the power control system in these cards is overkill. NOt only do load (current draw), TDP and temp factor in, voltage (at which the current is delivered) is weighted in also. For the most part, voltage (especially in AB or PX) has little to a mostly negative effect. Temperature has the greatest effect on unmodified cards. Not just for holding a given frequency, but for outright stability. I can run mush higher clocks and complete benchmarks with the water chiller on (cards @ 9C) than I can with the cards at 30C. Subzero guys get even better scaling.


Water chiller - Nice!


----------



## Jpmboy

Quote:


> Originally Posted by *ttg35fort*
> 
> Water chiller - Nice!


----------



## ttg35fort

Cool setup! Overclocker's paradise.


----------



## KillerBee33

Ouch!!! Still talking about shunt Mod








Was hoping BIOS Tools would be around by now...


----------



## Jpmboy

Quote:


> Originally Posted by *ttg35fort*
> 
> Cool setup! Overclocker's paradise.


that's - it's actually my home office. The wife DOES NOT appreciate the computer lab look.


----------



## ttg35fort

Here's mine


----------



## DsLiadz

¿So la creme de la creme is in here? ¿is there any real incentive to buy a titan x pascal besides being the most powerful gpu in this world and the bragging rights? also i salute you kind gentleman.


----------



## ttg35fort

Troll


----------



## DsLiadz

Quote:


> Originally Posted by *ttg35fort*
> 
> Troll


¿Troll?, nah man i don't "troll" it was a serious question, i mean, there has to be something else besides bragging and having the world's most powerful gpu to date, a lot of people might buy this just for the heck of it, like a wall painting or something.


----------



## Jpmboy

Quote:


> Originally Posted by *ttg35fort*
> 
> Here's mine


I think you need another monitor...








Nice!
Quote:


> Originally Posted by *DsLiadz*
> 
> ¿So la creme de la creme is in here? ¿is there any real incentive to buy a titan x pascal *besides being the most powerful gpu in this world* and the bragging rights? also i salute you kind gentleman.


you answered your own question.


----------



## DsLiadz

Quote:


> Originally Posted by *Jpmboy*
> 
> I think you need another monitor...
> 
> 
> 
> 
> 
> 
> 
> 
> Nice!
> you answered your own question.


Yeah i meant besides that, cause you know, there is always a XXXXTi variant on the way that pretty much beats it with fewer cuda cores and assets, i was wondering if there was any people that just buy it for the sake of having one, or simply for the design. Well more or less i was trying to say that what would you say to convince me to buy one besides the "being the most powerful gpu to date and bragging rights", like the performance for the value ¿is it worth it? note: i am not a fanboy, Nvidia and AMD they both had flaws on their lifetime, people said that Nvidia usually has been more shady, but hey nobody is perfect.


----------



## Lee0

Quote:


> Originally Posted by *DsLiadz*
> 
> snipidisnip snip


It depends. Are you looking for bang for your buck then no, the TXP is not for you. Are you looking to game at 4k without an overpriced 4k g-sync monitor (AKA a normal 4k monitor) at 60fps+ with highest settings then yes the TXP is right for you. But in most other cases other cards do what you want for a cheaper price IMHO.


----------



## Jpmboy

Quote:


> Originally Posted by *DsLiadz*
> 
> Yeah i meant besides that, cause you know, there is always a XXXXTi variant on the way that pretty much beats it with fewer cuda cores and assets, i was wondering if there was any people that just buy it for the sake of having one, or simply for the design. Well more or less i was trying to say that what would you say to convince me to buy one besides the "being the most powerful gpu to date and bragging rights", *like the performance for the value* ¿is it worth it? note: i am not a fanboy, Nvidia and AMD they both had flaws on their lifetime, people said that Nvidia usually has been more shady, but hey nobody is perfect.


Same stupid question has been asked since the original Titan.
It's $1000 card. You are in an Aston-Martin club asking about gas mileage.


----------



## DsLiadz

Quote:


> Originally Posted by *Jpmboy*
> 
> Same stupid question has been asked since the original Titan.
> It's $1000 card. You are in an Aston-Martin club asking about gas mileage.


Don't need to be rude man, i apologize if somehow i offended you gents with this question. i am off the thread, thank you anyways i guess.


----------



## meson1

Quote:


> Originally Posted by *DsLiadz*
> 
> Yeah. I meant besides that. ' Cause, you know, there is always a xxxx Ti variant on the way that pretty much beats it with fewer cuda cores and assets. I was wondering if there was any people that just buy it for the sake of having one, or simply for the design. Well more or less i was trying to say that what would you say to convince me to buy one besides the "being the most powerful gpu to date and bragging rights"; like the performance for the value ¿is it worth it? Note: i am not a fanboy, Nvidia and AMD they both had flaws on their lifetime, people said that Nvidia usually has been more shady, but hey nobody is perfect.


You have to ask yourself, what are your reasons for wanting one? We cannot answer that for you, because everyone's reasons are different.

I bought one because it met my requirement, that being for a single card that would give me (or as close as possible to) 60fps performance at 4K. And because it represents a significant step up from my current 780 Ti.

A 1080 Ti probably would have represented better value, if it ever comes out. I didn't have time to wait because my new rig is ready build now and I didn't want to wait until next year to see whether or not a 1080 Ti would be released AND THEN have to wait for water block for it.

I didn't buy it merely to have the most powerful (although it happens to be by virtue of my requirement). And I didn't buy it to brag, because I haven't told anyone about it (apart from mentioning it in here where other people own one anyway, so they're not going to care).


----------



## DsLiadz

Quote:


> Originally Posted by *meson1*
> 
> You have to ask yourself, what are your reasons for wanting one? We cannot answer that for you, because everyone's reasons are different.
> 
> I bought one because it met my requirement, that being for a single card that would give me (or as close as possible to) 60fps performance at 4K. And because it represents a significant step up from my current 780 Ti.
> 
> A 1080 Ti probably would have represented better value, if it ever comes out. I didn't have time to wait because my new rig is ready build now and I didn't want to wait until next year to see whether or not a 1080 Ti would be released AND THEN have to wait for water block for it.
> 
> I didn't buy it merely to have the most powerful (although it happens to be by virtue of my requirement). And I didn't buy it to brag, because I haven't told anyone about it (apart from mentioning it in here where other people own one anyway, so they're not going to care).


Really grateful for the replay, indeed i was looking for an answer like this one, it has great specs and benchmarks shows a nice improvement over the past Titan X, it's expensive, but it's a true 4k performer, i would definitely buy one too if i had the money, you must feel like a god with a rig with a Titan XP on there, any ways thank you buddy, i am gonna how much can i get for a healthy kidney.


----------



## Jpmboy

Quote:


> Originally Posted by *DsLiadz*
> 
> Really grateful for the replay, indeed i was looking for an answer like this one, it has great specs and benchmarks shows a nice improvement over the past Titan X, it's expensive, but it's a true 4k performer, i would definitely buy one too if i had the money, you must feel like a god with a rig with a Titan XP on there, any ways thank you buddy, i am gonna how much can i get for a healthy kidney.


hold the kidney until the 1080Ti... but, for nearly all gaming uses under 4K a single 1080 is more than enough.


----------



## jhowell1030

Not quite. Don't forget about those 3440 x 1440 ultra-wides back there,


----------



## Jpmboy

Quote:


> Originally Posted by *jhowell1030*
> 
> Not quite. Don't forget about those 3440 x 1440 ultra-wides back there,


the asus 1080 "turbo" i have drives a 1440P/120 monitor just fine. Not as well as a TXP does, but Certainly playable.


----------



## jhowell1030

Sure, but what about a predator x34 @ 100? My two 980 kingpins fell short even on games that had decent SLI profiles. I pulled the trigger on a Titan because at the time of the 1080's release reviewers everywhere were saying that it at most matched the performance of two 980s.

Anyhow, I'm happy to be back to a single card solution. Now comes the process of putting everything on water...for the first time.


----------



## CptSpig

Quote:


> Originally Posted by *jhowell1030*
> 
> Sure, but what about a predator x34 @ 100? My two 980 kingpins fell short even on games that had decent SLI profiles. I pulled the trigger on a Titan because at the time of the 1080's release reviewers everywhere were saying that it at most matched the performance of two 980s.
> 
> Anyhow, I'm happy to be back to a single card solution. Now comes the process of putting everything on water...for the first time.


Here is a simple solution for water cooling using EKWB AIO products. EK-XLC Predator 360 (incl. QDC) and a full cover Water Block EK-FC Titan X Pascal - Acetal+Nickel Pre-Fill. I have these in my machine now and they are a great solution. Temps are CPU OC 4.6 GHz 56c full load 3D Mark Fire Strike. GPU OC 2110 MHz is 48c full load 3D Mark Fire Strike. Idle temps are CPU OC 4.2 GHz 32c and GPU No OC 25c.

http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_225432_zpslarfbv2w.jpg.html


----------



## Jpmboy

Quote:


> Originally Posted by *CptSpig*
> 
> Here is a simple solution for water cooling using EKWB AIO products. EK-XLC Predator 360 (incl. QDC) and a full cover Water Block EK-FC Titan X Pascal - Acetal+Nickel Pre-Fill. I have these in my machine now and they are a great solution. Temps are CPU OC 4.6 GHz 56c full load 3D Mark Fire Strike. GPU OC 2110 MHz is 48c full load 3D Mark Fire Strike. Idle temps are CPU OC 4.2 GHz 32c and GPU No OC 25c.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_225432_zpslarfbv2w.jpg.html


Nice., how about a sustained load like looping heaven or valley? Looking for some experience with the ability of that rad to shed a sustained heat load for @Gerbacio, he's thinking of getting the same setup.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> Nice., how about a sustained load like looping heaven or valley? Looking for some experience with the ability of that rad to shed a sustained heat load for @Gerbacio, he's thinking of getting the same setup.


When I overclocked the system I used Unigine Valley for about two hours to check stability. Temperatures max out the same as 3D Mark Fire Strike.


----------



## Dagamus NM

Quote:


> Originally Posted by *CptSpig*
> 
> Here is a simple solution for water cooling using EKWB AIO products. EK-XLC Predator 360 (incl. QDC) and a full cover Water Block EK-FC Titan X Pascal - Acetal+Nickel Pre-Fill. I have these in my machine now and they are a great solution. Temps are CPU OC 4.6 GHz 56c full load 3D Mark Fire Strike. GPU OC 2110 MHz is 48c full load 3D Mark Fire Strike. Idle temps are CPU OC 4.2 GHz 32c and GPU No OC 25c.
> 
> http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_225432_zpslarfbv2w.jpg.html


I had to do a double take. On mobile it looked like you had the CPU and GPU looped together and that was it. Reminded me of a funny drawing of an engine concept where the exhaust manifold just dumped straight into the intake.


----------



## CptSpig

Quote:


> Originally Posted by *Dagamus NM*
> 
> I had to do a double take. On mobile it looked like you had the CPU and GPU looped together and that was it. Reminded me of a funny drawing of an engine concept where the exhaust manifold just dumped straight into the intake.


----------



## jhowell1030

Quote:


> Originally Posted by *CptSpig*
> 
> Here is a simple solution for water cooling using EKWB AIO products. EK-XLC Predator 360 (incl. QDC) and a full cover Water Block EK-FC Titan X Pascal - Acetal+Nickel Pre-Fill. I have these in my machine now and they are a great solution. Temps are CPU OC 4.6 GHz 56c full load 3D Mark Fire Strike. GPU OC 2110 MHz is 48c full load 3D Mark Fire Strike. Idle temps are CPU OC 4.2 GHz 32c and GPU No OC 25c.
> 
> http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_225432_zpslarfbv2w.jpg.html


Nice! Yeah, I got in on the 25% off cyber Monday sale at ekwb. Saved $165! Ordered a bunch of stuff from there that arrives today. Probably won't be able to install any of it for a few weeks (daddy with a two year old) but I can't wait. I'm gonna put a slim 360 on the top and a 280 slim in the front of my Define S. Front will be pull/push and top will be exhaust.


----------



## CptSpig

Quote:


> Originally Posted by *jhowell1030*
> 
> Nice! Yeah, I got in on the 25% off cyber Monday sale at ekwb. Saved $165! Ordered a bunch of stuff from there that arrives today. Probably won't be able to install any of it for a few weeks (daddy with a two year old) but I can't wait. I'm gonna put a slim 360 on the top and a 280 slim in the front of my Define S. Front will be pull/push and top will be exhaust.


Very Nice! Pictures when complete?


----------



## jhowell1030

Trust me...I can't wait to show off!

Old setup before the upgrade:
http://i.imgur.com/gxNneDa.jpg
http://i.imgur.com/59tNeAo.jpg


----------



## Jpmboy

Quote:


> Originally Posted by *CptSpig*
> 
> When I overclocked the system I used Unigine Valley for about two hours to check stability. Temperatures max out the same as 3D Mark Fire Strike.


no sheet, that's pretty strange, the loop coolant should heat up a bity on sustained loading.


----------



## Vellinious

Quote:


> Originally Posted by *Jpmboy*
> 
> no sheet, that's pretty strange, the loop coolant should heat up a bity on sustained loading.


One would think....especially with that setup. Makes me wonder what the delta is on the coolant.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> no sheet, that's pretty strange, the loop coolant should heat up a bity on sustained loading.


Quote:


> Originally Posted by *Jpmboy*
> 
> no sheet, that's pretty strange, the loop coolant should heat up a bity on sustained loading.


In unigine the max temp is consistent. In Fire Strike it veries. The ambient temperature is 20c.


----------



## Fiercy

I must say the Nvidia RMA process was the best one I've had. The payed for overnight shipment to them and to me. Sent the card on Monday and got new one Wednesday morning. Wow at least you pay for a good support!

As I was putting the water block on the card one thing I noticed on the new one the was a lot more thermal paste than the Last time.


----------



## Jpmboy

Quote:


> Originally Posted by *Vellinious*
> 
> One would think....especially with that setup. Makes me wonder what the delta is on the coolant.


yeah, I'm not sure the info was helpful for a purchase decision by a buddy...


----------



## Sheyster

Quote:


> Originally Posted by *ttg35fort*
> 
> Here's mine


Nice setup! I'm thinking about setting up a new rig for remote work using 3 x BENQ BL3200's (32" 2K 2560x1440). Definitely not a gaming rig with those monitors, but their size is perfect for the type of work I do. I don't really need 4K.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah, I'm not sure the info was helpful for a purchase decision by a buddy...


Here you go







Max temp on the CPU during the Fire Strike physics test 55c. Temps during Valley after one hour. Ambient temp 20c.
http://s1164.photobucket.com/user/CptSpig/media/Valley_zpsvlkudig9.png.html
http://s1164.photobucket.com/user/CptSpig/media/Fire Strike_zpsszrk4svn.png.html


----------



## kx11

with this new pre-built PC i just got from OriginPC i think the TXP it's got ain't that much

this is the highest i could get to run 3dmark benchmark with no crashes


----------



## TheGeneralLee86

This is some Pictures of Mine in the box and in my case:

__
https://flic.kr/p/PhfoAG


__
https://flic.kr/p/PhfoAG
 by general_lee862000, on Flickr


__
https://flic.kr/p/ND7CqC


__
https://flic.kr/p/ND7CqC
 by general_lee862000, on Flick


----------



## xarot

Quote:


> Originally Posted by *kx11*
> 
> with this new pre-built PC i just got from OriginPC i think the TXP it's got ain't that much
> 
> this is the highest i could get to run 3dmark benchmark with no crashes


About the same as mine. +193 on the core and +475 mem and card is water cooled by EK block. I tend to lose silicon lottery...on the other hand, those last few tens of MHzs don't really mean much. The card is very good at these clocks already.


----------



## Jpmboy

Quote:


> Originally Posted by *xarot*
> 
> About the same as mine. +193 on the core and +475 mem and card is water cooled by EK block. I tend to lose silicon lottery...on the other hand, those last few tens of MHzs don't really mean much. *The card is very good at these clocks already*.


^^ This!


----------



## kx11

Quote:


> Originally Posted by *xarot*
> 
> About the same as mine. +193 on the core and +475 mem and card is water cooled by EK block. I tend to lose silicon lottery...on the other hand, those last few tens of MHzs don't really mean much. The card is very good at these clocks already.


i'm not complaining









just wondering if my unit is a good one


----------



## Menthol

Quote:


> Originally Posted by *kx11*
> 
> i'm not complaining
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just wondering if my unit is a good one


Without water cooling, that's about as good as it gets, even water cooling only gets you a little more but more consistent with less throttling


----------



## Silent Scone

Even then, range is limited from there. Least you don't have to listen to an army of hairdryers.


----------



## jodasanchezz

Quote:


> Originally Posted by *kx11*
> 
> with this new pre-built PC i just got from OriginPC i think the TXP it's got ain't that much
> 
> this is the highest i could get to run 3dmark benchmark with no crashes


I Had 2 Titans one was rmaed and the second is in my System,
Try not to Focus on +core clock

Check ur favourit game @ vor example 4k @stock Settings
Tab out add ur max mhz on the core
Check back ingame an look at the Performance increase
Tab out Do the sam with the memmory but start here @300mhz+
Tab in check
tab out add 50 ....andowon

I have realised that there is a Point where adding more on Memmory decreses the Performance ingame.

For example im running max 450mhz on the mem
And maybe if u have sme wats left over u can add another 10mhz on the core.

Whats ur Boost cklock ind temperature?


----------



## jodasanchezz

Is it Possible thet a i5 6600k @ 4.6ghz is bottlenecking a Titan X Pascal @ 2560x1444 or above?

Playing BF1 1080p 120-144fps
playing BF1 4K 95-120fps

Ultra TAA off


----------



## unreality

battlefield 1 sure is really cpu hungry too, especially in the bigger conquest maps with a lot of players. i had to add some things to config so the game uses all cores on my [email protected]

before i had fps drops to 70-80, now its all good at ~140fps 99% of the time (capped at 140 bc of gsync inputlag)

In gta5 im at cpu bottleneck most of the time so answer is *yes*. those cards are so friggin powerful, its actually a nice feeling


----------



## jodasanchezz

Quote:


> Originally Posted by *unreality*
> 
> battlefield 1 sure is really cpu hungry too, especially in the bigger conquest maps with a lot of players. i had to add some things to config so the game uses all cores on my [email protected]
> 
> before i had fps drops to 70-80, now its all good at ~140fps 99% of the time (capped at 140 bc of gsync inputlag)
> 
> In gta5 im at cpu bottleneck most of the time so answer is *yes*. those cards are so friggin powerful, its actually a nice feeling












Thanks for the answer...hoping for kaby 7700k because i dont want to Switch the platform
Yes ist nice and sad at the same time


----------



## Jpmboy

Quote:


> Originally Posted by *jodasanchezz*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the answer...hoping for kaby 7700k because i dont want to Switch the platform
> Yes ist nice and sad at the same time


from what I've seen, a 7700K is about the same as a 6700K when they are run at the same OC.


----------



## veaseomat

Just added a 2nd XP to the rig and separated the cpu loop from the gpu loop. might go back to e series chip when the new ones release in 2017 becasue it looks like gaming might benefeit from more cores finally.


----------



## ottoore

Quote:


> Originally Posted by *jodasanchezz*
> 
> Is it Possible thet a i5 6600k @ 4.6ghz is bottlenecking a Titan X Pascal @ 2560x1444 or above?
> 
> Playing BF1 1080p 120-144fps
> playing BF1 4K 95-120fps
> 
> Ultra TAA off


I guess you meant below.


----------



## Enapace

Is 5930K going to be a good enough CPU to pair with two Titan Pascal ?

Or Should I try and replace my 5930K with a 5960X?


----------



## jhowell1030

Quote:


> Originally Posted by *Enapace*
> 
> Is 5930K going to be a good enough CPU to pair with two Titan Pascal ?
> 
> Or Should I try and replace my 5930K with a 5960X?


You won't see any substantial benefit from the 5960X. Might as well save some dough.


----------



## Enapace

Quote:


> Originally Posted by *jhowell1030*
> 
> You won't see any substantial benefit from the 5960X. Might as well save some dough.


Figured as much just figured I would ask.


----------



## Jpmboy

Quote:


> Originally Posted by *Enapace*
> 
> Is 5930K going to be a good enough CPU to pair with two Titan Pascal ?
> 
> Or Should I try and replace my 5930K with a 5960X?


depends on the games you play and whether your MB will allow full lane access with a 28 lane cpu. the 5960X's extra cores and 40 lanes can help if you have things like NVMe and/or PCIE drives on board, and of course with post processing and physics. Frankly, a 5960X running 4.6-4.7 (or higher) is a very strong system.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> depends on the ganes you play and whether your MB will allow full lane access with a 28 lane cpu. the 5960X's extra cores and 40 lanes can help if you have things like NVMe and/or PCIE drives on board, and of course with post processing and physics. Frankly, a 5960X running 4.6-4.7 (or higher) is a very strong system.


The Intel 5930K processor has 40 lanes and mine overclocks to 4.6 with no issues on liquid cooling. In real world applications don't know how much two more cores will help. I do know it helps with 3D Mark scores.


----------



## Jpmboy

Quote:


> Originally Posted by *CptSpig*
> 
> The Intel 5930K processor has 40 lanes and mine overclocks to 4.6 with no issues on liquid cooling. In real world applications don't know how much two more cores will help. I do know it helps with 3D Mark scores.


yeah - with 40 lanes you're fine loading the PCIE system up. As far as real world apps go, that depends on the apps. Very few games take advantage of 6 let alone 10 cores. But many actual _real world_ apps will... encoding, encryption, sci and financial computation etc, and yes, 3D mark physics. The 5930K is a very strong cpu.








my 5960x is a busy little guy...









Spoiler: Warning: Spoiler!




5960x 4.6/4.4/3200c13, 2 TX maxwells


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - with 40 lanes you're fine loading the PCIE system up. As far as real world apps go, that depends on the apps. Very few games take advantage of 6 let alone 10 cores. But many actual _real world_ apps will... encoding, encryption, sci and financial computation etc, and yes, 3D mark physics. The 5930K is a very strong cpu.
> 
> 
> 
> 
> 
> 
> 
> 
> my 5960x is a busy little guy...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 5960x 4.6/4.4/3200c13, 2 TX maxwells


Thanks for the insight. I am trying to decide if I want to upgrade to a Intel i7 6900k or wait until a new platform comes around. Could be a long wait?


----------



## meson1

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks for the insight. I am trying to decide if I want to upgrade to a Intel i7 6900k or wait until a new platform comes around. Could be a long wait?


If I were in your position, with a i7 5930K, I'd definitely skip Broadwell-E and wait for Skylake-X/Kaby Lake-X which is due H2 next year.

Broadwell-E is just Haswell-E on a smaller process and moving from one to the other provides little or no gains. In fact, I am given to understand that Broadwell-E just doesn't overclock as well as Haswell-E.

The only slight gain you might get would be from increasing your core count which will only count for anything if you are doing work for which those extra cores will make a crucial difference.

From Skylake-X and Kaby Lake-X you will be looking at significant gains by virtue of the change in architecture, so you'll be getting a much bigger bump in speed. But you will need to be patient while they come.


----------



## Dagamus NM

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks for the insight. I am trying to decide if I want to upgrade to a Intel i7 6900k or wait until a new platform comes around. Could be a long wait?


Depends on what you consider a long wait. Probably 6-9 months based on Intel's past behavior.

A 6 series processor makes sense as an upgrade to anything 4 series or older, but as you have a very nice processor there are little real world gains to be had.

I picked up a 6950X to replace a 3930K to host four TXPs. This is to run 4K surround and in all actuality is most likely a colossal waste of money.

My 4930K will be replaced with Skylake X next year.


----------



## Enapace

I think I will keep my 5930K for a while to come will give skylake E a look but a 5930k should be able to work with my Titan Pascal SLI at 4K for a while to come.


----------



## s1rrah

Building a new system currently ... I can get a Titan X Pascal or wait for 1080ti's ... any reasonable argument for going with Titan X Pascal now instead of waiting for the Ti's??

Thanks..


----------



## Blaise Pascal

Given our lack of cooler support and bios, unless you plan on investing in a full water loop, I'd wait through January to see what's up with the 1080ti.

That being said, if you don't care about a little extra money, go with the Titan. It's a beast and I think that everyone here agrees. By all rumors, I don't think that the 1080ti will outperform the titan (though it should be extremely close), and a hundred bucks-ish extra for a card now rather than later is never a bad way to go.

Also, if you aren't really an overclocker, I'd say there is no doubt that you should go with the Titan.


----------



## Baasha

I just tried upping my OC the other day and got 2000mhz across all 4 GPUs.









It was a sight to see. Oh, and +700 on the mem too!

God I love the Uber Rig!









I need MOAR challenging software - the games are too easy to crush even at 8K! hehe...


----------



## Blaise Pascal

^ What a man.


----------



## s1rrah

Quote:


> Originally Posted by *Blaise Pascal*
> 
> Given our lack of cooler support and bios, unless you plan on investing in a full water loop, I'd wait through January to see what's up with the 1080ti.
> 
> That being said, if you don't care about a little extra money, go with the Titan. It's a beast and I think that everyone here agrees. By all rumors, I don't think that the 1080ti will outperform the titan (though it should be extremely close), and a hundred bucks-ish extra for a card now rather than later is never a bad way to go.
> 
> Also, if you aren't really an overclocker, I'd say there is no doubt that you should go with the Titan.


Thanks for the reply; I'm inclined to agree with you on waiting. Hard though when the Titan is a click away .. LOL; I'm completely comfortable with the 1080 hybrid mod so that's what I'd do if I went with the Titan. I've used two of those hybrid coolers on x2 980's for a long time and can't find anything wrong with them, other than the obvious "pumpfest" going on in the case ... but never had noise or otherwise issues so ...

I'm really wanting to go single card, too as I've been using SLI for quite a few years (since the GTX670 days so many decades ago) ... and though I really enjoy it, I'd still rather have one super capable card (and then go SLI a year later) ...

I'll see though ... if I find that both of my 980's are still functioning after my recent, still unexplained/troubleshooted system failure last week, then I'll definitely wait for the 1080ti's as I'm still pretty stoked on 980's SLI ...

Thanks again ..


----------



## Dagamus NM

Quote:


> Originally Posted by *s1rrah*
> 
> Thanks for the reply; I'm inclined to agree with you on waiting. Hard though when the Titan is a click away .. LOL; I'm completely comfortable with the 1080 hybrid mod so that's what I'd do if I went with the Titan. I've used two of those hybrid coolers on x2 980's for a long time and can't find anything wrong with them, other than the obvious "pumpfest" going on in the case ... but never had noise or otherwise issues so ...
> 
> I'm really wanting to go single card, too as I've been using SLI for quite a few years (since the GTX670 days so many decades ago) ... and though I really enjoy it, I'd still rather have one super capable card (and then go SLI a year later) ...
> 
> I'll see though ... if I find that both of my 980's are still functioning after my recent, still unexplained/troubleshooted system failure last week, then I'll definitely wait for the 1080ti's as I'm still pretty stoked on 980's SLI ...
> 
> Thanks again ..


What display are you planning to run with said card?


----------



## Jpmboy

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks for the insight. I am trying to decide if I want to upgrade to a Intel i7 6900k or wait until a new platform comes around. Could be a long wait?


the only reason I see to go BWE is for 10 unlocked cores. For 6 and 8 core processors stick with HWE until a next gen HEDT CPU with improved IPC comes along.


----------



## Nikos4Life

Hello mates,

After reading last few pages about CPUs bottlenecking the Titans, It just came to my mind that while playing BF1 (all maxed out @ 1440p) I do not see the cards being use at 90% or so, so my FPS are not constant at the refresh rate of the monitor(144Hz GSYNC). Cards are overclocked and watercooled.
My CPU is a 6850K @ 4.4 and the ram is running @ 3200MHz
Do you think the CPU is possibly bottlenecking the cards?
Anything I can do about it?

Thanks in adavance


----------



## Maintenance Bot

Quote:


> Originally Posted by *Nikos4Life*
> 
> Hello mates,
> 
> After reading last few pages about CPUs bottlenecking the Titans, It just came to my mind that while playing BF1 (all maxed out @ 1440p) I do not see the cards being use at 90% or so, so my FPS are not constant at the refresh rate of the monitor(144Hz GSYNC). Cards are overclocked and watercooled.
> My CPU is a 6850K @ 4.4 and the ram is running @ 3200MHz
> Do you think the CPU is possibly bottlenecking the cards?
> Anything I can do about it?
> 
> Thanks in adavance


SLi not work well in BF1 at the moment, scaling was not that good for BF1 so I sold my 2nd Titan XP.

Rig in sig I get around 150 fps ( 1440p 165hz ) in BF1 with 1 titan XP.


----------



## Nikos4Life

Quote:


> Originally Posted by *Maintenance Bot*
> 
> SLi not work well in BF1 at the moment, scaling was not that good for BF1 so I sold my 2nd Titan XP.
> 
> Rig in sig I get around 150 fps ( 1440p 165hz ) in BF1 with 1 titan XP.


Thanks!

Everything maxed out? O_O, that is much better than what I am getting with SLI.
DX12?


----------



## unreality

Quote:


> Originally Posted by *Nikos4Life*
> 
> Hello mates,
> 
> After reading last few pages about CPUs bottlenecking the Titans, It just came to my mind that while playing BF1 (all maxed out @ 1440p) I do not see the cards being use at 90% or so, so my FPS are not constant at the refresh rate of the monitor(144Hz GSYNC). Cards are overclocked and watercooled.
> My CPU is a 6850K @ 4.4 and the ram is running @ 3200MHz
> Do you think the CPU is possibly bottlenecking the cards?
> Anything I can do about it?
> 
> Thanks in adavance


create a file called user.cfg in bf1 folder and put the following text in there

Code:



Code:


thread.processorcount 12
thread.maxprocessorcount 12
thread.minfreeprocessorcount 0
gametime.maxvariablefps 140

Should fix it, especially in bigger multiplay maps. Im running 1 TX @ 2060 and it stays at 140 fps (which you should cap if you have gsync bc of inputlag) like 99.9% of the time


----------



## Nikos4Life

Quote:


> Originally Posted by *unreality*
> 
> create a file called user.cfg in bf1 folder and put the following text in there
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> thread.processorcount 12
> thread.maxprocessorcount 12
> thread.minfreeprocessorcount 0
> gametime.maxvariablefps 140
> 
> Should fix it, especially in bigger multiplay maps. Im running 1 TX @ 2060 and it stays at 140 fps (which you should cap if you have gsync bc of inputlag) like 99.9% of the time


This improved my framerate (thanks) but it is not 140 steady. Do you have all the settings maxed out?
If I use SLI with this config I am getting worse performance than with one :|
I think it is the game and not any of my config :S


----------



## Jpmboy

Quote:


> Originally Posted by *Nikos4Life*
> 
> This improved my framerate (thanks) but it is not 140 steady. Do you have all the settings maxed out?
> If I use SLI with this config I am getting worse performance than with one :|
> I think it is the game and not any of my config :S


it's the game.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> the only reason I see to go BWE is for 10 unlocked cores. For 6 and 8 core processors stick with HWE until a next gen HEDT CPU with improved IPC comes along.


Thanks, have you heard when they will move from X99 and come out with a new chipset/socket?


----------



## Piospi

Will an i7 6700k and Samsung 950 PRO bottleneck the two Titan's X? I'm sorry for this question, but I am confused and don'tt know if it will be a good buy.


----------



## Dagamus NM

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks, have you heard when they will move from X99 and come out with a new chipset/socket?


Second half 2017. August or September most likely.


----------



## s1rrah

Quote:


> Originally Posted by *Dagamus NM*
> 
> What display are you planning to run with said card?


1440p 144hz ...

I'm a FPS junkie and so that's where I will be playing for the next couple years ... when 4K is comfortable at 100fps then I'll upgrade.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> the only reason I see to go BWE is for 10 unlocked cores. For 6 and 8 core processors stick with HWE until a next gen HEDT CPU with improved IPC comes along.


Thanks, have you heard when they will move from X99 and come out with a new chipset/socket?
Quote:


> Originally Posted by *Dagamus NM*
> 
> Second half 2017. August or September most likely.


Thanks, I think I will wait.


----------



## Jpmboy

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks, have you heard when they will move from X99 and come out with a new chipset/socket?
> Thanks, I think I will wait.


x299 & socket 2066 is the lead rumor.


----------



## Dagamus NM

Quote:


> Originally Posted by *s1rrah*
> 
> 1440p 144hz ...
> 
> I'm a FPS junkie and so that's where I will be playing for the next couple years ... when 4K is comfortable at 100fps then I'll upgrade.


What are you using to drive it now? I would expect a 1080Ti could drive it. I am surprised that NVidia decided to wait and miss Christmas. 980Ti was out this time last year.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> x299 & socket 2066 is the lead rumor.


Thanks


----------



## wizaga

Hi everyone,i have a big broblem with my titan xp sli config, i hope somebody in here can help me.
I just posted on the Nvidia forums as well so i will just copy and paste what i wrote there:

first off all i just want to say sorry if my english isn't the best (not my native language).
I have a problem with my titan x pascal sli (with HB bridge) setup and i think its the drivers fault.
The problem is that Every time i run anything at 4k and exit the Game/benchmark etc. my computer becomes SUPER slow, Anything i click at takes like 5-10 secunds to open and doesn't recover unless i reboot or disable/enable sli again.
I searched other threads/forums and i found other people having the similiar problem and nobody
had an answer to what the solution was.
I have tested each card on its own and they both work perfect and as long as i am not in sli there is no slowdown, I did a fresh windows install and still no difference, i reset every oc i had on Cpu,memory and gpu and still no difference.

i was monitoring it all with HW monitor and there where no difference in temps or any warning signs.

This is my setup:
Motherboard: Msi Z170 Xpower Titanium gaming edition
GPU: Titan x Pascal Sli (with 1080 hybrids cooler)
Cpu: i76700k at 4.6ghz stable (also water cooled)
Ram: Corsair vengeance 32gb 3200mhz
Power supply: evga t2 1600w
SSD: 256 gb (only windows)
HDD: 3tb

So please is there any fix for this?
does anybody else have this problem?
HELP!

Edit: I dont have this problem with my former titan x Maxwells in sli (same pc), i tried rolling back a couple of drivers and still no difference, i also tried with another Motherboard and still no differenc


----------



## Menthol

Quote:


> Originally Posted by *Dagamus NM*
> 
> What are you using to drive it now? I would expect a 1080Ti could drive it. I am surprised that NVidia decided to wait and miss Christmas. 980Ti was out this time last year.


Nividia sales the TXP, 1080ti will be sold by manufacturers, if 1080ti was out now how many TXP sales would they loose?


----------



## Dagamus NM

Quote:


> Originally Posted by *Menthol*
> 
> Nividia sales the TXP, 1080ti will be sold by manufacturers, if 1080ti was out now how many TXP sales would they loose?


Are there many TXP buyers that have yet to get it?


----------



## Menthol

I'll ask Santa and get back to you


----------



## Enapace

Quote:


> Originally Posted by *Dagamus NM*
> 
> Are there many TXP buyers that have yet to get it?


Well I only have one debating if I should get a second for 4k


----------



## Nikos4Life

This fix my problems with FPS @ BF1:

[user.cfg]

thread.processorcount 6
thread.maxprocessorcount 6
thread.minfreeprocessorcount 0
gametime.maxvariablefps 140



Spoiler: NVIDIA PROFILE



Code:



Code:


<?xml version="1.0" encoding="utf-16"?>
<ArrayOfProfile>
  <Profile>
    <ProfileName>Battlefield 1</ProfileName>
    <Executeables>
      <string>bf1.exe</string>
      <string>tunguska.main_win64_final.exe</string>
      <string>bf1trial.exe</string>
      <string>tunguska.main_win64_release.exe</string>
    </Executeables>
    <Settings>
      <ProfileSetting>
        <SettingNameInfo>Vertical Sync Tear Control</SettingNameInfo>
        <SettingID>5912412</SettingID>
        <SettingValue>2525368439</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Preferred refresh rate</SettingNameInfo>
        <SettingID>6600001</SettingID>
        <SettingValue>1</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Vertical Sync</SettingNameInfo>
        <SettingID>11041231</SettingID>
        <SettingValue>1620202130</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Flag to control smooth AFR behavior</SettingNameInfo>
        <SettingID>270198627</SettingID>
        <SettingValue>0</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>SLI rendering mode</SettingNameInfo>
        <SettingID>271830737</SettingID>
        <SettingValue>1</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Number of GPUs to use on SLI rendering mode</SettingNameInfo>
        <SettingID>271834321</SettingID>
        <SettingValue>1</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Power management mode</SettingNameInfo>
        <SettingID>274197361</SettingID>
        <SettingValue>1</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>VRR requested state</SettingNameInfo>
        <SettingID>278196727</SettingID>
        <SettingValue>1</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>G-SYNC</SettingNameInfo>
        <SettingID>279476687</SettingID>
        <SettingValue>0</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Enable G-SYNC globally</SettingNameInfo>
        <SettingID>294973784</SettingID>
        <SettingValue>1</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Multi-display/mixed-GPU acceleration</SettingNameInfo>
        <SettingID>537586684</SettingID>
        <SettingValue>0</SettingValue>
      </ProfileSetting>
      <ProfileSetting>
        <SettingNameInfo>Threaded optimization</SettingNameInfo>
        <SettingID>549528094</SettingID>
        <SettingValue>1</SettingValue>
      </ProfileSetting>
    </Settings>
  </Profile>
</ArrayOfProfile>


----------



## toncij

I need someone to confirm something for me:
- when overclocking TXP (and I recall the same for 1080) I think the power limit enveloped the GPU and VRAM and thus if VRAM was overclocked too high, it was cannibalizing into power available to the GPU.

Was my perception wrong or correct?


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> I'll ask Santa and get back to you


----------



## Dagamus NM

Quote:


> Originally Posted by *Enapace*
> 
> Well I only have one debating if I should get a second for 4k


You already know the answer to that question. And to the point above, you are not going to hold out for the 1080Ti because you want another Titan.


----------



## Enapace

Quote:


> Originally Posted by *Dagamus NM*
> 
> You already know the answer to that question. And to the point above, you are not going to hold out for the 1080Ti because you want another Titan.


Going order another one both will have sit on air for 2-4 months tho so i can afford a custom loop lol.


----------



## Seyumi

Quote:


> Originally Posted by *toncij*
> 
> I need someone to confirm something for me:
> - when overclocking TXP (and I recall the same for 1080) I think the power limit enveloped the GPU and VRAM and thus if VRAM was overclocked too high, it was cannibalizing into power available to the GPU.
> 
> Was my perception wrong or correct?


You are correct. I believe the sweet spot is around +450 on the memory. Anything higher then the GPU clock drops more often since power is taken away from the core and given to the memory. This inturn reduces overall performance and get less frames. A website did a huge chart where they'd test the memory/gpu clock in like +50mhz intervals and +450 was the safe spot (would vary from 400-500). That's what I have my Titan X's at.


----------



## Murlocke

Is this heatsink the same as the 1080? I'm trying to find a guide to reapply the TIM on this card and I don't wanna risk it by guessing. I generally have to run my card at 100% fan in games like Witcher 3 for about 81-82C when TDP is set to 120%. I have 5 140mm noctua case fans, so definitely not a case issue, 3 fans blow almost directly at the card.

I believe I saw someone claim upward of 10C drops when replacing the TIM with something better but I couldn't find any other results. Anyone else have good results replacing the stock TIM?


----------



## KillerBee33

Quote:


> Originally Posted by *Murlocke*
> 
> Is this heatsink the same as the 1080? I'm trying to find a guide to reapply the TIM on this card and I don't wanna risk it by guessing. I generally have to run my card at 100% fan in games like Witcher 3 for about 81-82C when TDP is set to 120%. I have 5 140mm noctua case fans, so definitely not a case issue, 3 fans blow almost directly at the card.
> 
> I believe I saw someone claim upward of 10C drops when replacing the TIM with something better but I couldn't find any other results. Anyone else have good results replacing the stock TIM?


Tried Arctic Silver 5 and Gelid Extreme twice....1-2 degrees difference.


----------



## axiumone

Quote:


> Originally Posted by *Murlocke*
> 
> Is this heatsink the same as the 1080? I'm trying to find a guide to reapply the TIM on this card and I don't wanna risk it by guessing. I generally have to run my card at 100% fan in games like Witcher 3 for about 81-82C when TDP is set to 120%. I have 5 140mm noctua case fans, so definitely not a case issue, 3 fans blow almost directly at the card.
> 
> I believe I saw someone claim upward of 10C drops when replacing the TIM with something better but I couldn't find any other results. Anyone else have good results replacing the stock TIM?


Absolutely, in my opinion it's an absolute must. I was shocked at the factory tim application on a $1,200 card. Here, take a look for yourself. I used thermal grizzly kryonaut and I'm happy with the results compared to stock.





Although the heatsinks between the txp and 1080 are a little different, the parts that you need to dissemble to replace the tim are the same. Take the top front portion of the shroud off, then undo the large four screws that hold the actual heatsink block on. That's all you need to do. You don't need to take off the back plate or dissemble the card any further.


----------



## KillerBee33

Quote:


> Originally Posted by *axiumone*
> 
> Absolutely, in my opinion it's an absolute must. I was shocked at the factory tim application on a $1,200 card. Here, take a look for yourself. I used thermal grizzly kryonaut and I'm happy with the results compared to stock.
> 
> 
> 
> 
> /quote]
> Ouch , tht looks horrible


----------



## Murlocke

Quote:


> Originally Posted by *axiumone*
> 
> Absolutely, in my opinion it's an absolute must. I was shocked at the factory tim application on a $1,200 card. Here, take a look for yourself. I used thermal grizzly kryonaut and I'm happy with the results compared to stock.
> 
> 
> 
> 
> 
> Although the heatsinks between the txp and 1080 are a little different, the parts that you need to dissemble to replace the tim are the same. Take the top front portion of the shroud off, then undo the large four screws that hold the actual heatsink block on. That's all you need to do. You don't need to take off the back plate or dissemble the card any further.


Thanks, if I may ask how much did temps drops? It looks terrible but sometimes it does the job decently. If it's just 1-2C then I'm not gonna bother.


----------



## axiumone

Quote:


> Originally Posted by *Murlocke*
> 
> Thanks, if I may ask how much did temps drops? It looks terrible but sometimes it does the job decently. If it's just 1-2C then I'm not gonna bother.


I can't honestly remember, but I think it was worth it. More than a few degrees.


----------



## BelowAverageIQ

In a conundrum. First world problem really.

I have SLI GTX 980's at the moment. They do the job, quite well.

I want to go to single card. had enough of SLI not being supported, or needing drivers/coding specific to the game. Obviously the Titan XP is the fastest current single card on the market.

The only things stopping me from buying are:

1. I am located in Australia (will have to use a forwarding company, not a big problem). Plus the exchange rate









2. The 1080Ti is rumor, but will possibly be paper launched at CES.

3. Last lot of Ti's have left the Titans wanting.

4. AMD is allegedly going to bring something to the market.

5. I will immediately remove the cooler and put a block on it for my custom loop.

6. I basically browse forums, play a few games: Arma 3, BF1, Rust.

I dont usually buy cards and flip them. I hate selling things to others. When I do replace the 980's either both will go into my sons system or one into his system and one into my daughters system.

On the flip side:

1. I love latest tech.

2. I like to have the best (see No. 3 above).

3. I need a bigger e-peen









Thoughts? Any regrets from anyone here (probably a stupid question).

Cheers


----------



## MrTOOSHORT

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> In a conundrum. First world problem really.
> 
> I have SLI GTX 980's at the moment. They do the job, quite well.
> 
> I want to go to single card. had enough of SLI not being supported, or needing drivers/coding specific to the game. Obviously the Titan XP is the fastest current single card on the market.
> 
> The only things stopping me from buying are:
> 
> 1. I am located in Australia (will have to use a forwarding company, not a big problem). Plus the exchange rate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. The 1080Ti is rumor, but will possibly be paper launched at CES.
> 
> 3. Last lot of Ti's have left the Titans wanting.
> 
> 4. AMD is allegedly going to bring something to the market.
> 
> 5. I will immediately remove the cooler and put a block on it for my custom loop.
> 
> 6. I basically browse forums, play a few games: Arma 3, BF1, Rust.
> 
> I dont usually buy cards and flip them. I hate selling things to others. When I do replace the 980's either both will go into my sons system or one into his system and one into my daughters system.
> 
> On the flip side:
> 
> 1. I love latest tech.
> 
> 2. I like to have the best (see No. 3 above).
> 
> 3. I need a bigger e-peen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thoughts? Any regrets from anyone here (probably a stupid question).
> 
> Cheers


Have a Titan X Maxwell, moved to a TXP, no regrets. Voltage control on a 1080ti won't matter much with this architecture which is what will be needed to get past TXP performance. So TXP will be king on water or air over the 1080ti.

We are close, to the rumored announcement of the 1080ti, so you can wait and see. But because you are going water and the voltage does really nothing on water or air for pascal, just go for the big one.


----------



## Murlocke

I hate SLI, so I am super biased, but I'd do everything in my power to get rid of SLI if you don't actually need the performance. 60FPS on SLI still feels like 45-50FPS to me, which renders it very pointless. Microstuttering is still a thing and I must be super sensitive to it. Tried it about 6 times over the years, always returned or sold the second card.
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Have a Titan X Maxwell, moved to a TXP, no regrets.


Same, went from 4K being nearly unplayable at max settings to 60FPS in almost every title. Worth every penny.


----------



## MrKenzie

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> In a conundrum. First world problem really.
> 
> I have SLI GTX 980's at the moment. They do the job, quite well.
> 
> I want to go to single card. had enough of SLI not being supported, or needing drivers/coding specific to the game. Obviously the Titan XP is the fastest current single card on the market.
> 
> The only things stopping me from buying are:
> 
> 1. I am located in Australia (will have to use a forwarding company, not a big problem). Plus the exchange rate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. The 1080Ti is rumor, but will possibly be paper launched at CES.
> 
> 3. Last lot of Ti's have left the Titans wanting.
> 
> 4. AMD is allegedly going to bring something to the market.
> 
> 5. I will immediately remove the cooler and put a block on it for my custom loop.
> 
> 6. I basically browse forums, play a few games: Arma 3, BF1, Rust.
> 
> I dont usually buy cards and flip them. I hate selling things to others. When I do replace the 980's either both will go into my sons system or one into his system and one into my daughters system.
> 
> On the flip side:
> 
> 1. I love latest tech.
> 
> 2. I like to have the best (see No. 3 above).
> 
> 3. I need a bigger e-peen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thoughts? Any regrets from anyone here (probably a stupid question).
> 
> Cheers


Hey mate I'm from Australia too. I went from a 780ti to the Titan XP and upgraded to 4K. I have no regrets obviously as it's a huge improvement compared to what I had! I used SHOPMATE from Auspost to get it here and that went pretty good. It cost me AUD $90 for insured shipping (insurance optional) and took about 3 weeks to get here because of customs etc. I got lucky and the card delivered to my door cost a total of AUD $1670 with no duty or gst applied, Nvidia marked it as a "gift".

I can't see the 1080ti being much of an improvement, if any at all, considering even the custom board 1080's max out at about 2100-2150MHz the same as the Titan X does. It wouldn't surprise me if the 1080ti is just a US$899 card that's 10-15% faster than the 1080.

As long as you don't have to pay the duty and gst, it will cost a similar amount to get a Titan X or a locally delivered 1080ti...


----------



## BelowAverageIQ

I can guarantee if I buy the Titan XP today, the 1080Ti will be announced tomorrow and it will be a XP killer. Guaranteed.

Then you guys and gals will hate me, because of what I have done


----------



## Exnetic

A 980Ti or 1080Ti is not better than a titan x/xp at same clocks titans is faster.


----------



## KillerBee33

Quote:


> Originally Posted by *Exnetic*
> 
> A 980Ti or 1080Ti is not better than a titan x/xp at same clocks titans is faster.


Rumored cClock on the 1080ti is higher than the TXP , only fewer Cuda cores on the 1080ti. Not sure why nvidia would do something like that


----------



## BelowAverageIQ

Quote:


> Originally Posted by *KillerBee33*
> 
> Rumored cClock on the 1080ti is higher than the TXP , only fewer Cuda cores on the 1080ti. Not sure why nvidia would do something like that


That combined with partner designed boards, voltage regulation and power will probably lead to a comparable or better performing board, as has been demonstrated in the past.

Realistically though, a paper launch in January will probably not see the cards in sufficient numbers and waterblocks here in Australia, my guess until May 2017.

I appreciate that Nvidia and partners are there to make money. There will ALWAYS something better around the corner. Just makes the decision that little bit harder.


----------



## KillerBee33

Quote:


> Originally Posted by *Exnetic*
> 
> A 980Ti or 1080Ti is not better than a titan x/xp at same clocks titans is faster.


Rumored cClock on the 1080ti is higher than the TXP , only fewer Cuda cores on the 1080ti. Not sure why nvidia would do something like that
Quote:


> Originally Posted by *BelowAverageIQ*
> 
> That combined with partner designed boards, voltage regulation and power will probably lead to a comparable or better performing board, as has been demonstrated in the past.
> 
> Realistically though, a paper launch in January will probably not see the cards in sufficient numbers and waterblocks here in Australia, my guess until May 2017.
> 
> I appreciate that Nvidia and partners are there to make money. There will ALWAYS something better around the corner. Just makes the decision that little bit harder.


I lost $100 on a 1080 resale and quite happy with the output after getting TXP , was hoping for bios tools to see it's full power but it looks like we ain't getting those in the 10 Series.But if you do the math the BOOST on the 10Series is pretty much what we were getting with custom BIOS on the 9Series


----------



## bl4ckdot

Would an upgrade to a 6850K, Asus Edition 10 and some good DDR4 RAM be a waste coming from a 4790K with my Titan XP ? New chipset is coming next year so I don't really know








I know I should wait, but damn this is tempting


----------



## Exnetic

Quote:


> Originally Posted by *KillerBee33*
> 
> Rumored cClock on the 1080ti is higher than the TXP , only fewer Cuda cores on the 1080ti. Not sure why nvidia would do something like that
> I lost $100 on a 1080 resale and quite happy with the output after getting TXP , was hoping for bios tools to see it's full power but it looks like we ain't getting those in the 10 Series.But if you do the math the BOOST on the 10Series is pretty much what we were getting with custom BIOS on the 9Series


I find no need for custom bios etc, a shunt mod and oc will gett 1080 and titan xps over 2.1ghz stable with waterblocks


----------



## Enapace

Quote:


> Originally Posted by *bl4ckdot*
> 
> Would an upgrade to a 6850K, Asus Edition 10 and some good DDR4 RAM be a waste coming from a 4790K with my Titan XP ? New chipset is coming next year so I don't really know
> 
> 
> 
> 
> 
> 
> 
> 
> I know I should wait, but damn this is tempting


It would be a bigger upgrade if you were going Titan XP SLI. Reason why I went 5930K because I knew I was likely going end up getting a second Titan Pascal lol.

You would be getting DDR4 support which is a plus and additional 2 physical cores for your CPU and also M.2 NVME support which don't think was on Z97.

If it's a big enough improvement is up to you it's not like the 4790K is a bad processor by any stretch of imagination.


----------



## Nikos4Life

Quote:


> Originally Posted by *Exnetic*
> 
> I find no need for custom bios etc, a shunt mod and oc will gett 1080 and titan xps over 2.1ghz stable with waterblocks


Any place where I can read a guide to do the shunt mod step-by-step?

Thanks


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Nikos4Life*
> 
> Any place where I can read a guide to do the shunt mod step-by-step?
> 
> Thanks


Watch this video:

*



*


----------



## Nikos4Life

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Watch this video:
> 
> *
> 
> 
> 
> *


Thanks!!

Also found this thread:

http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus

Thanks


----------



## Exnetic

A shunt mod is more for power target, so it wont hitt the limit and and clocks it self down, it may clock down just a tiny tiny bit if the power target reaches but not often at all, but far more headroom, the temps needs to stay down, so waterbloock is needed!

http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus


----------



## Nikos4Life

Quote:


> Originally Posted by *Exnetic*
> 
> A shunt mod is more for power target, so it wont hitt the limit and and clocks it self down, it may clock down just a tiny tiny bit if the power target reaches but not often at all, but far more headroom, the temps needs to stay down, so waterbloock is needed!
> 
> http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus


I do have them watercooled, I am going to change my current loop from one to two different ones, so the CPU is not heating up also the same liquid as the GPUs.

Thanks buddy!


----------



## Enapace

Quote:


> Originally Posted by *Nikos4Life*
> 
> I do have them watercooled, I am going to change my current loop from one to two different ones, so the CPU is not heating up also the same liquid as the GPUs.
> 
> Thanks buddy!


This is the reason I'm going for a dual loop system also as want use two different coolants. Just a bit worried that I'm going struggle on only a 360MM radiator for Titan Pascal SLI.


----------



## Murlocke

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> I can guarantee if I buy the Titan XP today, the 1080Ti will be announced tomorrow and it will be a XP killer. Guaranteed.
> 
> Then you guys and gals will hate me, because of what I have done


I can guarantee that will never happen. Every single Titan people worry/claim this. I know because I've bought every single Titan, and every single time multiple people tell me the Ti will be better and cheaper. It never happens, and it never will happen. Titan has always been the best performing card of that generation.

The 1080Ti will be at best equal to a TXP with less VRAM, but more than likely a few percent slower like previous Ti versions. It will be far cheaper, so at worse you'll feel like you wasted $400 for a few percent and some more VRAM.


----------



## ESRCJ

What kind of core clocks are you folks reaching with liquid cooling? My Predator 360 replacement (first was leaking out of the box) is coming soon and I can't wait to properly cool my Titan XP with the pre-filled EK water block. Is 2200MHz for 3DMark reasonable under liquid cooling? What about 2100MHz for consistent gaming?


----------



## unreality

Quote:


> Originally Posted by *gridironcpj*
> 
> What kind of core clocks are you folks reaching with liquid cooling? My Predator 360 replacement (first was leaking out of the box) is coming soon and I can't wait to properly cool my Titan XP with the pre-filled EK water block. Is 2200MHz for 3DMark reasonable under liquid cooling? What about 2100MHz for consistent gaming?


Most people are getting 2050-2100 under water. You wont get much higher without extra voltage which is hurting the max PT @ 120% so its not worth it. The watercooling "only" helps in stabilizing these clocks at a low temperature because under air you will get throttled to 1900 really fast even at 100% fan speed.


----------



## Nikos4Life

Quote:


> Originally Posted by *gridironcpj*
> 
> What kind of core clocks are you folks reaching with liquid cooling? My Predator 360 replacement (first was leaking out of the box) is coming soon and I can't wait to properly cool my Titan XP with the pre-filled EK water block. Is 2200MHz for 3DMark reasonable under liquid cooling? What about 2100MHz for consistent gaming?


More interesting than the clock is the temperature they are getting, I would like to know your temps under load and if others are willing to share their temps will be really nice









Thanks


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Have a Titan X Maxwell, moved to a TXP, no regrets. *Voltage control on a 1080ti won't matter much with this architecture which is what will be needed to get past TXP performance*. So TXP will be king on water or air over the 1080ti.
> 
> We are close, to the rumored announcement of the 1080ti, so you can wait and see. But because you are going water and the voltage does really nothing on water or air for pascal, just go for the big one.


^^This
Only advantage a 1080Ti might have with 3rd party assembly is an unlocked bios. It's gonna need something tho, since the gap between the highest performing current 1080s and even mid performing TXPs is much wider this generation than in previous generations where a "Ti" version had incremental improvements over the majority of Titan cards lol - except Mr.T's cards!







.
Quote:


> Originally Posted by *Nikos4Life*
> 
> More interesting than the clock is the temperature they are getting, I would like to know your temps under load and if others are willing to share their temps will be really nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


folding, gaming or benchmarking, my two WC TXPs never get out of the 30s. Loop temp in the mid to high(er) 20s. Basically, 10C above loop temperature. these have been running overnight:


----------



## CptSpig

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> I can guarantee if I buy the Titan XP today, the 1080Ti will be announced tomorrow and it will be a XP killer. Guaranteed.
> 
> Then you guys and gals will hate me, because of what I have done


The ti will never be a XP killer.







I went from two EVGA 980 KingPin's in SLI to a single Titan X Pascal. Fire Strike went from a score of 21540 to 23580 with the single XP. It's a no brainier. Just do it do it now!









http://s1164.photobucket.com/user/CptSpig/media/Fire Strike_zpsakdpcfvr.jpg.html
Quote:


----------



## Nikos4Life

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^This
> Only advantage a 1080Ti might have with 3rd party assembly is an unlocked bios. It's gonna need something tho, since the gap between the highest performing current 1080s and even mid performing TXPs is much wider this generation than in previous generations where a "Ti" version had incremental improvements over the majority of Titan cards lol - except Mr.T's cards!
> 
> 
> 
> 
> 
> 
> 
> .
> folding, gaming or benchmarking, my two WC TXPs never get out of the 30s. Loop temp in the mid to high(er) 20s. Basically, 10C above loop temperature. these have been running overnight:


HOLY SH*** those temps seems ureal to me!

Can you share your loop components please? (and loop order







)

I do have both of them WC but I am thinking that maybe my loop is not properly setup.

Both cards under load hit almost 60ºC one of them hits 62ºC.

Thanks buddy! and awesome job there with your temps


----------



## Jpmboy

Quote:


> Originally Posted by *Nikos4Life*
> 
> HOLY SH*** those temps seems ureal to me!
> 
> Can you share your loop components please? (and loop order
> 
> 
> 
> 
> 
> 
> 
> )
> 
> I do have both of them WC but I am thinking that maybe my loop is not properly setup.
> 
> Both cards under load hit almost 60ºC one of them hits 62ºC.
> 
> Thanks buddy! and awesome job there with your temps


components are in the rig drop down in my sig, but here ya go:
2 DDC1T pumps
1 XSPC fat 360 rad
1 Aquacomputer Gigant (4x420 rads with 1 220 fan)
EK copper blocks (mounted with grizzly kryonaut TIM)
Plain distilled water coolant (~ 2gal) with 10mL redline water wetter.
Pump/Res -> rad(s) -> CPU -> GPUs -> pump/Res.
that's all. The main thing is to have good (clean) air flow for the rads and aviod recirc already heated air.



the 2 TXMs in the Caselabs SM8 (these run at 1.274V via bios mod) in the picture will hit 43C while folding or gaming.. etc. that has 2 thin 360 rads in the top mount, 1 D5 pump. and 6 cougar fans in "pull" only. Again, EK nickel-plated copper blocks with Grizzly TIM.


----------



## jhowell1030

Quote:


> Originally Posted by *s1rrah*
> 
> ...since the GTX670 days so many decades ago...
> 
> I


Many decades ago?
Quote:


> Originally Posted by *CptSpig*
> 
> The ti will never be a XP killer.
> 
> 
> 
> 
> 
> 
> 
> I went from two EVGA 980 KingPin's in SLI to a single Titan X Pascal. Fire Strike went from a score of 21540 to 23580 with the single XP. It's a no brainier. Just do it do it now!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Fire Strike_zpsakdpcfvr.jpg.html
> Quote:
> I made the exact same move! Although, I have no idea how my Fire Strike test scores have changed because I have yet to get it to work. It used to finish the benchmark and not have a score. Now, it's just stuck on the "gathering System Info" screen before the benchmark begins.
> 
> I deleted it from my steam library. Might try to reinstall and see if that fixes anything. Hopefully I'll have enough time soon to assemble the watercooling stuff. Stuck on the fence thoughabout grabbing a 2nd pump/res So i have the CPU and GPU on separate loops.


----------



## CptSpig

Quote:


> Originally Posted by *jhowell1030*
> 
> Many decades ago?
> I made the exact same move! Although, I have no idea how my Fire Strike test scores have changed because I have yet to get it to work. It used to finish the benchmark and not have a score. Now, it's just stuck on the "gathering System Info" screen before the benchmark begins.
> 
> I deleted it from my steam library. Might try to reinstall and see if that fixes anything. Hopefully I'll have enough time soon to assemble the watercooling stuff. Stuck on the fence thoughabout grabbing a 2nd pump/res So i have the CPU and GPU on separate loops.


Download from 3D Mark and run from your HD. Don't use the latest Nvidia drivers it takes 3D mark a while to update. Make sure you have stable OC's on your components with a good stress test before running bench marks. I have mine on a EKWB Predator 360 QDC and a full cover water block QDC pre-fill. I am very happy with the temp's. Good Luck


----------



## jhowell1030

Quote:


> Originally Posted by *CptSpig*
> 
> Download from 3D Mark and run from your HD. Don't use the latest Nvidia drivers it takes 3D mark a while to update. Make sure you have stable OC's on your components with a good stress test before running bench marks. I have mine on a EKWB Predator 360 QDC and a full cover water block QDC pre-fill. I am very happy with the temp's. Good Luck


Can't do a regular install on just the HD to run it. I got it for free years ago and have no idea what the serial is for it.


----------



## Blaise Pascal

CptSpig, I want you to know that it's my personal mission right now to reach your benchmark as soon as another cold front hits the Texas coast! haha. I must break into the 1%! [Still on air cooling for GPU.] 1911Mhz is the best that I can do on the core without thermal/power issues at the moment. Based on MSI, it seems like the benchmark isn't CPU limiting, but i guess I could be wrong.

Current best effort with hot (72 inside) weather: http://www.3dmark.com/fs/11023734


----------



## CptSpig

Quote:


> Originally Posted by *jhowell1030*
> 
> Can't do a regular install on just the HD to run it. I got it for free years ago and have no idea what the serial is for it.


Go to this link. Go to the bottom of the page and you can download the basic edition and get a single benchmark in each test for free.
http://www.futuremark.com/benchmarks/3dmark?_ga=1.153706622.734057688.1463431589


----------



## jhowell1030

Quote:


> Originally Posted by *CptSpig*
> 
> Go to this link. Go to the bottom of the page and you can download the basic edition and get a single benchmark in each test for free.
> http://www.futuremark.com/benchmarks/3dmark?_ga=1.153706622.734057688.1463431589


Thanks, I may have to do that here soon. There used to be a way you could look up your serials on Steam but I couldn't find a way to do so with 3DMark


----------



## Lee0

Before I do something stupid I came here to ask what you guys think. Is it a good idea to put a Arctic Cooling Accelero Xtreme IV on the TXP? I'll also swap out the thermal paste for arctic silver 5. Has anyone else got some experience with this?


----------



## CptSpig

Quote:


> Originally Posted by *Blaise Pascal*
> 
> CptSpig, I want you to know that it's my personal mission right now to reach your benchmark as soon as another cold front hits the Texas coast! haha. I must break into the 1%! [Still on air cooling for GPU.] 1911Mhz is the best that I can do on the core without thermal/power issues at the moment. Based on MSI, it seems like the benchmark isn't CPU limiting, but i guess I could be wrong.
> 
> Current best effort with hot (72 inside) weather: http://www.3dmark.com/fs/11023734


Here is the link to my score and settings to help you with your quest!
http://www.3dmark.com/fs/10633858


----------



## Viveacious

Quote:


> Originally Posted by *Lee0*
> 
> Before I do something stupid I came here to ask what you guys think. Is it a good idea to put a Arctic Cooling Accelero Xtreme IV on the TXP? I'll also swap out the thermal paste for arctic silver 5. Has anyone else got some experience with this?


I have one on mine. Temps are around 18c lower than on the stock cooler. I'm OC'd to 2000mhz and temps are staying at 60c or under when gaming. (they were peaking ~78-80 before and throttling the clock)

It's a bit difficult to get on due to the 50 small screws that have to be removed, but if you've swapped coolers before then you should be able to handle it.

The Accelero.. well it's simply massive. Taking into account the backplate heat sink, it has, in total, around 5-6x the surface area of the stock heat sink.


----------



## Lee0

Quote:


> Originally Posted by *Viveacious*
> 
> I have one on mine. Temps are around 18c lower than on the stock cooler. I'm OC'd to 2000mhz and temps are staying at 60c or under when gaming. (they were peaking ~78-80 before and throttling the clock)
> 
> It's a bit difficult to get on due to the 50 small screws that have to be removed, but if you've swapped coolers before then you should be able to handle it.
> 
> The Accelero.. well it's simply massive. Taking into account the backplate heat sink, it has, in total, around 5-6x the surface area of the stock heat sink.


Ok thanks!


----------



## Viveacious

Quote:


> Originally Posted by *Lee0*
> 
> Ok thanks!


No problem. One more thing - this was my first Accelero, and it was a little different than other coolers I've worked with in that the thermal pads for the VRMs and RAM actually mount to the back of the PCB, instead of the front, to draw heat up and through the backplate heatsink. The front heatsink is entirely for the GPU.


----------



## Jpmboy

Quote:


> Originally Posted by *Viveacious*
> 
> No problem. One more thing - this was my first Accelero, and it was a little different than other coolers I've worked with in that the t*hermal pads for the VRMs and RAM actually mount to the back of the PCB*, instead of the front, to draw heat up and through the backplate heatsink. The front heatsink is entirely for the GPU.


wut? Really? CAn you plz post a pic if possible.


----------



## Lee0

Yeah I read that they have some patent (might be pending) Passive backplate heatsink.


----------



## Viveacious

Quote:


> Originally Posted by *Jpmboy*
> 
> wut? Really? CAn you plz post a pic if possible.


I didn't take any photos from the back of it before installing, and my PC is really hard to get into (I'm using all 12 of my USB ports for stuff and it's stacked on top of my 10-core renderbox) so I can't share any.

Basically, you just have to line the thermal pads up with the RAM chips and the VRMs as best you can. The thermal pads are nice and thick. The Accelero IV is different than the III in this way. This is supposed to be a superior cooling method, according to them. If you've laid your hand on a GPU backplate before when it's under load then you probably agree with this design. The heatsink on the front is for the GPU only and has no surface contact with the VRMs or RAM.

Here's the only pic I took of it after it was installed. The support bracket is covering up the view of the backplate heat sink (it's black), but if you look in the upper right you can see a small piece of it. It runs the entire length of the back of the card.


----------



## Jpmboy

Quote:


> Originally Posted by *Viveacious*
> 
> I didn't take any photos from the back of it before installing, and my PC is really hard to get into (I'm using all 12 of my USB ports for stuff and it's stacked on top of my 10-core renderbox) so I can't share any.
> 
> Basically, you just have to line the thermal pads up with the RAM chips and the VRMs as best you can. The thermal pads are nice and thick. The Accelero IV is different than the III in this way. This is supposed to be a superior cooling method, according to them. If you've laid your hand on a GPU backplate before when it's under load then you probably agree with this design. The heatsink on the front is for the GPU only and has no surface contact with the VRMs or RAM.


interesting backside heatsink for an AIO. The stock backplate is not there for cooling - it's there to cover some delicate ICs on the back of the PCB, and maybe a little structural stiffness. Ek backplates do act as a heatsink along with active cooling of the ICs on the top side of the card. Unfortunately, there is no DTS in the memory ICs on these cards so the effectiveness of any of these methods is really unknown.


----------



## Viveacious

Quote:


> Originally Posted by *Jpmboy*
> 
> The stock backplate is not there for cooling


Didn't say it was.







That's pretty obvious from the fact that there's no thermal material there. Just said that it gets hot back there, and I can see their reasoning for this new design.


----------



## jhowell1030

Watercoolers out there: I currently have everything needed for one loop for both the GPU and CPU. I was entertaining the idea if doing a separate loop for each. Any of you out there experimented with both?

Since becoming a dad, I really don't game anywhere near as much as I used to. I just hate the idea of warming up the CPU any more than necessary by dumping extra heat from the GPU it's way. Currently the cpuis on a kraken x61 and the warmest it get's while gaming is in the mid 40's. Those of you out there with only one loop, what can I expect(ish) to see?


----------



## Enapace

Quote:


> Originally Posted by *jhowell1030*
> 
> Watercoolers out there: I currently have everything needed for one loop for both the GPU and CPU. I was entertaining the idea if doing a separate loop for each. Any of you out there experimented with both?
> 
> Since becoming a dad, I really don't game anywhere near as much as I used to. I just hate the idea of warming up the CPU any more than necessary by dumping extra heat from the GPU it's way. Currently the cpuis on a kraken x61 and the warmest it get's while gaming is in the mid 40's. Those of you out there with only one loop, what can I expect(ish) to see?


I'm planning on doing a seperate loop one for my CPU and second for my two Titan Pascal. 2 EK PE360MM Rads one for each loop hopefully going be enough will let you know how it works out if you want.


----------



## jhowell1030

Quote:


> Originally Posted by *Enapace*
> 
> I'm planning on doing a seperate loop one for my CPU and second for my two Titan Pascal. 2 EK PE360MM Rads one for each loop hopefully going be enough will let you know how it works out if you want.


I was thinking the same. As it sits right now I have all the parts to put in a single loop with a 360 and 280 rad for just once CPU and one GPU. I Didn't know if having two loops was really worth throwing down another $200 for more parts.


----------



## Enapace

Quote:


> Originally Posted by *jhowell1030*
> 
> I was thinking the same. As it sits right now I have all the parts to put in a single loop with a 360 and 280 rad for just once CPU and one GPU. I Didn't know if having two loops was really worth throwing down another $200 for more parts.


I'm doing it because I have a completely custom case with built in passthrough and 2 custom res in it so it's for looks as well as keeping heat seperate.


----------



## Jpmboy

Quote:


> Originally Posted by *Viveacious*
> 
> Didn't say it was.
> 
> 
> 
> 
> 
> 
> 
> That's pretty obvious from the fact that there's no thermal material there. Just said that it gets hot back there, and I can see their reasoning for this new design.


Thanks for posting the Pic! Yeah, the stock cooler is just that... not the best way to cool these things. But no doubt, the TXP (and 1080) are very cool running cards compared to any previous generation Halo product(s).


----------



## Nautilus

A single 360 rad is more than enough for Titan XP and your overclocked CPU (4 or 6 core). You can even do Titan XP SLI with it, but with push-pull and high fan speeds.

I'm using a modded predator 360 AIO kit on titan xp and 6700k. temps rarely exceed 47 on the gpu.


----------



## KillerBee33

Quote:


> Originally Posted by *Nautilus*
> 
> A single 360 rad is more than enough for Titan XP and your overclocked CPU (4 or 6 core). You can even do Titan XP SLI with it, but with push-pull and high fan speeds.
> 
> I'm using a modded predator 360 AIO kit on titan xp and 6700k. temps rarely exceed 47 on the gpu.


What temperatures you get with that loop?


----------



## Nautilus

Quote:


> Originally Posted by *KillerBee33*
> 
> What temperatures you get with that loop?


GPU:

idle: 28C

load: 47C (skyrim)

CPU:

idle: 27C

load: 70C (prime95)

But please note that 70C is the temp i get with my extreme OC. With my daily OC, which is 4.6 v1.36. I get 10 degrees lower max.


----------



## KillerBee33

Quote:


> Originally Posted by *Nautilus*
> 
> GPU:
> 
> idle: 28C
> 
> load: 47C (skyrim)
> 
> CPU:
> 
> idle: 27C
> 
> load: 70C (prime95)
> 
> But please note that 70C is the temp i get with my extreme OC. With my daily OC, which is 4.6 v1.36. I get 10 degrees lower max.


What's Skyrim?








Have you tried TimeSpy with 3DMark?
http://www.guru3d.com/files-details/3dmark-download.html


----------



## Nautilus

Quote:


> Originally Posted by *KillerBee33*
> 
> What's Skyrim?
> 
> 
> 
> 
> 
> 
> 
> 
> Have you tried TimeSpy with 3DMark?
> http://www.guru3d.com/files-details/3dmark-download.html


3D Mark won't heat up the GPU more than Skyrim believe me. I play it without Vsync and GPU usage is always 99%. It is also the Special Edition and heavily modded.


----------



## KillerBee33

Quote:


> Originally Posted by *Nautilus*
> 
> 3D Mark won't heat up the GPU more than Skyrim believe me. I play it without Vsync and GPU usage is always 99%. It is also the Special Edition and heavily modded.


You won't know until you try, got any newer games ?


----------



## Nikos4Life

Quote:


> Originally Posted by *Jpmboy*
> 
> components are in the rig drop down in my sig, but here ya go:
> 2 DDC1T pumps
> 1 XSPC fat 360 rad
> 1 Aquacomputer Gigant (4x420 rads with 1 220 fan)
> EK copper blocks (mounted with grizzly kryonaut TIM)
> Plain distilled water coolant (~ 2gal) with 10mL redline water wetter.
> Pump/Res -> rad(s) -> CPU -> GPUs -> pump/Res.
> that's all. The main thing is to have good (clean) air flow for the rads and aviod recirc already heated air.
> 
> 
> 
> the 2 TXMs in the Caselabs SM8 (these run at 1.274V via bios mod) in the picture will hit 43C while folding or gaming.. etc. that has 2 thin 360 rads in the top mount, 1 D5 pump. and 6 cougar fans in "pull" only. Again, EK nickel-plated copper blocks with Grizzly TIM.


Your loop seems incredible







well done buddy.

I am planning to change my current loop which is -> Radiator 1 -> CPU -> TXP 1 -> TXP2 -> PUMP -> Radiator 2

To this one:


Is there anything there you would change? (Parts/order or whatever)
Any tips will be more than welcome









Thanks people


----------



## Nautilus

Quote:


> Originally Posted by *KillerBee33*
> 
> You won't know until you try, got any newer games ?


I ran 2 loops of Unigine Valley for you. Highest gpu temp was 48.


----------



## KillerBee33

Quote:


> Originally Posted by *Nautilus*
> 
> I ran 2 loops of Unigine Valley for you. Highest gpu temp was 48.


Haven't tried Unigine V so wouldn't know but MAFIA III @ 1440 gets me to mid 50's....
Cranking up fans don't help so keep'em @1100rpm and my pump @ 2800-3000RPM


----------



## Nautilus

Quote:


> Originally Posted by *KillerBee33*
> 
> Haven't tried Unigine V so wouldn't know but MAFIA III @ 1440 gets me to mid 50's....
> Cranking up fans don't help so keep'em @1100rpm and my pump @ 2800-3000RPM


Just ran Witcher 3 all maxed 1440p for 10 minutes, highest I saw was 50C.


----------



## DeathroW31

Hi guys,

I just received my TXP and still waiting for EK Waterblock & Backplate. I will OC for gaming on 1440p 144hz screen, and run 3DMark for tests. (No hard benchs, no extreme OC).
Before installing the parts, i'm wondering if i do the shunt mod for increase TDP limit ... OR wait for a custom bios. Do someone know if we will have that kind of mod ?


----------



## KillerBee33

Quote:


> Originally Posted by *Nautilus*
> 
> Just ran Witcher 3 all maxed 1440p for 10 minutes, highest I saw was 50C.


Yeah that seems the general Report with regular PC Chassis . I got 3 120MM fans run thru a 320 Radiator as intake , can't seem to have the time to try otherwise but i think i'll have to try...


----------



## bl4ckdot

Quote:


> Originally Posted by *Nautilus*
> 
> Just ran Witcher 3 all maxed 1440p for 10 minutes, highest I saw was 50C.


I do have a rather similar setup (also with a 360 push pull) and have the same temps.


----------



## Jpmboy

Quote:


> Originally Posted by *Nikos4Life*
> 
> Your loop seems incredible
> 
> 
> 
> 
> 
> 
> 
> well done buddy.
> 
> I am planning to change my current loop which is -> Radiator 1 -> CPU -> TXP 1 -> TXP2 -> PUMP -> Radiator 2
> 
> To this one:
> 
> 
> Is there anything there you would change? (Parts/order or whatever)
> Any tips will be more than welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks people


Should work. My preference is to not split the loop but to configure the flow from CPU to GPU(s) and have the GPUs configured for parallel flow (reduce restriction). Even with a 6950X (or 5960X) this fow path works fine and the effectivness of serial radiators in lowering coolant temp is maintained (and very noticable).
Only parts I would add is a flow meter and 2 in-line coolant temp sensors (hot side and cold side). If you know the deltaT and flow rate you can calculate the system cooling power.









honestly... the easiest way and most portable to any future rig is the Aquacomputer 720 XT. This one has been running (24/7/365) for 5 years and is now cooling a 295x2 firebreather. It has previously handled 780Ti Kingpins OG Titans, 980Ti Kingpins... very solid - a bit pricey but considering it has cooled 3 different HEDT rigs over the years the investment is worth it.

Quote:


> Originally Posted by *DeathroW31*
> 
> Hi guys,
> 
> I just received my TXP and still waiting for EK Waterblock & Backplate. I will OC for gaming on 1440p 144hz screen, and run 3DMark for tests. (No hard benchs, no extreme OC).
> Before installing the parts, i'm wondering if i do the shunt mod for increase TDP limit ... OR wait for a custom bios. Do someone know if we will have that kind of mod ?


It is VERY unlikely there will be a custom bios, and if you are not doing "extreme" OC you do not need the shunt mod. Basically if you keep these cards cool, you will not limit the OC without the shunt mod... especially if you do not plan any "extreme" OC.


----------



## DeathroW31

Thanks for the reply. I hit the TDP limit +120% in GTA 5 (+Redux Mod) and Battlefield 1 without OC (spikes to +122%), that's why i ask. I didn't try others games since i have a Windows 10 x64 fresh install. As soon as i have EK WB fitted, i will run tests and see how far i can go with stock voltage and no mod


----------



## jhowell1030

Quote:


> Originally Posted by *Nautilus*
> 
> GPU:
> 
> idle: 28C
> 
> load: 47C (skyrim)
> 
> CPU:
> 
> idle: 27C
> 
> load: 70C (prime95)
> 
> But please note that 70C is the temp i get with my extreme OC. With my daily OC, which is 4.6 v1.36. I get 10 degrees lower max.


Thanks a lot, these are helpful! Yeah, I knew with my radiator setup that I'd have plenty of overhead I was just curious as to how much warmer it would make CPU while gaming.


----------



## Jpmboy

Quote:


> Originally Posted by *DeathroW31*
> 
> Thanks for the reply. I hit the TDP limit +120% in GTA 5 (+Redux Mod) and Battlefield 1 without OC (spikes to +122%), that's why i ask. I didn't try others games since i have a Windows 10 x64 fresh install. As soon as i have EK WB fitted, i will run tests and see how far i can go with stock voltage and no mod


Yeah, the PL will drop clock bins, tho this may be something you wouldn't want to defeat for 24/7 use in a gaming rig (the VRMs get really hot - I posted this effect, measured with an IR thermo right after launch when using uniblocks on the cards), but IME, temperature-induced clock bin drops are more of a problem for an overclocked day-driver config. Benching? Sure, short the resistors. Just my


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> Yeah, the PL will drop clock bins, tho this may be something you wouldn't want to defeat for 24/7 use in a gaming rig (the VRMs get really hot - I posted this effect, measured with an IR thermo right after launch when using uniblocks on the cards), but IME, temperature-induced clock bin drops are more of a problem for an overclocked day-driver config. Benching? Sure, short the resistors. Just my


Oh that's great info. I must have missed the vrm temps that you posted. Would you mind referencing the post? Would love to see it.


----------



## tooterz

Hey folks. Nice to see all the camaraderie here. I had a quick question for those with SLI TXP's. For those who own the new Battlefront, are you able to maintain 60 FPS @ 4k while utilizing SLI? At any preset, at any resolution, my FPS will jump from 30-60 constantly. It's not so bad with a G-sync monitor but it's very annoying and if I can avoid it, I will. DOOM, Witcher 3, ROTR all work fine with SLI. Thanks in advance!









Edit: I own the 32" 4k G-sync monitor from Acer. Vsync is on in NCP, off in-game. Behavior changes when I flip flop but it is still all over the place.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Oh that's great info. I must have missed the vrm temps that you posted. Would you mind referencing the post? Would love to see it.


erm... my post log only holds the last 100... it was back in August in this thread when I took these pictures. the red box indicates the hottest part on the card (>> 85C if I recall). Could be tamed with a delta fan blowing directly on the PCB.











now:



so I can't get the surface temps any more.


----------



## jhowell1030

Quote:


> Originally Posted by *tooterz*
> 
> Hey folks. Nice to see all the camaraderie here. I had a quick question for those with SLI TXP's. For those who own the new Battlefront, are you able to maintain 60 FPS @ 4k while utilizing SLI? At any preset, at any resolution, my FPS will jump from 30-60 constantly. It's not so bad with a G-sync monitor but it's very annoying and if I can avoid it, I will. DOOM, Witcher 3, ROTR all work fine with SLI. Thanks in advance!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I own the 32" 4k G-sync monitor from Acer. Vsync is on in NCP, off in-game. Behavior changes when I flip flop but it is still all over the place.


I think folks have already discussed that BF1 performs poorly with SLI.


----------



## tooterz

Quote:


> Originally Posted by *jhowell1030*
> 
> I think folks have already discussed that BF1 performs poorly with SLI.


You're not by chance confusing Battlefield and Battlefront, are you? I'm not sure if I can link articles in OC, but users in this thread: https://forums.geforce.com/default/topic/900688/sli/star-wars-battlefront-poor-performance-and-stuttering-2x-gtx-670/2/ state that SLI is under par and they need to use NV Inspector.


----------



## jhowell1030

Quote:


> Originally Posted by *tooterz*
> 
> You're not by chance confusing Battlefield and Battlefront, are you? I'm not sure if I can link articles in OC, but users in this thread: https://forums.geforce.com/default/topic/900688/sli/star-wars-battlefront-poor-performance-and-stuttering-2x-gtx-670/2/ state that SLI is under par and they need to use NV Inspector.


Yep, I was. My bad!


----------



## unreality

Quote:


> Originally Posted by *tooterz*
> 
> Hey folks. Nice to see all the camaraderie here. I had a quick question for those with SLI TXP's. For those who own the new Battlefront, are you able to maintain 60 FPS @ 4k while utilizing SLI? At any preset, at any resolution, my FPS will jump from 30-60 constantly. It's not so bad with a G-sync monitor but it's very annoying and if I can avoid it, I will. DOOM, Witcher 3, ROTR all work fine with SLI. Thanks in advance!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I own the 32" 4k G-sync monitor from Acer. Vsync is on in NCP, off in-game. Behavior changes when I flip flop but it is still all over the place.


I actually have the same monitor and im running BF1, Witcher 3 and DOOM all at 60 fps steady with a single card @ 4k. I can actually downclock and undervolt the card and still have permanent 60fps even in BF1 Ultra.

SLI really seems to be broken for most games.


----------



## Nikos4Life

Finally I broke the 36K points barrier!! yay!
With better airflow I think I can scratch a little bit more there









http://www.3dmark.com/3dm11/11821223
What do you think buddies?


----------



## Jpmboy

Quote:


> Originally Posted by *Nikos4Life*
> 
> Finally I broke the 36K points barrier!! yay!
> With better airflow I think I can scratch a little bit more there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/11821223
> What do you think buddies?


sub that *here*


----------



## Djootn

Hi,

I'm having the same issue with battlefield one. After the patch yesterday there was no flickering anymore with sli and dx11 (still no sli support with dx12).
For me enabling sli didn't actually increase my fps. I play on an overclocked 4K monitor (70fps) and playing with one card and dx12 gives me 60+ fps.
Usually a steady 70fps on ultra with resolution scale to 100% (Used to play on 115 but had to lower after the overclock on my screen).
Yesterday i've played on dx11 and sli and i had fps drops to 50 or even less, so i've actually lost performance.
I've re-enabled dx12 an play on 1 gpu. For battlefield one a single titan xp is sufficient so I wont bother anymore to check after following patches if sli actually works.
Hope this helps


----------



## tooterz

Quote:


> Originally Posted by *Djootn*
> 
> Hi,
> 
> I'm having the same issue with battlefield one. After the patch yesterday there was no flickering anymore with sli and dx11 (still no sli support with dx12).
> For me enabling sli didn't actually increase my fps. I play on an overclocked 4K monitor (70fps) and playing with one card and dx12 gives me 60+ fps.
> Usually a steady 70fps on ultra with resolution scale to 100% (Used to play on 115 but had to lower after the overclock on my screen).
> Yesterday i've played on dx11 and sli and i had fps drops to 50 or even less, so i've actually lost performance.
> I've re-enabled dx12 an play on 1 gpu. For battlefield one a single titan xp is sufficient so I wont bother anymore to check after following patches if sli actually works.
> Hope this helps


I was referring to Star Wars Battlefront, but I guess the issues may be similar. While 1 TXP is certainly enough at 4K, I'm all for enabling the extra eye candy (resolution upscaling, more AA) although unnecessary


----------



## sgs2008

No issues for me with battlefront gfreat scaling running at 4k on my 27inch 4k swift with gsync. as far as i can tell it doesn't drop below 60. now watchdogs 2 on the otherhand -_-


----------



## ttg35fort

AMD shows off Zen, now called Ryzen, using two Titan X GPUs in SLI.

http://www.anandtech.com/show/10907/amd-gives-more-zen-details-ryzen-34-ghz-nvme-neural-net-prediction-25-mhz-boost-steps

I find it funny that, lately, when AMD demonstrates their GPUs, they use Intel processors, and when they demonstrate their CPUs, they use NVidia GPUs. I do understand why the do so, but it is kind of sad for AMD. Hopefully 2017 will come together well for them between Vega and Ryzen. Both Intel and NVidia need the competition...


----------



## jhowell1030

Quote:


> Originally Posted by *ttg35fort*
> 
> AMD shows off Zen, now called Ryzen, using two Titan X GPUs in SLI.
> 
> http://www.anandtech.com/show/10907/amd-gives-more-zen-details-ryzen-34-ghz-nvme-neural-net-prediction-25-mhz-boost-steps
> 
> I find it funny that, lately, when AMD demonstrates their GPUs, they use Intel processors, and when they demonstrate their CPUs, they use NVidia GPUs. I do understand why the do so, but it is kind of sad for AMD. Hopefully 2017 will come together well for them between Vega and Ryzen. Both Intel and NVidia need the competition...


I don't think it's sad at all. It's brilliant marketing if you ask me. Not only does it show the full potential of their product (although they should've picked a game that scales better for SLI but hey...win some lose most) but is also show's that they aren't biased to their own products for testing.

As consumers...we would like to hope that companies do more stuff like this.


----------



## Blaise Pascal

Totally agreed! Everyone should really be rooting for AMD right now anyways. If their products are roughly comparable to intel and nvidia, then we'll win out big as consumers due to the increased competition!

Sidenote: They had battlefront 3 running on zen/vega simultaneously (for the first public time as far as I know) at the event last night by the way. Seemed to work pretty well, but there was no frame counter. Didn't seem to have any lag or tearing though.


----------



## shiokarai

Well, let's hope this "competition" doesn't turn out something like AMD setting their prices about 5% less than NVIDIA (and INTEL) and brag about how they're "competing" with them







NVIDIA pushing high end prices to $1k+ prices has set the dangerous precedent and AMD has every business reason to follow it, unfortunately.


----------



## CptSpig

Quote:


> Originally Posted by *shiokarai*
> 
> Well, let's hope this "competition" doesn't turn out something like AMD setting their prices about 5% less than NVIDIA (and INTEL) and brag about how they're "competing" with them
> 
> 
> 
> 
> 
> 
> 
> NVIDIA pushing high end prices to $1k+ prices has set the dangerous precedent and AMD has every business reason to follow it, unfortunately.


I agree would not surprise me in the least.


----------



## jhowell1030

Quote:


> Originally Posted by *shiokarai*
> 
> Well, let's hope this "competition" doesn't turn out something like AMD setting their prices about 5% less than NVIDIA (and INTEL) and brag about how they're "competing" with them
> 
> 
> 
> 
> 
> 
> 
> NVIDIA pushing high end prices to $1k+ prices has set the dangerous precedent and AMD has every business reason to follow it, unfortunately.


While I do agree whole heartedly
Quote:


> Originally Posted by *CptSpig*
> 
> I agree would not surprise me in the least.


While I do agree you have to also keep in mind the years and $$$ that AMD has thrown into R & D.

That being said...it'll be interesting to see what's available this time next year.


----------



## KillerBee33

http://www.nvidia.com/download/driverResults.aspx/113448/en-us


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> http://www.nvidia.com/download/driverResults.aspx/113448/en-us


thanks. I hope they fixed the folding fail in the last driver.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> thanks. I hope they fixed the folding fail in the last driver.


Had GPU Fail wile Folding after about an hour sinse 375 Drivers...


----------



## DNMock

Been tinkering around with O/C's on my SLI set up a bit, and I'm starting to wonder if GPU boost 3.0 doesn't quite know how to handle SLI properly.

When one GPU is running at 1.03, a little slower clock speed, and topping out at 108% of the PL and the other GPU is running at 1.061, at a higher clock speed and the one that is slamming the power wall.

This is with GPU's synched in afterburner.

Anyone tinkered around with unsyncing the GPU's in afterburner and tweaking a bit?


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> Been tinkering around with O/C's on my SLI set up a bit, and I'm starting to wonder if GPU boost 3.0 doesn't quite know how to handle SLI properly.
> 
> When one GPU is running at 1.03, a little slower clock speed, and topping out at 108% of the PL and the other GPU is running at 1.061, at a higher clock speed and the one that is slamming the power wall.
> 
> This is with GPU's synched in afterburner.
> 
> Anyone tinkered around with unsyncing the GPU's in afterburner and tweaking a bit?


that difference is more likely than not due to the "asic" of the individual cards. Most pairs behave the same way. When I set my 2 at 2088, #1 needs 1.063V the other needs 1.050V. Unsynching and cntrl-F, cntrl -L to lock the core freqs still runs the same voltage. The higher voltage card hit the PL more often.


----------



## DNMock

Quote:


> Originally Posted by *Jpmboy*
> 
> that difference is more likely than not due to the "asic" of the individual cards. Most pairs behave the same way. When I set my 2 at 2088, #1 needs 1.063V the other needs 1.050V. Unsynching and cntrl-F, cntrl -L to lock the core freqs still runs the same voltage. The higher voltage card hit the PL more often.


aah, control F locks the frequencies, didn't know that bit. The fact that the lower voltage card would generally run at a lower clock speed is what bothered me the most. That's the ticket I needed


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> aah, control F locks the frequencies, didn't know that bit. The fact that the lower voltage card would generally run at a lower clock speed is what bothered me the most. That's the ticket I needed


If you do unsynch the cards, you need to select the card you want to control... and cntrl-F, cntrl-L on each one. It works... may take a little practice to get it right.


----------



## DNMock

Quote:


> Originally Posted by *Jpmboy*
> 
> If you do unsynch the cards, you need to select the card you want to control... and cntrl-F, cntrl-L on each one. It works... may take a little practice to get it right.


Yeah, I remember I think it was with crossfire 7970's maybe when I first started going dual GPUs and overclocking, I didn't know about the sync setting in afterburner so I was constantly having to adjust the overclock twice like that. felt like an R-Tard when I saw the little check box for sync.


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> Yeah, I remember I think it was with crossfire 7970's maybe when I first started going dual GPUs and overclocking, I didn't know about the sync setting in afterburner so I was constantly having to adjust the overclock twice like that. felt like an R-Tard when I saw the little check box for sync.


lol - we all have those moments with this stuff.


----------



## Blaise Pascal

Wooohoo! What's up with their bonobo clock speed though?


----------



## kx11

those clock are very low

here are mine


----------



## DNMock

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - we all have those moments with this stuff.


So I got it figured out. Wasn't really a big deal just seemed to crash more often than it seemed like it should.

Turns out my two cards are about as polar opposite as you can get:


Spoiler: Warning: Spoiler!






Note 2050 was my target point for clock speed so bottom card had plenty of room.

For giggles I slid the voltage slider all the way up while synced and check out the massive differences.

Turns out I was right, but probably only really matters when dealing with yin and yang gpus. Out of luck I had the GPU shown on the bottom as my primary GPU to have it's settings mirrored over, and when gpu boost got ahold of that O/C the cards were so vastly different I would end up with one running at a higher sped than the other. I swapped them so the card listed on gpuz on the top there is now the primary card, and while my benchmark scores fell a bit, didn't have a single hiccup and games seem to run smoother (placebo maybe?) but anyway, now they are both clocking in at the same 2037.

Feel bad for the other card though, it's seems so bored. I'll have to try and give her a solo run sometime over the break, see if it can hit 2100.


----------



## Jpmboy

Quote:


> Originally Posted by *DNMock*
> 
> So I got it figured out. Wasn't really a big deal just seemed to crash more often than it seemed like it should.
> 
> Turns out my two cards are about as polar opposite as you can get:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Note 2050 was my target point for clock speed so bottom card had plenty of room.
> 
> For giggles I slid the voltage slider all the way up while synced and check out the massive differences.
> 
> Turns out I was right, but probably only really matters when dealing with yin and yang gpus. Out of luck I had the GPU shown on the bottom as my primary GPU to have it's settings mirrored over, and when gpu boost got ahold of that O/C the cards were so vastly different I would end up with one running at a higher sped than the other. I swapped them so the card listed on gpuz on the top there is now the primary card, and while my benchmark scores fell a bit, didn't have a single hiccup and games seem to run smoother (placebo maybe?) but anyway, now they are both clocking in at the same 2037.
> 
> Feel bad for the other card though, it's seems so bored. I'll have to try and give her a solo run sometime over the break, see if it can hit 2100.


Yeah, that's the trade off. Normally you'd want the stronger card in slot 1 (to run solo using MB PCIE lane switches), but when you want all cards to play well together it works best to put the weakest card in slot 1 and it will set the pace, contrary to lore about this. we should be able to do the same by changing the "Master Graphics Card Selection" in AB. (hasn't worked for me very well).


----------



## Chaoszero55

I use two 480 GTS Stealth Black Ice rads hooked up to my titan xp and 6700k + mosfet block on formula VIII and get ~36 C. Without the loud fans, I'm noticing all this coil whine on the titan xp on games like Ark Evolved/Bf1 and I'm not sure if there is any way to dampen the noise.

Question: Is there any way to remove the automatic downclocking on the titan xp as the temp increases? I'm talking about my gpu's running ~29-36C on load and there seems to be an automatic downclock at that transition between 34C to 35C even though that's still a very cool gpu temp.


----------



## jedi95

Quote:


> Originally Posted by *Chaoszero55*
> 
> Question: Is there any way to remove the automatic downclocking on the titan xp as the temp increases? I'm talking about my gpu's running ~29-36C on load and there seems to be an automatic downclock at that transition between 34C to 35C even though that's still a very cool gpu temp.


I have noticed that same downclocking behavior, and there is no way to change it other than keeping the card cooler. I get very similar temps in my build with 2x 480mm rads. I just ignore it since it's only a single clock step.


----------



## DNMock

Quote:


> Originally Posted by *Chaoszero55*
> 
> I use two 480 GTS Stealth Black Ice rads hooked up to my titan xp and 6700k + mosfet block on formula VIII and get ~36 C. Without the loud fans, I'm noticing all this coil whine on the titan xp on games like Ark Evolved/Bf1 and I'm not sure if there is any way to dampen the noise.
> 
> Question: Is there any way to remove the automatic downclocking on the titan xp as the temp increases? I'm talking about my gpu's running ~29-36C on load and there seems to be an automatic downclock at that transition between 34C to 35C even though that's still a very cool gpu temp.


Depends, are you downclocking because of TDP limit or because it isn't being fully utilized or doesn't need to be.

If the first is your problem, then the only solution is to pull your OC back to the point where you don't hit that limit *OR* to hard mod your card via shunt mod.

If the 2nd is your issue, scroll back to last page, as JPM just told me about a nice afterburner trick to lock your GPU's down

do note though it won't prevent the downclocking forced by Boost 3.0 if you are reaching your Power Limit wall, but will prevent downclocking due to under utilization.

From the way it sounds, the first one is your problem, so unfortunately cranking down your OC a bit or hard modding are your only real options I'm afraid.


----------



## MrKenzie

Quote:


> Originally Posted by *jedi95*
> 
> I have noticed that same downclocking behavior, and there is no way to change it other than keeping the card cooler. I get very similar temps in my build with 2x 480mm rads. I just ignore it since it's only a single clock step.


There is at least 2 downclock steps before 36C, but there's nothing you can do other than making the card run cooler. The roughly 35MHz that you are missing out on won't equate to much in performance loss.


----------



## Chaoszero55

Quote:


> Originally Posted by *MrKenzie*
> 
> There is at least 2 downclock steps before 36C, but there's nothing you can do other than making the card run cooler. The roughly 35MHz that you are missing out on won't equate to much in performance loss.


True, I know it won't equate to any real world performance but just wanted to see if there was something I was missing


----------



## Jpmboy

Quote:


> Originally Posted by *Chaoszero55*
> 
> True, I know it won't equate to any real world performance but just wanted to see if there was something I was missing


set the boost to run higher and let it drop a bin or two to the frequency you want it to run at. This works if the card is not already at its ceiling.


----------



## lanofsong

Hey Titan X Pascal owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Jpmboy

I'm in.


----------



## lanofsong

Quote:


> Originally Posted by *Jpmboy*
> 
> I'm in.


Crazy PPD from you


----------



## Jpmboy

Quote:


> Originally Posted by *lanofsong*
> 
> Crazy PPD from you


these TXPs make _huge_ PPD and don't even break a sweat.


----------



## KillerBee33

Quote:


> Originally Posted by *lanofsong*
> 
> Hey Titan X Pascal owners,
> 
> We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
> Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> December Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


Signed In


----------



## lanofsong

Excellent and thank you


----------



## MunneY

Quote:


> Originally Posted by *lanofsong*
> 
> Excellent and thank you


----------



## MunneY

I actually moved my Monsta 480 External to the window so it could suck in cold air from outside... The stupid card is running 30c and actually upclocked to 2012mhz


----------



## Jpmboy

Quote:


> Originally Posted by *MunneY*
> 
> I actually moved my Monsta 480 External to the window so it could suck in cold air from outside... The stupid card is running 30c and actually upclocked to 2012mhz


The 5960X is probably generating more heat than the Titan.


----------



## Menthol

@Jpmboy
what driver are you using for folding? I seem to be having trouble with cards dropping out an failing

I think I have it going OK now on 372.90


----------



## roccale

My resut air cooled:

+220 +800 no overvolt.

https://postimg.org/image/jruup4rgn/

https://postimg.org/image/kgwos3i1t/

https://postimg.org/image/nwzxa521z/







Please take a look at FIRE STRIKE graphic score 32593 points ....


----------



## lanofsong

Quote:


> Originally Posted by *Menthol*
> 
> @Jpmboy
> what driver are you using for folding? I seem to be having trouble with cards dropping out an failing
> 
> I think I have it going OK now on 372.90


373.xx or lower









Note: Once you have folded 50K points for OCN, you can apply for your 'very colorful and most awesome' Postbit







See attached thread. Do not fill in the Folding team name box









http://www.overclock.net/t/1164344/folding-postbit-folding-team-name/0_20


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> @Jpmboy
> what driver are you using for folding? I seem to be having trouble with cards dropping out an failing
> 
> I think I have it going OK now on 372.90


372.90 is what works best for me too (with or without the odd 1080 added to the 2 TXPs).









you got all 4 TXPs loaded?


----------



## Blaise Pascal

JPM, it's possible to effectively SLI a 1080 into two TXP's to compute with??


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> 372.90 is what works best for me too (with or without the odd 1080 added to the 2 TXPs).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you got all 4 TXPs loaded?


yes 6950/4 txp's and a 6700/1080
I have never seen temps using my chiller benching that I see folding


----------



## Menthol

Quote:


> Originally Posted by *Blaise Pascal*
> 
> JPM, it's possible to effectively SLI a 1080 into two TXP's to compute with??


cards don't need to be in sli to fold


----------



## Menthol

Quote:


> Originally Posted by *Blaise Pascal*
> 
> JPM, it's possible to effectively SLI a 1080 into two TXP's to compute with??


cards don't need to be in sli to fold


----------



## lanofsong

Quote:


> Originally Posted by *Blaise Pascal*
> 
> JPM, it's possible to effectively SLI a 1080 into two TXP's to compute with??


Quote:


> Originally Posted by *Menthol*
> 
> yes 6950/4 txp's and a 6700/1080
> I have never seen temps using my chiller benching that I see folding


Did you sign up for this months FAT? as i don't see you on the list







- not too late to do so








http://folding.axihub.ca/foldathon_signup.php


----------



## Enapace

Quote:


> Originally Posted by *lanofsong*
> 
> Did you sign up for this months FAT? as i don't see you on the list
> 
> 
> 
> 
> 
> 
> 
> - not too late to do so
> 
> 
> 
> 
> 
> 
> 
> 
> http://folding.axihub.ca/foldathon_signup.php


I would join in but my Titan Pascal SLI Setup is still sitting in boxes don't have a PSU yet which can power the system wish you well.


----------



## lanofsong

Quote:


> Originally Posted by *Enapace*
> 
> I would join in but my Titan Pascal SLI Setup is still sitting in boxes don't have a PSU yet which can power the system wish you well.


We will have another FAT in January 2017, maybe you can let loose all that Pascal Power then. If not, then maybe February for the real biggie - The 2017 Forum Folding War (not all teams are up yet).


----------



## Enapace

Quote:


> Originally Posted by *lanofsong*
> 
> Did you sign up for this months FAT? as i don't see you on the list
> 
> 
> 
> 
> 
> 
> 
> - not too late to do so
> 
> 
> 
> 
> 
> 
> 
> 
> http://folding.axihub.ca/foldathon_signup.php


I would join in but my Titan Pascal SLI Setup is still sitting in boxes don't have a PSU yet
Quote:


> Originally Posted by *lanofsong*
> 
> We will have another FAT in January 2017, maybe you can let loose all that Pascal Power then. If not, then maybe February for the real biggie - The 2017 Forum Folding War (not all teams are up yet).


Should be ready in Jan going order my PSU soon.


----------



## Menthol

Quote:


> Originally Posted by *lanofsong*
> 
> Did you sign up for this months FAT? as i don't see you on the list
> 
> 
> 
> 
> 
> 
> 
> - not too late to do so
> 
> 
> 
> 
> 
> 
> 
> 
> http://folding.axihub.ca/foldathon_signup.php


I just did that a few minutes ago, I'm kind of slow


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> The 5960X is probably generating more heat than the Titan.


Yup. Its only running in the 50s though at 4.4

Quote:


> Originally Posted by *Menthol*
> 
> yes 6950/4 txp's and a 6700/1080
> I have never seen temps using my chiller benching that I see folding


I have alwasys wanted a chiller, butttttttttttt I live in a high humidity environment.


----------



## KillerBee33

Quote:


> Originally Posted by *MunneY*
> 
> Yup. Its only running in the 50s though at 4.4
> I have alwasys wanted a chiller, butttttttttttt I live in a high humidity environment.


What is that Chiller? I'm running [email protected] 4.6 and run into 70's when folding....


----------



## Dagamus NM

What size of chiller are you running Menthol? I picked up a couple of 800W industrial CO2 laser chillers so I can be like the cool kids.

Actually one is to try and tame a pair of r9 295x2s, but the other can be used with my quad XPs.

I am reading up on everything I need to know about it.

Any not so obvious pointers?


----------



## Jpmboy

Quote:


> Originally Posted by *Blaise Pascal*
> 
> JPM, it's possible to effectively SLI a 1080 into two TXP's to compute with??


for compute - you really want to disable SLI, each card is it's own compute slot.
Quote:


> Originally Posted by *Menthol*
> 
> yes 6950/4 txp's and a 6700/1080
> I have never seen temps using my chiller benching that I see folding


Yeah - it gives the OC a good workout! I would not fold on the CPU, just the GPUs. The rig is completely useable. Also note, the folding client will run the cpu cache at full bore the entire time you are folding for I/O even when the CPU is not folding. Best to keep this at a tame setting
4 TXP ~ 6M PPD, 1 1080 ~ 800KPPd. Crazy!!
Quote:


> Originally Posted by *Menthol*
> 
> I just did that a few minutes ago, I'm kind of slow


He's just being modest.








Quote:


> Originally Posted by *KillerBee33*
> 
> What is that Chiller? I'm running [email protected] 4.6 and run into 70's when folding....





BUT a simple aquarium chiller works just fine.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> BUT a simple aquarium chiller works just fine.


Ouch ....can't have this monstrosity in my Living Room...
Aquarium Chiller....any smaller?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Ouch ....can't have this monstrosity in my Living Room...
> Aquarium Chiller....any smaller?



somewhat smaller... it's on the floor on the left.








the koolance unit has pump and reservoir built in.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> somewhat smaller... it's on the floor on the left.
> 
> 
> 
> 
> 
> 
> 
> 
> the koolance unit has pump and reservoir built in.


Ehh ... my whole setup is in the middle of the Living Room "full Entertainment" can't find a place for those...


Spoiler: Warning: Spoiler!






Will stick with what it is until i get a bigger place


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Ehh ... my whole setup is in the middle of the Living Room "full Entertainment" can't find a place for those...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Will stick with what it is until i get a bigger place


nice - you get to put this crap in the Living Room. You married? For me either in my office or the garage.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> nice - you get to put this crap in the Living Room. You married? For me either in my office or the garage.


Was married until my last Bender...


----------



## TheGeneralLee86

My Rig with 2 Titan X Pascals in SLi and i7 6950X Extreme Editon!



This is my favorite build I have done so far!


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Was married until my last Bender...


benders are why I'm still married. eh, gonna stay this way, 'cause she'd never give me half my sht back.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> benders are why I'm still married. eh, gonna stay this way, 'cause she'd never give me half my sht back.


Yeah , splitting up after 10 years is a B*tch...Getting all my crap together and movin' to FL
A place to be when single







also don't have much for her to take....


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Was married until my last Bender...


Quote:


> Originally Posted by *KillerBee33*
> 
> Yeah , splitting up after 10 years is a B*tch...Getting all my crap together and movin' to FL
> *A place to be when single :*thumb: also don't have much for her to take....


oh, you got that right.


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> oh, you got that right.


Get myself a nice Studio , a crappy Job and start over...


----------



## KillerBee33

I'm giving up on Folding...left it while at work , came back ....CPU ain't doing ***** and GPU Failed all was set on FULL ...what a waste


----------



## Blaise Pascal

double post


----------



## Blaise Pascal

Quote:


> Originally Posted by *lanofsong*
> 
> Did you sign up for this months FAT? as i don't see you on the list
> 
> 
> 
> 
> 
> 
> 
> - not too late to do so
> 
> 
> 
> 
> 
> 
> 
> 
> http://folding.axihub.ca/foldathon_signup.php


Having internet issues at home at the moment. Comcast does NOT want me to fold. Or watch netflix. Or anything. I've been sneaking over to work at night to use the wifi on my craptop to play lame-ass league of legends for entertainment LOL


----------



## arrow0309

Quote:


> Originally Posted by *roccale*
> 
> My resut air cooled:
> 
> +220 +800 no overvolt.
> 
> https://postimg.org/image/jruup4rgn/
> 
> https://postimg.org/image/kgwos3i1t/
> 
> https://postimg.org/image/nwzxa521z/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please take a look at FIRE STRIKE graphic score 32593 points ....


+Rep
Nice score









How was your fanspeed?


----------



## lanofsong

Quote:


> Originally Posted by *KillerBee33*
> 
> I'm giving up on Folding...left it while at work , came back ....CPU ain't doing ***** and GPU Failed all was set on FULL ...what a waste


What drivers are you folding on? 373.xx or lower work really well, anything above will cause 0X21WU's to crash







Don't quit yet - let's see what the problem was first


----------



## Menthol

Yeah - it gives the OC a good workout! I would not fold on the CPU, just the GPUs. The rig is completely useable. Also note, the folding client will run the cpu cache at full bore the entire time you are folding for I/O even when the CPU is not folding. Best to keep this at a tame setting
4 TXP ~ 6M PPD, 1 1080 ~ 800KPPd. Crazy!!

I don't even know what you mean, I am getting ready to disassemble those TXP's soon anyway so it is all going full bore tell this is done, so many screws to put the stock heatsinks back on
Quote:


> Originally Posted by *KillerBee33*
> 
> I'm giving up on Folding...left it while at work , came back ....CPU ain't doing ***** and GPU Failed all was set on FULL ...what a waste


First time folding for me, driver 372.90 did the trick for me


----------



## Celcius

If you buy a Titan X and it has coil whine, does nvidia give a full refund no questions asked within a certain period of time?


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> Yeah - it gives the OC a good workout! I would not fold on the CPU, just the GPUs. The rig is completely useable. Also note, the folding client will run the cpu cache at full bore the entire time you are folding for I/O even when the CPU is not folding. Best to keep this at a tame setting
> 4 TXP ~ 6M PPD, 1 1080 ~ 800KPPd. Crazy!!
> 
> I don't even know what you mean, I am getting ready to disassemble those TXP's soon anyway so it is all going full bore tell this is done, *so many screws to put the stock heatsinks back on*
> First time folding for me, driver 372.90 did the trick for me


ugh - I dread even thinking about that backplate screw-in-a-screw reassembly.

Quote:


> Originally Posted by *Celcius*
> 
> If you buy a Titan X and it has coil whine, does nvidia give a full refund no questions asked within a certain period of time?


define "coil" whine. since the TX has no coils.


----------



## Celcius

Quote:


> Originally Posted by *Jpmboy*
> 
> define "coil" whine. since the TX has no coils.


Coil whine = a high pitched noise emanating from an electronic device that is loud enough to be annoying.


----------



## Jpmboy

Quote:


> Originally Posted by *Celcius*
> 
> Coil whine = a high pitched noise emanating from an electronic device that is loud enough to be annoying.


That's not quite coil whine, but it is annoying. Does it occur only under certain high loads or is it constant, even at idle? If the later, RMA it under warranty. Whether you get a refund or a replacement IDK. If it only occurs when under heavy load it could be the card, or the PSU. Some chokes will "buzz".


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> Yeah - it gives the OC a good workout! I would not fold on the CPU, just the GPUs. The rig is completely useable. Also note, the folding client will run the cpu cache at full bore the entire time you are folding for I/O even when the CPU is not folding. Best to keep this at a tame setting
> 4 TXP ~ 6M PPD, 1 1080 ~ 800KPPd. Crazy!!
> 
> I don't even know what you mean, I am getting ready to disassemble those TXP's soon anyway so it is all going full bore tell this is done, so many screws to put the stock heatsinks back on
> First time folding for me, driver 372.90 did the trick for me


this is what I was talking about... even with the CPU not folding, only GPUs, the Cache runs at the max clock set in bios all the time (I/O related)


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> That's not quite coil whine, but it is annoying. Does it occur only under certain high loads or is it constant, even at idle? If the later, RMA it under warranty. Whether you get a refund or a replacement IDK. If it only occurs when under heavy load it could be the card, or the PSU. Some chokes will "buzz".


Think you all should watch this.
Some good info


----------



## jhowell1030

Quote:


> Originally Posted by *Jpmboy*
> 
> ugh - I dread even thinking about that backplate screw-in-a-screw reassembly.
> define "coil" whine. since the TX has no coils.


Or...since you knew what he was talking about you could drop being facetious.

Seriously...it's not cute.







If you want to "correct"
Quote:


> Originally Posted by *lilchronic*
> 
> Think you all should watch this.
> Some good info


Thanks for supplying information and trying to be helpful instead of being facetious like others.

Seriously guys...the only one laughing at that is you. To everyone else you look pompously-conceited.


----------



## Jpmboy

whoa - lighten up.

been a while since I've seen one of these on a graphics card











anyway, as I said, he might get justice if it is producing whine at idle.. but at load very few manufacturers will RMA on that basis, let alone a full refund.

and @jhowell1030 , I asked the question for a specific reason which seems to have escaped your rant-read. FO


----------



## KillerBee33

Quote:


> Originally Posted by *lanofsong*
> 
> What drivers are you folding on? 373.xx or lower work really well, anything above will cause 0X21WU's to crash
> 
> 
> 
> 
> 
> 
> 
> Don't quit yet - let's see what the problem was first


Had this problem since 375 but running 376.33...also have High Pitch buzzing Noise from PSU when Folding Only....


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> whoa - lighten up.
> 
> been a while since I've seen one of these on a graphics card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> anyway, as I said, he might get justice if it is producing whine at idle.. but at load very few manufacturers will RMA on that basis, let alone a full refund.
> 
> and @jhowell1030 , I asked the question for a specific reason which seems to have escaped your rant-read. FO


There a lot smaller now and have shielding over them so you can't see the coils.

Inductor



Inside are coils....


----------



## lanofsong

Please run 373.xx or lower.


----------



## KillerBee33

Quote:


> Originally Posted by *lanofsong*
> 
> Please run 373.xx or lower.


Not sure what the problem is...been running Medium from 8AM , still goin' no crash ....
Is there anyway to reset [email protected] settings? i might've messed around with it.


----------



## MunneY

Quote:


> Originally Posted by *jhowell1030*
> 
> Or...since you knew what he was talking about you could drop being facetious.
> 
> Seriously...it's not cute.
> 
> 
> 
> 
> 
> 
> 
> If you want to "correct"
> Thanks for supplying information and trying to be helpful instead of being facetious like others.
> 
> Seriously guys...the only one laughing at that is you. To everyone else you look pompously-conceited.


Maybe if people didnt run around dropping misnomers all the time and spreading mis-information, then we wouldn't have these issues.


----------



## Chaoszero55

Quote:


> Originally Posted by *lilchronic*
> 
> There a lot smaller now and have shielding over them so you can't see the coils.
> 
> Inductor
> 
> 
> 
> Inside are coils....


So have any of you used coil lacquer to dampen the sound?


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> There a lot smaller now and have shielding over them so you can't see the coils.
> 
> Inductor
> 
> 
> 
> Inside are coils....


though none of this is relevant to whether he can get a refund or RMA... where are the coils on the Titan X Pascal?? Those shots are not from this card.


----------



## Celcius

I haven't actually bought one (yet?), I just wanted to know what their return policy was like just in case as I've never ordered directly from nvidia before.


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> though none of this is relevant to whether he can get a refund or RMA... where are the coils on the Titan X Pascal?? Those shots are not from this card.


Aren't these coils?



Edit 1 - Sure seems like it by the ID markings on the housing.

http://www.mouser.com/ds/2/392/products_24-220115.pdf

Edit 2 - So is the bank of chips marked R22 highlighted in dark red next to the phases.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Aren't these coils?
> 
> 
> 
> Edit 1 - Sure seems like it by the ID markings on the housing.
> 
> http://www.mouser.com/ds/2/392/products_24-220115.pdf
> 
> Edit 2 - So is the bank of chips marked R22 highlighted in dark red next to the phases.


I thought those were radial (not axial) encapsulated chokes... IDK, probably best that an EE chime in here.

that said.. in over 3 dozen GPUs in the past few years, I've luckily never had one whine.. had a bunch that buzz with certain loads though.


----------



## arrow0309

Quote:


> Originally Posted by *axiumone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jpmboy*
> 
> though none of this is relevant to whether he can get a refund or RMA... where are the coils on the Titan X Pascal?? Those shots are not from this card.
> 
> 
> 
> Aren't these coils?
> 
> 
> 
> Edit 1 - Sure seems like it by the ID markings on the housing.
> 
> http://www.mouser.com/ds/2/392/products_24-220115.pdf
> 
> Edit 2 - So is the bank of chips marked R22 highlighted in dark red next to the phases.
Click to expand...

Yeah, the R22 are chokes, low profile but they're still (coil) inductors, like these:

https://ae01.alicdn.com/kf/HTB19bfRJFXXXXcKXpXXq6xXFXXXU.jpg

However, since I have this good EVGA G2 psu I didn't notice any more coil whine with none of the AMD 290, GTX 980, GTX 980 ti or the current GTX 1080 (all liquid cooled).
Gonna buy soon a new Titan X and I'll let you know if there'll be any coil whine.


----------



## Menthol

It does seem strange that some of us never see this coil whine and some do, like jpmboy, I have not experienced this in years, is it faulty house wiring, living close to overhead power lines, other appliances in your house, faulty cards, cheap power supply?


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> I thought those were radial (not axial) encapsulated chokes... IDK, probably best that an EE chime in here.
> 
> that said.. in over 3 dozen GPUs in the past few years, I've luckily never had one whine.. had a bunch that buzz with certain loads though.


Yeah, that's way beyond my knowledge as well. Same experience with me though. I've had a lot of Maxwell cards and most of my Pascall buzzed when frame rates go over 200+, but none of them whined or squealed.


----------



## DooRules

Never had that coil whine myself either through many gpus. As mentioned I have heard a buzz under high loads at times.


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> though none of this is relevant to whether he can get a refund or RMA... where are the coils on the Titan X Pascal?? Those shots are not from this card.


Well the point of the my post was to inform you, so it was totally relevant. I see around 12 indutors on the pascal titan x.

The buzzing noise that people say they hear when there card is pushing 200FPS is coil whine or buzzing all the same thing almost every card i have owned has coil whine when pushing high fps. One example where you could find coil whine is when you close out valley bench and get that splash sceen with 2000 fps the card will make some buzzing noises.

and the people that say they dont hear it probably really don't hear it because of old age or poor hearing. There are frequency's that when you get older you can no longer hear.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Yeah, that's way beyond my knowledge as well. Same experience with me though. I've had a lot of Maxwell cards and most of my Pascall buzzed when frame rates go over 200+, but none of them whined or squealed.


Well.. I think I made a 780Ti Kingpin squeal once.








Quote:


> Originally Posted by *lilchronic*
> 
> Well the point of the my post was to inform you, so it was totally relevant. I see around 12 indutors on the pascal titan x.
> 
> The buzzing noise that people say they hear when there card is pushing 200FPS is coil whine or buzzing all the same thing almost every card i have owned has coil whine when pushing high fps. One example where you could find coil whine is when you close out valley bench and get that splash sceen with 2000 fps the card will make some buzzing noises.
> 
> and the people that say they dont hear it probably really don't hear it because of *old age or poor hearing*. There are frequency's that when you get older you can no longer hear.


Pulleze. Thank you for informing me. My hearing is fine. The relevancy as to whether one can get an RMA based on whine or buzz is still questionable IMO. But... now we know you are smart.








The Valley splash screen has done that across many generations of cards. A few game credits do it also. This type of buzz, and lol, coil whine from encapsulated chokes during "normal" use are two different things.
Buzz is very condition specific - and if you need to run the valley splash screen to hear it - no RMA. Yeah, A CONTINUOUS whine at idle - yeah, that would be accepted as an RMA.

That said - I'll admit, my right ear may have lost a bit in picking up shrill noises. But it's not from age


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> Well.. I think I made a 780Ti Kingpin squeal once.
> 
> 
> 
> 
> 
> 
> 
> 
> Pulleze. Thank you for informing me. My hearing is fine. The relevancy as to whether one can get an RMA based on whine or buzz is still questionable IMO. But... now we know you are smart.
> 
> 
> 
> 
> 
> 
> 
> 
> The Valley splash screen has done that across many generations of cards. A few game credits do it also. This type of buzz, and lol, coil whine from encapsulated chokes during "normal" use are two different things.
> Buzz is very condition specific - and if you need to run the valley splash screen to hear it - no RMA. Yeah, A CONTINUOUS whine at idle - yeah, that would be accepted as an RMA.
> 
> That said - I'll admit, my right ear may have lost a bit in picking up shrill noises. But it's not from age


It's a known fact that when you get older you start to lose hearing to high frequency's. that was not directed specifically to you just in general. sorry for the misunderstanding.

As for asking someone on the forums if you can RMA a card for coil whine is pretty much pointless, you wont get a answer. Best to contact the company you got the card from.


----------



## KillerBee33

http://nvidia.custhelp.com/app/answers/detail/a_id/4288
*Workaround to fix incorrect [email protected] work units.*
Fixed random flashes in Just Cause 3.
Fixed some issues that could lead to Battlefield 1 crash
Fixed SLI texture flickering in Battlefield 1
Fixed corruption in Wargame: Red Dragon game.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> Well.. I think I made a 780Ti Kingpin squeal once.
> 
> 
> 
> 
> 
> 
> 
> 
> Pulleze. Thank you for informing me. My hearing is fine. The relevancy as to whether one can get an RMA based on whine or buzz is still questionable IMO. But... now we know you are smart.
> 
> 
> 
> 
> 
> 
> 
> 
> The Valley splash screen has done that across many generations of cards. A few game credits do it also. This type of buzz, and lol, coil whine from encapsulated chokes during "normal" use are two different things.
> Buzz is very condition specific - and if you need to run the valley splash screen to hear it - no RMA. Yeah, A CONTINUOUS whine at idle - yeah, that would be accepted as an RMA.
> 
> That said - I'll admit, my right ear may have lost a bit in picking up shrill noises. But it's not from age


I'm guessing cause "bang, bang, bang, bang, bang"

Ya know... As a fellower boomer


----------



## Menthol

I definitely have hearing loss and after a couple long term relationships and several kids I have conditioned myself to tune out whining


----------



## xTesla1856

I'm about to pull the trigger on a new Titan X Pascal (Late to the party, as usual). Since I'm in Switzerland and Nvidia doesn't ship here, it'll involve a little day trip over the German border to pick up the card. What does the owner's club think about the 1080Ti rumors, will we see a 780Ti situation or a 980Ti situation?


----------



## Chaoszero55

Quote:


> Originally Posted by *xTesla1856*
> 
> I'm about to pull the trigger on a new Titan X Pascal (Late to the party, as usual). Since I'm in Switzerland and Nvidia doesn't ship here, it'll involve a little day trip over the German border to pick up the card. What does the owner's club think about the 1080Ti rumors, will we see a 780Ti situation or a 980Ti situation?


I'm sure we will but it's just a matter of when we'll see it. Either way, the titan should still be slightly more powerful than a theoretical 1080ti but if cost isn't an issue than go for it


----------



## bl4ckdot

Quote:


> Originally Posted by *xTesla1856*
> 
> I'm about to pull the trigger on a new Titan X Pascal (Late to the party, as usual). Since I'm in Switzerland and Nvidia doesn't ship here, it'll involve a little day trip over the German border to pick up the card. What does the owner's club think about the 1080Ti rumors, will we see a 780Ti situation or a 980Ti situation?


I may be wrong, but I don't see the 1080Ti being cheap this generation. So if you are already on the edge, you might as well buy the TX and call it a day.


----------



## Chaoszero55

Quote:


> Originally Posted by *bl4ckdot*
> 
> I may be wrong, but I don't see the 1080Ti being cheap this generation. So if you are already on the edge, you might as well buy the TX and call it a day.


Yep, you get expensive cards when the competition sucks


----------



## xTesla1856

Quote:


> Originally Posted by *bl4ckdot*
> 
> I may be wrong, but I don't see the 1080Ti being cheap this generation. So if you are already on the edge, you might as well buy the TX and call it a day.


Yeah, this was my kinda thinking all along, I'll order a TItan X, slap a block on it and be happy for a long while


----------



## axiumone

Quote:


> Originally Posted by *xTesla1856*
> 
> I'm about to pull the trigger on a new Titan X Pascal (Late to the party, as usual). Since I'm in Switzerland and Nvidia doesn't ship here, it'll involve a little day trip over the German border to pick up the card. What does the owner's club think about the 1080Ti rumors, will we see a 780Ti situation or a 980Ti situation?


CES 2017 is right around the corner in January. I think it's worth to wait a just a bit if you haven't picked one up yet.


----------



## BelowAverageIQ

I bit the bullet and just purchased one as well. I had to get it shipped from the USA using a forward company. Not the easiest process. The exchange rate is horrible as well.

With CES just a few days away, I was torn about the 1080Ti. I hope it will be a beast of a card. Yes I will be disappointed if it crushes the Titan, but at the same time happy that it will be a great card for performance.

I assume due to the lack of information or leaks that it will be a paper launch. Partner manufacturers, will or have probably started manufacturing. Companies like EK will then design and start to manufacture water blocks. This will probably be months.

Was I able to wait. yes. My SLI 980's are going well.

But the Titan is available now, supposedly same GPU (102), same DDR5X. Sure the Titan is expensive, but already has waterblocks and backplates readily available. In fact I received my block and backplate from Performance PC's already!

I will be glad to run a single high performance card and get away from SLI.

Will carefully install the stock coolers on the 980's and put them both in my sons system or give one to him to replace his 970 and one to my daughter to replace her 950Ti.


----------



## arrow0309

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> I bit the bullet and just purchased one as well. I had to get it shipped from the USA using a forward company. Not the easiest process. The exchange rate is horrible as well.
> 
> With CES just a few days away, I was torn about the 1080Ti. I hope it will be a beast of a card. Yes I will be disappointed if it crushes the Titan, but at the same time happy that it will be a great card for performance.
> 
> I assume due to the lack of information or leaks that it will be a paper launch. Partner manufacturers, will or have probably started manufacturing. Companies like EK will then design and start to manufacture water blocks. This will probably be months.
> 
> Was I able to wait. yes. My SLI 980's are going well.
> 
> But the Titan is available now, supposedly same GPU (102), same DDR5X. Sure the Titan is expensive, but already has waterblocks and backplates readily available. In fact I received my block and backplate from Performance PC's already!
> 
> I will be glad to run a single high performance card and get away from SLI.
> 
> Will carefully install the stock coolers on the 980's and put them both in my sons system or give one to him to replace his 970 and one to my daughter to replace her 950Ti.


Hi, I'm thinking to order one too from the nvidia.co.uk (£1099) on 6 Jan (I'm coming back to London on 10 Jan) and also the aquacomputer's nickel / plexi wb and active backplate and I agree to you they'll probably gonna take months for the water blocks to become available.
About the 1080 ti's performance I don't think will overtake the Titan X (and not even a custom one since we're even putting the Titan X under water)








The rumours are more and more evident they gonna build a 3200 cc / 320 bit / 10 gb gddr5(x) one (a bit disappointing though)
Or maybe a 3328cc one (all the rest like the TX) and they'll prepare to launch a new TX Black or something on a full gp102 & 24 gb (like the Quadro P6000) or a new line (2080 or 1180) with a new refresh (14 nm ?) before Volta.
In any case the upcoming 1080 ti won't crush the Titan X Pascal


----------



## BigBeard86

What heatsinks would be a good fit for titan pascal vrm? I plan to put a kraken g10 on it.


----------



## xTesla1856

Welp, the 1080 was sold today, for a very good price I might add. Now I've been hovering over the "Checkout" button over at Nvidias store all day...


----------



## cookiesowns

Haven't checked in a while. I'm guessing still no tools to "uncork" the Titan XP's without resorting to hardware modifications?

That said, my cards have been under the uniblocks with 0 passive/active VRM / VRAM cooling apart from a 120x38mm fan Panaflo sitting on top of the two cards getting occasional passive airflows. ( controlled manuallyy via speed fan for when I play some more graphically intense games ) No degradation that I can tell









It's almost funny since even in the most demanding games at 1440P 165Hz I'm still CPU limited


----------



## czin125

Since both P100 and Titan X is 471mm^2, can you unlock the last 256 cores like the Fury to Fury X's remaining 512?


----------



## Chaoszero55

Quote:


> Originally Posted by *cookiesowns*
> 
> Haven't checked in a while. I'm guessing still no tools to "uncork" the Titan XP's without resorting to hardware modifications?
> 
> That said, my cards have been under the uniblocks with 0 passive/active VRM / VRAM cooling apart from a 120x38mm fan Panaflo sitting on top of the two cards getting occasional passive airflows. ( controlled manuallyy via speed fan for when I play some more graphically intense games ) No degradation that I can tell
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's almost funny since even in the most demanding games at 1440P 165Hz I'm still CPU limited


Nope, cant do anything except hardware mod if you want to change voltage/power on titan xp


----------



## Asus11

I dont know why but im thinking of buying a titan x pascal

when I first thought about it I was like...whattt hell no.. then I remembered its no longer ''£1099'' if I sell my 1080 for 5-600 with waterblock etc

always fooling myself into buying things


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> I dont know why but im thinking of buying a titan x pascal
> 
> when I first thought about it I was like...whattt hell no.. *then I remembered its no longer ''£1099''* if I sell my 1080 for 5-600 with waterblock etc
> 
> always fooling myself into buying things


What do you mean?


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> I dont know why but im thinking of buying a titan x pascal
> 
> when I first thought about it I was like...whattt hell no.. then I remembered its no longer ''£1099'' if I sell my 1080 for 5-600 with waterblock etc
> 
> *always fooling myself into buying things*


Edit:
Got it what you mean








And yeah, we're in the same boat!


----------



## QSS-5

Quote:


> Originally Posted by *czin125*
> 
> Since both P100 and Titan X is 471mm^2, can you unlock the last 256 cores like the Fury to Fury X's remaining 512?


Mate the P100 is not 471mm^2 it is 610mm^2 because the extra space is used up double precision cores


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> Edit:
> Got it what you mean
> 
> 
> 
> 
> 
> 
> 
> 
> And yeah, we're in the same boat!


only way to do it









will probably wait 2 weeks then make a decision

cause tbh I cba with fiasco of 1080 ti its going to be slower no matter what unless they give it the same cores which I doubt

also theres going to be too much hassle with people buying them etc price hikes.. waiting for waterblocks annoying as hell just like the 1080

also the titans hold money well I would say especially this one because its limited from Nvidia only


----------



## xTesla1856

Quote:


> Originally Posted by *Asus11*
> 
> only way to do it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> will probably wait 2 weeks then make a decision
> 
> cause tbh I cba with fiasco of 1080 ti its going to be slower no matter what unless they give it the same cores which I doubt
> 
> also theres going to be too much hassle with people buying them etc price hikes.. waiting for waterblocks annoying as hell just like the 1080
> 
> also the titans hold money well I would say especially this one because its limited from Nvidia only


My opinion exactly, as a former owner of two Maxwell Titans, they did hold value very well. Also, stock supply issues and retailer price gouging is a nightmare. I remember how horrific the 980Ti and 1080 launches were here in Switzerland. 7 months after the 980Ti release, they stil weren't available here. And 1080's are still going for 900 at some retailers.

With a Titan, you know what you get.


----------



## czin125

Quote:


> Originally Posted by *QSS-5*
> 
> Mate the P100 is not 471mm^2 it is 610mm^2 because the extra space is used up double precision cores


http://videocardz.net/nvidia-quadro-p6000-24gb/

I mean the P6000 at 471mm^2 3840 cores


----------



## Captain_cannonfodder

I don't know what my Titan XP's gaming is like, I bought mine solely for [email protected]


----------



## xTesla1856

Whoops


----------



## KillerBee33

Quote:


> Originally Posted by *xTesla1856*
> 
> Whoops


How the F****...i had to pay 106 $ in taxes here in US


Spoiler: Warning: Spoiler!


----------



## xTesla1856

Quote:


> Originally Posted by *KillerBee33*
> 
> How the F****...i had to pay 106 $ in taxes here in US
> 
> 
> Spoiler: Warning: Spoiler!


I'm using a mail forwarding company in tax-free Delaware.


----------



## KillerBee33

Quote:


> Originally Posted by *xTesla1856*
> 
> I'm using a mail forwarding company in tax-free Delaware.


Heh spending 1200 $ on a GPU 106 $ for tax isn't a big deal , just wanted to know , and now i know...


----------



## Captain_cannonfodder

This is my invoice when I bought mine.


----------



## BelowAverageIQ

Quote:


> Originally Posted by *xTesla1856*
> 
> Whoops


Nice one mate, you bit the bullet and bought it!! Well done.









I purchased mine last week. Via credit card and sent it to a freight forward company. With the AUD to US exchange at 0.69 it was $1739 plus AUD $55 overnight to Portland. Then an additional $81AUD to ship it to Australia, where it is currently held up by holidays, DHL and customs.............. yay









AUD$1875









I hope it is worth it.

Good luck with the the silicon lottery


----------



## arrow0309

So, what do ya guys choose (or taken already) to best watercool your TitanX, which water block, EK or Aquacomputer?
I feel like going to give the aquacomputer block and its active backplate's a try, kinda like them (especially the backplate)









https://shop.aquacomputer.de/product_info.php?products_id=3460&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep
https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep

On the other hand I don't like lo loose any single bit of cooling performance and if I know the EK is the best even this time then I'll come back to EK (no more bitspower whatsoever)

Any of you can share or link anything about it?


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> So, what do ya guys choose (or taken already) to best watercool your TitanX, which water block, EK or Aquacomputer?
> I feel like going to give the aquacomputer block and its active backplate's a try, kinda like them (especially the backplate)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://shop.aquacomputer.de/product_info.php?products_id=3460&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep
> https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep
> 
> On the other hand I don't like lo loose any single bit of cooling performance and if I know the EK is the best even this time then I'll come back to EK (no more bitspower whatsoever)
> 
> Any of you can share or link anything about it?


I'll be going with a prefilled QDC-Block for my Predator 360 from EK. I'm too lazy to buil a custom loop yet


----------



## xTesla1856

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Good luck with the the silicon lottery


Same to you my friend


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> So, what do ya guys choose (or taken already) to best watercool your TitanX, which water block, EK or Aquacomputer?
> I feel like going to give the aquacomputer block and its active backplate's a try, kinda like them (especially the backplate)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://shop.aquacomputer.de/product_info.php?products_id=3460&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep
> https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep
> 
> On the other hand I don't like lo loose any single bit of cooling performance and if I know the EK is the best even this time then I'll come back to EK (no more bitspower whatsoever)
> 
> Any of you can share or link anything about it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll be going with a prefilled QDC-Block for my Predator 360 from EK. I'm too lazy to buil a custom loop yet
Click to expand...

Then EK (once again)
Any backplate as well?


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Any backplate as well?


Debating between the EK black or shiny silver...


----------



## KillerBee33

Quote:


> Originally Posted by *xTesla1856*
> 
> Debating between the EK black or shiny silver...


Its not as shiny as it looks on their website, it's got a fingerprint look just nickeled "ribbed"


----------



## Chaoszero55

I've got the shiny silver and it's pretty nice.


----------



## xTesla1856

I'll probably spring for the shiny one, should look nice with my Rampage Edition 10. That said, I was a little sad to ship my 1080 away today. It did 2152 on air, before throttling down to 2088 in games and took +500 on the memory all day as well. But for the price I sold it for, I really shouldn't complain.


----------



## arrow0309

Found these pics and I just have to say the nickel shiny backplate looks gorgeous









https://linustechtips.com/main/uploads/gallery/album_3436/gallery_269294_3436_113812.jpg
http://cdn.overclock.net/1/19/19b17421_DSC_0486.jpeg

You guys are really addictive








Gonna buy the nickel plexy EK with the above bp for the £142.94 free standard shipping (from the watercooling.uk)


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Found these pics and I just have to say the nickel shiny backplate looks gorgeous
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://linustechtips.com/main/uploads/gallery/album_3436/gallery_269294_3436_113812.jpg
> http://cdn.overclock.net/1/19/19b17421_DSC_0486.jpeg
> 
> You guys are really addictive
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna buy the nickel plexy EK with the above bp for the £142.94 free standard shipping (from the watercooling.uk)


Damn, those pictures just sealed the deal for me. Maybe I should do a build log with all these new parts.


----------



## EniGma1987

Quote:


> Originally Posted by *czin125*
> 
> http://videocardz.net/nvidia-quadro-p6000-24gb/
> 
> I mean the P6000 at 471mm^2 3840 cores


Nvidia tends to laser cut the die sections and they cannot be unlocked. Even if Nvidia only disabled them through the bios, we would still need a flashable custom bios to unlock which we do not have and may not ever have.

Quote:


> Originally Posted by *arrow0309*
> 
> So, what do ya guys choose (or taken already) to best watercool your TitanX, which water block, EK or Aquacomputer?
> I feel like going to give the aquacomputer block and its active backplate's a try, kinda like them (especially the backplate)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://shop.aquacomputer.de/product_info.php?products_id=3460&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep
> https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=5m12t3nudt9624l1ggn6oelsarpbn9ep
> 
> On the other hand I don't like lo loose any single bit of cooling performance and if I know the EK is the best even this time then I'll come back to EK (no more bitspower whatsoever)
> 
> Any of you can share or link anything about it?


I still think Heatkiller is the best block in both temperature and looks







The only advantage the Aqua block has is the active backplate, but since the components stay pretty cool and do not look current capacity in the VRMs with higher temperature then the backplate doesnt actually matter.


----------



## meson1

Quote:


> Originally Posted by *EniGma1987*
> 
> Nvidia tends to laser cut the die sections and they cannot be unlocked. Even if Nvidia only disabled them through the bios, we would still need a flashable custom bios to unlock which we do not have and may not ever have.
> I still think Heatkiller is the best block in both temperature and looks
> 
> 
> 
> 
> 
> 
> 
> The only advantage the Aqua block has is the active backplate, but since the components stay pretty cool and do not look current capacity in the VRMs with higher temperature then the backplate doesnt actually matter.


The specific sections that get disabled are the ones that didn't test well, are they not?


----------



## arrow0309

Quote:


> Originally Posted by *EniGma1987*
> 
> I still think Heatkiller is the best block in both temperature and looks
> 
> 
> 
> 
> 
> 
> 
> The only advantage the Aqua block has is the active backplate, but since the components stay pretty cool and do not look current capacity in the VRMs with higher temperature then the backplate doesnt actually matter.


You're right about the vrm's temps under water, the look of the aquacomputer bp is OK though.
I didn't know the Watercool also made their lineup (Heatkiller iv) for the Titan X P, their best looking choice IMO is this one:

http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/15584

And their black aluminium (passive cooling) bp also (no more silver steel one):

http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/16063

How are the Heatkiller's performance?
Don't we have any review yet?
I'd also like to see their flow perf. as well (EK used to be a winner there)


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> You're right about the vrm's temps under water, the look of the aquacomputer bp is OK though.
> I didn't know the Watercool also made their lineup (Heatkiller iv) for the Titan X P, their best looking choice IMO is this one:
> 
> http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/15584
> 
> And their black aluminium (passive cooling) bp also (no more silver steel one):
> 
> http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/16063
> 
> How are the Heatkiller's performance?
> Don't we have any review yet?
> I'd also like to see their flow perf. as well (EK used to be a winner there)


have you took the plunge then to get the titan X? if so what made you pull the trigger







?

I usually just grab a EK Acetal x nickel and backplate but im boring nowdays









I used to always go plexi but it always gets stained the acetal looks just as good as day one  plus my build is mostly just black and red no shinys


----------



## looniam

Quote:


> Originally Posted by *KillerBee33*
> 
> Its not as shiny as it looks on their website, it's got a fingerprint look just nickeled "ribbed"


for her pleasure™









i'll see myself out now . .


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> Have you took the plunge then to get the titan X? if so what made you pull the trigger
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> I usually just grab a EK Acetal x nickel and backplate but im boring nowdays
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used to always go plexi but it always gets stained the acetal looks just as good as day one  plus my build is mostly just black and red no shinys


Not yet








I've to order somehow on 6 Jan (starting with) as I'm coming back from Italy to London on 10
Just wanted to get a fancy choice for the wb & bp and (why not) even pull the trigger ordering them even sooner, before the Titan X (so I couldn't change my mind no more)








I know the EK's black acetal is somehow "the best", had a 980 ti G1 with that block but didn't fall in love with it (and I still like to have a look at the wb's fins and surroundings through plexi window once in a while).
And I also have all of my fittings in shiny silver & nickel


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> Not yet
> 
> 
> 
> 
> 
> 
> 
> 
> I've to order somehow on 6 Jan (starting with) as I'm coming back from Italy to London on 10
> Just wanted to get a fancy choice for the wb & bp and (why not) even pull the trigger ordering them even sooner, before the Titan X (so I couldn't change my mind no more)
> 
> 
> 
> 
> 
> 
> 
> 
> I know the EK's black acetal is somehow "the best", had a 980 ti G1 with that block but didn't fall in love with it (and I still like to have a look at the wb's fins and surroundings through plexi window once in a while).
> And I also have all of my fittings in shiny silver & nickel


works out £123.94 for both backplate and block









ill probably order from ocuk because free del

also another reason for me to go acetal is because the front side faces the PSU so I dont even get to see it anyway lol all I get to see is the backplate lol

like this


----------



## MunneY

I'm sad to say, I may be going to sell my Titan XP...

The Ti's are looming! I was going to go ITX, but I've gone a different route.



> __
> http://instagr.am/p/BOOdivlB5IL%2F/
> on Dec 19, 2016 at 8:43pm PST


----------



## CptSpig

Quote:


> Originally Posted by *MunneY*
> 
> I'm sad to say, I may be going to sell my Titan XP...
> 
> The Ti's are looming! I was going to go ITX, but I've gone a different route.
> 
> 
> __
> http://instagr.am/p/BOOdivlB5IL%2F/
> on Dec 19, 2016 at 8:43pm PST


How much?


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> works out £123.94 for both backplate and block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ill probably order from ocuk because free del
> 
> also another reason for me to go acetal is because the front side faces the PSU so I dont even get to see it anyway lol all I get to see is the backplate lol
> 
> like this


Nice and good prices on the ocuk








I was close to order an acetal nickel EK but then I saw the silver bp was out of stock.
I'll keep an eye though


----------



## xarot

Hi guys,

I've had my TXP for a while now, some month(s) ago I started noticing I am getting a blue screen every now and then when I am starting or shutting down my computer. It does it even at bare stock clocks on the CPU and JEDEC ram speeds, and BluescreenView tells me it's nvlddmkm.sys causing the issue so NVidia drivers. Just did a fresh install of Win 8.1 and it does it again. Also there have been a few drivers already with this issue.

On the contrary everything is running well in games and yet to experience any blue screens in use. My card is a very poor overclocker though, not running much above 2 GHz stable under water.

Just thinking should I try RMAing it just in case it's the card? I know that for the most part, NVidia drivers have never came without any issues, but this has been going on for long and I cannot find a solution. Anyone tried RMAing in EU? Did you have any trouble when you swapped the stock cooler back?


----------



## devilhead

got my titan xp yesterday, tested couple times some benchmarks - it is beast, compaired to 980ti







but still i need to get waterblock, because now core jumps all over the place....
http://www.3dmark.com/3dm/16977190


----------



## BelowAverageIQ

My Titan Arrived this afternoon. Already had the EK block and backplate waiting.

Installed in my sons system initially to see if all ok. Perfect.

Removed the stock cooler, installed the block and backplate, without any known issues.

Drained my hard tube loop, removed the SLI 980's and installed the Titan.

Connected up tubing, but needed to do some cutting for the hard tube.

All up and running. Room temp is 27 degrees, card is idle at 26 degrees.

No overclocks yet, but noticed that it automatically boosted to 1837 straight off.

Will see how she goes. First time in ages that I have run a single card. looks lonely by itself


----------



## arrow0309

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> My Titan Arrived this afternoon. Already had the EK block and backplate waiting.
> 
> Installed in my sons system initially to see if all ok. Perfect.
> 
> Removed the stock cooler, installed the block and backplate, without any known issues.
> 
> Drained my hard tube loop, removed the SLI 980's and installed the Titan.
> 
> Connected up tubing, but needed to do some cutting for the hard tube.
> 
> All up and running. Room temp is 27 degrees, card is idle at 26 degrees.
> 
> No overclocks yet, but noticed that it automatically boosted to 1837 straight off.
> 
> Will see how she goes. First time in ages that I have run a single card. looks lonely by itself


Nice, I've decided to go with the EK wb and (silver) bp as well
What block did you take, nickel plexy or acetal?
Have you changed the stock vrm pads with other or simply installed with the stock stuff (still 0.5mm for the memory ics and 1mm vrm)?

Keep us informed with any oc, throttling steps and temps


----------



## xTesla1856

My order status changed from "Pending" to "Processing"








Can't wait for this beast to arrive


----------



## bizplan

Quote:


> Originally Posted by *xarot*
> 
> Hi guys,
> 
> I've had my TXP for a while now, some month(s) ago I started noticing I am getting a blue screen every now and then when I am starting or shutting down my computer. It does it even at bare stock clocks on the CPU and JEDEC ram speeds, and BluescreenView tells me it's nvlddmkm.sys causing the issue so NVidia drivers. Just did a fresh install of Win 8.1 and it does it again. Also there have been a few drivers already with this issue.
> 
> On the contrary everything is running well in games and yet to experience any blue screens in use. My card is a very poor overclocker though, not running much above 2 GHz stable under water.
> 
> Just thinking should I try RMAing it just in case it's the card? I know that for the most part, NVidia drivers have never came without any issues, but this has been going on for long and I cannot find a solution. Anyone tried RMAing in EU? Did you have any trouble when you swapped the stock cooler back?


May I suggest you boot up using msconfig and test the nvlddmkm.sys driver with all other drivers (however, only one at a time) until you find the driver/device that is interfering with nvlddmkm.sys? You can search the web on using msconfig this way. Best of luck!


----------



## BelowAverageIQ

Well the system was running ok. Then shutdown. It tries to restart with the power and board lighting up for 1/2 second at most. Boot loops about 10 times. Eventually gets into bios, restarts the system into desktop for a short while then does it again.

System and card NOT overclocked.

Temps are good as such. Very warm here at the moment with room temp of at least 27 degrees C. The cpu gets to 60 now under load (50 before) and Titan gets to 40 (SLI 980's 36 before).

Checked all wiring and cables. Checked for leaks, nothing obvious.

EK Block looks horrible where it does not get cooling. Big stains.









I was getting a better score in Firestrike with my 5960X at 4.5Ghz and SLI 980's with slight overclock than this system I7 6700K at stock and Titan X at stock. Expensive way to get out of SLI









**** Does the Nickel EK Block have protecting film covering on it, where it touches the die??*


----------



## Asus11

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Well the system was running ok. Then shutdown. It tries to restart with the power and board lighting up for 1/2 second at most. Boot loops about 10 times. Eventually gets into bios, restarts the system into desktop for a short while then does it again.
> 
> System and card NOT overclocked.
> 
> Temps are good as such. Very warm here at the moment with room temp of at least 27 degrees C. The cpu gets to 60 now under load (50 before) and Titan gets to 40 (SLI 980's 36 before).
> 
> Checked all wiring and cables. Checked for leaks, nothing obvious.
> 
> EK Block looks horrible where it does not get cooling. Big stains.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was getting a better score in Firestrike with my 5960X at 4.5Ghz and SLI 980's with slight overclock than this system I7 6700K at stock and Titan X at stock. Expensive way to get out of SLI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> **** Does the Nickel EK Block have protecting film covering on it, where it touches the die??*


looks fine to me

no.. no protective film... I hope you took care to remove the rear screws as I have seen people knock a few components off using tools that are not suited for the job, also make sure the block/backplate is not super tight


----------



## xTesla1856

I just got shipping confirmation







Let's hope the mail forwarding company is equally as fast


----------



## BelowAverageIQ

Quote:


> Originally Posted by *Asus11*
> 
> looks fine to me
> 
> no.. no protective film... I hope you took care to remove the rear screws as I have seen people knock a few components off using tools that are not suited for the job, also make sure the block/backplate is not super tight


Sorry mate, not sure what you mean by rear screws? You mean the little hex screws that are 4mm? If so yes. Used a socket with film to protect the PCB.

I secured the block and backplate securely, but I never overtighten. Around the core, I use a star pattern to tighten the screws.

The block looks ugly in the middle where it has no coolant flow. Is that normal? is it oil that has heated up when the card is powered up?


----------



## Asus11

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Sorry mate, not sure what you mean by rear screws? You mean the little hex screws that are 4mm? If so yes. Used a socket with film to protect the PCB.
> 
> I secured the block and backplate securely, but I never overtighten. Around the core, I use a star pattern to tighten the screws.
> 
> The block looks ugly in the middle where it has no coolant flow. Is that normal? is it oil that has heated up when the card is powered up?


it seems to be normal that discolouration your seeing and it will get worse as time goes by, thats another reason why I choose to get a acetal blocks because you can't see anything









also make sure you have decent coolant to prevent corrosion etc but everything seems A OK from the pictures, sometimes EK Q.C is a hit and miss


----------



## MunneY

And Nvidia has an announcement at CES... so I guess its time to list this Titan X


----------



## arrow0309

Quote:


> Originally Posted by *MunneY*
> 
> And Nvidia has an announcement at CES... so I guess its time to list this Titan X


You mean this?

http://wccftech.com/nvidia-pascal-quadro-p6000-gaming-benchmarks/

Just in case, you have a liquid cooled Titan X Pascal? Aquacomputer block and bp?
Will you even ship to UK?


----------



## xarot

Quote:


> Originally Posted by *bizplan*
> 
> May I suggest you boot up using msconfig and test the nvlddmkm.sys driver with all other drivers (however, only one at a time) until you find the driver/device that is interfering with nvlddmkm.sys? You can search the web on using msconfig this way. Best of luck!


It's worth a shot. I am trying to disable fast startup in Windows, I had it disabled in my previous installation but it didn't seem to fix the issue. Also I've uninstalled Geforce Experience now. So far, only two BSODs in two days, but none yesterday. This one's difficult...


----------



## devilhead

What about those white pads on stock cooler? Why nothing is included with waterblock? It's not so important?


----------



## EniGma1987

Quote:


> Originally Posted by *devilhead*
> 
> 
> 
> What about those white pads on stock cooler? Why nothing is included with waterblock? It's not so important?


New ones of those pads were included with my waterblock. EK must not give new ones out.


----------



## feitlinger

hi guys,

found your thread by googling and i'm hoping to find answers for my questions. I have the following setup:

intel i7 6850k @ 4.2Ghz
MSI X99A Godlike Gaming Carbon
2x Titan X Pascal (SLI)
128GB Corsair Platinum 2800Mhz
CPU, RAM and both GPUs are watercooled

Like my CPU I want to overclock the GPUs aswell. Without increasing the voltage I could increase core clock by 200Mhz and memory clock by 500Mhz on MSI Afterburner. Temperature are fine, 48° on load.

Even though I unlocked the voltage control in the settings and put the slider (core voltage (%)) to 100% it doesn't seem that this setting is active. Voltages are still the same (between 600-850mV) and not on a constant value (even tough I selected the option in the Afterburner settings).
Do you guys have any advice for my how to get a constant voltage over 1000mV?

Thanks a lot in advance!


----------



## CptSpig

Quote:


> Originally Posted by *xarot*
> 
> It's worth a shot. I am trying to disable fast startup in Windows, I had it disabled in my previous installation but it didn't seem to fix the issue. Also I've uninstalled Geforce Experience now. So far, only two BSODs in two days, but none yesterday. This one's difficult...


I had a similar issue with a machine. Try booting into safe mode and wiping the Nvidia drivers with DDU. Next reboot and do a clean install of the graphics and physics drivers only and see if this helps it took care of my problems.


----------



## EniGma1987

Quote:


> Originally Posted by *feitlinger*
> 
> hi guys,
> 
> found your thread by googling and i'm hoping to find answers for my questions. I have the following setup:
> 
> intel i7 6850k @ 4.2Ghz
> MSI X99A Godlike Gaming Carbon
> 2x Titan X Pascal (SLI)
> 128GB Corsair Platinum 2800Mhz
> CPU, RAM and both GPUs are watercooled
> 
> Like my CPU I want to overclock the GPUs aswell. Without increasing the voltage I could increase core clock by 200Mhz and memory clock by 500Mhz on MSI Afterburner. Temperature are fine, 48° on load.
> 
> Even though I unlocked the voltage control in the settings and put the slider (core voltage (%)) to 100% it doesn't seem that this setting is active. Voltages are still the same (between 600-850mV) and not on a constant value (even tough I selected the option in the Afterburner settings).
> Do you guys have any advice for my how to get a constant voltage over 1000mV?
> 
> Thanks a lot in advance!


Adding voltage doesnt really do much of anything right now so you probably wont get higher whether you are able to get it working or not







You will also be throttling on power limit more so than voltage limit with a waterblock installed, and if you do a mod to remove the power limit then you will next be throttling on temperature more than voltage. So not really worth worrying about the voltage right now.


----------



## Asus11

Quote:


> Originally Posted by *MunneY*
> 
> And Nvidia has an announcement at CES... so I guess its time to list this Titan X


1080ti ?









imo I think your doing the wrong move .. you might only come out a few hundred in pocket plus have a less inferior card and probably have to wait months without a card

but its up to you I guess


----------



## Enapace

Quote:


> Originally Posted by *Asus11*
> 
> 1080ti ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> imo I think your doing the wrong move .. you might only come out a few hundred in pocket plus have a less inferior card and probably have to wait months without a card
> 
> but its up to you I guess


This was why I went for Titan Pascal even tho I knew 1080Ti was coming out. I have two of them in boxes just waiting for my PSU and HB SLI Bridge which I've just ordered the system should be up and running hopefully this weekend or early next.

We all saw how long it took get a 1080 at launch wouldn't be shock if it was same for 1080Ti. Wasn't worth taking that risk I'm putting cards under water eventually so going have solid performance.

Greatly looking forward to it.


----------



## devilhead

So card works fine under water







so those extra white pads is't so important







now need cold ambient and do some overclock














here is my run under water : http://www.3dmark.com/3dm/17022410


----------



## jsutter71

Quote:


> Originally Posted by *devilhead*
> 
> So card works fine under water
> 
> 
> 
> 
> 
> 
> 
> 
> so those extra white pads is't so important
> 
> 
> 
> 
> 
> 
> 
> now need cold ambient and do some overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here is my run under water : http://www.3dmark.com/3dm/17022410


This was mine with Dual TXPs under watter
http://www.3dmark.com/fs/10622261


----------



## xTesla1856

Quote:


> Originally Posted by *jsutter71*
> 
> This was mine with Dual TXPs under watter
> http://www.3dmark.com/fs/10622261


I chuckled at "better than 99% of results"


----------



## jsutter71

*Sorry devilhead about the boneheaded move on my part. After I came back to the post and read what I wrote, I realized how obnoxious I sounded with the whole one upmanship esque comment. To be fair my results were Fire Strike 1.1 and devilheads are Fire Strike Ultra 1.1*


----------



## MunneY

Quote:


> Originally Posted by *Asus11*
> 
> 1080ti ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> imo I think your doing the wrong move .. you might only come out a few hundred in pocket plus have a less inferior card and probably have to wait months without a card
> 
> but its up to you I guess


Sli incoming!


----------



## arrow0309

Quote:


> Originally Posted by *devilhead*
> 
> So card works fine under water
> 
> 
> 
> 
> 
> 
> 
> *so those extra white pads is't so important*
> 
> 
> 
> 
> 
> 
> 
> now need cold ambient and do some overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here is my run under water : http://www.3dmark.com/3dm/17022410


???


----------



## devilhead

Quote:


> Originally Posted by *arrow0309*
> 
> ???


red marked white pads







but and blue marked area is without any pad with waterblock.










Quote:


> Originally Posted by *jsutter71*
> 
> *Sorry devilhead about the boneheaded move on my part. After I came back to the post and read what I wrote, I realized how obnoxious I sounded with the whole one upmanship esque comment. To be fair my results were Fire Strike 1.1 and devilheads are Fire Strike Ultra 1.1*


here is my fs run







http://www.3dmark.com/3dm/17033099


----------



## arrow0309

Quote:


> Originally Posted by *devilhead*
> 
> red marked white pads
> 
> 
> 
> 
> 
> 
> 
> but and blue marked area is without any pad with waterblock.


I wouldn't worry about any red marked or blue marked areas since under water the whole vga is way cooler and you only need to put tim & pads on the contact places on your wb (gpu, memory ics and mosfets).
Some blocks recently (and optionally) added the possibility you can place another pad on the poscap line (blue marked on your pic), left side of the R22 chokes; others (like my 1080 Strix Bitspower block) on to the chokes line.
Please someone correct me if I'm wrong, the Titan may be somehow different (but I don't think so).

Edit:

Made a research and it seems that all the above red marked zones (white pads) are correspondent to either (other) poscaps and / or some low profile chokes:



And I don't see any way to cool them since the (EK) block doesn't have any contact base to these areas


----------



## BelowAverageIQ

I have not had a chance to work through why my system just shut down all of a sudden, then power on loop 10 times for half a second. Turned off power, then rebooted system. Got to desktop and did the same again.

System was rock solid before that with the 2 980's in SLI.

Only thing I changed was install of Titan X with EK block and backplate and new Drivers 376.45 after using DDU.

Custom Loop was completely drained. No leaks.

Power Supply is a Corsair 1200i and I used a jump connector to power the pump to refill the loop. I had to keep turning the power supply off and on via the switch on the unit itself.

Prior to the unexpected shutdown and power loop, system was stable, temps were good on CPU and Titan.

I had my 6700K with adaptive manual voltage set. Voltages were:

Core 1.175

CPU SA 1.050

CPU IO 0.950

DRAM 1.300

These were 100% stable with SLI 980's, which I assume drew more power than the Titan.

I have custom cables which I made. The 6 pin and 8 pin PCIE were tested using a PSU Tester before plugging into the Titan. Both 12v and PSU were good.

The green light on the Corsair PSU was on. I have NOT adjusted the power level via the Corsair software for the PCIE outputs.

It seemed to me it was a power issue, as if the power supply was restarting due to a fault or protection feature.

I am now away on a trip. It is eating me away at the moment as to possible problems, least of which is a problem with the new Titan.

I was thinking of draining the loop again and removing then reseating the card, but it looks to be seated perfectly.

Using the Corsair supplied cables for 24 pin, CPU power and PCIE.............

Anyone else experienced a similar problem before?

My apologies for the long post.

Thank you.

*** How many hex head screws were to be removed? I am positive I removed them all. I can count how many when I get home to be sure*


----------



## arrow0309

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> I have not had a chance to work through why my system just shut down all of a sudden, then power on loop 10 times for half a second. Turned off power, then rebooted system. Got to desktop and did the same again.
> 
> System was rock solid before that with the 2 980's in SLI.
> 
> Only thing I changed was install of Titan X with EK block and backplate and new Drivers 376.45 after using DDU.
> 
> Custom Loop was completely drained. No leaks.
> 
> Power Supply is a Corsair 1200i and I used a jump connector to power the pump to refill the loop. I had to keep turning the power supply off and on via the switch on the unit itself.
> 
> Prior to the unexpected shutdown and power loop, system was stable, temps were good on CPU and Titan.
> 
> I had my 6700K with adaptive manual voltage set. Voltages were:
> 
> Core 1.175
> 
> CPU SA 1.050
> 
> CPU IO 0.950
> 
> DRAM 1.300
> 
> These were 100% stable with SLI 980's, which I assume drew more power than the Titan.
> 
> I have custom cables which I made. The 6 pin and 8 pin PCIE were tested using a PSU Tester before plugging into the Titan. Both 12v and PSU were good.
> 
> The green light on the Corsair PSU was on. I have NOT adjusted the power level via the Corsair software for the PCIE outputs.
> 
> It seemed to me it was a power issue, as if the power supply was restarting due to a fault or protection feature.
> 
> I am now away on a trip. It is eating me away at the moment as to possible problems, least of which is a problem with the new Titan.
> 
> I was thinking of draining the loop again and removing then reseating the card, but it looks to be seated perfectly.
> 
> Using the Corsair supplied cables for 24 pin, CPU power and PCIE.............
> 
> Anyone else experienced a similar problem before?
> 
> My apologies for the long post.
> 
> Thank you.
> 
> *** How many hex head screws were to be removed? I am positive I removed them all. I can count how many when I get home to be sure*


I assume you're not at (cpu) default settings, are you?
Is the card working well at default setting?
Sometimes when passing to a more powerful vga your cpu in oc might need a little more vcore (it happend to me in the past from the HD 7970 to the R 290 Tri-X)


----------



## Jpmboy

Quote:


> Originally Posted by *devilhead*
> 
> 
> 
> What about those white pads on stock cooler? Why nothing is included with waterblock? It's not so important?


AFAIK, the white "pads" are not thermally conductive, they are heat and electrical insulation. There are no EK pads or instructions regarding those stock cooler locations and I have been running 2 TXPs since launch with EK blocks and nothing in those spots.

lol - all this talk about blocks staining... it's the coolant guys, not the blocks. Check the pH of your mix.


----------



## Dr Mad

Quote:


> Originally Posted by *devilhead*
> 
> here is my fs run
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/17033099


I'm doing 32800 with 2101 (stable) +550vram and 5960X at 4600.

How can you do 1K more points?? What's the magic here?


----------



## devilhead

Quote:


> Originally Posted by *Dr Mad*
> 
> I'm doing 32800 with 2101 (stable) +550vram and 5960X at 4600.
> 
> How can you do 1K more points?? What's the magic here?


maybe +1000vram







))


----------



## Jpmboy

Quote:


> Originally Posted by *Dr Mad*
> 
> I'm doing 32800 with 2101 (stable) +550vram and 5960X at 4600.
> 
> How can you do 1K more points?? What's the magic here?


try it with the updated sysinfo and firestrike... with the most recent driver, and make sure you have the NV driver set up properly. the 376 driver boosts FS and TS some. If you are using win 7... minus several hundred points vs W10. Win 8 and 8.1 give the highest scores.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> try it with the updated sysinfo and firestrike... with the most recent driver, and make sure you have the NV driver set up properly. the 376 driver boosts FS and TS some. If you are using win 7... minus several hundred points vs W10. Win 8 and 8.1 give the highest scores.


like he said... Driver tweaking makes a BIGGGG difference too.


----------



## KillerBee33

Quote:


> Originally Posted by *MunneY*
> 
> like he said... Driver tweaking makes a BIGGGG difference too.


Driver Tweaking?


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Driver Tweaking?


first level tweaks... all "legal" and Valid:


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> first level tweaks... all "legal" and Valid:


Is that for Benching only? I doubt these will be any good for the rest of PC entertainment ...


----------



## Asus11

Quote:


> Originally Posted by *KillerBee33*
> 
> Is that for Benching only? I doubt these will be any good for the rest of PC entertainment ...


revert back when done benching


----------



## KillerBee33

Will give it a try one day...


----------



## jsutter71

Quote:


> Originally Posted by *jsutter71*
> 
> *Sorry devilhead about the boneheaded move on my part. After I came back to the post and read what I wrote, I realized how obnoxious I sounded with the whole one upmanship esque comment. To be fair my results were Fire Strike 1.1 and devilheads are Fire Strike Ultra 1.1*


*Just out of curiosity a went and reran 3DMArk using Fire strike Ultra 1.1 and here are my results. Just to give a good comparison of an SLI configuration compared to single card TXP's
http://www.3dmark.com/3dm/17050656*


----------



## CoD511

Quote:


> Originally Posted by *KillerBee33*
> 
> Is that for Benching only? I doubt these will be any good for the rest of PC entertainment ...


You'd be surprised sometimes that games have underutilised the GPU until maximum power was forced in the control panel. If you want to go deeper, Nvidia Inspector's Profile Manager has a far larger selection of what can be accessed by NVAPI with the latest build having added more descriptions and flags to control. And that made the difference between Dishonored 2 being quite significantly playable and terribly performing, unplayable for me


----------



## KillerBee33

Quote:


> Originally Posted by *CoD511*
> 
> You'd be surprised sometimes that games have underutilised the GPU until maximum power was forced in the control panel. If you want to go deeper, Nvidia Inspector's Profile Manager has a far larger selection of what can be accessed by NVAPI with the latest build having added more descriptions and flags to control. And that made the difference between Dishonored 2 being quite significantly playable and terribly performing, unplayable for me


The only reason i chose a Single GPU instead of SLI , to save myself from Nvidia Inspector and constant Driver Tweaking








Will try posted Driver settings for Benching but will refuse any further tweaking , haven't had any issues since 376.48 HotFix


----------



## CoD511

Quote:


> Originally Posted by *KillerBee33*
> 
> The only reason i chose a Single GPU instead of SLI , to save myself from Nvidia Inspector and constant Driver Tweaking
> 
> 
> 
> 
> 
> 
> 
> 
> Will try posted Driver settings for Benching but will refuse any further tweaking , haven't had any issues since 376.48 HotFix


Heh, sorry, wasn't fully aware of the context if there are issues in the picture. I understand it though, it's annoying and I ditched SLI for the similar reasons too no doubt. In a pinch though, I'm glad I could control the shader garbage collection rate to save my Dishonored 2 experience.


----------



## KillerBee33

Quote:


> Originally Posted by *CoD511*
> 
> Heh, sorry, wasn't fully aware of the context if there are issues in the picture. I understand it though, it's annoying and I ditched SLI for the similar reasons too no doubt. In a pinch though, I'm glad I could control the shader garbage collection rate to save my Dishonored 2 experience.


No Gaming issues so far , haven't had a reason to tweak anything.


----------



## BelowAverageIQ

Quote:


> Originally Posted by *arrow0309*
> 
> I assume you're not at (cpu) default settings, are you?
> Is the card working well at default setting?
> Sometimes when passing to a more powerful vga your cpu in oc might need a little more vcore (it happend to me in the past from the HD 7970 to the R 290 Tri-X)


No overclock on the CPU or memory (as such, 3000Mhz).

Stock voltages my MSI board wants to "auto" apply on the CPU are:

CPU Core 1.176
CPU SA 1.256
CPU IO 1.168

I think they are too high and actually show "red" when manually entering those voltages. I am scared of default "auto" settings now with board manufacturers.
Quote:


> Originally Posted by *Jpmboy*
> 
> AFAIK, the white "pads" are not thermally conductive, they are heat and electrical insulation. There are no EK pads or instructions regarding those stock cooler locations and I have been running 2 TXPs since launch with EK blocks and nothing in those spots.
> 
> lol - all this talk about blocks staining... it's the coolant guys, not the blocks. Check the pH of your mix.


@Jmpboy, my discolouration of my new EK block is in the area of no coolant contact......... I am using coolant, Mayhems X1, mixed at the correct ratio. I admit though, I have NOT checked the PH. Just got home, checked my PH, 7.5

My guess is it is oil left over from machining or testing/fitting which has heated up with the card.

Thank you for those that are trying to help me with the power issue.

I would think that SLI 980's would draw more on each rail (dual 6 pin) than the Titan in this case???

Cheers


----------



## Dodam

Hi guys - I have a quick question about repasting GPUs for stock air cooling.

I've just put together a new system after everything in my last system died simultaneously:


Spoiler: Warning: Spoiler!







Since the case (Crystal 570X) is tight I didn't want to put in a custom loop - for the time being I'm sticking to air cooling until the Enthoo Elite is released.

I'm planning on repasting the Titan XPs with Thermal Grizzly Conductonaut as the GPUs have been regularly thermal throttling with a small OC - is this safe? I can't tell if the stock heatsink is nickel-plated copper from the photos.

Also, there's no point in using thermal tape instead of the white tape since the stock cooler is just plastic there, right?


----------



## xTesla1856

Quote:


> Originally Posted by *Dodam*
> 
> Hi guys - I have a quick question about repasting GPUs for stock air cooling.
> 
> I've just put together a new system after everything in my last system died simultaneously:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Since the case (Crystal 570X) is tight I didn't want to put in a custom loop - for the time being I'm sticking to air cooling until the Enthoo Elite is released.
> 
> I'm planning on repasting the Titan XPs with Thermal Grizzly Conductonaut as the GPUs have been regularly thermal throttling with a small OC - is this safe? I can't tell if the stock heatsink is nickel-plated copper from the photos.
> 
> Also, there's no point in using thermal tape instead of the white tape since the stock cooler is just plastic there, right?


Repasting always helps, it's the first thing I do when I get a new GPU. For reference, I had 2 Maxwell Titan X's, and when I repasted them with Kryonaut, temps dropped about 8° celsius. I had them running undervolted at 1430mhz all day with the stock air cooler. Ah, good times


----------



## Asus11

Quote:


> Originally Posted by *Dodam*
> 
> Hi guys - I have a quick question about repasting GPUs for stock air cooling.
> 
> I've just put together a new system after everything in my last system died simultaneously:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Since the case (Crystal 570X) is tight I didn't want to put in a custom loop - for the time being I'm sticking to air cooling until the Enthoo Elite is released.
> 
> I'm planning on repasting the Titan XPs with Thermal Grizzly Conductonaut as the GPUs have been regularly thermal throttling with a small OC - is this safe? I can't tell if the stock heatsink is nickel-plated copper from the photos.
> 
> Also, there's no point in using thermal tape instead of the white tape since the stock cooler is just plastic there, right?


dont bother you wont notice any difference with ''Thermal Grizzly'' its over-hyped over-marketed thermal paste

stock paste is decent enough and no you should not be experiencing what you are and its not down to paste its down to bad cooler Nvdia put on which is not very good at cooling

sidenote will be ordering a Titan on 5th of Jan if all goes well.. see what Nvidia are hiding..


----------



## KillerBee33

Quote:


> Originally Posted by *Asus11*
> 
> dont bother you wont notice any difference with ''Thermal Grizzly'' its over-hyped over-marketed thermal paste
> 
> stock paste is decent enough and no you should not be experiencing what you are and its not down to paste its down to bad cooler Nvdia put on which is not very good at cooling
> 
> sidenote will be ordering a Titan on 5th of Jan if all goes well.. see what Nvidia are hiding..


In my experience changing Stock to anything , tried Arctic Silver & Gelid Ex. on 980-1080 and TXP ,all I got was 2-4 degree difference.
But then some of us get this from the Factory









Spoiler: Warning: Spoiler!


----------



## EniGma1987

Quote:


> Originally Posted by *Dodam*
> 
> Hi guys - I have a quick question about repasting GPUs for stock air cooling.
> 
> I've just put together a new system after everything in my last system died simultaneously:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Since the case (Crystal 570X) is tight I didn't want to put in a custom loop - for the time being I'm sticking to air cooling until the Enthoo Elite is released.
> 
> I'm planning on repasting the Titan XPs with Thermal Grizzly Conductonaut as the GPUs have been regularly thermal throttling with a small OC - is this safe? I can't tell if the stock heatsink is nickel-plated copper from the photos.
> 
> Also, there's no point in using thermal tape instead of the white tape since the stock cooler is just plastic there, right?


Liquid metal TIMs have ruined all the coolers I have used them with (two AIOs and a waterblock), so I would not recommend using them unless you plan to never remove the stock heatsink again, or if replacing it you never plan to put the stock heatsink back on. The copuple extra degrees difference it will make over regular Kryonaut will not usually matter unless you are in a pretty cold climate and going full water, and are right on the edge of that first or second clock/temp bin.


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> dont bother you wont notice any difference with ''Thermal Grizzly'' its over-hyped over-marketed thermal paste
> 
> stock paste is decent enough and no you should not be experiencing what you are and its not down to paste its down to bad cooler Nvdia put on which is not very good at cooling
> 
> sidenote will be ordering a Titan on 5th of Jan if all goes well.. see what Nvidia are hiding..


That makes two of us








Early (chinese forum) rumors 1080Ti 3328 Cuda Cores, 384 Bit, 12 GB GDDR5, 96 ROPs


----------



## unreality

Happy New Year fellow TitanX brothers


----------



## arrow0309

*Happy New 2017!*








Full of vgas (no Vegas) lol


----------



## CptSpig

Quote:


> Originally Posted by *arrow0309*
> 
> That makes two of us
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Early (chinese forum) rumors 1080Ti 3328 Cuda Cores, 384 Bit, 12 GB GDDR5, 96 ROPs


And a $1,000.00+ price tag? Thanks but I will stick with my Titan X Psscal it just sounds faster probably because it is and always will be......


----------



## Dodam

Quote:


> Originally Posted by *EniGma1987*
> 
> Liquid metal TIMs have ruined all the coolers I have used them with (two AIOs and a waterblock), so I would not recommend using them unless you plan to never remove the stock heatsink again, or if replacing it you never plan to put the stock heatsink back on. The copuple extra degrees difference it will make over regular Kryonaut will not usually matter unless you are in a pretty cold climate and going full water, and are right on the edge of that first or second clock/temp bin.


Yeah - I have noticed that CLU binds to exposed copper waterblocks / heatsinks after a few months. I guess I'll go with regular Kryonaut or IC Diamond 7 - thanks!


----------



## xTesla1856

Anyone ever get this when ordering their cards? I guess it's because of New Year's Eve:


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> That makes two of us
> 
> 
> 
> 
> 
> 
> 
> 
> Early (chinese forum) rumors 1080Ti 3328 Cuda Cores, 384 Bit, 12 GB GDDR5, 96 ROPs




GDDR5... but why?

GDDR5 is horrible on the 1070 compared to the 1080 imo


----------



## fat4l

Subbed!

Whats the ~max freq ppl are achieving with Titan XP nowadays ?
Does it behave the same way as 1080 FE (voltage locked at 1.094v, TDP locked etc) ?

How does the card behave under water ?


----------



## arrow0309

You'd better start read some pages, I'm already on page 435


----------



## arrow0309

Be prepared folks, Las *Vega*s coming


----------



## aylan1196

Hi all installed swiftech waterblock on titan x pascal very nice temps 25 idle and 47 max and I like the looks of it have a peek


----------



## arrow0309

Quote:


> Originally Posted by *aylan1196*
> 
> Hi all installed swiftech waterblock on titan x pascal very nice temps 25 idle and 47 max and I like the looks of it have a peek


Not bad, even if it's not of my taste we could call it a good and easy "all in one" H140-X / H240-X dual loop (dual integrated pump) setup








Even if you could get some extra (5 to 10) C degrees with a nice, custom liquid cooling, especially with larger and well placed rads and IMO with another block (like EK or Watercool) it remains still a cooler version of the Evga's Hybrid kit









Good job









And BTW:

I've just ordered on the 31 DEC from the Watercool's online shop their fancy line for the Titan Xp, a Heatkiller IV nickel acryl (silver) block and their (aluminium) black bp













Still have to order a new (get an used) Txp, gonna wait for a couple of days more


----------



## Lee0

Quote:


> Originally Posted by *arrow0309*
> 
> Be prepared folks, Las *Vega*s coming
> 
> 
> 
> 
> 
> 
> 
> 
> (...)


Not to sound rude or pretentious but why are you posting this in the _Titan x owners thread?_ I mean it has no relevance to the titan x owners and there are already tons of threads dedicated to vega/ stuff like that. This thread gets hijacked and off topic so many times it's quite unbelievable. Oh well.


----------



## arrow0309

It was a joke on their "claiming" the Vega gpu to be even more powerful than the upcoming Volta (I don't think it'll overtake not even a 1080).
Sorry if that was bad taste joke for you


----------



## Lee0

Quote:


> Originally Posted by *arrow0309*
> 
> It was a joke on their "claiming" the Vega gpu to be even more powerful than the upcoming Volta (I don't think it'll overtake not even a 1080).
> Sorry if that was bad taste joke for you


No problem mate, my reply was more about this thread going off topic than just your specific reply.
On another note, as holidays are soon to be over it won't be long before I'll order my _Arctic Accelero_ cooler. I'll be sure to update when it arrives.


----------



## lukerobi

Has anyone thought of using conductive tape rather than liquid metal to bridge the resistors?


----------



## EniGma1987

Quote:


> Originally Posted by *lukerobi*
> 
> Has anyone thought of using conductive tape rather than liquid metal to bridge the resistors?


Probably wont have low enough resistance. The stock resistors are already 0.005 ohms, and the liquid metal lowers that to about 0.002~.


----------



## Dagamus NM

So I decided to do the liquid metal power limit mod to my titans. I will have them vertically mounted on a horizontal motherboard (Caselabs S8) so I opted to isolate the resistors to ensure no mess ended up in unwanted places but the hot glue did not get me motivated so I masked everything off around the two resistors, did the clu in the smallest amount possible. I simply dabbed the syringe on top of the resistors and let the cohesive forces pull enough out to barely coat it. Then I sprayed about eight layers of plasti dip. Here is the final result:


----------



## Asyrin

So after getting my Titan x and being completely disgusted at the thermal throttling I've broken down and ordered the evga hybrid kit.

For those of you that have done this mod, is it worth cutting the hybrid with a dremel to be able to fit the hybrid shroud over the PCB? Or should I do the simpler thing and leave the front half without a shroud and the back half the titan fan?

Let's presume I don't care how the card looks I only care how it performs and lasts. Let's also presume that while I'm not an idiot, I've never actually used a dremel before.

Thanks in advance!


----------



## EniGma1987

Quote:


> Originally Posted by *Dagamus NM*
> 
> 
> So I decided to do the liquid metal power limit mod to my titans. I will have them vertically mounted on a horizontal motherboard (Caselabs S8) so I opted to isolate the resistors to ensure no mess ended up in unwanted places but the hot glue did not get me motivated so I masked everything off around the two resistors, did the clu in the smallest amount possible. I simply dabbed the syringe on top of the resistors and let the cohesive forces pull enough out to barely coat it. Then I sprayed about eight layers of plasti dip. Here is the final result:


That looks pretty awesome. And plastidip can be easily removed too so a very nice solution.
What are your power consumption reading now under 100% load (% TDP)?


----------



## arrow0309

Quote:


> Originally Posted by *Lee0*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> It was a joke on their "claiming" the Vega gpu to be even more powerful than the upcoming Volta (I don't think it'll overtake not even a 1080).
> Sorry if that was bad taste joke for you
> 
> 
> 
> No problem mate, my reply was more about this thread going off topic than just your specific reply.
> On another note, as holidays are soon to be over it won't be long before I'll order my _Arctic Accelero_ cooler. I'll be sure to update when it arrives.
Click to expand...

No worries








Btw:
Accelero Hybrid or air cooled?
Nice & chilly temps you've got there in Sweden, right?








Quote:


> Originally Posted by *EniGma1987*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> 
> So I decided to do the liquid metal power limit mod to my titans. I will have them vertically mounted on a horizontal motherboard (Caselabs S8) so I opted to isolate the resistors to ensure no mess ended up in unwanted places but the hot glue did not get me motivated so I masked everything off around the two resistors, did the clu in the smallest amount possible. I simply dabbed the syringe on top of the resistors and let the cohesive forces pull enough out to barely coat it. Then I sprayed about eight layers of plasti dip. Here is the final result:
> 
> 
> 
> That looks pretty awesome. And plastidip can be easily removed too so a very nice solution.
> What are your power consumption reading now under 100% load (% TDP)?
Click to expand...

@Dagamus NM
Yea, also looking forward to see what did you literally manage to achieve in oc








What block and temps do you have on the gpu?

PS:
Except for this shunt modes are we gonna see any bios mod soon?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Dagamus NM*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> So I decided to do the liquid metal power limit mod to my titans. I will have them vertically mounted on a horizontal motherboard (Caselabs S8) so I opted to isolate the resistors to ensure no mess ended up in unwanted places but the hot glue did not get me motivated so I masked everything off around the two resistors, did the clu in the smallest amount possible. I simply dabbed the syringe on top of the resistors and let the cohesive forces pull enough out to barely coat it. Then I sprayed about eight layers of plasti dip. Here is the final result:


That looks nice and neat, clean.







Interested to see if it helped or not.


----------



## Dagamus NM

I have no idea what clocks these will hit yet. Untested. I put the EK full cover TXP blocks with the nickel back plates. There are four of these guys on an RVE e10 with 6950X. I am going to experiment a bit with this one as I have the rads external to the S8 and then an 850W chiller that I will open up when benching or maybe in the summer if needed. Just going to play with it a bit.


----------



## jhowell1030

Man oh man. I am LOVING the performance I've gotten since putting in a custom loop! I've been really busy with work, being a daddy and a husband, and the holidays...so I haven't had time to do much gaming. However, I did get the chance to do some benchmarking last night and was able to get a stable +201 on the gpu clock, kept the memory at +500 (too many tests/reviewers out there showing that anything much more than that on pascal either yields no real-world returns and/or actually hurts FPS in gaming) and was able to get my 5820k from 4.3ghz to *4.6*! That was kind of my biggest surprise seeing as I had that on a kraken x61 and it never really got very hot. Went from a firestrike score of 21080 to +23000. I only did the original firestrike because that was the only benchmark I was able to get working correctly when I got the card at launch and forgot to run the beefier versions and/or timespy before ripping off the cooler.

I was completely blown away with idle temps hovering 27-29C. I know that firestrike temps aren't really indicative of real-world gaming temps but it never broke 42C. That was a lot better than I was expecting.

Hopefully I'll have time to post pictures soon. The tubes came out ugly. 1. First timer 2. Meant to order PETG (hearing that it was more forgiving) and ordered acrylic by mistake 3. I used EK's little tool that lets you pick what you want to cool and kind of tells you what you need. After it suggested the amount of tubing to buy for a one rad setup (I went with two) I added in one additional 2 tube kit for the additional radiator plus two more 2 tube kits to account for trial and error. Turns out I probably should have ordered _one_ more. My last couple of "final" tubes that went in were made from leftover errors. I'll be sure to order more tubing to redo it all to make it more pretty.


----------



## jhowell1030

Quote:


> Originally Posted by *jhowell1030*
> 
> Man oh man. I am LOVING the performance I've gotten since putting in a custom loop! I've been really busy with work, being a daddy and a husband, and the holidays...so I haven't had time to do much gaming. However, I did get the chance to do some benchmarking last night and was able to get a stable +201 on the gpu clock, kept the memory at +500 (too many tests/reviewers out there showing that anything much more than that on pascal either yields no real-world returns and/or actually hurts FPS in gaming) and was able to get my 5820k from 4.3ghz to *4.6*! That was kind of my biggest surprise seeing as I had that on a kraken x61 and it never really got very hot. Went from a firestrike score of 21080 to +23000. I only did the original firestrike because that was the only benchmark I was able to get working correctly when I got the card at launch and forgot to run the beefier versions and/or timespy before ripping off the cooler.
> 
> I was completely blown away with idle temps hovering 27-29C. I know that firestrike temps aren't really indicative of real-world gaming temps but it never broke 42C. That was a lot better than I was expecting.
> 
> Hopefully I'll have time to post pictures soon. The tubes came out ugly. 1. First timer 2. Meant to order PETG (hearing that it was more forgiving) and ordered acrylic by mistake 3. I used EK's little tool that lets you pick what you want to cool and kind of tells you what you need. After it suggested the amount of tubing to buy for a one rad setup (I went with two) I added in one additional 2 tube kit for the additional radiator plus two more 2 tube kits to account for trial and error. Turns out I probably should have ordered _one_ more. My last couple of "final" tubes that went in were made from leftover errors. I'll be sure to order more tubing to redo it all to make it more pretty.


I did forget to mention the only one drawback. Now...whenever the GPU is used for gaming or benchmarking there is a vibrating, coil whine type of sound. I suppose it's something I'll have to get used to because I don't really want to try to go through the process of RMAing it.


----------



## Jpmboy

Quote:


> Originally Posted by *Dagamus NM*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> So I decided to do the liquid metal power limit mod to my titans. I will have them vertically mounted on a horizontal motherboard (Caselabs S8) so I opted to isolate the resistors to ensure no mess ended up in unwanted places but the hot glue did not get me motivated so I masked everything off around the two resistors, did the clu in the smallest amount possible. I simply dabbed the syringe on top of the resistors and let the cohesive forces pull enough out to barely coat it. Then I sprayed about eight layers of plasti dip. Here is the final result:


That's the neatest, OEM-looking shunt mod I've ever seen!


----------



## Dagamus NM

Quote:


> Originally Posted by *Jpmboy*
> 
> That's the neatest, OEM-looking shunt mod I've ever seen!


Thank you sir, that means a lot coming from you.

I obsessed over how to do it for a couple of days.


----------



## jhowell1030

Quote:


> Originally Posted by *Jpmboy*
> 
> That's the neatest, OEM-looking shunt mod I've ever seen!


I have to agree. It looks very clean. Keep us posted! That was a great idea.


----------



## arrow0309

So, it seems there's non 1080 ti yet








Do ya guys still advise / consider buying a Txp new right now or we gonna see a nice (both 1080 ti and txp) P refresh soon or even the new Volta?


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> So, it seems there's non 1080 ti yet
> 
> 
> 
> 
> 
> 
> 
> 
> Do ya guys still advise / consider buying a Txp new right now or we gonna see a nice (both 1080 ti and txp) P refresh soon or even the new Volta?


I say if you have the money now, go for it. IMO, The Ti isn't even supposed to come out unless AMD delivers a hammer with Vega that jeopardizes nvidias position at the top end.


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> So, it seems there's non 1080 ti yet
> 
> 
> 
> 
> 
> 
> 
> 
> Do ya guys still advise / consider buying a Txp new right now or we gonna see a nice (both 1080 ti and txp) P refresh soon or even the new Volta?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I say if you have the money now, go for it. IMO, The Ti isn't even supposed to come out unless AMD delivers a hammer with Vega that jeopardizes nvidias position at the top end.
Click to expand...

A full Vega is very closely to suppose will offer superior 1080 performance at what price?
Will they still announce the new 1080 ti a little later or even revise the txp price a bit lower?
I hate AMD and their Vega not showing yet off









Edit:
OK, it seems their countdown is finishing in 7 or so hours


----------



## xTesla1856

I don't think the TXP price will ever drop.


----------



## Seyumi

I just ordered my 2nd Titan X Pascal this morning. I was waiting to see if the 1080Ti was a possible full chip like they pulled off with the 780Ti against the original Titan. If they did release it, I was 80% sure it was going to be the 10GB even more cutdown version like all the leaks suggested. All we got instead was a 2 hour presentation of autonomous cars, house spying gadgets, and poor persons game streaming.


----------



## arrow0309

Quote:


> Originally Posted by *Seyumi*
> 
> I just ordered my 2nd Titan X Pascal this morning. I was waiting to see if the 1080Ti was a possible full chip like they pulled off with the 780Ti against the original Titan. If they did release it, I was 80% sure it was going to be the 10GB even more cutdown version like all the leaks suggested. All we got instead was a 2 hour presentation of autonomous cars, house spying gadgets, and poor persons game streaming.


The Txp is not a full chip, not even a full GP102, the Quadro P6000 is a full GP102. And we still don't know anything about the new 1080ti yet, I think that will depend on how fast the new full Vega really is and probably will show off in a couple of months.
However nice choice (still), congrats








I'll be ordering my (first) txp very soon, maybe tomorrow


----------



## xTesla1856

I still haven't received my Titan X, these damn mail forwarding companies are so slow...


----------



## piee

As long as its on 16nm archetechure, its only going to be 15% at most faster than TXP, talking bout 2080ti or TXP black volta/HBM2, not as big jump as from 28nm to 16nm.


----------



## Dagamus NM

Making progress. All four are dressed and ready for the party.


----------



## jhowell1030

Quote:


> Originally Posted by *Dagamus NM*
> 
> Making progress. All four are dressed and ready for the party.


Purty. Love that CPU block.


----------



## EniGma1987

Quote:


> Originally Posted by *piee*
> 
> As long as its on 16nm archetechure, its only going to be 15% at most faster than TXP, talking bout 2080ti or TXP black volta/HBM2, not as big jump as from 28nm to 16nm.


Not necessarily. You can add 25% more transistors to the current Titan XP die size and still be under the typical maximum die size we get for GPUs on a node. That means 25% more hardware resources to add to performance, plus the Titan XP is slightly cut down already anyway, so we could see as much as 40% more brute force resources in a later generation on the same node, plus small improvements due to arch optimizations.


----------



## Jpmboy

Quote:


> Originally Posted by *Dagamus NM*
> 
> Making progress. All four are dressed and ready for the party.


daaum nice! did you mod all 4 cards?
... and for a short break:


----------



## Dagamus NM

Thank you guys, I modded all four cards. Tedious process overall. Had to use the 0.5mm thermal pads that came with the blocks as I was out of fujipoly. Luckily that is only for the memory, I have plenty of 1.0mm for the VRMs. Installing GPU blocks four at a time takes a decent amount of time.

I used the MSI sli bridge. Now that the spacing is set I will install the bridge this evening. Then wiring and fill the loop. I already know that I will use four 420mm rads and they will be external to the case. I am debating putting the pumps and res in the case or leaving those outside as well. I will have the chiller in the room with two valves to turn on flow when it is in use. Or maybe that is not necessary and I just let the two d5s push the whole loop and simply turn on the chiller when needed.


----------



## arrow0309

Quote:


> Originally Posted by *Dagamus NM*
> 
> Making progress. All four are dressed and ready for the party.











No comment, +1 Rep








Quote:


> Originally Posted by *Jpmboy*
> 
> daaum nice! did you mod all 4 cards?
> ... and for a short break:


Wow, he's got to earn enough to afford himself the (Dagamus NM) above quad Txp rig


----------



## Dagamus NM

That video gave me a bit of vertigo. I am not a fan of heights. I can do them if needed but I prefer to skulk around in basements. I think I could be a permanent subterranean dweller.

That said, that light isn't going to change itself.


----------



## jsutter71

Quote:


> Originally Posted by *Dagamus NM*
> 
> That video gave me a bit of vertigo. I am not a fan of heights. I can do them if needed but I prefer to skulk around in basements. I think I could be a permanent subterranean dweller.
> 
> That said, that light isn't going to change itself.


I'm also terrified of heights but Mamma told me to stop acting like a little girl and go conquer my fears. So I decided to join the army and go to Airborne school. Retired in 2013. Still hate heights but I'm ok as long as I have a parachute.


----------



## piee

thats only about 15% more transistors


----------



## Dagamus NM

Quote:


> Originally Posted by *jsutter71*
> 
> I'm also terrified of heights but Mamma told me to stop acting like a little girl and go conquer my fears. So I decided to join the army and go to Airborne school. Retired in 2013. Still hate heights but I'm ok as long as I have a parachute.


Yeah, my mom told me to slither around in the crawl space to get over my fear of spiders. I put her in a nursing home to get over her fear of dying alone.


----------



## Jpmboy

Quote:


> Originally Posted by *Dagamus NM*
> 
> That video gave me a bit of vertigo. I am not a fan of heights. I can do them if needed but I prefer to skulk around in basements. I think I could be a permanent subterranean dweller.
> 
> That said, that light isn't going to change itself.


Dizzying.
lol - I can't imagine that last 100 foot not resulting in "sewing machine leg". no ladder... and the safety rope does not look like a smart clip to me.


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> It was a joke on their "claiming" the Vega gpu to be even more powerful than the upcoming Volta (I don't think it'll overtake not even a 1080).
> Sorry if that was bad taste joke for you


nice heatkiller block!!









don't buy a used titan XP that is false economy! it is only warrantied to the original owner, you may save £100 but lose £1000


----------



## jsutter71

Quote:


> Originally Posted by *Dagamus NM*
> 
> Yeah, my mom told me to slither around in the crawl space to get over my fear of spiders. I put her in a nursing home to get over her fear of dying alone.


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> nice heatkiller block!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> don't buy a used titan XP that is false economy! it is only warrantied to the original owner, you may save £100 but lose £1000


Just ordered a new one!








I'll be back to London (from Italy) on 10 Jan, can't wait to get my new beastie


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Just ordered a new one!
> 
> 
> 
> 
> 
> 
> 
> 
> I'll be back to London (from Italy) on 10 Jan, can't wait to get my new beastie


You finally bit the bullet, congratulations







! Mine will be here on Monday, I'm refreshing the tracking page all day at work hehe


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> You finally bit the bullet, congratulations
> 
> 
> 
> 
> 
> 
> 
> ! Mine will be here on Monday, I'm refreshing the tracking page all day at work hehe


Thanks!
Btw, I see you have an MG279Q and still chosen the Titan








I also have an "almost" perfect panel PG279Q, long live the g-sync
Tomorrow we'll gonna buy ourselves a second txp in order to power this new Rog Swift









http://videocardz.com/65369/asus-announces-swift-pg27uq-4k-ips-144hz-g-sync-monitor


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Thanks!
> Btw, I see you have an MG279Q and still chosen the Titan
> 
> 
> 
> 
> 
> 
> 
> 
> I also have an "almost" perfect panel PG279Q, long live the g-sync
> Tomorrow we'll gonna buy ourselves a second txp in order to power this new Rog Swift
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://videocardz.com/65369/asus-announces-swift-pg27uq-4k-ips-144hz-g-sync-monitor


I had R9 Furys in crossfire very briefly. I only used FreeSync maybe 3 times while I had the setup, I saw no real benefits from using it. Depending on the game engine and the support for it, sometimes it wouldn't work at all. I think Freesync and G-Sync are great if you can't achieve a high enough framerate with your card. Above say, 90-100fps, I see no tearing anyways, whcih in my opnion negates the need for any kind of sync. But remember, this is just my personal view, and I know there are lots of people who swear by G-Sync even with high end rigs.









As for that 2nd Titan and that monitor, we'll see.....


----------



## MunneY

Quote:


> Originally Posted by *xTesla1856*
> 
> I had R9 Furys in crossfire very briefly. I only used FreeSync maybe 3 times while I had the setup, I saw no real benefits from using it. Depending on the game engine and the support for it, sometimes it wouldn't work at all. I think Freesync and G-Sync are great if you can't achieve a high enough framerate with your card. Above say, 90-100fps, I see no tearing anyways, whcih in my opnion negates the need for any kind of sync. But remember, this is just my personal view, and I know there are lots of people who swear by G-Sync even with high end rigs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for that 2nd Titan and that monitor, we'll see.....


Okay, what? How does the game engine make any difference as to how it works... and its supported through drivers. It has nothing to do with individual games, even on the gsync side.

There is a lot of opinion in that statement, and a lot of it is wrong.

I own both a high end freesync rig and a high end gysnc and its VERY obvious when its working vs when its not... no matter the FPS.

on a completely separate and unrelated note... I think we've found @Baasha's new monitor!
https://www.pcper.com/news/Displays/CES-2017-Dell-Announces-UltraSharp-32-Ultra-HD-8K-Monitor


----------



## xTesla1856

Quote:


> Originally Posted by *MunneY*
> 
> Okay, what? How does the game engine make any difference as to how it works... and its supported through drivers. It has nothing to do with individual games, even on the gsync side.
> 
> There is a lot of opinion in that statement, and a lot of it is wrong.
> 
> I own both a high end freesync rig and a high end gysnc and its VERY obvious when its working vs when its not... no matter the FPS.
> 
> on a completely separate and unrelated note... I think we've found @Baasha's new monitor!
> https://www.pcper.com/news/Displays/CES-2017-Dell-Announces-UltraSharp-32-Ultra-HD-8K-Monitor


That's why I said that it was my opinion, I don't claim to know the exact workings of neither sync technology. Maybe if I got to properly experience G-Sync with a high end display, my opinion would change. But as of now, it stands. As for Freesync not running certain engines: I know for a fact that Freesync does not work with GTA IV. No matter what I did, it never worked. There are also other game engines where Freesync doesn't play along nicely. This has been discussed for ages on AMD's support forums as well on reddit.


----------



## piee

I've got QNIXUHD325,10bit,ips,great color,same as samsung 32e850, it works flawless, turn vsync on,limit prerender frames to 1, then lock fps in game to monitor refresh rate, no lag,tearing,smooth 4k with TXP.


----------



## Baasha

Quote:


> Originally Posted by *MunneY*
> 
> on a completely separate and unrelated note... I think we've found @Baasha's new monitor!
> https://www.pcper.com/news/Displays/CES-2017-Dell-Announces-UltraSharp-32-Ultra-HD-8K-Monitor


wheeeeeeeeeeeeeeeeeeeeeeee!
























I just hope it's not vaporware like the 4K 120hz OLED panel they announced last CES.


----------



## asheth007

Double post sorry


----------



## asheth007

Quote:


> Originally Posted by *Asus11*
> 
> only way to do it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> will probably wait 2 weeks then make a decision
> 
> cause tbh I cba with fiasco of 1080 ti its going to be slower no matter what unless they give it the same cores which I doubt
> 
> also theres going to be too much hassle with people buying them etc price hikes.. waiting for waterblocks annoying as hell just like the 1080
> 
> also the titans hold money well I would say especially this one because its limited from Nvidia only


This is my first post on OC.net, but I wanted to say when I was researching this was the post that made me decide to purchase a Titan X pascal. Thank you!


----------



## Jpmboy

Quote:


> Originally Posted by *asheth007*
> 
> This is my first post on OC.net, but I wanted to say when I was researching this was the post that made me decide to purchase a Titan X pascal. Thank you!


Welcome to OCN!


----------



## CptSpig

Quote:


> Originally Posted by *asheth007*
> 
> This is my first post on OC.net, but I wanted to say when I was researching this was the post that made me decide to purchase a Titan X pascal. Thank you!


Welcome to OCN and you are going to love that Titan XP.


----------



## BelowAverageIQ

Hi Team,

My Pascal Titan X has been in the system nearly 2 weeks. Seems to be running well.

Using MSI Afterburner, the maximum I can get is Core +210 and Memory +450...........

On water, custom loop. Max temp is 42 degrees C with an ambient of 27 degrees C.

I have the power limit at 120% and Temp limit at 90 degrees.

Is it worth increasing the voltage slider at all?

Cheers


----------



## Silent Scone

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Hi Team,
> 
> My Pascal Titan X has been in the system nearly 2 weeks. Seems to be running well.
> 
> Using MSI Afterburner, the maximum I can get is Core +210 and Memory +450...........
> 
> On water, custom loop. Max temp is 42 degrees C with an ambient of 27 degrees C.
> 
> I have the power limit at 120% and Temp limit at 90 degrees.
> 
> Is it worth increasing the voltage slider at all?
> 
> Cheers


Probably not. You'll be power limited, and they don't scale very well with voltage.


----------



## arrow0309

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Hi Team,
> 
> My Pascal Titan X has been in the system nearly 2 weeks. Seems to be running well.
> 
> Using MSI Afterburner, the maximum I can get is Core +210 and Memory +450...........
> 
> On water, custom loop. Max temp is 42 degrees C with an ambient of 27 degrees C.
> 
> I have the power limit at 120% and Temp limit at 90 degrees.
> 
> Is it worth increasing the voltage slider at all?
> 
> Cheers


Nice temp at such high tamb







,
Do you mind letting me know how many rads & fans do ya have and their specs and positions?


----------



## BelowAverageIQ

Quote:


> Originally Posted by *arrow0309*
> 
> Nice temp at such high tamb
> 
> 
> 
> 
> 
> 
> 
> ,
> Do you mind letting me know how many rads & fans do ya have and their specs and positions?


I have 2 x RX360 with inlet and 1 x AX360 for the GPU and CPU.

CPU is NOT overclocked.

I am using the new Corsair 120 fans for the rads and 1 x 140 for exhaust.

Coolant is Mayhems X1. GPU block is EK, Nickel. CPU Block is EK, also nickel.

All set in a CaseLabs S8


----------



## Enapace

I'm worried about how my temps are going be going be running two Titan Pascal off a EK PE360 radiator no CPU tho.


----------



## CptSpig

Quote:


> Originally Posted by *Enapace*
> 
> I'm worried about how my temps are going be going be running two Titan Pascal off a EK PE360 radiator no CPU tho.


The system will cool just fine especially with out the CPU block. From EK installation manual.

http://s1164.photobucket.com/user/CptSpig/media/20161104_165443_zpsvbavyubd.jpg.html


----------



## Asus11

Quote:


> Originally Posted by *asheth007*
> 
> This is my first post on OC.net, but I wanted to say when I was researching this was the post that made me decide to purchase a Titan X pascal. Thank you!


no problem! glad I made your mind up!

I need to hurry up and buy one too but lately all I play is Rocket League *Sigh*


----------



## BelowAverageIQ

Nvidia US site showed "out of stock" yesterday. Maybe a run on them after the 1080Ti was not released, or something new coming out........................


----------



## ESRCJ

Could someone test the following? I started a thread recently about my Titan XP issues and I'm curious to see if anyone else experiences the same.

- OC your Titan XP however you want (but no underclocking), just make sure Power limit and Temp Limit are cranked up all the way.
- Run the Witcher 3, max settings (hairworks on or off, doesn't matter), at least 2560x1440 resolution.
- Go to a crowded part of a town and leave Geralt standing there. Let it run for an hour.

Does it crash your PC at some point? I'd like to know because I'm on my second Titan and both have crashed during this test.


----------



## arrow0309

Quote:


> Originally Posted by *gridironcpj*
> 
> Could someone test the following? I started a thread recently about my Titan XP issues and I'm curious to see if anyone else experiences the same.
> 
> - OC your Titan XP however you want (but no underclocking), just make sure Power limit and Temp Limit are cranked up all the way.
> - Run the Witcher 3, max settings (hairworks on or off, doesn't matter), at least 2560x1440 resolution.
> - Go to a crowded part of a town and leave Geralt standing there. Let it run for an hour.
> 
> Does it crash your PC at some point? I'd like to know because I'm on my second Titan and both have crashed during this test.


Hi, there's a pal on my Italian thread playing with the Txp air cooled, max oc he achieved is 2000 but TW3 is crashing and had to downclock at 1950, he was using the fans at 50°/60%, 60°/70%, 70°/80%.
However since it was hell loud he downclocked to 1900 afterwards and is kepping the fan at 1:1 with the temp (still getting up to 74°).
You have to put it under water


----------



## meson1

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Nvidia US site showed "out of stock" yesterday. Maybe a run on them after the 1080Ti was not released, or something new coming out........................


Or maybe ... and I'm going out on a limb here ... maybe they really are out of stock and are waiting for more.


----------



## MrKenzie

Quote:


> Originally Posted by *gridironcpj*
> 
> Could someone test the following? I started a thread recently about my Titan XP issues and I'm curious to see if anyone else experiences the same.
> 
> - OC your Titan XP however you want (but no underclocking), just make sure Power limit and Temp Limit are cranked up all the way.
> - Run the Witcher 3, max settings (hairworks on or off, doesn't matter), at least 2560x1440 resolution.
> - Go to a crowded part of a town and leave Geralt standing there. Let it run for an hour.
> 
> Does it crash your PC at some point? I'd like to know because I'm on my second Titan and both have crashed during this test.


My old 780Ti and my current Titan XP have both done this, not necessarily in the Witcher. I don't think it is entirely uncommon..


----------



## DarkHell2

Quote:


> Originally Posted by *CallsignVega*
> 
> First installed.
> 
> 
> 
> 
> These EVGA 980Ti Hybrid kits are pretty nice quality. Glad I found them on Amazon for only $67.


The Memory chips are not cooled anymore with the removed "cooler" or am I wrong?
I am new in "modifying" GPUs and want a better cooling for my Titan XP.
This look interesting.
Is EVERYTHING still getting cooled?

Or does someone have a better way to cool the TXXP? At the moment I am not able to affort a complete water cooling system (guess why?!







 )

Thanks in adavance!









Yours sincerely


----------



## roccale

Hi guys, i have a problem.
My ek waterblock for txp is arrived, but they don't delivery black plate for the back of the vga.
It is essential for the vga cooling or i can use the original for the moment?
Thx in advance.


----------



## xTesla1856

Quote:


> Originally Posted by *roccale*
> 
> Hi guys, i have a problem.
> My ek waterblock for txp is arrived, but they don't delivery black plate for the back of the vga.
> It is essential for the vga cooling or i can use the original for the moment?
> Thx in advance.


The EK backplate is sold separately and doesn't come included with your waterblock. You can continue using the stock backplate until your order the EK one, or you can remove the backplate altogether.


----------



## ESRCJ

Quote:


> Originally Posted by *MrKenzie*
> 
> My old 780Ti and my current Titan XP have both done this, not necessarily in the Witcher. I don't think it is entirely uncommon..


I have a GTX 980 Ti as a backup card and it never crashes when I play The Witcher 3, even at a heavy OC (~1500MHz).


----------



## roccale

Thx man.


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> Quote:
> 
> 
> 
> Originally Posted by *roccale*
> 
> Hi guys, i have a problem.
> My ek waterblock for txp is arrived, but they don't delivery black plate for the back of the vga.
> It is essential for the vga cooling or i can use the original for the moment?
> Thx in advance.
> 
> 
> 
> The EK backplate is sold separately and doesn't come included with your waterblock. You can continue using the stock backplate until your order the EK one, or you can remove the backplate altogether.
Click to expand...

I don't think it's possible to leave the stock backplate, maybe with a modification that I don't advise however.


----------



## CptSpig

Quote:


> Originally Posted by *roccale*
> 
> Hi guys, i have a problem.
> My ek waterblock for txp is arrived, but they don't delivery black plate for the back of the vga.
> It is essential for the vga cooling or i can use the original for the moment?
> Thx in advance.


You can use the original back plate but you need to use the original hex head screws from the shroud. Here is a picture of my card with original back plate.
http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_221954_zpszpkrc7de.jpg.html


----------



## roccale

@ CptSpig:

Ooooo ok thx man, i understand perfectly.
U have predator 240 AIO with pascal ekwb or i'm wrong?
Temperatures are acceptable?
I have same config but i have 6700k-7700k...and i assemble my pascal wb this weekend...What I should I expect from 240?
Thx


----------



## roccale

@ CptSpig:

I have to put the plastic washers under the hex screws?


----------



## CptSpig

Quote:


> Originally Posted by *roccale*
> 
> @ CptSpig:
> 
> I have to put the plastic washers under the hex screws?


I have a Predator 360 1.1c and temps are amazing 32c Idle and 54c max on the cpu, 24c idle and 44c max on the gpu. I put them in as they were originally installed. I did not use the washers?


----------



## jhowell1030

Quote:


> Originally Posted by *CptSpig*
> 
> I have a Predator 360 1.1c and temps are amazing 32c Idle and 54c max on the cpu, 24c idle and 44c max on the gpu. I put them in as they were originally installed. I did not use the washers?


Are you asking for the block on the card or on the radiator?


----------



## ESRCJ

Is anyone here using pigtail cables from their PSU to their Titan XP? In other words, one 8-pin from the PSU that splits into 2x 6+2 pin for the graphics card? Could this somehow limit power for the card? I'm asking because I've had my Titan XP connected this way since I got it and it always crashes in The Witcher 3 when I increase the power limit above stock.


----------



## jhowell1030

I do not for the XP but I did a while back for one of my 980 KINGPINS. I was having similar issues with games either crashing or being inconsistent. Upgraded to a new PSU that allowed me to hook it up properly and all of my issues went away.


----------



## xTesla1856

Quote:


> Originally Posted by *CptSpig*
> 
> I have a Predator 360 1.1c and temps are amazing 32c Idle and 54c max on the cpu, 24c idle and 44c max on the gpu. I put them in as they were originally installed. I did not use the washers?


How is pump noise with the predator when cooling a CPU and a GPU? I am planning on using your exact setup with a Predator 360.


----------



## Dagamus NM

Quote:


> Originally Posted by *gridironcpj*
> 
> Is anyone here using pigtail cables from their PSU to their Titan XP? In other words, one 8-pin from the PSU that splits into 2x 6+2 pin for the graphics card? Could this somehow limit power for the card? I'm asking because I've had my Titan XP connected this way since I got it and it always crashes in The Witcher 3 when I increase the power limit above stock.


This absolutely could be your problem. When you up the power what goes up? Current and you can only push so much across those cables. Yes you need separate cables for each connector. Honestly, I am surprised that the TXP has a six pin and an eight pin instead of two eight pins. Many others have voiced this as the smaller configuration is good for what it is rated but that is about it.

Power hungry cards pull a lot of power. When I was running a pair of 295x2s I had to upgrade my power supply substantially. Needed a single rail 1600W for the two cards and cpu. Multi rail did not have enough current on any one rail and the same thing would happen in dying light.

What PSU are you running?


----------



## ESRCJ

Quote:


> Originally Posted by *Dagamus NM*
> 
> This absolutely could be your problem. When you up the power what goes up? Current and you can only push so much across those cables. Yes you need separate cables for each connector. Honestly, I am surprised that the TXP has a six pin and an eight pin instead of two eight pins. Many others have voiced this as the smaller configuration is good for what it is rated but that is about it.
> 
> Power hungry cards pull a lot of power. When I was running a pair of 295x2s I had to upgrade my power supply substantially. Needed a single rail 1600W for the two cards and cpu. Multi rail did not have enough current on any one rail and the same thing would happen in dying light.
> 
> What PSU are you running?


It's a Corsair AX1200i. I was using the pigtail cable to keep things cleaner. They came with the supported Corsair cable kit that I purchased, so I figured it wouldn't be an issue. I'm going to test stability with two cables instead of the single pigtail cable. I'll report back with the results *fingers crossed*


----------



## Enapace

Quote:


> Originally Posted by *Dagamus NM*
> 
> This absolutely could be your problem. When you up the power what goes up? Current and you can only push so much across those cables. Yes you need separate cables for each connector. Honestly, I am surprised that the TXP has a six pin and an eight pin instead of two eight pins. Many others have voiced this as the smaller configuration is good for what it is rated but that is about it.
> 
> Power hungry cards pull a lot of power. When I was running a pair of 295x2s I had to upgrade my power supply substantially. Needed a single rail 1600W for the two cards and cpu. Multi rail did not have enough current on any one rail and the same thing would happen in dying light.
> 
> What PSU are you running?


After hearing this i'm worried that my EVGA G3 1000W might not be enough power for my 5930K and Titan Pascal SLI


----------



## CptSpig

Quote:


> Originally Posted by *xTesla1856*
> 
> How is pump noise with the predator when cooling a CPU and a GPU? I am planning on using your exact setup with a Predator 360.


I do not here the pump at all just the fans when the load gets extreme.


----------



## TheGeneralLee86

Quote:


> Originally Posted by *Enapace*
> 
> After hearing this i'm worried that my EVGA G3 1000W might not be enough power for my 5930K and Titan Pascal SLI


You should have no problem with it because I am running 2 Titan X pascals with i7 6950X extreme edition with a evga 1000g2 supernova and have not had a problem since I built which has been since thanksgiving.


----------



## Enapace

Quote:


> Originally Posted by *TheGeneralLee86*
> 
> You should have no problem with it because I am running 2 Titan X pascals with i7 6950X extreme edition with a evga 1000g2 supernova and have not had a problem since I built which has been since thanksgiving.


Nice thanks man I know the G3 is meant to be even better for power regulation specially for it's small size couldn't believe how tiny it was.


----------



## ESRCJ

Quote:


> Originally Posted by *gridironcpj*
> 
> It's a Corsair AX1200i. I was using the pigtail cable to keep things cleaner. They came with the supported Corsair cable kit that I purchased, so I figured it wouldn't be an issue. I'm going to test stability with two cables instead of the single pigtail cable. I'll report back with the results *fingers crossed*


It's still crashing in The Witcher 3 with only the power limit and temp limit turned to the max. Could it be possible that the Titan XP is drawing too much power from the motherboard when overclocked? I've heard some mobos can't handle too much of a power draw from the PCI-E ports. For reference, my motherboard is an MSI X99A Titanium. I would think my mobo would be fine since it's essentially MSI's flagship (aside from the Godlike, which is the same board with RGB).

Also, note that I've tried the same test with an MSI Gaming GTX 980 Ti overclocked to 1500MHz. It's completely stable, no crashes. I'll note that the 980 Ti has two 8-pins, opposed to the Titan XP's 8 and 6 pins.


----------



## Dagamus NM

Quote:


> Originally Posted by *Enapace*
> 
> After hearing this i'm worried that my EVGA G3 1000W might not be enough power for my 5930K and Titan Pascal SLI


Nah, you are fine. This is the perfect size PSU for your setup.

I run a G2 1600W for four of them. Hopefully I will get to boot it for the first time tonight. The G2 1600W has been perfect for quad 780TIs and quad 980Tis. I expect no different for the TXPs.

But yeah, run two separate cables from the PSU to the GPU.

The G3 is single rail?


----------



## roccale

I mounted my wb and I can guarantee that the thermal pad provided with ek titan x pascal kit had not thick enough to allow the wb touching memories.
I did several tests, I finally had to absolutely use the original thermal pads nvidia taken from the original heat sink.
Their thickness is at least 3 times that of ek pad.
The problem is all in the point of contact with the GPU, it is too high, in fact the pins where the screws screwed if you look well remain far apart not just from the pcb.
Be careful not to tighten too much ...
After 3 mounting evidence with thermal pads and k the result was a distance of about 1mm between the pad and the wb of all mindful.
I tested this against the light, and the distance was clear!
Pay attention to your momorie, control is always better to prevent failures or system crashes.
thanks for your attention and advice.


----------



## Dagamus NM

Quote:


> Originally Posted by *gridironcpj*
> 
> It's still crashing in The Witcher 3 with only the power limit and temp limit turned to the max. Could it be possible that the Titan XP is drawing too much power from the motherboard when overclocked? I've heard some mobos can't handle too much of a power draw from the PCI-E ports. For reference, my motherboard is an MSI X99A Titanium. I would think my mobo would be fine since it's essentially MSI's flagship (aside from the Godlike, which is the same board with RGB).
> 
> Also, note that I've tried the same test with an MSI Gaming GTX 980 Ti overclocked to 1500MHz. It's completely stable, no crashes. I'll note that the 980 Ti has two 8-pins, opposed to the Titan XP's 8 and 6 pins.


Stock clocks? Toy with the voltage maybe?


----------



## Enapace

Quote:


> Originally Posted by *Dagamus NM*
> 
> Nah, you are fine. This is the perfect size PSU for your setup.
> 
> I run a G2 1600W for four of them. Hopefully I will get to boot it for the first time tonight. The G2 1600W has been perfect for quad 780TIs and quad 980Tis. I expect no different for the TXPs.
> 
> But yeah, run two separate cables from the PSU to the GPU.
> 
> The G3 is single rail?


I'm honestly not sure if it is or not I just kept hearing good things about it.

AC Input 100 - 240 VAC, 15A, 50 - 60 Hz
DC Output +3.3V +5V +12V +5Vsb -12V
MAX Output 24A 24A 83.3A 3A 0.5A
83.3A
Combined 120W 999.6W 15W 6W
Output Power 1000W @ +50C

Thats stats copied for EVGA Site


----------



## ESRCJ

Quote:


> Originally Posted by *Dagamus NM*
> 
> Stock clocks? Toy with the voltage maybe?


I'm letting GPU boost do it's thing. The clocks bump up to 1850-1890 or so just with Atferburner running. I'm curious to know if this is just an issue with Titan XPs because I've gone through two of them and they both had the same issue. Granted, both were taken apart to install the water block. Either I'm the worst at installing a water block and damaged both cards or there is something inherently wrong with Titan XP overclocked (max power limit and max temp limit) and The Witcher 3. Youtube videos show the latter is likely not true, but it's also possible they didn't play long enough to induce a crash (most gameplay footage is less than 10 minutes).


----------



## Dagamus NM

Quote:


> Originally Posted by *Enapace*
> 
> I'm honestly not sure if it is or not I just kept hearing good things about it.
> 
> AC Input 100 - 240 VAC, 15A, 50 - 60 Hz
> DC Output +3.3V +5V +12V +5Vsb -12V
> MAX Output 24A 24A 83.3A 3A 0.5A
> 83.3A
> Combined 120W 999.6W 15W 6W
> Output Power 1000W @ +50C
> 
> Thats stats copied for EVGA Site


It is single rail. The 83.3A on the +12V rail tells the story. Plenty of output. The 1600W has an output of ~130A off the top of my head.

Nice that they reduced the footprint because the G2 is ridiculously large and heavy. I mean, for what you get it is not surprising but it looks like PSU manufacturers are focusing on quality as opposed to quantity (in Watts).

I might upgrade PSUs this year. That and a Skylake-X setup. GPUs again next year unless NVidia really pulls something out amazing.
Quote:


> Originally Posted by *gridironcpj*
> 
> I'm letting GPU boost do it's thing. The clocks bump up to 1850-1890 or so just with Atferburner running. I'm curious to know if this is just an issue with Titan XPs because I've gone through two of them and they both had the same issue. Granted, both were taken apart to install the water block. Either I'm the worst at installing a water block and damaged both cards or there is something inherently wrong with Titan XP overclocked (max power limit and max temp limit) and The Witcher 3. Youtube videos show the latter is likely not true, but it's also possible they didn't play long enough to induce a crash (most gameplay footage is less than 10 minutes).


Not likely, the 980Ti you tried was one the exact system? No changes to memory, CPU or anything? I have the witcher 3 but have not played but five minutes on it. The game was unplayable on the 295x2s so I moved on. I don't have much time for games anyhow.

Don't get me wrong, you could really be bad at installing waterblocks. I don't want to take away from your glory on that, but it just isn't likely that you have three duds.


----------



## Dagamus NM

derp


----------



## ESRCJ

Quote:


> Originally Posted by *Dagamus NM*
> 
> Not likely, the 980Ti you tried was one the exact system? No changes to memory, CPU or anything? I have the witcher 3 but have not played but five minutes on it. The game was unplayable on the 295x2s so I moved on. I don't have much time for games anyhow.
> 
> Don't get me wrong, you could really be bad at installing waterblocks. I don't want to take away from your glory on that, but it just isn't likely that you have three duds.


Yes, the GTX 980 Ti was tried on the exact same system. It's in there right now. Everything is stable at a heavy OC. I have no glory with installing water blocks. It's a pain since I'm not the best at handling such small components. Technically, 2 duds. The replacement card for the first was through an RMA and it wouldn't hit 1900MHz and was hitting the voltage limit without overclocking. Luckily I was able to refund since it was still within 30 days. The third card was brand new and worked fine before adding the water block. It also passed my Witcher 3 test a few hours after the water block was installed. Somehow, something went wrong after that.


----------



## Dagamus NM

Quote:


> Originally Posted by *gridironcpj*
> 
> Yes, the GTX 980 Ti was tried on the exact same system. It's in there right now. Everything is stable at a heavy OC. I have no glory with installing water blocks. It's a pain since I'm not the best at handling such small components. Technically, 2 duds. The replacement card for the first was through an RMA and it wouldn't hit 1900MHz and was hitting the voltage limit without overclocking. Luckily I was able to refund since it was still within 30 days. The third card was brand new and worked fine before adding the water block. It also passed my Witcher 3 test a few hours after the water block was installed. Somehow, something went wrong after that.


So strange. Maybe it is an issue with the Witcher install itself, or maybe some driver issue. I am installing the game right now on my sli TXP setup. I will do my quad later.

Any other benchmarks that you can run to see?

All other games are fine?


----------



## jsutter71

Quote:


> Originally Posted by *gridironcpj*
> 
> Yes, the GTX 980 Ti was tried on the exact same system. It's in there right now. Everything is stable at a heavy OC. I have no glory with installing water blocks. It's a pain since I'm not the best at handling such small components. Technically, 2 duds. The replacement card for the first was through an RMA and it wouldn't hit 1900MHz and was hitting the voltage limit without overclocking. Luckily I was able to refund since it was still within 30 days. The third card was brand new and worked fine before adding the water block. It also passed my Witcher 3 test a few hours after the water block was installed. Somehow, something went wrong after that.


Could you please be more specific about what you mean by "crash"? Are you talking system lockups? BODs? shutdowns or restarts? I ask because I have had several instances where my system would shutdown and restart when running graphic intensive benchmarks like heaven or 3dmark. In my situation this only happens in SLI and if I connect the 4 PCIe power cables on my power supply in slots 1-4. Previously I was running three 980Ti's and never had any problems with them.


----------



## ESRCJ

Quote:


> Originally Posted by *Dagamus NM*
> 
> So strange. Maybe it is an issue with the Witcher install itself, or maybe some driver issue. I am installing the game right now on my sli TXP setup. I will do my quad later.
> 
> Any other benchmarks that you can run to see?
> 
> All other games are fine?


I re-installed The Witcher 3 just to be sure before and that didn't make a difference. I tried different SSDs as well. The other games I've experienced similar crashes are Battlefield 1 and Rise of the Tomb Raider. The card passed the Time Spy stress test a few hours after I installed the water blocks, but eventually I started noticing problems, such as Youtube videos and videos saved to my PC not playing or playing at 1FPS. Putting the stock air cooler back on the card fixed the video playback issue, but the crashing still persists. Non-demanding games are 100% fine. If the power slider isn't touched, the card is fine for ALL applications. When the power slider is increased to the max, that's when the said games experience crashes eventually.

Quote:


> Originally Posted by *jsutter71*
> 
> Could you please be more specific about what you mean by "crash"? Are you talking system lockups? BODs? shutdowns or restarts? I ask because I have had several instances where my system would shutdown and restart when running graphic intensive benchmarks like heaven or 3dmark. In my situation this only happens in SLI and if I connect the 4 PCIe power cables on my power supply in slots 1-4. Previously I was running three 980Ti's and never had any problems with them.


There are two crashes that I've experienced: (i) the PC randomly restarts itself during gameplay (less common) or (ii) the PC completely freezes and I'm forced to do a hard shut down. On the latter, I checked to see how long the screen would stay frozen. This was a mistake, as the screen eventually started flickering. The flickering persisted after the restart, so I had to reinstall the Nvidia drivers to fix that. The crashing also persists regardless of whether or not g-sync is on.

I wonder if an overclocked Titan XP is simply drawing too much power from the motherboard? Or maybe only in certain applications/games? That's why I was hoping some owners would attempt my Witcher 3 test: max settings, at least 1440p, power limit set to the max (additional overclocks optional), and let Geralt stand there for up to an hour. Mine usually crashed between 10 and 40 minutes.


----------



## roccale

Look in backlight if ur memories are touching well wb or if there is space...light.
The original termal pad of my ek were outrageous.
The memories remained distant from wb 1mm sink despite the thermal pads ...
It follows that the memories do not dissipate heat ...
Maybe that's why so many do not exceed the 500mhz .... I get to 850mhz


----------



## AndreTM

What about buying a TX now? In your opinion could upcoming cards (1080Ti and Vega) outperform this monster? I'm a bit afraid because few years ago NVIDIA did this with the 780Ti.


----------



## MunneY

Quote:


> Originally Posted by *AndreTM*
> 
> What about buying a TX now? In your opinion could upcoming cards (1080Ti and Vega) outperform this monster? I'm a bit afraid because few years ago NVIDIA did this with the 780Ti.


The Ti cards will always be close to the Titan parts, but beating it is a different story. The fact that Nvidia exclusively sold the Titan X and people like EVGA will handle the 980 Ti, make it where Classifieds and Lightnings might come close to beating it. As for Vega, I seriously doubt it'll come close, even though it needs to.


----------



## jsutter71

Quote:


> Originally Posted by *gridironcpj*
> 
> I re-installed The Witcher 3 just to be sure before and that didn't make a difference. I tried different SSDs as well. The other games I've experienced similar crashes are Battlefield 1 and Rise of the Tomb Raider. The card passed the Time Spy stress test a few hours after I installed the water blocks, but eventually I started noticing problems, such as Youtube videos and videos saved to my PC not playing or playing at 1FPS. Putting the stock air cooler back on the card fixed the video playback issue, but the crashing still persists. Non-demanding games are 100% fine. If the power slider isn't touched, the card is fine for ALL applications. When the power slider is increased to the max, that's when the said games experience crashes eventually.
> There are two crashes that I've experienced: (i) the PC randomly restarts itself during gameplay (less common) or (ii) the PC completely freezes and I'm forced to do a hard shut down. On the latter, I checked to see how long the screen would stay frozen. This was a mistake, as the screen eventually started flickering. The flickering persisted after the restart, so I had to reinstall the Nvidia drivers to fix that. The crashing also persists regardless of whether or not g-sync is on.
> 
> I wonder if an overclocked Titan XP is simply drawing too much power from the motherboard? Or maybe only in certain applications/games? That's why I was hoping some owners would attempt my Witcher 3 test: max settings, at least 1440p, power limit set to the max (additional overclocks optional), and let Geralt stand there for up to an hour. Mine usually crashed between 10 and 40 minutes.


Interesting. In my situation I am using a EVGA Supernova T2 1600 power supply which I have RMA'd. Unfortunately the same thing happens with both the old and the new power supply. In my situation the system just shuts down completely and restarts. I have discovered a fix which suggests that it it a power supply problem but find it strange that the problem was not resolved with the RMA. I have tried a wide variety of stress tests and have discovered that my solution is mostly stable. I say mostly because if I Overclock my CPU past a certain point it will exhibit the exact same behavior. Please see the attached pics. I am also linking my build log. The last 2 pages of which describe the problem in more detail.
http://www.overclock.net/t/1595092/my-new-sth10-build/220_20


----------



## jsutter71

Quote:


> Originally Posted by *AndreTM*
> 
> What about buying a TX now? In your opinion could upcoming cards (1080Ti and Vega) outperform this monster? I'm a bit afraid because few years ago NVIDIA did this with the 780Ti.


The old expression is that if you wait 10 minutes your brand new system will already be obsolete. Just get the best most powerful system you can afford which suits your needs and enjoy what you have. The other alternative is selling a kidney on the black market. A quick internet search shows the current going rate is $262,000.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> The old expression is that if you wait 10 minutes your brand new system will already be obsolete. Just get the best most powerful system you can afford which suits your needs and enjoy what you have. The other alternative is selling a kidney on the black market. A quick internet search shows the current going rate is $262,000.


----------



## AndreTM

I'm not a father yet so I don't have one or more kids to sell sorry.








BTW I understood what you are said, my worries are only about the possibility of a full CUDA cores unclocked version of Pascal at a lower price than the TITAN X. Exactly like it happened with the 780Ti.

I know that no one has got the crystal sphere so my question is more like: "Would you buy a TITAN X now that is out since August or would you wait?"


----------



## jhowell1030

Quote:


> Originally Posted by *AndreTM*
> 
> I'm not a father yet so I don't have one or more kids to sell sorry.
> 
> 
> 
> 
> 
> 
> 
> 
> BTW I understood what you are said, my worries are only about the possibility of a full CUDA cores unclocked version of Pascal at a lower price than the TITAN X. Exactly like it happened with the 780Ti.
> 
> I know that no one has got the crystal sphere so my question is more like: "Would you buy a TITAN X now that is out since August or would you wait?"


My question is why you think they would unlock all of the cuda cores on the 1080ti? Have they done similar in the past?


----------



## jsutter71

Quote:


> Originally Posted by *jhowell1030*
> 
> My question is why you think they would unlock all of the cuda cores on the 1080ti? Have they done similar in the past?


No because then you would have a Quadro P6000 and charge an additional $2K


----------



## AndreTM

Quote:


> Originally Posted by *jhowell1030*
> 
> My question is why you think they would unlock all of the cuda cores on the 1080ti? Have they done similar in the past?


Yeah, with the 780Ti.
2880 CUDA cores VS 2688 of the TITAN that was on the market.

After NVIDIA updated also the TITAN with the same number of CUDA cores (TITAN Black).


----------



## jsutter71

Quote:


> Originally Posted by *AndreTM*
> 
> Yeah with the with
> Yeah, with the 780Ti.
> 2880 CUDA cores VS 2688 of the TITAN that was on the market.
> 
> After NVIDIA updated also the TITAN with the same number of CUDA cores (TITAN Black).


The TXP and the P6000 are basically the same card minus the extra 2 SMs enabled, double the memory, and of course the drivers. Nvidia won't be enabling those extra 2 SMs. That makes no since.
http://www.babeltechreviews.com/pascal-titan-x-vs-quadro-p6000/


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> The TXP and the P6000 are basically the same card minus the extra 2 SMs enabled, double the memory, and of course the drivers. Nvidia won't be enabling those extra 2 SMs. That makes no since.
> http://www.babeltechreviews.com/pascal-titan-x-vs-quadro-p6000/


Not to mention that the original Titan and 780ti were not only 9 months apart (almost a whole generation) but wasn't the original Titan the first card to implement the new (at the time) Kepler architecture? Seeing as the 10 series cards had been out for a bit with Pascal before the Titan X was launched I'd be surprised to see a fully unlocked chip unless it was right before Volta...if that.


----------



## AndreTM

Yeah you're right, thanks guys.
Which waterblock do you suggest for this monster?


----------



## jsutter71

Quote:


> Originally Posted by *AndreTM*
> 
> Yeah you're right, thanks guys.
> Which waterblock do you suggest for this monster?


Lot of people here LOVE Watercool but I'm happy with my EK-FC Nickel blocks


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> Lot of people here LOVE Watercool but I'm happy with my EK-FC Nickel blocks


Love how clean those leds are!


----------



## arrow0309

Quote:


> Originally Posted by *jsutter71*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AndreTM*
> 
> Yeah you're right, thanks guys.
> Which waterblock do you suggest for this monster?
> 
> 
> 
> Lot of people here LOVE Watercool but I'm happy with my EK-FC Nickel blocks
Click to expand...

Nice setup!

+Rep!


----------



## arrow0309

Speaking of TITAN X and w.blocks, look what I've just found at home today, coming back from Rome!


----------



## AndreTM

Quote:


> Originally Posted by *jsutter71*
> 
> Lot of people here LOVE Watercool but I'm happy with my EK-FC Nickel blocks


Beautiful! I'm a fan of Nickel backplates !


----------



## shonik09

I have an EK block on my Titan XP too, works a treat


----------



## jsutter71

*AndreTM, arrow0309, jhowell1030, Thank you Gentlemen. And may I also congratulate arrow0309. Welcome to the club*.









*It's looking more like a tradition for people to pose their new TXP's when they arrive. I know I did the same.*


----------



## arrow0309

Quote:


> Originally Posted by *jsutter71*
> 
> *AndreTM, arrow0309, jhowell1030, Thank you Gentlemen. And may I also congratulate arrow0309. Welcome to the club*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *It's looking more like a tradition for people to pose their new TXP's when they arrive. I know I did the same.*


Thank you sir!








Now my Swift PG279Q can finally feel happy








Do you all use the latest drivers?


----------



## ESRCJ

Regarding the GTX 1080 Ti, it will probably be a cut-down GP102. I believe the rumors we heard earlier were true regarding its CUDA cores, but Nvidia scrapped the CES launch since a Vega launch is still a bit off. Nvidia wants to sell as many Titan XPs as possible. Notice how Titan XP was sold out for a little over the weekend. Nvidia's strategy worked. Now, if Vega outperforms Titan XP somehow, then expect a card with a full GP102 sooner rather than later. This is actually the longest Nvidia has gone between releasing a Titan and releasing a cut-down version.

Kepler: GTX Titan (cut-down GK110) launched in Q1 2013 and GTX 780 (further cut-down GK110) launched in Q2 2013. GTX 780 Ti (full GK110) launched in Q4 2013, but only had 3GB of VRAM. Titan Black released shortly after with the full GK110 and 6GB of VRAM.

Maxwell: GTX Titan X (full GM200) launched in Q1 2015 and GTX 980 Ti (cut-down GM200) launched in Q2 2015

My guess is Nvidia will probably bring a full GP102 towards the end of the Pascal era in the form of another Titan, unless AMD can somehow defeat Titan XP as i mentioned. Then, we might see it in a non-Titan card. I also think it's possible we might see a Pascal refresh sort of like the Kepler refresh, where the '80 is bumped down to the '70 spot and the '80 and '80 Ti are GP102 cards. It actually makes a lot of sense to do this if the RX 590 (placeholder name) beats out 1080 at the price point of a GTX 1070.


----------



## asheth007

There is a new hot fix driver update today

http://nvidia.custhelp.com/app/answers/detail/a_id/4293

This is GeForce Hot Fix driver version 376.60 that addresses the following:

Battlefield 1 crash on some Kepler based GPUs
Dark puddles in Battlefield 1
Random black screen in DOTA 2

So MSI afterburner and GPU-Z are showing my idle clocks lower than normal.



















I'm on the stock blower currently. I ran a couple of firemark runs and saw that in the overlay between loading each test it does indeed drop to these numbers. Allowing the temps to actually drop lower than normal when idle I'm currently at http://i.imgur.com/ZAdI2dS.jpg I'm sitting at 26c currently usually even reverting to the stock out of the box settings allows me to sit at 26c with a stock fan curve where previously I would sit around 40-43c stock fan curve. My 3Dmark scores aren't far off of my previous high they are lower by about 300 points. The differences are definitely within margin of error though 1-2% across the board.


----------



## BelowAverageIQ

Quote:


> Originally Posted by *asheth007*
> 
> There is a new hot fix driver update today
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/4293
> 
> This is GeForce Hot Fix driver version 376.60 that addresses the following:
> 
> Battlefield 1 crash on some Kepler based GPUs
> Dark puddles in Battlefield 1
> Random black screen in DOTA 2
> 
> So MSI afterburner and GPU-Z are showing my idle clocks lower than normal.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm on the stock blower currently. I ran a couple of firemark runs and saw that in the overlay between loading each test it does indeed drop to these numbers. Allowing the temps to actually drop lower than normal when idle I'm currently at http://i.imgur.com/ZAdI2dS.jpg I'm sitting at 26c currently usually even reverting to the stock out of the box settings allows me to sit at 26c with a stock fan curve where previously I would sit around 40-43c stock fan curve. My 3Dmark scores aren't far off of my previous high they are lower by about 300 points. The differences are definitely within margin of error though 1-2% across the board.


Strange, I have had low idle clocks since doing a fresh install with the drivers. Currently on 376.33


----------



## roccale

I had always this idle clocks...


----------



## ChronoBodi

Ok,i gave up on waiting for 1080 ti, and ordered a Titan XP. Yea yea, call me a hypocrite, i've made post about how expensive it is and all that, but, I don't care.

I chose standard shipping and used guest checkout, how long does it take usually for Nvidia to ship it? This is obviously the first time its direct from Nvidia, no Gigabyte or Asus or anything, just straight through Nvidia.

So i got two emails about this and now I'm wondering that the third email is the shipping email, correct?

Also, get newest version of MSI afterburner as well?


----------



## ESRCJ

So I finally solved my problem described in this thread about the card crashing my PC when overclocked playing games like The Witcher 3. Here is what I posted in my dedicated thread (I'm copying it here so more of you see this just in case you're having problems):

Problem solved! I was right in that the Titan XP pulls "too much" power from the motherboard when overclocked. Apparently my motherboard has an additional 6-pin power connector for the PCI-E slots, which was sort of hidden. I plugged one in and my problem vanished. The card was fully stable with The Witcher 3 with a solid OC.

I also noticed something very interesting: Before this fix, I noticed the voltage limit was always hit when the card was overclocked, according to MSI Afterburner's monitoring. After the fix, there were only brief spikes from 0 to 1 (binary indicator) for the voltage limit. Giving the PCI-E additional power via the 6-pin seemed to keep voltages under control.

So if anyone else is having crashes or any sort of instability with overclocking, check to see if your motherboard has an additional plug for PCI-E power. According to pcper's Titan XP review, the card draws power from the mobo above the PCI-E spec when overclocked. Here is the link to the article:

https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Titan-X-Pascal-12GB-Graphics-Card-Review/Detailed-Power-Consumption-Te


----------



## meson1

Quote:


> Originally Posted by *gridironcpj*
> 
> So I finally solved my problem described in this thread about the card crashing my PC when overclocked playing games like The Witcher 3. Here is what I posted in my dedicated thread (I'm copying it here so more of you see this just in case you're having problems):
> 
> Problem solved! I was right in that the Titan XP pulls "too much" power from the motherboard when overclocked. Apparently my motherboard has an additional 6-pin power connector for the PCI-E slots, which was sort of hidden. I plugged one in and my problem vanished. The card was fully stable with The Witcher 3 with a solid OC.
> 
> I also noticed something very interesting: Before this fix, I noticed the voltage limit was always hit when the card was overclocked, according to MSI Afterburner's monitoring. After the fix, there were only brief spikes from 0 to 1 (binary indicator) for the voltage limit. Giving the PCI-E additional power via the 6-pin seemed to keep voltages under control.
> 
> So if anyone else is having crashes or any sort of instability with overclocking, check to see if your motherboard has an additional plug for PCI-E power. According to pcper's Titan XP review, the card draws power from the mobo above the PCI-E spec when overclocked. Here is the link to the article:
> 
> https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Titan-X-Pascal-12GB-Graphics-Card-Review/Detailed-Power-Consumption-Te


Noted. There's a 6 pin power connector on my Asus X99-E WS. When I finally get the bloody thing built, I will ensure I hook up this connector.

Valuable info, so +rep


----------



## arrow0309

Hope my Z87M OC Formula (micro-atx) will have the proper pcie power. It only has one 8pin 12v.
Otherwise I'll have to switch back to the bigger Z87 OC Formula which I still possess.
However it won't enter inside my current case, Enthoo Mini XL.


----------



## MunneY

Quote:


> Originally Posted by *gridironcpj*
> 
> So I finally solved my problem described in this thread about the card crashing my PC when overclocked playing games like The Witcher 3. Here is what I posted in my dedicated thread (I'm copying it here so more of you see this just in case you're having problems):
> 
> Problem solved! I was right in that the Titan XP pulls "too much" power from the motherboard when overclocked. Apparently my motherboard has an additional 6-pin power connector for the PCI-E slots, which was sort of hidden. I plugged one in and my problem vanished. The card was fully stable with The Witcher 3 with a solid OC.
> 
> I also noticed something very interesting: Before this fix, I noticed the voltage limit was always hit when the card was overclocked, according to MSI Afterburner's monitoring. After the fix, there were only brief spikes from 0 to 1 (binary indicator) for the voltage limit. Giving the PCI-E additional power via the 6-pin seemed to keep voltages under control.
> 
> So if anyone else is having crashes or any sort of instability with overclocking, check to see if your motherboard has an additional plug for PCI-E power. According to pcper's Titan XP review, the card draws power from the mobo above the PCI-E spec when overclocked. Here is the link to the article:
> 
> https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Titan-X-Pascal-12GB-Graphics-Card-Review/Detailed-Power-Consumption-Te


Quote:


> Originally Posted by *meson1*
> 
> Noted. There's a 6 pin power connector on my Asus X99-E WS. When I finally get the bloody thing built, I will ensure I hook up this connector.
> 
> Valuable info, so +rep


Quote:


> Originally Posted by *arrow0309*
> 
> Hope my Z87M OC Formula (micro-atx) will have the proper pcie power. It only has one 8pin 12v.
> Otherwise I'll have to switch back to the bigger Z87 OC Formula which I still possess.
> However it won't enter inside my current case, Enthoo Mini XL.


This shouldn't be right. I'm not saying it doesn't work, but the card should only pull the 75w from the board and everything else from the Power cables. Then again, we are talking about PC's here.

AFAIK those booster slots were for when you were running SLI so that the CPU wasn't forced to distribute all the power.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *MunneY*
> 
> This shouldn't be right. I'm not saying it doesn't work, but the card should only pull the 75w from the board and everything else from the Power cables. Then again, we are talking about PC's here.
> 
> AFAIK those booster slots were for when you were running SLI so that the CPU wasn't forced to distribute all the power.


I agree too. Something else is going on and the 6pin is just masking the problem being solved. Glad it's working though.


----------



## EniGma1987

I suspect that the MB is only designed to be able to send 65~ watts from the 24-pin to the PCI-E slots, since that is normal and they provided an extra power plug for the slots. It could be it was designed this way on purpose and if overclocking you are supposed to always use the extra power connector for exactly this reason. We saw issues like this with the RX 480 on some boards too, tripping OCP in the motherboard and causing a shutdown crash or lockup because the MB was not designed to send more than 65-66 watts from the 24-pin to the slots.


----------



## xTesla1856

Does the Rampage V Edition 10 have such an extra connector? If I recall corectly, it only has an extra 4-pin EPS power connector for the CPU and a 4-pin Molex connector below the PCI-E slots at the lower edge of the board. I'll have a look when I get home from work


----------



## ESRCJ

I'm curious: Can someone post their MSI Afterburner hardware monitor info for the Power Limit and Voltage Limit after a good 10 minutes of Witcher 3 or Unigine Heaven? Make sure your power limit is set to 120% and manually OC your card a little. I'm just curious to see what this looks like for a different system.


----------



## asheth007

Quote:


> Originally Posted by *roccale*
> 
> I had always this idle clocks...


That's interesting 376.33 and 376.40 driver both gave me the minimum base clocks on idle never this low. I also posted about this on the Nvidia subreddit in this drivers stickied thread and others are reporting the same across all Pascal cards. Some have lower than minimum base clocks/mem some have base clock/mem at idle.


----------



## AndreTM

Really interesting that NVIDIA raised up the price in their EU store..
(In Italy) Before: €1329 ---> After: €1379


----------



## xTesla1856

Hey, look what just got here


----------



## xTesla1856

Preliminary testing looks very promising, the card does +220 on the core and +500 on the memory with ease. This puts the core clock in the neighbourhood of 2ghZ (it hovers between 1963 and 2025). So far I'm very impressed, the stock cooler is screaming for dear life and the card is screaming for a water block







I've attached a screeshot of Heaven bench as well as a link to my FireStrike Extreme run. I've never seen numbers this high on my computer before









FireStrike Extreme run


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> Preliminary testing looks very promising, the card does +220 on the core and +500 on the memory with ease. This puts the core clock in the neighbourhood of 2ghZ (it hovers between 1963 and 2025). So far I'm very impressed, the stock cooler is screaming for dear life and *the card is screaming for a water block*
> 
> 
> 
> 
> 
> 
> 
> I've attached a screeshot of Heaven bench as well as a link to my FireStrike Extreme run. I've never seen numbers this high on my computer before
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FireStrike Extreme run


I bet it is








You'll have to deal with a large amount of small screws and parts


----------



## arrow0309

Some hires pics of my Titan Xp's vrm


----------



## Dagamus NM

Nobody is going to buy a gall bladder. It has no use to another person.

So crashes related to power supply cable location choice and possible spacing on thermal pads for memory. Odd. Let me double check but I believe all of my blocks have no gap.


----------



## arrow0309

I agree, my memory pads were also 0.5mm (as always), they're from Watercool.
Just finished assembling this steel, copper, glass and aluminium block (and bp)


----------



## Jpmboy

NIce! Glass or plexiglass??


----------



## jsutter71

Quote:


> Originally Posted by *gridironcpj*
> 
> So I finally solved my problem described in this thread about the card crashing my PC when overclocked playing games like The Witcher 3. Here is what I posted in my dedicated thread (I'm copying it here so more of you see this just in case you're having problems):
> 
> Problem solved! I was right in that the Titan XP pulls "too much" power from the motherboard when overclocked. Apparently my motherboard has an additional 6-pin power connector for the PCI-E slots, which was sort of hidden. I plugged one in and my problem vanished. The card was fully stable with The Witcher 3 with a solid OC.
> 
> I also noticed something very interesting: Before this fix, I noticed the voltage limit was always hit when the card was overclocked, according to MSI Afterburner's monitoring. After the fix, there were only brief spikes from 0 to 1 (binary indicator) for the voltage limit. Giving the PCI-E additional power via the 6-pin seemed to keep voltages under control.
> 
> So if anyone else is having crashes or any sort of instability with overclocking, check to see if your motherboard has an additional plug for PCI-E power. According to pcper's Titan XP review, the card draws power from the mobo above the PCI-E spec when overclocked. Here is the link to the article:
> 
> https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Titan-X-Pascal-12GB-Graphics-Card-Review/Detailed-Power-Consumption-Te


Looks like you struck gold. I did not even think about this as a possible reason as to why my system was restarting for the same reason as yours, but today I went and reattached the PCIe cable on my motherboard that I had previously removed after I upgraded from 3 980Ti's to dual TXPs. Afterwards I ran the same benchmarks that was causing my issues and this time no restarts. Apparently running dual TXPs in SLI with a 6950x causes stability problems without the additional power.


----------



## EniGma1987

Quote:


> Originally Posted by *Jpmboy*
> 
> NIce! Glass or plexiglass??


I cant be sure from the feel of it on these waterblocks. I always assumed it was plexi, but their reservoirs are actual glass so maybe these are too


----------



## arrow0309

It's plexi, " PLEXIGLAS® GS of EVONIK"

http://shop.watercool.de/epages/WatercooleK.mobile/en_GB/?ObjectPath=/Shops/WatercooleK/Products/15584&Locale=en_GB


----------



## shonik09

Why do you guys prefer Watercool over EK waterblocks? Is there any performance difference? Also how do people find SLI? Are there many games that are well optimised for it anymore?


----------



## xTesla1856

Honestly, I prefer the look of the Watercool block to the EK block, now that I've seen more pictures. I'd probably get the Watercool one, if I wasn't tied in to the EK ecosystem with a Predator 360. The ability to use prefilled blocks with the QDC is a huge bonus for me, as I don't have as much time anymore to spend on building computers. Oh well, maybe when I finally crack down and build a custom loop, I'll get the Watercool block.


----------



## Konex

edited - not needed anymore.


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> I agree, my memory pads were also 0.5mm (as always), they're from Watercool.
> Just finished assembling this steel, copper, glass and aluminium block (and bp)


very nice!! I like the block and backplate looks alot different than EK stuff









did you buy the Titan for £1099?

I've just checked Nvidias site and its now £1,179 which I is only £80 extra but it's a serious p*** take imo


----------



## EniGma1987

Quote:


> Originally Posted by *Konex*
> 
> hi there,
> 
> I need a very quick advice (next 2 hours at most).
> 
> 2 x Titan X Pascal air
> 
> or
> 
> 1 x Titan X Pascal liquid cooled
> 
> I can buy both configs but I'm having so many different opinions that it's driving me crazy.
> 
> At this moment, I have a quote for a 2 x Titan X Pascal because I don't like the liquid idea too much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please please help me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks for your advices.


I still like single card a lot better than SLI, so my vote is for that with a watercooling setup. Stock titans can throttle to around 1850MHz in some games so the liquid cooling helps keep the card nice and stable at around 2GHz. Which isnt a huge improvement for the money spent, but it is nice knowing the card stays nice and steady under all circumstances.

Quote:


> Originally Posted by *shonik09*
> 
> Why do you guys prefer Watercool over EK waterblocks? Is there any performance difference? Also how do people find SLI? Are there many games that are well optimised for it anymore?


Watercool performs a bit better than EK and also looks better IMO.
People go with EK because they are usually 2-4 weeks faster at launch of a product with a new block out (Cause Watercool does a lot more R&D on new product releases) and because of the pre-filled with quick disconnect option


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> very nice!! I like the block and backplate looks alot different than EK stuff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> did you buy the Titan for £1099?
> 
> I've just checked Nvidias site and its now £1,179 which I is only £80 extra but it's a serious p*** take imo


Yeah, 1099, with the last train.
AMD = Epic Fail


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Yeah, 1099, with the last train.
> AMD = Epic Fail


Prices went up in Germany as well, it is now a whopping 1359€ up from 1299€. i don't know how to interpret this price hike though, surely it can't be because of Brexit?

As for the AMD bit, what is there that hasn't been said already?


----------



## shonik09

Quote:


> Originally Posted by *EniGma1987*
> 
> Watercool performs a bit better than EK and also looks better IMO.
> People go with EK because they are usually 2-4 weeks faster at launch of a product with a new block out (Cause Watercool does a lot more R&D on new product releases) and because of the pre-filled with quick disconnect option


Do you know how big the difference in performance is? Atm mine idles at around 25, load is 40-50, with an OC applied.


----------



## Jpmboy

Quote:


> Originally Posted by *Asus11*
> 
> very nice!! I like the block and backplate looks alot different than EK stuff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> did you buy the Titan for £1099?
> 
> I've just checked Nvidias site and its now £1,179 which I is only £80 extra but it's a serious p*** take imo


not sure why Nvidia is jacking prices... except to increase the butthurt.


----------



## Asus11

Quote:


> Originally Posted by *Jpmboy*
> 
> not sure why Nvidia is jacking prices... except to increase the butthurt.


gives me another reason not to buy one









1080 & titan Z should do for now


----------



## arrow0309

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Asus11*
> 
> very nice!! I like the block and backplate looks alot different than EK stuff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> did you buy the Titan for £1099?
> 
> I've just checked Nvidias site and its now £1,179 which I is only £80 extra but it's a serious p*** take imo
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not sure why Nvidia is jacking prices... except to increase the butthurt.
Click to expand...

LMAO


----------



## shonik09

Quote:


> Originally Posted by *arrow0309*
> 
> did you buy the Titan for £1099?
> 
> I've just checked Nvidias site and its now £1,179 which I is only £80 extra but it's a serious p*** take imo


Probably down to the exchange rate drop tbh, same with the euro...


----------



## EniGma1987

Quote:


> Originally Posted by *shonik09*
> 
> Do you know how big the difference in performance is? Atm mine idles at around 25, load is 40-50, with an OC applied.


My idle is 27 (high ambient in California) and load has never gone past 42, usually 40-42 at "100%" with a 2.05GHz overclock applied and power limit mod. I do have 360mm and 120mm extra thick radiators though. Im sure the block does a lot, but the overkill radiator space also helps.


----------



## arrow0309

Quote:


> Originally Posted by *EniGma1987*
> 
> Watercool performs a bit better than EK and also looks better IMO.
> People go with EK because they are usually 2-4 weeks faster at launch of a product with a new block out (Cause Watercool does a lot more R&D on new product releases) and because of the pre-filled with quick disconnect option


Never had any Watercool whatsoever, I didn't like their former (nice looking) steel bp with 0 cooling, I've always have EK blocks (still have it on my cpu) except for the last 1080 Strix where I did a mistake, spent a lot for the Bits. block (even Vat & customs) and except for the look it's not the best performer.
I believe you if you say these Heatkiller IV perform nicely and I'm happy about it.
I've also liked their pads all perfectly measure cut and fitted just OK (and all, just like the nvidia reference designed, not sure if needed however I've placed them all), different measures, 0.5mm (mem. ics), 1mm (vrm), 1.5mm and 2mm (bp)!









I've just unmounted (not installed yet) to check the prints and they all seem fine:





Gonna install it later, this evening!


----------



## xTesla1856

Quote:


> Originally Posted by *Asus11*
> 
> gives me another reason not to buy one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080 & titan Z should do for now


How's that Titan Z holding up in 2017? i always wanted one when they came out


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Asus11*
> 
> gives me another reason not to buy one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080 & titan Z should do for now
> 
> 
> 
> How's that Titan Z holding up in 2017? i always wanted one when they came out
Click to expand...

He's using it for Physics lol


----------



## Dagamus NM

Quote:


> Originally Posted by *gridironcpj*
> 
> So I finally solved my problem described in this thread about the card crashing my PC when overclocked playing games like The Witcher 3. Here is what I posted in my dedicated thread (I'm copying it here so more of you see this just in case you're having problems):
> 
> Problem solved! I was right in that the Titan XP pulls "too much" power from the motherboard when overclocked. Apparently my motherboard has an additional 6-pin power connector for the PCI-E slots, which was sort of hidden. I plugged one in and my problem vanished. The card was fully stable with The Witcher 3 with a solid OC.
> 
> I also noticed something very interesting: Before this fix, I noticed the voltage limit was always hit when the card was overclocked, according to MSI Afterburner's monitoring. After the fix, there were only brief spikes from 0 to 1 (binary indicator) for the voltage limit. Giving the PCI-E additional power via the 6-pin seemed to keep voltages under control.
> 
> So if anyone else is having crashes or any sort of instability with overclocking, check to see if your motherboard has an additional plug for PCI-E power. According to pcper's Titan XP review, the card draws power from the mobo above the PCI-E spec when overclocked. Here is the link to the article:
> 
> https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Titan-X-Pascal-12GB-Graphics-Card-Review/Detailed-Power-Consumption-Te


That'll do it. Glad you got it solved.


----------



## Jpmboy

Quote:


> Originally Posted by *Asus11*
> 
> gives me another reason not to buy one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080 & titan Z should do for now


but you know you want one.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> but you know you want one.


Deep down everyone wants one.


----------



## jsutter71

*More details emerging about the 1080Ti. For those hoping for the same performance parallels that occurred between the 980Ti and the Titan X it looks like that probably won't happen this time. If this is correct it makes me glad that I decided not to wait. Looking forward to the benchmarks.
*
http://www.guru3d.com/news-story/geforce-gtx-1080-ti-announced-at-pax-march-10th.html


----------



## stryker7314

Quote:


> Originally Posted by *jsutter71*
> 
> *More details emerging about the 1080Ti. For those hoping for the same performance parallels that occurred between the 980Ti and the Titan X it looks like that probably won't happen this time. If this is correct it makes me glad that I decided not to wait. Looking forward to the benchmarks.
> *
> http://www.guru3d.com/news-story/geforce-gtx-1080-ti-announced-at-pax-march-10th.html


Based on the specs the situation is exactly the same and the ti will have the same performance out the box due to higher clocks.


----------



## MunneY

Quote:


> Originally Posted by *stryker7314*
> 
> Based on the specs the situation is exactly the same and the ti will have the same performance out the box due to higher clocks.


No.. the GDDR5X is going to make a difference.

The place where it might actually beat a Titan X is if they can deliver a Lightning/Classified/K|ngP|n card that can break that 2200mhz barrier.


----------



## ChronoBodi

The difference here is that there is no full-fat GP102, if any are made they are sold for far more as an Quadro P6000.

So whatever p6000 doesn't make the cut, they cut off 256 cores and gives us Titan XP, even that fails even more, cut off another 256 cores and put on GDDR5 memory, and there's the 1080 Ti.

You can wait 9 months or more to get anything like a Titan XP perf-wise for cheaper. Only AMD has the competitive power to force upon the GPU market anything better, and we don't know until they release a Vega part that competes with the GTX 1080, let alone the GP102-powered parts.

I mean it did happen before, original Titan, then AMD made r9 290x, 7 months later, but this seems to be a longer stretch.


----------



## xTesla1856

Quote:


> Originally Posted by *MunneY*
> 
> No.. the GDDR5X is going to make a difference.
> 
> The place where it might actually beat a Titan X is if they can deliver a Lightning/Classified/K|ngP|n card that can break that 2200mhz barrier.


Which, knowing how that also didn't happen with the 1080 (EVGA never even bothered with a KingPin 1080), is highly unlikely. Titan X overclocked/watercooled will reign supreme. For reference: My air cooled TX hovers at 2025-2088mhz during gameplay. Memory is at +500 too. I just don't see the Ti beating it this time around, given how the Titans have proven to be good overclockers so far.


----------



## meson1

Quote:


> Originally Posted by *ChronoBodi*
> 
> The difference here is that there is no full-fat GP102, if any are made they are sold for far more as an Quadro P6000....


... and the Tesla P40.


----------



## ChronoBodi

Quote:


> Originally Posted by *xTesla1856*
> 
> Which, knowing how that also didn't happen with the 1080 (EVGA never even bothered with a KingPin 1080), is highly unlikely. Titan X overclocked/watercooled will reign supreme. For reference: My air cooled TX hovers at 2025-2088mhz during gameplay. Memory is at +500 too. I just don't see the Ti beating it this time around, given how the Titans have proven to be good overclockers so far.


Unless AMD pulls a quick move somehow and forces Nvidia to essentially release the full-fat die for cheaper, aka the 290x vs 780ti scenario.


----------



## meson1

Quote:


> Originally Posted by *ChronoBodi*
> 
> Unless AMD pulls a quick move somehow and forces Nvidia to essentially release the full-fat die for cheaper, aka the 290x vs 780ti scenario.


I suspect Nvidia have a fair idea of what AMD are doing and what performance to expect even before any of it became public knowledge. I'm not suggesting the conducted industrial espionage or anything underhanded. Just that being in the industry they have a reasonable idea of what is going on.


----------



## DNMock

Quote:


> Originally Posted by *MunneY*
> 
> No.. the GDDR5X is going to make a difference.
> 
> The place where it might actually beat a Titan X is if they can deliver a Lightning/Classified/K|ngP|n card that can break that 2200mhz barrier.


Doubtful from what I have heard/seen. Even with frankensteined PCB's on LN2, TXP is only getting to the 2400 to 2500 clock range. Seems the limitation is the die itself, so throwing it on the best PCB in the world and tweaking the BIOS to the moon wouldn't make much difference.

Now if they can get an air/water cooled chip to clock within 10% of LN2 cooled FrankenPCBs I'll be extremely impressed.


----------



## jsutter71

AMD has been building this up for months. While customers have been gobbling up Pascal based GPU's they have yet to deliver on those promises. AMD was successful by locking in the market for console based gaming systems. They have released some very powerful video cards. I'm talking about the Radeon Pro Duo. But technology wise it's nothing more then dual GPU's on a single card. What AMD has failed to do is engineer a better single GPU solution. Certainly nothing that compares to the 1080 and certainly not the TXP. Worse yet their drivers are terrible. On paper AMD is advertising some impressive new hardware that is supposed to be a direct competitor to Nvidia's higher end GPUs. Unfortunately their track record as of late has not been consistent. The same can be said with their CPU division. In the 90's AMD was dominating the GPU market, and smacking Intel with their CPU's.Their was a time when I was only building PCs using AMD parts. Then the heavens opened up and Intel released the Core i7 processors which I was an early adopter by the way. After that AMD's stopped making consumer based CPU's that could compete. I'll admit that their was a lot of leap frogging with their consumer based GPU's with Nvidia, but it has been a while since they have been able to provide a truly competitive consumer based single GPU solution. Nvidia has shown the ability to not only improve their hardware, but also redesign a whole new product with brand new architecture. Not just slap 2 older GPU's on one card. And lastly even if AMD does pull a rabbit out of their hat, will they be able to provide the drivers to back it up? Or for that matter the cooling solutions.


----------



## EniGma1987

Quote:


> Originally Posted by *DNMock*
> 
> Doubtful from what I have heard/seen. Even with frankensteined PCB's on LN2, TXP is only getting to the 2400 to 2500 clock range. Seems the limitation is the die itself, so throwing it on the best PCB in the world and tweaking the BIOS to the moon wouldn't make much difference.
> 
> Now if they can get an air/water cooled chip to clock within 10% of LN2 cooled FrankenPCBs I'll be extremely impressed.


Here is an interesting question. Since GDDR5 and GDDR5X are basically the same and just a couple minor tweaks, and apparently you can use GDDR5X on the same controllers as GDDR5, would someone like ASUS or EVGA be able to put out a Classified or MARS card that has the X RAM on it same as the Titan? hmm.


----------



## xTesla1856

Quote:


> Originally Posted by *EniGma1987*
> 
> Here is an interesting question. Since GDDR5 and GDDR5X are basically the same and just a couple minor tweaks, and apparently you can use GDDR5X on the same controllers as GDDR5, would someone like ASUS or EVGA be able to put out a Classified or MARS card that has the X RAM on it same as the Titan? hmm.


Maybe, but nvidia could lock that out of the vBIOS maybe?


----------



## DNMock

Quote:


> Originally Posted by *jsutter71*
> 
> AMD has been building this up for months. While customers have been gobbling up Pascal based GPU's they have yet to deliver on those promises. AMD was successful by locking in the market for console based gaming systems. They have released some very powerful video cards. I'm talking about the Radeon Pro Duo. But technology wise it's nothing more then dual GPU's on a single card. What AMD has failed to do is engineer a better single GPU solution. Certainly nothing that compares to the 1080 and certainly not the TXP. Worse yet their drivers are terrible. On paper AMD is advertising some impressive new hardware that is supposed to be a direct competitor to Nvidia's higher end GPUs. Unfortunately their track record as of late has not been consistent. The same can be said with their CPU division. In the 90's AMD was dominating the GPU market, and smacking Intel with their CPU's.Their was a time when I was only building PCs using AMD parts. Then the heavens opened up and Intel released the Core i7 processors which I was an early adopter by the way. After that AMD's stopped making consumer based CPU's that could compete. I'll admit that their was a lot of leap frogging with their consumer based GPU's with Nvidia, but it has been a while since they have been able to provide a truly competitive consumer based single GPU solution. Nvidia has shown the ability to not only improve their hardware, but also redesign a whole new product with brand new architecture. Not just slap 2 older GPU's on one card. And lastly even if AMD does pull a rabbit out of their hat, will they be able to provide the drivers to back it up? Or for that matter the cooling solutions.


Nvidia hasn't really release a truly new card since the 980, all we have gotten since then is bigger wafers and a die shrink so they haven't done anything to advance their technology in years either which kind of leaves them open.

Combine that with the fact that Silicone is reaching it's limits, and AMD may well be able to close the gap and be competitive again in the GPU market.

That being said, I'm almost positive AMD has been putting 90% of their combined efforts (money) into the development of Zen, so aside from developing a series of new GPU techs I doubt they will be implemented in a manner that fully utilizes their strengths until Navi, which by then Volta will be up and running and Volta is lining up to be when Nvidia puts out all their new GPU techs and probably with proper utilization.

Of course, if by about 2020 a fundamental shift in GPU design, like going Carbon Nanotubes instead of silicone, or a silicone/germanium type of GPU, the only option left for either side is going to be developing a way to have multiple GPU's on a single PCB and operate as if it were a single GPU. A MOAR COREZ approach as it were.


----------



## jsutter71

Quote:


> Originally Posted by *DNMock*
> 
> Nvidia hasn't really release a truly new card since the 980, all we have gotten since then is bigger wafers and a die shrink so they haven't done anything to advance their technology in years either which kind of leaves them open.
> 
> Combine that with the fact that Silicone is reaching it's limits, and AMD may well be able to close the gap and be competitive again in the GPU market.
> 
> That being said, I'm almost positive AMD has been putting 90% of their combined efforts (money) into the development of Zen, so aside from developing a series of new GPU techs I doubt they will be implemented in a manner that fully utilizes their strengths until Navi, which by then Volta will be up and running and Volta is lining up to be when Nvidia puts out all their new GPU techs and probably with proper utilization.
> 
> Of course, if by about 2020 a fundamental shift in GPU design, like going Carbon Nanotubes instead of silicone, or a silicone/germanium type of GPU, the only option left for either side is going to be developing a way to have multiple GPU's on a single PCB and operate as if it were a single GPU. A MOAR COREZ approach as it were.


All very interesting points. Things in the home PC industry really took off for a while, but seem to have leveled out the last few years. A lot of broken promises and slow implementation. Like Dell ditching their 30" OLED display. Still waiting on ANY company to make displayport cables and monitors beyond version 1.2. USB 3.1 devices that actually support USB 3.1 speeds. And how many kickstarters announced some new revolutionary device that would make our lives so much easier yet failed to follow through and issue refunds. Lets face it. We've hit a slump. Do you think it's the beginning of the Idecoracy? I guess we could only milk that crashed space ship or ships in Roswell of their technology for so long.


----------



## arrow0309

*Installed!*

















And my very first heaven @1440p, def cpu & gpu (with pl and temp maxed), what do you think about its boost?


----------



## Kaapstad

Mine before I put waterblocks on the cards.



NVidia used the pic on their UK Facebook page.


----------



## Gemini2039

The memory on my Titan X is insane... Fully tested and fully working overclock.


----------



## ChronoBodi

Quote:


> Originally Posted by *Gemini2039*
> 
> The memory on my Titan X is insane... Fully tested and fully working overclock.


BS.... no way that memory OC is actually stable? you sure you stress tested that with 3dmark stress test or something?

I just have a hard time believing you literally maxed out the memory slider and it's stable.

www.babeltechreviews.com/overclocking-titan-x/2/

If you take a look at that link, +250 memory was slightly unstable, and they only got +200 memory to be stable.

there is no way that +1000 memory is stable, you will crash any game you play on it or get massive graphic corruption..


----------



## Gemini2039

Quote:


> Originally Posted by *ChronoBodi*
> 
> BS.... no way that memory OC is actually stable? you sure you stress tested that with 3dmark stress test or something?
> 
> I just have a hard time believing you literally maxed out the memory slider and it's stable.


I try to understand if it's a bug or if I'm a lottery winner.
but I played GTA V for almost 5 hours, tested with Heaven benchmark without any artefacts and it really boost performances. So I think it's true.


----------



## DooRules

Both of my Titans will run benches at +850 mem and one of them will run firestrike at +1000 on the memory so they are out there. Just a matter of luck


----------



## ChronoBodi

why is it that i don't believe such insane memory OCs?

Is it something to do with GDDR5X? Intentional huge headroom in memory OC?

Is this done with volt changes or stock volt?


----------



## DooRules

Get those clocks on mem on my gpu's with cold air to rads and using the curve on AB to set voltage. But I am still limited to what Pacal allows.

I can run BF1 all day at +230 on core and + 650- 700 on mem in sli.


----------



## Gemini2039

Quote:


> Originally Posted by *DooRules*
> 
> Both of my Titans will run benches at +850 mem and one of them will run firestrike at +1000 on the memory so they are out there. Just a matter of luck


Ok then I'm not crazy









I just followed the settings on Guru3d overclocking guide... then tried to push it as far as I can.
I'm just a little dissapointed for the GPU ... I saw people @ +250 mhz, if I go further than 200 mhz it's not stable.


----------



## Gemini2039

Quote:


> Originally Posted by *DooRules*
> 
> Get those clocks on mem on my gpu's with cold air to rads and using the curve on AB to set voltage. But I am still limited to what Pacal allows.
> 
> I can run BF1 all day at +230 on core and + 650- 700 on mem in sli.


How do you change the voltage curve ?


----------



## xTesla1856

Finished my rig today, enjoy the plastic peel guys


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> Finished my rig today, enjoy the plastic peel guys


Nice case and nice system (although your Titan x is screaming for water cooling)








I got rid of any cable mod, used those black of my Evga G2 directly to the mainboard & vga this time instead


----------



## patrickisfrench

Just picked up my Titan XP yesterday, popped it in, and added a custom fan curve. No OC yet, but I sit at 1850Mhz under load with a 70% fan speed, at 72C on air. Ordering an EVGA Hybrid and plan to install just the AIO block and leave the reference die cast plate on (cause it has all the right thermal pads, and I read the VRAM doesn't get too hot).

Anything else I should be on the look out for or be doing with the card? Will the default TIM applied to the Hybrid cooler from EVGA be fine or should I order some grizzly kryonaut that I see every one talking about in here?

Love this card. First time owner of a AAA video card. Feels good!


----------



## lanofsong

Hello Titan X Pascal owners,

We are having our monthly Foldathon from Monday 16th - 18th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

January 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## MunneY

Quote:


> Originally Posted by *patrickisfrench*
> 
> Just picked up my Titan XP yesterday, popped it in, and added a custom fan curve. No OC yet, but I sit at 1850Mhz under load with a 70% fan speed, at 72C on air. Ordering an EVGA Hybrid and plan to install just the AIO block and leave the reference die cast plate on (cause it has all the right thermal pads, and I read the VRAM doesn't get too hot).
> 
> Anything else I should be on the look out for or be doing with the card? Will the default TIM applied to the Hybrid cooler from EVGA be fine or should I order some grizzly kryonaut that I see every one talking about in here?
> 
> Love this card. First time owner of a AAA video card. Feels good!


EVGA is about to come out with a Titan XP Kit, so if you have patience there will be one. At least according to @EVGA-JacobF


----------



## BelowAverageIQ

Quote:


> Originally Posted by *DooRules*
> 
> Both of my Titans will run benches at +850 mem and one of them will run firestrike at +1000 on the memory so they are out there. Just a matter of luck


Wow that is awesome!! Holy c$5p, insane. I have NEVER won the silicon lottery, not ever. The tens of thousands I have spent on PC gear, not once.

My Titan is under water. Checked that the EK block touches everything, pads have contact. Max temp is 42 degrees. Most I can get on core is +225 and memory is +480, sigh.


----------



## Garrett1974NL

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Wow that is awesome!! Holy c$5p, insane. I have NEVER won the silicon lottery, not ever. The tens of thousands I have spent on PC gear, not once.
> 
> My Titan is under water. Checked that the EK block touches everything, pads have contact. Max temp is 42 degrees. Most I can get on core is +225 and memory is +480, sigh.


And this results in... what frequency? Under 2GHz?


----------



## MrKenzie

Quote:


> Originally Posted by *Garrett1974NL*
> 
> And this results in... what frequency? Under 2GHz?


It would be around 2130MHz before it drops due to temperature and power limits


----------



## octiny

So call me crazy, but I bought an Alienware Aurora R5 late december due the crazy deal they were having for a Titan XP SLI
configuration & Ebates.

$2635 (that's after tax, multiple discounts through rep)

6700K/Titan XP SLI
850w w/liquid cooler
8GB
1TB

PLUS received around $350 after through Ebates & $130 dell promo credit.

BUT BUT....it arrived with a deep scratch and couple scuffs on a side panel. So I got $200 refunded, and had a tech come over
to replace the panel for free.

2635
-350
-140
-200

= $1945









Through in a couple SSD's and 64GB of ram afterwards.

Still tweaking everything.

6700K @ 4.7ghz/4.6ghz cache 1.36v (Max 74c Real Bench loop 4hr)
Titan XP's @ +200 core/+575 memory (GPU 1 77C, GPU2 75C 100% fan on Crysis 3 maxed @ 4K after 1hr)





Feels good not to be hampered by the 4GB on my Radeon Duo Pro


----------



## Jquala

I'm happy to report I have successfully applied CLU after 2 failed attempts (not enough) really need a good helping of *liquid* touching one end of reeisistor to other. I used to hit 120-122% pwr in FSU @2114ghz but it fluctuate down to 2025 or even 2012 and now I don't even hit 75% on both cards. Problem is I can't get through a single run of FSU @ 2114, 2101, or even 2088 all at 1.075-1.093v but I'm not getting fluctuations just consistent 2075-2062. When I check what caused the program to crash...wait for it...god damn power limit! I'm getting pref reason power from gpu-z at 75% tdp. One card sometimes reports every pref reason under the sun which makes me think it's getting too much voltage since the shunt mod only reports 200w in Hwinfo.


----------



## arrow0309

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Wow that is awesome!! Holy c$5p, insane. I have NEVER won the silicon lottery, not ever. The tens of thousands I have spent on PC gear, not once.
> 
> My Titan is under water. Checked that the EK block touches everything, pads have contact. Max temp is 42 degrees. Most I can get on core is +225 and memory is +480, sigh.


Yeah, I confirm it as well, maybe a little "luckier" (as far as I tested yet), I benched at +230 / +497, clocks of max 2125 and down to as low as 2038 (occasionally in Time Spy for pl) and 5500 for the mem, pl max over 130% and temps never over 38C (still open pc).
But I'm happy enough, played TW3 for hours not a single issue at +199 / +497 (gpu clocks 2050-2075 mostly) max gpu temp at 39C











Waiting for a "last miracle" called biosmod


----------



## MunneY

Quote:


> Originally Posted by *octiny*
> 
> So call me crazy, but I bought an Alienware Aurora R5 late december due the crazy deal they were having for a Titan XP SLI
> configuration & Ebates.
> 
> $2635 (that's after tax, multiple discounts through rep)
> 
> 6700K/Titan XP SLI
> 850w w/liquid cooler
> 8GB
> 1TB
> 
> PLUS received around $350 after through Ebates & $130 dell promo credit.
> 
> BUT BUT....it arrived with a deep scratch and couple scuffs on a side panel. So I got $200 refunded, and had a tech come over
> to replace the panel for free.
> 
> 2635
> -350
> -140
> -200
> 
> = $1945
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Through in a couple SSD's and 64GB of ram afterwards.
> 
> Still tweaking everything.
> 
> 6700K @ 4.7ghz/4.6ghz cache 1.36v (Max 74c Real Bench loop 4hr)
> Titan XP's @ +200 core/+575 memory (GPU 1 77C, GPU2 75C 100% fan on Crysis 3 maxed @ 4K after 1hr)
> 
> 
> 
> 
> 
> Feels good not to be hampered by the 4GB on my Radeon Duo Pro


Wow... Now that's a good deal there... basically paying for the GPU's and getting the rest for free... You should run me down next time something like that comes along LOL.


----------



## patrickisfrench

it's been shown by many reviewers, gamer nexus, guru3d etc that setting memory over +500 and in most cases over +450 actually _decreases_ performance, not to mention eats up WAY more TDP that the core could use.


----------



## kvickstick

Quote:


> Originally Posted by *patrickisfrench*
> 
> it's been shown by many reviewers, gamer nexus, guru3d etc that setting memory over +500 and in most cases over +450 actually _decreases_ performance, not to mention eats up WAY more TDP that the core could use.


I never OC vram either, i never OC my ram beyond XMP either....!


----------



## CptSpig

Quote:


> Originally Posted by *patrickisfrench*
> 
> it's been shown by many reviewers, gamer nexus, guru3d etc that setting memory over +500 and in most cases over +450 actually _decreases_ performance, not to mention eats up WAY more TDP that the core could use.


+1 I got my best Fire Strike score with no memory overclock and 225 on the core. I can overclock my Titan XP with 240 on the core and 550 memory and the card is stable. Don't see any benefit on any bench marks or games.


----------



## Techenthused73

Quote:


> Originally Posted by *CptSpig*
> 
> +1 I got my best Fire Strike score with no memory overclock and 225 on the core. I can overclock my Titan XP with 240 on the core and 550 memory and the card is stable. Don't see any benefit on any bench marks or games.


On air or water? I just got my TX pascal on Friday. I watched Jayz 2 cents guide on overclocking Pascal and it seemed really complicated with voltage limits etc. Also it was a instruction on using Precision X but Precision X does not recognize the Titan X as it is not a EVGA card. Should I just use MSI afterburner and add 200 on the core and 500 on the memory and call it a day. Card is plenty powerful without an overclock so like you I'm not sure if it is worth it with temps. Not sure either what the fan curve should be or what power/voltage sliders should be at.


----------



## patrickisfrench

Quote:


> Originally Posted by *CptSpig*
> 
> +1 I got my best Fire Strike score with no memory overclock and 225 on the core. I can overclock my Titan XP with 240 on the core and 550 memory and the card is stable. Don't see any benefit on any bench marks or games.


The memory bus is so efficient at 384 bits with GDDR5X that the clock speeds end up yielding diminishing returns much quicker then with previous generations on other architecture.

i'd rather save any power draw from an already limited VRM for GPU performance application only.


----------



## patrickisfrench

Quote:


> Originally Posted by *Techenthused73*
> 
> On air or water? I just got my TX pascal on Friday. I watched Jayz 2 cents guide on overclocking Pascal and it seemed really complicated with voltage limits etc. Also it was a instruction on using Precision X but Precision X does not recognize the Titan X as it is not a EVGA card. Should I just use MSI afterburner and add 200 on the core and 500 on the memory and call it a day. Card is plenty powerful without an overclock so like you I'm not sure if it is worth it with temps. Not sure either what the fan curve should be or what power/voltage sliders should be at.


i would use afterburner, yes. i wouldn't set the clocks too high on air unless you have really good air flow in the case. what i would do right away is up the power limit to 120% max using the slider. even if you're not overclocking you will notice a more steady fluctuation of the core GPU clocks with this setting enabled.

i then would set up a custom fan profile with a 70% fan at 70C gpu temp and 100% at 82C for your two plot points. see how that is and adjust the speed down if it's too loud.

for me this allows the card to sit around 70-71 under load at 1805 boost without any other settings. it doesn't move and performance is great. ill of course push this thing when evga releases their titan specific hybrid aio cooler later in q1


----------



## bizplan

Quote:


> Originally Posted by *Techenthused73*
> 
> On air or water? I just got my TX pascal on Friday. I watched Jayz 2 cents guide on overclocking Pascal and it seemed really complicated with voltage limits etc. Also it was a instruction on using Precision X but Precision X does not recognize the Titan X as it is not a EVGA card. Should I just use MSI afterburner and add 200 on the core and 500 on the memory and call it a day. Card is plenty powerful without an overclock so like you I'm not sure if it is worth it with temps. Not sure either what the fan curve should be or what power/voltage sliders should be at.


Try Precision XOC, it is for Pascal based cards, +200/+600 should work fine, max voltage % for OC stability & power limit & temp.


----------



## jsutter71

Quote:


> Originally Posted by *octiny*
> 
> So call me crazy, but I bought an Alienware Aurora R5 late december due the crazy deal they were having for a Titan XP SLI
> configuration & Ebates.
> 
> $2635 (that's after tax, multiple discounts through rep)
> 
> 6700K/Titan XP SLI
> 850w w/liquid cooler
> 8GB
> 1TB
> 
> PLUS received around $350 after through Ebates & $130 dell promo credit.
> 
> BUT BUT....it arrived with a deep scratch and couple scuffs on a side panel. So I got $200 refunded, and had a tech come over
> to replace the panel for free.
> 
> 2635
> -350
> -140
> -200
> 
> = $1945
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Through in a couple SSD's and 64GB of ram afterwards.
> 
> Still tweaking everything.
> 
> 6700K @ 4.7ghz/4.6ghz cache 1.36v (Max 74c Real Bench loop 4hr)
> Titan XP's @ +200 core/+575 memory (GPU 1 77C, GPU2 75C 100% fan on Crysis 3 maxed @ 4K after 1hr)
> 
> 
> 
> 
> 
> Feels good not to be hampered by the 4GB on my Radeon Duo Pro


What a difference a CPU makes. Here are my results.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *jsutter71*
> 
> What a difference a CPU makes. Here are my results.
> 
> 
> Spoiler: Warning: Spoiler!


octiny has you on gpu score though.


----------



## octiny

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> octiny has you on gpu score though.












Edit:


Optimized a few things to get the GPU score even higher. Combined score too. Same 4.7/+200/+575 (highest clock is only 2012 before gpu boost brings the clocks down).

Still have room to push, but these will be my 24/7 clocks.


----------



## Robilar

Considering grabbing one of these in place of my current 1080 and bypassing the Ti when its released.

Are there summaries of fps comparisons between this card and the 1080?

Also it looks like its only available through the nvidia store?


----------



## jsutter71

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> octiny has you on gpu score though.


He should be since I have not overclocked mine. I purchased my CPU from Silicon Lottery and I'm using the same overclock settings. 4.3GHz. I have been able to reach 4.6Ghz stable but I don't want or need to push it that high. Especially since I can't afford to replace a $1600 CPU.


----------



## BelowAverageIQ

Quote:


> Originally Posted by *Garrett1974NL*
> 
> And this results in... what frequency? Under 2GHz?


Sorry Garrett, I have been away with work and unable to reply.

It equates to:

Core: 2055 - 2103 MHz (+225)

Mem: 10984 MHz (+480)

Not bad, but certainly not up there with some cards and recent reports. My card is under water. Max temp is 42 degrees.

Even if I do NOT overclock the memory, the max core is 225.

Using MSI Afterburner. Power Limit 120%, Temp Limit 90 degrees.

Cheers


----------



## ChronoBodi

Quote:


> Originally Posted by *Robilar*
> 
> Considering grabbing one of these in place of my current 1080 and bypassing the Ti when its released.
> 
> Are there summaries of fps comparisons between this card and the 1080?
> 
> Also it looks like its only available through the nvidia store?


From what I recall, it's an extra 20-30 FPS in general over a 1080 depending on OC and games.

Yes, it's Nvidia exclusive, no partners at all on this Titan X release.

That being said, they ship fast, i ordered last week and I will get it this Tuesday.


----------



## xarot

I grabbed another one for SLI, second card is on air. I am getting quite bad SLI scaling in many games. Witcher 3 is anything from 70 % to 95 % on first cards and 70 % to 86 % on second. On the other hand some games like RottR and Sleeping Dogs can utilize up to 99 % on both cards. I am using only two old SLI bridges because no HB bridges were available locally. Is it just how it is or has SLI support gone downhill after Maxwell? I doubt [email protected] GHz can bring a CPU bottleneck. If this is the case the second card will be pretty much useless for me.







I am using 2560x1440 144 Hz monitor but using DSR and 4K doesn't seem to improve scaling.

Titan X Maxwell SLI was amazing and worked very good on this system.

Also when enabling G-Sync I am getting even worse scaling and massive stuttering so I disabled that. I found similar cases on NVIDIA forums too.


----------



## unreality

Scaling is pretty bad @ 1440p. I get cpu limited with one TX in many games with a [email protected] So yes you probably are at cpu limit.

This should change with DSR @4k or higher, if not reinstall your drivers with DDU.


----------



## Sheyster

Quote:


> Originally Posted by *Techenthused73*
> 
> On air or water? I just got my TX pascal on Friday. I watched Jayz 2 cents guide on overclocking Pascal and it seemed really complicated with voltage limits etc. Also it was a instruction on using Precision X but Precision X does not recognize the Titan X as it is not a EVGA card. Should I just use MSI afterburner and add 200 on the core and 500 on the memory and call it a day. Card is plenty powerful without an overclock so like you I'm not sure if it is worth it with temps. Not sure either what the fan curve should be or what power/voltage sliders should be at.


For gaming I add +150 to core and call it a day. This nets me 2000 MHz before any throttling. Very diminishing returns over 2000 MHz in games (FPS) with this card.


----------



## Sheyster

Quote:


> Originally Posted by *Robilar*
> 
> Considering grabbing one of these in place of my current 1080 and bypassing the Ti when its released.
> 
> Are there summaries of fps comparisons between this card and the 1080?
> 
> Also it looks like its only available through the nvidia store?


DO IT! (We have the same monitor BTW.)

I came from an EVGA GTX 1080 FTW, and this card just smokes it in everything. I had a good OC'ing 1080 too (2100+). I bought the TXP and never looked back. As long as you can stomach the $1200 price tag, get one.







Yes, nVidia store only.


----------



## xTesla1856

Quote:


> Originally Posted by *Sheyster*
> 
> DO IT! (We have the same monitor BTW.)
> 
> I came from an EVGA GTX 1080 FTW, and this card just smokes it in everything. I had a good OC'ing 1080 too (2100+). I bought the TXP and never looked back. As long as you can stomach the $1200 price tag, get one.
> 
> 
> 
> 
> 
> 
> 
> Yes, nVidia store only.


Hello Sheyster, nice to see you in a Titan thread again


----------



## Sh3perd

What program should I use for OC'ing this card on a watercooled setup? Im very new to overclocking GPU's, so whatever is more beginner friendly.


----------



## arrow0309

Msi Afterburner


----------



## CptSpig

Quote:


> Originally Posted by *Sh3perd*
> 
> What program should I use for OC'ing this card on a watercooled setup? Im very new to overclocking GPU's, so whatever is more beginner friendly.


MSI Afterburner


----------



## bizplan

Quote:


> Originally Posted by *Sh3perd*
> 
> What program should I use for OC'ing this card on a watercooled setup? Im very new to overclocking GPU's, so whatever is more beginner friendly.


EVGA Precision XOC is OK although I was able to get higher scores in 3dMark with MSI AB (using the same settings).


----------



## BelowAverageIQ

Well I take my other post back. My card will do +1000 on the memory and +245 on core (12000/2138) in heaven, Arma 3 and DayZ standalone without any issues.

Max temp today though was 46 degrees which brought it back down a little. Room temp is 32 degrees. Outside is 41 degrees today


----------



## Baasha

Quote:


> Originally Posted by *BelowAverageIQ*
> 
> Well I take my other post back. *My card will do +1000 on the memory and +245 on core* (12000/2138) in heaven, Arma 3 and DayZ standalone without any issues.
> 
> Max temp today though was 46 degrees which brought it back down a little. Room temp is 32 degrees. Outside is 41 degrees today


o_0

You have a beast of a Titan XP!









Mine can do "only" +200mhz on the core and +675Mhz on the Mem.. but in 4-Way SLI though.


----------



## Glerox

Quote:


> Originally Posted by *bee144*
> 
> Just received my two Titan Xp yesterday. Starting to overclock.
> 
> Is my Corsair AX1200i enough for the rig in my signature?


Your corsair is really enough in my opinion! I bought a 1000w PSU just in case I wanted to buy a second TXP so 1200w you're fine.

Btw, I can't tell enough how this GPU amazes me when overclocked/watercooled.
Just bought DOOM and playing in 4k everything maxed out including AA I'm around 90-100 fps. With only one GPU!


----------



## qazplm5089

As for modding the EVGA Hybrid kit for the Titan XP, how come people don't use the FTW kit? The FTW kit doesn't need to be cut near the power connectors because the 1080 FTW is already an 8+8. Are there other incompatibilities that prevent the FTW kit from being used rather than the regular hybrid kit? I'm asking because I would rather not make the necessary cut on the shroud.


----------



## Menthol

The FTW is a custom board, the shroud is not the same dimensions as a FE or TXP board


----------



## asheth007

Quote:


> Originally Posted by *CptSpig*
> 
> +1 I got my best Fire Strike score with no memory overclock and 225 on the core. I can overclock my Titan XP with 240 on the core and 550 memory and the card is stable. Don't see any benefit on any bench marks or games.


This was my experience I could keep upping my memory clocks then I dialed them back down and got higher scores in Unigine and Firestrike with a lower memory overclock.


----------



## patrickisfrench

Quote:


> Originally Posted by *qazplm5089*
> 
> As for modding the EVGA Hybrid kit for the Titan XP, how come people don't use the FTW kit? The FTW kit doesn't need to be cut near the power connectors because the 1080 FTW is already an 8+8. Are there other incompatibilities that prevent the FTW kit from being used rather than the regular hybrid kit? I'm asking because I would rather not make the necessary cut on the shroud.


just wait for the titan specific cooler from evga. just a few more weeks


----------



## patrickisfrench




----------



## qazplm5089

Oh. Looks like a few weeks could mean it's launching on the rumored 1080 ti launch with the 1080 ti.


----------



## Silent Scone

Quote:


> Originally Posted by *qazplm5089*
> 
> Oh. Looks like a few weeks could mean it's launching on the rumored 1080 ti launch with the 1080 ti.


Ti will be around March. Perhaps. Who knows?


----------



## MrKenzie

Quote:


> Originally Posted by *Glerox*
> 
> Your corsair is really enough in my opinion! I bought a 1000w PSU just in case I wanted to buy a second TXP so 1200w you're fine.
> 
> Btw, I can't tell enough how this GPU amazes me when overclocked/watercooled.
> Just bought DOOM and playing in 4k everything maxed out including AA I'm around 90-100 fps. With only one GPU!


Agreed! I was seeing 55-60fps with 1x 1080 overclocked on air. Then my overclocked on water Titan XP I was seeing 100fps!


----------



## DarkHell2

Quote:


> Originally Posted by *MrKenzie*
> 
> Agreed! I was seeing 55-60fps with 1x 1080 overclocked on air. Then my overclocked on water Titan XP I was seeing 100fps!


Really? nearly 50% fps rise with water cooling + overclock? Really?


----------



## axiumone

Quote:


> Originally Posted by *qazplm5089*
> 
> As for modding the EVGA Hybrid kit for the Titan XP, how come people don't use the FTW kit? The FTW kit doesn't need to be cut near the power connectors because the 1080 FTW is already an 8+8. Are there other incompatibilities that prevent the FTW kit from being used rather than the regular hybrid kit? I'm asking because I would rather not make the necessary cut on the shroud.


FTW is a completely different PCB from a txp. Taller and a different vrm layout.


----------



## ChronoBodi

So my TXP arrived, finally.

I may be part of this club now.


----------



## NRosko

I struggling to decide on a monitor for my new titan xp, i only have one card is that enough for a 32inch 4k? Am i better going for one of those x34 wide screens? Trying to decide on a monitor has been much harder than picking my system bits. I use a rift for gaming but want a better monitor for games like witcher3, dishonored2, gtaV, or any good upcoming one player games. When a good cooler arrives i will oc the GPU if i think its worth it.


----------



## asheth007

Quote:


> Originally Posted by *NRosko*
> 
> I struggling to decide on a monitor for my new titan xp, i only have one card is that enough for a 32inch 4k? Am i better going for one of those x34 wide screens? Trying to decide on a monitor has been much harder than picking my system bits. I use a rift for gaming but want a better monitor for games like witcher3, dishonored2, gtaV, or any good upcoming one player games. When a good cooler arrives i will oc the GPU if i think its worth it.


I've got my eye on the HP Omen X 35 that's suppose to release Feb 1 it's 35 inch 3440x1440p ultrawide. It's suppose to be 100hz native it's AMVA panel not IPS so that would a personal preference. On r/ultrawidemasterrace on reddit I see people bring up QC issues with the X34 and PG348Q some I'm not sure if I want to get either of them since they have the same panel.


----------



## unreality

I did try the new Asus 32inch 4K Gsync monitor and kept returning to my swift [email protected] I guess its just personal preference but i could - or can in that case - never go back to 60Hz. Colors were much better on the 60Hz IPS though.

Also 4k 60Hz the TX (oc under water) sometimes is not at 100% usage. [email protected] is more taxing.

But in the end its just a personal prefence you have to decide.

@ChronoBodi: Welcome to the club


----------



## jhowell1030

Quote:


> Originally Posted by *NRosko*
> 
> I struggling to decide on a monitor for my new titan xp, i only have one card is that enough for a 32inch 4k? Am i better going for one of those x34 wide screens? Trying to decide on a monitor has been much harder than picking my system bits. I use a rift for gaming but want a better monitor for games like witcher3, dishonored2, gtaV, or any good upcoming one player games. When a good cooler arrives i will oc the GPU if i think its worth it.


Personally, I prefer the added immersion of a 21:9 and higher framerates over the GPU harder hitting 4k. I grabbed the PA328Q for a month just to try out 4k and immediately returned it. to go back to my predator x34.

I also do a little bit of audio production and like the wider aspect ratio for adobe audition.

Edited to specify my 21:9 monitor.


----------



## NRosko

Quote:


> Originally Posted by *asheth007*
> 
> I've got my eye on the HP Omen X 35 that's suppose to release Feb 1 it's 35 inch 3440x1440p ultrawide. It's suppose to be 100hz native it's AMVA panel not IPS so that would a personal preference. On r/ultrawidemasterrace on reddit I see people bring up QC issues with the X34 and PG348Q some I'm not sure if I want to get either of them since they have the same panel.


Yes I have my eye on that also, for me its looks less of a stupid toy than Acer & Asus designs, so i'm interested to see what its like. Only thing is i've not seen a UK release date for it & I emailed them about it but no response.


----------



## NRosko

Quote:


> Originally Posted by *jhowell1030*
> 
> Personally, I prefer the added immersion of a 21:9 and higher framerates over the GPU harder hitting 4k. I grabbed the PA328Q for a month just to try out 4k and immediately returned it. to go back to my predator x34.
> 
> I also do a little bit of audio production and like the wider aspect ratio for adobe audition.
> 
> Edited to specify my 21:9 monitor.


Yes i'm leaning that way for a wider monitor, so its interesting you try both & prefer the x34. What i can't really get my head around is how does a 4K @ 60hz compare to 21:9 @ 100hz in terms of the power needed, is higher hz less taxing than higher resolution at lower hz? i also do audio production I use Ableton live so I'm interested in how that program works on a widescreen.


----------



## NRosko

Quote:


> Originally Posted by *unreality*
> 
> Also 4k 60Hz the TX (oc under water) sometimes is not at 100% usage. [email protected] is more taxing.
> 
> :


That is actually not what i would expect, I'm running such old monitors i can't really imagine what 144hz is like or how it effects the display.


----------



## jhowell1030

Quote:


> Originally Posted by *NRosko*
> 
> Yes i'm leaning that way for a wider monitor, so its interesting you try both & prefer the x34. What i can't really get my head around is how does a 4K @ 60hz compare to 21:9 @ 100hz in terms of the power needed, is higher hz less taxing than higher resolution at lower hz? i also do audio production I use Ableton live so I'm interested in how that program works on a widescreen.


There are some games that I don't hit 100FPS on with the XP. GTA V, Witcher 3, Batman Arkham Knight, all average around 85-90 with everything maxed out for me. Deus Ex: Mankind Divided was probably the worst offender recently averaging about 65. Other titles like Gears 4 and Tomb Raider (haven't installed the new one yet) I hit 100fps no problem.

I didn't try all of these games on the pa328q but I remember GTA V and Witcher both had lower frames.

For me it honestly came down to the overall immersion of the 21:9 and higher FPS overall. Honestly, the fidelity of 4k @ 32" vs the pixel density of 1440 at ultrawide was nice...but it didn't blow me away enough to leave the smoother, more immersive experience behind.

Everyone likes something different.


----------



## ChronoBodi

Well balls, this OCed Titan XP can do 65-70 fps constant on Ultra settings at 4K on Gears of War 4, no AA or motion blur as I always turn those off no matter what

Finally first 4K60 card ever.

Now the thing is, i heard its like 45 FPS in Crysis 3, but at that point, don't we have BF1 that looks pretty good like Crysis but with better performance?

I'm just saying, is Crysis 3 still a valid game to use anymore?


----------



## jhowell1030

Quote:


> Originally Posted by *ChronoBodi*
> 
> Well balls, this OCed Titan XP can do 65-70 fps constant on Ultra settings at 4K on Gears of War 4, no AA or motion blur as I always turn those off no matter what


I was surprised at how well that game handles. Turned everything up to max and ran it at DSR +25% (just for fun) and still hit an easy 100fps.

The cut scenes though. I don't know why but half of them would play at less an 30 while the other half would run fine. ???


----------



## NRosko

Well i just did a stress test with 3d mark time spy & the GPU was hitting 1850mhz in msi afterburner is this correct as i thought the boost was 1531mhz? I don't know a huge amount about overclocking GPUs did i OC by mistake by running afterburner? I was just using it to monitor things i didn't touch a thing.


----------



## ChronoBodi

Quote:


> Originally Posted by *NRosko*
> 
> Well i just did a stress test with 3d mark time spy & the GPU was hitting 1850mhz in msi afterburner is this correct as i thought the boost was 1531mhz? I don't know a huge amount about overclocking GPUs did i OC by mistake by running afterburner? I was just using it to monitor things i didn't touch a thing.


1531 is the MINIMUM it will ever boost.

OCed though, it can boost to 1900-2000 mhz consistently as long as you keep a fan profile that goes to 100% fan speed at 60C temp,
and OC of 180 on core and 200 on memory in MSI Afterburner, AND have power limit of 120% and temp limit of 90C.

Here's my Time Spy and Fire Strike:

http://www.3dmark.com/3dm/17450663?
http://www.3dmark.com/3dm/17450787?


----------



## Robilar

Quote:


> Originally Posted by *NRosko*
> 
> Yes I have my eye on that also, for me its looks less of a stupid toy than Acer & Asus designs, so i'm interested to see what its like. Only thing is i've not seen a UK release date for it & I emailed them about it but no response.


I was looking at the hp omen as well. Same size and screen as mine (VA) and has gsync but only 100hz?


----------



## NRosko

Quote:


> Originally Posted by *Robilar*
> 
> I was looking at the hp omen as well. Same size and screen as mine (VA) and has gsync but only 100hz?


3440×1440 @ 100hz native.


----------



## DNMock

Quote:


> Originally Posted by *jhowell1030*
> 
> There are some games that I don't hit 100FPS on with the XP. GTA V, Witcher 3, Batman Arkham Knight, all average around 85-90 with everything maxed out for me. Deus Ex: Mankind Divided was probably the worst offender recently averaging about 65. Other titles like Gears 4 and Tomb Raider (haven't installed the new one yet) I hit 100fps no problem.
> 
> I didn't try all of these games on the pa328q but I remember GTA V and Witcher both had lower frames.
> 
> For me it honestly came down to the overall immersion of the 21:9 and higher FPS overall. Honestly, the fidelity of 4k @ 32" vs the pixel density of 1440 at ultrawide was nice...but it didn't blow me away enough to leave the smoother, more immersive experience behind.
> 
> Everyone likes something different.


I don't hit 100 fps in many games running SLI TXP @ 2k mhz.... Of course I usually tweak all the hidden ini settings through the roof too though.


----------



## EniGma1987

Quote:


> Originally Posted by *NRosko*
> 
> Well i just did a stress test with 3d mark time spy & the GPU was hitting 1850mhz in msi afterburner is this correct as i thought the boost was 1531mhz? I don't know a huge amount about overclocking GPUs did i OC by mistake by running afterburner? I was just using it to monitor things i didn't touch a thing.


Default out of the box speed for nearly every Titan X is 1850MHz with no overclocking.


----------



## ChronoBodi

Quote:


> Originally Posted by *EniGma1987*
> 
> Default out of the box speed for nearly every Titan X is 1850MHz with no overclocking.


more like throttling to 1550-1750 mhz due to 50% fan cap if no MSI afterburner is ever used.

If OCed and kept cool by more aggressive fan profile, easy 1950 mhz boost consistent.


----------



## MrKenzie

Quote:


> Originally Posted by *DarkHell2*
> 
> Really? nearly 50% fps rise with water cooling + overclock? Really?


Yes I don't know why/how but the maximum frames I would get with the 1080 was 70fps but would average around 55-60, while I was seeing up to 110fps with the Titan with the average around 95-100fps.


----------



## xTesla1856

This is insane, and I love it









http://www.3dmark.com/3dm/17468210?


----------



## CptSpig

Quote:


> Originally Posted by *xTesla1856*
> 
> This is insane, and I love it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/17468210?


I agree! http://www.3dmark.com/fs/11486413


----------



## Nicklas0912

Any bios mod out there to incresse Power taget limtit







?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Nicklas0912*
> 
> Any bios mod out there to incresse Power taget limtit
> 
> 
> 
> 
> 
> 
> 
> ?


No.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> I agree! http://www.3dmark.com/fs/11181198


I know i am
http://www.3dmark.com/fs/10622261


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> I know i am
> http://www.3dmark.com/fs/10622261


Nice!


----------



## pez

Quote:


> Originally Posted by *NRosko*
> 
> I struggling to decide on a monitor for my new titan xp, i only have one card is that enough for a 32inch 4k? Am i better going for one of those x34 wide screens? Trying to decide on a monitor has been much harder than picking my system bits. I use a rift for gaming but want a better monitor for games like witcher3, dishonored2, gtaV, or any good upcoming one player games. When a good cooler arrives i will oc the GPU if i think its worth it.


I really enjoyed 4K for gaming on SLI 1080s prior to having my Titan X P. I think Gsync would be a must for a single Titan, though.

21:9 1440p IMO, is a perfect match for the Titan X P at the moment. You hit 100fps in quite a few modern titles and it looks great. All of the games it looks like you're interested in playing are 21:9 compatible (at least I know Dishonored 2 and GTA V are), so I think you would have a good experience there.

I picked up a Dell S2716DGR yesterday to try out 1440p @ 144hz for myself and so far I'm leaning towards the higher refresh rate boat. 100hz to 144hz is noticeable, but not as noticeable as 60 to 100hz was. My current biggest gripe with the x34 and 21:9 in general is the lack of motivation in support of some titles. I'm not really into coaxing my games into working with the resolution sometimes.

So far, playing Overwatch, and Fallout 4 have been a bit miserable with 21:9. CS:GO stretches out the 16:9 resolution while setting the 21:9 counterpart, and Overwatch cuts off your FOV on the top and bottom if using 21:9. So you're essentially forced to play a really closed in feeling FOV, or playing the game with flanking black bars. And then there's playing older titles. I've been playing Borderlands (1) with my GF and it too has the Overwatch effect. Both Borderlands and Fallout 4 have fixes online, but for me, I don't fancy the trouble of getting it to work, and then run poorly (i.e. Fallout 4).

TL;DR when 21:9 is supported by a title, it's amazing. When it's not, it's a bit frustrating.


----------



## jhowell1030

Quote:


> Originally Posted by *pez*
> 
> I really enjoyed 4K for gaming on SLI 1080s prior to having my Titan X P. I think Gsync would be a must for a single Titan, though.
> 
> 21:9 1440p IMO, is a perfect match for the Titan X P at the moment. You hit 100fps in quite a few modern titles and it looks great. All of the games it looks like you're interested in playing are 21:9 compatible (at least I know Dishonored 2 and GTA V are), so I think you would have a good experience there.
> 
> I picked up a Dell S2716DGR yesterday to try out 1440p @ 144hz for myself and so far I'm leaning towards the higher refresh rate boat. 100hz to 144hz is noticeable, but not as noticeable as 60 to 100hz was. My current biggest gripe with the x34 and 21:9 in general is the lack of motivation in support of some titles. I'm not really into coaxing my games into working with the resolution sometimes.
> 
> So far, playing Overwatch, and Fallout 4 have been a bit miserable with 21:9. CS:GO stretches out the 16:9 resolution while setting the 21:9 counterpart, and Overwatch cuts off your FOV on the top and bottom if using 21:9. So you're essentially forced to play a really closed in feeling FOV, or playing the game with flanking black bars. And then there's playing older titles. I've been playing Borderlands (1) with my GF and it too has the Overwatch effect. Both Borderlands and Fallout 4 have fixes online, but for me, I don't fancy the trouble of getting it to work, and then run poorly (i.e. Fallout 4).
> 
> TL;DR when 21:9 is supported by a title, it's amazing. When it's not, it's a bit frustrating.


Personally, any game that doesn't support 21:9 I just play at 16:9 1440p. Doesn't bother me a bit


----------



## willverduzco

Quote:


> Originally Posted by *pez*
> 
> I really enjoyed 4K for gaming on SLI 1080s prior to having my Titan X P. I think Gsync would be a must for a single Titan, though.
> 
> 21:9 1440p IMO, is a perfect match for the Titan X P at the moment. You hit 100fps in quite a few modern titles and it looks great. All of the games it looks like you're interested in playing are 21:9 compatible (at least I know Dishonored 2 and GTA V are), so I think you would have a good experience there.
> 
> I picked up a Dell S2716DGR yesterday to try out 1440p @ 144hz for myself and so far I'm leaning towards the higher refresh rate boat. 100hz to 144hz is noticeable, but not as noticeable as 60 to 100hz was. My current biggest gripe with the x34 and 21:9 in general is the lack of motivation in support of some titles. I'm not really into coaxing my games into working with the resolution sometimes.
> 
> So far, playing Overwatch, and Fallout 4 have been a bit miserable with 21:9. CS:GO stretches out the 16:9 resolution while setting the 21:9 counterpart, and Overwatch cuts off your FOV on the top and bottom if using 21:9. So you're essentially forced to play a really closed in feeling FOV, or playing the game with flanking black bars. And then there's playing older titles. I've been playing Borderlands (1) with my GF and it too has the Overwatch effect. Both Borderlands and Fallout 4 have fixes online, but for me, I don't fancy the trouble of getting it to work, and then run poorly (i.e. Fallout 4).
> 
> TL;DR when 21:9 is supported by a title, it's amazing. When it's not, it's a bit frustrating.


+1 to most of that from me as well. Just like you, I went from 2x 1080 SLI to 1x TXP under water at 2176 MHz. The difference in raw pixel pushing and compute power is a little noticeable at 4k, but it's so much nicer to not deal with multi-card issues any more. Also like you, I went from a high res 27" display to a 34" ultrawide at 3440x1440, 100 Hz monitor with Gsync. (I chose the Asus PG348 variant due to the better menu and factory calibration, though it took me a while to get used to the "unique" base).

Just like your experiences, I really dislike vertical black bars when viewing 16:9 content / games on my 21:9 screen. In fact, unless it's a game I'm truly addicted to (e.g. Killer Instinct on the Win10 app store), I basically refuse to play the games that force me to use 16:9. However, I think you are painting a bit too bleak a picture of ultrawide support. Most modern games work just fine in ultrawide without minimal or no tweaking at all.

In CSGO (which you bring up as an example), the game itself scales perfectly, increasing FOV proportionally with horizontal resolution (HOR+). In fact, multi-monitor gives you even more of an advantage by offering an extremely wide FOV (though these setups have other issues). I'm currently only a legendary eagle master (and I've never been global elite, even before the rank update last year), but I rarely ever get caught fully off guard when a lurker is in the periphery. This is very useful when you're in Kerrigan spot as a CT and have to watch both long A and short A, if you're pushing catwalk as a T and have to look at mid double doors and cat stairs at the same time, and if you're waiting top mid as a T and need to keep an eye at both cat/mid and the entrance to long hut. The only place where I run into an issue with CSGO is in the hud scaling. In 21:9 it's fine if you turn down the scaling size, but in multi-monitor, it simply gets huge. In addition, the score screen scales according to horizontal instead of vertical resolution, leading to a cutoff display. This isn't a major issue since you can still see how you're doing in a 5v5 comp, but I'd imagine that in casual or DM you'd not see everyone. Moreover, it's hard to report players at the bottom end of the screen if you have to, even in a 5v5 comp. (To get around that, I click on "commend" and then switch tabs to report.)

I don't have Overwatch or FO4, but other modern games like Doom 2016, Battlefield 4 and One, Rainbow Six Siege, Shadow Warrior 2013 and 2, Deus Ex Human Revolution, Deus Ex Mankind Divided, Alien Isolation, Rocket League, Forza Horizon, and countless others work just fine. Some don't offer a wide enough FOV, which was the case for me in games like Singularity, Dead Space, Remember Me, and Bioshock Infinite. However in these games, you can always edit the INIs or download a utility like Flawless Widescreen which does all the hard work for you. You can always look up the game's compatibility and simple fixes on WSGF. With that website and utility, it's now a pretty rare occurrence that I can't get a game working at 21:9 well with very minimal effort (if any is required at all).


----------



## jhowell1030

Quote:


> Originally Posted by *willverduzco*
> 
> +1 to most of that from me as well. Just like you, I went from 2x 1080 SLI to 1x TXP under water at 2176 MHz. The difference in raw pixel pushing and compute power is a little noticeable at 4k, but it's so much nicer to not deal with multi-card issues any more. Also like you, I went from a high res 27" display to a 34" ultrawide at 3440x1440, 100 Hz monitor with Gsync. (I chose the Asus PG348 variant due to the better menu and factory calibration, though it took me a while to get used to the "unique" base).
> 
> Just like your experiences, I really dislike vertical black bars when viewing 16:9 content / games on my 21:9 screen. In fact, unless it's a game I'm truly addicted to (e.g. Killer Instinct on the Win10 app store), I basically refuse to play the games that force me to use 16:9. However, I think you are painting a bit too bleak a picture of ultrawide support. Most modern games work just fine in ultrawide without minimal or no tweaking at all.
> 
> In CSGO (which you bring up as an example), the game itself scales perfectly, increasing FOV proportionally with horizontal resolution (HOR+). In fact, multi-monitor gives you even more of an advantage by offering an extremely wide FOV (though these setups have other issues). I'm currently only a legendary eagle master (and I've never been global elite, even before the rank update last year), but I rarely ever get caught fully off guard when a lurker is in the periphery. This is very useful when you're in Kerrigan spot as a CT and have to watch both long A and short A, if you're pushing catwalk as a T and have to look at mid double doors and cat stairs at the same time, and if you're waiting top mid as a T and need to keep an eye at both cat/mid and the entrance to long hut. The only place where I run into an issue with CSGO is in the hud scaling. In 21:9 it's fine if you turn down the scaling size, but in multi-monitor, it simply gets huge. In addition, the score screen scales according to horizontal instead of vertical resolution, leading to a cutoff display. This isn't a major issue since you can still see how you're doing in a 5v5 comp, but I'd imagine that in casual or DM you'd not see everyone. Moreover, it's hard to report players at the bottom end of the screen if you have to, even in a 5v5 comp. (To get around that, I click on "commend" and then switch tabs to report.)
> 
> I don't have Overwatch or FO4, but other modern games like Doom 2016, Battlefield 4 and One, Rainbow Six Siege, Shadow Warrior 2013 and 2, Deus Ex Human Revolution, Deus Ex Mankind Divided, Alien Isolation, Rocket League, Forza Horizon, and countless others work just fine. Some don't offer a wide enough FOV, which was the case for me in games like Singularity, Dead Space, Remember Me, and Bioshock Infinite. However in these games, you can always edit the INIs or download a utility like Flawless Widescreen which does all the hard work for you. You can always look up the game's compatibility and simple fixes on WSGF. With that website and utility, it's now a pretty rare occurrence that I can't get a game working at 21:9 well with very minimal effort (if any is required at all).


VERY good points. It's worth noting that Blizzard purposefully made 21:9 more narrow in order to keep the FOV at 103 for all aspect ratios.


----------



## DNMock

Quote:


> Originally Posted by *pez*
> 
> I really enjoyed 4K for gaming on SLI 1080s prior to having my Titan X P. I think Gsync would be a must for a single Titan, though.
> 
> 21:9 1440p IMO, is a perfect match for the Titan X P at the moment. You hit 100fps in quite a few modern titles and it looks great. All of the games it looks like you're interested in playing are 21:9 compatible (at least I know Dishonored 2 and GTA V are), so I think you would have a good experience there.
> 
> I picked up a Dell S2716DGR yesterday to try out 1440p @ 144hz for myself and so far I'm leaning towards the higher refresh rate boat. 100hz to 144hz is noticeable, but not as noticeable as 60 to 100hz was. My current biggest gripe with the x34 and 21:9 in general is the lack of motivation in support of some titles. I'm not really into coaxing my games into working with the resolution sometimes.
> 
> So far, playing Overwatch, and Fallout 4 have been a bit miserable with 21:9. CS:GO stretches out the 16:9 resolution while setting the 21:9 counterpart, and Overwatch cuts off your FOV on the top and bottom if using 21:9. So you're essentially forced to play a really closed in feeling FOV, or playing the game with flanking black bars. And then there's playing older titles. I've been playing Borderlands (1) with my GF and it too has the Overwatch effect. Both Borderlands and Fallout 4 have fixes online, but for me, I don't fancy the trouble of getting it to work, and then run poorly (i.e. Fallout 4).
> 
> TL;DR when 21:9 is supported by a title, it's amazing. When it's not, it's a bit frustrating.


https://www.flawlesswidescreen.org/

helps a ton


----------



## Glerox

Quote:


> Originally Posted by *MrKenzie*
> 
> Agreed! I was seeing 55-60fps with 1x 1080 overclocked on air. Then my overclocked on water Titan XP I was seeing 100fps!


Nice! I did the same move.
Had we knew Nvidia would have released a Titan like two months after the gtx 1080, I would never have bought it (I even bought two actually...)
I'm still glad I made the move from two 1080s in SLI for one Titan XP.


----------



## arrow0309

Any of you guys running a single TitanX in OC with a Predator XB271HK 4K?
I've just ordered one from Amazon, I'm already running with an X34A for a couple of days but it didn't quite convinced me, same details like the 1440p and too wide for my taste.


----------



## pez

Quote:


> Originally Posted by *jhowell1030*
> 
> Personally, any game that doesn't support 21:9 I just play at 16:9 1440p. Doesn't bother me a bit


Yeah, I don't mind the black bars in many things, but I find myself playing a lot of Overwatch, or a backlog of games that require tweaking to make work. And I'm not always in the mood for trying to tweak everything prior to loading up a title. Again, it is me being super nitpicky, but when I spend >$1200 on a monitor, I can be nitpicky







. Keep in mind Fallout 4 made me extremely sour about the whole situation as well. There are fixes for the game, and they work generally well, but it didn't make up for the games poor performance and bad scalability on the GPU at the res. My best experience in that game was at 4K, honestly.

However, I know 21:9, nor either or the games here can be put fully at fault. In the end, it's just my personally experience and general frustrations with it.

I've bought a S2716DGR in the meantime to see if I really want to give up the Predator. Because like it was said, when it works, its amazing.
Quote:


> Originally Posted by *willverduzco*
> 
> +1 to most of that from me as well. Just like you, I went from 2x 1080 SLI to 1x TXP under water at 2176 MHz. The difference in raw pixel pushing and compute power is a little noticeable at 4k, but it's so much nicer to not deal with multi-card issues any more. Also like you, I went from a high res 27" display to a 34" ultrawide at 3440x1440, 100 Hz monitor with Gsync. (I chose the Asus PG348 variant due to the better menu and factory calibration, though it took me a while to get used to the "unique" base).
> 
> Just like your experiences, I really dislike vertical black bars when viewing 16:9 content / games on my 21:9 screen. In fact, unless it's a game I'm truly addicted to (e.g. Killer Instinct on the Win10 app store), I basically refuse to play the games that force me to use 16:9. However, I think you are painting a bit too bleak a picture of ultrawide support. Most modern games work just fine in ultrawide without minimal or no tweaking at all.
> 
> In CSGO (which you bring up as an example), the game itself scales perfectly, increasing FOV proportionally with horizontal resolution (HOR+). In fact, multi-monitor gives you even more of an advantage by offering an extremely wide FOV (though these setups have other issues). I'm currently only a legendary eagle master (and I've never been global elite, even before the rank update last year), but I rarely ever get caught fully off guard when a lurker is in the periphery. This is very useful when you're in Kerrigan spot as a CT and have to watch both long A and short A, if you're pushing catwalk as a T and have to look at mid double doors and cat stairs at the same time, and if you're waiting top mid as a T and need to keep an eye at both cat/mid and the entrance to long hut. The only place where I run into an issue with CSGO is in the hud scaling. In 21:9 it's fine if you turn down the scaling size, but in multi-monitor, it simply gets huge. In addition, the score screen scales according to horizontal instead of vertical resolution, leading to a cutoff display. This isn't a major issue since you can still see how you're doing in a 5v5 comp, but I'd imagine that in casual or DM you'd not see everyone. Moreover, it's hard to report players at the bottom end of the screen if you have to, even in a 5v5 comp. (To get around that, I click on "commend" and then switch tabs to report.)
> 
> I don't have Overwatch or FO4, but other modern games like Doom 2016, Battlefield 4 and One, Rainbow Six Siege, Shadow Warrior 2013 and 2, Deus Ex Human Revolution, Deus Ex Mankind Divided, Alien Isolation, Rocket League, Forza Horizon, and countless others work just fine. Some don't offer a wide enough FOV, which was the case for me in games like Singularity, Dead Space, Remember Me, and Bioshock Infinite. However in these games, you can always edit the INIs or download a utility like Flawless Widescreen which does all the hard work for you. You can always look up the game's compatibility and simple fixes on WSGF. With that website and utility, it's now a pretty rare occurrence that I can't get a game working at 21:9 well with very minimal effort (if any is required at all).


Most of my reply is above, but IIRC, CS:GO only allows up to a certain FOV, no?
Quote:


> Originally Posted by *DNMock*
> 
> https://www.flawlesswidescreen.org/
> 
> helps a ton


Yeah I kinda mentioned that







. That was part of my small rant about not wanting to always tweak things to work







.


----------



## Wyllliam

Hi
Anybody here feels like putting their Titan's to use for a good cause?
Join the forum folding war!
Team intel could use some people with the Titan's.
more info follow the link

Forum folding war Team Intel


----------



## Leyaena

Quote:


> Originally Posted by *Wyllliam*
> 
> Hi
> Anybody here feels like putting their Titan's to use for a good cause?
> Join the forum folding war!
> Team intel could use some people with the Titan's.
> more info follow the link
> 
> Forum folding war Team Intel


Antwerp!







Sup?

Also, I'll set that up tonight probably, it's been a while since I messed with [email protected]


----------



## Wyllliam

Quote:


> Originally Posted by *Leyaena*
> 
> Antwerp!
> 
> 
> 
> 
> 
> 
> 
> Sup?
> 
> Also, I'll set that up tonight probably, it's been a while since I messed with [email protected]


You to from Antwerp?
Nice to hear to you'll join, it will help a lot.


----------



## willverduzco

Quote:


> Originally Posted by *pez*
> 
> Most of my reply is above, but IIRC, CS:GO only allows up to a certain FOV, no?


It just seems to keep on stretching that FOV to whatever your screen width is. Look at how much more you see horizontally when going from 16:9 to 3x1 surround.




(Images from WSGF, where you can also find an animated gif and HOR+ scaling information.)


----------



## pez

Quote:


> Originally Posted by *willverduzco*
> 
> It just seems to keep on stretching that FOV to whatever your screen width is. Look at how much more you see horizontally when going from 16:9 to 3x1 surround.
> 
> 
> 
> 
> (Images from WSGF, where you can also find an animated gif and HOR+ scaling information.)


I'll have to test it out again. For some reason it felt really strange to me, if not a bit fish-eyed.

On a side note -- what are you guys doing as far as noise with the Titan X P? It's one of my biggest cons with the card so far, and it's exacerbated a bit by the fact I have mine in a Ncase. The airflow it's getting is direct, but the case is small and rather 'open' and not necessarily silent if that makes sense. So far I've applied by OC and then set a max fan of 70% as a 'reasonable' sounding noise profile. Of course this affects my max OC/Boost by knocking me down from 1924-2024Mhz to 1794-1850Mhz. I haven't done any testing to see if the 200Mhz really makes that huge of a difference, but at this point, I'm more interested in keeping my PC a bit silent.

So are you guys (that are still on air) just sucked it up, or what?

I currently have a GTX 1080 on the way (ACX 3.0) to test out. I feel that it's going to fall a bit flat on its' face in 21:9 1440p, but it might pair well with the Dell I currently have on hand. At this point I'm deciding between what I'm guessing is 16:9 and low noise or 21:9 and moderate to high noise. First world problems







.


----------



## jhowell1030

Quote:


> Originally Posted by *pez*
> 
> I'll have to test it out again. For some reason it felt really strange to me, if not a bit fish-eyed.
> 
> On a side note -- what are you guys doing as far as noise with the Titan X P? It's one of my biggest cons with the card so far, and it's exacerbated a bit by the fact I have mine in a Ncase. The airflow it's getting is direct, but the case is small and rather 'open' and not necessarily silent if that makes sense. So far I've applied by OC and then set a max fan of 70% as a 'reasonable' sounding noise profile. Of course this affects my max OC/Boost by knocking me down from 1924-2024Mhz to 1794-1850Mhz. I haven't done any testing to see if the 200Mhz really makes that huge of a difference, but at this point, I'm more interested in keeping my PC a bit silent.
> 
> So are you guys (that are still on air) just sucked it up, or what?
> 
> I currently have a GTX 1080 on the way (ACX 3.0) to test out. I feel that it's going to fall a bit flat on its' face in 21:9 1440p, but it might pair well with the Dell I currently have on hand. At this point I'm deciding between what I'm guessing is 16:9 and low noise or 21:9 and moderate to high noise. First world problems
> 
> 
> 
> 
> 
> 
> 
> .


I honestly never even thought about going on water until I got this card. I even had two 980 K|NGP|N cards which were practically begging for water. However, the noise of that stock XP cooler...I just couldn't handle.

Now that I have it on water...it makes a vibrating, whining like noise when it's under any sort of load. Swapped one devil for another. I usually game with a headset on but I still know it's there.


----------



## pez

Quote:


> Originally Posted by *jhowell1030*
> 
> I honestly never even thought about going on water until I got this card. I even had two 980 K|NGP|N cards which were practically begging for water. However, the noise of that stock XP cooler...I just couldn't handle.
> 
> Now that I have it on water...it makes a vibrating, whining like noise when it's under any sort of load. Swapped one devil for another. I usually game with a headset on but I still know it's there.


Yeah, I'm too in love with the case to swap it just yet. It's that or I go to a different case and repurpose the Ncase. I've seen in videos the sound profile of the ACX 3.0 and it's pretty awesome, but I'm skeptical of the 1080s performance. I know it handled 1440p well, but hopefully it does fine with 21:9 or I can come to some compromise with AA, etc.

I tested out the Titan X at a 60% cap and it did well, and I like to think that's acceptable noise, but I'm not sure. I'm just being picky at this point really







.


----------



## jhowell1030

Quote:


> Originally Posted by *pez*
> 
> Yeah, I'm too in love with the case to swap it just yet. It's that or I go to a different case and repurpose the Ncase. I've seen in videos the sound profile of the ACX 3.0 and it's pretty awesome, but I'm skeptical of the 1080s performance. I know it handled 1440p well, but hopefully it does fine with 21:9 or I can come to some compromise with AA, etc.
> 
> I tested out the Titan X at a 60% cap and it did well, and I like to think that's acceptable noise, but I'm not sure. I'm just being picky at this point really
> 
> 
> 
> 
> 
> 
> 
> .


Keep me posted on your experience with 21:9 with a 1080. I had two 980's and almost went to one 1080 because I hated the issues I was having with two cards...but then they announced the XP and my mind was made.

Love that Ncase. Thought about looking into one once upon a time.


----------



## pez

Quote:


> Originally Posted by *jhowell1030*
> 
> Keep me posted on your experience with 21:9 with a 1080. I had two 980's and almost went to one 1080 because I hated the issues I was having with two cards...but then they announced the XP and my mind was made.
> 
> Love that Ncase. Thought about looking into one once upon a time.


Will do. I had 2 1080s, but decided I wanted to go to the Ncase and figured I'd go all out with the Titan and x34. I'm not sure why the noise is bothering me all of a sudden considering I use headphones exclusively. But we shall see.


----------



## jhowell1030

Quote:


> Originally Posted by *pez*
> 
> Will do. I had 2 1080s, but decided I wanted to go to the Ncase and figured I'd go all out with the Titan and x34. I'm not sure why the noise is bothering me all of a sudden considering I use headphones exclusively. But we shall see.


I'm the same way. Can't even hear my XP since I have headphones but just knowing it sounds like a vibrator bothers the heck outta me.


----------



## pez

Quote:


> Originally Posted by *jhowell1030*
> 
> I'm the same way. Can't even hear my XP since I have headphones but just knowing it sounds like a vibrator bothers the heck outta me.


This made me lol pretty hard.


----------



## Nicklas0912

Hello Boys!

How much better Will my overclock be with The Titan on Water? Righr now it Can Do 2050/2025 stabil with Stock fan.


----------



## DNMock

Quote:


> Originally Posted by *Wyllliam*
> 
> Hi
> Anybody here feels like putting their Titan's to use for a good cause?
> Join the forum folding war!
> Team intel could use some people with the Titan's.
> more info follow the link
> 
> Forum folding war Team Intel


If they fixed the drivers finally I'll throw in on it. Last time I checked TXP wasn't functioning correctly due to Nvidia driver issues w/ FAH.

I think I'm still signed up with team intel actually.
Quote:


> Originally Posted by *Nicklas0912*
> 
> Hello Boys!
> 
> How much better Will my overclock be with The Titan on Water? Righr now it Can Do 2050/2025 stabil with Stock fan.


Fans set to 100%?

Max clocks won't improve much if any, but if you check you are probably not running that speed consistently as temps go up and the Power Limit wall gets slammed in to.

Run Valley or Heaven or something on loop for about 20 or 30 minutes and check what your clocks are leveling off at on your current set-up, odds are you will be down in the 1900's once it levels off after a bit.

Waterblocks should remedy that a fair bit and you will be able to maintain those clocks consistently since Power Limit is tied to temps.

That's really the thing about TXP, keeping the clocks consistent preventing Boost 3.0 from downclocking you when you hit that 120% mark. Max recorded clocks won't see much, if any improvement though unfortunately.


----------



## Wyllliam

Quote:


> Originally Posted by *DNMock*
> 
> If they fixed the drivers finally I'll throw in on it. Last time I checked TXP wasn't functioning correctly due to Nvidia driver issues w/ FAH.
> 
> I think I'm still signed up with team intel actually.


You can use driver 372.70 for folding.
Or 376.48 hotfix works as well.
Give it a try and let me know if you get it going.
You have to resign on every year.
More info if you follow the link


----------



## jhowell1030

Quote:


> Originally Posted by *pez*
> 
> This made me lol pretty hard.


Good deal! My wife thought I was hiding some kind of new toy by my desk the first time I went to game out.

"Sorry to disappoint you, babe..."


----------



## DNMock

Quote:


> Originally Posted by *Wyllliam*
> 
> You can use driver 372.70 for folding.
> Or 376.48 hotfix works as well.
> Give it a try and let me know if you get it going.


ah sweet, I'll give it a go this evening. Been annoying me for a while now not being able to put all that muscle to use while at the office.


----------



## Wyllliam

Quote:


> Originally Posted by *DNMock*
> 
> ah sweet, I'll give it a go this evening. Been annoying me for a while now not being able to put all that muscle to use while at the office.


Happy to help.
Don't forget to sign up for team Intel again.
Teams get cleared every year.


----------



## Nicklas0912

Quote:


> Originally Posted by *DNMock*
> 
> If they fixed the drivers finally I'll throw in on it. Last time I checked TXP wasn't functioning correctly due to Nvidia driver issues w/ FAH.
> 
> I think I'm still signed up with team intel actually.
> Fans set to 100%?
> 
> Max clocks won't improve much if any, but if you check you are probably not running that speed consistently as temps go up and the Power Limit wall gets slammed in to.
> 
> Run Valley or Heaven or something on loop for about 20 or 30 minutes and check what your clocks are leveling off at on your current set-up, odds are you will be down in the 1900's once it levels off after a bit.
> 
> Waterblocks should remedy that a fair bit and you will be able to maintain those clocks consistently since Power Limit is tied to temps.
> 
> That's really the thing about TXP, keeping the clocks consistent preventing Boost 3.0 from downclocking you when you hit that 120% mark. Max recorded clocks won't see much, if any improvement though unfortunately.


Most benchmark Ye, just not The GPU test 2 In time spy In that it Jump from 2025 to 1950 but all other test 2050 --> 2012

But with waterblock you Will gain 2-5% TDP cause you dont have The fan anymore at 100% it EAT some TDP too.

With Stock Cooler on 100% The card is max 65c with oc, "have total of 15 fans In my case." Custom water cooling and stuff. So hope with waterblock 2100---> 2050 In all test.


----------



## aomsin2526

Hello guys, i just buy this chick last week and try to overclock today.
I can reach max core clock +250 (2050mhz) and +950 (5950mhz) memory. is it good or bad?


----------



## ChronoBodi

Quote:


> Originally Posted by *aomsin2526*
> 
> Hello guys, i just buy this chick last week and try to overclock today.
> I can reach max core clock +250 (2050mhz) and +950 (5950mhz) memory. is it good or bad?


I wouldn't call that stable if on 100% fan curve stock cooler.

+230 core crashes my TXP, and memory over +400 eats away TDP power that can go to core clock to boost higher.

So i keep mine on +200 core, +204 memory, and 100% fan curve starting at 60C in MSI afterburner,

it's consistently 1950-2000 mhz at all times.


----------



## arrow0309

How am I running?
gpu +227 (highest 2125 lowest 2038) mem +497 (5500), 31C max gpu temp

http://www.3dmark.com/3dm/17518963
http://www.3dmark.com/3dm/17519077
http://www.3dmark.com/3dm/17519277
http://www.3dmark.com/3dm/17518838

Someone told me we "could" stabilise a bit more the PL tweaking an undervolting custom curve.
Is that possible, that our Pxp under heavy load and oc could improve with less volt.?


----------



## carlhil2

To me, the most impressive thing about the TXP are the temps, under water=


----------



## CptSpig

Quote:


> Originally Posted by *arrow0309*
> 
> How am I running?
> gpu +227 (highest 2125 lowest 2038) mem +497 (5500), 31C max gpu temp
> 
> http://www.3dmark.com/3dm/17518963
> http://www.3dmark.com/3dm/17519077
> http://www.3dmark.com/3dm/17519277
> http://www.3dmark.com/3dm/17518838
> 
> Someone told me we "could" stabilise a bit more the PL tweaking an undervolting custom curve.
> Is that possible, that our Pxp under heavy load and oc could improve with less volt.?


Your Graphic scores are very good and your temps are exceptional. Good job.


----------



## MXPOC

Hi all, I'm REALLY hoping you guys can help me out.

I'm not sure that I'm posting in the right place for a bit of technical expertise on the Titan X Pascal. So I recently upgraded from 2X 970s to the mighty Titan X, mainly because 2016 wasn't the best year for SLI support. I was instantly blown away by the sheer performance of the beast but the joy didn't last very long. My PC is restarting during use, but only in very specific situations. The one that seems to trigger it every time is the combined Firestrike test on 3DMark. This test ran perfectly fine with my old cards but seems to be causing a major malfunction with the Titan X Pascal. I have a 750W Corsair PSU which seems to be pulling about 450W at the moment the restart triggers so I seem to have enough overhead. I have also tried running the test at base clock speeds on both the Titan and the CPU but it still cuts out after just a few seconds. Strangely I can run other benchmarks like Heaven and Furmark for 8 to 10 hours straight without issue. I've tried both driver and BIOS updates but nothing seems to help. At this point I'm running out of ideas.

Intel 5820K @ 4.3 GHz (Corsair H100i)
16GB Team Vengeance DDR4
MSI X99A SLI Plus


----------



## Dagamus NM

Quote:


> Originally Posted by *MXPOC*
> 
> Hi all, I'm REALLY hoping you guys can help me out.
> 
> I'm not sure that I'm posting in the right place for a bit of technical expertise on the Titan X Pascal. So I recently upgraded from 2X 970s to the mighty Titan X, mainly because 2016 wasn't the best year for SLI support. I was instantly blown away by the sheer performance of the beast but the joy didn't last very long. My PC is restarting during use, but only in very specific situations. The one that seems to trigger it every time is the combined Firestrike test on 3DMark. This test ran perfectly fine with my old cards but seems to be causing a major malfunction with the Titan X Pascal. I have a 750W Corsair PSU which seems to be pulling about 450W at the moment the restart triggers so I seem to have enough overhead. I have also tried running the test at base clock speeds on both the Titan and the CPU but it still cuts out after just a few seconds. Strangely I can run other benchmarks like Heaven and Furmark for 8 to 10 hours straight without issue. I've tried both driver and BIOS updates but nothing seems to help. At this point I'm running out of ideas.
> 
> Intel 5820K @ 4.3 GHz (Corsair H100i)
> 16GB Team Vengeance DDR4
> MSI X99A SLI Plus


Do you have the extra power connections on your motherboard connected? That has been the issues with other having the same symptoms


----------



## MXPOC

Quote:


> Originally Posted by *Dagamus NM*
> 
> Do you have the extra power connections on your motherboard connected? That has been the issues with other having the same symptoms


Hi, thanks for the reply. There is an additional 8-pin connector that runs from PSU to the board. I've checked that it is plugged in firmly but to no avail.


----------



## Dagamus NM

Quote:


> Originally Posted by *MXPOC*
> 
> Hi, thanks for the reply. There is an additional 8-pin connector that runs from PSU to the board. I've checked that it is plugged in firmly but to no avail.


Does the x99 board you are using have a four pin molex connector at the bottom?


----------



## MXPOC

Quote:


> Originally Posted by *Dagamus NM*
> 
> Does the x99 board you are using have a four pin molex connector at the bottom?


I've just had a look to double check but I don't believe that it does.


----------



## ChronoBodi

Is this because of the OCed Titan XP drawing slightly over spec from PCI-E bus? The question is why me and others are just fine on max power limit/OC and others get issues...

What is MXPOC's OC settings?


----------



## bizplan

Quote:


> Originally Posted by *MXPOC*
> 
> I've just had a look to double check but I don't believe that it does.


I wonder if there's a BIOS setting that needs to be changed, i.e. voltage to CPU or other motherboard component(s) i.e. system agent, else PSU is too light, maybe upgrade?


----------



## Seyumi

Quote:


> Originally Posted by *MXPOC*
> 
> Hi all, I'm REALLY hoping you guys can help me out.
> 
> I'm not sure that I'm posting in the right place for a bit of technical expertise on the Titan X Pascal. So I recently upgraded from 2X 970s to the mighty Titan X, mainly because 2016 wasn't the best year for SLI support. I was instantly blown away by the sheer performance of the beast but the joy didn't last very long. My PC is restarting during use, but only in very specific situations. The one that seems to trigger it every time is the combined Firestrike test on 3DMark. This test ran perfectly fine with my old cards but seems to be causing a major malfunction with the Titan X Pascal. I have a 750W Corsair PSU which seems to be pulling about 450W at the moment the restart triggers so I seem to have enough overhead. I have also tried running the test at base clock speeds on both the Titan and the CPU but it still cuts out after just a few seconds. Strangely I can run other benchmarks like Heaven and Furmark for 8 to 10 hours straight without issue. I've tried both driver and BIOS updates but nothing seems to help. At this point I'm running out of ideas.
> 
> Intel 5820K @ 4.3 GHz (Corsair H100i)
> 16GB Team Vengeance DDR4
> MSI X99A SLI Plus


I'm suspecting your PSU regardless of the total watts it has. Out of the half a dozen computer restart issues I've had over the past decade or two, the issue always came back to the PSU.


----------



## MXPOC

Quote:


> Originally Posted by *ChronoBodi*
> 
> Is this because of the OCed Titan XP drawing slightly over spec from PCI-E bus? The question is why me and others are just fine on max power limit/OC and others get issues...
> 
> What is MXPOC's OC settings?


It happens at base clock speed, admittedly less frequently. I can't even hit +50 on the Core Clock without it restarting every single time on Fire Strike.


----------



## MXPOC

Quote:


> Originally Posted by *Seyumi*
> 
> I'm suspecting your PSU regardless of the total watts it has. Out of the half a dozen computer restart issues I've had over the past decade or two, the issue always came back to the PSU.


Thanks for the advice everyone. Do you have any recommendations for a new power supply? It looks like I'm going to have to bite the bullet and go for it. My PSU is definitely the weak link in my PC right now.


----------



## bizplan

Quote:


> Originally Posted by *MXPOC*
> 
> Thanks for the advice everyone. Do you have any recommendations for a new power supply? It looks like I'm going to have to bite the bullet and go for it. My PSU is definitely the weak link in my PC right now.


I would recommend at least a 1,000 watt unit for the X99 platform, I'm pretty happy with my EVGA PSU (although admittedly it's only an 850 watt unit for the Z170 platform).

http://www.evga.com/products/productlist.aspx?type=10&family=Power+Supplies&chipset=1000+Watts


----------



## MXPOC

Quote:


> Originally Posted by *bizplan*
> 
> I wonder if there's a BIOS setting that needs to be changed, i.e. voltage to CPU or other motherboard component(s) i.e. system agent, else PSU is too light, maybe upgrade?


Quote:


> Originally Posted by *bizplan*
> 
> I would recommend at least a 1,000 watt unit for the X99 platform, I'm pretty happy with my EVGA PSU (although admittedly it's only an 850 watt unit for the Z170 platform).
> 
> http://www.evga.com/products/productlist.aspx?type=10&family=Power+Supplies&chipset=1000+Watts


Quote:


> Originally Posted by *bizplan*
> 
> I would recommend at least a 1,000 watt unit for the X99 platform, I'm pretty happy with my EVGA PSU (although admittedly it's only an 850 watt unit for the Z170 platform).
> 
> http://www.evga.com/products/productlist.aspx?type=10&family=Power+Supplies&chipset=1000+Watts


Do you think an 'EVGA SuperNova T2 850W 80 Plus Titanium' would be sufficient or do I really need to push up to 1000W+?


----------



## bizplan

Quote:


> Originally Posted by *MXPOC*
> 
> Do you think an 'EVGA SuperNova T2 850W 80 Plus Titanium' would be sufficient or do I really need to push up to 1000W+?


With a PSU, you should double the max anticipated wattage as PSUs are optimal at 1/2 their rated wattage. Since you hit 450 watts when it restarts, you should double that = 900 watts, that's why I was recommending a 1k unit (and you're on X99).


----------



## Dagamus NM

Quote:


> Originally Posted by *MXPOC*
> 
> I've just had a look to double check but I don't believe that it does.


Too bad that isn't it, would have been the easiest fix.

As the others have said, next stop is the PSU.

1K gives some extra room and keeps you in the efficiency area. I'd bet the 850W would be fine too but if the price difference is less than $100 then why not get the bigger unit anyhow. Same form factor most likely with a few extra power connectors likely.


----------



## Nicklas0912

Quote:


> Originally Posted by *ChronoBodi*
> 
> I wouldn't call that stable if on 100% fan curve stock cooler.
> 
> +230 core crashes my TXP, and memory over +400 eats away TDP power that can go to core clock to boost higher.
> 
> So i keep mine on +200 core, +204 memory, and 100% fan curve starting at 60C in MSI afterburner,
> 
> it's consistently 1950-2000 mhz at all times.


For benchmark, I dont care about what my fan speed is, is just about to get the best score









I dont need to OC the GPU at daily use anyway.


----------



## Nicklas0912

Quote:


> Originally Posted by *DNMock*
> 
> If they fixed the drivers finally I'll throw in on it. Last time I checked TXP wasn't functioning correctly due to Nvidia driver issues w/ FAH.
> 
> I think I'm still signed up with team intel actually.
> Fans set to 100%?
> 
> Max clocks won't improve much if any, but if you check you are probably not running that speed consistently as temps go up and the Power Limit wall gets slammed in to.
> 
> Run Valley or Heaven or something on loop for about 20 or 30 minutes and check what your clocks are leveling off at on your current set-up, odds are you will be down in the 1900's once it levels off after a bit.
> 
> Waterblocks should remedy that a fair bit and you will be able to maintain those clocks consistently since Power Limit is tied to temps.
> 
> That's really the thing about TXP, keeping the clocks consistent preventing Boost 3.0 from downclocking you when you hit that 120% mark. Max recorded clocks won't see much, if any improvement though unfortunately.


THat is the stable clocks under 3dmark.
Max peak is 2075.

But I will get water on this card next week, so hope I will hit 2100


----------



## ChronoBodi

MXPOC, get EVGA supernova PSUs of 850w or more, but avoid the "NEX" or first gen EVGA PSUs. they used a crappy OEM for that, but went to Superflower for their 2nd generation PSUs and it's all smooth sailing from there.

Check out JonnyGuru's site if you need more PSU info.


----------



## jsutter71

The power supply makes the difference. Why go low wattage when your running a system with a $1200 GPU? Personally I wouldn't use anything under 1000 watts.


----------



## MrKenzie

Quote:


> Originally Posted by *MXPOC*
> 
> Hi all, I'm REALLY hoping you guys can help me out.
> 
> I'm not sure that I'm posting in the right place for a bit of technical expertise on the Titan X Pascal. So I recently upgraded from 2X 970s to the mighty Titan X, mainly because 2016 wasn't the best year for SLI support. I was instantly blown away by the sheer performance of the beast but the joy didn't last very long. My PC is restarting during use, but only in very specific situations. The one that seems to trigger it every time is the combined Firestrike test on 3DMark. This test ran perfectly fine with my old cards but seems to be causing a major malfunction with the Titan X Pascal. I have a 750W Corsair PSU which seems to be pulling about 450W at the moment the restart triggers so I seem to have enough overhead. I have also tried running the test at base clock speeds on both the Titan and the CPU but it still cuts out after just a few seconds. Strangely I can run other benchmarks like Heaven and Furmark for 8 to 10 hours straight without issue. I've tried both driver and BIOS updates but nothing seems to help. At this point I'm running out of ideas.
> 
> Intel 5820K @ 4.3 GHz (Corsair H100i)
> 16GB Team Vengeance DDR4
> MSI X99A SLI Plus


I very much doubt it is the PSU causing your issues considering you were successfully running SLI 970 which will draw more power than a single Titan. I think it is another problem that may be hard to pin-point. 3DMark failed to run at all for me for over 2 months! I literally gave up on trying to use it, until one day I tried it again and it worked perfectly! I know that isn't much help to you but I think you will waste your money buying a bigger PSU. Good PSU's can run at 90% load all day long, the efficiency loss is minimal and nothing to worry about! I run my Titan XP with a Corsair AX760i no problems at all.


----------



## Artah

Quote:


> Originally Posted by *Nicklas0912*
> 
> For benchmark, I dont care about what my fan speed is, is just about to get the best score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I dont need to OC the GPU at daily use anyway.


are you sure it's not overheating? maybe crank it up a bit using some utilities or at least keep an eye on it?


----------



## kx11

i like this result a lot

http://www.3dmark.com/3dm/17541135?

i don't know how some peole get the exact clock of the GPU in the results page while i get the base clock all the time


----------



## Nicklas0912

Quote:


> Originally Posted by *Artah*
> 
> are you sure it's not overheating? maybe crank it up a bit using some utilities or at least keep an eye on it?


For thins longer than 3Dmark, im sure it will on Stock cooler!

in whole firestrike, is 2025/2012 all time.
Time Spy GPU test 1: 2050/212
Timespy GPU test 2: 2012 - The 19xx cause of temps I think it is?

As TXP is limted ny temps? High temps= Low volts? or something.

I will get my waterblock, and hope 2100 will be stabile.

The cooling system: 2x 3 120 mm rad, 2x 140mm rads. 2x D5 Pump.


----------



## DooRules

Quote:


> Originally Posted by *kx11*
> 
> i like this result a lot
> 
> http://www.3dmark.com/3dm/17541135?
> 
> i don't know how some peole get the exact clock of the GPU in the results page while i get the base clock all the time


Go to options in 3d mark when you have it open. Turn on systeminfo hardware monitoring. In my experience this does come at a small cost in score, you may see a small hitch in the run when it does its scan. But you will see the exact clock if that is what you want to see.


----------



## kx11

Quote:


> Originally Posted by *DooRules*
> 
> Go to options in 3d mark when you have it open. Turn on systeminfo hardware monitoring. In my experience this does come at a small cost in score, you may see a small hitch in the run when it does its scan. But you will see the exact clock if that is what you want to see.


yeah that option hits the score bad , i'll skip it if that is the only way


----------



## Seyumi

Quote:


> Originally Posted by *MrKenzie*
> 
> I very much doubt it is the PSU causing your issues considering you were successfully running SLI 970 which will draw more power than a single Titan. I think it is another problem that may be hard to pin-point. 3DMark failed to run at all for me for over 2 months! I literally gave up on trying to use it, until one day I tried it again and it worked perfectly! I know that isn't much help to you but I think you will waste your money buying a bigger PSU. Good PSU's can run at 90% load all day long, the efficiency loss is minimal and nothing to worry about! I run my Titan XP with a Corsair AX760i no problems at all.


It doesn't even matter if his system had 4x970 SLI and it didn't restart. Many times it has nothing to do with the total power draw but more like the load on each cable/rail. I had a crappy Alienware 1200 watt PSU back in the GTX 480/580 days. It could handle 2x stock GTX 580's no problem at all. It would restart under heavy load even with just 1 super overclocked GTX 580 all by itself. The Titan X pascal (overclocked) is obviously drawing more power out of each cable/rail than a 970 would.


----------



## xTesla1856

Guys I need your help. I'm paging all the experienced guys in here to read this:

About 2 weeks ago, my rig started behaving very weirdly (sig rig). BSOD's out of the blue, seemingly at random. Codes are "PFN_LIST CORRUPT", "SYSTEM_SERVICE_EXCEPTION". Also, sometimes the Windows 10 recovery would crash as well and give me some fatal sys32 driver error. Funny thing is, I reinstalled Windows 3 days ago, because of these very issues, thinking it was just a botched driver or update. But today, about 2 hours into playing Forza, it started again. "PFN_LIST_CORRUPT" BSOD, a few seconds before that, I'd get stuttering and a Windows10 notification saying "Having problems with audio playback?". Wouldn't reboot, it power cycled itself a few times, then booted into the BIOS, recognizing only 12 out of 16GB of RAM (3 DIMM LED's on my board lit up, not 4). Now at stock settings, it booted into Windows again, and I'm typing this post with no shenanigans happening.

I had a similar issue with my old RVE and 5820K, it turned out it was a defective stick of Trident Z. Now I ask you guys, what should I check, what should I do? I'm at the very end of wits here. The rig used to work PERFECTLY up until about 2 weeks ago. Could it be my SSD or a hard drive?

I appreciate any and all help


----------



## CptSpig

Quote:


> Originally Posted by *xTesla1856*
> 
> Guys I need your help. I'm paging all the experienced guys in here to read this:
> 
> About 2 weeks ago, my rig started behaving very weirdly (sig rig). BSOD's out of the blue, seemingly at random. Codes are "PFN_LIST CORRUPT", "SYSTEM_SERVICE_EXCEPTION". Also, sometimes the Windows 10 recovery would crash as well and give me some fatal sys32 driver error. Funny thing is, I reinstalled Windows 3 days ago, because of these very issues, thinking it was just a botched driver or update. But today, about 2 hours into playing Forza, it started again. "PFN_LIST_CORRUPT" BSOD, a few seconds before that, I'd get stuttering and a Windows10 notification saying "Having problems with audio playback?". Wouldn't reboot, it power cycled itself a few times, then booted into the BIOS, recognizing only 12 out of 16GB of RAM (3 DIMM LED's on my board lit up, not 4). Now at stock settings, it booted into Windows again, and I'm typing this post with no shenanigans happening.
> 
> I had a similar issue with my old RVE and 5820K, it turned out it was a defective stick of Trident Z. Now I ask you guys, what should I check, what should I do? I'm at the very end of wits here. The rig used to work PERFECTLY up until about 2 weeks ago. Could it be my SSD or a hard drive?
> 
> I appreciate any and all help


Try these links:

PFN_LIST CORRUPT: https://msdn.microsoft.com/en-us/library/windows/hardware/ff559014(v=vs.85).aspx

SYSTEM_SERVICE_EXCEPTION: https://msdn.microsoft.com/en-us/library/windows/hardware/ff558949(v=vs.85).aspx


----------



## jhowell1030

Quote:


> Originally Posted by *xTesla1856*
> 
> Guys I need your help. I'm paging all the experienced guys in here to read this:
> 
> About 2 weeks ago, my rig started behaving very weirdly (sig rig). BSOD's out of the blue, seemingly at random. Codes are "PFN_LIST CORRUPT", "SYSTEM_SERVICE_EXCEPTION". Also, sometimes the Windows 10 recovery would crash as well and give me some fatal sys32 driver error. Funny thing is, I reinstalled Windows 3 days ago, because of these very issues, thinking it was just a botched driver or update. But today, about 2 hours into playing Forza, it started again. "PFN_LIST_CORRUPT" BSOD, a few seconds before that, I'd get stuttering and a Windows10 notification saying "Having problems with audio playback?". Wouldn't reboot, it power cycled itself a few times, then booted into the BIOS, recognizing only 12 out of 16GB of RAM (3 DIMM LED's on my board lit up, not 4). Now at stock settings, it booted into Windows again, and I'm typing this post with no shenanigans happening.
> 
> I had a similar issue with my old RVE and 5820K, it turned out it was a defective stick of Trident Z. Now I ask you guys, what should I check, what should I do? I'm at the very end of wits here. The rig used to work PERFECTLY up until about 2 weeks ago. Could it be my SSD or a hard drive?
> 
> I appreciate any and all help


First of all is the understanding of the error. The PFN_LIST_CORRUPT BSOD is caused when the page frame number (PFN) list is some how lost or corrupted. The PFN is used by your OS drive to determine the location of physical files on the disk. So, that being said, if it's a hardware issue it's either going to be with your disk that the OS is on or the memory.

If it were me, I'd run disk check on the OS disk first on then try to rule out my RAM by trying MemTest. Seeing that you BIOS had only recognized 3 out of the 4 DIMMS of memory as you described I'd be sure to 1. Make sure my BIOS was up to date and 2. Assuming the chkdsk passed for the OS drive start by replacing ram first.

Any programs installed during that time period? Sometimes if one or more programs try to access a system needed file at the same time this could happen.


----------



## bizplan

Quote:


> Originally Posted by *xTesla1856*
> 
> Guys I need your help. I'm paging all the experienced guys in here to read this:
> 
> About 2 weeks ago, my rig started behaving very weirdly (sig rig). BSOD's out of the blue, seemingly at random. Codes are "PFN_LIST CORRUPT", "SYSTEM_SERVICE_EXCEPTION". Also, sometimes the Windows 10 recovery would crash as well and give me some fatal sys32 driver error. Funny thing is, I reinstalled Windows 3 days ago, because of these very issues, thinking it was just a botched driver or update. But today, about 2 hours into playing Forza, it started again. "PFN_LIST_CORRUPT" BSOD, a few seconds before that, I'd get stuttering and a Windows10 notification saying "Having problems with audio playback?". Wouldn't reboot, it power cycled itself a few times, then booted into the BIOS, recognizing only 12 out of 16GB of RAM (3 DIMM LED's on my board lit up, not 4). Now at stock settings, it booted into Windows again, and I'm typing this post with no shenanigans happening.
> 
> I had a similar issue with my old RVE and 5820K, it turned out it was a defective stick of Trident Z. Now I ask you guys, what should I check, what should I do? I'm at the very end of wits here. The rig used to work PERFECTLY up until about 2 weeks ago. Could it be my SSD or a hard drive?
> 
> I appreciate any and all help


Sounds like an intermittent hardware issue: 1) sound card/sound driver or 2) bad memory stick. Good luck!


----------



## Sheyster

Quote:


> Originally Posted by *MXPOC*
> 
> Thanks for the advice everyone. Do you have any recommendations for a new power supply? It looks like I'm going to have to bite the bullet and go for it. My PSU is definitely the weak link in my PC right now.


If I was buying a PSU today, I'd buy this one:

http://www.evga.com/products/product.aspx?pn=220-G3-1000-X1

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story6&reid=494

9.8 score on JG.


----------



## Enapace

Quote:


> Originally Posted by *Sheyster*
> 
> If I was buying a PSU today, I'd buy this one:
> 
> http://www.evga.com/products/product.aspx?pn=220-G3-1000-X1
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story6&reid=494
> 
> 9.8 score on JG.


I've just brought one of those waited ages for them come to UK nearly brought the T2 it took that long lol.


----------



## CptSpig

Quote:


> Originally Posted by *xTesla1856*
> 
> Guys I need your help. I'm paging all the experienced guys in here to read this:
> 
> About 2 weeks ago, my rig started behaving very weirdly (sig rig). BSOD's out of the blue, seemingly at random. Codes are "PFN_LIST CORRUPT", "SYSTEM_SERVICE_EXCEPTION". Also, sometimes the Windows 10 recovery would crash as well and give me some fatal sys32 driver error. Funny thing is, I reinstalled Windows 3 days ago, because of these very issues, thinking it was just a botched driver or update. But today, about 2 hours into playing Forza, it started again. "PFN_LIST_CORRUPT" BSOD, a few seconds before that, I'd get stuttering and a Windows10 notification saying "Having problems with audio playback?". Wouldn't reboot, it power cycled itself a few times, then booted into the BIOS, recognizing only 12 out of 16GB of RAM (3 DIMM LED's on my board lit up, not 4). Now at stock settings, it booted into Windows again, and I'm typing this post with no shenanigans happening.
> 
> I had a similar issue with my old RVE and 5820K, it turned out it was a defective stick of Trident Z. Now I ask you guys, what should I check, what should I do? I'm at the very end of wits here. The rig used to work PERFECTLY up until about 2 weeks ago. Could it be my SSD or a hard drive?
> 
> I appreciate any and all help


I would recommend using the Windows Media Ceation tool and put a fresh copy of 10 on your machine. Than you can format your drive before install and load fresh board drivers. This should fix your issues. If this does not work start hardware testing. Start with putting the min hardware to boot and then adding each component until you find out if you have a problem. Good luck.


----------



## kx11

8.7gb vram used in RE7 , maxed @ 4k



http://imgur.com/7ux3Yes


----------



## MXPOC

Quote:


> Originally Posted by *Enapace*
> 
> I've just brought one of those waited ages for them come to UK nearly brought the T2 it took that long lol.


Thanks, I just read the review. I'll be ordering one of these today.


----------



## Dagamus NM

Quote:


> Originally Posted by *kx11*
> 
> 8.7gb vram used in RE7 , maxed @ 4k
> 
> 
> 
> http://imgur.com/7ux3Yes


Interesting. What are you using to generate that overlay to determine this useage?


----------



## Sh3perd

I think i can finally join this club.

Last peice of the puzzle:



I have to work late this week, but hopefully ill have the build done this week.


----------



## CptSpig

Quote:


> Originally Posted by *Sh3perd*
> 
> I think i can finally join this club.
> 
> Last peice of the puzzle:
> 
> 
> 
> I have to work late this week, but hopefully ill have the build done this week.


----------



## jhowell1030

Quote:


> Originally Posted by *Dagamus NM*
> 
> Interesting. What are you using to generate that overlay to determine this useage?


RivaTuner maybe?


----------



## jsutter71

Upgraded score for 3DMARK firestrike 1.1
http://www.3dmark.com/fs/11512998


----------



## EniGma1987

Quote:


> Originally Posted by *Dagamus NM*
> 
> Interesting. What are you using to generate that overlay to determine this useage?


That is what you get from Afterburner/RivaTuner


----------



## jsutter71

*This applies to anyone who is running multi monitors or thinking about it. Running TXP's in SLI mode.*

My previous configuration was 980Ti in triple SLI mode. I was driving 4 monitors. 1 at 4096X2160, and the other 3 at 2560X1440. All were under water. Driving 4 monitors in that configuration gave me nothing but headaches. Half the time during boot up, The resolution was off, and slow boot times. It got to the point where I dropped down to 3 monitors. After I did that my issues were resolved and it was business as usual.

Fast forward 3 months ago and I upgraded my GPU's and CPU to my current configuration. I forgot to mention that I also upgraded my CPU from a 5930K to a 6950X. I was still running the same 3 monitors I did with the 3 980Ti's with no issues. So today I decided to add the 4th monitor to see how the TXPs would perform. Again all connected to the first card which is running in SLI. First thing I tried was powering cycling my system to see if the system would show the proper resolution from a cold start. No issues. I did this a few times just to make sure. My experience with multi monitor systems running NVIDIA drivers are that you have good days and bad days. Regardless, I saw a much noticeable improvement with stability. Next thing was the benchmarks. Again, no issues their.
http://www.3dmark.com/fs/11512998

So the take away from this is that it looks like Nvidia has made some remarkable improvements in regards to multi monitor systems under SLI.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> Upgraded score for 3DMARK firestrike 1.1
> http://www.3dmark.com/fs/11512998


----------



## Nicklas0912

Now I got watercooling on my Titan XP









Idle 28c, max load 37c with overclock, fast test, 2078Mhz with no tottle.


----------



## MrKenzie

I did a Firestrike ultra run with my final overclock which has proven to be very stable for gaming!
+226 core
+800 memory (no it didn't lower my scores or maximum boost clock susprisingly, +900 did!)

My graphics score is just below steponz in 7th spot, but my CPU is a stock clock i7 4790K so I'm way down the list!
http://www.3dmark.com/fs/11489082

Quote:


> Originally Posted by *Nicklas0912*
> 
> Now I got watercooling on my Titan XP
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Idle 28c, max load 37c with overclock, fast test, 2078Mhz with no tottle.


It will throttle with certain games. My Titan runs at 25c maximum and throttles by about 100MHz in certain games.


----------



## Nicklas0912

Quote:


> Originally Posted by *MrKenzie*
> 
> I did a Firestrike ultra run with my final overclock which has proven to be very stable for gaming!
> +226 core
> +800 memory (no it didn't lower my scores or maximum boost clock susprisingly, +900 did!)
> 
> My graphics score is just below steponz in 7th spot, but my CPU is a stock clock i7 4790K so I'm way down the list!
> http://www.3dmark.com/fs/11489082
> It will throttle with certain games. My Titan runs at 25c maximum and throttles by about 100MHz in certain games.


How Can that be?

What your oc ?


----------



## MrKenzie

Quote:


> Originally Posted by *Nicklas0912*
> 
> How Can that be?
> 
> What your oc ?


My OC is +226 / +800 as above.

I run an aquarium cooler so temps are pretty good. I think Steponz has a bigger cooler so his card doesn't even go over 10c! My boost clock is 2135MHz or so but mostly runs at between 2050 and 2124MHz because of the power limit!


----------



## Nicklas0912

Quote:


> Originally Posted by *MrKenzie*
> 
> My OC is +226 / +800 as above.
> 
> I run an aquarium cooler so temps are pretty good. I think Steponz has a bigger cooler so his card doesn't even go over 10c! My boost clock is 2135MHz or so but mostly runs at between 2050 and 2124MHz because of the power limit!


Danm not bad, my card Will only max 2088mhz, but it nerver gooing under 2050mhz , Can you post your timespy score?


----------



## MrKenzie

Quote:


> Originally Posted by *Nicklas0912*
> 
> Danm not bad, my card Will only max 2088mhz, but it nerver gooing under 2050mhz , Can you post your timespy score?


http://www.3dmark.com/spy/1119010
As requested, once again GPU score is good, CPU score is trash!


----------



## unreality

While you guys are breaking overclocking records im downclocking and undervolting my card since weeks, because its just too powerful for all of my games (WoW, Diablo 3 @ 1440p144hz)


----------



## pez

Quote:


> Originally Posted by *unreality*
> 
> While you guys are breaking overclocking records im downclocking and undervolting my card since weeks, because its just too powerful for all of my games (WoW, Diablo 3 @ 1440p144hz)


Heh, is this so that you can actually see 144hz/fps in those games?


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> I did a Firestrike ultra run with my final overclock which has proven to be very stable for gaming!
> +226 core
> +800 memory (no it didn't lower my scores or maximum boost clock susprisingly, +900 did!)
> 
> My graphics score is just below steponz in 7th spot, but my CPU is a stock clock i7 4790K so I'm way down the list!
> http://www.3dmark.com/fs/11489082
> It will throttle with certain games. My Titan runs at 25c maximum and throttles by about 100MHz in certain games.


Fire Strike does not like a high memory overclock. I did 227 on the core and 350 on memory.
http://s1164.photobucket.com/user/CptSpig/media/Untitled_zpsopscaot9.png.html


----------



## DooRules

Get my best scores in Firestrike with memory at +700 to +800 range. Even higher in Timespy. I think it simply comes down to the individual gpu and how they can handle the higher memory o/c. Same for sli runs as well.


----------



## jhowell1030

Quote:


> Originally Posted by *unreality*
> 
> While you guys are breaking overclocking records im downclocking and undervolting my card since weeks, because its just too powerful for all of my games (WoW, Diablo 3 @ 1440p144hz)


Not me. Watching all of these guys with their beautiful overclocks while my card can't go past +200 on the core.


----------



## unreality

Quote:


> Originally Posted by *pez*
> 
> Heh, is this so that you can actually see 144hz/fps in those games?


I dont really get your logic behind this, but if your question is if you can see the difference to 60 Hz thenn the answer is HELL YES! I still reach max fps with downclocked TX

Quote:


> Originally Posted by *jhowell1030*
> 
> Not me. Watching all of these guys with their beautiful overclocks while my card can't go past +200 on the core.


Im running my card at 1480 MHz @ 0.8 Volts. With watercooling the card doesnt really change temperature


----------



## jhowell1030

Quote:


> Originally Posted by *unreality*
> 
> HELL YES! I still reach max fps with downclocked TX
> Im running my card at 1480 MHz @ 0.8 Volts. With watercooling the card doesnt really change temperature


I wish I could get away with doing that. Trying to chase constant +60FPS (would prefer 100FPS) @ 3440 x 1440. Not that my system gets too hot since I have everything on a full loop now (if I ever get time I'll undress her a bit and take some pictures







) but I might have to play around a little bit.


----------



## Gunslinger.

You guys all running with a stock bios or has someone come up with a tweaked version?


----------



## xTesla1856

Quote:


> Originally Posted by *Gunslinger.*
> 
> You guys all running with a stock bios or has someone come up with a tweaked version?


Yes and no.


----------



## arrow0309

Quote:


> Originally Posted by *xTesla1856*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gunslinger.*
> 
> You guys all running with a stock bios or has someone come up with a tweaked version?
> 
> 
> 
> Yes and no.
Click to expand...

What you mean?


----------



## DooRules

Stock bios. Nothing has changed on that front. Doesn't look like it will at this point.


----------



## CptSpig

Quote:


> Originally Posted by *DooRules*
> 
> Get my best scores in Firestrike with memory at +700 to +800 range. Even higher in Timespy. I think it simply comes down to the individual gpu and how they can handle the higher memory o/c. Same for sli runs as well.


Not true. Each bench mark looks at hardware differently. It's best to read the notes for each of the bench marks they are really helpful. Yes SLI is a totally different ball game. Having a processor with 40 lanes makes a difference. Cooling makes the biggest difference. The colder the hardware the higher the overclocks.


----------



## jhowell1030

Quote:


> Originally Posted by *CptSpig*
> 
> The colder the hardware the higher the overclocks.


That's only true to a point. I'm no where near hitting the thermal limit. On water and I can't push my card past +200mhz on the core.


----------



## CptSpig

Quote:


> Originally Posted by *Gunslinger.*
> 
> You guys all running with a stock bios or has someone come up with a tweaked version?


Not that I know of if you find one let us know.


----------



## DooRules

Quote:


> Originally Posted by *CptSpig*
> 
> Not true. Each bench mark looks at hardware differently. It's best to read the notes for each of the bench marks they are really helpful. Yes SLI is a totally different ball game. Having a processor with 40 lanes makes a difference. Cooling makes the biggest difference. The colder the hardware the higher the overclocks.


Even though each benchmark looks at hardware differently you are still inherently bound by the limitations of the gpu in question. If this was not the case everyone with basically the same hardware could attain the same scores, but as you know this does not happen.

Simply getting the gpu colder does not allow for much higher clocks, just more consistent runs. Or at least that has been my experience


----------



## CptSpig

Quote:


> Originally Posted by *jhowell1030*
> 
> That's only true to a point. I'm no where near hitting the thermal limit. On water and I can't push my card past +200mhz on the core.


This is very true! All of the world records are done on LN2 which is freezing the cores. If you could freeze your hardware you would definitely see higher clocks. Take a look at the hall of fame for any future mark score.


----------



## jhowell1030

Quote:


> Originally Posted by *CptSpig*
> 
> This is very true! All of the world records are done on LN2 which is freezing the cores. If you could freeze your hardware you would definitely see higher clocks. Take a look at the hall of fame for any future mark score.


Once again...only true to a point. Eventually you hit a wall with the silicone. That's why they call it the silicon lottery. You have folks like Vince Lucido that are given cards that have been binned to ensure maximum overclockability.

My card proves that point. The hottest it get's on Firestrike is only 40C. Still plenty of room before it'd hit any sort of thermal limit. Anything past 201mhz on the core clock though and it crashes. Now, I'm not an overclocking expert like him nor am I using any sort of exotic cooling...but I'm obviously not limited by thermals at that temperature.


----------



## jhowell1030

Quote:


> Originally Posted by *DooRules*
> 
> Even though each benchmark looks at hardware differently you are still inherently bound by the limitations of the gpu in question. If this was not the case everyone with basically the same hardware could attain the same scores, but as you know this does not happen.
> 
> Simply getting the gpu colder does not allow for much higher clocks, just more consistent runs. Or at least that has been my experience


I did not see this post before I wrote my previous one. This is very well explained.


----------



## Dagamus NM

Finally got my quad TXP rig running. Got the OS installed last night, will install nvidia software and get to tweaking it this evening.


----------



## jsutter71

Quote:


> Originally Posted by *Dagamus NM*
> 
> Finally got my quad TXP rig running. Got the OS installed last night, will install nvidia software and get to tweaking it this evening.


*Quad TXP.* My wife would divorce me. When she found out that I had bought 2 of them the fight that ensued lasted for 2 days. I had to make it up by getting her a new wedding band set for Christmas. She told me to not come home with anything smaller then 2 carats. That cost me $6000. Of course my son had to get a PS4 pro with VR to replace the PS4 he got last Christmas.

Is the rest of your hardware the same as what's listed in your signature? Previously I was running 3 980Ti's in SLI which is what dictated my choice in motherboards. I wanted a board that supported quad X16 PCIE. I understand that the difference with X8 is minimal at best, but most people who can afford, and purchase a quad TXP system are the type of people who want the best in performance. The same with your CPU.


----------



## Dagamus NM

Quote:


> Originally Posted by *jsutter71*
> 
> *Quad TXP.* My wife would divorce me. When she found out that I had bought 2 of them the fight that ensued lasted for 2 days. I had to make it up by getting her a new wedding band set for Christmas. She told me to not come home with anything smaller then 2 carats. That cost me $6000. Of course my son had to get a PS4 pro with VR to replace the PS4 he got last Christmas.
> 
> Is the rest of your hardware the same as what's listed in your signature? Previously I was running 3 980Ti's in SLI which is what dictated my choice in motherboards. I wanted a board that supported quad X16 PCIE. I understand that the difference with X8 is minimal at best, but most people who can afford, and purchase a quad TXP system are the type of people who want the best in performance. The same with your CPU.


Well, I got divorced last year so the sky is the limit now that I am free of nagging.

For quad x16 PCIe don't you need an onboard PLX chip or a multi socket server board?

I agonized over the decision between the Rampage V Extreme and the X99-WS as that had the PLX chips for many PCIe lanes. Ultimately, I took a pair of RVEs. I still think about the x99-WS, such a nice looking board.

The others are all current rigs. I am on the quad 980ti machine at the moment at the physics office waiting on a matlab install to finish, listening to music and typing this.


----------



## jsutter71

Quote:


> Originally Posted by *Dagamus NM*
> 
> Well, I got divorced last year so the sky is the limit now that I am free of nagging.
> 
> For quad x16 PCIe don't you need an onboard PLX chip or a multi socket server board?
> 
> I agonized over the decision between the Rampage V Extreme and the X99-WS as that had the PLX chips for many PCIe lanes. Ultimately, I took a pair of RVEs. I still think about the x99-WS, such a nice looking board.
> 
> The others are all current rigs. I am on the quad 980ti machine at the moment at the physics office waiting on a matlab install to finish, listening to music and typing this.


Yes I'm runnng a X99-E WS USB 3.1. I've had 4 of them, and aside from one that I had to RMA after 7 months of non use, have had no real issues. The one I RMA'd had a 2 week turn around from the time I sent it to Asus to the time I received a new replacement ( and it was a brand new board they sent back). Contrary to the horror stories regarding Asus's RMA department. When I commented to Asus about the poor reputation with their RMA department, they told me that server/workstation boards get special treatment.


----------



## Dagamus NM

Quote:


> Originally Posted by *jsutter71*
> 
> Yes I'm runnng a X99-E WS USB 3.1. I've had 4 of them, and aside from one that I had to RMA after 7 months of non use, have had no real issues. The one I RMA'd had a 2 week turn around from the time I sent it in to the time I received a new replacement ( and it was a brand new board they sent back). Contrary to the horror stories regarding Asus's RMA department.


Nicely. I pretty much only use Asus motherboards. I have not cared much for the motherboards I have had from Asrock and Gigabyte. Even the UD7 was a let down.

As some of these builds get further out of date I will need to pick up some LN2 pots. I have a good sized jug of LN2 here at the office. Using it for cooling the high purity germanium detector. Might as well find some other utility for it.

After learning the ropes of old systems I might work up the nerve to try out some new ones.

Before that, I just want to see how this chiller cools this quad titan rig. My dual titan rig stays plenty cool just on water. But it is winter here so we will see what next summer brings.


----------



## MrKenzie

Quote:


> Originally Posted by *CptSpig*
> 
> Fire Strike does not like a high memory overclock. I did 227 on the core and 350 on memory.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Untitled_zpsopscaot9.png.html


Actually I kept upping my memory overclock until my score stopped increasing or went backwards. +800 memory gave 120 more points than +500 for instance.

+230 core clock gave worse results than +220, and even though I can benchmark at +250 core the score is worse than +220.


----------



## MrKenzie

Quote:


> Originally Posted by *jhowell1030*
> 
> Once again...only true to a point. Eventually you hit a wall with the silicone. That's why they call it the silicon lottery. You have folks like Vince Lucido that are given cards that have been binned to ensure maximum overclockability.
> 
> My card proves that point. The hottest it get's on Firestrike is only 40C. Still plenty of room before it'd hit any sort of thermal limit. Anything past 201mhz on the core clock though and it crashes. Now, I'm not an overclocking expert like him nor am I using any sort of exotic cooling...but I'm obviously not limited by thermals at that temperature.


If your are hitting 40c then you are throttling due to temperature already! Pascal Titan's throttle at least 2-3 times before the temps get to 40c! I have tested this by letting mine warm up from 5c all the way to 50c and watched the downclocks happen.

With that said, it's not the reason why you can't go above +200 on the core. You just didn't get a high overclocker as you have said.


----------



## ChronoBodi

Quote:


> Originally Posted by *jsutter71*
> 
> *Quad TXP.* My wife would divorce me. When she found out that I had bought 2 of them the fight that ensued lasted for 2 days. I had to make it up by getting her a new wedding band set for Christmas. She told me to not come home with anything smaller then 2 carats. That cost me $6000. Of course my son had to get a PS4 pro with VR to replace the PS4 he got last Christmas.
> 
> Is the rest of your hardware the same as what's listed in your signature? Previously I was running 3 980Ti's in SLI which is what dictated my choice in motherboards. I wanted a board that supported quad X16 PCIE. I understand that the difference with X8 is minimal at best, but most people who can afford, and purchase a quad TXP system are the type of people who want the best in performance. The same with your CPU.


Everybody got their things to spend on.

For you it's TXPs, i guess for your wife it's jewelry.

They're both flashy, either by the LEDs of the logo on the TXP or the glitter of the band.

Although..... at least a TXP can render stuff fast and is good for deep learning from what i heard, whereas jewelry... is status symbol?

Sorry if this is off-topic.

on-topic, can anyone see if memory overclocks in excess of +400 does impact performance due to limited power budget than anything else? Just curious if it's worth OCing my memory from +204 for the nice even number of 500 GB/s.

Anyway, obligatory Titan XP shots in my rig. IT should be white for my mobo scheme though, lol.


----------



## MrKenzie

Quote:


> Originally Posted by *ChronoBodi*
> 
> Everybody got their things to spend on.
> 
> For you it's TXPs, i guess for your wife it's jewelry.
> 
> They're both flashy, either by the LEDs of the logo on the TXP or the glitter of the band.
> 
> Although..... at least a TXP can render stuff fast and is good for deep learning from what i heard, whereas jewelry... is status symbol?
> 
> Sorry if this is off-topic.
> 
> on-topic, can anyone see if memory overclocks in excess of +400 does impact performance due to limited power budget than anything else? Just curious if it's worth OCing my memory from +204 for the nice even number of 500 GB/s.
> 
> Anyway, obligatory Titan XP shots in my rig. IT should be white for my mobo scheme though, lol.
> 
> 
> Spoiler: Warning: Spoiler!


I can't speak for everyone, but my Titan kept increasing in benchmark scores by increasing memory clock all the way up to +800.

If I kept memory at +500 and increased core clock to +240 I got lower scores than +800 memory and +220 core clock.

You can't test this yourself?


----------



## ChronoBodi

Quote:


> Originally Posted by *MrKenzie*
> 
> [/SPOILER]
> 
> I can't speak for everyone, but my Titan kept increasing in benchmark scores by increasing memory clock all the way up to +800.
> 
> If I kept memory at +500 and increased core clock to +240 I got lower scores than +800 memory and +220 core clock.
> 
> You can't test this yourself?


Well, which benchmark is to really make sure that the memory OC is stable?

I've used Unigine Valley bench, but something tells me that some OC is stable in Valley but not in some games.


----------



## MXPOC

Quote:


> Originally Posted by *Sheyster*
> 
> If I was buying a PSU today, I'd buy this one:
> 
> http://www.evga.com/products/product.aspx?pn=220-G3-1000-X1
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story6&reid=494
> 
> 9.8 score on JG.


Thanks for the recommendation. I bought one and it's working like a charm and my system is super stable now.

Thanks to everyone else that suggested that my PSU was casing the issue. Very happy with the results!


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> [/SPOILER]
> 
> Actually I kept upping my memory overclock until my score stopped increasing or went backwards. +800 memory gave 120 more points than +500 for instance.
> 
> +230 core clock gave worse results than +220, and even though I can benchmark at +250 core the score is worse than +220.


You can keep on doing what ever makes you happy. I will just enjoy my number one spot and the only person to break 24000 with a 5930k and one Titan XP on Fire Strike. Oh and by the way number three on 3D Mark 11 with 120 on the core and 550 on memory it loves memory OC.. Number three on Time spy with 230 on the core and 425 on the memory. One more note your CPU overclock impacts all your bench marks as well.








http://s1164.photobucket.com/user/CptSpig/media/Untitled_zpsopscaot9.png.html


----------



## ChronoBodi

I ask, what is one benchmark that will be sure of a stable memory OC?


----------



## Leyaena

I quite like Furmark for weeding out unstable memory clocks.
Be warned though, Furmark can draw an insane amount of power, much more than you'd see in day to day use.


----------



## jhowell1030

Quote:


> Originally Posted by *Leyaena*
> 
> I quite like Furmark for weeding out unstable memory clocks.
> Be warned though, Furmark can draw an insane amount of power, much more than you'd see in day to day use.


YES!


----------



## pez

Quote:


> Originally Posted by *unreality*
> 
> I dont really get your logic behind this, but if your question is if you can see the difference to 60 Hz thenn the answer is HELL YES! I still reach max fps with downclocked TX
> Im running my card at 1480 MHz @ 0.8 Volts. With watercooling the card doesnt really change temperature


I was actually making a joke at how cards like the Titan and even 1080 and 1070 in some situations don't hit full utiliziation at their stock clocks.

I had tried something like this before to force my Titan to use more GPU resources in some older titles with some mixed results. I have no complaints on high refresh rate panels







. I personally love mine







.


----------



## MrKenzie

Quote:


> Originally Posted by *CptSpig*
> 
> You can keep on doing what ever makes you happy. I will just enjoy my number one spot and the only person to break 24000 with a 5930k and one Titan XP on Fire Strike. Oh and by the way number three on 3D Mark 11 with 120 on the core and 550 on memory it loves memory OC.. Number three on Time spy with 230 on the core and 425 on the memory. One more note your CPU overclock impacts all your bench marks as well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Untitled_zpsopscaot9.png.html


What graphics score do you get on Firestrike Ultra? I don't use Firestrike since my CPU holds the scores back at low resolutions. I know overclocking it will help performance, but I can't stand crashes while gaming and that's what I get when I overclock the CPU even just a little, It's a heap of crap this one, but I won't spend money on a CPU upgrade when it barely increases gaming performance.


----------



## Jpmboy

Quote:


> Originally Posted by *ChronoBodi*
> 
> I ask, what is one benchmark that will be sure of a stable memory OC?


there really is none since running a benchmark only demonstrates stability to that specific benchmark. Games load the memory bus completely differently, the only benchmarks that are close to gaming are the Unigine series IMO. Run/Loop these at the max resolution you can (or down sample) with settings maxed out (eg, run them not as you would while benchmarking).








Quote:


> Originally Posted by *Leyaena*
> 
> I quite like Furmark for weeding out unstable memory clocks.
> Be warned though, Furmark can draw an insane amount of power, much more than you'd see in day to day use.


it a power virus. NV and all 3rd party vendors have defined a specific TDP in bios as"virus mode" for this reason. A power virus has nothing to do with memory stability.


----------



## MunneY

Well....

I might be leaving you guys. I still have my XP, but I just bought a 1080 for 450 with a waterblock.... I'm doing a new build and dont want to go to 2 TXP in SLI for $ reasons.


----------



## Pirazy

Sorry if this has been asked already but the thread is 600 pages long.









Has anyone here done any actual testing to see if Titans in SLI will saturate the lanes on a z170/z270 board? Been googling for hours and the closest I could find was an article on techpowerup where they tested a single 1080 in various lane speeds which didn't see any performance hit going from 3.0 x16 to x8. There's been no tests on SLI and pci-e scaling on the pascal architecture at all from what I can see.

Well, there was that test over at Puget Systems but the results from that was so weird they must have done something wrong, you shouldn't get less performance from more bandwidth, not to mention their lackluster choice of realworld gaming benchmarks. Plus it was an x99 platform where they gimped the lanes in BIOS.


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> [/SPOILER]
> 
> What graphics score do you get on Firestrike Ultra? I don't use Firestrike since my CPU holds the scores back at low resolutions. I know overclocking it will help performance, but I can't stand crashes while gaming and that's what I get when I overclock the CPU even just a little, It's a heap of crap this one, but I won't spend money on a CPU upgrade when it barely increases gaming performance.


I have not run Fire Strike Ultra with this machine. If I get a chance this weekend I will run and let you know. What CPU are you running? I OC my 5930k at 4.5 to 4.7 depending on the bench mark I am running.


----------



## MrKenzie

Quote:


> Originally Posted by *CptSpig*
> 
> I have not run Fire Strike Ultra with this machine. If I get a chance this weekend I will run and let you know. What CPU are you running? I OC my 5930k at 4.5 to 4.7 depending on the bench mark I am running.


I am running a 4790K at 4.4 which is stock, all I have done is made the 3rd and 4th core run at 4.4 up from 4.0 (from memory).

I have done a few hours testing today and at 4K resolution, running the memory at +800 benefit's me more than running the core at higher than +220.

The following are games that I tested this on;
*Divinity: Original sin 2* (3% increase)
*Far Cry 4* (6% increase)
*Project cars* (1% increase)
*Shadow Warrior 2* (4% increase)
*For Honor* (5% increase)

The increases I have shown are fps increases going from +400 memory to +800 memory that I measured at specific scenes in-game. Although not scientific, they were repeatable time and time again.

I was surprised to see 2114MHz core / 11,600 (+800) memory to consistently outperform 2139MHz core / 10,800 (+400) memory.


----------



## ESRCJ

I decided to try to surpass my TimeSpy score with my current system. Success!



http://www.3dmark.com/spy/1128212

+250 on the core clock
+400 on the memory clock

I didn't try pushing it further, but I might be able to add a little more to each without issue.


----------



## ChronoBodi

quick question, to satisfy my curiosity:

What's with the unused extra 8-pin left on the PCB, and there seems to be 2 or 3 markings on the PCB itself that looks like it once held a capacitor or a MOSFET or a chip of sorts. What exactly are those missing pieces?


----------



## axiumone

Quadro cards.


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> I am running a 4790K at 4.4 which is stock, all I have done is made the 3rd and 4th core run at 4.4 up from 4.0 (from memory).
> 
> I have done a few hours testing today and at 4K resolution, running the memory at +800 benefit's me more than running the core at higher than +220.
> 
> The following are games that I tested this on;
> *Divinity: Original sin 2* (3% increase)
> *Far Cry 4* (6% increase)
> *Project cars* (1% increase)
> *Shadow Warrior 2* (4% increase)
> *For Honor* (5% increase)
> 
> The increases I have shown are fps increases going from +400 memory to +800 memory that I measured at specific scenes in-game. Although not scientific, they were repeatable time and time again.
> 
> I was surprised to see 2114MHz core / 11,600 (+800) memory to consistently outperform 2139MHz core / 10,800 (+400) memory.


Games are totally different than bench marks. This is why if you are trying to get a stable OC in games. Best to test stability in the games you play most. I also mentioned in a prior post the not all bench marks are the same. Some like more memory OC and some like more core OC.. You just have to be patient when testing.


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> [/SPOILER]
> 
> What graphics score do you get on Firestrike Ultra? I don't use Firestrike since my CPU holds the scores back at low resolutions. I know overclocking it will help performance, but I can't stand crashes while gaming and that's what I get when I overclock the CPU even just a little, It's a heap of crap this one, but I won't spend money on a CPU upgrade when it barely increases gaming performance.


I did one run with 227 on the core and 350 on the memory. CPU OC 4.7 GHz. Top ten without tweaking and multiple runs. http://www.3dmark.com/fs/11550696

http://s1164.photobucket.com/user/CptSpig/media/Untitled4_zpshlqt84en.png.html


----------



## Jpmboy

Quote:


> Originally Posted by *MunneY*
> 
> Well....
> I might be leaving you guys. I still have my XP, but I just bought a 1080 for 450 with a waterblock.... I'm doing a new build and dont want to go to 2 TXP in SLI for $ reasons.


so what's leaving? YOu still have the TXP?
Quote:


> Originally Posted by *Pirazy*
> 
> Sorry if this has been asked already but the thread is 600 pages long.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anyone here done any actual testing to see if Titans in SLI will saturate the lanes on a z170/z270 board? Been googling for hours and the closest I could find was an article on techpowerup where they tested a single 1080 in various lane speeds which didn't see any performance hit going from 3.0 x16 to x8. There's been no tests on SLI and pci-e scaling on the pascal architecture at all from what I can see.
> 
> Well, there was that test over at Puget Systems but the results from that was so weird they must have done something wrong, you shouldn't get less performance from more bandwidth, not to mention their lackluster choice of realworld gaming benchmarks. Plus it was an x99 platform where they gimped the lanes in BIOS.


It's really difficult to saturate the x8 lane config on a z270. I really doubt any current game will do so, and even with a synthetic, saturation at very high resolution is probably limited by a 4 core CPU. I have done this on 2 x99 boards, verifying that 2x16 lanes vs 2x8 lanes for a TXP SLI setup did not make any major difference in several 4K benchmarks. In fact, many benchers will actually dial back to Gen2 x16 (same as gen3x8) in order to get higher scores in some applications.
Only way to know for sure is to test the specific applications (games) you play.


----------



## axiumone

Quote:


> Originally Posted by *Pirazy*
> 
> Sorry if this has been asked already but the thread is 600 pages long.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anyone here done any actual testing to see if Titans in SLI will saturate the lanes on a z170/z270 board? Been googling for hours and the closest I could find was an article on techpowerup where they tested a single 1080 in various lane speeds which didn't see any performance hit going from 3.0 x16 to x8. There's been no tests on SLI and pci-e scaling on the pascal architecture at all from what I can see.
> 
> Well, there was that test over at Puget Systems but the results from that was so weird they must have done something wrong, you shouldn't get less performance from more bandwidth, not to mention their lackluster choice of realworld gaming benchmarks. Plus it was an x99 platform where they gimped the lanes in BIOS.


http://www.overclock.net/t/1616578/x99-6850k-4-5-vs-z170-6700k-4-8-w-titan-xp-sli-benchmarks-and-results/0_100


----------



## auraofjason

Do you guys think my 4770k @ 4.5ghz is bottlenecking my titan x? For regular 60hz I don't think it'd be a problem, but I think my cpu might be limiting me trying to reach 165hz in certain games. You guys think upgrading to a 5.1ghz 7700k would be worth?


----------



## BigBeard86

What heatsinks would be a good fit for titan pascal vrm? I plan to put a kraken g10 on it.
Quote:


> Originally Posted by *auraofjason*
> 
> Do you guys think my 4770k @ 4.5ghz is bottlenecking my titan x? For regular 60hz I don't think it'd be a problem, but I think my cpu might be limiting me trying to reach 165hz in certain games. You guys think upgrading to a 5.1ghz 7700k would be worth?


Go look at benchmarks...0-5 fps difference. Closer to 0 more than 5 in most games. Keep in mind that the 4770k uses diff ram than the 7700k, and even then, almost 0 difference in gaming.


----------



## Pirazy

Quote:


> Originally Posted by *axiumone*
> 
> http://www.overclock.net/t/1616578/x99-6850k-4-5-vs-z170-6700k-4-8-w-titan-xp-sli-benchmarks-and-results/0_100


Top notch job axiumone! Do you think there would be any significant difference running them in x8/x8?


----------



## axiumone

Thank you.

I can only speculate as I didn't have anyway to get both cards to run at x8 at the time of testing. Potentially, yes, the performance gap would be even greater. On the other hand, it's generally an accepted theory that when one gpu is running at x8, then they can only communicate at x8 worth of bandwidth, so the performance could stay the same as x16/x8.


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*
> 
> so what's leaving? YOu still have the TXP?
> It's really difficult to saturate the x8 lane config on a z270. I really doubt any current game will do so, and even with a synthetic, saturation at very high resolution is probably limited by a 4 core CPU. I have done this on 2 x99 boards, verifying that 2x16 lanes vs 2x8 lanes for a TXP SLI setup did not make any major difference in several 4K benchmarks. In fact, many benchers will actually dial back to Gen2 x16 (same as gen3x8) in order to get higher scores in some applications.
> Only way to know for sure is to test the specific applications (games) you play.


I'll probably sell it as soon as I find another cheap 1080. That'll allow me to get 3 in SLI if I want to.


----------



## unreality

Quote:


> Originally Posted by *auraofjason*
> 
> Do you guys think my 4770k @ 4.5ghz is bottlenecking my titan x? For regular 60hz I don't think it'd be a problem, but I think my cpu might be limiting me trying to reach 165hz in certain games. You guys think upgrading to a 5.1ghz 7700k would be worth?


Play Games and after look in the sensor tab of gpu-z if the TX is utilizing 100% usage or not. In my experience high refresh rates (120+) is also needing a lot more cpu power to utilize the gpu fully. I guess in some games you could see some improvements especially with newer IPC and 600MHz more but it wont be 50%


----------



## MrKenzie

Quote:


> Originally Posted by *CptSpig*
> 
> I did one run with 227 on the core and 350 on the memory. CPU OC 4.7 GHz. Top ten without tweaking and multiple runs. http://www.3dmark.com/fs/11550696
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Untitled4_zpshlqt84en.png.html


That goes to show the difference between firestrike, where we are CPU limited, to firestrike ultra, where we are GPU limited.
Your firestrike score is around 4,000 points higher than mine, but my firestrike ultra score is higher than yours, especially the GPU score.
http://www.3dmark.com/fs/11489082

I am all for gaming performance over benchmark performance though, benchmark performance can be misleading because the hardware will only run at that performance until the temps rise and performance drops. Unless you have a chilled system as I do, where the temps remain low and performance stays high.


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> [/SPOILER]
> 
> That goes to show the difference between firestrike, where we are CPU limited, to firestrike ultra, where we are GPU limited.
> Your firestrike score is around 4,000 points higher than mine, but my firestrike ultra score is higher than yours, especially the GPU score.
> http://www.3dmark.com/fs/11489082
> 
> I am all for gaming performance over benchmark performance though, benchmark performance can be misleading because the hardware will only run at that performance until the temps rise and performance drops. Unless you have a chilled system as I do, where the temps remain low and performance stays high.


Like I said I only ran Fire Strike Ultra one time. With some tweaking I can beat your GPU score as well. I am running my machine on water so it stays pretty cold. I only guessed at what the OC should be for the GPU. To get the best scores from your machine you have to OC all the hardware to work together. If your CPU is not overclocked to work with a heavily overclocked GPU your GPU will not get to it's potential. Are graphic scores are within 5% so it would not take much to beat your graphic score. See below.
http://s1164.photobucket.com/user/CptSpig/media/Untitled5._zpsftjb5t3w.png.html


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> http://www.overclock.net/t/1616578/x99-6850k-4-5-vs-z170-6700k-4-8-w-titan-xp-sli-benchmarks-and-results/0_100


a lot of work went into that data set. +1 (from every body I hope)
Doesn't quite address the question of saturation, maybe more related to the single core frequency of the processors?


----------



## MrKenzie

Quote:


> Originally Posted by *CptSpig*
> 
> Like I said I only ran Fire Strike Ultra one time. With some tweaking I can beat your GPU score as well. I am running my machine on water so it stays pretty cold. I only guessed at what the OC should be for the GPU. To get the best scores from your machine you have to OC all the hardware to work together. If your CPU is not overclocked to work with a heavily overclocked GPU your GPU will not get to it's potential. Are graphic scores are within 5% so it would not take much to beat your graphic score. See below.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Untitled5._zpsftjb5t3w.png.html


I would be interested to see if you could get higher than 8181 graphic score, to give me some incentive to upgrade the CPU side of my PC as I'm yet to be convinced I would benefit enough to upgrade.
My graphic score is higher than 6th place in overall Firestrike ultra single GPU's..


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> [/SPOILER]
> 
> I would be interested to see if you could get higher than 8181 graphic score, to give me some incentive to upgrade the CPU side of my PC as I'm yet to be convinced I would benefit enough to upgrade.
> My graphic score is higher than 6th place in overall Firestrike ultra single GPU's..


If you are interested in a CPU you should be looking at the Physics score where I was 52% higher. I have a 4790K in a second machine single core it's pretty strong but no match for the 5930K overall. When I get a chance I will run Fire Strike Ultra and see what I can achieve.


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> a lot of work went into that data set. +1 (from every body I hope)
> Doesn't quite address the question of saturation, maybe more related to the single core frequency of the processors?


Thanks a lot, I appreciate it. A portion of the testing directly compares the skylake chip at the same frequency x16/x16 vs x16/x8. Some games are more susceptible to performance decreases than others, but on average there's around 5% or so drop off. A lot of us will pay good money for a better cooler, etc., to get the same 5% gain. At that point you have to compare what getting x16 to all gpus will cost.


----------



## AlRayesBRN

Gentlemen .. need your advice ..

After 3 agonizing hours I installed the hybrid kit on my Titan XP .. on Air my card used to overclock max to 225/450 with 75% fan profile and reaches 83c

Now after installing the hybrid kit I get artifacts on these clocks! I toned down the overclock to 175/350 and the artifacts are gone (42 to 44c max) with volts at 100% in MSI after burner .. with these setting im getting max boost at 2088 and jumping around between 1970 and 2057 and sometimes hitting 2088

Any idea why is this happening?

Ill try to push the card to 190/360 today


----------



## Artah

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Gentlemen .. need your advice ..
> 
> After 3 agonizing hours I installed the hybrid kit on my Titan XP .. on Air my card used to overclock max to 225/450 with 75% fan profile and reaches 83c
> 
> Now after installing the hybrid kit I get artifacts on these clocks! I toned down the overclock to 175/350 and the artifacts are gone (42 to 44c max) with volts at 100% in MSI after burner .. with these setting im getting max boost at 2088 and jumping around between 1970 and 2057 and sometimes hitting 2088
> 
> Any idea why is this happening?
> 
> Ill try to push the card to 190/360 today


Ouchie, hope you didn't knock off one of those tiny parts on the board. You tried DDU and reinstall the driver?


----------



## AlRayesBRN

Quote:


> Originally Posted by *Artah*
> 
> Ouchie, hope you didn't knock off one of those tiny parts on the board. You tried DDU and reinstall the driver?


Not yet .. finished installing the card at 2 am .. hope I didnt knock off any parts


----------



## ESRCJ

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Not yet .. finished installing the card at 2 am .. hope I didnt knock off any parts


Have you tried disabling the voltage control and trying again? Pascal cards don't seem to like higher voltages. They seem to do more harm than good.


----------



## AlRayesBRN

Quote:


> Originally Posted by *gridironcpj*
> 
> Have you tried disabling the voltage control and trying again? Pascal cards don't seem to like higher voltages. They seem to do more harm than good.


Yup tried both and artificats still present if my clocks is 200 or above ..


----------



## ESRCJ

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Yup tried both and artificats still present if my clocks is 200 or above ..


Hm, something may have gone wrong during the installation. Worst case scenario, you could request an RMA with Nvidia. They're pretty lax with their RMAs from what I've read. However, they send refurbished cards as replacements and some of them are pretty awful, such as cards than can't even hit 1900MHz.

You could also consider putting the air cooler back on and retest. If everything is fine then, it might just be something off with the installation.


----------



## AlRayesBRN

Quote:


> Originally Posted by *gridironcpj*
> 
> Hm, something may have gone wrong during the installation. Worst case scenario, you could request an RMA with Nvidia. They're pretty lax with their RMAs from what I've read. However, they send refurbished cards as replacements and some of them are pretty awful, such as cards than can't even hit 1900MHz.
> 
> You could also consider putting the air cooler back on and retest. If everything is fine then, it might just be something off with the installation.


Damn .. removing all the screws and cooler is a pain tbh .. the artifacts only pop up in heaven though .. looks like I did something bad to the board by mistake ..


----------



## ESRCJ

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Damn .. removing all the screws and cooler is a pain tbh .. the artifacts only pop up in heaven though .. looks like I did something nad to the board by mistake


If it's only Heaven, then it might not be worth worrying about. Yeah, the Titan XP has so many screws that it becomes a major pain to deal with. It's definitely worth it if you get it right though, as these cards flourish with liquid cooling.


----------



## AlRayesBRN

Quote:


> Originally Posted by *gridironcpj*
> 
> If it's only Heaven, then it might not be worth worrying about. Yeah, the Titan XP has so many screws that it becomes a major pain to deal with. It's definitely worth it if you get it right though, as these cards flourish with liquid cooling.


but I dont understand why cant I reach the same overclock I had on Air?? Its weird


----------



## tonnytech

Quote:


> Originally Posted by *AlRayesBRN*
> 
> but I dont understand why cant I reach the same overclock I had on Air?? Its weird


possible the memory isent being cooled as effective as it was with the air cooler , also guessing may be possible the hybrid cooler isent seated correctly on gpu.

Be worth putting the card back to air to check , fingers crossed no damage on installation


----------



## tonnytech

**del


----------



## AlRayesBRN

Quote:


> Originally Posted by *tonnytech*
> 
> possible the memory isent being cooled as effective as it was with the air cooler , also guessing may be possible the hybrid cooler isent seated correctly on gpu.
> 
> Be worth putting the card back to air to check , fingers crossed no damage on installation


Stable max on 204/475 now .. temps dont exceed 47c .. 2088 max playing watch dogs 2 (rarely exceeds 2100)


----------



## bizplan

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Stable max on 204/475 now .. temps dont exceed 44 - 45c .. 2088 max playing watch dogs 2


Are the artifacts showing up in Firestrike and/or other Direct X-based games at high core clock settings?


----------



## AlRayesBRN

Only saw the artificats in heaven when I tried different clocks .. Ill try firestrike tomorrow and let you how it goes


----------



## CptSpig

Quote:


> Originally Posted by *MrKenzie*
> 
> [/SPOILER]
> 
> I would be interested to see if you could get higher than 8181 graphic score, to give me some incentive to upgrade the CPU side of my PC as I'm yet to be convinced I would benefit enough to upgrade.
> My graphic score is higher than 6th place in overall Firestrike ultra single GPU's..


I could not beat your graphics score in the time I had this weekend. Below is the best I could do but I will keep trying.








http://s1164.photobucket.com/user/CptSpig/media/Untitled6_zpsnfb2vujl.png.html


----------



## ESRCJ

Quote:


> Originally Posted by *CptSpig*
> 
> I could not beat your graphics score in the time I had this weekend. Below is the best I could do but I will keep trying.
> 
> 
> 
> 
> 
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Untitled6_zpsnfb2vujl.png.html


What are your core clocks like during graphics test 1? I've noticed that mine don't break 2100MHz in FS Ultra with the same overclock that gets them to 2137MHz in Fire Strike (normal).

Also, what in the world is with the small weight 3DMark puts on the physics score??? Yours is 50% higher than the other user, a few percentage points lower on graphics and combined, yet your overall score is only 0.6% higher.


----------



## ESRCJ

Accidental post.


----------



## PowerK

Having owned TITAN X Pascal SLI since summer last year, today was the first time running 3DMark for fun.


----------



## CptSpig

Quote:


> Originally Posted by *gridironcpj*
> 
> Accidental post.


Quote:


> Originally Posted by *gridironcpj*
> 
> What are your core clocks like during graphics test 1? I've noticed that mine don't break 2100MHz in FS Ultra with the same overclock that gets them to 2137MHz in Fire Strike (normal).
> 
> Also, what in the world is with the small weight 3DMark puts on the physics score??? Yours is 50% higher than the other user, a few percentage points lower on graphics and combined, yet your overall score is only 0.6% higher.


I noticed the same thing my Fire Strike 1.1 score my core clock is 2,101 and my memory is 1,339 can,t seem to get that in Ultra. I don't know what's up with physics being worth so little in the overall score. Maybe some one else can answer that question. I am pretty happy with my scores on water.


----------



## MrKenzie

Quote:


> Originally Posted by *gridironcpj*
> 
> What are your core clocks like during graphics test 1? I've noticed that mine don't break 2100MHz in FS Ultra with the same overclock that gets them to 2137MHz in Fire Strike (normal).
> 
> Also, what in the world is with the small weight 3DMark puts on the physics score??? Yours is 50% higher than the other user, a few percentage points lower on graphics and combined, yet your overall score is only 0.6% higher.


My guess as to why cards tend to boost lower in Ultra is because the GPU is drawing more power and hits the power limits more than in Firestrike. Mine hits 2088 max in Ultra graphics test 1, whereas in almost all games it runs steady at 2114-2126.


----------



## arrow0309

Still no one working on a custom bios for the txp?
Something like the T4 bios for my (ex) 1080 Strix


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Still no one working on a custom bios for the txp?
> Something like the T4 bios for my (ex) 1080 Strix


Paging @Sheyster


----------



## Nunzi

I'm still using sheysters bios for the Titan xm works great


----------



## arrow0309

Nooooo








And when have you guys decided to let me know?
Cause I wanna rock a 2100 fixed, no throttling








So, where can I get this bios?


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Nooooo
> 
> 
> 
> 
> 
> 
> 
> 
> And when have you guys decided to let me know?
> Cause I wanna rock a 2100 fixed, no throttling
> 
> 
> 
> 
> 
> 
> 
> 
> So, where can I get this bios?


Calm down, there are no Pascal vBIOSes yet.


----------



## arrow0309




----------



## ChronoBodi

from what i heard, Nvidia has a signature key lock to their VBIOS for the Pascal series. So, it's not so easy compared to previous Nvidia GPUs and AMD GPUs.


----------



## arrow0309

Quote:


> Originally Posted by *ChronoBodi*
> 
> from what i heard, Nvidia has a signature key lock to their VBIOS for the Pascal series. So, it's not so easy compared to previous Nvidia GPUs and AMD GPUs.


And how come they came out with the T4 (1080) Strix bios?


----------



## axiumone

Quote:


> Originally Posted by *arrow0309*
> 
> And how come they came out with the T4 (1080) Strix bios?


Because it's an partner board with a custom bios from the get go. I'd assume they've used different security in that bios. There are no partner boards for a TXP, hence no custom bios progress.


----------



## jhowell1030

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Only saw the artificats in heaven when I tried different clocks .. Ill try firestrike tomorrow and let you how it goes


This is going to sound a little ignorant of me...but bare with me. So, before getting my Titan XP the last time I really messed with overclocking was when I pickup up my 2 980 K|NGP|N cards when they launched. Back then the Firestrike interface was a little different. Now then when I'd mess with setting trying to push my cards I'd hit a point where I'd see artifacts and then know I needed to dial it back from there.

Fast forward a couple years to today. When I'm messing with clock speeds now, +201 runs fine, at +202 sometimes (not all the time) the benchmark will straight up stop as if I hit escape to cancel out of it. That's why I've not pushed it any further than +201. I could run +201 20 times, no problem. At +202 sometimes it works...sometimes it ends prematurely.

Since I'm not seeing artifacts or anything before that happens...could I have another issue going on here?

Thanks guys


----------



## xTesla1856

Quote:


> Originally Posted by *jhowell1030*
> 
> This is going to sound a little ignorant of me...but bare with me. So, before getting my Titan XP the last time I really messed with overclocking was when I pickup up my 2 980 K|NGP|N cards when they launched. Back then the Firestrike interface was a little different. Now then when I'd mess with setting trying to push my cards I'd hit a point where I'd see artifacts and then know I needed to dial it back from there.
> 
> Fast forward a couple years to today. When I'm messing with clock speeds now, +201 runs fine, at +202 sometimes (not all the time) the benchmark will straight up stop as if I hit escape to cancel out of it. That's why I've not pushed it any further than +201. I could run +201 20 times, no problem. At +202 sometimes it works...sometimes it ends prematurely.
> 
> Since I'm not seeing artifacts or anything before that happens...could I have another issue going on here?
> 
> Thanks guys


Happens to me too, from time to time. I jsut repeat the benchmark until i get a pass









No idea what causes it, though..


----------



## arrow0309

Wow, currently playing WD2 at 3440x1440 at 2100/11000, 2100 fixed!








Niiiiice


----------



## Jpmboy

^^ Slammin' that power limit.


----------



## arrow0309

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ Slammin' that power limit.


Yeah, dammit
I don't even look at the **** however
Important is the clock pretty stable at 2100 and 1.05-1.065v (+228/+497) in this game, same OC and is downclocking heavily in Time Spy or Tw3 for instance


----------



## Lee0

So as I've mentioned in this thread I've been looking for a cooling solution for my TXP. Right now it's either between the arctic cooler accelero IV air or their hybrid one. However they are kind of ugly (at least the hybrid one) and I want it on water. So I'm thinking about doing the EVGA hybrid mod. I've experience with a dremel so no problem and the cuts are really minor. But I'm a bit confused what kit should I use? There are two different: http://www.evga.com/products/product.aspx?pn=400-HY-5188-B1 - The normal 1080 one
and the FTW one http://www.evga.com/products/product.aspx?pn=400-HY-5288-B1 .
Notice that the FTW one has a different fan and a connector cut out for 2 connectors.
I'm kind of directing this question to you: @Lobotomite430 and the other people who've done this mod.
And as a final note I plan on keeping the EVGA shroud on.
Thanks in advance.


----------



## jhowell1030

Quote:


> Originally Posted by *Lee0*
> 
> So as I've mentioned in this thread I've been looking for a cooling solution for my TXP. Right now it's either between the arctic cooler accelero IV air or their hybrid one. However they are kind of ugly (at least the hybrid one) and I want it on water. So I'm thinking about doing the EVGA hybrid mod. I've experience with a dremel so no problem and the cuts are really minor. But I'm a bit confused what kit should I use? There are two different: http://www.evga.com/products/product.aspx?pn=400-HY-5188-B1 - The normal 1080 one
> and the FTW one http://www.evga.com/products/product.aspx?pn=400-HY-5288-B1 .
> Notice that the FTW one has a different fan and a connector cut out for 2 connectors.
> I'm kind of directing this question to you: @Lobotomite430 and the other people who've done this mod.
> And as a final note I plan on keeping the EVGA shroud on.
> Thanks in advance.


The FTW has a different layout for the VRMs. You'll want the one for the reference 1080.


----------



## Lee0

Quote:


> Originally Posted by *jhowell1030*
> 
> The FTW has a different layout for the VRMs. You'll want the one for the reference 1080.


Ok, thank you.
And I think I pretty much have everything down.
-Uninstall original shroud and etc.
-Ground myself
-Mask of other areas that aren't being cut to protect them.
-Cut for the extra connector and VRM.
-Grind space for capacitors
-lastly install the kit.
And of course the wear gloves and reapply TIM.


----------



## jhowell1030

Quote:


> Originally Posted by *Lee0*
> 
> Ok, thank you.
> And I think I pretty much have everything down.
> -Uninstall original shroud and etc.
> -Ground myself
> -Mask of other areas that aren't being cut to protect them.
> -Cut for the extra connector and VRM.
> -Grind space for capacitors
> -lastly install the kit.
> And of course the wear gloves and reapply TIM.


Yep! I *almost* went that route. I had a kraken x61 on my 5820k and I loved the temps I was getting with that setup. I was nervous what would happen if I put both the CPU and GPU in a single loop. Ultimately, I thought the hybrid kit mod in my case along with the kraken would be look really cluttered...so I spent the money on a loop.


----------



## phattypatty

Has anyone else experienced a crash when running through displayport. For me it starts with my screen going black then a quick audio loop and gone. Only happens when running through displayport which is ironic because our mighty titan come with three dp slots and only 1 hdmi yet the display ports crash it. Ive run benchmarks from low to ultra crash 5 seconds in go to hdmi runs yhe entire benchmark. Any ideas anyone


----------



## MrTOOSHORT

Quote:


> Originally Posted by *phattypatty*
> 
> Has anyone else experienced a crash when running through displayport. For me it starts with my screen going black then a quick audio loop and gone. Only happens when running through displayport which is ironic because our mighty titan come with three dp slots and only 1 hdmi yet the display ports crash it. Ive run benchmarks from low to ultra crash 5 seconds in go to hdmi runs yhe entire benchmark. Any ideas anyone


Bad cord?


----------



## phattypatty

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Bad cord?


Yeah I realized i had a bad off brand after coming across the article about displayport cables not working out of the store and how you have to buy a proper one. Switched to a new one and ran my benchmark perfectly on ultra


----------



## jsutter71

I have had MANY issues related to bad displayport cables. Especially since my primary display is 4096X2160. Nothing but issues when I was running it on my 980Tis. Most of my issues went away with the TXPs.


----------



## labjet

Can i join the club?


----------



## Leyaena

Damn I love that coolant color.
Is it off-the-shelf, or is it something you mixed yourself?


----------



## jsutter71

Quote:


> Originally Posted by *labjet*
> 
> Can i join the club?


That looks like mine.


Great Score. What are your overclock settings?


----------



## labjet

Thanks!!! its a one off, i was trying to get a dark grey color. I used Mayhems extreme white for base, green and red dye. when the lights are off its alike a greyish/greenish/brownish color -_- lol


----------



## labjet

nice, i like your bends lol, i think i need to use extensions on the Ek terminal, hardline tube to the rad sits on the 6+8pin connector on 2nd GPU









For that run i think i had a +213 clock and +500 memory, didnt translate well when i tried that on timespy, also not that stable, i normally run +200/+500


----------



## Jpmboy

Quote:


> Originally Posted by *labjet*
> 
> Can i join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I like the plexi look.


----------



## bizplan

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Only saw the artificats in heaven when I tried different clocks .. Ill try firestrike tomorrow and let you how it goes


Firestrike... What did you learn?


----------



## labjet

Question for the SLI and multi monitor users. Do you have any issues or is there a special setup? and what drivers are you running?

my current monitor setup is 3 x Ben Q xl2720 144hz and 1 x Acer X34 100hz.

Previously on one titan i was able to run 3 monitor surround with a 4th monitor, id also be able to switch between the 2 setups by changing the primary monitor as i like to do surround for certain games and single monitor for others.

This is my first time running SLI and Ive been experiencing low fps or stuttering issues when trying to run 4 monitors. The issues dissappear though when running a single monitor.


----------



## jsutter71

Running 4 monitors with TXPs in SLI. Using the latest drivers and have no issues. Your issues may be your cables.
Center monitor is a LG 31" 4096X2160 and the other 3 are 27" 2560X1440


----------



## jhowell1030

Well boys. I finally did it. I BROKE 24000!

http://www.3dmark.com/3dm/17781250?

I'm scratching my head. After lots of testing....any overclock over 200Mhz on the core can't have anything added to the power limit or else Fire Strike closes as if I hit escape on the 2nd graphics test. +213 seems to be the highest I can go in Fire Strike without it doing the same thing. I've been able to game on it @220 no problem. I just can't seem to benchmark it that high.


----------



## arrow0309

And I finally broke the 34000 GS
















http://www.3dmark.com/3dm/17780821


----------



## AlRayesBRN

Quote:


> Originally Posted by *bizplan*
> 
> Firestrike... What did you learn?


Tried Firestrike and artificats were present when going above 203/475

I downloaded the latest Nvodia hotfix and guess what!! the clocks now reach to 225/480!

I tried 250/480 but watch dogs 2 crashed after 15 minutes or so ..

I think I can get it a bit higher than 225/480 but didnt have the time to try it .. might do it tonight!


----------



## jhowell1030

Quote:


> Originally Posted by *AlRayesBRN*
> 
> Tried Firestrike and artificats were present when going above 203/475
> 
> I downloaded the latest Nvodia hotfix and guess what!! the clocks now reach to 225/480!
> 
> I tried 250/480 but watch dogs 2 crashed after 15 minutes or so ..
> 
> I think I can get it a bit higher than 225/480 but didnt have the time to try it .. might do it tonight!


Hotfix? Can you share the url?

Or are you talking about a driver update?


----------



## hertz9753

http://nvidia.custhelp.com/app/answers/detail/a_id/4378/~/geforce-hot-fix-driver-version-378.57

It's probably that one.


----------



## AlRayesBRN

https://nvidia.custhelp.com/app/answers/detail/a_id/4378

This one .. make sure you do a clean install


----------



## AlRayesBRN

Quote:


> Originally Posted by *jhowell1030*
> 
> Hotfix? Can you share the url?
> 
> Or are you talking about a driver update?


https://nvidia.custhelp.com/app/answers/detail/a_id/4378

This one .. make sure you do a clean install


----------



## TonyRoma

Just signed up today. I ordered my Titan XP on 5th August as a birthday treat. I've been enjoying it immensely since then, clocking and gaming, BOINC/SETI too. I've been a long time lurker on this thread and thought it's about time I give my thanks to the many many posters with great information regarding the Titan XP. Thank you guys, much appreciated for all the info you have provided since way back in August


----------



## MrTOOSHORT

Quote:


> Originally Posted by *TonyRoma*
> 
> Just signed up today. I ordered my Titan XP on 5th August as a birthday treat. I've been enjoying it immensely since then, clocking and gaming. I've been a long time lurker on this thread and thought it's about time I give my thanks to the many many posters with great information regarding the Titan XP. Thank you guys, much appreciated for all the info you have provided since way back in August


Welcome to OCN and to the club too!









Yes, great card, just would have been great to tweak it more from the bios level to get more. I'm still happy though.


----------



## TonyRoma

Thank you@MrTOOSHORT, I do recall seeing you often in this thread. During the time I've had my TXP I've been toying with the idea of a custom water loop, but I just don't have that kind of experience. I'm currently quite happy on air due to Noctua fans and my Corsair 540 case which puts the fans very close to the GPU, keeping the temps fairly under control









The less said about fan noise the better though haha


----------



## CptSpig

Quote:


> Originally Posted by *jhowell1030*
> 
> Well boys. I finally did it. I BROKE 24000!
> 
> http://www.3dmark.com/3dm/17781250?
> 
> I'm scratching my head. After lots of testing....any overclock over 200Mhz on the core can't have anything added to the power limit or else Fire Strike closes as if I hit escape on the 2nd graphics test. +213 seems to be the highest I can go in Fire Strike without it doing the same thing. I've been able to game on it @220 no problem. I just can't seem to benchmark it that high.


+1 Feels good to brake 24,000!


----------



## CptSpig

Quote:


> Originally Posted by *arrow0309*
> 
> And I finally broke the 34000 GS
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/17780821


+1 Great Graphic Score!







Are you using a chiller?


----------



## jsutter71

I can't seem to get to 14000 for my combined score for Ultra.
http://www.3dmark.com/fs/11604314

I have noticed the more I overclock my CPU the lower my GPU score goes. Running stable at almost 4500MHz. At the CPU rate the higher I raised my GPU clock the lower my scores got.
http://www.3dmark.com/fs/11604599
http://www.3dmark.com/fs/11604730
http://www.3dmark.com/fs/11604815


----------



## arrow0309

Quote:


> Originally Posted by *CptSpig*
> 
> +1 Great Graphic Score!
> 
> 
> 
> 
> 
> 
> 
> Are you using a chiller?


Nope, only a simple custom mid tower liquid cooling and some fresh air!









http://www.xtremeshack.com/photos/20170204148622459823975.JPG

Other 3DMarks at the same OC (+227, +797):

http://www.xtremeshack.com/photos/20170204148622459826284.JPG
http://www.xtremeshack.com/photos/20170204148622459423192.JPG
http://www.xtremeshack.com/photos/20170204148622459312378.JPG


----------



## jsutter71

Getting closer to my goal of 14000 on firestrike ultra.
http://www.3dmark.com/fs/11615890


----------



## jsutter71

Got it...WOOHOO!!!!!
Core clock 225
Mem Clock 500

http://www.3dmark.com/fs/11616131


----------



## ocvn

Quote:


> Originally Posted by *jsutter71*
> 
> Got it...WOOHOO!!!!!
> Core clock 225
> Mem Clock 500
> 
> http://www.3dmark.com/fs/11616131


http://www.3dmark.com/fs/10519071
check your CPU score, it seem a bit low. What is your memory speed?


----------



## jsutter71

Quote:


> Originally Posted by *ocvn*
> 
> http://www.3dmark.com/fs/10519071
> check your CPU score, it seem a bit low. What is your memory speed?


Don't rain on my parade but my memory is 2398MHz. Looking at yours which is at 2998MHz explains the difference.

Here is my Heaven score running on my 4096X2160 monitor


----------



## ocvn

Quote:


> Originally Posted by *jsutter71*
> 
> Don't rain on my parade but my memory is 2398MHz. Looking at yours which is at 2998MHz explains the difference.
> 
> Here is my Heaven score running on my 4096X2160 monitor










OC it. 3000-3200 should be fine with vram 1.45V


----------



## labjet

awesome setup, desk looks a lil small though for those monitors lol

yea not sure what was causing it but i reinstalled driver and now its working properly, no stuttering, fps drops or anything and no issues changing primary display from x34 and surround. Must have just been something random screwing with it.


----------



## labjet

I think watercooling the titan is well worth it, did not like the temps at all using the reference cooler, even at max fan speed not to mention the noise at that speed. Lower temps, more consistent boost, quite operation, better overclocking it brings out the full potential of the titan

Watercooling also isnt scary to do, just do you research and get the correct and quality parts. Ive built 2 watercooled PC's so far and even went straight for hardline tube on my first build. I havent ran into any leaks so far, but one mistake i did do was ran the pump dry. I did not realize the PSU was in the on postion when i pulled it in and there was no fluid in the reservoir.


----------



## labjet

have you tried running the benchmark with just one monitor active? i achieved much higher scores when i disabled my other monitors


----------



## CptSpig

Quote:


> Originally Posted by *arrow0309*
> 
> Nope, only a simple custom mid tower liquid cooling and some fresh air!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20170204148622459823975.JPG
> 
> Other 3DMarks at the same OC (+227, +797):
> 
> http://www.xtremeshack.com/photos/20170204148622459826284.JPG
> http://www.xtremeshack.com/photos/20170204148622459423192.JPG
> http://www.xtremeshack.com/photos/20170204148622459312378.JPG


[

Here are some of my scores. OC GPU +227 core + 650 memory CPU + 4.6


----------



## uggy

What do you guys recommend for an OC on two cards in sli on water.
Core and memory.
Voltage 100%?
And power 120%?


----------



## jsutter71

Quote:


> Originally Posted by *ocvn*
> 
> 
> 
> 
> 
> 
> 
> 
> OC it. 3000-3200 should be fine with vram 1.45V


Interesting issue with the core clock. Anything above 225 will cause 3Dmark to lock up. Not my entire system. Just the program itself. Anything above 500 on the memory clock shows no improvement with the benchmark. I tried 500 and 525 and received the same score.


----------



## patrickisfrench

I saw someone do this their card on evga forums, with the heatsinks next to the water block from the hybrid cooler. would thermal paste alone work in keeping these small heatsinks in place?


----------



## hertz9753

Those heart sinks probably use double sided tape on the back.


----------



## octiny

24/7 scores.

*6950X @ 4.3/3.5 w/Titan X Pascal SLI +200/+575(DDR4 2666)*


http://www.3dmark.com/fs/11626484

http://www.3dmark.com/fs/11626531

http://www.3dmark.com/spy/1172736

*6950x @ 4.3/3.5 w/GTX 1080 SLI +170/+500 (DDR4 2666)*


http://www.3dmark.com/fs/11603007

http://www.3dmark.com/fs/11598750 *Bad FSU run, GPU score usually @ 11500 w/ CPU @ 26700

http://www.3dmark.com/spy/1159804

*6700K @ 4.7/4.5 w/Titan X Pascal SLI +200/575 (DDR4 2400mhz)*


http://www.3dmark.com/fs/11480147

http://www.3dmark.com/fs/11477784

http://www.3dmark.com/spy/1087488

Cheers.


----------



## CptSpig

Big deference in scores based on deferent GPU's, CPU's and possible cooling.


----------



## octiny

Quote:


> Originally Posted by *CptSpig*
> 
> Big deference in scores based on deferent GPU's, CPU's and possible cooling.


Yessir









Always nice to get some perspective.


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> Interesting issue with the core clock. Anything above 225 will cause 3Dmark to lock up. Not my entire system. Just the program itself. Anything above 500 on the memory clock shows no improvement with the benchmark. I tried 500 and 525 and received the same score.


I have similar issues. If I go higher than +213 on the core Firestrike stops as if I hit escape. What doesn't make sense is that I've been able to game on it with no problem at +220. I just can't seem to benchmark it that high.


----------



## Jpmboy

you guys should post your Futuremark and Unigine results in these threads (the data is very welcome!)
P Lease read post#1 in each for the proper settings and entry stuff.








http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20

Tossed around the idea of a VRMark Top 30... doesn't seem to be much interest. Let's see what the new Unigine bench looks like.


----------



## xTesla1856

Quote:


> Originally Posted by *Jpmboy*
> 
> you guys should post your Futuremark and Unigine results in these threads (the data is very welcome!)
> P Lease read post#1 in each for the proper settings and entry stuff.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
> http://www.overclock.net/t/1443196/firestrike-extreme-top-30
> http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
> http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
> http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20
> 
> Tossed around the idea of a VRMark Top 30... doesn't seem to be much interest. Let's see what the new Unigine bench looks like.


Will do some runs when I get home, thanks


----------



## octiny

Quote:


> Originally Posted by *Jpmboy*
> 
> you guys should post your Futuremark and Unigine results in these threads (the data is very welcome!)
> P Lease read post#1 in each for the proper settings and entry stuff.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
> http://www.overclock.net/t/1443196/firestrike-extreme-top-30
> http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
> http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
> http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20
> 
> Tossed around the idea of a VRMark Top 30... doesn't seem to be much interest. Let's see what the new Unigine bench looks like.


Done.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> you guys should post your Futuremark and Unigine results in these threads (the data is very welcome!)
> P Lease read post#1 in each for the proper settings and entry stuff.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1518806/firestrike-ultra-top-30/0_20
> http://www.overclock.net/t/1443196/firestrike-extreme-top-30
> http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
> http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
> http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20
> 
> Tossed around the idea of a VRMark Top 30... doesn't seem to be much interest. Let's see what the new Unigine bench looks like.


I posted my Fire Strike 1.1 score about a month ago and does not look like anyone is updating the list. I will post in Time Spy and Fire strike Ultra and see what happens.


----------



## jsutter71

Quote:


> Originally Posted by *octiny*
> 
> 24/7 scores.
> 
> *6950X @ 4.3/3.5 w/Titan X Pascal SLI +200/+575(DDR4 2666)*
> 
> 
> http://www.3dmark.com/fs/11626484
> 
> http://www.3dmark.com/fs/11626531
> 
> http://www.3dmark.com/spy/1172736
> 
> *6950x @ 4.3/3.5 w/GTX 1080 SLI +170/+500 (DDR4 2666)*
> 
> 
> http://www.3dmark.com/fs/11603007
> 
> http://www.3dmark.com/fs/11598750 *Bad FSU run, GPU score usually @ 11500 w/ CPU @ 26700
> 
> http://www.3dmark.com/spy/1159804
> 
> *6700K @ 4.7/4.5 w/Titan X Pascal SLI +200/575 (DDR4 2400mhz)*
> 
> 
> http://www.3dmark.com/fs/11480147
> 
> http://www.3dmark.com/fs/11477784
> 
> http://www.3dmark.com/spy/1087488
> 
> Cheers.


How are you overclocking your CPU? By individual core or by syncing and what are your settings? Also what are your 3D settings in the Nvidia panel. I have played around a lot with my settings and with the same CPU 6950x my settings are set to per core at 43 and I have the Intel turbo boost max 3.0 driver enabled. I can overclock higher but for some reason when I go beyond 43 my 3dmark scores drop.


----------



## octiny

Quote:


> Originally Posted by *jsutter71*
> 
> How are you overclocking your CPU? By individual core or by syncing and what are your settings? Also what are your 3D settings in the Nvidia panel. I have played around a lot with my settings and with the same CPU 6950x my settings are set to per core at 43 and I have the Intel turbo boost max 3.0 driver enabled. I can overclock higher but for some reason when I go beyond 43 my 3dmark scores drop.


Just all per core at 43 multiplier, AVX offset at 2, cache 35, override voltage, 1.336v/1.248v cache/vrrin 1.9/1.1 SA. Haven't touch a single thing in Nvidia control panel, set to the default "optimal power". Nothing special, still need to tighten my mem timings which should give me a nice increase.

I do notice your cpu/combined scores are kind of low, as I've tried 2400mhz too and only lose about 200 points vs. 2666. One thing I did notice while getting a final 24/7 overclock, the littlest of instability in the system will cause the physics and combined score to drop dramatically in FS/FSU, similar to what I'm seeing with your 3Dmark scores. I've noticed it happening quite a bit on the Futuremark site too, at least with 6950x/XP SLI systems even though they're clocked 200+ higher with much faster memory. Took me awhile to dial in my voltage and settings because of it.

Stress tested via 4hr RealBench w/64GB loaded along with RealBench x264 for 4hr after that, then another 4hr cache/memory via Aidia64.


----------



## jhowell1030

Quote:


> Originally Posted by *octiny*
> 
> Just all per core at 43 multiplier, AVX offset at 2, cache 35, override voltage, 1.336v/1.248v cache/1.1 SA. Haven't touch a single thing in Nvidia control panel, set to the default "optimal power". Nothing special, still need to tighten my mem timings which should give me a nice increase.
> 
> I do notice your cpu/combined scores are kind of low, as I've tried 2400mhz too and only lose about 200 points vs. 2666. One thing I did notice while getting a final 24/7 overclock, the littlest of instability in the system will cause the physics and combined score to drop dramatically in FS/FSU, similar to what I'm seeing with your 3Dmark scores. I've noticed it happening quite a bit on the Futuremark site too, at least with 6950x/XP SLI systems. Took me awhile to dial in my voltage and settings because of it.
> 
> Stress tested via 4hr RealBench w/64GB loaded along with RealBench x264 for 4hr after that, then another 4hr cache/memory via Aidia64.


Exact same here. Before I was tinkering to try and get +24000 on Fire Strike I had what tested as a 24/7 stable overclock on my 5820k @ 4.7Ghz using AIDA64 and Prime95. I have a spreadsheet of all of my different settings and how they benchmark in Fire Strike to use for reference anytime I change settings or upgrade. I hadn't tried 4.7 too much in Fire Strike but noticed that the Physics and Combined score were about 18% worse than what they were at 4.6. I went and added just a bit more voltage to the CPU (went from 1.35 to 1.375) and saw that now I was getting 3% on average better scores than what I was at 4.6. Turns out my rock solid voltage for AIDA64 and Prime95 wasn't quite cutting it. Now I'll have to remember that for chips in the future.


----------



## Jpmboy

Quote:


> Originally Posted by *CptSpig*
> 
> I posted my Fire Strike 1.1 score about a month ago and does not look like anyone is updating the list. I will post in Time Spy and Fire strike Ultra and see what happens.


.. I don;t maintain that leader board.









Quote:


> Originally Posted by *octiny*
> 
> Done.


thanks!
Quote:


> Originally Posted by *xTesla1856*
> 
> Will do some runs when I get home, thanks


thanks!


----------



## stocksux

Hey I'm here for the gangbang...oh wait ? Look what showed up today!!


----------



## xTesla1856

Quote:


> Originally Posted by *stocksux*
> 
> Hey I'm here for the gangbang...oh wait ? Look what showed up today!!


Enjoy! Tell us about your OC results as welll


----------



## stocksux

Quote:


> Originally Posted by *xTesla1856*
> 
> Enjoy! Tell us about your OC results as welll


Roger that. Still waiting for more parts for the build. So far just case (CaseLabs SMA8) and card (Titan XP).


----------



## NemChem

Hey all! Have been following the thread and got my Titan yesterday so I've joined the forums to partake in the fun you're all having ;D!

Build isn't complete yet (waiting for Ryzen) but here is the best I've gotten on Timespy so far... i5 3570K @ 4.7GHz, Titan X(P) @ +220 core/+900 memory. Gradually getting everything together for my cooling loop so hopefully will be able push a bit more out then!



http://www.3dmark.com/spy/1186840


----------



## DooRules

Looking good NemChem. These Titans are a beast.

Welcome to the forum.


----------



## mitcHELLspawn

Hey there guys. Been going back and forth as to whether I should pick one of these up lately.. ive been running SLI gtx 1080s since early August(and SLI titanX hybrids before that), and being a gamer that plays pretty much all AAA releases i have been beyond disappointed with SLI support from the beginning of 2016 until now and I have hit my limit with it so I decided to sell one of my 1080s... but being a power addict that I am lol it just isn't enough, so ive been thinking about selling my second one and going with one of these bad boys.

The only thing that has been holding me back is that damn 3584 cuda cores.the fact that it's a cut down card scares the hell out of me. I will be beyond mad if they release the 1080ti in a month or two time with the fully unlocked GP102 chip with the 3840 cores (remember the 780ti anyone?)

I guess what I'm trying to say is.... im just looking for someone to convince me one way or another !

Then another thing, the titan X pascal actually just went out of stock yesterday.. and i noticed a bunch of industry guys were at an event this week that is under embargo until next week.. im hoping and praying its the 1080ti but i know in my heart it isn't lol.. but i have not seen the TITANXP go out of stock in ages... weird.


----------



## NemChem

Quote:


> Originally Posted by *DooRules*
> 
> Looking good NemChem. These Titans are a beast.
> 
> Welcome to the forum.


Thanks DooRules; indeed they are!

mitcHELLspawn, I'm not sure about a 1080Ti with 3840 cores but I know there have been rumours that there will be a Titan Black V2 with 3840 cuda cores. I remain unconvinced - all the single chip Titans have had a TDP of 250W and the 3584 core Titan is hitting up against that immediately. When +20% power is set, overclocking quickly hits the new 300W power limit... with 3840 cores the clocks would have to come down to fit inside the power budgets and it would be a case of which ratio is better, 3584:224:96 or 3840:240:96? When overclocked that might mean the current Titan is faster for some workloads. I'd love to see a comparison between an overclocked Titan X(P) and an overclocked P6000 both hitting up against the 300W limit! This is of course all conjecture and I'm probably biased since I just picked up the current Titan and don't want it to become obsolete over night ;D!


----------



## NemChem

I think this is the highest I'm going to get until I get a new CPU or move from air to water! Changed from last run: i5 3570K @ 4.7 GHz -> 4747 MHz, RAM from 1866 9-13-13-34 -> 2020 10-11-11-31 and GPU from 220+ core / +900 mem -> +225 core / +925 mem.



http://www.3dmark.com/spy/1189785


----------



## Fredthehound

Honestly, I'd wait until Vega releases as I doubt Nvidia will release anything like a TI until then. Whether or not it's Quadro-Lite will be determined by how much faster than a 108 the Vega is.

With that said, Although I have maxed my TitanXP in modded Fallout And Skyrim SSE and see dips under 60FPS, I find it VERY hard to be disappointed as my visuals are jacked through the roof. All the big mods and texture packs, the ENBs and weather mods...GLORIOUS. The Titan does more than I could have hoped for when I bought it day 1.

Whis is not to say I won't be sobbing like a dumped schoolgirl if the full boat Pascal is indeed released in a 1080TI. somehow though I think I can manage to survive my disappointment considering the past several months of gaming bliss.


----------



## st0necold

call me crazy i'm holding off for the 1080ti..

I am not a fan of paying $1200 for something that's going to have less "vram" (who cares) and be the exact same thing and do better in games for $600.


----------



## mitcHELLspawn

^^ well, call me crazy, but I assume this is not the ideal thread to be sharing those sentiments. Unless you had the direct intention to start an argument of course..

I actually have to disagree with the Vram opinion as well.. to a point. As a SLI gtx 1080 user i have come up against that 8GB of Vram in quite a few instances already in games in 4k, and I don't see that getting better in the future, only worse...

So yeah, I would definitely have to take a step back if the 1080ti ends up with only 8GB of Vram. But hopefully it launches with the 10GB of g5x that was rumored.. then I would be much more comfortable choosing it over a titanX pascal.

I hope the ti has the same memory bandwith as the titan as well, because they really handicapped the hell out of the 1080s g5x Vram with that crappy 256 bit memory bus.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *st0necold*
> 
> call me crazy i'm holding off for the 1080ti..
> 
> I am not a fan of paying $1200 for something that's going to have less "vram" (who cares) and be the exact same thing and do better in games for $600.


That's great.


----------



## Fredthehound

Quote:


> Originally Posted by *mitcHELLspawn*
> 
> ^^ well, call me crazy, but I assume this is not the ideal thread to be sharing those sentiments. Unless you had the direct intention to start an argument of course..
> 
> I actually have to disagree with the Vram opinion as well.. to a point. As a SLI gtx 1080 user i have come up against that 8GB of Vram in quite a few instances already in games in 4k, and I don't see that getting better in the future, only worse...


I went to almost 9 in Fallout with the new texture pack and am constantly bordering on 8 in Skyrim SSE. Thats with DSR at X2 and X3 respectively. I have little doubr full 4K would now push Fallout over 10.


----------



## mitcHELLspawn

Don't really know how you pull that off lol.. im not using dsr im playing at true 4k and in fallout 4 with the new texture pack it barely uses 5GB of Vram.


----------



## Fredthehound

ENB, weather mods 2xMSAA, CBBE, high rez clothing in fully populated settlements filled with a ton of extra stuff...it adds up.


----------



## jhowell1030

Quote:


> Originally Posted by *Fredthehound*
> 
> I went to almost 9 in Fallout with the new texture pack and am constantly bordering on 8 in Skyrim SSE. Thats with DSR at X2 and X3 respectively. I have little doubr full 4K would now push Fallout over 10.


Honest to God...one of the reasons why I bought a Titan XP was to be able to mod skyrim and fallout again. I had a 980 KINGPIN and had Skyrim (original) looking absolutely stunning at 1080 @60hz. Then I went and got an ultrawide and NOPE.

Fast forward to today...and I'm kind of having issues with both Skyrim SSE and Fallout 4 at either 3440 x 1440 or 2560 x 1440. Any place you recommend I can go to for pointers? (Or maybe even also be willing to share your mod list and game/nvidia settings?) I just wan't it to look sexy AND smooth.

Please feel free to send them in a message. I don't want to hijack the thread with Bethesda talk if folks don't want to read it.


----------



## mitcHELLspawn

You also need to take into the account that the more Vram you have, the more you will use. This was shown really well by digital foundry back in the shadow of mordor high res texture pack days. A 980ti would use around 5.5GB of Vram whereas the titanX maxwell would use over 8GB and they would result in identical performance.


----------



## Fredthehound

It might be the ultrawide rez giving you trouble but I wouldn't swear to that. I just use the normal 'big' mods, nothing too esoteric. I run 1080P on a 65 in TV with DSR 2 or 3X. I ran 4x without ENBs but ENBs are too much for even Lord Titan at 1080p with all the other graphical stuff.

Go into ini and turn shadow rez to 2048. Thats a must. turn off vsync, you won't have to worry about the physics bug of too high framerates, trust me. Cap at 60 for a TV with Rivatuner.

In the control panel I use MSAAx2, triple buffering. Thats really about it for the setup. Looks beyond anything I ever accomplished in the original game. By orders of magnitude.

I also bought the TXP specifically for skyrim and Fallout. And even at 1080p with DSR, they WILL eat every single cycle the GPU can put out and want more. I still use original Sky with vorpX for my Vive and moderate modding. It's 4 gig limit stops anything more but the TXP is still struggling to keep it smooth in heavy areas like Dragonreach. VorpX is high overhead and consumes ram so it can be dicy with mods beyond a certain point, even with all the crash fixes and memory mods.


----------



## Lobotomite430

Just installed a full waterblock on my titan!
Bitspower GPU Waterblock for Nvidia Titan X Pascal, Clear Acrylic https://www.amazon.com/dp/B01N0ONJAV/ref=cm_sw_r_cp_apap_CvtHyzmLytCda


----------



## mitcHELLspawn

65 inch at 1080p? My eyes... my poor eyes lol. I don't even want to do the math on the ppi for that one. But to each their own I suppose !

I have a 75 inch 4k screen in my theater room but I don't touch it for gaming.

When I game I either use the acer predator x34 if it's a fast paced multiplayer game or just a game I use keyboard and mouse with in general because the 100hz gsync is so damn nice with a mouse.

But for everything else and games i use a controller with I use a 43 inch 4k HDR sony tv and sit about 4 feet away with a recliner. Its absolutely pixel perfect. Having such a high resolution yet (by perspective due to how close i am) such a big screen, it just makes games look stunning maxed out in 4k/60 fps.


----------



## mitcHELLspawn

Quote:


> Originally Posted by *Lobotomite430*
> 
> Just installed a full waterblock on my titan!
> Bitspower GPU Waterblock for Nvidia Titan X Pascal, Clear Acrylic https://www.amazon.com/dp/B01N0ONJAV/ref=cm_sw_r_cp_apap_CvtHyzmLytCda


Awesome dude, looks fantastic. You should check out v1tech for a custom backplate. He really does some incredible work! I had him make me up a sweet custom SLI bridge when I was running SLI titanX hybrids(maxwell).. it was a giant X black on white. Looked super cool


----------



## Fredthehound

It looks fine. Never once had anyone ever say anything but WOW! when I had Skyrim up. I swear you ultra high end monitor guys are spoiled









Believe me, if I could run this mod setup at 4K, I'd be doing it. But I can't hold 60FPS even with a 7700k and 4266 ram. So 1080P it is.


----------



## Lobotomite430

It did come with a backplate but I wouldn't mind looking into a custom one. I will have to google them!


----------



## CptSpig

Quote:


> Originally Posted by *Lobotomite430*
> 
> It did come with a backplate but I wouldn't mind looking into a custom one. I will have to google them!


I just used the stock screws and backplate. I like the original look.








http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_221954_zpszpkrc7de.jpg.html


----------



## jhowell1030

Quote:


> Originally Posted by *Fredthehound*
> 
> It looks fine. Never once had anyone ever say anything but WOW! when I had Skyrim up. I swear you ultra high end monitor guys are spoiled
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Believe me, if I could run this mod setup at 4K, I'd be doing it. But I can't hold 60FPS even with a 7700k and 4266 ram. So 1080P it is.


Kind makes me want to try streaming it to my 55 inch TV @ 1080 with my steam link and see how it works with mods from there. At least I'll be able to game in my lovesac that way!


----------



## xTesla1856

Quote:


> Originally Posted by *CptSpig*
> 
> I just used the stock screws and backplate. I like the original look.
> 
> 
> 
> 
> 
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_221954_zpszpkrc7de.jpg.html


I just ordered the Prefilled Predator block for the TX about 20 minutes ago. Should be here on Tuesday, I'm, curious to see the clock speeds I can achieve under water


----------



## Lobotomite430

I dont think I could have used the stock screws with this block. This thing came with beefier scews but I didnt really pay attention as I just wanted to use their kit. What kind of temps do you get with yours? Mine was hitting 40c last night and I kind of expected it to be lower.


----------



## CptSpig

Quote:


> Originally Posted by *Lobotomite430*
> 
> I dont think I could have used the stock screws with this block. This thing came with beefier scews but I didnt really pay attention as I just wanted to use their kit. What kind of temps do you get with yours? Mine was hitting 40c last night and I kind of expected it to be lower.


Absolute max with 230 on the core and 575 memory OC is 44c Idle is 24c and that is with a CPU block also in the loop. Ambient temp. 22c Very Happy.


----------



## CptSpig

Quote:


> Originally Posted by *xTesla1856*
> 
> I just ordered the Prefilled Predator block for the TX about 20 minutes ago. Should be here on Tuesday, I'm, curious to see the clock speeds I can achieve under water


Best I can do Future Mark is 227 core and 675 memory. Very happy!


----------



## stocksux

Nice looking block! Just got my block in the mail yesterday. Haven't had a chance to install it yet though.


----------



## Gunslinger.

Is there a consensus on what the best full coverage water block available is for these cards?

I've historically always used EK but am itching to try something different this time.


----------



## Lobotomite430

hmmm I think my temps are a tad high then, Im at 40c with OC with an ambient of 15c


----------



## stocksux

Quote:


> Originally Posted by *Gunslinger.*
> 
> Is there a consensus on what the best full coverage water block available is for these cards?
> 
> I've historically always used EK but am itching to try something different this time.


They are all about the same as far as performance goes. Choose something that fits your build and looks great in your case!


----------



## Gunslinger.

Quote:


> Originally Posted by *stocksux*
> 
> They are all about the same as far as performance goes. Choose something that fits your build and looks great in your case!


No case, open bench station FTW


----------



## CptSpig

Quote:


> Originally Posted by *Gunslinger.*
> 
> Is there a consensus on what the best full coverage water block available is for these cards?
> 
> I've historically always used EK but am itching to try something different this time.


I really like my EK full cover block! I was able to use original screws and backplate.


----------



## NemChem

Quote:


> Originally Posted by *Gunslinger.*
> 
> Is there a consensus on what the best full coverage water block available is for these cards?
> 
> I've historically always used EK but am itching to try something different this time.


I took the plunge with an Aquacomputer block and their active cooling backplate for the VRMs - the only review I could find for a power hungry card was at xtremerigs where they tested it for a 290. Seemed to make a sizeable difference and I read (I think it was somewhere earlier in this thread) that the Titan X(P) doesn't have the best VRMs so want to give them as much love as possible!


----------



## jhowell1030

Quote:


> Originally Posted by *Lobotomite430*
> 
> hmmm I think my temps are a tad high then, Im at 40c with OC with an ambient of 15c


Wow, I wish my wife would let me keep the place that cool!


----------



## Lobotomite430

To be fair me and my computers are in the basement, shes on the main floor where its a good 10 degrees warmer.


----------



## xTesla1856

Quote:


> Originally Posted by *Gunslinger.*
> 
> Is there a consensus on what the best full coverage water block available is for these cards?
> 
> I've historically always used EK but am itching to try something different this time.


The Heatkiller block is your best alternative. If I were building a custom loop, I'd go full Heatkiller. But the EK Predator is soooooooo convenient


----------



## Fredthehound

Worth at least trying I think. Might hate it, might be OK for you. At least more comfortable







.


----------



## kx11

do you guys think the temps are alright with these settings ??

running watercooled operation here in a 21c room temp w/fans running 100%

the game is For Honor open beta with DSR 2x00 (3620x2036) on @ highest settings , no AA no vsync


----------



## stocksux

Quote:


> Originally Posted by *kx11*
> 
> do you guys think the temps are alright with these settings ?? running watercooled operation here in a 21c room temp w/fans running 100% the game is For Honor open beta with DSR 2x00 (3620x2036) on @ highest settings , no AA no vsync


What are your temps with the game in the foreground and not the background? Clock speeds are way low as adterburner is showing in the foreground in your pic. Temps will differ greatly based on clock speed.


----------



## CptSpig

Quote:


> Originally Posted by *kx11*
> 
> do you guys think the temps are alright with these settings ??
> 
> running watercooled operation here in a 21c room temp w/fans running 100%
> 
> the game is For Honor open beta with DSR 2x00 (3620x2036) on @ highest settings , no AA no vsync


Quote:


> Originally Posted by *kx11*
> 
> do you guys think the temps are alright with these settings ??
> 
> running watercooled operation here in a 21c room temp w/fans running 100%
> 
> the game is For Honor open beta with DSR 2x00 (3620x2036) on @ highest settings , no AA no vsync


I would disable the core voltage slider these cards do not like voltage and your temps will come down slightly. If those are in game temps they look good. My Titan XP with EK full cover block ambient temperature 22c idle is 24c. Max for in game or benchmarking is 44c.


----------



## ChronoBodi

Quote:


> Originally Posted by *NemChem*
> 
> Thanks DooRules; indeed they are!
> 
> mitcHELLspawn, I'm not sure about a 1080Ti with 3840 cores but I know there have been rumours that there will be a Titan Black V2 with 3840 cuda cores. I remain unconvinced - all the single chip Titans have had a TDP of 250W and the 3584 core Titan is hitting up against that immediately. When +20% power is set, overclocking quickly hits the new 300W power limit... with 3840 cores the clocks would have to come down to fit inside the power budgets and it would be a case of which ratio is better, 3584:224:96 or 3840:240:96? When overclocked that might mean the current Titan is faster for some workloads. I'd love to see a comparison between an overclocked Titan X(P) and an overclocked P6000 both hitting up against the 300W limit! This is of course all conjecture and I'm probably biased since I just picked up the current Titan and don't want it to become obsolete over night ;D!


Jeez. No gpu lasts forever. I bought txp knowing its cut off 256 cores. The difference would be no greater than 6-8% difference between 3584 cores and 3840 cores.

The law of diminishing returns strikes.

An extra 256 cores to a 256 core card, yea, 100% difference.

But, add 256 cores to 3584, not so much.

Forget the idea that a single gpu will be king forever. I give TXP 9 months before something is better, but not like OMG replace txp Now!

3-4 years before we see something 2x faster than txp


----------



## kx11

Quote:


> Originally Posted by *stocksux*
> 
> What are your temps with the game in the foreground and not the background? Clock speeds are way low as adterburner is showing in the foreground in your pic. Temps will differ greatly based on clock speed.


sorry i forgot to mention the OSD in the background is a video recorded while playing while MSI AB was showing current state without gaming


----------



## kx11

Quote:


> Originally Posted by *CptSpig*
> 
> I would disable the core voltage slider these cards do not like voltage and your temps will come down slightly. If those are in game temps they look good. My Titan XP with EK full cover block ambient temperature 22c idle is 24c. Max for in game or benchmarking is 44c.


your temps are really good


----------



## TheFallenDeity

I took the plunge yesterday and will be getting my TITAN X Pascal on Tuesday. I had a GTX 1080 and while powerful, I needed more "oomph" to drive 1440p/165Hz G-Sync. Pretty excited about this. It's my first TITAN card and I am excited to see what this puppy can do.


----------



## stocksux

Quote:


> Originally Posted by *TheFallenDeity*
> 
> I took the plunge yesterday and will be getting my TITAN X Pascal on Tuesday. I had a GTX 1080 and while powerful, I needed more "oomph" to drive 1440p/165Hz G-Sync. Pretty excited about this. It's my first TITAN card and I am excited to see what this puppy can do.


Welcome to the club!


----------



## kx11

Quote:


> Originally Posted by *TheFallenDeity*
> 
> I took the plunge yesterday and will be getting my TITAN X Pascal on Tuesday. I had a GTX 1080 and while powerful, I needed more "oomph" to drive 1440p/165Hz G-Sync. Pretty excited about this. It's my first TITAN card and I am excited to see what this puppy can do.


i have a gsync 1440p 165hz monitor but all i got in NVCP is 144hz , any idea how to enable it ?


----------



## ottoore

Quote:


> Originally Posted by *kx11*
> 
> i have a gsync 1440p 165hz monitor but all i got in NVCP is 144hz , any idea how to enable it ?


" _The overclocking feature can be quickly enabled via the OSD menu as shown above_"

http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg279q.htm


----------



## NapalmV5

sup guys! hows everyone doing? hey any word/rumors on when a full 3840cc titan xp black edition will arrive? end of february/march possibility? or will it take a year just like kepler to release the full black edition in august? gp100 3584cc/16gb hbm2 quadro was just paper released to be available in march and still not offering full 3840cc.. sure they have been offering very limited full gp102 quadro not too crazy about spending 20k+ on gpus.. oh and anything on 1080ti? nothing since ces in january


----------



## stocksux

I don't think you'll hear anything until AMD shows their hand really. Why would nvidia show off anymore product when they're the only one in the game at the moment? If they've got bigger and better they'll hold it to one up AMD and show they are still king.


----------



## ChronoBodi

Hmm. yea.

If AMD Vegas kills GTX 1080, unleash 1080 Ti with 3328 cores.

if AMD Vegas really, really knocks it out of the park, desperate 3840 core OCed edition of GP102.

If AMD kills anything GP102 can do, they have to make a 600mm 16nm chip that's all fp32, and put in like 5,120 cores. And that would be awesome.

Titan XP isn't even the full 600mm potential of 16nm node to begin with, albeit one that's all FP32 focused like Titan X Maxwell/980 Ti was for the 28nm node.


----------



## krizby

With Pascal still leading in efficiency Vega need to be a 250W tdp card just to compete with 1080 or a 350W card to compete with 1080ti. With some process improvement Nvidia can pretty much let Pascal get some clock boost and enjoy another year of glory before unleashing Volta. I hope this will not come back to the 8800GTX time when Nvidia just rehashing cores but still come out on top :/


----------



## hertz9753

I remember the 8800GTX came after 7900's. The 8800GT, GTS250 and GTS450 were the rehashes in my memory, but I'm old.


----------



## NemChem

So I just took my Titan apart and I have to say... the cooling solution is a work of engineering art.

Edit: Not a peltier... the LED strip! Doh!

Loads of screws so it took a while, but once you get it apart, you see the heat sink has a phase change element at the base to spread the heat from the die across all of the block extremely quickly and then up against the side of the block is a long, thin peltier (I think, not sure what else it could be!) that runs along the side of, and cools the heat sink whilst the hot side dissipates to the metal shroud. When I first got my Titan I thought "oh wow, metal" when I first held it as it was lovely and cool to the touch. I thought it was just cosmetic for a quality feel, but seems it serves a practical purpose too!

Took some pictures which I'll put up once I've got it all back together.

Edit: I realise lots of you have personally taken your Titans apart to attach waterblocks so this won't be news, but for everyone else I thought it might be interesting







!

Edit 2: Picture time!
Please excuse the hairy carpet, our pug is molting (I think she's confused... molting in February?!)

Phase change base of the heatsink:


The not-a-peltier-LED-strip... I still think a peltier would be a cool idea:


Pre-mod:


Our weapons:


Sooo tiny!


The glue is terrible but at least it keeps them in place for now...


Nerves of steel for the conductive paint!


Looking good on this side too!


Success! Sort of... graphics score up by 1.5% (+225/+935)!

http://www.3dmark.com/spy/1207034

Power usage now 110% - 115% so it worked a bit. Over on the [H]ardocp thread there were thoughts that conductive paint would be too resistive to work, but it looks like it works enough for a small boost - my boost clock is much more stable and slowly goes down over time due to rising temps (I guess?). Afterburner says it is limited by voltage now instead of power during a Time Spy run.


----------



## ChronoBodi

Um, just saying i've seen peak usage of 129% TDP in GPU-Z on my OCed TXP without any hardmods of the sorts...

Orrrr.... is this on stock, doing what you did increases the power limit basically?


----------



## xTesla1856

Quote:


> Originally Posted by *ChronoBodi*
> 
> Um, just saying i've seen peak usage of 129% TDP in GPU-Z on my OCed TXP without any hardmods of the sorts...
> 
> Orrrr.... is this on stock, doing what you did increases the power limit basically?


Yup, I've seen spikes up in to the 130s with my card. But that was with power, voltage, temp limit and fan speed all cranked to 11. +800mem and +220core will do that


----------



## xTesla1856

Quote:


> Originally Posted by *NapalmV5*
> 
> sup guys! hows everyone doing? hey any word/rumors on when a full 3840cc titan xp black edition will arrive? end of february/march possibility? or will it take a year just like kepler to release the full black edition in august? gp100 3584cc/16gb hbm2 quadro was just paper released to be available in march and still not offering full 3840cc.. sure they have been offering very limited full gp102 quadro not too crazy about spending 20k+ on gpus.. oh and anything on 1080ti? nothing since ces in january


We might not see a Black Edition at all. Depending on how good Vega is, nVidia _might_ release a 1080Ti to combat AMD. But from all I've seen of Vega so far (Yeah I know, unfinished drivers, bad cooling etc), I wouldn't bet my money on it. Vega (in my mind) will be a 1080 competitor at launch.


----------



## pez

I'm just ready for EVGA to release a Hybrid AIO for the TXP. I'd love to keep mine, but noise is pretty much keeping me from this ATM.


----------



## NemChem

Quote:


> Originally Posted by *ChronoBodi*
> 
> Um, just saying i've seen peak usage of 129% TDP in GPU-Z on my OCed TXP without any hardmods of the sorts...
> 
> Orrrr.... is this on stock, doing what you did increases the power limit basically?


On stock cooler for now. Yes, the power limit has been increased by tricking the reporting, so say before it would report it was at 120%, now it thinks it's at ~100-110%. Whilst I was doing some Time Spy runs I was reading the back of the conductive paint packet and it said "becomes more conductive as it dries completely". The reported power is indeed now lower than immediately after the mod: it was still going up to 120% on the second graphics test, and now it only reaches 110%. The boost speed graph is much flatter now - no little fluctuations! Before, Afterburner reported that it was limited by power all the time, now it is by voltage the majority of the time: since a higher voltage is needed to achieve, say, 2100MHz at 80 C than at 40 C I am hopeful that when my waterblock arrives I'll see slightly higher boosts, and even if I don't it will be silent instead of the 100% fan noise (not that I game at those frequencies and fan speed, but still







)!


----------



## aylan1196

I've done the shunt mod today pl dropped from 120 to mid 80s ?
So satisfied with clocks stable no more fluctuate... but still voltage limit and temp limit the card holds 2076 on core and 500 memory stable on all games so far it's water cooled and the temps in mid 40s to mid 50s I love the txp more now


----------



## NemChem

Quote:


> Originally Posted by *aylan1196*
> 
> I've done the shunt mod today pl dropped from 120 to mid 80s ?
> So satisfied with clocks stable no more fluctuate... but still voltage limit and temp limit the card holds 2076 on core and 500 memory stable on all games so far it's water cooled and the temps in mid 40s to mid 50s I love the txp more now


Awesome! Did you solder the resistors or use conductive paint? I'm thinking about soldering them now as I think the paint has come away from the contacts as it dried completely - the power limit went down initially and then improved a bit as the paint dried but after running the card for a day or two it seems it's back to pre-mod!

Do you have a backplate to accompany your waterblock? I couldn't find a review of Titan X (Pascal) blocks but if the temps from an R9 290 are anything to go by the VRMs can get extremely hot without a backplate so I'd be careful pushing the card too hard to avoid a similar situation.


----------



## aylan1196

Quote:


> Originally Posted by *NemChem*
> 
> Awesome! Did you solder the resistors or use conductive paint? I'm thinking about soldering them now as I think the paint has come away from the contacts as it dried completely - the power limit went down initially and then improved a bit as the paint dried but after running the card for a day or two it seems it's back to pre-mod!
> 
> Do you have a backplate to accompany your waterblock? I couldn't find a review of Titan X (Pascal) blocks but if the temps from an R9 290 are anything to go by the VRMs can get extremely hot without a backplate so I'd be careful pushing the card too hard to avoid a similar situation.


Hi iam using the thermal grizzly liquid so far so good but if what r u telling will happen eventually I think I'll solder them down the road I'll see the pl for this week and report back
Iam using swiftech wb with backplate and it's good


----------



## TheFallenDeity

Does anyone know the thickness of NVidia's factory green thermal pads for the VRM and VRAM? I want to buy some Fujipoly pads to replace them. Thanks!









Edit: I'll be adding an AIO to my TXP.


----------



## xTesla1856

I just received a little package from Slovenia...


----------



## NemChem

Quote:


> Originally Posted by *xTesla1856*
> 
> I just received a little package from Slovenia...










me too! But mine was a Supremacy EVO in preparation for Ryzen... still waiting on my Titan XP waterblock







.


----------



## TheFallenDeity

Would adding thermal pads to the back of the TXP's PCB over the VRM (and under the backplate) area help with extra heat dissipation? Or is it not necessary?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *TheFallenDeity*
> 
> Would adding thermal pads to the back of the TXP's PCB over the VRM (and under the backplate) area help with extra heat dissipation? Or is it not necessary?


Wont help. The card is voltage straved as is. No stress on vrms.


----------



## TheFallenDeity

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Wont help. The card is voltage straved as is. No stress on vrms.


Thank you.


----------



## EniGma1987

Quote:


> Originally Posted by *NemChem*
> 
> The not-a-peltier-LED-strip... I still think a peltier would be a cool idea:


Nvidia would have to have a 300w peltier for the card to run stock and we would not be able to OC the card. We would have to have a 500+ watt peltier to support overclocking. No way anyone would launch a card that consumes 700+ watts at stock just to have an exotic cooling solution like that. If nvidia only put even a 200w peltier on it would act as an insulator and temps would skyrocket. So ya, way too much power draw to do such a thing.

Quote:


> Originally Posted by *TheFallenDeity*
> 
> Would adding thermal pads to the back of the TXP's PCB over the VRM (and under the backplate) area help with extra heat dissipation? Or is it not necessary?


The VRMs actually dont run *that* hot. The memory VRM is the hottest and it is still only 70-80c. These VRMs are rated for about over 100c (just think 100c being the limit though) and are actually pretty special. They do not have very high current limits, but they do not derate with temp. So running them really cool does not give more headroom than running them really hot. So extra cooling is not needed for the VRM at all. I have never seen a VRM rated like these before. The current limit is actually total package related, rather than a mosfet limit in the VRM. So that is the reason it doesnt de-rate at all. Mosfets are capable of way higher current, but the package limits total current capability.


----------



## kx11

do you guys think the frametime/latency is ok in this one ?!










vsync is enabled btw


----------



## TheFallenDeity

Quote:


> Originally Posted by *EniGma1987*
> 
> Nvidia would have to have a 300w peltier for the card to run stock and we would not be able to OC the card. We would have to have a 500+ watt peltier to support overclocking. No way anyone would launch a card that consumes 700+ watts at stock just to have an exotic cooling solution like that. If nvidia only put even a 200w peltier on it would act as an insulator and temps would skyrocket. So ya, way too much power draw to do such a thing.
> The VRMs actually dont run *that* hot. The memory VRM is the hottest and it is still only 70-80c. These VRMs are rated for about over 100c (just think 100c being the limit though) and are actually pretty special. They do not have very high current limits, but they do not derate with temp. So running them really cool does not give more headroom than running them really hot. So extra cooling is not needed for the VRM at all. I have never seen a VRM rated like these before. The current limit is actually total package related, rather than a mosfet limit in the VRM. So that is the reason it doesnt de-rate at all. Mosfets are capable of way higher current, but the package limits total current capability.


Thank you guys, you guys are awesome! I'm new here and just trying to find my way around right now.

I'll be doing some benchmarks this weekend with my new card. Would love to do them now but I unfortunately have a job. Haha.


----------



## xTesla1856

I did some testing last night after I installed the waterblock and I have to say I'm amazed by what my Titan can do: When it's still cold right after booting, it hits 2152MHz before it downclocks 2 times and settles at 2126MHz where it stays all the time in game. I think this is a great result for any Pascal GPU, let alone a Titan X. Next step is to see what the memory can do with the core that high. However I'm a little disappointed that despite the Voltage slider at +100% AfterBurner is permamnently showing the Voltage Limit at "1". Card temp hits 49°C maximum with a single EK Predator 360 and a 6800K with 1.35V flowing through it. All the while my rig is completely silent.


----------



## xarot

Hi, does anyone have an idea if this new Asus HB bridge will be compatible with EK fullcover blocks in SLI?

https://rog.asus.com/articles/gaming-graphics-cards/republic-of-gamers-announces-rog-hb-sli-bridge/


----------



## DooRules

Quote:


> Originally Posted by *xTesla1856*
> 
> I did some testing last night after I installed the waterblock and I have to say I'm amazed by what my Titan can do: When it's still cold right after booting, it hits 2152MHz before it downclocks 2 times and settles at 2126MHz where it stays all the time in game. I think this is a great result for any Pascal GPU, let alone a Titan X. Next step is to see what the memory can do with the core that high. However I'm a little disappointed that despite the Voltage slider at +100% AfterBurner is permamnently showing the Voltage Limited at "1". Card temp hits 49°C maximum with a single EK Predator 360 and a 6800K with 1.35V flowing through it. All the while my rig is completely silent.


Max voltage for the Titan is 1.093


----------



## Jpmboy

Quote:


> Originally Posted by *xarot*
> 
> Hi, does anyone have an idea if this new Asus HB bridge will be compatible with EK fullcover blocks in SLI?
> 
> https://rog.asus.com/articles/gaming-graphics-cards/republic-of-gamers-announces-rog-hb-sli-bridge/


it was the decorative "pointy ends" on the Nvidia HB bridge that would not fit with an EK block (tho I just cut mine down with a dremel tool). The ASUS bridge looks to address this problem.


----------



## Dagamus NM

Quote:


> Originally Posted by *Jpmboy*
> 
> it was the decorative "pointy ends" on the Nvidia HB bridge that would not fit with an EK block (tho I just cut mine down with a dremel tool). The ASUS bridge looks to address this problem.


I did the same, though I just removed to cover and trimmed the PCB with shears. It is an unattractive pcb but with my SM8 setup in reverse it sits behind the metal part of the door so I don't really care.


----------



## jsutter71

I just use the EVGA bridge


----------



## Kommando Kodiak

I signed my card in over at the titan x owners club, whoops.


----------



## ottoore

Quote:


> Originally Posted by *kx11*
> 
> do you guys think the frametime/latency is ok in this one ?!
> 
> vsync is enabled btw


1000/60= 16.6ms.
If you can render 60fps your frametime will always be 16.6ms with vsync.


----------



## NemChem

Quote:


> Originally Posted by *xTesla1856*
> 
> "When it's still cold right after booting, it hits 2152MHz before it downclocks 2 times and settles at 2126MHz where it stays all the time in game.


Awesome! Looking forward to getting my waterblock (still overdue at Aquatuning







), mine boosts to about the same at the start on air before downclocking so hopeful to get results like yours on water







.


----------



## NemChem

Edit: Double post my bad!


----------



## EniGma1987

Quote:


> Originally Posted by *NemChem*
> 
> Awesome! Looking forward to getting my waterblock (still overdue at Aquatuning
> 
> 
> 
> 
> 
> 
> 
> ), mine boosts to about the same at the start on air before downclocking so hopeful to get results like yours on water
> 
> 
> 
> 
> 
> 
> 
> .


You could always cancel that order and buy a better block from Watercool. They have fairly fast shipping and are in stock now:
http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Categories/Wasserkühler/GPU_Kuehler/"TITAN%20X%20%28Pascal%29"

ModMyMods has their blocks with 7% off right now for Presidents Day, though they can sometimes be a little slow:
https://modmymods.com/gpu-blocks/gtx-titan-x-pascal.html


----------



## DNMock

So has anyone benched their TXP on 8x by 8x versus 16x by 16x and can tell me how close it is to saturating the 8x by 8x set-up?

Thinking long and hard about retiring the 5930K for a Ryzen chip if they are all that and a bag a chips, but with so few PCIE lanes available and entire platforms being upgraded far less often than GPU's for me, I'm worried that Titan X Volta's in sli or the 2nd set of Vega cards (assuming they can compete... nevermind







) might need 16x or pcie 4.0 to run full speed.


----------



## pez

Quote:


> Originally Posted by *DNMock*
> 
> So has anyone benched their TXP on 8x by 8x versus 16x by 16x and can tell me how close it is to saturating the 8x by 8x set-up?
> 
> Thinking long and hard about retiring the 5930K for a Ryzen chip if they are all that and a bag a chips, but with so few PCIE lanes available and entire platforms being upgraded far less often than GPU's for me, I'm worried that Titan X Volta's in sli or the 2nd set of Vega cards (assuming they can compete... nevermind
> 
> 
> 
> 
> 
> 
> 
> ) might need 16x or pcie 4.0 to run full speed.


There was a video where I guy did this with 980Tis i believe and there was a difference in some titles. What resolution are you running?


----------



## DNMock

Quote:


> Originally Posted by *pez*
> 
> There was a video where I guy did this with 980Tis i believe and there was a difference in some titles. What resolution are you running?


3440x1440 currently, and hopefully 4k 120hz in the near future.

Edit - Found my own answer:

https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/

Looks like TXP is indeed the point where PCIE 3.0 x8 begins to get fully saturated on high resolutions.


----------



## NemChem

Quote:


> Originally Posted by *EniGma1987*
> 
> You could always cancel that order and buy a better block from Watercool. They have fairly fast shipping and are in stock now:
> http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Categories/Wasserkühler/GPU_Kuehler/"TITAN%20X%20%28Pascal%29"
> 
> ModMyMods has their blocks with 7% off right now for Presidents Day, though they can sometimes be a little slow:
> https://modmymods.com/gpu-blocks/gtx-titan-x-pascal.html


Thanks for the links; the Watercool block was the other one I was considering. I think I'll wait a little longer but if the Aquacomputer one doesn't come back in stock soon I'll cancel and go for a Watercool block. The reason I went for the Aquacomputer block was the direct block-to-vram contact rather than thermal pads - from the reviews I've seen aquacomputer are a little optimistic of their tolerances and the vram chips aren't always getting perfect contact but, with a little experimentation on the thermal paste front, good contact can be made. The machine learning work I'm doing really loves memory bandwidth so any mhz higher on a stable memory OC is very welcome.
Quote:


> Originally Posted by *DNMock*
> 
> 3440x1440 currently, and hopefully 4k 120hz in the near future.
> 
> Edit - Found my own answer:
> 
> https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/
> 
> Looks like TXP is indeed the point where PCIE 3.0 x8 begins to get fully saturated on high resolutions.


4k and 120hz - drool







. That's the dream isn't it! I've got the P2715Q and it is lovely now I've got the graphical might for it, but 120hz would be wonderful. It seems with Brexit weakening the pound a P2715Q costs significantly more now so, if I decide to sell it, the resale value might be quite close to what I paid for it! Then again, the new 120hz 4k monitor replacing it will presumably cost more than it would have done







.


----------



## jhowell1030

Quote:


> Originally Posted by *NemChem*
> 
> 4k and 120hz - drool
> 
> 
> 
> 
> 
> 
> 
> . That's the dream isn't it! I've got the P2715Q and it is lovely now I've got the graphical might for it, but 120hz would be wonderful. It seems with Brexit weakening the pound a P2715Q costs significantly more now so, if I decide to sell it, the resale value might be quite close to what I paid for it! Then again, the new 120hz 4k monitor replacing it will presumably cost more than it would have done
> 
> 
> 
> 
> 
> 
> 
> .


If/When the day comes that a single card can pull off this feat...lord have mercy on my wallet for it will be ravaged.


----------



## Lobotomite430

Quote:


> Originally Posted by *jhowell1030*
> 
> If/When the day comes that a single card can pull off this feat...lord have mercy on my wallet for it will be ravaged.


Then the 8k monitors come out


----------



## jhowell1030

Quote:


> Originally Posted by *Lobotomite430*
> 
> Then the 8k monitors come out


You know...I can honestly say that I'm content with 21:9 @ 1440 especially if I can get +60 FPS. Of course, that'll mean that there will be 21:9 @ 4k by then.


----------



## DNMock

Quote:


> Originally Posted by *jhowell1030*
> 
> You know...I can honestly say that I'm content with 21:9 @ 1440 especially if I can get +60 FPS. Of course, that'll mean that there will be 21:9 @ 4k by then.


Yeah, that's what I really want to get ahold of is the 21:9 @ 2160 and 100+ hz, but I doubt we will see one for at least a couple more years.


----------



## pez

Quote:


> Originally Posted by *DNMock*
> 
> 3440x1440 currently, and hopefully 4k 120hz in the near future.
> 
> Edit - Found my own answer:
> 
> https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/
> 
> Looks like TXP is indeed the point where PCIE 3.0 x8 begins to get fully saturated on high resolutions.


Yeah, with a single Titan the 3440x1440 res is a great match IMO.

I'm only currently using the 1080 as the noise profile got to me with the Titan (small system with the GPU being the loudest compenent







).


----------



## auraofjason

Is it just me or is anyone else's titan xp not downclocking properly when using drivers past 376.33? Any driver I use beyond that one my titan xp idles at around 700mhz, with 376.33 it idles at around 130mhz.

I'm running 1440p165hz which might be a factor too.


----------



## xTesla1856

Quote:


> Originally Posted by *auraofjason*
> 
> Is it just me or is anyone else's titan xp not downclocking properly when using drivers past 376.33? Any driver I use beyond that one my titan xp idles at around 700mhz, with 376.33 it idles at around 130mhz.
> 
> I'm running 1440p165hz which might be a factor too.


Did you check power settings in Control Panel already?


----------



## auraofjason

Quote:


> Originally Posted by *xTesla1856*
> 
> Did you check power settings in Control Panel already?


Yeah, same settings on 376.33 and any driver after it.


----------



## jhowell1030

Anyone getting Ryzen? I have the 1800x in my cart for predorer. Can't decide if it's worth upgrading from my 5820k


----------



## EniGma1987

Quote:


> Originally Posted by *jhowell1030*
> 
> Anyone getting Ryzen? I have the 1800x in my cart for predorer. Can't decide if it's worth upgrading from my 5820k


Id get it just for fun. If it turns out better then great! If not then you have a new toy for a while and then you can sell it to someone for not too much loss.
Well, 1800X might be a bit of a loss with a $500 retail price being paid. I bet 1700X will be the best deals all around and overclock around the same anyway.


----------



## NemChem

Quote:


> Originally Posted by *jhowell1030*
> 
> Anyone getting Ryzen? I have the 1800x in my cart for predorer. Can't decide if it's worth upgrading from my 5820k


Did a pre-order for a 1800X and a 1700. Waiting to see the reviews and then I'll cancel one of them - hoping the 1700 will OC just as well as the 1800X. At 65W my thinking was that perhaps those and the 1800X chips would be similarly binned, with the worst chips going to the 1700X (Goldilocks effect). Though on Scan their prebuilt systems say "professionally overclocked to 4.0 GHz" for the 1700X and 1800X systems and "3.8 GHz" for the 1700 system, so maybe my logic is flawed, that or Scan don't have any chips yet and so were being conservative on the OCs? Or because the mobo the 1700 system comes with is cheaper that's the reason for a more conservative OC... who knows!

Very excited to be upgrading from my i5 3570K - first AMD chip since my A64 3200+







.


----------



## xTesla1856

Well, we have a counter now, guys









http://www.geforce.com/


----------



## jhowell1030

Quote:


> Originally Posted by *xTesla1856*
> 
> Well, we have a counter now, guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.geforce.com/


HYPE!


----------



## Dagamus NM

Counter for what??


----------



## jhowell1030

The release of a 1080Ti


----------



## EniGma1987

Quote:


> Originally Posted by *Dagamus NM*
> 
> Counter for what??


For when it is *TI*me for a new release. Seems pretty obvious what its for.


----------



## greatxerox

Hello guys !!









Where is the DSR option for TITAN X PASCAL ???

i have a 2 x SLI Titan X pascal and a HTC VIVE



In, in the Nvidia panel, for 1000 serie & 900 serie, there is that in the nvidia panel



but in my system, i don't understand , i don't have DSR option
Just look, is it normal ??



i don't see the Titan X pascal in this page :
http://www.geforce.com/hardware/technology/dsr/supported-gpus

is it normal the best Nvidia graphic card doesn't support DSR ??

someone could explain me please, thank you !


----------



## jhowell1030

Quote:


> Originally Posted by *greatxerox*
> 
> Hello guys !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Where is the DSR option for TITAN X PASCAL ???
> 
> i have a 2 x SLI Titan X pascal and a HTC VIVE
> 
> 
> 
> In, in the Nvidia panel, for 1000 serie & 900 serie, there is that in the nvidia panel
> 
> 
> 
> but in my system, i don't understand , i don't have DSR option
> Just look, is it normal ??
> 
> 
> 
> i don't see the Titan X pascal in this page :
> http://www.geforce.com/hardware/technology/dsr/supported-gpus
> 
> is it normal the best Nvidia graphic card doesn't support DSR ??
> 
> someone could explain me please, thank you !


I think you have to enable first in the 3D options, then apply, and then go back into it. I know I am able to use it but I'm at work and don't have it right ion front of me.


----------



## greatxerox

" you have to enable first in the 3D options, then apply, "

>>> thank you but WHERE exactly please ??? i don't see anything


----------



## Dagamus NM

Ahh, it wasn't popping up on mobile. I see it now.

I am curious after all of the speculation to see what all it really is.


----------



## axiumone

Quote:


> Originally Posted by *greatxerox*
> 
> " you have to enable first in the 3D options, then apply, "
> 
> >>> thank you but WHERE exactly please ??? i don't see anything


You have 3 displays. If you have surround enabled, DSR option will not appear.


----------



## gamingarena

Quote:


> Originally Posted by *Dagamus NM*
> 
> Counter for what??


Counter for nerfed Titan XP


----------



## greatxerox

Quote:


> Originally Posted by *axiumone*
> 
> You have 3 displays. If you have surround enabled, DSR option will not appear.


i just tried with only one display, same thing :/


----------



## stocksux

Quote:


> Originally Posted by *gamingarena*
> 
> Counter for nerfed Titan XP


That's about right!!


----------



## jhowell1030

Quote:


> Originally Posted by *greatxerox*
> 
> " you have to enable first in the 3D options, then apply, "
> 
> >>> thank you but WHERE exactly please ??? i don't see anything




Right here


----------



## greatxerox

jhowell, look my first post, you'll see i don't have this option, that's the reason i ask :/


----------



## pez

Maybe I'll upgrade the 1080 via step-up and put my Titan back in for anticipation of the EVGA Hybrid cooler.


----------



## greatxerox

i FOUND !!

that's the SURROUND which deactivate DSR option in the NVIDIA Panel !! yeaaaaaaaaaaaaaaaaaah


----------



## Lobotomite430

Quote:


> Originally Posted by *pez*
> 
> Maybe I'll upgrade the 1080 via step-up and put my Titan back in for anticipation of the EVGA Hybrid cooler.


Did I by chance offer you my TItan EVGA Hybrid cooler on reddit yesterday? Its been 6 months since Titan released and EVGA hasnt made a kit for it.


----------



## jhowell1030

Quote:


> Originally Posted by *Lobotomite430*
> 
> Did I by chance offer you my TItan EVGA Hybrid cooler on reddit yesterday? Its been 6 months since Titan released and EVGA hasnt made a kit for it.


Honestly, this late into the Titan XP's life lifecycle I'd be surprised if they ever went to market with it.


----------



## Lobotomite430

Quote:


> Originally Posted by *jhowell1030*
> 
> Honestly, this late into the Titan XP's life lifecycle I'd be surprised if they ever went to market with it.


My thoughts exactly, it wouldnt take them much to make it but at this point theres not much money to be made by making them now since theres so many other kits or full blocks you can make work.


----------



## jhowell1030

Quote:


> Originally Posted by *Lobotomite430*
> 
> My thoughts exactly, it wouldnt take them much to make it but at this point theres not much money to be made by making them now since theres so many other kits or full blocks you can make work.


Exactly! Not to mention...we all kind of know that there will be something bigger and better come August/September. If they really wanted to make any money off of one then they would've put one out months ago.


----------



## jmaz87

when is the titan xp gonna drop in price so i can justify buying a 2nd one/ ek waterblock.


----------



## xTesla1856

Quote:


> Originally Posted by *jmaz87*
> 
> when is the titan xp gonna drop in price so i can justify buying a 2nd one/ ek waterblock.


Same as previous Titans: Never.


----------



## jhowell1030

Quote:


> Originally Posted by *xTesla1856*
> 
> Same as previous Titans: Never.


True. Unless you buy second-hand.


----------



## gee4vee

I have dual Titan X Pascals with latest drivers as of today. Suddenly I'm experiencing a black screen and fans at full throttle in the middle of gameplay. This has never happened before. I put them in SLI mode. Anyone else seeing this?


----------



## Lobotomite430

Quote:


> Originally Posted by *jmaz87*
> 
> when is the titan xp gonna drop in price so i can justify buying a 2nd one/ ek waterblock.


I kind of wanted to sell my titan and get 2 1080 TI but 2 Titans oooooo weeeeee


----------



## stocksux

Quote:


> Originally Posted by *Lobotomite430*
> 
> I kind of wanted to sell my titan and get 2 1080 TI but 2 Titans oooooo weeeeee


Keep the Titan and add another one. No sense it going down a level. I doubt price difference in the two will be real drastic. Maybe $300ish. I mean cmon, $1200 for the one is already ludacris and we paid it already ?


----------



## Lobotomite430

Quote:


> Originally Posted by *stocksux*
> 
> Keep the Titan and add another one. No sense it going down a level. I doubt price difference in the two will be real drastic. Maybe $300ish. I mean cmon, $1200 for the one is already ludacris and we paid it already ?


Yea I hear ya, my titan is making me pretty happy with my new full waterblock!


----------



## Baasha

Wrong thread perhaps but is anyone having issues with Battlefield 1 and Battlefield 4 with DirectX "DEVICE_REMOVED" errors?









Started getting these recently and is really aggravating - only in these two games though.


----------



## Dagamus NM

Quote:


> Originally Posted by *Baasha*
> 
> Wrong thread perhaps but is anyone having issues with Battlefield 1 and Battlefield 4 with DirectX "DEVICE_REMOVED" errors?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Started getting these recently and is really aggravating - only in these two games though.


Don't know, but I do need to pick your brain regarding NVI profiles for some games on quad sli TXPs.

I sent a pm about this a while back but didn't hear back from you.


----------



## Remidi

My Titan XP has a nasty coil whine at full load even with a EKWB block on it. I call NVIDIA to setup an RMA and mentioned I put the block on a couple months ago hoping that would get rid of the coil whine. First thing out of the reps mouth was they do not support it and he had to speak with a supervisor to see if they could help me. I hung up. I'll try again later and not mention the water block. The guy was pretty insistent that I broke my card.

When I returned GPU's to EVGA it was simple and I was never made to feel that Im the one who broke the GPU. ******* coil whine. Stupid thing wont go over 1900mhz even when under 50c anyway.

Anyone have any suggestions for thermal pads? I'd rather just order from Amazon instead of EKWB but I am not exactly sure what to buy. I'm going to start looking now.

**That rep must have input my info in somewhere because now they wont accept an RMA due to adding the cooler...

***Spoke with a supervisor...the ONLY reason they MAY accept the RMA is due to me originally stating the noise was there day 1 before I installed the water block(which it was). He has to forward the information to his supervisor. I'll never buy another GPU from NVIDIA directly again, this is ridiculous.

***I asked him why is this process such a hassle when it doesnt state on the warranty that I can not change the cooler. Decides against sending me an email where he was requesting I take a picture of my GPU with the serial number on it and is now forwarding me to "level 2 RMA"...I thought I was already talking to a supervisor...

***I was transferred to someone not in India and he couldnt understand why they were giving me a hard time. As long as I didnt damage the GPU by replacing the heatsink its fine. RMA is setup. I'm selling the replacement and my EKWB...never again. Ill be without my PC though until the 1080ti comes out


----------



## clipse84

hey guys quick question I have sli titan xps's with an acer predator 1440p,165hz monitor should I be gaming with the dsr option on to get 4k quality.? can you tell the diffrence and is it woth it on a 1440p monitor or should I just run games at native 1440p? Thanks, in advance


----------



## bee144

Quote:


> Originally Posted by *clipse84*
> 
> hey guys quick question I have sli titan xps's with an acer predator 1440p,165hz monitor should I be gaming with the dsr option on to get 4k quality.? can you tell the diffrence and is it woth it on a 1440p monitor or should I just run games at native 1440p? Thanks, in advance


I use resolution scaling in BF1 with my 2560x1440 144Hz monitor. I set the resolution to 150%, which is equal to 4K. I use this instead of AA and I can see a difference. Anything over 150% and I can't notice the difference.


----------



## EniGma1987

Quote:


> Originally Posted by *clipse84*
> 
> hey guys quick question I have sli titan xps's with an acer predator 1440p,165hz monitor should I be gaming with the dsr option on to get 4k quality.? can you tell the diffrence and is it woth it on a 1440p monitor or should I just run games at native 1440p? Thanks, in advance


You will never get 4k quality unless you use a 4k or higher display. No amount of DSR can display pixels that are simply not there, the only thing DSR does is render in a higher resolution and downscale which is basically similar to some AA techniques and the resulting image quality is the same as if you were running AA.


----------



## invincible20xx

anybody using a titan x with a 3770k here ?


----------



## Exnetic

Yes but, Do to 0% in nvidia control pannel is not that bad, better than AA if you ask me


----------



## NemChem

Quote:


> Originally Posted by *invincible20xx*
> 
> anybody using a titan x with a 3770k here ?


Not quite a 3770K, but using a 3570K here...


----------



## NemChem

So did one of you lot win the Titan XP eBay auction that just finished?


----------



## mitcHELLspawn

The one that went for 1755 cad? I was looking at that thinking why the hell would someone pay more than retail for a used titan x... lol especially with the 1080ti launching next week.


----------



## NemChem

Quote:


> Originally Posted by *mitcHELLspawn*
> 
> The one that went for 1755 cad? I was looking at that thinking why the hell would someone pay more than retail for a used titan x... lol especially with the 1080ti launching next week.


Ah, sorry, I should've said eBay UK. Went for £1066.66 + £12 shipping; nVidia raised the new price from £1099 to £1179 in January due to a weak pound so it's a good price but a bit of a gamble around the 1080 Ti launch.

Might've been the same one, Google says "1066.66 British Pound equals: 1743.03 Canadian Dollar".

I wasn't willing to gamble to that high of a price, especially with the rumours of a 3840 cuda core 1080 Ti. Logic tells me that that doesn't make sense since Volta isn't for quite a while (there needs to be a halo card later in the year?) and people would buy a 1080 Ti whether it's 3384(?) cores or 3840, whilst the former option leaves a 3840 core Titan for later in the year, but my gut said watch out







! A fully fledged GP102 7 months after the Titan just feels a little too soon, but who knows, maybe nVidia knows something about Vega we don't! How many months was the 780 Ti after the Titan?


----------



## mitcHELLspawn

Oh man i am really hoping for a 3840 core card.. it would be incredible to see but man would the titan guys be mad... but there definitely is precedent for it, as you said with the 780ti and the OG titan. And as for how much later it launched, im pretty sure it was less than a year later.

I will definitely be picking up one of the very first 1080 ti's whether it's a full gp102 or not like you said most people will as well... im hoping because it looks like nvidia held off on launching slightly that board partners all have their cards ready to go with at least some decent stock. I've been running dual MSI gtx 1080 gaming X since they launched in August and i would love to throw in an identical card just a ti version. It really is a great card. Amazing temps.


----------



## BenchZowner

I'd rather have a Quadro P6000 at 1.6k ( and bang goes the Titan X (P)







)


----------



## NemChem

Haha, I would be kinda upset, but if it were 3840 cores and 10 GB, not quite so much since some machine learning stuff can fill up 12 GB. What I really would like is Vega 10, but Vega 10 with the level of software support for machine learning that cuda has. I think AMD announced something about a translator for cuda -> openCL, so I guess that's a start! 8 GB Vega 10 would essentially be like 2x Titan XPs with 16 GB of memory, due to it running FP16 at 2x the speed of FP32 like GP100, so 25 TFlops, and they take up less memory than FP32 so effectively more to play with.

One thing that makes me think the 1080 Ti could be 3840 cores is that the Titan XP, after being out of stock quite often after launch, was then in stock for a looooong time, and now it seems to be going in and out of stock very frequently, like they're only keeping a very small inventory of it - that or loads of university computer science departments are buying them all up for deep learning... Like you said "why would someone pay more than retail?", I think new ones do actually sell on eBay for more than retail because of the 2 per customer limit and university research teams wanting lots of them!

Fun to speculate







! 28 Feb can't come soon enough though!

Edit:
Quote:


> Originally Posted by *BenchZowner*
> 
> I'd rather have a Quadro P6000 at 1.6k ( and bang goes the Titan X (P)
> 
> 
> 
> 
> 
> 
> 
> )


Haha that would be completely awesome - can but dream!


----------



## pez

Quote:


> Originally Posted by *Lobotomite430*
> 
> Did I by chance offer you my TItan EVGA Hybrid cooler on reddit yesterday? Its been 6 months since Titan released and EVGA hasnt made a kit for it.


Wasn't me







. What cooler was this, though? Previous Titan gen cooler modded to work with the TXP?
Quote:


> Originally Posted by *jhowell1030*
> 
> Honestly, this late into the Titan XP's life lifecycle I'd be surprised if they ever went to market with it.


Yeah, I'm starting to think this as well







.

I'm still throwing up the idea of just putting the Titan XP back into my main rig, stepping up the 1080 to a Ti and then selling off the 1070....or keeping it for a HTPC build. I'd think the 1070 is a fairly decent HTCP 1080p card







.


----------



## jhowell1030

Quote:


> Originally Posted by *pez*
> 
> Wasn't me
> 
> 
> 
> 
> 
> 
> 
> . What cooler was this, though? Previous Titan gen cooler modded to work with the TXP?
> Yeah, I'm starting to think this as well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I'm still throwing up the idea of just putting the Titan XP back into my main rig, stepping up the 1080 to a Ti and then selling off the 1070....or keeping it for a HTPC build. I'd think the 1070 is a fairly decent HTCP 1080p card
> 
> 
> 
> 
> 
> 
> 
> .


LOL. "fairly decent" That's a good one!


----------



## pez

In all seriousness, I do want to get a 4K display by the end of the year and having the 1070 for an HTPC would be pretty awesoe. My goal is to make a really really SFF build for it and sit it along with my consoles. I may go as big as mATX, though unless I can find a speaker amp that I would like to use over a receiver.


----------



## hertz9753

Quote:


> Originally Posted by *pez*
> 
> In all seriousness, I do want to get a 4K display by the end of the year and having the 1070 for an HTPC would be pretty awesoe. My goal is to make a really really SFF build for it and sit it along with my consoles. I may go as big as mATX, though unless I can find a speaker amp that I would like to use over a receiver.


http://www.parts-express.com/

That is where I would look.


----------



## pez

Quote:


> Originally Posted by *hertz9753*
> 
> http://www.parts-express.com/
> 
> That is where I would look.


I've seen a couple things on there so far that I'm going to research further. Thanks for that







.


----------



## Glerox

OUCH

Gtx 1080 Ti, faster than Titan x Pascal, for 699$...

I feel bad right now with my 1200$ Titan


----------



## wiretap

Good thing I waited.. almost pulled the trigger on a Pascal Titan on Ebay the other day for $1k.


----------



## Glerox

Official boost clock is 1600Mhz but during the demo it boosted up to 2000Mhz on stock clock/stock cooler. OC performance will be crazy with watercooling, probably higher than 2200Mhz.


----------



## axiumone

Quote:


> Originally Posted by *Glerox*
> 
> Official boost clock is 1600Mhz but during the demo it boosted up to 2000Mhz on stock clock/stock cooler. OC performance will be crazy with watercooling, probably higher than 2200Mhz.


Lol no chance. The 1080 presentation showed the same thing. Gpu running at 2100 on a stock cooler at 65c. Reality is a little different.


----------



## opt33

yeah buying a titan xp now would be a bummer, but 6 months ago just the choice to pay more to have it early, since the TI version always comes out after for similar performance and cheaper price. Ill probably buy another titan (4th time) instead of waiting for ti version again, providing it has another huge performance increase like previous ones. 2000 boost, overclocking little higher is nice, but not enough of an upgrade at my 2000 speed now. But yeah, would be nice if they just offered the ti to start with....for those that have no patience.


----------



## Glerox

Quote:


> Originally Posted by *axiumone*
> 
> Lol no chance. The 1080 presentation showed the same thing. Gpu running at 2100 on a stock cooler at 65c. Reality is a little different.


Haha I hope so because my old Titan XP with waterblock max out at 2088Mhz...


----------



## ChronoBodi

I admit i would have bought a Ti though, if it came out in January.

Ah well, imma wait for the 1180 Ti then.

Actually, if the 1080 Ti is already at 2 Ghz boost out of the box, where's the OC room then?

This is still Pascal, I am wondering if its possible to do 2200 mhz if that's possible.


----------



## Glerox

Quote:


> Originally Posted by *opt33*
> 
> yeah buying a titan xp now would be a bummer, but 6 months ago just the choice to pay more to have it early, since the TI version always comes out after for similar performance and cheaper price. Ill probably buy another titan (4th time) instead of waiting for ti version again, providing it has another huge performance increase like previous ones. 2000 boost, overclocking little higher is nice, but not enough of an upgrade at my 2000 speed now. But yeah, would be nice if they just offered the ti to start with....for those that have no patience.


I agree we knew the TI was coming, but I hoped the 1080TI wouldnt be BETTER than the Titan XP... according to nvidia it's faster. Waiting for benchmarks


----------



## ChronoBodi

Quote:


> Originally Posted by *Glerox*
> 
> I agree we knew the TI was coming, but I hoped the 1080TI wouldnt be BETTER than the Titan XP... according to nvidia it's faster. Waiting for benchmarks


1080 Ti is literally a Titan XP that's OCed, 1GB less VRAM, and 64 bit bus cut off.

It is for all intent and purposes a slightly neutered bus width OCed Titan XP.

Nvidia says its faster than TXP, but how? simple, they overclocked it like we did, how else is 3584 cores going to beat 3584 cores?


----------



## Glerox

Quote:


> Originally Posted by *ChronoBodi*
> 
> 1080 Ti is literally a Titan XP that's OCed, 1GB less VRAM, and 64 bit bus cut off.
> 
> It is for all intent and purposes a slightly neutered bus width OCed Titan XP.
> 
> Nvidia says its faster than TXP, but how? simple, they overclocked it like we did, how else is 3584 cores going to beat 3584 cores?


Yeah but maybe they found a way to achieve higher stable clocks, like the gtx 1080 achieves higher stable clocks than TXP, but now it has the same amount of CUDA cores


----------



## animeowns

Quote:


> Originally Posted by *Glerox*
> 
> Yeah but maybe they found a way to achieve higher stable clocks, like the gtx 1080 achieves higher stable clocks than TXP, but now it has the same amount of CUDA cores


Yeah I was reading the specs and comparing it to the titan xp its literally the same card minus a few different things like a 352 bit bus vs a 384 bit bus but I think nvidia will wait for amd to release big vega before releasing the full chip to the consumers only way I can see it beating the titan xp if they cripple driver support and give the better support to the 1080 ti meant for gamers !


----------



## Fredthehound

Soooo...since it's the same core count/chip, will it SLI with a TXP?


----------



## ChronoBodi

No, as stupid as that sounds.

A 680 cannot SLI with a 770 despite both being exactly the same GK104 die.

So, no, a TXP cannot SLI with a 1080 Ti, the driver blocks it.


----------



## animeowns

Quote:


> Originally Posted by *Fredthehound*
> 
> Soooo...since it's the same core count/chip, will it SLI with a TXP?


I Doubt it one card is 11gb the other is 12gb + the 352 bit 384 bit


----------



## Fredthehound

Tthats what I figured. Driver.

Oh well, I still don't regret buying my TXP for a second. It's been 7 months of gaming bliss.


----------



## stocksux

I have a Titan X I haven't even used yet. I'm building a system (going on a month now) and I just finished putting an EKWB on it...I'm gonna be sick ? Guess I'll sell it for a loss /sigh. Someone from Nvidia please see this and help me out! I feel so jaded by Nvidia.


----------



## Fredthehound

With Crossfire, the 'lead' card just defaults to the lower across SKUs I can't say I'm surprised Nvidia would lock down to exact cards though.


----------



## Fredthehound

Why? For MAYBE gaining Edit 100mhz but losing a full gig of ram? There's NOTHING to haz sadz about with 2 Titans on water!


----------



## patrickisfrench

is there nothing that nvidia can do for people having bought a titan xp in early jan?


----------



## Fredthehound

Why? It's not like we all didn't know it was a matter of time till the TI released. And even if we didn't, we all still willingly chose to pay Halo pricing for Halo performance. We all still have exactly that. In fact, considering the OC levels we have hit without voltage mods and the rarity of actual problems that were not self inflicted, I'd say we got more than we paid for, even at $1200, as it is.


----------



## markklok

Maybe more chance for a possible bios hack since 1080ti is using the same chip......


----------



## Fredthehound

Hope so.


----------



## patrickisfrench

Titan XP (NVIDIA direct)

Subtotal $1,200.00
Tax $109.69
Shipping $35.89
Total $1,345.58

+ EVGA hybrid cooler for mod (Amazon)

Order Summary
Item(s) Subtotal: $107.99
Shipping & Handling: $5.99
Free Shipping: -$5.99
Total before tax: $107.99
Estimated tax to be collected: $9.58
Grand Total: $117.57

*$1463.15*


----------



## patrickisfrench

Quote:


> Originally Posted by *Fredthehound*
> 
> Why? It's not like we all didn't know it was a matter of time till the TI released. And even if we didn't, we all still willingly chose to pay Halo pricing for Halo performance. We all still have exactly that. In fact, considering the OC levels we have hit without voltage mods and the rarity of actual problems that were not self inflicted, I'd say we got more than we paid for, even at $1200, as it is.


Titan XP (NVIDIA direct)

Subtotal $1,200.00
Tax $109.69
Shipping $35.89
Total $1,345.58

+ EVGA hybrid cooler for mod (Amazon)

Order Summary
Item(s) Subtotal: $107.99
Shipping & Handling: $5.99
Free Shipping: -$5.99
Total before tax: $107.99
Estimated tax to be collected: $9.58
Grand Total: $117.57

*$1463.15*

edit: double post, please delete above? mods? thank you


----------



## mbze430

Anyone else planning to sell their Titan X Pascal to get the 1080 TI?


----------



## Fredthehound

Yea but you knew the TI was coming soon. So yea, it's a tough break but I don't see what Nvidia or any other company should do about it. They are in business to sell us stuff, not make financial decisions for us.

Why 2 months and not 3. Or 4? Sure if they want to offer up some trade in/other deal, that's great but it's really up to us to buy smarter.


----------



## patrickisfrench

after there was no word at all at CES I lost faith. figured.. whatever Titan XP for me then, lol - boom!


----------



## Fredthehound

I don't blame you. I'd be torqued if that happened to me too. But look on the bright side. You STILL have the best of the best out there and will for a good long time.Hell, I still get called bad names for just having one of them


----------



## ChronoBodi

Quote:


> Originally Posted by *mbze430*
> 
> Anyone else planning to sell their Titan X Pascal to get the 1080 TI?


No. Anyone is not going to pay the original price for TXP considering the 1080 Ti is literally a TXP with 64 bit bus cut off and 1 GB less VRAM, and OCed.

And again, our OCed TXPs is the same perf as 1080 Ti anyway, except for the price/perf, which Titans aren't exactly known for.

Titans serve as the no holds barred early adopter option and the herald of what's to come in the later Ti variant, as usual.

I'll just get a Ti version of Volta next generation.


----------



## gamingarena

Quote:


> Originally Posted by *mbze430*
> 
> Anyone else planning to sell their Titan X Pascal to get the 1080 TI?


Why would you do that and gain what?


----------



## xTesla1856

I don't think anyone has a compelling reason to sell.


----------



## pez

I will probably get a Ti, but I'll do so by upgrading my 1080 via step-up. In my case, the lower noise profile for air cooling on an AIB is really appetizing. I'm not sure what I'll do with the Titan X P, but I might sell it for the shear fact it might not be worth it to keep around...unless I'm feeling that kind of way and just want to put it on a shelf in my den for 'art'







.


----------



## Silent Scone

At first I thought I read it had all SM enabled, so I was a little gob smacked - but thankfully, logic / reality took over







.


----------



## bl4ckdot

What do you guys think of the Tiled Rendering of the Ti ?
Also anyone planning to get a 1800x with our Titan XP ?


----------



## xTesla1856

They lumped off 1GB of VRAM, gimped the bus, lowered the TDP. The ultimate? There can only be one...


----------



## pez

Quote:


> Originally Posted by *bl4ckdot*
> 
> What do you guys think of the Tiled Rendering of the Ti ?
> Also anyone planning to get a 1800x with our Titan XP ?


If they release a semi-decent AM4 ITX board, I'd be very interested to at least do a 1600X with my TXP. It's discouraging to see no news or images of ITX boards, though.


----------



## xarot

TXP owners will "lose" nothing with 1080TI except the value of the card. You'll lose a lot of money with this hobby during the years anyway...My 4x 8800Ultras are on the shelf.









I sold my second TXP, but mainly because I looked at my games list and realised I mostly play games that don't know what SLI is.







I could count games I am going to play and support SLI with one hand, and those would not even struggle with one TXP. Yes, it can play Crysis too...Also I didn't have the second card on water yet and felt I could use the money for a PCIe SSD.


----------



## DooRules

Now would be a great time for Nvidia to unlock titan xp bios. One can dream right,


----------



## meson1

Quote:


> Originally Posted by *markklok*
> 
> Maybe more chance for a possible bios hack since 1080ti is using the same chip......


That was my thought too. It would be nice to be able to free up the voltage limit without having to physically mod the PCB with conductive paint or whatever.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *DooRules*
> 
> Now would be a great time for Nvidia to unlock titan xp bios. One can dream right,


One possibility would be to flash a modded Asus Strix 1080ti bios to the TXP, like how it went for the 1080. That's a realistic dream me thinks.


----------



## piee

Curious at the 1080ti benchmarks. TXP rocks, gaming bliss, also bought a Phantom 4 pro drone (20megpixels)4k/60fps, and video is sweet on Qnix325 4k. TXP is solid w/EK block. Should see benchmarks today?


----------



## xarot

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> One possibility would be to flash a modded Asus Strix 1080ti bios to the TXP, like how it went for the 1080. That's a realistic dream me thinks.


Probably won't work due to differences in ROPs and memory bus. But one can hope.


----------



## NemChem

Quote:


> Originally Posted by *patrickisfrench*
> 
> after there was no word at all at CES I lost faith. figured.. whatever Titan XP for me then, lol - boom!


Ditto.

"Thank you for ordering from NVIDIA on 07 February 2017. The following product(s) have shipped. If you paid by credit card, your credit card has now been charged."

Damn.

Not the end of the world though; worse things could have happened!

Positive notes: the TDP of the Titan is 250W, so +20% gives us 300W to play around in when overclocking; the TDP of the 1080 Ti is 220W, so +20% gives 264W to play around in. The Ti might have a newer revision of the silicon but I imagine the Titan's headroom is still higher? Also, Titans might still retain their second hand value after an initial panic selloff due to the number that are wanted in academia for deep learning (12GB over 11GB is a bonus there for certain problems).

Question: The presentation said the 1080 Ti had "7 phase dualFET power", is that the same as on our Titans? I know the Titan has 7+2 dualFET but did nvidia leave the +2 part out for simplicity in the 1080 Ti presentation or is the power delivery on the Titan better?


----------



## Fredthehound

I only really play two games consistently. Fallout and Skyrim. Both modded to hell and back. From what I hear, Fallout's SLI is really good and I would LOVE the extra horsepower to get the last few ini settings maxed in Boston. Skyrim SE doesn't have a profile and I hear the hacks to get it working are never close to flawless. And I now have it about 99% over 60FPS anyway with all the eye candy tweaked up. So $1200 is a bit steep for one game.. To put it mildly.

By the time Cyberpunk 2077 releases, we'll be into Volta anyway and thats really the only other potentially taxing game I'm interested in so far.


----------



## Gunslinger.

Quote:


> Originally Posted by *mbze430*
> 
> Anyone else planning to sell their Titan X Pascal to get the 1080 TI?


Yes, have 2 with EK waterblocks that will be sold.


----------



## xTesla1856

Quote:


> Originally Posted by *NemChem*
> 
> Ditto.
> 
> "Thank you for ordering from NVIDIA on 07 February 2017. The following product(s) have shipped. If you paid by credit card, your credit card has now been charged."
> 
> Damn.
> 
> Not the end of the world though; worse things could have happened!
> 
> Positive notes: the TDP of the Titan is 250W, so +20% gives us 300W to play around in when overclocking; the TDP of the 1080 Ti is 220W, so +20% gives 264W to play around in. The Ti might have a newer revision of the silicon but I imagine the Titan's headroom is still higher? Also, Titans might still retain their second hand value after an initial panic selloff due to the number that are wanted in academia for deep learning (12GB over 11GB is a bonus there for certain problems).
> 
> Question: The presentation said the 1080 Ti had "7 phase dualFET power", is that the same as on our Titans? I know the Titan has 7+2 dualFET but did nvidia leave the +2 part out for simplicity in the 1080 Ti presentation or is the power delivery on the Titan better?


nvidia.com lists the TDP for the Ti at 250W. So same as the TXP.


----------



## xTesla1856

Quote:


> Originally Posted by *Gunslinger.*
> 
> Yes, have 2 with EK waterblocks that will be sold.


I'm curious, why are your selling them?


----------



## DooRules

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> One possibility would be to flash a modded Asus Strix 1080ti bios to the TXP, like how it went for the 1080. That's a realistic dream me thinks.


I am not particular in how we get access, just that we do.


----------



## NemChem

Quote:


> Originally Posted by *xTesla1856*
> 
> nvidia.com lists the TDP for the Ti at 250W. So same as the TXP.


Ah, seems techpowerup has it listed wrong, mybad!


----------



## stocksux

Quote:


> Originally Posted by *patrickisfrench*
> 
> is there nothing that nvidia can do for people having bought a titan xp in early jan?


I 100% am on board with you as I also purchased a Titan X in January. A month later after taking my $1300+ they drop a card they claim will outperform my Titan X and for $500 less! So disgusted...


----------



## Nicklas0912

Hello.

Im gonna do vmod on my Titan X friday.

you guys know if I can use this one for the 3x vcore chips?
https://www.coolerkit.dk/shop/coollaboratory-liquid-712p.html


----------



## EniGma1987

Quote:


> Originally Posted by *Nicklas0912*
> 
> Hello.
> 
> Im gonna do vmod on my Titan X friday.
> 
> you guys know if I can use this one for the 3x vcore chips?
> https://www.coolerkit.dk/shop/coollaboratory-liquid-712p.html


To do a vmod you need multi-turn variable resistors, some wire, and soldering skills.

The CLU is just for a power mod








You can use CLP, but it will be a lot harder. The ultra applies a lot easier and sticks where you put it more. You will most likely have issues applying the pro unless you have a good bit of experience with how it moves.


----------



## stocksux

So after digging into things I'm not sure the 1080ti is "faster" than the Titan X. The 1080ti essentially has the 12GB of GDDR5X that the Titan X has reduced to 11GB, memory interface is bumped down from 384-bit to 352-bit, ROP count drops from 96 to 88, while 256K of the Titan X Pascal's L2 cache is also omitted. Here's what I found the 1080ti has over the Titan X. 11GBps vs the 10GBps modules found on the Titan X Pascal which increases memory bandwidth from the Titan X's 480GBps to 484GBps. And boost clock speeds have increased from 1.53GHz on the Titan X to 1.58GHz on the 1080ti. All that being said it looks like the biggest difference is in the faster but less memory. Nvidia looks to be arguing that games don't use the 12GBthat's on the Titan X and they can run 11GB of this new Micron memory chip a full 1GBps faster. So here's what I'm asking myself, sure the price/performance hands down goes to the 1080ti. Heck you can almost buy two of them for the cost of a single Titan X. But people are getting Titan X overclock of the memory to 11GHz and getting boost clocks over 2000MHz. Seems like an overclocked Titan X would still outperform the new 1080ti. The real question will be how far the 1080ti can overclock its memory past 11GBps. If that number creeps up into 12GBps+ territory it would certainly be the winner. So who thinks there's a Titan X Black or some variant around the corner??? Let's face it, we're still only using 3584 out of the 3840 CUDA cores available on GP102 processor.


----------



## Gunslinger.

Quote:


> Originally Posted by *xTesla1856*
> 
> I'm curious, why are your selling them?


Because I'm an extreme overclocker (LN2) and the 1080 Ti is going to have non-reference cards available. (just like 980 Ti)

These non-reference cards will have full voltage control and not be 100% locked down and will be able to overclock much much higher than the Titan XP.


----------



## Nicklas0912

Quote:


> Originally Posted by *EniGma1987*
> 
> To do a vmod you need multi-turn variable resistors, some wire, and soldering skills.
> 
> The CLU is just for a power mod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can use CLP, but it will be a lot harder. The ultra applies a lot easier and sticks where you put it more. You will most likely have issues applying the pro unless you have a good bit of experience with how it moves.


Yea sory, I ment power mod







Can I use this Clu? or does you know one there is better ? + can I use the card 24/7 after? then remove it later,


----------



## stocksux

Quote:


> Originally Posted by *Gunslinger.*
> 
> These non-reference cards will have full voltage control and not be 100% locked down and will be able to overclock much much higher than the Titan XP.


Where are you getting this information? There are non reference 1080s and they are all voltage locked. Where have you seen that the ti will not be?


----------



## xTesla1856

Quote:


> Originally Posted by *stocksux*
> 
> Where are you getting this information? There are non reference 1080s and they are all voltage locked. Where have you seen that the ti will not be?


I would like to know as well. Pascal hits a wall around 2150, even the aftermarket 1080s with extended power limits and "Full voltage control" didn't change that fact. That's why EVGA and MSi haven't bothered with cards like the KingPin or Lightning. All we know is that the 1080Ti has a slightly hogher boost clock, faster memory and will overclock to somewhere around 2 GHz. Sound familiar? Yeah...


----------



## Vellinious

Quote:


> Originally Posted by *xTesla1856*
> 
> I would like to know as well. Pascal hits a wall around 2150, even the aftermarket 1080s with extended power limits and "Full voltage control" didn't change that fact. That's why EVGA and MSi haven't bothered with cards like the KingPin or Lightning. All we know is that the 1080Ti has a slightly hogher boost clock, faster memory and will overclock to somewhere around 2 GHz. Sound familiar? Yeah...


False. Gotta keep them cool and they'll go much higher, and perform like they should at higher clocks. The colder the better.


----------



## stocksux

Quote:


> Originally Posted by *Vellinious*
> 
> False. Gotta keep them cool and they'll go much higher, and perform like they should at higher clocks. The colder the better.


It's actually very true. Without upping voltage it doesn't matter what the temp is. Could be -crap ton Celsius and still not clock higher if there isn't enough voltage.


----------



## Vellinious

Quote:


> Originally Posted by *stocksux*
> 
> It's actually very true. Without upping voltage it doesn't matter what the temp is. Could be -crap ton Celsius and still not clock higher if there isn't enough voltage.


Might be a little bit different on the Titan (I didn't see it when I was messing with mine), but, the cooler you run them, the higher the clocks without adding any additional voltage. I've had 5 different 1080s and they're all the same. At room temps, 2130 to 2150 usually run pretty well. Lower the temps and you'll see higher clocks that perform as they should. The lower the temps go, the higher they'll boost. With coolant temps down around 0c, mine will run as high as 2278, and run WELL at 2252 all with stock voltage.

So....no, it's really not.


----------



## jhowell1030

Quote:


> Originally Posted by *Vellinious*
> 
> False. Gotta keep them cool and they'll go much higher, and perform like they should at higher clocks. The colder the better.


Only true to a point

Quote:


> Originally Posted by *stocksux*
> 
> It's actually very true. Without upping voltage it doesn't matter what the temp is. Could be -crap ton Celsius and still not clock higher if there isn't enough voltage.


Low temps isn't the _end all be all_ to a cards performance. If that was the case than there would be no reason for new architectures for better performance.

Eventually the chip can only do so much


----------



## Vellinious

Quote:


> Originally Posted by *jhowell1030*
> 
> Only true to a point
> Low temps isn't the _end all be all_ to a cards performance. If that was the case than there would be no reason for new architectures for better performance.
> 
> Eventually the chip can only do so much


Absolutely...at some point, you're going to reach the point where the core just won't clock any higher without additional voltage. I've had my coolant temps down to -1c, and haven't found that place yet. /shrug


----------



## Gunslinger.

Quote:


> Originally Posted by *stocksux*
> 
> Where are you getting this information? There are non reference 1080s and they are all voltage locked. Where have you seen that the ti will not be?


From the people who are responsible for designing, testing and marketing the cards.









To do so is a pretty big task, why would they put the effort into doing the same for the 1080 cards if they know that the 1080 Ti is coming down the pipe?
This is the same scenario that we've already seen with GTX 980 - Titan X - GTX 980 Ti


----------



## Silent Scone

Given it has the same number of SMs, anything that comes out from underneath EVGA and ASUS is likely going to smash most TX. Although as far as every day clocks are concerned, I'd wager the differences will be negligible. If these manage to clock to 2200 out of the box I'll be hankering for one to play with lol.


----------



## Lobotomite430

"Featuring the most powerful and efficient hardware we've ever designed, the $699 GeForce GTX 1080 Ti is up to 35% faster than the GeForce GTX 1080, and is even faster in games than the $1200 NVIDIA TITAN X."

Just read this on geforce.com I am almost kinda sad.


----------



## stocksux

Quote:


> Originally Posted by *Vellinious*
> 
> Might be a little bit different on the Titan (I didn't see it when I was messing with mine), but, the cooler you run them, the higher the clocks without adding any additional voltage. I've had 5 different 1080s and they're all the same. At room temps, 2130 to 2150 usually run pretty well. Lower the temps and you'll see higher clocks that perform as they should. The lower the temps go, the higher they'll boost. With coolant temps down around 0c, mine will run as high as 2278, and run WELL at 2252 all with stock voltage.
> 
> So....no, it's really not.


Yes the cards thermal throttle. We all know that. Most people are able to OC into the 2000-2100 range and stay there on water. That's not a big deal. The claim was that the voltage would be unlocked on the ti thus allowing for even faster clocks. If voltage doesn't go up you will become capped from clock speeds. Also at 2278 consider yourself lucky at winning the silicon lottery


----------



## stocksux

Quote:


> Originally Posted by *Gunslinger.*
> 
> From the people who are responsible for designing, testing and marketing the cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To do so is a pretty big task, why would they put the effort into doing the same for the 1080 cards if they know that the 1080 Ti is coming down the pipe?
> This is the same scenario that we've already seen with GTX 980 - Titan X - GTX 980 Ti


Could you provide a link to this info??


----------



## looniam

Quote:


> Originally Posted by *Gunslinger.*
> 
> Because I'm an extreme overclocker (LN2) and the 1080 Ti is going to have non-reference cards available. (just like 980 Ti)
> 
> These non-reference cards will have full voltage control and not be 100% locked down and will be able to overclock much much higher than the Titan XP.


well that info strikes me like lightning!


----------



## Gunslinger.

Quote:


> Originally Posted by *stocksux*
> 
> Could you provide a link to this info??


You want a link to my private conversations with these individuals?


----------



## Gunslinger.

Quote:


> Originally Posted by *looniam*
> 
> well that info strikes me like lightning!


KP Edition enters the Matrix to see the Lightning.

You see what I did there.


----------



## looniam

copy that.


----------



## stocksux

Quote:


> Originally Posted by *Gunslinger.*
> 
> You want a link to my private conversations with these individuals?


Didn't know it was a private conversation. I suppose we'll need to address you as the Nvidia insider. As a community, we heard it here first. Straight from the private conversations of Gunslinger and Nvidia individuals, the 1080ti will have unlocked voltage.


----------



## Gunslinger.

Quote:


> Originally Posted by *stocksux*
> 
> Didn't know it was a private conversation. I suppose we'll need to address you as the Nvidia insider. As a community, we heard it here first. Straight from the private conversations of Gunslinger and Nvidia individuals, the 1080ti will have unlocked voltage.


Certain non-reference 1080 Ti's will have unlocked voltage.


----------



## stocksux

So after watching the GDC and fighting back getting physically sick I wrote Nvidia an email on a $1200 Titan X that will be a $699 1080ti. Being as I haven't even had my Titan for a month yet. Here is the response I got from Nvidia,

"From the problem description, I understand that you purchased the TITAN X Pascal card and now you want a refund since you feel the GTX 1080 Ti would be a better option and you can go for 2 GTX 1080 Ti cards with a 200 $ more, The TITAN X Pascal would still be more powerful however 2 GTX 1080 Ti would be more powerful when compared to a single TITAN X Pascal."

I was then directed to a refund center to process a refund. What strikes me as odd, is that this rep of Nvidia said, "The TITAN X Pascal would still be more powerful" which completely goes against what CEO Jen-Hsun Huang said at GDC.


----------



## Lobotomite430

Quote:


> Originally Posted by *stocksux*
> 
> So after watching the GDC and fighting back getting physically sick I wrote Nvidia an email on a $1200 Titan X that will be a $699 1080ti. Being as I haven't even had my Titan for a month yet. Here is the response I got from Nvidia,
> 
> "From the problem description, I understand that you purchased the TITAN X Pascal card and now you want a refund since you feel the GTX 1080 Ti would be a better option and you can go for 2 GTX 1080 Ti cards with a 200 $ more, The TITAN X Pascal would still be more powerful however 2 GTX 1080 Ti would be more powerful when compared to a single TITAN X Pascal."
> 
> I was then directed to a refund center to process a refund. What strikes me as odd, is that this rep of Nvidia said, "The TITAN X Pascal would still be more powerful" which completely goes against what CEO Jen-Hsun Huang said at GDC.


They think the Titan would be more powerful than the 1080ti? I really wasnt expecting the 1080ti to be that close to the Titan but wow. Hopefully you get your money back.


----------



## Vellinious

Quote:


> Originally Posted by *stocksux*
> 
> Yes the cards thermal throttle. We all know that. Most people are able to OC into the 2000-2100 range and stay there on water. That's not a big deal. The claim was that the voltage would be unlocked on the ti thus allowing for even faster clocks. If voltage doesn't go up you will become capped from clock speeds. Also at 2278 consider yourself lucky at winning the silicon lottery


I hit those clocks on all 3 of the 1080s I tested with coolant temps at 0c to -1c. The two I have now run it in SLI.

Adding voltage will only get you so much, without also dropping temps. I've seen the T4 bios at work, and while it adds clock speeds, it doesn't add much performance, and in some cases just hurts performance because it's adding heat. They just don't run well at higher temps. At 35c 2202 might run, but at 25c or lower, 2202 will run a LOT better, adding frames and probably allow for lower voltages. It doesn't have anything to do with "THROTTLING".

Ya can't He-Man this architecture with more voltage like we used to in the past. The game changed. It's all about the temps.....but hey....I hope you get all your wishes, and your new 1080ti will add extra voltage for you. rofl


----------



## pez

They might be saying that to discourage you and based on very certain situations, hardware wise, the Titan X P will reign victorious. So while they're not lying to you, they're being selective in their wording.


----------



## octiny

Quote:


> Originally Posted by *stocksux*
> 
> I was then directed to a refund center to process a refund. What strikes me as odd, is that this rep of Nvidia said, "The TITAN X Pascal would still be more powerful" which completely goes against what CEO Jen-Hsun Huang said at GDC.


It's a gimped Titan XP, I'm not sure why anyone would expect it to be more powerful. It's being marketed as the fastest "gaming" gpu, which makes sense as the Titan XP isn't considered a "gaming" only card. Unless the TI can magically break the Pascal wall, OC for OC, XP will still pull slightly ahead.


----------



## Lobotomite430

Quote:


> Originally Posted by *octiny*
> 
> It's a gimped Titan XP, I'm not sure why anyone would expect it to be more powerful. It's being marketed as the fastest "gaming" gpu, which makes sense as the Titan XP isn't considered a "gaming" only card. Unless the TI can magically break the Pascal wall, OC for OC, XP will still pull slightly ahead.


I hope so but at the price it sure would be tempting to ditch my Titan for 2 1080ti.


----------



## octiny

Quote:


> Originally Posted by *Lobotomite430*
> 
> I hope so but at the price it sure would be tempting to ditch my Titan for 2 1080ti.


If you are still within the return period, I would definitely get 2. Almost Titan XP performance x 2 would be a no-brainer for $200 more, if I was in your position.

I had thought of selling my two cards so I could pick up a few hundred dollars, but realized that I'd lose that money simply on Ti resell value once Volta/VEGA hits.

Ti's are the absolute worst at holding resale value versus Titans once a new gen gets released. Titans hold their value quite well.


----------



## Lobotomite430

Quote:


> Originally Posted by *octiny*
> 
> If you are still within the return period, I would definitely get 2. Almost Titan XP performance x 2 would be a no-brainer for $200 more, if I was in your position.
> 
> I had thought of selling my two cards so I could pick up a few hundred dollars, but realized that I'd lose that money simply on Ti resell value once Volta/VEGA hits.
> 
> Ti's are the absolute worst at holding resale value versus Titans once a new gen gets released. Titans hold their value quite well.


I bought my Titan the day it came out, so if I would have to resell it at this point.


----------



## Slushpup

I'll be sitting back till Vega with my Two Power houses.


----------



## foggyNA

I put my request in for a refund on my titan, but I don't think it will get approved. I understand the nature of this hobby but this was a huge blow to enthusiasts. How can I resell my titan when 1080ti can do it just as good for half the price almost.


----------



## Slushpup

We as consumers could just stop buying the Titan class cards, Nvidia would have no reason to sell them at that point.


----------



## pez

You've gotta think of it like cars. It costs to have the latest and greatest and the moment you drive it off the lot even, it depreciates unless demand is high.


----------



## foggyNA

Quote:


> Originally Posted by *pez*
> 
> You've gotta think of it like cars. It costs to have the latest and greatest and the moment you drive it off the lot even, it depreciates unless demand is high.


I would normally agree with this but the performance gap that was closed with the 1080ti for the price is outrageous.


----------



## pez

Quote:


> Originally Posted by *foggyNA*
> 
> I would normally agree with this but the performance gap that was closed with the 1080ti for the price is outrageous.


Well a similar thing also happened with the 780Ti and 980Ti so if it took you by surprise, I'm a bit perplexed.


----------



## xTesla1856

How long is the refund period with NV?


----------



## stocksux

Quote:


> Originally Posted by *xTesla1856*
> 
> How long is the refund period with NV?


What is your refund policy?
We offer a 30-day money back guarantee for all products including Gear Store apparel.


----------



## Lobotomite430

Quote:


> Originally Posted by *pez*
> 
> Well a similar thing also happened with the 780Ti and 980Ti so if it took you by surprise, I'm a bit perplexed.


Well I was a bit shocked but my 980ti was the first Nvidia card since geforce 4 series. But I suppose its hurting their Titan sales so much as it is moving products and getting them out there before their competition has a chance.


----------



## pez

Yeah I see that. If you've had the card for more than 3 months I see no need to fret. It's still going to battle head-to-head, and you'll still have the satisfaction that you have 12GB of VRAM instead of 11GB







.


----------



## Lobotomite430

Quote:


> Originally Posted by *pez*
> 
> Yeah I see that. If you've had the card for more than 3 months I see no need to fret. It's still going to battle head-to-head, and you'll still have the satisfaction that you have 12GB of VRAM instead of 11GB
> 
> 
> 
> 
> 
> 
> 
> .


I guess im not too sad as I still dont know many games that like xfire or SLI enough to justify two cards. Ive been pretty happy running 60fps on my 3440x1440 since August 2nd


----------



## xTesla1856

Quote:


> Originally Posted by *stocksux*
> 
> What is your refund policy?
> We offer a 30-day money back guarantee for all products including Gear Store apparel.


I'm waay past 30 days but thanks anyway


----------



## Lobotomite430

Quote:


> Originally Posted by *xTesla1856*
> 
> I'm waay past 30 days but thanks anyway


Your Titan runs at 2114mhz? Is that part of the lottery or did you do something?


----------



## jhowell1030

Quote:


> Originally Posted by *Lobotomite430*
> 
> I guess im not too sad as I still dont know many games that like xfire or SLI enough to justify two cards. Ive been pretty happy running 60fps on my 3440x1440 since August 2nd


Same boat. I probably won't be upgrading until a single-card solution can do this at 100FPS with no sweat.


----------



## EniGma1987

Quote:


> Originally Posted by *stocksux*
> 
> So after digging into things I'm not sure the 1080ti is "faster" than the Titan X. The 1080ti essentially has the 12GB of GDDR5X that the Titan X has reduced to 11GB, memory interface is bumped down from 384-bit to 352-bit, ROP count drops from 96 to 88, while 256K of the Titan X Pascal's L2 cache is also omitted. Here's what I found the 1080ti has over the Titan X. 11GBps vs the 10GBps modules found on the Titan X Pascal which increases memory bandwidth from the Titan X's 480GBps to 484GBps. And boost clock speeds have increased from 1.53GHz on the Titan X to 1.58GHz on the 1080ti. All that being said it looks like the biggest difference is in the faster but less memory. Nvidia looks to be arguing that games don't use the 12GBthat's on the Titan X and they can run 11GB of this new Micron memory chip a full 1GBps faster. So here's what I'm asking myself, sure the price/performance hands down goes to the 1080ti. Heck you can almost buy two of them for the cost of a single Titan X. But people are getting Titan X overclock of the memory to 11GHz and getting boost clocks over 2000MHz. Seems like an overclocked Titan X would still outperform the new 1080ti. The real question will be how far the 1080ti can overclock its memory past 11GBps. If that number creeps up into 12GBps+ territory it would certainly be the winner. So who thinks there's a Titan X Black or some variant around the corner??? Let's face it, we're still only using 3584 out of the 3840 CUDA cores available on GP102 processor.


My thoughts are the same.
One thing that may be a real difference though is Nvidia claims the new Ti has a software based cache for the tiled rendering. Nvidia claims this cache puts the performance above the Titan. Which I dont see how that is given that at best if the driver caches to a small bit of system RAM it is still drastically slower than any of the hardware resources on the GPU. And if this software based cache works, why wont it be given to the Titan as well?

Quote:


> Originally Posted by *Gunslinger.*
> 
> Because I'm an extreme overclocker (LN2) and the 1080 Ti is going to have non-reference cards available. (just like 980 Ti)
> 
> These non-reference cards will have full voltage control and not be 100% locked down and will be able to overclock much much higher than the Titan XP.


Im a little confused though. As an LN2 overclocker, why do you care about custom bios support with unlocked voltage? LN2 should always be doing hardmod control to set voltage.

Quote:


> Originally Posted by *Nicklas0912*
> 
> Yea sory, I ment power mod
> 
> 
> 
> 
> 
> 
> 
> Can I use this Clu? or does you know one there is better ? + can I use the card 24/7 after? then remove it later,


You can use CLP, but it likes to squirm around when being applied so you run a much higher risk of spilling it all over the GPU's board. You will want some acetone around to clean things up afterwards. You can remove the mod any time you want with some acetone and paper towels.


----------



## EniGma1987

Quote:


> Originally Posted by *stocksux*
> 
> So after watching the GDC and fighting back getting physically sick I wrote Nvidia an email on a $1200 Titan X that will be a $699 1080ti. Being as I haven't even had my Titan for a month yet. Here is the response I got from Nvidia,
> 
> "From the problem description, I understand that you purchased the TITAN X Pascal card and now you want a refund since you feel the GTX 1080 Ti would be a better option and you can go for 2 GTX 1080 Ti cards with a 200 $ more, The TITAN X Pascal would still be more powerful however 2 GTX 1080 Ti would be more powerful when compared to a single TITAN X Pascal."
> 
> I was then directed to a refund center to process a refund. What strikes me as odd, is that this rep of Nvidia said, "The TITAN X Pascal would still be more powerful" which completely goes against what CEO Jen-Hsun Huang said at GDC.


Regardless if the Titan comes out even 10% faster than the Ti (which I doubt it will), two Ti's in SLI will crush a Titan in any game that has even barely decent SLI support.


----------



## Gunslinger.

Quote:


> Originally Posted by *EniGma1987*
> 
> Im a little confused though. As an LN2 overclocker, why do you care about custom bios support with unlocked voltage? LN2 should always be doing hardmod control to set voltage.
> You can use CLP, but it likes to squirm around when being applied so you run a much higher risk of spilling it all over the GPU's board. You will want some acetone around to clean things up afterwards. You can remove the mod any time you want with some acetone and paper towels.


CLP is terrible for LN2

I care because I don't like to butcher my cards to run on LN2, cards like the EVGA Kingpin series, Asus Matrix series and MSI's Lightning series are all highly overclockable using software for voltage control, or an EVBot in EVGA's case.

All these cards can be benched hard on LN2 and then returned to stock condition for daily use.


----------



## Slushpup

I hope Volta is this big of a jump.


----------



## Maintenance Bot

Ryzen and now 1080 Ti.

Time to go throw up in bucket again.


----------



## Nicklas0912

Quote:


> Originally Posted by *EniGma1987*
> 
> My thoughts are the same.
> One thing that may be a real difference though is Nvidia claims the new Ti has a software based cache for the tiled rendering. Nvidia claims this cache puts the performance above the Titan. Which I dont see how that is given that at best if the driver caches to a small bit of system RAM it is still drastically slower than any of the hardware resources on the GPU. And if this software based cache works, why wont it be given to the Titan as well?
> Im a little confused though. As an LN2 overclocker, why do you care about custom bios support with unlocked voltage? LN2 should always be doing hardmod control to set voltage.
> You can use CLP, but it likes to squirm around when being applied so you run a much higher risk of spilling it all over the GPU's board. You will want some acetone around to clean things up afterwards. You can remove the mod any time you want with some acetone and paper towels.


Okay thanks, is there any other produkt there is better to use?


----------



## jezzer

More ram doesnt make a card faster, faster ram does. So if u are saying that the only real difference is faster but less ram you are kinda saying the Ti is faster but u dont think its faster


----------



## ChronoBodi

Quote:


> Originally Posted by *jezzer*
> 
> More ram doesnt make a card faster, faster ram does. So if u are saying that the only real difference is faster but less ram you are kinda saying the Ti is faster but u dont think its faster


It takes 11000 mhz GDDR5X to hit 484 GB/s bandwidth on Ti. The same speed gets you 528 GB/s bandwidth on Titan XP.

That being said, no one buys TXP now, the same TXP performance i've had (2 Ghz Boost OCed) is now in 1080 Ti out of the box.

Why would the Nvidia CEO say 1080 Ti is faster than TXP considering they're both 3584 cores? Simple, they overclocked it. nothing more, nothing less.


----------



## Slushpup

4 more GB/s is not a big diff and we don't know if the 11gb ram can be pushed higher. We do know the ram on the Titan can be.


----------



## ChronoBodi

Quote:


> Originally Posted by *Slushpup*
> 
> 4 more GB/s is not a big diff and we don't know if the 11gb ram can be pushed higher. We do know the ram on the Titan can be.


It might have been so that it's basically a overclocked Titan XP on both core and memory with missing 64 bit bus.


----------



## ChronoBodi

Tile rendering is already a thing since Maxwell, Nvidia just used this as marketing.

David Kanter found out about it.




In fact, i see the tiles when I pushed my TXP a bit too far and artifacts.


----------



## jhowell1030

Quote:


> Originally Posted by *ChronoBodi*
> 
> Tile rendering is already a thing since Maxwell, Nvidia just used this as marketing.
> 
> David Kanter found out about it.
> 
> 
> 
> 
> In fact, i see the tiles when I pushed my TXP a bit too far and artifacts.


Yeah, I didn't think that this was new to the 1080ti. Don't all Pascal cards do this?


----------



## ChronoBodi

Quote:


> Originally Posted by *jhowell1030*
> 
> Yeah, I didn't think that this was new to the 1080ti. Don't all Pascal cards do this?


Since Maxwell, yes.


----------



## xTesla1856

Funny how people will tell you not to trust Nvidias marketing but as soon as a new product is released, they all take it and gobble it up as if it were god's word. The things I've had to listen to today from friends and acquaintances are beyond me.


----------



## ChronoBodi

Quote:


> Originally Posted by *xTesla1856*
> 
> Funny how people will tell you not to trust Nvidias marketing but as soon as a new product is released, they all take it and gobble it up as if it were god's word. The things I've had to listen to today from friends and acquaintances are beyond me.


Please, do tell me.

We all know that the 1080 Ti is essentially an OCed TXP minus 1GB Vram and 64 bit bus.
the thing is, is it easy to explain this to your friends?


----------



## Nicklas0912

Quote:


> Originally Posted by *ChronoBodi*
> 
> It takes 11000 mhz GDDR5X to hit 484 GB/s bandwidth on Ti. The same speed gets you 528 GB/s bandwidth on Titan XP.
> 
> That being said, no one buys TXP now, the same TXP performance i've had (2 Ghz Boost OCed) is now in 1080 Ti out of the box.
> 
> Why would the Nvidia CEO say 1080 Ti is faster than TXP considering they're both 3584 cores? Simple, they overclocked it. nothing more, nothing less.


where do you get it 2000 pre overclocked? boost clock is 1580, IT can be Overclocked to 2Ghz as you saw on the demo.

Titan still has more Rof and bit rate, wich migt come in play.


----------



## ChronoBodi

Quote:


> Originally Posted by *Nicklas0912*
> 
> where do you get it 2000 pre overclocked? boost clock is 1580, IT can be Overclocked to 2Ghz as you saw on the demo.
> 
> Titan still has more Rof and bit rate, wich migt come in play.


I did the OC myself, the Nvidia CEO is relying on the fact that most people besides this site has no clue how to OC nor what the specs is. All they go on is just the name of the card and fancy marketing.

This is my TXP, OCed to essentially slightly better than 1080 Ti specs.


----------



## EniGma1987

Quote:


> Originally Posted by *jhowell1030*
> 
> Yeah, I didn't think that this was new to the 1080ti. Don't all Pascal cards do this?


yes. The thing that is "new" for the Ti is not Tile Based rendering, but rather a software based cache for the tile base rendering.

Quote:


> Originally Posted by *Gunslinger.*
> 
> CLP is terrible for LN2


Well ya. Nobody was saying use Liquid Pro for LN2 though.


----------



## Dagamus NM

We all knew this was going to happen. Meh, just means that Volta is that much closer. 6 months or less I assume. Then everybody is going to cry that they bought a 1080Ti only to have the 1180 or whatever they call it come out six months later. This is the business model.

For people that are hurt about Ryzen and have an x99 setup, that is just absurd.

I am not sure if I want to replace my 780Tis with 1080Tis or wait until Titan Volta. Probably the latter.

I am glad to see AMD competing. Innovation would be nice, but at least we are getting competition.

Skylake-E and Titan Vega sound like a winning combination. Maybe eVGA will have a 1600W P3 model by then.


----------



## ChronoBodi

Quote:


> Originally Posted by *Dagamus NM*
> 
> We all knew this was going to happen. Meh, just means that Volta is that much closer. 6 months or less I assume. Then everybody is going to cry that they bought a 1080Ti only to have the 1180 or whatever they call it come out six months later. This is the business model.
> 
> For people that are hurt about Ryzen and have an x99 setup, that is just absurd.
> 
> I am not sure if I want to replace my 780Tis with 1080Tis or wait until Titan Volta. Probably the latter.
> 
> I am glad to see AMD competing. Innovation would be nice, but at least we are getting competition.
> 
> Skylake-E and Titan Vega sound like a winning combination. Maybe eVGA will have a 1600W P3 model by then.


I mean, i'm getting Ryzen myself for my TV rig, would be awesome.

Otherwise, GPU tech advances, it's all about in the timing of when you buy that makes your money worth it. Ofc this depends on individual to individual.

I feel that the *70s and the *Ti models tends to be best FPS/$ cards in the Nvidia side of the camp.

I mean, i had a 980 Ti before, but a OG titan before that as well. I'm weird like that.

That being said, I'm getting a Volta *Ti next time.


----------



## Artah

bad timing for me, was getting ready to sell my two txp with ekwb blocks lol, I guess I can sell it for 699 each if I get desparate.


----------



## stocksux

I need benchmarks!! I need to see 1080ti overclockability! I'm days away from refund territory expiring on my Titan...????


----------



## stryker7314

Quote:


> Originally Posted by *Artah*
> 
> bad timing for me, was getting ready to sell my two txp with ekwb blocks lol, I guess I can sell it for 699 each if I get desparate.


Let me know if you are before the 10th.


----------



## xTesla1856

Quote:


> Originally Posted by *Lobotomite430*
> 
> Your Titan runs at 2114mhz? Is that part of the lottery or did you do something?


It hits 2152MHz actually before downclocking 2-3 steps where it hovers in game between 2114 and 2126. Yeah, luck is always involved, this time it was on my side


----------



## ChronoBodi

Quote:


> Originally Posted by *stocksux*
> 
> I need benchmarks!! I need to see 1080ti overclockability! I'm days away from refund territory expiring on my Titan...????


It's the same GP104, 3584 cores.

It's basically apply +200 on core to TXP, +500 on memory to TXP, cut off 1GB of VRAM and 32 bit bus to reduce it to 352 bit from 384 bit, and bam, 1080 Ti.

That being said, there won't be much OC headroom on 1080 Ti. The only reason we can OC TXP so high is because of that untapped OC headroom, which Nvidia used to make the 1080 Ti and claim it faster than TXP despite both being 3584 cores.


----------



## Gunslinger.

So is there any boost with these HB SLI bridges vs. the standard bridges as far as benchmarking goes?

Inquiring minds want to know.


----------



## mbze430

Well I have 2 Titan XP..... I think I might be jumping ship to Vega or wait when the Volta comes out. going to 1080TI would put me in the hole deeper..lol

BUT... i was thinking of putting a 1080 TI in my HTPC instead. idk...maybe just get all Voltas.... so hard to decide now.


----------



## Nitemare3219

Quote:


> Originally Posted by *ChronoBodi*
> 
> It's the same GP104, 3584 cores.
> 
> It's basically apply +200 on core to TXP, +500 on memory to TXP, cut off 1GB of VRAM and 32 bit bus to reduce it to 352 bit from 384 bit, and bam, 1080 Ti.
> 
> That being said, there won't be much OC headroom on 1080 Ti. The only reason we can OC TXP so high is because of that untapped OC headroom, which Nvidia used to make the 1080 Ti and claim it faster than TXP despite both being 3584 cores.


Are you 100% sure about that?

NVIDIA claims the TDP is now 220 watts due to the upgraded power delivery system. Won't that increase headroom for OC at all above 2.1 GHz or so where all other cards seem to flop, or will they still hit a wall there?

I'm really trying to decide between picking up 2 new 1080 Ti, or 2 used Titan XP's for about the same price... I game in 4K and want the best, but some of the upgrades of the 1080 Ti such as the power delivery being better, and the better cooler (may negate the need for liquid cooling), make me want to consider it instead. At the same time, the additional ROPs, wider memory bus, and extra VRAM on the Titan X seem appealing too.


----------



## DooRules

Quote:


> Originally Posted by *Gunslinger.*
> 
> So is there any boost with these HB SLI bridges vs. the standard bridges as far as benchmarking goes?
> 
> Inquiring minds want to know.


I didn't notice any difference. Other than gpuz telling me I had a high BW sli link, lol.


----------



## ChronoBodi

Quote:


> Originally Posted by *Nitemare3219*
> 
> Are you 100% sure about that?
> 
> NVIDIA claims the TDP is now 220 watts due to the upgraded power delivery system. Won't that increase headroom for OC at all above 2.1 GHz or so where all other cards seem to flop, or will they still hit a wall there?
> 
> I'm really trying to decide between picking up 2 new 1080 Ti, or 2 used Titan XP's for about the same price... I game in 4K and want the best, but some of the upgrades of the 1080 Ti such as the power delivery being better, and the better cooler (may negate the need for liquid cooling), make me want to consider it instead. At the same time, the additional ROPs, wider memory bus, and extra VRAM on the Titan X seem appealing too.


the reduced TDP can be attributed to the missing 32 bit bus and 1gb module they left off to make a 1080 Ti.

Take a look at custom 1080s and 1070s, a lot of them hit the same 2100-2150 mhz wall, no matter the phases or cooling. Even the blowers can OC right up there too, give or take 25-50 mhz due to better cooling from open airs, but that's about it.


----------



## Menthol

Quote:


> Originally Posted by *Gunslinger.*
> 
> So is there any boost with these HB SLI bridges vs. the standard bridges as far as benchmarking goes?
> 
> Inquiring minds want to know.


I think that if you double up flex able bridges or use one of the pre HB led type from EVGA or MSI there is no difference, if you are going 3 and 4 way sli you need an EVGA or MSI LED bridge, the LED bridges have contacts on both sides of the connector, essentially the same as HB, if you don't you get a warning in Nvidia control panel and it may not let you sli the cards


----------



## NemChem

So is it 220W or 250W? I thought it was 220W but I was corrected and the NVIDIA site said 250W yet iirc the stream said 220W... Different sites reporting different numbers.
Quote:


> Originally Posted by *stocksux*
> 
> I need benchmarks!! I need to see 1080ti overclockability! I'm days away from refund territory expiring on my Titan...????


Start your return process, that's what I've done, got mine on the 8th Feb. You have 30 days to get it to them once the RMA is open so if you don't send it back the RMA just expires. Good luck with NVIDIA CS at Digital River though... asking me for info I already gave them and writing a mishmash of generic responses.

The return site gave me a "physical product return reference #" and address to send it to, saying "a copy of this will be sent to your email along with instructions", then the email included none of that and just says contact CS, which I did, but they say a return is in progress already, yet they'll "happily process this return for me" - which is it? Lol... completely ignored my question asking if the physical return reference number is the RMA number and if I can go ahead and send it to the address given.

Had a nightmare with Digital River and the HTC Vive, seems they're still just as bad!


----------



## stocksux

Quote:


> Originally Posted by *NemChem*
> 
> So is it 220W or 250W? I thought it was 220W but I was corrected and the NVIDIA site said 250W yet iirc the stream said 220W... Different sites reporting different numbers.
> Start your return process, that's what I've done, got mine on the 8th Feb. You have 30 days to get it to them once the RMA is open so if you don't send it back the RMA just expires. Good luck with NVIDIA CS at Digital River though... asking me for info I already gave them and writing a mishmash of generic responses.
> 
> The return site gave me a "physical product return reference #" and address to send it to, saying "a copy of this will be sent to your email along with instructions", then the email included none of that and just says contact CS, which I did, but they say a return is in progress already, yet they'll "happily process this return for me" - which is it? Lol... completely ignored my question asking if the physical return reference number is the RMA number and if I can go ahead and send it to the address given.
> 
> Had a nightmare with Digital River and the HTC Vive, seems they're still just as bad!


Yeah I'll do that. Always better to have options.


----------



## Slushpup

At the speeds my card runs at, the 1080ti will have to have as good silicon.

(Old build btw)

http://www.3dmark.com/fs/9938679


----------



## TheFallenDeity

So after having my Titan XP for 3 weeks now, I've opened a return request for a refund. With the 1080 Ti being essentially the same performance for $500 dollars less, I could use the extra money for build upgrades. I just so happened to buy a TXP at the wrong time (or right time for a refund







). I was figuring the Ti to be more cut down than the Titan (3328 cores/ 12GB G5 memory, etc.), since it is practically the same, there's no sense of keeping the Titan. I could use a new SSD and new Noctua case fans, which the extra money will allot me.

It was a short but sweet marriage. Anyone who bought a Titan in the last 30 days should return theirs and get the Ti imo.


----------



## carlhil2

nVIDIA is calling the 1080Ti the fastest gpu because of the stock clocks that it will come with.out of the box, it will, in fact, be faster than the TXP. I needs me 2 of these.unless I can find a decent priced TXP..


----------



## stryker7314

Quote:


> Originally Posted by *carlhil2*
> 
> nVIDIA is calling the 1080Ti the fastest gpu because of the stock clocks that it will come with.out of the box, it will, in fact, be faster than the TXP. I needs me 2 of these.unless I can find a decent priced TXP..


Should start seeing used TXP's around 750 in the for sale section soon. 50 for the extra gig of ram.


----------



## MunneY

Well,

I sold my TXP and got 1100$ for it

Time to buy 2 Ti's!


----------



## octiny

Quote:


> Originally Posted by *MunneY*
> 
> Well,
> 
> I sold my TXP and got 1100$ for it
> 
> Time to buy 2 Ti's!


Congrats!

Lol @ people thinking they're going to go for 650-750. Titans have always had the best resale value and it'll still be the fastest card OC for OC. I'd put money on that







....

Shoot, Titan XM's sold @ 750+ all the time until the TI announcement even though people were selling new 1080's on Ebay @ 500 a pop.

So for anyone thinking of selling their XP for two Ti's, don't take less than $1000. I've seen plenty sale for that price on Ebay on my watch-list in the last 12 hours.


----------



## animeowns

Quote:


> Originally Posted by *octiny*
> 
> Congrats!
> 
> Lol @ people thinking they're going to go for 650-750. Titans have always had the best resale value and it'll still be the fastest card OC for OC. I'd put money on that
> 
> 
> 
> 
> 
> 
> 
> ....
> 
> Shoot, Titan XM's sold @ 750+ all the time until the TI announcement even though people were selling new 1080's on Ebay @ 500 a pop.
> 
> So for anyone thinking of selling their XP for two Ti's, don't take less than $1000. I've seen plenty sale for that price on Ebay on my watch-list in the last 12 hours.


Nice I am getting enough performance with 3 titan xp's I can't see myself buying a 1080 ti unless it can run 8k resolution dell is releasing the 32 inch behemoth next month and it will be an excellent purchase to play with new video cards maybe amd vega will be more suited for this


----------



## Maintenance Bot

Quote:


> Originally Posted by *MunneY*
> 
> Well,
> 
> I sold my TXP and got 1100$ for it
> 
> Time to buy 2 Ti's!


Pre orders start tomorrow morning.


----------



## lilchronic

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Pre orders start tomorrow morning.


What time is considered morning.


----------



## MunneY

Quote:


> Originally Posted by *lilchronic*
> 
> What time is considered morning.


8 am pst.

Thats 10am for me and 11 fir you


----------



## lilchronic

Quote:


> Originally Posted by *MunneY*
> 
> 8 am pst.
> 
> Thats 10am for me and 11 fir you












I just sold my Titan X Maxwell for $600 with water block. Now i need a titan x pascal block.


----------



## Nitemare3219

Quote:


> Originally Posted by *octiny*
> 
> Congrats!
> 
> Lol @ people thinking they're going to go for 650-750. Titans have always had the best resale value and it'll still be the fastest card OC for OC. I'd put money on that
> 
> 
> 
> 
> 
> 
> 
> ....
> 
> Shoot, Titan XM's sold @ 750+ all the time until the TI announcement even though people were selling new 1080's on Ebay @ 500 a pop.
> 
> So for anyone thinking of selling their XP for two Ti's, don't take less than $1000. I've seen plenty sale for that price on Ebay on my watch-list in the last 12 hours.


The cards are no longer worth $1000, sorry dude. The only thing the TXP's have going for them is the wider memory bus (could be irrelevant if the 1080 Ti's can hit 12 GHz on their memory), a few more ROPs, and 1 GB extra VRAM... that it's. The 1080 Ti's have newer hardware (supposedly) in terms of power delivery, the G5X memory is newer/refined, and the cooler is redesigned.


----------



## octiny

Quote:


> Originally Posted by *Nitemare3219*
> 
> The cards are no longer worth $1000, sorry dude. There is practically zero advantage (and in fact some disadvantages) over the new 1080 Ti.


LOL not even close. Titan X is a multi-purpose GPU.....hence why the resale value is always higher. It'll be the same for the XP just as it was for the Titan Black and Titan XM.

1080 TI has less ram, smaller bit bus, less ROPS. The only advantage is higher overclock at default, all XP's hit 11ghz on the ram with ease. TI's will hit the same wall as all other pascal cards.

Sorry dude, but while you may not think it's worth that much, factual history disagrees with you







.

Edit: It sounds like you're buying into the NVIDIA marketing hype lol. We'll just wait for some reviews. In the meantime, don't expect XP prices to fall very much regardless of whether it's worth a certain price or not.


----------



## Nitemare3219

Quote:


> Originally Posted by *octiny*
> 
> LOL not even close. Titan X is a multi-purpose GPU.....hence why the resale value is always higher. It'll be the same for the XP just as it was for the Titan Black and Titan XM.
> 
> 1080 TI has less ram, smaller bit bus, less ROPS. The only advantage is higher overclock at default, all XP's hit 11ghz on the ram with ease. TI's will hit the same wall as all other pascal cards.
> 
> Sorry dude, but while you may not think it's worth that much, factual history disagrees with you
> 
> 
> 
> 
> 
> 
> 
> .


You seem like you're in denial. Multi-purpose GPU - really? How so? It lacks double precision. It is BARELY a leader in VRAM anymore (11 GB vs 12 GB is insignificant). The Titan XM was a leader until the Titan XP came out, but it continued to maintain resale value because no other cards had 12 GB of VRAM (or anything really close). Same story for Titan Black... nothing significant came along until Titan X, and the Titan Black remained relevant until the 980 Ti came out.

To sit here and think the Titan XP still has a high resale value is laughable when the 1080 Ti will deliver just as much performance, if not more, depending on how the power delivery is being handled now. And even NVIDIA themselves showed during the show how the 1080 Ti's 11 GHz memory was superior to the Titan XP running at 11 GHz as well.

Resale has already tanked. I already bought 1 Titan XP for $800. NVIDIA has never seriously slapped Titan owners in the face until now. A SAME GENERATION TI CARD is essentially matching a Titan (of the same generation) for the first time ever.


----------



## octiny

Quote:


> Originally Posted by *Nitemare3219*
> 
> You seem like you're in denial. Multi-purpose GPU - really? How so? It lacks double precision. It is BARELY a leader in VRAM anymore (11 GB vs 12 GB is insignificant). The Titan XM was a leader until the Titan XP came out, but it continued to maintain resale value because no other cards had 12 GB of VRAM (or anything really close). Same story for Titan Black... nothing significant came along until Titan X, and the Titan Black remained relevant until the 980 Ti came out.
> 
> To sit here and think the Titan XP still has a high resale value is laughable when the 1080 Ti will deliver just as much performance, if not more, depending on how the power delivery is being handled now. And even NVIDIA themselves showed during the show how the 1080 Ti's 11 GHz memory was superior to the Titan XP running at 11 GHz as well.
> 
> Resale has already tanked. I already bought 1 Titan XP for $800.


Seems like you're in denial with history









Yes, because 4GB of a TXM over a GTX 1080 is worth $250? Especially when the TXM is the slower card?

I'm sure you bought one from someone trying to convince them to sell it for that much. You're one of those I see.









I've seen plenty sell for $1000+ so far, not to mention a guy few posts back just sold his for $1100 just now.

So as I've said, you can think all you want when it comes to whether it's worth that price or not......but in the end, it doesn't matter what you or OCN thinks. Until Nvidia lowers their $1200 asking price, the market value won't fluctuate much.

Also, I'm not sure if you've noticed, but the XP has been out for 9 months. I would hope it would at least offer similar performance









FYI, I could care less about the pricing on the TI, I got my cards for half off. What grinds my gears are people on OCN trying to artificially deflate the resale value of the XP, so they scare people into selling their XP's for much cheaper than what they could actually get. The same exact thing happened with the Titan XM.

Edit:

1-day Titan XP auction which just ended for $983. Around the $1000 dollar range as I've been saying, Auctions usually save a few bucks versus "buy it now". I've watched a ton of "buy it nows" get snatched up at $1000+ since the Ti reveal. They may not sell instantly, but they no doubt sell. On a side note, I sure do hope Nvidia has a large inventory of Ti's, because I can already picture scalpers snatching them all up similar to the 1080/1070's and profiting large.


----------



## pez

Quote:


> Originally Posted by *Lobotomite430*
> 
> I guess im not too sad as I still dont know many games that like xfire or SLI enough to justify two cards. Ive been pretty happy running 60fps on my 3440x1440 since August 2nd


Yeah. I'm in the same boat. I want my Titan performance back, but not at the cost of the FE cooler....and watercooling just wasn't an option yet as I wanted to go with an official AIO if possible (no space or desire to WC in my current case). So getting Titan-ish performance with a cooler that's no louder than my 1080 ACX 3.0? Sign me up. I think the Titan XP and now the 1080 Ti are a perfect match for 21:9 1440p.


----------



## jhowell1030

Quote:


> Originally Posted by *octiny*
> 
> What grinds my gears are people on OCN trying to artificially deflate the resale value of the XP, so they scare people into selling their XP's for much cheaper than what they could actually get. The same exact thing happened with the Titan XM.


Couldn't agree more


----------



## jmaz87

whats the best HB bridge for EK full blocks? this was another complete letdown. I never expected EK to not make an HB bridge for the Titan XP... probably the largest reason for me not getting a 2nd one by now.... I'm a bit hesitant to buy a used gpu so expensive but we will see how the market changes i suppose


----------



## gamingarena

Quote:


> Originally Posted by *octiny*
> 
> What grinds my gears are people on OCN trying to artificially deflate the resale value of the XP, so they scare people into selling their XP's for much cheaper than what they could actually get. The same exact thing happened with the Titan XM.


Exactly that's what happen when 1080 came people panicking and selling TXM for dirt cheap ( cause of same scare tactics i was seeing on kijiji.ca ) i got 5x Titan XM $500-550USD on my local market (kijiji.ca) in Toronto and just a month after craze died sold each one from $800-900USD on eBay, Titan will sell and always keep it's value

I can bet anything you want that when Volta shows Ti will drop to 350-400 at best but Titan XP still fetch 600-800, it just the way it is for last 4 years like a Swiss watch, even though they are virtual identical cards.


----------



## octiny

Quote:


> Originally Posted by *jhowell1030*
> 
> Couldn't agree more











Quote:


> Originally Posted by *gamingarena*
> 
> Exactly that's what happen when 1080 came people panicking and selling TXM for dirt cheap ( cause of same scare tactics i was seeing on kijiji.ca ) i got 5x Titan XM $500-550USD on my local market (kijiji.ca) in Toronto and just a month after craze died sold each one from $800-900USD on eBay, Titan will sell and always keep it's value
> 
> I can bet anything you want that when Volta shows Ti will drop to 350-400 at best but Titan XP still fetch 600-800, it just the way it is for last 4 years like a Swiss watch, even though they are virtual identical cards.


Nailed it


----------



## Silent Scone

Without double precision the TX wasn't worth 1100 when it came out, let alone now. I bought one because I wanted the best, and as "history" dictates, a more affordable variant is dropped by NVIDIA a few months later.

No point crying over spilt milk. In this industry if you want the best, your return on investment is limited.


----------



## xTesla1856

Quote:


> Originally Posted by *Silent Scone*
> 
> Without double precision the TX wasn't worth 1100 when it came out, let alone now. I bought one because I wanted the best, and as "history" dictates, a more affordable variant is dropped by NVIDIA a few months later.
> 
> No point crying over spilt milk. In this industry if you want the best, your return on investment is limited.


Return of investment? What's that, never heard of it?


----------



## piee

TXP is the first card to run 4k/60fps after years of waiting for 1 card to do 4k/60, on 16nm architecture, been gaming bliss 4k/60fps, the next substantial increase will come with 10/7nm architecture, just like the 28 to 16nm jump. Also Titan black with all 3860 cores, like quadro6000, would be next. TXP is solid, get 1860 boost in gaming no OC.


----------



## pez

TXP runs 60fps/4K on select titles on select settings. I can also make a 1070 do that







.


----------



## Benny89

To TITAN XP owners (since I am to buy 1080Ti ) - OCed does it max 144Hz/165Hz in 1440p?

I think I won't go for 4K till Volta drops so one card can get more than 60hz in 4K. I will never go back to 60Hz after tasting 100+ fps in games


----------



## pez

Quote:


> Originally Posted by *Benny89*
> 
> To TITAN XP owners (since I am to buy 1080Ti ) - OCed does it max 144Hz/165Hz in 1440p?
> 
> I think I won't go for 4K till Volta drops so one card can get more than 60hz in 4K. I will never go back to 60Hz after tasting 100+ fps in games


What games?


----------



## Benny89

Quote:


> Originally Posted by *pez*
> 
> What games?


You know AAA games in general, RoTR, MGS, Witcher, Deus Ex, BF1 etc. Just in general


----------



## pez

Quote:


> Originally Posted by *Benny89*
> 
> You know AAA games in general, RoTR, MGS, Witcher, Deus Ex, BF1 etc. Just in general


For those you could essentially check reviews. Not a whole lot of AAA titles are running on any GPU at 144fps maxed out. However, for me, I'm only concerned with crazy FPS for any competitive FPS games.


----------



## Dagamus NM

Quote:


> Originally Posted by *Nitemare3219*
> 
> You seem like you're in denial. Multi-purpose GPU - really? How so? It lacks double precision. It is BARELY a leader in VRAM anymore (11 GB vs 12 GB is insignificant). The Titan XM was a leader until the Titan XP came out, but it continued to maintain resale value because no other cards had 12 GB of VRAM (or anything really close). Same story for Titan Black... nothing significant came along until Titan X, and the Titan Black remained relevant until the 980 Ti came out.
> 
> To sit here and think the Titan XP still has a high resale value is laughable when the 1080 Ti will deliver just as much performance, if not more, depending on how the power delivery is being handled now. And even NVIDIA themselves showed during the show how the 1080 Ti's 11 GHz memory was superior to the Titan XP running at 11 GHz as well.
> 
> Resale has already tanked. I already bought 1 Titan XP for $800. NVIDIA has never seriously slapped Titan owners in the face until now. A SAME GENERATION TI CARD is essentially matching a Titan (of the same generation) for the first time ever.


Yep. This is as close to a Titan copy as they have come. People thought the original Titan got slapped by the 780 ti but they kept the vram at half of the Titan just as they did with Maxwell and the 980 Ti. Puzzling to see the 11gb/352 bit bus. Seems lazy by NVidia at best.

Makes me wonder what AMD has up their sleeve. Not like I really care, my Titans will keep cruising along. I won't buy more. 1080 Ti looks like a winner. Cannot say that wouldn't swap these TXPs for Tis if I were in the early post purchase window.

I wonder how many Titan X Pascal cards have sold since January 1st. I have to imagine that the market for those buying was pretty well tapped. 1080 Ti really opens it up for NVidia. I also wonder if Asus will make a proper Matrix card this time and actually release it to market unlike the limited 980 Ti variant.


----------



## Benny89

Quote:


> Originally Posted by *pez*
> 
> For those you could essentially check reviews. Not a whole lot of AAA titles are running on any GPU at 144fps maxed out. However, for me, I'm only concerned with crazy FPS for any competitive FPS games.


I understand but I prefer if all games I play are butter smooth and 85+ fps looks, move and feels sooooo goood. Its like having to put cold butter on your sandwitch instead of room temperature soft butter









I love high graphic settings, but without high fps I feel like I watch slide shows. Slides looks stunning but experience is ruined by that low fps. I prefer balance between those two, hence I will switch to 4K only when I will be able to get at least 85+ fps







.

But each to his own, most important thing is to enjoy gaming


----------



## stocksux

So on exactly the 30th day of owning a Titan X, Nvidia has granted a refund of said Titan X. Every situation from person to person will be different, but here is what made my decision. Ive never owned Nvidia's Top Tier card simply because of cost. I only game on my machine. There's nothing else I do with it. I purchased the Titan X simply because I wanted to finally own top tier and experience it. I understand the 1080ti is simply a slightly cut down Titan X that they just OC'd and marketing magic did the rest. But for the dollar/performance and my needs specifically, the 1080ti is a better fit. Also the games I currently play do support sli and now that is in reach for me with there only being a $200 difference in single Titan X vs sli 1080ti. I've also never owned an sli setup before. Yes I've read numerous complaints on lack of support and not as many games use it now. However, of the 7 or 8 games I play, all but one fully support sli. The one that doesn't is an MMORPG that can run on a 5 year old card. I'm also within my full refund period on the Titan X. Had I not been, then things would be different. It would have been tough to sell it at a loss so soon into ownership. So for what it's worth, that's my 2 cents.


----------



## Nitemare3219

Quote:


> Originally Posted by *octiny*
> 
> Seems like you're in denial with history
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, because 4GB of a TXM over a GTX 1080 is worth $250? Especially when the TXM is the slower card?
> 
> I'm sure you bought one from someone trying to convince them to sell it for that much. You're one of those I see.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've seen plenty sell for $1000+ so far, not to mention a guy few posts back just sold his for $1100 just now.
> 
> So as I've said, you can think all you want when it comes to whether it's worth that price or not......but in the end, it doesn't matter what you or OCN thinks. Until Nvidia lowers their $1200 asking price, the market value won't fluctuate much.
> 
> Also, I'm not sure if you've noticed, but the XP has been out for 9 months. I would hope it would at least offer similar performance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FYI, I could care less about the pricing on the TI, I got my cards for half off. What grinds my gears are people on OCN trying to artificially deflate the resale value of the XP, so they scare people into selling their XP's for much cheaper than what they could actually get. The same exact thing happened with the Titan XM.
> 
> Edit:
> 
> 1-day Titan XP auction which just ended for $983. Around the $1000 dollar range as I've been saying, Auctions usually save a few bucks versus "buy it now". I've watched a ton of "buy it nows" get snatched up at $1000+ since the Ti reveal. They may not sell instantly, but they no doubt sell. On a side note, I sure do hope Nvidia has a large inventory of Ti's, because I can already picture scalpers snatching them all up similar to the 1080/1070's and profiting large.


You're hilarious.

TXM really wasn't any slower than a 1080 if it was overclocked to its limit. Additionally, yes, people were paying the added premium for the extra 4 GB of memory - probably for work applications.

I'm one of those? Who are you to call me anything? The seller actually posted asking for $850 on Reddit. I countered at $750. He countered at $800, and I accepted. It doesn't take a genius to realize that the TXP's value is no longer $1200 - are you really that blind? NOBODY is going to buy a new TXP from NVIDIA anymore for $1200 when they can get the SAME PERFORMANCE, if not better, from a 1080 Ti, for 60% of the price. What NVIDIA lists the card at is MEANINGLESS because they have another card for half the price with all the performance.

TXP has been out for 7 months, not 9 months, but you're pretty bad at math as we have seen.

I'm not trying to artificially deflate a damn thing. I don't have to. TXP's were selling on eBay for $1,500 before the 1080 Ti (who knows why - probably people who needed more than 2 for some reason, or were international). Now they are selling for $900-$1000 - IS THAT NOT MASSIVE DEFLATION? People can list their cards at $1,100 all damn day - they won't sell.

The one that you posted selling for $983 - cool story man. eBay takes 10% of that, so they basically sold it for under $900. And as I said, the value will continue to drop over the next few days. Just watch.


----------



## Gunslinger.

^^savage^^


----------



## KillerBee33

I say keep your Titans and just wait for VOLTA next year . I don't get why ppl start sellin' them to get 1080Ti's.








No Regrets here


----------



## meson1

Quote:


> Originally Posted by *KillerBee33*
> 
> I say keep your Titans and just wait for VOLTA next year . I don't get why ppl start sellin' them to get 1080Ti's.
> 
> 
> 
> 
> 
> 
> 
> 
> No Regrets here


The only possible benefit I could see to selling a TXP is to offset the cost of two 1080 Ti's in order to run SLI.

But you would have had to do that last week, before the value of TXPs dropped like a telly from a tower block.

I have a TXP still in it's box. Bought it months ago. Never even been plugged in yet. It half crossed my mind to sell it and get a pair of 1080 Ti's, but when you even only slightly look into it, it becomes apparent that I can't recoup enough money from the TXP to make it worthwhile. I'm not upset about it though. It's not like the Titan suddenly became the slowest GPU on Earth overnight. It's still a damn fast piece of kit. So I'm going to stick with plan A and build my rig with the Titan.

After that, my upgrade path is Volta or beyond.


----------



## KillerBee33

Got mine the first day it showed up, 7 months with the best on the market is what you usually get with any Electronics







Don't see a reason to panic and start selling... in 7 months after 1080Ti is fully available Volta is gonna be introduced. Keep Calm , Keep your Titans!!!


----------



## meson1

Further to my post above, it occurs to me that there might be an alternative solution. If you have a Titan XP, and want to move to an SLI setup, would it not be better to let the prices of the TXP drop and then pick up a second hand one? Hazards of purchasing pre-owned computing electronics notwithstanding, it might prove to be a better option than swapping a TXP out for two 1080 Ti's. I haven't done the math properly, but the cash outlay to make up the difference doesn't seem to be too far apart between the two options.

Not that I'm considering this myself. But I thought I'd throw the idea out there.


----------



## Nitemare3219

Quote:


> Originally Posted by *octiny*
> 
> Seems like you're in denial with history


In case I needed to prove my point any further, just got offered another TXP for $750 (seller offered me this price).
Quote:


> Originally Posted by *meson1*
> 
> Further to my post above, it occurs to me that there might be an alternative solution. If you have a Titan XP, and want to move to an SLI setup, would it not be better to let the prices of the TXP drop and then pick up a second hand one? Hazards of purchasing pre-owned computing electronics notwithstanding, it might prove to be a better option than swapping a TXP out for two 1080 Ti's. I haven't done the math properly, but the cash outlay to make up the difference doesn't seem to be too far apart between the two options.
> 
> Not that I'm considering this myself. But I thought I'd throw the idea out there.


That is what I would do as well. I was dead-set on pre-ordering 2x 1080 Ti, but it'd cost $1,500 from NVIDIA after tax. I could wait for AIB to launch, but there's no guarantee I'd get 2 cards there either. Instead, I'm buying 2 used TXP's for about the same price as 2 new 1080 Ti's... and with a small amount of luck, I'm sure resale value later on will make up for it.

I've never bought used PC components before (but have sold my old stuff to people plenty of times and heard no complaints) - hopefully there's not some inherent risk I'm not seeing here. As long as the cards look good and function properly when I receive them, I'm good to go aren't I? I mean failure rates are extremely low... biggest downside is no warranty transfer from NVIDIA, so I'd have to rely on the seller for warranty service if I ever needed it.


----------



## xTesla1856

Quote:


> Originally Posted by *Nitemare3219*
> 
> In case I needed to prove my point any further, just got offered *another TXP for $750* (seller offered me this price).


Damn, for 750 I'd pick it up in a heartbeat


----------



## octiny

Quote:


> Originally Posted by *Nitemare3219*
> 
> You're hilarious.
> 
> TXM really wasn't any slower than a 1080 if it was overclocked to its limit. Additionally, yes, people were paying the added premium for the extra 4 GB of memory - probably for work applications.
> 
> I'm one of those? Who are you to call me anything? The seller actually posted asking for $850 on Reddit. I countered at $750. He countered at $800, and I accepted. It doesn't take a genius to realize that the TXP's value is no longer $1200 - are you really that blind? NOBODY is going to buy a new TXP from NVIDIA anymore for $1200 when they can get the SAME PERFORMANCE, if not better, from a 1080 Ti, for 60% of the price. What NVIDIA lists the card at is MEANINGLESS because they have another card for half the price with all the performance.
> 
> TXP has been out for 7 months, not 9 months, but you're pretty bad at math as we have seen.
> 
> I'm not trying to artificially deflate a damn thing. I don't have to. TXP's were selling on eBay for $1,500 before the 1080 Ti (who knows why - probably people who needed more than 2 for some reason, or were international). Now they are selling for $900-$1000 - IS THAT NOT MASSIVE DEFLATION? People can list their cards at $1,100 all damn day - they won't sell.
> 
> The one that you posted selling for $983 - cool story man. eBay takes 10% of that, so they basically sold it for under $900. And as I said, the value will continue to drop over the next few days. Just watch.


Actually, TXM is still 10-15% slower even at 1500mhz OC for OC, but who's counting right?

The guy/s on Reddit is a dumb*ss then.

Nobody will buy at 1200? Guess what? People are buying at $1100 used. You need to get over the fact that no one cares a god damn thing about what you think regarding the price, nor about the idiots on reddit enjoying screwing themselves. End of story.

"Same, if not better performance". Nice! Please share these imaginary benchmarks when you're done sipping from Nvidia's marketing teams cool-aid. OC for OC, the TXP will still be faster which will still serve as the e-peen premium......which SOME people will pay for. Even if it's only 1% better.

What the hell does Ebay fees have to do with this? You are creating this hysteria that nobody will buy at the asking price of $1000. However, it's the complete opposite. They are still selling for that and will continue to sell at that price. Do you want me to list every "buy it now" auction that sells for 1000+? Lol I mean seriously, you're in denial and it's quite amusing.

The crazy thing, people are buying them up full well knowing the Ti is coming out. If anything, the resale value will stay put, or may rise, depending on whether Nvidia can keep the Ti's stocked the first couple months.

Moral of the story people. If you want to sell something, don't ever sell on Reddit









Edit:

Newest Titan X listing selling for $1060. I can do this all day.



Btw, the only time they sold @ $1500+ was near launch. Those 1500+ prices you saw recently are mainly because international buyers couldn't buy the TXP since Nvidia didn't ship to some countries so they unfortunately had no choice but to pay a huge premium.


----------



## octiny

delete


----------



## gamingarena

Again people looks like they dont get it, Nvidia strong marketing (brain washing) made Titan name recognizable premium brand, so many people think Ti is Titan knockoff or poor man s Titan, that mindset will stay for a while and nothing can chnange in peoples mind, so resale value will allways be way better then Ti and yes even if Ti is same specs as Titan period.


----------



## octiny

Quote:


> Originally Posted by *gamingarena*
> 
> Again people looks like they dont get it, Nvidia strong marketing (brain washing) made Titan name recognizable premium brand, so many people think Ti is Titan knockoff or poor man s Titan, that mindset will stay for a while and nothing can chnange in peoples mind, so resale value will allways be way better then Ti and yes even if Ti is same specs as Titan period.


+1

It's the prestige brand. Always has, and always will be.


----------



## CptSpig

Quote:


> Originally Posted by *KillerBee33*
> 
> I say keep your Titans and just wait for VOLTA next year . I don't get why ppl start sellin' them to get 1080Ti's.
> 
> 
> 
> 
> 
> 
> 
> 
> No Regrets here


+1 it's time now to get a second TXP for $700.00 to $800.00. No Regrets.


----------



## animeowns

Quote:


> Originally Posted by *CptSpig*
> 
> +1 it's time now to get a second TXP for $700.00 to $800.00. No Regrets.


Currently using 3 Txp's I don't think I Need a 4th one I mean the games I have been playing have been very good with 3 way support but there are only a few people that actually have 4 way that AI can look at benchmarks and compare from


----------



## Artah

Quote:


> Originally Posted by *KillerBee33*
> 
> I say keep your Titans and just wait for VOLTA next year . I don't get why ppl start sellin' them to get 1080Ti's.
> 
> 
> 
> 
> 
> 
> 
> 
> No Regrets here


I'm actually wanting to sell my pair to fund a bunch of construction at my house and not getting 1080 ti.


----------



## piee

Discussions went on for years about single card doing 4k/60fps max, TXP is the first.The jump from 28nm to 16nm on TXP is what sold me, also gaming 4k/60fps max settings is sweet, lotsa eye candy, I get butter smooth locking gfx card to 60fps, then turn on vsync with no frame buffer so no latency. There wont be as big a jump in performance until new architecture, maybe 10nm or 7nm, which wont be for some years for gfx cards.


----------



## Gunslinger.

Quote:


> Originally Posted by *octiny*
> 
> Actually, TXM is still 10-15% slower even at 1500mhz OC for OC, but who's counting right?
> 
> The guy/s on Reddit is a dumb*ss then.
> 
> Nobody will buy at 1200? Guess what? People are buying at $1100 used. You need to get over the fact that no one cares a god damn thing about what you think regarding the price, nor about the idiots on reddit enjoying screwing themselves. End of story.
> 
> "Same, if not better performance". Nice! Please share these imaginary benchmarks when you're done sipping from Nvidia's marketing teams cool-aid. OC for OC, the TXP will still be faster which will still serve as the e-peen premium......which SOME people will pay for. Even if it's only 1% better.
> 
> What the hell does Ebay fees have to do with this? You are creating this hysteria that nobody will buy at the asking price of $1000. However, it's the complete opposite. They are still selling for that and will continue to sell at that price. Do you want me to list every "buy it now" auction that sells for 1000+? Lol I mean seriously, you're in denial and it's quite amusing.
> 
> The crazy thing, people are buying them up even BEFORE full well knowing the Ti is coming out. If anything, the resale value will stay put, or may rise, depending on whether Nvidia can keep the Ti's stocked the first couple months.
> 
> Moral of the story people. If you want to sell something, don't ever sell on Reddit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> 
> Newest Titan X listing selling for $1060. I can do this all day.


http://makeagif.com/gif/fpsrussia-flamethrowers-gHy0p2


----------



## CptSpig

Quote:


> Originally Posted by *Gunslinger.*
> 
> http://makeagif.com/gif/fpsrussia-flamethrowers-gHy0p2


+1 Please stop!


----------



## Nitemare3219

Quote:


> Originally Posted by *xTesla1856*
> 
> Damn, for 750 I'd pick it up in a heartbeat


Just got the invoice and paid it. So I paid $810 and $750 shipped for 2 Titan XP's in the past 24 hours.


----------



## Slushpup

Nice deals man!


----------



## KillerBee33

@ Artah
That makes More sense than the Ti


----------



## jsutter71

Quote:


> Originally Posted by *stocksux*
> 
> Yes the cards thermal throttle. We all know that. Most people are able to OC into the 2000-2100 range and stay there on water. That's not a big deal. The claim was that the voltage would be unlocked on the ti thus allowing for even faster clocks. If voltage doesn't go up you will become capped from clock speeds. Also at 2278 consider yourself lucky at winning the silicon lottery


What are your settings to get to this level? I'm running in SLI under water.


----------



## jsutter71

Quote:


> Originally Posted by *jmaz87*
> 
> whats the best HB bridge for EK full blocks? this was another complete letdown. I never expected EK to not make an HB bridge for the Titan XP... probably the largest reason for me not getting a 2nd one by now.... I'm a bit hesitant to buy a used gpu so expensive but we will see how the market changes i suppose


I assume when you say full blocks your talking about a FC terminal. If so then get a EVGA HB bridge.


----------



## DooRules

Who knew the Titan forum could be so entertaining, love it.









@ gunslinger... keep the fire on full blast







great stuff


----------



## chantruong

Which subreddit are you guys finding Titan XPs for $700 to $800. I might want to pick up another one









Also any chance Nvidia lowers the MSRP on it?


----------



## pez

Quote:


> Originally Posted by *Benny89*
> 
> I understand but I prefer if all games I play are butter smooth and 85+ fps looks, move and feels sooooo goood. Its like having to put cold butter on your sandwitch instead of room temperature soft butter
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I love high graphic settings, but without high fps I feel like I watch slide shows. Slides looks stunning but experience is ruined by that low fps. I prefer balance between those two, hence I will switch to 4K only when I will be able to get at least 85+ fps
> 
> 
> 
> 
> 
> 
> 
> .
> 
> But each to his own, most important thing is to enjoy gaming


No I definitely agree. You made it sound like you wanted 144+ and no less on every game. There's a lot of games you will get that on at 1440p. I get 85+ on BF1 full tilt, so I don't doubt that's doable.
Quote:


> Originally Posted by *Nitemare3219*
> 
> Just got the invoice and paid it. So I paid $810 and $750 shipped for 2 Titan XP's in the past 24 hours.


Cool...?

I'm glad you exploited Reddit and it's generally clueless user base to get a good deal...can we move on?


----------



## Nitemare3219

Quote:


> Originally Posted by *pez*
> 
> Cool...?
> 
> I'm glad you exploited Reddit and it's generally clueless user base to get a good deal...can we move on?


I'm not sure how that's exploiting. I'd say it was a very fair price, although the 2nd XP at $750 shipped was a surprisingly good deal.
Quote:


> Originally Posted by *chantruong*
> 
> Which subreddit are you guys finding Titan XPs for $700 to $800. I might want to pick up another one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also any chance Nvidia lowers the MSRP on it?


I literally just searched /r/hardwareswap for "Titan" and kept checking posts in the last 24 hours. Someone posted up for $850 and I got it for $810. And as for the $750 one I snagged, I actually messaged eBay sellers asking to deal direct... most are more than happy to take direct payment and avoid eBay's atrocious 10% fees.


----------



## CptSpig

Quote:


> Originally Posted by *Nitemare3219*
> 
> I'm not sure how that's exploiting. I'd say it was a very fair price, although the 2nd XP at $750 shipped was a surprisingly good deal.
> I literally just searched /r/hardwareswap for "Titan" and kept checking posts in the last 24 hours. Someone posted up for $850 and I got it for $810. And as for the $750 one I snagged, I actually messaged eBay sellers asking to deal direct... most are more than happy to take direct payment and avoid eBay's atrocious 10% fees.


There are fourteen on Ebay right now all under $800.00. One at $720.00 with six hours left. Someone better jump on that!


----------



## pez

Quote:


> Originally Posted by *Nitemare3219*
> 
> I'm not sure how that's exploiting. I'd say it was a very fair price, although the 2nd XP at $750 shipped was a surprisingly good deal.
> I literally just searched /r/hardwareswap for "Titan" and kept checking posts in the last 24 hours. Someone posted up for $850 and I got it for $810. And as for the $750 one I snagged, I actually messaged eBay sellers asking to deal direct... most are more than happy to take direct payment and avoid eBay's atrocious 10% fees.


Maybe exploit was the wrong word. Either way, the panicking bunch selling their TXPs are silly to do so, but getting them for cheap is great...but in the grand scheme of things, the topic is a bit silly.


----------



## BigMack70

What sort of clocks are common for custom BIOS under water for Titan XP?


----------



## KillerBee33

Quote:


> Originally Posted by *BigMack70*
> 
> What sort of clocks are common for custom BIOS under water for Titan XP?


Custom BIOS for Pascal?


----------



## BigMack70

Quote:


> Originally Posted by *KillerBee33*
> 
> Custom BIOS for Pascal?


Nobody managed to edit the Pascal BIOS? Interesting... and disappointing after so much was done with Maxwell.

Thanks.


----------



## MunneY

Quote:


> Originally Posted by *BigMack70*
> 
> Nobody managed to edit the Pascal BIOS? Interesting... and disappointing after so much was done with Maxwell.
> 
> Thanks.


Yeah, it was an Nvidia only card so it was locked down pretty tight. We'll probably get something this time around. Atleast hopefully.


----------



## KillerBee33

Doubt it.. Pascal with slight OC already BOOSTIN' over 30% , you could only get that + 30% on Maxwell with BIOS Mode. What i'm sayin' here is 2100 is basically the Wall for Pascal ( and thats 30%+ from stock)


----------



## bl4ckdot

Since the 1800X didn't lived up to my expectations in games scenarios, I'm looking for something else to replace my 4790K. 7700K would be the obvious choice since I mostly play games on this PC and occasionally run some VMs to work. Going for a 2011-3 socket seems to be at bit late (x299 (?) is expected late 2017). What do you think ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *bl4ckdot*
> 
> Since the 1800X didn't lived up to my expectations in games scenarios, I'm looking for something else to replace my 4790K. 7700K would be the obvious choice since I mostly play games on this PC and occasionally run some VMs to work. Going for a 2011-3 socket seems to be at bit late (x299 (?) is expected late 2017). What do you think ?


Keep the 4790k. Or go 6800k x99. Myself would go for the Ryzen 1700 anyways, even though the reviews don't paint a positive picture when it comes to gaming vs Intel cpus.

Really your 4790k is still top notch.


----------



## bl4ckdot

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Keep the 4790k. Or go 6800k x99. Myself would go for the Ryzen 1700 anyways, even though the reviews don't paint a positive picture when it comes to gaming vs Intel cpus.
> 
> Really your 4790k is still top notch.


I know my 4790K isn't bad but the chipset itself is limiting me : to take an example, I can't run a NVMe drive at PCIe 4x with my setup.
(I'm also craving for an upgrade of my build sooo







)


----------



## jmaz87

i would wait for x299 or whatever else is comprable later in the year. Im very happy with my 6850k especially after seeing the ryzen benchmarks but at this point wouldn't be worth the money with better right around the corner.


----------



## MunneY

Quote:


> Originally Posted by *jmaz87*
> 
> i would wait for x299 or whatever else is comprable later in the year. Im very happy with my 6850k especially after seeing the ryzen benchmarks but at this point wouldn't be worth the money with better right around the corner.


This.

You are close enough to skylake-x that you might as well stick to your guns and see whats in the pipe.


----------



## bl4ckdot

Quote:


> Originally Posted by *MunneY*
> 
> This.
> 
> You are close enough to skylake-x that you might as well stick to your guns and see whats in the pipe.


I think this is indeed wise. Thanks, much appreciated.


----------



## Dagamus NM

Can't wait for x299 here. 3930K x79 needs an update.

So after reading the posts the past few days I have seen many talk about high refresh rates at high res.

What monitors are doing 4K or higher at above 60Hz and what connection cable supports the bandwidth?


----------



## Benny89

Quote:


> Originally Posted by *MunneY*
> 
> This.
> 
> You are close enough to skylake-x that you might as well stick to your guns and see whats in the pipe.


So if I understood this correctly- Intel is planning to release in Q2 both Skylake-X, Kaby Lake-X and on top off that- Coffe Lake 8th generations CPUs? That doesn't make sense... Nobody will buy X if 2-3 months later there will be new generation CPUs on market.

I am confused


----------



## stocksux

Quote:


> Originally Posted by *Dagamus NM*
> 
> Can't wait for x299 here. 3930K x79 needs an update.
> 
> So after reading the posts the past few days I have seen many talk about high refresh rates at high res.
> 
> What monitors are doing 4K or higher at above 60Hz and what connection cable supports the bandwidth?


none. We're all waiting on the Asus PG27UQ or the Acer Predator XB272-HDR to launch.


----------



## Dagamus NM

Quote:


> Originally Posted by *Benny89*
> 
> So if I understood this correctly- Intel is planning to release in Q2 both Skylake-X, Kaby Lake-X and on top off that- Coffe Lake 8th generations CPUs? That doesn't make sense... Nobody will buy X if 2-3 months later there will be new generation CPUs on market.
> 
> I am confused


Enthusiast chips for x299 will be both skylake-X and Kabylake-X with the KL part being a four core that works on x299. Coffeelake will be consumer chips.

The only change here is making another 4 core for HEDT.
Quote:


> Originally Posted by *stocksux*
> 
> none. We're all waiting on the Asus PG27UQ or the Acer Predator XB272-HDR to launch.


Funny that we are waiting on more powerful gpus still.

Titan XPs in sli allow for zero drops below 60fps on the couple of games I have played (skyrim special edition and the borderlands presequel). I have four XPs in sli on my setup with three Asus PB287q monitors which are all set to run 60Hz but I haven't bothered to try any games yet. Need to figure out how to make a custom sli profile I guess. I work on this pc so games haven't been a concern. Maybe after this semester is over I will try fallout 4. Bought it awhile back but have yet to launch it. With the depth of current AAA titles, particularly Bethesda it seems that I might get into two games a year if I am lucky.


----------



## meson1

Quote:


> Originally Posted by *Benny89*
> 
> So if I understood this correctly- Intel is planning to release in Q2 both Skylake-X, Kaby Lake-X and on top off that- Coffe Lake 8th generations CPUs? That doesn't make sense... Nobody will buy X if 2-3 months later there will be new generation CPUs on market.
> 
> I am confused


Now I could be wrong, but to the best of MY knowledge, X299, Skylake-X and Kaby Lake-X are slated for Q3 around August time. And Coffee Lake is supposed to be due Q1 2018.

And as was stated earlier, Skylake-X and Kaby Lake-X are HEDT processors with quad channel memory controllers, extra PCI lanes and all that jazz. Coffee Lake will be regular desktop and mobile offerings of next generation CPUs.


----------



## spyui

Quote:


> Originally Posted by *jsutter71*
> 
> I assume when you say full blocks your talking about a FC terminal. If so then get a EVGA HB bridge.


Can I ask what is your TXP temp is like in this setup ? ARe you cooling your TXP with 2 Radiators ?


----------



## Benny89

Quote:


> Originally Posted by *Dagamus NM*
> 
> Enthusiast chips for x299 will be both skylake-X and Kabylake-X with the KL part being a four core that works on x299. Coffeelake will be consumer chips.
> 
> The only change here is making another 4 core for HEDT.


So in theory "X" should more powerfull than Coffee Lake chips?


----------



## Fredthehound

I went from a 4790k to 7700k because of the faster ram availability. I'm completely happy with it. But if you are willing to wait, you'd probably be better to. Kaby will still be there if the SkyX stuff isn't up to what you need.


----------



## cg4200

With the 1080 ti coming out from other vendors knowing they have the same cores 3584 cores. I am getting my hopes up that maybe we will get
to have custom bios finally?? Would you be able to flash a 1080 ti bios to xp even know faster ram??
I know my old titan x needed more even power to it than volts although it did also need volts to hit 1575.
I think the shunt works good but not the same as editing power levels yourself my card is starved for power and I don't think much more volts will help but can't hurt to try.. I am hoping to get another 100 MHz on to my 2100 =2200 for gaming would be great if I could hit [email protected] most games maxed out..


----------



## KillerBee33

Quote:


> Originally Posted by *cg4200*
> 
> With the 1080 ti coming out from other vendors knowing they have the same cores 3584 cores. I am getting my hopes up that maybe we will get
> to have custom bios finally?? Would you be able to flash a 1080 ti bios to xp even know faster ram??
> I know my old titan x needed more even power to it than volts although it did also need volts to hit 1575.
> I think the shunt works good but not the same as editing power levels yourself my card is starved for power and I don't think much more volts will help but can't hurt to try.. I am hoping to get another 100 MHz on to my 2100 =2200 for gaming would be great if I could hit [email protected] most games maxed out..


Talk to Dark ,he'll cook you nice trimmed BIOS for your Maxwel Titan








http://www.overclock.net/t/1573308/nvidia-gtx-900-cards-custom-bios-upon-request


----------



## stocksux

Hey everyone. So I finished building my new rig and with everything completely atock ran Heaven at 1080p 8xAA, Ultra, Extreme and scored 3731. I started messing with an overclock on the CPU and so far have managed 5.0GHz on a 7700k. Just to see the difference before I keep going on the overclock I ran Heaven again. To my surprise with the same settings in Heaven and my CPU now at 5GHz, my score dropped down to 3001! What gives?!?!?!?!?!?

Stock 7700k & Stock Titan XP 5.0GHz & Stock Titan XP
FPS 148.1 119.1
SCORE 3731 3001
MIN FPS 9.4 36.6
MAX FPS 295.5 264.7


----------



## xarot

Quote:


> Originally Posted by *stocksux*
> 
> Hey everyone. So I finished building my new rig and with everything completely atock ran Heaven at 1080p 8xAA, Ultra, Extreme and scored 3731. I started messing with an overclock on the CPU and so far have managed 5.0GHz on a 7700k. Just to see the difference before I keep going on the overclock I ran Heaven again. To my surprise with the same settings in Heaven and my CPU now at 5GHz, my score dropped down to 3001! What gives?!?!?!?!?!?


Sound like unstable overclock on the CPU.


----------



## Sheyster

Quote:


> Originally Posted by *xTesla1856*
> 
> I don't think anyone has a compelling reason to sell.


I may sell mine and go 1080 Ti SLI. Not sure yet though; just chewing on it for now.


----------



## meson1

Quote:


> Originally Posted by *Sheyster*
> 
> I may sell mine and go 1080 Ti SLI. Not sure yet though; just chewing on it for now.


LOL. For sale, one Titan X Pascal, some teeth marks.


----------



## xTesla1856

I get 32k graphics score in FS with my TXP at 2152. No reason to sell for me


----------



## qazplm5089

Does anyone know if the Corsair HG10 will work with the Titan X Pascal?


----------



## stocksux

Problem solved. Should have posted earlier, but the issue was GeForce Control Panel had reset my monitor hz to 60hz. It also had turned G-Sync on. I set the monitor back to 165hz and turned G-Sync off and bam, scores went up.


----------



## auraofjason

Quote:


> Originally Posted by *qazplm5089*
> 
> Does anyone know if the Corsair HG10 will work with the Titan X Pascal?


I don't think so, I remember people saying it wouldn't work with the 1080 and they had to mod theirs for it to fit. I would assume a titan x wouldn't fit either.


----------



## vmanuelgm

What do u think guys, 1080Ti custom will get over TitanX Pascal at max oc???


----------



## Maintenance Bot

Quote:


> Originally Posted by *vmanuelgm*
> 
> What do u think guys, 1080Ti custom will get over TitanX Pascal at max oc???


Me thinks TI 5% to 10% faster.


----------



## stryker7314

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Me thinks TI 5% to 10% faster.


Ti will be 3% slower either way, ti won't get past the 2150 clock limit unless subambient, air and water won't make a difference. At that point the txp will be faster do to not having cut down memory, rops, etc...


----------



## stocksux

Quote:


> Originally Posted by *stryker7314*
> 
> Ti will be 3% slower either way, ti won't get past the 2150 clock limit unless subambient, air and water won't make a difference. At that point the txp will be faster do to not having cut down memory, rops, etc...


Which equates to $200 for each percentage point difference


----------



## Maintenance Bot

Quote:


> Originally Posted by *stryker7314*
> 
> Ti will be 3% slower either way, ti won't get past the 2150 clock limit unless subambient, air and water won't make a difference. At that point the txp will be faster do to not having cut down memory, rops, etc...


Yeah, maybe your right. Just found these.

http://www.3dmark.com/compare/spy/1303084/spy/1307626/spy/1310153/spy/1301697/spy/1219203#


----------



## vmanuelgm

Thanks for your opinions guys.

Some people think the Ti will clock higher, and customs wont be power restricted. I have a TitanX with shunt mod and 2150 stable gaming, so wonder if Ti can beat it.

In Heaven the higher memory clock can help a bit, specially if the max reachable clock is near/over 12 GHz.


----------



## MrTOOSHORT

The TI will beat out the Titan-X P, plus there will bios' floating around to max out the power limit. Like the Asus strix bios that could flash to a reference(founders)card.

I guess we'll see.


----------



## axiumone

Does anyone know off hand if the base on the stock heat sink is copper or aluminum?


----------



## CptSpig

Quote:


> Originally Posted by *meson1*
> 
> LOL. For sale, one Titan X Pascal, some teeth marks.


How much!


----------



## stryker7314

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Yeah, maybe your right. Just found these.
> 
> http://www.3dmark.com/compare/spy/1303084/spy/1307626/spy/1310153/spy/1301697/spy/1219203#


Good find, the proof is in the pudding.


----------



## vmanuelgm

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> The TI will beat out the Titan-X P, plus there will bios' floating around to max out the power limit. Like the Asus strix bios that could flash to a reference(founders)card.
> 
> I guess we'll see.


If Ti is able to reach 2200 MHz easily and beyond it would beat the TXP with shunt mod. If not, they should perform similar, winning sometimes the Ti, and others the TXP.


----------



## meson1

Quote:


> Originally Posted by *CptSpig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *meson1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sheyster*
> 
> I may sell mine and go 1080 Ti SLI. Not sure yet though; just chewing on it for now.
> 
> 
> 
> LOL. For sale, one Titan X Pascal, some teeth marks.
> 
> Click to expand...
> 
> How much!
Click to expand...

You would have to ask Sheyster. LOL. I was referring to his prospective for-sale listing.









P.S. Did you know that in your sig. you've got a colon instead of a semi-colon after "&nbsp"?


----------



## CptSpig

Quote:


> Originally Posted by *meson1*
> 
> You would have to ask Sheyster. LOL. I was referring to his prospective for sale listing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. Did you know that in your sig. you've got a colon instead of a semi-colon after "&nbsp"?


So many people wanting to sell their Titan XP's. I see this as a opportunity to by another Titan XP. I believe when all is said and done it will still be king.


----------



## Benny89

Quote:


> Originally Posted by *CptSpig*
> 
> So many people wanting to sell their Titan XP's. I see this as a opportunity to by another Titan XP. I believe when all is said and done it will still be king.


In gaming (as always) Ti will be better than TITAN, especially with non-reference cards. On custom water builds they will perform at least the same, but if Ti can OC higher then it will be little better (really no difference imo). In pro tasks- TITAN should be better, but really, 1GB of memory difference is not that much 11 vs 12.

Ti is just pure gaming card, as any Ti every generation.


----------



## TonyRoma

Don't forget the ROP count difference (96 vs 88), and the lower bit bus on the TI (384 vs 352-bit). Surely it all adds up. Isn't there something about cache present on TXP that is done via software on 1080 TI too? 1080 TI can only come out on top if it has a decent clock speed advantage over TXP.


----------



## CptSpig

Quote:


> Originally Posted by *Benny89*
> 
> In gaming (as always) Ti will be better than TITAN, especially with non-reference cards. On custom water builds they will perform at least the same, but if Ti can OC higher then it will be little better (really no difference imo). In pro tasks- TITAN should be better, but really, 1GB of memory difference is not that much 11 vs 12.
> 
> Ti is just pure gaming card, as any Ti every generation.


I think we need to wait and see when the cards come out Nvidia always promises more than what the card can actually deliver.


----------



## willverduzco

Quote:


> Originally Posted by *CptSpig*
> 
> So many people wanting to sell their Titan XP's. I see this as a opportunity to by another Titan XP. I believe when all is said and done it will still be king.


Even if only nominally. The way I see it, we're going to see approximately the same range of poorly clocking to well clocking 1080TI cards as we see with the TXP, as they are both cut down equally when it comes to shader cores. Since the TI has 92% the ROP count as our TXP cards, we will likely have an edge in some high res games where we're not shader limited. Finally, since the TI needs higher memory clocks in order to compensate for the narrower bus, we may still have an edge in overall memory bandwidth when both are pushed to max clocks (further helping the TXP relative to the TI at high resolution).


----------



## DooRules

If any of them unlocked bios's that come out for the TI's can be flashed to the Titan game on.


----------



## EvilPieMoo

Quote:


> Originally Posted by *DooRules*
> 
> If any of them unlocked bios's that come out for the TI's can be flashed to the Titan game on.


At this point a custom BIOS wouldn't do much for the Titan, maybe a few Mhz extra. Pascal really doesn't care for extra power/voltage, example being how FE 1080's were constantly clocking higher than custom cards with an extra 8 pin, higher target etc.


----------



## symmetrical

I'm just hoping EVGA or another company makes a replacement cooler for the Titan XP with the 1080Ti coming out. I'm sick of my stock cooler being so noisy with a small overclock.


----------



## KillerBee33

Quote:


> Originally Posted by *EvilPieMoo*
> 
> At this point a custom BIOS wouldn't do much for the Titan, maybe a few Mhz extra. Pascal really doesn't care for extra power/voltage, example being how FE 1080's were constantly clocking higher than custom cards with an extra 8 pin, higher target etc.


Not by much though, this is the absolute wall for the FE 1080 i had...


Spoiler: Warning: Spoiler!






And my first TXP did 2160 solid with +600 on the memory


----------



## cg4200

Not sure why people would say a custom bios would not matter for txp at this point I have been waiting forever I am not smart enough to unlock bios but I got good with Maxwell bios editing my t Maxwell was 1575 gaming and my 980 ti was 1600 gaming.. It makes no sense of coarse it would matter maybe not as much as Maxwell did but txp is basically a Maxwell refresh.. So as example I could not run firestrike extreme 2139 til shunt mod.. and with shunt mod and volts not +100 I could not get that high it took all of 1.09mv to get 2139 run ..So with proper cooling and unlocked bios you could get more with out tricking your gpu unclean power.. If given clean power to tdp some more total power and a little to pci rails a touch of voltage and mabe a final bin voltage I don't see why we could not get another 100 or so of most cards bios should be close to Maxwell just need them unlocked.. Hoping with 1080 ti launching full chip and going to 3rd party vendors that someone will upload sighed unlocked bios fingers crossed


----------



## carlhil2

https://www.overclock3d.net/news/gpu_displays/nvidia_s_gtx_1080_ti_shows_up_in_futuremark_s_3dmark_database/1


----------



## feznz

me thinks if you are gong to "slight grade" from TXP to 1080ti you be a drug dealer









LOL all seriousness this must be about the E-Peen and getting to the top of the leader board even at best if the 1080Ti were to be 15% faster out of the box than a TXP I personally wouldn't even consider changing out cards.
A 1080Ti will be common soon a TXP will always have a little rarity so hold value a little better.

my


----------



## roccale

I do my timespy with total stock vga and i do 9890 graphic score .

https://postimg.org/image/kf9pgb5kz/

Only with pl to 120% but all clock stock my result is 10035 di graphics score:

http://www.3dmark.com/3dm/18383001?

I use driver 378.77 and disable demo in bench.

The graphic scores are strange at ur link:

https://www.overclock3d.net/news/gpu_displays/nvidia_s_gtx_1080_ti_shows_up_in_futuremark_s_3dmark_database/1


----------



## lilchronic

Quote:


> Originally Posted by *roccale*
> 
> I do my timespy with total stock vga and i do 9890 graphic score .
> 
> https://postimg.org/image/kf9pgb5kz/
> 
> Only with pl to 120% but all clock stock my result is 10035 di graphics score:
> 
> http://www.3dmark.com/3dm/18383001?
> 
> I use driver 378.77 and disable demo in bench.
> 
> The graphic scores are strange at ur link:
> 
> https://www.overclock3d.net/news/gpu_displays/nvidia_s_gtx_1080_ti_shows_up_in_futuremark_s_3dmark_database/1


put power limit to stock and test again. Anyway each card is going to boost slightly differnt.


----------



## Edge0fsanity

anyone with a horizontal motherboard orientation like the CL s8 has have long term success with the clu shunt mod? I've been wanting to do it for awhile but the chance the clu runs and ruins the card kept me from doing it back in August.


----------



## Dagamus NM

Quote:


> Originally Posted by *Edge0fsanity*
> 
> anyone with a horizontal motherboard orientation like the CL s8 has have long term success with the clu shunt mod? I've been wanting to do it for awhile but the chance the clu runs and ruins the card kept me from doing it back in August.


I did this with four Titans in an S8. I masked the card off and only had the individual resistors that I planned on using exposed. I then applied the CLU and then applied several coats of black plasti dip. After I removed the masking it sealed them perfectly and looks like it could come off easily if needed. Pictures are in my posts in this thread from a couple months ago.


----------



## pez

Quote:


> Originally Posted by *CptSpig*
> 
> So many people wanting to sell their Titan XP's. I see this as a opportunity to by another Titan XP. I believe when all is said and done it will still be king.


Even I'm looking at those people and starting to think if I should go ATX or even mATX again







.
Quote:


> Originally Posted by *Benny89*
> 
> In gaming (as always) Ti will be better than TITAN, especially with non-reference cards. On custom water builds they will perform at least the same, but if Ti can OC higher then it will be little better (really no difference imo). In pro tasks- TITAN should be better, but really, 1GB of memory difference is not that much 11 vs 12.
> 
> Ti is just pure gaming card, as any Ti every generation.


If you can stand the noise, the Titan XP will clock fairly high. It's been a while, but mine consistently would do 2k+ and do a 'final' drop to 1974 MHz. The benefits you're going to see with the Ti are the AIBs being nice and cool and you being able to hold that 2k+ OC longer...but in the grand scheme of things, you're looking at a margin of error type of performance difference in a high 1900 OC and a low to mid 2000 OC.
Quote:


> Originally Posted by *symmetrical*
> 
> I'm just hoping EVGA or another company makes a replacement cooler for the Titan XP with the 1080Ti coming out. I'm sick of my stock cooler being so noisy with a small overclock.


EVGA said a few times they were working on a AIO for the Titan XP. They spoke of a delay, but said it was still being worked on, so maybe they're doing it to somehow be compatible with both the TXP and the Ti. That would be especially good news for me, honestly. No clue on inter-compatibility as I'm not sure if the boards are similar enough off the bat to say that with confidence.


----------



## xTesla1856

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> The TI will beat out the Titan-X P, plus there will bios' floating around to max out the power limit. Like the Asus strix bios that could flash to a reference(founders)card.
> 
> I guess we'll see.


Power limit doesn't seem to be the issue with Pascal, both my 1080FE and my TXP constantly hit the voltage limit at 2126MHz and 2152MHz respctively. Power can spike up in the 130s on the TXP but nothing sustained.


----------



## roccale

Yes i do it and mi score is 9836 all stock and with some in background like 3 google chrome page open and sonething other.

At this link : http://www.3dmark.com/compare/spy/1303084/spy/1307626/spy/1310153/spy/1301697/spy/1219203#

we see that 1080Ti at 1860*1377 best graphic score is 9303 points.

My titan x graphic score all stock (with demo on) at 1835*1251 is 9836
With demo off at 1860*1251 is 9890
All clock stock with power limit at 120% (like a 1080ti custom) il 10035 at graphic score...

So, whi at link http://www.3dmark.com/compare/spy/1303084/spy/1307626/spy/1310153/spy/1301697/spy/1219203# titan score is so low???

And can 1080Ti graphic score be so low too? 9303 points.





http://www.3dmark.com/3dm/18383001?


----------



## NemChem

Quote:


> Originally Posted by *axiumone*
> 
> Does anyone know off hand if the base on the stock heat sink is copper or aluminum?


I believe it's aluminium, I might be wrong though. It's definitely silver in colour but it might be nickel plated copper...

Fairly angry about this... or well, not so much since I'm returning my card but for all of you that can no longer do that, I'm angry at nVidia for you! You'd expect them not to skimp on the premium product's VRM compared to the mainstream product. Yes I know it has been 7 months but we're not talking about the yields of the GPU improving and therefore a more mainstream GP102 product being availalbe, we're talking about off the shelf components that don't really change in 7 months:


----------



## symmetrical

Quote:


> Originally Posted by *pez*
> 
> Even I'm looking at those people and starting to think if I should go ATX or even mATX again
> 
> 
> 
> 
> 
> 
> 
> .
> If you can stand the noise, the Titan XP will clock fairly high. It's been a while, but mine consistently would do 2k+ and do a 'final' drop to 1974 MHz. The benefits you're going to see with the Ti are the AIBs being nice and cool and you being able to hold that 2k+ OC longer...but in the grand scheme of things, you're looking at a margin of error type of performance difference in a high 1900 OC and a low to mid 2000 OC.
> EVGA said a few times they were working on a AIO for the Titan XP. They spoke of a delay, but said it was still being worked on, so maybe they're doing it to somehow be compatible with both the TXP and the Ti. That would be especially good news for me, honestly. No clue on inter-compatibility as I'm not sure if the boards are similar enough off the bat to say that with confidence.










You are correct, my XP can clock up to just about 2025mhz stable, but after temps kick in, it drops to about 1950. But at 1950 it gets so hot and the fans pretty much are almost full blast which gets very loud and I can't stand it.

But yeah I've been waiting for the EVGA AIO for awhile now, and actually never thought of it that way. It makes sense now, they were probably making it for both the XP and Ti. Hopefully they announce something soon.


----------



## Edge0fsanity

Quote:


> Originally Posted by *Dagamus NM*
> 
> I did this with four Titans in an S8. I masked the card off and only had the individual resistors that I planned on using exposed. I then applied the CLU and then applied several coats of black plasti dip. After I removed the masking it sealed them perfectly and looks like it could come off easily if needed. Pictures are in my posts in this thread from a couple months ago.


I went back and found that post, i think i might attempt that when i tear down my loop for an upgrade in a few weeks.

Did you need to allow for any drying time on the CLU before using the plasti dip? Did the thin CLU layer get rid of the throttling for you?


----------



## Dagamus NM

Quote:


> Originally Posted by *Edge0fsanity*
> 
> I went back and found that post, i think i might attempt that when i tear down my loop for an upgrade in a few weeks.
> 
> Did you need to allow for any drying time on the CLU before using the plasti dip? Did the thin CLU layer get rid of the throttling for you?


Not sure. I imagine that it has. Clocks hold steady, these are all on a large loop with 12x140mm^2 worth of cooling space.

I did let the clu sit overnight but that was mostly because I didn't want to paint at night. When I applied it I simply touched the syringe to the resistor dabbing it to let the cohesive forces pull the material out rather than apply any pressure to the plunger. As such, the layer was extremely thin and the paint had no trouble sticking.


----------



## Edge0fsanity

Quote:


> Originally Posted by *Dagamus NM*
> 
> Not sure. I imagine that it has. Clocks hold steady, these are all on a large loop with 12x140mm^2 worth of cooling space.
> 
> I did let the clu sit overnight but that was mostly because I didn't want to paint at night. When I applied it I simply touched the syringe to the resistor dabbing it to let the cohesive forces pull the material out rather than apply any pressure to the plunger. As such, the layer was extremely thin and the paint had no trouble sticking.


Good to know. I'll probably let the clu sit for a week then before plastidipping it. This upgrade is going to take me several weeks before its completed anyways.

Currently my TXP has horrible PL throttle issues so i was hoping to eliminate them forever.


----------



## Dagamus NM

Quote:


> Originally Posted by *Edge0fsanity*
> 
> Good to know. I'll probably let the clu sit for a week then before plastidipping it. This upgrade is going to take me several weeks before its completed anyways.
> 
> Currently my TXP has horrible PL throttle issues so i was hoping to eliminate them forever.


Is it only PL causing the throttling? I thought it was temperature dependent on the TXP. I wasn't really sure so I did it to be sure but have a good amount of cooling and a chiller for benching whenever I get around to that.


----------



## Edge0fsanity

Quote:


> Originally Posted by *Dagamus NM*
> 
> Is it only PL causing the throttling? I thought it was temperature dependent on the TXP. I wasn't really sure so I did it to be sure but have a good amount of cooling and a chiller for benching whenever I get around to that.


Its PL throttling, i can watch it happen with afterburner or hwinfo showing the power % and see my clock speed and voltage bounce all over the place. Usually between 2000-2100mhz with my OC enabled but it can go as low as 1950mhz. Cooling is a non issue with 1320mm of rad. Its getting another 720mm added with a second TXP soon. Card doesn't get above 36C with a 5-6C a/w delta.

Thing that drives me nuts is i can turn off my gpu and memory OC and it'll still PL throttle.

Of course this only happens if i uncap the framerate. I'm normally playing games at 60fps on a Dell UW so it doesn't throttle unless i dip below 60fps.

The throttling will be an issue for me if anyone releases a 144hz UW monitor that i've been waiting for.


----------



## jsutter71

Ok. This one is for you overclockers running in SLI under water with a 6950x CPU. My cards refuse to go stable beyond +225 for core clock and +650 memory clock using Afterburner. I can not seem to break past 14206 in Firestrike Ultra and 17066 for Time Spy. My temps are always cool and never come close to max.
My system is overclocked as follows.
6950x set to per core @ 4300 GHz for all cores
64G G.Skill 2400 @ 2600

My results are faster when I set my CPU to per core as opposed to syncing.
Memory speeds beyond 2600 have not improved my score.

Fire Strike Ultra
http://www.3dmark.com/fs/11616345
http://www.3dmark.com/fs/11616131

Time Spy
http://www.3dmark.com/spy/1053306




*So how can I raise this?*


----------



## vmanuelgm

Not impressed by these results...

http://www.guru3d.com/news-story/geforce-gtx-1080-ti-spotted-in-futuremark-orb.html

Easy for me reaching 11400 in graphics score with TXP.


----------



## Dagamus NM

Quote:


> Originally Posted by *Edge0fsanity*
> 
> Its PL throttling, i can watch it happen with afterburner or hwinfo showing the power % and see my clock speed and voltage bounce all over the place. Usually between 2000-2100mhz with my OC enabled but it can go as low as 1950mhz. Cooling is a non issue with 1320mm of rad. Its getting another 720mm added with a second TXP soon. Card doesn't get above 36C with a 5-6C a/w delta.
> 
> Thing that drives me nuts is i can turn off my gpu and memory OC and it'll still PL throttle.
> 
> Of course this only happens if i uncap the framerate. I'm normally playing games at 60fps on a Dell UW so it doesn't throttle unless i dip below 60fps.
> 
> The throttling will be an issue for me if anyone releases a 144hz UW monitor that i've been waiting for.


I hear that. Ok, I wasn't sure what your cooling situation was.

I have two in SLI for playing on a 4K panel. I just lock at 60Hz/60fps since that is what the panel does. No frame dips except for load screens which run at 24fps, for the whole 500ms that they are up.

I think I will jam some skyrim tonight.


----------



## DooRules

@ jsutter71

Why is afterburner showing way high temps, and not reading your gpu volatge?

When I looked at your Futuremark results the core and memory readings are way off, not sure what is up with that either.

I have never set cpu to that setting, I have always used sync all cores, so not sure about that one.


----------



## jsutter71

*Something that has not really been mentioned but worth a note.*

I have no intention whatsoever of replacing my TXP's for the same card that has less memory, but adds some mosfets, and capacitors in order to achieve slightly faster speeds. *That said, I think the real issue why people are angry is because of the insane price drop*. That is where Nvidia is really screwing people over.

A little of my history. I was a early adopter of the 980 until I upgraded to 980Ti's as soon as they were released. The reason was because I have 4 monitors. My primary is a LG 31' 4096 x 2160 and the other 3 are 28" 2560 x 2160. I went from 980 SLI to 980Ti triple SLI and still I had nothing but issues. Especially after driver updates. So much to the point that I had to drop down to 3 monitors for stability reasons. As soon at the 1080's were released I seriously considered upgrading but chose to hold off in anticipation of the 1080Ti's. When December rolled around, and still no sign of the new card, I decided to take the plunge and get a pair of TXP's. Fortunately I was able to sell my 3 980Ti's on ebay for about $600 a piece which did help some. Because of the massive improvement with speed and drivers I was able to go back to my 4 monitor configuration. With the 980Ti's SLI was a total PITA. Half the time when I booted my system the drivers would default back to single card configuration. With the TXP's my system now defaults to SLI. It has improved so much that even after driver updates SLI stays turned on. I also have zero issues with flickr.

*OK now to my point.* I am surprised nobody has really mention this yet. Nvidia was very smart to finally get rid of the DVI connector. This should have happened with the TXP. *Finally a flagship card will be able to operate in a single slot configuration.* Or at least for those of us that watercool which seems to be the majority of overclockers. I understand that their are people who have no problem removing the DVI connector and taking wire cutters to their $1200 card, but I am not one of those people. Regardless, if their was a real reason to change cards, that would be the only reason for me.


----------



## jsutter71

Quote:


> Originally Posted by *DooRules*
> 
> @ jsutter71
> 
> Why is afterburner showing way high temps, and not reading your gpu volatge?
> 
> When I looked at your Futuremark results the core and memory readings are way off, not sure what is up with that either.
> 
> I have never set cpu to that setting, I have always used sync all cores, so not sure about that one.


It's set to F. Temps are 91 and 88. I could run cooler but I have my pumps running at 65% and fans between 790 - 840 for silent operation. I'm content with that temp and even while bench marking my temps never go beyond 140F per card. Voltage monitoring was not checked.


----------



## DooRules

Ah, that would explain the temp readings. I don't adjust my pumps but I do have fans turned down on rads for everyday use. When benching they are full bore.

In that TS link you have up it seems you have no overclock applied to memory at all and very little to core??? Looks to be running at just boost speeds?


----------



## jsutter71

Quote:


> Originally Posted by *DooRules*
> 
> Ah, that would explain the temp readings. I don't adjust my pumps but I do have fans turned down on rads for everyday use. When benching they are full bore.
> 
> In that TS link you have up it seems you have no overclock applied to memory at all and very little to core??? Looks to be running at just boost speeds?


I have 3 pumps which are kinda loud at 100%.


----------



## labjet

I agree with you the price drop is the only thing that makes me feel bad about my titan xp purchase, even more so that i purchased late in the year like you.

Nvidia getting rid of DVI port is a great idea and i would be benificial for water cooled single slot configuration. Only problem for me is i run 4 monitors as well at high refresh rate and i use the 3 dp and 1 dvi








if only one of those monitors had newer hdmi port that would run at higher refresh rates lol i hate having to use DVI its so bulky


----------



## gamingarena

Quote:


> Originally Posted by *vmanuelgm*
> 
> Not impressed by these results...
> 
> http://www.guru3d.com/news-story/geforce-gtx-1080-ti-spotted-in-futuremark-orb.html
> 
> Easy for me reaching 11400 in graphics score with TXP.


Yep i just did a fast test on my modest 2050mhz droping down to 1980mhz during the tests TXP and hit 10900 Graphic score with 70.51FPS for test 1 and 62.92FPS for test 2 that is on Haswell CPU @4.5ghz


----------



## vmanuelgm

Quote:


> Originally Posted by *gamingarena*
> 
> Yep i just did a fast test on my modest 2050mhz droping down to 1980mhz during the tests TXP and hit 10900 Graphic score with 70.51FPS for test 1 and 62.92FPS for test 2 that is on Haswell CPU @4.5ghz


The key will be in the custom models and their max overclock.

Did the test at 2062 without throttling in my case cos of liquid, and +200 in memory, reaching 11063 points in graphics score vs the 10825 of the TI ([email protected]).


----------



## Baasha

Everyone knew the Ti version of the 1080 was coming out the moment the 1080 came out at the end of May last year.

The Titan XP was released early Aug which means it's been 8 months - that's 3/4ths of a year - since the 'king-of-the-hill' GPU has been reigning supreme.

Why all this drama of "waah Ti beats Titan" now? lol

I believe the 1080 Ti will outshine the Titan XP @ 4K. At higher resolutions than that, I'm not so sure.

Further, why not get both so only your wallet will cry instead of you?


----------



## vmanuelgm

Quote:


> Originally Posted by *Baasha*
> 
> Everyone knew the Ti version of the 1080 was coming out the moment the 1080 came out at the end of May last year.
> 
> The Titan XP was released early Aug which means it's been 8 months - that's 3/4ths of a year - since the 'king-of-the-hill' GPU has been reigning supreme.
> 
> Why all this drama of "waah Ti beats Titan" now? lol
> 
> I believe the 1080 Ti will outshine the Titan XP @ 4K. At higher resolutions than that, I'm not so sure.
> 
> Further, why not get both so only your wallet will cry instead of you?


Why crying???

Just commenting, nothing else...

If Ti gets over, will go for it for sure, xD...


----------



## Dagamus NM

Quote:


> Originally Posted by *Baasha*
> 
> Everyone knew the Ti version of the 1080 was coming out the moment the 1080 came out at the end of May last year.
> 
> The Titan XP was released early Aug which means it's been 8 months - that's 3/4ths of a year - since the 'king-of-the-hill' GPU has been reigning supreme.
> 
> Why all this drama of "waah Ti beats Titan" now? lol
> 
> I believe the 1080 Ti will outshine the Titan XP @ 4K. At higher resolutions than that, I'm not so sure.
> 
> Further, why not get both so only your wallet will cry instead of you?


Agreed.


----------



## vmanuelgm




----------



## labjet

Quote:


> Originally Posted by *jsutter71*
> 
> Ok. This one is for you overclockers running in SLI under water with a 6950x CPU. My cards refuse to go stable beyond +225 for core clock and +650 memory clock using Afterburner. I can not seem to break past 14206 in Firestrike Ultra and 17066 for Time Spy. My temps are always cool and never come close to max.
> My system is overclocked as follows.
> 6950x set to per core @ 4300 GHz for all cores
> 64G G.Skill 2400 @ 2600
> 
> My results are faster when I set my CPU to per core as opposed to syncing.
> Memory speeds beyond 2600 have not improved my score.
> 
> Fire Strike Ultra
> http://www.3dmark.com/fs/11616345
> http://www.3dmark.com/fs/11616131
> 
> Time Spy
> http://www.3dmark.com/spy/1053306
> 
> 
> 
> 
> *So how can I raise this?*


@jsutter71 i noticed with my setup when i just have 1 of my 4 monitors active during benchmarks i achieved higher scores. Not sure if you have tried that yet.


----------



## pez

Quote:


> Originally Posted by *symmetrical*
> 
> 
> 
> 
> 
> 
> 
> 
> You are correct, my XP can clock up to just about 2025mhz stable, but after temps kick in, it drops to about 1950. But at 1950 it gets so hot and the fans pretty much are almost full blast which gets very loud and I can't stand it.
> 
> But yeah I've been waiting for the EVGA AIO for awhile now, and actually never thought of it that way. It makes sense now, they were probably making it for both the XP and Ti. Hopefully they announce something soon.


Indeed. My fingers are crossed with you! I would love to throw the TXP under an AIO in my Ncase and then put the Ti in the GFs system.
Quote:


> Originally Posted by *Baasha*
> 
> Everyone knew the Ti version of the 1080 was coming out the moment the 1080 came out at the end of May last year.
> 
> The Titan XP was released early Aug which means it's been 8 months - that's 3/4ths of a year - since the 'king-of-the-hill' GPU has been reigning supreme.
> 
> Why all this drama of "waah Ti beats Titan" now? lol
> 
> I believe the 1080 Ti will outshine the Titan XP @ 4K. At higher resolutions than that, I'm not so sure.
> 
> Further, why not get both so only your wallet will cry instead of you?


But 8 months is only 2/3 of a year







.....


----------



## meson1

Quote:


> Originally Posted by *jsutter71*
> 
> *OK now to my point.* I am surprised nobody has really mention this yet. Nvidia was very smart to finally get rid of the DVI connector. This should have happened with the TXP. *Finally a flagship card will be able to operate in a single slot configuration.* Or at least for those of us that watercool which seems to be the majority of overclockers. I understand that their are people who have no problem removing the DVI connector and taking wire cutters to their $1200 card, but I am not one of those people. Regardless, if their was a real reason to change cards, that would be the only reason for me.


Does (or will) anyone do a replacement endplate in single width for the Ti?


----------



## xTesla1856

Quote:


> Originally Posted by *meson1*
> 
> Does (or will) anyone do a replacement endplate in single width for the Ti?


Probably so, seeing as even the 980Ti KPE had a single slot cover available.


----------



## EDGERRIES

Quote:


> Originally Posted by *Baasha*
> 
> Everyone knew the Ti version of the 1080 was coming out the moment the 1080 came out at the end of May last year.
> 
> The Titan XP was released early Aug which means it's been 8 months - that's 3/4ths of a year - since the 'king-of-the-hill' GPU has been reigning supreme.
> 
> Why all this drama of "waah Ti beats Titan" now? lol
> 
> I believe the 1080 Ti will outshine the Titan XP @ 4K. At higher resolutions than that, I'm not so sure.
> 
> Further, why not get both so only your wallet will cry instead of you?


This is the best statement I have read in a long while. It's all about time and utility you've experienced with the product in that time, everyone knows the cycle. Gtx 1080 users have had that performance (gaming and benching etc) since end of may and might hold off till the Gtx 1180 which will be faster than the 1080ti. Titan XP owners have had this performance/utility gained since august and will probably grab the new Titan when it launches as a refresh for their cycle, its a never ending process.

It's going to be the same with the next cycle XX80, followed by titan Volta etc, followed by the TI variant.

Its your choice when to buy and gain utility from the product.

Baasha, im excited to see some video's on the 1080 Ti's vs the Titan Xp's on your channel, keep up the good work champ!


----------



## Silent Scone

Not sure why it's the best statement, it's the same card with process refinement advantage on the memory IC, with a slightly narrower memory bus and ramped clocks. I've been running my Titan XP at 2125Mhz and 500+ on the memory offset since launch day.

The only Ti cards worth buying over TX(if you own one already) if you're that way inclined will be the ones coming from add in board partners. Baasha is just nuts. With his big air-heat sandwich builds. lol


----------



## vmanuelgm




----------



## Lobotomite430

Hey guys anyway to drop the temps of my titan and have it sit in the mid to low 30c range under load? I have it running with one 360mm rad and gaming it hovers around 41c-45c I was wondering if I add another 360mm rad if that will reduce the temps more.


----------



## Silent Scone

Quote:


> Originally Posted by *Lobotomite430*
> 
> Hey guys anyway to drop the temps of my titan and have it sit in the mid to low 30c range under load? I have it running with one 360mm rad and gaming it hovers around 41c-45c I was wondering if I add another 360mm rad if that will reduce the temps more.


Yes more rad space will help. Temps can be effected by frame target and resolution also. If you add in another 360 or two, this should get you closer to mid 30s with any luck.


----------



## Lobotomite430

Quote:


> Originally Posted by *Silent Scone*
> 
> Yes more rad space will help. Temps can be effected by frame target and resolution also. If you add in another 360 or two, this should get you closer to mid 30s with any luck.


Excellent, I will have to mess with it more then. When I boot from my computer off for some time the clocks dont fluctuate at all when the temps are in the 30s until the computer has been running for a while then they go all over the place.


----------



## Baasha

Quote:


> Originally Posted by *Dagamus NM*
> 
> Agreed.


PM replied.









Quote:


> Originally Posted by *pez*
> 
> But 8 months is only 2/3 of a year
> 
> 
> 
> 
> 
> 
> 
> .....


ROFL - brain fart on my part there.








Quote:


> Originally Posted by *EDGERRIES*
> 
> This is the best statement I have read in a long while. It's all about time and utility you've experienced with the product in that time, everyone knows the cycle. Gtx 1080 users have had that performance (gaming and benching etc) since end of may and might hold off till the Gtx 1180 which will be faster than the 1080ti. Titan XP owners have had this performance/utility gained since august and will probably grab the new Titan when it launches as a refresh for their cycle, its a never ending process.
> 
> It's going to be the same with the next cycle XX80, followed by titan Volta etc, followed by the TI variant.
> 
> Its your choice when to buy and gain utility from the product.
> 
> Baasha, im excited to see some video's on the 1080 Ti's vs the Titan Xp's on your channel, keep up the good work champ!


Awesome! Thanks bud! Yea, it's like these threads/posts are almost guaranteed every year - Titan comes out --> "ZOMG SO EXPENSIVE / WASTE OF MUNNNNEY!".. Ti version comes out --> "ZOMG TITAN SUCKS / PEOPLE WHO BOUGHT TITAN WASTED MUNNNNEY!"









The biggest shenanigan was when they announced the 780 Ti *THREE MONTHS*, that's right, 3 months, after releasing the 780 that had people fuming. Thankfully, they've become more sane and are now releasing the Ti versions 6 months or more after the flagship GPU (Titan) is released.

At the end of the day, most people (99.9%) will buy only a max of two GPUs per iteration --> they hold value pretty well IME and if you sell them, the amount you 'lose' should be well worth the enjoyment they've given you during that period.

The people who have buyer's remorse don't understand how technology evolves and how aggressively companies try to capitalize on it. On the bright side, it makes for comedic gold!









Quote:


> Originally Posted by *Silent Scone*
> 
> Not sure why it's the best statement, it's the same card with process refinement advantage on the memory IC, with a slightly narrower memory bus and ramped clocks. I've been running my Titan XP at 2125Mhz and 500+ on the memory offset since launch day.
> 
> The only Ti cards worth buying over TX(if you own one already) if you're that way inclined will be the ones coming from add in board partners. *Baasha is just nuts. With his big air-heat sandwich builds.* lol


Make me a GPU sammich!


----------



## jsutter71

Quote:


> Originally Posted by *Baasha*
> 
> PM replied.
> 
> 
> 
> 
> 
> 
> 
> 
> ROFL - brain fart on my part there.
> 
> 
> 
> 
> 
> 
> 
> 
> Awesome! Thanks bud! Yea, it's like these threads/posts are almost guaranteed every year - Titan comes out --> "ZOMG SO EXPENSIVE / WASTE OF MUNNNNEY!".. Ti version comes out --> "ZOMG TITAN SUCKS / PEOPLE WHO BOUGHT TITAN WASTED MUNNNNEY!"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The biggest shenanigan was when they announced the 780 Ti *THREE MONTHS*, that's right, 3 months, after releasing the 780 that had people fuming. Thankfully, they've become more sane and are now releasing the Ti versions 6 months or more after the flagship GPU (Titan) is released.
> 
> At the end of the day, most people (99.9%) will buy only a max of two GPUs per iteration --> they hold value pretty well IME and if you sell them, the amount you 'lose' should be well worth the enjoyment they've given you during that period.
> 
> The people who have buyer's remorse don't understand how technology evolves and how aggressively companies try to capitalize on it. On the bright side, it makes for comedic gold!
> 
> 
> 
> 
> 
> 
> 
> 
> Make me a GPU sammich!


I don't even want to imagine the amount of heat that comes from that.


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> I don't even want to imagine the amount of heat that comes from that.


If you can't stand the heat...


----------



## Silent Scone

Quote:


> Originally Posted by *jhowell1030*
> 
> If you can't stand the heat...


Buy some EK blocks and do things properly?


----------



## jsutter71

Quote:


> Originally Posted by *jhowell1030*
> 
> If you can't stand the heat...


LOL..I know a thing or two about heat. I served a total of 39 months of combat time in Iraq. Also I live in Texas.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> LOL..I know a thing or two about heat. I served a total of 39 months of combat time in Iraq. Also I live in Texas.


Thank you for your service and helping keep Americans safe!


----------



## toncij

Quote:


> Originally Posted by *Seyumi*
> 
> You are correct. I believe the sweet spot is around +450 on the memory. Anything higher then the GPU clock drops more often since power is taken away from the core and given to the memory. This inturn reduces overall performance and get less frames. A website did a huge chart where they'd test the memory/gpu clock in like +50mhz intervals and +450 was the safe spot (would vary from 400-500). That's what I have my Titan X's at.


Thanks, I was told power limits are separate
Quote:


> Originally Posted by *EDGERRIES*
> 
> This is the best statement I have read in a long while. It's all about time and utility you've experienced with the product in that time, everyone knows the cycle. Gtx 1080 users have had that performance (gaming and benching etc) since end of may and might hold off till the Gtx 1180 which will be faster than the 1080ti. Titan XP owners have had this performance/utility gained since august and will probably grab the new Titan when it launches as a refresh for their cycle, its a never ending process.
> 
> It's going to be the same with the next cycle XX80, followed by titan Volta etc, followed by the TI variant.
> 
> Its your choice when to buy and gain utility from the product.
> 
> Baasha, im excited to see some video's on the 1080 Ti's vs the Titan Xp's on your channel, keep up the good work champ!


Ti can beat Titan only if we get a much more free BIOS...


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> Thank you for your service and helping keep Americans safe!


Thank you much Sir. Fortunately my extended over seas visits are over. I retired in 2013. My 24yo daughter on the other hand chose to enter the service the same year I retired. She has been in almost 5 years now and she loves it.

A pic of her and I when she graduated from her Advanced Individual training at Ft Sam Houston in San Antonio Texas.


----------



## jhowell1030

Quote:


> Originally Posted by *CptSpig*
> 
> Excuse me but show some respect!


Curious to see what folks think of the 1080 ti now that the review embargo has lifted and reviewsare being posted/shared. I still plan on keeping my Titan, don't get me wrong, but now I'm wondering if I should pick on of these up to play with?

http://hothardware.com/reviews/nvidia-geforce-gtx-1080-ti-performance-review-with-intel-and-ryzen?page=9


----------



## seven7thirty30

Quote:


> Originally Posted by *jhowell1030*
> 
> Curious to see what folks think of the 1080 ti now that the review embargo has lifted and reviewsare being posted/shared. I still plan on keeping my Titan, don't get me wrong, but now I'm wondering if I should pick on of these up to play with?
> 
> http://hothardware.com/reviews/nvidia-geforce-gtx-1080-ti-performance-review-with-intel-and-ryzen?page=9


I would have liked to see them throw the 980Ti into those charts, just out of curiosity. I didn't like that they overclocked the 1080Ti but didn't overclock the Titan X Pascal. I wanted to see those charts.


----------



## jhowell1030

Quote:


> Originally Posted by *seven7thirty30*
> 
> I would have liked to see them throw the 980Ti into those charts, just out of curiosity. I didn't like that they didn't compare it to an overclocked Titan X Pascal.


I agree. It would have been nice to see both cards head to head with overclocks.


----------



## xTesla1856

All I care about is max OC vs. max OC. At 2152/11k this TXP over here is tough to beat


----------



## mouacyk

Quote:


> Originally Posted by *xTesla1856*
> 
> All I care about is max OC vs. max OC. At 2152/11k this TXP over here is tough to beat


Just the guy I want to ask. Have you TXP owners been able to eliminate the pesky throttling completely and hold stable clocks above 2000MHz? I'd like to see some Firestrike graphics scores to get a sense of what might be possible with the 1080 Ti, fully unleashed without PL or thermal throttling. Thanks!


----------



## xTesla1856

Quote:


> Originally Posted by *mouacyk*
> 
> Just the guy I want to ask. Have you TXP owners been able to eliminate the pesky throttling completely and hold stable clocks above 2000MHz? I'd like to see some Firestrike graphics scores to get a sense of what might be possible with the 1080 Ti, fully unleashed without PL or thermal throttling. Thanks!


Throttling is still a reality, even under water. Mine stays at 2152 for a very long time, sometimes dipping to 2126 depending on load and temps. I wish we could unlock the vBIOS and disable boost like on Maxwell. I see potential for 2200MHz in this card.


----------



## Silent Scone

If the load permits, mine stays at 2100Mhz no issue.. 30c load temps.


----------



## Artah

Quote:


> Originally Posted by *jhowell1030*
> 
> I agree. It would have been nice to see both cards head to head with overclocks.


I don't see why people make that kind of garbage review, it's like reviewing a Lamborghini and Ferrari performance and declaring the Ferrari the winner at engine speeds of 5,000 RPM but what if the Lamborghini's power does not exponentially increase unless it's at the region of 5,000-10,000 RPM but the Ferrari's power stays linear even at 10,000 RPM.

tltr;That review is obviously biased for people that don't plan to overclock the thing is my .2 cents.


----------



## CptSpig

Quote:


> Originally Posted by *jhowell1030*
> 
> Curious to see what folks think of the 1080 ti now that the review embargo has lifted and reviewsare being posted/shared. I still plan on keeping my Titan, don't get me wrong, but now I'm wondering if I should pick on of these up to play with?
> 
> http://hothardware.com/reviews/nvidia-geforce-gtx-1080-ti-performance-review-with-intel-and-ryzen?page=9


I am sticking with the Titan X Pascal it just sounds better. I want to see real world benchmark scores. In the review it looked like the 1080ti hit a wall 1999MHz. If this is the case I think there will be a battle for supremacy. On that note I think I can get a second Titan for a good price.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> I am sticking with the Titan X Pascal it just sounds better. I want to see real world benchmark scores. In the review it looked like the 1080ti hit a wall 1999MHz. If this is the case I think there will be a battle for supremacy. On that note I think I can get a second Titan for a good price.


You will not be disappointed with that pair.


----------



## pez

Maybe I need to take advantage of the TXP crowd that's selling their GPUs too cheap. Would be a reasonable way to get a second one at a cheaper price







.


----------



## eliau81

we need BIOS unlock more then ever!!!
maybe we can convince nVIDIA to be kind let us unlock it after stabbing our back with a new low price better card.


----------



## gamingarena

Quote:


> Originally Posted by *eliau81*
> 
> we need BIOS unlock more then ever!!!
> maybe we can convince nVIDIA to be kind let us unlock it after stabbing our back with a new low price better card.


Or maybe they can send us $300-500 Voucher for next Purchase as good gesture...


----------



## pez

Stabbing our back?


----------



## Leyaena

Quote:


> Originally Posted by *eliau81*
> 
> ... stabbing our back with a new low price better card.


A newly released piece of technology is cheaper and/or performs better than a similar piece of technology that came out somewhere around the middle of last year?

*gasp*
Say it ain't so!

Really though, what did you expect?


----------



## meson1

Quote:


> Originally Posted by *Leyaena*
> 
> A newly released piece of technology is cheaper and/or performs better than a similar piece of technology that came out somewhere around the middle of last year?
> 
> *gasp*
> Say it ain't so!
> 
> Really though, what did you expect?


Yes. Stuff gets faster. And power gets cheaper. It's the way of the tech world. It's not new for products to be superseded after 6 months.


----------



## cg4200

Stuff gets faster all the time yes agreed.. But NVidia did something unusual buy making the 1080 ti 700.00 card stock 3% faster in games than titan xp..1200.00
Yes early adopters pay more for first use but it is basically a wash when ti comes out and is close but typically titan remains king..
I do not regret buying a titan xp last 8 months great ..
(But I feel a little But hurt YES..)
I wonder what max oc on ram is for 1080 ti ??will be and if it scales well. Benchmarks are run titan xp vs 1080ti stock where ram is faster and better fan holds boost longer for ti..
My titan I can game at + 2100 and add 650 to memory which puts me at 11.2 gbs8200 firestrike ultra graphics score only
.. So I have not had time to read all reviews anyonewho has read a lot know what on average you can overclock core?? and more memory on ti??
Thanks


----------



## Silent Scone

Seen a few reviewers pushing the memory to 6Ghz on the Ti due to IC improvments, but this will not give much of a lead if any if the comparison between TXP is also overclocked. Given AIB partner cards will be volt locked, there will be very little between the bunch, and those running decent watercooling with good silicon will reign on top, regardless of which card.

Anyone selling their TXP for one of these is buying into the hype train.


----------



## xTesla1856

Well guys, this is where my membership to this thread gets revoked. Sold the Titan today for a pretty insane amount. Now the only question is to SLI or not to SLI the 1080Ti


----------



## Slushpup

RIP brother.


----------



## xTesla1856

Quote:


> Originally Posted by *Slushpup*
> 
> RIP brother.


Let's just say I get to keep 600 bucks after I get a 1080Ti.


----------



## st0necold

Nothing like dropping almost 2 grand on a GPU and having the exact same GPU come out 4 months later for $699.


----------



## xTesla1856

Quote:


> Originally Posted by *st0necold*
> 
> Nothing like dropping almost 2 grand on a GPU and having the exact same GPU come out 4 months later for $699.


Imported for 1330, sold today for 1300. Nothing like dropping 2 G's


----------



## jhowell1030

Quote:


> Originally Posted by *xTesla1856*
> 
> Imported for 1330, sold today for 1300. Nothing like dropping 2 G's


I'm glad you got your money back! There are others out there that try to low ball in the most repugnant way possible while trying to justify it with the "fact" that something new is out there. Those individuals are what kill the secondary market.


----------



## jsutter71

What a bunch of Negative Nellies and Sally Sadsacks. Maybe it's just me, but anyone who has been in the PC community for more then a few days should realize that this happens all the time. I get that you run out and drop a ton of cash on the latest fastest piece of hardware, but that new piece of equipment is no different then anything else. It's only a matter of time before something else gets released to replace it. Their is absolutely ZERO reason to swap out a TXP for a 1080Ti. I seriously doubt anyone on this thread will see any speed differences in real world experiences. Sure perhaps in benchmarks, IF they don't bother to overclock their TXP. But the truth is that Its almost the exact same card. Nvidia just clocked it higher to make up for the 1 gig memory difference. Since it is clocked higher from the start it is entirely possible that it will run hotter, and might not overclock as well as a TXP. The comparison data being released has not taken into account those considerations yet. Regardless if your going to drop a ton of cash on a GPU like this then pick a second and run it in SLI. You would be hard pressed to find any speed deficits with games, and programs that support SLI running a pair of TXPs. IMHO the only reason to be irritated is because of the cost difference. But even that is expected. The original Macintosh released in 1984 cost $2495.00 which today would be $6018.00 adjusted for inflation. Most people willing to spend that much on PC hardware do it to try and future proof as much as possible. For example I still have a 1st gen Core i7 920 CPU which I bought in 2008 which I was using until around 2013. The TXP's are extremely powerful so unless your gaming in 8K then why stress it?


----------



## xTesla1856

I never respond to lowball offers. When I sell, I have a set price in mind, if I can sell it for that, great. If not, I keep the product. In this case it worked and I'm very happy. Plus I got to diddle a Titan XP for almost 3 months


----------



## xTesla1856

Quote:


> Originally Posted by *jsutter71*
> 
> What a bunch of Negative Nellies and Sally Sadsacks. Maybe it's just me, but anyone who has been in the PC community for more then a few days should realize that this happens all the time. I get that you run out and drop a ton of cash on the latest fastest piece of hardware, but that new piece of equipment is no different then anything else. It's only a matter of time before something else gets released to replace it. Their is absolutely ZERO reason to swap out a TXP for a 1080Ti. I seriously doubt anyone on this thread will see any speed differences in real world experiences. Sure perhaps in benchmarks, IF they don't bother to overclock their TXP. But the truth is that Its almost the exact same card. Nvidia just clocked it higher to make up for the 1 gig memory difference. Since it is clocked higher from the start it is entirely possible that it will run hotter, and might not overclock as well as a TXP. The comparison data being released has not taken into account those considerations yet. Regardless if your going to drop a ton of cash on a GPU like this then pick a second and run it in SLI. You would be hard pressed to find any speed deficits with games, and programs that support SLI running a pair of TXPs. IMHO the only reason to be irritated is because of the cost difference. But even that is expected. The original Macintosh released in 1984 cost $2495.00 which today would be $6018.00 adjusted for inflation. Most people willing to spend that much on PC hardware do it to try and future proof as much as possible. For example I still have a 1st gen Core i7 920 CPU which I bought in 2008 which I was using until around 2013. The TXP's are extremely powerful so unless your gaming in 8K then why stress it?


If you can sell something for almost the same money you paid, and get essentially the same performance for almost half the price, why would you not do that? I love the Titans, with Maxwell I kept both of mine after the Ti came out, but this time it's a little much to swallow. The difference is just too minor to justify 600 bucks in my case. I get your general point though.


----------



## stocksux

Quote:


> Originally Posted by *xTesla1856*
> 
> If you can sell something for almost the same money you paid, and get essentially the same performance for almost half the price, why would you not do that? I love the Titans, with Maxwell I kept both of mine after the Ti came out, but this time it's a little much to swallow. The difference is just too minor to justify 600 bucks in my case. I get your general point though.


100% on board with this. I was able to return my Titan XP to Nvidia as I was under 30 days of ownership. I just grabbed a 1080ti from microcenter. Same performance and a bunch of extra money in my pocket. Win win


----------



## vmanuelgm

Quote:


> Originally Posted by *mouacyk*
> 
> Just the guy I want to ask. Have you TXP owners been able to eliminate the pesky throttling completely and hold stable clocks above 2000MHz? I'd like to see some Firestrike graphics scores to get a sense of what might be possible with the 1080 Ti, fully unleashed without PL or thermal throttling. Thanks!


----------



## jsutter71

Quote:


> Originally Posted by *xTesla1856*
> 
> If you can sell something for almost the same money you paid, and get essentially the same performance for almost half the price, why would you not do that? I love the Titans, with Maxwell I kept both of mine after the Ti came out, but this time it's a little much to swallow. The difference is just too minor to justify 600 bucks in my case. I get your general point though.


So have you already purchased your 1080TI's yet???? And I doubt it will be $600 difference. The mark up on these cards will be steep for the next several months because of supply and demand. Bet on it.


----------



## xTesla1856

Quote:


> Originally Posted by *jsutter71*
> 
> So have you already purchased your 1080TI's yet???? And I doubt it will be $600 difference. The mark up on these cards will be steep for the next several months because of supply and demand. Bet on it.


I'll be ordering from Nvidia directly like I did the Titan X.


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> What a bunch of Negative Nellies and *Sally Sadsacks*.


Sally Sadsacks...LOL That killed me!


----------



## jsutter71

Quote:


> Originally Posted by *xTesla1856*
> 
> I'll be ordering from Nvidia directly like I did the Titan X.


They're already listing it at $699 with a $500 difference from the TXP. You better watch that page like a hawk to be able to snag one because every other vendor will be charging a lot more. Nvidia will have to amp up production to keep them in stock so you may be waiting a while. It would be one thing to preorder at that price but Notify me is not very reassuring.


----------



## jhowell1030

Quote:


> Originally Posted by *xTesla1856*
> 
> I never respond to lowball offers. When I sell, I have a set price in mind, if I can sell it for that, great. If not, I keep the product. In this case it worked and I'm very happy. Plus I got to diddle a Titan XP for almost 3 months


That's the best mentality to have about it. When I sold my pair of 980 K|NGP|Ns I didn't budge. Bought them when they launched, sold them almost ten months later for $1050 when two would still retail for $1350. Kept having the same guy across a couple of different forums (same usernames) try to lowball me for $850. Eventually, all it did was attract more attention to the post and I sold them for what I wanted. Waited several months and $200 later...*BOOM*..Titan XP

I thought about listing my TXP with it's waterblock, backplate, and stock cooler somewhere just to see what they'd hit for and grab a 1080 ti to hold me over until something more powerful comes along. That way, I'd still be a little ahead. Ultimately the plan is to have one GPU that is able to pump out enough pixels to satisfy 100FPS @ 3440 x 1440 but I think that may be the next generation or later. Then the 1080 ti would go into a HTPC. Still on the fence though because I'm not dissatisfied with Titan in any way.


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> They're already listing it at $699 with a $500 difference from the TXP. You better watch that page like a hawk to be able to snag one because every other vendor will be charging a lot more. Nvidia will have to amp up production to keep them in stock so you may be waiting a while. It would be one thing to preorder at that price but Notify me is not very reassuring.


Yeah, the notify me option didn't quite get me there for the Titan XP. On launch day they were instantly sold out. I am lucky enough to work in on office so I was able to refresh all day and buy one before the notify me email was sent.


----------



## stocksux

Quote:


> Originally Posted by *jsutter71*
> 
> So have you already purchased your 1080TI's yet???? And I doubt it will be $600 difference. The mark up on these cards will be steep for the next several months because of supply and demand. Bet on it.


I was able to buy one at Micro Center today. But only one per customer....they had six total


----------



## jsutter71

Quote:


> Originally Posted by *stocksux*
> 
> I was able to buy one at Micro Center today. But only one per customer....they had six total


So just out of curiosity with tax was it a $500 difference???? Kinda sucks that they limited you to 1 though. I would have found a uninterested friend to take with you so they could have snagged a second one right behind you.


----------



## stocksux

Quote:


> Originally Posted by *jsutter71*
> 
> So just out of curiosity with tax was it a $500 difference???? Kinda sucks that they limited you to 1 though. I would have found a uninterested friend to take with you so they could have snagged a second one right behind you.


Nvidia charges tax as well so that's a wash. Also I was told yesterday by men employee that they wouldn't limit to one per customer or I would have brought a friend. Now I have to play the online game if I want sli


----------



## xTesla1856

I'll have mine shipped to tax-free Delaware and then exported from there to Europe. It'll still end upo cheaper than getting one here with our ridiculous price-gouging retailers.


----------



## jsutter71

Quote:


> Originally Posted by *stocksux*
> 
> Nvidia charges tax as well so that's a wash. Also I was told yesterday by men employee that they wouldn't limit to one per customer or I would have brought a friend. Now I have to play the online game if I want sli


The one downside to new cheaper technology is the crap you go through to make it happen. And if you plan on liquid cooling the additional expense with the water blocks. So with the time it takes to get one, and the added expensive of the cooling solutions, unless your recycling your TXP blocks if applicable, and the tax, and the inconvenience associated with swapping out the old with the new, if liquid cooling. I would ask myself if how much all the time, patience, and inconvenience is worth.


----------



## stocksux

Quote:


> Originally Posted by *jsutter71*
> 
> The one downside to new cheaper technology is the crap you go through to make it happen. And if you plan on liquid cooling the additional expense with the water blocks. So with the time it takes to get one, and the added expensive of the cooling solutions, unless your recycling your TXP blocks if applicable, and the tax, and the inconvenience associated with swapping out the old with the new, if liquid cooling. I would ask myself if how much all the time, patience, and inconvenience is worth.


I am recycling my Titan block/backplate. Nvidia taxed the Titan and microcenter taxed the 1080ti. I made one call to Nvidia and was sent instructions for the Titan return. Really no sweat. Pretty easy process imo. And now $500 in my pocket ?


----------



## jsutter71

Quote:


> Originally Posted by *stocksux*
> 
> I am recycling my Titan block/backplate. Nvidia taxed the Titan and microcenter taxed the 1080ti. I made one call to Nvidia and was sent instructions for the Titan return. Really no sweat. Pretty easy process imo. And now $500 in my pocket ?


And their is absolutely nothing bad about that.


----------



## xTesla1856

Quote:


> Originally Posted by *stocksux*
> 
> I am recycling my Titan block/backplate. Nvidia taxed the Titan and microcenter taxed the 1080ti. I made one call to Nvidia and was sent instructions for the Titan return. Really no sweat. Pretty easy process imo. And now $500 in my pocket ?


I was super happy when EK confirmed that the TXP blocks and backplates also work for the Ti. From the outside, it will still say "TITAN X" on the back







win win for me


----------



## jsutter71

Quote:


> Originally Posted by *xTesla1856*
> 
> I was super happy when EK confirmed that the TXP blocks and backplates also work for the Ti. From the outside, it will still say "TITAN X" on the back
> 
> 
> 
> 
> 
> 
> 
> win win for me










Well it is for the most part the same card. Just like the older Titans and the 980Ti.


----------



## jhowell1030

Massdrop's drop of these almost sold out in 30 minutes.

https://www.massdrop.com/buy/nvidia-geforce-gtx-1080-ti-founders-edition?referer=6Q3EN4


----------



## xTesla1856

Quote:


> Originally Posted by *jhowell1030*
> 
> Massdrop's drop of these almost sold out in 30 minutes.
> 
> https://www.massdrop.com/buy/nvidia-geforce-gtx-1080-ti-founders-edition?referer=6Q3EN4


Since when does MD do graphics cards?


----------



## jhowell1030

They've been doing them for a while. MSI goes to them a lot with their stuff.

Not too long ago you could get one of MSI's 1070FE with with a block for around 15% more than a typical FE was running. Those are few and far between though.


----------



## jhowell1030

Is it sad that I'm toying with the idea of getting a 1080 ti just to throw it into my current rig under water to test/compare performance against it's bigger brother also under water? You know, for science.

Honestly, I don't think there would be that much of a gap at that point. (Not that there's one now)


----------



## stocksux

Card in hand!


----------



## jsutter71

I did not know that EVGA released one yet


----------



## jhowell1030

They did preorders


----------



## lilchronic

Quote:


> Originally Posted by *jhowell1030*
> 
> They did preorders


Probably picked it up locally.


----------



## mouacyk

You people are making my 980 Ti look dull and boring.


----------



## KillerBee33

Almost $800.
http://www.evga.com/products/product.aspx?pn=11G-P4-6390-KR
http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,30.html
Keep ya TITANS!


----------



## gamingarena

Quote:


> Originally Posted by *KillerBee33*
> 
> Almost $800.
> http://www.evga.com/products/product.aspx?pn=11G-P4-6390-KR
> http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,30.html
> Keep ya TITANS!


Unless you can get close to retail price on Titan XP, absolute this keep the Titans the price difference is not much which you will recover at then end i can bet anything you want that at the end Titan XP will fetch at least $200 or more then 1080Ti, so you loose nothing,
as for the guys that have OCD problems go check your doctor and relax 1-5fps means nothing specially if you Overclock TXP.


----------



## stocksux

Quote:


> Originally Posted by *lilchronic*
> 
> Probably picked it up locally.


I did get it locally. Micro Center had six cards. Two EVGA, two MSI and two Gigabytes


----------



## jsutter71

Quote:


> Originally Posted by *mouacyk*
> 
> You people are making my 980 Ti look dull and boring.


No buyers remorse here. upgraded 3 980Ti's to dual TXP's. My 980Ti's were EVGA ACX 2.0+ with EK water blocks which I sold on ebay for $650 each. All 3 were step ups from 980's. The 980 series cards gave me nothing but headaches and even had to RMA 2 of them. The TXPs run fast and stable and have no issues driving my 4 monitors.


----------



## jsutter71

Quote:


> Originally Posted by *KillerBee33*
> 
> Almost $800.
> http://www.evga.com/products/product.aspx?pn=11G-P4-6390-KR
> http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,30.html
> Keep ya TITANS!


*unfreakinbelievable !!!!*

Looks like people are just a little to quick to trash the TXP. This is what happens when people react before getting proper intel. Their was a time when most people in the PC community AKA PC gamers were considered smart nerds. As of late this thread has looked like a bad case of chicken little syndrome.


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> *unfreakinbelievable !!!!*
> 
> Looks like people are just a little to quick to trash the TXP. This is what happens when people react before getting proper intel. Their was a time when most people in the PC community AKA PC gamers were considered smart nerds. As of late this thread has looked like a bad case of chicken little syndrome.


Doubt that "Ti" will be any better







Keepin' my TXP for another 7 months when Volta is out ...
Most TXP's do 11K on the TimeSpy Graphics, here's what OC'd 1080Ti does
http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpu/spy/P/1127/16261?minScore=0&gpuName=NVIDIA%20GeForce%20GTX%201080%20Ti


----------



## gamingarena

Quote:


> Originally Posted by *KillerBee33*
> 
> Doubt that "Ti" will be any better
> 
> 
> 
> 
> 
> 
> 
> Keepin' my TXP for another 7 months when Volta is out ...
> Most TXP's do 11K on the TimeSpy Graphics, here's what OC'd 1080Ti does
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpu/spy/P/1127/16261?minScore=0&gpuName=NVIDIA%20GeForce%20GTX%201080%20Ti


There is not single 1080TI in top 100 on any list yet, all you see is TXP, so either they are slower clock per clock or...


----------



## hertz9753

... not many people have them or are playing games and not benchmarking. Enjoy what you have.


----------



## MrKenzie

Someone is selling their brand new Titan X and EK waterblock/backplate here in Australia on Ebay, currently at US$1,000 with 2 days to go. Will be interesting to see what it eventually goes for because the Titan X never shipped to Australia. The reference 1080Ti sells for US$850 here for instance.


----------



## Nicklas0912

To people who sell there titan X Pascal for TI is just stupid.

Clock for clock, Titan X Pascal, perform better, TI need to go 2100Mhz or more before it match a 2062Mhz Titan X Pascal.

The Bit rate and Extra Rof save the titan.

The Titan X Pascal still have more Memory Brandwith, TI got 484GB/s at 11GB/s, if you OC Titan to 11GB/s "+500" you get 528.4GB/s.

Keep your Titan X, enjoy you still have the faste card when both are overclocked.


----------



## Sheyster

Quote:


> Originally Posted by *jsutter71*
> 
> Thank you much Sir. Fortunately my extended over seas visits are over. I retired in 2013. My 24yo daughter on the other hand chose to enter the service the same year I retired. She has been in almost 5 years now and she loves it.
> 
> A pic of her and I when she graduated from her Advanced Individual training at Ft Sam Houston in San Antonio Texas.


That's really great dude, I live in a mil town (San Diego) and have much respect for you both. Thank you.


----------



## kx11

hey guys

i just did a benchmark run with Deus EX DX12 @ 4k maxed - no AA and vsync off , i wanna know if the GPU temps are alright since i'm using watercooling and the fans are at full speed and i see people with water cooled PCs talking about playing at 40c max !!!


----------



## stocksux

Quote:


> Originally Posted by *kx11*
> 
> hey guys
> 
> i just did a benchmark run with Deus EX DX12 @ 4k maxed - no AA and vsync off , i wanna know if the GPU temps are alright since i'm using watercooling and the fans are at full speed and i see people with water cooled PCs talking about playing at 40c max !!!


What size radiator are using to cook the card? Single loop or dual? If single, what else is in the loop (CPU, ram, chipset, etc...)??? Need some more info from you. As far as temps go, I'd say they are high for being on water in general.


----------



## kx11

Quote:


> Originally Posted by *stocksux*
> 
> What size radiator are using to cook the card? Single loop or dual? If single, what else is in the loop (CPU, ram, chipset, etc...)??? Need some more info from you. As far as temps go, I'd say they are high for being on water in general.


it's actually OriginPC cryogenic water cooling ( hard tubes ) lvl 3 which means it's for CPU/GPU , the bottom of the case got 2 360 rads + next to the GPU 1x240 connected to a single 120 (i think ) fan

here's how it looks inside ( 90 degree oriented Mobo )


----------



## stocksux

Quote:


> Originally Posted by *kx11*
> 
> it's actually OriginPC cryogenic water cooling ( hard tubes ) lvl 3 which means it's for CPU/GPU , the bottom of the case got 2 360 rads + next to the GPU 1x240 connected to a single 120 (i think ) fan
> 
> here's how it looks inside ( 90 degree oriented Mobo )


That pic makes it tough to tell but if you two 360mm rads, a 240mm rad a 120mm rad that should be way more than enough to cool a single gpu and single CPU. Something's not right. Maybe the block on the gpu isn't seated properly or maybe the thermal paste wasnt applied quite right. Could have something clogging up the block. Not real sure. But with 1080mm of radiator and fans ok full you should not be in the 50's for the gpu.


----------



## kx11

Quote:


> Originally Posted by *stocksux*
> 
> That pic makes it tough to tell but if you two 360mm rads, a 240mm rad a 120mm rad that should be way more than enough to cool a single gpu and single CPU. Something's not right. Maybe the block on the gpu isn't seated properly or maybe the thermal paste wasnt applied quite right. Could have something clogging up the block. Not real sure. But with 1080mm of radiator and fans ok full you should not be in the 50's for the gpu.


the GPU is OC btw and the CPU too (4.0 ghz currently) so iwould not worry too much about the CPU at all , it's the gpu that worries me


----------



## stocksux

Quote:


> Originally Posted by *kx11*
> 
> the GPU is OC btw and the CPU too (4.0 ghz currently) so iwould not worry too much about the CPU at all , it's the gpu that worries me


FWIW here's my setup; CPU i7 7700k + asus fornula IX (has water block built in for cooling power delivery) and that is all being cooled by a 480mm radiator with fans in push only. While gaming I see mid 30s. During Realbench stress tests I see low 50s. Then I have a separate loop for the GPU. It's cooled by a 560mm radiator in push pull. My max temps at load are mid/high 30s. Both components are overclocked and fan speeds are at a minimum for low noise. I could push temps down even further by turning the fans to full.


----------



## xarot

My card hits around 50c when overclocked to the max and depending on fan speed, I wouldn't worry about it at all. Maybe on the higher side when compared to some others but I don't really care that much.


----------



## kx11

Quote:


> Originally Posted by *stocksux*
> 
> FWIW here's my setup; CPU i7 7700k + asus fornula IX (has water block built in for cooling power delivery) and that is all being cooled by a 480mm radiator with fans in push only. While gaming I see mid 30s. During Realbench stress tests I see low 50s. Then I have a separate loop for the GPU. It's cooled by a 560mm radiator in push pull. My max temps at load are mid/high 30s. Both components are overclocked and fan speeds are at a minimum for low noise. I could push temps down even further by turning the fans to full.


your CPU is a quad core while mine got 8 of them









i might drain the liquid because i think it's kinda thick to run through the tubes fast enough to cool everything since it's not a premixed liquid which i tried before and it gets the job done ( not like yours though but better than mine now )


----------



## Vellinious

Quote:


> Originally Posted by *kx11*
> 
> your CPU is a quad core while mine got 8 of them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i might drain the liquid because i think it's kinda thick to run through the tubes fast enough to cool everything since it's not a premixed liquid which i tried before and it gets the job done ( not like yours though but better than mine now )


What kind of cooling are you running? What's your ambient temps? Temp inside the case? With a full coverage block, and the core speed shown in that video you posted, I wouldn't think the core should be running more than 20c over ambient...probably closer to 15c.


----------



## stocksux

I bet you all a farewell. Titan XP is gone (full refund from Nvidia). 1080 ti running my rig. It was a fun month of Titan. Good luck to you all!


----------



## kx11

Quote:


> Originally Posted by *Vellinious*
> 
> What kind of cooling are you running? What's your ambient temps? Temp inside the case? With a full coverage block, and the core speed shown in that video you posted, I wouldn't think the core should be running more than 20c over ambient...probably closer to 15c.


Quote:


> OriginPC CRYOGENIC Hard-Line Liquid Cooling on GENESIS desktop!


that is the cooling i got in my PC , it arrived with leakage and a damaged tube which made me change the tube and the liquid , i'll have to replace the liquid again for sure , the temps aren't what i thought watercooling should be

some stats from HWinfo

Idle , chrome , steam and MS Edge are open


----------



## xTesla1856

My buyer bailed out of the deal, I guess I'm staying in this (increasingly) exclusive club


----------



## jsutter71

Quote:


> Originally Posted by *xTesla1856*
> 
> My buyer bailed out of the deal, I guess I'm staying in this (increasingly) exclusive club


Why not just use ebay. I've had good success selling all my used PC equipment their.


----------



## KillerBee33

Quote:


> Originally Posted by *xTesla1856*
> 
> My buyer bailed out of the deal, I guess I'm staying in this (increasingly) exclusive club


You can also use OCNs Market Place to sell parts , i sold 1080 and 10 series EVGA Hybrid Kit here.


----------



## jsutter71

Quote:


> Originally Posted by *KillerBee33*
> 
> You can also use OCNs Market Place to sell parts , i sold 1080 and 10 series EVGA Hybrid Kit here.


Not without 35 reps.


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> Not without 35 reps.


Let's help the dude out


----------



## hertz9753

It won't work because it is not allowed ToS.


----------



## jsutter71

Quote:


> Originally Posted by *xTesla1856*
> 
> My buyer bailed out of the deal, I guess I'm staying in this (increasingly) exclusive club


Something to consider. This guy has identical setup as mine except he's running 1080 Ti's

His score
http://www.3dmark.com/fs/11988933



My score
http://www.3dmark.com/fs/11512998



Pretty sizable difference.


----------



## pez

I might actually sell the TXP only to fund a second Ti. I'm about to switch it back into my main build to ready the 1080 for the step-up process. Not sure how I feel about moving away from SFF again, but TXP SLI performance for a couple bills over the price of a single TXP is attractive to me. Otherwise, I don't see the hullabaloo about selling TXPs to get a side-grade.


----------



## xTesla1856

I'd lose massive money if I sold it anywhere right now. The guy originally bought it for 30 bucks less than what I paid. At this point I'll just keep it.


----------



## Silent Scone

Quote:


> Originally Posted by *jsutter71*
> 
> Something to consider. This guy has identical setup as mine except he's running 1080 Ti's
> 
> His score
> http://www.3dmark.com/fs/11988933
> 
> 
> 
> My score
> http://www.3dmark.com/fs/11512998
> 
> 
> 
> Pretty sizable difference.


Could be due to a number of reasons. Different driver, overclocking, LOD tweaks. The difference in reality all things considered would not be that great if the setups were identical.

The results indicate that the Titan is 13.7% faster in Graphics test 2...We know that's not really the case, don't we


----------



## MrKenzie

It looks like most 1080Ti owners are maxing out at around 2060MHz, so lower clocks than the Titan. They are still far better value for money though!


----------



## vmanuelgm

Quote:


> Originally Posted by *Nicklas0912*
> 
> To people who sell there titan X Pascal for TI is just stupid.
> 
> Clock for clock, Titan X Pascal, perform better, TI need to go 2100Mhz or more before it match a 2062Mhz Titan X Pascal.
> 
> The Bit rate and Extra Rof save the titan.
> 
> The Titan X Pascal still have more Memory Brandwith, TI got 484GB/s at 11GB/s, if you OC Titan to 11GB/s "+500" you get 528.4GB/s.
> 
> Keep your Titan X, enjoy you still have the faste card when both are overclocked.


But Baasha says Ti is faster, xDD


----------



## CptSpig

Quote:


> Originally Posted by *vmanuelgm*
> 
> But Baasha says Ti is faster, xDD


So far his graphic scores do not indicate the 1080 ti is faster. KingPin is the only one who has higher scores but we can't compare he took down his Titan XP scores. As of right now no other 1080 ti's have made the top ten. Only time will tell.

So far a single 1080 ti is only number eighty six on the Time Spy list not counting KingPin who is using a binned EVGA card with unlocked bios on LN2?








Quote:


> Originally Posted by *MrKenzie*
> 
> It looks like most 1080Ti owners are maxing out at around 2060MHz, so lower clocks than the Titan. They are still far better value for money though!


That is what I have seen also it's like they are hitting a wall at 2060. Will see maybe we are still in shock over the price.


----------



## JunkaDK

Quote:


> Originally Posted by *jsutter71*
> 
> Something to consider. This guy has identical setup as mine except he's running 1080 Ti's
> 
> His score
> http://www.3dmark.com/fs/11988933
> 
> 
> 
> My score
> http://www.3dmark.com/fs/11512998
> 
> 
> 
> Pretty sizable difference.


I am happy my 2 x OC'ed 1080's beat the 2 X 1080ti's ( graphics score).. but i guess they are not OC'ed.







http://www.3dmark.com/compare/fs/11988933/fs/11957272#


----------



## Vellinious

I would have thought the 1080ti, even on stock clocks would do better than that. I'm getting 48k graphics scores on my 2 x 1080s.


----------



## Gunslinger.

Quote:


> Originally Posted by *CptSpig*
> 
> So far a single 1080 ti is only number eighty six on the Time Spy list not counting KingPin who is using a highly volt modded EVGA card on LN2?


Fixed that for you.









Picture of his modded cards.


----------



## CptSpig

Quote:


> Originally Posted by *JunkaDK*
> 
> I am happy my 2 x OC'ed 1080's beat the 2 X 1080ti's ( graphics score).. but i guess they are not OC'ed.
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/11988933/fs/11957272#


Quote:


> Originally Posted by *Gunslinger.*
> 
> Fixed that for you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Picture of his modded cards.


Thanks, do your rember what his score was with Titan XP and was the setup the same?


----------



## Radox-0

Quote:


> Originally Posted by *jsutter71*
> 
> Something to consider. This guy has identical setup as mine except he's running 1080 Ti's
> 
> His score
> http://www.3dmark.com/fs/11988933
> 
> 
> 
> My score
> http://www.3dmark.com/fs/11512998
> 
> 
> 
> Pretty sizable difference.


Those scores seem extremely odd for the 1080Ti. Ran Firestrike 3 times now with a pair of stock 1080Ti's and all handily blew past the score he got. Here is a run at stock which coincidently has the same core clock listed as your TXP run although I only have a 5960x:



http://www.3dmark.com/fs/11994432


----------



## NemChem

TL;DR: Titan wins by 2% despite worse airflow.
Edit: TL;DR: Titan wins by 1.1% despite worse airflow.

Hey guys, here is my apples to apples comparison - well, as close to apples to apples as I can be, both cards at 100% fan speed so the 1080 Ti will have better cooling due to no DVI port. If I had an airspeed meter I'd measure that and then multiply by the surface area of each cards' exhaust then reduce the 1080 Ti's fanspeed until the figures matched - alas I cannot do that







.

For some reason 3D Mark thinks both runs were done with the Titan X but I can assure you, the one on the left was done on the 1080 Ti as Afterburner reported it going to full load whilst the Titan remained idle - and I'd like to see the Titan try and do 12,268 Edit: 12528 MHz on the memory...

Edit: Felt the temperature of the exhausted air from the cards on this run just to make sure... definitely the 1080 Ti doing the work







.

*System*

ASUS Crosshair VI Hero
AMD Ryzen 7 1800X @ 4.1 GHz
G.Skill TridentZ @ 3200MHz CL14-13-13-13-31
*1080 Ti clocks*

120% power
+175 MHz core (Max boost seen 2076 MHz)
+625 MHz memory (539 GB/s)
Edit: +750 MHz memory (12.5 GHz effective, 551.2 GB/s)
*Titan X (Pascal) clocks*

120% power
+200 MHz core (Don't remember the max boost, sorry)
+750 MHz memory (11.5 GHz effective, 552 GB/s)
http://www.3dmark.com/compare/spy/1374347/spy/1358968


----------



## Gunslinger.

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks, do your rember what his score was with Titan XP and was the setup the same?


13062 with Titan X Pascal

http://hwbot.org/submission/3437024_kingpin_3dmark___time_spy_titan_x_pascal_13062_marks

13291 with 1080 Ti at +83MHz on the core and +50MHz on the memory, rest of the system clocks are the same.

http://hwbot.org/submission/3486294_kingpin_3dmark___time_spy_geforce_gtx_1080_ti_13291_marks


----------



## CptSpig

Quote:


> Originally Posted by *Radox-0*
> 
> Those scores seem extremely odd for the 1080Ti. Ran Firestrike 3 times now with a pair of stock 1080Ti's and all handily blew past the score he got. Here is a run at stock which coincidently has the same core clock listed as your TXP run although I only have a 5960x:
> 
> 
> 
> http://www.3dmark.com/fs/11994432


That only puts you at number 89 on the hall of fame list. Lots of Titan XP's in front. Your memory is not stock it's overclocked? Give us your scores when you have your best overclocks.








Quote:


> Originally Posted by *Gunslinger.*
> 
> 13062 with Titan X Pascal
> 
> http://hwbot.org/submission/3437024_kingpin_3dmark___time_spy_titan_x_pascal_13062_marks
> 
> 13291 with 1080 Ti at +83MHz on the core and +50MHz on the memory, rest of the system clocks are the same.
> 
> http://hwbot.org/submission/3486294_kingpin_3dmark___time_spy_geforce_gtx_1080_ti_13291_marks


Thanks, When he made the Titan XP run did he mod the cards the same way?


----------



## xTesla1856

Here's my run from last night with my sig rig:

http://www.3dmark.com/3dm/18572268?

Memory is at +500, so 11k effectively. Highes boost I saw is 2152 before it settles around 2100 (I need a damn custom loop already







)


----------



## Radox-0

Quote:


> Originally Posted by *CptSpig*
> 
> That only puts you at number 89 on the hall of fame list. Lots of Titan XP's in front. Your memory is not stock it's overclocked? Give us your scores when you have your best overclocks.


That is at stock. On everything including memory. That's how all 1080tis are showing up on the runs for memory at stock. 89 you say, did not even see, not too bad for a random stock run







Will do when I can right now into gaming


----------



## CptSpig

Quote:


> Originally Posted by *Radox-0*
> 
> That is at stock. On everything including memory. That's how all 1080tis are showing up on the runs for memory at stock. 89 you say, did not even see, not too bad for a random stock run
> 
> 
> 
> 
> 
> 
> 
> Will do when I can right now into gaming


Thanks


----------



## jsutter71

Quote:


> Originally Posted by *Radox-0*
> 
> Those scores seem extremely odd for the 1080Ti. Ran Firestrike 3 times now with a pair of stock 1080Ti's and all handily blew past the score he got. Here is a run at stock which coincidently has the same core clock listed as your TXP run although I only have a 5960x:
> 
> 
> 
> http://www.3dmark.com/fs/11994432


I like to open mind as their are multiple variables to consider. He is under water same as mine. Were both using the same amount of memory clocked the same. He has a Asus RAMPAGE V EDITION 10 and I have a Asus X99-E WS USB 3.1, Same CPU. The differences are I was running a SM951 and he was using a 950 pro for storage. The background programs and number of monitors. I don't ever disable background programs when I test nor turn off additional monitors since I'm running 4 of them. All my monitors are IPS panels with no g-sync. My primary monitor which is used for the testing is a 31" 4096x2160 display. When I ran that test I was being conservative with my settings and was not overclocking my system memory.


----------



## Vellinious

Quote:


> Originally Posted by *jsutter71*
> 
> I like to open mind as their are multiple variables to consider. He is under water same as mine. Were both using the same amount of memory clocked the same. He has a Asus RAMPAGE V EDITION 10 and I have a Asus X99-E WS USB 3.1, Same CPU. The differences are I was running a SM951 and he was using a 950 pro for storage. The background programs and number of monitors. I don't ever disable background programs when I test nor turn off additional monitors since I'm running 4 of them. All my monitors are IPS panels with no g-sync. My primary monitor which is used for the testing is a 31" 4096x2160 display. When I ran that test I was being conservative with my settings and was not overclocking my system memory.


I don't think the system memory does much for the physics score in FS. Has a pretty sizable impact on the CPU test in Timespy, though.


----------



## Gunslinger.

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks, When he made the Titan XP run did he mod the cards the same way?


No, that card is modded with an EPower (external power board) which is more complicated.

The 1080 Ti's are modded with standard VR's and in a more typical way than the EPower method.


----------



## Radox-0

Quote:


> Originally Posted by *jsutter71*
> 
> I like to open mind as their are multiple variables to consider. He is under water same as mine. Were both using the same amount of memory clocked the same. He has a Asus RAMPAGE V EDITION 10 and I have a Asus X99-E WS USB 3.1, Same CPU. The differences are I was running a SM951 and he was using a 950 pro for storage. The background programs and number of monitors. I don't ever disable background programs when I test nor turn off additional monitors since I'm running 4 of them. All my monitors are IPS panels with no g-sync. My primary monitor which is used for the testing is a 31" 4096x2160 display. When I ran that test I was being conservative with my settings and was not overclocking my system memory.


Don't get me wrong there will be differences, but FS GPU score does a pretty good job of just comparing the overall graphics score which you noted. The overall system score is where those other factors start being involved pretty heavily.

Simply put I cannot fathom why there would be 5000 GPU score difference when in pretty much every other scenario the 1080Ti and Titan XP at stock are relatively close performance wise, more so when my own Graphics score show the same (disregarding the other scores)

A raandom review with lower clocked 5960x and pair of FE 1080Ti's at stock Firestike grpahics result: https://us.hardware.info/reviews/7270/4/nvidia-geforce-gtx-1080-ti-review-incl-sli-faster-card-for-the-same-price-benchmarksn3dmark-fire-strike--extreme--ultra

Main reason is it shows a graphics score of 52k at stock in SLI, roughly inline with stock TXP and 1080Ti scores in SLI (I got)


----------



## vmanuelgm

Quote:


> Originally Posted by *CptSpig*
> 
> So far his graphic scores do not indicate the 1080 ti is faster. KingPin is the only one who has higher scores but we can't compare he took down his Titan XP scores. As of right now no other 1080 ti's have made the top ten. Only time will tell.


I was joking.

Kingpin has reached a higher score because of the Ti pcb. The TitanXP has to be hard modded, some custom Ti's come prepared for war, xDDD

Except for LN2, will be difficult a Ti can beat TXP unless it does 2220+ and 2GHz over stock memory.

I thought Heaven would be a great bench for Ti, but TXP wins easily.


----------



## Gunslinger.

Quote:


> Originally Posted by *vmanuelgm*
> 
> I was joking.
> 
> Kingpin has reached a higher score because of the Ti pcb. The TitanXP has to be hard modded, some custom Ti's come prepared for war, xDDD
> 
> Except for LN2, will be difficult a Ti can beat TXP unless it does 2220+ and 2GHz over stock memory.
> 
> I thought Heaven would be a great bench for Ti, but TXP wins easily.


Where to start...

Both Titan XP and 1080 Ti scores from KP are run on cards that are "hard modded" it has nothing to do with the PCB

Heaven is strictly a CPU bench with today's powerful GPU's. A better comparison is Time Spy or the Firestrike benchmarks.


----------



## xTesla1856

To those who did the shunt mod:

How much did you gain, if anything?


----------



## Jpmboy

Quote:


> Originally Posted by *Gunslinger.*
> 
> Where to start...
> 
> Both Titan XP and 1080 Ti scores from KP are run on cards that are "hard modded" it has nothing to do with the PCB
> 
> Heaven is strictly a CPU bench with today's powerful GPU's. A better comparison is Time Spy or the Firestrike benchmarks.


To isolate the GPU, Firestrike Ultra maybe? Or, to really take the CPU out of the question, VR Mark Blue Room?
Let's hope the new Unigine benchmark (was to be out Q1







) ups the ante.


----------



## Gunslinger.

Quote:


> Originally Posted by *Jpmboy*
> 
> To isolate the GPU, Firestrike Ultra maybe? Or, to really take the CPU out of the question, VR Mark Blue Room?
> Let's hope the new Unigine benchmark (was to be out Q1
> 
> 
> 
> 
> 
> 
> 
> ) ups the ante.


Firestrike is the hardest of the three to pass at high clocks. (FS>FSE>FSU) At least from my experience with 980 Ti clocked at 2000MHz+ on LN2

Blue Room would be another good one to compare with.

It would be very interesting to see scores compared at identical GPU core and memory clocks as well as CPU clocks, off of the same system.


----------



## vmanuelgm

Quote:


> Originally Posted by *Gunslinger.*
> 
> Where to start...
> 
> Both Titan XP and 1080 Ti scores from KP are run on cards that are "hard modded" it has nothing to do with the PCB
> 
> Heaven is strictly a CPU bench with today's powerful GPU's. A better comparison is Time Spy or the Firestrike benchmarks.


U are right, thought he used a custom pcb from EVGA. My fault...










In regards to performance, all benchs are useful, being Heaven one more. Still dont see any 1080ti's in the TimeSpy top 10, but it is too early. Must wait for shunt mods and water customs...










But Ti's need more than 2200 and 2 GHz over stock memory to beat the TitanXP at max. A few Titans are able to bench (custom water) at 2189+-5900+... This cant be denied.

Quote:


> Originally Posted by *Gunslinger.*
> 
> Firestrike is the hardest of the three to pass at high clocks. (FS>FSE>FSU) At least from my experience with 980 Ti clocked at 2000MHz+ on LN2
> 
> Blue Room would be another good one to compare with.
> 
> It would be very interesting to see scores compared at identical GPU core and memory clocks as well as CPU clocks, off of the same system.


A good bench to test stability is Heavensward.


----------



## Vellinious

Quote:


> Originally Posted by *Jpmboy*
> 
> To isolate the GPU, Firestrike Ultra maybe? Or, to really take the CPU out of the question, VR Mark Blue Room?
> Let's hope the new Unigine benchmark (was to be out Q1
> 
> 
> 
> 
> 
> 
> 
> ) ups the ante.


They're still using their engine, though, aren't they? I'm honestly not hoping for much....won't disappoint that way.


----------



## Jpmboy

Quote:


> Originally Posted by *Gunslinger.*
> 
> Firestrike is the hardest of the three to pass at high clocks. (FS>FSE>FSU) At least from my experience with 980 Ti clocked at 2000MHz+ on LN2
> 
> Blue Room would be another good one to compare with.
> 
> It would be very interesting to see scores compared at identical GPU core and memory clocks as well as CPU clocks, off of the same system.


Yeah, Passing at 1080P is more of a on-card IO and driver issue/bottleneck than an actual harder load on the GPU imo. Basically, all I'm sayin' is that the less of an impact the CPU has on the graphics score (or overall score).. well you know.








Quote:


> Originally Posted by *Vellinious*
> 
> They're still using their engine, though, aren't they? I'm honestly not hoping for much....won't disappoint that way.


I really don't know if the unigine engine has been updated or if they have (hopefully) integrated dx12


----------



## Lee0

I read that some of us in the club has returned or sold their Titans to join the 1080ti club. For those that returned their Titans, how long was it since you bought them?


----------



## PasK1234Xw

nvidia has 30 day return

Honestly if you're going to pay titan premium you get it on release not almost year later when Ti about to release


----------



## jhowell1030

Quote:


> Originally Posted by *PasK1234Xw*
> 
> nvidia has 30 day return
> 
> *Honestly if you're going to pay titan premium you get it on release not almost year later when Ti about to release*


That's the perfect point.


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> Yeah, Passing at 1080P is more of a on-card IO and driver issue/bottleneck than an actual harder load on the GPU imo. Basically, all I'm sayin' is that the less of an impact the CPU has on the graphics score (or overall score).. well you know.
> 
> 
> 
> 
> 
> 
> 
> 
> I really don't know if the unigine engine has been updated or if they have (hopefully) integrated dx12


They have a new benchmark coming out called superposition


----------



## Lee0

Quote:


> Originally Posted by *jhowell1030*
> 
> That's the perfect point.


I'll have to agree. But the idea of having 1080ti in SLI sounds kind of awesome. But I then again, I don't really need it as my Titan can handle 60fps in 4k with almost everything maxed out (some exceptions exist). I don't really need more than 60fps since then my monitor would become a bottleneck and I would "have" to upgrade that and then something else, and the cycle continues.


----------



## Baasha

lol.. 2x 1080 Ti OC'd cannot do 60FPS @ 4K in GTA V (albeit with the NaturalVision 2.0 mod):


----------



## xTesla1856

Seems like the TXP clocks consistently higher than the Ti. None of the guys on OCN seem to reach 2100 core. I played around with my memory last night and with still +120 core the memory will do +825 stable. Nuts !


----------



## BrainSplatter

Quote:


> Originally Posted by *Baasha*
> 
> lol.. 2x 1080 Ti OC'd cannot do 60FPS @ 4K in GTA V (albeit with the NaturalVision 2.0 mod):


GPU utilization looks a bit wonky. Maybe caused by some shaders in the mod. Also, DDR3 2133 RAM speed is not the greatest (although quad channel) which could cause some streaming performance issues when driving around.


----------



## Jpmboy

Quote:


> Originally Posted by *lilchronic*
> 
> They have a new benchmark coming out called superposition


yeah, that's the one I was referring to in the post above... couldn;t remember the name. Anyway it is/was to be out in 1Q, so any day now.


----------



## JunkaDK

Can someone help me explain why my friend "KilmerDK" is scoring so much lower with his setup compared to Yakasa? 3DMark is showing identical clocks. Is it 3Dmark now showing his correct clocks or?

http://www.3dmark.com/compare/fs/12006051/fs/11223837


----------



## jhowell1030

It could be an overclock that's not 100% "stable." I had my 5820k clocked at 4.5GHz and was able to play games, run stress tests, render 4k video no problem. Went to run firestrike and my CPU scores were much worse than at 4.4. Gave it a little more juice on the voltage and that fixed it.


----------



## JunkaDK

Quote:


> Originally Posted by *jhowell1030*
> 
> It could be an overclock that's not 100% "stable." I had my 5820k clocked at 4.5GHz and was able to play games, run stress tests, render 4k video no problem. Went to run firestrike and my CPU scores were much worse than at 4.4. Gave it a little more juice on the voltage and that fixed it.


For the cpu score maybe, but what about gpu score?


----------



## jhowell1030

Other than every GPU being different I really couldn't tell ya. I'm at work and couldn't open the link to see the benchmark comparison. (The site is blocked for some reason)

Also, I have no idea all of the little tricks people use to squeeze every point out of there results as far as NVIDIA control panel settings and all that jazz. I don't know if any of those should be taken into consideration. One thing I noted from personal testing was to disable the Nvidia capture server. The only reason why I thought of it was because during the CPU test when the little pop-up window telling you that you could record came up my FPS went from 60FPS down to 23 FPS for the duration it was there and gone.


----------



## stryker7314

Does the TX Maxwell waterblock fit the TX Pascal card?


----------



## jsutter71

Quote:


> Originally Posted by *stryker7314*
> 
> Does the TX Maxwell waterblock fit the TX Pascal card?


Ar you serious


----------



## JunkaDK

Quote:


> Originally Posted by *JunkaDK*
> 
> Can someone help me explain why my friend "KilmerDK" is scoring so much lower with his setup compared to Yakasa? 3DMark is showing identical clocks. Is it 3Dmark now showing his correct clocks or?
> 
> http://www.3dmark.com/compare/fs/12006051/fs/11223837


Anyone else have any input on this?







Before he pulls his hair out


----------



## MrKenzie

Quote:


> Originally Posted by *JunkaDK*
> 
> Anyone else have any input on this?
> 
> 
> 
> 
> 
> 
> 
> Before he pulls his hair out


My guess is "Yakasa" has much better cooling so is running the clocks higher for longer.


----------



## JunkaDK

Could be,., But my friend has a dual loop setup with his 2 titans in one loop and the CPU on a seperate loop... Will ask my friend to do another run and monitor clocks ect.


----------



## MrKenzie

Quote:


> Originally Posted by *JunkaDK*
> 
> Could be,., But my friend has a dual loop setup with his 2 titans in one loop and the CPU on a seperate loop... Will ask my friend to do another run and monitor clocks ect.


I have chilled water cooling which drops my coolant temp to about 8c below ambient. It's good enough for a FS Ultra graphics score of 8-9th place. Cooling helps a lot with Pascal!


----------



## JunkaDK

Quote:


> Originally Posted by *MrKenzie*
> 
> I have chilled water cooling which drops my coolant temp to about 8c below ambient. It's good enough for a FS Ultra graphics score of 8-9th place. Cooling helps a lot with Pascal!


fsdfdsfas[quote

How does that work?







please explain your setup?


----------



## MrKenzie

It's simply a Teko TK500 aquarium chiller in series with a pump, reservior and my CPU and GPU. To get it colder, use a bigger chiller. I went this way to avoid the hassle of radiator and fan setups.


----------



## JunkaDK

Quote:


> Originally Posted by *MrKenzie*
> 
> It's simply a Teko TK500 aquarium chiller in series with a pump, reservior and my CPU and GPU. To get it colder, use a bigger chiller. I went this way to avoid the hassle of radiator and fan setups.


wow that is awesome.. So no rads and fans in the case?







just the aquarium chiller outside the case connect to a pump and res inside the case ?


----------



## MrKenzie

Quote:


> Originally Posted by *JunkaDK*
> 
> wow that is awesome.. So no rads and fans in the case?
> 
> 
> 
> 
> 
> 
> 
> just the aquarium chiller outside the case connect to a pump and res inside the case ?


Yes that's right, I only have case fans. So my PC is virtually silent until I'm gaming, the chiller doesn't have to run at all unless I'm gaming (the coolant temperature never goes above 35c under general use).


----------



## Silent Scone

I've recently been messing about with an Aquereo to control over 1600mm of rad space. System is near silent even under load with over 18 fans, water temps around 25-30c with the Titan around 30c. Level of control surpasses anything you can achieve on motherboard fan controls.

Although if I was in Australia, I'd likely have bitten on a chiller by now also lol.


----------



## MrKenzie

Quote:


> Originally Posted by *Silent Scone*
> 
> I've recently been messing about with an Aquereo to control over 1600mm of rad space. System is near silent even under load with over 18 fans, water temps around 25-30c with the Titan around 30c. Level of control surpasses anything you can achieve on motherboard fan controls.
> 
> Although if I was in Australia, I'd likely have bitten on a chiller by now also lol.


My personal experience is that the GPU temp is always about 8c above the coolant temp under load, this is an immediate reading that can be seen the second the GPU throttles to full load. So for me, on air cooled liquid, I would expect a GPU temp of around 38c if my room temp was 25c (considering a 5c over ambient coolant temp, then another 8c above coolant temp)

The highest I have ever seen my Titan during gaming was 26c and that was with an ambient of 33c!


----------



## Silent Scone

Quote:


> Originally Posted by *MrKenzie*
> 
> My personal experience is that the GPU temp is always about 8c above the coolant temp under load, this is an immediate reading that can be seen the second the GPU throttles to full load. So for me, on air cooled liquid, I would expect a GPU temp of around 38c if my room temp was 25c (considering a 5c over ambient coolant temp, then another 8c above coolant temp)
> 
> The highest I have ever seen my Titan during gaming was 26c and that was with an ambient of 33c!


Depends on how much load is being placed on the GPU (in terms of strictly GPU temps), resolution and what frame rate is being targeted for instance. 30c coolant is an extreme case for my setup in this climate. Coolant temps rarely breach 26c currently with an ambient of 20-22c


----------



## bl4ckdot

I finally broke the 33k graphic score : http://www.3dmark.com/fs/12026182
And the less impressive 11k on Time Spy : http://www.3dmark.com/spy/1391750

My sweet spot is +215/+510.
I'm happy


----------



## Jpmboy

Quote:


> Originally Posted by *JunkaDK*
> 
> Anyone else have any input on this?
> 
> 
> 
> 
> 
> 
> 
> Before he pulls his hair out


different drivers, higher cpu clock, higher ram clocks and lower temperatures... that's all.








Quote:


> Originally Posted by *Silent Scone*
> 
> I've recently been messing about with an Aquereo to control over 1600mm of rad space. System is near silent even under load with over 18 fans, water temps around 25-30c with the Titan around 30c. Level of control surpasses anything you can achieve on motherboard fan controls.
> 
> Although if I was in Australia, I'd likely have bitten on a chiller by now also lol.


the Aquaero is teets. with a couple of in-line temp sensors and using the built-in relay, it can control just about any config... even switching on a chiller if desired. (not that i know anyone who might have tried that)


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> different drivers, higher cpu clock, higher ram clocks and lower temperatures... that's all.
> 
> 
> 
> 
> 
> 
> 
> 
> the Aquaero is teets. with a couple of in-line temp sensors and using the built-in relay, it can control just about any config... even switching on a chiller if desired. (not that i know anyone who might have tried that)


Ah dude, not even scratched the surface yet. It's amazing


----------



## JunkaDK

Quote:


> Originally Posted by *Jpmboy*
> 
> different drivers, higher cpu clock, higher ram clocks and lower temperatures... that's all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So the cpu clock / ram clock Can affect the graphics score that much? It can hardly be the drivers ... my frikends temps are 35 at max load on his cards ?
> 
> the Aquaero is teets. with a couple of in-line temp sensors and using the built-in relay, it can control just about any config... even switching on a chiller if desired. (not that i know anyone who might have tried that)


----------



## lanofsong

Hey Titan X Pascal owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Sheyster

Well, I still have my card. Thinking about adding a second (used) one, or possibly going with 2 x EVGA FTW3's when they're available. Decisions, decisions...


----------



## Jpmboy

Quote:


> Originally Posted by *Sheyster*
> 
> Well, I still have my card. Thinking about adding a second (used) one, or possibly going with 2 x EVGA FTW3's when they're available. Decisions, decisions...


wait for the 1080Ti Classified.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> wait for the 1080Ti Classified.


I don't think I'm patient enough to wait for it!







It's usually a few months after all their other cards in the series are released right?


----------



## Nitemare3219

Quote:


> Originally Posted by *Jpmboy*
> 
> wait for the 1080Ti Classified.


Absolutely no reason to wait and pay more for a Classified card. Buy an FE and water cool, or just get a current AIB offering that isn't going to add ridiculous unneeded cost.


----------



## Sheyster

Quote:


> Originally Posted by *Nitemare3219*
> 
> Absolutely no reason to wait and pay more for a Classified card. Buy an FE and water cool, or just get a current AIB offering that isn't going to add ridiculous unneeded cost.


The reason I want the FTW3 is: 1) It's available soon and, 2) It has dual BIOS, one with a higher power limit. The older 1080 FTW's second BIOS had a 130% power limit; it basically never throttled. I owned one for about a month and ran it at 2150 MHz, temps never exceeded 70 degrees with the stock fan curve and it was virtually silent. Since I'm gonna run two of the FTW3 cards, I'm only planning a modest overclock. I do not intend to water cool them.


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> wait for the 1080Ti Classified.


Definitely waiting to see what if anything the custom cards can do, my Evbots have been idle for some time


----------



## jcde7ago

Well, by pure coincidence (no, seriously) one of my TXPs seemingly went KAPUT last night...was playing the ME: A trial, temps were in the mid-50s and running stock, and my monitor blackscreened in the middle of a multiplayer game while the sound continued to play through the speakers....I had to hard reset my PC and after a few more restarts the primary card simply wouldn't output a picture any longer....

After switching my displayport cable to output from the 2nd card I got a picture, and hardware manager did not even pick up the TXP from the first PCI-E slot at all...swapped in a spare 970 in the spot and it was detected just fine. So, dead TXP....this would be my first failed GPU since about 2005 when I had a 7900 GTX fail on me...just in case I checked the GPU block for any gunk or blockage/buildup from my loop, and it was perfectly fine...

The "good news" is, it just so happened that I was checking nowinstock.com this morning and the 1080 Ti FEs were on sale for a good hour, maybe a little less, on Nvidia's site...managed to pick up two! Will be RMA'ing the failed TXP with Nvidia and then my buddy has committed to buying both TXPs off me for a substantial discount....

Was good while it lasted, hopefully these 1080 Tis will treat me well....might have held off on buying a pair but the fact that EK's TXP blocks officially support the 1080 Tis made the decision much easier for me. I wish everyone the best of luck with their TXPs...would have stayed if both of mine had lived....this will be my last generation with SLI and I had a good experience with the TXPs overall on an X34 so I was confident in getting a pair of the 1080 TIs.


----------



## MrTOOSHORT

Sorry to hear that happen and best of luck to you with the 1080tis.


----------



## Gunslinger.

Quote:


> Originally Posted by *Nitemare3219*
> 
> Absolutely no reason to wait and pay more for a Classified card. Buy an FE and water cool, or just get a current AIB offering that isn't going to add ridiculous unneeded cost.


----------



## pez

Quote:


> Originally Posted by *jcde7ago*
> 
> Well, by pure coincidence (no, seriously) one of my TXPs seemingly went KAPUT last night...was playing the ME: A trial, temps were in the mid-50s and running stock, and my monitor blackscreened in the middle of a multiplayer game while the sound continued to play through the speakers....I had to hard reset my PC and after a few more restarts the primary card simply wouldn't output a picture any longer....
> 
> After switching my displayport cable to output from the 2nd card I got a picture, and hardware manager did not even pick up the TXP from the first PCI-E slot at all...swapped in a spare 970 in the spot and it was detected just fine. So, dead TXP....this would be my first failed GPU since about 2005 when I had a 7900 GTX fail on me...just in case I checked the GPU block for any gunk or blockage/buildup from my loop, and it was perfectly fine...
> 
> The "good news" is, it just so happened that I was checking nowinstock.com this morning and the 1080 Ti FEs were on sale for a good hour, maybe a little less, on Nvidia's site...managed to pick up two! Will be RMA'ing the failed TXP with Nvidia and then my buddy has committed to buying both TXPs off me for a substantial discount....
> 
> Was good while it lasted, hopefully these 1080 Tis will treat me well....might have held off on buying a pair but the fact that EK's TXP blocks officially support the 1080 Tis made the decision much easier for me. I wish everyone the best of luck with their TXPs...would have stayed if both of mine had lived....this will be my last generation with SLI and I had a good experience with the TXPs overall on an X34 so I was confident in getting a pair of the 1080 TIs.


I love the single card performance with the TXP and X34, but I can't lie that I haven't considered Ti SLI a couple times







.


----------



## CptSpig

Quote:


> Originally Posted by *pez*
> 
> I love the single card performance with the TXP and X34, but I can't lie that I haven't considered Ti SLI a couple times
> 
> 
> 
> 
> 
> 
> 
> .


Why not just stick with the King and add another TXP?


----------



## Sheyster

Quote:


> Originally Posted by *CptSpig*
> 
> Why not just stick with the King and add another TXP?


The main reason I'm hesitating to add a second (used) one is the noise factor; I don't intend to water cool the cards. Something like the 1080 Ti FTW3 (or Gigabyte G1) will be much cooler and quieter in an air-cooled setup. An added bonus is the better secondary BIOS with higher power limit.


----------



## pez

Quote:


> Originally Posted by *CptSpig*
> 
> Why not just stick with the King and add another TXP?


Quote:


> Originally Posted by *Sheyster*
> 
> The main reason I'm hesitating to add a second (used) one is the noise factor; I don't intend to water cool the cards. Something like the 1080 Ti FTW3 (or Gigabyte G1) will be much cooler and quieter in an air-cooled setup. An added bonus is the better secondary BIOS with higher power limit.


Yeah, this is kinda my situation, too. Otherwise I'd scour for a secondhand TXP for 'cheap'. I bought a 1080 in Jan as the TXP noise was getting to me. The EVGA ACX 3.0 for example is quiet in comparison to a FE cooler at 70% or so.

I've since come to terms with the noise a bit more and kinda dealt with it in the interim. The TXP's performance is worth the noise for now. I have 30 days left on my step up with EVGA, so I'm hoping a normal iCX 1080 Ti releases for step up so I can just honestly have the best of both worlds. As much as I'd love to go back to SLI again, I'd have to 'upgrade' to a new platform for me to feel like it was worth it.


----------



## kx11

changed the cooling water and everything is fine now , here's Timespy stress test + CPU temps

fans are 100%


----------



## jsutter71

Quote:


> Originally Posted by *jcde7ago*
> 
> Well, by pure coincidence (no, seriously) one of my TXPs seemingly went KAPUT last night...was playing the ME: A trial, temps were in the mid-50s and running stock, and my monitor blackscreened in the middle of a multiplayer game while the sound continued to play through the speakers....I had to hard reset my PC and after a few more restarts the primary card simply wouldn't output a picture any longer....
> 
> After switching my displayport cable to output from the 2nd card I got a picture, and hardware manager did not even pick up the TXP from the first PCI-E slot at all...swapped in a spare 970 in the spot and it was detected just fine. So, dead TXP....this would be my first failed GPU since about 2005 when I had a 7900 GTX fail on me...just in case I checked the GPU block for any gunk or blockage/buildup from my loop, and it was perfectly fine...
> 
> The "good news" is, it just so happened that I was checking nowinstock.com this morning and the 1080 Ti FEs were on sale for a good hour, maybe a little less, on Nvidia's site...managed to pick up two! Will be RMA'ing the failed TXP with Nvidia and then my buddy has committed to buying both TXPs off me for a substantial discount....
> 
> Was good while it lasted, hopefully these 1080 Tis will treat me well....might have held off on buying a pair but the fact that EK's TXP blocks officially support the 1080 Tis made the decision much easier for me. I wish everyone the best of luck with their TXPs...would have stayed if both of mine had lived....this will be my last generation with SLI and I had a good experience with the TXPs overall on an X34 so I was confident in getting a pair of the 1080 TIs.


Mid 50's is warm day in Iraq from my experience. Are your cards under water? I had a 980ti fail the same way you describe. My 980Ti's were ran off air then water cooled later which I suspect was the reason for its failure. My TXP's have been under water from day 1.


----------



## trippinonprozac

God I love the fact that so many TXP owners are jumping ship prematurely to the 1080ti. I just picked up a second TXP for slightly less than a new 1080ti.

In my experience when both are clocked to the wall the TXP is AT LEAST as fast, if not faster.


----------



## jsutter71

I need some help. First question. Can to much overclocking lower your scores?

My Fire Strike scores have dropped from a peak of 32063 to 14725. Everything is the same with my system except now I have the latest drivers and I'm clocking my TXPs higher. Can someone please take a look at my results and provide me some input.
Thanks.

http://www.3dmark.com/fs/11512998

http://www.3dmark.com/fs/12078368


----------



## DooRules

Maybe the different nvidia drivers, or maybe the new futuremark systeminfo version is playing havoc with the low run.


----------



## pez

Quote:


> Originally Posted by *trippinonprozac*
> 
> God I love the fact that so many TXP owners are jumping ship prematurely to the 1080ti. I just picked up a second TXP for slightly less than a new 1080ti.
> 
> In my experience when both are clocked to the wall the TXP is AT LEAST as fast, if not faster.


Wow, that's pretty awesome. At that kinda price, it's a no-brainer. May I ask where you sourced your card from?


----------



## Nitemare3219

Quote:


> Originally Posted by *Gunslinger.*


Check back with me in 6 months when the Classified is finally available and it still goes no higher than 2100~MHz. You'll be glad you paid $800+ and waited all that time for no reason.


----------



## Vellinious

Quote:


> Originally Posted by *Nitemare3219*
> 
> Absolutely no reason to wait and pay more for a Classified card. Buy an FE and water cool, or just get a current AIB offering that isn't going to add ridiculous unneeded cost.


You mean except for the fact that the custom boards won't hit the power limit nearly as fast....and that the Classy will be able to use the Classy voltage tool, which will help if you keep the card cold enough.

If you're going to work to keep the card cold, and really want to push clocks, the Classy is the way to go.

Assuming someone makes a block for them.


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> Definitely waiting to see what if anything the custom cards can do, my Evbots have been idle for some time


mine too... hopefully they are not museum pieces already.


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> mine too... hopefully they are not museum pieces already.


Seeing as they're basically the same GPU, i doubt that! Just need to price realistically when selling


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> Seeing as they're basically the same GPU, i doubt that! Just need to price realistically when selling


hopefully there is a 1080Ti classified with an EVBOT port... tho, core voltage may not do much on it. Fun to try tho.


----------



## Silent Scone

Yeah, no news on that front yet by the looks of things. There a few decent AIB cards coming, though.


----------



## Gunslinger.

Quote:


> Originally Posted by *Jpmboy*
> 
> hopefully there is a 1080Ti classified with an EVBOT port... tho, core voltage may not do much on it. Fun to try tho.


Looks like we'll find out April 4th









https://www.facebook.com/photo.php?fbid=1060602657378508&set=a.156685401103576.27809.100002863523994&type=3&theater


----------



## Baasha

^that is nuts - 3000Mhz on the 1080 Ti?!? o_0

Imagine running a 6950X @ 6.0Ghz and 4x 1080 Ti @ 3Ghz


----------



## KillerBee33

]I think i have to put my Green LIL* Monster for sale









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!



PC2.txt 1k .txt file


----------



## Artah

Anyone have a link to a class action law suit for titan xp vs 1080 ti fiasco? If not I want to start one. It's crazy that a manufacturer can make a card for 1200 and 9 months later make one that runs faster for 700. For me it was only 2,400+ tax for two but for people that save for years to buy the latest it's not right. This is not an ethical business practice and Nvidia must be stopped.


----------



## mouacyk

Quote:


> Originally Posted by *Baasha*
> 
> ^that is nuts - 3000Mhz on the 1080 Ti?!? o_0
> 
> Imagine running a 6950X @ 6.0Ghz and 4x 1080 Ti @ 3Ghz


And blacking out your town lights.


----------



## Gunslinger.

Quote:


> Originally Posted by *Artah*
> 
> Anyone have a link to a class action law suit for titan xp vs 1080 ti fiasco? If not I want to start one. It's crazy that a manufacturer can make a card for 1200 and 9 months later make one that runs faster for 700. For me it was only 2,400+ tax for two but for people that save for years to buy the latest it's not right. This is not an ethical business practice and Nvidia must be stopped.




Life's not fair, suck it up snowflake.


----------



## JedixJarf

Quote:


> Originally Posted by *Artah*
> 
> Anyone have a link to a class action law suit for titan xp vs 1080 ti fiasco? If not I want to start one. It's crazy that a manufacturer can make a card for 1200 and 9 months later make one that runs faster for 700. For me it was only 2,400+ tax for two but for people that save for years to buy the latest it's not right. This is not an ethical business practice and Nvidia must be stopped.


Quote:


> Originally Posted by *Gunslinger.*
> 
> 
> 
> Life's not fair, suck it up snowflake.


LOL! Class action law suit? You purchased a product knowing that it won't be top of the line forever. The market obviously shifted, people weren't buying the titan as much as they thought, and they are worried about competition from AMD's (vega?). I should start up a class action law suite because my Nexus 6p is outdated now! Damn you google for releasing new hardware!

But for real, grow up kid. Don't drop 1200 bucks on gfx cards that will just become second best the next day.


----------



## CptSpig

Quote:


> Originally Posted by *Gunslinger.*
> 
> Looks like we'll find out April 4th
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.facebook.com/photo.php?fbid=1060602657378508&set=a.156685401103576.27809.100002863523994&type=3&theater


I fell for the KingPin hype before if you are not on LN2 not going to overclock worth a hoot. For guys like Gunslinger it's a great card. For the average machine on water not going to make much difference.


----------



## Jpmboy

Quote:


> Originally Posted by *Artah*
> 
> Anyone have a link to a class action law suit for titan xp vs 1080 ti fiasco? If not I want to start one. It's crazy that a manufacturer can make a card for 1200 and 9 months later make one that runs faster for 700. For me it was only 2,400+ tax for two but for people that save for years to buy the latest it's not right. This is not an ethical business practice and Nvidia must be stopped.


faster? it's not. and the TXP has been out for a year. If you can buy any card that can hold that title for a year, it's a landmark. Besides, unless you are at the edge of the envelope, you'd never see any difference gaming and have a year head start already.
Basically, if you want the fastest _anything_ on the planet when it comes out, pay to run with the big dogs or don;t get off the porch.








Quote:


> Originally Posted by *CptSpig*
> 
> I fell for the KingPin hype before if you are not on LN2 not going to overclock worth a hoot. For guys like Gunslinger it's a great card. For the average machine on water not going to make much difference.


yeah - the 980 kingpin was crap, the 980Ti KPE wasn't all that better, but was the fastest available if you paid the premium for ASIC... well until the matrix showed up. The only KPE that was worth the price to play was the 780Ti KPE. A landmark card that obsoleted my OG Titans.


----------



## Artah

Quote:


> Originally Posted by *Gunslinger.*
> 
> 
> 
> Life's not fair, suck it up snowflake.


Quote:


> Originally Posted by *JedixJarf*
> 
> LOL! Class action law suit? You purchased a product knowing that it won't be top of the line forever. The market obviously shifted, people weren't buying the titan as much as they thought, and they are worried about competition from AMD's (vega?). I should start up a class action law suite because my Nexus 6p is outdated now! Damn you google for releasing new hardware!
> 
> But for real, grow up kid. Don't drop 1200 bucks on gfx cards that will just become second best the next day.


Quote:


> Originally Posted by *Jpmboy*
> 
> faster? it's not. and the TXP has been out for a year. If you can buy any card that can hold that title for a year, it's a landmark. Besides, unless you are at the edge of the envelope, you'd never see any difference gaming and have a year head start already.
> Basically, if you want the fastest _anything_ on the planet when it comes out, pay to run with the big dogs or don;t get off the porch.
> 
> 
> 
> 
> 
> 
> 
> 
> yeah - the 980 kingpin was crap, the 980Ti KPE wasn't all that better, but was the fastest available if you paid the premium for ASIC... well until the matrix showed up. The only KPE that was worth the price to play was the 780Ti KPE. A landmark card that obsoleted my OG Titans.


. I'm actually fine







if I can go back in time I wouldn't change a thing!


----------



## imLagging

Any chance someone who has done a shunt mod with 10 Ω 0805 resistors have some pictures they can share?
Reading https://xdevs.com/guide/pascal_m_oc/ provide and noticed the mention of "reliable and static resistance change" which I really don't know how much can affect or does affect the card. What I do know is cleaning CLU off the shunt should be much easier than trying to unsolder and resolder caps if an RMA is needed.
Asking in this thread because users have been modding the Titan XP for a while and the 1080 Ti only few mentions of shunt mods, not many pictures online that I have found either.


----------



## toncij

Quote:


> Originally Posted by *Artah*
> 
> Anyone have a link to a class action law suit for titan xp vs 1080 ti fiasco? If not I want to start one. It's crazy that a manufacturer can make a card for 1200 and 9 months later make one that runs faster for 700. For me it was only 2,400+ tax for two but for people that save for years to buy the latest it's not right. This is not an ethical business practice and Nvidia must be stopped.


LOL. You should learn a lesson: don't buy things you can't afford. Getting premium products early cost money, there is nothing "unethical" here. Also, it's been like this since forever. It's for the big boys, not faint of heart.


----------



## hertz9753

I remember buying a Samsung 23 inch LCD TV monitor in 2006. I paid $800 and it was only 720P...


----------



## piee

TXP will hold value in long run, also it is first 4k60fps single card,also the jump to 16nm from 28 nm took years and it will be years before 7nm graphix cards, which means theres nothing that is going to crush TXP for some years, unlike what TXP did to TXM ,and 4k 60fps plays smooth for me with vsync no buffer + locked 60fps on TXP, each 100mhz increase is 2.5fps so to get single card 30fps above TXP a card has to hit 2800 or 3000mhz, going to be a while.


----------



## Nitemare3219

Quote:


> Originally Posted by *Jpmboy*
> 
> faster? it's not. and the TXP has been out for a year. If you can buy any card that can hold that title for a year, it's a landmark. Besides, unless you are at the edge of the envelope, you'd never see any difference gaming and have a year head start already.


Why are you sitting here defending NVIDIA? They're a bunch of ****bags. Titan XP has NOT been out for a year - just under 8 months. On top of that, they released a card with the same performance and 90+%~ of the capability (VRAM, cache, ROP limitation), for under 60% the price. Additionally, they took it upon themselves to upgrade the board components of the Ti over what the Titan X offers. It's ridiculous. Titan cards are supposed to reign absolutely supreme over their respective generations until the next generation releases.


----------



## Jpmboy

Quote:


> Originally Posted by *Nitemare3219*
> 
> Why are you sitting here defending NVIDIA? They're a bunch of ****bags. Titan XP has NOT been out for a year - just under 8 months. On top of that, they released a card with the same performance and 90+%~ of the capability (VRAM, cache, ROP limitation), for under 60% the price. Additionally, they took it upon themselves to upgrade the board components of the Ti over what the Titan X offers. It's ridiculous. Titan cards are supposed to reign absolutely supreme over their respective generations until the next generation releases.


then buy something else.


----------



## hertz9753

I'm pretty sure what NVIDIA did is not against the law and they have done the same thing in past. Most people knew the Ti card was going to come out, just because you can't wait doesn't mean they are wrong.


----------



## beatfried

Quote:


> Originally Posted by *Nitemare3219*
> 
> Why are you sitting here defending NVIDIA? They're a bunch of ****bags. Titan XP has NOT been out for a year - just under 8 months. On top of that, they released a card with the same performance and 90+%~ of the capability (VRAM, cache, ROP limitation), for under 60% the price. Additionally, they took it upon themselves to upgrade the board components of the Ti over what the Titan X offers. It's ridiculous. Titan cards are supposed to reign absolutely supreme over their respective generations until the next generation releases.


lol... OG Titan would like to have a word with you.....


----------



## Jpmboy

Quote:


> Originally Posted by *hertz9753*
> 
> I'm pretty sure what NVIDIA did is not against the law and they have done the same thing in past. Most people knew the Ti card was going to come out, just because you can't wait doesn't mean they are wrong.


I'm 100% sure it is not against the law (here in the US). The "Ti" edition has always been in this position going back several generations. I'm really waiting to see the Ti Classified (hopefully)


----------



## mouacyk

Pretty happy to hear about the shunt mods working on 1080 Ti, which effectively removes the power throttling. Gratz and keep up the good work. Unlike the Titan XP, shunted, the 1080 Ti should work more reliably (last longer) due to the better VRM.


----------



## Blaise Pascal

Quick question:
I'll be buying one of the hybrid coolers when they start coming out (haha someday soon supposedly), and I'd like to update all of my fans for the summer. Right now my case fans are 3 pin fans on 4 pin slots. Updating to all 4-pin fans will allow for fan curve control that is finer than just 50% on or 100% on right? Which can be set in bios via the PWM mode area? (ASUS X99 Deluxe, 4 pin power all around) Save some power and noise and gain some cooling flexibility.


----------



## Fredthehound

Quote:


> Originally Posted by *Artah*
> 
> Anyone have a link to a class action law suit for titan xp vs 1080 ti fiasco? If not I want to start one. It's crazy that a manufacturer can make a card for 1200 and 9 months later make one that runs faster for 700. For me it was only 2,400+ tax for two but for people that save for years to buy the latest it's not right. This is not an ethical business practice and Nvidia must be stopped.


Eh... I'm in the 'not rich, saved my pennies for a titan' club and don't feel slighted or used at all. I knew the score going in and bought 8 months of uncontested halo level performance. Big Green delivered.

If you want to sue, sue your economics professors/teachers for failing you miserably.


----------



## KillerBee33

Quote:


> Originally Posted by *Blaise Pascal*
> 
> Quick question:
> I'll be buying one of the hybrid coolers when they start coming out (haha someday soon supposedly), and I'd like to update all of my fans for the summer. Right now my case fans are 3 pin fans on 4 pin slots. Updating to all 4-pin fans will allow for fan curve control that is finer than just 50% on or 100% on right? Which can be set in bios via the PWM mode area? (ASUS X99 Deluxe, 4 pin power all around) Save some power and noise and gain some cooling flexibility.


Depending on the FANS you get , you dont need to run'em @ 50% , 1100-1300 RPM is more than enough and as quiet as if they not even ON
Check these out , i know a bit pricey but do the job well...https://www.amazon.com/Noiseblock-NB-eLoop-B-12-P-Bionic-2000rpm/dp/B00IYF9QIA/ref=sr_1_5?ie=UTF8&qid=1490384788&sr=8-5&keywords=Noiseblocker+NB-eLoop


----------



## Baasha

Quote:


> Originally Posted by *Artah*
> 
> Anyone have a link to a class action law suit for titan xp vs 1080 ti fiasco? If not I want to start one. It's crazy that a manufacturer can make a card for 1200 and 9 months later make one that runs faster for 700. For me it was only 2,400+ tax for two but for people that *save for years to buy the latest* it's not right. This is not an ethical business practice and Nvidia must be stopped.


Not sure if serious but...

Don't mean to sound offensive, but if someone has to "save for years" to buy a graphics card, I think his/her priorities are all messed up.

This hobby, especially for the 'high-end' parts, should be a drop in the bucket. If you're having to "save up" to buy computer parts, you better find yourself another hobby that doesn't impact your personal finances so much.

Saving for years to buy a house is understandable. Saving for years to buy a car is stupid. Saving for years to buy computer parts is... I don't even...


----------



## jsutter71

post deleted


----------



## hertz9753

What does that have to do with buying a Titan?


----------



## jsutter71

post deleted


----------



## st0necold

Quote:


> Originally Posted by *jsutter71*
> 
> Everything. *Some people*, myself excluded are angry *because Nvidia just demonstrated that they could release a near identical product at a $500 decrease.* These same angry people are posting comments about suing but are receiving negative feedback. IMHO they have the right to feel the way they do, and if they want to sue then it's their right. The subject pertains to the hardware on this thread therefore it is applicable.
> 
> I also want to add that I am not trying to make anyone angry nor step on any toes. I understand the cost of business and love my TXPs. I remember my mom buying me a IBM PC jr when I was 14 years old and a year later IBM discontinued it. I remember feeling angry, and upset at the time but that was because I was a just a kid. The take away being that the PC hobby is like buying a car. Their will always be something newer and better to replace it.


Dude... some people including you didn't listen to the many (many) people that reminded everyone on the forum that the 1080ti will do exactly the same thing as the TX did with the 980ti. This was far from a secret and they demonstrated they can do this years ago. You were unfortunately in denial, ignored all of the warnings, and did exactly as I said some people will do-- "I got ripped off because the exact same thing is coming out for half the price."

Some people will repeat this the next cycle..

Some people think the next card that comes out every year will make there game look better they are just suckers and this is how they make profit.


----------



## jsutter71

post deleted


----------



## toncij

Quote:


> Originally Posted by *jsutter71*
> 
> You obviously did not read what I said. I said I WAS HAPPY WITH MY TXPs.


But you told him you were angry too. Are you bipolar?


----------



## jsutter71

post deleted


----------



## hertz9753

Just let it go.


----------



## Sheyster

LOL, this literally happens every time a Ti card is released, since the OG Titan days. Tempers flare and scrubs who could not afford the latest Titan come into threads like this one to further inflame folks. Mods need to come down much harder on guys who are non-owners coming in here spewing crap about how they're getting the same thing for half price. They don't get it and they never will.


----------



## D749

The trick with video cards, if you always want the best and want to minimize loss, is buy and sell - rinse and repeat. The quicker you sell the less you lose. I consider it already "late" to sell a Titan XP. However, just last week sold my two Titan XP on eBay for over $1,000 each.


----------



## Silent Scone

Some of these guys just get too caught up in the excitement. Essentially, the way I look at it is it's no different to buying a second hand TXP 6 months on for 700-800.

Welcome to the party







.

The other take away is that the 1080Ti is already that length of time into it's life cycle. I don't intend on having my Titan in 6 months time. In fact quite possibly even less than that.


----------



## Glerox

Hi guys , i have a question for you!

Proud owner of a titan XP from day one.
This gpu is amazing. I'm playing mass effect andromeda in 4K HDR at 60hz at high settings. I'm blown away.

Now, I could buy a second titan XP for 800-900$. Is it worth it? I know a lot of you here have titans in SLI.

the other option would be to wait for the Titan Volta and replace my titan xp.

Usually, the best answer in tech is always waiting. Thanks for your thoughts about that!


----------



## kx11

i would not recommend SLi for anyone , games that support it are awesome but how many of them support it when they launch ??

it's your call and 800$ is too much for a TXP after 1080ti release


----------



## bl4ckdot

Quote:


> Originally Posted by *kx11*
> 
> i would not recommend SLi for anyone , games that support it are awesome but how many of them support it when they launch ??
> 
> it's your call and 800$ is too much for a TXP after 1080ti release


I personally think that 800 for a Titan XP is very good. I agree on the SLI part though.


----------



## pez

Is this a millennial thing? I mean, I'm young and technically fall into that age-range, but come on....

I bought my TXP knowing full well a Ti might be announced a month later. But all in all...the Ti doesn't get that snazzy looking FE cooler







.


----------



## Silent Scone

Quote:


> Originally Posted by *Baasha*
> 
> Not sure if serious but...
> 
> Don't mean to sound offensive, but if someone has to "save for years" to buy a graphics card, I think his/her priorities are all messed up.
> 
> This hobby, especially for the 'high-end' parts, should be a drop in the bucket. If you're having to "save up" to buy computer parts, you better find yourself another hobby that doesn't impact your personal finances so much.
> 
> Saving for years to buy a house is understandable. Saving for years to buy a car is stupid. Saving for years to buy computer parts is... I don't even...


Buying 4 TITAN is stupid lol.


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> Buying 4 TITAN is stupid lol.


Not when he's getting them to scale and run appropriately







.


----------



## Silent Scone

Quote:


> Originally Posted by *pez*
> 
> Not when he's getting them to scale and run appropriately
> 
> 
> 
> 
> 
> 
> 
> .


Have you used them? Running properly is subjective to say the least, and that's when the profiles work at all.

Maybe he can share some frametime data in a few games to support the claims...


----------



## xTesla1856

To me, a Titan is not a piece of "hardware", it's a piece of "heartware".


----------



## meson1

Quote:


> Originally Posted by *pez*
> 
> I bought my TXP knowing full well a Ti might be announced a month later.


Me too. I bought mine back in September/October. In fact I'm surprised how long to took them before they released the Ti. It's usually just a month or too. But thinking about it, I don't think the Ti was later than usual. I think the Titan was earlier than expected. In fact, if you think back to August, it did catch everyone on the hop.

I don't think anyone has much of a right to be upset at Nvidia. Nvidia are well known for this kind of practise. And it wasn't like it was something no-one was saying. Nearly everyone to a man talked about how the 1080 Ti would come along sooner or later. You would have had to have been living in a cave in some remote corner of the planet to miss that advice.


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> Have you used them? Running properly is subjective to say the least, and that's when the profiles work at all.
> 
> Maybe he can share some frametime data in a few games to support the claims...


Sure, but calling someone's setup stupid because they spent money where they want to....

That's like me criticizing you for buying x99 to run one GPU on a 1200w PSU....

You can check through the pictures of him running 4 TXP (as well as the Ti thread for 4 x Ti) in a few titles at 90+% on each GPU.
Quote:


> Originally Posted by *meson1*
> 
> Me too. I bought mine back in September/October. In fact I'm surprised how long to took them before they released the Ti. It's usually just a month or too. But thinking about it, I don't think the Ti was later than usual. I think the Titan was earlier than expected. In fact, if you think back to August, it did catch everyone on the hop.
> 
> I don't think anyone has much of a right to be upset at Nvidia. Nvidia are well known for this kind of practise. And it wasn't like it was something no-one was saying. Nearly everyone to a man talked about how the 1080 Ti would come along sooner or later. You would have had to have been living in a cave in some remote corner of the planet to miss that advice.


Yeah, I'm pretty sure it was earlier than normal. I sold off my 2 1080s and directly funded my move to the TXP as well as the ITX components with it. That being said, I will be getting a Ti out of the shear fact I have the ability to step up my 1080 to one. There's no telling if I sell it, re-purpose it or what, but in the end, one of my PCs will have a fast single card solution







.


----------



## meson1

Quote:


> Originally Posted by *pez*
> 
> Yeah, I'm pretty sure it was earlier than normal. I sold off my 2 1080s and directly funded my move to the TXP as well as the ITX components with it. That being said, I will be getting a Ti out of the shear fact I have the ability to step up my 1080 to one. There's no telling if I sell it, re-purpose it or what, but in the end, one of my PCs will have a fast single card solution
> 
> 
> 
> 
> 
> 
> 
> .


I don't normally 'micro-upgrade' like that. I stick with a card for a couple of generations before skipping. My last card was a 780 Ti. By rights I should skip Volta and get whatever comes next. But if Volta will give me [email protected] on Ultra settings and with some headroom, I might be tempted.


----------



## pez

Quote:


> Originally Posted by *meson1*
> 
> I don't normally 'micro-upgrade' like that. I stick with a card for a couple of generations before skipping. My last card was a 780 Ti. By rights I should skip Volta and get whatever comes next. But if Volta will give me [email protected] on Ultra settings and with some headroom, I might be tempted.


Yeah, I normally wouldn't have, but I wanted to go with SFF, have the best single GPU I could have, and also avoid spending another $1200 a month later to SLI it







. In the end, I'm super happy with whatever card will remain the permanent card in this system for a while (TXP or Ti).

Previously I went from a 780 to 970 SLI (not my choice to go with a 970; EVGA sent me one to replace the 2nd failed 780). Getting a preferably ICX Ti in this system would be best case scenario for me due to noise desirability. Otherwise, I can suffer happily with fan noise for the performance I get







.


----------



## Silent Scone

Quote:


> Originally Posted by *pez*
> 
> Sure, but calling someone's setup stupid because they spent money where they want to....


That's exactly what Basha was doing, though.

Quote:


> Originally Posted by *pez*
> 
> That's like me criticizing you for buying x99 to run one GPU on a 1200w PSU....


No, it's not even remotely the same. Considering I had two GPU in the system at one point, and will again. So you may want to rethink that analogy.

Quote:


> Originally Posted by *pez*
> 
> You can check through the pictures of him running 4 TXP (as well as the Ti thread for 4 x Ti) in a few titles at 90+% on each GPU.


Again, this doesn't tell you anything. You'd need to see consistent frame time data. Raw scaling is nothing but bragging without this.


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> That's exactly what Basha was doing, though.
> No, it's not even remotely the same. Considering I had two GPU in the system at one point, and will again. So you may want to rethink that analogy.
> Again, this doesn't tell you anything. You'd need to see consistent frame time data. Raw scaling is nothing but bragging without this.


The analogy is silly like the argument. I'm not saying I totally agree that you shouldn't have to save to buy a system you want, but to a point, if you're exhausting funds to where it puts you in financial jeopardy often, then yes I agree something may need to be addressed elsewhere.

EDIT:

Just for the sake of argument, even with two GPUs, you're still using nothing close to what you'd need with that PSU.


----------



## Silent Scone

Quote:


> Originally Posted by *pez*
> 
> The analogy is silly like the argument. I'm not saying I totally agree that you shouldn't have to save to buy a system you want, but to a point, if you're exhausting funds to where it puts you in financial jeopardy often, then yes I agree something may need to be addressed elsewhere.
> 
> EDIT:
> 
> Just for the sake of argument, even with two GPUs, you're still using nothing close to what you'd need with that PSU.


You don't have an argument. You've addressed my post by saying you've seen Basha has scaling, but this doesn't tell you anything without seeing frametime data, which you do not have. Bringing up PSU requirements in an argument about questionable SLI performance is somewhat of a deflection.


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> You don't have an argument. You've addressed my post by saying you've seen Basha has scaling, but this doesn't tell you anything without seeing frametime data, which you do not have. Bringing up PSU requirements in an argument about questionable SLI performance is somewhat of a deflection.


My initial argument was more to address the fact that he's used it, is happy with it, and doesn't seem to care that you might think it's stupid







. Like I said, the argument/conversation or whatever you would like to call this is silly







. As silly as me bringing up your PSU usage. I could care less what his frametimes are, what your system is, or what people spend money on as long as they're happy. But us all calling people that spend money how they want is pure silliness







. Sorry if you can't see that overall point enough to move on...


----------



## Silent Scone

Quote:


> Originally Posted by *pez*
> 
> My initial argument was more to address the fact that he's used it, is happy with it, and doesn't seem to care that you might think it's stupid
> 
> 
> 
> 
> 
> 
> 
> . Like I said, the argument/conversation or whatever you would like to call this is silly
> 
> 
> 
> 
> 
> 
> 
> . As silly as me bringing up your PSU usage. I could care less what his frametimes are, what your system is, or what people spend money on as long as they're happy. But us all calling people that spend money how they want is pure silliness
> 
> 
> 
> 
> 
> 
> 
> . Sorry if you can't see that overall point enough to move on...


If you continually move the goal posts, yes it can be difficult to come to a conclusion. From your last post then, frame times likely don't matter to Basha - and we should leave it at that...

I could quite have easily just dropped it two posts ago, but sometimes it's nice to get down to the raw deal here, and the fact of the matter is, performance with that many cards is sketchy. NVIDIA dropped 3-4 way support from an official stance themselves, that tells you everything you need to know about their long term plans and what this means for the technology.


----------



## Artah

Quote:


> Originally Posted by *Sheyster*
> 
> LOL, this literally happens every time a Ti card is released, since the OG Titan days. Tempers flare and scrubs who could not afford the latest Titan come into threads like this one to further inflame folks. Mods need to come down much harder on guys who are non-owners coming in here spewing crap about how they're getting the same thing for half price. They don't get it and they never will.


LOL this is dead on, I was getting tired of people saying my ti is better than the titan that you enjoyed for many months that you paid double for.... Sorry guys if you don't own a TXP or got rid of yours it's time to move on to that ti thread unless you have something constructive to say that helps actual TXP owners.


----------



## dboythagr8

Quote:


> Originally Posted by *Baasha*
> 
> lol.. 2x 1080 Ti OC'd cannot do 60FPS @ 4K in GTA V (albeit with the NaturalVision 2.0 mod):


2x 1080Tis can absolutely do GTA V at 4k60, in fact it can do higher. Granted you're running a mod, but I played the game for a few hours this past weekend and I was able to use DSR 4x to downsample to my 1440p resolution, and it was still over 60fps. Everything maxed out.


----------



## Artah

Quote:


> Originally Posted by *xTesla1856*
> 
> To me, a Titan is not a piece of "hardware", it's a piece of "heartware".


It's a piece of art! Looking forward to the next iteration of Titan X when I can possibly run 4k/60fps/144hz with a single card with zero issues.


----------



## Jpmboy

Quote:


> Originally Posted by *Artah*
> 
> It's a piece of art! Looking forward to the next iteration of Titan X when I can possibly run 4k/60fps/144hz with a single card with zero issues.


we have a lot of complaining... yeah, okay price wise early adopters always pay more. But performance wise, and looking at a bunch of bench threads I maintain, the Ti is just not performing as well as the titanXP. With the exception of a few extreme guys running "franken Ti's" (which I ignore for this comparo) the Ti is just not keeping up with the TXP. Am I wrong?


----------



## xTesla1856

Quote:


> Originally Posted by *Jpmboy*
> 
> we have a lot of complaining... yeah, okay price wise early adopters always pay more. But performance wise, and looking at a bunch of bench threads I maintain, the Ti is just not performing as well as the titanXP. With the exception of a few extreme guys running "franken Ti's" (which I ignore for this comparo) the Ti is just not keeping up with the TXP. Am I wrong?


Seeing most struggle to hit 2100MHz or how some cards have iffy memory, the Titan seems like the more mature, hardcore card. The Ultimate? There can only be one.


----------



## bl4ckdot

Quote:


> Originally Posted by *Jpmboy*
> 
> we have a lot of complaining... yeah, okay price wise early adopters always pay more. But performance wise, and looking at a bunch of bench threads I maintain, the Ti is just not performing as well as the titanXP. With the exception of a few extreme guys running "franken Ti's" (which I ignore for this comparo) the Ti is just not keeping up with the TXP. Am I wrong?


True, I have yet to see 33k+ on FS.


----------



## Baasha

Quote:


> Originally Posted by *pez*
> 
> Not when he's getting them to scale and run appropriately
> 
> 
> 
> 
> 
> 
> 
> .










Some people are too stupid to see this. I'll just leave it at that.

Quote:


> Originally Posted by *dboythagr8*
> 
> 2x 1080Tis can absolutely do GTA V at 4k60, in fact it can do higher. Granted you're running a mod, but I played the game for a few hours this past weekend and I was able to use DSR 4x to downsample to my 1440p resolution, and it was still over 60fps. Everything maxed out.


For a vanilla game, I'd agree. That 'mod' is NaturalVision 2 which is insanely demanding - much more so than Redux it seems.

You would be lucky if you got 30FPS consistently w/ 2x 1080 Ti at 4K with that mod. It is an absolute performance killer.


----------



## Artah

Quote:


> Originally Posted by *Jpmboy*
> 
> we have a lot of complaining... yeah, okay price wise early adopters always pay more. But performance wise, and looking at a bunch of bench threads I maintain, the Ti is just not performing as well as the titanXP. With the exception of a few extreme guys running "franken Ti's" (which I ignore for this comparo) the Ti is just not keeping up with the TXP. Am I wrong?


hah love that description "franken Ti's". I have not put my TXP through the fryer yet because I have not had a need to but soon I'll have time to put it to the test. Last time I had some time to mess with it I had no issues hitting 2100Mhz on ek blocks and backplates.


----------



## Silent Scone

Quote:


> Originally Posted by *Baasha*
> 
> 
> 
> 
> 
> 
> 
> 
> Some people are too stupid to see this. I'll just leave it at that.


Yeah, you sure showed me. lol


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> we have a lot of complaining... yeah, okay price wise early adopters always pay more. But performance wise, and looking at a bunch of bench threads I maintain, the Ti is just not performing as well as the titanXP. With the exception of a few extreme guys running "franken Ti's" (which I ignore for this comparo) the Ti is just not keeping up with the TXP. Am I wrong?


+1 You are absolutely correct!


----------



## Artah

Quote:


> Originally Posted by *Jpmboy*
> 
> we have a lot of complaining... yeah, okay price wise early adopters always pay more. But performance wise, and looking at a bunch of bench threads I maintain, the Ti is just not performing as well as the titanXP. With the exception of a few extreme guys running "franken Ti's" (which I ignore for this comparo) the Ti is just not keeping up with the TXP. Am I wrong?


I'm going to have to buy my wife a pair of Tis to see for myself. She's due for an upgrade anyway she's still using a pair TXMs.


----------



## jsutter71

How are you'all powering your TXP's A single power cable with a 8 and 6 pin connector or a separate 6 and 8 pin cable? Up until today I was using as separate 8 and 6 pin cable but today I switched to a single cable.


----------



## Artah

Quote:


> Originally Posted by *jsutter71*
> 
> How are you'all powering your TXP's A single power cable with a 8 and 6 pin connector or a separate 6 and 8 pin cable? Up until today I was using as separate 8 and 6 pin cable but today I switched to a single cable.


I always use separate ones.


----------



## Jpmboy

Quote:


> Originally Posted by *bl4ckdot*
> 
> True, I have yet to see 33k+ on FS.


even then, FS 1080P is too cpu bound. Extreme is better, Ultra, timespy and VRMark Blueroom are better at separating out GPUs from the rest of the components. .. then there's Folding/Compute.
Quote:


> Originally Posted by *Artah*
> 
> I'm going to have to buy my wife a pair of Tis to see for myself. She's due for an upgrade anyway she's still using a pair TXMs.


Yeah... that's the excuse. Unfortunately (or fortunately?) my wife doesn't know what a GPU is.
Quote:


> Originally Posted by *jsutter71*
> 
> How are you'all powering your TXP's A single power cable with a 8 and 6 pin connector or a separate 6 and 8 pin cable? Up until today I was using as separate 8 and 6 pin cable but today I switched to a single cable.


depends on the PSU. With most all single rail PSUs it won;t matter assuming the single-split cable is larger gauge on the common.


----------



## jsutter71

Quote:


> depends on the PSU. With most all single rail PSUs it won;t matter assuming the single-split cable is larger gauge on the common.


Using a EVGA T2 1600. I ran FS and FSU with HWiNFO sensors and the cards looked like they were still able to hit their peak power even with the single cable. Over 300W per cable.

Interesting note in the User manual

NOTE 2: We recommend to use a single PCI-E cable to connect per port on graphic cards if your video card requires high power such as equipped with more than 2 PCI-E connectors 6pin + 8pin and 8pin + 8pin.

This note contradicts the bundled cables which includes single cables with 6 and 8 pin connectors.


----------



## Jpmboy

Quote:


> Originally Posted by *jsutter71*
> 
> Using a EVGA T2 1600. I ran FS and FSU with HWiNFO sensors and the cards looked like they were still able to hit their peak power even with the single cable. Over 300W per cable.
> 
> Interesting note in the User manual
> 
> NOTE 2: We recommend to use a single PCI-E cable to connect per port on graphic cards if your video card requires high power such as equipped with more than 2 PCI-E connectors 6pin + 8pin and 8pin + 8pin.
> 
> This note contradicts the bundled cables which includes single cables with 6 and 8 pin connectors.


no contradiction since the note stipulates "video card requires high power". Just FYI, TXPs really are not very power hungry.. especially compared to something like a 780Ti KPE.


----------



## jsutter71

Quote:


> Originally Posted by *Jpmboy*
> 
> no contradiction since the note stipulates "video card requires high power". Just FYI, TXPs really are not very power hungry.. especially compared to something like a 780Ti KPE.


Looks like I'll be switching back to individual cables. System has locked up multiple times.


----------



## pez

Quote:


> Originally Posted by *jsutter71*
> 
> How are you'all powering your TXP's A single power cable with a 8 and 6 pin connector or a separate 6 and 8 pin cable? Up until today I was using as separate 8 and 6 pin cable but today I switched to a single cable.


Quote:


> Originally Posted by *jsutter71*
> 
> Looks like I'll be switching back to individual cables. System has locked up multiple times.


You're on a PSU with a single 12v rail, so unless your cable is just crappy, you shouldn't be seeing issues there. Have been running a TXP on a single cable from a SF600 for months now with no issues.


----------



## xTesla1856

Quote:


> Originally Posted by *jsutter71*
> 
> How are you'all powering your TXP's A single power cable with a 8 and 6 pin connector or a separate 6 and 8 pin cable? Up until today I was using as separate 8 and 6 pin cable but today I switched to a single cable.


Always use separate cables. Call me paranoid, but I like the peace of mind


----------



## soccastar001

Finally got around to overclocking my Titan XP (I'm new to the whole overclocking game) and got weird numbers. I couldn't go higher than +225 on core clock without crashing 3DMark but I didn't get artifacts until +750 on memory clock. That seems really high considering I couldn't get core clock higher than +225, am I doing something wrong?

I also moved onto overclocking my 7700K the following day and after an overnight stress test woke up to a black screen even though the computer was running. Could this be caused by my Titan and not the CPU?

Thanks for any assistance, sorry for the inexperience.


----------



## pez

Quote:


> Originally Posted by *soccastar001*
> 
> Finally got around to overclocking my Titan XP (I'm new to the whole overclocking game) and got weird numbers. I couldn't go higher than +225 on core clock without crashing 3DMark but I didn't get artifacts until +750 on memory clock. That seems really high considering I couldn't get core clock higher than +225, am I doing something wrong?
> 
> I also moved onto overclocking my 7700K the following day and after an overnight stress test woke up to a black screen even though the computer was running. Could this be caused by my Titan and not the CPU?
> 
> Thanks for any assistance, sorry for the inexperience.


It would be better to test each OC individually and then together. I.e. ensure your GPU OC is stable with some stress tests and some real-world usage (whatever is real world for you, be it rendering or gaming, etc). Then move on to your CPU OC. I'd even go as far as suggesting you put your GPU at stock while OC'ing CPU.


----------



## soccastar001

Quote:


> Originally Posted by *pez*
> 
> It would be better to test each OC individually and then together. I.e. ensure your GPU OC is stable with some stress tests and some real-world usage (whatever is real world for you, be it rendering or gaming, etc). Then move on to your CPU OC. I'd even go as far as suggesting you put your GPU at stock while OC'ing CPU.


I thought I had properly tested my GPU overclock first. I pushed both clock speeds until I got crashes or artifacts and used the last best value (factor of 25 for core and 50 for memory) and ran a full stress test, Time Spy benchmark and played games on it for several days before moving onto my CPU.


----------



## pez

Quote:


> Originally Posted by *soccastar001*
> 
> I thought I had properly tested my GPU overclock first. I pushed both clock speeds until I got crashes or artifacts and used the last best value (factor of 25 for core and 50 for memory) and ran a full stress test, Time Spy benchmark and played games on it for several days before moving onto my CPU.


In that case, it sounds like a CPU OC is causing your issues. However, if you run the same benchmarks while stressing your CPU and the GPU is at stock, that will undoubtedly let you know that something is wrong that isn't the CPU.


----------



## soccastar001

Quote:


> Originally Posted by *pez*
> 
> In that case, it sounds like a CPU OC is causing your issues. However, if you run the same benchmarks while stressing your CPU and the GPU is at stock, that will undoubtedly let you know that something is wrong that isn't the CPU.


Good point, thanks!


----------



## Asus11

Quote:


> Originally Posted by *Jpmboy*
> 
> even then, FS 1080P is too cpu bound. Extreme is better, Ultra, timespy and VRMark Blueroom are better at separating out GPUs from the rest of the components. .. then there's Folding/Compute.
> Yeah... that's the excuse. Unfortunately (or fortunately?) my wife doesn't know what a GPU is.
> depends on the PSU. With most all single rail PSUs it won;t matter assuming the single-split cable is larger gauge on the common.


hey JPM.. thinking of buying a used titan XP.. just because I tend to like things people have less of









can you explain why the XP is getting a better score? the 1080 ti has 11Gps then can be clocked to have effective 12Gps

just wondered as I know you own a XP what your experience was with overclocking it core / mem etc

much appreciated


----------



## nycgtr

Quote:


> Originally Posted by *Asus11*
> 
> hey JPM.. thinking of buying a used titan XP.. just because I tend to like things people have less of
> 
> 
> 
> 
> 
> 
> 
> 
> 
> can you explain why the XP is getting a better score? the 1080 ti has 11Gps then can be clocked to have effective 12Gps
> 
> just wondered as I know you own a XP what your experience was with overclocking it core / mem etc
> 
> much appreciated


I picked like quite a few that people dumped when the ti news broke out, cheapest being blocked with original cooler included for less than 900 lol. Prices seen to have gone back up a bit past week or two. There were peeps locally dumping them for 800-850 when the ti got announced. I'd say if you can get it for 100 or so more than a 1080ti just do it. You can oc the ram to match. Some how resale value is better as well. I see maxwell titans going for 550-600 while tis are what under 300 these days.


----------



## jsutter71

Quote:


> Originally Posted by *pez*
> 
> You're on a PSU with a single 12v rail, so unless your cable is just crappy, you shouldn't be seeing issues there. Have been running a TXP on a single cable from a SF600 for months now with no issues.


Yes I thought it might have been the cable but switching back to individual cables did not fix the issue. Reset my BIOS and all my settings didn't either so thinking it was Windows I did a clean install this morning. That also did not fix it so I pulled every power cable, tested, and reattached. Switched back to single cables during the process. Turned my system back on and the issue was resolved. Must have been a lose connection because all my cables tested fine.


----------



## trippinonprozac

Has anyone experienced SLI sync errors or voltage limit errors in afterburner on these cards??

I am trying to troubleshoot both as since going SLI I constantly get them, even at stock.


----------



## Silent Scone

Quote:


> Originally Posted by *trippinonprozac*
> 
> Has anyone experienced SLI sync errors or voltage limit errors in afterburner on these cards??
> 
> I am trying to troubleshoot both as since going SLI I constantly get them, even at stock.


Afterburner has been causing me issues recently. Switched to GPUTweak. Cards were not boosting half the time, or at all in some games. Removing AB solved the issue.
Quote:


> Originally Posted by *jsutter71*
> 
> Yes I thought it might have been the cable but switching back to individual cables did not fix the issue. Reset my BIOS and all my settings didn't either so thinking it was Windows I did a clean install this morning. That also did not fix it so I pulled every power cable, tested, and reattached. Switched back to single cables during the process. Turned my system back on and the issue was resolved. Must have been a lose connection because all my cables tested fine.


There should be no issues running one of these cards on a split cable on that PSU, so that doesn't surprise me.


----------



## pez

Quote:


> Originally Posted by *jsutter71*
> 
> Yes I thought it might have been the cable but switching back to individual cables did not fix the issue. Reset my BIOS and all my settings didn't either so thinking it was Windows I did a clean install this morning. That also did not fix it so I pulled every power cable, tested, and reattached. Switched back to single cables during the process. Turned my system back on and the issue was resolved. Must have been a lose connection because all my cables tested fine.


Glad you got it resolved







. Sounds like you had a busy day/night.


----------



## Jpmboy

Quote:


> Originally Posted by *Asus11*
> 
> hey JPM.. thinking of buying a used titan XP.. just because I tend to like things people have less of
> 
> 
> 
> 
> 
> 
> 
> 
> 
> can you explain why the XP is getting a better score? the 1080 ti has 11Gps then can be clocked to have effective 12Gps
> 
> just wondered as I know you own a XP what your experience was with overclocking it core / mem etc
> 
> much appreciated


only explanation i can come up with is higher clocks and/or less error correction... the benchmarks speak for them selves. Anyway, it was a good time to buy a used TXP. Kinda opposite the ole stock market mantra in this setting, here it's "sell on the rumor and buy on the news".








I never had plans ot sell my 2 TXPs for a Ti - doesn't make sense at all. A Ti classified? maybe.


----------



## Silent Scone

Yeah, can't believe people have considered and gone through with it.


----------



## ratzofftoya

Alrighty. In the process of deciding whether to put my pair of Titan XPs in the main rig or the pair of 1080 Tis, and I benchmarked the XPs last night. They'll be under water in the main rig but I am benching on air. Despite the new drivers, my results are pretty much the same as when I first got them. With fans cranked up to 80%, power up to 120%, and voltage all the way up, I was able to push the GPU clock up to +180 and the memory up to +350. It hovered a bit above 2Ghz for a while in Heaven, then slipped to the ~1980 range for the rest of the benchmark. In the second run, I turned the memory up to +400. It made it through the benchmark but the score was a bit lower because the clocks started in the ~1980 range and stayed there.

With +185 and above, I couldn't make it through more than a minute of Heaven running.

Is this fairly typical? Did I get a somewhat weaker batch? Much weaker batch?


----------



## bl4ckdot

Quote:


> Originally Posted by *ratzofftoya*
> 
> Alrighty. In the process of deciding whether to put my pair of Titan XPs in the main rig or the pair of 1080 Tis, and I benchmarked the XPs last night. They'll be under water in the main rig but I am benching on air. Despite the new drivers, my results are pretty much the same as when I first got them. With fans cranked up to 80%, power up to 120%, and voltage all the way up, I was able to push the GPU clock up to +180 and the memory up to +350. It hovered a bit above 2Ghz for a while in Heaven, then slipped to the ~1980 range for the rest of the benchmark. In the second run, I turned the memory up to +400. It made it through the benchmark but the score was a bit lower because the clocks started in the ~1980 range and stayed there.
> 
> With +185 and above, I couldn't make it through more than a minute of Heaven running.
> 
> Is this fairly typical? Did I get a somewhat weaker batch? Much weaker batch?


I don't think upping the voltage is helping ...


----------



## ratzofftoya

Quote:


> Originally Posted by *bl4ckdot*
> 
> I don't think upping the voltage is helping ...


Should I try afterburner instead of PrecisionX?


----------



## bl4ckdot

I mean you can always try to see if it works better. My undestanding is that low temps help way more than voltage


----------



## jhowell1030

Quote:


> Originally Posted by *bl4ckdot*
> 
> I mean you can always try to see if it works better. My undestanding is that low temps help way more than voltage


This is especially true with Pascal


----------



## ratzofftoya

Quote:


> Originally Posted by *jhowell1030*
> 
> This is especially true with Pascal


But I'm not seeing the temps ever get hotter than 60-70, even when I go +200 Mhz.


----------



## mouacyk

Quote:


> Originally Posted by *Silent Scone*
> 
> Yeah, can't believe people have considered and gone through with it.


Economics and exclusive early access. Paid $600 for 9 month early access. Recoup ~$200-$300 from resale, and end up loosing about $300-$400 for the exclusive access to world's fast single GPU. Then at the end, you still end up with a GPU that's within 5% of it. Overall, you spend around $350 for 9 months of early access to the flagship single GPU.


----------



## jhowell1030

Quote:


> Originally Posted by *ratzofftoya*
> 
> But I'm not seeing the temps ever get hotter than 60-70, even when I go +200 Mhz.


Just from experience with my card, I got better scores with a slightly lower clocked (+212) speed at stock voltage than I did at a higher "stable" clock speed (+225) with higher voltage.


----------



## bl4ckdot

Quote:


> Originally Posted by *jhowell1030*
> 
> Just from experience with my card, I got better scores with a slightly lower clocked (+212) speed at stock voltage than I did at a higher "stable" clock speed (+225) with higher voltage.


Same. Broke the 33K barrier graphic score with +215/+510. Higher was stable but lower scores.


----------



## jhowell1030

Quote:


> Originally Posted by *bl4ckdot*
> 
> Same. Broke the 33K barrier graphic score with +215/+510. Higher was stable but lower scores.


Yep, if I was at home I could tell you the exact amount but I'm sure mine were at least 1500 apart.


----------



## ratzofftoya

Quote:


> Originally Posted by *bl4ckdot*
> 
> Same. Broke the 33K barrier graphic score with +215/+510. Higher was stable but lower scores.


So what does it mean that I can only get stability at +180/+400. Am I doing something wrong? Are your numbers under water?


----------



## bl4ckdot

Quote:


> Originally Posted by *ratzofftoya*
> 
> So what does it mean that I can only get stability at +180/+400. Am I doing something wrong? Are your numbers under water?


You just have to do multiple runs, first with GPU (or memory), find at which frequency you score the higher then do the same with the memory (or GPU if you did memory first).
While benching I didn't get past 40°C. It was at 27°C on idle.


----------



## ratzofftoya

Quote:


> Originally Posted by *bl4ckdot*
> 
> You just have to do multiple runs, first with GPU (or memory), find at which frequency you score the higher then do the same with the memory (or GPU if you did memory first).
> While benching I didn't get past 40°C. It was at 27°C on idle.


Are you watercooling? What percentage were your fans at?


----------



## bl4ckdot

Quote:


> Originally Posted by *ratzofftoya*
> 
> Are you watercooling? What percentage were your fans at?


Yes, I'm on water (EK Predator 360). Fans were at full speed.


----------



## jhowell1030

On air I was only able to get +185 / +500 so that could also be an issue.

Stupid question: Have you checked to make sure the the power setting is set to maximum performance and vsync is off in the NVIDIA control panel? Buddy of mine was having issues and didn't check those.


----------



## ratzofftoya

Quote:


> Originally Posted by *jhowell1030*
> 
> On air I was only able to get +185 / +500 so that could also be an issue.
> 
> Stupid question: Have you checked to make sure the the power setting is set to maximum performance and vsync is off in the NVIDIA control panel? Buddy of mine was having issues and didn't check those.


I did. But +185/+500 with 80% fans is about where I'm at, so I am not as concerned. Time to bench at these settings, then compare to the 1080 Tis....


----------



## xTesla1856

Anyone else worried? My slider sits at 100%


----------



## pez

Someone posted that in another thread in these subforums a week or so back. I don't think he's stating anything new. He's saying 'using higher voltage shortens lifespan'. I think it's dramatic to say a year, however. Though, I think he's not accounting for the people that are doing the 'normal' voltage tweaks are usually under water, running max fan speed, or even an aftermarket solution.


----------



## xTesla1856

Quote:


> Originally Posted by *pez*
> 
> Someone posted that in another thread in these subforums a week or so back. I don't think he's stating anything new. He's saying 'using higher voltage shortens lifespan'. I think it's dramatic to say a year, however. Though, I think he's not accounting for the people that are doing the 'normal' voltage tweaks are usually under water, running max fan speed, or even an aftermarket solution.


The way I understand it, if your Voltage Slider in AB is set to 100% all the time like mine, your card will only last about one year.


----------



## MrTOOSHORT

1.05v to 1.09v? Not worried.


----------



## BrainSplatter

Quote:


> Originally Posted by *xTesla1856*
> 
> your card will only last about one year.


One year with full load for 24h? Without more specific info, this statement is pretty pointless. Also, increased electromigration from overvolting will not kill your GPU from one moment to the next but will at some point make the highest clock speeds unstable.

On the other hand, with smaller chip structures comes more sensitivity to voltage, so voltage limits might have to be more closely followed in the future than was needed before.


----------



## mouacyk

You have NVidia 3-year warranty for those using stock cooler:
Quote:


> *FOR HOW LONG?*
> Three (3) years from the date of purchase of your new Warranted Product based on product specific warranty.
> 
> *WHAT DOES THIS WARRANTY NOT COVER?*
> Any problems that do not relate specifically to a manufacturing defect or hardware product failure, including, but not limited to, problems caused by abuse, misuse, negligence, act of God (such as flood), misapplication of service by a party other than an authorized service representative, software, shipment damages, etc.


Not sure though how NVidia will treat those who fried on water or custom after market coolers.


----------



## Asus11

Quote:


> Originally Posted by *Jpmboy*
> 
> only explanation i can come up with is higher clocks and/or less error correction... the benchmarks speak for them selves. Anyway, it was a good time to buy a used TXP. Kinda opposite the ole stock market mantra in this setting, here it's "sell on the rumor and buy on the news".
> 
> 
> 
> 
> 
> 
> 
> 
> I never had plans ot sell my 2 TXPs for a Ti - doesn't make sense at all. A Ti classified? maybe.


thanks JPM, I have a TXP on the way btw.. only 50 over a 1080ti.. would rather have a titan than a mass produced card









also full 12gb more ROPs / L2 cache / bigger bus.. think its defo worth the extra 50.. also titan block for titan lol


----------



## jsutter71

I noticed that the more I overclock the lower my score. Strange. I still can't seem to break 3400 in FS though.
http://www.3dmark.com/fs/12170530


----------



## Dagamus NM

Quote:


> Originally Posted by *jsutter71*
> 
> I noticed that the more I overclock the lower my score. Strange. I still can't seem to break 3400 in FS though.
> http://www.3dmark.com/fs/12170530


Every GPU has its own sweet spot. Unless you are simply chasing raw frequency you should tune your machine for a specific benchmark. I have noticed that when I am fully dialed in on all gpus a change of a single MHz can have quite a large difference in point score.

On a separate note, have you all noticed that NVidia Control Panel seems to be getting less and less functional with recent driver updates. It has to be the slowest program on my computer. Switching from durround back to individual monitors takes forever with the program going unresponsive every change?


----------



## jsutter71

Quote:


> Originally Posted by *Dagamus NM*
> 
> Every GPU has its own sweet spot. Unless you are simply chasing raw frequency you should tune your machine for a specific benchmark. I have noticed that when I am fully dialed in on all gpus a change of a single MHz can have quite a large difference in point score.
> 
> On a separate note, have you all noticed that NVidia Control Panel seems to be getting less and less functional with recent driver updates. It has to be the slowest program on my computer. Switching from durround back to individual monitors takes forever with the program going unresponsive every change?


I have noticed the same exact thing. I realized that the reason my scores were so terrible before was because I did not have have force constant voltage selected in Afterburner.


----------



## Sheyster

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> 1.05v to 1.09v? *Not worried.*


Same here.. If you're worried then keep it stock for everything and live worry free.


----------



## jsutter71

Starting to get a little better
http://www.3dmark.com/fs/12171513


----------



## EvilPieMoo

Quote:


> Originally Posted by *jsutter71*
> 
> Starting to get a little better
> http://www.3dmark.com/fs/12171513


Keep at it, you should be able to do 58k graphics score no problem.


----------



## qazplm5089

I checked that box also but the voltage still jumps around in Time Spy. When you're testing, is the voltage actually constant, or does it jump around?


----------



## jsutter71

Quote:


> Originally Posted by *qazplm5089*
> 
> I checked that box also but the voltage still jumps around in Time Spy. When you're testing, is the voltage actually constant, or does it jump around?


Before I had that box checked my average score in FS was under 20000


----------



## jsutter71

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Keep at it, you should be able to do 58k graphics score no problem.


This is as close to 58K that I've got so far from a earlier test.
http://www.3dmark.com/fs/12170530


----------



## kx11

the CPU needs a push to 4.5 and Ram to 3000mhz , you should hit at least 57k


----------



## jsutter71

Quote:


> Originally Posted by *kx11*
> 
> the CPU needs a push to 4.5 and Ram to 3000mhz , you should hit at least 57k


My CPU and RAM will not overclock that high. 4.3GHz is the highest stable setting, and my RAM will push to 2600 but I see very little difference between that and 2400 so I just leave it at 2400. I might upgrade to 3200 soon.


----------



## jsutter71

Something I've noticed about FS is that the overall score seems to favor the physics test above the other tests. Take a look at these scores

KilmerDK and yakasa both have higher frame rates overall then my system yet because they have lower physics scores received a lower overall score then my system

http://www.3dmark.com/compare/fs/12006051/fs/11223837#

This is my score and demonstrates how the 6950X CPU stands out.
http://www.3dmark.com/fs/12170530


----------



## jsutter71

Quote:


> Originally Posted by *bl4ckdot*
> 
> True, I have yet to see 33k+ on FS.


I would like to know how this reviewer only got a graphics score of 28297 with a pair of TXPs and 52306 for the 1080Ti's. Something seems off with this test.

https://us.hardware.info/reviews/7270/4/nvidia-geforce-gtx-1080-ti-review-incl-sli-faster-card-for-the-same-price-benchmarksn3dmark-fire-strike--extreme--ultra


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> I would like to know how this reviewer only got a graphics score of 28297 with a pair of TXPs and 52306 for the 1080Ti's. Something seems off with this test.
> 
> https://us.hardware.info/reviews/7270/4/nvidia-geforce-gtx-1080-ti-review-incl-sli-faster-card-for-the-same-price-benchmarksn3dmark-fire-strike--extreme--ultra


It doesn't say SLI where TitanX P is...


----------



## jsutter71

Quote:


> Originally Posted by *pez*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Glad you got it resolved
> 
> 
> 
> 
> 
> 
> 
> . Sounds like you had a busy day/night.


Busy day. My wife gets irritated if I'm still on the PC when she gets home from work. Even though I am able to overclock my TXPs with 1 power cable with no issues I can't help but wonder if I could push them a little further with a separate 6 and 8 pin cable. My thinking is that instead of pulling more power through a singe cable which would create more heat, individual cables might even out the power resulting in less power per cable. Less power equals less heat. That could explain why my cards start dropping frame rates the higher I clock them. I've noticed a HUGE performance drop when these cards heat up.


----------



## jsutter71

Quote:


> Originally Posted by *KillerBee33*
> 
> It doesn't say SLI where TitanX P is...


Sensing a little bias.


----------



## KillerBee33

Quote:


> Originally Posted by *jsutter71*
> 
> Sensing a little bias.


Quite Simple really ...It's not an SLI Benchmark
Also Single TXP beats Single 1080Ti on his test


----------



## bl4ckdot

Quote:


> Originally Posted by *jsutter71*
> 
> I would like to know how this reviewer only got a graphics score of 28297 with a pair of TXPs and 52306 for the 1080Ti's. Something seems off with this test.
> 
> https://us.hardware.info/reviews/7270/4/nvidia-geforce-gtx-1080-ti-review-incl-sli-faster-card-for-the-same-price-benchmarksn3dmark-fire-strike--extreme--ultra


Just to be clear, when I said I had yet to see 33k+ with Ti on FS I was talking of Graphic Score (single card only)


----------



## pez

Quote:


> Originally Posted by *jsutter71*
> 
> Busy day. My wife gets irritated if I'm still on the PC when she gets home from work. Even though I am able to overclock my TXPs with 1 power cable with no issues I can't help but wonder if I could push them a little further with a separate 6 and 8 pin cable. My thinking is that instead of pulling more power through a singe cable which would create more heat, individual cables might even out the power resulting in less power per cable. Less power equals less heat. That could explain why my cards start dropping frame rates the higher I clock them. I've noticed a HUGE performance drop when these cards heat up.


I'm not an expert so I couldn't tell you if that would be a possibility, but if you can test it both ways, I don't see why not.

I just got some custom cables in, so my TXP will be switching from a single power cable to two separate ones. Assuming the custom cables were made correctly that is







.


----------



## Jpmboy

Quote:


> Originally Posted by *xTesla1856*
> 
> Anyone else worried? My slider sits at 100%


not worried at all. frankly and unfortunately, the voltage slider does very little as MrT points points out.
Quote:


> Originally Posted by *jsutter71*
> 
> I noticed that the more I overclock the lower my score. Strange. I still can't seem to break 3400 in FS though.
> http://www.3dmark.com/fs/12170530


there is a very strong error correction trap in this architecture that kicks in well before the gpu will crash. Best way to get the max performance out of these cards is to keep the temperature below 40C and preferably below 25C. Makes all the difference with clock bin drops and thermally-induced error correction at higher temps.
Quote:


> Originally Posted by *jsutter71*
> 
> I would like to know how this reviewer only got a graphics score of 28297 with a pair of TXPs and 52306 for the 1080Ti's. Something seems off with this test.
> 
> https://us.hardware.info/reviews/7270/4/nvidia-geforce-gtx-1080-ti-review-incl-sli-faster-card-for-the-same-price-benchmarksn3dmark-fire-strike--extreme--ultra


because he's only running ONE txp.








the point he's tring to make is that you can get 2 1080Tis for the original cost of one TXP. what a revelation. guy must be a genius.

Quote:


> Originally Posted by *bl4ckdot*
> 
> Just to be clear, when I said I had yet to see 33k+ with Ti on FS I was talking of Graphic Score (single card only)


You guys need to drop firestrike 1080P, it really can't load this card. Use extreme, ultra, timespy.. or if you really want to push the gpu (and take the cpu out of the equation) use VR Mark Blue Room.
and, instead of clicking "force constant voltage" in AB, open AB, set a core clock and apply, put your mouse in the sensor window, hit cntrl-F. Select the 1050mV point on the graph (while holding thbe shift key) then hit cntrl-L. Now click Applyu and your card is locked at that frequency. Save this to a save slot and adjust other sliders as needed. This, and cold is the best way to get the max out of these cards.


----------



## kx11

Quote:


> Originally Posted by *jsutter71*
> 
> I would like to know how this reviewer only got a graphics score of 28297 with a pair of TXPs and 52306 for the 1080Ti's. Something seems off with this test.
> 
> https://us.hardware.info/reviews/7270/4/nvidia-geforce-gtx-1080-ti-review-incl-sli-faster-card-for-the-same-price-benchmarksn3dmark-fire-strike--extreme--ultra


his charts are weird , 1080 SLi betas 1080ti SLi in almost everything ??


----------



## jsutter71

Quote:


> Originally Posted by *kx11*
> 
> his charts are weird , 1080 SLi betas 1080ti SLi in almost everything ??


His review is terrible. How can you make a FAIR recommendation without equal testing. If he wanted to throw in SLI then he should have had a SLI score for all the cards he listed. All tested under the same conditions.. Either the tester lacked the hardware of he was to lazy to follow through. He wasn't to lazy to write the stupid review though.


----------



## DNMock

Quote:


> Originally Posted by *jsutter71*
> 
> His review is terrible. How can you make a FAIR recommendation without equal testing. If he wanted to throw in SLI then he should have had a SLI score for all the cards he listed. All tested under the same conditions.. Either the tester lacked the hardware of he was to lazy to follow through. He wasn't to lazy to write the stupid review though.


Does it really matter? Anyone who doesn't know by now that the x80ti's are going to put out comparable numbers to the Titan series 6 months after titan releases and for $300 to $500 less at this point isn't worth the time to care about since they either A) are new and too lazy to bother putting in the research on arguably the most expensive component in their system, or B) just an ignorant shill.


----------



## MrKenzie

I managed to squeeze some more out of my Titan with fine-tuning of the curve, previous best graphics score was 8181 and have managed 8242 with curve adjustment. Pretty happy and I could probably get it higher but I prefer stability as I do a lot of gaming.

http://www.3dmark.com/fs/12182845

I will look at upgrading the CPU side in 9-12 months but currently about 5-7% gain by spending AU$3500 is not worth it, I could buy another 2 Titan's for that money!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *MrKenzie*
> 
> I managed to squeeze some more out of my Titan with fine-tuning of the curve, previous best graphics score was 8181 and have managed 8242 with curve adjustment. Pretty happy and I could probably get it higher but I prefer stability as I do a lot of gaming.
> 
> http://www.3dmark.com/fs/12182845
> 
> I will look at upgrading the CPU side in 9-12 months but currently about 5-7% gain by spending AU$3500 is not worth it, I could buy another 2 Titan's for that money!


Nice gpu score, right up there with the top TXPs.


----------



## MrKenzie

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Nice gpu score, right up there with the top TXPs.


Yes all down to keeping it as cool as possible.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *MrKenzie*
> 
> Yes all down to keeping it as cool as possible.


I know about keeping it cool, I stick my PC outside in -7'C weather for benching. Sometimes colder.


----------



## MrKenzie

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I know about keeping it cool, I stick my PC outside in -7'C weather for benching. Sometimes colder.


What sort of load GPU temps do you get when you do that? I can end Firestrike at 10c, and with 100% load for 2 hours I get a maximum of 26c (will hold 26c forever as long as ambient isn't over 30c).


----------



## MrTOOSHORT

Load around 18 to 25'C depending on cold night.


----------



## Asus11

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Load around 18 to 25'C depending on cold night.


would hate to be trying to move your big computer around lol must be a pain

on another note my XP just came


----------



## Asus11

better late than never


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> better late than never


Woohoo, nice!















How come didn't you get a 1080 ti and save some coin?


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> Woohoo, nice!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How come didn't you get a 1080 ti and save some coin?


because I would not of saved much









plus id rather have a titan than a 1080ti

the card is not new but the guy hardly ever used it


----------



## arrow0309

Quote:


> Originally Posted by *Asus11*
> 
> because I would not of saved much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> plus id rather have a titan than a 1080ti
> 
> the card is not new but the guy hardly ever used it


Well, in that case I can only give you congratulations!








It's a nice piece of hardware, planning to water cooling it?









Btw:
I've recently changed some hw (cpu, mb and ram) and the cpu block also, went to 100% Watercool Heatkiller's (IV)


----------



## Asus11

Quote:


> Originally Posted by *arrow0309*
> 
> Well, in that case I can only give you congratulations!
> 
> 
> 
> 
> 
> 
> 
> 
> It's a nice piece of hardware, planning to water cooling it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw:
> I've recently changed some hw (cpu, mb and ram) and the cpu block also, went to 100% Watercool Heatkiller's (IV)


watercooling blocks etc was meant to come few days ago but will come tomorrow now







ek block & backplate
hopefully get back to some good benchmarks scores









not going to lie that heatkiller looks really nice









have you done shunt mod? Im planning to do it .. wondering if its worth it


----------



## xTesla1856

Quote:


> Originally Posted by *arrow0309*
> 
> Woohoo, nice!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How come didn't you get a 1080 ti and save some coin?


I think he may have got it used. The smart choice at the moment


----------



## MrTOOSHORT

Quote:


> Originally Posted by *xTesla1856*
> 
> I think he may have got it used. The smart choice at the moment


He already said that a few posts up.


----------



## nycgtr

Hmm never joined this club


----------



## skypine27

685 pages so excuse (please) this but....

Still no unlocked BIOS to up the power limit ?

Flame on!


----------



## xTesla1856

Quote:


> Originally Posted by *skypine27*
> 
> 685 pages so excuse (please) this but....
> 
> Still no unlocked BIOS to up the power limit ?
> 
> Flame on!


None so far..


----------



## EvilPieMoo

Quote:


> Originally Posted by *skypine27*
> 
> 685 pages so excuse (please) this but....
> 
> Still no unlocked BIOS to up the power limit ?
> 
> Flame on!


Nope, I wouldn't count on one ever happening either.


----------



## soccastar001

Wanted to post my results. I'm a first time Titan owner and brand new to overclocking a GPU. This is my best score with OC of 235/570 on water. I really wanted to break 10000 but I don't think it is happening.

http://www.3dmark.com/spy/1475016

I also noticed a lot of people posting temps a lot lower than I'm getting. When running on a full load for multiple hours (like a gaming session) I'm getting high 40s, close to 50c. Should I lower the clock to keep it cooler?

Edit. My core OC is 235, not 265. At 240 or higher Time Spy's stress test would crash. I was able to get my memory OC much higher, to about 750, but it wasn't having much if any positive affect on my benchmarks so I rolled it back.


----------



## bl4ckdot

Quote:


> Originally Posted by *soccastar001*
> 
> Wanted to post my results. I'm a first time Titan owner and brand new to overclocking a GPU. This is my best score with OC of 265/570 on water. I really wanted to break 10000 but I don't think it is happening.
> 
> http://www.3dmark.com/spy/1475016
> 
> I also noticed a lot of people posting temps a lot lower than I'm getting. When running on a full load for multiple hours (like a gaming session) I'm getting high 40s, close to 50c. Should I lower the clock to keep it cooler?


Welcome to the club.
This is a very solid score. I see that the GPU highest clock reported is 2050Mhz, did it throttled a lot during the test ? Your gaming temps seems fine, I have the same. If you want to break the 10k score you will have to keep the GPU as cool as possible to avoid the throttle even though 11100 on the graphic score is pretty high already (max I have seen is 11400).


----------



## Jpmboy

Quote:


> Originally Posted by *soccastar001*
> 
> Wanted to post my results. I'm a first time Titan owner and brand new to overclocking a GPU. This is my best score with OC of 235/570 on water. I really wanted to break 10000 but I don't think it is happening.
> 
> http://www.3dmark.com/spy/1475016
> 
> I also noticed a lot of people posting temps a lot lower than I'm getting. When running on a full load for multiple hours (like a gaming session) I'm getting high 40s, close to 50c. Should I lower the clock to keep it cooler?
> 
> Edit. My core OC is 235, not 265. At 240 or higher Time Spy's stress test would crash. I was able to get my memory OC much higher, to about 750, but it wasn't having much if any positive affect on my benchmarks so I rolled it back.


your graphics score is very good.. no need to change anything unless you think you need to lower operating temperature.

compare *here*


----------



## Asus11

Quote:


> Originally Posted by *Jpmboy*
> 
> your graphics score is very good.. no need to change anything unless you think you need to lower operating temperature.
> 
> compare *here*


JP is it worth shunt modding the titan x?


----------



## Jpmboy

Quote:


> Originally Posted by *Asus11*
> 
> JP is it worth shunt modding the titan x?


it's reversible so you may find that your specific sample benefits. I did one card at launch and then removed it. Cold works as well IMO.


----------



## jsutter71

My highest scores in FS.. 34000 just out of reach
http://www.3dmark.com/fs/12206964


----------



## DooRules

Quote:


> Originally Posted by *jsutter71*
> 
> My highest scores in FS.. 34000 just out of reach
> http://www.3dmark.com/fs/12206964


I am going to assume you can't reasonably go higher on the chip? Other than that get the gpu's as cold as you can.


----------



## DNMock

Quote:


> Originally Posted by *skypine27*
> 
> 685 pages so excuse (please) this but....
> 
> Still no unlocked BIOS to up the power limit ?
> 
> Flame on!


As far as I can tell it wouldn't matter much anyway. Shunt mod accomplishes the same thing for all intent and purpose and aside from allowing max clocks to stay stable doesn't seem to offer any real benefits beyond that. Pascal itself seems to hard cap out at around 2100 mhz without a Frankenstein board and LN2 or some other extreme approach.

Here's hoping the new fab process for Volta will allow a little more wiggle room


----------



## jsutter71

Quote:


> Originally Posted by *DooRules*
> 
> I am going to assume you can't reasonably go higher on the chip? Other than that get the gpu's as cold as you can.


Running 4 EK XE Rads in push/pull configuration so they won't be getting any cooler. I have never once seen my cards go beyond 60C even with heavy loads. So which is it. Is their speed limited by the amount of power they receive or are they just a couple of spoiled brats who refuse to work unless they're cold?


----------



## skypine27

Quote:


> Originally Posted by *DNMock*
> 
> As far as I can tell it wouldn't matter much anyway. Shunt mod accomplishes the same thing for all intent and purpose and aside from allowing max clocks to stay stable doesn't seem to offer any real benefits beyond that. Pascal itself seems to hard cap out at around 2100 mhz without a Frankenstein board and LN2 or some other extreme approach.
> 
> Here's hoping the new fab process for Volta will allow a little more wiggle room


Thx for the heads up.

I run 2 x of them under custom water, but ultra low RPM fans (I aimed at silence under max load) and the loop also has to cool a 10x core over clocked 6950x.... So my temps are nothing to write home about.

But from watching benchmark and gaming sessions, I see 1911 a lot (and I'm a 45ACP fan so I like this) but they are always bouncing off the 120% power limit, and never crack 2k. I'd like a solid 2000-2100 which was what I was asking about the power limit mod. I checked that thread on the shunt mod and it seems pretty easy to do, but I dont feel like pulling the blocks and going through the time required (my interest in tinkering is slowly dying....)

Thx for the info


----------



## paxw

Not sure is many still care but the EVGA GTX TITAN X (Pascal) / GTX 1080 Ti FE HYBRID Waterblock Cooler, actually was in stock a couple of times today.
sold out crazy fast but I suspect you will see some more go up.


----------



## jsutter71

So would it be unreasonable to upgrade my system memory from 2400mhz to 3200mhz to squeeze out some additional performance? For the record I already have a 6950x and samsung 960 pro. Is it worth the cost which in my case would be $668 for the new dimms?
This is what I'm looking at.
https://www.newegg.com/Product/Product.aspx?item=N82E16820232349


----------



## MrTOOSHORT

Quote:


> Originally Posted by *jsutter71*
> 
> So would it be unreasonable to upgrade my system memory from 2400mhz to 3200mhz to squeeze out some additional performance? For the record I already have a 6950x and samsung 960 pro. Is it worth the cost which in my case would be $668 for the new dimms?
> This is what I'm looking at.
> https://www.newegg.com/Product/Product.aspx?item=N82E16820232349


If you have the money to spend, those are some nice sticks. Your 6950x should be able to get those speeds too. 2400Mhz to 3200MHz is definitely worth it, well in my eyes it is.


----------



## DooRules

I got the 32 gb kit of the same ram. Runs at 3400 with no effort at all.


----------



## pez

TXP will probably be up for sale soon so long as my Ti isn't faulty. I'd keep the TXP, but putting it into the secondary rig would be a bit senseless for many reasons. The TXP should at least pay for the Ti sidegrade. After seeing the pricing on that Hybrid kit, and the fact I had to wait a ridiculous amount of time because EVGA wanted it to be compatible with the Ti as well....yeah, you can keep that EVGA. I will happily 'downgrade' to a Ti AIB to avoid your cash grab.


----------



## Asus11

Quote:


> Originally Posted by *pez*
> 
> TXP will probably be up for sale soon so long as my Ti isn't faulty. I'd keep the TXP, but putting it into the secondary rig would be a bit senseless for many reasons. The TXP should at least pay for the Ti sidegrade. After seeing the pricing on that Hybrid kit, and the fact I had to wait a ridiculous amount of time because EVGA wanted it to be compatible with the Ti as well....yeah, you can keep that EVGA. I will happily 'downgrade' to a Ti AIB to avoid your cash grab.


I dont get it when people do this.. maybe gain $100 to downgrade?

is it really worth the hassle


----------



## pez

Quote:


> Originally Posted by *Asus11*
> 
> I dont get it when people do this.. maybe gain $100 to downgrade?
> 
> is it really worth the hassle


I've stated it in here before, but I did so for the benefits of AIB cards. I.e. cooler running cards under air, lower temps, etc. I had no intentions of going for water on my TXP unless with a AIO, but that proved to be cost ineffective.

Seeing as I got to play with the card for 8 months, and now I can move on to a AIB cooler that does what I need it to, I don't mind losing $200-300 in the end. I'll still make more selling the card than what the Ti costs and if I truly want, I can use the money to go SLI.


----------



## Asus11

Quote:


> Originally Posted by *pez*
> 
> I've stated it in here before, but I did so for the benefits of AIB cards. I.e. cooler running cards under air, lower temps, etc. I had no intentions of going for water on my TXP unless with a AIO, but that proved to be cost ineffective.
> 
> Seeing as I got to play with the card for 8 months, and now I can move on to a AIB cooler that does what I need it to, I don't mind losing $200-300 in the end. I'll still make more selling the card than what the Ti costs and if I truly want, I can use the money to go SLI.


I see now, imo the Titan XP is only good under water

goodluck with your next card


----------



## pez

Quote:


> Originally Posted by *Asus11*
> 
> I see now, imo the Titan XP is only good under water
> 
> goodluck with your next card


I mean it's amazing even at 70% fan, but 70% fan is enough to make it impossible to have a conversation in the same room as another person







. But yes, generally I would agree. It's a very specific situation in the end, but I'm going to end up happy







.


----------



## xTesla1856

Quote:


> Originally Posted by *pez*
> 
> I mean it's amazing even at 70% fan, but 70% fan is enough to make it impossible to have a conversation in the same room as another person
> 
> 
> 
> 
> 
> 
> 
> . But yes, generally I would agree. It's a very specific situation in the end, but I'm going to end up happy
> 
> 
> 
> 
> 
> 
> 
> .


Mine did over 2GHz on air with fan speeds of about 70-80%. But at that point it sounds like it will self desctruct any time. Water is the only way to go for such a beast


----------



## pez

Quote:


> Originally Posted by *xTesla1856*
> 
> Mine did over 2GHz on air with fan speeds of about 70-80%. But at that point it sounds like it will self desctruct any time. Water is the only way to go for such a beast


Yeah, if I capped it at 70% fan (honestly the max I could tolerate) It would plateau after a few hours of gaming at something like 1924Mhz. I was perfectly happy with that. With the stock fan curve, it was just all over the place, and for some games that can mean the difference in 90FPS and 110+FPS (i.e. Doom).


----------



## Jpmboy

Quote:


> Originally Posted by *jsutter71*
> 
> So would it be unreasonable to upgrade my system memory from 2400mhz to 3200mhz to squeeze out some additional performance? For the record I already have a 6950x and samsung 960 pro. Is it worth the cost which in my case would be $668 for the new dimms?
> This is what I'm looking at.
> https://www.newegg.com/Product/Product.aspx?item=N82E16820232349


good stuff, but if you are looking for 64GB, get the 8x8GB kit. Basically, filling all ram slots on t-topology boards always performs better, and the 8GB sticks are more likely to OC than high density 16GB sticks. I'm running the 3200c14 8x8GB kit on this R5E10 at 3400c13 with 1.45V.


----------



## Sheyster

Quote:


> Originally Posted by *Asus11*
> 
> I dont get it when people do this.. maybe gain $100 to downgrade?
> 
> is it really worth the hassle


The FTW3 cards I have on order will *both* be much quieter than 1 Titan XP. It was an easy decision for me. I don't water cool my video cards anymore; I turn them over 2-3 times a year.


----------



## Sheyster

Quote:


> Originally Posted by *pez*
> 
> I mean it's amazing even at 70% fan, but 70% fan is enough to make it impossible to have a conversation in the same room as another person
> 
> 
> 
> 
> 
> 
> 
> . But yes, generally I would agree. It's a very specific situation in the end, but I'm going to end up happy
> 
> 
> 
> 
> 
> 
> 
> .


I run my card at 70% fan, voltage locked at 1050mv, 2000 MHz. It loses some MHz as it heats up but the card is quite noisy at 70% fan. I use my case (Core X9) just like an open bench so nothing is baffling that horrid sound.


----------



## MrTOOSHORT

So what do you guys think, flash an AIB 1080ti bios to the TXP?







or







?


----------



## Sheyster

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> So what do you guys think, flash an AIB 1080ti bios to the TXP?
> 
> 
> 
> 
> 
> 
> 
> or
> 
> 
> 
> 
> 
> 
> 
> ?


LOL. All I can say is:

"Good luck." - sinister voice on the phone from _Taken_.


----------



## MrTOOSHORT

I have a specific set of skills though!









I don't have another free card just incase the flash goes wrong, TXM is watercooled in another machine.


----------



## Asus11

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> So what do you guys think, flash an AIB 1080ti bios to the TXP?
> 
> 
> 
> 
> 
> 
> 
> or
> 
> 
> 
> 
> 
> 
> 
> ?


I wondered when someone was going to do this









do let us know if you do haha

I would do it but im mini itx and only have 1 slot available if I mess the bios up I can't add another card in to flash it again


----------



## xTesla1856

I have a spare 980 laying around.....


----------



## Asus11

Quote:


> Originally Posted by *xTesla1856*
> 
> I have a spare 980 laying around.....


2152mhz on the XP? dam your card is OP lol

imagine it actually worked


----------



## Kendragon

I run my titan xp +240 on core amd +440 on memory and haven't had any issues thati can notice. I have an ek full block on it.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> So what do you guys think, flash an AIB 1080ti bios to the TXP?
> 
> 
> 
> 
> 
> 
> 
> or
> 
> 
> 
> 
> 
> 
> 
> ?


lol - I was first to flash the strix bios to a kingpin.... but yeah, you gotta have a spare NV card (an old 9800GT will do just fine).








I'm happy with the performance of this pair of TXPs, but I am waiting to see if a Ti Classified shows up.


----------



## arrow0309

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - I was first to flash the strix bios to a kingpin.... but yeah, you gotta have a spare NV card (an old 9800GT will do just fine).
> 
> 
> 
> 
> 
> 
> 
> 
> I'm happy with the performance of this pair of TXPs, but I am waiting to see if *a Ti Classified* shows up.


A Ti Kingpin Classified, that makes us two!


----------



## xTesla1856

Ti Kingping might make me weak. The 980Ti KPE was the best looking card ever, IMO.


----------



## pez

Quote:


> Originally Posted by *Sheyster*
> 
> I run my card at 70% fan, voltage locked at 1050mv, 2000 MHz. It loses some MHz as it heats up but the card is quite noisy at 70% fan. I use my case (Core X9) just like an open bench so nothing is baffling that horrid sound.


Indeed. Maybe some dampening could go a long way, but yeah







. Glad you caught my drift







. I can't say the NCASE is necessarily a quiet case since it's about 90% metal and then some plastic bits.


----------



## kx11

Quote:


> Originally Posted by *xTesla1856*
> 
> Ti Kingping might make me weak. The 980Ti KPE was the best looking card ever, IMO.


i'd vote for GALAX HOF


----------



## axiumone

Haha, nvidia just dropped the bomb. Full fat titan xp, now literally called Titan Xp. With 3840 cuda cores.

http://nvda.ws/2nBObA2


----------



## pez

Quote:


> Originally Posted by *axiumone*
> 
> Haha, nvidia just dropped the bomb. Full fat titan xp, now literally called Titan Xp. With 3840 cuda cores.
> 
> http://nvda.ws/2nBObA2


Was this announced somewhere? I had no idea this was coming...and same price as our TXPs were







.


----------



## axiumone

I don't think anyone did. It was announced via a strange twitter post from nvidia. I thought it was a hijacked account at first, but the link leads to nvidia store buy page.


----------



## pez

Quote:


> Originally Posted by *axiumone*
> 
> I don't think anyone did. It was announced via a strange twitter post from nvidia. I thought it was a hijacked account at first, but the link leads to nvidia store buy page.


Yeah, I couldn't find anything on Google for it either. Outside of some site predicting it happening in September







. Inb4 Baasha orders 4 of these







.


----------



## axiumone

From what I see on the buy page, they've cut the dvi out on this one as well and are now using the same heatsink as on the Ti now.


----------



## JackCY

It's on all NV store pages, some even have a direct link in the products list. Titan X gone replaced with Titan Xp, or XP replaced with XPp as the slang goes I guess.
New club for XPp?









They made a smaller or bigger heatsink for 1080Ti? They really bother with making a different heatsink other than the cover? A total waste. Wasn't like 1 fin more or something anyway?


----------



## Anzial

Looks like nvidia is feeling the pressure from Vega lol


----------



## Jpmboy

With Tin at it (at EVGA) a Ti Classy is possible.
http://forum.hwbot.org/showthread.php?t=167865?utm_source=email&utm_medium=newsletter&utm_campaign=march2017


----------



## Sheyster

Quote:


> Originally Posted by *pez*
> 
> Inb4 Baasha orders 4 of these
> 
> 
> 
> 
> 
> 
> 
> .


----------



## pez

Quote:


> Originally Posted by *axiumone*
> 
> From what I see on the buy page, they've cut the dvi out on this one as well and are now using the same heatsink as on the Ti now.


I was wondering this, but couldn't find an angle of that...do you have a pic?
Quote:


> Originally Posted by *Sheyster*


I mean...I may or may not have ordered one with next morning shipping....


----------



## rt123

Ordered.


----------



## axiumone

No pic, but dvi is missing from the specs table.


----------



## pez

Quote:


> Originally Posted by *axiumone*
> 
> No pic, but dvi is missing from the specs table.


Aaah, good call. Welp -- we finally got Big Pascal and bragging rights over Ti users again







.


----------



## Maintenance Bot

In for 1.


----------



## MrTOOSHORT

The naming is just stupid, was stupid last year too calling it Titan-X.


----------



## Maintenance Bot

Looks like they juiced the memory bandwidth a good bit.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Looks like they juiced the memory bandwidth a good bit.


Yeah, that's +700 with the OG TXP.


----------



## xTesla1856

Kinda pissed actually


----------



## Jpmboy

Quote:


> Originally Posted by *rt123*
> 
> Ordered.


same here.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> The naming is just stupid, was stupid last year too calling it Titan-X.


gotta figure out how to enter that name in the bench tables.


----------



## rt123

Windows XP?


----------



## axiumone

I wonder if they also upgraded the power delivery on this one as well.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *axiumone*
> 
> I wonder if they also upgraded the power delivery on this one as well.


I'm going to say yes as it's probably the exact same pcb and components as 1080ti FE.


----------



## xTesla1856

Quote:


> Originally Posted by *axiumone*
> 
> I wonder if they also upgraded the power delivery on this one as well.


Seems like it

EDIT: I have one in my cart, must resist


----------



## octiny

Sold my "old" Titan X Pascal's for a little over $1000 each 2 days ago since I'm doing a completely new build from scratch. Jeez, so glad I didn't procrastinate.


----------



## bl4ckdot

Fk me I bought a week ago my new audio setup and now I'm must wait


----------



## Jpmboy

looks like it's a uniblock or the 1080Ti waterblock?


----------



## xTesla1856

Quote:


> Originally Posted by *Jpmboy*
> 
> looks like it's a unbiblock or the 1080Ti waterblock?


Regular TXP block won't fit?


----------



## MrTOOSHORT

Old block fit a 1080ti, so it'll fit the new TXP.

I want to sell the TXM now so I can hand me down the old TXP to my son. Then get the new TXP. It's getting costly here.


----------



## DooRules

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Old block fit a 1080ti, so it'll fit the new TXP.
> 
> I want to sell the TXM now so I can hand me down the old TXP to my son. Then get the new TXP. It's getting costly here.


Getting costly for sure. I am just gonna stay pat with what I got. Wait from next gen to drop.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Old block fit a 1080ti, so it'll fit the new TXP.
> 
> I want to sell the TXM now so I can hand me down the old TXP to my son. Then get the new TXP. It's getting costly here.


I hope so. Just PM'd EK to ask if they are compatible.

edit" damn nice hand-me-down.


----------



## vmanuelgm

If someone wants the txp 1.0 pm me!!!

Just bought the 3840 cudas!!!


----------



## nycgtr

256 extra cuda cores lol. What are we looking at realistically for performance increase?


----------



## axiumone

Around 8% from cores and probably close to 2-3% from memory. So 9-11% faster than the "old" txp.


----------



## xTesla1856

Eager to see if these reach the same or higher clocks than the TXP(leb)


----------



## nycgtr

So whos going to flash their old one


----------



## willverduzco

Quote:


> Originally Posted by *nycgtr*
> 
> 256 extra cuda cores lol. What are we looking at realistically for performance increase?


7% increase at same clocks in raw compute and shading. ROP count is same, so same strict rendering performance that isn't shader-limited... But games are shader, rather than ROP limited, more often than not. I assume with good cooling, they'll hit ~2100 just like we can, so that 7% faster will hold true.

Memory seems like a non-issue, since we can hit 12000 MT/s (+1000 MHz DDR) on memory (most afterburner allows without tweaking) without any artifacts. I recently switched from +450 MHz (which was a previous sweet spot) to +1000 MHz (which is fully stable and gives a slight boost).


----------



## Silent Scone

Wow, talk about smacking you about the face...

Didn't expect them to stab me in the back till Vega dropped.


----------



## axiumone

I will die laughing if some of the old cards unlock to 3840 cores.


----------



## rt123

Nvidia & unlock? Yeah forget it.


----------



## nycgtr

Quote:


> Originally Posted by *rt123*
> 
> Nvidia & unlock? Yeah forget it.



















If it was possible I am sure they charge for it. Wouldn't be surprised if it showed up lol.


----------



## rt123

No kidding lol.

For $200, make your TXP into a TXp.


----------



## nycgtr

Quote:


> Originally Posted by *rt123*
> 
> No kidding lol.
> 
> For $200, make your TXP into a TXp.


I am sure if they were to do it at 50 -100 bucks many would buy it lol.


----------



## Silent Scone

They're likely all laser cut as per usual. I'm a bit miffed they've done this so soon to the Ti. Probably not going to buy either now.


----------



## bl4ckdot

Quote:


> Originally Posted by *nycgtr*
> 
> I am sure if they were to do it at 50 -100 bucks many would buy it lol.


Like everyone IMO


----------



## rt123

Quote:


> Originally Posted by *nycgtr*
> 
> I am sure if they were to do it at 50 -100 bucks many would buy it lol.


And that's EXACTLY is the reason why it wouldn't be 50-100 bucks.


----------



## Blaise Pascal

Think there's any chance of a step up program? It's probably been too long for most of us, haha. Not sure how those work when buying straight from nvidia


----------



## Radox-0

Quote:


> Originally Posted by *Blaise Pascal*
> 
> Think there's any chance of a step up program? It's probably been too long for most of us, haha. Not sure how those work when buying straight from nvidia


A step up that would cost you $1200


----------



## axiumone

Quote:


> Originally Posted by *rt123*
> 
> And that's EXACTLY is the reason why it wouldn't be 50-100 bucks.


It would be a monthly subscription available exclusively through geforce experience.


----------



## Blaise Pascal

I could sell my AMD stock and buy the new XP, but it just chunked LOL.


----------



## rt123




----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> They're likely all laser cut as per usual. I'm a bit miffed they've done this so soon to the Ti. Probably not going to buy either now.


Of course they are milking us... the TX Pascal should have been full die. But it's business as usual. For a gamer at 1440P or 4K the difference is nominal. Would be nice if this full die is DP.


----------



## pez

What, you guys aren't happy to order one and have the true king again?


----------



## nycgtr

I am thinking about buying it but honestly I don't see the real gaming benefit from it and I can't play firestrike as well anymore anyway. I gave up my 5960x lol.


----------



## spyshagg

Nvidia's motto of late is "balls out". Preemptive move? They seem to be wanting to kill competition before there even is competition on the market


----------



## nycgtr

Don't think there's competition in the 1k plus market for single card. This is just epeen heaven lol.


----------



## pez

I love ITX, so the single fastest card is a big thing for me







. I'll shell out the money to drive the res I prefer and get a few more frames







.


----------



## xTesla1856

Quote:


> Originally Posted by *Blaise Pascal*
> 
> I could sell my AMD stock and buy the new XP, but it just chunked LOL.


With that username? You must do it now.


----------



## GosuPl

At least! Full Gp120







Meawhile on my floor ;-)







Hmm, but if Nvidia relase thrid TITAN X Pascal on GP100 instead GP102 ?









I thinking about buy 2x new TITAN X for SLI but...


----------



## Jpmboy

Quote:


> Originally Posted by *GosuPl*
> 
> At least! Full Gp120
> 
> 
> 
> 
> 
> 
> 
> Meawhile on my floor ;-)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm, but if Nvidia relase thrid TITAN X Pascal on GP100 instead GP102 ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I thinking about buy 2x new TITAN X for SLI but...


there is no therapy for your affliction.


----------



## GosuPl

Haha! Yes , but i just love that







:thumb:


----------



## xTesla1856

I wonder how long until my "condition" gets as bad as yours


----------



## GnarlyCharlie

So will there be a TitanXPp owner's thread now? Sign me up! I was >this< close to buying a 1080Ti when I saw this, and mITX will keep me from the temptation to buy 2.


----------



## rcfc89

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> So will there be a TitanXPp owner's thread now? Sign me up! I was >this< close to buying a 1080Ti when I saw this, and mITX will keep me from the temptation to buy 2.


Hope so







This will be my first Titan.

As far as the naming. The last Titan was simply called Titan X. The P was added from the community to separate it from the Maxwell version. Nvidia is actually calling this Titan Xp.


----------



## GosuPl

We all are doomed by our madness







Great!


----------



## Jpmboy

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> So will there be a TitanXPp owner's thread now? Sign me up! I was >this< close to buying a 1080Ti when I saw this, and mITX will keep me from the temptation to buy 2.


start one up.


----------



## iamjanco

Going to wait for Volta (or Vega) myself, as I just pushed the button on a second 1080 EK X, which I was keeping an eye on as prices dropped (thanks, early adopters).It'll fit in nicely in the SMA8 build I'm currently working on and since I'm only a casual gamer and more a Premier/After Effects type, it should suit my needs better for the time being.

That said, if I had the cash, I probably would have gone high-end Quadra long ago.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *rcfc89*
> 
> Hope so
> 
> 
> 
> 
> 
> 
> 
> This will be my first Titan.
> 
> As far as the naming. The last Titan was simply called Titan X. The P was added from the community to separate it from the Maxwell version. Nvidia is actually calling this Titan Xp.


Right, but the community will still need a way to differentiate the two - I don't think you could rely on the lower case p vs an upper case P as the sole difference with message board grammar/syntax/punctuation/spelling in effect.


----------



## Zurv

what the what!?
hrmm.. must... not.. buy.. new.. card... ugh...

I really wish they would sell via amazon, i hate getting stuff directly from Nvidia.

We going to need new block or something? tell me Jpmboy! you know all!


----------



## Sheyster

Quote:


> Originally Posted by *pez*
> 
> I mean...I may or may not have ordered one with next morning shipping....


I have to admit, I've been hovering on the buy page for a few hours now...


----------



## Zurv

Quote:


> Originally Posted by *Sheyster*
> 
> I have to admit, I've been hovering on the buy page for a few hours now...


i'm pretty happy with Titan XPs (2016) i have now... but.. new is new... If the waterblocks are the same that i'm pretty sure i'll break.









But if I need a new waterblock.. blah









We are expecting the new GPUs at the end of the year, right?


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> what the what!?
> hrmm.. must... not.. buy.. new.. card... ugh...
> 
> I really wish they would sell via amazon, i hate getting stuff directly from Nvidia.
> 
> We going to need new block or something? tell me Jpmboy! you know all!


Waiting for a PM from EK about waterblocks. I think the TXP block will fit the TXp.


----------



## cookiesowns

I feel extremely gyped.. but I still want one. Help me.


----------



## Sheyster

Eff it... Ordered one... Sigh... The peer pressure is just too much around here!









I'm keeping the FTW3's on order for now. This card better be to my liking or it's going back quick.. Just sayin'..


----------



## Zurv

Quote:


> Originally Posted by *Sheyster*
> 
> Eff it... Ordered one... Sigh... The peer pressure is just too much around here!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm keeping the FTW3's on order for now. This card better be to my liking or it's going back quick.. Just saying'..


just one?!? ^_^


----------



## Sheyster

Proposal for new name: *Titan XP²* ...


----------



## Sheyster

Quote:


> Originally Posted by *Zurv*
> 
> just one?!? ^_^


Yes, I'm getting more conservative in my old age.







(except when it comes to single malt scotch and women)


----------



## xTesla1856

Quote:


> Originally Posted by *Sheyster*
> 
> I have to admit, I've been hovering on the buy page for a few hours now...


Same here buddy, same here


----------



## pez

Quote:


> Originally Posted by *Sheyster*
> 
> Proposal for new name: *Titan XP²* ...


I agree with this. Though raised 2 will be hard on mobile for me







. Maybe Titan XpP.


----------



## xTesla1856

The raised 2 works on mobile if you squint just a bit


----------



## Jpmboy

TXFp (F not equal to Full).


----------



## pez

Why don't we just call it Big Pascal







. #BigPascalMasterRace

Also, my order is confirmed but my order status shows 'Processing'. What else is everyone seeing? NVIDIA doesn't seem to ever do pre-orders, so I really do hope I'll have it tomorrow morning. I did so because it means I'll have all weekend to play with it.


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> Why don't we just call it Big Pascal
> 
> 
> 
> 
> 
> 
> 
> . #BigPascalMasterRace
> 
> Also, my order is confirmed but my order status shows 'Processing'. What else is everyone seeing? NVIDIA doesn't seem to ever do pre-orders, so I really do hope I'll have it tomorrow morning. I did so because it means I'll have all weekend to play with it.


yeah, I did "next morning" also.. let's see which morning that is.


----------



## aylan1196

Back to the club I bought the big brother
Long live the king ? Titan xp
R.I.P 1080 ti short lived ?


----------



## Artah

Quote:


> Originally Posted by *Jpmboy*
> 
> Waiting for a PM from EK about waterblocks. I think the TXP block will fit the TXp.


You get a response on this? I wanted to keep my blocks and just change out my TXP GPUs.


----------



## rt123

Quote:


> Originally Posted by *Jpmboy*
> 
> TXFp (F not equal to Full).












*I APPROVE THIS!!!*

Alternatively, TitanFJenHsung.


----------



## mouacyk

This is funny. NVidia watching OCN's naming conundrum.


Spoiler: Warning: Spoiler!



We got them this time!


----------



## gamingarena

Quote:


> Originally Posted by *Jpmboy*
> 
> same here.


But why? who ever buys this overpriced Fossil at this time is delusional, just get 2x 1080Tis for $200 more and enjoy 50-65% extra performance, who ever buys this new TXP is out of his mind period!

Yes and i own and owned every single TItan SLi including TXP from beginning, but this milking has to stop at some point.


----------



## rcfc89

Quote:


> Originally Posted by *gamingarena*
> 
> But why? who ever buys this overpriced Fossil at this time is delusional, just get 2x 1080Tis for $200 more and enjoy 50-65% extra performance, who ever buys this new TXP is out of his mind period!
> 
> Yes and i own and owned every single TItan SLi including TXP from beginning, but this milking has to stop at some point.


You must have missed the memo. SLI is dead. Single gpu DX12 is the future.

http://www.babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/3/


----------



## GnarlyCharlie

Quote:


> Originally Posted by *gamingarena*
> 
> But why? who ever buys this overpriced Fossil at this time is delusional, just get 2x 1080Tis for $200 more and enjoy 50-65% extra performance, who ever buys this new TXP is out of his mind period!
> 
> Yes and i own and owned every single TItan SLi including TXP from beginning, but this milking has to stop at some point.


I haven't bought any Pascal cards aside from a 1070 that was a gift, and I'm building a ITX rig so no SLI. It's a hobby, not some crucial decision that the fate of the world rests on. And as several in this thread - and probably others lurking who haven't posted - have already made the purchase, your measured and profound observation concerning our mental state is duly noted.


----------



## gamingarena

Quote:


> Originally Posted by *rcfc89*
> 
> You must have missed the memo. SLI is dead. Single gpu DX12 is the future.
> 
> http://www.babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/3/


Nothing to miss im playing at home and if you have Titans you should know what you doing and how to get SLi working where it dosnt


----------



## jhowell1030

Quote:


> Originally Posted by *gamingarena*
> 
> If you have Titans you should know what you doing and _[how to get SLi working where it dosnt_


That last statement seems belligerently ignorant to me. SLI utilization is up to the devs. There's isn't much that the users can do about that.


----------



## gamingarena

Quote:


> Originally Posted by *jhowell1030*
> 
> That last statement seems belligerently ignorant to me. SLI utilization is up to the devs. There's isn't much that the users can do about that.


Regardless my point is that at this point in time with 1080Ti on the market there is absolute zero reason to pay $500 extra for 7% extra Cuda cores period the reason why i called anyone considering new Titan Xp delusional.
When "OLD" Titan XP came it was 35% faster then 1080 made sense but now its just makes zero sense! SLi makes more sense even if it works in only 50% of titles you still get 40-50% extra performance and loose at most 10% in other half of the games! 1080Ti SLi that is.


----------



## nycgtr

Quote:


> Originally Posted by *Artah*
> 
> You get a response on this? I wanted to keep my blocks and just change out my TXP GPUs.


They got rid of the dvi port as well on this one. Honestly, its most likely the 1080ti pcb with the 1gb still there and a full chip. There's no reasoning to why it would be different, give how the 1080ti pcb ref is actually pretty overbuilt for nv reference crap pcb standards.


----------



## rt123

Quote:


> Originally Posted by *gamingarena*
> 
> Regardless my point is that at this point in time with 1080Ti on the market there is absolute zero reason to pay $500 extra for 7% extra Cuda cores period the reason why i called anyone considering new Titan Xp delusional.
> When "OLD" Titan XP came it was 35% faster then 1080 made sense but now its just makes zero sense! SLi makes more sense even if it works in only 50% of titles you still get 40-50% extra performance and loose at most 10% in other half of the games! 1080Ti SLi that is.


It will be more than 10% faster. It has more Cores & more ROPs. Mem is up in the air if it will outclock the one on 1080Ti or both will endup in the same place after OC.


----------



## EvilPieMoo

Eh, I'm a fan of the Titan line of cards, but the "new" Titan doesn't seem to be anything more than a incremental update instead of the new card. Going by the specs, it looks to barely boast 10% increase over the Titan/Ti.


----------



## qazplm5089

Do you guys think its possible that the full chip could be unlocked? Has anything like that ever been done?


----------



## gamingarena

Quote:


> Originally Posted by *rt123*
> 
> It will be more than 10% faster. It has more Cores & more ROPs. Mem is up in the air if it will outclock the one on 1080Ti or both will endup in the same place after OC.


Yes cause more ROPS on old Titan XP helped against 1080Ti right?


----------



## toncij

Well, we all expected the hit 1080Ti did, but Titan Xp has just completely killed the value of old Titan X (Pascal).


----------



## jhowell1030

Quote:


> Originally Posted by *toncij*
> 
> Well, we all expected the hit 1080Ti did, but Titan Xp has just completely killed the value of old Titan X (Pascal).


Yep. That's the only reason I'm salty. They've always held their value well. Not this time around.


----------



## gamingarena

Quote:


> Originally Posted by *jhowell1030*
> 
> Yep. That's the only reason I'm salty. They've always held their value well. Not this time around.


New Titan XP didn't to anything to its value, it was already killed by 1080Ti...


----------



## Sheyster

Quote:


> Originally Posted by *pez*
> 
> Why don't we just call it Big Pascal
> 
> 
> 
> 
> 
> 
> 
> . #BigPascalMasterRace
> 
> Also, my order is confirmed but my order status shows 'Processing'. What else is everyone seeing? NVIDIA doesn't seem to ever do pre-orders, so I really do hope I'll have it tomorrow morning. I did so because it means I'll have all weekend to play with it.


I like BigBoiPascal..









I've received 2 emails so far, the second being the order confirmation email. No ETA/ship date yet though.


----------



## toncij

Quote:


> Originally Posted by *gamingarena*
> 
> New Titan XP didn't to anything to its value, it was already killed by 1080Ti...


Well, to an extent, but not exactly. You see, TXP was still the best and you could find someone to buy them (I sold 3 few weeks before 1080Ti tho, for about 1200







). But now... it's the same as with Titan Black - old Titan is a paper weight.

The only problem is morons at Nvidia sell them only via their site which is unreachable in Ireland, Switzerland, Croatia... pretty much all the three countries I can buy it in. :facpalm:


----------



## nycgtr

I bought 2. Screw it. I can offload the older version locally fairly easily.


----------



## jhowell1030

Quote:


> Originally Posted by *gamingarena*
> 
> New Titan XP didn't to anything to its value, it was already killed by 1080Ti...


Before the new TXP, no it didn't. Take a look at how former Titan cards values vs Tis (Titan Maxwell vs 780ti is the perfect example) were selling when the new Titan released.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *gamingarena*
> 
> New Titan XP didn't to anything to its value, it was already killed by 1080Ti...


As it has been through every generation since Nvidia started releasing the Ti cards. They did it on the last generation and we had to suffer through commentators such as yourself, they'll do it next generation and we'll suffer it again. I bought my Titan X Maxwells AFTER the 980Ti was released and the commentators declared Titan XM dead - and they still outperformed all but the top of the line (read very expensive) 980Tis. We'll be OK, your concern is duly noted.


----------



## jhowell1030

Quote:


> Originally Posted by *toncij*
> 
> Well, to an extent, but not exactly. You see, TXP was still the best and you could find someone to buy them (I sold 3 few weeks before 1080Ti tho, for about 1200
> 
> 
> 
> 
> 
> 
> 
> ). But now... it's the same as with Titan Black - old Titan is a paper weight.
> 
> The only problem is morons at Nvidia sell them only via their site which is unreachable in Ireland, Switzerland, Croatia... pretty much all the three countries I can buy it in. :facpalm:


Yep, that's exactly what I was thinking of. The Titan Black.


----------



## Zurv

Quote:


> Originally Posted by *rcfc89*
> 
> You must have missed the memo. SLI is dead. Single gpu DX12 is the future.
> 
> http://www.babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/3/


that url test isn't very useful.
Most dx12 games don't work in sli. It isn't the scaling - it just isn't using the second card. period. Being that dx12 really add nothing at this stage (other than some more perf for ATI cards.. but the visuals are the same.) Hopefully it does at some point. Also Vulkan only real game doesn't support mutliGPUs (but Vulkan can), but doom didn't need it.

As someone that only plays in 4k. SLI is a must. Do also note that many games have higher than ultra settings that too need more power. SLI is still a must for most games to run in 4k. (yes, some don't need 2 cards, but the ones with sexy grafix normally do.)

What you ever you want to do is fine. True, if you are playing on 1080p. You are in a good spot.
SLI works great. I sick to death of people *****ing saying that it doesn't. Of course most of these people are using low rez screens, old hardware/OS or just don't use SLI at all. If a dev gimps something and SLI doesn't work, that is a shame, but games that need SLI and don't support it are much rarer these days than in the past. (I honestly can't think of any new game that needed it but doesn't support it...)

deux ex, BF1, gta 5, mass effect:A, nier, fallout4, witcher 3, watchdogs2, Ghost recon, Rise of the tomb raider, xcom2, etc (just random games that came to mind) would have all been unplayable without SLI for me.
Cost? I don't really care.

1080 (non-ti) totally wasn't powerful enough with only 2 way SLI. titan XP (2016) mostly is and i .. might hold off getting this card.. maybe








That said, if these new cards can get up to 2ghz as i'd expect.. they will be sexy monsters.


----------



## Sheyster

Quote:


> Originally Posted by *toncij*
> 
> Well, we all expected the hit 1080Ti did, but Titan Xp has just completely killed the value of old Titan X (Pascal).


If anyone is that worried about this, they really shouldn't be buying any of the Titan series cards when they're new (used is a different story). Just sayin'...


----------



## Jpmboy

Quote:


> Originally Posted by *gamingarena*
> 
> But why? who ever buys this overpriced Fossil at this time is delusional, just get 2x 1080Tis for $200 more and enjoy 50-65% extra performance, who ever buys this new TXP is out of his mind period!
> 
> Yes and i own and owned every single TItan SLi including TXP from beginning, but this milking has to stop at some point.


Pretty easy to tell who's losing it.








why would I buy 1080tis when I'm writing this email from a machine with SLI TXPs? Which sits next to a rig with 2 TXMs.


Quote:


> Originally Posted by *jhowell1030*
> 
> Before the new TXP, no it didn't. Take a look at how former Titan cards values vs Tis (Titan Maxwell vs 780ti is the perfect example) were selling when the new Titan released.


The value comparison always shows up when a new product launches, and just the mention of it in a thread discussing the Titan "halo" product series is just stupid. Wrong thread for that argument. If you are here, "value" is not relevant in the first place.
move on to an AMD thread please.


----------



## nycgtr

Quote:


> Originally Posted by *Zurv*
> 
> that url test isn't very useful.
> Most dx12 games don't work in sli. It isn't the scaling - it just isn't using the second card. period. Being that dx12 really add nothing at this stage (other than some more perf for ATI cards.. but the visuals are the same.) Hopefully it does at some point. Also Vulkan only real game doesn't support mutliGPUs (but Vulkan can), but doom didn't need it.
> 
> As someone that only plays in 4k. SLI is a must. Do also note that many games have higher than ultra settings that too need more power. SLI is still a must for most games to run in 4k. (yes, some don't need 2 cards, but the ones with sexy grafix normally do.)
> 
> What you ever you want to do is fine. True, if you are playing on 1080p. You are in a good spot.
> SLI works great. I sick to death of people *****ing saying that it doesn't. Of course most of these people are using low rez screens, old hardware/OS or just don't use SLI at all. If a dev gimps something and SLI doesn't work, that is a shame, but games that need SLI and don't support it are much rarer these days than in the past. (I honestly can't think of any new game that needed it but doesn't support it...)
> 
> deux ex, BF1, gta 5, mass effect:A, nier, fallout4, witcher 3, watchdogs2, Ghost recon, Rise of the tomb raider, xcom2, etc (just random games that came to mind) would have all been unplayable without SLI for me.
> Cost? I don't really care.
> 
> 1080 (non-ti) totally wasn't powerful enough with only 2 way SLI. titan XP (2016) mostly is and i .. might hold off getting this card.. maybe
> 
> 
> 
> 
> 
> 
> 
> 
> That said, if these new cards can get up to 2ghz as i'd expect.. they will be sexy monsters.


How did you nier to work with sli? I have txp sli and I think some of the titles you mentioned only don't work if you want heavy aa


----------



## jhowell1030

Quote:


> Originally Posted by *Jpmboy*
> 
> Pretty easy to tell who's losing it.
> 
> 
> 
> 
> 
> 
> 
> 
> why would I buy 1080tis when I'm writing this email from a machine with SLI TXPs? Which sits next to a rig with 2 TXMs.


ROFLCOPTERSKATES! I just lost it in the office. That's gold.


----------



## toncij

7% faster than Titan X (Pascal), 7% faster than 1080Ti, 50% faster than 1080, 80% faster than Titan X... Nice.


----------



## Sheyster

Quote:


> Originally Posted by *nycgtr*
> 
> I bought 2. Screw it. I can offload the older version locally fairly easily.


Atta boi! Double down FTW!









Seriously though, these cards at these prices aren't for everyone. It is what it is.


----------



## NYU87

Quote:


> Originally Posted by *gamingarena*
> 
> But why? who ever buys this overpriced Fossil at this time is delusional, just get 2x 1080Tis for $200 more and enjoy 50-65% extra performance, who ever buys this new TXP is out of his mind period!
> 
> Yes and i own and owned every single TItan SLi including TXP from beginning, but this milking has to stop at some point.


If someone has the money why not? It's not your problem.

Or are you jealous?


----------



## Nitemare3219

Quote:


> Originally Posted by *nycgtr*
> 
> I bought 2. Screw it. I can offload the older version locally fairly easily.


And you are the reason NVIDIA does **** like this.


----------



## nycgtr

Quote:


> Originally Posted by *Sheyster*
> 
> Atta boi! Double down FTW!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seriously though, these cards at these prices aren't for everyone. It is what it is.


Hopefully the box is different lol.


----------



## Sheyster

Quote:


> Originally Posted by *toncij*
> 
> 7% faster than Titan X (Pascal), 7% faster than 1080Ti, 50% faster than 1080, 80% faster than Titan X... Nice.


You forgot the most important metric:

+3 inches to EPEEN!


----------



## toncij

Quote:


> Originally Posted by *Sheyster*
> 
> You forgot the most important metric:
> 
> +3 inches to EPEEN!


Yeah, that too. I can now return for full refund my Quadro P6000s...


----------



## nycgtr

Quote:


> Originally Posted by *Nitemare3219*
> 
> And you are the reason NVIDIA does **** like this.


I have my limits too. Like the 1700 6950x was a no go for me as that was past acceptable. They have good offerings at other price ranges too. There are much more expensive hobbies in life.


----------



## rcfc89

Quote:


> Originally Posted by *gamingarena*
> 
> New Titan XP didn't to anything to its value, it was already killed by 1080Ti...


Quote:


> Originally Posted by *Zurv*
> 
> that url test isn't very useful.
> Most dx12 games don't work in sli. It isn't the scaling - it just isn't using the second card. period. Being that dx12 really add nothing at this stage (other than some more perf for ATI cards.. but the visuals are the same.) Hopefully it does at some point. Also Vulkan only real game doesn't support mutliGPUs (but Vulkan can), but doom didn't need it.
> 
> As someone that only plays in 4k. SLI is a must. Do also note that many games have higher than ultra settings that too need more power. SLI is still a must for most games to run in 4k. (yes, some don't need 2 cards, but the ones with sexy grafix normally do.)
> 
> What you ever you want to do is fine. True, if you are playing on 1080p. You are in a good spot.
> SLI works great. I sick to death of people *****ing saying that it doesn't. Of course most of these people are using low rez screens, old hardware/OS or just don't use SLI at all. If a dev gimps something and SLI doesn't work, that is a shame, but games that need SLI and don't support it are much rarer these days than in the past. (I honestly can't think of any new game that needed it but doesn't support it...)
> 
> deux ex, BF1, gta 5, mass effect:A, nier, fallout4, witcher 3, watchdogs2, Ghost recon, Rise of the tomb raider, xcom2, etc (just random games that came to mind) would have all been unplayable without SLI for me.
> Cost? I don't really care.
> 
> 1080 (non-ti) totally wasn't powerful enough with only 2 way SLI. titan XP (2016) mostly is and i .. might hold off getting this card.. maybe
> 
> 
> 
> 
> 
> 
> 
> 
> That said, if these new cards can get up to 2ghz as i'd expect.. they will be sexy monsters.


I currently have a pair of 980Ti Lightnings. Had a pair of 780Ti's before that and a 690. I'm done with SLI. Some games it works well but most it either doesn't work at all or scales terribly. The fact that most DX12 games don't use SLI should be a sign of what's to come.


----------



## Jpmboy

Quote:


> Originally Posted by *nycgtr*
> 
> I have my limits too. Like the 1700 6950x was a no go for me as that was past acceptable. They have good offerings at other price ranges too. *There are much more expensive hobbies in life*.


tell me about it... i think my wife spent that much yesterday getting the horses re-shod and the donkey's hooves clipped.


----------



## Zurv

Quote:


> Originally Posted by *nycgtr*
> 
> How did you nier to work with sli? I have txp sli and I think some of the titles you mentioned only don't work if you want heavy aa


NieR was a pain in the butt








Far fix (which i'm sure you are already running)
also, add the NieR exe to the Rise of the Tomb Raider profile.
It worked great











The stats on top left.


Clearly this game should NOT have needed SLI. It isn't much of a looker. But it is always nice to throw horse power at bad programming









*It will be Mr Jpmboy that will MAKE me spend money.*









If the perf jump is enough over his old 3dmark scores i'll upgrade









But i'm not going to get it for all my systems. Just SLI in one. I'd already have 4 titan X (Pascal)

Also, for the peeps here is Titan X Pascal cards looking at these new ones. We've had them from launch week a year ago







(shut your mouth.. it was a year ago!







haha.. ok fine.. august.. but .. yeah.. long time ago? )








I was happy with them when i got them (and totally wasn't with the 1080s) and i would have got them again (rather than waiting for the ti)
... of course i still wish 4 way SLI worked still!!


----------



## Artah

Quote:


> Originally Posted by *nycgtr*
> 
> I have my limits too. Like the 1700 6950x was a no go for me as that was past acceptable. They have good offerings at other price ranges too. There are much more expensive hobbies in life.


I'll trade you a 6950x for an unopened TXp


----------



## nycgtr

Quote:


> Originally Posted by *Jpmboy*
> 
> tell me about it... i think my wife spent that much yesterday getting the horses re-shod and the donkey's hooves clipped.


I have a bad thing for watches. That's most likely the worst hobby. My wife is pretty good about stuff. But she likes to eat. Eating around in nyc gets expensive real quick. This is a much better deal than 2k in restaurant charges in a month that went to nowhere but the crapper for me.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> tell me about it... i think my wife spent that much yesterday getting the horses re-shod and the donkey's hooves clipped.


Okay, so now I have to ask.. WTH do you have a donkey?!







Lots of folks around here have horses but I have not seen a donkey since the last time I was in TJ (Mexico).


----------



## Artah

Quote:


> Originally Posted by *Sheyster*
> 
> Okay, so now I have to ask.. WTH do you have a donkey?!
> 
> 
> 
> 
> 
> 
> 
> Lots of folks around here have horses but I have not seen a donkey since the last time I was in TJ (Mexico).


Never seen it but I heard about those donkeys







.


----------



## jcde7ago

I hate you Nvidia, just a bit more than I hate myself...

- Sold 1 of 2x 2016 TXPs for $1K last week, not a bad return considering the existence of the 1080 Ti.
- Have not RMA'd the second/dead 2016 TXP yet even though it's under warranty for another 2+ years...wonder if Nvidia will give me the new version instead? Doubt it.








- Bought 2x 1080 TIs just last week.
- Bought 2x 2017 TXPs today, will be returning the 1080 TIs asap since they were poor overclockers anyways.

Kill meeeeeeeeee.


----------



## Menthol

wild burros are a common site in so-cal, they come down from the hills into my neighborhood to feed on vegetation all the time


----------



## Glerox

A new titan?!? Seriously, ***!!! OMG NVIDIA WHYYYYYY ARE YOU DOING THIS TO ME!


----------



## Zurv

Quote:


> Originally Posted by *jcde7ago*
> 
> I hate you Nvidia, just a bit more than I hate myself...
> 
> - Sold 1 of 2x 2016 TXPs for $1K last week, not a bad return considering the existence of the 1080 Ti.
> - Have not RMA'd the second/dead 2016 TXP yet even though it's under warranty for another 2+ years...wonder if Nvidia will give me the new version instead? Doubt it.
> 
> 
> 
> 
> 
> 
> 
> 
> - Bought 2x 1080 TIs just last week.
> - Bought 2x 2017 TXPs today, will be returning the 1080 TIs asap since they were poor overclockers anyways.
> 
> Kill meeeeeeeeee.


i feel your pain sir. I bought 8!! (2 4way SLI set ups) 1080s are launch. (this was a 2 weeks before they changed their mind about "unlocking" sli and made the max 2. 2 way 1080 sucks for 4k. My old 4 way titan X was faster.)

I also got 2 1080 classified a month before the ti...

But nice sell on the TXP. Where did you sell it? clearly not amazon. holy cow used ones are cheap on that now.. ugh..)


----------



## Sheyster

Quote:


> Originally Posted by *Artah*
> 
> Never seen it but I heard about those donkeys
> 
> 
> 
> 
> 
> 
> 
> .


Yeah, they're not a myth, unfortunately for my eyes.


----------



## alucardis666




----------



## Jpmboy

Quote:


> Originally Posted by *Sheyster*
> 
> Okay, so now I have to ask.. WTH do you have a donkey?!
> 
> 
> 
> 
> 
> 
> 
> Lots of folks around here have horses but I have not seen a donkey since the last time I was in TJ (Mexico).


keeps the horses company... no really. Besides, he's a pisser!


Spoiler: Warning: Spoiler!


----------



## kx11

looks like this thread got 6 more months of life in it


----------



## Sheyster

Quote:


> Originally Posted by *Menthol*
> 
> wild burros are a common site in so-cal, they come down from the hills into my neighborhood to feed on vegetation all the time


Come down from the hills of TJ?


----------



## alucardis666

Quote:


> Originally Posted by *kx11*
> 
> looks like this thread got 6 more months of life in it


Especially since a mod killed my thread because they're uneducated to the new card. If you guys buying the Titan's could PM them for me I'd appreciate it.

http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread


----------



## nycgtr

Quote:


> Originally Posted by *Artah*
> 
> I'll trade you a 6950x for an unopened TXp


Well I got rid of my 5960x and went Ryzen 1800x. I am waiting on x299 to buy the next extreme chip. At this point I wouldn't reinvest in x99 but thanks for the offer.

Quote:


> Originally Posted by *alucardis666*


Paid so much less in tax than me. 213 in tax for 2 here.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> keeps the horses company... no really. Besides, he's a pisser!
> 
> 
> Spoiler: Warning: Spoiler!


LOL nice! Always good for the pets to have a pet of their own.


----------



## alucardis666

Quote:


> Originally Posted by *nycgtr*
> 
> Paid so much less in tax than me. 213 in tax for 2 here.


Ouch, where do you live?


----------



## nycgtr

Quote:


> Originally Posted by *alucardis666*
> 
> Ouch, where do you live?


new york city


----------



## alucardis666

Quote:


> Originally Posted by *nycgtr*
> 
> new york city


And I thought Chicago was bad.


----------



## Menthol

Quote:


> Originally Posted by *Sheyster*
> 
> Come down from the hills of TJ?


the hills around Moreno Valley but they are all over Riverside and San Bernardino counties, the last job I worked on was the Metrolink line from Riverside to Perris, every time the bushes and trees were planted at the train stations the burro's would feast on them that night, was quite the conundrum


----------



## mbze430

Anyone else selling their Titan X Pascal for the Xp???

Cause I am selling 2 Titan X Pascals!


----------



## nycgtr

Quote:


> Originally Posted by *mbze430*
> 
> Anyone else selling their Titan X Pascal for the Xp???


I am sure there are. What price are you looking for.


----------



## Artah

Quote:


> Originally Posted by *mbze430*
> 
> Anyone else selling their Titan X Pascal for the Xp???


you buying them? I'm thinking about it. I have ek blocks + backplates.


----------



## nycgtr

Quote:


> Originally Posted by *Artah*
> 
> you buying them? I'm thinking about it. I have ek blocks + backplates.


The backplates and block should fit imo. I really don't see why not. Unless you want that single slotter.


----------



## pez

Quote:


> Originally Posted by *Sheyster*
> 
> I like BigBoiPascal..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've received 2 emails so far, the second being the order confirmation email. No ETA/ship date yet though.


Same. I'm just patiently waiting on the shipping one...pls NVIDIA







.
Quote:


> Originally Posted by *gamingarena*
> 
> New Titan XP didn't to anything to its value, it was already killed by 1080Ti...


Quote:


> Originally Posted by *Jpmboy*
> 
> Pretty easy to tell who's losing it.
> 
> 
> 
> 
> 
> 
> 
> 
> why would I buy 1080tis when I'm writing this email from a machine with SLI TXPs? Which sits next to a rig with 2 TXMs.
> 
> 
> The value comparison always shows up when a new product launches, and just the mention of it in a thread discussing the Titan "halo" product series is just stupid. Wrong thread for that argument. If you are here, "value" is not relevant in the first place.
> move on to an AMD thread please.


Thank you for saying this







. Nail. On. The. Head.
Quote:


> Originally Posted by *mbze430*
> 
> Anyone else selling their Titan X Pascal for the Xp???


Mine will be listed tomorrow night assuming that these do ship today and I have mine in the morning. Feel free to PM me.


----------



## Artah

Quote:


> Originally Posted by *nycgtr*
> 
> The backplates and block should fit imo. I really don't see why not. Unless you want that single slotter.


I bought 1080 tis and seen the motherboard upgrades but never opened them to look, I'll be giving them back to get the TXp x2


----------



## nycgtr

Quote:


> Originally Posted by *Artah*
> 
> I bought 1080 tis and seen the motherboard upgrades but never opened them to look, I'll be giving them back to get the TXp x2


i got a txp block on a ref 1080ti fit fine. This shouldnt be any different.


----------



## mbze430

Quote:


> Originally Posted by *nycgtr*
> 
> I am sure there are. What price are you looking for.


Quote:


> Originally Posted by *Artah*
> 
> you buying them? I'm thinking about it. I have ek blocks + backplates.


Quote:


> Originally Posted by *pez*
> 
> Same. I'm just patiently waiting on the shipping one...pls NVIDIA
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Thank you for saying this
> 
> 
> 
> 
> 
> 
> 
> . Nail. On. The. Head.
> Mine will be listed tomorrow night assuming that these do ship today and I have mine in the morning. Feel free to PM me.


No, I just bought two Titan Xp, so I am going to sell mine

Just wondering if I should sell it on here or just off Ebay


----------



## Artah

Quote:


> Originally Posted by *nycgtr*
> 
> i got a txp block on a ref 1080ti fit fine. This shouldnt be any different.


thanks, I hope that's true, I'll be replacing my cards tomorrow if "next morning" actually means Friday morning Pacific Time.


----------



## Glerox

Does somebody know if the 2016 Titan x Pascal EKWB waterblock will fit on the 2017 Titan Xp?


----------



## Menthol

Is the new driver that was released today the only driver that will work on the new card? or is it needed for the Win 10 Creators update?
when downloaded from Nvidia it was the only driver to show for Win 10 without searching deaper


----------



## Jbravo33

not sure where to post other thread got locked. Not sure why? Would hate to have to sift thru 700 plus pages.
Regardless I'm here.


----------



## Sheyster

Quote:


> Originally Posted by *Menthol*
> 
> the hills around Moreno Valley but they are all over Riverside and San Bernardino counties, the last job I worked on was the Metrolink line from Riverside to Perris, every time the bushes and trees were planted at the train stations the burro's would feast on them that night, was quite the conundrum


Cool, I'd heard about some wild goats out there, but the coyotes probably keep their population well in check. I'll keep an eye out for them burros, but they may not get this far south. I'm 30 miles from the border.


----------



## Artah

Quote:


> Originally Posted by *Glerox*
> 
> Does somebody know if the 2016 Titan x Pascal EKWB waterblock will fit on the 2017 Titan Xp?


@nycgtr put TXP blocks on 1080 Ti so most likely it will fit. No guarantees but I might find out tomorrow for myself.


----------



## jsutter71

If anyone is interested their is a bug with Rivatuner and the newest Nvidia driver that was released today. The bug prevents some programs from opening. Before I uninstalled Rivatuner I saw that Aquasuite refused to open with a error message, and PCMark 8 would not respond. To verify the problem I uninstalled Rivatuner and my programs returned to normal. I then reinstalled Rivatuner a second time and still had the same problem. Afterburner is not affected. This might be a combination of issues pertaining to Nvidia's drivers and Windows 10 Creators update since I did a fresh install for that this morning.


----------



## alucardis666

Anyone gotten shipping info on their cards yet?


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> Is the new driver that was released today the only driver that will work on the new card? or is it needed for the Win 10 Creators update?
> when downloaded from Nvidia it was the only driver to show for Win 10 without searching deaper


menthol.. you get 4 of these?


----------



## Dagamus NM

Not this time Gear Acquisition Syndrome. Get out of my head.

Holding out for Titan Volta. 256 cores and some memory tweaks are not worth the hassle.

This is pretty hilarious to me.


----------



## Zurv

Quote:


> Originally Posted by *Dagamus NM*
> 
> Not this time Gear Acquisition Syndrome. Get out of my head.
> 
> Holding out for Titan Volta. 256 cores and some memory tweaks are not worth the hassle.
> 
> This is pretty hilarious to me.


what is the guess for Volta? end of the year?


----------



## alucardis666

Got our thread back! For those buying the new Titan, I'll see you there!

http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/10#post_25991607


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> menthol.. you get 4 of these?


Only 2 at this time, who will get theirs first?


----------



## jcde7ago

Quote:


> Originally Posted by *Zurv*
> 
> what is the guess for Volta? end of the year?


End of year is highly doubtful for consumer gaming cards but we may see Quadros out by then. I'm guessing a solid 10-12 months from now until high-end Volta GTX cards are out and 16+ months for a high-end Ti/Titan variant.


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> Only 2 at this time, who will get theirs first?


lol ... only 2. I'll prob pick up a second after seeing how it performs. I was hoping you'd get 4 and cause stinky to visit the euthanasia clinic.


----------



## alucardis666

Why's it feel like no one here is paying any attention to me...


----------



## EvilPieMoo

Who is this new Titan even marketed towards? Titans have never been known to be good value for money, but $500 more for maybe 10% more performance than a TX/Ti awful.

I get the feeling Nvidia released this just to see if people would actually pay for it and are now crying with laughter everytime somebody buys one.


----------



## kx11

I ordered 1 and will replace the current TXP with the new TXp

same old EK block and everything else , i'll try to sell the old GPU at a cheap price since i don't have anything for it but the chip itself , no blower fan ...etc


----------



## jhowell1030

Quote:


> Originally Posted by *jcde7ago*
> 
> End of year is highly doubtful for consumer gaming cards but we may see Quadros out by then. I'm guessing a solid 10-12 months from now until high-end Volta GTX cards are out and 16+ months for a high-end Ti/Titan variant.


Although I do think that it will not be available by the end of the year I think 10-12 months is way too far ahead.


----------



## Sheyster

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Who is this new Titan even marketed towards?


A: Those who want the best regardless of cost.

Most of the buyers will be dumping their TXP 1.0's so the out of pocket won't be as much as you'd think.


----------



## Artah

Quote:


> Originally Posted by *Jpmboy*
> 
> lol ... only 2. I'll prob pick up a second after seeing how it performs. I was hoping you'd get 4 and cause stinky to visit the euthanasia clinic.


I went with two, it performs much better than 1080 ti or bust!


----------



## Dagamus NM

Quote:


> Originally Posted by *Zurv*
> 
> what is the guess for Volta? end of the year?


That would be ideal. Volta chips go to driverless car stuff first I believe.

I will get Titan Voltas for at least two builds this year or whenever they release.


----------



## mbze430

when Volta comes out.... TAKE MY MONEY TIMES 2! lol

The extra cores on the new Xp should run faster than the 1080TI......


----------



## Menthol

Quote:


> Originally Posted by *Jpmboy*
> 
> lol ... only 2. I'll prob pick up a second after seeing how it performs. I was hoping you'd get 4 and cause stinky to visit the euthanasia clinic.


If that would do it buddy I would have 5 just to make sure


----------



## nycgtr

Quote:


> Originally Posted by *mbze430*
> 
> when Volta comes out.... TAKE MY MONEY TIMES 2! lol
> 
> The extra cores on the new Xp should run faster than the 1080TI......


Too bad monitors aren't catching up.


----------



## mbze430

Didn't they have 4k HDR 120hz monitor announce at CES? and the 8k from Dell

That's what I am looking forward to 4K HDR @ 120hz but 32"+


----------



## nycgtr

Quote:


> Originally Posted by *mbze430*
> 
> Didn't they have 4k HDR 120hz monitor announce at CES? and the 8k from Dell


The 8k monitor is 5k. Yea no thanks. The 144hz is 2k and 27inches. I don't mind 2k price tag. I mind the 27inch size for 4k. That's crap.


----------



## jcde7ago

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Who is this new Titan even marketed towards? Titans have never been known to be good value for money, but $500 more for maybe 10% more performance than a TX/Ti awful.
> 
> I get the feeling Nvidia released this just to see if people would actually pay for it and are now crying with laughter everytime somebody buys one.


Previous Titan X Pascal 1.0 owners is who this is aimed at....I mean, it's pretty obvious NOW that this was Nvidia's plan all along, even if it's a complete shock to most people that another Pascal-based Titan X was released. But as was mentioned many people are going to offload Titan XP 1.0 or 1080 Ti cards to help cover the costs so the price difference isn't absolutely monumental (and even then, cost is relative for everyone).

Nvidia knows they've shafted us original Titan X Pascal owners and even some 1080 Ti owners by releasing this, but they also knew that they weren't going to leave the "Titan" name to be de-throned by a Ti card so the balance was always going to be restored.

The 1080 Ti, which most consumers have now come around to accepting as a "really good value because it's a beast of a performer" card at a whopping $700 is even MORE of a value now compared to the new Titan Xp which has a $500 premium over it...that just means more 1080 Ti sales for Nvidia for anyone not wanting an Xp, but either way, that person is probably buying an Nvidia card of some sort regardless. It's a brutal strategy by Nvidia but AMD can't compete, so this is what we get and they're taking advantage of it because nothing is forever. Can't blame them, really...no one is forcing anyone to buy Nvidia cards, but here we are.


----------



## Jpmboy

Quote:


> Originally Posted by *Menthol*
> 
> If that would do it buddy I would have 5 just to make sure


yeah, I have to admit the thought did cross my mind... but reason got the better of me.

Quote:


> Originally Posted by *jcde7ago*
> 
> Previous Titan X Pascal 1.0 owners is who this is aimed at....I mean, it's pretty obvious NOW that this was Nvidia's plan all along, even if it's a complete shock to most people that another Pascal-based Titan X was released. But as was mentioned many people are going to offload Titan XP 1.0 or 1080 Ti cards to help cover the costs so the price difference isn't absolutely monumental (and even then, cost is relative for everyone).
> 
> Nvidia knows they've shafted us original Titan X Pascal owners and even some 1080 Ti owners by releasing this, but they also knew that they weren't going to leave the "Titan" name to be de-throned by a Ti card so the balance was always going to be restored.
> 
> The 1080 Ti, which most consumers have now come around to accepting as a "really good value because it's a beast of a performer" card at a whopping $700 is even MORE of a value now compared to the new Titan Xp which has a $500 premium over it...that just means more 1080 Ti sales for Nvidia for anyone not wanting an Xp, but either way, that person is probably buying an Nvidia card of some sort regardless. It's a brutal strategy by Nvidia but AMD can't compete, so this is what we get and they're taking advantage of it because nothing is forever. Can't blame them, really...no one is forcing anyone to buy Nvidia cards, but here we are.


Just for accuracy, I'm not seeing the Ti being faster than the TXP (new one is called the "TXFp"







)


----------



## NemChem

Glad I just got this







:


----------



## toncij

Quote:


> Originally Posted by *NemChem*
> 
> Glad I just got this
> 
> 
> 
> 
> 
> 
> 
> :


But why?


----------



## Baasha

Quote:


> Originally Posted by *pez*
> 
> Yeah, I couldn't find anything on Google for it either. Outside of some site predicting it happening in September
> 
> 
> 
> 
> 
> 
> 
> . Inb4 Baasha orders 4 of these
> 
> 
> 
> 
> 
> 
> 
> .


you know it!.









Titan XP vs. Titan Xp 4 Way SLI showdown.. coming soon.


----------



## jcde7ago

Quote:


> Originally Posted by *toncij*
> 
> But why?


Looks like a return of 2x 1080 Ti FEs for maybe 2x Titan Xps? That GBP converts to ~$1,470.00 USD.


----------



## Lee0

Quote:


> Originally Posted by *Baasha*
> 
> The 8k monitor is 5k. Yea no thanks. The 144hz is 2k and 27inches. I don't mind 2k price tag. I mind the 27inch size for 4k. That's crap.


Why do you think that 4k 27" is crap?


----------



## toncij

Quote:


> Originally Posted by *nycgtr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mbze430*
> 
> Didn't they have 4k HDR 120hz monitor announce at CES? and the 8k from Dell
> 
> 
> 
> The 8k monitor is 5k. Yea no thanks. The 144hz is 2k and 27inches. I don't mind 2k price tag. I mind the 27inch size for 4k. That's crap.
Click to expand...

Quote:


> Originally Posted by *Lee0*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Baasha*
> 
> The 8k monitor is 5k. Yea no thanks. The 144hz is 2k and 27inches. I don't mind 2k price tag. I mind the 27inch size for 4k. That's crap.
> 
> 
> 
> Why do you think that 4k 27" is crap?
Click to expand...

[email protected]" is ideal


----------



## nycgtr

Quote:


> Originally Posted by *Lee0*
> 
> Why do you think that 4k 27" in crap?


I own a 32inch 4k monitor at the moment and I have a 40inch 4k as well. I had a 30inch dell 3007fpw-hc when that came out what 10 years ago? Using a 27inch 4k is painful. I have one at work and I have to scale up the UI which isn't fun and seriously not something I would game on. I have 21;9 34inchers as well. 27 is just painful for 4k. Honestly, at that size I'd just stick to 1440p.

Maybe if I was in a dorm room or something or a very small apartment. Again smaller the screen harder to see the dpi difference.


----------



## xTesla1856

Well guys it's official:

In for a Titan X BigBoiPascal









Sold my TXP(leb) locally for four figures, so I'm pretty stoked.


----------



## Lee0

Quote:


> Originally Posted by *nycgtr*
> 
> I own a 32inch 4k monitor at the moment and I have a 40inch 4k as well. I had a 30inch dell 3007fpw-hc when that came out what 10 years ago? Using a 27inch 4k is painful. I have one at work and I have to scale up the UI which isn't fun and seriously not something I would game on. I have 21;9 34inchers as well. 27 is just painful for 4k. Honestly, at that size I'd just stick to 1440p.


I can't exactly see your point at all but I'm going to be subjective ofc since a 4k 27" monitor *is* exactly what I use to game on. I haven't had a single problem with GUI scaling and you didn't mention a reason why not to game on it (I'm assuming that you meant GUI scaling was only a work problem).
My reasoning why I like my monitor is because I tend to sit quite close to it. A bigger monitor would force me to sit further away and not only that, sitting as close to the screen as I do requires a higher pixel density (ppi) since otherwise it would just look bad.


----------



## Lee0

Quote:


> Originally Posted by *xTesla1856*
> 
> Well guys it's official:
> 
> In for a Titan X BigBoiPascal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sold my TX*P(leb)* locally for four figures, so I'm pretty stoked.


Hehe


----------



## nycgtr

Quote:


> Originally Posted by *Lee0*
> 
> I can't exactly see your point at all but I'm going to be subjective ofc since a 4k 27" monitor *is* exactly what I use to game on. I haven't had a single problem with GUI scaling and you didn't mention a reason why not to game on it (I'm assuming that you meant GUI scaling was only a work problem).
> My reasoning why I like my monitor is because I tend to sit quite close to it. A bigger monitor would force me to sit further away and not only that, sitting as close to the screen as I do requires a higher pixel density (ppi) since otherwise it would just look bad.


For me it's the immersion factor. My desk isn't that small. A 27 incher is just not immersive. It's almost like watching a 40inch tv on your couch that's 10 ft away. Some people can do it but I can't. I am at work right now looking at a 27inch screen and all i see are bezels









As for gaming, some games I play the UI will not scale well. I can't even read the hud well on a 40 incher less yet on a 27,


----------



## mbze430

I personally want a 3x 32" or 35" [email protected] HDR monitors. I have a 49" KS8000 4k HDR TV at my primary PC location as well. I prefer the bigger screen then the 32" 4K that is sitting on my desk. however I like the surround setup once those [email protected] HDR monitor comes out

I sold 2 of the 32" 4k and now holding out for those [email protected] HDRs


----------



## Lee0

Quote:


> Originally Posted by *nycgtr*
> 
> For me it's the immersion factor. My desk isn't that small. A 27 incher is just not immersive. It's almost like watching a 40inch tv on your couch that's 10 ft away. Some people can do it but I can't. I am at work right now looking at a 27inch screen and all i see are bezels


Then you definitely wouldn't be happy to look at my screen at all; it's bezels less - you wouldn't see a thing at all.


----------



## nycgtr

Quote:


> Originally Posted by *Lee0*
> 
> Then you definitely wouldn't be happy to look at my screen at all; it's bezels less - you wouldn't see a thing at all.


It's one of those things I feel that once you go bigger it's hard to go back. Back when I got a 24inch wide (back when these things were like 800 dollars) I was like OMG THIS THING IS MASSIVE. As screen sizes have gone up and the average screen size for most consumers is in the 20s it's not so amazing. 32inch imo is really the smallest a 4k monitor should be.


----------



## NemChem

Quote:


> Originally Posted by *toncij*
> 
> But why?


Return of Titan XP (3584 core) I bought at end of Feb







.


----------



## Sheyster

Quote:


> Originally Posted by *nycgtr*
> 
> It's one of those things I feel that once you go bigger it's hard to go back.


TWSS...


----------



## nycgtr

Quote:


> Originally Posted by *Sheyster*
> 
> TWSS...


I hope that isn't from personal experience


----------



## Clint Black

Quote:


> Originally Posted by *nycgtr*
> 
> 32inch imo is really the smallest a 4k monitor should be.


32 should be the biggest monitor for 4k, otherwise do yourself a favor and get lower resolution monitor







you don't need 4k.


----------



## Asus11

Quote:


> Originally Posted by *NemChem*
> 
> Return of Titan XP (3584 core) I bought at end of Feb
> 
> 
> 
> 
> 
> 
> 
> .


buying the new shiny?


----------



## Sheyster

Personally I just want to see an ultra-wide 35" 21:9 curved 144 Hz 1440P HDR IPS G-Sync monitor please!


----------



## toncij

Quote:


> Originally Posted by *Clint Black*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nycgtr*
> 
> 32inch imo is really the smallest a 4k monitor should be.
> 
> 
> 
> 32 should be the biggest monitor for 4k, otherwise do yourself a favor and get lower resolution monitor
> 
> 
> 
> 
> 
> 
> 
> you don't need 4k.
Click to expand...

No. You need 4K at 27". In desktop and regular usage as with games you benefit from the fantastic density.

Saying that 4K is irrelevant if scaled can come only from someone who never used a 4K display.


----------



## Lee0

Quote:


> Originally Posted by *nycgtr*
> 
> It's one of those things I feel that once you go bigger it's hard to go back. Back when I got a 24inch wide (back when these things were like 800 dollars) I was like OMG THIS THING IS MASSIVE. As screen sizes have gone up and the average screen size for most consumers is in the 20s it's not so amazing. 32inch imo is really the smallest a 4k monitor should be.


That kind of feeling isn't only limited to screen size - it applies to everything. Once you go 1080p you can never go back to 720p, and the same for 1080p to 4k. People say the same about monitor refresh rates. But just because one thing is better than the other in one instance doesn't mean it's in every situation. I.E. A 1080p TV will almost be as good as a 4k TV in a long away distance (as long as screen size isn't too big for 1080p).


----------



## Jbravo33

Quote:


> Originally Posted by *Sheyster*
> 
> Personally I just want to see an ultra-wide 35" 21:9 curved 144 Hz 1440P HDR IPS G-Sync monitor please!


samezies


----------



## NemChem

Quote:


> Originally Posted by *Asus11*
> 
> buying the new shiny?


Tempted... got 3x 1080 Ti FE just before I sent the Titan back. Use them for training neural nets and it's a lot more convenient being able to try three different sets of hyperparameters at the same time! This PC was paid for with gains from AMD stock (so it's kinda funny it has nvidia graphics cards







) and I'm still a student. If I'm lucky and finances change I might go for some Titan XP full fat cards but 11 GB is enough for what I'm doing at the moment and I expect Titan XV will be out by the time I have a job in the field!


----------



## Clint Black

Quote:


> Originally Posted by *toncij*
> 
> No. You need 4K at 27". In desktop and regular usage as with games you benefit from the fantastic density.
> 
> Saying that 4K is irrelevant if scaled can come only from someone who never used a 4K display.


I know that's why i am saying 32 inch can be my max for monitor







i have currenlty 4k @ 28 and its enough.
Anything over 32 inch is too big and will need 8k to enjoy it.


----------



## Lee0

Quote:


> Originally Posted by *toncij*
> 
> No. You need 4K at 27". In desktop and regular usage as with games you benefit from the fantastic density.
> 
> Saying that 4K is irrelevant if scaled can come only from someone who never used a 4K display.


I don't understand who you're agreeing or disagreeing with. Could you please clarify? Sorry.


----------



## xTesla1856

Is there a new owners club yet?


----------



## Lee0

Quote:


> Originally Posted by *xTesla1856*
> 
> Is there a new owners club yet?


Yupp. Go back 2 pages or something.


----------



## Zurv

might i suggest picking up a LG OLED 4k TV to go along with your new titan Xp







(it is sooooo gwed... you know i mean it because i spelled good that way...)


----------



## NemChem

Quote:


> Originally Posted by *xTesla1856*
> 
> Is there a new owners club yet?


http://www.overclock.net/t/1627390/official-2017-nvidia-titan-xp-owners-thread


----------



## Lee0

This guy (wether you like him or not, I know he's hated by some) explained what nVidia did in a really good and objective way. I recommend to watch it if you are/ were dumb uninformed like me.


----------



## mbze430

65" LG E6.... its soooooooo gewd


----------



## Jpmboy

Quote:


> Originally Posted by *mbze430*
> 
> 65" LG E6.... its soooooooo gewd


lol - this is on my buy list (just need to get my current Pioneer Elite up to the fishing "lodge" to make room). But the new B6 may be the better choice (unless you go G6).
peel and stick 65" 4K OLED.


----------



## Sheyster

Quote:


> Originally Posted by *nVidia*
> 
> Your Order Has Shipped
> 
> Dear XXXXX,
> 
> Thank you for ordering from NVIDIA on April 6, 2017. The following product(s) have shipped. If you paid by credit card, your credit card has now been charged.












Shipped from: CIRCLE PINES, MN USA


----------



## Zurv

Quote:


> Originally Posted by *Jpmboy*
> 
> lol - this is on my buy list (just need to get my current Pioneer Elite up to the fishing "lodge" to make room). But the new B6 may be the better choice (unless you go G6).
> peel and stick 65" 4K OLED.


I have the G6... and i'd suggest anyone getting a new screen now should get the c7 (which i also have). Unlike the 2016 units, all 2017 models have the same SoC. (g6 was was stunning waste of money of the e6







)
the c7 is also 20ms in gaming (HDR gaming too) (vs 33 for the 2016 models)


----------



## jcde7ago

Quote:


> Originally Posted by *Sheyster*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shipped from: CIRCLE PINES, MN USA


GG...I ordered around noon Pacific time and no shipping email yet!


----------



## freitz

So I hope everyone saw this today.

http://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-has-stealthily-replaced-the-titan-x-with-an-upgraded-titan-xp/

The new TITAN Xp... As a titan X owner I am extremely pissed off not sure how the rest of you feel. They need to do an upgrade program.


----------



## Sheyster

Quote:


> Originally Posted by *freitz*
> 
> So I hope everyone saw this today.
> 
> http://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-has-stealthily-replaced-the-titan-x-with-an-upgraded-titan-xp/
> 
> The new TITAN Xp... As a titan X owner I am extremely pissed off not sure how the rest of you feel. They need to do an upgrade program.


Wait a minute you're a Titan X MAXWELL owner right? Why are you mad???


----------



## freitz

Quote:


> Originally Posted by *Sheyster*
> 
> Wait a minute you're a Titan X MAXWELL owner right? Why are you mad???


I haven't updated my Sig. I am a pascal owner as well.


----------



## Jpmboy

Quote:


> Originally Posted by *Zurv*
> 
> I have the G6... and i'd suggest anyone getting a new screen now should get the c7 (which i also have). Unlike the 2016 units, all 2017 models have the same SoC. (g6 was was stunning waste of money of the e6
> 
> 
> 
> 
> 
> 
> 
> )
> the c7 is also 20ms in gaming (HDR gaming too) (vs 33 for the 2016 models)


Thanks for the heads-up bro! I wouldn't be using it for gaming, just movies etc. I'll check the c7....
http://4k.com/

... just received note from NV. card shipped.


----------



## Sheyster

Quote:


> Originally Posted by *freitz*
> 
> I haven't updated my Sig. I am a pascal owner as well.


Well then you should be used to this by now!







We're just now coming back full circle to the Titan Black days is all. Get those older Titan X's sold quick son!


----------



## freitz

Quote:


> Originally Posted by *Sheyster*
> 
> Wait a minute you're a Titan X MAXWELL owner right? Why are you mad???


fixed as fast as I could. Its been a while since I ventured on OCN. Just really upset about this.

There is a thread going on Nvidia forums I would suggest following.

https://forums.geforce.com/default/topic/1002821/should-nvidia-offer-an-upgrade-program-/?offset=5#5121972


----------



## Jpmboy

Quote:


> Originally Posted by *freitz*
> 
> fixed as fast as I could. Its been a while since I ventured on OCN. Just really upset about this.
> 
> There is a thread going on Nvidia forums I would suggest following.
> 
> https://forums.geforce.com/default/topic/1002821/should-nvidia-offer-an-upgrade-program-/?offset=5#5121972


why are you upset? By that rationale any product/process improvement should halt. Early yields on this die were very poor. Production is stabilizing, hence consumer-grade p6000.


----------



## freitz

Quote:


> Originally Posted by *Jpmboy*
> 
> why are you upset? By that rationale any product/process improvement should halt. Early yields on this die were very poor. Production is stabilizing, hence consumer-grade p6000.


As a owner of ever titan they have made. Of course I am upset. I bought titans thinking I at least have a year of gpu dominance... I was a little late this gen buying in November. Then 1080ti came out same performance but early adoption fee (last gen Titan still was faster). I ate it then figure next will be better. To have another titan come up that fully unlocks the GP102 chip is what the Titan should have been in the first place. They just ripped everyone off who bought a Titan X pascal.... thats my problem with this.


----------



## Sheyster

Just noticed this in the official TXp specs:

Maximum GPU Temperature (in C) *96*

So this is up by 5 deg C...


----------



## PowerK

Just placed an order for 2x Titan Xps.
My 2016 Titan Pascals will go into 2nd rig.


----------



## nycgtr

Quote:


> Originally Posted by *Jpmboy*
> 
> why are you upset? By that rationale any product/process improvement should halt. Early yields on this die were very poor. Production is stabilizing, hence consumer-grade p6000.


I think they could of given the full chip from the get-go just didn't want to and needed something to do with the bad chips. They could of hung onto them for a TI then but the market didn't need a TI.

Quote:


> Originally Posted by *freitz*
> 
> As a owner of ever titan they have made. Of course I am upset. I bought titans thinking I at least have a year of gpu dominance... I was a little late this gen buying in November. Then 1080ti came out same performance but early adoption fee (last gen Titan still was faster). I ate it then figure next will be better. To have another titan come up that fully unlocks the GP102 chip is what the Titan should have been in the first place. They just ripped everyone off who bought a Titan X pascal.... thats my problem with this.


I can relate to this argument. I think part of adding the P officially was to save some potential problems. As in to say this is a different product than what was sold prior.

Quote:


> Originally Posted by *Sheyster*
> 
> Just noticed this in the official TXp specs:
> 
> Maximum GPU Temperature (in C) *96*
> 
> So this is up by 5 deg C...


That's just to make people feel better about a crappy cooler on a 1200 part.

I for one am being milked but I am aware of it. Am I that angry about it? No not really but if I had bought this thing in say February from Nvidia directly, I would be pissed.


----------



## pez

Quote:


> Originally Posted by *Baasha*
> 
> you know it!.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Titan XP vs. Titan Xp 4 Way SLI showdown.. coming soon.


Called it....







.
Quote:


> Originally Posted by *Sheyster*
> 
> Just noticed this in the official TXp specs:
> 
> Maximum GPU Temperature (in C) *96*
> 
> So this is up by 5 deg C...


I actually checked that earlier and I thought it was just 2, but I could have been too distracted.

Also, shipping notification came while I was sleeping. Card is getting here by 10:30AM. Interestingly enough, I don't see a signature being required.

EDIT:

Oh yeah....#HYPED


----------



## freitz

Quote:


> Originally Posted by *nycgtr*
> 
> I for one am being milked but I am aware of it. Am I that angry about it? No not really but if I had bought this thing in say February from Nvidia directly, I would be pissed.


Thats why I am pissed.


----------



## Jpmboy

Quote:


> Originally Posted by *pez*
> 
> Called it....
> 
> 
> 
> 
> 
> 
> 
> .
> I actually checked that earlier and I thought it was just 2, but I could have been too distracted.
> 
> Also, shipping notification came while I was sleeping. Card is getting here by 10:30AM. Interestingly enough, I don't see a signature being required.
> 
> EDIT:
> 
> Oh yeah....#HYPED


baasha is waaay off the reservation.


----------



## pez

Quote:


> Originally Posted by *Jpmboy*
> 
> baasha is waaay off the reservation.


Indeed







.

I guess I need to find my motivation to install my sleeves cables tonight then. So they can welcome the new Titan with open arms.


----------



## Baasha

Quote:


> Originally Posted by *Lee0*
> 
> Why do you think that 4k 27" is crap?


For a moment I did a double take - you quoted a video I posted but also someone else's post which makes it look like mine. I did NOT say that for the record.
Quote:


> Originally Posted by *Jpmboy*
> 
> baasha is waaay off the reservation.










I have a GPU addiction.. When is the X299 platform going to be released? Can't wait to try 4 Way SLI + M2.


----------



## guttheslayer

Now the irony when Nvidia have already released all their pascal lineup and AMD has not show anything beyond a refresh polaris.

sigh.


----------



## Jpmboy

Quote:


> Originally Posted by *nycgtr*
> 
> I think they co*uld of given the full chip from the get-go* just didn't want to and needed something to do with the bad chips. They could of hung onto them for a TI then but the market didn't need a TI.


but they did, it just cost almost $6000. again, the yields were low to qualify in the AOR for full die performance AFAIK, Of course we all wanted a P102 last year. As you say, it's not like anyone forced me to buy these all along.
funny thing is I saw the same complaints about the GNX on GN forums and the ZR1 on Z06 forums.


Spoiler: Warning: Spoiler!











Quote:


> Originally Posted by *Baasha*
> 
> For a moment I did a double take - you quoted a video I posted but also someone else's post which makes it look like mine. I did NOT say that for the record.
> 
> 
> 
> 
> 
> 
> 
> I have a GPU addiction.. When is the X299 platform going to be released? Can't wait to try 4 Way SLI + M2.


That you do... better than some other addictions I can think of.


----------



## guttheslayer

Quote:


> Originally Posted by *Baasha*
> 
> For a moment I did a double take - you quoted a video I posted but also someone else's post which makes it look like mine. I did NOT say that for the record.
> 
> 
> 
> 
> 
> 
> 
> I have a GPU addiction.. When is the X299 platform going to be released? Can't wait to try 4 Way SLI + M2.


More ppl in this world have GPU addiction but not everyone can afford them like you do.


----------



## Jbravo33

why am i only one with no shipping confirmation







i even called said they could still have it out by tomorrow. keep telling myself patience is a virtue, hes not listening


----------



## Glerox

It's been a roller coaster day for me, but I think Nvidia is going too far with the new Titan Xp. I understood the 1080ti even if it was a punch in the face for Titan X pascal 2016 owners. We all knew a Ti would come sooner or later.

But now this is just laughing in the face of consumers who paid 1200$ US (1800$ CAN) 7 months ago for a supposed "no compromises" card... F U Nvidia. I hope Vega will be really competitive.

At least JayzTwoCents agrees with me.


----------



## WaXmAn

Quote:


> Originally Posted by *Glerox*
> 
> It's been a roller coaster day for me, but I think Nvidia is going too far with the new Titan Xp. I understood the 1080ti even if it was a punch in the face for Titan X pascal 2016 owners. We all knew a Ti would come sooner or later.
> 
> But now this is just laughing in the face of consumers who paid 1200$ US (1800$ CAN) 7 months ago for a supposed "no compromises" card... F U Nvidia. I hope Vega will be really competitive.
> 
> At least JayzTwoCents agrees with me.


I agree 100%....F U Nvidia


----------



## KillerBee33

Humm can't see the difference in TXP VS TXp...What am i missing?


----------



## WaXmAn

Quote:


> Originally Posted by *KillerBee33*
> 
> Humm can't see the difference in TXP VS TXp...What am i missing?


Same Chip, but more (cuda cores) and faster ram....should have been the first Titan XP


----------



## JedixJarf

Quote:


> Originally Posted by *WaXmAn*
> 
> Same Chip, but more (cuda cores) and faster ram....should have been the first Titan XP


Someone should try and flash the Titan XP bios on a 1080 ti : P


----------



## blackforce

Quote:


> Originally Posted by *Glerox*
> 
> It's been a roller coaster day for me, but I think Nvidia is going too far with the new Titan Xp. I understood the 1080ti even if it was a punch in the face for Titan X pascal 2016 owners. We all knew a Ti would come sooner or later.
> 
> But now this is just laughing in the face of consumers who paid 1200$ US (1800$ CAN) 7 months ago for a supposed "no compromises" card... F U Nvidia. I hope Vega will be really competitive.
> 
> At least JayzTwoCents agrees with me.


hahahahaha


----------



## pez

Quote:


> Originally Posted by *Jbravo33*
> 
> why am i only one with no shipping confirmation
> 
> 
> 
> 
> 
> 
> 
> i even called said they could still have it out by tomorrow. keep telling myself patience is a virtue, hes not listening


Not sure, bud







. As I type this, my tracking just updated to show my card is now at the sorting facility it'll ship from.
Quote:


> Originally Posted by *KillerBee33*
> 
> Humm can't see the difference in TXP VS TXp...What am i missing?


Quote:


> Originally Posted by *WaXmAn*
> 
> Same Chip, but more (cuda cores) and faster ram....should have been the first Titan XP


It's the full chip*. I.e. Big Pascal.


----------



## hotrod717

I believe at some point last evening was a cutoff on next day shipping. Realized this, but hoping just the same, that I might have seen shipping notification and tracking when I woke. Nope, looks like Sat. delivery for me, unless they pull a canonball run.


----------



## BrainSplatter

Quote:


> Originally Posted by *Glerox*
> 
> As a owner of ever titan they have made. Of course I am upset. I bought titans thinking I at least have a year of gpu dominance... I was a little late this gen buying in November.


I really don't get all the complaints. If u buy a $1200 GPU u should also have the money to upgrade half a year later to a 10% faster one for a couple of hundred ? No?

Also, a Titan you should buy right after release in order to maximize the bragging rights







.


----------



## nycgtr

Quote:


> Originally Posted by *BrainSplatter*
> 
> I really don't get all the complaints. If u buy a $1200 GPU u should also have the money to upgrade half a year later to a 10% faster one for a couple of hundred ? No?
> 
> Also, a Titan you should buy right after release in order to maximize the bragging rights
> 
> 
> 
> 
> 
> 
> 
> .


Don't think money is the issue here. I think some people are salty that they should of gotten this from the get-go yields or not. I think nvidia added the P to the name just so there wouldn't be an issue where you could say " how are you now selling the exact same sku with better specs than before when using the same components". 1200 isn't a lot of money for enthusiasts in this price range but no one likes the feeling of paying for the exact same thing twice and getting what should of been to begin with. The quadro m6000 had the same chip as the titan x maxwell, so maybe people felt they weren't going to do the titan black thing again.


----------



## freitz

Quote:


> Originally Posted by *nycgtr*
> 
> Don't think money is the issue here. I think some people are salty that they should of gotten this from the get-go yields or not. I think nvidia added the P to the name just so there wouldn't be an issue where you could say " how are you now selling the exact same sku with better specs than before when using the same components". 1200 isn't a lot of money for enthusiasts in this price range but no one likes the feeling of paying for the exact same thing twice and getting what should of been to begin with. The quadro m6000 had the same chip as the titan x maxwell, so maybe people felt they weren't going to do the titan black thing again.


100% right... nothing to do with the money; this is a stab in the back from Nvidia from those of us who bought the Titan X (Pascal)


----------



## Artah

Quote:


> Originally Posted by *freitz*
> 
> 100% right... nothing to do with the money; this is a stab in the back from Nvidia from those of us who bought the Titan X (Pascal)


I think it's fine, the Titan series is definitely the ruler here for a price, I'm glad they didn't let the 1080 ti last too long at around even performance with the Titan and quickly released the TXp. I had a feeling that the TXP was an interim release of the big dogs now known as TXp, I had this feeling at the time I bought my two TXPs and was completely ok with it.


----------



## jsutter71

I would like to throw in my 2 cents. Since everybody is excited about the release of the new Titans, and when i say excited it's not a term I'm using to reflect good or bad. I would just like to share my thoughts on the matter since were all free thinkers here. IMHO I did not buy the TXPs just so I could have the latest greatest and fastest card out their. I bought them so that I wouldn't have to worry about having enough power to drive my 4 monitors. One of which is a 31" 4096X2160 resolution display. Just to flashback when they released this beast after the 1080, A lot of comments were made regarding future proofing, and that these cards were an awesome thing for people because they were finally able to play their games in 4K with decent frame rates. I for one noticed a massive improvement from my triple 980Ti SLI configuration to which I upgraded from. Fast forward 6 months and the 1080Ti's were released. To most everybody's amazement and surprise the card *SEEMED* to be faster than the TXP. At least that's what all the initial reviews stated until the actual real world usage showed otherwise. So Nvidia heard the cries of the TXP owners who felt betrayed, and as a result released a slightly faster consumer version of the Quadro P6000. Again the Titan regains the crown....Hail the Titan!!!!

Ok so what has changed???? Well a bunch of really happy new TXP owners got to game in 4K, and feel the warm fuzzies of having the fasted consumer card on the planet with bragging rights. I too fell victim to the sway of pride where I felt it necessary to tell all my interested friends about my brand new TXP SLI system. Then GASP!!!! Something new and shinier was released, and I did not own it. But you know what???? I can still do all the wonderful things I was able to do after I bought my TXP's that my previous configuration struggled to do.

Granted. If you are one of the few who games in 8K then you will have a legitimate reason, or if your an engineer, doctor, or scientist who deals with issues which require massive amounts of GPU performance. And yes. Your FS scores will be truly amazing. But for the rest of us who felt the thrill of having the fastest gear if only for a little while, then take comfort that the vast majority of PC GURU's still wish they could afford the original TXP.

Again just my 2 cents.


----------



## AdamK47

Quote:


> Originally Posted by *hotrod717*
> 
> I believe at some point last evening was a cutoff on next day shipping. Realized this, but hoping just the same, that I might have seen shipping notification and tracking when I woke. Nope, looks like Sat. delivery for me, unless they pull a canonball run.


I ordered just before noon eastern time yesterday. Received the order confirmation 4 hours later. No shipping confirmation email yet. I did next day afternoon shipping.

Digital River is terrible about getting shipping times and customer service. There is no rhyme or reason to how they order the shipments. For the Pascal Titan release in August I got mine with two day shipping before some who did next day shipping on orders placed before mine. Don't even bother trying to call them. All their Indian call center customer service does it give a broad non-specific answer in the hopes that it makes you hang up. Totally useless.


----------



## jhowell1030

Quote:


> Originally Posted by *jsutter71*
> 
> I would like to throw in my 2 cents. Since everybody is excited about the release of the new Titans, and when i say excited it's not a term I'm using to reflect good or bad. I would just like to share my thoughts on the matter since were all free thinkers here. IMHO I did not buy the TXPs just so I could have the latest greatest and fastest card out their. I bought them so that I wouldn't have to worry about having enough power to drive my 4 monitors. One of which is a 31" 4096X2160 resolution display. Just to flashback when they released this beast after the 1080, A lot of comments were made regarding future proofing, and that these cards were an awesome thing for people because they were finally able to play their games in 4K with decent frame rates. I for one noticed a massive improvement from my triple 980Ti SLI configuration to which I upgraded from. Fast forward 6 months and the 1080Ti's were released. To most everybody's amazement and surprise the card *SEEMED* to be faster than the TXP. At least that's what all the initial reviews stated until the actual real world usage showed otherwise. So Nvidia heard the cries of the TXP owners who felt betrayed, and as a result released a slightly faster consumer version of the Quadro P6000. Again the Titan regains the crown....Hail the Titan!!!!
> 
> Ok so what has changed???? Well a bunch of really happy new TXP owners got to game in 4K, and feel the warm fuzzies of having the fasted consumer card on the planet with bragging rights. I too fell victim to the sway of pride where I felt it necessary to tell all my interested friends about my brand new TXP SLI system. Then GASP!!!! Something new and shinier was released, and I did not own it. But you know what???? I can still do all the wonderful things I was able to do after I bought my TXP's that my previous configuration struggled to do.
> 
> Granted. If you are one of the few who games in 8K then you will have a legitimate reason, or if your an engineer, doctor, or scientist who deals with issues which require massive amounts of GPU performance. And yes. Your FS scores will be truly amazing. But for the rest of us who felt the thrill of having the fastest gear if only for a little while, then take comfort that the vast majority of PC GURU's still wish they could afford the original TXP.
> 
> Again just my 2 cents.


Nicely put. We should all voices these opinions in the NVIDIA forums as well. They listen a little more than we think.


----------



## jsutter71

Quote:


> Originally Posted by *jhowell1030*
> 
> Nicely put. We should all voices these opinions in the NVIDIA forums as well. They listen a little more than we think.


























I had a little bit of help conducting my thoughts after a couple shot of Patrone. I always have my most inspirational thoughts after a little tequila.


----------



## Glerox

Quote:


> Originally Posted by *nycgtr*
> 
> Don't think money is the issue here. I think some people are salty that they should of gotten this from the get-go yields or not. I think nvidia added the P to the name just so there wouldn't be an issue where you could say " how are you now selling the exact same sku with better specs than before when using the same components". 1200 isn't a lot of money for enthusiasts in this price range but no one likes the feeling of paying for the exact same thing twice and getting what should of been to begin with. The quadro m6000 had the same chip as the titan x maxwell, so maybe people felt they weren't going to do the titan black thing again.


This is exactly my point. It's not a money issue.
Fat Pascal could have been released 8 months ago but Nvidia has been greedy.

Another youtuber arguing the same :


----------



## Blaise Pascal

Any tips from people that have sold locally??


----------



## jhowell1030

Quote:


> Originally Posted by *Blaise Pascal*
> 
> Any tips from people that have sold locally??


Don't let the fact that it's a local sale lend itself towards you budging on price for "convenience." You can get your money whether local or online...no need to short yourself.


----------



## Jbravo33

Quote:


> Originally Posted by *AdamK47*
> 
> I ordered just before noon eastern time yesterday. Received the order confirmation 4 hours later. No shipping confirmation email yet. I did next day afternoon shipping.
> 
> Digital River is terrible about getting shipping times and customer service. There is no rhyme or reason to how they order the shipments. For the Pascal Titan release in August I got mine with two day shipping before some who did next day shipping on orders placed before mine. Don't even bother trying to call them. All their Indian call center customer service does it give a broad non-specific answer in the hopes that it makes you hang up. Totally useless.


I've called twice they've told me nothing but the same thing that I can read on there order status site. Your orders boxed for shipment. But they don't say anything else. Customer service for the store is an absolute joke. Went thru same thing with ti's not sure why i expected that to change. You'd think they would care about giving you info after spending 2500 bucks but I guess not.


----------



## Artah

Quote:


> Originally Posted by *Jbravo33*
> 
> I've called twice they've told me nothing but the same thing that I can read on there order status site. Your orders boxed for shipment. But they don't say anything else. Customer service for the store is an absolute joke. Went thru same thing with ti's not sure why i expected that to change. You'd think they would care about giving you info after spending 2500 bucks but I guess not.


I did next morning shipping, that's a bunch of baloney for me because not even a shipping email and I ordered yesterday.


----------



## mbze430

I don't look at it as a "stab" in the back. I bought the TXP, 1080TI, and the TXp(all SLI)... as a techie and also a business man, I applaud Nvidia (I might have stock in Nvidia too







)

As a Techie = I want the top of line. $$$= no object
As a business man = it's all about finding ways to increase profit and sales. This is exactly what Nvidia did. They know this part of the niche market, people will dump dumb money

Those that are whining and *****ing.. simple... stop supporting the manufacture. Just know, even though you stopped, someone else might taken your place.

And if you are one of those ultra-liberals... go to Nvidia HQ and start a march/protest and deface their building


----------



## mbze430

Quote:


> Originally Posted by *Artah*
> 
> I did next morning shipping, that's a bunch of baloney for me because not even a shipping email and I ordered yesterday.


Looks like LA peeps got screwed... no email either


----------



## Sheyster

Quote:


> Originally Posted by *BrainSplatter*
> 
> Also, a Titan you should buy right after release in order to maximize the bragging rights
> 
> 
> 
> 
> 
> 
> 
> .


*+1*

Indeed, gotta maintain that E-peen erection as long as possible!


----------



## nycgtr

Anyone get theirs yet?


----------



## mbze430

Not even an email

both Artah and I are in Los Angeles.... both of us didn't get an shipping email... oh well. I am not the one that needs it "right now"


----------



## Jbravo33

Quote:


> Originally Posted by *nycgtr*
> 
> Anyone get theirs yet?


a couple have
http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/110


----------



## Dagamus NM

Quote:


> Originally Posted by *Baasha*
> 
> 
> 
> 
> 
> 
> 
> 
> I have a GPU addiction.. When is the X299 platform going to be released? Can't wait to try 4 Way SLI + M2.


Do you think x16x8x8x8 +x4 M.2 will make a lot of difference?

I have 4x SLI on my 6950X with m.2 in x8x8x8x8. I want to get an m.2 to u.2 adapter as I have an extra 1TB SSD doing nothing at the moment.

Don't get me wrong, I am pretty excited for x299. More so for that than TXp. Even though I want them, I have a couple engines to build in my garage as well as a house to remodel.

When the heck does x299 come out anyhow? I suppose I will get GPUs for it but would rather not buy more Pascal cards.


----------



## Sheyster

Quote:


> Originally Posted by *mbze430*
> 
> Not even an email
> 
> both Artah and I are in Los Angeles.... both of us didn't get an shipping email... oh well. I am not the one that needs it "right now"


I'm in San Diego, my card shipped yesterday. It probably has more to do with WHEN you placed the order yesterday, not where you live in the US.


----------



## Jpmboy

Quote:


> Originally Posted by *nycgtr*
> 
> Anyone get theirs yet?


yes.


----------



## Artah

Quote:


> Originally Posted by *Jpmboy*
> 
> yes.


Quote:


> Originally Posted by *nycgtr*
> 
> Anyone get theirs yet?


Jpmboy lives across the street from Nvidia


----------



## pez

You guys need to come join us in the new thread as well as join us in purchasing this new card







. The water is fine







.

Also as far as shipping, my card shipped out of Roseville, MN and I live in NC (in the Triangle). I also ordered my card around 7AM or so EST.


----------



## jhowell1030

Quote:


> Originally Posted by *pez*
> 
> You guys need to come join us in the new thread as well as join us in purchasing this new card
> 
> 
> 
> 
> 
> 
> 
> . The water is fine
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Also as far as shipping, my card shipped out of Roseville, MN and I live in NC (in the Triangle). I also ordered my card around 7AM or so EST.


Honestly, if I hadn't had the wife talk me into dropping +$3000 on a gaming laptop I would've thought about picking one of these up and throwing my current Titan X in the HTPC.


----------



## Artah

Quote:


> Originally Posted by *jhowell1030*
> 
> Honestly, if I hadn't had the wife talk me into dropping +$3000 on a gaming laptop I would've thought about picking one of these up and throwing my current Titan X in the HTPC.


return the gaming laptop


----------



## jhowell1030

Can't. Already opened it up to replace the 128GB nvme with two 256GB nvmes (RAID 0), added in a m.2 1TB SSD, and replaced the 1TB HD with a 2TB SSHD. Had to remove the warranty sticker to do it.


----------



## OZrevhead

Are any scores up for TXp rev 2 ?


----------



## KillerBee33

Anyone managed to do a TXP VS TXp yet ? [email protected] they couldnt come up with a better name..


----------



## MaDeOfMoNeY

As much as I wanna buy the new shiny I just don't see a point in doing so.


----------



## Artah

Quote:


> Originally Posted by *OZrevhead*
> 
> Are any scores up for TXp rev 2 ?


big dog early tests shows the TXp performing like a pro. http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/200_100#post_25995352

Quote:


> Originally Posted by *KillerBee33*
> 
> Anyone managed to do a TXP VS TXp yet ? [email protected] they couldnt come up with a better name..


. http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/200_100#post_25995352

Quote:


> Originally Posted by *MaDeOfMoNeY*
> 
> As much as I wanna buy the new shiny I just don't see a point in doing so.


I would say it's worth it if you crave the power of the bleeding edge, if you can't spare the cash though don't do it.


----------



## Jpmboy

Quote:


> Originally Posted by *KillerBee33*
> 
> Anyone managed to do a TXP VS TXp yet ? [email protected] they couldnt come up with a better name..


I did using time spy ... same rig and settings, the TXFp easily ran right by my 2162MHz TXP, and almost did it while air cooled. I put a uniblock on the TXFp since it is as temp sensitive as the TXP.


----------



## OZrevhead

Can you put up some time spy comparison scores please? I need to know if its worth getting an Xp


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> I did using time spy ... same rig and settings, the TXFp easily ran right by my 2162MHz TXP, and almost did it while air cooled. I put a uniblock on the TXFp since it is as temp sensitive as the TXP.


Quote:


> Originally Posted by *Jpmboy*
> 
> I did using time spy ... same rig and settings, the TXFp easily ran right by my 2162MHz TXP, and almost did it while air cooled. I put a uniblock on the TXFp since it is as temp sensitive as the TXP.


Since im not Benchin' , just playin i don't think it's worth goin through this again


----------



## Jpmboy

Quote:


> Originally Posted by *OZrevhead*
> 
> Can you put up some time spy comparison scores please? I need to know if its worth getting an Xp


this is my best with the TXP:


this is NOT my best with ther new TXFp: http://hwbot.org/submission/3514475_
(just ran a 12350!) all higher scores are using LN2 on the GPU.
Quote:


> Originally Posted by *KillerBee33*
> 
> Since im not Benchin' , just playin i don't think it's worth goin through this again


you are 100% correct. Gaming, you are not gonna see a worthwhile benefit.








I still have both my TXPs and plan to keep them for quite a while.


----------



## OZrevhead

Thanks jpmboy, so if I can buy a TXP for 2/3 the price of a TXp then P is better value right? I will be benching it hard but Im not chasing world rankings, just local ones.

Cheers.


----------



## Artah

Quote:


> Originally Posted by *OZrevhead*
> 
> Thanks jpmboy, so if I can buy a TXP for 2/3 the price of a TXp then P is better value right? I will be benching it hard but Im not chasing world rankings, just local ones.
> 
> Cheers.


better off buying 1080 ti it's close enough and save half the cash for other fun things.


----------



## OZrevhead

Quote:


> Originally Posted by *Artah*
> 
> better off buying 1080 ti it's close enough and save half the cash for other fun things.


But I can get a TXP with a full water block for the price of a new Ti, so that seems like better value to me (and TXP scores higher than 1080Ti with equal cooling).


----------



## Artah

Quote:


> Originally Posted by *OZrevhead*
> 
> But I can get a TXP with a full water block for the price of a new Ti, so that seems like better value to me (and TXP scores higher than 1080Ti with equal cooling).


for the same price as a 1080 ti but with a water block yea that's a better value, how prices have suddenly plummeted but I'm thinking it will go back up a bit after the hype of people dumping them.


----------



## madmeatballs

Anyone in this thread who has an aqua computer block on their card?


----------



## invincible20xx

who here is selling the gen 1 titan xp for the gen 2 one ?


----------



## Artah

Quote:


> Originally Posted by *invincible20xx*
> 
> who here is selling the gen 1 titan xp for the gen 2 one ?


I seen a couple selling them on the market place here in OCN. http://www.overclock.net/f/14779/video


----------



## PowerK

Quote:


> Originally Posted by *invincible20xx*
> 
> who here is selling the gen 1 titan xp for the gen 2 one ?


Just sold mine for 1k USD each.


----------



## invincible20xx

Quote:


> Originally Posted by *PowerK*
> 
> Just sold mine for 1k USD each.


oh boy it's hard to always stay on top isn't it









new 1200$ gpu king now every few months


----------



## PowerK

Quote:


> Originally Posted by *invincible20xx*
> 
> oh boy it's hard to always stay on top isn't it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> new 1200$ gpu king now every few months


I was going to move those existing Titan X Pascals (2016) to the secondary PC. However, on second thought, I just could not be bothered to move and rebuild custom loop on the secondary PC. I put it up on the market several hours ago for 1k USD each. They are sold now. I just need to work on the main PC then.


----------



## invincible20xx

Quote:


> Originally Posted by *PowerK*
> 
> I was going to move those existing Titan X Pascals (2016) to the secondary PC. However, on second thought, I just could not be bothered to move and rebuild custom loop on the secondary PC. I put it up on the market several hours ago for 1k USD each. They are sold now. I just need to work on the main PC then.


i hope you will enjoy the new gpus









have a good day









long live the king, down with the king such is life lol


----------



## pez

Quote:


> Originally Posted by *invincible20xx*
> 
> who here is selling the gen 1 titan xp for the gen 2 one ?


I was going to go AIB Ti, but ended up getting a new TXp. Hopefully the old one will sell soon enough







.


----------



## bouncingsoul

I'm willing to buy one. Is anyone selling in Europe?


----------



## drfouad

I got a Titan x pascal 2016 that I could sell internationally, provided we work out shipping. (My card is just out of 30day return period)


----------



## axiumone

Quote:


> Originally Posted by *madmeatballs*
> 
> Anyone in this thread who has an aqua computer block on their card?


Nope, not this this time. I've had enough titan cards and this round just made me extra salty. After 3 txm and 2 txp, I've had my fill. I've sold 1 txp and traded the other to a friend for 2 evga 1080 acx3.0's that I've upgraded up to icx and consequently to 2 FE Ti's for a fraction of the cost. Which leads me to just a bit less than where I started, but since I'm not running 5x [email protected] anymore, I just don't care. I'm fed up with nvidias marketing practices and general stance. We all know full well that pascal yields were good enough to produce a $1,200, 3840 cuda core card from the start, but they chose to withhold that in order to one up the Ti cards months later, screw that.


----------



## CptSpig

Quote:


> Originally Posted by *invincible20xx*
> 
> who here is selling the gen 1 titan xp for the gen 2 one ?


I am thinking about selling my Titan X Pascal.


----------



## Unimag

I'll be selling mine after I receive the new one hopefully tomorrow.

I'm based in U.K. (Sheffield)

Please drop me a line if interested


----------



## TonyRoma

Quote:


> Originally Posted by *axiumone*
> 
> Nope, not this this time. I've had enough titan cards and this round just made me extra salty. After 3 txm and 2 txp, I've had my fill. I've sold 1 txp and traded the other to a friend for 2 evga 1080 acx3.0's that I've upgraded up to icx and consequently to 2 FE Ti's for a fraction of the cost. Which leads me to just a bit less than where I started, but since I'm not running 5x [email protected] anymore, I just don't care. I'm fed up with nvidias marketing practices and general stance. We all know full well that pascal yields were good enough to produce a $1,200, 3840 cuda core card from the start, but they chose to withhold that in order to one up the Ti cards months later, screw that.


Indeed, I feel your pain.


----------



## eliau81

are they for real (nvidia)?!!

deja vu all over me


----------



## TheGeneralLee86

Is there any new benchmarks t for the Titan Xp? to show what the real difference between them are?


----------



## KillerBee33

Quote:


> Originally Posted by *TheGeneralLee86*
> 
> Is there any new benchmarks t for the Titan Xp? to show what the real difference between them are?


¯\_(ツ)_/¯
5% over the TXP at most
http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30


----------



## Glerox

If anyone is selling his 2016 Titan XP in Canada, please Inbox me.
Import fees are expensive here...


----------



## bizplan

Quote:


> Originally Posted by *axiumone*
> 
> Nope, not this this time. I've had enough titan cards and this round just made me extra salty. After 3 txm and 2 txp, I've had my fill. I've sold 1 txp and traded the other to a friend for 2 evga 1080 acx3.0's that I've upgraded up to icx and consequently to 2 FE Ti's for a fraction of the cost. Which leads me to just a bit less than where I started, but since I'm not running 5x [email protected] anymore, I just don't care. I'm fed up with nvidias marketing practices and general stance. We all know full well that pascal yields were good enough to produce a $1,200, 3840 cuda core card from the start, but they chose to withhold that in order to one up the Ti cards months later, screw that.


Indeed, I feel your pain.



Edit: on a side note, did you see that one of the brokerage houses put a "sell" recommendation on Nvidia's stock? (after it tripled over the last year).


----------



## ratzofftoya

Selling both of my Gen 1 (meaning August 2016) Titan X(capital)Ps. Can ship pretty much anywhere. PM me if interested!


----------



## alexthemans0527

I will sell my Titan X (Pascal), or Titan XP 2016, in Hong Kong, after receiving Titan Xp.

My Titan XP do pretty well under water cooling (+185 Core/600 RAM), and hope my new card is better...

P.S. I know it is costly, but it will serve me for two or three years


----------



## bouncingsoul

Why is it everyone is referring to the "+MHz" numbers when talking about the OCing capabilities? Wouldn't it be much more precise to say e.g. it settles at [email protected] with +200 on the core?

Anyways, I'd like to ask you something else as well. Do you guys think that 2 Txp will exceed the capability of my cooling? I have an [email protected],9GHz (not delidded so far) and a Txp at the above mentioned OC. CPU reaches ~75-80deg, GPU settles at 55deg while playing Ghost Recon Wildlands for multiple hours. I have a custom loop with one 360 and one 240 radiator with noisekiller fans @~1100rpm. Not too sure about the water temps tough.


----------



## Silent Scone

Quote:


> Originally Posted by *alexthemans0527*
> 
> I will sell my Titan X (Pascal), or Titan XP 2016, in Hong Kong, after receiving Titan Xp.
> 
> My Titan XP do pretty well under water cooling (+185 Core/600 RAM), and hope my new card is better...
> 
> *P.S. I know it is costly, but it will serve me for two or three years*


So would your Titan X, in that case... lol


----------



## ChronoBodi

I can't be arsed to sell mine for a mere 7% difference.

Basically, the full unlocked GP102, 3584 vs 3840, here's the difference, assuming same clock speed and memory speed:

3840 cores at 1840 mhz is same performance as 3584 cores at 2000 mhz.

Not really arsed to "upgrade", let me know when AMD or Nvidia have a Volta Ti or something similar on AMD side.


----------



## alexthemans0527

Quote:


> Originally Posted by *Silent Scone*
> 
> So would your Titan X, in that case... lol


At least I fulfilled my goal to build a dream PC now, by upgrading to Titan Xp, which uses a complete GP102.


----------



## Jpmboy

Quote:


> Originally Posted by *bizplan*
> 
> Indeed, I feel your pain.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Edit: on a side note, did you see that one of the brokerage houses put a "sell" recommendation on Nvidia's stock? (after it tripled over the last year).


The price of NV stock is NOT being driven by any _halo_ SKU silly... look to AI, automobile hardware etc.


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> The price of NV stock is NOT being driven by any _halo_ SKU silly... look to AI, automobile hardware etc.


Well, yes and no. It's not only directly impacted by a $1,200 titan, but it is impacted by GPU's. The gaming revenue for nvidia is substantially higher than any other things they're doing right now, deep learning, auto, pro visualization. The other than gaming fields are showing good growth, and that's certainly helping the stock, however the downgrades are coming from analysts thinking that nvidia completely over saturated the market with GPU's and it's going to stagnate. That's my









See here for some easy to read info. - http://files.shareholder.com/downloads/AMDA-1XAJD4/4251406238x0x919755/BA04DEA9-2D69-43AF-85CF-3402A58884BC/Corporate_Presentation.pdf


----------



## bizplan

Quote:


> Originally Posted by *bizplan*
> 
> Indeed, I feel your pain.
> 
> 
> 
> Edit: on a side note, did you see that one of the brokerage houses put a "sell" recommendation on Nvidia's stock? (after it tripled over the last year).


Quote:


> Originally Posted by *Jpmboy*
> 
> The price of NV stock is NOT being driven by any _halo_ SKU silly... look to AI, automobile hardware etc.


Quote:


> Originally Posted by *axiumone*
> 
> Well, yes and no. It's not only directly impacted by a $1,200 titan, but it is impacted by GPU's. The gaming revenue for nvidia is substantially higher than any other things they're doing right now, deep learning, auto, pro visualization. The other than gaming fields are showing good growth, and that's certainly helping the stock, however the downgrades are coming from analysts thinking that nvidia completely over saturated the market with GPU's and it's going to stagnate. That's my
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See here for some easy to read info. - http://files.shareholder.com/downloads/AMDA-1XAJD4/4251406238x0x919755/BA04DEA9-2D69-43AF-85CF-3402A58884BC/Corporate_Presentation.pdf


The market is reacting negatively to Nvidia milking its customers, in the span of (what seemed like a week) NV likely sold thousands of TI's to folks who were thinking they were buying the fastest GPU available only to have their cards usurped by the Xp.









Edit: what NV is doing is not good business, although it's good business for AMD!









Edit: although it would have been bad business to kill off the XP which (for one week) everyone thought NV did with the TI. I guess the joke was on us..


----------



## mbze430

Quote:


> Originally Posted by *madmeatballs*
> 
> Anyone in this thread who has an aqua computer block on their card?


I used to have the AquaComputer nickel blocks on my Titan XP. I have since removed them for the Xp.


----------



## jsutter71

A little off topic from the current discussions.

Not long ago I discussed the issue about individual 6 and 8 pin power cords versus a single cable with both connections. My verdict is as follows. With a single power cable with both connections the ability to overclock is limited. This not only affected my Titans but also my CPU. I experienced frequent lockups and system restarts especially during benchmarking. As soon as I attached individual power cords these problems went away. I was able to overclock my CPU and Titans again.


----------



## bizplan

Quote:


> Originally Posted by *jsutter71*
> 
> A little off topic from the current discussions.
> 
> Not long ago I discussed the issue about individual 6 and 8 pin power cords versus a single cable with both connections. My verdict is as follows. With a single power cable with both connections the ability to overclock is limited. This not only affected my Titans but also my CPU. I experienced frequent lockups and system restarts especially during benchmarking. As soon as I attached individual power cords these problems went away. I was able to overclock my CPU and Titans again.


Apollo 13 main bus B under-volt?


----------



## Jusiz

I know its insane idea but can someone crossflash Xp bios to titan x pascal!? Can this way unlock cores or something?
Is it possible anyway?


----------



## xTesla1856

Quote:


> Originally Posted by *Jusiz*
> 
> I know its insane idea but can someone crossflash Xp bios to titan x pascal!? Can this way unlock cores or something?
> Is it possible anyway?


They're laser cut, this isn't AMD


----------



## Lee0

EVGA has finally released the *Original Titan X Pascal* _this name is too long :c_ (/1080 ti, it fits this card as well) Hybrid kit!
Product page link: http://www.evga.com/products/product.aspx?pn=400-HY-5388-B1


----------



## octiny

Quote:


> Originally Posted by *Lee0*
> 
> EVGA has finally released the *Original Titan X Pascal* _this name is too long :c_ (/1080 ti, it fits this card as well) Hybrid kit!
> Product page link: http://www.evga.com/products/product.aspx?pn=400-HY-5388-B1


It's been out for about 10 days.

Grabbed two awhile back 2 minutes after it was listed, sold out within 5 minutes.

Edit: It also works on the Titan Xp with no modifications.


----------



## nycgtr

I am a big fan of EVGA but this crap for 160 is just bs. These kits were 120 before and that was kinda meh. 160 for plastic cover and a 60 dollar aio.


----------



## toncij

Quite a sum for a weak single-vent rad...


----------



## Jpmboy

sli txps [email protected]
Quote:


> Originally Posted by *toncij*
> 
> Quite a sum for a weak single-vent rad...


... Fischer-Price watercooling.


----------



## jsutter71

Quote:


> Originally Posted by *Jpmboy*
> 
> sli txps [email protected]
> ... Fischer-Price watercooling.


Once you go open loop you never go back.


----------



## axiumone

Not true.









Once you go back to high end air and can't contain the pleasure of knowing there's next to no maintenance. Besides wiping the dust off once in a while.


----------



## jsutter71

Quote:


> Originally Posted by *axiumone*
> 
> Not true.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once you go back to high end air and can't contain the pleasure of knowing there's next to no maintenance. Besides wiping the dust off once in a while.


I prefer quiet overclocking. The maintenance is easy. It takes longer to get an oil change in my car then it took me to drain, flush, and refill my loop.


----------



## axiumone

I though so too, for a very long time. I've recently invested in a cryorig r1 after years of high end water. I'm incredibly impressed at the cooling/noise ratio. The delided [email protected] - 1.38v is 45-50c while gaming and it's very quiet.

Edit, GPU's are a little tougher to keep quiet though. Although a lot of the triple fan solutions have very respectable noise levels, even overclocked.


----------



## Jbravo33

Quote:


> Originally Posted by *axiumone*
> 
> I though so too, for a very long time. I've recently invested in a cryorig r1 after years of high end water. I'm incredibly impressed at the cooling/noise ratio. The delided [email protected] - 1.38v is 45-50c while gaming and it's very quiet.
> 
> Edit, GPU's are a little tougher to keep quiet though. Although a lot of the triple fan solutions have very respectable noise levels, even overclocked.


Did you delid the 6850k yourself? I was thinking about it but couldn't find a tool that supported broadwell.


----------



## axiumone

Quote:


> Originally Posted by *Jbravo33*
> 
> Did you delid the 6850k yourself? I was thinking about it but couldn't find a tool that supported broadwell.


Purchased delided. Although if you look really hard, I heard you can find the tool der8auer made. I think he released about 75 of them through private sales. I've seen the process and I didn't feel like taking a razor to the ihs and the die to scrape off the solder, as I'm prone to messing stuff up. Feel like purchasing a ready chip was the better way. It's not like deliding a 115x chip, those are much easier and I've done a few of them.


----------



## Baasha

wheeeee.. ran Superposition @ 8K with OG Titan XP in 4 Way SLI:


----------



## Silent Scone

Pointless exercise #4,603

Your mins are poo. Which essentially shows how impractical that is lol


----------



## pez

Quote:


> Originally Posted by *jsutter71*
> 
> I prefer quiet overclocking. The maintenance is easy. It takes longer to get an oil change in my car then it took me to drain, flush, and refill my loop.


Excluding FE coolers, it's very possible to have an air cooled rig perform just as quietly as a WC'ed one.


----------



## Jpmboy

Quote:


> Originally Posted by *Baasha*
> 
> wheeeee.. ran Superposition @ 8K with OG Titan XP in 4 Way SLI:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


baasha, how did you get SLi to work ? and ~ 75% scaling at that...
Quote:


> Originally Posted by *pez*
> 
> Excluding FE coolers, it's very possible to have an air cooled rig perform just as quietly as a WC'ed one.


except the component temperature wil lbe much higher in any air cooled rig. I mean, my NH-D14 is an exception ac, but not even close to a custom loop for the GPU, and GPUs are in a different category all together. at full tilt, they are hair driers running at hi speed, and roasting the card.


----------



## piee

gota ask yourself is 400-$500 upgrade from TXP to TXp worth 5-8 fps, then gotta ask yoself do you feel lucky.....well do you?


----------



## Jpmboy

Quote:


> Originally Posted by *piee*
> 
> gota ask yourself is 400-$500 upgrade from TXP to TXp worth 5-8 fps, then gotta ask yoself do you feel lucky.....well do you?


best to just stick to asking _yourself_.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> ... Fischer-Price watercooling.


Indeed. A good compromise might be an EK Predator 360 with QDC and a pre-filled block; pretty damn easy and much better results too:


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> baasha, how did you get SLi to work ? and ~ 75% scaling at that...


It's not complicated. Just tweak the existing sli profiles from 2 GPU's to 3-4. Sometimes there will be custom sli bits floating around to get the game engine to see more than two cards. You'll obviously need to use one of the old sli bridges as well.

http://www.forum-3dcenter.org/vbulletin/showthread.php?t=509912

Above link has a lot of those sli bits available.


----------



## Jpmboy

Quote:


> Originally Posted by *Sheyster*
> 
> Indeed. A good compromise might be an EK Predator 360 with QDC and a pre-filled block; pretty damn easy and much better results too:


that is a very good product, and expandable.








Quote:


> Originally Posted by *axiumone*
> 
> It's not complicated. Just tweak the existing sli profiles from 2 GPU's to 3-4. Sometimes there will be custom sli bits floating around to get the game engine to see more than two cards. You'll obviously need to use one of the old sli bridges as well.
> 
> http://www.forum-3dcenter.org/vbulletin/showthread.php?t=509912
> 
> Above link has a lot of those sli bits available.


thanks! i'll take a look. Why the old SLI bridge??
... more than 2 cards? it only loads one.


----------



## xTesla1856

Quote:


> Originally Posted by *Jpmboy*
> 
> that is a very good product, and expandable.


Great product until they discontinued it due to massive leak issues. The Predators are EOL, which is a shame since I loved mine to bits whilst I had it. Now on to my custom loop


----------



## jsutter71

Quote:


> Originally Posted by *axiumone*
> 
> Purchased delided. Although if you look really hard, I heard you can find the tool der8auer made. I think he released about 75 of them through private sales. I've seen the process and I didn't feel like taking a razor to the ihs and the die to scrape off the solder, as I'm prone to messing stuff up. Feel like purchasing a ready chip was the better way. It's not like deliding a 115x chip, those are much easier and I've done a few of them.


Zero benefit it delided my 6950x. It's just a matter of personal preference but after years of air cooling I love the clean look of my system without all the huge fans. I's also more then that. My motherboard temps and drives stay cool and my cables route easier.


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> thanks! i'll take a look. Why the old SLI bridge??
> ... more than 2 cards? it only loads one.


Well, the new HB bridges are limited to 2 cards only, there's just no other configs. The previous generation maxwell high frequency bridges go up to 4 gpus in configs.
Quote:


> Originally Posted by *jsutter71*
> 
> Zero benefit it delided my 6950x. It's just a matter of personal preference but after years of air cooling I love the clean look of my system without all the huge fans. I's also more then that. My motherboard temps and drives stay cool and my cables route easier.


Can't argue on looks. A nicely laid out wc system will always look awesome.


----------



## Baasha

Quote:


> Originally Posted by *Jpmboy*
> 
> baasha, how did you get SLi to work ? and ~ 75% scaling at that...


Create your own custom profile in Inspector - PM me and I can send you the profile via email.









Also, the cards were scaling at ~98 - 99% across all 4 - not 75%









Btw, turning on DOF *wrecks* performance. I went from avg 121 fps in 8K to like 90 fps after turning it on.


----------



## jsutter71

Quote:


> Originally Posted by *axiumone*
> 
> Well, the new HB bridges are limited to 2 cards only, there's just no other configs. The previous generation maxwell high frequency bridges go up to 4 gpus in configs.
> 
> Can't argue on looks. A nicely laid out wc system will always look awesome.


Thank you sir....


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> *Well, the new HB bridges are limited to 2 cards only, there's just no other configs.* The previous generation maxwell high frequency bridges go up to 4 gpus in configs.
> 
> Can't argue on looks. A nicely laid out wc system will always look awesome.


yeah.. I only have two cards (2 each TXPs and TXp)
I'm just trying to get superposition to use more than one card... can;t believe unigine released this benchmark. slightly better than a POS.


----------



## xarot

Quote:


> Originally Posted by *jsutter71*
> 
> Zero benefit it delided my 6950x. It's just a matter of personal preference but after years of air cooling I love the clean look of my system without all the huge fans. I's also more then that. My motherboard temps and drives stay cool and my cables route easier.


Almost too clean, my eyes can't find a SLI bridge?







Anyway looks very nice!
Quote:


> Originally Posted by *xTesla1856*
> 
> Great product until they discontinued it due to massive leak issues. The Predators are EOL, which is a shame since I loved mine to bits whilst I had it. Now on to my custom loop


I like to tighten the fittings myself, after checking individual parts...either Asetek AIO or full custom loop only.


----------



## jsutter71

Quote:


> Originally Posted by *xarot*
> 
> Almost too clean, my eyes can't find a SLI bridge?
> 
> 
> 
> 
> 
> 
> 
> Anyway looks very nice!
> I like to tighten the fittings myself, after checking individual parts...either Asetek AIO or full custom loop only.





First pic was taken with my iphone and the second with my EOS 5DM3


----------



## jsutter71

https://www.newegg.com/Product/Product.aspx?item=N82E16820232349
Just bought some new ram this morning that will be arriving tomorrow. Racers start your engines. Looking forward to some new overclocking and benchmarking.


----------



## jsutter71

How to break 34000 in FS....It's all about the memory


----------



## Gunslinger.

There's more to it than that.


----------



## jsutter71

Quote:


> Originally Posted by *Gunslinger.*
> 
> There's more to it than that.


Ok...So without any exotic cooling what do you suggest


----------



## Gunslinger.

Nvidia Inspector, properly set up driver, nice tight memory timings.

https://www.google.com/#q=nvidia+inspector


----------



## jsutter71

Quote:


> Originally Posted by *Gunslinger.*
> 
> Nvidia Inspector, properly set up driver, nice tight memory timings.
> 
> https://www.google.com/#q=nvidia+inspector


So that was it? How are you cooling your system?


----------



## piee

Looks like TXp gets about 4-7fps over TXP, may they will unlock voltage before AMD releases HBM2.


----------



## Jpmboy

Quote:


> Originally Posted by *Gunslinger.*
> 
> There's more to it than that.


HOF valid or tess tweak?








Quote:


> Originally Posted by *jsutter71*
> 
> Ok...So *without any exotic cooling* what do you suggest


now why would you ask gunney that question?


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> now why would you ask gunney that question?


LOL, his LN2 tank is probably as big as my water heater.


----------



## Silent Scone

Firestrike is super sensitive to system memory timings also. I gained almost 1,000 points at the same frequency over some users in the past.

I really hate synthetic tests, though!

First run of Superposition using 24/7 settings

6850 @ 4.4
3200C14
TXP(3584) @ 2100/5310


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> Firestrike is super sensitive to system memory timings also. I gained almost 1,000 points at the same frequency over some users in the past.
> 
> I really hate synthetic tests, though!
> 
> First run of Superposition using 24/7 settings
> 
> 6850 @ 4.4
> 3200C14
> TXP(3584) @ 2100/5310


wait.. wut? you picked up a PASCAL CARD??


----------



## xfachx

Hey guys,

I have been having a lot of trouble overclocking my Titan X Pascal card. On the latest drivers. Using a 6600K without an OC at the moment and have 32GB of RAM. Using an Asus Maximus Gene motherboard. I have my whole system running under a custom water cooled loop and load temps never exceed 60C. Think that covers it.

So basically, any change I make in the Precision or Afterburner software regardless of how small the increments leads to issues. For example I will increase both GPU and Mem by 50 points, have power limit set to 120 and the temps maxed out to 90. I also maxed out the voltage meter on the left. I can see the changes after adjustments in GPU Z and then I try and test it out.

Any time I run FurMark or boot up a game it works for 2 seconds, freezes for a few moments, then crashes to desktop and then defaults all the settings in Precision to stock levels.

Any idea what I may be doing wrong? I am really confused as to whats happening.


----------



## axiumone

That's a bummer.


----------



## xfachx

Quote:


> Originally Posted by *axiumone*
> 
> That's a bummer.


Lol. Post updated! Keyboard flipped out!


----------



## jsutter71

Quote:


> Originally Posted by *Sheyster*
> 
> LOL, his LN2 tank is probably as big as my water heater.


No doubt


----------



## jsutter71

I'm not getting any warm fuzzies about this benchmark. I'm basically smoking FS and TS relatively speaking but this one is a different story. BTW...Have the latest drivers and Win updates with no issues overclocking. A clean install of the creators update may be the reason.


----------



## Glerox

Sup guys. I bought a second Titan XP to make an sli build.
I'm thinking about the cpu I want and I can't find the answer.

What are your thoughts on SLI between 7700k @ 5GHz with 4000MHz RAM @ PCI-e x8/x8 VS something like a 6850k @ 4.2-4.4GHz with 3000MHz RAM @ PCI-e x16/x16?
I wonder if the extra single core speed and RAM speed compensate for the loss of PCI-e lanes.

It's only for gaming at 4k and light workload.

Thanks!


----------



## mitcHELLspawn

I can personally attest to the fact that anything more powerful than a 1080 can completely saturate the pcie gen 3 x8 bandwith. I've personally ran sli for quite a few years. I went from sli 780ti to sli 980 to sli TitanX hybrids (maxwell) to sli gtx 1080s and finally to just a single 1080Ti.

I originally was running a 6700k at 4.8ghz with my SLI 1080s and I was noticing some serious stutters and very bad scaling in games that had great scaling with my previous cards. I had a feeling that the titans were getting close to hitting that cut off point and I know for sure in some games, definitely not all the 1080s were being held back by x8x8 and only 4 cores. I made the switch to a 6850k and overclocked to 4.5ghz and immediately saw my scaling improve in the games that were lagging.. and i can guarantee the bandwith problem will only be compounded with something as powerful as a titan XP.

But can I just say that I truly don't believe you should spend the money on a second card. Hear me out. I've been a huge SLI user for a long time, and back in 2014 and 2015 the scaling was pretty decent, not only that but we had around 90+% of AAA games at the time with at least some form of SLI support. Then the 10 series released, and we saw nvidia drop anything over dual cards, and no sli for 60 series. Most of us thought this was a good sign, as it would give targeted attention to the dual cards. Unfortunately it truly wasn't... it was the beginning of the end. In the year 2016 and so far in 2017 sli support has taken an insane drop off in games supported. Last year we had around 40% AAA games with support and this year is looking just as bad. The worst thing is, not only are games not supporting it but we're seeing negative scaling in a lot of games and tons of extra stutter and issues.

Now don't get me wrong, if you play a lot of AAA games from 2015 and older it could be good for you, but if you're like me and you pick up pretty much all the newest AAA games, youre in for a terrible time. So much more time spent on looking for fixes and messing with nvidia inspector etc etc than actually sitting down and enjoying the game. I know because I've been there. I realized I was spending all this time trying to get the game to run the way i want instead of being able just to enjoy it.

All I can say is since I switched to a single 1080Ti and overclocked it heavily, my gaming experience has just been so incredible compared to before. Games just work, and run beautifully, and I don't have to always be wondering if my cards are getting usage or if it's scaling well etc...

If you really want some extra performance, maybe drop the titan X pascal and pick up the new titan Xp. It should be an extra 10% ish performance which unfortunately is really on average all you're going to get with 2 cards... with a few exceptions of course... but then there's also the other exceptions with performance worse than a single card ;/


----------



## trippinonprozac

Anyone had issues with gsync and SLI?

I am having a weird issue since using DDU where when gsync is on in NCP my GPU usage is about 50-70% on each GPU. Setting it to fixed refresh rate fixes the issue and usage returns to normal.

All testing was done with vsync turned off.


----------



## mitcHELLspawn

Yeah, that's actually been a pretty big issue for quite a while now unfortunately:/ if you go over onto the nvidia forums theres lots of angry gsync users like myself where gsync totally breaks sli. Its pretty crappy, but only another reason why I stand behind my previous post:/


----------



## trippinonprozac

Quote:


> Originally Posted by *mitcHELLspawn*
> 
> Yeah, that's actually been a pretty big issue for quite a while now unfortunately:/ if you go over onto the nvidia forums theres lots of angry gsync users like myself where gsync totally breaks sli. Its pretty crappy, but only another reason why I stand behind my previous post:/


Do you just turn Gsync off so you get the most out of your cards or leave it on and suffer with lessened performance?


----------



## mitcHELLspawn

I actually don't do either anymore lol I switched to single card and haven't looked back.

But as far as what I did, it depended on the situation/game and whether I needed the performance or not. I game at either 4k or 3440x1440 @ 100hz gsync and ive actually found ive been able to get everything I need with a water cooled over clocked 1080Ti.

I know its not a great answer for your situation, but im not sure how knowledgeable you are with SLI but if you're playing a game from 2016 or 2017 theres a pretty good chance you're not getting very much from that second card, if anything at all.


----------



## trippinonprozac

Quote:


> Originally Posted by *mitcHELLspawn*
> 
> I actually don't do either anymore lol I switched to single card and haven't looked back.
> 
> But as far as what I did, it depended on the situation/game and whether I needed the performance or not. I game at either 4k or 3440x1440 @ 100hz gsync and ive actually found ive been able to get everything I need with a water cooled over clocked 1080Ti.
> 
> I know its not a great answer for your situation, but im not sure how knowledgeable you are with SLI but if you're playing a game from 2016 or 2017 theres a pretty good chance you're not getting very much from that second card, if anything at all.


I know plenty about both SLI and gsync but only recently picked up the second Titan. I too game at 3440x1440p 100hz so there were a few games I wanted the second card form plus benchmarking.

You are right though, 1 card serves you pretty damn well at that res.


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> wait.. wut? you picked up a PASCAL CARD??


Nope! Well, same one I've had since August lol

I'm going SLI with a pair of AIB 1080s soon to drive the FOVE. Need the grunt, and not willing to pay out for two Henry VIII (fat kings).
Quote:


> Originally Posted by *mitcHELLspawn*
> 
> I can personally attest to the fact that anything more powerful than a 1080 can completely saturate the pcie gen 3 x8 bandwith. I've personally ran sli for quite a few years. I went from sli 780ti to sli 980 to sli TitanX hybrids (maxwell) to sli gtx 1080s and finally to just a single 1080Ti.
> 
> I originally was running a 6700k at 4.8ghz with my SLI 1080s and I was noticing some serious stutters and very bad scaling in games that had great scaling with my previous cards. I had a feeling that the titans were getting close to hitting that cut off point and I know for sure in some games, definitely not all the 1080s were being held back by x8x8 and only 4 cores. I made the switch to a 6850k and overclocked to 4.5ghz and immediately saw my scaling improve in the games that were lagging.. and i can guarantee the bandwith problem will only be compounded with something as powerful as a titan XP.
> 
> But can I just say that I truly don't believe you should spend the money on a second card. Hear me out. I've been a huge SLI user for a long time, and back in 2014 and 2015 the scaling was pretty decent, not only that but we had around 90+% of AAA games at the time with at least some form of SLI support. Then the 10 series released, and we saw nvidia drop anything over dual cards, and no sli for 60 series. Most of us thought this was a good sign, as it would give targeted attention to the dual cards. Unfortunately it truly wasn't... it was the beginning of the end. In the year 2016 and so far in 2017 sli support has taken an insane drop off in games supported. Last year we had around 40% AAA games with support and this year is looking just as bad. The worst thing is, not only are games not supporting it but we're seeing negative scaling in a lot of games and tons of extra stutter and issues.
> 
> Now don't get me wrong, if you play a lot of AAA games from 2015 and older it could be good for you, but if you're like me and you pick up pretty much all the newest AAA games, youre in for a terrible time. So much more time spent on looking for fixes and messing with nvidia inspector etc etc than actually sitting down and enjoying the game. I know because I've been there. I realized I was spending all this time trying to get the game to run the way i want instead of being able just to enjoy it.
> 
> All I can say is since I switched to a single 1080Ti and overclocked it heavily, my gaming experience has just been so incredible compared to before. Games just work, and run beautifully, and I don't have to always be wondering if my cards are getting usage or if it's scaling well etc...
> 
> If you really want some extra performance, maybe drop the titan X pascal and pick up the new titan Xp. It should be an extra 10% ish performance which unfortunately is really on average all you're going to get with 2 cards... with a few exceptions of course... but then there's also the other exceptions with performance worse than a single card ;/


That's an awfully long post with no mention of resolution or display type. For the FOVE on arrival, I'd probably say one 1080 is minimum...

Also, your claims of bus saturation would need to be backed up with some data.


----------



## mitcHELLspawn

My resolution? I run both a 4k HDR TV and a predator x34... and as far as the data, I personally did the testing for myself. I'm in no way trying to convince you or anyone lol..although the hardware reviewer for pc gamer has actually done the testing as well and come back with the same conclusion. He has given his findings in a few reviews as well as quite a few comments sections on the site. Unfortunately I'm not going to be trying to find that tonight lol... if you want to try to SLI 2 cards as powerful as the 1080 or higher with a quad core with 16 pcie lanes you can have at it buddy lol... just don't say I didn't warn you when your gaming experience is garbage.

Anyway, I was just giving advice from my own (very thorough) experience. Feel free to listen to it, or don't.


----------



## Silent Scone

Quote:


> Originally Posted by *mitcHELLspawn*
> 
> My resolution? I run both a 4k HDR TV and a predator x34... and as far as the data, I personally did the testing for myself. I'm in no way trying to convince you or anyone lol..although the hardware reviewer for pc gamer has actually done the testing as well and come back with the same conclusion. He has given his findings in a few reviews as well as quite a few comments sections on the site. Unfortunately I'm not going to be trying to find that tonight lol... if you want to try to SLI 2 cards as powerful as the 1080 or higher with a quad core with 16 pcie lanes you can have at it buddy lol... just don't say I didn't warn you when your gaming experience is garbage.
> 
> Anyway, I was just giving advice from my own (very thorough) experience. Feel free to listen to it, or don't.


One TITAN/TI card for an X34 is efficient enough, sure. I have both card and screen here. For VR on the FOVE one won't cut any mustard at all. The bus saturation, again, would need to see data. You're talking about more of a CPU bottleneck than any lane saturation.


----------



## piee

I just use Vsync,no buffer, and lock ingame fps to Hz of monitor, no delay,smooth,less stress of gpx card. Also can us fastsync if want Gota Qnix 4k32" perfect ips, 10bit, nice.


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> Nope! Well, same one I've had since August lol
> 
> I'm going SLI with a pair of AIB 1080s soon to drive the FOVE. Need the grunt, and not willing to pay out for two Henry VIII (fat kings).
> That's an awfully long post with no mention of resolution or display type. For the FOVE on arrival, I'd probably say one 1080 is minimum...
> 
> Also, your claims of bus saturation would need to be backed up with some data.


what's a FOVE?


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> what's a FOVE?


https://www.getfove.com/?gclid=Cj0KEQjwicfHBRCh6KaMp4-asKgBEiQA8GH2xx-WmMWKsiuoJRHH367JEN2qBHp2I7-TkOSwS11C6eMaAgcZ8P8HAQ


----------



## Jpmboy

Quote:


> Originally Posted by *Silent Scone*
> 
> https://www.getfove.com/?gclid=Cj0KEQjwicfHBRCh6KaMp4-asKgBEiQA8GH2xx-WmMWKsiuoJRHH367JEN2qBHp2I7-TkOSwS11C6eMaAgcZ8P8HAQ


very cool... does it come with a box of airsick bags?


----------



## KillerBee33

Quote:


> Originally Posted by *Jpmboy*
> 
> very cool... does it come with a box of airsick bags?


You get used to VR after about 10 hours of use.
But i had a friend throwing up after 30 minutes


----------



## Glerox

Quote:


> Originally Posted by *mitcHELLspawn*
> 
> I can personally attest to the fact that anything more powerful than a 1080 can completely saturate the pcie gen 3 x8 bandwith. I've personally ran sli for quite a few years. I went from sli 780ti to sli 980 to sli TitanX hybrids (maxwell) to sli gtx 1080s and finally to just a single 1080Ti.
> 
> I originally was running a 6700k at 4.8ghz with my SLI 1080s and I was noticing some serious stutters and very bad scaling in games that had great scaling with my previous cards. I had a feeling that the titans were getting close to hitting that cut off point and I know for sure in some games, definitely not all the 1080s were being held back by x8x8 and only 4 cores. I made the switch to a 6850k and overclocked to 4.5ghz and immediately saw my scaling improve in the games that were lagging.. and i can guarantee the bandwith problem will only be compounded with something as powerful as a titan XP.
> 
> But can I just say that I truly don't believe you should spend the money on a second card. Hear me out. I've been a huge SLI user for a long time, and back in 2014 and 2015 the scaling was pretty decent, not only that but we had around 90+% of AAA games at the time with at least some form of SLI support. Then the 10 series released, and we saw nvidia drop anything over dual cards, and no sli for 60 series. Most of us thought this was a good sign, as it would give targeted attention to the dual cards. Unfortunately it truly wasn't... it was the beginning of the end. In the year 2016 and so far in 2017 sli support has taken an insane drop off in games supported. Last year we had around 40% AAA games with support and this year is looking just as bad. The worst thing is, not only are games not supporting it but we're seeing negative scaling in a lot of games and tons of extra stutter and issues.
> 
> Now don't get me wrong, if you play a lot of AAA games from 2015 and older it could be good for you, but if you're like me and you pick up pretty much all the newest AAA games, youre in for a terrible time. So much more time spent on looking for fixes and messing with nvidia inspector etc etc than actually sitting down and enjoying the game. I know because I've been there. I realized I was spending all this time trying to get the game to run the way i want instead of being able just to enjoy it.
> 
> All I can say is since I switched to a single 1080Ti and overclocked it heavily, my gaming experience has just been so incredible compared to before. Games just work, and run beautifully, and I don't have to always be wondering if my cards are getting usage or if it's scaling well etc...
> 
> If you really want some extra performance, maybe drop the titan X pascal and pick up the new titan Xp. It should be an extra 10% ish performance which unfortunately is really on average all you're going to get with 2 cards... with a few exceptions of course... but then there's also the other exceptions with performance worse than a single card ;/


Thanks for your detailed answer. Unfortunately I have already bought the second TXP. I did it for the upcoming 4k144hz monitor.

I'll think about it, maybe I'll return my z270 combo for X99. Still in the box.


----------



## mitcHELLspawn

LOL you thought SLI in regular games were bad, good luck with SLI in VR. I've owned the oculus rift since launch and have the triple sensor roomscale touch setup, i absolutely love VR... if you think you're going to get literally ANY use out of the second card in literally 99% of VR games, unfortunately you are delusional...I have had SLI the entire time I've had an oculus and I have played a ton of games.. had titanX hybrids first the gtx 1080s and anytime i play VR the second card just sits there looking pretty... so yeah FOVE or not the second card is still useless.

Vr developers are operating on extremely tight budgets right now. They literally have zero dollars in the budget to develop for a niche market of a niche market...


----------



## Jbravo33

only thing holding me back on 3dmark is cpu card is a beast. got two good ones but top one is special. best superposition so far. want to upgrade cpu but x299 shouldnt be very far out.


----------



## skypine27

This is probably a dumb question:

Are Titan X(Pascal) and thew new Titan Xp the same physical specs? I want to know if my EK water blocks for the Titan X (Pascal) will fit the new Titan Xp

Im about to order 2 of them if they do

Thx


----------



## deafboy

Quote:


> Originally Posted by *skypine27*
> 
> This is probably a dumb question:
> 
> Are Titan X(Pascal) and thew new Titan Xp the same physical specs? I want to know if my EK water blocks for the Titan X (Pascal) will fit the new Titan Xp
> 
> Im about to order 2 of them if they do
> 
> Thx


Yes, blocks will work...


----------



## skypine27

Quote:


> Originally Posted by *deafboy*
> 
> Yes, blocks will work...


Thanks!


----------



## vmanuelgm

Anyone tried to flash TXp bios in TXP or vice versa???










TXP block fits perfectly the new TXp


----------



## MrTOOSHORT

Quote:


> Originally Posted by *vmanuelgm*
> 
> Anyone tried to flash TXp bios in TXP or vice versa???


Doesn't work, says gpu mismatch. Tried a bunch of 1080ti bios' same thing.


----------



## vmanuelgm

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Doesn't work, says gpu mismatch. Tried a bunch of 1080ti bios' same thing.


Thanks, mate!!!

Real pity!!!


----------



## Silent Scone

Quote:


> Originally Posted by *Jpmboy*
> 
> very cool... does it come with a box of airsick bags?


lol! Have you tried VR yet? I suffer from motion sickness at times, haven't really had that with VR.


----------



## Silent Scone

https://www.facebook.com/NVIDIAGeForceIT/videos/10155256213219626/


----------



## xTesla1856

Quote:


> Originally Posted by *Silent Scone*
> 
> https://www.facebook.com/NVIDIAGeForceIT/videos/10155256213219626/


It's for a gaming gathering event in southern Europe in collaboration with Asus ROG.


----------



## Silent Scone

You hope


----------



## MrTOOSHORT

Representing the old dog TXP cards...


----------



## Silent Scone

Chilly


----------



## lanofsong

Hey Titan X Pascal owners,

We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

April 2017 Foldathon

BTW - make sure you sign up









To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Baasha

you guys want to test your GPUs? Play this in 8K 60fps!


----------



## DooRules

I was able to average 71.59 with just two with sli working correctly. With 4 it must be smooth as butter.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Doesn't work, says gpu mismatch. Tried a bunch of 1080ti bios' same thing.


which nvflash?
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Representing the old dog TXP cards...


"old dog"... lol. not.


----------



## MrTOOSHORT

I'm using 5.319. Other versions, including the newest just says some WoW64 error.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I'm using 5.319. Other versions, including the newest just says some WoW64 error.


ugh - I tried the same and a few versions back ( to 5.227 ). I don't think we can unlock cuda cores; probably failed cores anyway considering the yields on this die. Could be cut, but that would be one less $5000 tesla.


----------



## Glerox

I changed my poor overclocker 6800k (4 GHz RAM 3000 MHz) for a lucky winner 6850k (4.4 GHz RAM 3200 Mhz) because I'm planning SLI and need the extra PCI-e lanes.
It allowed me to beat my benchmarks record








Still rocking that "old" Titan XP on firestrike ultra hall-of-fame!



core +225
vram +550
temp around 40 celcius

Can't wait to try my second TXP


----------



## deafboy

Waiting on the block to arrive.... ahhh, can't wait to install this thing, haha.


----------



## bouncingsoul

My best result so far... 13034 in Firestrike Ultra

http://www.3dmark.com/fs/12385703

one Titan is still stock cooled. CPU not yet delidded, so there's still lots of work to do.
biggest limit right now is the PSU. It's hitting its limit hard... I even can hear noises from it


----------



## Glerox

What's your psu?

I just installed my second TXP also.
My psu is 1000w. I think it's a bit short for overclocking...
I ordered a 1200w.


----------



## bouncingsoul

Mine has 630W... I'm looking forward to upgrading to 900 or 1000W max. :-D


----------



## Glerox

Quote:


> Originally Posted by *bouncingsoul*
> 
> Mine has 630W... I'm looking forward to upgrading to 900 or 1000W max. :-D


Jesus! Way too low.
A maximum OC on a Titan XP can take 400w of power... x2 = 800w
I have also an overclocked 6850k which can take around 150w...
Plus the other parts, plus the pump... And you want to leave some extra because PSU are more efficient when they are below 80% of total power comsumption...


----------



## bouncingsoul

Which one exactly did you order?

I was estimating that each titan would need like 300W and another 300W for the cpu abd everything else, so 1000W should be plenty.
Does the Titan draw more than 300w with that restrictive power target?


----------



## Glerox

I've seen people with powermeters measuring the TXP with 400w power draw because you increase the power limit to 120%

I ordered the EVGA P2 1200w


----------



## JrClocker

I have 2 Titan X Pascal cards (overclocked to 2050 MHz each) and a 5820 K (overclocked to 4.3 GHz).

When everything is pulling hard, it draws 800 W from the UPS with an EVGA 1000W T2 PSU.

Be careful with the P2 1200 W if you are running through a UPS. The inrush current is high and could trip the UPS. This is why I went to the T2 1000 W.


----------



## jsutter71

I bought a EVGA T2 1600 when I was running 980Ti's in triple SLI because I did not want to take any chances. Not that I need that much now, but I have noticed a significant improvement with overclocking and stability by not cutting corners. Example from a previous post about using a single power cable with a 6 and 8 pin connector versus an individual cable per connection. When I used a single cable with both connections my system would frequently lock up and any slight attempt at overclocking. Not just the Titans but also my CPU and system memory.


----------



## Glerox

Quote:


> Originally Posted by *JrClocker*
> 
> I have 2 Titan X Pascal cards (overclocked to 2050 MHz each) and a 5820 K (overclocked to 4.3 GHz).
> 
> When everything is pulling hard, it draws 800 W from the UPS with an EVGA 1000W T2 PSU.
> 
> Be careful with the P2 1200 W if you are running through a UPS. The inrush current is high and could trip the UPS. This is why I went to the T2 1000 W.


I'm not sure what a UPS is apart from meaning "uninterruptible power supply" so I'm probably not using one


----------



## PowerK

Quote:


> Originally Posted by *Glerox*
> 
> I've seen people with powermeters measuring the TXP with 400w power draw because you increase the power limit to 120%
> 
> I ordered the EVGA P2 1200w


There's no way these cards (250W TDP) draw 400W of power.
With power limit set to max. (120%), we are talking about 300W for each card.


----------



## Jbravo33

Quote:


> Originally Posted by *Glerox*
> 
> I changed my poor overclocker 6800k (4 GHz RAM 3000 MHz) for a lucky winner 6850k (4.4 GHz RAM 3200 Mhz) because I'm planning SLI and need the extra PCI-e lanes.
> It allowed me to beat my benchmarks record
> 
> 
> 
> 
> 
> 
> 
> 
> Still rocking that "old" Titan XP on firestrike ultra hall-of-fame!
> 
> 
> 
> core +225
> vram +550
> temp around 40 celcius
> 
> Can't wait to try my second TXP


you'll enjoy 6850k i think its a great balance between 4 and higher cored cpus. and if it overclocks well, single threaded performance is above 6900k and 6950 and right below 7700k


----------



## axiumone

Quote:


> Originally Posted by *PowerK*
> 
> There's no way these cards (250W TDP) draw 400W of power.
> With power limit set to max. (120%), we are talking about 300W for each card.


TDP ≠ Power draw.


----------



## MrTOOSHORT

Had the shunt mod on for a few days. Used CLU, but after some researching, it doesn't look like it's a good idea as people have said it eats solder. So I cleaned it off. Now back to the throttling for now at least.









Might try the shunt mod again with conductonaut, might not as the cold season is over weather wise.


----------



## JrClocker

I have 2 Titan X Pascal running on a 5820 K machine...both Titans are overclocked to 2050 MHz
Quote:


> Originally Posted by *Glerox*
> 
> I'm not sure what a UPS is apart from meaning "uninterruptible power supply" so I'm probably not using one


Yeah - UPS = Uninterruptible Power Supply.

I leave my PCs on 24/7...so a UPS is a must for me!


----------



## deltamono

Keep in mind about the maximun wanttage on ups. Most manufactures advise it on volt-ampere that is no equal watts. It´s about 60 or 70 percent of total volt-amperes supported by the ups. It´means that if you computer consumes 600 w on full charge ( assuming that the psu used have about 200w of headroom on maximun charge) you nedd a ups of at least 1000va.
Also is important chose a ups with line interactive or on line function. Those functions extend the battery life and attenuates all possible fluctuations in the input voltage.
Buy a good ups is worth it but it not cheaper.


----------



## PowerK

Quote:


> Originally Posted by *axiumone*
> 
> TDP ≠ Power draw.


Sure. Let me put it this way then.
8pin = 150W
6pin = 75W
PCI-E = 75W
Total = 300W


----------



## hotrod717

Quote:


> Originally Posted by *PowerK*
> 
> Sure. Let me put it this way then.
> 8pin = 150W
> 6pin = 75W
> PCI-E = 75W
> Total = 300W


Ummm no. If i had a nickel for everytime i saw some post something similar. - http://forum.kingpincooling.com/showthread.php?t=3961

"Often gamers and users are mistakenly referring to 6-pin or 8-pin MiniFit.JR connectors as 75W or 150W capable inputs. Nothing can be further from truth. These power levels are nothing but just way for NV determine how capable is used board hardware in terms of power delivery system. It's imaginary target number and have nothing to do with actual power taken from connector nor power input capability. Software and NV BIOS will handle GPU clocks and reduce voltages if measured power hitting programmed BIOS limit (which can and usuall is different value than 75/150W per connector!).

If you intend to do serious overclocking and benchmarking, it may be required to trick power monitoring circuitry to report lower power reading, so you not run into power throttle. Also to make sure we are not at any physical limit of power connector itself, check Molex 26-01-3116 specifications, which have specifications both 13A *per contact* (16AWG wire in small connector) to 8.5A/contact (18AWG wire).

This means that using common 18AWG cable, 6-pin connector specified for 17A of current (3 contacts for +12V power, 2 contacts for GND return, one contact for detect). 8-pin have 25.5A current specification (3 contacts for +12V power, 3 contacts for GND return and 2 contacts for detection). This is 204W at +12.0V level or 306W for 8-pin accordingly.

Now if somebody tells you that 6-pin can't provide more than 75W, you know they don't understand the topic very well. It's not the connector itself or cable limit the power, but active regulation of GPU/BIOS/Driver according to detection of used cables and preprogrammed limits. So now we getting to know how actual power measured?"


----------



## PowerK

Quote:


> Originally Posted by *hotrod717*
> 
> Ummm no. If i had a nickel for everytime i saw some post something similar. - http://forum.kingpincooling.com/showthread.php?t=3961
> 
> "Often gamers and users are mistakenly referring to 6-pin or 8-pin MiniFit.JR connectors as 75W or 150W capable inputs. Nothing can be further from truth. These power levels are nothing but just way for NV determine how capable is used board hardware in terms of power delivery system. It's imaginary target number and have nothing to do with actual power taken from connector nor power input capability. Software and NV BIOS will handle GPU clocks and reduce voltages if measured power hitting programmed BIOS limit (which can and usuall is different value than 75/150W per connector!).
> 
> If you intend to do serious overclocking and benchmarking, it may be required to trick power monitoring circuitry to report lower power reading, so you not run into power throttle. Also to make sure we are not at any physical limit of power connector itself, check Molex 26-01-3116 specifications, which have specifications both 13A *per contact* (16AWG wire in small connector) to 8.5A/contact (18AWG wire).
> 
> This means that using common 18AWG cable, 6-pin connector specified for 17A of current (3 contacts for +12V power, 2 contacts for GND return, one contact for detect). 8-pin have 25.5A current specification (3 contacts for +12V power, 3 contacts for GND return and 2 contacts for detection). This is 204W at +12.0V level or 306W for 8-pin accordingly.
> 
> Now if somebody tells you that 6-pin can't provide more than 75W, you know they don't understand the topic very well. It's not the connector itself or cable limit the power, but active regulation of GPU/BIOS/Driver according to detection of used cables and preprogrammed limits. So now we getting to know how actual power measured?"


You should check ATX spec. Those are the limits.


----------



## deafboy

That is the spec, not the limits. You can push more power through all of those connections, they'd just be out of spec at that point.


----------



## arrow0309

Quote:


> Originally Posted by *deafboy*
> 
> That is the spec, not the limits. You can push more power through all of those connections, they'd just be out of spec at that point.


That is correct indeed


----------



## hotrod717

Quote:


> Originally Posted by *deafboy*
> 
> That is the spec, not the limits. You can push more power through all of those connections, they'd just be out of spec at that point.


LOL. Overclocking.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Had the shunt mod on for a few days. Used CLU, but after some researching, it doesn't look like it's a good idea as people have said it eats solder. So I cleaned it off. Now back to the throttling for now at least.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Might try the shunt mod again with conductonaut, might not as the cold season is over weather wise.


not sure conductonaut is materially different regarding gallium content (which really is the key component of all these liquid metal mixtures).
IDK - I can;t get CLU or CLP to do anything on my TXp... got the PCIE ribbon today so I can lay the card on the MB, but I won't be able to play with it until Mon Tues (I hope).








Quote:


> Originally Posted by *PowerK*
> 
> You should check ATX spec. Those are the limits.


they are the min spec for those rails and terminals. each is capable of delivering a lot more that the min spec. It's mainly an issue for the connectors assuming the cables are sufficient gauge. I mean, using an EVBOT on 780Ti Kingpins, the PCIE combined had to deliver nearly 2x this spec - and they did without even getting warm by IR thermo.


----------



## bl1tzk1213g

Off topic question here. I'm looking for the follow specs for titan x (pascal 2016)

1. Thermal pad sizes for stock cooler / backplate
2. Backplate screw size (the tiny little ones that hold the backplate)


----------



## Artah

Quote:


> Originally Posted by *bl1tzk1213g*
> 
> Off topic question here. I'm looking for the follow specs for titan x (pascal 2016)
> 
> 1. Thermal pad sizes for stock cooler / backplate
> 2. Backplate screw size (the tiny little ones that hold the backplate)


Not really off topic. .5mm for mem and vrm 16mm wide by 120mm strips. for the mosfet it's 1mm thick by 24mm wide in 120mm strip. Same at TXp 2017. Backplate is .5mm IIRC. All I can say about the screws is REALY small that screws on another bigger screws with a hex head on top that's the same size thread as a standard TXP ekwb screws, don't have measurements for that sorry.


----------



## DerComissar

Quote:


> Originally Posted by *deafboy*
> 
> Waiting on the block to arrive.... ahhh, can't wait to install this thing, haha.


Yeah!

That's going to look great in TUCM!









Which block did you order?


----------



## deafboy

Quote:


> Originally Posted by *DerComissar*
> 
> Yeah!
> 
> That's going to look great in TUCM!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which block did you order?


EK Copper/Acetal My go to combo


----------



## DerComissar

Quote:


> Originally Posted by *deafboy*
> 
> EK Copper/Acetal My go to combo


Good choice.

No nickel flaking worries, nor cracking acrylic.


----------



## PowerK

Quote:


> Originally Posted by *Jpmboy*
> 
> they are the min spec for those rails and terminals. each is capable of delivering a lot more that the min spec. It's mainly an issue for the connectors assuming the cables are sufficient gauge. I mean, using an EVBOT on 780Ti Kingpins, the PCIE combined had to deliver nearly 2x this spec - and they did without even getting warm by IR thermo.


Hi Jpm.
Understood.


----------



## bouncingsoul

Quote:


> Originally Posted by *Glerox*
> 
> I've seen people with powermeters measuring the TXP with 400w power draw because you increase the power limit to 120%
> 
> I ordered the EVGA P2 1200w


Just to make sure the PSU stays relaxed I went for a Corsair 1200W. Thanks for your input, guys


----------



## Glerox

Nice, better have more than not enough!


----------



## Jpmboy

Quote:


> Originally Posted by *PowerK*
> 
> Hi Jpm.
> Understood.


yeah.. the days of melting the ATX connector because the graphics sub system was pulling ALL power from the slot are gone (thankfully).


----------



## Artah

Anyone running a single 360 rad on two of these TXPs? I have these on my wife's rig with a push pull setup and she's going past 80c sometimes. Wonder if someone is running this with only that one rad for two GPUs and not having crazy heat.


----------



## Silent Scone

Quote:


> Originally Posted by *Artah*
> 
> Anyone running a single 360 rad on two of these TXPs? I have these on my wife's rig with a push pull setup and she's going past 80c sometimes. Wonder if someone is running this with only that one rad for two GPUs and not having crazy heat.


Xp? Or XP? Believe it or not, there's a separate thread lol.


----------



## Artah

Quote:


> Originally Posted by *Silent Scone*
> 
> Xp? Or XP? Believe it or not, there's a separate thread lol.


XP I have a pair of Maxwell, TXP and TXp.

The TXP is the one giving off too much heat on my wife's rig and I double cleaned the loop already. I had it on 11x140mm and it was fine overclocked when I had it on my rig. Maybe my wife's pump is too weak with the koolance bay/res with 3x120mm.

The maxwell is on my daughter's rig and it seems to be fine with 5x120mm but she's got a single DD5 pump.


----------



## JrClocker

I'm running each of my Titan X Pascal cards off a 120 mm radiator (AIO kit mod).

They don't go past 46 C.


----------



## Jpmboy

Quote:


> Originally Posted by *Artah*
> 
> Anyone running a single 360 rad on two of these TXPs? I have these on my wife's rig with a push pull setup and she's going past 80c sometimes. Wonder if someone is running this with only that one rad for two GPUs and not having crazy heat.


1 360 really can;t handle my two TXPs (eg, when folding for example). 2 360s and 6 fans keeps the water hot side in the low - mid 30s.


----------



## CptSpig

Quote:


> Originally Posted by *Artah*
> 
> Anyone running a single 360 rad on two of these TXPs? I have these on my wife's rig with a push pull setup and she's going past 80c sometimes. Wonder if someone is running this with only that one rad for two GPUs and not having crazy heat.


I have my CPU and Titan X Pascal on one 360 with great temps. Adding another card even though EK says it's ok I would be skeptical.


----------



## Jusiz

Quote:


> Originally Posted by *Artah*
> 
> XP I have a pair of Maxwell, TXP and TXp.
> 
> The TXP is the one giving off too much heat on my wife's rig and I double cleaned the loop already. I had it on 11x140mm and it was fine overclocked when I had it on my rig. Maybe my wife's pump is too weak with the koolance bay/res with 3x120mm.
> 
> The maxwell is on my daughter's rig and it seems to be fine with 5x120mm but she's got a single DD5 pump.


The titan family.







i think too weak pump or bad contacts?


----------



## Artah

Quote:


> Originally Posted by *Jusiz*
> 
> The titan family.
> 
> 
> 
> 
> 
> 
> 
> i think too weak pump or bad contacts?


I bought my wife an SMA8 and going to add a 2nd 360 hope that cools it enough.


----------



## Naennon

anyone tried the Xp BIOS on the X Pascal?


----------



## Naennon

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Doesn't work, says gpu mismatch. Tried a bunch of 1080ti bios' same thing.


have you tried it with the force switch? -4 -5 -6 ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Naennon*
> 
> have you tried it with the force switch? -4 -5 -6 ?


Yes, I have.


----------



## jsutter71

Updated video of my system


----------



## DerComissar

Quote:


> Originally Posted by *jsutter71*
> 
> Updated video of my system


Now that is a class act!

Off to the build log I go....................


----------



## Jpmboy

Quote:


> Originally Posted by *jsutter71*
> 
> Updated video of my system


PC porn.


----------



## Glerox

I can't get Superposition to work in SLI. I imported the SLI profile through Nvidia 3D profile manager but it doesn't work.
It shows both GPUs using 100% but performance is terrible.

Any tricks?


----------



## OZrevhead

With both gpus on water the Titan XP should beat a 1080Ti in benchmarks right? Does the AB curve mod work on TXP?


----------



## Silent Scone

Quote:


> Originally Posted by *OZrevhead*
> 
> With both gpus on water the Titan XP should beat a 1080Ti in benchmarks right? Does the AB curve mod work on TXP?


Don't think so. Depending on achievable clocks, both should be pretty much equal. I'll let you know come next week, waiting for my Strix block. (Assuming you're talking about Pascal Titan X and not Titan Xp)


----------



## OZrevhead

Damn it, I didn't realize that both have 3584 cuda cores, I thought Ti was a little less.

So its all silicone lottery.


----------



## Silent Scone

Quote:


> Originally Posted by *OZrevhead*
> 
> Damn it, I didn't realize that both have 3584 cuda cores, I thought Ti was a little less.
> 
> So its all silicone lottery.


Yeah, they're the same part, snipped memory bus with 11GB but with the new GDDR5X IC. There should be nothing in it clock for clock.


----------



## st0necold

Quote:


> Originally Posted by *jsutter71*
> 
> Everything. Some people, myself excluded are angry because Nvidia just demonstrated that they could release a near identical product at a $500 decrease. These same angry people are posting comments about suing but are receiving negative feedback. IMHO they have the right to feel the way they do, and if they want to sue then it's their right. The subject pertains to the hardware on this thread therefore it is applicable.
> 
> I also want to add that I am not trying to make anyone angry nor step on any toes. I understand the cost of business and love my TXPs. I remember my mom buying me a IBM PC jr when I was 14 years old and a year later IBM discontinued it. I remember feeling angry, and upset at the time but that was because I was a just a kid. The take away being that the PC hobby is like buying a car. Their will always be something newer and better to replace it.


just joined the owners club thanks to the for sale sub. can't wait to get it.


----------



## GosuPl

Quote:


> Originally Posted by *Silent Scone*
> 
> Yeah, they're the same part, snipped memory bus with 11GB but with the new GDDR5X IC. There should be nothing in it clock for clock.


TITAN X Pascal have 96 ROP and 3584 CUDA / 224 TMU . Memory bus 384 bit. GTX 1080Ti have 88 ROP , 3584 CUDA and 224 TMU, memory bus 352 bit.

Clock vs Clock TITAN X Pascal have better performance than GTX 1080Ti. 1080Ti have faster memory, yes. But with less bandwitch, TX P still have better potential.

I change my 2x 1080Ti for 2x TITAN X Pascal right now, skip TXp until VOLTA ;-)


----------



## Jpmboy

Quote:


> Originally Posted by *GosuPl*
> 
> TITAN X Pascal have 96 ROP and 3584 CUDA / 224 TMU . Memory bus 384 bit. GTX 1080Ti have 88 ROP , 3584 CUDA and 224 TMU, memory bus 352 bit.
> 
> Clock vs Clock TITAN X Pascal have better performance than GTX 1080Ti. 1080Ti have faster memory, yes. But with less bandwitch, TX P still have better potential.
> 
> I change my 2x 1080Ti for 2x TITAN X Pascal right now, skip TXp until VOLTA ;-)


^^ this.

Honestly, not really seeing the 1080Ti pass (or generaly match) the TXP in just about any bench thread here at OCN. They're clocse, but the 1080Ti clusters right at the lower end of the TXP scores overall. Maybe it's the ROPs. Looking to see how the ASUS 1080Ti Stris performs.









the most GPU-centric benchmark seems to be VR Mark Blue Room... they trade blows. Otherwise, it's TXp > TXP >~ 1080Ti...


----------



## MrKenzie

I finally bit the bullet and did the liquid metal shunt mod on my Titan XP. I used Thermal Grizzly and just applied to 2 of the 3 shunts. Maximum power I see is 90% now.

I was running a fairly consistent 2126MHz but dropping to as low as 2050MHz in one or two power intensive titles. Now I have it dialed in at 2139MHz and have not seen it drop yet (still need to test more titles though).

As long as my PCB doesn't short circuit from the liquid metal I can see myself doing this to all future cards that will benefit from it


----------



## MrKenzie

Cracked 8000 in FS Ultra.. But look at my GPU score









http://www.3dmark.com/fs/12530438


----------



## GosuPl

I compared GTX 1080Ti vs TITAN X Pascal 2016, clock vs clock and + 1000 on memory effective.

1080Ti @2012/12000
TITAN X Pascal @2012/11000

http://www.3dmark.com/compare/fs/12537346/fs/12536362

http://www.3dmark.com/compare/fs/12537379/fs/12536406

http://www.3dmark.com/compare/fs/12537309/fs/12536330

Nvidia claim "1080Ti is faster than TITAN X", nope is not faster. ONLY in stock vs stock, and this is cuase better temperatures on 1080Ti and higher core clock.

Maybe soon i will buy TITAN Xp and compare vs 1080Ti and TX Pascal clock vs clock. But performance gain is not worthy in real scenarios.

On GPU-Z screen, you can see memory bandtwich, pixel fillrate and texture fillrate.

Please dont look at clocks on 3d mark, it bugged as alwyas, same GPU and CPU







Boost on GPU-Z is too misleading, beacuse GPU-Boost 3.0 is very strange









Cards works on same clocks, TX P even throttle more, form 2012 to 1974/1961-1923.

1080Ti stay on 1974 most time.

I am satisfied with change 2x 1080Ti for 2x TITAN X Pascal, now time to LC them


----------



## Glerox

Nice!


----------



## GosuPl

Quote:


> Originally Posted by *Glerox*
> 
> Nice!


Thanks


----------



## Glerox

I'm not sure I understand why you changed your tis for two TXP but nevermind, thanks for the comparative benchmarks!

TXP is the best gpu I have ever had. I just bought my 2nd one and I'm planning to do my first crazy hardline build with them!

I'll try to do a nice video of the build


----------



## DerComissar

Quote:


> Originally Posted by *GosuPl*
> 
> I compared GTX 1080Ti vs TITAN X Pascal 2016, clock vs clock and + 1000 on memory effective.
> 
> 1080Ti @2012/12000
> TITAN X Pascal @2012/11000
> 
> http://www.3dmark.com/compare/fs/12537346/fs/12536362
> 
> http://www.3dmark.com/compare/fs/12537379/fs/12536406
> 
> http://www.3dmark.com/compare/fs/12537309/fs/12536330
> 
> Nvidia claim "1080Ti is faster than TITAN X", nope is not faster. ONLY in stock vs stock, and this is cuase better temperatures on 1080Ti and higher core clock.
> 
> Maybe soon i will buy TITAN Xp and compare vs 1080Ti and TX Pascal clock vs clock. But performance gain is not worthy in real scenarios.
> 
> On GPU-Z screen, you can see memory bandtwich, pixel fillrate and texture fillrate.
> 
> Please dont look at clocks on 3d mark, it bugged as alwyas, same GPU and CPU
> 
> 
> 
> 
> 
> 
> 
> Boost on GPU-Z is too misleading, beacuse GPU-Boost 3.0 is very strange
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cards works on same clocks, TX P even throttle more, form 2012 to 1974/1961-1923.
> 
> 1080Ti stay on 1974 most time.
> 
> I am satisfied with change 2x 1080Ti for 2x TITAN X Pascal, now time to LC them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Spoiler!


Imo you don't need to change over to the TXp, enjoy those two sweet TX Pascals to the fullest.

On water they'll really shine!


----------



## bouncingsoul

Yesterday I added the second Titan to my Loop, delidded the CPU and changed the TIM for the Titans to Kryonaut LIM. Actually I'm really happy the system still works, as it is my first built like that. 
The temperature of the Titans dropped significantly with the LIM. Went from approx. 55-60 degrees to 38-42 degrees under heavy load. CPU dropped from 80-85 to mid 60s.
The cards clock up to 2050MHz which is a nice result for me as I decided to not shunt mod them.
For the cards thats pretty much the limit but there's still should beplenty of OC potential for the CPU as I'm currently still using the profile I used before changing to LIM.
Im pretty happy with the results so far : http://www.3dmark.com/fs/12540052
What do you guys think?


----------



## GosuPl

Quote:


> Originally Posted by *Glerox*
> 
> I'm not sure I understand why you changed your tis for two TXP but nevermind, thanks for the comparative benchmarks!
> 
> TXP is the best gpu I have ever had. I just bought my 2nd one and I'm planning to do my first crazy hardline build with them!
> 
> I'll try to do a nice video of the build


I change them because

1.I like have 12 gb vram (my former GPUs is 2x TITAN X Maxwell)
2.I sold 2x 1080Tis and i bought 2x TX P on same price









So why not ? ;-)


----------



## GosuPl

Quote:


> Originally Posted by *bouncingsoul*
> 
> Yesterday I added the second Titan to my Loop, delidded the CPU and changed the TIM for the Titans to Kryonaut LIM. Actually I'm really happy the system still works, as it is my first built like that.
> The temperature of the Titans dropped significantly with the LIM. Went from approx. 55-60 degrees to 38-42 degrees under heavy load. CPU dropped from 80-85 to mid 60s.
> The cards clock up to 2050MHz which is a nice result for me as I decided to not shunt mod them.
> For the cards thats pretty much the limit but there's still should beplenty of OC potential for the CPU as I'm currently still using the profile I used before changing to LIM.
> Im pretty happy with the results so far : http://www.3dmark.com/fs/12540052
> What do you guys think?


Not bad







Compare to my, 2012/11000, GPUs drops for 1885 LOL









http://www.3dmark.com/compare/fs/12540052/fs/12541683

I need waterrr, now waitng for blocks ^^

Much better than my former 2x TX M ^^

http://www.3dmark.com/compare/fs/12541683/fs/9992259

But even 3x TX M, cant outperform 2x TX P

http://www.3dmark.com/compare/fs/12541683/fs/9913863


----------



## bouncingsoul

Quote:


> Originally Posted by *GosuPl*
> 
> Not bad
> 
> 
> 
> 
> 
> 
> 
> Compare to my, 2012/11000, GPUs drops for 1885 LOL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/12540052/fs/12541683


Well... :-D
http://www.3dmark.com/compare/fs/12542072/fs/12541683#

It's hard to beat your physics score with "only" 4 cores


----------



## GosuPl

Quote:


> Originally Posted by *bouncingsoul*
> 
> Well... :-D
> http://www.3dmark.com/compare/fs/12542072/fs/12541683#
> 
> It's hard to beat your physics score with "only" 4 cores


GPU score is GPU score ;-)

But PCI-E lanes is VERY important too, in games (depend of game is very) in 3dm much less


----------



## bouncingsoul

Quote:


> Originally Posted by *GosuPl*
> 
> GPU score is GPU score ;-)
> 
> But PCI-E lanes is VERY important too, in games (depend of game is very) in 3dm much less


but is that also the case when you use PCI E 3.0 x8? Didn't the change from 2.0 to 3.0 double the bandwidth?


----------



## GosuPl

Quote:


> Originally Posted by *bouncingsoul*
> 
> but is that also the case when you use PCI E 3.0 x8? Didn't the change from 2.0 to 3.0 double the bandwidth?


I just
Quote:


> Originally Posted by *bouncingsoul*
> 
> but is that also the case when you use PCI E 3.0 x8? Didn't the change from 2.0 to 3.0 double the bandwidth?


On this test PCI-E 2x 8 is native, i just swap from PCI-E 16 slots, for PCI-E 8 slots.

Soon i will record test on my second PC with 7700K + 2x TX P vs 5930K + 2x TX P.

CPU with 40 PCI-E lanes is very important thing for maximized SLI performance.


----------



## BrainSplatter

Quote:


> Originally Posted by *GosuPl*
> 
> CPU with 40 PCI-E lanes is very important thing for maximized SLI performance.


And that's the problem with the whole situation since single core performance is better on the 7700K compared to any X99 CPU. You have to decide between maximum PCIE bandwidth or maximum single core performance which sucks a lot.

And it seems like Intel is continuing this stupid division with their upcoming 4 core CPU (supposedly the 7740K) for the X299 platform since it seems to be also limited in number of lanes


----------



## jsutter71

Quote:


> Originally Posted by *GosuPl*
> 
> I compared GTX 1080Ti vs TITAN X Pascal 2016, clock vs clock and + 1000 on memory effective.
> 
> 1080Ti @2012/12000
> TITAN X Pascal @2012/11000
> 
> http://www.3dmark.com/compare/fs/12537346/fs/12536362
> 
> http://www.3dmark.com/compare/fs/12537379/fs/12536406
> 
> http://www.3dmark.com/compare/fs/12537309/fs/12536330
> 
> Nvidia claim "1080Ti is faster than TITAN X", nope is not faster. ONLY in stock vs stock, and this is cuase better temperatures on 1080Ti and higher core clock.
> 
> Maybe soon i will buy TITAN Xp and compare vs 1080Ti and TX Pascal clock vs clock. But performance gain is not worthy in real scenarios.
> 
> On GPU-Z screen, you can see memory bandtwich, pixel fillrate and texture fillrate.
> 
> Please dont look at clocks on 3d mark, it bugged as alwyas, same GPU and CPU
> 
> 
> 
> 
> 
> 
> 
> Boost on GPU-Z is too misleading, beacuse GPU-Boost 3.0 is very strange
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cards works on same clocks, TX P even throttle more, form 2012 to 1974/1961-1923.
> 
> 1080Ti stay on 1974 most time.
> 
> I am satisfied with change 2x 1080Ti for 2x TITAN X Pascal, now time to LC them


*More reinforcement to my previous discussions that the TXP is superior to the 1080Ti.*


----------



## jsutter71

Quote:


> Originally Posted by *bouncingsoul*
> 
> Yesterday I added the second Titan to my Loop, delidded the CPU and changed the TIM for the Titans to Kryonaut LIM. Actually I'm really happy the system still works, as it is my first built like that.
> The temperature of the Titans dropped significantly with the LIM. Went from approx. 55-60 degrees to 38-42 degrees under heavy load. CPU dropped from 80-85 to mid 60s.
> The cards clock up to 2050MHz which is a nice result for me as I decided to not shunt mod them.
> For the cards thats pretty much the limit but there's still should beplenty of OC potential for the CPU as I'm currently still using the profile I used before changing to LIM.
> Im pretty happy with the results so far : http://www.3dmark.com/fs/12540052
> What do you guys think?


Did not do anything near the mod you did to your card but I think you might wanna do some more tweaking.
http://www.3dmark.com/fs/12360076


----------



## bouncingsoul

Quote:


> Originally Posted by *jsutter71*
> 
> Did not do anything near the mod you did to your card but I think you might wanna do some more tweaking.
> http://www.3dmark.com/fs/12360076


What do you mean?


----------



## GosuPl

Quote:


> Originally Posted by *jsutter71*
> 
> *More reinforcement to my previous discussions that the TXP is superior to the 1080Ti.*


Thanks


----------



## lilchronic

Quote:


> Originally Posted by *jsutter71*
> 
> *More reinforcement to my previous discussions that the TXP is superior to the 1080Ti.*


Quote:


> Originally Posted by *jsutter71*
> 
> Did not do anything near the mod you did to your card but I think you might wanna do some more tweaking.
> http://www.3dmark.com/fs/12360076


It's pretty much the same card. I think the memory oc on the 1080Ti's are causing it to draw more power thus making the core clocks throttle more.









I would like to see the card's compared clock for clock with out any power throttling. maybe at .925v and 1911Mhz to ensure no power throttlling.

either way the titan XP beats the 1080ti in pretty much all of OCN's benchmark threads......


----------



## deafboy

Finally got the card installed, loop filled, and up and running yesterday. Have to say I'm super excited, such a better experience than my 290 Tri-fire setup, ugh.

It is a shame though that you can't pump more power to these things without mods. Did some quick overclocks last night just to test things out, was quite happy how relatively easy it was to get to 2113 and never broke 38*C

Haven't dialed and and stressed tested 24/7 OC's yet but I'm one happy camper


----------



## jsutter71

Quote:


> Originally Posted by *bouncingsoul*
> 
> What do you mean?


Looking at your system you may be able to increase your memory and clock settings a little bit. Disregard the physics and combined score because you're limited due to your CPU and memory speeds. Your graphics score on the other hand may be able to increase. Depending on your cooling.


----------



## DerComissar

Quote:


> Originally Posted by *deafboy*
> 
> Finally got the card installed, loop filled, and up and running yesterday. Have to say I'm super excited, such a better experience than my 290 Tri-fire setup, ugh.
> 
> It is a shame though that you can't pump more power to these things without mods. Did some quick overclocks last night just to test things out, was quite happy how relatively easy it was to get to 2113 and never broke 38*C
> 
> Haven't dialed and and stressed tested 24/7 OC's yet but I'm one happy camper


2113!









You don't neeeeed more voltage, you're there!








Nice Supo score, 10K busted and then some!

Imo that max gpu temp. @38C is a good example of how effective moar rads can be, lol.


----------



## Apples10304

Is there a hard limit to the voltage? Is there a recommended amount of voltage to stay below? Or a recommended "dont add more than "X" mV? Sorry if my question is not good. I am very bad at stringing words together.


----------



## bouncingsoul

Quote:


> Originally Posted by *jsutter71*
> 
> Looking at your system you may be able to increase your memory and clock settings a little bit. Disregard the physics and combined score because you're limited due to your CPU and memory speeds. Your graphics score on the other hand may be able to increase. Depending on your cooling.


Especially for the core clocks I'm hitting a wall. Currently I apply +200 which is stable. Up to +202 it crashes only sometimes, and +203 results in a bluescreen for sure. For the memory the sweet spot for the first card was +475 thats where i got the best scores. The cooling should be ok i think. Its its 1x 360 and one 240 rad.
But yeah, you are right, there might be a few more points to tweak... Ram for example as i cant get the dominators to run at the clocks they are supposed to.
Here's a picture:


----------



## PowerK

Quote:


> Originally Posted by *BrainSplatter*
> 
> And that's the problem with the whole situation since single core performance is better on the 7700K compared to any X99 CPU. You have to decide between maximum PCIE bandwidth or maximum single core performance which sucks a lot.
> 
> And it seems like Intel is continuing this stupid division with their upcoming 4 core CPU (supposedly the 7740K) for the X299 platform since it seems to be also limited in number of lanes


I honestly don't think Intel even remotedly cares about multi gpu performance. Neither do game developers.
It was all GPU makers. Hence, making display drivers to work in AFR mode, fooling APIs and game engines to work as if there's only one display adapter. GPU makers can't do that (fooling API and engine) any more in DX12, resulting poor multi gpu support these days.

Multi GPU support is still a long way to go, I think.. let alone the support is getting worse. Unless we see a major paradigm shift in how multi gpu works, I really think it will remain very nichie.
Heck, multi thread/core CPU support in games are still rare to this day. Too many games are still single threaded.

For ultimate gaming rig, the fastest single core performance CPU paired with the fastest single GPU is the best.







(and I don't think this trend is going to change any time soon.)


----------



## Jpmboy

Quote:


> Originally Posted by *Apples10304*
> 
> Is there a hard limit to the voltage? Is there a recommended amount of voltage to stay below? Or a recommended "dont add more than "X" mV? Sorry if my question is not good. I am very bad at stringing words together.


the real question is whether it is actually getting any higher voltage from the AB or any OS-based tool's slider.


----------



## jsutter71

Quote:


> Originally Posted by *bouncingsoul*
> 
> Especially for the core clocks I'm hitting a wall. Currently I apply +200 which is stable. Up to +202 it crashes only sometimes, and +203 results in a bluescreen for sure. For the memory the sweet spot for the first card was +475 thats where i got the best scores. The cooling should be ok i think. Its its 1x 360 and one 240 rad.
> But yeah, you are right, there might be a few more points to tweak... Ram for example as i cant get the dominators to run at the clocks they are supposed to.
> Here's a picture:


When you say +200 Your referring to afterburner? I understand that. Depending on your level of cooling seems to have a big impact on how far you can raise those settings. Some people are able to push their clocks to extreme levels on Afterburner. On my system with Afterburner I can raise it to 226 for core clock and 550 on memory clock. For benchmarking I set mine at 225 and 525. Anything beyond that for me is not always stable. Afterburner is only part of the equation though. When I upgraded my system memory from DDR2400 CL 15 to DDR3200 CL14 my benchmarks improved dramatically. On average my scores improved by 500 points in TS and 700 points in FS.

Nice system BTW


----------



## lanofsong

Hey there Titan X Pascal owners,

Would you consider signing up with Team OCN for the 2017 Pentathlon (*May 5th through May 19th*). There is so much time left an we really could use your help.

This event is truly a GLOBAL battle with you team OCN going up against many teams from across the world and while we put in a good showing at last year's event by finishing 6th, we could do with a lot more CPU/GPU compute power. All you need to do is sign up and crunch on any available hardware that you can spare.

The cool thing about this event is that it spread over 5 disciplines over *varying lengths of time* (different projects) so there is a lot of *strategy/tactics* involved.

We look forward to having you and your hardware on our team. Again, this event lasts for two weeks and takes place May 5th through the 19th.


Download the software here.

https://boinc.berkeley.edu/download.php

Presently we really would like some help with the following project:

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

To find your Cross Project ID# - sign into your account and it will be located under Computing and Credit


Please check out the GUIDE - How to add BOINC Projects page for more information about running different projects:

This really is an exciting and fun event and i look forward to it every year and I am hoping that you will join us and participate in this event









BTW - There is an awesome BOINC Pentathlon badge for those who participate









lanofsong

OCN - FTW


----------



## st0necold

This is the best GPU I've ever purchased. I could never really get my 980ti's classified's to clock well in SLI... I got the Titan x pascal yesterday and ran some benchmarks today. I got it to 2025mhz.. and never even crashed. +200/+200mem

http://www.3dmark.com/fs/12556981


----------



## MrKenzie

I need help with locking the voltage at 1.093V. I feel I can get another 50MHz out of my overclock but I can't seem to make it go higher than 1.062V. I have selected the 1.093V marker on the curve in afterburner 4.3.0 and pressed "L" and even "CTRL L" but it does not affect the voltage at all. What am I missing?

Currently running stable 2126MHz @ 1.050V and 2139MHz @ 1.062. But adjusting the 1.075, 1.081, and 1.093 voltage/MHz does nothing.


----------



## CptSpig

Quote:


> Originally Posted by *st0necold*
> 
> This is the best GPU I've ever purchased. I could never really get my 980ti's classified's to clock well in SLI... I got the Titan x pascal yesterday and ran some benchmarks today. I got it to 2025mhz.. and never even crashed. +200/+200mem
> 
> http://www.3dmark.com/fs/12556981


Congratulations on the new card.







You can go much higher My best OC was 227 on the core and 702 on the memory. So plan on having more fun......









http://www.3dmark.com/fs/11486413 Look at the graphic score.


----------



## Unimag

Hi. Also have a TXP (2017) but fairly new to overclocking.

When you guys talk about stable 2000+ I assume this is boost and if so what's the best way to get this result?

I've run most of the benchmark tools but surely with them being synthetic they don't give you an idea of real world gaming performance?


----------



## jsutter71

Quote:


> Originally Posted by *st0necold*
> 
> This is the best GPU I've ever purchased. I could never really get my 980ti's classified's to clock well in SLI... I got the Titan x pascal yesterday and ran some benchmarks today. I got it to 2025mhz.. and never even crashed. +200/+200mem
> 
> http://www.3dmark.com/fs/12556981


I was blown away going from triple 980Ti sli to TXP SLI. For me the performance increase was noticeable right away since I have a multi monitor setup. I wrongly assumed that I could get away with 4 monitors with 3 980Ti's and had to downgrade to 3 monitors. The biggest issue was that my primary monitor was always switched to either one of the other 3. After a month with the TXP's I figured I'd try again with 4 monitors. I'm happy that I did. I've been running with 4 monitors since December of last year and I rarely have any issues. Having a 31" monitor with 3 more 28" monitors is a amazing experience. Even so If I still dream of having 4 of the same monitors. Only issue is that my primary monitor cost me $1100 and even though I bought it in 2014 it still cost $900 on Amazon.
http://www.lg.com/us/monitors/lg-31MU97-B-4k-ips-led-monitor


----------



## Sheyster

Quote:


> Originally Posted by *MrKenzie*
> 
> I need help with locking the voltage at 1.093V. I feel I can get another 50MHz out of my overclock but I can't seem to make it go higher than 1.062V. I have selected the 1.093V marker on the curve in afterburner 4.3.0 and pressed "L" and even "CTRL L" but it does not affect the voltage at all. What am I missing?
> 
> Currently running stable 2126MHz @ 1.050V and 2139MHz @ 1.062. But adjusting the 1.075, 1.081, and 1.093 voltage/MHz does nothing.


If you're just gaming stick to 2126 @ 1050mv IMHO. That tiny difference you want is negligible in games. Benching is another story though if you're chasing a spot on a list somewhere.


----------



## st0necold

Quote:


> Originally Posted by *CptSpig*
> 
> Congratulations on the new card.
> 
> 
> 
> 
> 
> 
> 
> You can go much higher My best OC was 227 on the core and 702 on the memory. So plan on having more fun......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/11486413 Look at the graphic score.


your the man i'm going to go for the gold!


----------



## MrKenzie

Quote:


> Originally Posted by *Sheyster*
> 
> If you're just gaming stick to 2126 @ 1050mv IMHO. That tiny difference you want is negligible in games. Benching is another story though if you're chasing a spot on a list somewhere.


I just figure that if I can get it to clock higher at 1.093V then all the better, it's free performance and cooling is no issue as it's a chiller system with 15-17c coolant temperature. The Voltage "lock" sounds so simple but it just doesn't work for me, even after re-installing afterburner.


----------



## MrKenzie

Got it working- when I re-installed afterburner I had to unlock voltage control again! But it didn't achieve anything, I could not run any higher clocks. 2139MHz seems to be the limit but it is 100% constant at this with the liquid metal shunt mod and 25c max GPU temp. It does clock down marginally in Firestrike Ultra graphics test 1 but that's it.


----------



## Sheyster

Quote:


> Originally Posted by *MrKenzie*
> 
> I just figure that if I can get it to clock higher at 1.093V then all the better, it's free performance and cooling is no issue as it's a chiller system with 15-17c coolant temperature. The Voltage "lock" sounds so simple but it just doesn't work for me, even after re-installing afterburner.


Ah, don't blame you if you're on chilled water. Max that sucka out!


----------



## Glerox

Omg ok... I'm know I'm really stupid but I need your help.

While using pliers for removing the stock cooler on my 2nd Titan XP, I chopped off a small piece (a resistor I guess?)

Now, as you can see in the picture, my Philips screwdriver is pointing it.
As you can also see, I fixed it with 3M Tartan 369 box sealing tape.

Do you guys think this will work? Is the tape conductive?
Or should I go see someone who does electronic soldering?

Thanks! (yes I know it's fu*?&$ stupid)


----------



## pez

Well ended up returning my TXp since I was going to have a couple Tis and this TXP to play with. My TXP is a fairly decent OC'er on air, so I've ordered a hybrid kit and will be attaching a block to it hopefully by this weekend. Also picked up some Thermal Grizzly Kryonaut for a repaste. Will be curious to see how NVIDIA did on the original TIM.


----------



## Jpmboy

Quote:


> Originally Posted by *Glerox*
> 
> Omg ok... I'm know I'm really stupid but I need your help.
> 
> While using pliers for removing the stock cooler on my 2nd Titan XP, I chopped off a small piece (a resistor I guess?)
> 
> Now, as you can see in the picture, my Philips screwdriver is pointing it.
> As you can also see, I fixed it with 3M Tartan 369 box sealing tape.
> 
> Do you guys think this will work? Is the tape conductive?
> Or should I go see someone who does electronic soldering?
> 
> Thanks! (yes I know it's fu*?&$ stupid)


oh man - that's a sphincter relaxing moment for sure!
you're not the first. Last guy didn't replace it and the card still worked if i recall correctly. gotta use a 4mm socket.
What's the worst that can happen... if the card is dead, it's dead. it won't take out your MB or CPU if it is dead or fouled. Only way to know is to plug it in and see.


----------



## Glerox

Quote:


> Originally Posted by *Jpmboy*
> 
> oh man - that's a sphincter relaxing moment for sure!
> you're not the first. Last guy didn't replace it and the card still worked if i recall correctly. gotta use a 4mm socket.
> What's the worst that can happen... if the card is dead, it's dead. it won't take out your MB or CPU if it is dead or fouled. Only way to know is to plug it in and see.


I'm going to an electronic repair shop to solder it back. I think it's the best I can do in that situation...


----------



## lanofsong

Hey there Titan X Pascal owners,

We truly could use your help here. Presently we are #1 and just ahead of two of the Great TITAN's when it comes to Distributed Computing and to stay there we could use the help from you and your BOSS GPU's. Only 4 days left.




Download the software here.

https://boinc.berkeley.edu/download.php

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

Thanks in advance.

lanofsong

OCN - FTW


----------



## jsutter71

Quote:


> Originally Posted by *Glerox*
> 
> Omg ok... I'm know I'm really stupid but I need your help.
> 
> While using pliers for removing the stock cooler on my 2nd Titan XP, I chopped off a small piece (a resistor I guess?)
> 
> Now, as you can see in the picture, my Philips screwdriver is pointing it.
> As you can also see, I fixed it with 3M Tartan 369 box sealing tape.
> 
> Do you guys think this will work? Is the tape conductive?
> Or should I go see someone who does electronic soldering?
> 
> Thanks! (yes I know it's fu*?&$ stupid)


Ever thought about getting insurance. Small price to pay for something like that. I can't recommend mine because it's with USAA which the vast majority of people here would not qualify to join. However their are some that are open to the general public. My coverage specifically covers mistake like that. I have a $250 deductible which is tiny in comparison to the thousands I spent on my system.


----------



## bizplan

Quote:


> Originally Posted by *jsutter71*
> 
> Ever thought about getting insurance. Small price to pay for something like that. I can't recommend mine because it's with USAA which the vast majority of people here would not qualify to join. However their are some that are open to the general public. My coverage specifically covers mistake like that. I have a $250 deductible which is tiny in comparison to the thousands I spent on my system.




You think you're the only one who has spent thousands on their system (or who has them insured)? Your insensitivity to his plight and your general comments, bragging, (and one-ups-manship) continually throughout this thread are getting old. Stop it already!


----------



## pez

Uh oh lol.

It sounds like an ad for USAA Renter's/Property insurance







.

I'll be putting my OG TXP or Ti on a hybrid block this weekend







. Can't waitttt.


----------



## OZrevhead

Is there a Superposition ranking thread or something here? Or do you guys just search hwbot results?

I haven't tried superposition yet, whats good about it? That it's geared towards 4k results?


----------



## GnarlyCharlie

Quote:


> Originally Posted by *OZrevhead*
> 
> Is there a Superposition ranking thread or something here? Or do you guys just search hwbot results?
> 
> I haven't tried superposition yet, whats good about it? That it's geared towards 4k results?


OCN has its own thread with scores at different resolutions (1080 Extreme, 4K Optimized, 8K Optimized)
http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark


----------



## Jpmboy

Quote:


> Originally Posted by *bizplan*
> 
> 
> 
> You think you're the only one who has spent thousands on their system (or who has them insured)? Your insensitivity to his plight and your general comments, bragging, (and one-ups-manship) continually throughout this thread are getting old. Stop it already!


whoa... try the decaf brother.


----------



## jsutter71

I meant no disrespect. Quite the contrary. I am a retired US Army Combat medic that is also 100% disabled related to 3 1/2 years of combat in Iraq. I am not a insurance salesman. My comment was only meant to provide feedback pertaining to a unfortunate situation. This thread is all about helping people and discussing experiences pertaining to a specific piece of hardware. if preventive maintenance is an option. IE insurance. Then I don't see any harm in mentioning it.


----------



## jsutter71

Quote:


> Originally Posted by *Jpmboy*
> 
> whoa... try the decaf brother.


Thank you


----------



## Glerox

Pascal is over. Rise of Volta!

Still no Vega, poor AMD... they missed a complete cycle.



15 tflops vs 12 on the Titan Xp.
So at least 25% of performance improvment I would guess.


----------



## Dagamus NM

Quote:


> Originally Posted by *jsutter71*
> 
> Ever thought about getting insurance. Small price to pay for something like that. I can't recommend mine because it's with USAA which the vast majority of people here would not qualify to join. However their are some that are open to the general public. My coverage specifically covers mistake like that. I have a $250 deductible which is tiny in comparison to the thousands I spent on my system.


USAA let me do an add on to my homeowners policy for all of my PC and photography gear. I need to go in and validate the list, include pictures of anything new. Been about a year.

I think I pay an extra $15/month on top of my homeowners policy for this.
Quote:


> Originally Posted by *Glerox*
> 
> Pascal is over. Rise of Volta!
> 
> Still no Vega, poor AMD... they missed a complete cycle.
> 
> 
> 
> 15 tflops vs 12 on the Titan Xp.
> So at least 25% of performance improvment I would guess.


This is for the new Tesla card. It will be a bit before we see the Titan XV or whatever they call it. It will probably be about a 10-15% increase over Titan XP. AMD has missed several cycles now. When was their last new card? It was Fury right? That was May of 2014?


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> whoa... try the decaf brother.


Indeed, that was a bit harsh IMHO. I was not offended in the least by what was posted.


----------



## Glerox

FIY, those of you who saw my post on me ripping off a part of my Titan XP while using pliers to remove the stock cooler.

I went to see a guy who does PCB repairs.
He soldered back the part on the PCB.

The gpu is now working perfectly!
So if it happens to you, don't panic, sometime it can be repaired


----------



## jsutter71

Quote:


> Originally Posted by *Glerox*
> 
> FIY, those of you who saw my post on me ripping off a part of my Titan XP while using pliers to remove the stock cooler.
> 
> I went to see a guy who does PCB repairs.
> He soldered back the part on the PCB.
> 
> The gpu is now working perfectly!
> So if it happens to you, don't panic, sometime it can be repaired


Out of curiosity. Why did you use pliers to remove the stock cooler? Were you using the pliers as leverage to separate the cooler from the PCB? If so why pliers instead of a flat tip screw driver? Not to suggest that as an option. When I removed mine I applied slight pressure against the underside of the cooler until it separated. Not trying to sound like a NO IT ALL WHO WANTS TO ONE UP ANYONE. God knows I've accidentally destroyed my fare share of PC hardware during my 20+ years of PC building experience.


----------



## xTesla1856

I'm surprised you guys need tools at all to separate the cooler from the card. I've blocked/repasted/taken apart 4 Titans so far and never needed to do anything more than slightly pull and twist the card off the cooler. Maybe I was just lucky with the stock TIM.


----------



## deafboy

If I remember correctly he was using pliers on the screws instead of a torx/socket


----------



## GnarlyCharlie

I had a couple of the standoffs on the EK block back out when I went to remove it from the card, the screw threads were galled up. No way to grab that but with pliers, it's not a hex head.

And I was under the impression he was trying to remove the fasteners holding the card PCB to the cooler, not simply separate the TIM joint.


----------



## Glerox

I was using pliers to remove the stock cooler because I don't have this :

https://www.ekwb.com/shop/hex-socket-4mm

to remove the hexagonal screws holding the cooler from the backside of the PCB



I will NEVER use pliers again


----------



## Dagamus NM

Quote:


> Originally Posted by *Glerox*
> 
> Pascal is over. Rise of Volta!
> 
> Still no Vega, poor AMD... they missed a complete cycle.
> 
> 
> 
> 15 tflops vs 12 on the Titan Xp.
> So at least 25% of performance improvment I would guess.


Quote:


> Originally Posted by *Glerox*
> 
> I was using pliers to remove the stock cooler because I don't have this :
> 
> https://www.ekwb.com/shop/hex-socket-4mm
> 
> to remove the hexagonal screws holding the cooler from the backside of the PCB
> 
> 
> 
> I will NEVER use pliers again


Good that EK makes that available. Getting those off of four blocks took forever. Bought a proper tool afterwards.


----------



## GnarlyCharlie

And for a final insult, I carefully put all those little screws back into the stock cooler shroud so I wouldn't lose them....

....and forgot to put the dang heatsink back inside first. So now it won't fit back in the box it shipped in.


----------



## jsutter71

I highly recommend a set of precision tools. I use this Wiha set.


----------



## Glerox

I just completed my build with my 2nd Titan XP!

I did a couple of builds in the last year, but this time I wanted a professional finish.
So this means hardline tubing and custom cables.
I think I did a pretty good job








Will post a video timelapse build soon.






And some benchmarks of course :




Now I'm ready for some [email protected] ultra settings gaming


----------



## CptSpig

Quote:


> Originally Posted by *Glerox*
> 
> I just completed my build with my 2nd Titan XP!
> 
> I did a couple of builds in the last year, but this time I wanted a professional finish.
> So this means hardline tubing and custom cables.
> I think I did a pretty good job
> 
> 
> 
> 
> 
> 
> 
> 
> Will post a video timelapse build soon.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> G ALT=""]http://www.overclock.net/content/type/61/id/3034308/width/500/height/1000[/IMG]
> 
> 
> 
> 
> And some benchmarks of course :
> 
> 
> 
> 
> 
> 
> Now I'm ready for some [email protected] ultra settings gaming


Very nice build. I really like the case.


----------



## Glerox

Quote:


> Originally Posted by *CptSpig*
> 
> Very nice build. I really like the case.


Thanks!


----------



## DerComissar

Quote:


> Originally Posted by *Glerox*
> 
> I just completed my build with my 2nd Titan XP!
> 
> I did a couple of builds in the last year, but this time I wanted a professional finish.
> So this means hardline tubing and custom cables.
> I think I did a pretty good job
> 
> 
> 
> 
> 
> 
> 
> 
> Will post a video timelapse build soon.
> 
> 
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And some benchmarks of course :
> 
> 
> 
> 
> 
> 
> Now I'm ready for some [email protected] ultra settings gaming


Great benches with those two lovely cards, and the build looks fantastic.









Lian-Li really outdid themselves making that stunning case.


----------



## OZrevhead

Thats a beautiful build glenrox, well done

I dont keep my hardware long enough to worry about how it looks lol


----------



## Glerox

Quote:


> Originally Posted by *DerComissar*
> 
> Great benches with those two lovely cards, and the build looks fantastic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Lian-Li really outdid themselves making that stunning case.


Yeah the case is awesome and the price is reasonable!

Thanks


----------



## Glerox

Quote:


> Originally Posted by *OZrevhead*
> 
> Thats a beautiful build glenrox, well done
> 
> I dont keep my hardware long enough to worry about how it looks lol


Yeah I understand... I told myself I would skip volta after having invest so much time and money in this build. It will be hard to skip volta haha


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Glerox*
> 
> Yeah I understand... I told myself I would skip volta after having invest so much time and money in this build. It will be hard to skip volta haha


That's what I'm holding out for on my Titan X Maxwell SLI rig, some Big Voltas.


----------



## Dagamus NM

Quote:


> Originally Posted by *jsutter71*
> 
> I highly recommend a set of precision tools. I use this Wiha set.


A man after my own heart. I love wiha tools, though I use the ESD safe drive-loc sets with the adjustable length blades. I have a small set for computers and a large set for the garage. German hand tools are the best.


----------



## GnarlyCharlie

Looks like that 5.5mm socket has seen better days.


----------



## jsutter71

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> Looks like that 5.5mm socket has seen better days.


It has. I emailed Wiha to request a new 5.5 bit. Here was their response.

Hello John,

Please be advised bits are not covered under warranty as they are considered consumable products. As a courtesy we will be sending out a onetime replacement for your 5.5mm nut setter (driver) to the address on the Amazon Invoice.

You may review our warranty policy using the link below. http://www.wihatools.com/warranty-information

Thank you,

Tracy


----------



## jsutter71

Quote:


> Originally Posted by *Dagamus NM*
> 
> A man after my own heart. I love wiha tools, though I use the ESD safe drive-loc sets with the adjustable length blades. I have a small set for computers and a large set for the garage. German hand tools are the best.


I also like Wera an Klein tools. This is my PC tool bag


----------



## GnarlyCharlie

Quote:


> Originally Posted by *jsutter71*
> 
> It has. I emailed Wiha to request a new 5.5 bit. Here was their response.
> 
> Hello John,
> 
> Please be advised bits are not covered under warranty as they are considered consumable products. As a courtesy we will be sending out a onetime replacement for your 5.5mm nut setter (driver) to the address on the Amazon Invoice.
> 
> You may review our warranty policy using the link below. http://www.wihatools.com/warranty-information
> 
> Thank you,
> 
> Tracy


Can't beat that!


----------



## Dagamus NM

Quote:


> Originally Posted by *jsutter71*
> 
> I also like Wera an Klein tools. This is my PC tool bag


Nicely. I have a fair amount of Klein stuff for bigger jobs. I don't have any Wera, but I have some Weidmueller stuff that is pretty awesome for ferrule crimping and stuff related to wiring.


----------



## Jquala

I have an extra BNIB 2017 Titan xp anyone in the socal area want it for 1k pm me!


----------



## pez

I personally love the iFixIt set that I've had for some time now. Needed 5 separate bits out of it for my Ti disassembly and it was super painless. PH0, PH00, Hex 4.0mm and then Torx 2.0mm and 2.5mm. That kit has yet to fail me
Quote:


> Originally Posted by *jsutter71*
> 
> Out of curiosity. Why did you use pliers to remove the stock cooler? Were you using the pliers as leverage to separate the cooler from the PCB? If so why pliers instead of a flat tip screw driver? Not to suggest that as an option. When I removed mine I applied slight pressure against the underside of the cooler until it separated. Not trying to sound like a NO IT ALL WHO WANTS TO ONE UP ANYONE. God knows I've accidentally destroyed my fare share of PC hardware during my 20+ years of PC building experience.


Quote:


> Originally Posted by *xTesla1856*
> 
> I'm surprised you guys need tools at all to separate the cooler from the card. I've blocked/repasted/taken apart 4 Titans so far and never needed to do anything more than slightly pull and twist the card off the cooler. Maybe I was just lucky with the stock TIM.


Quote:


> Originally Posted by *deafboy*
> 
> If I remember correctly he was using pliers on the screws instead of a torx/socket


Quote:


> Originally Posted by *Glerox*
> 
> I was using pliers to remove the stock cooler because I don't have this :
> 
> https://www.ekwb.com/shop/hex-socket-4mm
> 
> to remove the hexagonal screws holding the cooler from the backside of the PCB
> 
> 
> 
> I will NEVER use pliers again


Was starting to wonder myself how you ended up with pliers and now I see








. Yeah....next time going out to the hardware store for a <$5 tool would be ideal







.


----------



## bl1tzk1213g

What's the maximum safest power limit and core voltage (%) under water?


----------



## MrKenzie

Quote:


> Originally Posted by *bl1tzk1213g*
> 
> What's the maximum safest power limit and core voltage (%) under water?


You can safely run them both as high as they go (+100mV, 120% power limit), but I have not seen a need to go over +30mV on my card whilst using 120% power limit.


----------



## Jpmboy

Quote:


> Originally Posted by *Glerox*
> 
> I just completed my build with my 2nd Titan XP!
> 
> I did a couple of builds in the last year, but this time I wanted a professional finish.
> So this means hardline tubing and custom cables.
> I think I did a pretty good job
> 
> 
> 
> 
> 
> 
> 
> 
> Will post a video timelapse build soon.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> And some benchmarks of course :
> 
> 
> 
> 
> 
> 
> Now I'm ready for some [email protected] ultra settings gaming


beautiful!


----------



## Glerox

Quote:


> Originally Posted by *Jpmboy*
> 
> beautiful!


Thanks!


----------



## st0necold

I just ordered the EVGA AIO for mine cant wait to get it and see if I can improve on my score.


----------



## OZrevhead

So has anyone here applied the AB curve mod with the TXp (or TXP)? Any results?


----------



## Menthol

Quote:


> Originally Posted by *OZrevhead*
> 
> So has anyone here applied the AB curve mod with the TXp (or TXP)? Any results?


Not exactly sure what you mean by curve mod but the use of the curve method to overclock is explained in this thread many times over and is the best method to use for overclocking and benching


----------



## OZrevhead

Same same ....









Can someone confirm for me if Liquid Ultra and Conductanaught both work for performing a shunt mod? This stuff seems reluctant to spreading so is it really going to drip (vertical gpu)? What is a piece of plastic or something was applied over the clu so it has something to hold on to? Does heat make it more likely to drip or run?

Thanks


----------



## WaXmAn

Would you sell your Titan XP for $850? I have a local buyer wanting it. Debating if its worth to sell and get a 1080Ti or maybe upgrade to a Titan Xp since I know my EK block from my Titan XP will fit a Titan Xp.


----------



## CptSpig

Quote:


> Originally Posted by *WaXmAn*
> 
> Would you sell your Titan XP for $850? I have a local buyer wanting it. Debating if its worth to sell and get a 1080Ti or maybe upgrade to a Titan Xp since I know my EK block from my Titan XP will fit a Titan Xp.


I sold my Titan X Pascal and purchased a Titan Xp and could not be happier!


----------



## WaXmAn

Quote:


> Originally Posted by *CptSpig*
> 
> I sold my Titan X Pascal and purchased a Titan Xp and could not be happier!


Did you see a Nice Jump in performance for the added $300+ cost? Questioning IF its worth it with Volta already in the works


----------



## bl1tzk1213g

Quote:


> Originally Posted by *WaXmAn*
> 
> Would you sell your Titan XP for $850? I have a local buyer wanting it. Debating if its worth to sell and get a 1080Ti or maybe upgrade to a Titan Xp since I know my EK block from my Titan XP will fit a Titan Xp.


I'd sell mine if he's interested. And waterblock too for additional.


----------



## CptSpig

Quote:


> Originally Posted by *WaXmAn*
> 
> Did you see a Nice Jump in performance for the added $300+ cost? Questioning IF its worth it with Volta already in the works


My Time Spy score went up 800 points. I have Battlefield 1 capped at 144 and it maintains this no problem. It is a beast.


----------



## DerComissar

Quote:


> Originally Posted by *CptSpig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *WaXmAn*
> 
> Would you sell your Titan XP for $850? I have a local buyer wanting it. Debating if its worth to sell and get a 1080Ti or maybe upgrade to a Titan Xp since I know my EK block from my Titan XP will fit a Titan Xp.
> 
> 
> 
> I sold my Titan X Pascal and purchased a Titan Xp and could not be happier!
Click to expand...

Imo the TX Pascal is still a damn good card.

I came close to getting one myself, but then the Ti dropped.
Bought into that, then the Xp dropped.

At that point I had gone somewhat gpu crazy, and decided I just had to have a full-core chip, so I bought an Xp.

Imo 850 bucks is a good price to sell for, if you really want to get an Xp.

Before Volta drops.


----------



## lanofsong

Hey Titan X Pascal owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Glerox

For those who like timelapse videos, for fun I did an amateur video of my build, from unboxing to benchmark in 8 minutes.
That project was fun!
Thanks for all the times you guys answered my questions hehe.
Overclock.net is the best PC forum!


----------



## CptSpig

Quote:


> Originally Posted by *Glerox*
> 
> For those who like timelapse videos, for fun I did an amateur video of my build, from unboxing to benchmark in 8 minutes.
> That project was fun!
> Thanks for all the times you guys answered my questions hehe.
> Overclock.net is the best PC forum!


Great video and the build is very nice. I really like my MSI board good choice.


----------



## Glerox

Quote:


> Originally Posted by *CptSpig*
> 
> Great video and the build is very nice. I really like my MSI board good choice.


Thanks! Yeah specially this model you get every bang for your buck!


----------



## arrow0309

Quote:


> Originally Posted by *Glerox*
> 
> For those who like timelapse videos, for fun I did an amateur video of my build, from unboxing to benchmark in 8 minutes.
> That project was fun!
> Thanks for all the times you guys answered my questions hehe.
> Overclock.net is the best PC forum!


+Rep!

You deserve it








Nice and quick build


----------



## pez

Quote:


> Originally Posted by *st0necold*
> 
> I just ordered the EVGA AIO for mine cant wait to get it and see if I can improve on my score.


I think you'll be very pleased. If you're just after performance alone, the lower noise and temps will be icing on the cake







.


----------



## GosuPl

Anyone have problems with SLI in Battlefiled 1? 3 last drivers included 382.33 and i have huge fps drops, low GPU usage.

GPU1 - 85%-93% GPU2 - 73%-75%

BF 1 4k maxed - FPS spikes from 91 for 105

When i use older drivers, i have 95/99% GPU usage on both cards and FPS from 112 for 135 . Same test place etc.

*** with this drivers ?









378.92 - Best Pascal drivers for SLI so far .


----------



## jsutter71

Quote:


> Originally Posted by *Glerox*
> 
> For those who like timelapse videos, for fun I did an amateur video of my build, from unboxing to benchmark in 8 minutes.
> That project was fun!
> Thanks for all the times you guys answered my questions hehe.
> Overclock.net is the best PC forum!


Awesome video. Beautiful case but how does it compare to previous Lian Li cases? I always thought Lian Li cases had amazing high quality engineering. I hate using the word "but", but my only issue with their cases was that they used very thin aluminum compared to Silverstone and Case Labs. That case has tempered glass, correct? I bet that adds some weight.

Not trying to be critical because your build is BEAUTIFUL. I'm just curious.


----------



## jsutter71

Quote:


> Originally Posted by *GosuPl*
> 
> Anyone have problems with SLI in Battlefiled 1? 3 last drivers included 382.33 and i have huge fps drops, low GPU usage.
> 
> GPU1 - 85%-93% GPU2 - 73%-75%
> 
> BF 1 4k maxed - FPS spikes from 91 for 105
> 
> When i use older drivers, i have 95/99% GPU usage on both cards and FPS from 112 for 135 . Same test place etc.
> 
> *** with this drivers ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 378.92 - Best Pascal drivers for SLI so far .


As of late all my scores have sunk...2000 in TS alone. I haven't done anything to my system other then update the drivers. Sometimes I think Microsoft and other vendors release substandard updated drivers so people will spend more money and upgrade their hardware.


----------



## Glerox

Quote:


> Originally Posted by *GosuPl*
> 
> Anyone have problems with SLI in Battlefiled 1? 3 last drivers included 382.33 and i have huge fps drops, low GPU usage.
> 
> GPU1 - 85%-93% GPU2 - 73%-75%
> 
> BF 1 4k maxed - FPS spikes from 91 for 105
> 
> When i use older drivers, i have 95/99% GPU usage on both cards and FPS from 112 for 135 . Same test place etc.
> 
> *** with this drivers ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 378.92 - Best Pascal drivers for SLI so far .


Quote:


> Originally Posted by *jsutter71*
> 
> As of late all my scores have sunk...2000 in TS alone. I haven't done anything to my system other then update the drivers. Sometimes I think Microsoft and other vendors release substandard updated drivers so people will spend more money and upgrade their hardware.


In BF1, Are you disabling TAA?
With TAA off I have 90% scaling. With TAA on I have 70%.


----------



## Glerox

Quote:


> Originally Posted by *jsutter71*
> 
> Awesome video. Beautiful case but how does it compare to previous Lian Li cases? I always thought Lian Li cases had amazing high quality engineering. I hate using the word "but", but my only issue with their cases was that they used very thin aluminum compared to Silverstone and Case Labs. That case has tempered glass, correct? I bet that adds some weight.
> 
> Not trying to be critical because your build is BEAUTIFUL. I'm just curious.


It's my first Lian Li case. I admit the aluminium is quite thin, specially the corner between the two glass panels. I feel like I could really bend it with small pressure. But it stays on my desk and I don't move it so it's not really a big problem for me.

Yes it it tempered glass. I think for such a beautiful case, the price is reasonable.


----------



## Blaise Pascal

Hi all, just some results of my EVGA Hybrid cooler on the 2016 Titan Pascal. I was blown away. Feel free to ask questions. I put yellow boxes around the important plots.

Info:
-NO overclock limits or fan curves were changed from standard settings at all!!!
-Game: Player Unknown Battlegrounds gameplay at steady state
-The cooler was not the 1080 one, it was the one specifically for the TitanX/1080ti.
-Ambient temperature was 72F

PUBG1.jpg 1124k .jpg file


alternative imgur link:


http://imgur.com/v0coQ


----------



## jsutter71

Quote:


> Originally Posted by *Glerox*
> 
> In BF1, Are you disabling TAA?
> With TAA off I have 90% scaling. With TAA on I have 70%.


I was speaking in general lack of performance. I don't have BF1. I keep telling myself that I need to get it but I'm already neglecting Civ6, Fallout4 and Witcher. Just not enough hours in the day. How are you adjusting your settings anyways. Through the Nvidia control panel or Nvidia inspector?


----------



## OZrevhead

Quote:


> Originally Posted by *Blaise Pascal*
> 
> Hi all, just some results of my EVGA Hybrid cooler on the 2016 Titan Pascal. I was blown away. Feel free to ask questions. I put yellow boxes around the important plots.
> 
> Info:
> -NO overclock limits or fan curves were changed from standard settings at all!!!
> -Game: Player Unknown Battlegrounds gameplay at steady state
> -The cooler was not the 1080 one, it was the one specifically for the TitanX/1080ti.
> -Ambient temperature was 72F
> 
> PUBG1.jpg 1124k .jpg file
> 
> 
> alternative imgur link:
> 
> 
> http://imgur.com/v0coQ


Great results.


----------



## Glerox

Quote:


> Originally Posted by *jsutter71*
> 
> I was speaking in general lack of performance. I don't have BF1. I keep telling myself that I need to get it but I'm already neglecting Civ6, Fallout4 and Witcher. Just not enough hours in the day. How are you adjusting your settings anyways. Through the Nvidia control panel or Nvidia inspector?


You can disable Temporal Anti-aliasing directly in game settings. It seems that TAA is a post-processing method that doesn't work well with SLI. It shows the same bad scaling in Mass Effect Andromeda.


----------



## jsutter71

Quote:


> Originally Posted by *Glerox*
> 
> You can disable Temporal Anti-aliasing directly in game settings. It seems that TAA is a post-processing method that doesn't work well with SLI. It shows the same bad scaling in Mass Effect Andromeda.


Yesterday before I launched TS I opened up Nvidia Inspector and used the default profile for TS. Afterwards I ran the benchmark and sure enough my score went back into the 19000 range like before. Thought that was odd so ran the bechmark a few more times with the same results. Then I turned off Nvidia Inspector, rebooted my PC, reran TS and again back where it was supposed to be. So this morning I did the same thing without launching Nvidia inspector and my scores were where they were supposed to be. This time my TS score was 19194. Not my highest but close enough. My thinking was that during one of the last driver updates my settings must have been messed up, and Nvidia Inspector correct the problem. I just about wiped my system drive and performed a clean Windows install to fix the issue. I even uninstalled some of my later programs thinking they might have slowed down my performance. Live and learn right.


----------



## GosuPl

Funny thing. I just test 2x TX P on my second platform with I7 [email protected], and what i see is just madness

Witcher 3 - 4k uber + HWaaX8 + AA ON = GPU usage 75/80% (both) and 68-72 fps on 7700K rig

Witcher 3 - 4k uber + HWaaX8 + AA ON = GPU usage 94/95% (both) and 79-84 fps 5930K rig

Witcher 3 - 4k uber + HWaaX8 + AA OFF = GPU usage 95/96% (both) and 88-92 fps on 7700K rig

Witcher 3 - 4k uber + HWaaX8 + AA OFF = GPU usage 96/98% (both) and 94-98 fps on 5930K rig

Same drivers, same settings. HB SLI bridges on both RIGs. CPU have Importance even on 4k ? Or this maybe PCI-E lanes (2x 16 on 5930K vs 2x 8 on 7700K ) influence?

What do You think ?


----------



## Glerox

Quote:


> Originally Posted by *GosuPl*
> 
> Funny thing. I just test 2x TX P on my second platform with I7 [email protected], and what i see is just madness
> 
> Witcher 3 - 4k uber + HWaaX8 + AA ON = GPU usage 75/80% (both) and 68-72 fps on 7700K rig
> 
> Witcher 3 - 4k uber + HWaaX8 + AA ON = GPU usage 94/95% (both) and 79-84 fps 5930K rig
> 
> Witcher 3 - 4k uber + HWaaX8 + AA OFF = GPU usage 95/96% (both) and 88-92 fps on 7700K rig
> 
> Witcher 3 - 4k uber + HWaaX8 + AA OFF = GPU usage 96/98% (both) and 94-98 fps on 5930K rig
> 
> Same drivers, same settings. HB SLI bridges on both RIGs. CPU have Importance even on 4k ? Or this maybe PCI-E lanes (2x 16 on 5930K vs 2x 8 on 7700K ) influence?
> 
> What do You think ?


Thank you for proving once again that 8 pci-e lanes bottlenecks performance vs 16.


----------



## GosuPl

Quote:


> Originally Posted by *Glerox*
> 
> Thank you for proving once again that 8 pci-e lanes bottlenecks performance vs 16.


No problem, tests results will be on my YT channel.

https://www.youtube.com/channel/UCQSy7A7a75eE0H7bDfRpEew

Here You have, my old PCI-E 2x8 vs 2x 16 test , on 2x TITAN X Maxwell.


----------



## OZrevhead

Does anyone have a download link for AB 4.3.0 beta14 ? I cant find it, all links seem to point to guru3d who has updated to full release version.


----------



## OZrevhead

Ok so I finally got my vgpu slider unlocked and I get 1.081v max, have any other Titan owners seen this issue? Can I get the full 1.093v?

Thanks


----------



## Overfiend1981

Hi Guys,

And so my adventure with Titan Xp ends for now....temporarily.

And it ended with...this:




After 4 weeks of working flawlessly - EVGA Hybrid leaked from pipe joints directly onto the PCB. Cooler was mounted on the back of Cosmos 2 directly above the card.

Just bringing it up here for people to see and to really ask if someone had any experience with EVGA RMA service. I'm currently in process but so far I have only received automated messages from their system requesting details and for the Hybrid do be sent back. They haven't replied to any of the tickets or even seemed to acknowledge the fact that the card was destroyed because of the leak.

I'm quite confident about the legal basis for this (have already taken legal advice) - but would really like to hear from people first. I do want to tread lightly and make them accept responsibility as they should rather then firing legal jargon at them. Besides I heard nothing but good things about their customer service and RMAs so far.

Any thoughts anyone? For know a bit miffed re my 5 week old card being obliterated by the bloody cooler! :/


----------



## bizplan

Quote:


> Originally Posted by *Overfiend1981*
> 
> Hi Guys,
> 
> And so my adventure with Titan Xp ends for now....temporarily.
> 
> And it ended with...this:
> 
> 
> 
> 
> After 4 weeks of working flawlessly - EVGA Hybrid leaked from pipe joints directly onto the PCB. Cooler was mounted on the back of Cosmos 2 directly above the card.
> 
> Just bringing it up here for people to see and to really ask if someone had any experience with EVGA RMA service. I'm currently in process but so far I have only received automated messages from their system requesting details and for the Hybrid do be sent back. They haven't replied to any of the tickets or even seemed to acknowledge the fact that the card was destroyed because of the leak.
> 
> I'm quite confident about the legal basis for this (have already taken legal advice) - but would really like to hear from people first. I do want to tread lightly and make them accept responsibility as they should rather then firing legal jargon at them. Besides I heard nothing but good things about their customer service and RMAs so far.
> 
> Any thoughts anyone? For know a bit miffed re my 5 week old card being obliterated by the bloody cooler! :/


I wonder if EVGA disclaims liability for consequential damages from their product, I believe that is their standard legal policy as it is with most manufacturers. Having said that, you should for sure be able to get a replacement Hybrid unit from EVGA (their customer service/RMA policy is quite liberal). I would also think that Nvidia, as a result of goodwill, would replace the damaged Xp for a new one (I have seen them do this many times even though it wasn't their fault). I would tell them the truth and avoid a legal discussion. You may be quite surprised!


----------



## Lee0

Quote:


> Originally Posted by *bizplan*
> 
> I wonder if EVGA disclaims liability for consequential damages from their product, I believe that is their standard legal policy as it is with most manufacturers. Having said that, you should for sure be able to get a replacement Hybrid unit from EVGA (their customer service/RMA policy is quite liberal). I would also think that Nvidia, as a result of goodwill, would replace the damaged Xp for a new one (I have seen them do this many times even though it wasn't their fault). I would tell them the truth and avoid a legal discussion. You may be quite surprised!


^-- What this guy said. EVGA, or really any other company that sells water-cooling components, can't/ won't be held responsible for leak damage on other components, and that also includes AIOs. They will only replace the faulty cooler itself, and not anything else. But, considering nVidia and how cool they can be as the guy above me also said they might replace the card.


----------



## jsutter71

Quote:


> Originally Posted by *Overfiend1981*
> 
> Hi Guys,
> 
> And so my adventure with Titan Xp ends for now....temporarily.
> 
> And it ended with...this:
> 
> 
> 
> 
> After 4 weeks of working flawlessly - EVGA Hybrid leaked from pipe joints directly onto the PCB. Cooler was mounted on the back of Cosmos 2 directly above the card.
> 
> Just bringing it up here for people to see and to really ask if someone had any experience with EVGA RMA service. I'm currently in process but so far I have only received automated messages from their system requesting details and for the Hybrid do be sent back. They haven't replied to any of the tickets or even seemed to acknowledge the fact that the card was destroyed because of the leak.
> 
> I'm quite confident about the legal basis for this (have already taken legal advice) - but would really like to hear from people first. I do want to tread lightly and make them accept responsibility as they should rather then firing legal jargon at them. Besides I heard nothing but good things about their customer service and RMAs so far.
> 
> Any thoughts anyone? For know a bit miffed re my 5 week old card being obliterated by the bloody cooler! :/


Did you completely dry out the card and verify that it is inoperable? Did you contact Nvidia and try to RMA it? Their have been situations on this thread where people with similar situations were able to get Nvidia to replace their cards.


----------



## st0necold

EDIT: I screwed up the install it's fine now.


----------



## pez

I just transferred the EVGA block from my Ti to my TXP, so a bit unsettling to see these stories. No issues for a couple weeks now with the CLC itself, but never confidence inspiring to hear the opposite







.

Hope that the both of you are taken care of to your expectations!


----------



## st0necold

Quote:


> Originally Posted by *pez*
> 
> I just transferred the EVGA block from my Ti to my TXP, so a bit unsettling to see these stories. No issues for a couple weeks now with the CLC itself, but never confidence inspiring to hear the opposite
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Hope that the both of you are taken care of to your expectations!


Don't be unsettled.

I woke up and decided to look over everything again and I noticed that I had the AIO's fan blocked by the relay wire inside of the unit--- and I accidently connected the 4 pin header outside of the card (it's supposed to go in the lower corner of the card.) I am so happy because when I got everything back in my case I got temps of 25 degrees and the pump / fans are working fine.

It literally took 10-15 minutes to break the entire stock cooler down and get the AIO on. Don't be worried about anything. If I would have read the directions I would have noticed that I didn't have the pump/fans properly connected. It's really a simple install.

Also thanks to EVGA they were beyond helpful and were literally willing to send me a brand new cooler (as they thought mine was defective.) I had a brain fart and thought the shroud's vrm fan was spinning (which is was not.) I'm sure the rep would have told me to double check the wiring inside if I had not told him all fans were working


----------



## st0necold

I ran firestrike extreme today and set a new personal best.

the guy who said the card would do better with a watercooler was right. With my EVGA AIO cooler I was able to go +205/+495 on mem and got 14303

http://www.3dmark.com/fs/12744676

ran at an impressive 2088mhz I wonder how much more I can squeeze out of it.


----------



## OZrevhead

Very nice, what were your gpu temps during the run?


----------



## CptSpig

Quote:


> Originally Posted by *st0necold*
> 
> I ran firestrike extreme today and set a new personal best.
> 
> the guy who said the card would do better with a watercooler was right. With my EVGA AIO cooler I was able to go +205/+495 on mem and got 14303
> 
> http://www.3dmark.com/fs/12744676
> 
> ran at an impressive 2088mhz I wonder how much more I can squeeze out of it.


Great Score!


----------



## pez

Quote:


> Originally Posted by *st0necold*
> 
> Don't be unsettled.
> 
> I woke up and decided to look over everything again and I noticed that I had the AIO's fan blocked by the relay wire inside of the unit--- and I accidently connected the 4 pin header outside of the card (it's supposed to go in the lower corner of the card.) I am so happy because when I got everything back in my case I got temps of 25 degrees and the pump / fans are working fine.
> 
> It literally took 10-15 minutes to break the entire stock cooler down and get the AIO on. Don't be worried about anything. If I would have read the directions I would have noticed that I didn't have the pump/fans properly connected. It's really a simple install.
> 
> Also thanks to EVGA they were beyond helpful and were literally willing to send me a brand new cooler (as they thought mine was defective.) I had a brain fart and thought the shroud's vrm fan was spinning (which is was not.) I'm sure the rep would have told me to double check the wiring inside if I had not told him all fans were working


Oh for sure. I got it done pretty simply (this was my second time at this point). I did notice that some screws were tightened more on the TXP than my Ti was. However, a fairly straightforward install. It's just tedious because of the backplate screws and the cooler bolts.

My card currently peaks at 60C (I'm in a rather small case and using a slower spinning fan), but sits around 55-58C in games. The OC is sitting around 2025-2050MHz, so I'm happy. Neither of my Tis wanted to break 2K, so I'll just keep the higher clocking card







. I think I'm going to do a GT fan on both of my CLCs to have the faster spinning fan and what I find to be an attractive looking fan (http://www.performance-pcs.com/darkside-gentle-typhoon-performance-radiator-fan-2150rpm-68cfm-black-edition.html).


----------



## DerComissar

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *st0necold*
> 
> Don't be unsettled.
> 
> I woke up and decided to look over everything again and I noticed that I had the AIO's fan blocked by the relay wire inside of the unit--- and I accidently connected the 4 pin header outside of the card (it's supposed to go in the lower corner of the card.) I am so happy because when I got everything back in my case I got temps of 25 degrees and the pump / fans are working fine.
> 
> It literally took 10-15 minutes to break the entire stock cooler down and get the AIO on. Don't be worried about anything. If I would have read the directions I would have noticed that I didn't have the pump/fans properly connected. It's really a simple install.
> 
> Also thanks to EVGA they were beyond helpful and were literally willing to send me a brand new cooler (as they thought mine was defective.) I had a brain fart and thought the shroud's vrm fan was spinning (which is was not.) I'm sure the rep would have told me to double check the wiring inside if I had not told him all fans were working
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh for sure. I got it done pretty simply (this was my second time at this point). I did notice that some screws were tightened more on the TXP than my Ti was. However, a fairly straightforward install. It's just tedious because of the backplate screws and the cooler bolts.
> 
> My card currently peaks at 60C (I'm in a rather small case and using a slower spinning fan), but sits around 55-58C in games. The OC is sitting around 2025-2050MHz, so I'm happy. Neither of my Tis wanted to break 2K, so I'll just keep the higher clocking card
> 
> 
> 
> 
> 
> 
> 
> . I think I'm going to do a GT fan on both of my CLCs to have the faster spinning fan and what I find to be an attractive looking fan (http://www.performance-pcs.com/darkside-gentle-typhoon-performance-radiator-fan-2150rpm-68cfm-black-edition.html).
Click to expand...

Static pressure, and industrial-grade build quality (as they are made by Nidec) are further attributes to the GT's.









I'm still using my original Scythe GT's from 2012, and they are still going strong.
They are a great rad fan, so they should work well for your CLCs.


----------



## pez

Quote:


> Originally Posted by *DerComissar*
> 
> Static pressure, and industrial-grade build quality (as they are made by Nidec) are further attributes to the GT's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm still using my original Scythe GT's from 2012, and they are still going strong.
> They are a great rad fan, so they should work well for your CLCs.


Oh yeah. Don't get me wrong. If the F120MP came in a higher RPM fan, I'd jump on that. However, GTs seem to have a more pleasing noise profile of any of the 2k+ RPM fans. Otherwise, in any other WC situation, I'd just run the F120MPs







.


----------



## st0necold

Quote:


> Originally Posted by *OZrevhead*
> 
> Very nice, what were your gpu temps during the run?


I forgot to log the run through 3dmark so although I don't have exact numbers I don't think the card went over 60 degrees. I am going to bench today with afterburner and come back with results.
Quote:


> Originally Posted by *CptSpig*
> 
> Great Score!


Thanks bro, I was looking for the post you made to quote it but I could not find it.


----------



## st0necold

Quote:


> Originally Posted by *pez*
> 
> Oh for sure. I got it done pretty simply (this was my second time at this point). I did notice that some screws were tightened more on the TXP than my Ti was. However, a fairly straightforward install. It's just tedious because of the backplate screws and the cooler bolts.
> 
> My card currently peaks at 60C (I'm in a rather small case and using a slower spinning fan), but sits around 55-58C in games. The OC is sitting around 2025-2050MHz, so I'm happy. Neither of my Tis wanted to break 2K, so I'll just keep the higher clocking card
> 
> 
> 
> 
> 
> 
> 
> . I think I'm going to do a GT fan on both of my CLCs to have the faster spinning fan and what I find to be an attractive looking fan (http://www.performance-pcs.com/darkside-gentle-typhoon-performance-radiator-fan-2150rpm-68cfm-black-edition.html).


The screws were a PITA to get out... I am glad I had a precision screwdriver. I couldn't even get the backplate back on because those tiny screws were damn near impossible to re-mount so I left the backplate off of mine (* I don't think were supposed to re-use it anyway)


----------



## pez

Quote:


> Originally Posted by *st0necold*
> 
> The screws were a PITA to get out... I am glad I had a precision screwdriver. I couldn't even get the backplate back on because those tiny screws were damn near impossible to re-mount so I left the backplate off of mine (* I don't think were supposed to re-use it anyway)


Ah so it sounds like you did the full hybrid kit? I just ordered the 1080 kit and put the block on it with the window portion of the shroud removed. It's probably not as pretty or streamlined, but I dig it







.


----------



## st0necold

Quote:


> Originally Posted by *pez*
> 
> Ah so it sounds like you did the full hybrid kit? I just ordered the 1080 kit and put the block on it with the window portion of the shroud removed. It's probably not as pretty or streamlined, but I dig it
> 
> 
> 
> 
> 
> 
> 
> .


I got the custom one for the Tx pascal--

https://www.evga.com/products/product.aspx?pn=400-HY-5388-B1

also guys it's not going past 47c under hours of gaming i'm going to see what the temps are during my next bench and post results. The 60 degree guess was prob. off I can't get it to go past 47.


----------



## OZrevhead

Guys, I have my TXp unlocked at 1.093v but now I am testing a second one (which will SLi next) but I can't get past 1.031v with it, has anyone else had this issue? I have not fitted the water block yet or done the resistor mod, with the air cooler I get PWR limit on and off constantly, is this just from the air cooler?



Edit: Actually, I have found that the slider moves but the vgpu doesn't change, what causes this?


----------



## Nicklas0912

The Voltage silder, does not give the card more Voltage.

It only does that it earlyer will peak into the max allow voltage than later.


----------



## Nicklas0912

Still Enjoy my Titan X Pascal!

No reason to change before volta come !









http://www.3dmark.com/spy/1428084

Core around 2088.
mem + 620


----------



## xTesla1856

One of these cards just popped up for sale locally for 550 bucks


----------



## jsutter71

Quote:


> Originally Posted by *xTesla1856*
> 
> One of these cards just popped up for sale locally for 550 bucks


That's like $560 US dollars. Jump on it because that's a incredible deal. Better yet. Do they ship to the US?


----------



## jsutter71

Quote:


> Originally Posted by *Nicklas0912*
> 
> Still Enjoy my Titan X Pascal!
> 
> No reason to change before volta come !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/1428084
> 
> Core around 2088.
> mem + 620


I keep telling myself to be patient with all the new hardware coming out but it's like a crack addiction....My wife would kill me because she's been bugging me to take her to Portugal for vacation. To me Portugal equals long flights, hotel stays, and being overcharged on everything. For her it's like dream vacation and a check off her list of places she always wanted to see.


----------



## xTesla1856

Quote:


> Originally Posted by *jsutter71*
> 
> That's like $560 US dollars. Jump on it because that's a incredible deal. Better yet. Do they ship to the US?


Local pickup only, sorry









I would if I still had my TX and hadn't upgraded to the Xp


----------



## Nikos4Life

Hello all







,

I am mining with my Titan X just for fun and to learn about this cryptocurrency world. I do have the cards as the pic below shows mining in good temps but they are throttling. I do not know if I am missing something here, do you have any tips or any further info regarding downclocking with this card?

Thank you










Regards,

Nikos


----------



## Lee0

Hello Guys, I do all decided to go full on out with my watercolling loop and I will be adding my you now. I'm currently in the disassembly process of the video card and I got a few bumps. First things first,I can't seem to remove these 2 screws while their neighbouring screws screwa out just fine.  and the opposite side. 
Secondly I need the correct tool to remove the "hex screws" (I don't know their name) on the back of the card *(I won't be using pliers).* I am talking about these screws.
Thirdly what is Nvidia doing with their thermal application??? Is this normal?

So my questions are:
1: is there any way to remove the small stuck screws (I am suing the correct bit)?
2: what is the the tool and the screw on the back of the card called and what are their specific size?
3: does Nvidia like TIM as much as fast food chains like mayo? Because they both put a damn lot of it on their products. (Is it normal?,(and yea I will be cleaning it and changing it, I am not that "nooby")).
Thanks for any help I can get.


----------



## dreamcat4

Quote:


> Originally Posted by *uggy*
> 
> 3: does Nvidia like TIM as much as fast food chains like mayo? Because they both put a damn lot of it on their products. (Is it normal?,(and yea I will be cleaning it and changing it, I am not that "nooby")).
> Thanks for any help I can get.


Yes because on a bare silicon die, under-applying can be quite dangerous. It would give an un-needed risk to such an expensive part. If any area of the die is missing paste, its too likely to damage the gpu core and basically destroy it.

Reason: When the thermal sensor is at a different location it cannot know or measure the temperature in some bad corner. And hence cannot throttle back just for that bad corner it cannot see is overheating (because... no paste there). Wheras over-application is really harmless by comparison. It just makes a mess around the edges, a place where nodody is going to see that anyway.

If a manufacturing line, its a requirement to be rapid. They cannot afford to do anything that wastes time... So over-apply makes best sense. Its possible to apply paste if you take more time / care. Or else by a tested and very consistent and repeatable machine process (robots).

Its different than for comething like a CPU with IHS, which helps to spread the heat.

Also the same situation is common in laptops (bare silicon dies).


----------



## Lee0

Quote:


> Originally Posted by *dreamcat4*
> 
> Yes because on a bare silicon die, under-applying can be quite dangerous. It would give an un-needed risk to such an expensive part. If any area of the die is missing paste, its too likely to damage the gpu core and basically destroy it.
> 
> Reason: When the thermal sensor is at a different location it cannot know or measure the temperature in some bad corner. And hence cannot throttle back just for that bad corner it cannot see is overheating (because... no paste there). Wheras over-application is really harmless by comparison. It just makes a mess around the edges, a place where nodody is going to see that anyway.
> 
> If a manufacturing line, its a requirement to be rapid. They cannot afford to do anything that wastes time... So over-apply makes best sense. Its possible to apply paste if you take more time / care. Or else by a tested and very consistent and repeatable machine process (robots).
> 
> Its different than for comething like a CPU with IHS, which helps to spread the heat.
> 
> Also the same situation is common in laptops (bare silicon dies).


Oh, I understand now, thanks for the info! It felt very weird seeing the naked gpu die compared to my processor, which only requires a bit of what was on the video card.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Lee0*
> 
> Secondly I need the correct tool to remove the "hex screws" (I don't know their name) on the back of the card *(I won't be using pliers).* I am talking about these screws.


The screws are 4mm hex socket IIRC.


----------



## Lee0

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> The screws are 4mm hex socket IIRC.


Ok,thank you very much. I will be going out and buying a proper tool to remove them now.







now all that remains is my first question.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *Lee0*
> 
> Ok,thank you very much. I will be going out and buying a proper tool to remove them now.
> 
> 
> 
> 
> 
> 
> 
> now all that remains is my first question.


On the two Phillips head screws? Try heating them up with a soldering iron pressed into the cross, then unscrew. The thread locking compound will soften with heat.


----------



## Lee0

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> On the two Phillips head screws? Try heating them up with a soldering iron pressed into the cross, then unscrew. The thread locking compound will soften with heat.


Thank you for the answer, but I already fixed it! I took my dremel to the screw and made a slit in it. I then later used a flathead screwdriver. I made sure to mark of the exposed PCB and the gpu, to protect them for any potential sparks and metal.


----------



## songokuj5

Any chance to unlock Titan X Pascal to Titan Xp or is certain that her suffered laser cut?


----------



## DerComissar

Quote:


> Originally Posted by *songokuj5*
> 
> Any chance to unlock Titan X Pascal to Titan Xp or is certain that her suffered laser cut?


Not possible, you would have to buy a TXp for the full-core deal.

Or a Quadro P6000, lol.


----------



## songokuj5

Quote:


> Originally Posted by *DerComissar*
> 
> Not possible, you would have to buy a TXp for the full-core deal.
> 
> Or a Quadro P6000, lol.


I knew ... but ... Hope is the last to die but die. Hehehe


----------



## DerComissar

Quote:


> Originally Posted by *songokuj5*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> Not possible, you would have to buy a TXp for the full-core deal.
> 
> Or a Quadro P6000, lol.
> 
> 
> 
> I knew ... but ... Hope is the last to die but die. Hehehe
Click to expand...

Yeah, that would have been nice if it were possible.

I really haven't seen much difference in performance with the full-core TXp, at least compared to a 1080Ti I had previously.
But I just had to have one, lol.

Imo, the TX Pascal is still a damn good card.


----------



## Jpmboy

Quote:


> Originally Posted by *DerComissar*
> 
> Yeah, that would have been nice if it were possible.
> 
> I really haven't seen much difference in performance with the full-core TXp, at least compared to a 1080Ti I had previously.
> But I just had to have one, lol.
> 
> I*mo, the TX Pascal is still a damn good card*.


It's a fantastic card. I have a pair of each running right next to each other.... unless benching at the edge, you'd never know which was which.


----------



## dreamcat4

Hey guys! Since you are all enthusiasts here. Did any of you know that aparrently some guys over on laptop forums have been hardware flashing the BIOS on pascal, in order to get around the nvidia driver signing?

It puzzles me because no-one else seems to have done that. Yet modding the BIOS and flashing it like that is significantly less dangerous than the hard voltage mod. And isn't for-sure going to void the warranty either. You use SPI flash clip on the SOIC 8 BIOS smd package. Then remove the clip when done.

Prior to finding that, had assumed from the outset that the NVIDIA signing drm [for the BIOSes] was checked at runtime by the driver, at boot / upon driver loading & initialization. For this method to work, it would mean only the flashing process itself (nvflash or whatever they call it) is being blocked by a signature failure check. If thats true then i guess what these laptop guys have been doing is legitimate.

Just thought you guys should know. Again, this all really surprises me. Must also point out that I have not been able to check / verify myself. I certainly don't want to spread false hope either about these things. And yes... I know maybe this topic belongs on another thread. In which case please move it there moderators / admins.


----------



## Sheyster

Quote:


> Originally Posted by *Jpmboy*
> 
> It's a fantastic card. I have a pair of each running right next to each other.... unless benching at the edge, you'd never know which was which.


Indeed, I kinda regret bothering to upgrade to the Xp, but at least a good friend got a nice deal on the older card.


----------



## fernlander

Quote:


> Originally Posted by *dreamcat4*
> 
> Hey guys! Since you are all enthusiasts here. Did any of you know that aparrently some guys over on laptop forums have been hardware flashing the BIOS on pascal, in order to get around the nvidia driver signing?
> 
> It puzzles me because no-one else seems to have done that. Yet modding the BIOS and flashing it like that is significantly less dangerous than the hard voltage mod. And isn't for-sure going to void the warranty either. You use SPI flash clip on the SOIC 8 BIOS smd package. Then remove the clip when done.
> 
> Prior to finding that, had assumed from the outset that the NVIDIA signing drm [for the BIOSes] was checked at runtime by the driver, at boot / upon driver loading & initialization. For this method to work, it would mean only the flashing process itself (nvflash or whatever they call it) is being blocked by a signature failure check. If thats true then i guess what these laptop guys have been doing is legitimate.
> 
> Just thought you guys should know. Again, this all really surprises me. Must also point out that I have not been able to check / verify myself. I certainly don't want to spread false hope either about these things. And yes... I know maybe this topic belongs on another thread. In which case please move it there moderators / admins.


This is what I came to the thread to read about. Not so much about people consoling Titan X owners etc.


----------



## SirCanealot

Hey guys,

Got a Titan XP a lil while back (2nd hand, fairly cheap) and it's been a fun upgrade from my 1080!









I've put an Accelero Xtreme IV on it (I was thinking of putting something better on it, but at £45 the Accelero is great value for money) and the card under load is around 60-70c (while being pretty much silent).

I was thinking of doing the shunt mod at some point to get me a little more juice and have a couple of questions:

Would there be much point in getting some heatsinks for the memory and VRM? The Accelero does have a large backplate which does seem to work pretty well (it gets quite hot!), but would it help a little more to put some heatsinks on the front too?

And was there a consensus on the shunt mod if the card is vertical? I have a Raven 3 case, so did we figure out if the paste can actually run off the shunts after time? If so, I think people have suggested to seal around it with liquid electrical tape, then seal over the shunt to hold the paste in place?


----------



## jsutter71

Has anyone had any issues with the latest drivers? I haven't benchmarked in a while so today when I opened up 3dmark and afterburner my system showed artifacts and locked up. It continued to do this until I uninstalled afterburner. Then I reinstalled and even with a slight increase with core clock and memory did the same thing. A few months ago I stopped using Rivatuner for the same reason. I'm getting so PO'd at these system updates which do more harm then good. These coders need to go back to school.


----------



## songokuj5

*Competition! Thx AMD!







*

http://www.guru3d.com/files-details/geforce-385-12-driver-download.html

Nvidia just released a new driver, it is a bit of a mystery driver as it *"Provides multiple Titan Xp performance optimizations on a variety of applications for prosumers and creatives"*. It seems Nvidia unlocked a couple of features in an answer to yesterdays released AMD Radeon Pro WX 9100. These drivers add NVIDIA professional features for applications such as Maya, unlocking "3X more performance" for the software.

"Our latest driver - available today - delivers 3x more performance in applications like Maya to help you create and design faster than ever"


----------



## GosuPl

Quote:


> Originally Posted by *songokuj5*
> 
> *Competition! Thx AMD!
> 
> 
> 
> 
> 
> 
> 
> *
> 
> http://www.guru3d.com/files-details/geforce-385-12-driver-download.html
> 
> Nvidia just released a new driver, it is a bit of a mystery driver as it *"Provides multiple Titan Xp performance optimizations on a variety of applications for prosumers and creatives"*. It seems Nvidia unlocked a couple of features in an answer to yesterdays released AMD Radeon Pro WX 9100. These drivers add NVIDIA professional features for applications such as Maya, unlocking "3X more performance" for the software.
> 
> "Our latest driver - available today - delivers 3x more performance in applications like Maya to help you create and design faster than ever"


Not bad







But this works on TX P 2016 too? I cant find answer :/


----------



## songokuj5

Quote:


> Originally Posted by *GosuPl*
> 
> Not bad
> 
> 
> 
> 
> 
> 
> 
> But this works on TX P 2016 too? I cant find answer :/


I hope so... I will do benchmark and post here later


----------



## GosuPl

Quote:


> Originally Posted by *songokuj5*
> 
> I hope so... I will do benchmark and post here later


Thanks!


----------



## GnarlyCharlie

Looks like more of a productivity upgrade - make Xp work more like a Quadro - than any sort of a gaming driver.


----------



## songokuj5




----------



## DerComissar

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> Looks like more of a productivity upgrade - make Xp work more like a Quadro - than any sort of a gaming driver.
> 
> 
> Spoiler: Spoiler!


I did get a nice gain of about 250 points in Supo with 385.12, vs the last few drivers.



The only game I have installed atm is GTA V, it seems to be running smoother, with less stuttering issues than the previous drivers.
IQ looks good, that may be due to the placebo effect however.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *DerComissar*
> 
> I did get a nice gain of about 250 points in Supo with 385.12, vs the last few drivers.
> 
> 
> 
> The only game I have installed atm is GTA V, it seems to be running smoother, with less stuttering issues than the previous drivers.
> IQ looks good, that may be due to the placebo effect however.


Good to hear! I'm surprised that it's crossing over into gaming gains, but that's welcome news. I haven't even downloaded it-my Xp rig has no "productivity" stuff on it. But sounds like it might be worth the effort


----------



## scoobied77

Quote:


> Originally Posted by *songokuj5*


So both Titan XP and Xp get a nice boost with the latest drivers it seems.


----------



## Praze

Any chance I could get a benchmark or two from you nice folks in OctaneBench using the new driver? Can't seem to find anyone posting numbers online :\

Thanks!


----------



## Timmaigh!

Quote:


> Originally Posted by *Praze*
> 
> Any chance I could get a benchmark or two from you nice folks in OctaneBench using the new driver? Can't seem to find anyone posting numbers online :\
> 
> Thanks!


I am convinced the unlocked Quadro capabilities are going to have no influence on Octane. But yeah, just to be sure, i would like to see it as well.

Anyway, what interests me more personally, is performance within 3Ds MAX. Dealing with massive scenes with millions of trianges using regular Geforce card (i have 1080) is and always has been incredible pain and i am pretty sure it was the same even with TitanX before the latest driver - i wonder how much it improved.


----------



## songokuj5

Quote:


> Originally Posted by *Praze*
> 
> Any chance I could get a benchmark or two from you nice folks in OctaneBench using the new driver? Can't seem to find anyone posting numbers online :\
> 
> Thanks!




Seems no difference comparing with: https://render.otoy.com/octanebench/summary_detail_item.php?v=3.06.2&systemID=1x+TITAN+X+%28Pascal%29... Just a bit more because of mine is overclocked.


----------



## jsutter71

Any stability issues with the beta driver. I'm always apprehensive with beta drivers from Nvidia. Especially since I have 4 monitors in SLI.


----------



## GnarlyCharlie

It hasn't been great for me. I cleaned the old one and installed the new one, and that meant reconfiguring Afterburner. As soon as I just tried to move the mem slider, my system locked up hard and re-booted. I've been bashing video cards at the edge of stability for years and never saw that one before. Got it running and tried Superposition as it's generally a very forgiving load and wasn't doing all that great. Looked at AB and it was now set to Curve- I never use Curve.

GTA V crashed/hard locked. That game has never done that for me. I did have AB set maybe a little higher on the core than I normally run, but this more than a driver crash. And the next day I tried to power that rig up, it failed to boot. This has been an extremely stable rig that has never failed to boot since I got it all squared away a couple of months ago. Seems strange it would do that the next day after I installed this driver. It's still loaded, I'll play with it a little more before throwing in the towel, but so far it's not been all that.


----------



## jsutter71

I've noticed some issues with afterburner as of late. artifacts and lockups. I think MSI has not kept up with NVIDIA with the latest driver updates. At this point the only time I use afterburner is during benchmarking. I went ahead and installed the beta driver. Just ran this.


----------



## DerComissar

Quote:


> Originally Posted by *GnarlyCharlie*
> 
> It hasn't been great for me. I cleaned the old one and installed the new one, and that meant reconfiguring Afterburner. As soon as I just tried to move the mem slider, my system locked up hard and re-booted. I've been bashing video cards at the edge of stability for years and never saw that one before. Got it running and tried Superposition as it's generally a very forgiving load and wasn't doing all that great. Looked at AB and it was now set to Curve- I never use Curve.
> 
> GTA V crashed/hard locked. That game has never done that for me. I did have AB set maybe a little higher on the core than I normally run, but this more than a driver crash. And the next day I tried to power that rig up, it failed to boot. This has been an extremely stable rig that has never failed to boot since I got it all squared away a couple of months ago. Seems strange it would do that the next day after I installed this driver. It's still loaded, I'll play with it a little more before throwing in the towel, but so far it's not been all that.


I'm really sorry to hear you're still having issues with this driver.
As I mentioned previously, it wasn't running properly for me either right after the initial install, but re-booting and re-starting AB got everything running normally.
We have some similarities as we're both running Supo and GTA V, lol.

I'm using AB version 4.3.0., and I did have a curve set-up on it previously, which hasn't changed:

Somewhat of a sloppy curve perhaps, but it works for me, lol.
No issues with it not downclocking at idle.

Well, if you don't get it working right for you, there will certainly be more driver releases to come from Nvidia.

Quote:


> Originally Posted by *jsutter71*
> 
> I've noticed some issues with afterburner as of late. artifacts and lockups. I think MSI has not kept up with NVIDIA with the latest driver updates. At this point the only time I use afterburner is during benchmarking. I went ahead and installed the beta driver. Just ran this.


Nice score, a lot of horsepower there with the dual-TXP's and 6950X.
Did you get any improvement over previous drivers?


----------



## GnarlyCharlie

Quote:


> Originally Posted by *DerComissar*
> 
> I'm really sorry to hear you're still having issues with this driver.


Well, not really "still", I haven't had that rig on since the failure to boot - other than to reset and see if it would boot at all.

And jsutter, I don't think you have the SLI patch or whatever it's called in Supo - your score is right in line with what a single card should do in 4K Optimized Supo. It doesn't support SLI without some sort of patch from what I've read.


----------



## jsutter71

Quote:


> Originally Posted by *DerComissar*
> 
> I'm really sorry to hear you're still having issues with this driver.
> As I mentioned previously, it wasn't running properly for me either right after the initial install, but re-booting and re-starting AB got everything running normally.
> We have some similarities as we're both running Supo and GTA V, lol.
> 
> I'm using AB version 4.3.0., and I did have a curve set-up on it previously, which hasn't changed:
> 
> Somewhat of a sloppy curve perhaps, but it works for me, lol.
> No issues with it not downclocking at idle.
> 
> Well, if you don't get it working right for you, there will certainly be more driver releases to come from Nvidia.
> Nice score, a lot of horsepower there with the dual-TXP's and 6950X.
> Did you get any improvement over previous drivers?


Some. Here is my previous best score


----------



## Nikos4Life

Still not possible to flash this card right? As I saw it is now possible to flash 1080Ti.
Kind regards


----------



## Jpmboy

Quote:


> Originally Posted by *fernlander*
> 
> This is what I came to the thread to read about. Not so much about people consoling Titan X owners etc.


necro.. but flashing pascal cards is not a mystery (done 1080s and 1080Tis many times). The TXp has a different lock out than the 1080 and 1080tis. Laptop mobile GPUs are not full die pascal.


----------



## Vellinious

Yeah, but what would you flash the Titan with? There are no custom boards out there with unlocked voltages, and they all have the same power limit. /shrug


----------



## Jpmboy

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah, but what would you flash the Titan with? There are no custom boards out there with unlocked voltages, and they all have the same power limit. /shrug


that's the issue, not flashing. it's the lockout for bios editing.


----------



## lilchronic

Quote:


> Originally Posted by *Jpmboy*
> 
> that's the issue, not flashing. it's the lockout for bios editing.


I always thought it was that you cant flash an edited bios.


----------



## Vellinious

Quote:


> Originally Posted by *lilchronic*
> 
> I always thought it was that you cant flash an edited bios.


There is no pascal bios editor.


----------



## lilchronic

Quote:


> Originally Posted by *Vellinious*
> 
> There is no pascal bios editor.


There are other ways to edit a bios.


----------



## Zurv

Quote:


> Originally Posted by *jsutter71*
> 
> Some. Here is my previous best score


why disable DOF? that makes the score misleading.


----------



## Zurv

Why are people buying titan X (2016) still? i just put a few up on amazon for over $700 each.. and they sold right away. so strange.


----------



## jsutter71

Quote:


> Originally Posted by *Zurv*
> 
> Why are people buying titan X (2016) still? i just put a few up on amazon for over $700 each.. and they sold right away. so strange.


Maybe they thought they were pascal versions. Or perhaps they're just impulsive shoppers who don't do their homework. A lot of possible reasons.


----------



## GnarlyCharlie

Quote:


> Originally Posted by *jsutter71*
> 
> Maybe they thought they were pascal versions. Or perhaps they're just impulsive shoppers who don't do their homework. A lot of possible reasons.


The TitanX 2016 is a Pascal, it was the first Pascal Titan X. Then along came the second Pascal Titan X, the Xp.


----------



## unreality

Still astounded what these cards can do. Playing GTA V NaturalVision Remastered on 5k atm with 80-100 fps.

With one card that is! Awesome!


----------



## CptSpig

Quote:


> Originally Posted by *unreality*
> 
> Still astounded what these cards can do. Playing GTA V NaturalVision Remastered on 5k atm with 80-100 fps.
> 
> With one card that is! Awesome!


Now that's awesome!


----------



## DerComissar

Quote:


> Originally Posted by *unreality*
> 
> Still astounded what these cards can do. Playing GTA V NaturalVision Remastered on 5k atm with 80-100 fps.
> 
> With one card that is! Awesome!


----------



## Mr-Dark

Hello

My first Titan ever













BNIB for same price as 1080 Ti FE.. so why not ?


----------



## KillerBee33

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> My first Titan ever
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BNIB for same price as 1080 Ti FE.. so why not ?


Welcome Dark, finally made it


----------



## Mr-Dark

Quote:


> Originally Posted by *KillerBee33*
> 
> Welcome Dark, finally made it


Thanks bro







i'm also thinking about custom loop


----------



## KillerBee33

Quote:


> Originally Posted by *Mr-Dark*
> 
> Thanks bro
> 
> 
> 
> 
> 
> 
> 
> i'm also thinking about custom loop


Havent started mine in 6 months , it's locked up in Storage...not sure what the Procedure is when i do...Having that liquid untouched for 6 months


----------



## Mr-Dark

Quote:


> Originally Posted by *KillerBee33*
> 
> Havent started mine in 6 months , it's locked up in Storage...not sure what the Procedure is when i do...Having that liquid untouched for 6 months


why ? your titan on AIR now ?

what you think about the Evga hybrid kit for the Ti on this Titan ?


----------



## KillerBee33

Quote:


> Originally Posted by *Mr-Dark*
> 
> why ? your titan on AIR now ?
> 
> what you think about the Evga hybrid kit for the Ti on this Titan ?


Whole setup in Storage now.
Had Hybrid Kit on it for few days than decided to try a loop, KIT gets TITAN in the mid 60's which isn't bad comparing to what you'll spend on a loop.


----------



## Mr-Dark

Quote:


> Originally Posted by *KillerBee33*
> 
> Whole setup in Storage now.
> Had Hybrid Kit on it for few days than decided to try a loop, KIT gets TITAN in the mid 60's which isn't bad comparing to what you'll spend on a loop.


Oops, i think its the lack of time for the pc









the loop will cost me 700$ easily.. while the Hybrid kit will cost me 250$..

but i have Corsair H110 and i can buy the kraKEN g10 .. so the total cost around 130$.. what you think ?


----------



## KillerBee33

Quote:


> Originally Posted by *Mr-Dark*
> 
> Oops, i think its the lack of time for the pc
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the loop will cost me 700$ easily.. while the Hybrid kit will cost me 250$..
> 
> but i have Corsair H110 and i can buy the kraKEN g10 .. so the total cost around 130$.. what you think ?


Get the KIT , i only got the loop to waste time and try to be creative


----------



## Mr-Dark

Quote:


> Originally Posted by *KillerBee33*
> 
> Get the KIT , i only got the loop to waste time and try to be creative


Hello

Finally the 1080 ti kit in stock on amazon..

https://www.amazon.com/gp/product/B074CPP2TS/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

this will fit the Titan X without any modification right ?


----------



## KillerBee33

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello
> 
> Finally the 1080 ti kit in stock on amazon..
> 
> https://www.amazon.com/gp/product/B074CPP2TS/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> this will fit the Titan X without any modification right ?


I had the 1080 kit on a TXP without the cover ... I think i saw people doing mods with 1080 covers to fit TXP , honestly don't know but you can ask EVGA directly....see what they say.HUmmm don't see LIVE CHAT on EVGA anymore. I guess ask people around here then.


----------



## pez

You can do a 'ghetto mod' and use most of the original shroud of the Titan with a 1080 Hybrid kit or even cut a couple places on the EVGA shroud if you're in love with it for whatever reason -- or overpay for the 1080 Ti kit that uses the same AIO part. Plus, the 'ghetto modded' look of the FE cooler and the AIO pump still looks better than the EVGA shroud for the 1080/1080 Ti.


----------



## jbyron

The Titan X Pascal specific hybrid cooler (EVGA GTX TITAN X (Pascal) / GTX 1080 Ti FE HYBRID Waterblock Cooler, Cooling, 400-HY-5388-B1) came back in stock briefly a week ago and I managed to snag one before it went out of stock.

Gained almost 1k in TimeSpy just from the card not throttling


----------



## jsutter71

Does anyone know how to remove these left side green status bars while playing steam games?


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> Does anyone know how to remove these left side green status bars while playing steam games?


In your Nvidia control panel at the top near 3D settings under SLI. Drop down the box and turn of SLI OSD.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> In your Nvidia control panel at the top near 3D settings under SLI. Drop down the box and turn of SLI OSD.


Thank you much. I just gave you another REP.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> Thank you much. I just gave you another REP.


Thanks so much! I had to try and remember my SLI days but it looks like you figuard it out.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> Thanks so much! I had to try and remember my SLI days but it looks like you figuard it out.


HAHA. I just gave you your 35th REP. That means you can start selling if you like.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> HAHA. I just gave you your 35th REP. That means you can start selling if you like.


Yes it's about time to sell my board and processor moving to X299....


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> Yes it's about time to sell my board and processor moving to X299....


I'm waiting for the manufacturers to release a workstation board before I consider any type of upgrade. I'm not pleased with the current offerings and don't like to sacrifice bandwidth over peripherals.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> I'm waiting for the manufacturers to release a workstation board before I consider any type of upgrade. I'm not pleased with the current offerings and don't like to sacrifice bandwidth over peripherals.


I agree that's why I am going with a Asus x299 apex and i9-7980xe. Simple and fast.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> I agree that's why I am going with a Asus x299 apex and i9-7980xe. Simple and fast.


Yeah about that. Something's not right. Looking at the manual next to the overview page. Looks like a discrepancy with the PCIe slots.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> Yeah about that. Something's not right. Looking at the manual next to the overview page. Looks like a discrepancy with the PCIe slots.
> 
> 
> Spoiler: Warning: Spoiler!


Yes, I noticed that when I first look at the manual. The board actually has 4-way SLI. See picture below.


----------



## Nikos4Life

I am waiting to the Asus Rampage VI Extreme as well + 7980XE. Do you think delidding here is a must?

cheers


----------



## CptSpig

Quote:


> Originally Posted by *Nikos4Life*
> 
> I am waiting to the Asus Rampage VI Extreme as well + 7980XE. Do you think delidding here is a must?
> 
> cheers


Depends on cooling. Water for 24/7 4.5 GHz OC should be fine. For benching I am using a Koolance chiller. I don't like delidding a $2,000.00 processor.


----------



## jsutter71

My issue with the X299 motherboards. So 44 lanes total bandwidth support for the new CPU's in their PCIe slots. That's great on paper for the CPU but non of these new boards from what I've seen so far are fully able to take advantage of the added bandwidth. Still choosing between either quad X16 PCie support, which I have yet to seem offered, or multiple X4 M.2 support. And then you have my motherboard which adds a second PLX chip. I can have quad X16 PCIe, X4 M.2, and 12 SATA3 slots simultaneously with zero bandwidth loss. That's my point. Why hasn't any of the manufacturers done this yet for the X299 platform. Granted I only have 1 M.2 slot but with the addition of my Highpoint controller cards I added 8 additional & RAIDable M.2 slots with no bandwidth loss.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> My issue with the X299 motherboards. So 44 lanes total bandwidth support for the new CPU's in their PCIe slots. That's great on paper for the CPU but non of these new boards from what I've seen so far are fully able to take advantage of the added bandwidth. Still choosing between either quad X16 PCie support, which I have yet to seem offered, or multiple X4 M.2 support. And then you have my motherboard which adds a second PLX chip. I can have quad X16 PCIe, X4 M.2, and 12 SATA3 slots simultaneously with zero bandwidth loss. That's my point. Why hasn't any of the manufacturers done this yet for the X299 platform. Granted I only have 1 M.2 slot but with the addition of my Highpoint controller cards I added 8 additional & RAIDable M.2 slots with no bandwidth loss.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [IMG ALT="" http://www.overclock.net/content/type/61/id/3112808/width/500/height/1000[/IMG]


The machine I am building is for benching and gaming with only one card. Using a PLX chip to resemble or imitate x16 is not good for a gaming board. I have a few of PLX equipped boards, and they all suffer from some sort of delay


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> The machine I am building is for benching and gaming with only one card. Using a PLX chip to resemble or imitate x16 is not good for a gaming board. I have a few of PLX equipped boards, and they all suffer from some sort of delay


I'm not sure why you would think that because my system has consistently ranked high among it's gaming equivalents with the benchmarks to back it up. As far as delay I presume your referring to the boot process. The slower boot times have to do with the system checking the memory. That feature is able to be disabled and my boot times aren't insufferable. Not 7 seconds but 34. I'd gladly sacrifice 27 seconds for more bandwidth. If gaming is all you're doing and your spending 2K on the CPU then surely your could afford to SLI for maximum speed. Another added benefit of owning a workstation board with Asus is a 3 year warranty. Just my 2 cents.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> I'm not sure why you would think that because my system has consistently ranked high among it's gaming equivalents with the benchmarks to back it up. As far as delay I presume your referring to the boot process. The slower boot times have to do with the system checking the memory. That feature is able to be disabled and my boot times aren't insufferable. Not 7 seconds but 34. I'd gladly sacrifice 27 seconds for more bandwidth. If gaming is all you're doing and your spending 2K on the CPU then surely your could afford to SLI for maximum speed. Another added benefit of owning a workstation board with Asus is a 3 year warranty. Just my 2 cents.


I am talking about the time it takes going through a PLX chip vs straight to the CPU. Do you have Titan X (Pascal) or Titan Xp cards. We can compare bench marks. I am not being sarcastic just curious


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> I am talking about the time it takes going through a PLX chip vs straight to the CPU. Do you have Titan X (Pascal) or Titan Xp cards. We can compare bench marks. I am not being sarcastic just curious


Titan X (Pascal)

FS
https://www.3dmark.com/fs/12514604
FS ultra
https://www.3dmark.com/fs/12360076
FS extreme
https://www.3dmark.com/fs/12392853
TS
https://www.3dmark.com/spy/1679851




Just for grins Here's 8K I didn't really tweak this one as much as I could have.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> Titan X (Pascal)
> 
> FS
> https://www.3dmark.com/fs/12514604
> FS ultra
> https://www.3dmark.com/fs/12360076
> FS extreme
> https://www.3dmark.com/fs/12392853
> TS
> https://www.3dmark.com/spy/1679851
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Just for grins Here's 8K I didn't really tweak this one as much as I could have.


Here is my benches with one TitanXp. You can see the difference one card vs two cards you should be able to blow me away.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> Here is my benches with one TitanXp. You can see the difference one card vs two cards you should be able to blow me away.


And I'm sure I could If I was also using a chiller. What are your Futuremark scores?


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> And I'm sure I could If I was also using a chiller. What are your Futuremark scores?


@jsutter71 Titan X (Pascal) single card no chiller:

http://www.3dmark.com/spy/1575675
http://www.3dmark.com/3dm11/12112457
http://www.3dmark.com/fs/12273154
http://www.3dmark.com/fs/12273074
http://www.3dmark.com/fs/12272158

Titan Xp single card:

http://www.3dmark.com/fs/13542383
http://www.3dmark.com/fs/13535792
http://www.3dmark.com/fs/13499488
http://www.3dmark.com/spy/2292442


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> @jsutter71 Titan X (Pascal) single card no chiller:
> 
> http://www.3dmark.com/spy/1575675
> http://www.3dmark.com/3dm11/12112457
> http://www.3dmark.com/fs/12273154
> http://www.3dmark.com/fs/12273074
> http://www.3dmark.com/fs/12272158
> 
> Titan Xp single card:
> 
> http://www.3dmark.com/fs/13542383
> http://www.3dmark.com/fs/13535792
> http://www.3dmark.com/fs/13499488
> http://www.3dmark.com/spy/2292442


Comparason

TS
https://www.3dmark.com/compare/spy/1575675/spy/1679851/spy/2292442#
FS
https://www.3dmark.com/compare/fs/12273154/fs/13542383/fs/12514604
FS ultra
https://www.3dmark.com/compare/fs/12273074/fs/13499488/fs/12360076
FS Extreme.
https://www.3dmark.com/compare/fs/12272158/fs/13535792/fs/12392853

I never used 3DMark 11

Regarding Superposition I put very little thought process into improving those scores. Background apps and having 4 monitors attached to my system might have affected the outcome. Back to the original statement regarding PLX chips affecting system performance or causing any type of delays. I just don't see how that could be true. I believe that BIOS settings, cooling, and peripheral equipment are the primary determining factors for system performance.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> Comparason
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TS
> https://www.3dmark.com/compare/spy/1575675/spy/1679851/spy/2292442#
> FS
> https://www.3dmark.com/compare/fs/12273154/fs/13542383/fs/12514604
> FS ultra
> https://www.3dmark.com/compare/fs/12273074/fs/13499488/fs/12360076
> FS Extreme.
> https://www.3dmark.com/compare/fs/12272158/fs/13535792/fs/12392853
> 
> 
> 
> I never used 3DMark 11
> 
> Regarding Superposition I put very little thought process into improving those scores. Background apps and having 4 monitors attached to my system might have affected the outcome. Back to the original statement regarding PLX chips affecting system performance or causing any type of delays. I just don't see how that could be true. I believe that BIOS settings, cooling, and peripheral equipment are the primary determining factors for system performance.


Yes these are all factors. Try running your bench marks with SLI disabled (single card) and see how it effects your scores. When you get a chance of corse.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> Yes these are all factors. Try running your bench marks with SLI disabled (single card) and see how it effects your scores. When you get a chance of corse.


Good Call...I think that would be a better comparison of our systems. Their is another factor to consider though. Memory speed. When I first set up my system with current CPU I was using DDR4 2400 CL 15. After I upgraded to 3200 CL14 I had significant speed increases all around.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> Good Call...I think that would be a better comparison of our systems. Their is another factor to consider though. Memory speed. When I first set up my system with current CPU I was using DDR4 2400 CL 15. After I upgraded to 3200 CL14 I had significant speed increases all around.


That's good I am running 3200 MHz 14-16-16-42 cr1t memory. Should be a good comparison.


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> Yes these are all factors. Try running your bench marks with SLI disabled (single card) and see how it effects your scores. When you get a chance of corse.


That chiller makes a big difference.







Your CPU at 45 with NB @ 38.

This is the fastest I've been able to push my CPU and I purchased from Silicon Lottery.

CPU @ 44 and NB @ 37


CPU @ 43 and NB @ 38. By lowering the CPU clock, and raising the NB, memory speed improves


With your CPU speeds that high your results would be faster.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> That chiller makes a big difference.
> 
> 
> 
> 
> 
> 
> 
> Your CPU at 45 with NB @ 38.
> 
> This is the fastest I've been able to push my CPU and I purchased from Silicon Lottery.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> CPU @ 44 and NB @ 37
> 
> 
> CPU @ 43 and NB @ 38. By lowering the CPU clock, and raising the NB, memory speed improves
> 
> 
> 
> 
> With your CPU speeds that high your results would be faster.


@jsutter71 Did you look at these scores Titan X (Pascal) no chiller. CPU @ 4.4 GHz

http://www.3dmark.com/spy/1575675
http://www.3dmark.com/3dm11/12112457
http://www.3dmark.com/fs/12273154
http://www.3dmark.com/fs/12273074
http://www.3dmark.com/fs/12272158


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> @jsutter71 Did you look at these scores Titan X (Pascal) no chiller. CPU @ 4.4 GHz
> 
> http://www.3dmark.com/spy/1575675
> http://www.3dmark.com/3dm11/12112457
> http://www.3dmark.com/fs/12273154
> http://www.3dmark.com/fs/12273074
> http://www.3dmark.com/fs/12272158


The scores yes but not the rest.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> The scores yes but not the rest.


By the way thanks for your service to our country people like you make it very easy to be a proud American!







Your machine is very nice.....what's up with all the storage i.e.: hard drives?


----------



## jsutter71

Quote:


> Originally Posted by *CptSpig*
> 
> By the way thanks for your service to our country people like you make it very easy to be a proud American!
> 
> 
> 
> 
> 
> 
> 
> Your machine is very nice.....what's up with all the storage i.e.: hard drives?


I can't express how much I appreciate what you said, and thank you for checking out my system.

About the storage.
Games are awesome, and when I built my system I wanted to be able to do that in 4K with the hardware to support it. My other hobby is photography. To support that I wanted a fast CPU, tons of storage, and great monitors with near perfect color accuracy. My primary monitor is a 4096x2160 IPS display, which makes picture viewing & editing very nice. Granted it's not a gaming monitor, not the fastest @ 5ms, but It's fast enough to make the games I do play look beautiful.


----------



## CptSpig

Quote:


> Originally Posted by *jsutter71*
> 
> I can't express how much I appreciate what you said, and thank you for checking out my system.
> 
> About the storage.
> Games are awesome, and when I built my system I wanted to be able to do that in 4K with the hardware to support it. My other hobby is photography. To support that I wanted a fast CPU, tons of storage, and great monitors with near perfect color accuracy. My primary monitor is a 4096x2160 IPS display, which makes picture viewing & editing very nice. Granted it's not a gaming monitor, not the fastest @ 5ms, but It's fast enough to make the games I do play look beautiful.


----------



## PowerK

Quote:


> Originally Posted by *jsutter71*
> 
> I can't express how much I appreciate what you said, and thank you for checking out my system.
> 
> About the storage.
> Games are awesome, and when I built my system I wanted to be able to do that in 4K with the hardware to support it. My other hobby is photography. To support that I wanted a fast CPU, tons of storage, and great monitors with near perfect color accuracy. My primary monitor is a 4096x2160 IPS display, which makes picture viewing & editing very nice. Granted it's not a gaming monitor, not the fastest @ 5ms, but It's fast enough to make the games I do play look beautiful.


----------



## Jpmboy

Quote:


> Originally Posted by *jsutter71*
> 
> That chiller makes a big difference.
> 
> 
> 
> 
> 
> 
> 
> Your CPU at 45 with NB @ 38.
> 
> This is the fastest I've been able to push my CPU and I purchased from Silicon Lottery.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> CPU @ 44 and NB @ 37
> 
> 
> 
> CPU @ 43 and NB @ 38. *By lowering the CPU clock, and raising the NB, memory speed improves*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> With your CPU speeds that high your results would be faster.


it should.
AID64 bandwidth is very cache dependent. Not all memory tests are a latched to cache speed as AIDA. Sisoft Sandra gives a more "granular" assessment of memory, but takes much longer to run.


----------



## CptSpig

Quote:


> Originally Posted by *Jpmboy*
> 
> it should.
> AID64 bandwidth is very cache dependent. Not all memory tests are a latched to cache speed as AIDA. Sisoft Sandra gives a more "granular" assessment of memory, but takes much longer to run.


The links I attached are with the same CPU speeds and same card he is running. No chiller! Should be a fair comparision.


----------



## jsutter71

Geforce experience system icon removed.

I know that some of you'll do not use geforce experience or even have it installed, but some of us do. I really wish that Nvidia didn't try to "improve" their software by throwing up additional barriers. At the very least give us the option to turn the system icon on. What a annoyance. Some of us have the horsepower where this is not an issue.


----------



## jsutter71

*Parallel VS Serial*

Is their anyone here with an SLI setup using a serial terminal block over their cards? If so have you ever used parallel? And again if so which was preferred. I chose parallel for increased flow, but aesthetically speaking I'm not fond of having to route hard tubing around power cords. It would be easier to just have the coolant exit front to back. I'm considering changing but still undecided due to decreased flow. Bubbles aren't a huge issue because I have 3 pumps, but some always seem to get trapped in the block. Likely because of how the tubing is routed.


----------



## 5150 Joker

Any unlocked vbios yet?


----------



## Artah

Guys help us out in folding to cure diseases this Monday info here. http://www.overclock.net/t/1637951/september-2017-foldathon-monday-18th-wednesday-20th-1200-et-1600-utc/0_100#post_26344781


----------



## Glerox

Sup

I have a pretty beefy rig with two Titans X and custom loop.
I've been mining almost 24/7 with it (I don't pay my electricity) for a month or two and I think it really doesn't like it lol...

I can see the flow is really week now. And the temps are soooo bad. There is always bubbles also in the loop. I have to move them around each time and replace the air with liquid in the reservoir... pretty annoying.

Anyone else tried running a custom loop and gpus 24/7? I wonder if the pump is dying. And i don't understand why there is always bubbles.
This is my rig : https://pcpartpicker.com/b/DvfH99

Thanks!


----------



## jmaz87

had to edit to say nice pc btw! its like u took mine and kept going with hardlines and another titan! looks great!
what power mod are u referring to in your build? i'm very curious i must have missed that discussion in the thread i think i'm only in the 2050mhz area with +500vram and my 6850k is normally @4.2 full load its usually 50s tops with gpu never exceeding 45.

I've had my Titan X Pascal since day one launch and other than taking my EK block off once recently while I was flushing the system to clean things I haven't touched the loop. single GPU and CPU and 2 rads temps, pump etc. all fine. I have changed the coolant once. Mostly EK gear btw

leaving any OC'd PC on full load 24/7 causes mileage and ANY really high temps can degrade your coolant causing a premature breakdown. bad stuff...

as for bubbles, if ur loop is poorly designed air can take a while to work its way out and everybody has to add a little over time but we're talking mL.


----------



## lanofsong

Hey Titan X Pascal owners,

We are having our monthly Foldathon from Monday 16th - Wednesday 18th - 12noon EST.
Would you consider putting all that awesome GPU power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

October 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Glerox

This is the power mod I did : http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus

Honeslty it's not worth it except for benchmarking higher scores.

I never cleaned my loop yet. Maybe I'll try that. Some guy thinks it's of the block who needs to be cleaned.

This is when hardline is less funny...


----------



## GosuPl

TITAN X Pascal LC SLI , with [email protected] Ghz vs i9 [email protected] Ghz






Look at temps, decent custom loop is, a great thing ;-)

CPU bottlenecks on 1080p/1440p ;-)


----------



## jmaz87

I'd be interested to see what voltages people are getting away with for diff things iirc i hit a wall @4300mhz but now i suspect my PSU since replacing recently...


----------



## Asus11

Quote:


> Originally Posted by *GosuPl*
> 
> TITAN X Pascal LC SLI , with [email protected] Ghz vs i9 [email protected] Ghz
> 
> 
> 
> 
> 
> 
> Look at temps, decent custom loop is, a great thing ;-)
> 
> CPU bottlenecks on 1080p/1440p ;-)


nice review.. 6950x seems to be the winner









I can get the 6950x very cheap now aswell which is tempting to think of a ITX build with it in









13.75MB Cache vs 25MB Cache.. maybe thats the reason?

I was pondering on the thought of 6950x and disabling the HT what do you think? hopefully run cooler plus maybe a little better overclock


----------



## GosuPl

Quote:


> Originally Posted by *Asus11*
> 
> nice review.. 6950x seems to be the winner
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can get the 6950x very cheap now aswell which is tempting to think of a ITX build with it in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 13.75MB Cache vs 25MB Cache.. maybe thats the reason?
> 
> I was pondering on the thought of 6950x and disabling the HT what do you think? hopefully run cooler plus maybe a little better overclock


I always use HT









Soon i will upload test part II, and next will be GTX 1070Ti tests (after NDA remove), and after TR 1950X + SLI TXP / single TXP vs 6950X / 7900X

TR 1950X is trash for SLI ,on 1440p works worse than on single GPU ^^

Anyway, i will upload few interesting tests with 6950X vs 7900X

- [email protected] vs [email protected] + mem 3200 CL15.15.15 T1 both (6950X just crushed 7900X)
- [email protected] + mem 3600 CL15.15.15 T1 vs mem 2400 CL14.14.14 T1 , HUGE perf differences.

And soon i will test i9 7980XE + 4x TXP


----------



## Asus11

Quote:


> Originally Posted by *GosuPl*
> 
> I always use HT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Soon i will upload test part II, and next will be GTX 1070Ti tests (after NDA remove), and after TR 1950X + SLI TXP / single TXP vs 6950X / 7900X
> 
> TR 1950X is trash for SLI ,on 1440p works worse than on single GPU ^^
> 
> Anyway, i will upload few interesting tests with 6950X vs 7900X
> 
> - [email protected] vs [email protected] + mem 3200 CL15.15.15 T1 both (6950X just crushed 7900X)
> - [email protected] + mem 3600 CL15.15.15 T1 vs mem 2400 CL14.14.14 T1 , HUGE perf differences.
> 
> And soon i will test i9 7980XE + 4x TXP


I will look forward to these









if I could use 6950x in Mitx build I might consider it









but it seems the Asrock X99e itx board has a custom design for the CPU cooler so I would not be able to watercool









edit: look what I found https://www.bit-tech.net/news/tech/bitspower-waterblock-for-asrock-s-x99e-itx/1/


----------



## xarot

Quote:


> Originally Posted by *Asus11*
> 
> I will look forward to these
> 
> 
> 
> 
> 
> 
> 
> 
> 
> if I could use 6950x in Mitx build I might consider it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but it seems the Asrock X99e itx board has a custom design for the CPU cooler so I would not be able to watercool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: look what I found https://www.bit-tech.net/news/tech/bitspower-waterblock-for-asrock-s-x99e-itx/1/


You can use the Narrow ILM mounting for normal EK Supremacy block. https://www.ekwb.com/shop/mounting-plate-supremacy-lga-2011-narrow-ilm


----------



## GosuPl

SLI 2x TITAN X Pascal LC on i7 [email protected] GHz vs i9 [email protected] Ghz

Mem for 6950X - 3200 CL 15.15.15 T1 (above my BW-E have wall ^^ )
Mem for 7900X - 3600 Cl 15.15.15 T1 (SKLX IMC is great














10 games - 1080p/1440p - CPU bottlenecks ;-) 4k/5k - PC master race delicious meal


----------



## d34th25

Im looking to buy another Titan X Pascal 2016 from anyone who is willing to sell.

We can discuss the price in a PM thanks.


----------



## nycgtr

Quote:


> Originally Posted by *d34th25*
> 
> Im looking to buy another Titan X Pascal 2016 from anyone who is willing to sell.
> 
> We can discuss the price in a PM thanks.


You have PM


----------



## d34th25

Quote:


> Originally Posted by *nycgtr*
> 
> You have PM


Thanks for the pm, I have replied to you.


----------



## xzamples

Just got a Titan X Pascal (2016) thinking of either using it as my main GPU and throwing some aftermarket cooler on it or selling it

Any BIOS mods for it?

EDIT:

So i just did some gaming on it and it hits max 84c, I don't like those temps, I definitely can't be gaming with those temps always so I'm asking what AIO GPU cooler is compatible with the titan x pascal (2016) ???

I've seen the gamers nexus test and they highly recommend throwing a better cooler on the titan x pascal, gives better performance on top of good temps.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *xzamples*
> 
> Just got a Titan X Pascal (2016) thinking of either using it as my main GPU and throwing some aftermarket cooler on it or selling it
> 
> Any BIOS mods for it?


No bios mods. The one that's on there is the only one.


----------



## The EX1

I hate myself.


----------



## pez

LOL....what's with the price (drop?) on that thing?


----------



## ESRCJ

Has anyone here been able to push a TXP (big p) past 7000 in this benchmark? If so, what were your OC settings?


----------



## The EX1

Quote:


> Originally Posted by *pez*
> 
> LOL....what's with the price (drop?) on that thing?


They marked down the Star Wars edition by 5%-ish last week. First time I have ever seen a Titan on sale.


----------



## pez

Maybe they'll keep the sale on it active until the new movie hits the cinema







.


----------



## KillerBee33

Uhummmm.....
https://www.nvidia.com/en-us/titan/titan-v/?ncid=em-ded-tnvptlh-29076&deliveryName=DM6510


----------



## Artah

Quote:


> Originally Posted by *KillerBee33*
> 
> Uhummmm.....
> https://www.nvidia.com/en-us/titan/titan-v/?ncid=em-ded-tnvptlh-29076&deliveryName=DM6510


$3,000 for Titan V that is supposed to go 5x faster than Titan Xp. Hmmm single GPU time but for that price though...


----------



## KillerBee33

Quote:


> Originally Posted by *Artah*
> 
> $3,000 for Titan V that is supposed to go 5x faster than Titan Xp. Hmmm single GPU time but for that price though...


MEHH I"M OUT OF THIS GAME. TXP SITTING IN STORAGE SO IM ON A 1070X2 Laptop!


----------



## xzamples

is this cooler compatible with the titan x pascal (2016) https://www.arctic.ac/worldwide_en/accelero-hybrid-iii-120.html


----------



## d34th25

cleanup


----------



## d34th25

cleanup


----------



## KillerBee33

REMOVED POST


----------



## d34th25

I am still looking for a 2016 Titan X Pascal (2 GPUs). Please pm me if interested to sell. EU sellers. Somewhat urgent


----------



## jsutter71

Quote:


> Originally Posted by *gridironcpj*
> 
> 
> 
> Has anyone here been able to push a TXP (big p) past 7000 in this benchmark? If so, what were your OC settings?


Sure...Just add another card.


----------



## unreality

Happy New Year fellow Titan XP owners! This card and the thread deserves more activity


----------



## CptSpig

Quote:


> Originally Posted by *unreality*
> 
> Happy New Year fellow Titan XP owners! This card and the thread deserves more activity


Happy New Year! The Titan V has been getting all the attention. I still love my Titan Xp and will be waiting for the fully unlocked Titan Vp.


----------



## MrKenzie

Yes the influx of Titan Xp and Titan V's has pushed my Timespy Extreme GPU score down in to the 50s. Still good though for a card that I've had for 16 months though!

I'm still hanging out for a 120Hz 4k monitor, although even a Titan V will only manage 80-90fps on most new games


----------



## Artah

Quote:Originally Posted by *MrKenzie* 

Yes the influx of Titan Xp and Titan V's has pushed my Timespy Extreme GPU score down in to the 50s. Still good though for a card that I've had for 16 months though!

I'm still hanging out for a 120Hz 4k monitor, although even a Titan V will only manage 80-90fps on most new games









That's why you have to get two if you can SLI them...

Wow this thread went almost completely dead. Is there a new Titan V discussion? If not I started one here for those of you that have these cards. I just ordered two of them.

Edit: Oops someone already started a thread a while back. http://www.overclock.net/forum/nvidia/1644280-official-nvidia-titan-v-owner-s-club.html


----------



## bee144

Hello All, 

I have been playing Battlefield 1 on my PC DX11 or DX12(specs below). Monitor is at 2560x1440 144hz G-Sync. I’ve added a user.cfg file, which takes advantage of 10 of my 12 cores. My processor is at roughly 90-95% cpu usage on average. At the same times both of my Titans are somewhere between 40-60% GPU usage (80-120% GPU usage combined). My FPS is around 80-110 FPS, far from 144 FPS.

Is my setup bottlenecked by my CPU? If I turn off SLI, I get 100% usage on the first GPU.

The reason why I’m asking is because I’m wondering if upgrading to a used 6850K (x16,x16)or new 8700K(x8,x8) would help reduce the bottleneck.

4.7 GHz Intel 4960X
ASUS Rampage IV Black Edition
Corsair Platinum DDR3 2133 MHz RAM
Two Titan X (Pascal) with boost up to ~2076 MHz Core. (both are at x16)
Custom EKWB loop cooling cpu and both GPUs. Temps don’t go over 60 on full load.

Thank you


----------



## KillerBee33

[email protected] is it dead here....


----------



## bl4ckdot

I'm waiting for the next Titan


----------



## bee144

bl4ckdot said:


> /forum/images/smilies/frown.gif
> I'm waiting for the next Titan


I’ve regretted my two Titans. Going to wait for the next Ti Card.


----------



## KillerBee33

bl4ckdot said:


> I'm waiting for the next Titan


I just got mine back , been sittin' in storage for a year and a half. Hey how did you get your images back in Sig. since the site update i lost most of my crap...


----------



## bl4ckdot

KillerBee33 said:


> I just got mine back , been sittin' in storage for a year and a half. Hey how did you get your images back in Sig. since the site update i lost most of my crap...


Have a look here : http://www.overclock.net/forum/3-ov...-rigbuilder-sig-rigs-2018-a.html#post27313529


----------



## KillerBee33

Any news on overclocking these TXP's or updated Afterburner is the only thing new? Havent done this in a while...


----------



## bl4ckdot

KillerBee33 said:


> Any news on overclocking these TXP's or updated Afterburner is the only thing new? Havent done this in a while...


Nothing new, same as the old days


----------



## Lee0

Not really much to say. My card is at hard work in my rig. Playing games in 4k almost max settings on every game @60fps. I slapped a Heatkiller block on it (along with a backplate) and it's much better than stock. Haven't overclocked it though, haven't been bothered to.

EDIT: Might also add that I thought nVidia was screwing us original titan x (pascal) owners over with the new release of the card. I probably won't upgrade to the next gen since I can't upgrade my 4k60hz monitor (yet).


----------



## bl4ckdot

Lee0 said:


> Not really much to say. My card is at hard work in my rig. Playing games in 4k almost max settings on every game @60fps. I slapped a Heatkiller block on it (along with a backplate) and it's much better than stock. Haven't overclocked it though, haven't been bothered to.
> 
> EDIT: Might also add that I thought nVidia was screwing us original titan x (pascal) owners over with the new release of the card. I probably won't upgrade to the next gen since I can't upgrade my 4k60hz monitor (yet).



Yup same as mine. It's not getting any rest ^^


----------



## Glerox

Anyone here with 1 or 2 Titan XP waiting for new Titan or 1180 TI or even longer instead of 1180?

I'm tired of SLI but will wait at least for 30% performance increase over one Titan XP/1080 TI


----------



## KillerBee33

Glerox said:


> Anyone here with 1 or 2 Titan XP waiting for new Titan or 1180 TI or even longer instead of 1180?
> 
> I'm tired of SLI but will wait at least for 30% performance increase over one Titan XP/1080 TI


if you don't mind loosing few hundred in the transition... nVidia brought 3 TITANS this year..LOL , God knows what they gonna do next year...


----------



## bee144

Glerox said:


> Anyone here with 1 or 2 Titan XP waiting for new Titan or 1180 TI or even longer instead of 1180?
> 
> I'm tired of SLI but will wait at least for 30% performance increase over one Titan XP/1080 TI


I have two Titan X (Pascal) in SLI. SLI is annoying but it’s necessary when using 4K 144Hz.

I would be fine with using a single 1180Ti though as SLI is a headache and a waste of money, even with some AAA games.


----------



## Glerox

bee144 said:


> I have two Titan X (Pascal) in SLI. SLI is annoying but it’s necessary when using 4K 144Hz.
> 
> I would be fine with using a single 1180Ti though as SLI is a headache and a waste of money, even with some AAA games.


exactly. waiting for 1180ti also.


----------



## bl4ckdot

Glerox said:


> Anyone here with 1 or 2 Titan XP waiting for new Titan or 1180 TI or even longer instead of 1180?
> 
> I'm tired of SLI but will wait at least for 30% performance increase over one Titan XP/1080 TI



I am. I'll most probably buy the next Titan


----------



## bee144

bl4ckdot said:


> Glerox said:
> 
> 
> 
> Anyone here with 1 or 2 Titan XP waiting for new Titan or 1180 TI or even longer instead of 1180?
> 
> I'm tired of SLI but will wait at least for 30% performance increase over one Titan XP/1080 TI
> 
> 
> 
> 
> I am. I'll most probably buy the next Titan
Click to expand...

Why buy the Titan when you could pay half less for a 1180Ti?

That’s one thing I’ll never do again. I bought three of the OG Titans and two of the 2016 titans but after realizing I spent 2x as much for the same performance, I’m not doing that again. I love my money too much.


----------



## bl4ckdot

bee144 said:


> Why buy the Titan when you could pay half less for a 1180Ti?
> 
> That’s one thing I’ll never do again. I bought three of the OG Titans and two of the 2016 titans but after realizing I spent 2x as much for the same performance, I’m not doing that again. I love my money too much.


Sure, you can have the same performance by waiting 6+ months. I usually want to play with new toys asap ^^


----------



## stefxyz

Depends of course. If the 1180 can sport at least a 20% lead to my Titan X Pascal at 2000 MHZ then I will go for it right away and buy the new GTX Titan when released too (got 3 PCs in use). If the new 1180 is only marginally quicker then I will skip and wait.

I really hope they can show huge performance gains. I am desperate for more power. 4k 144hz and especially VR begs for way more power.

Knowing myself I will buy the 1180 anyways day 1 just because I am curious on how it performs what it can do and how it overclocks. I just enjoy fiddling around with it... may be more than actuallyl game on it to be honest...

Also I really got tired of my Titan X which I bought August 2nd 2016 so 2 years ago...


----------



## Pandora's Box

Nope. Zero plans to upgrade here, none of my games are struggling with 2 Titan Xp's @ 2GHz core, 12GHz mem. Higher priorities right now than buying new video card(s).


----------



## BGaming

Has anyone manage to make a volt mod or something like that. I have done the shunt mod and Liquid metal on the gpu. Wont pas 37ºC full load, ambient temps at 21ºC. On hot days (normally here) 47ºC full load at 32ºC ambient temp. Core clock @2088(2063 GP boost hits) memory clock @11.8 ghz. I think I still have room for more overclocking but I need more volts or a stable no moving around voltage. Any suggestions? What can i do?. Thanks in advance.


----------



## Timp74

*Help. Full Titan X Pascal VBIOS Image needed...*

I managed to completely erase the VBOIS of a Titan X Pascal I'm trying to fix. :doh:

I re-flashed it with an image from techpowerup, but the card still won't POST. Gives error 62. I noticed that the EEPROM on the card is 512kb but the image from techpowerup is 256Kb. :headscrat

Looking at the the images from other Pascal based cards I have, the second half of the VBIOS isn't always blank. I'm hoping there is something there that will make the card work again. :glasses

It would be fantastic if someone could use nvflash(or an external EEPROM programmer) to dump the full 512Kb VBIOS image of their card and send it to me. 

Thanks.


----------



## mattxx88

Timp74 said:


> I managed to completely erase the VBOIS of a Titan X Pascal I'm trying to fix. :doh:
> 
> I re-flashed it with an image from techpowerup, but the card still won't POST. Gives error 62. I noticed that the EEPROM on the card is 512kb but the image from techpowerup is 256Kb. :headscrat
> 
> Looking at the the images from other Pascal based cards I have, the second half of the VBIOS isn't always blank. I'm hoping there is something there that will make the card work again. :glasses
> 
> It would be fantastic if someone could use nvflash(or an external EEPROM programmer) to dump the full 512Kb VBIOS image of their card and send it to me.
> 
> Thanks.


I knew that there isn't a modded bios for titan x pascal.. if you want i can give you the original one from mine gpu (but i have to close the waterloop before, and i'll have the mainboard this wednesday)


----------



## Timp74

mattxx88 said:


> I knew that there isn't a modded bios for titan x pascal.. if you want i can give you the original one from mine gpu (but i have to close the waterloop before, and i'll have the mainboard this wednesday)


Hi, Yes, I'm just after the factory image so that would be great! Thanks.


----------



## KillerBee33

Hey, i know this may be a bit late but after all the reviews on RTX are out i've decided to keep my TXP till next Gen. The question is " did anyone else noticed bumping up Voltage with MSI Afterburner lowers the 3DMark score?
1 Voltage to 100 https://www.3dmark.com/fs/17569581
2 Voltage Untouched https://www.3dmark.com/fs/17628911
I've noticed it boostin' 25MHz less with Voltage Untouched but it scores higher "OC and driver were exactly the same on both runs"


----------



## CptSpig

KillerBee33 said:


> Hey, i know this may be a bit late but after all the reviews on RTX are out i've decided to keep my TXP till next Gen. The question is " did anyone else noticed bumping up Voltage with MSI Afterburner lowers the 3DMark score?
> 1 Voltage to 100 https://www.3dmark.com/fs/17569581
> 2 Voltage Untouched https://www.3dmark.com/fs/17628911
> I've noticed it boostin' 25MHz less with Voltage Untouched but it scores higher "OC and driver were exactly the same on both runs"


You need to find out what the max voltage is for your card.

1. open Afterburner
2. set the core slider to any value (multiple of 13) so 130, 143.. etc. apply
3. put your mouse in the graph window, hit cntrl-F
4. select the 1025mV, 1050mV or where the graphs slope and straight lines meet point and hit cntrl-L
5. Apply
6. save so you can set your max v any time you like.

Enjoy. :thumb:


----------



## KillerBee33

CptSpig said:


> You need to find out what the max voltage is for your card.
> 
> 1. open Afterburner
> 2. set the core slider to any value (multiple of 13) so 130, 143.. etc. apply
> 3. put your mouse in the graph window, hit cntrl-F
> 4. select the 1025mV, 1050mV or where the graphs slope and straight lines meet point and hit cntrl-L
> 5. Apply
> 6. save so you can set your max v any time you like.
> 
> Enjoy. :thumb:


Clocked higher, Scored Lower https://www.3dmark.com/3dm/31767790


----------



## CptSpig

KillerBee33 said:


> Clocked higher, Scored Lower https://www.3dmark.com/3dm/31767790


All 3dmark bench's are different. Time Spy likes high core and lower memory. Time Spy Extreme like high core and memory.


----------



## KillerBee33

CptSpig said:


> All 3dmark bench's are different. Time Spy likes high core and lower memory. Time Spy Extreme like high core and memory.


Did get stable 1.0930 and higher clock but PerfCap was still VRel and score went down. Mehh , will wait for Borderlands 3 and take it from there i guess, for now run it stock which seems to do just fine in games.


----------



## bl4ckdot

Hello, I'm in need of help. Since now 2 months, there is a noise (a poping sound to be more "precise") coming from my PC. I finally found where it's coming from. It is from my Titan XP. This is the sound I recorded : https://instaud.io/3cT4 It sounds like a relay switch, I'm not sure. My card is watercooled and if I lay my hand on it, I can feel the "pop". The GPU isn't oc (I have 0 issue with the GPU, no artefact). It happens mostly while gaming, but can happen in the menu of the game, while watching a youtube video etc. I had a time where it happened after the shutdown of my PC, like 2 or 3 sec after it stops. Very random. It can also be pretty lood, I hear it with my headphone while gaming. So ... what could make this noise and what can I do ?


----------



## bl4ckdot

bl4ckdot said:


> Hello, I'm in need of help. Since now 2 months, there is a noise (a poping sound to be more "precise") coming from my PC. I finally found where it's coming from. It is from my Titan XP. This is the sound I recorded : https://instaud.io/3cT4 It sounds like a relay switch, I'm not sure. My card is watercooled and if I lay my hand on it, I can feel the "pop". The GPU isn't oc (I have 0 issue with the GPU, no artefact). It happens mostly while gaming, but can happen in the menu of the game, while watching a youtube video etc. I had a time where it happened after the shutdown of my PC, like 2 or 3 sec after it stops. Very random. It can also be pretty lood, I hear it with my headphone while gaming. So ... what could make this noise and what can I do ?


Fixed : https://www.overclock.net/forum/31-power-supplies/1722418-seasonic-prime-titanium-1000-pop-sound-3.html#post27915750


----------



## jsutter71

*Cleanup*

So I'm in the middle of a major cleanup and overhaul of my system and when I took my TXP's out I noticed what looks like scorch marks on the SLI fingers. I spent some quality time removing the black and ordered a new SLI bridge. I also have some rust in the cooling chamber of one of the cards. This is after my initial cleaning. Any thoughts on how this can affect performance? The black marks were limited to 1 side of the card. I'm very displeased with the rust since I spent enough on those EK blocks.


----------



## TheGeneralLee86

How Many on here still own there Titan X Pascal? I know I still do waiting to save up for another super machine!


----------



## deafboy

I'm still rocking mine and haven't really felt the need for an upgrade... it's been a solid performer for as long as I've had it, lol


----------



## TheGeneralLee86

deafboy said:


> I'm still rocking mine and haven't really felt the need for an upgrade... it's been a solid performer for as long as I've had it, lol


 Yeah I know what you mean and I have 2 in Sli it will be a while before i upgrade again because everything has gotten so expensive! I also have 32GB (4x8GB) of DDR4 2666Mhz RAM and the i7 6950x extreme edition it is attached to all on a Asus X99 M WS Workstation MicroATX MB I've had it all since they firsst came out so my system is a little over 2yrs old! Still plays every game awesomely though especially on my 3 LU28E590D 28" 4k UHD Samsung Monitors at 60hz, which are still plenty good for me though someday I hope to get a 49" Ultra-wide 31:9 Aspect Monitor, which would be better because it would be one versus three monitors and will take less space and easier to move around!


----------



## Multiplectic

Hey guys! I'm getting my Titan tomorrow! 
I always wanted a Titan, never really could afford one... Had to wait a few years, lol. It'll be a good step up from my 980Ti.

Any quick tips or tweaks I should do?
I'm planning on getting a Kraken G12 + AIO in a couple of months, but I wanted to try out the stock cooler first.


----------



## ryward

Hello Everyone, I am looking for an EKWB EK-FC Titan X Backplate, which all look End of Life. Preferably one in Nickel, does anyone have one that they would be willing to part with? If so feel free to DM me.

Thank you!


----------



## Buford458

TheGeneralLee86 said:


> How Many on here still own there Titan X Pascal? I know I still do waiting to save up for another super machine!



I'm running one and I see no reason to upgrade for the near future.


----------



## MrTOOSHORT

ryward said:


> Hello Everyone, I am looking for an EKWB EK-FC Titan X Backplate, which all look End of Life. Preferably one in Nickel, does anyone have one that they would be willing to part with? If so feel free to DM me.
> 
> Thank you!


I have one, for Titan X Pascal version, let me know...


----------



## ryward

MrTOOSHORT said:


> I have one, for Titan X Pascal version, let me know...


Thanks I just sent you a private message! Would love to grab it.


----------



## KillerBee33

Yeap, still got mine here. Flushing the loop "as we speak" i thought 4 years is the time but it looks curiously Clean...


----------



## Jedi Mind Trick

Man, this thread seems super dead! Guess those with TXP/TXps can upgrade more frequently!

Anyways, I got one of these cards recently as a side grade for a 1080ti and while it doesn't clock too well (never sure how good a 'good' clock is / how bad a 'bad' clock is). It tops out right around 2075/2088 on the core and 5500-5600 on the memory (under water). Card is a champ for 1440p144hz; will hopefully ride it until something that can handle 4K/120hz better.

Card is awesome and I have no complaints with it; hopefully I'll keep it longer than the myriad 1080ti's I've owned (which no joke, was way too many!).


----------



## deafboy

Still have my card, through it into my mITX build

Haven't felt the need to upgrade, lol


----------



## Jedi Mind Trick

deafboy said:


> Still have my card, through it into my mITX build
> 
> *Haven't felt the need to upgrade*, lol


That is what I hope this card does for me! If nothing else the idea that I'd have to get a new WB if I upgrade should be enough of a deterrent for now! I have a problem, but hopefully this card will keep my love to side-grade at bay!


----------



## Rizen

Jedi Mind Trick said:


> Man, this thread seems super dead! Guess those with TXP/TXps can upgrade more frequently!
> 
> Anyways, I got one of these cards recently as a side grade for a 1080ti and while it doesn't clock too well (never sure how good a 'good' clock is / how bad a 'bad' clock is). It tops out right around 2075/2088 on the core and 5500-5600 on the memory (under water). Card is a champ for 1440p144hz; will hopefully ride it until something that can handle 4K/120hz better.
> 
> Card is awesome and I have no complaints with it; hopefully I'll keep it longer than the myriad 1080ti's I've owned (which no joke, was way too many!).


Yeah, mine tops out around 2063/2075 on the core. I am 100% power limited, GPU never goes over ~45C.


----------



## Multiplectic

So, a couple of days ago I got my G12 and an Asetek OEM from eBay:



Dropped about ~40C on load temps, and now my boost clocks are limited by power only, hitting about 1850 with power @ 120%.
I still have to do some OC testing to see how far I can go now. 

The AIO came with an incredible Nidec fan that pumps >140CFM @ 3300 RPM, but right now I'm controlling it via the mobo and it's running @ 10-15%, virtually silent.


----------



## Astroflash

Anyone willing to sell me a stock cooler for the Titan X Pascal in the UK? Cheers!


----------



## Sheytan

nevermind


----------

